You should always keep search engine optimization in the forefront of your mind for your website.
In this article; we will show you how to optimize your website page for SEO. With our guide you can rank better and get more organic traffic. Read on to find out exactly how.
- TITLE TAGS: Your HTML title tag appears in browser tabs, bookmarks and in search result pages. Make your title tags clear, concise (50-60 characters) and include your most important keywords.
- META DESCRIPTION: A good meta description acts as an organic advertisement, so use enticing messaging with a clear call to action to maximize click-through rate. They allow you to influence how your web pages are described and displayed in search results.Ensure that all of your web pages have a unique meta description that is explicit and contains your most important keywords (these appear in bold when they match part or all of the user’s search query).
Check your Google Search Console account (Click ‘Search Appearance’, then ‘HTML Improvements’) to identify any issues with your meta descriptions, for example, they are too short/long, or duplicated across more than one page.
- HEADINGS TAGS: Use your keywords in the headings and make sure the first level (<H1>) includes your most important keywords. Never duplicate your title tag content in your header tag.While it is important to ensure every page has an <H1> tag, never include more than one per page. Instead, use multiple <H2> – <H6> tags.
- KEYWORD CLOUD: This Keyword Cloud provides an insight into the frequency of keyword usage within the page. It’s important to carry out keyword research to get an understanding of the keywords that your audience is using. There are a number of keyword research tools available online to help you choose which keywords to target.
- KEYWORD CONSISTENCY: Keyword consistency is the use of keywords throughout the different elements of the webpage. Consistent keyword use helps crawlers index your site and determine relevancy to search queries. The table above highlights the most frequently used keywords on your page and how consistently you’re using them.
- ALT ATTRIBUTE: Alternative text allows you to add a description to an image. Since search engine crawlers cannot see images, they rely on alternative text attributes to determine relevance to a search query. Alternative text also helps makes an image more likely to appear in a Google image search and is used by screen readers to provide context for visually impaired users. It looks like most or all of your images have alternative text. Check the images on your website to make sure accurate and relevant alternative text is specified for each image on the page. Try to minimize the number of alt text characters to 150 or less (including spaces!) to optimize page load times.
- DISCOVERED PAGES: This is the number of pages that we have discovered on your website.A low number can indicate that bots are unable to discover your webpages, which is commonly caused by bad site architecture & internal linking, or you’re unknowingly preventing bots and search engines from crawling & indexing your pages. Make sure your website’s XML sitemap is present and you have submitted it to the major search engines. Building backlinks to your website’s internal pages will also help bots to discover, crawl and index them, while building authority to help them rank in the search engines. Check Google™ Search Console under ‘Google Index’ and ‘Crawl’ to keep track of the status of your site’s indexed/crawled pages.
- GOOGLE+ PUBLISHER: The rel=”Publisher” link allows Google to display an excerpt from your brand’s Google+ page as a rich snippet on the search engine results page. Since you included the rel=”Publisher” link your company will be featured in SERPs instead of the individual authors as is the case with rel=”Author”.
- IN-PAGE LINKS: Links pass value from one page to another. This value is called ‘link juice’. A page’s link juice is split between all the links on that page so lots of unnecessary links on a page will dilute the value attributed to each link. There’s no exact number of links to include on a page but best practice is to keep it under 200. Using the Nofollow attribute in your links prevents some link juice, but these links are still taken into account when calculating the value that is passed through each link, so using lots of NoFollow links can still dilute PageRank.
- BROKEN LINKS: Broken links send users to non-existent web pages. They hurt a site’s usability and reputation, which impacts SEO. Fortunately your page doesn’t contain any broken links. Be proactive in checking your pages to make sure they don’t have any broken links.
- WWW RESLOVE: Search engines see www.example.com and example.com as different websites. This means they could see a large amount of duplicate content, which they don’t like. So therefore, you must make sure your www.example.com and example.com resolve to same domain name and tell google web master tool to choose your preferred domain name.
- ROBOTS.TXT: A robots.txt file allows you to restrict the access of search engine crawlers to prevent them from accessing specific pages or directories. They also point the web crawler to your page’s XML sitemap file. You can use Google Search Console’s Robots.txt Tester to submit and test your robot.txt file and to make sure Googlebot isn’t crawling any restricted files.
- XML SITEMAP: XML sitemaps contain the list of your URLs that are available to index and allow the search engines to read your pages more intelligently. They can also include information like your site’s latest updates, frequency of changes and the importance of URLs. Be sure to only include the pages you want search engines to crawl, so leave out any that have been blocked in a robots.txt file. Avoid using any URLs that cause redirects or error codes and be sure to be consistent in using your preferred URLs (with or without www.), correct protocols (http vs. https) and trailing slashes. You should also use your robots.txt file to point search engine crawlers to the location of your sitemap.
- URL REWRITE: Clean URLs are not only SEO-friendly but are also important for usability. A clean URL is one that is easily read and does not contain any query strings or URL parameters. Take a look at this URL http://example.com/services/index.jsp?category=legal&id=patents , this URL does not easily describe the title or contents of the page at a glance. These bits of text: index.jsp?category= and &id=, are URL parameters that give an unclean look to your URL. Here is another version of the same URL: http://example.com/services/legal/patents , obviously the second example has a more straightforward, professional look and is more likely to be clicked on when shared on your Twitter or Facebook profile, or simply in a blog.
- UNDERSCORES IN URL: Google sees hyphens as word separators while underscores aren’t recognized. So the search engine sees www.example.com/green_dress as www.example.com/greendress. The bots will have a hard time determining this URL’s relevance to a keyword.
- MOBILE FRIENDLINESS: Mobile friendly websites make it easy for users to complete objectives and common tasks and use a design or template that is consistent across all devices (uses responsive web design). Make sure your site is well configured for mobile users.
- FAVICON: Favicons are the small icons that appear next to your site’s name or URL in a browser. They can be displayed in the address bar, a browser tab title or bookmarks. Make sure it is consistent with your brand.
All these points are really well stated and required to get website at the top of the search engines, The on-page optimization make website easily searched by the search engines.