For improved search engine rankings, your website must be built upon the foundation of on page SEO. There is a connection of interdependence between on page and off page SEO. Without each of them, there are low chances of high search engine rankings. Link building strategies are what off page SEO is all about – you may not have complete control over the results. However, on page SEO is in your hands. The application of on page SEO is carried out on both the visible and coded content of a web page, and this application is for both users as well as search engines. On page, SEO refers to the improvement of search engine rankings by what is done on individual pages, while on-site SEO refers to the improvement of search engine rankings by what is done to the website.
The most important on-page SEO element is the title. The presence of target keywords in the title helps with the popping up of relevant pages in search engine results. Before setting the title, one will have to check a few things like whether the most important keyword is present in the website title, whether the keyword figures at the title tag’s beginning, whether the title tag has not exceeded 70 characters, whether the title tag is unique etc. One should not confuse title tag with meta title – the former is a standalone tag. However, both do share the same function as well as location – the same content is used by some sites for both meta title and title tag. However, it is important to have a title tag even if you do not have a meta title tag. The most important keyword must figure in the title, and that too towards the beginning of the title. The title should not exceed 70 characters for the sake of usability. The content should not only be easily understandable but also unique.
The next point to look at is the description tag. In the search engine results page, the description tag offers a summary of the content on the website, and hence it is important for site visitors. Before setting the description tag, one will have to check certain things like whether the target keywords figure in the meta description tag, whether the meta description has not exceeded 150 characters, whether synonyms have been used to help users easily understand the description content, whether the description tag in every web page is unique. While staying within 150 characters is not a must, exceeding this number can lead to the content being displayed as an ellipsis and being of little value to others. The use of keywords and synonyms is advised as the same is usually highlighted in search engine results. Although the extent to which SEO is boosted by keywords in the description is not known, the click-through rate on search results sees a definite improvement. As more characters are allowed, you can explain in the meta description what you could not in the title. It is important to avoid duplication of meta descriptions and ensure that they stay unique.
On the web page, the heading is visible to visitors, unlike the description and title. From the point of view of search engine results, the heading is important as the search engine uses the heading tag to gather information about the page. While setting the heading tag, it is important to checkpoints like whether there is an h1 tag, whether the h1 tag contains the targeted keyword, whether the targeted keyword is at the tag’s beginning, whether the content of the heading tag is unique etc. The content of the heading tag gives a summary of the website’s intent. It is recommended to keep the content minimal, even though there is no limit on the number of characters, as keeping it short helps to attain the attention of visitors. The targeted keyword must figure in the heading tag to help with SEO, and it is recommended to place the same at the heading tag’s beginning. Even though more than one heading tag can be used, they should follow the hierarchy of h1 to h6. Two h1 tags must figure on one page. Even if the intent is the same, make sure that the same content does not figure in the title tag and heading tag. A weak relevancy signal can be passed to search engines if you use keywords in h2 and h3. It is useful to note that even if meta tags help in giving site-relevant information to search engines, this may not improve SEO.
Now let us look at content. The keywords for a certain web page must figure in the content. These keywords could include synonyms, long tail keywords etc. It is important to make sure that there is no issue of high keyword density. While setting the content it is important to checkpoints like whether the website content’s first lines contain the main keywords, whether in the content body optimum keyword density is maintained, whether the content exceeds 500 words, whether the content of each web page is not only unique but also original etc. It was mentioned earlier that one should strive for a minimum number of characters while setting title tags, heading tags, description tags etc. However, it is right the opposite for content. For content, it is best to exceed 500-1000 words as search engines love content which is long – in short, the longer the better. However, that does not mean you subject readers to the torture of having to trudge through a forest of words – give them the information that they need, but without having to subject them to a waterfall of words.
There is no exact number to quote for keyword density, but use keywords wherever necessary. One can use latent semantic indexing keywords – this is what search engines use to assess your content’s relevancy to the target keywords. It is very important to make sure that the content is original. Duplicate content from third-party pages is considered bad for SEO by Google. Indulge in duplicating content neither from third-party pages nor from your own pages. One content is chosen as original by a search engine when it finds duplicate content, with the others getting omitted and figuring among the last search results. Now let us look at URLs. It is not only important to keep URLs clean but also to use them to describe the content using the right keywords or words. Not only are URLs important for usability and SEO, but they also talk about the site’s architecture.
It is important to checkpoints like whether the URLs are clean and short, whether keywords are placed at the beginning of the URLs, whether hyphens have been used in URLs etc. URLs figure among the on-page SEO elements which are most important. Earlier in search ranking, keyword domains had a high place – not anymore. Keywords in the rest of the URL which extend beyond the domain name is what is required. It is useful to note that URLs with underscores are less preferred over URLs with hyphens. The first words are what are given priority by search engines while trying to assess the URL content’s relevance – this is why it is recommended to place the target keywords at the URL’s beginning. Even if you cannot place the target keywords at the beginning of the URL, the URL must have the words that properly describe the content. URLs containing the same content must be taken care of – on duplicate pages the use of rel=”canonical” tags inform search engines about the original location.
Now let us look at images – while the text on images cannot be accessed by search engines, the text about the images which is written between the alt text attribute can definitely be accessed. Points that need to be checked are whether all the images on the website have alt text, whether keywords figure in the alt text for images, whether keywords or relevant text figure in the image file names etc. Not only are they good for SEO, but alt attributes for images are also great for usability. To illustrate, it is alt text that would aid a visually impaired visitor (who could use a screen reader) as the image cannot be seen. Even if the visitor is not visually-impaired alt attributes for images come to the rescue during a slow internet connection or when images have been disabled by the visitor. As far as SEO is concerned, not only do search rankings improve with keywords within image alt txt signals, but the images will enjoy high rankings in Google image search.
Another point to note is that of keyword consistency. It is important to check whether the target keywords consistently figure in the title, heading, description, content, image alt attribute, URL etc. Among a website’s spiderable content, it is important to maintain keyword consistency – it is highly recommended even if it’s not compulsory. Internal links are an important factor, as they facilitate the smooth flow of link juice. It is important to check whether the smooth flow of link juice is facilitated by the internal links, whether in the internal links targeted keywords have been used as anchor text etc. Through every inner page, the flow of link juice must be facilitated by the internal linking architecture. Through external linking, one can popularize the inner pages, but internal linking is still important for ease of crawling and indexing of all web pages. By linking a certain web page to other pages which are popular on one’s website, one can improve the search rankings of that page. The anchor text for the internal links can be chosen as it is one’s own website – therefore use target keywords in the anchor text instead of generic words like ‘read more’.
Now let us look at sitemap. It is common knowledge that every website must have a sitemap. Indexing by search engines happens even if a website doesn’t have a sitemap, but the finding and indexing of pages become easier for search engines when there is a sitemap. What needs to be checked is whether one has an XML sitemap, whether the sitemap has been submitted to Google Webmaster Tools, whether the sitemap has been located on the robots.txt file etc. The location of every internal link and the priority of every page within a website are two things that a sitemap tells search engines. The location of every internal link helps with indexing all the internal pages which in turn helps search engine visibility to internal pages. Understanding of the information architecture is improved on knowing the priority of every page within a website. The robots.txt file is often the first file looked for by a search engine crawler – therefore, one can specify a sitemap location in the robots.txt file.
Information related to pages that should not be indexed is contained in the robots.txt file. The efficient crawling and indexing of a site by search engines are enabled by the information contained in the robots.txt file. This information can include the sitemap location. You will need to check whether there are blocked search bots or incorrectly blocked pages in the robots.txt file. Even though your website’s SEO is not directly affected by the robots.txt file, you will have to check the file to see whether any page you want to be indexed is not blocked. Double-checking of the robots.txt file will be required if Google Webmaster Tools reports crawling issues. By specifying unnecessary pages in the robots.txt file, one can avoid these pages from getting indexed and thus improve the search engine bots’ crawl rate. Another factor to look at is page loading speed. In 2010 Google had announced that page loading speed would affect search rankings. It is therefore important to make sure that 4 seconds or less is the time that is taken for a page to load.
It is important to make sure that the website is regularly maintained and updated and that all the on page SEO strategies are implemented in an organized way so as to achieve high search engine rankings.