The thing about SEO is that it is never enough. It can be an exhilarating experience to see your website at the top of the SERPs one day and the very next day, you might find the website nowhere on the first page of the SERPs. But, it is this very agony and ecstasy that keeps giving birth to innovative SEO tools, techniques, stratagems and methodologies that are used by experts to stay one step ahead of the competition.
In the last post on the same topic, we discussed 5 essentials. These were URL Structure, Alt and Title Tags, Sitemap, Header Tags, Navigation and Footer Links. Here are a five more essentials that can help you ensure a more SEO friendly website.
Focus On Unique Content
Obvious but often ignored, the value of unique content is priceless. But, you need to be careful. You shouldnt lose your readers, in the pursuit of uniqueness. What you must focus on is readability; make your content a worthy read and readers wont be able to ignore it. The content must be aligned perfectly with the purpose of your website and more specifically the web page it is on. Readability of content can be improved upon by using sub-headings, italicized content and pointers. Some webmasters get smart and spam it with keyword, which puts readers off. Remember the trick to improving the popularity of your website or blog is getting the visitors on your side. This is why quality content matters. It mattered long before Google Panda came on the scene, but the Panda update just emphasized it more.
Canonicalization Issues
If the thought of why Google isnt picking the right URL of your website, is making you stay up all night, buddy, you have major canonicalization issues to sort out. This problem arises because there are multiple URLs for the same on-site content. You must follow the golden rule for ensuring that Google is able to pick the right URL Maintain URL consistency. Say you want your default URL to be https://www.xyz.com/; you must ensure that this format is used across all your internal links as well.
You can display your canonical preference to Google by using the 301 Redirects or Link Tag using Rel Canonical.
The former helps you control your URLs; 301 is the status code of the URL and helps redirect the website visitor to the right URL. And why should this be looked into, while you are building a website? The answer lies in the fact that 301 Redirects can be coded directly into the page that you want redirected.
The latter also written in Google parlance as Rel=Canonical helps search engine bots find the correct URL for the content very easily. All you need to do is add the link tag to the <head> section of the page and specify the URL for that page.
Maintain Protocol
A serious breach of HTTP and HTPPS control on your website, can lead to disastrous consequences for your websites presence on SERPS. Either render your web page with an HTPP protocol or an HTTPS protocol and not both; because search engine algorithms wont consider them to be one and the same page. If you use HTTPS, you are using a secure protocol that is commonly used for a web page that requires website visitors to share sensitive information with them. For example, a payment page on an ecommerce store should always be rendered with HTTPS.
I know you will say that this is pretty elementary and doesnt deserve a place in the list, but elementary or not, webmasters do make the mistake of rendering a page with both protocols. This must be avoided at all costs.
No Hidden Content
Some webmasters believe that they can get better rankings on SERPs if they have hidden texts and links on the website. On the contrary, the use of such content is not seen as a good thing by search engines and while search engine bots will crawl this content, they wont index it. Your site might not be considered credible enough and it wont appear on the search engine page results. So, dont hide content behind the JavaScript. You might think you are doing something smart, but a search engine like Google wont think along the same lines.
No Cloak And Dagger Stuff
Use Black Hat SEO techniques and get set to experience the Google Slap. One commonly used black hat technique is cloaking. In this, the content presented to the user is not the same as the content that is presented to the search engine spider. Unscrupulous webmasters use a server side script to deceive search engines by convincing them that the page that they are crawling is the one that is going to be displayed to the user.
They effectively hide the content that is presented to the users browsers and instead present search engines with hugely optimized pages that wouldnt make much sense if presented to the website visitor. So dont use it. If caught, its the kind of stigma that will be very difficult, if at all, to erase.
In Conclusion
So, here we are at the end of another post, thinking to ourselves, whether the trials and tribulations of SEO are really worth it. But this is what we do and lets face it. we love it. Getting SEO right is just the beginning of an endless process of fine tuning existing strategies and using newer ones to ensure an extremely search friendly presence for our websites.