Friday, 8 May 2015

SEO Training (Part 3)

You spend countless hours designing the perfect webpage only to find out a couple of days later that your published content does not sit on the radar of the giant search engine providers. So why is this? Let's look at a couple of scenarios that could cause those hardworking search engine bots to overlook your webpage.

(a.) You include forms on your website that require users to input information before they can see your content - There are various reasons that you may have a form on your webpage: you may want to capture a users personal information in order to setup an account for them or you may want them to complete a marketing survey or it could simply be a password protected website. Your intentions may be sincere but could have an adverse affect on whether the content that sits behind those forms can be found and indexed by the search engines. Bots will not try to attempt filling out forms so any links that are connected to those forms will be overlooked by the crawlers.

(b.) Your webpages may literally have thousands of links pointing to it - In reality, search engine bots can only crawl on a certain number of links on a webpage at a time. This is done purposely in order to reduce spam and make the rankings between webpages conservative. That being said, there will be a good probability that those bots will overlook some of those webpages that are accessible by those huge number of links.

(c.) The "Meta Robots Tag" - The meta robots tag is used by the web master to restrict access to the webpage by the robots. Why would a web site owner do that you may ask? Well, simply put, in addition to the good search engine bot and crawlers there are also malicious bots as well so the meta robots tag would protect your web content from them, but in the process you may end up restricting good bots as well.

(d.) Using JavaScript code for your links - Best practices suggest that you use HTML links as these are easily read by search engines as opposed to using links that are embedded within JavaScript code - search engines have a hard time reading these links and hence give little attention to them.

(e.) The use of the "Robots.txt" file - This file has the same function as the meta robots tag in that it polices the links that are on the webpage and blocks spiders and crawlers from traversing them. Be careful when using the robots.txt file because you may also block legitimate crawlers from finding your content through those links.

So, by understanding how code works and knowing what search engine bots can and cannot read when it comes to your links you can greatly enhance or optimize your webpages for search engines.
Seeking for search engine company that helps you generate more leads from Google? Look for Dougles Chan - The Search Engine Guru.

No comments:

Post a Comment