Monday, 19 March 2007

SEO - spidering

A lot of web sites nowadays use some fancy script-based navigation with menus that drop-down on ‘hover’ and the like, but apparently these links are often not read or followed by the search engine bots that index the site.What does this mean? It means that there is a danger that only your homepage will be fully indexed. How do you fix it? Very simply, you add standard HTML hyperlinks to back up your other navigation. Many sites do this and place the extra navigation at the bottom of the page. You may think you can cleverly fool the bots by adding ‘invisible’ text ie. text which is a similar colour to the background or in a very small font. Be warned – the bots are wise to this, and will not take kindly to ‘hidden’ text.
Search engine 'bots' like to see good interconectivity between pages ie. the 'web' nature of pages that all connect together - make sure that you don't create dead ends in the 'spidering' of your site. The best way to achieve this is to create a sitemap and check that everything links together. As well as sitemap pages, there are sitemap files that you can leave in the root of your website which will be recognised by search engines. You may need additional facilities on your web host to do this. Google recognise sitemaps created via XML and RSS, check out what Google has to say about this and check whether your host has the appropriate software on your server. If not you can opt for a simple text file list - this will inform the bots of the pages that they SHOULD be seeing, and if they see a problem in navigation, it can be reported back to you in Google Webmaster tools.

No comments: