Is your website technically set up for search engine spiders and robots?
Search engines use software that goes out and “crawls” the web, or indexes the sites and pages it finds as it follows links within a site and from site to site. Sometimes, web designers who know little about the backend or business owners who create “do it yourself” websites make mistakes that hamper how a search engine spider moves through your site.
Make sure you aren’t making common mistakes by adhering to these guidelines:
Create a website using style sheets. Some software and old-time designers use table tags to create the structure of a website. This creates a lot more code, bloating the page. That means it takes longer to download and takes longer or is more confusing for search engines to crawl. Using style sheets allows much more flexibility in how the site appears, as well.
Reduce bloated code. Along with using style sheets, you may want to consider how to streamline other code on your site. Some HTML creation tools, even good ones like Dreamweaver, can produce excess code, especially if you don’t know or understand HTML to keep this in check. Some pages use several Javascript programs, which also keep lots of code in the top of a page. The problem here is that the search engine spider may have a tough time getting through all this code to reach the main content in your site.
Validate your code. This means that you make sure your code is written correctly and simply. If you have errors in your validation, it’s not the end of the world. But there is a big difference between a handful of errors and dozens or even hundreds of errors, which we have seen with some local sites we’ve reviewed. Code that is not correct may display incorrectly in some browsers, making your users’ experiences vary considerably, and may make it challenging for search engines.
Navigation should be clear and coded correctly. Some websites use scripts, notably Javascript, to code the navigation (links to the main sections of the site). This may make for a fancy user experience – if the user has the right browser and the code is interpreted correctly. But search engine spiders can be tripped up with Javascript code for navigation, which means that the spider can’t go through the entire site to find all your pages. And worse, people using an older or less popular browser may not see how to move around your site at all! Keep your navigation simple and coded in HTML to prevent problems.
Keep your website URLs simple. Some content management systems, especially ones that handle e-commerce, use session IDs or variables and parameters within the web page address, making a long URL with lots of nonsense numbers or symbols. A good content management system, like the one we use for Visual People’s clients that is based on Textpattern, allow you to create URLs with keywords and without these variable numbers and symbols. As a default, if you don’t create a unique URL for your page, it will use the headline text (which should have keywords and be relevant to your page).
Make sure spiders and robots are welcomed to your website. You want to make sure that you don’t do something wrong by accidentally forbidding search engine spiders to go through your site. Sites have a “robots.txt” file that spells out what pages search engines are allowed to go through. If you make a mistake, you may end up preventing your site from being indexed at all.
Don’t worry if this makes little sense – the takeaway for a small business owner is that your website will benefit from having a professional handle the backend. If you create a site yourself, you will want to be sure that you understand what needs to be done to ensure you’re not losing search engine rankings because of technical issues. And if you work with a web designer, you’ll want to be sure he or she understands that the site needs to have more than a pretty appearance to be effective for your business.
Want a free review of your site? Contact Visual People for a basic review of the effectiveness of your small business website.