Creating Search Engine Friendly Websites

Creating Search Engine Friendly Websites

13:31 30 October in Web Solutions


Having a website listed by search engines is an important goal for all website developers. While developers may focus on meeting the client’s requirements and aesthetic demands, they also need to consider the requirements of search engines. Search engines work by indexing the text on a website. They look at the words used on the site and also note the HTML tags in which those words appear. This process should be quite straightforward; however the use of alternative technology presents problems for search engines and should be considered carefully.

JavaScript is often used by website developers to include features not available in HTML. Some of these will include dynamic manipulation of text and images. Of particular importance with regard to SEO is the use of JavaScript to create navigation bars and menus. When JavaScript is used, complex menus can be created that have change dynamically as selections are made. While this may create added interest on the website, it presents difficulties for search engines. Search engines cannot follow hyperlinks that are embedded in JavaScript. This means the ‘spidering’ process, where links within the website are followed to index all pages, cannot be completed. JavaScript should not be used for primary navigation, and if it is, the developer should provide alternative text based navigation on the same pages.

Flash can be used to create web pages where all of the content is held within a Flash movie. As with JavaScript, Flash movies are impervious to search engine indexing. The content is hidden and cannot be read by software. Flash can be used to build components used on a web page, but a Flash movie should not constitute the entire content of the web page.

Web pages that are created using database content often have complex URLs. The page address may contain record numbers and values for a range of variables. These addresses may be difficult for search engine spidering software to follow and may be transient in nature. There is some scope for creating search-friendly URLs using URL Rewriting. While these are simpler to read, the process can lead to indexing problems. The best option is to use static page URLs where possible.

Content management systems have become popular for clients who have minimal expertise in website maintenance. The text viewed on web pages is held on a database and these text records can be amended by the client using a form driven interface. As with general dynamic content mentioned above, the pages often have complex URLs. Content management systems enable clients to amend web page content themselves; however they are not a good substitute for static pages. Search engines index static web pages most easily and these should always be the preferred option to facilitate SEO.

Images cannot be read by search engines. Text held in an image is simply a collection of pixels that can’t be readily interpreted by software. Images can be a positive attribute to a web page when used with appropriate text, but if they become the only item on the page there will be little for search engines to index. Use keywords in all elements of the image tags. The image name and the ‘alt’ and ‘title’ parameters should also contain keywords. The image should play a supporting role to the text in the page.

Some website developers are more concerned with the addition of unusual features and sparkle to a website than ensuring it fulfills its original function. Somewhere along the journey they lose sight of the goal. If you are to develop a website that is successful in attracting visitors then good quality text must be a major part of the content. Ignore the use of text and your search engine rankings will suffer.

No Comments

Post A Comment