Search Engine Optimization: Devil or Saint? Science or Art?

Written on 10:35:00 PM by S. Potter

To many website operators, especially websites operated by individuals, small to medium-size businesses and non-profit organizations, Search Engine Optimization (SEO) techniques are black magic that have no rules. Others see SEO techniques as a sure-fire cure-all. While Google clouds its search algorithms in enigma to outsiders, there are cornerstones to implementing effective SEO campaigns. In this posting I wish to demystify the core building blocks websites should be basing their SEO strategies on. Recently while working on software and web development projects remotely for two clients I have also had the opportunity to work on four SEO projects of various scale since the beginning of this year. I have been aware of many SEO techniques (some of them I would term tricks), but prior to January I had only put them into practice on small personal web endeavors. In my experiences thus far I have found some fairly simple, yet very effective foundations to build SEO efforts on top of, namely:

  • Provide useful, well written content on your site. It does not need to be mountains of content, but it does need to be somewhat sensible and useful content. One way that you can promote content generation (depending on the type of website you have) without crafting it all yourself is to open up forums for users/customers/clients to discuss their experiences. However, in some cases, this could be a nightmare for smaller organizations or individuals that need to create a family friendly environment (in terms of moderating spam or inappropriate comments), but has been used to great effect on many well known websites including, of course, amazon.com. I should also note that technology (like CAPTCHA image validation) can minimize spamming.
  • Clean HTML documents that you would like indexed by search engines to contain the minimal amount of markup, styling and javascript code as possible. This is second only to the above cornerstone. As a software engineer with, most recently, a web development focus, I find too many websites that clients give me to optimize, need client and server code to be refactored in in a serious way. The idea is to reduce non-content code to the appropriate size that can improve keyword and search phrase ratios that most search engines most likely use to rank pages. I should note that we do not know exactly what ratios are used by search engines, but I have seen statistical evidence to strongly support the fact that certain ratios (or formulas that are proportional to these ratios) yield higher positions in web page rankings. If your web developer spends too much time learning the latest Javascript tricks or CSS hacks and not enough time writing more consice (but readable), more organized code, you should question whether they should be kept on for production work. Prototyping websites might be a better skill fit for them.
  • Know your audience [and keywords]. Three out of the four SEO projects I have recently worked on this year had highly flawed initial keyword/phrase targets, set forth by our clients. I was asked to promote these three websites on keywords very few people actually use to search for these business services. There are some commercial, free and internally developed tools and scripts that can help in analyzing the suitability of keywords. Making a list of not only the A-list keywords/phrases (the ideal candidates) for your website, but also popular B-list keywords/phrases, common typos and misspellings is essential to implementing an effective SEO campaign. Not only should you know your audience and relevant keywords and phrases to promote your website on, but you should be realistic about potential ranks for extremely popular keywords that larger players will, most likely, dominate for the first 3-5 pages of results. In these cases you should consider your options and evaluate if going after a 54th spot for a major buzz/keyword is worth the time and effort to get your site to that position when you could be getting first page hits with other more specific keywords and phrases.
  • Support search engine-friendly URLs on your website. If your website is dynamic in nature, choose web server technology that supports easily readable URLs. For example, most search engines will not index a web resource at a URL like the following: http://your.domain.name/dir?id=6434&type=restaurants&subtype=middle+eastern&zip=60640&order=ASC A website that has more than 1 dynamic HTTP parameter set in its URL is taking a risk that most search engines will not index it as search engines may assume it contains highly perishable information in the page. I have been able to translate unfriendly URLs to friendly URLs without asking the original web developers to touch the server code, although not all website URLs can be translated using URL rewriting engines such as Apache's mod_rewrite! Another gotcha is when web developers use the POST HTTP method when submitting HTTP parameter data using the GET HTTP method is far more appropriate. Since I am a software engineer with web development and SEO experience now I can advise clients on how to approach this and discuss best practice approaches. There are also other issues related to supporting search engine friendly URLs that would probably require a whole book let alone a long blog posting to fully treat the subject.
  • Do not blatantly violate the spirit of the search engines' code of ethics. There are some techniques that are various shades of grey, but you should stay clear of all techniques that are darker than 50% grey. If the search engines do not notice, your competitors might and they will be sure to notify search engines if you are stealing their thunder using questionable techniques. In theory, Google does not believe in blacklisting individual sites and prefer to optimize their algorithms such that the website drops in rank significantly or is no longer listed. However, Google will blacklist when websites are blatantly violating the spirit of their rules. Most recently Google has removed BMW's German website from search results on certain key words as they overstepped the mark in SEO creativity: http://www.alwayson-network.com/comments.php?id=14809_0_5_0_C
As you have probably figured out from my ramblings above, there is no clear winner. SEO techniques are neither black magic riddled with negative consequences nor sure-fire cure-all solutions. All I suggest is treading carefully using tried and tested techniques with room for some experimentation of your own to determine your own conclusions if you are going to undertake this task yourself. Alternatively, if you hire an SEO consultant make sure they are neither extreme.

If you enjoyed this post Subscribe to our feed

1 Comment

  1. S. Potter |

    I should also mention that experimenting with techniques that are specifically applied to the social web (e.g. blogs, delicious, digg, etc.) is an area I am presently looking into in my spare time. If and when I find some useful results I will post a new blog entry on this.

     

Post a Comment