Welcome to Streakfury.com

Search Engine Optimization – Balloney?

Posted by | Posted in Geek Stuff | Posted on 14-02-2009

Search Engine Optimization, or SEO (for the lazy folk), is something I’ve been thinking about recently (wow, the excitement!) and what I’ve come to realise is that 90% of it is a complete load of bollocks.

First of all, I’d like to start off by apologizing to the many, many “great” SEO experts out there, who’ve dedicated years of their lives to perfecting their skills. I don’t mean to offend anyone, and a lot of what I’m going to say is based solely on my own personal experiences (which is pretty compelling evidence, to be fair) but I just thought I’d say that before continuing.

For those that don’t know, SEO is the process by which a developer improves various aspects of a website in order to help that website rank higher up in the Search Engine Results Pages (SERPs). The ultimate aim is to help more people find the site when searching the net, in the hopes of fullfilling the widely-held belief that more visitors = more customers = more money. The thing is, I think that a lot of what the “experts” say we should be doing doesn’t actually help at all.

Of course, there are things that I think help with your website’s SERP rankings. But in my opinion these things should be done for different reasons, and not because the developer thinks that they’ll benefit from better search rankings.

Firstly, people still use the ‘keywords’ metatag. It’s been known for some time that Google (and most other major search engines) won’t display your website on a results page when the user searches for “gardening” simply because you’ve included the word “gardening” in your ‘keywords’ metatags, even if your site has nothing to do with gardening. It used to work, but hasn’t done for many years. This much is true, however, it’s worth putting some values for your ‘keywords’ metatag because some search engines will compare the values in your ‘keywords’ metatag to the actual content on your page. Depending on how relevant your keywords are to your content, it’ll determine how relevant your keywords are to any given internet search.

Now that’s out of the way, I can move on to the other SEO tips and tricks that poeple try. There are lots of them out there, but I feel as though a lot of them make no difference whatsoever, and the ones that do make a difference, don’t make a difference for the sake of it.

OK – take, for example, a website that has been designed using tables. Now, SEO experts would tell us that, apart from the inefficient use of HTML code, a table-based design is bad for SEO because it introduces lots of unecessary code, which makes it harder to read through by search engine bots, and hence, could put your site lower down in the SERPs. I personally don’t believe that. As far as I know, it’s the job of search engines to provide people with as relevant information as is possible for any given search. It’s NOT their job to try and dictate to people how to build websites, and with that in mind, many differnet websites are designed and coded in many different ways. What this means is, Google and co need to be able to search all websites as thoroughly as possible, regardless of how they’re built. And that’s exactly what they do. I’ve seen some horrendously-built sites rank very well for their chosen keywords, because the information contained within all those table cells is actually very good. On the flip side, I’ve seen very well-coded sites fail miserably, despite following all the rules and tips set out by SEO experts. I know there is more to good SERPs than the quality of your code, but as far as things you can control go, that’s all SEO is about – the code (and the structure of your site).

The things that I believe don’t make a difference are the mundane and developer-controlled things. SEO experts tell us not to have more than one <h1> element on our page. Personally, I don’t think it matters, as long as you don’t overdo them. All the heading elements (<h1> to <h6>) do is emphasize how important certain headings are in relation to other headings, and so if you have more than one heading that is very important in relation to the other headings, why not use the <h1> element to represent them all? It shows that there are multiple sections of content that have equal importance on that page, and that they are more important than all of the other sections (hence using <h1> and not <h3> for example).

The things that I do believe make a difference to a site’s SERPs are the aspects of SEO that crossover into the world of usability and accessibility. My personal belief is that your primary concern as a developer is to make your website work well for as many different users as is humanly possible. That is, unless you have a specific set of users, with known hardware and software configurations. There are a number of things that usability experts say you should do, that will also help with your search engine results.

For example, some sites use a lot of javascript and AJAX, but usability experts tell us that we should provide the same functionality in a non-client scripted way. SEO experts would also tell us to do the same thing (or, at least, not to use too much scripting at all) because it helps the search engine bots crawl our websites. That may be true, I don’t know, but I’d like to believe that if we’re rewarded for making our sites degrade gracefully by having our sites rank better in SERPs, that it’s because we’re caring about our users, and not because it makes the Google crawler’s life a bit easier. By making our site degrade gracefully, we’re ensuring that even those users who have JavaScript disabled, or don’t use a script-enabled browser, can use our site effectively. That’s what should help the SERP rankings.

The same could be said for separating CSS code and Javascript from the HTML. That’s a good idea because the cleaner the code, the easier it is to write and maintain, and the quicker the browser is able to parse it. And parsing it more quickly means a better experience for the users – and that’s what usability experts want. Naturally, SEO experts would say the same thing, but toot it as advantageous for SEO purposes as well. They say that, again, cleaner code makes the content easier to read, and so by having a nice cleanly-coded page, it’ll rank better in the SERPs. Again, though, I’d like to think that a higher ranking in the SERPs isn’t because we’ve provided easy code for the crawlers to read, but because we care enough about our users to give them a lightening-fast website to use.

To be honest, there are a lot of crossovers between accessibility/usability practices and SEO, but I personally think that where the crossovers lie, any advantages you get in terms of SERP rankings are because of the lengths you’ve gone to for good accessibility and usability, and not because those practices are known to help your ranking in the SERPs. A good placement in a results page should be the reward for offering good content in a way that means the majority of people can access it, and in a fast and reliable way. But people seem to be looking down the wrong end of the telescope – they practice these ideas because it gives them a good placement in the SERPs, and not because they see it as advantageous for their users.

What’s the difference?, I hear you ask. In some cases none. In some cases, it’s a case of doing the right thing for the wrong reasons, and the end result is a usable site that ranks well in the SERPs. But in other cases, the SEO modifcations made to a site might not benefit the user at all – and it’s those techniques that I personally believe make no difference. Take the following URL as an example:


An SEO expert might come along and turn that URL into this:


They’d justify that by saying that Google and co like to see some keywords in the URL, and hence will rank the latter address above the former in the SERPs. But how does changing the URL in that way help the user? It doesn’t. Firstly, by default, people naturally put ‘www’ at the beginning of a domain name, yet the SEO “expert” has removed it. In 90% of cases, that might not make any difference, but some servers are set up so that a domain name with ‘www’ at the beginning is actually different from one without the ‘www’, and so typing the ‘www’ at the start would result in a 404. But more importantly, the second URL is twice the length of the first, and a lot harder to remember. The whole reason that poeple use words in web addresses rather than IP addresses is because they’re supposed to be easier to remember, yet the latter address is a nightmare to remember for your Average Joe.

So that is an example of a typical alteration that an SEO expert might make, but which I personally think actually makes things worse. In my opinion, the easiest way to determine if a particular SEO technique will work is to figure out what advantages it has to Average Joe Commoner. If there isn’t any, it won’t help your SERPs either. So by cramming thousands of sex-related keywords into the metatags of your gardening website; or putting the entire english dictionary into the ‘alt’ attributes of your images; or by creating monumentous URLs in the hopes that Google will see some of your keywords; will all prove to be rather fruitless exercises.

On the other hand, keeping your centent clean, accurate, and timely; using people-friendly URLs; having nice descriptive ‘alt’ attributes; laying off the user-unfriendly Javascript; making your site degrade gracefully; and being as efficient as possible with your code and structure; will all help you. That’s not to say that SEO experts wouldn’t tell you to do that already, because they do, but for the wrong reasons. I only urge you to remember why you’re doing these things, and it shouldn’t be because you’re expecting much higher SERP rankings.

In the same way that “looking after the pennies means the pounds will look after themselves”, by taking care of your users first, the SEO will take care of itself.

Oh, and happy Valentines Day!

Post a Comment

You must be logged in to post a comment.