By Barry | February 10, 2011
This week someone sent me a SEO site audit report written by a UK online marketing company for a big UK retail brand. I enjoy taking a peek in the competition’s kitchen now and then, so I dug in straight away.
Seven pages in to this 66-page report I realised that these guys weren’t really competition. The report was so chock full of disinformation, ignorance, and random stupidity that any SEO work they end up doing is likely to be of pretty abysmal quality.
But I’m still bloody annoyed. Annoyed that witless chimps like that get to write site audits for big UK brands. It’s testimony to the FUD surrounding SEO – as well as the general ignorance about SEO pervasive in the corporate world – that clueless halfwits manage to talk big corporate management types in to buying their services.
A Look Inside a Useless SEO Audit
So what exactly did this report contain? Let’s take a look:
Fairly early in the document, in the context of competitive analysis, it says that the amount of results that Google says it’s found for a given query is an indicator of how competitive that query is.
Really? It’s a pretty piss-poor indicator, if you ask me. Sure, you can use it as one aspect of broader competitive analysis, but the report does no such thing – it mentions this metric right at the start of its competitive analysis section, and doesn’t do much to put it in its proper context. Which is pretty fucking stupid.
Secondly the report claims, straight-faced, that the links shown in Google Webmaster Tools show "the full range of backward links that Google has indexed". At which point I loudly proclaimed ‘GTFO’, to the amusement of my colleagues.
It gets better. Shortly after that proclamation, the report has the audacity to refer to "Google Page Rank" aka; toolbar pagearank, (yes that’s how they typed it) as a valid indicator of a site’s quality. Oh dear.
They go on with blowing shotgun-sized holes in their feet by referring to ‘inanchor:’ and ‘intitle:‘ search queries, implicitly assuming that the bullshit Google serves you on such queries is actually accurate:
"The fewer results that are returned for an ‘inanchor:’ search, the easier it should be to achieve high rankings for the keyword being analysed".
And it goes on and on… using the ‘site:’ command to measure indexation, recommending using the ‘title’ attribute on all links to improve SEO, using the ‘link:’ command as the primary method of finding sites that link to competitors, and recommending the site’s HTML code should be W3C valid. (They see the 20 validation errors on the site’s homepage as a strong indicator the code needs to be improved. Fuck off. 20 errors is about half of what I consider acceptable for large ecommerce sites like that one.)
Where’s the beef?
It’s not all bullshit of course, the report contains some actual useful tips. But not only is it immersed in such horrendous beginner errors, it also features some glaring omissions: nothing at all about site structure and semantic tagging – which for the site in question has huge potential – and nothing about the site’s dismal URLs either.
This sort of shit pisses me off to no end. It infuriates me that such uneducated ‘tards successfully manage to peddle their quackery to big companies. It’s not just the fact that they get paid to write uninformed shite like that – it’s the fact that they probably get away with it, time and again, because their clients simply don’t know any better.
The people who wrote that report probably consider themselves to be pretty clever, cutting-edge SEOs. And that, most of all, ticks me off, seeing the Dunning-Kruger effect in action.