Google 2009: The Painful Details

Ian Lurie Dec 30 2008

OK, a few of you are gluttons for punishment. You asked me to give you some details as to how I researched and wrote my SEO 2009: Adapt or Die piece. Here are the basics.

Read the Patents

I subscribe to a Google blog search for “Google Patent”, and scour the news that I read.
I also go search the US Patent Office for Google-related patents.
In this case, I saw a patent from 2005 some time ago regarding information retrieval based on historical data. It generated a lot of interest a while back because it showed that Google was starting to use domain age as an indicator of quality. What really got my spidey-sense tingling, though, was this bit of information in the patent:

36. The method of claim 1, wherein the one or more types of history data includes information relating to user behavior associated with documents; and wherein the generating a score includes: determining user behavior associated with the document, and scoring the document based, at least in part, on the user behavior associated with the document.
37. The method of claim 36, wherein the user behavior relates to at least one of a number of times that the document is selected within a set of search results and an amount of time that one or more users spend accessing the document…
…47. The method of claim 45, wherein the scoring the document includes: analyzing the user maintained or generated data over time to identify at least one of trends to add or remove the document, a rate at which the document is added to or removed from the user maintained or generated data, and whether the document is added to, deleted from, or accessed through the user maintained or generated data, and scoring the document based, at least in part, on a result of the analyzing.

In English, this means Google’s going to track the percentage of time spent on n pages visited where ‘n’ is the number of documents they click from the Google search results. It also means Google is looking at bookmarking, social voting, SearchWiki, Google Notebook and who-knows-what-else to get a clue as to which web pages matter and which don’t.
That’s a profound shift from the days when links, site structure and keyword density were all that mattered.
So, the patents seemed to point to a shift towards behavioral ranking. But I graduated from law school with a B- average, so I never trust my interpretation of legal documents. I need a bit more.

Test My Assumptions

This part isn’t exact. Marketing never is. Sorry.
I launched, nearly simultaneously, three test sites. I can’t tell you what they are because I don’t want to wreck my sites’ rankings – they’re legit sites. I just used them as experiments.
Site A was an application that let folks compare how their car performs next to others. It’s nearly uncrawlable, except for a few generic information pages.
Site B was just plain silly. It’s only one page.
Site C was a pure keyword-sniping site. A blog built for one purpose: To get a top ranking for a juicy key phrase. But it’s loaded with great content.
All three sites targeted keywords with nearly-equal competition, both in number of competing pages and apparent level of optimization.
Site C jumped up in the rankings within a few weeks. It was super keyword-relevant. It attracted links, too. But average time on site for my target keyword was under 1 minute, and the bounce rate was over 80%. Site C yo-yo’s up and down in the rankings and has yet to stabilize, in spite of continued writing.
Site B got a ton of links and mentions and generated some buzz. But it had almost zero content. For months it didn’t rank. But it did average over 2 minutes time on site for my target keyword, and a very low bounce rate of 30% for that keyword. After a couple months it gained a top 3 ranking and has stayed there ever since. I haven’t updated the site since.
Site A got lots of traffic for a short time, and has since tailed off to almost nothing. It has, however, maintained a time on site for my target keyword of over 2 minutes, and a low 30% bounce rate. It gained a high ranking very quickly and has stayed there. I’ve not touched the site since I launched it.

Which Site Won?

None of them won. They all finished about even, with decent rankings for target phrases and equal percentage of available traffic.
Which made no sense at all. If you’re banking on the traditional hallmarks of good SEO, site C should’ve won: It had more content, the same link authority and the best keyword targeting.
How did a dinky little one-page site with almost no content (site B) and a few links keep up? The only way it outperformed site C was bounce rate and time on site for my target keyword.
The same held true for site A. It looks awful in every way, except that time on site and bounce rate for my target keyword was nearly 3x better than for site C.

Conclusions

Such as they are:

  • Google is weighing site performance and user behavior for specific keywords. Sites with lower bounce rates and higher time on site after search (TOSAS) for a specific keyword gain leverage for that keyword, and have a shot at a high ranking in spite of less content and/or few links. That’s why sites A and B outperformed C for months.
  • Content, while important, can’t sustain your high ranking in the absence of visitor behavior that proves site relevance. That’s why site C continues to struggle: People just don’t spend much time there. That doesn’t mean you should stop writing. It means you should write well.
  • Google’s been pondering user behavior as a ranking factor since at least 2005. Probably for longer. Their patent application proves that.

So, there you have it. My unscientific seat-of-the-pants test, plus patent analysis. Go take an aspirin and send me your comments in the morning.

Related Posts

SEO 2009: Adapt or Die
10 Years Later, It’s Still About the Content

PS: I wrote this after eating, in a 5 hour period: Smoked bacon with black-eyed peas, Wonton soup, edamame, shredded sesame beef and then a chocolate souffle. With 3 beers. All because my wife is a terrible influence. Attractive, smart and my better half, but a terrible, terrible influence. It’s amazing I’m still alive, much less writing. So be kind in your critique.

SEO Copywriting eBook
tags : conversation marketing

8 Comments

  1. Leo

    Leo

    I know this may sound a little off topic but you mentioned domain age as a weight for trust benchmarks. What is your view on purchasing a domain long term vs. re-upping every year, at least in the eyes of google? Do you think that purchasing your domain for 10 years would help increase rankings (even marginally) b/c it shows that you are in it for the long haul?

  2. Erik

    Erik

    This is a very interesting piece of information. Many of us already had the feeling that social behaviour is of importance to your ranking.
    But there is much more research required to know the social drivers of your ranking.
    You mention bounce rate and time on site. But both are easy to manipulate and therewith less attractive to Google.
    Bounce rate
    Bounce rate measures whether a person opens a second page. If somebody quickly opens three pages on your site and then leaves he may not have found what he was looking for. Bounce rate is 0% however.
    If a person opens the first page, finds what he was looking for and starts reading for the next two minutes bounce rate is 100% but the visitor is happy.
    The problem above can be solved by calling the Google Analytics setvar with javascript after some time, for example 60 seconds. Everybody paying attention to a page long enough will not be counted as a bounce.
    Time on site
    The other measure you mention is time on site. Although this is in my opinion very valuable, I suspect it only measures correctly if you browse with one tab.
    I use a dual screen setup with several browserwindows who each have several tabs. I open interesting pages in a new tab and start reading them in one go. This means that a page can be open for a long time without me looking at the site. Can Google measure which tab you are reading??
    Basically, there is a lot of clouding here. Nevertheless, it is worth your time to look at these measures to try and improve your ranking by improving user experience. IN the end the golden key is user experience.

  3. Ian

    Ian

    @Leo Definitely reserve for the long haul.

  4. Ian

    Ian

    @Erik I agree, but remember there are many, many other factors they’re weighing too. I think the real intent here is to add more, better measures of site relevance and authority. That makes things the algorithm more complex and harder to game. As I said in my first post on the topic, anything can be gamed. ANYTHING.

  5. Tom

    Tom

    I am going to guess that site C is this one. And site A is your php hybrid milage comparison site that you mentioned once before. Site B no idea. Sorry about skewing your experimentation, I won’t be offended in the least if you delete/not post this comment.
    All of the google updates sound exciting to me, but I am not really in the SEO business. I really like to see giant companies that change major parts of their business and not become in-the-rut dinosaurs.

  6. Ian

    Ian

    @Tom you’re spot-on on site A. Site C isn’t Conversation Marketing, though.

  7. @Ian – if you were to build your own search engine, how would you measure relevance of search results?
    Like what you said, anything can be gamed. Even if bounce rates and time on site is measured together, there is cheap manpower available, and one might just spend some money to get a bunch of peeps to load do what the SE thinks is a good user behavior.
    I think until the day we can really track eyeballs, I guess google has to rely on such measures.
    P/S: Its kinda scary to think how people might game the SEs when we can really track eyeballs. Heh.

  8. I wonder how long it will take till Google buys stumbleupon to use it for their rankings :o

Comments are closed.