Analytics Apples And Oranges: Switching Web Analytics Tools Without Getting Fired
Ian Lurie Jul 8 2008
I did WebTrends a disservice here by failing to point out that my client’s creaky WebTrends installation was a very old, purely log-file-based one. Newer versions of WebTrends are every bit as versatile and accurate their competitors.
If you switch from one web analytics package to another, be ready to make some adjustments in your metrics goals.
Web Analytics Q & A
How many people here used WebTrends or something similar up until a few years ago?
How many switched to, say, Google Analytics, Omniture or Urchin 6?
OK, last question: How many of you nearly got fired when your visitors, pageviews or other metrics inexplicably plunged?
Chances are you aren’t suffering from bad statistics. You’re a victim of metrics madness: The lack of standards in site traffic measurement.
The Horror of Metric Madness
Late last year we launched a shiny new web site for a client. It’s beautiful, if I may say so.
As part of that launch, we switched them from a creaky, old, log-file-only WebTrends installation to Google Analytics (note my comment at the beginning – WebTrends is great. Just don’t use a 5+ year-old version).
Four weeks ago I got a wake-up call. Pageviews and unique visits to the site had dropped 10% over the previous year.
December 2006: 300,000 pageviews
December 2007: 270,000 pageviews
(numbers changed to protect the innocent, percentages the same)
For this client that’s pretty grim: They’d seen steady growth (thanks in part to SEO) before 2008.
Those Missing Pageviews
So, where did those pageviews go? And come to mention it, why didn’t they go up, instead of down? Have I lost my touch?
Nope.
We’d switched the client from WebTrends, a tool that measured traffic using log files and nothing else, to Google Analytics, which measures traffic using a fancy javascript.
The Difference Between Log and Javascript Tracking
Warning: Geeky stuff ahead. If your eyes roll back in your head at words like ‘log file analysis’, skip ahead to ‘Analytics Package Conversion Factors’.
The WebTrends installation my client had used the server logs to count visits and pageviews. It counted any page load by any visitor as a pageview. Even if the visitor started to view the page and then clicked away before a full pageload.
Google Analytics, on the other hand, waits until the entire page loads. The javascript that Google Analytics uses to count a pageview loads at the bottom of the page. If the whole page doesn’t load then Google won’t count it. Visiting search engine spiders and other non-javascript browsers don’t execute the javascript at all, so those pageviews never get counted.
Third-party tools like Compete.com further reduce pageviews and visits because they’re not measuring 100% of traffic to your site. They get a random sampling based on either traffic through your ISP or toolbar installations (like Alexa.com).
All of this can cause a perceived but artificial drop of as much as 50% in both pageviews and visitors. It’s often less, but it depends on the original configuration. Here are some rough percentages based on my experience:
Web Analytics Package Conversion Factors
These numbers are from sites I’ve worked on, not the entire internet. While I’ve been around the block more times than I care to count, take ’em with a grain of salt.
From Urchin using UTM.js to Google Analytics: 20% drop
From Log file-only WebTrends or Urchin to Google Analytics: 30-50% drop
From Webalyzer to Google Analytics: 20-30% drop
From Google Analytics to Compete.com: 20-60% drop
3rd-party sites like Compete.com are tougher: A busier, more popular web site may see only a small drop. A smaller, less popular one may see a very large drop. A smaller initial audience leads to a smaller sample and a larger drop.
Generally, you’ll see an apparent drop in pageviews and visits whenever you switch from a logfile-based analytics tool to a javascript ‘bug’ based tool like Google Analytics. And you’ll see another drop when you compare javascript-based analytics to a 3rd party toolbar or internet service provider-based tool like Compete.com.
Two Tests To Save Your Job
So, your boss or client is really pissed. It looks like you’ve cost them a big chunk of traffic. Your life, or at least your career, is about to come to a messy end. There’s plastic spread on the boss’s floor.
Here’s how you save yourself:
Use Compete.com
Go to Compete.com. Look at the trend over the time period in question.
For example, let’s say I moved my blog from Urchin to Google Analytics January 1, 2008. My own data shows a drop in visits:
These numbers are made up, by the way.
I take a deep breath, go to Compete and look at the same period:
Ah HAH! My pageviews didn’t drop. I’m just seeing the effects of a switchover from log- to javascript-based tracking.
Run a Logfile Tool On Current Data
If you have access to the log files on your web site, why not point a log file analysis tool like Urchin or WebTrends at the logs? Then you can compare your visitors and pageviews across multiple analytics packages, figure out the difference and adjust accordingly.
Chances are your web hosting provider already has some form of basic log file analysis running on your server. It might be pretty crude, but you only need the most basic stats: Visits and page views.
Don’t Panic
Different analytics packages measure different ways. There is no central standard for web traffic analysis.
So, if you move from one web analytics tool to another, and your numbers plunge, do a little double-checking before you panic.

Ian Lurie
CEO & Founder
Ian Lurie is CEO and founder of Portent Inc. He's recorded training for Lynda.com, writes regularly for the Portent Blog and has been published on AllThingsD, Forbes.com and TechCrunch. Ian speaks at conferences around the world, including SearchLove, MozCon, SIC and ad:Tech. Follow him on Twitter at portentint. He also just published a book about strategy for services businesses: One Trick Ponies Get Shot, available on Kindle. Read More
Great post, Ian. I definitely learned the pains of trying to explain why traffic “dropped” when switching from a log file analytics solution to a javascript solution.
Now how about the difference (or drop) when comparing 2 different javascript solutions? :)
@rexolio Oh, man, don’t even want to go there. We’d have to get Omniture and their ilk to turn over their algorithms. I think we’ll get Google to do it first…
Beautiful lively post. I’ll be referring a lot of people to it because it answers THE QUESTION so well.
I kinda wish you’d made it a bit more obvious(although you are explicit) that the important distinction is the type of raw data and not the brand name of the tool. As somebody who switched from WebTrends-Log-Analyzer to WebTrends-Tagged-Page-Analyzer a few years ago, I can say it’s truly a night and day difference … and the Tagged Page Analyzer’s reports are almost perfectly aligned with Google Analytics’ results.
I didn’t nearly get fired, but I nearly tore the head off the WebTrends tech support person … until he patiently got it into my newbie skull that I was in a whole new world.
Ian,
You do a nice job here of highlighting some of the challenges you’ll face when moving from one analytics solution to another.
I did want to point a few things out, though.
Firstly, WebTrends also has a JavaScript (tag) data collection solution, called the SmartSource Data Collector. Of course, we still support log-file analysis, too. We also support multiple sessionization methods (ranging from IP address matching to 1st and 3rd party cookies, and even authenticated user sessionizing), the configuration of which can impact visit and visitor numbers. We’re pretty flexible all around.
Secondly, if you are analyzing log-files instead of using the tag, you can filter out file-types that you don’t want included as page-views.
And lastly, it isn’t entirely true that JavaScript tagged pages are only collected as pageviews once the page fully loads. Where the tag is placed in the page code can have an impact on when the page view is recorded (i.e. when it starts to load or when it finishes loading). If you’re placing your tag at the bottom of the page, the scenario you outline is true as the tag will be the last bit of code parsed.
This said, there are good arguements for placing the tag at the top so that you collect the entirety of actual user behavior on the site. Especially on slow loading sites, it’s possible that the visitor finds what they’re looking for before the page loads and clicks out. You probably want to collect that page view, as it represents real user behavior on your site. And, your path analysis reports will look funny without it.
As I mentioned, nice post. You made it nice and simple to understand. Keep it up.
Best,
Aaron Gray
WebTrends
I am glad Chris points out that this isn’t a tool based issue per se, but rather a difference due to a tracking method change. As Chris briefly touches upon, WebTrends has the ability to track both server logs as well as logs generated via JavaScript tags. The SmartSource Data Collector is provided to software clients for collecting their own JavaScript generated data and the SAAS WebTrends OnDemand clients all utilize the JavaScript tagging methodology via requests to the WebTrends global data collection centers.
@Aaron good point – I did a terrible job describing the client’s old WebTrends installation. It VERY old. See above for edits.
This is really important stuff. I work for a newspaper company and we went through the same thing last year when we switched from Urchin 5 (no UTM) to Google Analytics. The drop was even more pronounced than what you’ve laid out here.
As you can imagine, explaining to managers (not to mention advertisers) why our numbers were suddenly slashed was a challenge.
Hopefully, one day a standard is established, because apparently virtually no one’s numbers are 100% accurate.
Hi,
We are looking for a good analytics software for our business portal.
We were looking at using Urchin 6 as we did not want to go for Web Trends which is very pricy, to serve a group of sites with 50M impressions.
The problem here is when the access logs for one month for a site were fed into Default config of Web trends and Urchin, there was a huge difference in the visits. Urchin was almost 25% less than what Web trends suggested.
The page views were around 7% less.
Can someone guide us as to how would this happen, also suggest if Web trends is really better than Urchin.
Thanks
Hitendra
@Hitendra I need to know if WebTrends is using log file tracking or log file tracking plus javascript tracking…