Most of us are ardent followers of client side site traffic analytics tools like Google Analytics or site meter while some people still use old style server side analytical options such as Webalizer or awstats. No matter which tool you use the problem of qualitatively analyzing the real traffic to your site or blog still remains an unsolved puzzle – even Google Analytics does not work well!

Problems with server side traffic tracking

There are several concerns with respect to the server side tracking of your site’s traffic. First of all, these tools record all types of ‘visits’ and ‘hits’ to the site. This might include bot visits, pings etc. Basically all that they do is to record the raw data which will then be aggregated as a report.

Secondly, the server side tracking basically eats up some of your server resources. This includes the CPU time and mega bytes of disk space used to store the tracking data. This can cause a marginal performance issue in terms of site response as well.

Thirdly and most importantly, deducing some valuable information out of those megabytes is not an easy job. Awstats might show your site’s daily page hits as 2000 hits where as in the eyes of Google Analytics it may be just 200!

Client side tracking (Google Analytics, Sitemeter etc)

The client side javascript based traffic tracking solves a couple of the above mentioned problems for you. Basically, as the script runs on the client (browser), there’s no load on your server and hence no performance issue. Since the tracking code is installed at the footer (the right approach), there are no rendering delays as well. Further, they track the actual content access and the traffic analytics is more realistic. On the flip side, the client side tracking tools miss out on tracking a very small % of people who have not enabled Javascript on their browsers.

Client side tools still not perfect

We still have bigger issues with the client side tracking tools. Most of us do use Google Analytics and mostly trust Google for whatever it shows. And I do believe what I see when it comes to the traffic analysis of direct visits and referring sites. However, I have a problem with what I see on the search engine visits.

If you carefully analyze your Google analytics search visits, you will see that a big percentage of the visits have an average time of 0 seconds on your site. This can be attributed to a quick exit before the whole tracking code executes. [Edited: But my strong doubt is that Google actually also includes even those instances where your page appears on SERPs as a result of a keyword search. In other words, it may be recording a visit even without a click through!]

To verify the above theory, please take a look at your Google Webmaster Tools => Top search queries page. Check the search data for the past seven days for all search types and see the difference between ‘impressions’ and “click throughs’! Is your Google Analytics search visit numbers inflated?

What would be your quality traffic?

According to me your real quality page views are those page views with an ‘average time on site’ value more than zero seconds. And if you run through your Google Analytics content tab, you will see that it is hardly 50% or 60% of your actual page views!

Any takers?

Tail Piece: I would assume that Sitemeter probably provides a more realistic data. I could not crosscheck it as I have disabled it since the theme change. Please let me know your thoughts on this topic.

Comments to: How much traffic does your blog ‘actually’ get?

Your email address will not be published. Required fields are marked *

Attach images - Only PNG, JPG, JPEG and GIF are supported.