The web is the only marketing-communications channel that allows us to track practically everything a user does, which, in turn, allows us to learn more about our users than anyone ever thought possible. So, it’s no surprise that web marketers continually ask, “How’s our traffic?” As traffic at a site grows, so does the complexity of obtaining traffic reports. However, several sites are now making it easier to track traffic.
Tracking web users is one of those good-news/bad-news situations. The good news is that we can capture a tremendous amount of data about what users see and do. The bad news is that we have to sift through all that data to get actionable information. Of course, more data doesn’t necessarily mean more knowledge, but it always means more data to process.
When web pages had a limited number of graphics, the log files were very manageable. As designers added more graphical images to precisely control how pages look, such as invisible “spacer” graphics, log files grew larger very quickly. Sites that had been able to download logs to a local PC to create reports found that the log files were too large to transfer in a reasonable amount of time. This required using a server-based log analysis program to create tables and impressive graphs, but it placed an extra processing load on servers.
This meant that system administrators needed to develop data-management strategies just for managing and processing log files.
If you’ve looked for a way to avoid managing large log files and dealing with the heavy processing power they require, take a look at the current crop of hosted traffic-analysis services such as IBM’s SurfAid Analytics, HitBox Enterprise and WebTrends Live.
These systems work much like serving a banner ad. Code placed on each page to be tracked tells the central server when the page was served, plus all of the other information that appears in a standard server log. Then, the server tabulates the data and produces attractive reports showing a variety of measurements.
Such use of a third party for traffic analysis started a few years ago when page counters started being offered to small businesses that couldn’t afford traditional log-analysis programs. Today, however, enterprise-quality services deliver benefits that match and sometimes surpass the benefits of stand-alone traffic products.
In addition to identifying how many times each page was served and when, there are additional tabulations that can provide insight into a site’s visitors. For example, hosted solutions typically produce reports similar to stand-alone software, such as:
- Search-engine keywords
- Entry/exit pages
- Path taken through site
- Repeat-visitor analysis
- Browser software used
Some hosted traffic services store traffic data in a database that allows you to drill down to see finer resolutions of data. Another advantage of a database-driven system is that it allows you to view the data from multiple perspectives, sometimes called “spinning” the data, as if it were a three-dimensional cube. And many of the third-party traffic-analysis sites produce reports in real time, which means you don’t have to wait for reports to be run at night. This makes them great for monitoring what visitors are doing while a promotion or event is in progress.
For web operations that are already stretched for technical resources, these hosted services eliminate the need for:
- Technical staff to manage log data and tabulations
- Server-processing resources
- Special hardware or purchased software
These services also provide third-party auditing reports that show advertisers verified traffic reports, export data for processing on a desktop PC, and other benefits that are usually associated with stand-alone traffic products.
There are a few downsides to consider, including price. As your traffic goes up, so does what you pay the traffic-analysis firm. However, when you consider the total cost of producing timely traffic reports, these services can make good sense for managing real-time web marketing.