Picture the scene: A marketing manager spends tens of thousands on planning and executing a high-profile integrated campaign and then… does nothing. No reports on its success or failure, its visibility, its affect on company identity, or the demographics of the audience reached.
Sound incredible? Not exactly. When it comes to web sites, this scenario is played out time and again, with no knowledge of how the site is used by visitors, or if those visitors ever return.
Put simply, the more you know about the strengths and weaknesses of your site, the better it becomes. Just about every site has its “hit-counter,” recording the number of visits. But just because people hit the site doesn’t mean that they will stay there, or absorb any of the messages that it’s trying to convey. To make the most of your company’s web presence, and to justify investment in the web site, it’s necessary to dig deeper.
With in-depth analysis of the traffic across the site, one can track usage and generate explicit reports on how the site is being used by visitors. Understanding the online customer and all of the different aspects of a web visitor’s experience is critical to ensuring the effectiveness of a site and ensuring that the goals of the web site are met. Critical e-business intelligence information which underpin the effectiveness and performance of a web site include:
- Visits to each page
- How many individual users have visited a site
- Visitor domains — Are visitors potential clients, partners or competitors?
- Average time that each page is looked at
- Whether site changes increase or decrease traffic
- Which advertising vehicles are most effective for a site
- Busiest times for a site
- Server load and performance statistics
- Paths that visitors take through a site
- How long it takes visitors to access specific pages
- Error reports seen when visitors access the site
- Web advertising effectiveness
- How many visits resulted in sales
- Which visits resulted in sales
- The behavioral phenotypes of various users — Is there a pattern of behavior for users who buy versus those who don’t?
Putting this information into use is easy. For example, a common path taken by visitors through a site is from the home page to the general product information page, to an individual product’s page. This can take four or five clicks on some sites, and only one or two on others — with real and obvious implications for the number of times people will revisit the site.
By making it easier for people to get the information they need, revisits are encouraged. This is a technique called ‘designing for analysis’ — that is, the web site should have a logical structure which enables you to track visitors meaningfully, and find what interests them, as they use a site.
Another example of how analysis can deliver benefits is in the capture of Internet domains as visitors visit. With these addresses, one can build a database of visitors, gaining critical e-business intelligence about a given web audience.
So far so good. But how does one go about gathering this data and turning it as an advantage? Here are some of the IT issues that need to be addressed.
Harvesting The Data
Some of the information outlined here can be gleaned from the web host server’s access logs. However, these logs were not designed with in-depth site analysis in mind, and cannot easily be tweaked to produce the kind of information that’s desirable from a management perspective. When a web site is accessed, most servers will record only the visitor’s domain (i.e. the Internet address), the date and time of the visit, and the size of the item requested. What’s more, if information isn’t recorded in the log files — as sometimes happens — then there will be no record of visitors at all.
Even if a record exists, it can be misleading. Distortions can occur because of the way IP addressing works. Many Internet service providers employ dynamic IP addressing, a method of spreading a large user demand for IP addresses across few machines, so a user may come from different addresses on consecutive days.
Proxy servers will also distort the record. All Internet requests from inside a security firewall must first go through the proxy server. In large companies, hundreds of PCs may make requests to a site, but only the domain of the proxy server will appear in the log.
Also, frequently requested documents are stored locally on the proxy machine. When users request such documents, the proxy just returns the copy it has on disk and no request is made to the original web server. What’s more, a proxy in front of the web server will store and send in-demand documents to the user making it impossible for the visitor to be logged.
Difficulties also arise in attempting to track users as they move through a site. Each HTML page requested is logged by the server independently, without describing how one is related to another. If traffic is high, it’s unlikely that individual requests will be recorded sequentially in the log file.
One answer is “cookies.” These are identifiers used to store information about a user’s interaction with a site over multiple requests. The cookies pass back and forth between server and browser each time information is requested and received, and can be used to track user access. However, cookies are mainly used for marketplace interfaces where the user has a “shopping bag” of goods, and are not supported on most web servers.
Another method of tracking is to make assumptions based on the time intervals between document requests with the same host-name. Again, this is not a satisfactory method of tracking users, as requests for documents can be done arbitrarily, which doesn’t produce accurate statistics.
The Right Tools For The Job
Given the importance of web analysis, it is essential that software tools are available. It will be most helpful to conduct on-site analysis, with an online database providing data on-demand and allowing older information to be taken into account when looking for possible trends.
On-site analysis can also be faster, because third-party intervention in the process is eliminated, and as many reports as required can be created at no extra cost. Sophisticated web site usage analysis tools empower organizations throughout an enterprise by providing in-depth usage analysis reports to the people who need them, whenever they need them.
Good analysis software will also package data in a variety of report formats, from raw statistics to presentation quality. CIO’s, e-business managers, and web site administrators can decide whether to provide basic figures for say, IT analysts, or colored charts and tables using multiple fonts for the marketing department. Various software formats can also be specified for the report, such as HTML, Postscript or PDF (Adobe Acrobat).
Other analysis features include: the capability to run multiple parallel reports, importing and analysis of other Internet server log files, demographic information, and automatic report mailing. What’s more, production of these reports can be automated to ensure that the e-business manager has the information s/he needs, as soon as it’s available.
As web sites grow ever more central to successful business, analysis of the way they are used becomes a fundamental issue. Web site administrators with an effective tracking package will be supplied with quantitative analysis enabling them to maximize their site’s potential and derive full benefit from their Internet presence. As most organizations struggle to determine the return on its web investment, implementing a sophisticated e-business intelligence tool has become a critical business need to ensure web site effectiveness and measure an organization’s return from their web investment.