Report from the June 2005 Emetrics Summit, London:
A Personal Digest Of 10 Web Analytics Issues

The Emetrics Summit 2005, led by Jim Sterne, was a global gathering of web analytics practitioners and vendors. It was a useful and lively exchange of the latest thinking and practical issues facing the web intelligence industry. The 2006 London Emetrics Summit report is also now available.

“The price of light is less than the cost of darkness”

Arthur C Nielsen

This my personal take on just 10 of the 24 key issues identified during the 2005 Emetrics Summit.

1) Actionability

The Number One problem highlighted by web analysts during research by Forrester last year was translating insight into action. Our discussions certainly showed this to be a case with many attendees this year.

Web analytics should produce actionable information, and often involves making changes to the website or processes. If you cannot apply your insight and make changes to your website for whatever reason, then the value of web analytics diminishes to virtually nothing and can be a very frustrating endeavour.

Discussions returned time and again to circulation of information, the importance of clearly demonstrating the value to the bottom line and showing the price of inaction to management. Literally putting the money on the table had got results, as had always applying the question "So what?" before bursting forth with your insights!

Where politics and lack of resources are major obstacles, lessons learned have included:

  • Not trying to tackle everything
  • Controlling data circulation to minimize overload
  • Prioritising on small but effective wins with demonstrable savings or increases in conversions

2) Clearly defined goals

Clearly defined measurement goals aligned to the business are critical. And we soon came to agree we're not talking KPIs but Key-KPIs (KKPIs was the new acronym of the week!) Defining those critical few measurements and agreeing how to apply them is essential from the outset.

And there is no single list of measurements to suit all - there are some that come up repeatedly but the key factor is to align your measurement with your specific business objectives.

3) You are not the customer

Your boss, your designer, the analyst - are not the customer! The only way to discover what the customer wants, thinks or does - is to ask, to watch and to test.

Jim Sterne's take on the world's best pop-up survey:

  • Why did you come this time?
  • Did you achieve your objective?
  • If not, why not?

Evaluation of user experience and satisfaction is a critical measure. And if you're having a hard time convincing your colleagues, why not try usability testing on the cheap, with the "five random guys" test. Seeing how real people undertake specific tasks on your site - such as making a complaint - can be a major eye opener!

4) Testing, testing

Making changes without testing or measuring them is at best, a guess. A/B testing is one method of making side by side comparisons of a variable, by serving different versions of the same page. This allows you to measure what works best and roll out the change.

The Chief of Data at Yahoo gave an example of how A/B testing on the position of the search box on the page had lead to a 2% increase in searches. That's seriously substantial revenue on a worldwide scale! His examples of how the applied use of data mining has allowed them to segment and target customers according to search interest and behaviour were of interest. They were also an example for the way forward for intelligent segmentation of customers and greater personalisation of sites according to unique user behaviour.

5) Asking prospective vendors the difficult questions

The importance of looking not just at the features of a web analytics vendor's product, but the specific needs of your reporting KPIs, the technical issues of your unique site and thereby asking the difficult questions was highlighted.

Several speakers told frank tales of disaster after appointing the wrong or a series of wrong providers. The lesson they imparted was to examine your site, understand the things that make it different and address those factors with vendors rather than settle for an out-of-the-box solution.

6) The trouble with averages

This has made the top 10 as it is a personal bugbear of mine! As a fellow 'meanophobe' I found myself in complete agreement with Neil Mason of Applied Insights who cautioned against the reliance on averages when talking about things like time spent on site, visits and numbers of pages viewed. So often in web analytics, averages hide meaningful differences in behaviour and work against the process of understanding how different segments of users behave. The average visitor does not exist.

7) Click Fraud

The presentation on click fraud was an eye opening discussion for anyone committing revenue to pay per click advertising. Artificially or maliciously generated clicks drive up advertising costs and drive down conversion rates. Human perpetrators go beyond a few spurned lovers and disgruntled employees, to large numbers of organised workers paid to click on ads all day.

Advertisers would do well to ensure their web analytics teams are looking for data anomalies such spikes in click volumes and increased clicks with zero conversions, as well as factors such as new competitors in the mix.

8) Conversion

It is not just those businesses selling online who should care about conversion rates. Conversion is not just about buying, but the ratio of visits that convert to a desired action. It all comes back to goals and what your website is trying to achieve - whether savings by diverting enquirers away from the phone or support to sales that will complete offline. Understanding the 'cost' and 'value' of your content may be a neglected measure, but one requiring attention if non-sales sites are to deliver greater returns.

“Not everything that can be counted counts and not everything that counts can be counted”

Albert Einstein

9) Information overload vs. getting personal

Data volumes are growing massively and the sheer volume of potential material to measure, along with the sheer range of facts to be learned about customers, is leading to data overload. Identifying your critical few measures is essential - but what about when you want to move beyond trended data and examine user behaviour at a micro level? Yahoo, Lastminute.com and others talked about complex data mining to deliver highly tailored content. Other speakers told of the need to closely control the volume and type of data being circulated to avoid misinterpretation and confusion.

Clearly, one of the challenges for the web analyst will continue to be getting the right insights into the right hands - and ensuring those insights are meaningful and actionable.

10) Tools are not the same as answers

Buying a web analytics package - even the very best on the market - will not give you answers. They are the tools to help you interpret your data and support you in drawing insights. Skilled human resources are required in the process - in terms of interpretation, insights, testing and applying changes. Consider carefully what resources you are able to commit to the web analytics team prior to selecting a vendor - and (cheap plug!) don't forget that that you can outsource your web analytics if you don't have the skills you need in house.

Many thanks again to Jim Sterne for a great event.

Vicky Brock of HBR can be contacted at: vicky@highlandbusinessresearch.com