The vast majority of digital reporting is comprised either entirely or partially by data from Google Analytics. So while familiarity with it doesn’t guarantee an effective reporting process, lack of familiarity does guarantee an ineffective one.
There are a bunch of mistakes commonly made with GA (all of which I’ve made over the years). Here are some of the more costly ones:
1. Covering too much
As with any modern analytics platform, there is a hefty volume of data in GA for anyone that wants it. The problem is that unlike GA, humans are pretty lousy at processing vast quantities of data, so if our reports fail to laser in on a small number of key items, we can safely assume that a grand total of **** all value will be taken from the process. We need to take a few strands and use those to tell a simple business narrative.
2. Automating reports
In spite of my previous point, there have been a handful of occasions over the years where I have sent a new customer a report and been informed that it lacks detail. Apparently their previous agency sent a 35 page every week with lots of colourful graphs and charts. I’d ask them to share some examples and predictably, they were all automated reports, with absolutely no narrative or actual analysis. Clearly it was reassuring to the customer so maybe there’s a lesson in there for me, but in terms of actual value this process is adding between zero and nothing. The value is in digging into the data, finding things that don’t quite make sense, staring at the wall for a couple of minutes in quiet frustration and then reaching a conclusion – ideally one that drives action.
Bypass the work and you’ll bypass the value.
3. Presenting it in the wrong way for the audience
If the report is for the marketing manager, then a lengthy PDF could well be right. After all, it’s this person’s job to be interested.
If it’s for a CEO or Sales Director, on the other hand, do not expect them to even open the pdf, let alone make sense of it. These are busy people, and frankly they have more important things to be doing than reading my 400 word analysis of their recent surge in traffic from Bing. All these individuals care about is what it means for their business, and I better be able to contain my thoughts to half a dozen bullet points in an email or they’re not going to be heard. Even better, via text message.
Make it easy for them to be interested.
4. Lack of numerical attribution
The majority of our clients sell high value, bespoke services, which means few of their websites possess ecommerce functionality. One of the downsides is that we can never be clear of the exact value delivered to the business in analytics (how much is an enquiry on x page for y service worth?). However, that doesn’t mean we can’t make an educated guess Let’s say that we’re an accounting firm and we know from our historical P&L data that a typical client stays with us for three years and pays an average of £900 a month. That’s £32,400 of lifetime revenue, £10,800 of which is profit (working to a standard 33/33/33 split between overheads/wages/profit). We also know from our CRM data that our conversion rate from website leads is 15%. The value of a website lead is therefore £1,620. Or maybe we err on the side of caution and round it down to £1,500. By adding this value attribution to our goals in analytics, we can now see an approximate value delivered by each channel. It may not be exact, but if a channel is delivering lots of value or little value, we’ll know it, and can then adjust our spend and focus accordingly.
5. Misunderstanding the data
A great deal of data within analytics is misleading. Which is not to say that it’s inaccurate, but simply that without context it may lead us to think one thing, when actually the opposite is true. For example, let’s imagine that organic traffic is on the rise. “Fantastic!” we think, as we enthusiastically write up our latest report. Yet a few minutes later, we notice that goals have reduced. How could this be? Surely with all this extra organic traffic we should be generating more goals than ever? It’s probably just an anomaly, we tell ourselves, and finish writing the report while we hope the client doesn’t bother to read the detail. What we’ve failed to notice is that while organic traffic is rising, the proportion of traffic being sent to pages that actually convert is reducing. This is a constant problem on B2B sites as typically 80-90% of their traffic is to blog posts and other legacy resource content. I’m not suggesting the value of this traffic is zero, but it’s probably not much higher than zero (depending on your ability to capture the data with some kind of hook). Whereas each visitor to a specific product or service page could be worth tens of even hundreds of pounds. This is the traffic we care about. Setting up a segment in analytics to isolate this traffic will give far clearer insight as to the actual commercial performance of the website.
Don’t get distracted by the noise.
6. Looking at the wrong time frame
The number of times I’ve heard people excitedly exclaim “website traffic has risen by xx%!”. “Wow”, I respond “that’s awesome! Over what time frame?” “The last week” they reply. At which point I feel myself losing all interest in the conversation. You see, 7 days is fine if you’re Amazon (in fact 7 minutes is probably fine if you’re Amazon), but for a B2B organisation that generates small quantities of targeted traffic for ultra high value services, it’s statistically meaningless. The natural variance over a week, or even a month, is so great that we need to look at far long time frames, probably over the course of the year. We also need to compare to the same periods for athe last two or three previous years, in case there is a seasonal trend that we might be missing.
7. Attaching too much value to the data
The final mistake is allowing our strategies to be driven by data. Tactics, yes. Test, measure and iterate based on the data available. That’s been true for a hundred years.
But allowing the data to drive your strategy is a road to nowhere. It can only tell us the value of what we have done. Not the value of the things we didn’t do, and certainly not the value of the things we might do in the future.
As Rory Sutherland, VP of Ogilvy writes in his superb book, Alchemy: The Surprising Power of Ideas That Don’t Make Sense – never forget that the “data in 1993 would’ve predicted a great future for the fax machine.”