Get Agile with Web Analytics: Part 2
In Part 1 we set the stage for a adopting an agile web analytics methodology and looked at goal setting, KPI definition, closing data gaps, and creating your “to review” list.
In this post we’ll discuss data automation (and why it’s not the end-all-be-all), reporting strategies, insight brainstorming sessions, and the relative importance of data accuracy.
To Automate or Not to Automate?
The fully automated analytics dashboard is sort of a holy grail of marketing. The concept is having one place to check that shows performance data across all of your digital (and perhaps even offline) marketing, automatically updated as new data comes in.
Many vendors and agencies will claim this can be setup for you, (and some can take this concept very far such as Omniture, albeit at significant cost) however the devil is often in the details. One thing likely all can agree on is that online marketing continually evolves and it is evolving at a faster and faster pace. This means that your marketing objectives, content, campaigns, and the tools in your toolbox will be in a constant state of flux.
With data coming in from your primary web analytics tool (i.e Google Analytics), Facebook, Twitter, various other social media properties, email marketing, surveys, rating and feedback systems, search marketing, CRM systems, and more, the task of staying on top of the data each tool creates and funneling it into your dashboard accurately is daunting to say the least. Many of these systems will tout an API (application programming interface) however each one is different and is subject to change.
By committing to keep all of your various data sources integrated and automated, you’re adding a complex technical layer that may significantly increase your time to insight. When a decision is made to change the site, switch to a different social media platform, create or modify campaigns, you’ll likely have to submit an IT request to ensure your dashboard remains accurate. By the time this is completed, you may have already moved on to the next iteration of your strategy. Not exactly agile.
Instead, simply keep and update brief instructions next to each important metric about where and how to get the data. You may be able to link directly to a specific report. When your strategy or site changes, you only need to update text and/or a link. Follow your own instructions and you’ll have your answer in seconds. You can manually enter the number or metric to track over time, or simply look to see if it changed significantly. The level of effort you go to will depend on where you’re focusing your marketing efforts that month or that quarter. If you know nothing’s changed, why measure it?
Less Reporting, More Insight
If you want to ensure your web analytics initiative does NOT improve the overall company condition do this:
Create the same reports every month, exported into spreadsheets or PDFs, and distribute to top stakeholders. Wait for their feedback.
If on the other hand, you want results, a more hands-on collaborative process is in order. Try the following:
- Using your agile, not-necessarily-automated inventory of metrics, review the data that matters based on business goals and current marketing efforts. Identify significant improvements or problems in those areas.
- Draft a summary of the most important issues, highlighting the ones that likely contributed to the significant rise or fall of a major KPI. Create a list of the top 10 high priority initiatives (listed in order of importance/opportunity) you feel the company should take as a result of recent data.
- Organize a meeting of the top stakeholders in your organization (executive, marketing, customer service, technical) and use your summary of issues and opportunities as an agenda.
- Prior to the meeting, make sure you have the proper facilities and presentation equipment to bring up analytics tools on the fly during the meeting to illustrate your points and to reference the company website and/or social media properties to provide context.
- During the meeting walk through the issues and opportunities step by step, bring up data and representative screens to support your assertions.
- For each step, solicit direct feedback from the stakeholders and get each unique department’s perspective on what the data is revealing. Have there been major changes in strategy, technical updates, or content updates that the rest of the group wasn’t fully aware of? What are their plans for the immediate future?
- As responsibilities become clear, immediately assign them to the proper stakeholder. It may be a good idea to update your online collaboration tool’s task list (SharePoint, Basecamp, activeCollab, etc.) right then and there.
- After the meeting, go back and update your top 10 high priority initiative list and publish to the stakeholders. Also update your metric inventory based on revised goals and shifted marketing focus.
- At the next session, measure current progress against this updated set of metrics and new agenda.
- Lather, rinse, repeat, and progress.
Use web analytics data and A/B and multivariate tests to settle arguments. End the opinion loop and learn to take action as an organization regularly and rapidly.
Data Accuracy Matters…Mostly
We all want to be sure that the data we’re looking at is correct. Of course, we should take all reasonable precautions to ensure that it is. Properly tagging your site with tracking code tags, filtering out internal traffic, understanding how your web analytics tools tracks vists, sessions, events, etc. are crucial.
However, the more tools you use, inevitably, the more discrepancies you’ll see in terms of visit counts, referrals, views and so on. This is simply because they all track data differently and always will.
You can lose hair trying to reconcile all of these data sources, or you can act based on trends you see across the tools. Google Analytics may be telling you one thing about referrals from Facebook, Facebook may be telling you another, and your CRM system may be telling you something else. However, if each system is proportionally telling you that conversions from traffic driven by Facebook decreased significantly, then act. If you wait for 100% data confidence, you’ll miss the boat.
For more on freeing yourself from data accuracy dependency, see Avinash Kaushik’s blog post on the subject.
You Are Ready, Grasshopper
By integrating the points above (and from Part 1) into your web analytics process, you’ll decrease time to insight, and learn to act quickly to keep pace with the rapidly evolving digital landscape.
As always, comments are encouraged. What steps in your company’s data review process should stay or go? What challenges are you experiencing in the effort to end opinion loops and become a data-driven organization?