Product Analytics
What is product analytics anyway?
Product Analytics relates to understanding user interactions with an application or service, and how those interactions impact business outcomes such as retention, churn, and revenue.
It is an important but often misunderstood specialisation of analytics, often distilled to collecting 'everything' - what is called clickstream data like taps, swipes, clicks and pageviews - whereas in fact product analytics should be a complete set of data that informs you about the user experience with your product, in the simplest way to get answers from as possible. Remember that a user does not necessarily have to do something themselves in order to have an interaction with your product.
Promotional communications, offline interactions, errors and bugs, user's feedback and customer support contacts: these are all important interactions that a user has with your application - they add to the entire experience.
How do I know what is going to be useful?
When you're planning your product analytics implementation, you must work backwards from the core questions. The most fundamental indicators that tell you if your product (or feature) is doing what it's supposed to be doing well. That's the best way to ensure that you're always collecting what matters, measuring the most important items, and with enough context to explore the 'why' behind certain activities.
In this article I'll run through a framework I designed a few years back in my first role as a product analyst working at Xero.
The situation at Xero (circa 2018)
For some background - Xero at the time in regards to analytics, specifically product analytics was kind of like a half-melted glacier - some parts of it were solid, but there were many seen and unseen gaps which made it difficult to navigate. Not to mention the tools were not at all user-friendly, so accessibility was also a huge problem.
Even as a technical analyst often I could not find or use quality information, so it was much harder for product managers or developers. Although there were several product teams doing their own thing for data capture, that meant little to no consistency, no common structures or conventions, and no straightforward way of capturing comparable information.
The result of that was an inability to connect the dots at a higher organisation level around customer behaviour and business outcomes. For example, a reasonable question from leadership may have been which interactions with which types of product led to greater retention or higher LTV - but without adequate and comparable data collection, it was basically impossible to find out. And there were lots of products....!
When I was there, around 22 different product domain teams existed - covering things like reporting, payroll, projects, expenses, invoices, bank feeds, bank reconciliation, tax and so on and so forth. For all of those teams, there were only 4 analysts, and they were not really shared amongst teams. Tackling this problem was a massive task, but I was young, optimistic, and naive enough to think I could try and help fix it - along with a few others that had the same idea.
So we formed a working group, and we wrote up a proposal on what needed to be done to help address the problems. I got a new role in the company looking after digital analytics platforms for marketing and product. I then immediately set meetings with all of the product and marketing teams to create a tracking plan for each of their domains.
I drew from the Avinash Kaushik method of determining metrics for marketers, but applied that more to a product domain. I also followed the Jeff Patton wisdom - what should your product be doing? and to measure that. So kind of created a mixed up frame work for event tracking, and to help teams determine simple and effective success metrics for their products.
Product Analytics planning framework - three part approach
Part 1:
Outline - why does your product or feature exist? Choose three statements that explain the who, what, and why, use adjectives such as 'easily' or 'quickly' to describe the intended experience where relevant.
For example, a music streaming platform like Spotify might exist for the following reasons:
1. To enable high quality music streaming and recommendations for listeners
2. To provide a platform for artists to share and monetise their music with listeners
3. To provide profitable income streams for the parent organisation
Part 2:
For each purpose above, what top three metrics tell you if that purpose is being met?
Part 3:
For the above metrics, think about the core events that you will need to track, within particular workflow types - and also about the way you want to be able to break down the metrics. For example, you might want to see the average error rate per session, per day, per week. And you might want to be able to break it down by listener's location, device type, or tenure for example. So that will help you shape what events and parameters you start by capturing. The following table is a good place to start planning what to track and what parameters to include.
Start very simply here, only the core steps really matter. You can always add in more events and more context as required. You should group activities by workflow primarily so that you can get key results as a conversion metric. If possible, you should use a 'thread' or a correlation id so that you can measure each specific workflow from start to end (eg: a document id).
At Xero after creating this style of tracking plan for all 22 + teams, we found that there were many similarities across the product business - in actions, workflows, and in high level metrics such as retention and churn. Of course the rules changed based on expected behaviour within different products. For example the tax product retention metric was set annually at the client level (tax only gets submitted once per year!) whereas the bank reconciliation product expected to see usage at least once per week.
But fundamentally we were able to design and simply label a comparable set of events that could be used and understood by everyone across the business - with built in flexibility to allow teams to collect more information, conduct and analyse product experiments as they needed.
We collected the data into tools like Mixpanel as well as the data warehouse and google analytics to allow for analysis, using what was effectively an in house version of segment for application instrumentation. While I moved to Amsterdam before the rollout was complete, at my next role in a start up, I used what I learned at Xero to implement an excellent and well-used product analytics set up within the space of a few months.
If you would like further advice or guidance on how to get the best insights from your product analytics just reach out, otherwise I can highly recommend the blogs and guidance found on Amplitude and Mixpanel documentation.