Showing posts with label Mixpanel. Show all posts
Showing posts with label Mixpanel. Show all posts

Sunday, 20 July 2014

Just launched my new company called Insignum

Yesterday I launched my new company called Insignum. It delivers anomaly detection and notification for your analytics across Mixpanel, KISSmetrics and Google Analytics.

The product provides data-driven startups with automated analytics intelligence to find deviations in expected customer behaviour as they happen. Using machine learning algorithms, it continuously searches your data and when an anomaly is detected it will notify you so that you can act on the insight.


If you have any feedback about the product I'd be very interested in hearing about it. Insignum's Twitter account is @insignum_io.

Monday, 11 November 2013

A Growth Hacking Case Study on Starbucks SRCH


In 2011, while working at Blast Radius, a global digital agency, I was responsible for the technical development of the 'Starbucks SRCH Scavenger Hunt'. The following video describes the campaign.


Given my recent venture into the world of growth hacking and the way it now informs my thinking, I took another look at the Starbucks SRCH Scavenger Hunt from a growth hacking perspective.

Here, I will present it as a retrospective case study using the publicly available data. Here are the numbers quoted by Blast Radius in their post:
  • 7000 Starbucks locations advertised the initial QR code for launch
  • 300k visits over 3 weeks
  • 23k registrations (97% played at least one clue)
  • Avg. time for 1st person to solve a clue was 21 min, indicating extremely high engagement with the brand
  • Over 20k posts from social channels regarding SRCH
  • Media coverage from Mashable, USA Today, CNN, PSFK and more

1. Use a Simple Framework

I've posted before about Dave McLure's Startup Metrics for Pirates: AARRR or Chamath Palihapitiya's growth framework. Neil Patel and Bronson Taylor have also created an even simpler three stage framework influenced by Dave's ideas: Get Visitors, Activate Members and Retain Users. Either framework can be used to independently measure and analyze each stage that a user progresses through as they go from having never heard about the product to being fully engaged and possibly paying for a premium version. In this case, I'm choosing to use Chamath's four stage growth framework as it ignores the revenue stage (due to Facebook's business model which makes sense for SRCH as well because it was also a free product):









2. Start Acquiring Users

Paid media was not used for this project so all inbound traffic for SRCH's acquisition (300k visits) came from the following three sources: 
  1. Existing Starbucks Customers (via their 7000 retail locations)
  2. Traditional Media (Mashable, USA Today, CNN... etc)
  3. Social Media (Mostly Twitter & Facebook)
















Things To Consider:
  • Unique Users: Using "visits" to quantify the acquisition stage is ill-advised. Visits, page views, downloads... etc are usually just vanity metrics and were most likely quoted in this specific instance to bolster numbers. What should be measured at this stage is the exact number of unique users to the landing page(s).
  • Conversion Rates: According to Terifs data analysis, Starbucks had, on average, somewhere around 500 daily customers at their retail locations in 2010-2011. Given this insight, if 7000 retail locations had approximately 500 daily customers over a 2 week period while they might have advertised the SRCH Scavenger Hunt, then there was a potential audience of 49 million customers (without factoring in repeat customers which might  in fact be quite high). As an example, if we split the 300k visits three ways across each acquisition channel (stores, traditional media & social media), then we can estimate that the Starbucks locations brought in approximately 100k visits alone. Thus 100k visits/49M potential customers translates to a conversion ratio of only 0.2%. It is interesting to consider that this is in line with conversion rates for digital display advertising (i.e. banner ads) which are known to have very low click-through rates (CTR) compared to other advertising methods. So when thinking about the logistics and development costs required to setup advertising across 7000 Starbucks stores coupled with the conversion rate of those in-store ads, which approximate the conversion rate of banner ads, it may have been more beneficial to spend time optimizing the in-store advertising of SRCH or switching to paid media to drive those visitors to Starbuck's landing pages.

3. Measure Each Stage In Detail

One of the most valuable things to do in any project is to measure each stage along the growth framework (a.k.a. funnel) and figure out the conversion rate at each stage. This shows where users are dropping off and also allows segmentation of the traffic/users so that insightful questions can be asked like "Which types of users are activating more often?" or "What source did our most engaged users come from?" or "Where should we start optimizing first?".


NOTE: The only data available is from Blast Radius may not be accurately measuring the most representative proxy for each stage. 

Here are some things to consider when building these types of funnels and analyzing the results:

  • Counting Conversion: The funnel should be measuring each user independently and any action they perform should only be counted once. Thus, if a single user sent out multiple social media posts, the virality stage should only count one of those social posts since that user initially "converted" to that stage of the funnel (i.e. converting multiple times is still just a single conversion). The reason this immediately stood out to me was the 87% conversion from the engagement to virality stage. From my experience, this number is quite high, and I assume that it measures the number of total social posts but not necessarily the ones from engaged users only.
  • Defining Engagement: The engagement stage took into account whether the "user played at least once", which may or may not be the right proxy for what should be considered an engaged user. Measuring engagement is by far the hardest stage to measure and each business should measure it differently and constantly re-assess whether they are measuring the right thing. Many industry leaders have discovered what their leading indicators of engagement are, but these are hard to figure out without a comprehensive understanding of the customer and tested theories based on data analysis.
  • Funnel Creation: Given the growth framework above it is very helpful to map each stage to a funnel step in an event based analytics tool such as Mixpanel or Kissmetrics. I've written a post before about using Dave McLure's AARRR framework with Mixpanel but here is a mocked-up version of the growth framework above mapped to a Mixpanel Funnel:


3. Optimize The Funnel

Given the data above, the best place to start optimizing would be higher up in the funnel where the largest drop-off was experienced (i.e. landed users who don't sign-up). The reason for this is that a one percent increase in signed-up users has a much larger effect on the overall completion rate than the same percentage increase in engaged users. One thing to be careful of with this approach is that diminishing returns start setting in the moment you begin optimizing a step. At some point the effort required to discover a change that has a tangible effect is no longer worth the cost. Here are some ideas that could have been used for optimizing each step of the funnel:

  • Optimizing Acquisition: Inbound traffic came from 3 channels as mentioned above. Figuring out which of those channels brought in the "best" users (most highly engaged) using Mixpanel's segmentation features (or Google Analytics), could focus efforts by reallocating resources to focus on the acquisition channel that performed the best and had the greatest potential for increases. For example, optimizing the retail in-store advertising about SRCH during the Scavenger Hunt would have been complex (in terms of logistics and timing to rollout any changes) but this could be tested at a single store and if sufficient increases were noticed to justify changes across the other 7000 stores, the improved advertising could be rolled-out. Essentially testing a variety of in-store combinations of advertising placements, colours, QR codes vs. actual links... etc. could be rapidly performed to see what single or set of changes drove more traffic.
  • Optimizing Activation:  The conversion page could be A/B tested (using something like Optimizely) for activation to determine if there are any changes that would boost sign-ups. Social sign-up, wording, images, colours, layout can all be A/B tested provided there is enough inbound traffic to support the tests. (See Neil and Bronson's suggestions for conversion growth hacks). Changes should be statistically significant, as measured with a A/B split test calculator.
  • Optimizing Engagement: This is the core of a user's experience. As can be seen, there are a number of steps that a user must go through to get to this point but once they are here they should be given what some call a "must-have experience"or "aha-moment" if they are ever to come back and continue to use the product. Not having this is the difference between whether or not the product has a product-market fit. Without it, no growth hacking will be that effective over time as the product will just bleed users over and over again until there are no more users left to acquire. Therefore, optimizing for engagement comes only after product-market fit has been found. If there is a clear understanding of how users are engaging with the product and there is a desire to boost engagement, a number of tactics are available. For the Starbucks SRCH Scavenger Hunt, email, SMS or push notifications could be used to alert users when the next clue has been released or when the first user solves a clue. SRCH was a game after all so building in a gamification system built upon competing users could boost engagement with existing users.
  • Optimizing Virality: Increasing the amount of users who post something about the product to their social graph requires trust, a value proposition and reducing friction. Thus, testing a number of combinations such as where in the flow should the user be prompted to post, what copy should be used to encourage a user to post and what copy should be used for the auto-populated post text. Additionally adding in some clear value added benefit (i.e. exclusive access, more game features... etc) for the user posting could also increase the number of user's who decide to post something to their social graph.

Sunday, 28 July 2013

What to do if you drank the Kool-Aid on bullshit metrics

I've always been interested in quantitative data and the ability to derive insights from that data. So when a VC Firm (Andreessen Horowitz) raised a ton of money ($10.25M) for an analytics start-up (Mixpanel), I took notice. Soon after raising the money, Marc Andreessen and Suhail Doshi came out swinging with a punchy line aimed at getting to the heart of a common problem in the technology industry:
Some people call page views and the like “vanity metrics,” but Marc Andreessen and Mixpanel founder Suhail Doshi have decided they want to raise the shame level by calling them “bullshit metrics.
Andreessen told me in an interview last week, “People think they’re richer if they have Zimbabwean dollars than U.S. dollars.”
“We and other investors need to get more vocal,” Andreessen said. “Page views and uniques are a waste of time.” 
Andreessen said his firm won’t throw start-ups out the door if their pitches include bullshit metrics - but it’s perhaps something they might consider.
Liz Gannes @ AllThingsD

So if you drank the Kool-Aid and decided that random download stats or pageview metrics, that go up and to the right, are pretty much worthless, then where do you turn? What do you measure that is more insightful than these bullshit metrics?

I've written previously about Dave McLure's "Startup Metrics for Pirates: AARRR" and its definitely worth starting there for an overarching framework for how to think about the whole customer lifecycle. Once you understand that lifecycle for your particular product, you can then begin to integrate an analytics platform into your product that captures the information you need and can then act on. Event based analytics (the kind of thing that Mixpanel excels at) is based around capturing and then segmenting all the events that your users perform. By segmenting the aggregate of these events you are then able to build an awareness of and insight about who your customers are, how they first got introduced to your product and how they are currently using it. Segmentation is a great starting point but there is another even more valuable tool called cohort analysis once you have all your events setup. Cohort analysis allows you to measure customer retention so that you can answer the question of whether or not your customers love your product. Andrew Chen has a great blog post on this where he asks that very question.

So once you have a few months worth of cohort data, how can you then determine whether your product is above or below par? It turns out this is a very hard question to answer because it usually depends on the type of product, your customers and a variety of other factors (essentially there is no "standard" that's works for every product). This doesn't stop some people/companies from speculating so here are a few reference points:


So as a very general rule of thumb, a retention rate of 30% month after month seems like a decent number to benchmark against. But a word of caution, definitely don't consider that number to be some special threshold for which your product can be deemed successful in the marketplace if you surpass it. The matrix from Flurry above was created in Oct, 2012 but an earlier version first appeared back Sep, 2009. If you look at how the retention rates for social networking apps have change over the last 3 years its startling. Back in 2009 social networking apps had a 90 day retention rate of approx. 15% as opposed to approx. 34% in 2012. 

As always, in the technology business, the goal posts continue to move every single year. Ben Horowitz, of VC firm Andreessen Horowitz, said this:
The technology business is fundamentally the innovation business. Etymologically, the word technology means “a better way of doing things.” As a result, innovation is the core competency for technology companies. Technology companies are born because they create a better way of doing things. Eventually, someone else will come up with a better way. Therefore, if a technology company ceases to innovate, it will die.

Friday, 26 July 2013

Mixpanel Implementation of Startup Metrics for Pirates: AARRR

Dave McClure of 500 Startups (a seed accelerator and investment fund) has a great slide presentation on slideshare called Startup Metrics for Pirates: AARRR!!!. Don't let the old school graphics fool you, it's packed with a ton of insight about how to strategically think about your startup in terms of quantifiable metrics.




On a previous post about Chamath Palhapitiya and focusing on the right things I included his simple 4 stage growth framework which had the following stages: Acquisition, Activation, Engagement & Virality.

Dave's metrics, called AARRR, have an additional stage (5 in total) and are as follows: Acquisition, Activation, Retention, Referral and Revenue. In Dave's case the Referral stage is similar to Chamath's Virality stage except for the fact that Dave goes into some detail about using it to effectively acquire new users on the back of your existing users. Chamath advocates quite passionately to not even focus on the concept of virality due to its illusive nature and the negative distraction it will cause. There are some very subtle differences between virality and referral but I will discuss those in an upcoming post.

Chamath's team at Facebook never spoke about virality or k-factor while building out their incredibly successful social network. He believed it was essential for his team to focus on the quality of the actual product and continue trying to make the overall experience for users better and better. Chamath says far too many people focus on this holy grail of trying to make their "bad" product viral in some way instead of finding ways to make their product better (or even just decent/okay) so users actually keep using it over time.

Either way Dave's startup metrics for pirates (described in the presentation above) give a solid footing to any startup interested in building a quality product/service and a business around that product/service. Measuring how successful things are going during a startups lifecycle is difficult due to the dynamic nature of a startup, but this simple framework and associated metrics are sufficiently generic enough that they are still relevant as a startup morphs into a revenue generating business. 

I am a huge fan of Mixpanel's event based analytics platform. I use it extensively at RESAAS and love the ease of setup and ability for both engineers and marketers to easily sift through tons of data, explore theories and then develop insights that impact future product decisions. If Dave's startup metrics for pirates, AARRR, is something you decide to build yourself then I highly suggest trying Mixpanel as the analytics platform behind those metrics (other product you could use include Google Analytics, Kissmetrics, Woopra, Flurry... etc). What's nice is that the Mixpanel team even put together this great blog post back in November, 2012 about using Mixpanel to implement Dave's AARRR metrics.