Group Product Manager and Head of Pricing at Mixpanel, Pranav Kashyap, elaborates on how the company made the switch to a new pricing model.
Mixpanel had an events-based pricing model initially (since 2009), until their recent change in 2019. Under the events-based model, each item that you tracked on the website was an event, which a customer sent us, and were then charged accordingly. This tracked very well with our costs, the more events they sent us, the more expensive it was.
The issue was that this did not work well for a lot of our customers. For one, it was really difficult for them to predict how many events they would need. Lots of people were new to the concept of tracking.
This was especially true for product analytics, it was usually Product Managers (PMs) who wanted to improve their product and see where users got stuck, or how a new feature of theirs is doing, whether people were using it or not, retaining it or not. In that world, we found that if we wanted to really help our customers, the only way we could prove it was by helping them move real business metrics. That means that we had to either help them grow, or retain, or engage users more.
Events help measure some of that, but they don’t always do a good job of actually tracking it. For example, if you were to track a lot of things, it might not be going well for you, but you’re still paying us more money. This led to churn and dissatisfaction. For one, people would churn because we had upsold them multiple times - say if they went from 10 million events to a 100 million events and were still unable to find value. They were tracking more and more events and paying more money, but not seeing the business benefit.
Secondly, if we asked brand new users and salespeople how many events they needed with a new account, they would not know. They knew their number of users, but not how many events they needed, and that was another complication in estimating usage.
The solution involved everything from packaging to changing the core price metric, to also setting the price levels.
We first figured out which were the big customer segments, and what use cases they had. We found a mix of start-ups and large customers. Within them, there were:
These were the different groups, and each had different use cases. But certain things were consistent. Everyone wanted to know how they are doing as a company, and whether their features were being used. But then, each group had some specific nuance around that need, too.
Using all this, we repackaged Mixpanel. Earlier, we used to sell an events plan and a people plan. Then, we split it out into Mixpanel Analytics, after which you could add things on for the data team in the ‘Data Pipelines Package’. For marketers, there was a ‘Messaging Package’. Separately, we also had a ‘Groups Package’ specifically targeted at B2B companies, where we actually incurred more costs in storing data because we had to store multiple copies of it.
We ran some surveys of people to figure out what things they valued, and how much. This was to get a sense of how many dollars we should charge for the packages, what features they needed, and how we stacked up against our competition. We also did a lot of customer interviews, to try and understand what were the different rules, and what features were being used in their companies.
We didn’t run any conjoints or Van Westendorp analyses. But conjoints were just really painful to set up and analyse. It was just too granular for that, as we had almost 50-60 different features we had to plug into different places. Running a conjoint would have meant taking hour-long surveys, which can be quite hard.
Once we had our packaging in a reasonably good place, the next thing was to change metrics. We tried lots of different metrics. We mooted charging based on how much we could improve a client’s revenue growth; or, we mulled basing it on monthly active users (MAUs) or monthly tracked users (MTUs).
We tested some 50 different metrics, like number of apps and websites tracked, page views, flat rates, query volume, data stored, data exported, a percentage of revenue attributed to Mixpanel, a percentage of churn reduced, and so on.
At the end of it, we managed to narrow down the list by subjecting every metric to the following five criteria:
At first, it was opt-in. We already knew which customers were finding it hardest under the events model, we could analytically figure that out. We also gave the pitch to our sales team to talk to customers who were upset about the price they were paying. We did that and basically kicked off a pilot. We had salespeople go out and ask if someone would actually rather be on the new option, instead of the events pricing model. So, many customers who were initially going to churn actually liked it a lot, switched over to the new plan, and stuck around with Mixpanel.
These were all existing customers. We already had Mixpanel data on exactly how many MTUs they had, so we had a very good sense of the numbers. Even for new customers, we found that almost everyone had Google Analytics installed on their website or their properties. Our sense of it was that we were roughly looking at Google Analytics plus 20 percent.
We wanted to keep our revenues flat. So we said, "Look, if we assume that everyone was to move over to this new plan, we want to make exactly the same money. This is not a call to make more money off our customers.” Thereafter, some customers who were significantly better off with MTU would switch; and some who were significantly worse off would never switch. So, net-net we expected to lose some money as a result of the switch, but hoped to make it up with churn rates - which we thought would actually decrease by 10-15 percent.
Thereafter, one learning on the sales front was that the switch did make the process a lot easier. This was because the team could now figure out how much the customer needed based on Google Analytics.
What got harder was that there was more uncertainty in the model than we had realized. We’re trying to get around it now, but take for instance a situation in which you’re tracking events on your homepage and then run a Snapchat ad that loads your homepage. Now, each person who sees that ad is pulling up your entire homepage. That blows up your MTU count because there are way more unique visitors, but they don’t do anything on your product. And now that’s something to pay for.
So that was an unforeseen development. We did know that some customers were very high MTU, but very few Events. To address this, we had to come up with a different pricing model called MTU Five. As part of this model, a user doesn’t count as one until they’ve done five events. This approach tackled 90 percent of such cases.
Secondly, some things that were easier on our old model are a bit harder now. With Events (value metric), it would have been a lot easier, say if you joined on the 15th of a month and we could just count events from the 15th to the 30th for the month. With MTUs, it’s harder because a month is a defined timeframe - we can only bill you on the first of every month. So, it would now require a new billing logic to only charge you for 15 days for your first month; and then do it on the first of every future month.
I am not a fan of CPQ. What we did to get around CPQ was to have an online Google Sheet to calculate. It just literally lists out every single price point and MTU podium, and our salespeople just use that. When they need to make up a quote, they go there, figure out the price, and all of the back and forth with the customer happens off the Google Sheet. Finally, when they’re ready to submit an actual contract, they go into CPQ and put in the details.