We recently hosted a webcast conversation between Lean Startup Co. Faculty Member Elliot Susel, and Aviv Stern, Chief Data Officer at Social Point, a gaming company, to discuss using lean analytics in mobile game development.
Don’t have time for the full webcast now? Catch the webcast highlights and tips from their conversation in our companion blog below.
If you’d like to read the full transcript of Elliot Susel’s conversation with Aviv Stern, you may download it.
Lean Startup is Data Driven
Aviv got started working for Fortune 100 companies in a corporate environment in data back when data analytics was called “business intelligence,” he said. No matter what you call it, Aviv said the goal of analytics is finding a way to use data to benefit a business.
Aviv praised the Lean Startup method for being intrinsically data driven, a good selling point when you’re trying to convince founders or a small product team to invest in data analytics. “Each of the stages, like build, measure, learn, has integrated into it [a] data-intrinsic approach,” Aviv pointed out.
Convincing product development teams to do A/B testing early on—in which you put out two versions of a product or service and see which version is better—can be a challenging battle, Aviv said, so Lean Startup is a great approach to take.
Aviv came to Lean Startup after what he called a typical startup experience, “the failing kind” that was also educational. After bootstrapping a data science app for about a year that “we were sure was going to change the world” only to have it fail, it led Aviv to ask how they were developing solutions.
“That’s how I found myself really shifting the approach I had…to be much more lean, really focusing very early on validating,” he said.“That’s how I found myself really shifting the approach I had…to be much more lean, really focusing very early on validating.” Click To Tweet
Ask the Right Questions
The company he works for now, Social Point, is well established in the gaming market, and has a solid user base, but it still faces the same challenges of how to obtain and use the data it seeks. Aviv said before they even design a new game they start by asking, “What is a good game?” And then they determine which indicators will help them validate the product market fit of the first version of a new game.
Aviv uses Lean Startup to identify the right questions to ask that will create data solutions. “Within Lean Startup you learn there’s gateways, depending on the stage of the product, you know which KPIs (key performance indicators) you need to focus on, and which metrics matter most.”
For every new game, they try to release two or more versions of the game. “One that’s, say, control untouched, and one with the improvements you wish to measure,” Aviv said.
Validate Before Testing
When asked how Social Point validates an idea for a game before a product even exists, Aviv described a practice of play testing, a form of qualitative analysis, in which the game is presented in two ways to two audiences. The first version of the game may be little more than a mock-up of screen shots aimed at an audience that is familiar with games, such as founders of other gaming companies. The second version may be a low tech prototype aimed at an audience that is comprised of gamers familiar with these kinds of games.
Most of these prototypes are done with very minimal investment, Aviv explained, and often very little coding. “If your game is a crossword, you can pretty much code it. If it’s about a more complex technological solution, we usually just use screenshots and explain the concept of it.”
After test users have tried them out, Aviv said they talk to their users and validate ideas with them.
“Anything that doesn’t involve, as we say in the Lean Startup approach, getting out of the building…[or] interacting with the real market will always be biased. But it can give you some very valuable hints and pointers about…what appeals to people [or] what is not very effective.”
Look for Actionable Metrics
Just because a company captures data, however, does not mean that data is in and of itself useful or actionable. Some metrics are just vanity metrics. Elliot gives the example of the difference between just tracking all downloads of an app versus paid downloads, with the latter being the actionable metric.
“At the end of the day, when we say ‘Is this actionable?’ we say… ‘What’s this going to change’?” Aviv said. For instance, receiving a pie chart that shows a split of gamers across different user platforms like Android versus iOS, isn’t actionable.
“If you can tell me within a specific market that devices of a specific price are more interesting, that’s something I can work with. I can change the way I target,” Aviv explained.
Lean analytics, Aviv said, help “differentiate and distinguish” between the metrics that are not usable and the ones that are. For example, if he’s looking at two metrics, conversion or average spend, conversion is more complicated and would require more testing, so average spend is going to be the most actionable.
“The decision about what am I going to be measuring at which stage of the game has very dramatic implications on what I am developing, which features of the game I’m focusing my attention on,” Aviv said.“Anything that doesn’t involve, as we say in the Lean Startup approach, getting out of the building…or interacting with the real market will always be biased.” Click To Tweet
Expect Positive and Negative Impacts
Just rolling out a new feature is no guarantee of receiving positive feedback from users, Aviv cautioned. Sometimes a new feature in a game will be a hit with a small subset of users and have a very adverse reaction on others. It’s important to understand user profiles up front. However, even negative feedback is important in shaping how they roll out new features.
“The perfect scientific way to test [a new] feature is to release a version just with that, wait, release another version,” Aviv said.
This does slow down the development process, but that may be the best way to track which of the changes are the most effective with the user base.
If the input is coming from a small sample of users, Aviv recommended taking it with a grain of salt, but if a larger sample of users is giving negative feedback, then that data is more likely actionable.
“You really need to understand your users and be able to listen to them,” Aviv insisted. “Try versions, see what happens, see what works.”
Figuring out which tools a company needs for data collection can be a process of testing in and of itself. However, Aviv recommended integrating tracking into products. So when someone downloads a game, Social Point tracks such things as when someone purchases, when someone logs into the game, and when someone clicks on a battle. All of this should be considered in product development.
“That’s pieces of code you integrate, and sensors you put into your game to be able to see what people are enjoying, what they’re playing,” Aviv said.
Of course this also can slow down development, so he recommended trying to do this “without breaking the speed of the product. You have to be very lean about how you approach analytics.”
The key is to focus on a very small set of metrics that are actionable. “Don’t try to cover everything, don’t try to track everything.”
The most important part, Aviv said, is to “always validate.”
Thanks to Jordan Rosenfeld for contributing this piece. If you seek to bring the entrepreneurial spirit to your organization, Lean Startup Co. can help.