Press enter to see results or esc to cancel.

The Role Of Data in Defining Your Product Success

Running a business in the modern world requires a data-driven approach. Without data, you can’t expect to live up to your full potential.

Thankfully for us, data is plentiful. Analytics are generally widely available and free. But that doesn’t mean that the data is being put to use or even interpreted correctly.

You use your data to make most of your marketing decisions. But tainted data can’t be trusted. If your data is out of whack, it can lead to a costly mistake.

Errors are common. Data lies. Even Google Analytics lies. That’s why you need to do your part to make your data tell the truth.

In this post, we’ll walk you through how a few great products and companies found success using a data-based approach and what tests they used to grow their businesses.

Let’s jump in.

Why Using Data Leads to More Success

The real problem lies in our ability (or lack of ability) to collect, read, interpret, and most importantly, put the data into action.

A recent study from the Harvard Business Review on the Evolution of Decision Making took a look at how organizations and product-based companies are adopting a data-based culture.

This study surveyed 646 executives, managers, and industry professionals from dozens of industries and countries across the globe.

According to the study, 80% of the respondents said that they were heavily reliant on data in their jobs, and 73% said that they rely on data to make decisions.

What’s more interesting was how being data-driven impacted their bottom line and overall success in the market.

Over 70% of the organizations that were consistently using analytics to make decisions reported improved financial performance, an increase in productivity, reduced risks, and faster decision-making skills.

Organizations without the large use of data to drive action were 20% less likely to state those benefits.

So, what does that mean exactly?

In a nutshell: if you are making decisions based on real, hard numbers and data, you have a higher chance of success. Don’t use the data, and you’ll cut that success potential down by 20%.

According to HubSpot’s State of Inbound, most marketers still struggle with proving ROI.

Marketing challenges by State of Inbound

(Image Source)

Other big challenges that were reported show a trend of struggling to use or interpret data:

Data interpretation challenges

(Image Source)

Of course, most product marketers think they are already data-driven. But in reality, most are doing it wrong still or are struggling to put the data they have into action. The studies above both prove that.

But there is a silver lining. There are currently multiple product-based giants that are conducting testing and basing their entire businesses on statistically significant data.

How Duolingo, Intercom, and VSCO Used Data to Improve Products

1. Do A/B Testing Right Like Duolingo

As product marketers, we think that we know data.

A/B testing is one of the most popular ways to test just about anything from the product to the branding and anything in between.

You can use it to find which variation of any given piece of marketing content performs best, which means that the value of these tests is huge.

You can use them to build better products because they show you the actions that will improve conversion rates, boost sales, and increase revenue.

I bet if you asked any given product marketer, they’d claim to be an expert at A/B testing by now.

In a 2013 study by Econsultancy and RedEye, 60% of participants stated that A/B testing was very valuable for their business.

Methods of improving conversion rates

(Image Source)

The allure of A/B testing is like a siren song in the distance, waiting for you to get sucked in.

But the reality is that most of those headlines are phony, and A/B testing isn’t as simple as changing the button color.

Most A/B tests fail. Only one out of every seven tests reaches any sort of statistical significance.

A/B testing has issues that include, but aren’t limited to, the following pitfalls:

  • Bias: you are literally testing your own assumptions of what will have an impact on conversions.
  • Time: they take a minimum of 3-4 weeks to see any results. The opportunity cost is brutal.
  • Size: is your sample size large enough? If not, it’s not statistically significant.

According to conversion rate optimization expert Peep Laja, you need a minimum of 1,000 conversions monthly for a test to have significance.

Minimum. Yikes. That’s a lot of conversions.

But don’t lose faith. There is hope for us yet.

And that hope lies in Duolingo – one of the best product-based companies to ever grace this planet.

Gina Gotthilf, VP of Growth and Marketing at Duolingo, recently spilled Duolingo’s A/B testing secrets with First Round.

Gina has used master growth hacking techniques to drive the product-based company from three million users to over 200 million.

How? By consistently A/B testing the product and building it to be better than ever.

Multiple years ago, Duolingo was refining their product, but they hit a brick wall. Their top-of-the-funnel users who were just becoming brand aware were leaving like a herd of wildebeests.

Gina stated,

“We were seeing a huge drop-off in the number of people who were visiting Duolingo or downloading the app and signing up.”

Essentially, they couldn’t get people to sign up for accounts after downloading the app for the first time.

Here is what the application looked like for new users without an account who were looking to save progress:


(Image Source)

The first A/B test they ran involved the placement of this signup wall. Users would often hit this wall early in the experience because Duolingo wanted users to convert faster.

But they realized that this might be counterintuitive. Making people sign up before they even got the full experience was driving users away. The first A/B test was moving the gated section further back in the process.

The result? A 20% increase in conversions from new users to account creation.

But that wasn’t enough for Duolingo. And this is often where A/B testing goes wrong. One single test isn’t going to cut it.

Duolingo then tested the same sign-up wall with a new feature:

Duolingo A:B testing

(Image Source)

Instead of saying “Discard My Progress,” the sign-up screen gave users the option to skip it and keep using the app without making a commitment just yet.

That yielded another increase of 8.2% on top of the 20% they already found.

Moral of the story: conducting a powerful A/B test should never be a one-off step.

Duolingo is living proof of that. They tested the same iteration over and over in different ways to get to their current setup.

They first moved the profile wall back a few steps, increasing conversions. But they didn’t settle. They kept tweaking one data point at a time until they reached their full potential.

When you do an A/B test, you shouldn’t conduct one test and call it significant. That can leave you with false data or tests that had serious outliers impacting the data.

Testing multiple times with a single variance in each test is critical for unlocking real conversion data.

2. Beta Test Your Product Like Intercom

When it comes to building the best products, nothing beats a beta test. It’s a validation that your product is well-received.

On top of that, it can serve as a way to refine your product to become better than you ever imagined.

Private product betas are a proven way to improve your products. And Intercom is one of the best when it comes to beta testing products.

George Williams, the Product Manager at Intercom, says that there are a few crucial steps to perform before jumping into a beta test to make sure it’s worthwhile.

This involves meeting with your product team and product manager to outline the schedule, size, and selection criteria for the user group.

It also involves noting what feedback or data you’re going to collect and how you’re going to determine whether you reached a viable solution or not.

With a beta test, George outlines five critical factors that he tries to accomplish with his products:

  1. Testing the positioning of the product
  2. Setting expectations for the product’s future
  3. Finding the target audience for the product
  4. Choosing which channels are best to announce the product
  5. Capturing real customer testimonials

George first and foremost tests positioning. He uses this strategy to see if the product messaging clearly communicates the benefits and features of the product to potential users.

He does this by incorporating the positioning and messaging into the beta invitation itself to see how many people are interested before actually using the product. This serves to test the market and specific position that the product creates in the minds of potential customers.

Intercom A:B testing

(Image Source)

George states,

“If you’re considering two directions for your positioning this can be a good opportunity to run an A/B test.”

This gives you a chance to evaluate your customer’s reaction to the invitation, and it can be a good way to measure how compelling your offer is.

Picking up on cues of excitement and confusion can help you make your product offering clear and concise.

Secondly, George says that managing expectations is critical for product success. If you claim too much and underdeliver, you can expect angry users.

He says to ask the following question both before and after users have interacted with your product:

“Based on what was described to you, did this meet your expectations?”

This question can help you gauge if you’ve met expectations or not and how you can fix your messaging to better fit your end benefits.

With beta testing, always be sure to collect customer reviews. These can serve as a tool for social proof, showing that your product is well received. On top of that, it serves as a resource for identifying what parts of your product did or did not satisfy the users.

George uses a simple, personalized email to collect feedback from beta testing sessions:

Personalized emails for product testing

(Image Source)

While most of this data is qualitative rather than quantitative (numbers-based), you should always use it to inform decisions. Even if your data isn’t direct numbers or statistics on what performed best, you can analyze qualitative feedback to drive performance.

When Intercom creates products, they always beta test them. With the qualitative data they collect, they seek to put customer feedback directly back into their products.

When beta testing, it’s critical to follow those factors that Intercom has used for years.

3. Tracking the Right Metrics Can Unlock Real Data like VSCO

Living in a data-rich universe is a blessing and a curse.

You can quickly log into a tool like Google Analytics and get instant data on everything from new website visits to micro-conversions and more.

Amazing, right? But it comes with a very serious caveat:

Information overload.

It quickly becomes saturated.

This is a huge problem in today’s modern age of data-driven work. Vanity metrics like impressions and clicks and time on site don’t really tell you anything.

Think about it:

Do impressions matter if you are profiting and seeing consistent growth? Not at all.

Does time on site matter? That all depends on the size or depth of your site, and even then it’s nearly worthless.

Every product is different. Every usage scenario, every goal, and every game plan has a different outcome.

As a product-based business, you should focus on these metrics first:

  • Customer lifetime value: how much does each customer spend with you over their relationship with your business?
  • Cost per acquisition: how much do you need to spend to acquire a single user?
  • Your bottom line: how many sales are you driving, and how much in revenue and profits?

And this strategy is exactly what VSCO used to change their culture to a data-driven product company.

VSCO metrics testing

(Image Source)

Their growth was stagnant before they became focused on data and metrics that impacted their bottom line.

VSCO’s product manager said:

“Our data culture wasn’t that strong before, which is probably how we got into that state of disarray to begin with.”

VSCO started to sort their analytics accounts based on real data like lifetime value, cost per acquisition, and understanding how users flow in their funnel.

For example, they started to measure conversions with a single event. When a user took a photo, edited it using the app, and exported the photo to their library, they counted it as a real conversion.


(Image Source)

Before that, they were focused on each micro-conversion as its own event. But that data wasn’t showing how many users converted or added to their bottom line.

Someone could easily edit a photo and then abandon the process. So counting those metrics was useless to them.

Limiting their metrics, VSCO was able to focus on the important steps and actions that signified changes in their key performance metrics like LTV, CPA, and revenue.

Moral of the story: Sometimes, less is more. Tracking metrics like impressions and clicks and bounce rate are all relative. So tracking those up to “industry standards” isn’t going to tell you anything about how successful your business is.

Take a lesson from VSCO and focus on the data that really tells a story. The metrics that show you if customers are buying and if you are making profits on those sales.


In the modern world, data is accessible at almost every point. Whether it’s qualitative or quantitative, you can always collect data.

But simply collecting data isn’t enough to move the needle and turn a product into the next big hit.

According to the Harvard Business Review, most big product companies and executives are collecting data. And those that put that data to use are seeing higher growth rates than those that aren’t.

But most still struggle with data, especially when it comes to product marketing.

Despite this, there is hope for us yet. Duolingo, Intercom, and VSCO are shining examples of product-based companies using data to drive their decisions.

Duolingo only makes changes through A/B testing that goes beyond just one test. Intercom always beta tests their products and uses qualitative feedback to improve their approval rate. VSCO looks beyond vanity metrics to make major, game-changing decisions.

The data is always there waiting for you. But it’s only as good as the tests and moves you make with it.


Leave a Comment