Boosted subscription sign-up rates by 2% through customer onboarding optimization using a growth experimentation framework.

Project
Ancestry Onboarding

Role
Product designer

Timeline
Jan. – Feb. 2022
1 year, 1 month

Context

Ancestry struggled to get customers to bill-through to a subscription while in a 14-day free trial.

Free trialers were cancelling for a variety of reasons, with cost being the primary reason. Because cost was outside of our team's scope, I looked at the next two reasons: "Something else" and "I am done researching my family history."

“Something else” included Ancestry being too expensive, not having the time to use the product, and not being able to find information about themselves or their family.

For customers who selected “I am done researching my family history” as an answer, their top two reasons were that Ancestry had “answered the questions [they] had” and that they weren’t “finding anything interesting.”

Customers weren't finding value in the product in the initial days of their 14-day free trial.

Solution

A collection of experiments that increased key engagement metrics, laddering up to a 2% boost in subscription sign-up rates.

I worked on 42 experiments as part of a growth experimentation framework to optimize the onboarding experience to increase the metrics mentioned above.

Goal

My goal was to help increase subscription sign-up rate by 2% through designing experiments that increased key engagement metrics in the onboarding experience.

Some experiments shared a common engagement metric while others differed. Onboarding was a good place in the product experience to design experiments because it's something all customers experience and provides an opportunity to deliver value quickly.

Role & Team

I was a contributing designer among 17 other team members.

The team included 4 product managers, 3 product designers, 2 user researchers, 1 content writer, 5 engineers, and 2 product analysts

Initial Research

To validate if customers were getting value from the product, I first had to understand how they were onboarding with Ancestry.

In the customer's first touchpoint with Ancestry, they're asked to fill out information for seven people in their family tree (including themselves). By filling out this information, it primes our system with reference points to search for additional information about the customer's family.

It seemed like a pretty straightforward flow, so why weren't customers getting value from the product after they filled out information?

Once a customer fills out information for these people, they are then shown historical records that reveal new information about their family history. These can be yearbook photos, immigration documents, housing deeds, etc.

I found that only 67% of customers were being shown a historical record after filling out information for their family.

Why was this occurring? Were customers skipping sections and not information? Was our system malfunctioning and not properly displaying historical records? Were customers prematurely quitting the flow?

Research Findings

Through a series of customer interviews and Hotjar recordings that analyzed customer behavior, I found that customers were not adding information for entire people in their tree.

There were a variety of reasons why customers were skipping entire people. Some wanted to "see the results right way," while others simply didn't want to add a specific relative to their family tree. In addition, some customers didn't know what they were working toward by adding these people to their tree. These reasons also resulted in customers dropping off in the middle of the flow.

I had to figure out a way to keep customers engaged, ensure they received a valuable historical recored about their family, and had an idea of what they were working toward.

Design Experiments

If customers skip entire nodes, they won’t receive new historical information about their family and won't get value out of Ancestry.

This is because our system doesn’t have enough information to reference when trying to find details about the customer’s family. I brainstormed three different test ideas to address this behavior.

The first test focused on customers not wanting to add certain relatives. Rather than making them go through unnecessary screens that they didn't want to fill out and risk potential drop-off, I let them choose which family members they wanted to add in the beginning of the flow.

The second test served as a way to ensure that after skipping through multiple people in their tree, customers wouldn't be sent to their tree with no historical records. It addressed their skip behavior and suggested a way to help them get value from Ancestry.

The third test accounted for people not understanding what they were working toward by adding people to their tree. It provided a short preview of what they would do and what would happen after they added a few people to their tree.

The second test performed the best, increasing historical record generation for customers by +16%.

The first test had negative results, with a -6% decrease in historical record generation.

The second test had mixed results. While it resulted in a +6% increase in historical record generation, it also resulted in a -3% decrease in tree creation rate. A decrease in tree creation rate meant that customers were seeing the preview of what they would be doing and then dropping off before beginning the flow.

While the second and third test both had positive results, I decided it would be best to only rollout the second test to all customers since it was the only test that successfully addressed all the problems identified in our research.

Success

When experiments perform well, they get placed in the “holdout experience.”

50% of all new Ancestry customers are in the holdout experience and see all of our winning experiments. The other 50% never see any of our experiments.

At the end of the year, we compare the bill-through rates of these experiences to see if we succeeded. In our case, we were successfully able to increase subscription sign-up rate along with other metrics.

Outcomes

Through 233 total experiments, the team and I were able to boost subscription sign-up rates by 2%.

Along with this, we were also able to increase overall tree creation rate and historical record generation rate.