Remote Testing Paid Signup Flow

Recency: 2015
Role: Owned project, client-facing designer and A/B test developer
Process: Solo with over a dozen A/B test iterations
Top Challenge: Improve sales by genuinely improving UX

Iterative redesign and A/B testing to improve the Plans/Payment flow on a dating site to increase paid sign-ups.

Problems With Original

The original process failed to emphasize the top benefits:

There is no guidance on what to choose. Is 3 month plan long enough?

Buying message packs and the option to enter a coupon code further complicate the choice. Analytics showed some of the options were never used. Some benefits, like “Extra privacy options”, are unclear. Some benefits, like “Organize singles events”, are not relevant for the average user.

Once the user picked a plan, they went to Step 2:

The 2-step flow was awkward. Step 1 had the per-month price prominently, yet step 2 starts with a higher prepay-3-months price. Checking analytics showed people went back and forth between the two steps, suggesting Step 1 wasn’t effective as a gateway page.

My Solution

My final redesign looked like this:

The key aspects of the solution were:

  • Hierarchy & Flow: Simplified to single-page process over multiple steps. Tabs for each plan allowed user to explore plans without flipping back and forth between pages.
  • Value proposition: I turned the top benefit into the headline (“Make A Great First Impression, Send the First Message”). Showed only the best next 3 benefits below instead of many.
  • User-centric: Guided user’s decision with Plain English financial and situational advice. For example, the 6-month plan says: “Pays for itself in X months. Take your time meeting people.” Used more casual language when describing the plans too.
  • Hierarchy: Removed distractions and moved secondary payment options way down to the footer.

Many Interactions Of A/B Experiments

Over multiple tests, I removed various components, such as the message pack footer. I also tried simplifying the choice by setting different defaults and using a single column layout.

Here’s an intermediate variation, which did NOT do better:

I tested setting the default to Plan 1 as well as Plan 2. During the testing, I monitored the impact on sales counts and revenue, as well as user behavior.

After each round of testing, I prepared summmary reports and analyses with lessons learned and recommendations.

User Behavior Research

I tracked user behavior in detail, like if they were chosing a plan then going back and changing their choice. I tracked how long they spent at a given step. 

Some results were surprising. For example, setting the default to Plan 1 reduced sales of Plan 1 but tripled the sales of Plan 2 instead BUT in a way that did not increase overall revenue due to the price differences between the plans.

A/B Test Outcome

I A/B tested this solution and it increased revenue and sales for all plans. It took 3-4 iterations.

Leave a Reply

Your email address will not be published. Required fields are marked *