
Onboarding A/B test increases tracking number creation by 25% and increases retention for 90 day user cohort
Duration: 1 month| Role: UX Designer |Team: Josephina Kaiser (UI design), Pia Kendrick (PM)
Tools used: Figma, Figjam | Methods: Lo-fidelity wireframes, A/B testing, rapid iterations,
Problem
The business had noticed there had been a steady churn rate bubbling up for new users. One of the theories my PM and I wanted to test was if the initial onboarding could be the cause of this churn rate. One of the reasons why we wanted to test this was because there was also a decline in source tracking numbers created for new users. Source tracking numbers are more profitable because users are buying numbers a-la-carte, which increases variable monthly usage costs that help make up a majority of revenue rather than users creating website pool numbers. There was a correlation in users with more source tracking numbers having reduced churn rates as well. With this project, we wanted to test if allowing customers to create more source tracking numbers in the initial onboarding would help SMB users with limited knowledge of dynamic source tracking have a higher retention rate than those who are trying out dynamic source trackers with little understanding of how to set it up properly.
We were implementing an A/B test to test our hypothesis of:
Would removing the option to create a website pool and replacing it with the ability to create multiple source trackers lead to a cohort of users who end up churning less?
Purpose
We have an opportunity to improve the businesses’ usage and churn metrics. Historically, the app has encouraged users to create a website pool as a way to dynamically track visitors to their site. Although this is a differentiator for us, there were a significant number of users calling support for help on how to set up a dynamic source tracker, which means website pools added friction in users setting up. To get all users (tech savvy to beginner) quicker time to value on their marketing attribution goals, I needed to simplify the requirements to complete the onboarding and get users their first lead attributed to illustrate the value.
Challenges
Aligning on an inclusive source tracker list
CallRail serves a diverse breadth of industries, like healthcare, home services, legal, and real estate. Each industry has different advertising platforms that are most relevant to them. How might we show a limited, but relevant list of options to all the industries while being mindful of decision fatigue (Hick’s law)?
Making a lasting impression on a wide range of users
This setup workflow is the first glance into CallRail’s in-app experience and it serves as an unboxing of the app. There is a lot of opportunity to showcase how well CallRail can accomplish the user’s job to be done. How might we address user’s expectations and show that CallRail meets them in the onboarding?
Discover
The UX team utilizes FullStory, a user research tool, to glean insights on completed funnels, heat maps, and user engagement with features. I made segments to track the data on our current onboarding. Here are the highlights:
60%/40% of users completing the first-run through desktop/mobile. The first-run was not mobile responsive meaning we were not optimizing for on the go account setup.
19.5% of users drop-off the flow in the first step where we ask them what type of tracker they want to create.
15% of users who completed the workflow went to create another number. Users who created another number had a lower churn rate in 90 days.
In the audit of the workflow content, there are missed opportunities for smart defaults to be used. There is also a high click interaction cost by the grouping of the questions.
Takeaways:
How might we address a user’s need for campaign attribution more plainly? JTBD for users in this stage is to buy a tracking number to start seeing lead generation.
How might we reduce decision making for the user and add friction only when necessary.
Assure the flow is mobile responsive
User flow chart
The onboarding step broke out number creation into too many steps, leading to high user drop-off rates.
User empathy comic strip
Ideation
Since this would be dreaming up new UI patterns and illustrations, I collaborated with our UI systems designer from beginning to end. After we gathered user research and aligned with our project manager on scope and what we would not be doing, we got to lo-fi sketches to quickly brainstorm different approaches.
Reverse engineering was how I wanted to re-design the onboarding. Users are coming in knowing where they want to assign tracking numbers to. So, how might we jump them into that process effectively?
Pro - Clear content
Con - When user adds a couple numbers, the scroll depth increases.
Pro - Intuitive affordance to encourage users to add more than 1 tracking number
Con - We are asking users to name a number they might not have yet, which introduces a cognitive gap in the sequence of getting a tracking number
Pro - Reduces click affordance by showing all options at a bird’s eye. Helps with recognition and recall by showing logos
Con - We had not aligned on what the final list of advertisement places would be yet. Depending on the list, we would not need the filtering options this design supports.
✅ Approved lo-fi flow
Exploration
After I gained alignment from the project manager and development team on the lo-fidelity approach of the entire onboarding number creation workflow, I transitioned the design to mid-fidelity to make more concrete decisions to the experience. This part of the process would fill the content and UI gaps to the design and allow us to get detailed feedback from our team versus strategic.
Following the example of the first screen, this is how we explored the mid-fidelity design starting from the lo-fi
Callouts :
The final list of advertisement locations came from user data of the most popular trackers users create.
Users who don’t see their advertisement location can select “Other”
Design and prototype
After I finalized the UI, content decisions, documented feedback on why or why not we would be addressing certain feedback, I made a hi-fidelity prototype for the engineers and product department to reference. Below are some callouts to design choices made.
Limit users to 5 selections
We added a disabled variant to the existing selection card pattern. This is clear UI feedback to tell users they are at the selection limit.
Smart defaults and global settings
To help users make less decisions, we auto-populate the number name field with the type of number selected
4 steps to 3 steps
The original workflow had 4 steps. This workflow combines the “Activate” and “Finish” step into one for seamless clarity on what they completed.
Conclusion
This was an exciting project to work on because the excitement was in “lets ship to learn”. We had a hypothesis that there is a strong correlation of decreased source tracking numbers were causing churn and decreased usage, so we curbed our assumptions by applying an A/B test.
How did it do? Success metrics:
Mobile version addressed negative reviews from home service and real estate investors. This project served as a good advocation to invest more into mobile design
Users bought 25% more tracking numbers and increased revenue
Onboarding completion rate went up by 15%
Users churned 8% less