SUBSCRIPTION SERVICE
Increasing the awareness and clarity of a brand new subscription service.
I am still in the process of adding the full case study to this page...
CLIENT
Bowglass Works
ROLE
Content Designer
TIMELINE
3 Months
COMPANY BIO
Boutique glass blowing company based out of New York

The Problems
-
New subscription program wasn't getting the traction the stakeholder had hoped for.
-
User wasn't being onboarded in a way that left them feeling confident about the program.
-
Subscription tiers were not aligned with research and testing insights.
The Process
-
A/B Testing
-
Competitive analysis
-
Survey
-
5 Second testing
-
Content audit
-
Task analyses
The Solutions
The Impact
-
Higher task completion rate among our test participants.
-
Increased 5-second testing scores on price decks.
-
Higher comprehension rate of subscription service, measured by post-task interviews.
-
Users reported feeling less overwhelmed.
OVERVIEW
BowGlass had a new subscription program that had only been live for a few months. It had a few good things going for it, but it wasn't getting the kind of traction the stakeholder had initially wanted.
THE GOOD THINGS
-
It had the potential to be a great money maker.
-
It was pretty easy for the stakeholder to manage.
-
It was a cool and unique feature for the business (we didn't come across this feature in other, similar businesses during our competitive audits).
THE ISSUES
-
Users didn't know about the subscription program.
-
Our content audit revealed very little traffic to the subscription page.​
-
​Of the people who remembered seeing a subscription service during our testing, nobody was able to tell me what it specifically entailed.​
-
-
Users weren't being onboarded properly.
-
Of the users who wanted to sign up for the subscription, very few actually felt confident in their knowledge of how the whole thing worked.​
-
-
​Key Information wasn't being conveyed.
-
The stakeholder had some great ideas about how the whole thing was going to work, but the information wasn't being conveyed to the user.​
-
-
Program wasn't aligned with specific user needs.
-
Our research efforts showed almost zero interest in the second tier of the subscription program. In other words, nobody expressed an interest in receiving two new pieces of glassware per month.
-
THE SOLUTIONS
With the redesign, I wanted to accomplish a few things:
-
Adjust the entry point so more users are aware of the program.
-
Redesign the price decks to give user a better sense of financial obligation.
-
Onboard the user with as little friction as possible.
-
Simplify the service and decrease cognitive load by reducing the number of tiers.
PRICE DECK REDESIGN
OVERVIEW
The original ask was to simply redesign the price decks for his new subscription service. Improvements were easy to make simply based on the quality of the original version. I still put the redesigns through testing, and our research efforts produced some strategy-altering insights (see next section for details).
CLOSE UP OF ORIGINAL PRICE DECKS


-
Original subscription draws were grouped within the original product page.
-
Text was hard to read, pricing was inconsistent, formatting didn't allow for scannability.
PRICE DECKS ON PRODUCT PAGE

-
Original price decks were only located on the product page at the bottom.
REDESIGNED PRICE DECKS - DESKTOP

-
New price decks tested way better across our 5-second tests.
-
Differences between tiers is immediately obvious.
-
We also adjusted the price to make the second tier more cost effective (and appealing).
-
We worked to develop several iterations of these price decks and put the versions through multiple rounds of A/B testing to get to the final prototype.
SIMPLIFYING THE SERVICE
OVERVIEW
​The stakeholder originally offered two separate tiers for his subscription program. Users could sign up to receive either one piece of glass per month, or two pieces of glass per month.
THOUGHTS/ASSUMPTIONS
As a UX writer/content designer, the first question with almost everything usually centers around whether or not there is an actual need for whatever is being implemented, with the answer residing in research and testing.
When I first signed on for this project, I loved the idea of a subscription program. Our competitive audits proved that this was a unique service (no other company was doing it), and the ROI for the stakeholder was high. A win-win.
Though, it felt busy, for lack of a better word, and I had a hard time believing that there were people out there who wanted two new pieces of glass every month. Furthermore, not only did the stakeholder want to offer two tiers, he had different levels within each tier.
To validate this assumption, I turned to research in the form of a Google form (survey), and pulled participants from his large social media following.
Of the 30 people who filled out the form, only 1 person expressed an interest in getting 2 pieces of glass a month.
INSIGHT-BASED SOLUTION
My suggested solution was to simplify the subscription program and focus on those things that our users actually expressed interest in.
Here's what they want/like, according to research:
-
A subscription program
-
The option to either choose the category or have BowGlass surprise them.
Here's what they don't want/don't like, according to research:
-
Second tier of subscription program
-
Being overwhelmed by complicated process​ (choosing between tiers, and then choosing between categories).
​
​
REDESIGNED PRICE DECK, MOBILE

This is a rough and untested prototype of what the price deck could look like for his subscription program.
-
We took away one of the tiers and focused our efforts on just getting subscribers.
-
The owner wanted to add in the t-shirt and polaroid, two things that were very much on-brand.​
ONBOARDING SCREENS



ORDER SCREENS


Part of the subscription program gives the user
the option to either choose a category of glass
to be delivered or have the owner surprise them.
ENTRY POINT
ORIGINAL CONVERSION POINT

-
Original subscription draws were initially tucked in with the other products on the product page, resulting in very few people knowing about it.​​​​
TESTING
We had a few assumptions about the entry point that we validated by looking at click-through-rates and performing some light task analysis.
The task analysis was pretty basic: asking a few participants to explore the website and place an item in the cart. The participants were family members, friends, and two existing customers. We followed up with a short interview.
The results were as expected (low click-through-rates, participants not being able to recall specific details about the service or that it even existed).
We knew the solution was to relocate the subscription draw, and relied on both our own assumptions and some insights from the interviews to determine three better conversion points.
-
Content on the landing page
-
The landing page was the pretty obvious choice. Furthermore, almost all of our participants said they expected to see it here.​
-
-
Content on verification screen (post-purchase)
-
The idea for the verification screen was one of those wonderful cases of our assumptions being validated by specific user insights. The reasoning made a lot of sense in the realm of content design: users who liked the product enough to buy something are far more likely to subscribe for the monthly service.
-
-
Content in the verification email
-
Adding in a quick shout out to the subscription program for the same reasons as stated above.
-
TO NOTE:
All of the following iterations were put through rounds of A/B testing to ensure the actual words were in line with user needs.
Updates still in progress here.
LANDING PAGE CONTENT
This will be "below the fold", under the capabilities zone of the landing page (third section).
​
VERIFICATION SCREEN CONTENT
