Vodafone-Vodafone-5G-Vodafone-network-Vo

Increasing conversion rates by 25% for Vodafone.

Vodafone-logo-vector-PNG.png
star.png

25% more reviews were left on the newer designer during A/B Testing.

rate.png

Overall, 3% more users completely onboarding after the tweaks were made.

How I achieved the above....

PROJECT OVERVIEW AND MY ROLE

Vodafone were going through a digital transformation similar to Lloyds bank (See project here). One of the big requirements was to improve their visual UI. The design team wanted to use this as an opportunity to improve the user experience of the journey's rather than just refresh the visuals. 

As you can imagine Vodafone have many online journeys, both on web, responsive mobile, and native app, so there was a lot to get through. Designers were assigned to feature teams, but often we would need to collaborate across teams. 

IMG_5648.JPG

DISCOVERY RESEARCH WITH USERS

We wanted to get an understanding of which journeys needed the most improvements and what those improvements could be. To achieve this insight we ran a few workshops with a variety of users from our Vodafone user pool. This pool of users had been gathered by the marketing department in collaboration with the design team. They were gathered by completing a survey at the end of a variety of journeys. Due to this we had a journey linked with each user. 

  • People who had completed a transaction through visiting the basket and checkout.​

  • Users who had left a review or multiple reviews for a product

Within these groups people had experienced a variety of screens, which we hoped to improve. Some users had used desktop, others had used mobile through their browser. The workshops we ran were very open, only semi-structured. We created a safe environment in one of our meeting rooms, and while moderating I made sure to ask questions directly to each user, so that everyone participated. 

RESULTS FROM THE WORKSHOP

It was clear from the workshop that the two journeys had major issues. Here are some of the Pain-points we managed to gather. 

  • When a new user was checking out they had to go through around eight steps, over eight pages. But many users felt lost and weren't sure where they were in the process or how long the process was.

  • The same users also felt like they didn't have enough information during checkout, for example they could not actually see what they were purchasing without navigating back to the basket.

  • Users who had left reviews found the process very cumbersome. They wanted to see at a quick glance other peoples reviews which they could not, often sighting 'standard' review elements such as an average score, an overall rating, and a list of reviews left. 

  • People who had left reviews also felt like the review section was hidden, and many also felt like this may have been done on purpose to hide lower scoring reviews and products.

SUPPORT FROM QUANTITATIVE DATA

These two key pieces of information, in combination with the workshop results, helped us build a case to not only update the visuals for these journeys but also look at optimising them through adding additional elements (if needed), and reconsidering the flows and layouts. 

  • Nearly 20% of people who started to checkout later dropped off and didn't complete a purchase in that session.

  • 35% of people who started to write a review never finished, and didn't submit to the product page.

The qualitative data above is great, and very handy. But it's always better to combine qualitative with quantitative. So the design team set out to see if we had any data on these journeys. Through talking to developers and the data management teams we managed to find tag information on these journeys. It was a lot to dig through, and took some time, but we found:

REDESIGNS

We decided to tackle these two issues slightly differently. For the review problem we would work with team Atlas, who were the A/B testing kings, to run a series of A/B testings with different ways to leave reviews. 

Secondly for people dropping off during checkout we would conduct a full 'UX' review of the journey, looking for quick wins. This divide in approach was due to which teams owned which areas of the site. The former was owned across the tech branch, however the checkout was owned solely by one team. Due to this no A/B testing could be carried out on the latter - though this would have been our initial process.

Below are some of the redesigns we did for leaving reviews, and some additions we added to the check out journey.

MONITORING SUCCESS

As we A/B tested the changes made to the 'making reviews' journey we monitored bounce rates, and selling rates of each phone. Our hypothesis was that in the newer design we would see far greater number of completed reviews. At least 20% less drop off rates. 

- We made sure that not too many elements were changed from the control group, only a few flow issues were tested.

- We let the test run for the full two weeks, to improve statistical significance.

- Balanced traffic was used, sending the altered URL to 50% of the users.

 

Ultimately we wanted to run more than just two versions of the journey, but due to time constraints we had to go with only two.  

As for the onboarding, we conducted a UX review with four designers over the course of two days and came up with solutions. The business wanted to build these without testing, which we agreed to as long as they were tracked. 

Line Chart - Demo.png

RESULTS

The A/B testing was monitored for two weeks, and the changes to onboarding were built, launched, and tracked over an extended period of time.

star.png

25% more reviews were left on the newer designer during A/B Testing.

rate.png

Overall, 3% more users completely onboarding after the tweaks were made.

arrows.png
arrows.png

Want to work with me? Contact me below.

mail.png
chat.png

Find and chat to me on Linkedin here