Sofar’s monthly newsletter is sent to a global audience of over 300,000 people at the end of every month, announcing new Sofar events in the upcoming month.
Since there are far too many events worldwide to announce in a single newsletter, the email contained links to the cities where events will take place.
The Status Quo: A long list of links and a manual workflow
When more and more cities started to host Sofar events, the formerly short and convenient overview of cities soon became a long list of links. Dividing the links into sections by continent or region made the list better scannable, but didn’t solve the problem that users had to look carefully and scroll for a long time to find their city.
Furthermore, the design is not mobile-friendly since an average thumb hits about 3 links at a time on a smartphone. Based on analytics, we know that about 60% open the newsletter on their phone.
The other issue was an organisational one: The newsletter was created completely manually. All city teams had to submit their event dates to the person who was responsible for adding new cities and removing those who don’t have any upcoming shows.
The Goal: Increasing conversion and automating the creation process
One of our goals was therefore to replace this manual process with an automated one requiring minimum effort from staff members. The other more important goal was to increase visits on the event pages and conversion* by at least 5% each.
*Side note: Users can’t buy event tickets directly on the website, but have to apply and be selected beforehand. Therefore we measured our conversion based on event applications, since this was a more reliable KPI than purchases.
Looking into analytics to assess the newsletter’s performance
When I was tasked with redesigning Sofar’s monthly newsletter, my first question was: What do we know about the newsletter and its problems?
After taking a look into analytics, I found that the newsletter has a huge impact on website traffic and event applications.
Whereas many people seem to reach our event pages and apply for an event, I also noted a drop off of over 50% between the city page and event page. That means that not even half of the users clicking a city in the newsletter makes it to an event page, where event applications take place.
Conducting a survey to gather feedback and expectations
What I couldn’t find out in analytics was more qualitative feedback of how helpful the newsletter is from our users’ perspective and if it meets their expectations.
So we set up a survey comprising a small number of open and closed questions using Google Forms and sent it via MailChimp to about 400 subscribers, who have received the newsletter at least once. Creating segments and sending the same survey to different groups allowed us to collect data separately for each sample, so that we could compare which role the newsletter plays in cities with 1 event per month versus cities with 50 events per month.
Prioritising problems based on survey results
Although surveys usually give limited insight into the why and need to be analysed carefully since people often don’t tell the truth, the results gave us a first hint into possible design directions and helped us outline the topics we want to investigate further in more in-depth research. This is what we’ve found out:
Top problems identified
- Our assumption that the long list of cities causes a poor user experience was confirmed. Some participants told us they use shortcuts like cmd + f and then type their city in the search field to access it faster. However, this function isn’t supported on mobile devices.
- Users are most interested in upcoming event dates in their city. The current newsletter doesn’t provide this information directly, the user has to click through the website to get there.
- The newsletter lacks relevant, editorial content about artists and recent events, as generally expected from a newsletter.
The next step was to translate the collected insights into actual designs that can be tested with users to validate our findings and assumptions. Having surveyed our audience before, we had a pretty clear idea what to test:
A newsletter tailored to the user’s city showing event dates and interesting content about artists.
Converging and Diverging or Sketching and Discussing
My design colleague and I first sketched approaches individually before we came back together and discussed our ideas. In this way, we had enough time to focus on ideas without being influenced by each other. The final solution we took into the digital wireframe format was a combination of both our ideas – collaboration at its best.
Creating digital wireframes: Using the power of Sketch to build designs faster
I used Sketch from the very beginning to keep everything in one place. Since the project wasn’t overly complex from a UI design perspective, this approach makes it super fast to craft the final deliverables. No asset transfer from one tool to another.
Spending much time in Sketch also made me more proficient with the tool and its plugins like Craft. For instance, working exclusively with symbols saved me time when creating prototypes for different cities and users.
Read more about how I use Sketch and Craft to design with real data on my blog.
Encountering the biggest challenges: Information inconsistency and lack of content
Since the newsletter was supposed to pull all content automatically to reduce manual effort, we could only use what is available on the website or in the API. This turned out to be an immense challenge since the event information presented on the website varied widely across different cities. This is due to the fact that events are managed locally by the city teams and every team has different capacities and workflows.
After some research on Sofar’s website and discussions with the back end developer, edge cases became suddenly common cases and I had to find a solution that takes all different scenarios into account.
Recruiting the right people for the different scenarios to test
Design for different user scenarios finally requires testing these scenarios with the users who are affected by them. I therefore recruited users from all around the world, from cities with 50 events a month, to cities with 1 event a month, to cities with no events at all and presented them with a prototype tailored to their situation. This allowed me to get feedback on all cases while still providing users with a realistic experience.
Using surveys as recruiting instruments
The survey we ran at the beginning of the project was a big help during recruiting: We asked participants at the end of the survey if they are interested in further research and collected their email addresses. When we contacted them for the testing sessions, almost every participant was happy to help.
Moderated in-person and remote sessions to understand the why
To get as much qualitative feedback as possible, we chose a combination of user testing and in-depth interview as our research method. This choice had 3 key benefits:
1. Validation and probing for feedback from the survey
Since most of our participants had participated in the survey, I asked follow-up questions on the current newsletter at the beginning of the session. Event if the participant didn’t take the survey, this was a great chance to collect in-depth feedback on the current design.
2. Observing behaviour
Observing users while they explored the newsletter independently revealed insights on where they stop and what catches their interest. Paired with think aloud method, we were able to learn what exactly is exciting or frustrating about a certain feature. Conducting tests on mobile and desktop let us compare the design’s effectiveness on both mediums.
3. Get to know the user
As this was my first in-depth research with Sofar users, I learned a bunch of other things which were not necessarily related to the newsletter, but helped me to see everything in context and were useful in other projects. It’s always worth to get a little bit off script and include some generic icebreaker questions to get a more complete picture of the person in front of you.
A/B testing different versions to find the best way to go
After the insights from the interviews were incorporated into the final designs, we tested 5 different variations of the new design against the current newsletter in an A/B test via MailChimp. The aim of this test was on the one hand to learn how the new design performs compared to the current one (control group). On the other hand we wanted to get feedback from a larger audience and learn how small changes affect performance.
We tested with a small sample of our global audience to reduce risk, but made sure that each test group had a few thousand participants to get significant differences.
Fake it ‘til you make it
For the A/B test, the newsletter design had to reach a fidelity level that is possible to test in real life, hence, it needed to be translated into code. Since automation would require weeks of development work, we decided to only implement the front end and hack the content together until we are sure what the optimal newsletter should look like. (During this time I learned some valuable lessons about the limits of email.)
Involving the team to create relevant content
While the dev team mastered the email code in MailChimp, I worked closely with city teams, customer support and artist relations to deliver content like copy, images and links in time. Being at the core of the event business everyday, they know best which Sofar topics and news are interesting at the moment.
Basing design decisions both on quantitive and qualitative data led to a newsletter design that largely exceeded the set goals of 5% in event page traffic and conversion increase:
Based on our A/B test, the redesign doubled the number of visits on event pages by removing just one crucial step between users and their end goal, that previously caused more than 50% drop off.
More importantly, 20% more users applied for an event coming from the new design since they now had direct access to events and a pre-selected number of options to choose from.
The newsletter redesign is currently online in London and New York, Sofar’s most active cities. It will be rolled out globally as soon as the automation work is finished.
TOP LESSONS LEARNED
Don’t conduct an interview and take notes at the same time
Even though I ran the interviews together with a note taker and recorded everything, I took notes during the interview, probably out of the fear of missing something. This approach might save some time during analysis, but prevents you from entering an in-depth discussion with your participant and makes you appear distracted. You are indeed missing opportunities for asking follow up questions that lead to the insights you’re looking for. You have plenty of time to review notes and recordings but only one hour or less with the person in front of you. Make the most of it.
Don’t hand over a Sketch file to a developer
While this might work for some designer-developer relationships, I encountered some unnecessary problems when time constraints caused me to give a Sketch file to the front end developer, so that he could grab assets and specs more quickly. Unfortunately, the Sketch file became his design reference (instead of the pngs I sent him earlier), and without having downloaded the font styles used in the design, he started developing a completely broken design.
Don’t rely on one single data point
After an A/B test, seeing all the data coming in can become exciting and overwhelming. It’s tempting to see differences where no real differences exist. In fact, after running a significance test, we realised that none of the different redesigns tested outperformed the others in terms of conversion. It was then on us to cross-compare other data like open rate and click rate and finally refer back to the qualitative feedback we got from the interviews to decide which redesign is most promising.