One of the most popular songs on Genius, Rap God by Eminem, has 117 annotations created by 1200+ contributors, a few verified annotations by Eminem, thousands of comments, a couple related articles and videos by Genius, 20+ question and answer pairs, and over 20 pieces of track info like recording location, samples, and the release date. The main product design challenge at Genius is fitting all of this information onto a page without taking away from the lyrics reading experience, and doing all of that on mobile.

The song stories project reimagined the song page from the ground up, replacing every section and every interaction on the page. I led and executed all the design and prototyping, and worked closely with the project lead to plan, organize meetings, and communicate with stakeholders from other departments.

In 2016, Spotify launched Genius “fact tracks” on select songs — it was a huge success, and by the end of the year we decided to take the idea even further with song stories.

Fact tracks are text only, and limited to the duration of their song, so a lot of interesting information gets omitted. Building a similar experience on the Genius platform meant no limits on content or media, leading to expanded capabilities like content-specific interactive features. For example, the editorial team wrote a fact track card comparing Black Beatles by Rae Sremmurd to Day Tripper by The Beatles — in the song story version, we added two interactive slides: an audio scrubber to crossfade between the compared tracks, and a poll to choose which band is best.

Packaging annotations, comments, articles and videos up into a separate experience meant we could show a single, strong call to action on the song page — the lyrics are unobstructed by line-by-line annotations, and the value proposition of the song story is clear: “learn more about this song.”

Before the design process began it was clear that static mockups wouldn’t be sufficient, since the product relies heavily on audio and video. I decided to use Facebook’s Origami at the beginning of the project, which not only helped me understand what I was designing, but also improved communication with the rest of the team, from editorial to engineering. Origami allows you to load a prototype onto a real phone, so something that feels real can be passed around for feedback.

Initially the prototypes just captured small interactions, like swiping between slides, or voting in a poll — there were over 20 different Origami and Sketch files. By the end of the prototyping phase I had a built a fully-functioning song story, fed by a JSON file, and using Origami components to render each slide type — this made changes easy, and allowed us to begin user testing.