Adobe Premiere Pro
Web app prototype
At the start of this pandemic, the music industry has taken steps to move all performances online. However, despite the attempt to recreate a magical evening spent with your favorite artist, the cons for live streams vs. in-person makes the former not as popular and cannot make up the loss suffered by an industry worth $20 billion in 2019. DIVE's mission, or Designing a Virtual-concert Experience, was to uncover what makes live stream experiences so extraordinary and how we can find a balance between a seamless online experience and maintaining an emotional connection between the artist and their audience.
This is the process of how DIVE came to be.
Our goal is to research and design a live stream concert experience in an interactive platform designed specifically for an artist’s performance. To also promote local artists, we partnered with Seattle musician Chong the Nomad as our model artist.
During the pandemic, the music industry has begun to explore live stream performances to reach out to audiences without physical interaction. However, live streams are still not as popular as their live counterparts. We wanted to understand how users interacted with live streams and how this might change the future of live music after the pandemic.
Persona and Scenario Development
Visual and user interfaces
Prototyping and Testing via Figma & GameMaker
Chat with the Artist and other attendees during concert.
Users can also participate in artist-generated survey for different purposes throughout the concert (i.e. pick the next song).
Enjoy live music in a custom-made environment designed after the artist's branding
The artist, as our client, will work with the team to design a virtual environment that fits the style and theme of the live event.
Create collaborative art with fellow concert attendees
Users will appear as visual representations specifically designed for the artist on the page that interacts with other attendees in a virtual experience, and will adapt to how users act, how the artist acts, and how the music sounds.
Our design process starts with the Brainstorming Phase , where we meet the team, form the ideas, acquire stakeholders, and look at the trends of current live-stream platforms.
We then go on to the Research Phase , where we did feature prioritization and decided on what we can do with our time and abilities.
In the Prototype Phase , we start creating our prototypes. I was in charge of creating the mid-fidelity prototype on Figma, and visualizing how our design assets would fit into the prototype. Afterward, I handed off the Figma prototype for our SWD to create the high-fidelity prototype in GameMaker Studios.
The last part of our project was the Test Phase , where we conducted usability tests on the first version of our prototype. We took the feedback, addressed those issues as best as we could, and ended with the finalized version down in the Prototype Section below.
Our mind map shows the different needs users might want with an interactive online concert. This really helps to set the baseline for what we prioritized as we approached the project.
We wanted to focus on including a client-based audience in-browser experience, custom reactions, and interactive components. Taking into account the capstone deadline also allowed us to plan what kind of software we might be able to use to build the prototype later down the road.
This mood board was used to inspire the visual design of this version of the DIVE platform, which we customized to Chong the Nomad's brand. Our team met with the artist and her producer multiple times to collaborate on her visuals on such a platform.
This mood board is the result of our team's work. I used the colors of Chong the Nomad's album art to create the color palette that we were going to use, as well as the font style that would best fit our model artist.
Out of 24 live stream platforms we researched at the time (2021), we found that the feature prioritized most are ones related to interpersonal connection for the audience, such as chat and participant representation. We decided to take the ones we knew we could complete within the time limit and continued to develop those until the capstone deadline.
The only feature that wasn't prioritized but that we also focused on was audience representation, despite how few of the current apps actually do use it. We wanted to keep that as a factor because of the research we've done about interactive platforms that can improve the interpersonal connection between the audience and the artist.
Our personas represented the people who would have the highest stakes in an interactive live stream concert -- mainly the music industry and their audiences. The following personas allowed us to explore different types of needs that such users might need and further refine the design of the high fidelity prototype.
This user flow represents the information architecture that is designed for the user to experience a seamless online experience. The idea is that most of the effects and controls will be controlled on the artist's side, but in our prototype, we created a way for three different themes to be displayed to show the capability of the "stage" changing its appearance.
Participants noticed the name of the artist missing from the prototype
Participants noticed the volume of the prototype was very loud and they lacked the ability to control it.
Participants wanted to know if other users would have the same reactions and wondered if it would be hard to differentiate their reactions from others.
A participant expressed that they would like to be able to react to individual messages.
Participants wanted to see messages sent before (didn't see it or scrolled by too fast)
Participants did not know whether some buttons were clickable or not
Users were STILL confused within instructions on how to use the platform
Participants can't tell the difference between their messages and messages sent by other people in the chat
Usability Findings & Recommendations
After completing our first version of the final prototype, we led several usability tests over Zoom to get feedback about the deliverable. Using screen share abilities, we were able to observe what the user was doing as they interacted with the components of our platform.
The table to the right shows some of the feedback that we were able to receive from our participants:
Add name of the concert and artist in the home page of the platform
Add one audio control for all rooms
During the concert, the user will be able to change reactions and color.
Users would be able to click on a message to react to it
Ability to scroll through past chat messages (messages currently disappear)
Hover feedback added to show that buttons are clickable
Implement better instructional guidelines
Each participant will choose a color that will represent them or their mood & participant-specific messages could be bolded in the chat for that specific user
This is the final video prototype that I created for DIVE. You can test any of the most current prototypes of the DIVE application, either in Figma or GameMaker below.