music + motion

  • in development

  • Music + Motion is a web application that allows users to see how people move and react to music. I am curious about how we all move differently to the same things, what makes me tap my feet may make you move your hands instead. I want to show that we’re more similar than different through music.

    From it’s initial conception to the stage it’s at now the project has changed a lot. What began as a way to just be expressive through music has now turned into an application that will analyze the users face for reactions to the music their listening to. When the user does something that can be defined as showing emotion, an outline of the face will show up. This can be used for many purposes including studying how people react to certain content. Also, when you consider various data points such as where the users are from, their gender, age etc it is possible to see patterns or similarities between users based on the songs as well.

    Initial Idea

    When this project first began I really wanted it to be focused on gestures and how the user can use them to interact with their computer. I decided to use music as the base for this project because I was also interested in human behavior when it comes to music. This interest came from just observation of people on public transport and how much they move to their music and what part of their body they move.

    Precedents

    When it came to my observations there was also something very special that I remembered, music can make people move across all genres. From this I started to look at how important it is throughout cultures as well and other ways that people use music in their daily life.

    I also looked into other examples of music and how multiple people connect over it and I remembered a game I used to love when I was younger called Rock Band. Rock Band allowed multiple players to play together in a band with each person taking over a different role. You had someone on the vocals, someone playing the drums, and someone else on the guitar. The goal was to work together as a team and perform the best you can together.

    Whilst scrolling through my Twitter timeline I noticed something that I hadn’t seen for a while, someone with an image over their avatar to show support for something. I really liked the idea of the overlay being opaque enough that you can see the users face below it still, not replacing them but instead adding to it. Having the unique user below with the same overlay applied, similar to people being their own person and listening to the same music as someone else.

    First Draft

    I wanted to use something like the Arduino to detect physical motion and use that to trigger an event on screen but due to what I wanted to do it would have made more sense to use something like the Leap Motion. The movement of the users hands over the sensors would then change the elements on the screen by size, color, design etc.

    Due to this, I scrapped this initial idea and decided to use the webcam to capture motion instead. The initials drafts of the music selection and motion page were mocked up.

    Music Choice
    Visualizer

    Second Draft

    After switching the focus to the web camera instead of the Arduino I also decided to change the gestures/interactions. Instead of having hand gestures I began experimenting with using the fingers as brushes and selectors for moving elements around.

    The purpose of this approach was to allow the users to feel like they are painting on a canvas of some sort. When multiple users use the app their work we’ll be overlaid onto the same canvas, allowing them to view the similarities or differences between their reactions to the songs.

    Visualizer

    Final Prototype

    I modified the design to be more simple and since music is copyrighted I took away the option of searching for a song and instead found videos for each genre that would serve as placeholders for the demonstration. In the opening page the user is presented with the choice of Hands or Face. This will determine how they interact with the application.

    Home
    Choices

    For the final version of the prototyping stage I ran into errors when trying to use fingers as the brushes and selectors. So, in the meantime I resorted to using the mouse and the keys to edit the strokes and colors created by the user. But, I still really wanted to use the webcam so I came up with the idea of overlaying the outline of the users face over the music video.

    The idea behind this was to allow the users to see how the other person is reacting live to the music. They may find that someone may not be as enthusiastic at certain parts or they may find that they have more in common than they thought. Whilst demo-ing this at The New School a lot of the feedback was about the future iterations or how this application can be used in the music industry for gathering feedback to new songs or videos and what could be done with this information

    Website

    Since one of the main purposes for this project was to be able to see the similarities and differences between people and their interaction with music, I felt the need to archive the results on a website. Originally the website was only going to show the work created by using the fingers as brushes and not the facial outline feature. But, now the website will serve as a sort of global database for all of the reactions submitted by the users. People will be able to search through the site by genre, song, age, location etc to get different results.

    Future Iteration

    The next iteration of the application will give the user direct feedback whilst they use the app and a report after of their types of movements and how many other people had similar reactions.