Serendip is an independent site partnering with faculty at multiple colleges and universities around the world. Happy exploring!

Processing Music and Information

This discussion is closed: you can't post new comments.
cara's picture

(in collaboration with rubikscube)
 

For our final GIST project, we wanted to explore the process of decoding and encoding information and also the entanglements that can occur during this process. Throughout this course, we have both been interested in the concept of interpreting information, especially in relation to music. By using the Processing programming language (a collection of libraries built on top of Java for creating graphics), we created two distinct applets that can interact with each other through music and visualization.

 

When choral conductor Tian Hui Ng visited our class back in March, we experienced a very interactive presentation. The class took part in the piece 4'33" (though unknowingly), and we "sang" a portion of the rain song Miniwanka. Our project is partially influenced by this presentation, as we wanted our project to be an interactive musical experience. In addition to participating in these music pieces during class, we discussed the meaning and interpretations of the music. In the case of our project, we attempted to create a music visualizer so that the computer may interpret the music itself.

 

While creating these applications, we noticed the theme of entanglement, not just between the two applets, but also among those who play with our project and us as the creators. The entanglement between the applets is clear because the music generated for one is what the other applet is visually interpreting. As for the individuals who use the applets, they are interacting with the music and with the visualizer. Whether they are using computer generated music, the included recording of a chamber sextet, or singing and clapping on their own, they are creating the music that changes what the visualizer is doing. As the creators of this project, we could see the entanglements between the individual applets we were writing because the design of one applet affected that of the other.

 

On the web page for our project, the applet on the left side is the music visualizer, and the applet on the right is the music generator. You do not need the computerized music for the visualizer to work, so feel free to talk, sing, or make other sounds to see what the visualizer can do. As the page is loading, a pop-up will appear asking you to allow the applets access to your computer. You must accept to view and use our project. Each applet should appear in its own portion of the page, and in order to use them, you should try clicking on one of the applet boxes. Then you can press the different buttons on the keyboard to see what happens. Only one applet can respond to keyboard input at a time, so make sure to click in either box throughout playing with our project to find out what you can do. For example, if you want the visualizer to respond to keyboard input, then click on the visualizer box.

 

Link to our project

 

(Note: We’ve found that while the applets should work fine in most browsers, they seem to work best in Safari and Chrome. If you are using Firefox, it may take a little while before the applets start responding to keyboard input. Also, for the visualizer to interpret sounds, you will need a built-in microphone for your computer.)