PROJECT BRIEF: Students are to submit a Processing prototype that uses sound and interaction as its main focus. The canvas area should be 800 pixels by 800 pixels. Images and videos can be added as well.
For this project, I chose to create an audio visualizer that would react based on the beats of the linked audio in the .pde file.
The inspiration for this project came from the way that music has become such an integral part of our lives. Anywhere you go, there’s always more than one person either nodding along to a song or mouthing the lyrics with headphones in. We’ve become so accustomed to it and with the constant production of music videos, having a visual alongside audio is very common. I wanted to explore that concept of combining audio with visuals, which is why I made an audio visualizer.
The following images are examples of design concepts that I originally researched to gain an understanding of what design I wanted to create.
I really liked the idea making an ellipse the center of the visualizer and having it change in size or have something come out of it as it reacts to the beat.
DESIGN AND COLOUR SCHEMES
I had multiple original design plans that I had to change along the way because either the code wasn’t working properly or I wasn’t able to find the code that would perform the action. With this visualizer, I wanted to add an interactive element that would allow the viewer to change songs or change the colour of the visualizer when clicking on the screen. Unfortunately, no matter how much I researched various codes and examples, I was unable to find the code or combination that would help me do that. I also managed to create a function that would pause and play the audio when the mouse was pressed and released, but that conflicted with the audio’s ability to loop.
Selecting a colour scheme was also a challenge because initially, another design concept was to make the colours based on the audio, which would change via a mouse click. I went through a variety of songs and changed the colours to reflect my personal opinion of what was an appropriate colour scheme for each song. In the end, I chose a combination of violets and orange/white.
Due to limitations in what I was able to do with the Processing code, audio changes must be made manually through the code itself.
There are three audios to choose from. Only one must be activated at a time, which can be done by ensuring that the coded line contains colour and has no slashes in front. If the code is blocked out, there will be two forward slashes in front of it. The project must be stopped and replayed each time there is a change made to the code.
EXO. “Lucky One (Instrumental).” EX’ACT – The 3rd Album, SM Entertainment, 2016, track 10. YouTube, https://www.youtube.com/watch?v=6d2_T-PnPX0.
SHINee. “View (Instrumental).” Odd – The 4th Album, SM Entertainment, 2015, track 3. YouTube, https://www.youtube.com/watch?v=WTOKwRpvrSI.
Noisestorm. “Crab Rave.” Monstercat Instinct Vol. 1, Monstercat, 2018, track 16. YouTube, https://www.youtube.com/watch?v=RxBQ4lxxJL0.
For me, Processing has always given me the most amount of resistance when it comes to making something using its code. The last time that I worked with this program, I had to re-create my design prototypes and reconfigure my code several times in order for it work. I didn’t feel comfortable coming into this project but was pleasantly surprised when I experienced less difficulty this time.
I’m quite happy with the way that the visualizer turned out. It took a while to settle on a design that I liked and to select the songs that would be used to illustrate how it works, but it’s definitely something that was worth all the long hours making.