Fusing Processing (Java) code with TouchDesigner visual programming, this simulation invites the user to a street of dynamic house-parties. Then—to the user's discretion—they are transported into a house-party to enjoy its unique, animated audiovisuals.
The purpose of this project is best described with a scenario:
So lets say, it's a Friday night. You (the user) and your friends want to go out. That's when you run the code, and you are brought to a street with four house-parties. Although it’s dark, and your night vision is limited, you are captivated by the flashing lights, emanating from the windows of the house-parties. 
Just from the outside, you recognize that each party embodies a different “vibe.” They are differentiated by the appearance, strength, and rhythm of their lights. Some have multiple vibrant colors, and others are in black and white. From outside, you can already hear the bass from the music of the night.
To choose which party to visit first, you hover (your mouse) over one of the four doors. The door opens, welcoming you in. When you finally decide on this particular party, you click (your mouse) on the entrance to “enter”.
Immediately, you are then invited onto the “dancefloor.” The energetic audio and the reactive, vibrant colors altogether generate a multi-sensory experience and cultivate a distinct, alluring atmosphere.
After dancing and enjoying the atmosphere, of course, it’s time to check out another party. What you can do then, is to click (your mouse) for a second time, and exit the house. Now, you are back outside on the street, and you can visit any of the other house-parties, enjoying the best of the night.

Process:
Creating this project required a combination of guidance and experimentation. In creating the four audiovisuals in TouchDesigner, I utilized the direction of four separate Youtube tutorials. With this, I was able to create distinct and unique visuals, and customize the aesthetics and coloar schemes to my preferences. The differences in 2D or 3D components (noise, displace, geometry, etc) all synthesize into a unique visual. 
Each visual was then paired with an individual song, dependent solely on how I believed they would best represent one another. Each audiovisual also has different methods of audio translation. Some visuals react according to changes in the beat or rhythm of the audio, and others move according to changes in the bass and snare.
Subsequently, each audiovisual is displayed (in an mp4 format) through Processing, allowing the user a smooth transition into accessing the audiovisual in an enjoyable, explorative way.
Back to Top