Real-Time Generative Visuals | On-Stage VJ
Real-time generative visuals and on-stage performance.

"Every Day" is a music piece performed by Berklee Valencia students in Las Naves - Espacio Cultural, back in April 2019. The performance was part of the "Discover-i" concert, a concert which was based on the collaboration between students from the four different Berklee Valencia graduate programs. The concept of "Every Day" is the anxiety, and was written by artist Alex Rapp. It was performed with drums, electronics and vocals, along with real-time visuals and dance.

The generative visuals were been created with MAX/MSP/Jitter and were controlled in a hybrid way; by a bio-sensor (Bitalino biosensor) that was placed on the dancer, by audio, and by a midi controller. The core of the visuals was a 3D model of a heart that was continuously being displaced based on the data of the biosensor, and/or the audio transients.

The appearance of the 3D model was alternating in the whole performance between points in 3D space with variable size, and wireframe style. The color attributes of the points were also changing - mostly in red and white - following the aesthetics of the track.

The idea and the first version of the MAX/MSP patch was created by Alex Rapp (project lead). My contribution to the project was to develop the Max patch furthermore, optimize it and perform with it live on-stage.

I programmed the MAX patch in a way that I could be able to quickly change the input of the displacement from the biosensor data to the audio data or to the midi data. In that way, I had full control over the output at all parts of the music which had rhythmic, sustain and experimental-tensive parts.



No items found.