EMITTING: IMMERSIVE INSTALLATION IN CU'S BLACK BOX



ContextBlack Box Studio Artist Residency
Role
Developer - Interactivity and Visual Design,
Engineer- Lighting and sound

Tools
Touchdesigner, Max MSP, Kinect, 44.4 Ambisonic System

An installation in University of Colorado’s Black Box Theatre, this immersive installation project utilized audio data, projection mapping, physical sensors, and sculptures to interactively express and explore feelings of isolation and loneliness left by the covid pandemic. 



Ideation


The goal of this undertaking was to create an immersive, mixed-reality art installation which speaks to the struggles of loneliness, specifically of the pandemic, in a creative and beautiful way. This installation intended to create a surreal, engaging experience that invites viewers to reflect on the nature of isolation and connection.


Research

The project is built on existing research to ensure a thoughtful approach. Studies on 3D projection mapping and the use of art in exploring emotional themes inform the design choices.
Effect of 3D Projection Mapping Art: Digital Surrealism
Research by Jung, Lee, and Biocca (2014) suggests that 3D projection mapping can create a stronger sense of presence compared to 2D projections. Their study found that spatialized projection mapping elicited greater spatial presence and engagement from viewers. ‘This technology allows for the transformation of spaces, potentially blurring the lines between physical and digital elements. By projecting onto real objects, abstract concepts like loneliness gave a more tangible form.

Images of Loneliness: Using Art as an Educational Method in Professional Training
A study by Blomqvist, Pitkälä, and Routasalo (2004) showed that art can be an effective medium for exploring complex emotions like loneliness. Their research demonstrated that discussions around artworks depicting loneliness helped healthcare professionals develop empathy and deepen their understanding of these experiences. The installation present edvarious perspectives on loneliness through different artistic styles.
Effects of music and music therapy on mood in neurological patients
The installation featured our own designed soundscapes to enhance the immersive experience and potentially offer therapeutic benefits: a study by Raglio et al. (2015) support the positive effects of music and sound interventions on mood, anxiety, and relaxation in various contexts, including clinical settings. While our installation is not a clinical environment, we applied these principles to create a sonic atmosphere that complements the visual elements, enhances emotional impact, and provides a calming, reflective experience for visitors exploring themes of loneliness and connection.



Process


Sculpture Building





























Interactivity 

























Sound
The goal of the sculptures is to represent the audience (and the creators). Therefore the sculpture will be an abstracted human form.
The sculpture was specifically be two faces, at a large scale, on opposite sides of the room
By using the human form, we insert the audience into the project.
We can then subvert the form to evoke emotion in the audience.


The goal for our interactive sculpture projection was to manifest aloneness and togetherness in a visual and tangible way. Utilizing two kinect sensors, we built a system to recognize how far apart two people were from each other while standing near the sculptures- and have the distance be dynamically represented in the visual projected on to the faces.


The soundtrack for the installation was a looping 25 minute soundscape created by audio engineering collaborator Anand Zupa.  

The blackbox uses a 44.4 ambisonic sound system, allowing for sounds to be moved through 3d space. We used a max msp patch that communicated through the dante network in order to access the sound system, also connecting it with our kinect sensors, giving the audience the experience of sounds harmonizing as they get closer to each other.  

















Audio Reactive Visuals and Touchdesigner Workflow


Audio ProcessingAudio from our soundtrack is processed in touchdesigner to give us frequency and amplitude data in order to feed various generative properties of the background visuals.



The data is sent to different “scenes” or mini-visual projects nested inside of this single touchdesigner file. 
Here are all the scenes unpacked.


The output of these scenes are sent to a switch which allows me to switch from scene to scene, allowing me creating a curated “video”.