Relational mixed reality sculptures

In this choreographic intervention, the performers move and dance together with relational mixed reality sculptures containing two motion-driven digital layers: the first is sound-reactive and the second is composed of augmented reality animations.

Movement explorations with the first sculptural prototype

The sculptures are in a prototyping phase and work as big toys that expand the range of possible gestures and body movements while a human is interacting with a cellphone through sonic stimulation. The cellphone of each performer is attached to the sculpture and connected via BlueTooth to a portable speaker attached to the sculpture or the performer's body. This system generates a sound-reactive vibratory landscape, allowing the performer to control the performance's sound instead of only reacting to it.

This is made possible through a javascript browser-based application built out of Apple's Core Motion Framework for IOS devices that allows getting the cellphone's orientation data coming from its IMU sensors (gyroscope and accelerometer events). The data is mapped into a frequency modulation system using p5.sound library.

The sound landscape is the voice of the critter, the sound of their belly, the sound of the air moving around their body, the sound of the earth when their feet touch the ground. The sound of the fire inside of their belly. The sound of their memory.

First open-air experiments. Brooklyn, Feb 2021
Audience interaction view. Brooklyn, Feb 2021

The sculptures also have Image Targets to be tracked by an augmented reality application allowing the audience to watch the performance with a virtual layer composed of 3D motion-driven graphics. When the performers move the sculptures, the animations move with them. The target trackers are QR-Codes, allowing anyone who encounters the living-sculpture in a park or other public spaces to scan it and access a web-based AR experience.

Lua Girino, Viola He playing with a mixed reality sculpture at Fort Greene Park, April 2020

For the prototypes, I'm using AdobeAero and Spark AR Studio. Spark AR will allow a very accessible version for the Augmented Reality experience through Instagram or Facebook, but have limitations such as one target tracker per filter. Future plans include building an application for the work with Unity (Vuforia).

Viola Leqi He and a more recent version of the AR sculpture working with the target-tracker. April, 2021