Following the success of the previous MCA Interactive for Vivid, Spinifex were engaged to do the same again on the new facade only this time we were to collaborate with Will.I.Am as Intel’s Director of Creative Innovation and MCA Artist Justene Williams. By the time we had the go ahead my brief consisted almost entirely of:
- Will.I.am wanting to use hand gestures to move coloured shapes around a building to control music.
- Justene Williams supplying a series of pen sketches done during a night of insomnia that were inspired by constellations and Russian Constructivism.
- 3 WEEKS!
What we ended up with was an installation that 3 people could interact with at a time via 3 kiosks in front of the building. Each corresponded to a different part of the music such as bass or melody, and each having a corresponding animated section on the facade. Users would use hand gestures to move around shapes based on Justene’s sketches that were the interface for an audio loop sequencer. The shapes would then animate in response to the volume and frequency of the music. The concept of constellations enabled me to transition to a star-scape between songs as we moved another group of people through. Then as a new song started up the constellations would be traced out in a Constructivist aesthetic and the interaction would begin.
I was determined that users have responsive control over audio, as being able to control things in time to the music was crucial. Most hand gesture interfaces involved hovering to select, which would not suffice so I hacked together a test monitoring arm extension to get responsive press and release states, which worked a treat. Thankfully Trent Brooks was available to save that day and take it from there and implement this technique with Jonny Old’s Interface designs for the kiosk app in OpenFrameworks, leaving me to tear my hair out over the remaining issues.
Will.I.am delegated the entire audio component to DJ Keebz who did a great job coming up with enough tracks to work with the install in such a short period. He received OSC from the gesture application into a MaxMSP patch that piped it into Ableton to drive the music.
I built the front end application using OpenFrameworks. It received OSC from the gesture application also as well as an audio feed. Once I had a proof of concept working in greyscale I asked Jonny to supply me with the Vivid appropriate palette you see.
It was an incredible stress. I mean success.