Last week a friend of TripleWide Media (Nicholas Rivero) shared with me a link to one of the coolest video projects I’ve seen yet. I knew I had to share it with our community. We’ve talked about media being used as a vehicle to communicate in the context of an environment… much like an instrument would in a band. Art adds to the collective of anthems to create energy, reveal truth and add a soundtrack to the moment.
However these guys have taken it to a whole new level… they’ve figured out how to make video into a virtual musical instrument. Here’s what I’m talking about:
The project has been a long time in the making, as it apparently took months to put the whole thing together. Using the power of Ben Kuper’s BiKinect project, they morphed a Microsoft Kinect box into a musical instrument with the addition of Abelton Live.
The Music and video however were indeed two separate systems to prevent lag wherever possible. Blogger Jeff Nusz at Custom-Logic.com posted this diagram to show how the two Kinects were being used through a Mac and PC to drive Audio and Video respectively.
With the use of the Kinect, the person controlling the musical instrument is able to use every day hand gestures to control sounds like a keyboard. In this video you can see how they simply move their hands in space to change the key being played…
IN the process of creating this instrument, they used a plethora of various apps to develop and debug their system. Here is the list:
1. Keyboard mapper – AIR app that is used to modify and edit keyboards on the fly.
2. Mapping Tool – Custom Processing app to speed up the process of mapping movements to midi and Ableton.
3. Midi Testers – Sender and receiver to help test system connection.
4. Midi Interceptor – Testing app that intercepts midi from the system helps in trying to figure out where issues lay.
5. Connection Testers – Data send emulator to test data receivers.
6. Motion Management Tool – custom AIR app that provided an interface to help create and manage positions and movements
Of course as we know here on TripleWide Media, the visuals are just as important to the overall experience. Don’t get me wrong, they were a critical piece of the puzzle for this project as well. As you can tell in the lower concept render of what they wanted this to look like, you can see that the “green polygon man” was a theme from the beginning. I think this helped make you realize that he was controlling the music live instead of the “disney-trick” of having the actor/dancer/musician being well choreographed with the moves that needed to be pulled off.
This above sketch is a drawing from Jeff (custom-logic.com) showing how they layered all the various visuals in place. Everything was rendered out (for the landscape layer) at 1920×768 64-bit Motion JPEG Quicktime videos at 60fps. That’s a heavy codec to have to deal with.
Other visuals included the polygon man:
User Interface for the Man to interact with:
For more details on this entire project, check out Jeff’s posts at Custom Logic or Paul’s post on The Fugitive. Links here:
The Fugitive: How We built it.