V Motion Project


Part1  乐器

The Motion Artist is on the left, and the Kinects are pointed at him from straight ahead. One Kinect is using OpenNI drivers to calculate his skeleton position. The other Kinect uses freenect drivers to access the raw depth data from the infrared sensor. The top computer is Paul’s music system, which is built in Processing on a Windows PC. The bottom computer is the Mac Pro running my visuals system written in C++ with OpenFrameworks. Mac and PC happily working together






The music system works by connecting the Kinect camera to Ableton Live, music sequencing software usually used by Djs and musicians during live performances. Below is a screen capture of our Ableton setup. The interface is full of dials, knobs, switches and buttons. Normally, a musician would use a physical control panel covered with the knobs, dials, and switches to control Ableton’s virtual ones. Paul’s music system works by allowing us to map body movements to Ableton’s controls. For example, when you touch your head with your left hand a certain loop could start. Or you could control the dry/wet filter with the distance between your hands. This ability to map physical motion to actions in Ableton is enormously powerful.


The skeleton based audio control system. In this example, the distance between the hands turns one dial in Ableton, and the angle of rotation controls another one.


air keyboard


This video demonstrates the instruments available to the motion artist. “Vox” and “Bass” are keyboards. “LFO” controls the low-frequency oscillation (that distinctive dubstep ‘wobble’). “Dough” uses two filters, one controlled by the distance between his hands and the other by the rotation of the ‘ball of dough’ he’s creating. “Drums Filtered” is a drum keyboard, but the sounds become filtered as his chest gets lower to the ground. There’s one element of the performance that’s not in this test; the moment that Josh looks like he is stretching out a giant triangle. Here, the distance between his two hands and the ground provides control. As he pulls the sound up, it gets louder and as he pulls his hands apart, the dampen filter decreases causing the sound to “open up”




The Landscape

Matt and the team spent weeks crafting a series of epic environments and effects that correspond to the build-up of the song. Because we knew the footage of the performance needed to match the radio edit of the song for the music video to work, we were able to lock down the exact timing of the musical transitions and Matt used these to orchestrate the visuals of the landscape. The final render took over 3 hours on Assembly’s massive render wall.


The Green Man



The User Interface






电子邮件地址不会被公开。 必填项已用*标注