As many of you are aware of, the West Coast Championship Controller Battle went down last weekend and the DJ Techtools staff was present for full coverage of the event. Although Edison deservingly ended up taking the crown in the end, we all found another competitor’s setup to be one of the most interesting and innovative controllers we have seen to date. So without further ado, let’s introduce Tim Thompson (aka Artful Codger) and his Space Palette, which utilizes a wooden frame and the Microsoft Kinect to create electronic music on the fly with the literal wave of a hand.
Tim Thompson, the inventor of the Space Palette is no newcomer to DIY controllers and algorithmic music. With a strong interest in computer music and art, tied with a background in software engineering (having worked at notable companies such Bell Labs and AT&T), Tim has been able to create a variety of unique music and art installations that is not only inspiring but also truly captivating. Check out his extensive (10+ years) “What have I been doing?” section of his personal website (http://nosuch.com/tjt/)
Tim’s most recent project, the Space Palette utilizes the depth tracking of Microsoft’s Kinect device and a wooden frame with rectangular shaped cutouts, which are used to control pitch, velocity, and other parameters of specific electronic/drum sounds. Each note is triggered when motion is detected within one of the rectangular cutouts. They are then modulated based on the position and depth of the hand travel producing a 3D touch surface.
The Microsoft Kinect works as a horizontal sensor that functions as an RGB camera, a depth sensor (which is just an infrared laser projector combined with a monochrome CMOS sensor), and a multi-array microphone which together are designed to be used for 3D motion capture, facial recognition, gesture recognition, and voice recognition purposes.
For all practical purposes, the Kinect has a tracking range of 1.2 meters to 3.5 meters (3.9 ft to 11 ft) and can track up to six people or motion analysis of two people with a feature extraction of 20 joints per person within an angular field of view of 57 degrees horizontally and 43 degrees vertically (with the motorized pivot being capable of tilting the sensor another 27 degrees vertically). Another thing you probably didn’t know about the Kinect is that it currently holds the Guinness Wold Record holder for “fastest selling consumer electronics device” selling 8 million units in the first 60 days or 133,333 units per day!
- After the controller battle, I followed up with Tim via email to inquire further about the Space Palette and his work :
All of our staff were really impressed with your Space Palette, not to mention wildly entertained by your performances (by the way, we all unanimously thought you should have won your last round)
Thanks, it was certainly a great opportunity to show people that controllers can be more than buttons and knobs. The Kinect blows the doors wide open for innovation, and Space Palette is just the first of many new controllers (both musical and visual) that will be based on it, there’s no doubt.
What software did you use to create the music aspect of your compositions with the Space Palette?
KeyKit is the environment I use – it’s a programming language I developed several decades ago, specifically for algorithmic and realtime MIDI work. The sounds were done with a variety of VST synths inside Plogue Bidule – e.g. Omnisphere, FM8, Battery 3, Massive.
What software do you use to communicate with the Kinect?
I wrote a program in C++ which uses the libfreenect library to communicate with the Kinect. I don’t use the RGB camera or skeleton tracking, I only use the depth data. The OpenCV image-processing library is used to do blob detection in the thresholded depth image. That program sends OSC messages to both KeyKit (to do musical things) and Processing (to do graphical things). The OSC messages use the TUIO format, which is a standard for multitouch information.
Is there any new technology or devices out there that interest you for another controller/installation (or adding to an existing)?
The other interesting device I’ve used in the last decade is the Fingerworks iGesture pad. Besides being multitouch, the iGesture pad can detect (with incredible precision) the area of each finger on its surface, a feature which I found to be extremely useful for realtime graphics (think “finger painting”) as well as music. The iGesture pad is no longer available because Apple bought the company in 2005, to help develop the iPhone. Apple doesn’t yet officially allow the use of finger area in apps, though some of their hardware is capable of it. I’m looking forward to the day when either pressure-sensitive or area-sensitive multitouch surfaces are widely available and supported. Roger Linn is one of the people trying to make this happen.
Was creating the software mapping/setup with the Kinect and your other software GUI based or more programming based?
All the software I write is done with text-based programming languages.
Could you please explain more specifically what the musical attributes the x, y, and z (outward depth) components of each rectangular space are controlling?
Within each rectangle, the x position of the hand (or other object) determines the pitch. The pitches of the notes are restricted to a particular scale. Each rectangle can have a different range; most of them are two octaves. The smaller the pitch range, the more accurate control you have over the pitch. Any number of hands (or objects) can be used simultaneously. There are no loops or sequences; notes are triggered individually whenever the movement of a particular hand or object exceeds a threshold.
The narrow rectangular spaces in the four corners of the Palette are essentially “big buttons” for control. One of them changes the key, one of them changes the scale, and one of them lets you change the sounds (each rectangle can be cycled through three different sounds). I sometimes allow one of these control buttons to turn on realtime looping independently for each rectangle – this can be convenient and fun, but can also be very confusing, especially for people new to the instrument, so I normally don’t enable this feature.
The y position within a rectangular space is used to control the timing quantization of the notes. Notes played in the lower third of the space are quantized to quarter notes; the middle third quantizes to eight notes; the top third quantizes to sixteenth notes. I think of this feature as implementing “time frets”.
The z position (the depth) gets converted into MIDI aftertouch, and the effect of that is dependent on the sound patch. Typically it’s used to add vibrato, control a filter, or mix between two sounds. The use of hand depth while playing is very satisfying for the player as well as entertaining for the audience.
Are you planning on ever producing or releasing the hardware/software package that creates the Space Palette to the public (or for sales)?
I’m planning on making the Kinect-specific program (the one that sends OSC/TUIO) available, at some point. The software is capable of dealing with any frame that has holes in it – the holes don’t have to be rectangular. Each hole becomes a separate multitouch surface. After the software is trained, the frame is actually optional, though I still consider it an important device both for the player as well as the audience. Now that there’s a TechShop San Jose, I’ll be able to produce frames of different shapes more easily.
Was there anything specific (e.g., an inventor, composer, piece of equipment, or event) that inspired you to start making your own controllers and installations?
Two events were particularly inspiring for me. The first was Woodstockhausen, a yearly event near Santa Cruz where experimental music makers were given an opportunity to entertain each other. I participated for 3 years in a row, doing increasingly-adventurous performances – the second year I performed using only a QWERTY keyboard as the controller, and the third year I performed by dancing on four playstation dance pads, in complete darkness, wearing EL-wire-outlined pants. The other inspirational event was my first visit to Burning Man in 2002, which triggered the most radical change in my work. In 2003 I did my first playa installation which was a 12-foot-high lyre with lighted strings that you played by dancing on playstation dance pads. Burning Man continues to inspire me every year – this will be my tenth year in a row.
What’s the best fan/audience response you’ve ever seen from any of your events?
The best response I’ve ever gotten was when I released the swinging tennis ball in the Space Palette at the Controller Battle.
Are there any interesting stories about the inspiration, creation, or evolution of the Space Palette that you would like to share?
Can’t think of a particular one. Evolution certainly describes how it was created and will continue to be enhanced.
Where can fans see the Space Palette in the future?
I’ll be bringing it to the “Touch the Gear” event on July 17 which is part of the Outsound New Music Summit
http://www.outsound.org/summit/11/schedule_details11.html
I’ll be bringing it to the Nexus fundraiser on July 29
http://www.themml.com/event.php?event_id=8989
And of course I’ll be bringing it to Burning Man. I’ve applied to be a theme camp, called the Multi Multi Touch Touch Camp, and I’ll be building a shade structure to allow it to be used in our camp during the day as well as at night.
Are you working on any new yet to be seen projects?
Using the Space Palette to paint graphics (simultaneously with the music) is the next step.
Is there anything else about yourself that you would like our readers to know?
One other concept I’d like to include is that I consider Space Palette to be in the class of a “casual instrument”, analagous to “casual games” that are simple and easy to learn, yet still deep enough to be satisfying for a long time. Using the label “casual instrument” is also an attempt to distinguish this from more conventional instruments that allow more fine-grained control (e.g. the ability to reliably play arbitrary melodies). There’s actually nothing preventing the Kinect and the Space Palette from being used to create a more conventional instrument, it’s just not the focus of my current efforts, which are centered around creating installations for people (both musicians and non-musicians) to enjoy at various events. Actually, the most rewarding situation for me is to see someone who does not consider themselves to be a musician walk up to one of my instruments, create their own music, and (most importantly) to see them realize that they are the ones actually creating it. At Burning Man, one of my most memorable moments was watching someone walk away from Monolith 2.0, saying “I did that!”.
You can find our more about Tim Thompson and his work at his website http://timthompson.com/