Light Tone allows users to play music on an imaginary piano roll in the air. Here's a video showcasing the project exhibition in Dawson College.
Light Tone aims to create an etheral feeling of playing music simply by extending one's hands, without touching anything, and to allow just about anyone to experience the joy of creating music on-the-fly.
The project is programmed in Processing (Java) and uses Microsoft's Kinect for input. I used KinectPV2 for Kinect interfacing with Processing, TheMidiBus for MIDI output, and LoopMidi for interfacing between Processing and Ableton Live.
In simple terms, Processing scans a line of pixels, and once that line is intersected, a note is played dependant on the intersection point's normalized position. The X coordinate (left-right for the user) of the intersection point drives pitch, and the Y coordinate (up-down for the user) drives note velocity. The current setup covers 2 octaves, though it could technically cover anywhere between 1 and 512 notes (limited by the kinect's IR camera resolution).
This midi information is sent to Ableton, goes through a synth, and plays along with a dynamically arranged piece with over 50 million unique combinations, using Ableton's clip triggers.
Here's a sample of the dynamic soundtrack.
I wanted to add two lasers with line lenses and angle them to match with the pixel line so as to provide users with more visual feedback and help them interact with the project. Of course, for the light emitted from the lasers to be seen, it needed to intersect with something, like smoke from a smoke machine. I decided not to feature them in the exhibition because 1. the lasers I got were strong, and I didn't want to create a health hazard, and 2. I was (unsurprisingly) not allowed to use a smoke machine inside the school.