Kinect Keyboard

For Christmas my lovely girlfriend gave me a Kinect for my Xbox.  After playing several hours of Dance Central, I decided to get down to the serious business of writing my own fun demo.  Hope you enjoy.  See below the cut for details on the implementation.

The above video is just a screen capture showing depth map from the Kinect (whiter is further back) on the top left, the live video on the top right, and my keyboard view on the bottom left.  The keyboard view displays the outline of any objects observed within a certain distance range from Kinect.  You can tell this because I appear and disappear as I move through this range in the video.  The computer vision software then determines the centroid of the largest blob in the keyboard view and calculates the location.  I use that location to trigger which key should be played.

To get my Kinect working with my Mac, I downloaded the libfreenect package using the brew package manager.  I also used get to get the excellent ofxKinect library for openFrameworks.  To my suprise, things just compiled in xcode on the first try.  Things look a little rough getting installed but it worked out fine for me.  See the bottom of the post for links back to all of these great free tools.

This was my first experience with openFramework but the UI development aspects feel very much like Processing (except in C).  The documentation available on their website was complete enough for me to work out any issues I had.  OpenCV is also included in openFramework and I used it’s blob detection capabilities to map my movements in front of the camera to the keys.  The sound is amateurish but I just ripped out the openFrameworks sound demo and modified it to emit sine waves of the appropriate pitches for the keys.

  • http://openkinect.org/wiki/Main_Page
  • http://www.openframeworks.cc/
  • https://github.com/ofTheo/ofxKinect
This entry was posted in Kinect, Projects and tagged , , . Bookmark the permalink.

Leave a Reply