Kinectadapter.github.io

XBMC Kinect Adapter


Project maintained by vova1987 Hosted on GitHub Pages — Theme by mattgraham

Welcome to XBMC Kinect Adapter Git Page!

XBMC Kinect Adapter is an open source project that allows using a Kinect device for controlling the XBMC media center. The project provides both ready-to-deploy executable and an easy to use developer platform for developing yet more new and exciting gestures.

XBMC What?

XBMCis the most widespread open source solution for home entertainment. XBMC lets you control your home theater and adds cool features and custom plugins to enhance your experience. XBMC allows you to control its simple and intuitive interface using any kind of input device - keyboard, mouse, remote, cellphone, etc. BUT - what if you want something even better? even easier?

Kinect Who?

Kinectis a M$ developed device used mainly in the XBOX gaming platform to detect user position and orientation. It is a cutting-edge technology just waiting to be exploited...

XBMC Kinect Adapter

This is our project. Our goal is to enable you to control your XBMC using your Kinect device. Use your hands and voice to tell the TV what to do! no more complex remote controls and cumbersome gaming pads. All you need to do is wave your hand XBMC will follow.

Gestures

Features:

  1. Large set of hand and body gestures implemented.
  2. Skeleton info is combined with the UserInfo stream to provide more intuitive gestures (Grap, release, etc.)
  3. Using Voice recognition of the Kinect device and the Microsoft Speech recognition library, to add even more control options
  4. XBMC Command Sender that can send all existing XBMC control commands.
  5. A fully configurable user friendly gesture-to-command mapping using a dedicated XML.
  6. Reusable Gesture Detectors and Command sender classes that can be plugged in into any project requiring gesture recognition Try it out: our platform is extensible and easy to use - feel free to to fork and add your own gestures. Learn: Take a look on the already implemented gestures - they will give you a good idea on how to implement your own gestures

Installation instructions

  1. Download the Binary package.
  2. Run the EXE File
  3. Start Controlling your XBMC!

Gesture to Command mapping

The mapping of all availble gestures to XBMC commands is dynamic. Use the provided XML to define your own mappings. The structure of a mapping is as follows: <GestureCommandPair> <Gesture Type="Physical">TheIdOfYourGesture</Gesture> <Command Type="KbCommand">KbButtonToSendToXBMC</Command> </GestureCommandPair> As you can see, all you need to do is to add such an elemnt to the xml. if the Gesture ID exists (i.e. there is a class implementing it) it will be mapped to the given XBMC KB command. See the XBMC documentation to find all available KB commands.

Implementing a new Gesture

If you really want to go pro and implement your own custom gestures, here are the steps:

  1. For each segment of your gesture, create a new class that implements IGestureSegment. Implement the only method of the interface: CheckGesture(). This method will be called to determine, based on your criteria, is this segment valid. You have all the Skeleton and interaction data to your disposal. For examples, see the segments already implemented in our project.

  2. Add a new class that implements ICompositeGesture. Implement the two methods: one to return a unique gesture ID and the second to return a list of Segments for this gesture. Note: The Segments you create are reusable! you can use the same segments to implement different gestures. For such an example - see the WaveLeft/WaveRight gesture classes: they implement a gesture by concatenating the same segments a few times. Make sure the class you created is placed in the KinectAdapter.Fizbin.Gestures namespace.

  3. Open GestureToCommand.xml and add a new mapping from your gesture(use the id returned by the ICompositeGestureinterface) to any XBMC command. The type of your gesture should be 'Physical'.

  4. That's it! the class you created will be dynamically loaded and the command applied according to the xml mapping.

Debugging

To Bring up the debug screen, that will help you to develop new gestures by showing the skeleton:

Writers

Regev Brody, Vladimir Dvorkin, Vladimir Cooperman