Kinect enables humans to interact naturally with computers, the latest sensor—whether it’s the Kinect for Xbox One sensor or the Kinect for Windows v2 sensor—and the free software development kit (SDK) 2.0 provide developers with the foundation needed to create and deploy interactive applications that respond to peoples’ natural movements, gestures, and voice commands. But it has some disadvantages like need for large room, the person interacting should be meters away from the sensor and more.
We recently reported that Microsoft Research is now developing a new real-time hand tracking system based on a single depth camera that can be embedded in mobile devices. Read about it here.
Microsoft is now presenting another Handpose system, that can track – in real time – all the sophisticated and nuanced hand motions that people make with hands in their everyday lives.
We present a new real-time articulated hand tracker which can enable new possibilities for human-computer interaction (HCI). Our system accurately reconstructs complex hand poses across a variety of subjects using only a single depth camera. It also allows for a high-degree of robustness, continually recovering from tracking failures. However, the most unique aspect of our tracker is its flexibility in terms of camera placement and operating range.
Handpose uses a camera to track a person’s hand movements. The system is different from previous hand-tracking technology in that it has been designed to accommodate much more flexible setups. That lets the user do things like get up and move around a room while the camera follows everything from zig-zag motions to thumbs-up signs, in real time.
The system can use a basic Kinect system, just like many people have on their own Xbox game console at home. But unlike the current home model, which tracks whole body movements, this system is designed to recognize the smaller and more subtle movements of the hand and fingers.
Read more about it here.