Google is working on cheap AI-based hand tracking for Google Glass

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Hand tracking is useful in augmented reality applications, as it provides an easy to use natural user interface to interact with digital information, but it normally requires special depth sensors and other processor-intensive and expensive technologies.

Google’s vision of augmented reality, which uses lightweight glasses-like headsets rather than the helmet-like HoloLens can not accommodate such sensors, and today we report on a new approach to hand tracking which only uses the front-facing camera and on-board AI-based processing to do remarkably accurate hand tracking.

The main innovation in the technology is using a conventional algorithm to detect the palm of the hand, and then using an AI-based model to predict the position of the fingers.

The team are releasing the code as a cross-platform framework called MediaPiPe.

“We hope that providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues,” the team wrote in a blog post.

MediaPipe can map up to 21 points on the hand and fingers with up to 96% precision and do it all on a mobile device (ie not in the cloud), opening up the possibility of using the tehcnology in other applications such as controlling applications on your smartphone.

We imagine Google may also employ it on their exclusive devices, with the Pixel 4 already including the Soli radar sensor for gesture control.

Ideally, however, the development would make it easy and cheap to create rich AR experiences and hopefully allow the creation of cheap but powerful AR glasses.

Read all the detail at Google here.

Via Next Reality

User forum

0 messages