Chris Harrison at Carnegie Mellon University in Pittsburgh, Pennsylvania, along with Dan Morris and Desney Tan at Microsoftâ€™s research lab have created a system that allows users to use their own hands and arms as touch screens by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin.
An armband houses an array of sensors that collect the signals generated by the skin taps and then calculates which part of the display you want to activate.
The system closes the loop using a pico-projector to show menus and other selectable media on your arm and the complete system has an accuracy of 95.5%.
More details can be found in this Microsoft Research paper.
It is unlikely the system will find implementation in our smartphones any time soon, but I do wonder if the accelerometer in our current devices are sensitive enough for a crude version that can do volume up, down, pause and start? Any developers interested in giving it a try?