VR targets mouth through ultrasound system

Reading time icon 3 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Researchers introduced another way of enhancing the virtual reality experience. Using a device attached to the bottom of the headset, airborne ultrasound waves are utilized to send sensations on the lips, teeth, and tongue to give users a way to receive tactile feedback in the VR world. 

This system was developed by researchers at Carnegie Mellon University. Why the mouth, though? This is because the lips are highly sensitive. In fact, it is only second to the fingertips in nerve density. As such, there is a great opportunity to add haptic effects. 

Unsurprisingly, the mouth had been a target for enhancing the VR experience due to its sensitivity. However, it had been a challenge for researchers to develop a practical system of rendering haptic effects on it, Vivian Shen (Ph.D. student at the Robotics Institute) said. Naturally, VR users do not like to cover their mouths, especially since some devices can be bulky. One effort to engage the mouth in the VR experience is to use a tiny robotic arm that could flick a feather across the lips or spray water on them. It is not practical for extensive use, however.

Meanwhile, ultrasound waves that can travel in the air within short distances offer a promising solution to deliver haptic effects to the mouth. Ultrasound waves create sensations when focused on a small area such as the mouth. This is done through the use of multiple transducers or ultrasound-generating modules. The points of peak amplitude are targeted on the lips, teeth, and tongue. Specifically, the CMU device is a half-moon-shaped array with 64 transducers. This array is attached to the bottom of the VR goggles so that it is placed over the mouth for easier sending of sensations. 

The system was developed by Shen, Craig Shultz, a post-doctoral fellow in the Human-Computer Interaction Institute (HCII), and Chris Harrison, associate professor in the HCII and director of the Future Interfaces Group (FIG) Lab. The research won the best paper award at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI 2022). 

Shen said that it makes the experience more immersive for the users. The haptic effects are synchronized with the visual images in the VR experience. The sensations, however, are limited only to the hands and mouth. It could not be felt, for instance, in the forearms and torso because those parts lack sufficient nerve mechanoreceptors that are needed to feel it, Shen said. 

A variety of effects were evaluated by 16 volunteers who generally reported an enhanced VR experience due to the mouth haptics. However, it seems that not all effects are useful or equally powerful. Raindrops from an open window and feeling bugs walking on the lips are among the most successful effects. However, others were not equally powerful. For instance, the feel of cobwebs on the face is not as strong because users tend to expect to feel them with the other parts of their bodies, as well. Similarly, the effect of drinking from a water fountain was a little disorienting because the user feels the water, but it is not wet, Shen said.

User forum

0 messages