Microsoft Research finds a way to enable natural interaction between humans and robots

Reading time icon 1 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Microsoft Research

Microsoft Research today revealed a new project that will enable more natural interaction between humans and robots through Mixed Reality. Azure Spatial Anchors already supports colocalizing multiple HoloLens and smartphone devices in the same space using a shared coordinate system. With this project, Microsoft Research has extended Azure Spatial Anchors to support robots equipped with cameras.

This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person’s perspective.

Microsoft has created a ROS wrapper for the Azure Spatial Anchors Linux SDK, allowing robots (and other devices equipped with a vision-based sensors and a pose estimation system) to create and query Azure Spatial Anchors, allowing the robot to co-localize with AR-enabled phones and Hololens devices. You can check out the project here at GitHub.

Source: Microsoft

More about the topics: microsoft, microsoft research

Leave a Reply

Your email address will not be published. Required fields are marked *