Is Copilot the best AI companion out there? Help us find out by answering a couple of quick questions!

Microsoft Research today revealed a new project that will enable more natural interaction between humans and robots through Mixed Reality. Azure Spatial Anchors already supports colocalizing multiple HoloLens and smartphone devices in the same space using a shared coordinate system. With this project, Microsoft Research has extended Azure Spatial Anchors to support robots equipped with cameras.
This allows humans and robots sharing the same space to interact naturally: humans can see the plan and intention of the robot, while the robot can interpret commands given from the person’s perspective.
Microsoft has created a ROS wrapper for the Azure Spatial Anchors Linux SDK, allowing robots (and other devices equipped with a vision-based sensors and a pose estimation system) to create and query Azure Spatial Anchors, allowing the robot to co-localize with AR-enabled phones and Hololens devices. You can check out the project here at GitHub.
Source: Microsoft