Microsoft’s Cognitive Services is somewhat difficult to wrap your head around, but in the video below we have VR game developer Human Interact explain exactly how Microsoft’s new speech recognition service and Intent Engine helped them create an amazing game which reacted naturally to speech, in particular by allowing them to upload custom dictionaries and allowing them to discern the intent of speakers no matter which exact phrase they use to give a command.
Starship Commander is an interactive virtual reality science fiction game that gives players control over the narrative using real conversation.
At first, Human Interact attempted to create their own natural language processing technology, but later decided to rely on Microsoft’s machine learning implementations due to the difficulty of achieving a good degree of accuracy.
Because Human Interact uses invented words and place names, the company utilised the Custom Speech Service by Microsoft Cognitive Services, creating a custom script so that characters in the game understand what players mean when they speak.
Starship Commander should be released this month on the Oculus Store and SteamVR. See the full trailer below:
Microsoft is in the process of releasing 3 of 21 Cognitive APIs to developers, and from looking at the video they could result in a revolution in how we use our PCs and phones.
Read more about Microsoft’s Cognitive Service API here.