Microsoft Research today spoke about their Project Adam at the Microsoft Research Faculty Summit. It is an state of an art machine learning and artificial intelligence program that enable softwares to visually recognize any object. Project Adam and its object classification is built on a massive dataset of 14 million images from the Web and sites such as Flickr, made up of more than 22,000 categories drawn from user-generated tags by Microsoft resesarcher Trishul Chilimbi and his team. Microsoft claims that this program is twice more accurate in its object recognition and 50 times faster than other systems. To show this system in action, Microsoft used a live dog on stage and Project Adam powered phone recognized the breed of the dog.
The live demo of the dog breed detector integrated Project Adam’s technology into Cortana. Apacible pointed a phone at the Dalmatian – named Cowboy – and asked it, “Cortana, what dog breed is this?” It was spot on, displaying the word “Dalmatian” on the phone’s screen. Then he turned the phone to Millie the Rhodesian Ridgeback and Cortana asked him to take a picture of it. It also nailed her breed. The audience applause conveyed appreciation for the success. Then came the Cobberdog, Ned. Project Adam thought it was a terrier, the audience thought it was a Labradoodle – and both were right. Both breeds are found within the Cobberdog. Just to show Adam knows the difference between people and dogs, Apacible pointed the phone at Shum. Cortana answered, “I believe this is not a dog.”