Amazon’s Show and Tell feature will help visually impaired users identify objects

Amazon is introducing a new feature that will help people with vision impairment identify objects with ease. Called, Show and Tell, the feature uses object recognition and a camera to identify objects.

The feature currently works with everyday household items and kitchen ingredients. It was built for people with visual problems so they could identify and use everyday household items without relying on someone else. The feature is currently rolling out to first and second-generation Echo Show devices in the US. Once the update has been installed, Echo Show owners can simply hold the device in front of the camera and use the phrase “Alexa, what am I holding” to trigger the feature.

The whole idea for Show and Tell came about from feedback from blind and low vision customers. We heard that product identification can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.

– Sarah Caplener, head of Amazon’s Alexa for Everyone team

Unfortunately, the feature is currently available just for Echo Show users in the US. Amazon is yet to disclose information about the availability of the feature outside the US.

Some links in the article may not be viewable as you are using an AdBlocker. Please add us to your whitelist to enable the website to function properly.

Related
Comments