Back in 2017, Microsoft first announced the Seeing AI app that changed the lives of blind and low vision community. Seeing AI app uses the power of AI to describe nearby people, text and objects. Since its launch, Microsoft has added several new features to the app. Today, Microsoft is announcing new features to further improve the app. For example, users can now tap their finger to an image on a touch-screen to hear a description of objects within an image. This update also brings iPad support for the first time. You can find the full change log below.
- Explore photos by touch: Leveraging the Custom Vision Service in tandem with the Computer Vision API, this new feature enables users to tap their finger to an image on a touch-screen to hear a description of objects within an image and the spatial relationship between them. Users can explore photos of their surroundings taken on the Scene channel, family photos stored in their photo browser, and even images shared on social media by summoning the options menu while in other apps.
- Native iPad support: For the first time we’re releasing iPad support, to provide a better Seeing AI experience that accounts for the larger display requirements. iPad support is particularly important to individuals using Seeing AI in academic or other professional settings where they are unable to use a cellular device.
- Channel improvements: Users can now customize the order in which channels are shown, enabling easier access to favorite features. We’ve also made it easier to access the face recognition function while on the Person channel, by relocating the feature directly on the main screen. Additionally, when analyzing photos from other apps, the app will now provide audio cues that indicate Seeing AI is processing the image.