Microsoft shares new accessibility efforts, highlights AI potential in 13th annual Ability Summit

Microsoft’s 13th annual Microsoft Ability Summit covered different accessibility projects it developed recently. Nonetheless, the main highlight of the event focused on AI and its potential benefits for people with disabilities in the future.

Some topics tackled in the event include the new adaptive accessories (Surface Pen customizable 3D printed attachments), Microsoft 365’s new “Accessibility Assistant,” 13 new African languages in Microsoft Translator, and a new Inclusive Design for Cognition Guidebook.

Microsoft also mentioned numerous improvements in its Seeing AI, including its new collaboration, over 1500 product code library additions, and a new Indoor Navigation feature. Further, the company stressed that LinkedIn is now more accessible, thanks to “more than 40% of LinkedIn posts” now including “at least one image and the addition of automatic alt-text descriptions and captioning.

Windows 11, as usual, was also underscored in the event, especially the new accessibility features injected into it. Some of those mentioned were the extended Narrator support for more Braille displays. The software giant said that the system keeps improving in the accessibility area, especially now that its voice access functionality is out of preview. In the past Insider tests, it is also important to note that the company continues to expand the feature to more languages and English dialects. In the recent Dev Build 23403, the company even detailed that it is now being tested in Chinese (Simplified and Traditional), French, German, Italian, Japanese, Portuguese (Brazil), and Spanish.

On the other hand, given Microsoft’s multi-billion investments in AI, it is no surprise that it has plans to leverage the tech to produce more accessibility features and tech in the future. In an interview with Forbes, Microsoft’s Chief Accessibility Officer Jenny Lay-Flurrie shared how AI can serve as accessibility tech in certain situations.

“[AI chatbots] collate so much information for you very, very quickly. It can save a lot of time,” Flurrie told Forbes. “If you think about someone from a mobility perspective, you can get the right level of information at your fingertips with a couple of clicks as opposed to having to conduct 10 to 20 different searches and go to multiple websites; it can be right there for you. It’s going to be very impactful for particularly neurodiversity… I think about dyslexia [and] dyspraxia. There’s a learning process to it. We’re definitely learning as we go [and learning] how to get the best out of the tools. I think there are some pretty profound implications.”

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}