Microsoft is showcasing advancements and partnerships throughout its range of assistive solutions at its 14th Ability summit. A large portion of it relates to Azure AI, including the features that were unveiled yesterday, such as audio descriptions driven by AI and the Azure AI studio, which makes it easier for developers with disabilities to construct machine-learning apps. Along with new playbooks that offer best practices standards for designing accessible campuses and increasing mental health support, it also showcased new improvements for its Seeing AI tool, including more languages and deeper AI-generated descriptions.
The company is also previewing a feature called “Speak For Me,” which is coming later this year. Much like Apple’s Personal Voice, Speak For Me can help those with ALS and other speech disabilities to use custom neural voices to communicate. Work on this project has been ongoing “for some time” with partners like the non-profit ALS organization Team Gleason, and Microsoft said it’s “committed to making sure this technology is used for good and plan to launch later in the year.” The company also shared that it’s working with Answer ALS and ALS Therapy Development Institute (TDI) to “almost double the clinical and genomic data available for research.”
This month, Copilot will get some major accessibility changes. Among other assistive tools, users will be able to ask the assistant to launch Live Caption and Narrator with the new accessibility skills. The business has stated that the Accessibility Assistant feature, which was unveiled last year, will be arriving “soon” to Outlook and PowerPoint. It is now accessible in the Insider preview for M365 programs like Word. Today, Microsoft is releasing four additional playbooks, one of which is a Mental Health toolkit that offers “advice for product makers to develop experiences that support mental health conditions, developed in collaboration with Mental Health America.”
Ahead of the summit, the company’s chief accessibility officer Jenny Lay-Flurrie spoke with Engadget to share greater insight around the news as well as her thoughts on generative AI’s role in building assistive products.
“In many ways, AI isn’t new,” she said, adding “this chapter is new.” Generative AI may be all the rage right now, but Lay-Flurrie believes that the core principle her team relies on hasn’t changed. “Responsible AI is accessible AI,” she said.
Still, generative AI could bring many benefits. “This chapter, though, does unlock some potential opportunities for the accessibility industry and people with disabilities to be able to be more productive and to use technology to power their day,” she said. She highlighted a survey the company did with the neurodiverse community around Microsoft 365 Copilot, and the response of the few hundred people who responded was “this is reducing time for me to create content and it’s shortening that gap between thought and action,” Lay-Flurrie said.
The idea of being responsible in embracing new technology trends when designing for accessibility isn’t far from Lay-Flurrie’s mind. “We still need to be very principled, thoughtful and if we hold back, it’s to make sure that we are protecting those fundamental rights of accessibility.”
Elsewhere at the summit, Microsoft is featuring guest speakers like actor Michelle Williams and its own employee Katy Jo Wright, discussing mental health and their experience living with chronic Lyme disease respectively. We will also see Amsterdam’s Rijksmusem share how it used Azure AI’s computer vision and generative AI to provide image descriptions for over a million pieces of art for visitors who are blind or have low vision.