Empowering every person and organization on the planet to achieve more requires giving ownership of the computing experience to the individual and leveraging technological advancements to deliver products, tools, and services that people can make work for them—whatever their circumstances or abilities. Over the years, Microsoft Research has collaborated closely with people with disabilities and those who support them to thoughtfully and creatively innovate around this commitment to inclusive design and accessible technology. Below is a sampling of those efforts. To learn more, explore researchers’ experience developing the teachable AI tool Find My Things or, for research in assistive technologies and beyond, explore Microsoft Research.
2024
Sep
Find My Things recognized for innovative design
Find My Things, the object recognition tool that can be personalized from a few videos of an item and is available in the Seeing AI mobile app, is a finalist in the accessible design and artificial intelligence categories of the US-based business media brand Fast Company’s Innovation by Design Awards. Find My Things was developed by members of the Microsoft Research Teachable AI Experiences (Tai X) team and a group of citizen designers and was integrated into Seeing AI earlier in the year.
As part of Microsoft Research’s Accelerate Foundation Models Research (AFMR) initiative, a team from Waseda University is developing a system that will leverage vision and language foundation models to help people who are blind or have low vision with outdoor navigation.
Students in the United Kingdom between the ages of 5 and 11 test the advanced research prototype PeopleLens. The head-worn device leverages AI and spatialized audio to identify people in a room, helping users situate themselves in social scenarios and more confidently interact with those around them. During development, the PeopleLens research team identified the value personalization could add to such an experience. Given the complexity of social encounters, the team opted to examine personalization in a more straightforward application—object recognition—planting the seed for the personalizable object recognizer Find My Things.
Microsoft releases the ORBIT dataset and benchmark. The project invited members of the blind and low-vision community to contribute videos of personal items they interact with regularly. These videos were used to build a training dataset that is more inclusive of the objects people who are blind or have low vision might use, such as guide canes, and more representative of the variation in quality of images and videos captured by people who are blind or have low vision—common challenges in existing datasets.
Researchers with the Microsoft Research Ability Group receive honorable mention at the 2020 ACM CHI Conference on Human Factors in Computing Systems for their haptic and auditory white cane for navigating large, complex virtual environments. The team created a scavenger hunt to test the cane’s ability to give users a sense of the shapes of different virtual objects and different surface textures. The work is a follow-up to an earlier haptic cane controller and is among a line of research dedicated to making virtual and augmented reality more accessible, including to people with varying levels of mobility and who are deaf or have a hearing disability.
SeeingVR, a set of tools for improving the virtual reality experience for people with low vision, is made open source. The tools include visual and audio augmentations, such as magnification and brightness lenses and text-to-speech functionality.
Microsoft Research hosts a two-day interdisciplinary workshop to identify the opportunities and obstacles in the space of sign language recognition, generation, and translation and publishes key insights and findings from the workshop later in the year. The publication wins best paper at the International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS).
Image may be NSFW. Clik here to view.Rico was part of a group of students at New College Worcester in Worcester, UK, who participated in a beta test of the technology behind Code Jumper. Photo by Jonathan Banks for Microsoft.
Microsoft transfers the research and technology behind Code Jumper, a physical programming language developed as part of Project Torino, to the nonprofit American Printing House for the Blind for broad distribution.
A motorized wheelchair controllable by a user’s eye movements takes top prize at the first Microsoft companywide hackathon. The work led to the creation of the Enable Group to further develop the wheelchair and other assistive technologies for those with neuromotor disabilities. The motivation behind the “Eye Gaze Wheelchair” was a call to action from former NFL football player Steve Gleason (opens in new tab), who had been diagnosed with ALS a few years earlier and uses a wheelchair to get around. Gleason was seeking tech help to navigate more independently and interact with his family more easily. He was an integral part of the development of the chair, and his foundation leveraged the technology behind it to help bring eye-drive capability to market in 2019 (opens in new tab).
Timeline contributors: Mary Bellard, Neeltje Berger, Danielle Bragg, David Celis Garcia, Matt Corwine, Ed Cutrell, Kristina Dodge, Martin Grayson, Alyssa Hughes, Daniela Massiceti, Amanda Melfi, Ann Paradiso, Brenda Potts, Carly Quill, Katie Recken, John Tang, Patti Thibodeau, Amber Tingle, Saqib Shaikh, Manohar Swaminathan, Sarah Wang, Larry West, and Katie Zoller. Code Jumper photo by Jonathan Banks for Microsoft.