Three researchers from The Smith-Kettlewell Eye Research Institute describe emerging tools and technologies that allow people with visual impairments to better explore, interact with and move in their environment. The first such approach is echolocation, in which a person infers the overall structure of their surroundings from the way sounds reflect off objects in the environment. We also describe an audio map interface that enables non-visual exploration of maps on a computer or mobile device, and report how blind volunteers virtually explore a playground and learn about its layout with this interface. Finally, we demonstrate a new smartphone app that provides audio feedback to give a user information about the specific location they are pointing to on an object and another app that gives accessible turn-by-turn directions in GPS-denied environments such as building interiors.
Santani Teng received his B.A. and M.A. in psychology from UC Davis and his Ph.D. in psychology from UC Berkeley in 2013. He was a postdoctoral researcher at the Massachusetts Institute of Technology before coming to Smith-Kettlewell as a fellow in 2017. Since January 2020 he has been an Associate Scientist at Smith-Kettlewell, where he is investigating auditory spatial perception, haptics, echolocation, and assisted mobility in sighted and blind persons. He uses a combination of psychophysical, neurophysiological, engineering, and computational tools to better understand how we perceive the world, especially when vision is unavailable. Santani was awarded a grant in December 2019 from the E. Matilda Ziegler Foundation for the Blind and Visually Impaired to study the neural mechanisms of echolocation in blind persons with electroencephalography.
James M. Coughlan received his B.A. in physics at Harvard University in 1990 and completed his Ph.D. in physics there in 1998. He is currently a Senior Scientist at The Smith-Kettlewell Eye Research Institute. His main research focus is the use of computer vision and sensor technologies to facilitate greater accessibility of the physical environment for blind and visually impaired persons. Current and past accessibility projects include the development of systems to provide audio-haptic access to physical objects such as documents and 3D models, the ability to find and read signs and other textual information, and navigation assistance indoors and at traffic intersections. He shared the 2020 Dr. Arthur I. Karshmer Award for Assistive Technology Research for his publication, “Towards Accessible Audio Labeling of 3D Objects,” which was awarded for the best submission to the Science/Research Journal Track of the CSUN 2020 Assistive Technology Conference. He was recently appointed to the National Advisory Eye Council, which advises the National Eye Institute on funding decisions, initiatives, and strategic planning.
Brandon Biggs received his BA in music from California State University, East Bay in 2016 and his Master’s in Inclusive Design from OCAD University in 2019. He joined Smith-Kettlewell in August 2019 to work with James Coughlan on his computer vision applications (the CamIO and indoor wayfinding apps), and he is also continuing work on building the digital auditory map display that he built as part of his Master’s program. He is currently working with the Magical Bridge Foundation to install a “Magic Map”, which utilizes CamIO, the wayfinding application, and the digital auditory map. He is currently starting a private business, XR Navigation, to commercialize the Magic Map and digital auditory map for companies and public spaces. Brandon is also the co-founder and CFO of Sonja Biggs Educational Services, Inc., a company that provides teachers of the blind and other blindness-related service providers to K-12 schools. Brandon’s main focus is on user experience, cross-sensory data representations, co-design, and sustainably putting the technology he is working on into the hands of the people who need it.