Designed a navigational solution for the visually impaired in NYC
UX Design / UX Research / Motion Design
About 3.4 million people in the U.S. who are either blind or visually impaired lack navigational tools to allow them to explore new environments and opportunities.
Vision impairment—commonly also known as blind or low-vision (BLV)—covers a broad spectrum of medical conditions, including cataracts, glaucoma, and macular degeneration. Consequently, with different levels of impairment, comes different comfort levels for traveling outside of the home or exploring new environments.
In 12 weeks over the summer, I worked with four other interns to define a solution to enable the BLV community travel outside of their comfort zones.
We interviewed over 40 blind and visually impaired individuals in New York City and 20+ subject matter experts around the world.
Subject Matter Experts Interviews
We reached out to blind and low vision technology experts, local specialists, transportation government officers and auditors who work closely with the blind and visually impaired community. From them, we explored different emerging technologies, including 5G and voice UI, and learned how to design tactile maps and accessible user interfaces for people who are visually impaired.
Blind / Low Vision People Interviews
From our research with over 40 blind and low-vision people, we recognized the wide range of vision impairments and the relationship between conditions and their willingness to explore. From those conversations, we observed a few common problems, which include pre-planning, path details, last-foot navigation and sensory overloads.
We hosted co-creation sessions with the visually impaired using tactile materials.
It’s pretty difficult to design something for someone whose experiences you aren’t able to directly—or even indirectly—identify with. Given that, why do designers create solutions and services for people with disabilities when the members of the community could be co-designers in the participatory design process? With this in mind, we hosted a co-creation session with the visually impaired community.
We prototype the form factor in parallel with functionality, combining low-fi and hi-fi prototyping methods.
The Form Factor
We made paper prototypes of different types of wearables. By constructing models with low-fidelity materials, we were able to quickly and inexpensively make adjustments. As a result, we gained an experiential view of Thea’s visual attributes.
We eventually decided on a haptic pad packaged in the form of rectangular strips. These pads could be placed on any part of the body that the user deems optimal for feeling the haptic vibrations.
After we decided on the haptic pad shape, we then had to figure out the nuances of the vibrational pulses by creating a pseudo haptic language. The language’s pulses should help orient the users and prompt them to turn a certain way or number of degrees or walk a specific distance. We wanted Thea to be able to communicate information in an intuitive way that also adheres to a user’s body movements.
Thea is a concept for an artificially-intelligent, on-the-go navigation assistant for the blind and visually-impaired.
Thea understands the most effective way to get around and guides users accordingly. As a system, Thea communicates granular directionality through a set of wearable haptic pads and a voice-activated user interface. By first interpreting natural speech, Thea then provides non-intrusive audio and tactile feedback. Ultimately, Thea helps people with vision impairments to overcome mobility challenges and empowers them to more confidently navigate their world.