Designed a navigational solution for the visually impaired in NYC

UX Design / UX Research / Motion Design

Man Standing In Front of Subway



About 3.4 million people in the U.S. who are either blind or visually impaired lack navigational tools to allow them to explore new environments and opportunities.

Vision impairment—commonly also known as blind or low-vision (BLV)—covers a broad spectrum of medical conditions, including cataracts, glaucoma, and macular degeneration. Consequently, with different levels of impairment, comes different comfort levels for traveling outside of the home or exploring new environments.

In 12 weeks over the summer, I worked with four other interns to define a solution to enable the BLV community travel outside of their comfort zones.




We interviewed over 40 blind and visually impaired individuals in New York City and 20+ subject matter experts around the world.

Subject Matter Experts Interviews

Tactile map design of the Oculus Transportation Center

We reached out to blind and low vision technology experts, local specialists, transportation government officers and auditors who work closely with the blind and visually impaired community. From them, we explored different emerging technologies, including 5G and voice UI, and learned how to design tactile maps and accessible user interfaces for people who are visually impaired.

Blind / Low Vision People Interviews

Synthesizing findings after interviewing the visually impaired.

Synthesizing findings after interviewing the visually impaired.

From our research with over 40 blind and low-vision people, we recognized the wide range of vision impairments and the relationship between conditions and their willingness to explore. From those conversations, we observed a few common problems, which include pre-planningpath details, last-foot navigation and sensory overloads.

Read more.



We hosted co-creation sessions with the visually impaired using tactile materials.

Co-Creation Session

It’s pretty difficult to design something for someone whose experiences you aren’t able to directly—or even indirectly—identify with. Given that, why do designers create solutions and services for people with disabilities when the members of the community could be co-designers in the participatory design process? With this in mind, we hosted a co-creation session with the visually impaired community.

Read more.

During the co-creation session with the visually impaired, we used tactile materials such as Lego and Play-doh to generate ideas through making.

During the co-creation session with the visually impaired, we used tactile materials such as Lego and Play-doh to generate ideas through making.



We prototype the form factor in parallel with functionality, combining low-fi and hi-fi prototyping methods.


The Form Factor

We made paper prototypes of different types of wearables. By constructing models with low-fidelity materials, we were able to quickly and inexpensively make adjustments. As a result, we gained an experiential view of Thea’s visual attributes.

The final form of the pad was made using silicon pad and KT tape.

We eventually decided on a haptic pad packaged in the form of rectangular strips. These pads could be placed on any part of the body that the user deems optimal for feeling the haptic vibrations.


Prototyping the haptic language using a micro-controller.

After we decided on the haptic pad shape, we then had to figure out the nuances of the vibrational pulses by creating a pseudo haptic language. The language’s pulses should help orient the users and prompt them to turn a certain way or number of degrees or walk a specific distance. We wanted Thea to be able to communicate information in an intuitive way that also adheres to a user’s body movements.


Final Product

Thea is a concept for an artificially-intelligent, on-the-go navigation assistant for the blind and visually-impaired.

Thea understands the most effective way to get around and guides users accordingly. As a system, Thea communicates granular directionality through a set of wearable haptic pads and a voice-activated user interface. By first interpreting natural speech, Thea then provides non-intrusive audio and tactile feedback. Ultimately, Thea helps people with vision impairments to overcome mobility challenges and empowers them to more confidently navigate their world.



Team Members: CHANEL LUU HAI, Lauren Fox, Alina Peng, Darshan Alatar, Alexis Trevizo

Advisors: John Payne, Jacob Pastrovich, Steph Rymer, Hanley Weng, Gena Hong, Caroline Brown