
Bridge AR
AR Design | Accessibility | Wearable Tech
Duration
April 2023 - May 2023
Team
Kanika Bansal
Chandan
​
Tools
Figma
Adobe Aero
Miro
​
My Role
UX research, Ideating, Wireframing (creating the sketches and high fidelity wireframes)
Usability Testing
There are approximately 466 million people worldwide – more than 5 percent of the world’s population – with “disabling hearing loss,” according to the World Health Organization
Problem Area.
Imagine constantly feeling left out of conversations, unable to fully engage and connect with those around you. For people who are hard of hearing, this is a daily reality, leading to feelings of isolation and exclusion. The lack of accessible offline communication not only limits their participation in social, educational, and professional activities but also hinders the development of meaningful relationships.

Design Challenge
How Might We enable the hard of hearing to effectively communicate with the hearing individuals in real-time to improve accessibility and social interaction?
The Idea.
BRIDGE AR is an innovative platform that leverages wearable technology, seamlessly integrated with a mobile application. It utilizes advanced computer vision algorithms to translate sign language into speech in real-time. Simultaneously, it converts spoken words into subtitles, which are displayed in augmented reality (AR) in real-time, enhancing communication accessibility dynamically.


Literature Review.

A comprehensive literature review was conducted to gain an in-depth understanding of the barriers hard of hearing face.
Observations & Interviews.
Conducted user interviews with 3 participants of diverse ages (20-40) and occupational backgrounds to comprehend their perspectives. Additionally, online observations were performed at Youtube vlogs and online blogs of students to observe their communication habits and support the information obtained from the interviews.
​
​
"It usually takes me 6 months of talking to someone everyday to be able to lip-read them pretty well. I cant lip read strangers at all"


"I usually have a friend with me everywhere I go to translate for me. I get the never mind card a lot."
Current Communications Methods.
Offline Methods
Lip-Reading
​
​​Even the best lip readers can only understand about 30-40% of what is being said
Interpreter
Interpreters are expensive and may be available at all times.
Sign Language
Can be uncomfortable to wear, and they may need to be adjusted frequently.
​
Assistive listening device
Many people are not familiar with sign language, which leads to misunderstandings or miscommunications.
Mobile Applications
Live Google Trasnscribe
Ava
Rogervoice

.png)
.png)
Feature Analysis
I dug a little deeper to closely understand the offerings and feature sets of these mobile applications.
​
Google Transcribe is great for
real-time transcribing, Rogervoice for phone calls, and Ava for meetings and discussions
Market Gap
Currently there are no applications that can actively convert real time sign-language into text/speech. Although, a lot of research is going on potential use of AI, ML and Deep learning with a combination of image/video recognition to identify patterns and sequences.
Affinity Mapping.
Based on data received from interviews, online observations, student blogs I identified repetitive patterns and classified them into broad categories.
KEEPING TABS

RELATIONSHIPS
FITTING IN






EXPECTATIONS
Findings.
5/6
Could not keep up with lip-reading
4/6
Missed out on important information in class
5/6
Did not like wearing "hearing aids"
3/6
Who used sign language as their primary language left left out
Target Audience.
Through affinity mapping, we integrated diverse datasets and crafted a comprehensive persona representing our target audience

" I hated wearing my hearing aid as a child, the older kids would call me a cyborg"
Dina is a 20 yr old senior at Indiana University.
-
Heavily relies on lip-reading to understand what others are saying.
-
Misses out on information in class despite how hard she pays attention to the professor speaking
-
Struggles to communicate in noisy environments, smiles or nods even though she doesn't understand what her peers said.
How Might We ?
We decided to use the "How Might We" (HMW) methodology to start our ideation process. We did this by developing the following design directions:
-
How Might We bridge the communication gap between the hard of hearing/deaf and their peers.
-
How Might We help the hard of hearing store important information in clasrooms?
![Feb-Business_9-[Converted].jpg](https://static.wixstatic.com/media/e510d0_63622b0321054970bc5bad2d9664fe8b~mv2.jpg/v1/crop/x_0,y_236,w_2000,h_1615/fill/w_400,h_322,al_c,q_80,usm_0.66_1.00_0.01,enc_avif,quality_auto/Feb-Business_9-%5BConverted%5D.jpg)
Product Goals.
We identified key requirements for our product to tackle design challenges and aid the hard of hearing.
Allowing the hard of hearing to choose thier mode of communication would make them feel confident.
Sign language translation
The product should convert real-time spoken english to text for the hard of hearing to understand
Spoken English to text in real-time
The product should be able to make notes of important information which be acessed later.
Record Important Information
The product should fit into the users lifestyle naturally. It should be easy to carry around.
​
Tangible & Portable
Brainstorming.
We held three brainstorming sessions over a week, each with its own mix of diverse, original, and out-of-the-box ideas







The Idea.
AR Glasses and Mobile App for Real-Time Subtitles and Sign Language Translation
We propose a system consisting of two components: AR glasses and a mobile app.
-
The AR glasses will capture the speech of the people talking to the user and convert it to real-time subtitles using text to speech plugin, which will be displayed in the user's field of view.
-
Additionally, when the user responds using sign language, the fish camera lens on the AR glass will capture the signs and the mobile app will translate the sign language into speech using computer visioned algorithm, which will be announced through the speaker of the mobile device
-
The mobile application will have additional features for support such as, change subtitle size, color and speed in AR. Transcribe important lectures/meetings into notes, save contacts.

Exploring the Idea


Fig: Exploring the idea of AR glasses with sketches.
Paper Prototype of the Mobile Application




Usability Testing.
We used the Paper Prototyping technique to perform a quick evaluation of our application. We conducted think aloud sessions with 8 participants. Each of them were asked to perform the following tasks :
1. Wearing the AR glasses and pairing it with the app
2. Complete on-boarding process
3. Use AR glasses to have a one on one conversation
4. Record Notes



We conducted a usability test with a deaf individual from the Deaf and Hard of Hearing Services.
Our other users simulated the experience of being deaf by using headphones with loud music on for the purpose of this test.
" But how would I know that the person called my name?"
" How can I know what i said in sign language is translated correctly "
"It would be cool to see the status of my AR glasses on the app"
Information Architecture

Wireframes.







_edited.png)


Branding Style Guide.


Usability for AR.
For the UX design of this application, I drew guidance from the article "The Usability of Augmented Reality" by NNG. This ensured that our design adhered closely to established usability guidelines and principles.

Consider Users’ Context and Limitations
The user wont be able to hear and thus know when the person starts talking, hence we have the microphone icon next to the person talking to signify that the person is speaking to the user.

Use Clear Instructions and Signifiers
The user will be given clear instructions by the arrow shown in the above image to indicate when someone calls their name.

Ensure that Text and Controls Are Visible Across Different Backgrounds
To ensure the text is readable in different environments it has a highlight color.
Final Product.

AR Image Generated using DALL-E

Reflection.
User-centric
design
Learning the technology
Future Applications
A key takeaway from this project was the importance of user-centric design in AR. It was crucial to continuously test and iterate on the designs based on user feedback. This process highlighted the unique challenges of AR UX, such as designing for different environments and ensuring information is accessible but not overwhelming.
The technical aspect of the project involved getting hands-on with AR development tools. I explored different AR frameworks and software, which was a significant learning curve. This allowed me to appreciate the complexities involved in creating seamless AR experiences, such as spatial recognition and the integration of digital content with the physical world.
This project opened my eyes to the potential applications of AR beyond entertainment, such as in education and healthcare. It has inspired me to think about how AR can be leveraged to solve real-world problems by enhancing the way we interact with information and our surroundings.