Better real-time captioning for deaf & hard of hearing students
Microsoft Research sponsored my graduate capstone aiming to improve real-time captioning for deaf and hard of hearing (DHH) students. The work was published and directly influenced Microsoft Translator.
Microsoft Research discovered that its language translation technology could also benefit deaf and hard of hearing users as a real-time captioning tool. We investigated how their system could better support DHH scenarios.
With no previous experience designing for the deaf, user research was especially critical. I met with university and middle school students, instructors and accessibility services coordinators.
- Classroom observations
- Baseline usability testing
- Diary studies
Data Analysis & User Definition
We used affinity diagrams to distill insights, define goals and design requirements, and conduct a stakeholder analysis.
Design & Prototyping
Measuring against the requirements and research insights I iterated from the rough sketches to simple wireframes, up to mid-fidelity designs. While not pixel perfect, the mid-fi designs were great to string into a prototype to start evaluating with stakeholders.
Co-design & Evaluation
We brought students, instructors, and technology providers together in a co-design workshop to gather more information about their experiences, brainstorm solutions, and evaluate our designs.
Delivering the Design
After honing our designs at mid-fidelity, I created high-fidelity comps with Illustrator following Windows design patterns for a pixel-perfect representation of our solution.
Real Life Outcomes!
The reference, research and design requirements helped to shape Microsoft Translator, and we later conducted usability studies evaluating the new product with deaf and hard of hearing students and university instructors.
A few months later, I saw the Translator team at a restaurant conversing with a deaf colleague thru the app on their phones! It’s an amazing product I that I’m proud to have had a small part in shaping.