When we began working on Audio Aid, the goal was clear.. to bridge the sensory gap for individuals with hearing impairments using the power of spatial computing. Built for Apple Vision Pro, Audio Aid classifies real-time environmental sounds and delivers smart, directional notifications that help users regain awareness of their surroundings, whether they’re at home, at work, or navigating busy public spaces.
As the Design Lead on this project, I took on the challenge of translating this vision into a thoughtful, human-centered interface. I started by building deep empathy conducting informal interviews, researching assistive technologies and constructing personas that represented diverse hearing-loss experiences. From there, I moved into storyboarding daily interaction scenarios, identifying moments where sound becomes critical- a doorbell ringing, someone calling your name, or a kettle boiling over.
(Initial screen designs on Figma)
Designing for visionOS opened up both opportunities and constraints. The interface had to be minimal but context-rich, readable in a mixed-reality space without overwhelming the user. I adopted Apple’s glass specular aesthetic, ensuring that notifications felt native to the immersive environment: light, floating, and spatially grounded. Sounds are paired with multimodal cues - visual overlays, directional indicators, text transcripts, and gentle haptic nudges, ensuring accessibility across varied levels of hearing loss.
I prototyped the experience in layered iterations, using SwiftUI and RealityKit references to mimic the spatial layout. We tested how users responded to floating cards, edge-of-vision alerts, and depth-based prioritization. Every detail from contrast ratios to gesture-based acknowledgment interactions was evaluated against accessibility standards and real-world feasibility.
Working in Agile sprints, I collaborated closely with our developers, mapping user stories to functionality, balancing user needs with device constraints and co-prioritizing features. Every sprint cycle included usability testing, feature validation, and internal feedback loops, ensuring we stayed aligned with both our accessibility mission and technical capabilities.
Designing Audio Aid taught me that accessibility is clarity, not complexity. From leading the design system end-to-end to adapting feedback into fast-paced iterations, I learned how to craft a product that’s not only functional but respectful of the people it serves. Most importantly, it showed me how immersive design can enhance (not replace) real-life human experience, especially when done with care, intent and inclusivity.