Audio Aid is an accessibility-focused iOS application developed for VisionOS in collaboration with the National Acoustic Laboratories (NAL). The app classifies real-time environmental sounds and delivers smart notifications ā helping individuals with hearing impairments regain situational awareness in both public and private spaces.
As the Design Lead, I was responsible for shaping the entire UI/UX system ā from initial sketches, user stories, and personas to wireframing, prototyping, and iterative user testing. The interface is designed for both simplicity and clarity, using high-contrast elements, multimodal cues (visual, text, vibration), and spatial gestures.
This project was built using Agile Scrum methodology over multiple sprints. I also coordinated closely with developers to map user stories to functional priorities and ensure each design decision met real accessibility needs. The design process evolved through internal feedback loops, accessibility benchmarking, and feature validation against real user scenarios.
ā
Audio Aid showed me the importance of designing for clarity, not just usability. As the only designer, I learned how to create a cohesive experience from scratch, align it with accessibility standards, and collaborate with developers to bring that vision to life ā all while adapting to evolving client needs and technical challenges.
ā
ā