
Revolutionizing Navigation for the Visually Impaired
Overview
SOAL (Smart Orientation and Assistance Locator) is an innovative wearable assistive device designed to empower Blind and Low Vision (BLV) individuals with unprecedented navigational independence. This academic research project, developed as part of UMBC's HCC741 course, reimagines how AI-powered wearable technology can bridge the gap between existing assistive tools and the complex needs of the BLV community.
Team
Role
Duration
Tools
How might we transform assistive technology by developing an integrated wearable device that enhances independence and safety for Blind and Low Vision individuals?
Blind and Low Vision (BLV) individuals face significant challenges in navigating their environments independently and safely. Existing assistive technologies often fall short in providing real-time feedback, comprehensive object detection, and accurate navigation. These solutions are fragmented, resulting in lack of seamless integration and increased risk of accidents or disorientation. SOAL addresses these critical gaps through user-centered design and innovative technology integration.

The Impact
Over 285 million people worldwide live with visual impairments, yet current navigation solutions force users to choose between comprehensive assistance and social discretion. This fragmented landscape not only limits independence but also creates safety risks in dynamic environments. By developing an integrated solution, SOAL has the potential to transform daily life for millions while establishing new standards for inclusive technology design.
Empathize
User Research
Our research phase involved comprehensive analysis of existing solutions and their limitations. We conducted extensive literature reviews, examined 15+ existing assistive technologies, and analyzed user feedback from current BLV technology users.
Key Research Insights
85% of users prioritized real-time environmental feedback over post-navigation analysis
Privacy concerns were paramount - users wanted assistance without data collection
Comfort and discretion ranked higher than advanced features in daily use scenarios
Multi-modal feedback (audio + tactile) was preferred over audio-only solutions
Pain Points Identified
Define
Ideate
Key Design Decisions
Prototype
Low Fidelity Design
The low-fidelity sketch in the notebook served as a foundational step in conceptualizing the design of SOAL and its key features. It focused on figuring out how the device would look and where essential components should be placed for optimal usability. This sketch allowed for a clear visualization of how the device's form factor would blend minimalism with functionality, ensuring that SOAL would be both practical and comfortable for the user.
Design Review and Iteration
Following the initial sketch, the design of SOAL was developed in 3D modeling and reviewed, which led to several significant improvements. The review confirmed the design’s effectiveness and highlighted areas for refinement. Key changes included adding a NAV belt slot, removing voice and scan buttons to minimize touch reliance, and incorporating a large voice activation area. The sensor was repositioned next to the camera, a customizable touchpad was added for simple inputs, and a voice activation button was placed on the side. These updates enhanced the device’s usability and accessibility for BLV individuals.
High Fidelity Design
The final design focused on delivering optimal user experience through meticulously crafted components
Strategically Positioned Cameras: Accurate environmental scanning
Belt Slot & Eject Mechanisms: Easy attachment and removal of SOAL Belt
Intuitive Touch Pad: Gesture-based control options
Magnetic Base: Secure yet flexible attachment to clothing
Charging Port: Efficient, widespread charging compatibility
Voice Activation: Hands-free operation with natural language processing
Real-time Sensors: Obstacle detection and environmental awareness
SOAL Belt: Comfortable vibrational feedback distribution for spatial awareness
Test
Initial Testing Approach
We began by testing prototypes ourselves using HCC (Human-Computer Interaction) design principles learned in class, focusing on usability heuristics and accessible design guidelines. This internal evaluation helped identify major usability issues before engaging with the BLV community.
Accessible Design Learning Integration
Our coursework in accessible design informed key decisions around inclusive interaction design, ensuring SOAL met WCAG guidelines and universal design principles from the ground up.