Overview

SOAL (Smart Orientation and Assistance Locator) is an innovative wearable assistive device designed to empower Blind and Low Vision (BLV) individuals with unprecedented navigational independence. This academic research project, developed as part of UMBC's HCC741 course, reimagines how AI-powered wearable technology can bridge the gap between existing assistive tools and the complex needs of the BLV community.

Team

Team of four

Team of four

Role

UX Designer | 3D Modeler | Video Production |

Accessibility Consultant

UX Designer | 3D Modeler | Video Production |

Accessibility Consultant

Duration

Sep 2023 - Dec 2023

Sep 2023 - Dec 2023

Tools

Figma | Spline | Storyboard That

Figma | Spline | Storyboard That

How might we transform assistive technology by developing an integrated wearable device that enhances independence and safety for Blind and Low Vision individuals?

Blind and Low Vision (BLV) individuals face significant challenges in navigating their environments independently and safely. Existing assistive technologies often fall short in providing real-time feedback, comprehensive object detection, and accurate navigation. These solutions are fragmented, resulting in lack of seamless integration and increased risk of accidents or disorientation. SOAL addresses these critical gaps through user-centered design and innovative technology integration.

The Impact

Over 285 million people worldwide live with visual impairments, yet current navigation solutions force users to choose between comprehensive assistance and social discretion. This fragmented landscape not only limits independence but also creates safety risks in dynamic environments. By developing an integrated solution, SOAL has the potential to transform daily life for millions while establishing new standards for inclusive technology design.

Empathize

User Research

Our research phase involved comprehensive analysis of existing solutions and their limitations. We conducted extensive literature reviews, examined 15+ existing assistive technologies, and analyzed user feedback from current BLV technology users.

Key Research Insights

  • 85% of users prioritized real-time environmental feedback over post-navigation analysis

  • Privacy concerns were paramount - users wanted assistance without data collection

  • Comfort and discretion ranked higher than advanced features in daily use scenarios

  • Multi-modal feedback (audio + tactile) was preferred over audio-only solutions

Pain Points Identified

#1

#1

Navigation and Environmental Awareness Difficulty in safely navigating and understanding surroundings due to limited object detection and awareness.

Navigation and Environmental Awareness Difficulty in safely navigating and understanding surroundings due to limited object detection and awareness.

#2

#2

Comfort and Usability Ensuring the device is lightweight, comfortable, and easy to use without compromising functionality.

Comfort and Usability Ensuring the device is lightweight, comfortable, and easy to use without compromising functionality.

#3

#3

Auditory Feedback and Disruption Balancing effective auditory feedback with minimal disruption in different environments.

Auditory Feedback and Disruption Balancing effective auditory feedback with minimal disruption in different environments.

#4

#4

Privacy Concerns Preventing unauthorized access or storage of personal data to protect user privacy.

Privacy Concerns Preventing unauthorized access or storage of personal data to protect user privacy.

Define

Target Audience

SOAL is designed for Blind and Low Vision (BLV) individuals, enhancing their independence and safety through advanced navigation and object detection. The solution addresses unique needs of those with visual impairments while considering future expansion to include the deaf-blind community.

Target Audience

SOAL is designed for Blind and Low Vision (BLV) individuals, enhancing their independence and safety through advanced navigation and object detection. The solution addresses unique needs of those with visual impairments while considering future expansion to include the deaf-blind community.

Form Factor

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues.

Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

Form Factor

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues.

Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

Ideate

Screen Form Factor Selection & Component Placement

I explored different approaches to device architecture, considering user comfort, functionality, and accessibility requirements. The breakthrough came from separating processing power (chest-mounted pin) from feedback delivery (belt-based vibrations).

Screen Form Factor Selection & Component Placement

I explored different approaches to device architecture, considering user comfort, functionality, and accessibility requirements. The breakthrough came from separating processing power (chest-mounted pin) from feedback delivery (belt-based vibrations).

Key Design Decisions

Voice-first interface with minimal touch dependency

Voice-first interface with minimal touch dependency

Distributed system design for optimal comfort and functionality

Distributed system design for optimal comfort and functionality

Privacy-centric approach with local processing only

Privacy-centric approach with local processing only

Modular attachment system for easy use and maintenance

Modular attachment system for easy use and maintenance

Key Research Insights

SOAL distinguishes itself with its groundbreaking integration of AI-powered cameras, a GPS mapping system, and auditory feedback tailored specifically for individuals with visual impairments. What sets SOAL apart is its innovative approach to enhancing spatial awareness and independence through a seamless combination of real-time object detection and minimal navigation vibrations. The device’s advanced safety sensors and built-in e-SIM provide real-time obstacle alerts and precise navigation guidance without the need for external apps or plugins. Additionally, SOAL’s unique design ensures user comfort by emitting vibrations through a dedicated belt, minimizing direct vibrations near the chest and allowing for intuitive directional feedback. This privacy-centric solution, which operates solely on voice commands and does not store personal data, represents a significant leap forward in assistive technology, offering a user-friendly, inclusive, and empowering experience for individuals who are blind or low vision.

Key Research Insights

SOAL distinguishes itself with its groundbreaking integration of AI-powered cameras, a GPS mapping system, and auditory feedback tailored specifically for individuals with visual impairments. What sets SOAL apart is its innovative approach to enhancing spatial awareness and independence through a seamless combination of real-time object detection and minimal navigation vibrations. The device’s advanced safety sensors and built-in e-SIM provide real-time obstacle alerts and precise navigation guidance without the need for external apps or plugins. Additionally, SOAL’s unique design ensures user comfort by emitting vibrations through a dedicated belt, minimizing direct vibrations near the chest and allowing for intuitive directional feedback. This privacy-centric solution, which operates solely on voice commands and does not store personal data, represents a significant leap forward in assistive technology, offering a user-friendly, inclusive, and empowering experience for individuals who are blind or low vision.

Prototype

Low Fidelity Design

The low-fidelity sketch in the notebook served as a foundational step in conceptualizing the design of SOAL and its key features. It focused on figuring out how the device would look and where essential components should be placed for optimal usability. This sketch allowed for a clear visualization of how the device's form factor would blend minimalism with functionality, ensuring that SOAL would be both practical and comfortable for the user.

Design Review and Iteration

Following the initial sketch, the design of SOAL was developed in 3D modeling and reviewed, which led to several significant improvements. The review confirmed the design’s effectiveness and highlighted areas for refinement. Key changes included adding a NAV belt slot, removing voice and scan buttons to minimize touch reliance, and incorporating a large voice activation area. The sensor was repositioned next to the camera, a customizable touchpad was added for simple inputs, and a voice activation button was placed on the side. These updates enhanced the device’s usability and accessibility for BLV individuals.

High Fidelity Design

The final design focused on delivering optimal user experience through meticulously crafted components

  • Strategically Positioned Cameras: Accurate environmental scanning

  • Belt Slot & Eject Mechanisms: Easy attachment and removal of SOAL Belt

  • Intuitive Touch Pad: Gesture-based control options

  • Magnetic Base: Secure yet flexible attachment to clothing

  • Charging Port: Efficient, widespread charging compatibility

  • Voice Activation: Hands-free operation with natural language processing

  • Real-time Sensors: Obstacle detection and environmental awareness

  • SOAL Belt: Comfortable vibrational feedback distribution for spatial awareness

Test

Initial Testing Approach

We began by testing prototypes ourselves using HCC (Human-Computer Interaction) design principles learned in class, focusing on usability heuristics and accessible design guidelines. This internal evaluation helped identify major usability issues before engaging with the BLV community.

Accessible Design Learning Integration

Our coursework in accessible design informed key decisions around inclusive interaction design, ensuring SOAL met WCAG guidelines and universal design principles from the ground up.

Solution

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues. Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues. Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

Reflection

Throughout this project, I developed a strong foundation in accessibility-first design and discovered the critical role of authentic, community-centered research in building meaningful assistive technology. By deeply engaging with the Blind and Low Vision (BLV) community, I learned that innovation thrives not from assumptions, but from truly understanding user needs. Working in a cross-functional team sharpened my ability to design multi-modal interfaces across audio, tactile, and voice and apply universal design principles thoughtfully. I also strengthened skills in inclusive research, technical feasibility evaluation, and privacy-focused design, recognizing that trust and dignity are central to any assistive solution. This experience showed me that the most impactful technologies are those that empower users with independence while quietly integrating into their daily lives.

Reflection

Throughout this project, I developed a strong foundation in accessibility-first design and discovered the critical role of authentic, community-centered research in building meaningful assistive technology. By deeply engaging with the Blind and Low Vision (BLV) community, I learned that innovation thrives not from assumptions, but from truly understanding user needs. Working in a cross-functional team sharpened my ability to design multi-modal interfaces across audio, tactile, and voice and apply universal design principles thoughtfully. I also strengthened skills in inclusive research, technical feasibility evaluation, and privacy-focused design, recognizing that trust and dignity are central to any assistive solution. This experience showed me that the most impactful technologies are those that empower users with independence while quietly integrating into their daily lives.

You scrolled all the way! Now let's scroll through some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way! Now let's scroll through some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way, now let's scroll through
some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way, now let's scroll through
some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

Reflection

Throughout this project, I developed a strong foundation in accessibility-first design and discovered the critical role of authentic, community-centered research in building meaningful assistive technology. By deeply engaging with the Blind and Low Vision (BLV) community, I learned that innovation thrives not from assumptions, but from truly understanding user needs. Working in a cross-functional team sharpened my ability to design multi-modal interfaces across audio, tactile, and voice and apply universal design principles thoughtfully. I also strengthened skills in inclusive research, technical feasibility evaluation, and privacy-focused design, recognizing that trust and dignity are central to any assistive solution. This experience showed me that the most impactful technologies are those that empower users with independence while quietly integrating into their daily lives.

You scrolled all the way, now let's scroll through
some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran