Revolutionizing Navigation for the
Visually Impaired
Revolutionizing Navigation for the
Visually Impaired
Revolutionizing Navigation for the
Visually Impaired
Team
Team

Keerthana Ravichandran

Keerthana Ravichandran

Keerthana Ravichandran

Aisvarya Sudaram

Aisvarya Sudaram

Aisvarya Sudaram

Faizan Shiek

Faizan Shiek

Faizan Shiek

Rowena Pinto

Rowena Pinto

Rowena Pinto
Role
Role

UX Designer

UX Designer

UX Designer

3D Modeler

3D Modeler

3D Modeler

Video Production

Video Production

Video Production

Accessibility Consutant

Accessibility Consutant

Accessibility Consutant
Duration
Duration

Sep 2023 - Dec 2023

Sep 2023 - Dec 2023

Sep 2023 - Dec 2023
Tools
Tools

Figma

Figma

Figma

Google Scholar

Google Scholar

Google Scholar

Spline

Spline

Spline

Storyboard That

Storyboard That

Storyboard That

Overview

Overview

SOAL (Smart Orientation and Assistance Locator) is a groundbreaking wearable device designed to revolutionize accessibility for Blind and Low Vision (BLV) individuals. Combining AI-powered cameras, real-time GPS mapping, auditory feedback, and advanced safety sensors, SOAL transforms how users navigate their world. Its sleek, lightweight design features a minimalistic interface with intuitive voice commands and a vibration-guided belt for clear spatial awareness. By seamlessly integrating cutting-edge technology with user-centered design, SOAL empowers users with unprecedented independence and safety, paving the way for a more inclusive future. Explore how SOAL is redefining assistive technology and enhancing daily life for the BLV community.

SOAL (Smart Orientation and Assistance Locator) is a groundbreaking wearable device designed to revolutionize accessibility for Blind and Low Vision (BLV) individuals. Combining AI-powered cameras, real-time GPS mapping, auditory feedback, and advanced safety sensors, SOAL transforms how users navigate their world. Its sleek, lightweight design features a minimalistic interface with intuitive voice commands and a vibration-guided belt for clear spatial awareness. By seamlessly integrating cutting-edge technology with user-centered design, SOAL empowers users with unprecedented independence and safety, paving the way for a more inclusive future. Explore how SOAL is redefining assistive technology and enhancing daily life for the BLV community.

The Problem

The Problem

Blind and Low Vision (BLV) individuals face significant challenges in navigating their environments independently and safely. Existing assistive technologies often fall short in providing real-time feedback, comprehensive object detection, and accurate navigation. These solutions are often fragmented, which results in a lack of seamless integration and increases the risk of accidents or disorientation.

Blind and Low Vision (BLV) individuals face significant challenges in navigating their environments independently and safely. Existing assistive technologies often fall short in providing real-time feedback, comprehensive object detection, and accurate navigation. These solutions are often fragmented, which results in a lack of seamless integration and increases the risk of accidents or disorientation.

The Goal

The Goal

The primary goal of SOAL is to transform assistive technology by developing an integrated wearable device that enhances the independence and safety of Blind and Low Vision (BLV) individuals. SOAL seeks to achieve this by offering real-time object detection and environmental scanning through AI-powered cameras, enabling accurate navigation with GPS and safety sensors, and providing auditory feedback and minimal vibrations to guide users effectively while ensuring user privacy and comfort with its lightweight design.

The primary goal of SOAL is to transform assistive technology by developing an integrated wearable device that enhances the independence and safety of Blind and Low Vision (BLV) individuals. SOAL seeks to achieve this by offering real-time object detection and environmental scanning through AI-powered cameras, enabling accurate navigation with GPS and safety sensors, and providing auditory feedback and minimal vibrations to guide users effectively while ensuring user privacy and comfort with its lightweight design.

The Scope of Work

The Scope of Work

The project begins with research and development to analyze current assistive technologies and their limitations. This is followed by designing and prototyping a lightweight wearable device that integrates AI-powered cameras, GPS, auditory feedback, and safety sensors. User testing will then refine the design based on practical feedback. Emphasis will be placed on privacy and accessibility, ensuring the device does not store personal information and considers features for the deaf-blind community. The final phase will focus on implementing improvements to address issues like camera resolution and cost optimization, with ongoing collaboration with accessibility experts to ensure effectiveness and user satisfaction.

The project begins with research and development to analyze current assistive technologies and their limitations. This is followed by designing and prototyping a lightweight wearable device that integrates AI-powered cameras, GPS, auditory feedback, and safety sensors. User testing will then refine the design based on practical feedback. Emphasis will be placed on privacy and accessibility, ensuring the device does not store personal information and considers features for the deaf-blind community. The final phase will focus on implementing improvements to address issues like camera resolution and cost optimization, with ongoing collaboration with accessibility experts to ensure effectiveness and user satisfaction.

Design Process

Design Process

The design process for SOAL begins with ideation, where the team identifies the specific needs and challenges faced by Blind or Low Vision (BLV) individuals through extensive research and user feedback. This phase is followed by concept development, where initial sketches and prototypes are created to define the core features and system architecture. The design then progresses to prototyping, where a functional model of SOAL is built and tested to gather user feedback and identify areas for improvement. Iteration involves refining the prototype based on this feedback, addressing issues, and enhancing user interactions to ensure the final design effectively meets user needs and enhances independence and accessibility for BLV individuals.

The design process for SOAL begins with ideation, where the team identifies the specific needs and challenges faced by Blind or Low Vision (BLV) individuals through extensive research and user feedback. This phase is followed by concept development, where initial sketches and prototypes are created to define the core features and system architecture. The design then progresses to prototyping, where a functional model of SOAL is built and tested to gather user feedback and identify areas for improvement. Iteration involves refining the prototype based on this feedback, addressing issues, and enhancing user interactions to ensure the final design effectively meets user needs and enhances independence and accessibility for BLV individuals.

Design Timeline

Design Timeline

The design process for SOAL is organized into four main phases over a total duration of 12 weeks. The process begins with Ideation (2 weeks) to identify user needs and challenges. This is followed by Concept Development (3 weeks) to create initial prototypes and define core features. The Prototyping phase (4 weeks) involves building and testing a functional model. Finally, the Iteration phase (3 weeks) focuses on refining the prototype based on feedback and addressing any issues.

The design process for SOAL is organized into four main phases over a total duration of 12 weeks. The process begins with Ideation (2 weeks) to identify user needs and challenges. This is followed by Concept Development (3 weeks) to create initial prototypes and define core features. The Prototyping phase (4 weeks) involves building and testing a functional model. Finally, the Iteration phase (3 weeks) focuses on refining the prototype based on feedback and addressing any issues.

Target Audience

Target Audience

SOAL is designed for Blind and Low Vision (BLV) individuals, enhancing their independence and safety. It offers advanced navigation and object detection through AI-powered cameras, GPS, and auditory feedback, tailored to meet the unique needs of those with visual impairments. Future updates aim to include the deaf-blind community for broader accessibility.

SOAL is designed for Blind and Low Vision (BLV) individuals, enhancing their independence and safety. It offers advanced navigation and object detection through AI-powered cameras, GPS, and auditory feedback, tailored to meet the unique needs of those with visual impairments. Future updates aim to include the deaf-blind community for broader accessibility.

User Research

User Research

For SOAL, we conducted user research involving interviews and surveys with Blind and Low Vision (BLV) individuals and consultations with accessibility experts. This research highlighted the need for intuitive, real-time feedback about surroundings while minimizing disruption in quiet settings. Users emphasized the importance of a lightweight, comfortable device with clear auditory and tactile cues for safety and navigation. These insights guided our design to ensure SOAL enhances independence and usability for the BLV community effectively.

For SOAL, we conducted user research involving interviews and surveys with Blind and Low Vision (BLV) individuals and consultations with accessibility experts. This research highlighted the need for intuitive, real-time feedback about surroundings while minimizing disruption in quiet settings. Users emphasized the importance of a lightweight, comfortable device with clear auditory and tactile cues for safety and navigation. These insights guided our design to ensure SOAL enhances independence and usability for the BLV community effectively.

Form Factor

Form Factor

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues. Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues. Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

Unique Feature

Unique Feature

SOAL distinguishes itself with its groundbreaking integration of AI-powered cameras, a GPS mapping system, and auditory feedback tailored specifically for individuals with visual impairments. What sets SOAL apart is its innovative approach to enhancing spatial awareness and independence through a seamless combination of real-time object detection and minimal navigation vibrations. The device’s advanced safety sensors and built-in e-SIM provide real-time obstacle alerts and precise navigation guidance without the need for external apps or plugins. Additionally, SOAL’s unique design ensures user comfort by emitting vibrations through a dedicated belt, minimizing direct vibrations near the chest and allowing for intuitive directional feedback. This privacy-centric solution, which operates solely on voice commands and does not store personal data, represents a significant leap forward in assistive technology, offering a user-friendly, inclusive, and empowering experience for individuals who are blind or low vision.

SOAL distinguishes itself with its groundbreaking integration of AI-powered cameras, a GPS mapping system, and auditory feedback tailored specifically for individuals with visual impairments. What sets SOAL apart is its innovative approach to enhancing spatial awareness and independence through a seamless combination of real-time object detection and minimal navigation vibrations. The device’s advanced safety sensors and built-in e-SIM provide real-time obstacle alerts and precise navigation guidance without the need for external apps or plugins. Additionally, SOAL’s unique design ensures user comfort by emitting vibrations through a dedicated belt, minimizing direct vibrations near the chest and allowing for intuitive directional feedback. This privacy-centric solution, which operates solely on voice commands and does not store personal data, represents a significant leap forward in assistive technology, offering a user-friendly, inclusive, and empowering experience for individuals who are blind or low vision.

Pain Points

Pain Points

The SOAL project tackles several critical pain points faced by Blind and Low Vision (BLV) individuals, aiming to enhance their independence and navigation abilities. By addressing these challenges, SOAL strives to create a more inclusive and user-friendly assistive device.

The SOAL project tackles several critical pain points faced by Blind and Low Vision (BLV) individuals, aiming to enhance their independence and navigation abilities. By addressing these challenges, SOAL strives to create a more inclusive and user-friendly assistive device.

1. Navigation and Environmental Awareness
Difficulty in safely navigating and understanding surroundings due to limited object detection and awareness.
2. Comfort and Usability
Ensuring the device is lightweight, comfortable, and easy to use without compromising functionality.
3. Auditory Feedback and Disruption
Balancing effective auditory feedback with minimal disruption in different environments.
4. Privacy Concerns
Preventing unauthorized access or storage of personal data to protect user privacy.

1. Navigation and Environmental Awareness
Difficulty in safely navigating and understanding surroundings due to limited object detection and awareness.
2. Comfort and Usability
Ensuring the device is lightweight, comfortable, and easy to use without compromising functionality.
3. Auditory Feedback and Disruption
Balancing effective auditory feedback with minimal disruption in different environments.
4. Privacy Concerns
Preventing unauthorized access or storage of personal data to protect user privacy.

Low fidelity Design

Low fidelity Design

The low-fidelity sketch in the notebook served as a foundational step in conceptualizing the design of SOAL and its key features. It focused on figuring out how the device would look and where essential components should be placed for optimal usability. This sketch allowed for a clear visualization of how the device’s form factor would blend minimalism with functionality, ensuring that SOAL would be both practical and comfortable for the user.

The low-fidelity sketch in the notebook served as a foundational step in conceptualizing the design of SOAL and its key features. It focused on figuring out how the device would look and where essential components should be placed for optimal usability. This sketch allowed for a clear visualization of how the device’s form factor would blend minimalism with functionality, ensuring that SOAL would be both practical and comfortable for the user.

Design Review and Iteration

Design Review and Iteration

Following the initial sketch, the design of SOAL was developed in 3D modeling and reviewed, which led to several significant improvements. The review confirmed the design’s effectiveness and highlighted areas for refinement. Key changes included adding a NAV belt slot, removing voice and scan buttons to minimize touch reliance, and incorporating a large voice activation area. The sensor was repositioned next to the camera, a customizable touchpad was added for simple inputs, and a voice activation button was placed on the side. These updates enhanced the device’s usability and accessibility for BLV individuals.

Following the initial sketch, the design of SOAL was developed in 3D modeling and reviewed, which led to several significant improvements. The review confirmed the design’s effectiveness and highlighted areas for refinement. Key changes included adding a NAV belt slot, removing voice and scan buttons to minimize touch reliance, and incorporating a large voice activation area. The sensor was repositioned next to the camera, a customizable touchpad was added for simple inputs, and a voice activation button was placed on the side. These updates enhanced the device’s usability and accessibility for BLV individuals.

High fidelity Design

High fidelity Design

The high-fidelity design of SOAL focuses on delivering an optimal user experience through meticulously crafted components. The Belt Slot and Belt Eject mechanisms ensure easy attachment and removal of the SOAL Belt, enhancing user convenience. The Cameras are strategically positioned for accurate environmental scanning, while the Touch Pad allows intuitive gesture-based control. The Magnetic Base ensures secure yet flexible attachment to clothing, and the Charging Port supports efficient, widespread charging. Voice Activation facilitates hands-free operation, and the Sensor provides real-time obstacle detection. The On/Off Button is ergonomically placed for easy access, and the SOAL Belt distributes vibrational feedback comfortably, optimizing spatial awareness and user comfort. Each element is designed to seamlessly integrate into a sleek, functional wearable that maximizes usability and comfort for BLV individuals.

The high-fidelity design of SOAL focuses on delivering an optimal user experience through meticulously crafted components. The Belt Slot and Belt Eject mechanisms ensure easy attachment and removal of the SOAL Belt, enhancing user convenience. The Cameras are strategically positioned for accurate environmental scanning, while the Touch Pad allows intuitive gesture-based control. The Magnetic Base ensures secure yet flexible attachment to clothing, and the Charging Port supports efficient, widespread charging. Voice Activation facilitates hands-free operation, and the Sensor provides real-time obstacle detection. The On/Off Button is ergonomically placed for easy access, and the SOAL Belt distributes vibrational feedback comfortably, optimizing spatial awareness and user comfort. Each element is designed to seamlessly integrate into a sleek, functional wearable that maximizes usability and comfort for BLV individuals.

Working

Working

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues. Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

The concept of combining the AI Pin with the NavBelt in SOAL was driven by the goal of merging advanced technology with practical navigation aids to enhance independence for Blind and Low Vision (BLV) individuals. The AI Pin's cutting-edge camera technology and AI capabilities facilitate real-time object detection and environmental awareness, while the NavBelt's vibrational feedback system provides precise spatial orientation cues. Integrating these technologies results in a cohesive solution that offers both detailed environmental scanning and intuitive navigation guidance through minimal vibrations. This approach leverages the strengths of both devices to create a compact and user-friendly wearable, prioritizing functionality, comfort, and enhanced user experience, and represents a significant advancement in assistive technology.

Key Takeaways

Key Takeaways

SOAL’s high-fidelity design highlights a commitment to user-centric innovation, combining advanced technology with intuitive usability. The integration of features such as the Belt Slot, Touch Pad, and Voice Activation ensures a seamless and user-friendly experience. The device’s sleek, minimalistic design not only enhances visual appeal but also optimizes functionality, making it lightweight and easy to use. With AI-powered cameras and real-time sensors working in harmony with the responsive SOAL Belt, users receive comprehensive feedback and navigation assistance, significantly improving their spatial awareness and independence.

SOAL’s high-fidelity design highlights a commitment to user-centric innovation, combining advanced technology with intuitive usability. The integration of features such as the Belt Slot, Touch Pad, and Voice Activation ensures a seamless and user-friendly experience. The device’s sleek, minimalistic design not only enhances visual appeal but also optimizes functionality, making it lightweight and easy to use. With AI-powered cameras and real-time sensors working in harmony with the responsive SOAL Belt, users receive comprehensive feedback and navigation assistance, significantly improving their spatial awareness and independence.

Next Steps

Next Steps

The next steps involve conducting thorough user testing to gather valuable feedback on SOAL’s comfort, functionality, and overall usability. This will be essential for identifying any areas needing improvement and validating the effectiveness of the design. Addressing potential limitations, such as camera performance in low-light conditions and privacy concerns, is crucial for refining the device. Additionally, exploring cost optimization strategies will help ensure affordability without compromising quality. Iterative refinement based on user insights will further enhance the device, making it an even more effective and inclusive tool for the BLV community.

The next steps involve conducting thorough user testing to gather valuable feedback on SOAL’s comfort, functionality, and overall usability. This will be essential for identifying any areas needing improvement and validating the effectiveness of the design. Addressing potential limitations, such as camera performance in low-light conditions and privacy concerns, is crucial for refining the device. Additionally, exploring cost optimization strategies will help ensure affordability without compromising quality. Iterative refinement based on user insights will further enhance the device, making it an even more effective and inclusive tool for the BLV community.

You scrolled all the way! Now let's scroll through some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way! Now let's scroll through some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way! Now let's scroll through
some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way! Now let's scroll through
some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran

You scrolled all the way! Now let's scroll through
some cool ideas together.

Drop an email, let’s make something awesome happen!

© 2024 by Keerthana Ravichandran