Enhanced Accessibility for Mobile Indoor Navigation
- URL: http://arxiv.org/abs/2602.13233v1
- Date: Thu, 29 Jan 2026 11:30:40 GMT
- Title: Enhanced Accessibility for Mobile Indoor Navigation
- Authors: Johannes Wortmann, Bernd Schäufele, Konstantin Klipp, Ilja Radusch, Katharina Blaß, Thomas Jung,
- Abstract summary: We develop an indoor navigation app that prioritizes accessibility, integrating enhanced features to meet the needs of visually impaired users.<n>The usability of the app is being thoroughly evaluated through tests involving both visually impaired and sighted users.
- Score: 0.31666540219908274
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The navigation of indoor spaces poses difficult challenges for individuals with visual impairments, as it requires processing of sensory information, dealing with uncertainties, and relying on assistance. To tackle these challenges, we present an indoor navigation app that places importance on accessibility for visually impaired users. Our approach involves a combination of user interviews and an analysis of the Web Content Accessibility Guidelines. With this approach, we are able to gather invaluable insights and identify design requirements for the development of an indoor navigation app. Based on these insights, we develop an indoor navigation app that prioritizes accessibility, integrating enhanced features to meet the needs of visually impaired users. The usability of the app is being thoroughly evaluated through tests involving both visually impaired and sighted users. Initial feedback has been positive, with users appreciating the inclusive user interface and the usability with a wide range of accessibility tools and Android device settings.
Related papers
- MR.NAVI: Mixed-Reality Navigation Assistant for the Visually Impaired [42.45301319345154]
We present MR. NAVI, a mixed reality system that enhances spatial awareness for visually impaired users.<n>Our system combines computer vision algorithms for object detection and depth estimation with natural language processing to provide contextual scene descriptions.
arXiv Detail & Related papers (2025-05-28T14:02:56Z) - Advancing Mobile UI Testing by Learning Screen Usage Semantics [0.42303492200814446]
This research seeks to enhance automated UI testing techniques by learning the screen usage semantics of mobile apps.<n>It also improves the usability of a mobile app's interface by identifying and mitigating UI design issues.
arXiv Detail & Related papers (2025-05-15T01:40:43Z) - Exploring Accessibility Trends and Challenges in Mobile App Development: A Study of Stack Overflow Questions [14.005637416640448]
This study presents a large-scale empirical analysis of accessibility discussions on Stack Overflow to identify the trends and challenges Android and iOS developers face.
Our results show several challenges, including integrating assistive technologies like screen readers, ensuring accessible UI design, supporting text-to-speech across languages, and conducting accessibility testing.
We envision our findings driving improvements in developer practices, research directions, tool support, and educational resources.
arXiv Detail & Related papers (2024-09-12T11:13:24Z) - Floor extraction and door detection for visually impaired guidance [78.94595951597344]
Finding obstacle-free paths in unknown environments is a big navigation issue for visually impaired people and autonomous robots.
New devices based on computer vision systems can help impaired people to overcome the difficulties of navigating in unknown environments in safe conditions.
In this work it is proposed a combination of sensors and algorithms that can lead to the building of a navigation system for visually impaired people.
arXiv Detail & Related papers (2024-01-30T14:38:43Z) - A Design Guideline to Overcome Web Accessibility Issues Challenged by
Visually Impaired Community in Sri Lanka [0.0]
Visual-impaired communities are one of the hindrances groups to accessing web content access in the world.
Five main problems including access limited by the impairment, usability issues due to lack of design, unavailability of visually impaired-friendly applications, lack of communication, and web navigation issues are the most dominant issues.
arXiv Detail & Related papers (2023-04-14T05:12:13Z) - Augmented reality navigation system for visual prosthesis [67.09251544230744]
We propose an augmented reality navigation system for visual prosthesis that incorporates a software of reactive navigation and path planning.
It consists on four steps: locating the subject on a map, planning the subject trajectory, showing it to the subject and re-planning without obstacles.
Results show how our augmented navigation system help navigation performance by reducing the time and distance to reach the goals, even significantly reducing the number of obstacles collisions.
arXiv Detail & Related papers (2021-09-30T09:41:40Z) - Deep Learning for Embodied Vision Navigation: A Survey [108.13766213265069]
"Embodied visual navigation" problem requires an agent to navigate in a 3D environment mainly rely on its first-person observation.
This paper attempts to establish an outline of the current works in the field of embodied visual navigation by providing a comprehensive literature survey.
arXiv Detail & Related papers (2021-07-07T12:09:04Z) - Diagnosing Vision-and-Language Navigation: What Really Matters [61.72935815656582]
Vision-and-language navigation (VLN) is a multimodal task where an agent follows natural language instructions and navigates in visual environments.
Recent studies witness a slow-down in the performance improvements in both indoor and outdoor VLN tasks.
In this work, we conduct a series of diagnostic experiments to unveil agents' focus during navigation.
arXiv Detail & Related papers (2021-03-30T17:59:07Z) - Active Visual Information Gathering for Vision-Language Navigation [115.40768457718325]
Vision-language navigation (VLN) is the task of entailing an agent to carry out navigational instructions inside photo-realistic environments.
One of the key challenges in VLN is how to conduct a robust navigation by mitigating the uncertainty caused by ambiguous instructions and insufficient observation of the environment.
This work draws inspiration from human navigation behavior and endows an agent with an active information gathering ability for a more intelligent VLN policy.
arXiv Detail & Related papers (2020-07-15T23:54:20Z) - DeFINE: Delayed Feedback based Immersive Navigation Environment for
Studying Goal-Directed Human Navigation [10.7197371210731]
Delayed Feedback based Immersive Navigation Environment (DeFINE) is a framework that allows for easy creation and administration of navigation tasks.
DeFINE has a built-in capability to provide performance feedback to participants during an experiment.
arXiv Detail & Related papers (2020-03-06T11:00:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.