HappyRouting: Learning Emotion-Aware Route Trajectories for Scalable
In-The-Wild Navigation
- URL: http://arxiv.org/abs/2401.15695v1
- Date: Sun, 28 Jan 2024 16:44:17 GMT
- Title: HappyRouting: Learning Emotion-Aware Route Trajectories for Scalable
In-The-Wild Navigation
- Authors: David Bethge, Daniel Bulanda, Adam Kozlowski, Thomas Kosch, Albrecht
Schmidt, Tobias Grosse-Puppendahl
- Abstract summary: We present HappyRouting, a novel navigation-based empathic car interface guiding drivers through real-world traffic while evoking positive emotions.
Our contribution is a machine learning-based generated emotion map layer, predicting emotions along routes based on static and dynamic contextual data.
We discuss how emotion-based routing can be integrated into navigation apps, promoting emotional well-being for mobility use.
- Score: 24.896210787867368
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Routes represent an integral part of triggering emotions in drivers.
Navigation systems allow users to choose a navigation strategy, such as the
fastest or shortest route. However, they do not consider the driver's emotional
well-being. We present HappyRouting, a novel navigation-based empathic car
interface guiding drivers through real-world traffic while evoking positive
emotions. We propose design considerations, derive a technical architecture,
and implement a routing optimization framework. Our contribution is a machine
learning-based generated emotion map layer, predicting emotions along routes
based on static and dynamic contextual data. We evaluated HappyRouting in a
real-world driving study (N=13), finding that happy routes increase
subjectively perceived valence by 11% (p=.007). Although happy routes take 1.25
times longer on average, participants perceived the happy route as shorter,
presenting an emotion-enhanced alternative to today's fastest routing
mechanisms. We discuss how emotion-based routing can be integrated into
navigation apps, promoting emotional well-being for mobility use.
Related papers
- RoadRunner -- Learning Traversability Estimation for Autonomous Off-road Driving [13.101416329887755]
We present RoadRunner, a framework capable of predicting terrain traversability and an elevation map directly from camera and LiDAR sensor inputs.
RoadRunner enables reliable autonomous navigation, by fusing sensory information, handling of uncertainty, and generation of contextually informed predictions.
We demonstrate the effectiveness of RoadRunner in enabling safe and reliable off-road navigation at high speeds in multiple real-world driving scenarios through unstructured desert environments.
arXiv Detail & Related papers (2024-02-29T16:47:54Z) - Learning Navigational Visual Representations with Semantic Map
Supervision [85.91625020847358]
We propose a navigational-specific visual representation learning method by contrasting the agent's egocentric views and semantic maps.
Ego$2$-Map learning transfers the compact and rich information from a map, such as objects, structure and transition, to the agent's egocentric representations for navigation.
arXiv Detail & Related papers (2023-07-23T14:01:05Z) - Emergence of Maps in the Memories of Blind Navigation Agents [68.41901534985575]
Animal navigation research posits that organisms build and maintain internal spatial representations, or maps, of their environment.
We ask if machines -- specifically, artificial intelligence (AI) navigation agents -- also build implicit (or'mental') maps.
Unlike animal navigation, we can judiciously design the agent's perceptual system and control the learning paradigm to nullify alternative navigation mechanisms.
arXiv Detail & Related papers (2023-01-30T20:09:39Z) - Gesture2Path: Imitation Learning for Gesture-aware Navigation [54.570943577423094]
We present Gesture2Path, a novel social navigation approach that combines image-based imitation learning with model-predictive control.
We deploy our method on real robots and showcase the effectiveness of our approach for the four gestures-navigation scenarios.
arXiv Detail & Related papers (2022-09-19T23:05:36Z) - RCA: Ride Comfort-Aware Visual Navigation via Self-Supervised Learning [14.798955901284847]
Under shared autonomy, wheelchair users expect vehicles to provide safe and comfortable rides while following users high-level navigation plans.
We propose to model ride comfort explicitly in traversability analysis using proprioceptive sensing.
We show our navigation system provides human-preferred ride comfort through robot experiments together with a human evaluation study.
arXiv Detail & Related papers (2022-07-29T03:38:41Z) - Augmented reality navigation system for visual prosthesis [67.09251544230744]
We propose an augmented reality navigation system for visual prosthesis that incorporates a software of reactive navigation and path planning.
It consists on four steps: locating the subject on a map, planning the subject trajectory, showing it to the subject and re-planning without obstacles.
Results show how our augmented navigation system help navigation performance by reducing the time and distance to reach the goals, even significantly reducing the number of obstacles collisions.
arXiv Detail & Related papers (2021-09-30T09:41:40Z) - Pushing it out of the Way: Interactive Visual Navigation [62.296686176988125]
We study the problem of interactive navigation where agents learn to change the environment to navigate more efficiently to their goals.
We introduce the Neural Interaction Engine (NIE) to explicitly predict the change in the environment caused by the agent's actions.
By modeling the changes while planning, we find that agents exhibit significant improvements in their navigational capabilities.
arXiv Detail & Related papers (2021-04-28T22:46:41Z) - ProxEmo: Gait-based Emotion Learning and Multi-view Proxemic Fusion for
Socially-Aware Robot Navigation [65.11858854040543]
We present ProxEmo, a novel end-to-end emotion prediction algorithm for robot navigation among pedestrians.
Our approach predicts the perceived emotions of a pedestrian from walking gaits, which is then used for emotion-guided navigation.
arXiv Detail & Related papers (2020-03-02T17:47:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.