Airy: Reading Robot Intent through Height and Sky
- URL: http://arxiv.org/abs/2510.08381v1
- Date: Thu, 09 Oct 2025 16:07:30 GMT
- Title: Airy: Reading Robot Intent through Height and Sky
- Authors: Baoyang Chen, Xian Xu, Huamin Qu,
- Abstract summary: Airy asks whether complex multi agent AI can become intuitively understandable.<n>Shows how sensory metaphors can turn a black box into a public interface.
- Score: 42.16631998429073
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As industrial robots move into shared human spaces, their opaque decision making threatens safety, trust, and public oversight. This artwork, Airy, asks whether complex multi agent AI can become intuitively understandable by staging a competition between two reinforcement trained robot arms that snap a bedsheet skyward. Building on three design principles, competition as a clear metric (who lifts higher), embodied familiarity (audiences recognize fabric snapping), and sensor to sense mapping (robot cooperation or rivalry shown through forest and weather projections), the installation gives viewers a visceral way to read machine intent. Observations from five international exhibitions indicate that audiences consistently read the robots' strategies, conflict, and cooperation in real time, with emotional reactions that mirror the system's internal state. The project shows how sensory metaphors can turn a black box into a public interface.
Related papers
- Choreographing Trash Cans: On Speculative Futures of Weak Robots in Public Spaces [0.0]
This paper explores mobile robots that encourage posthuman collaboration rather than managing environments independently.<n>We examine the workings of "weak robots" by queering notions of function and ability.<n>We introduce two speculative design fiction vignettes that describe choreographies of such robots in future urban spaces.
arXiv Detail & Related papers (2025-09-01T17:27:43Z) - Multi Layered Autonomy and AI Ecologies in Robotic Art Installations [42.16631998429073]
This paper presents Symbiosis of Agents, a large-scale installation by Baoyang Chen.<n>It embeds AI-driven robots in an immersive, mirror-lined arena, probing the tension between machine agency and artistic authorship.
arXiv Detail & Related papers (2025-06-03T08:28:19Z) - GNN-based Decentralized Perception in Multirobot Systems for Predicting Worker Actions [12.260881600042374]
This paper introduces a perception framework that enables mobile robots to understand and share information about human actions in a decentralized way.<n>A swarm-inspired decision-making process is used to ensure all robots agree on a unified interpretation of the human's actions.
arXiv Detail & Related papers (2025-01-08T00:06:38Z) - Controlling diverse robots by inferring Jacobian fields with deep networks [48.279199537720714]
Mirroring the complex structures and diverse functions of natural organisms is a long-standing challenge in robotics.<n>We introduce a method that uses deep neural networks to map a video stream of a robot to its visuomotor Jacobian field.<n>Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot.
arXiv Detail & Related papers (2024-07-11T17:55:49Z) - See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation [49.925499720323806]
We study how visual, auditory, and tactile perception can jointly help robots to solve complex manipulation tasks.
We build a robot system that can see with a camera, hear with a contact microphone, and feel with a vision-based tactile sensor.
arXiv Detail & Related papers (2022-12-07T18:55:53Z) - Robots with Different Embodiments Can Express and Influence Carefulness
in Object Manipulation [104.5440430194206]
This work investigates the perception of object manipulations performed with a communicative intent by two robots.
We designed the robots' movements to communicate carefulness or not during the transportation of objects.
arXiv Detail & Related papers (2022-08-03T13:26:52Z) - Synthesis and Execution of Communicative Robotic Movements with
Generative Adversarial Networks [59.098560311521034]
We focus on how to transfer on two different robotic platforms the same kinematics modulation that humans adopt when manipulating delicate objects.
We choose to modulate the velocity profile adopted by the robots' end-effector, inspired by what humans do when transporting objects with different characteristics.
We exploit a novel Generative Adversarial Network architecture, trained with human kinematics examples, to generalize over them and generate new and meaningful velocity profiles.
arXiv Detail & Related papers (2022-03-29T15:03:05Z) - Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and
Robotics Together [68.44697646919515]
This paper presents several human-robot systems that utilize spatial computing to enable novel robot use cases.
The combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning.
arXiv Detail & Related papers (2022-02-03T10:04:26Z) - A proxemics game between festival visitors and an industrial robot [1.2599533416395767]
Nonverbal behaviours of collaboration partners in human-robot teams influence the experience of the human interaction partners.
During the Ars Electronica 2020 Festival for Art, Technology and Society (Linz, Austria), we invited visitors to interact with an industrial robot.
We investigated general nonverbal behaviours of the humans interacting with the robot, as well as nonverbal behaviours of people in the audience.
arXiv Detail & Related papers (2021-05-28T13:26:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.