Evolutionary Generation of Visual Motion Illusions
- URL: http://arxiv.org/abs/2112.13243v1
- Date: Sat, 25 Dec 2021 14:53:50 GMT
- Title: Evolutionary Generation of Visual Motion Illusions
- Authors: Lana Sinapayen and Eiji Watanabe
- Abstract summary: We present a generative model, the Evolutionary Illusion GENerator (EIGen), that creates new visual motion illusions.
The structure of EIGen supports the hypothesis that illusory motion might be the result of perceiving the brain's own predictions.
The scientific motivation of this paper is to demonstrate that the perception of illusory motion could be a side effect of the predictive abilities of the brain.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Why do we sometimes perceive static images as if they were moving? Visual
motion illusions enjoy a sustained popularity, yet there is no definitive
answer to the question of why they work. We present a generative model, the
Evolutionary Illusion GENerator (EIGen), that creates new visual motion
illusions. The structure of EIGen supports the hypothesis that illusory motion
might be the result of perceiving the brain's own predictions rather than
perceiving raw visual input from the eyes. The scientific motivation of this
paper is to demonstrate that the perception of illusory motion could be a side
effect of the predictive abilities of the brain. The philosophical motivation
of this paper is to call attention to the untapped potential of "motivated
failures", ways for artificial systems to fail as biological systems fail, as a
worthy outlet for Artificial Intelligence and Artificial Life research.
Related papers
- Neural Representations of Dynamic Visual Stimuli [36.04425924379253]
We show that visual motion information as optical flow can be predicted (or decoded) from brain activity as measured by fMRI.
We show that this predicted motion can be used to realistically animate static images using a motion-conditioned video diffusion model.
This work offers a novel framework for interpreting how the human brain processes dynamic visual information.
arXiv Detail & Related papers (2024-06-04T17:59:49Z) - Brain-inspired bodily self-perception model for robot rubber hand
illusion [11.686402949452546]
We propose a Brain-inspired bodily self-perception model, by which perceptions of bodily self can be autonomously constructed without supervision signals.
We validate our model with six rubber hand illusion experiments and a disability experiment on platforms including a iCub humanoid robot and simulated environments.
arXiv Detail & Related papers (2023-03-22T02:00:09Z) - BI AVAN: Brain inspired Adversarial Visual Attention Network [67.05560966998559]
We propose a brain-inspired adversarial visual attention network (BI-AVAN) to characterize human visual attention directly from functional brain activity.
Our model imitates the biased competition process between attention-related/neglected objects to identify and locate the visual objects in a movie frame the human brain focuses on in an unsupervised manner.
arXiv Detail & Related papers (2022-10-27T22:20:36Z) - Adapting Brain-Like Neural Networks for Modeling Cortical Visual
Prostheses [68.96380145211093]
Cortical prostheses are devices implanted in the visual cortex that attempt to restore lost vision by electrically stimulating neurons.
Currently, the vision provided by these devices is limited, and accurately predicting the visual percepts resulting from stimulation is an open challenge.
We propose to address this challenge by utilizing 'brain-like' convolutional neural networks (CNNs), which have emerged as promising models of the visual system.
arXiv Detail & Related papers (2022-09-27T17:33:19Z) - Adversarially trained neural representations may already be as robust as
corresponding biological neural representations [66.73634912993006]
We develop a method for performing adversarial visual attacks directly on primate brain activity.
We report that the biological neurons that make up visual systems of primates exhibit susceptibility to adversarial perturbations that is comparable in magnitude to existing (robustly trained) artificial neural networks.
arXiv Detail & Related papers (2022-06-19T04:15:29Z) - A-ACT: Action Anticipation through Cycle Transformations [89.83027919085289]
We take a step back to analyze how the human capability to anticipate the future can be transferred to machine learning algorithms.
A recent study on human psychology explains that, in anticipating an occurrence, the human brain counts on both systems.
In this work, we study the impact of each system for the task of action anticipation and introduce a paradigm to integrate them in a learning framework.
arXiv Detail & Related papers (2022-04-02T21:50:45Z) - High-Fidelity Neural Human Motion Transfer from Monocular Video [71.75576402562247]
Video-based human motion transfer creates video animations of humans following a source motion.
We present a new framework which performs high-fidelity and temporally-consistent human motion transfer with natural pose-dependent non-rigid deformations.
In the experimental results, we significantly outperform the state-of-the-art in terms of video realism.
arXiv Detail & Related papers (2020-12-20T16:54:38Z) - An evolutionary perspective on the design of neuromorphic shape filters [0.0]
Cortical systems may be providing advanced image processing, but most likely are using design principles that had been proven effective in simpler systems.
The present article provides a brief overview of retinal and cortical mechanisms for registering shape information.
arXiv Detail & Related papers (2020-08-30T17:53:44Z) - A deep active inference model of the rubber-hand illusion [3.0854497868458464]
Recent results in humans have shown that the RHI not only produces a change in the perceived arm location, but also causes involuntary forces.
We show that our model, which deals with visual high-dimensional inputs, produces similar perceptual and force patterns to those found in humans.
arXiv Detail & Related papers (2020-08-17T15:28:57Z) - Visual Grounding of Learned Physical Models [66.04898704928517]
Humans intuitively recognize objects' physical properties and predict their motion, even when the objects are engaged in complicated interactions.
We present a neural model that simultaneously reasons about physics and makes future predictions based on visual and dynamics priors.
Experiments show that our model can infer the physical properties within a few observations, which allows the model to quickly adapt to unseen scenarios and make accurate predictions into the future.
arXiv Detail & Related papers (2020-04-28T17:06:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.