What if Eye...? Computationally Recreating Vision Evolution
- URL: http://arxiv.org/abs/2501.15001v2
- Date: Thu, 13 Feb 2025 04:15:47 GMT
- Title: What if Eye...? Computationally Recreating Vision Evolution
- Authors: Kushagra Tiwary, Aaron Young, Zaid Tasneem, Tzofi Klinghoffer, Akshat Dave, Tomaso Poggio, Dan-Eric Nilsson, Brian Cheung, Ramesh Raskar,
- Abstract summary: Vision systems in nature show remarkable diversity, from simple light-sensitive patches to complex camera eyes with lenses.
We show how environmental demands drive three fundamental aspects of visual evolution through an artificial evolution framework.
Our work introduces a novel paradigm that illuminates evolutionary principles shaping vision by creating targeted single-player games.
- Score: 13.841720704094115
- License:
- Abstract: Vision systems in nature show remarkable diversity, from simple light-sensitive patches to complex camera eyes with lenses. While natural selection has produced these eyes through countless mutations over millions of years, they represent just one set of realized evolutionary paths. Testing hypotheses about how environmental pressures shaped eye evolution remains challenging since we cannot experimentally isolate individual factors. Computational evolution offers a way to systematically explore alternative trajectories. Here we show how environmental demands drive three fundamental aspects of visual evolution through an artificial evolution framework that co-evolves both physical eye structure and neural processing in embodied agents. First, we demonstrate computational evidence that task specific selection drives bifurcation in eye evolution - orientation tasks like navigation in a maze leads to distributed compound-type eyes while an object discrimination task leads to the emergence of high-acuity camera-type eyes. Second, we reveal how optical innovations like lenses naturally emerge to resolve fundamental tradeoffs between light collection and spatial precision. Third, we uncover systematic scaling laws between visual acuity and neural processing, showing how task complexity drives coordinated evolution of sensory and computational capabilities. Our work introduces a novel paradigm that illuminates evolutionary principles shaping vision by creating targeted single-player games where embodied agents must simultaneously evolve visual systems and learn complex behaviors. Through our unified genetic encoding framework, these embodied agents serve as next-generation hypothesis testing machines while providing a foundation for designing manufacturable bio-inspired vision systems. Website: http://eyes.mit.edu/
Related papers
- Paleoinspired Vision: From Exploring Colour Vision Evolution to Inspiring Camera Design [44.353407126161514]
We present a model of visual transduction in the retina, introducing a novel opsin layer.
We quantify evolutionary pressures by measuring machine vision recognition accuracy on colour images shaped by specific opsins.
We develop an evolutionary conservation optimisation algorithm to reconstruct the spectral sensitivity of opsins.
arXiv Detail & Related papers (2024-12-27T04:07:52Z) - Toward Artificial Open-Ended Evolution within Lenia using Quality-Diversity [5.380545611878407]
We show that Quality-Diversity is an effective framework for the automatic discovery of diverse self-organizing patterns in complex systems.
Our framework, called Leniabreeder, can leverage both manually defined diversity criteria and unsupervised measures of diversity to broaden the scope of discoverable patterns.
arXiv Detail & Related papers (2024-06-06T16:35:27Z) - How does the primate brain combine generative and discriminative
computations in vision? [4.691670689443386]
Two contrasting conceptions of the inference process have each been influential in research on biological vision and machine vision.
We show that vision inverts a generative model through an interrogation of the evidence in a process often thought to involve top-down predictions of sensory data.
We explain and clarify the terminology, review the key empirical evidence, and propose an empirical research program that transcends and sets the stage for revealing the mysterious hybrid algorithm of primate vision.
arXiv Detail & Related papers (2024-01-11T16:07:58Z) - A Neuro-mimetic Realization of the Common Model of Cognition via Hebbian
Learning and Free Energy Minimization [55.11642177631929]
Large neural generative models are capable of synthesizing semantically rich passages of text or producing complex images.
We discuss the COGnitive Neural GENerative system, such an architecture that casts the Common Model of Cognition.
arXiv Detail & Related papers (2023-10-14T23:28:48Z) - Adapting Brain-Like Neural Networks for Modeling Cortical Visual
Prostheses [68.96380145211093]
Cortical prostheses are devices implanted in the visual cortex that attempt to restore lost vision by electrically stimulating neurons.
Currently, the vision provided by these devices is limited, and accurately predicting the visual percepts resulting from stimulation is an open challenge.
We propose to address this challenge by utilizing 'brain-like' convolutional neural networks (CNNs), which have emerged as promising models of the visual system.
arXiv Detail & Related papers (2022-09-27T17:33:19Z) - The Introspective Agent: Interdependence of Strategy, Physiology, and
Sensing for Embodied Agents [51.94554095091305]
We argue for an introspective agent, which considers its own abilities in the context of its environment.
Just as in nature, we hope to reframe strategy as one tool, among many, to succeed in an environment.
arXiv Detail & Related papers (2022-01-02T20:14:01Z) - Heritability in Morphological Robot Evolution [2.7402733069181]
We introduce the biological notion of heritability, which captures the amount of phenotypic variation caused by genotypic variation.
In our analysis we measure the heritability on the first generation of robots evolved from two different encodings.
We show how heritability can be a useful tool to better understand the relationship between genotypes and phenotypes.
arXiv Detail & Related papers (2021-10-21T14:58:17Z) - Embodied Intelligence via Learning and Evolution [92.26791530545479]
We show that environmental complexity fosters the evolution of morphological intelligence.
We also show that evolution rapidly selects morphologies that learn faster.
Our experiments suggest a mechanistic basis for both the Baldwin effect and the emergence of morphological intelligence.
arXiv Detail & Related papers (2021-02-03T18:58:31Z) - A Survey on Visual Transformer [126.56860258176324]
Transformer is a type of deep neural network mainly based on the self-attention mechanism.
In this paper, we review these vision transformer models by categorizing them in different tasks and analyzing their advantages and disadvantages.
arXiv Detail & Related papers (2020-12-23T09:37:54Z) - An evolutionary perspective on the design of neuromorphic shape filters [0.0]
Cortical systems may be providing advanced image processing, but most likely are using design principles that had been proven effective in simpler systems.
The present article provides a brief overview of retinal and cortical mechanisms for registering shape information.
arXiv Detail & Related papers (2020-08-30T17:53:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.