There's Plenty of Room Right Here: Biological Systems as Evolved,
Overloaded, Multi-scale Machines
- URL: http://arxiv.org/abs/2212.10675v1
- Date: Tue, 20 Dec 2022 22:26:40 GMT
- Title: There's Plenty of Room Right Here: Biological Systems as Evolved,
Overloaded, Multi-scale Machines
- Authors: Joshua Bongard and Michael Levin
- Abstract summary: We argue that a useful path forward results from abandoning hard boundaries between categories and adopting an observer-dependent, pragmatic view.
Efforts to re-shape living systems for biomedical or bioengineering purposes require prediction and control of their function at multiple scales.
We argue that an observer-centered framework for the computations performed by evolved and designed systems will improve the understanding of meso-scale events.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The applicability of computational models to the biological world is an
active topic of debate. We argue that a useful path forward results from
abandoning hard boundaries between categories and adopting an
observer-dependent, pragmatic view. Such a view dissolves the contingent
dichotomies driven by human cognitive biases (e.g., tendency to oversimplify)
and prior technological limitations in favor of a more continuous, gradualist
view necessitated by the study of evolution, developmental biology, and
intelligent machines. Efforts to re-shape living systems for biomedical or
bioengineering purposes require prediction and control of their function at
multiple scales. This is challenging for many reasons, one of which is that
living systems perform multiple functions in the same place at the same time.
We refer to this as "polycomputing" - the ability of the same substrate to
simultaneously compute different things. This ability is an important way in
which living things are a kind of computer, but not the familiar, linear,
deterministic kind; rather, living things are computers in the broad sense of
computational materials as reported in the rapidly-growing physical computing
literature. We argue that an observer-centered framework for the computations
performed by evolved and designed systems will improve the understanding of
meso-scale events, as it has already done at quantum and relativistic scales.
Here, we review examples of biological and technological polycomputing, and
develop the idea that overloading of different functions on the same hardware
is an important design principle that helps understand and build both evolved
and designed systems. Learning to hack existing polycomputing substrates, as
well as evolve and design new ones, will have massive impacts on regenerative
medicine, robotics, and computer engineering.
Related papers
- Evolution and learning in differentiable robots [0.0]
We use differentiable simulations to rapidly and simultaneously optimize individual neural control of behavior across a large population of candidate body plans.
Non-differentiable changes to the mechanical structure of each robot in the population were applied by a genetic algorithm in an outer loop of search.
One of the highly differentiable morphologies discovered in simulation was realized as a physical robot and shown to retain its optimized behavior.
arXiv Detail & Related papers (2024-05-23T15:45:43Z) - A Review of Neuroscience-Inspired Machine Learning [58.72729525961739]
Bio-plausible credit assignment is compatible with practically any learning condition and is energy-efficient.
In this paper, we survey several vital algorithms that model bio-plausible rules of credit assignment in artificial neural networks.
We conclude by discussing the future challenges that will need to be addressed in order to make such algorithms more useful in practical applications.
arXiv Detail & Related papers (2024-02-16T18:05:09Z) - Brain-Inspired Computational Intelligence via Predictive Coding [89.6335791546526]
Predictive coding (PC) has shown promising performance in machine intelligence tasks.
PC can model information processing in different brain areas, can be used in cognitive control and robotics.
arXiv Detail & Related papers (2023-08-15T16:37:16Z) - Machine Psychology [54.287802134327485]
We argue that a fruitful direction for research is engaging large language models in behavioral experiments inspired by psychology.
We highlight theoretical perspectives, experimental paradigms, and computational analysis techniques that this approach brings to the table.
It paves the way for a "machine psychology" for generative artificial intelligence (AI) that goes beyond performance benchmarks.
arXiv Detail & Related papers (2023-03-24T13:24:41Z) - Neurocompositional computing: From the Central Paradox of Cognition to a
new generation of AI systems [120.297940190903]
Recent progress in AI has resulted from the use of limited forms of neurocompositional computing.
New, deeper forms of neurocompositional computing create AI systems that are more robust, accurate, and comprehensible.
arXiv Detail & Related papers (2022-05-02T18:00:10Z) - The Introspective Agent: Interdependence of Strategy, Physiology, and
Sensing for Embodied Agents [51.94554095091305]
We argue for an introspective agent, which considers its own abilities in the context of its environment.
Just as in nature, we hope to reframe strategy as one tool, among many, to succeed in an environment.
arXiv Detail & Related papers (2022-01-02T20:14:01Z) - Inductive Biases for Deep Learning of Higher-Level Cognition [108.89281493851358]
A fascinating hypothesis is that human and animal intelligence could be explained by a few principles.
This work considers a larger list, focusing on those which concern mostly higher-level and sequential conscious processing.
The objective of clarifying these particular principles is that they could potentially help us build AI systems benefiting from humans' abilities.
arXiv Detail & Related papers (2020-11-30T18:29:25Z) - On the spatiotemporal behavior in biology-mimicking computing systems [0.0]
The payload performance of conventional computing systems, from single processors to supercomputers, reached its limits the nature enables.
Both the growing demand to cope with "big data" (based on, or assisted by, artificial intelligence) and the interest in understanding the operation of our brain more completely, stimulated the efforts to build biology-mimicking computing systems.
These systems require an unusually large number of processors, which introduces performance limitations and nonlinear scaling.
arXiv Detail & Related papers (2020-09-18T13:53:58Z) - On Artificial Life and Emergent Computation in Physical Substrates [0.0]
We argue that the lens of artificial life offers valuable perspectives for the advancement of high-performance computing.
Two specific substrates are discussed in detail: biological neurons and ensembles of nanomagnets.
We conclude with a philosophical discussion on what we can learn from approaching computation with the curiosity inherent to the study of artificial life.
arXiv Detail & Related papers (2020-09-09T18:59:53Z) - Do we know the operating principles of our computers better than those
of our brain? [0.0]
The paper discusses how the conventional principles, components and thinking about computing limit mimicking the biological systems.
We describe what changes will be necessary in the computing paradigms to get closer to the marvelously efficient operation of biological neural networks.
arXiv Detail & Related papers (2020-05-06T20:41:23Z) - Quantum computing using continuous-time evolution [0.0]
Digital silicon computers have reached their limits in terms of speed.
Quantum computing exploits the coherence and superposition of quantum systems.
Early quantum computers will be small, reminiscent of the early days of digital silicon computing.
arXiv Detail & Related papers (2020-04-01T20:58:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.