Biological Processing Units: Leveraging an Insect Connectome to Pioneer Biofidelic Neural Architectures
- URL: http://arxiv.org/abs/2507.10951v1
- Date: Tue, 15 Jul 2025 03:31:57 GMT
- Title: Biological Processing Units: Leveraging an Insect Connectome to Pioneer Biofidelic Neural Architectures
- Authors: Siyu Yu, Zihan Qin, Tingshan Liu, Beiya Xu, R. Jacob Vogelstein, Jason Brown, Joshua T. Vogelstein,
- Abstract summary: The complete connectome of the Drosophila larva brain offers a unique opportunity to investigate whether biologically evolved circuits can support artificial intelligence.<n>We convert this wiring diagram into a Biological Processing Unit (BPU), a fixed network derived directly from synaptic connectivity.<n>Despite its modest size 3,000 neurons and 65,000 weights between them, the unmodified BPU achieves 98% accuracy on MNIST and 58% on CIFAR-10, surpassing size-matched parameters.<n>These results demonstrate the potential of biofidelic neural architectures to support complex cognitive tasks and motivate scaling to larger and more intelligent connectomes in future work.
- Score: 5.459524614513537
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The complete connectome of the Drosophila larva brain offers a unique opportunity to investigate whether biologically evolved circuits can support artificial intelligence. We convert this wiring diagram into a Biological Processing Unit (BPU), a fixed recurrent network derived directly from synaptic connectivity. Despite its modest size 3,000 neurons and 65,000 weights between them), the unmodified BPU achieves 98% accuracy on MNIST and 58% on CIFAR-10, surpassing size-matched MLPs. Scaling the BPU via structured connectome expansions further improves CIFAR-10 performance, while modality-specific ablations reveal the uneven contributions of different sensory subsystems. On the ChessBench dataset, a lightweight GNN-BPU model trained on only 10,000 games achieves 60% move accuracy, nearly 10x better than any size transformer. Moreover, CNN-BPU models with ~2M parameters outperform parameter-matched Transformers, and with a depth-6 minimax search at inference, reach 91.7% accuracy, exceeding even a 9M-parameter Transformer baseline. These results demonstrate the potential of biofidelic neural architectures to support complex cognitive tasks and motivate scaling to larger and more intelligent connectomes in future work.
Related papers
- BrainSymphony: A Transformer-Driven Fusion of fMRI Time Series and Structural Connectivity [2.3486335708866606]
BrainSymphony is a lightweight, parameter-efficient foundation model for neuroimaging.<n>It achieves state-of-the-art performance while being pre-trained on significantly smaller public datasets.<n>BrainSymphony establishes that architecturally-aware, multimodal models can surpass their larger counterparts.
arXiv Detail & Related papers (2025-06-23T06:00:21Z) - CEReBrO: Compact Encoder for Representations of Brain Oscillations Using Efficient Alternating Attention [53.539020807256904]
We introduce a Compact for Representations of Brain Oscillations using alternating attention (CEReBrO)<n>Our tokenization scheme represents EEG signals at a per-channel patch.<n>We propose an alternating attention mechanism that jointly models intra-channel temporal dynamics and inter-channel spatial correlations, achieving 2x speed improvement with 6x less memory required compared to standard self-attention.
arXiv Detail & Related papers (2025-01-18T21:44:38Z) - Advancing the Biological Plausibility and Efficacy of Hebbian Convolutional Neural Networks [0.0]
The research presented in this paper advances the integration of Hebbian learning into Convolutional Neural Networks (CNNs) for image processing.<n>Hebbian learning operates on local unsupervised neural information to form feature representations.<n>Results showed clear indications of sparse hierarchical learning through increasingly complex and receptive fields.
arXiv Detail & Related papers (2025-01-06T12:29:37Z) - Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
Neuromorphic computing uses spiking neural networks (SNNs) to perform inference tasks.<n> embedding a small payload within each spike exchanged between spiking neurons can enhance inference accuracy without increasing energy consumption.<n> split computing - where an SNN is partitioned across two devices - is a promising solution.<n>This paper presents the first comprehensive study of a neuromorphic wireless split computing architecture that employs multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - Application of Quantum Tensor Networks for Protein Classification [3.5300092061072523]
We show that protein sequences can be thought of as sentences in natural language processing.
We classify proteins based on their subcellular locations.
We demonstrate that Quantum Networks (QTN) can effectively handle the complexity and diversity of protein sequences.
arXiv Detail & Related papers (2024-03-11T16:47:09Z) - Mem-elements based Neuromorphic Hardware for Neural Network Application [0.0]
The thesis investigates the utilization of memristive and memcapacitive crossbar arrays in low-power machine learning accelerators, offering a comprehensive co-design framework for deep neural networks (DNN)
The model, implemented through a hybrid Python and PyTorch approach, accounts for various non-idealities, achieving exceptional training accuracies of 90.02% and 91.03% for the CIFAR-10 dataset with memristive and memcapacitive crossbar arrays on an 8-layer VGG network.
arXiv Detail & Related papers (2024-03-05T14:28:40Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Spikformer: When Spiking Neural Network Meets Transformer [102.91330530210037]
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-attention mechanism.
We propose a novel Spiking Self Attention (SSA) as well as a powerful framework, named Spiking Transformer (Spikformer)
arXiv Detail & Related papers (2022-09-29T14:16:49Z) - Differentiable Agent-based Epidemiology [71.81552021144589]
We introduce GradABM: a scalable, differentiable design for agent-based modeling that is amenable to gradient-based learning with automatic differentiation.
GradABM can quickly simulate million-size populations in few seconds on commodity hardware, integrate with deep neural networks and ingest heterogeneous data sources.
arXiv Detail & Related papers (2022-07-20T07:32:02Z) - Bioformers: Embedding Transformers for Ultra-Low Power sEMG-based
Gesture Recognition [21.486555297061717]
Human-machine interaction is gaining traction in rehabilitation tasks, such as controlling prosthetic hands or robotic arms.
Gesture recognition exploiting surface electromyographic (sEMG) signals is one of the most promising approaches.
However, the analysis of these signals still presents many challenges since similar gestures result in similar muscle contractions.
arXiv Detail & Related papers (2022-03-24T08:37:26Z) - A Battle of Network Structures: An Empirical Study of CNN, Transformer,
and MLP [121.35904748477421]
Convolutional neural networks (CNN) are the dominant deep neural network (DNN) architecture for computer vision.
Transformer and multi-layer perceptron (MLP)-based models, such as Vision Transformer and Vision-Mixer, started to lead new trends.
In this paper, we conduct empirical studies on these DNN structures and try to understand their respective pros and cons.
arXiv Detail & Related papers (2021-08-30T06:09:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.