A Brain-inspired Memory Transformation based Differentiable Neural
Computer for Reasoning-based Question Answering
- URL: http://arxiv.org/abs/2301.02809v1
- Date: Sat, 7 Jan 2023 08:39:57 GMT
- Title: A Brain-inspired Memory Transformation based Differentiable Neural
Computer for Reasoning-based Question Answering
- Authors: Yao Liang, Hongjian Fang, Yi Zeng and Feifei Zhao
- Abstract summary: Reasoning and question answering as a basic cognitive function for humans is a great challenge for current artificial intelligence.
Motivated by the learning and memory mechanism of the brain, this paper proposed a Memory Transformation based Differentiable Neural Computer (MT-DNC) model.
- Score: 3.036382664997076
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reasoning and question answering as a basic cognitive function for humans, is
nevertheless a great challenge for current artificial intelligence. Although
the Differentiable Neural Computer (DNC) model could solve such problems to a
certain extent, the development is still limited by its high algorithm
complexity, slow convergence speed, and poor test robustness. Inspired by the
learning and memory mechanism of the brain, this paper proposed a Memory
Transformation based Differentiable Neural Computer (MT-DNC) model. MT-DNC
incorporates working memory and long-term memory into DNC, and realizes the
autonomous transformation of acquired experience between working memory and
long-term memory, thereby helping to effectively extract acquired knowledge to
improve reasoning ability. Experimental results on bAbI question answering task
demonstrated that our proposed method achieves superior performance and faster
convergence speed compared to other existing DNN and DNC models. Ablation
studies also indicated that the memory transformation from working memory to
long-term memory plays essential role in improving the robustness and stability
of reasoning. This work explores how brain-inspired memory transformation can
be integrated and applied to complex intelligent dialogue and reasoning
systems.
Related papers
- BrainODE: Dynamic Brain Signal Analysis via Graph-Aided Neural Ordinary Differential Equations [67.79256149583108]
We propose a novel model called BrainODE to achieve continuous modeling of dynamic brain signals.
By learning latent initial values and neural ODE functions from irregular time series, BrainODE effectively reconstructs brain signals at any time point.
arXiv Detail & Related papers (2024-04-30T10:53:30Z) - MindBridge: A Cross-Subject Brain Decoding Framework [60.58552697067837]
Brain decoding aims to reconstruct stimuli from acquired brain signals.
Currently, brain decoding is confined to a per-subject-per-model paradigm.
We present MindBridge, that achieves cross-subject brain decoding by employing only one model.
arXiv Detail & Related papers (2024-04-11T15:46:42Z) - Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model [55.116403765330084]
Current AIGC methods, such as score-based diffusion, are still deficient in terms of rapidity and efficiency.
We propose a time-continuous and analog in-memory neural differential equation solver for score-based diffusion.
We experimentally validate our solution with 180 nm resistive memory in-memory computing macros.
arXiv Detail & Related papers (2024-04-08T16:34:35Z) - A differentiable brain simulator bridging brain simulation and
brain-inspired computing [3.5874544981360987]
Brain simulation builds dynamical models to mimic the structure and functions of the brain.
Brain-inspired computing develops intelligent systems by learning from the structure and functions of the brain.
BrainPy is a differentiable brain simulator developed using JAX and XLA.
arXiv Detail & Related papers (2023-11-09T02:47:38Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Explainable fMRI-based Brain Decoding via Spatial Temporal-pyramid Graph
Convolutional Network [0.8399688944263843]
Existing machine learning methods for fMRI-based brain decoding either suffer from low classification performance or poor explainability.
We propose a biologically inspired architecture, Spatial Temporal-pyramid Graph Convolutional Network (STpGCN), to capture the spatial-temporal graph representation of functional brain activities.
We conduct extensive experiments on fMRI data under 23 cognitive tasks from Human Connectome Project (HCP) S1200.
arXiv Detail & Related papers (2022-10-08T12:14:33Z) - A bio-inspired implementation of a sparse-learning spike-based
hippocampus memory model [0.0]
We propose a novel bio-inspired memory model based on the hippocampus.
It can learn memories, recall them from a cue and even forget memories when trying to learn others with the same cue.
This work presents the first hardware implementation of a fully functional bio-inspired spike-based hippocampus memory model.
arXiv Detail & Related papers (2022-06-10T07:48:29Z) - CogNGen: Constructing the Kernel of a Hyperdimensional Predictive
Processing Cognitive Architecture [79.07468367923619]
We present a new cognitive architecture that combines two neurobiologically plausible, computational models.
We aim to develop a cognitive architecture that has the power of modern machine learning techniques.
arXiv Detail & Related papers (2022-03-31T04:44:28Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Memory and attention in deep learning [19.70919701635945]
Memory construction for machine is inevitable.
Recent progresses on modeling memory in deep learning have revolved around external memory constructions.
The aim of this thesis is to advance the understanding on memory and attention in deep learning.
arXiv Detail & Related papers (2021-07-03T09:21:13Z) - A Neural Dynamic Model based on Activation Diffusion and a
Micro-Explanation for Cognitive Operations [4.416484585765028]
The neural mechanism of memory has a very close relation with the problem of representation in artificial intelligence.
A computational model was proposed to simulate the network of neurons in brain and how they process information.
arXiv Detail & Related papers (2020-11-27T01:34:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.