Long-term excitation energy transfer predicted by a modified convolutional neural networks in the FMO complexes
- URL: http://arxiv.org/abs/2503.17430v3
- Date: Thu, 24 Apr 2025 04:26:48 GMT
- Title: Long-term excitation energy transfer predicted by a modified convolutional neural networks in the FMO complexes
- Authors: Yi-Meng Huang, Zi-Ran Zhao, Shun-Cai Zhao,
- Abstract summary: We propose an efficient CNNs scheme incorporating novel redundant time-functions to predict 100 picosecond (ps) excitation energy transfer (EET) in Fenna-Matthews-Olson complexes.<n>This method simplifies optimization and enhances learning efficiency, and demonstrates the accuracy, robustness, and efficiency of our approach in predicting quantum dissipative dynamics.
- Score: 0.49157446832511503
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In machine learning (ML), the risk of recursive strategies overfitting historical data has driven the development of convolutional neural networks (CNNs) in simulating quantum dissipative dynamics. In this work, we propose an efficient CNNs scheme incorporating novel redundant time-functions to predict 100 picosecond (ps) excitation energy transfer (EET) in Fenna-Matthews-Olson (FMO) complexes, in which the original time $t$ is normalized by mapping it to the [0, 1] range, allowing different functions focus on distinct time intervals, thereby effectively capturing the multi-timescale characteristics of EET dynamics. This method simplifies optimization and enhances learning efficiency, and demonstrate the accuracy, robustness, and efficiency of our approach in predicting quantum dissipative dynamics.
Related papers
- Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Enhancing Open Quantum Dynamics Simulations Using Neural Network-Based Non-Markovian Stochastic Schrödinger Equation Method [2.9413085575648235]
We propose a scheme that combines neural network techniques with simulations of the non-Markovian Schrodinger equation.
This approach significantly reduces the number of trajectories required for long-time simulations, particularly at low temperatures.
arXiv Detail & Related papers (2024-11-24T16:57:07Z) - A short trajectory is all you need: A transformer-based model for long-time dissipative quantum dynamics [0.0]
We show that a deep artificial neural network can predict the long-time population dynamics of a quantum system coupled to a dissipative environment.
Our model is more accurate than classical forecasting models, such as recurrent neural networks.
arXiv Detail & Related papers (2024-09-17T16:17:52Z) - Neuroevolving Electronic Dynamical Networks [0.0]
Neuroevolution is a method of applying an evolutionary algorithm to refine the performance of artificial neural networks through natural selection.
Fitness evaluation of continuous time recurrent neural networks (CTRNNs) can be time-consuming and computationally expensive.
Field programmable gate arrays (FPGAs) have emerged as an increasingly popular solution, due to their high performance and low power consumption.
arXiv Detail & Related papers (2024-04-06T10:54:35Z) - EMN: Brain-inspired Elastic Memory Network for Quick Domain Adaptive Feature Mapping [57.197694698750404]
We propose a novel gradient-free Elastic Memory Network to support quick fine-tuning of the mapping between features and prediction.<n>EMN adopts randomly connected neurons to memorize the association of features and labels, where the signals in the network are propagated as impulses.<n>EMN can achieve up to 10% enhancement of performance while only needing less than 1% timing cost of traditional domain adaptation methods.
arXiv Detail & Related papers (2024-02-04T09:58:17Z) - Automatic Evolution of Machine-Learning based Quantum Dynamics with
Uncertainty Analysis [4.629634111796585]
The long short-term memory recurrent neural network (LSTM-RNN) models are used to simulate the long-time quantum dynamics.
This work builds an effective machine learning approach to simulate the dynamics evolution of open quantum systems.
arXiv Detail & Related papers (2022-05-07T08:53:55Z) - Influence Estimation and Maximization via Neural Mean-Field Dynamics [60.91291234832546]
We propose a novel learning framework using neural mean-field (NMF) dynamics for inference and estimation problems.
Our framework can simultaneously learn the structure of the diffusion network and the evolution of node infection probabilities.
arXiv Detail & Related papers (2021-06-03T00:02:05Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.