Symmetry Adapted Residual Neural Network Diabatization: Conical Intersections in Aniline Photodissociation
- URL: http://arxiv.org/abs/2411.01702v1
- Date: Sun, 03 Nov 2024 21:56:25 GMT
- Title: Symmetry Adapted Residual Neural Network Diabatization: Conical Intersections in Aniline Photodissociation
- Authors: Yifan Shen, David Yarkony,
- Abstract summary: We present a symmetry adapted neural network (SAResNet) diabatization method to construct quasi-diabatic Hamiltonians.
Our SAResNet is applied to construct the full 36-dimensional coupled diabatic potential energy surfaces for aniline N-H bond photodissociation.
- Score: 1.2365038403958204
- License:
- Abstract: We present a symmetry adapted residual neural network (SAResNet) diabatization method to construct quasi-diabatic Hamiltonians that accurately represent ab initio adiabatic energies, energy gradients, and nonadiabatic couplings for moderate sized systems. Our symmetry adapted neural network inherits from the pioneering symmetry adapted polynomial and fundamental invariant neural network diabatization methods to exploit the power of neural network along with the transparent symmetry adaptation of polynomial for both symmetric and asymmetric irreducible representations. In addition, our symmetry adaptation provides a unified framework for symmetry adapted polynomial and symmetry adapted neural network, enabling the adoption of the residual neural network architecture, which is a powerful descendant of the pioneering feedforward neural network. Our SAResNet is applied to construct the full 36-dimensional coupled diabatic potential energy surfaces for aniline N-H bond photodissociation, with 2,269 data points and 32,640 trainable parameters and 190 cm-1 root mean square deviation in energy. In addition to the experimentally observed {\pi}{\pi}* and {\pi}Rydberg/{\pi}{\sigma}* states, a higher state (HOMO - 1 {\pi} to Rydberg/{\sigma}* excitation) is found to introduce an induced geometric phase effect thus indirectly participate in the photodissociation process.
Related papers
- The Empirical Impact of Neural Parameter Symmetries, or Lack Thereof [50.49582712378289]
We investigate the impact of neural parameter symmetries by introducing new neural network architectures.
We develop two methods, with some provable guarantees, of modifying standard neural networks to reduce parameter space symmetries.
Our experiments reveal several interesting observations on the empirical impact of parameter symmetries.
arXiv Detail & Related papers (2024-05-30T16:32:31Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Lie Point Symmetry and Physics Informed Networks [59.56218517113066]
We propose a loss function that informs the network about Lie point symmetries in the same way that PINN models try to enforce the underlying PDE through a loss function.
Our symmetry loss ensures that the infinitesimal generators of the Lie group conserve the PDE solutions.
Empirical evaluations indicate that the inductive bias introduced by the Lie point symmetries of the PDEs greatly boosts the sample efficiency of PINNs.
arXiv Detail & Related papers (2023-11-07T19:07:16Z) - NeuRBF: A Neural Fields Representation with Adaptive Radial Basis
Functions [93.02515761070201]
We present a novel type of neural fields that uses general radial bases for signal representation.
Our method builds upon general radial bases with flexible kernel position and shape, which have higher spatial adaptivity and can more closely fit target signals.
When applied to neural radiance field reconstruction, our method achieves state-of-the-art rendering quality, with small model size and comparable training speed.
arXiv Detail & Related papers (2023-09-27T06:32:05Z) - Improving equilibrium propagation without weight symmetry through Jacobian homeostasis [7.573586022424398]
Equilibrium propagation (EP) is a compelling alternative to the backpropagation of error algorithm (BP)
EP requires weight symmetry and infinitesimal equilibrium perturbations, i.e., nudges, to estimate unbiased gradients efficiently.
We show that the finite nudge does not pose a problem, as exact derivatives can still be estimated via a Cauchy integral.
We present a new homeostatic objective that directly mitigates functional asymmetries of the Jacobian at the network's fixed point.
arXiv Detail & Related papers (2023-09-05T13:20:43Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Symmetry-Enhanced Attention Network for Acute Ischemic Infarct
Segmentation with Non-Contrast CT Images [50.55978219682419]
We propose a symmetry enhanced attention network (SEAN) for acute ischemic infarct segmentation.
Our proposed network automatically transforms an input CT image into the standard space where the brain tissue is bilaterally symmetric.
The proposed SEAN outperforms some symmetry-based state-of-the-art methods in terms of both dice coefficient and infarct localization.
arXiv Detail & Related papers (2021-10-11T07:13:26Z) - Encoding Involutory Invariance in Neural Networks [1.6371837018687636]
In certain situations, Neural Networks (NN) are trained upon data that obey underlying physical symmetries.
In this work, we explore a special kind of symmetry where functions are invariant with respect to involutory linear/affine transformations up to parity.
Numerical experiments indicate that the proposed models outperform baseline networks while respecting the imposed symmetry.
An adaption of our technique to convolutional NN classification tasks for datasets with inherent horizontal/vertical reflection symmetry has also been proposed.
arXiv Detail & Related papers (2021-06-07T16:07:15Z) - Symmetry-via-Duality: Invariant Neural Network Densities from
Parameter-Space Correlators [0.0]
symmetries of network densities may be determined via dual computations of network correlation functions.
We demonstrate that the amount of symmetry in the initial density affects the accuracy of networks trained on Fashion-MNIST.
arXiv Detail & Related papers (2021-06-01T18:00:06Z) - Finding Symmetry Breaking Order Parameters with Euclidean Neural
Networks [2.735801286587347]
We demonstrate that symmetry equivariant neural networks uphold Curie's principle and can be used to articulate many symmetry-relevant scientific questions into simple optimization problems.
We prove these properties mathematically and demonstrate them numerically by training a Euclidean symmetry equivariant neural network to learn symmetry-breaking input to deform a square into a rectangle and to generate octahedra tilting patterns in perovskites.
arXiv Detail & Related papers (2020-07-04T17:24:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.