SA-GAT-SR: Self-Adaptable Graph Attention Networks with Symbolic Regression for high-fidelity material property prediction
- URL: http://arxiv.org/abs/2505.00625v1
- Date: Thu, 01 May 2025 16:05:10 GMT
- Title: SA-GAT-SR: Self-Adaptable Graph Attention Networks with Symbolic Regression for high-fidelity material property prediction
- Authors: Liu Junchi, Tang Ying, Tretiak Sergei, Duan Wenhui, Zhou Liujiang,
- Abstract summary: We introduce a novel computational paradigm, Self-Adaptable Graph Attention Networks integrated with Symbolic Regression (SA-GAT-SR)<n>Our framework employs a self-adaptable encoding algorithm that automatically identifies and adjust attention weights so as to screen critical features from an expansive 180-dimensional feature space.<n>The integrated SR module subsequently distills these features into compact analytical expressions that explicitly reveal quantum-mechanically meaningful relationships.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in machine learning have demonstrated an enormous utility of deep learning approaches, particularly Graph Neural Networks (GNNs) for materials science. These methods have emerged as powerful tools for high-throughput prediction of material properties, offering a compelling enhancement and alternative to traditional first-principles calculations. While the community has predominantly focused on developing increasingly complex and universal models to enhance predictive accuracy, such approaches often lack physical interpretability and insights into materials behavior. Here, we introduce a novel computational paradigm, Self-Adaptable Graph Attention Networks integrated with Symbolic Regression (SA-GAT-SR), that synergistically combines the predictive capability of GNNs with the interpretative power of symbolic regression. Our framework employs a self-adaptable encoding algorithm that automatically identifies and adjust attention weights so as to screen critical features from an expansive 180-dimensional feature space while maintaining O(n) computational scaling. The integrated SR module subsequently distills these features into compact analytical expressions that explicitly reveal quantum-mechanically meaningful relationships, achieving 23 times acceleration compared to conventional SR implementations that heavily rely on first principle calculations-derived features as input. This work suggests a new framework in computational materials science, bridging the gap between predictive accuracy and physical interpretability, offering valuable physical insights into material behavior.
Related papers
- Dynamic Graph Structure Estimation for Learning Multivariate Point Process using Spiking Neural Networks [14.77536193242342]
Spiking Dynamic Graph Network is a novel framework that leverages the temporal processing capabilities of spiking neural networks (SNNs) and spike-dependent plasticity (STD-P)
It adapts to any dataset by learning dynamic-temporal dependencies directly from event data, enhancing generalizability and modeling.
Our evaluations conducted on both synthetic and real-world datasets including NYC Taxi, 911 Reddit, and Stack Overflow, demonstrate superior accuracy while maintaining computational efficiency.
arXiv Detail & Related papers (2025-04-01T23:23:10Z) - Interactive Symbolic Regression through Offline Reinforcement Learning: A Co-Design Framework [11.804368618793273]
Symbolic Regression holds great potential for uncovering underlying mathematical and physical relationships from observed data.<n>Current state-of-the-art approaches typically do not consider the integration of domain experts' prior knowledge.<n>We propose the Symbolic Q-network (Sym-Q), an advanced interactive framework for large-scale symbolic regression.
arXiv Detail & Related papers (2025-02-05T06:26:49Z) - Enhancing material property prediction with ensemble deep graph convolutional networks [9.470117608423957]
Recent efforts have focused on employing advanced ML algorithms, including deep learning - based graph neural network, for property prediction.
Our research provides an in-depth evaluation of ensemble strategies in deep learning - based graph neural network, specifically targeting material property prediction tasks.
By testing the Crystal Graph Convolutional Neural Network (CGCNN) and its multitask version, MT-CGCNN, we demonstrated that ensemble techniques, especially prediction averaging, substantially improve precision beyond traditional metrics.
arXiv Detail & Related papers (2024-07-26T16:12:06Z) - Graph Neural Ordinary Differential Equations for Coarse-Grained Socioeconomic Dynamics [0.0]
We present a data-driven machine-learning approach for modeling space-time socioeconomic dynamics.
Our findings, from a case study of Baltimore, MD, indicate that this machine learning-augmented coarse-grained model serves as a powerful instrument.
arXiv Detail & Related papers (2024-07-25T15:12:46Z) - Interactive Symbolic Regression through Offline Reinforcement Learning: A Co-Design Framework [11.804368618793273]
Symbolic Regression holds great potential for uncovering underlying mathematical and physical relationships from observed data.<n>Current state-of-the-art approaches typically do not consider the integration of domain experts' prior knowledge.<n>We propose the Symbolic Q-network (Sym-Q), an advanced interactive framework for large-scale symbolic regression.
arXiv Detail & Related papers (2024-02-07T22:53:54Z) - The Role of Foundation Models in Neuro-Symbolic Learning and Reasoning [54.56905063752427]
Neuro-Symbolic AI (NeSy) holds promise to ensure the safe deployment of AI systems.
Existing pipelines that train the neural and symbolic components sequentially require extensive labelling.
New architecture, NeSyGPT, fine-tunes a vision-language foundation model to extract symbolic features from raw data.
arXiv Detail & Related papers (2024-02-02T20:33:14Z) - Understanding Self-attention Mechanism via Dynamical System Perspective [58.024376086269015]
Self-attention mechanism (SAM) is widely used in various fields of artificial intelligence.
We show that intrinsic stiffness phenomenon (SP) in the high-precision solution of ordinary differential equations (ODEs) also widely exists in high-performance neural networks (NN)
We show that the SAM is also a stiffness-aware step size adaptor that can enhance the model's representational ability to measure intrinsic SP.
arXiv Detail & Related papers (2023-08-19T08:17:41Z) - Understanding Augmentation-based Self-Supervised Representation Learning
via RKHS Approximation and Regression [53.15502562048627]
Recent work has built the connection between self-supervised learning and the approximation of the top eigenspace of a graph Laplacian operator.
This work delves into a statistical analysis of augmentation-based pretraining.
arXiv Detail & Related papers (2023-06-01T15:18:55Z) - AttNS: Attention-Inspired Numerical Solving For Limited Data Scenarios [51.94807626839365]
We propose the attention-inspired numerical solver (AttNS) to solve differential equations due to limited data.<n>AttNS is inspired by the effectiveness of attention modules in Residual Neural Networks (ResNet) in enhancing model generalization and robustness.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Great Truths are Always Simple: A Rather Simple Knowledge Encoder for
Enhancing the Commonsense Reasoning Capacity of Pre-Trained Models [89.98762327725112]
Commonsense reasoning in natural language is a desired ability of artificial intelligent systems.
For solving complex commonsense reasoning tasks, a typical solution is to enhance pre-trained language models(PTMs) with a knowledge-aware graph neural network(GNN) encoder.
Despite the effectiveness, these approaches are built on heavy architectures, and can't clearly explain how external knowledge resources improve the reasoning capacity of PTMs.
arXiv Detail & Related papers (2022-05-04T01:27:36Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.