Periodic Graph Transformers for Crystal Material Property Prediction
- URL: http://arxiv.org/abs/2209.11807v1
- Date: Fri, 23 Sep 2022 18:57:22 GMT
- Title: Periodic Graph Transformers for Crystal Material Property Prediction
- Authors: Keqiang Yan, Yi Liu, Yuchao Lin, Shuiwang Ji
- Abstract summary: We consider representation learning on periodic graphs encoding crystal materials.
Different from regular graphs, periodic graphs consist of a minimum unit cell repeating itself on a regular lattice in 3D space.
We propose a transformer architecture, known as Matformer, for periodic graph representation learning.
- Score: 39.01618096831294
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider representation learning on periodic graphs encoding crystal
materials. Different from regular graphs, periodic graphs consist of a minimum
unit cell repeating itself on a regular lattice in 3D space. How to effectively
encode these periodic structures poses unique challenges not present in regular
graph representation learning. In addition to being E(3) invariant, periodic
graph representations need to be periodic invariant. That is, the learned
representations should be invariant to shifts of cell boundaries as they are
artificially imposed. Furthermore, the periodic repeating patterns need to be
captured explicitly as lattices of different sizes and orientations may
correspond to different materials. In this work, we propose a transformer
architecture, known as Matformer, for periodic graph representation learning.
Our Matformer is designed to be invariant to periodicity and can capture
repeating patterns explicitly. In particular, Matformer encodes periodic
patterns by efficient use of geometric distances between the same atoms in
neighboring cells. Experimental results on multiple common benchmark datasets
show that our Matformer outperforms baseline methods consistently. In addition,
our results demonstrate the importance of periodic invariance and explicit
repeating pattern encoding for crystal representation learning.
Related papers
- Complete and Efficient Graph Transformers for Crystal Material Property Prediction [53.32754046881189]
Crystal structures are characterized by atomic bases within a primitive unit cell that repeats along a regular lattice throughout 3D space.
We introduce a novel approach that utilizes the periodic patterns of unit cells to establish the lattice-based representation for each atom.
We propose ComFormer, a SE(3) transformer designed specifically for crystalline materials.
arXiv Detail & Related papers (2024-03-18T15:06:37Z) - Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Accelerating Material Property Prediction using Generically Complete Isometry Invariants [3.031375888004876]
We adapt the Pointwise Distance Distribution (PDD) as a representation for our learning algorithm.
We develop a transformer model with a modified self-attention mechanism that combines PDD with compositional information via a spatial encoding method.
This model is tested on the crystals of the Materials Project and Jarvis-DFT databases and shown to produce accuracy on par with state-of-the-art methods.
arXiv Detail & Related papers (2024-01-22T15:14:22Z) - CycleCL: Self-supervised Learning for Periodic Videos [5.9647924003148365]
We propose CycleCL, a self-supervised learning method specifically designed to work with periodic data.
We exploit the repetitions in videos to design a novel contrastive learning method based on a triplet loss.
Our method uses pre-trained features to sample pairs of frames from approximately the same phase and negative pairs of frames from different phases.
arXiv Detail & Related papers (2023-11-05T17:40:10Z) - Curve Your Attention: Mixed-Curvature Transformers for Graph
Representation Learning [77.1421343649344]
We propose a generalization of Transformers towards operating entirely on the product of constant curvature spaces.
We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges.
arXiv Detail & Related papers (2023-09-08T02:44:37Z) - Nonparametric Factor Trajectory Learning for Dynamic Tensor
Decomposition [20.55025648415664]
We propose NON FActor Trajectory learning for dynamic tensor decomposition (NONFAT)
We use a second-level GP to sample the entry values and to capture the temporal relationship between the entities.
We have shown the advantage of our method in several real-world applications.
arXiv Detail & Related papers (2022-07-06T05:33:00Z) - Graph Gamma Process Generalized Linear Dynamical Systems [60.467040479276704]
We introduce graph gamma process (GGP) linear dynamical systems to model real multivariate time series.
For temporal pattern discovery, the latent representation under the model is used to decompose the time series into a parsimonious set of multivariate sub-sequences.
We use the generated random graph, whose number of nonzero-degree nodes is finite, to define both the sparsity pattern and dimension of the latent state transition matrix.
arXiv Detail & Related papers (2020-07-25T04:16:34Z) - Time-varying Graph Representation Learning via Higher-Order Skip-Gram
with Negative Sampling [0.456877715768796]
We build upon the fact that the skip-gram embedding approach implicitly performs a matrix factorization.
We show that higher-order skip-gram with negative sampling is able to disentangle the role of nodes and time.
We empirically evaluate our approach using time-resolved face-to-face proximity data, showing that the learned time-varying graph representations outperform state-of-the-art methods.
arXiv Detail & Related papers (2020-06-25T12:04:48Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.