Deep Learning Based Superconductivity: Prediction and Experimental Tests
- URL: http://arxiv.org/abs/2412.13012v1
- Date: Tue, 17 Dec 2024 15:33:48 GMT
- Title: Deep Learning Based Superconductivity: Prediction and Experimental Tests
- Authors: Daniel Kaplan, Adam Zhang, Joanna Blawat, Rongying Jin, Robert J. Cava, Viktor Oudovenko, Gabriel Kotliar, Anirvan M. Sengupta, Weiwei Xie,
- Abstract summary: We develop an approach based on deep learning (DL) to predict new superconducting materials.
We have synthesized a compound derived from our DL network and confirmed its superconducting properties.
In particular, RFs require knowledge of the chem-ical properties of the compound, while our neural net inputs depend solely on the chemical composition.
- Score: 2.78539995173967
- License:
- Abstract: The discovery of novel superconducting materials is a longstanding challenge in materials science, with a wealth of potential for applications in energy, transportation, and computing. Recent advances in artificial intelligence (AI) have enabled expediting the search for new materials by efficiently utilizing vast materials databases. In this study, we developed an approach based on deep learning (DL) to predict new superconducting materials. We have synthesized a compound derived from our DL network and confirmed its superconducting properties in agreement with our prediction. Our approach is also compared to previous work based on random forests (RFs). In particular, RFs require knowledge of the chem-ical properties of the compound, while our neural net inputs depend solely on the chemical composition. With the help of hints from our network, we discover a new ternary compound $\textrm{Mo}_{20}\textrm{Re}_{6}\textrm{Si}_{4}$, which becomes superconducting below 5.4 K. We further discuss the existing limitations and challenges associated with using AI to predict and, along with potential future research directions.
Related papers
- Energy-GNoME: A Living Database of Selected Materials for Energy Applications [0.0]
Recent GNoME protocol identifies over 380,000 novel stable crystals.
We identify over 33,000 materials with potential as energy materials forming the Energy-GNoME database.
arXiv Detail & Related papers (2024-11-15T11:48:14Z) - AI-driven inverse design of materials: Past, present and future [5.813167950821478]
Humans have long explored new materials through a large number of experiments and proposed corresponding theoretical systems to predict new material properties and structures.
With the improvement of computational power, researchers have gradually developed various electronic structure calculation methods.
Recently, the rapid development of artificial intelligence technology in the field of computer science has enabled the effective characterization of the implicit association between material properties and structures.
A significant progress has been made in inverse design of materials based on generative and discriminative models, attracting widespread attention from researchers.
arXiv Detail & Related papers (2024-11-14T13:25:04Z) - Predicting ionic conductivity in solids from the machine-learned potential energy landscape [68.25662704255433]
Superionic materials are essential for advancing solid-state batteries, which offer improved energy density and safety.
Conventional computational methods for identifying such materials are resource-intensive and not easily scalable.
We propose an approach for the quick and reliable evaluation of ionic conductivity through the analysis of a universal interatomic potential.
arXiv Detail & Related papers (2024-11-11T09:01:36Z) - InvDesFlow: An AI search engine to explore possible high-temperature superconductors [9.926621857444765]
InvDesFlow is an AI search engine that integrates deep model pre-training and fine-tuning techniques, diffusion models, and physics-based approaches.
We have obtained 74 dynamically stable materials with critical temperatures predicted by the AI model to be $T_c geq$ 15 K based on a very small set of samples.
arXiv Detail & Related papers (2024-09-12T14:16:56Z) - AI-accelerated Discovery of Altermagnetic Materials [48.261668305411845]
Altermagnetism, a new magnetic phase, has been theoretically proposed and experimentally verified to be distinct from ferromagnetism and antiferromagnetism.
We propose an automated discovery approach empowered by an AI search engine.
We successfully discovered 50 new altermagnetic materials that cover metals, semiconductors, and insulators.
arXiv Detail & Related papers (2023-11-08T01:06:48Z) - MatChat: A Large Language Model and Application Service Platform for
Materials Science [18.55541324347915]
We harness the power of the LLaMA2-7B model and enhance it through a learning process that incorporates 13,878 pieces of structured material knowledge data.
This specialized AI model, named MatChat, focuses on predicting inorganic material synthesis pathways.
MatChat is now accessible online and open for use, with both the model and its application framework available as open source.
arXiv Detail & Related papers (2023-10-11T05:11:46Z) - Prediction of superconducting properties of materials based on machine
learning models [3.7492020569920723]
This manuscript proposes the use of XGBoost model to identify superconductors.
The first application of deep forest model to predict the critical temperature of superconductors.
The first application of deep forest to predict the band gap of materials.
The first sub-network model to predict the Fermi energy level of materials.
arXiv Detail & Related papers (2022-11-06T10:24:21Z) - The Spectral Bias of Polynomial Neural Networks [63.27903166253743]
Polynomial neural networks (PNNs) have been shown to be particularly effective at image generation and face recognition, where high-frequency information is critical.
Previous studies have revealed that neural networks demonstrate a $textitspectral bias$ towards low-frequency functions, which yields faster learning of low-frequency components during training.
Inspired by such studies, we conduct a spectral analysis of the Tangent Kernel (NTK) of PNNs.
We find that the $Pi$-Net family, i.e., a recently proposed parametrization of PNNs, speeds up the
arXiv Detail & Related papers (2022-02-27T23:12:43Z) - Improving Molecular Representation Learning with Metric
Learning-enhanced Optimal Transport [49.237577649802034]
We develop a novel optimal transport-based algorithm termed MROT to enhance their generalization capability for molecular regression problems.
MROT significantly outperforms state-of-the-art models, showing promising potential in accelerating the discovery of new substances.
arXiv Detail & Related papers (2022-02-13T04:56:18Z) - Simulating Quantum Materials with Digital Quantum Computers [55.41644538483948]
Digital quantum computers (DQCs) can efficiently perform quantum simulations that are otherwise intractable on classical computers.
The aim of this review is to provide a summary of progress made towards achieving physical quantum advantage.
arXiv Detail & Related papers (2021-01-21T20:10:38Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.