Hierarchy-Boosted Funnel Learning for Identifying Semiconductors with Ultralow Lattice Thermal Conductivity
- URL: http://arxiv.org/abs/2501.06775v1
- Date: Sun, 12 Jan 2025 11:03:09 GMT
- Title: Hierarchy-Boosted Funnel Learning for Identifying Semiconductors with Ultralow Lattice Thermal Conductivity
- Authors: Mengfan Wu, Shenshen Yan, Jie Ren,
- Abstract summary: We propose a hierarchy-boosted funnel learning (HiBoFL) framework, which is successfully applied to identify semiconductors with ultralow lattice thermal conductivity ($kappa_mathrmL$)
By training on only a few hundred materials targeted by unsupervised learning from a pool of hundreds of thousands, we achieve efficient and interpretable supervised predictions of ultralow $kappa_mathrmL$.
As a result, we provide a list of candidates with ultralow $kappa_mathrmL$ for potential thermoelectric applications and discover a new factor that significantly influences
- Score: 2.186828191026978
- License:
- Abstract: Data-driven machine learning (ML) has demonstrated tremendous potential in material property predictions. However, the scarcity of materials data with costly property labels in the vast chemical space presents a significant challenge for ML in efficiently predicting properties and uncovering structure-property relationships. Here, we propose a novel hierarchy-boosted funnel learning (HiBoFL) framework, which is successfully applied to identify semiconductors with ultralow lattice thermal conductivity ($\kappa_\mathrm{L}$). By training on only a few hundred materials targeted by unsupervised learning from a pool of hundreds of thousands, we achieve efficient and interpretable supervised predictions of ultralow $\kappa_\mathrm{L}$, thereby circumventing large-scale brute-force calculations without clear objectives. As a result, we provide a list of candidates with ultralow $\kappa_\mathrm{L}$ for potential thermoelectric applications and discover a new factor that significantly influences structural anharmonicity. This study offers a novel practical pathway for accelerating the discovery of functional materials.
Related papers
- DSMoE: Matrix-Partitioned Experts with Dynamic Routing for Computation-Efficient Dense LLMs [70.91804882618243]
This paper proposes DSMoE, a novel approach that achieves sparsification by partitioning pre-trained FFN layers into computational blocks.
We implement adaptive expert routing using sigmoid activation and straight-through estimators, enabling tokens to flexibly access different aspects of model knowledge.
Experiments on LLaMA models demonstrate that under equivalent computational constraints, DSMoE achieves superior performance compared to existing pruning and MoE approaches.
arXiv Detail & Related papers (2025-02-18T02:37:26Z) - Deep Learning Based Superconductivity: Prediction and Experimental Tests [2.78539995173967]
We develop an approach based on deep learning (DL) to predict new superconducting materials.
We have synthesized a compound derived from our DL network and confirmed its superconducting properties.
In particular, RFs require knowledge of the chem-ical properties of the compound, while our neural net inputs depend solely on the chemical composition.
arXiv Detail & Related papers (2024-12-17T15:33:48Z) - Transfer Learning for Deep Learning-based Prediction of Lattice Thermal Conductivity [0.0]
We study the impact of transfer learning on the precision and generalizability of a deep learning model (ParAIsite)
We show that a much greater improvement is obtained when first fine-tuning it on a large datasets of low-quality approximations of lattice thermal conductivity (LTC)
The promising results pave the way towards a greater ability to explore large databases in search of low thermal conductivity materials.
arXiv Detail & Related papers (2024-11-27T11:57:58Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - InvDesFlow: An AI search engine to explore possible high-temperature superconductors [9.926621857444765]
InvDesFlow is an AI search engine that integrates deep model pre-training and fine-tuning techniques, diffusion models, and physics-based approaches.
We have obtained 74 dynamically stable materials with critical temperatures predicted by the AI model to be $T_c geq$ 15 K based on a very small set of samples.
arXiv Detail & Related papers (2024-09-12T14:16:56Z) - An Experimental Study on Exploring Strong Lightweight Vision Transformers via Masked Image Modeling Pre-Training [51.622652121580394]
Masked image modeling (MIM) pre-training for large-scale vision transformers (ViTs) has enabled promising downstream performance on top of the learned self-supervised ViT features.
In this paper, we question if the textitextremely simple lightweight ViTs' fine-tuning performance can also benefit from this pre-training paradigm.
Our pre-training with distillation on pure lightweight ViTs with vanilla/hierarchical design ($5.7M$/$6.5M$) can achieve $79.4%$/$78.9%$ top-1 accuracy on ImageNet-1
arXiv Detail & Related papers (2024-04-18T14:14:44Z) - Data-Efficient Learning via Minimizing Hyperspherical Energy [48.47217827782576]
This paper considers the problem of data-efficient learning from scratch using a small amount of representative data.
We propose a MHE-based active learning (MHEAL) algorithm, and provide comprehensive theoretical guarantees for MHEAL.
arXiv Detail & Related papers (2022-06-30T11:39:12Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z) - Active learning based generative design for the discovery of wide
bandgap materials [6.5175897155391755]
We present an active generative inverse design method that combines active learning with a deep variational autoencoder neural network and a generative adversarial deep neural network model.
The application of this method has allowed us to discover new thermodynamically stable materials with high band gap and semiconductors with specified band gap ranges.
Our experiments show that while active learning itself may sample chemically infeasible candidates, these samples help to train effective screening models for filtering out materials with desired properties from the hypothetical materials created by the generative model.
arXiv Detail & Related papers (2021-02-28T20:15:23Z) - Interpretable discovery of new semiconductors with machine learning [10.09604500193621]
We report an evolutionary algorithm powered search which uses machine-learned surrogate models trained on hybrid functional DFT data benchmarked against experimental bandgaps: Deep Adaptive Regressive Weighted Intelligent Network (DARWIN)
The strategy enables efficient search over the materials space of 10$8$ ternaries and 10$11$ quaternaries$7$ for candidates with target properties.
arXiv Detail & Related papers (2021-01-12T10:23:16Z) - Multilinear Compressive Learning with Prior Knowledge [106.12874293597754]
Multilinear Compressive Learning (MCL) framework combines Multilinear Compressive Sensing and Machine Learning into an end-to-end system.
Key idea behind MCL is the assumption of the existence of a tensor subspace which can capture the essential features from the signal for the downstream learning task.
In this paper, we propose a novel solution to address both of the aforementioned requirements, i.e., How to find those tensor subspaces in which the signals of interest are highly separable?
arXiv Detail & Related papers (2020-02-17T19:06:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.