Advancing Additive Manufacturing through Deep Learning: A Comprehensive
Review of Current Progress and Future Challenges
- URL: http://arxiv.org/abs/2403.00669v1
- Date: Fri, 1 Mar 2024 17:01:47 GMT
- Title: Advancing Additive Manufacturing through Deep Learning: A Comprehensive
Review of Current Progress and Future Challenges
- Authors: Amirul Islam Saimon, Emmanuel Yangue, Xiaowei Yue, Zhenyu (James)
Kong, Chenang Liu
- Abstract summary: This paper reviews the recent studies that apply deep learning for making the Additive Manufacturing process better.
It focuses on generalizing DL models for wide-range of geometry types, managing uncertainties both in AM data and DL models, overcoming limited and noisy AM data issues by incorporating generative models, and unveiling the potential of interpretable DL for AM.
- Score: 5.415870869037467
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Additive manufacturing (AM) has already proved itself to be the potential
alternative to widely-used subtractive manufacturing due to its extraordinary
capacity of manufacturing highly customized products with minimum material
wastage. Nevertheless, it is still not being considered as the primary choice
for the industry due to some of its major inherent challenges, including
complex and dynamic process interactions, which are sometimes difficult to
fully understand even with traditional machine learning because of the
involvement of high-dimensional data such as images, point clouds, and voxels.
However, the recent emergence of deep learning (DL) is showing great promise in
overcoming many of these challenges as DL can automatically capture complex
relationships from high-dimensional data without hand-crafted feature
extraction. Therefore, the volume of research in the intersection of AM and DL
is exponentially growing each year which makes it difficult for the researchers
to keep track of the trend and future potential directions. Furthermore, to the
best of our knowledge, there is no comprehensive review paper in this research
track summarizing the recent studies. Therefore, this paper reviews the recent
studies that apply DL for making the AM process better with a high-level
summary of their contributions and limitations. Finally, it summarizes the
current challenges and recommends some of the promising opportunities in this
domain for further investigation with a special focus on generalizing DL models
for wide-range of geometry types, managing uncertainties both in AM data and DL
models, overcoming limited and noisy AM data issues by incorporating generative
models, and unveiling the potential of interpretable DL for AM.
Related papers
- State-Space Modeling in Long Sequence Processing: A Survey on Recurrence in the Transformer Era [59.279784235147254]
This survey provides an in-depth summary of the latest approaches that are based on recurrent models for sequential data processing.
The emerging picture suggests that there is room for thinking of novel routes, constituted by learning algorithms which depart from the standard Backpropagation Through Time.
arXiv Detail & Related papers (2024-06-13T12:51:22Z) - The Frontier of Data Erasure: Machine Unlearning for Large Language Models [56.26002631481726]
Large Language Models (LLMs) are foundational to AI advancements.
LLMs pose risks by potentially memorizing and disseminating sensitive, biased, or copyrighted information.
Machine unlearning emerges as a cutting-edge solution to mitigate these concerns.
arXiv Detail & Related papers (2024-03-23T09:26:15Z) - Generative AI for Synthetic Data Generation: Methods, Challenges and the
Future [12.506811635026907]
The recent surge in research focused on generating synthetic data from large language models (LLMs)
This paper delves into advanced technologies that leverage these gigantic LLMs for the generation of task-specific training data.
arXiv Detail & Related papers (2024-03-07T03:38:44Z) - On the Challenges and Opportunities in Generative AI [135.2754367149689]
We argue that current large-scale generative AI models do not sufficiently address several fundamental issues that hinder their widespread adoption across domains.
In this work, we aim to identify key unresolved challenges in modern generative AI paradigms that should be tackled to further enhance their capabilities, versatility, and reliability.
arXiv Detail & Related papers (2024-02-28T15:19:33Z) - On the Resurgence of Recurrent Models for Long Sequences -- Survey and
Research Opportunities in the Transformer Era [59.279784235147254]
This survey is aimed at providing an overview of these trends framed under the unifying umbrella of Recurrence.
It emphasizes novel research opportunities that become prominent when abandoning the idea of processing long sequences.
arXiv Detail & Related papers (2024-02-12T23:55:55Z) - Deep Learning for Multi-Label Learning: A Comprehensive Survey [6.571492336879553]
Multi-label learning is a rapidly growing research area that aims to predict multiple labels from a single input data point.
Inherent difficulties in MLC include dealing with high-dimensional data, addressing label correlations, and handling partial labels.
Recent years have witnessed a notable increase in adopting deep learning (DL) techniques to address these challenges more effectively in MLC.
arXiv Detail & Related papers (2024-01-29T20:37:03Z) - Deep Transfer Learning for Automatic Speech Recognition: Towards Better
Generalization [3.6393183544320236]
Speech recognition has become an important challenge when using deep learning (DL)
It requires large-scale training datasets and high computational and storage resources.
Deep transfer learning (DTL) has been introduced to overcome these issues.
arXiv Detail & Related papers (2023-04-27T21:08:05Z) - Dataset Distillation: A Comprehensive Review [76.26276286545284]
dataset distillation (DD) aims to derive a much smaller dataset containing synthetic samples, based on which the trained models yield performance comparable with those trained on the original dataset.
This paper gives a comprehensive review and summary of recent advances in DD and its application.
arXiv Detail & Related papers (2023-01-17T17:03:28Z) - A Survey of Deep Active Learning [54.376820959917005]
Active learning (AL) attempts to maximize the performance gain of the model by marking the fewest samples.
Deep learning (DL) is greedy for data and requires a large amount of data supply to optimize massive parameters.
Deep active learning (DAL) has emerged.
arXiv Detail & Related papers (2020-08-30T04:28:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.