Synergy between Machine/Deep Learning and Software Engineering: How Far
Are We?
- URL: http://arxiv.org/abs/2008.05515v1
- Date: Wed, 12 Aug 2020 18:19:30 GMT
- Title: Synergy between Machine/Deep Learning and Software Engineering: How Far
Are We?
- Authors: Simin Wang, Liguo Huang, Jidong Ge, Tengfei Zhang, Haitao Feng, Ming
Li, He Zhang and Vincent Ng
- Abstract summary: Since 2009, the deep learning revolution has stimulated the synergy between Machine Learning (ML)/Deep Learning (DL) and Software Engineering (SE)
We conducted a 10-year Systematic Literature Review on 906 ML/DL-related SE papers published between 2009 and 2018.
- Score: 35.606916133846966
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Since 2009, the deep learning revolution, which was triggered by the
introduction of ImageNet, has stimulated the synergy between Machine Learning
(ML)/Deep Learning (DL) and Software Engineering (SE). Meanwhile, critical
reviews have emerged that suggest that ML/DL should be used cautiously. To
improve the quality (especially the applicability and generalizability) of
ML/DL-related SE studies, and to stimulate and enhance future collaborations
between SE/AI researchers and industry practitioners, we conducted a 10-year
Systematic Literature Review (SLR) on 906 ML/DL-related SE papers published
between 2009 and 2018. Our trend analysis demonstrated the mutual impacts that
ML/DL and SE have had on each other. At the same time, however, we also
observed a paucity of replicable and reproducible ML/DL-related SE studies and
identified five factors that influence their replicability and reproducibility.
To improve the applicability and generalizability of research results, we
analyzed what ingredients in a study would facilitate an understanding of why a
ML/DL technique was selected for a specific SE problem. In addition, we
identified the unique trends of impacts of DL models on SE tasks, as well as
five unique challenges that needed to be met in order to better leverage DL to
improve the productivity of SE tasks. Finally, we outlined a road-map that we
believe can facilitate the transfer of ML/DL-based SE research results into
real-world industry practices.
Related papers
- Generative AI-in-the-loop: Integrating LLMs and GPTs into the Next Generation Networks [11.509880721677156]
Large language models (LLMs) have recently emerged, demonstrating near-human-level performance in cognitive tasks.
We propose the concept of "generative AI-in-the-loop"
We believe that combining LLMs and ML models allows both to leverage their respective capabilities and achieve better results than either model alone.
arXiv Detail & Related papers (2024-06-06T17:25:07Z) - Characterization of Large Language Model Development in the Datacenter [55.9909258342639]
Large Language Models (LLMs) have presented impressive performance across several transformative tasks.
However, it is non-trivial to efficiently utilize large-scale cluster resources to develop LLMs.
We present an in-depth characterization study of a six-month LLM development workload trace collected from our GPU datacenter Acme.
arXiv Detail & Related papers (2024-03-12T13:31:14Z) - Rethinking Machine Unlearning for Large Language Models [85.92660644100582]
We explore machine unlearning in the domain of large language models (LLMs)
This initiative aims to eliminate undesirable data influence (e.g., sensitive or illegal information) and the associated model capabilities.
arXiv Detail & Related papers (2024-02-13T20:51:58Z) - Octavius: Mitigating Task Interference in MLLMs via LoRA-MoE [85.76186554492543]
Large Language Models (LLMs) can extend their zero-shot capabilities to multimodal learning through instruction tuning.
negative conflicts and interference may have a worse impact on performance.
We propose a novel framework, called Octavius, for comprehensive studies and experimentation on multimodal learning with MLLMs.
arXiv Detail & Related papers (2023-11-05T15:48:29Z) - Model-driven Engineering for Machine Learning Components: A Systematic
Literature Review [8.810090413018798]
We analyzed studies with respect to several areas of interest and identified the key motivations behind using MDE4ML.
We also discuss the gaps in existing literature and provide recommendations for future work.
arXiv Detail & Related papers (2023-11-01T04:29:47Z) - Are We Closing the Loop Yet? Gaps in the Generalizability of VIS4ML
Research [26.829392755701843]
We survey recent VIS4ML papers to assess the generalizability of research contributions and claims in enabling human-in-the-loop ML.
Our results show potential gaps between the current scope of VIS4ML research and aspirations for its use in practice.
arXiv Detail & Related papers (2023-08-10T21:44:48Z) - Hierarchical Optimization-Derived Learning [58.69200830655009]
We establish a new framework, named Hierarchical ODL (HODL), to simultaneously investigate the intrinsic behaviors of optimization-derived model construction and its corresponding learning process.
This is the first theoretical guarantee for these two coupled ODL components: optimization and learning.
arXiv Detail & Related papers (2023-02-11T03:35:13Z) - Machine Learning for Software Engineering: A Tertiary Study [13.832268599253412]
Machine learning (ML) techniques increase the effectiveness of software engineering (SE) lifecycle activities.
We systematically collected, quality-assessed, summarized, and categorized 83 reviews in ML for SE published between 2009-2022, covering 6,117 primary studies.
The SE areas most tackled with ML are software quality and testing, while human-centered areas appear more challenging for ML.
arXiv Detail & Related papers (2022-11-17T09:19:53Z) - Machine Learning vs. Deep Learning in 5G Networks -- A Comparison of
Scientific Impact [0.0]
Machine learning (ML) and deep learning (DL) techniques are used in 5G networks.
Our study aims to uncover the differences in scientific impact for these two techniques by the means of statistical bibliometrics.
Web of Science (WoS) database host 2245 papers for ML and 1407 papers for DL-related studies.
arXiv Detail & Related papers (2022-10-13T19:54:17Z) - MAML is a Noisy Contrastive Learner [72.04430033118426]
Model-agnostic meta-learning (MAML) is one of the most popular and widely-adopted meta-learning algorithms nowadays.
We provide a new perspective to the working mechanism of MAML and discover that: MAML is analogous to a meta-learner using a supervised contrastive objective function.
We propose a simple but effective technique, zeroing trick, to alleviate such interference.
arXiv Detail & Related papers (2021-06-29T12:52:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.