Research on the Application of Deep Learning-based BERT Model in
Sentiment Analysis
- URL: http://arxiv.org/abs/2403.08217v1
- Date: Wed, 13 Mar 2024 03:31:26 GMT
- Title: Research on the Application of Deep Learning-based BERT Model in
Sentiment Analysis
- Authors: Yichao Wu, Zhengyu Jin, Chenxi Shi, Penghao Liang, Tong Zhan
- Abstract summary: This paper explores the application of deep learning techniques, particularly focusing on BERT models, in sentiment analysis.
It elucidates the application effects and optimization strategies of BERT models in sentiment analysis, supported by experimental validation.
The experimental findings indicate that BERT models exhibit robust performance in sentiment analysis tasks, with notable enhancements post fine-tuning.
- Score: 8.504422968998506
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper explores the application of deep learning techniques, particularly
focusing on BERT models, in sentiment analysis. It begins by introducing the
fundamental concept of sentiment analysis and how deep learning methods are
utilized in this domain. Subsequently, it delves into the architecture and
characteristics of BERT models. Through detailed explanation, it elucidates the
application effects and optimization strategies of BERT models in sentiment
analysis, supported by experimental validation. The experimental findings
indicate that BERT models exhibit robust performance in sentiment analysis
tasks, with notable enhancements post fine-tuning. Lastly, the paper concludes
by summarizing the potential applications of BERT models in sentiment analysis
and suggests directions for future research and practical implementations.
Related papers
- See Further for Parameter Efficient Fine-tuning by Standing on the Shoulders of Decomposition [56.87609859444084]
parameter-efficient fine-tuning (PEFT) focuses on optimizing a select subset of parameters while keeping the rest fixed, significantly lowering computational and storage overheads.
We take the first step to unify all approaches by dissecting them from a decomposition perspective.
We introduce two novel PEFT methods alongside a simple yet effective framework designed to enhance the performance of PEFT techniques across various applications.
arXiv Detail & Related papers (2024-07-07T15:44:42Z) - New Product Development (NPD) through Social Media-based Analysis by
Comparing Word2Vec and BERT Word Embeddings [0.0]
Two popular word embedding techniques, Word2Vec and BERT, were evaluated to identify the best-performing approach in sentiment analysis and opinion detection.
BERT word embeddings combined with Balanced Random Forest yielded the most accurate single model for both sentiment analysis and opinion detection.
arXiv Detail & Related papers (2023-04-17T15:32:11Z) - Artificial Text Detection via Examining the Topology of Attention Maps [58.46367297712477]
We propose three novel types of interpretable topological features for this task based on Topological Data Analysis (TDA)
We empirically show that the features derived from the BERT model outperform count- and neural-based baselines up to 10% on three common datasets.
The probing analysis of the features reveals their sensitivity to the surface and syntactic properties.
arXiv Detail & Related papers (2021-09-10T12:13:45Z) - Pre-Trained Models: Past, Present and Future [126.21572378910746]
Large-scale pre-trained models (PTMs) have recently achieved great success and become a milestone in the field of artificial intelligence (AI)
By storing knowledge into huge parameters and fine-tuning on specific tasks, the rich knowledge implicitly encoded in huge parameters can benefit a variety of downstream tasks.
It is now the consensus of the AI community to adopt PTMs as backbone for downstream tasks rather than learning models from scratch.
arXiv Detail & Related papers (2021-06-14T02:40:32Z) - BERT based sentiment analysis: A software engineering perspective [0.9176056742068814]
The paper presents three different strategies to analyse BERT based model for sentiment analysis.
The experimental results show that the BERT based ensemble approach and the compressed BERT model attain improvements by 6-12% over prevailing tools for the F1 measure on all three datasets.
arXiv Detail & Related papers (2021-06-04T16:28:26Z) - Investigation of BERT Model on Biomedical Relation Extraction Based on
Revised Fine-tuning Mechanism [2.8881198461098894]
We will investigate the method of utilizing the entire layer in the fine-tuning process of BERT model.
In addition, further analysis indicates that the key knowledge about the relations can be learned from the last layer of BERT model.
arXiv Detail & Related papers (2020-11-01T01:47:16Z) - On Robustness and Bias Analysis of BERT-based Relation Extraction [40.64969232497321]
We analyze a fine-tuned BERT model from different perspectives using relation extraction.
We find that BERT suffers a bottleneck in terms of robustness by way of randomizations, adversarial and counterfactual tests, and biases.
arXiv Detail & Related papers (2020-09-14T05:24:28Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z) - Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation [84.64004917951547]
Fine-tuning pre-trained language models like BERT has become an effective way in NLP.
In this paper, we improve the fine-tuning of BERT with two effective mechanisms: self-ensemble and self-distillation.
arXiv Detail & Related papers (2020-02-24T16:17:12Z) - Utilizing BERT Intermediate Layers for Aspect Based Sentiment Analysis
and Natural Language Inference [19.638239426995973]
This paper explores the potential of utilizing BERT intermediate layers to enhance the performance of fine-tuning of BERT.
To show the generality, we also apply this approach to a natural language inference task.
arXiv Detail & Related papers (2020-02-12T06:11:48Z) - Rethinking Generalization of Neural Models: A Named Entity Recognition
Case Study [81.11161697133095]
We take the NER task as a testbed to analyze the generalization behavior of existing models from different perspectives.
Experiments with in-depth analyses diagnose the bottleneck of existing neural NER models.
As a by-product of this paper, we have open-sourced a project that involves a comprehensive summary of recent NER papers.
arXiv Detail & Related papers (2020-01-12T04:33:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.