LSTM based models stability in the context of Sentiment Analysis for
social media
- URL: http://arxiv.org/abs/2211.11246v1
- Date: Mon, 21 Nov 2022 08:31:30 GMT
- Title: LSTM based models stability in the context of Sentiment Analysis for
social media
- Authors: Bousselham El Haddaoui, Raddouane Chiheb, Rdouan Faizi and Abdellatif
El Afia
- Abstract summary: We present various LSTM models and their key parameters.
We perform experiments to test the stability of these models in the context of Sentiment Analysis.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning techniques have proven their effectiveness for Sentiment
Analysis (SA) related tasks. Recurrent neural networks (RNN), especially Long
Short-Term Memory (LSTM) and Bidirectional LSTM, have become a reference for
building accurate predictive models. However, the models complexity and the
number of hyperparameters to configure raises several questions related to
their stability. In this paper, we present various LSTM models and their key
parameters, and we perform experiments to test the stability of these models in
the context of Sentiment Analysis.
Related papers
- Advancing Financial Risk Prediction Through Optimized LSTM Model Performance and Comparative Analysis [12.575399233846092]
This paper focuses on the application and optimization of LSTM model in financial risk prediction.
The optimized LSTM model shows significant advantages in AUC index compared with random forest, BP neural network and XGBoost.
arXiv Detail & Related papers (2024-05-31T03:31:17Z) - HOPE for a Robust Parameterization of Long-memory State Space Models [51.66430224089725]
State-space models (SSMs) that utilize linear, time-invariant (LTI) systems are known for their effectiveness in learning long sequences.
We develop a new parameterization scheme, called HOPE, for LTI systems that utilize Markov parameters within Hankel operators.
Our new parameterization endows the SSM with non-decaying memory within a fixed time window, which is empirically corroborated by a sequential CIFAR-10 task with padded noise.
arXiv Detail & Related papers (2024-05-22T20:20:14Z) - Latent Semantic Consensus For Deterministic Geometric Model Fitting [109.44565542031384]
We propose an effective method called Latent Semantic Consensus (LSC)
LSC formulates the model fitting problem into two latent semantic spaces based on data points and model hypotheses.
LSC is able to provide consistent and reliable solutions within only a few milliseconds for general multi-structural model fitting.
arXiv Detail & Related papers (2024-03-11T05:35:38Z) - Efficient CNN-LSTM based Parameter Estimation of Levy Driven Stochastic
Differential Equations [0.0]
This study addresses the challenges in parameter estimation of differential equations driven by non-Gaussian noises.
Previous research highlighted the potential of LSTM networks in estimating parameters of alpha stable Levy driven SDEs.
We introduce the PEnet, a novel CNN-LSTM-based three-stage model that offers an end to end approach with superior accuracy and adaptability to varying data structures.
arXiv Detail & Related papers (2024-03-07T06:07:31Z) - Understanding Self-attention Mechanism via Dynamical System Perspective [58.024376086269015]
Self-attention mechanism (SAM) is widely used in various fields of artificial intelligence.
We show that intrinsic stiffness phenomenon (SP) in the high-precision solution of ordinary differential equations (ODEs) also widely exists in high-performance neural networks (NN)
We show that the SAM is also a stiffness-aware step size adaptor that can enhance the model's representational ability to measure intrinsic SP.
arXiv Detail & Related papers (2023-08-19T08:17:41Z) - Switching Autoregressive Low-rank Tensor Models [12.461139675114818]
We show how to switch autoregressive low-rank tensor (SALT) models.
SALT parameterizes the tensor of an ARHMM with a low-rank factorization to control the number of parameters.
We prove theoretical and discuss practical connections between SALT, linear dynamical systems, and SLDSs.
arXiv Detail & Related papers (2023-06-05T22:25:28Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - The DONUT Approach to EnsembleCombination Forecasting [0.0]
This paper presents an ensemble forecasting method that shows strong results on the M4Competition dataset.
Our assumption reductions, consisting mainly of auto-generated features and a more diverse model pool, significantly outperforms the statistical-feature-based ensemble method FFORMA.
We also present a formal ex-post-facto analysis of optimal combination and selection for ensembles, quantifying differences through linear optimization on the M4 dataset.
arXiv Detail & Related papers (2022-01-02T22:19:26Z) - A journey in ESN and LSTM visualisations on a language task [77.34726150561087]
We trained ESNs and LSTMs on a Cross-Situationnal Learning (CSL) task.
The results are of three kinds: performance comparison, internal dynamics analyses and visualization of latent space.
arXiv Detail & Related papers (2020-12-03T08:32:01Z) - On the Sparsity of Neural Machine Translation Models [65.49762428553345]
We investigate whether redundant parameters can be reused to achieve better performance.
Experiments and analyses are systematically conducted on different datasets and NMT architectures.
arXiv Detail & Related papers (2020-10-06T11:47:20Z) - Sentiment Analysis Using Simplified Long Short-term Memory Recurrent
Neural Networks [1.5146765382501612]
We perform sentiment analysis on a GOP Debate Twitter dataset.
To speed up training and reduce the computational cost and time, six different parameter reduced slim versions of the LSTM model are proposed.
arXiv Detail & Related papers (2020-05-08T12:50:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.