Topic Scaling: A Joint Document Scaling -- Topic Model Approach To Learn
Time-Specific Topics
- URL: http://arxiv.org/abs/2104.01117v1
- Date: Wed, 31 Mar 2021 12:35:36 GMT
- Title: Topic Scaling: A Joint Document Scaling -- Topic Model Approach To Learn
Time-Specific Topics
- Authors: Sami Diaf and Ulrich Fritsche
- Abstract summary: This paper proposes a new methodology to study sequential corpora by implementing a two-stage algorithm that learns time-based topics with respect to a scale of document positions.
The first stage ranks documents using Wordfish to estimate document positions that serve as a dependent variable to learn relevant topics.
The second stage ranks the inferred topics on the document scale to match their occurrences within the corpus and track their evolution.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper proposes a new methodology to study sequential corpora by
implementing a two-stage algorithm that learns time-based topics with respect
to a scale of document positions and introduces the concept of Topic Scaling
which ranks learned topics within the same document scale. The first stage
ranks documents using Wordfish, a Poisson-based document scaling method, to
estimate document positions that serve, in the second stage, as a dependent
variable to learn relevant topics via a supervised Latent Dirichlet Allocation.
This novelty brings two innovations in text mining as it explains document
positions, whose scale is a latent variable, and ranks the inferred topics on
the document scale to match their occurrences within the corpus and track their
evolution. Tested on the U.S. State Of The Union two-party addresses, this
inductive approach reveals that each party dominates one end of the learned
scale with interchangeable transitions that follow the parties' term of office.
Besides a demonstrated high accuracy in predicting in-sample documents'
positions from topic scores, this method reveals further hidden topics that
differentiate similar documents by increasing the number of learned topics to
unfold potential nested hierarchical topic structures. Compared to other
popular topic models, Topic Scaling learns topics with respect to document
similarities without specifying a time frequency to learn topic evolution, thus
capturing broader topic patterns than dynamic topic models and yielding more
interpretable outputs than a plain latent Dirichlet allocation.
Related papers
- Visualizing Temporal Topic Embeddings with a Compass [1.5184974790808403]
This paper proposes an expansion of the compass-aligned temporal Word2Vec methodology into dynamic topic modeling.
Such a method allows for the direct comparison of word and document embeddings across time in dynamic topics.
arXiv Detail & Related papers (2024-09-16T18:29:19Z) - Topics in the Haystack: Extracting and Evaluating Topics beyond
Coherence [0.0]
We propose a method that incorporates a deeper understanding of both sentence and document themes.
This allows our model to detect latent topics that may include uncommon words or neologisms.
We present correlation coefficients with human identification of intruder words and achieve near-human level results at the word-intrusion task.
arXiv Detail & Related papers (2023-03-30T12:24:25Z) - Topic Taxonomy Expansion via Hierarchy-Aware Topic Phrase Generation [58.3921103230647]
We propose a novel framework for topic taxonomy expansion, named TopicExpan.
TopicExpan directly generates topic-related terms belonging to new topics.
Experimental results on two real-world text corpora show that TopicExpan significantly outperforms other baseline methods in terms of the quality of output.
arXiv Detail & Related papers (2022-10-18T22:38:49Z) - Knowledge-Aware Bayesian Deep Topic Model [50.58975785318575]
We propose a Bayesian generative model for incorporating prior domain knowledge into hierarchical topic modeling.
Our proposed model efficiently integrates the prior knowledge and improves both hierarchical topic discovery and document representation.
arXiv Detail & Related papers (2022-09-20T09:16:05Z) - Distant finetuning with discourse relations for stance classification [55.131676584455306]
We propose a new method to extract data with silver labels from raw text to finetune a model for stance classification.
We also propose a 3-stage training framework where the noisy level in the data used for finetuning decreases over different stages.
Our approach ranks 1st among 26 competing teams in the stance classification track of the NLPCC 2021 shared task Argumentative Text Understanding for AI Debater.
arXiv Detail & Related papers (2022-04-27T04:24:35Z) - Representing Mixtures of Word Embeddings with Mixtures of Topic
Embeddings [46.324584649014284]
A topic model is often formulated as a generative model that explains how each word of a document is generated given a set of topics and document-specific topic proportions.
This paper introduces a new topic-modeling framework where each document is viewed as a set of word embedding vectors and each topic is modeled as an embedding vector in the same embedding space.
Embedding the words and topics in the same vector space, we define a method to measure the semantic difference between the embedding vectors of the words of a document and these of the topics, and optimize the topic embeddings to minimize the expected difference over all documents.
arXiv Detail & Related papers (2022-03-03T08:46:23Z) - Changepoint Analysis of Topic Proportions in Temporal Text Data [1.8262547855491456]
We build a specialised temporal topic model with provisions for changepoints in the distribution of topic proportions.
We use sample splitting to estimate topic polytopes first and then apply a likelihood ratio statistic.
We obtain some historically well-known changepoints and discover some new ones.
arXiv Detail & Related papers (2021-11-29T17:20:51Z) - TopicNet: Semantic Graph-Guided Topic Discovery [51.71374479354178]
Existing deep hierarchical topic models are able to extract semantically meaningful topics from a text corpus in an unsupervised manner.
We introduce TopicNet as a deep hierarchical topic model that can inject prior structural knowledge as an inductive bias to influence learning.
arXiv Detail & Related papers (2021-10-27T09:07:14Z) - Author Clustering and Topic Estimation for Short Texts [69.54017251622211]
We propose a novel model that expands on the Latent Dirichlet Allocation by modeling strong dependence among the words in the same document.
We also simultaneously cluster users, removing the need for post-hoc cluster estimation.
Our method performs as well as -- or better -- than traditional approaches to problems arising in short text.
arXiv Detail & Related papers (2021-06-15T20:55:55Z) - Topical Change Detection in Documents via Embeddings of Long Sequences [4.13878392637062]
We formulate the task of text segmentation as an independent supervised prediction task.
By fine-tuning on paragraphs of similar sections, we are able to show that learned features encode topic information.
Unlike previous approaches, which mostly operate on sentence-level, we consistently use a broader context.
arXiv Detail & Related papers (2020-12-07T12:09:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.