Effective user intent mining with unsupervised word representation
models and topic modelling
- URL: http://arxiv.org/abs/2109.01765v1
- Date: Sat, 4 Sep 2021 01:52:12 GMT
- Title: Effective user intent mining with unsupervised word representation
models and topic modelling
- Authors: Bencheng Wei
- Abstract summary: The explosion of e-commerce has led to a significant increase in text conversation between customers and agents.
We propose an approach to data mining the conversation intents behind the textual data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding the intent behind chat between customers and customer service
agents has become a crucial problem nowadays due to an exponential increase in
the use of the Internet by people from different cultures and educational
backgrounds. More importantly, the explosion of e-commerce has led to a
significant increase in text conversation between customers and agents. In this
paper, we propose an approach to data mining the conversation intents behind
the textual data. Using the customer service data set, we train unsupervised
text representation models, and then develop an intent mapping model which
would rank the predefined intents base on cosine similarity between sentences
and intents. Topic-modeling techniques are used to define intents and domain
experts are also involved to interpret topic modelling results. With this
approach, we can get a good understanding of the user intentions behind the
unlabelled customer service textual data.
Related papers
- Human-Object Interaction Detection Collaborated with Large Relation-driven Diffusion Models [65.82564074712836]
We introduce DIFfusionHOI, a new HOI detector shedding light on text-to-image diffusion models.
We first devise an inversion-based strategy to learn the expression of relation patterns between humans and objects in embedding space.
These learned relation embeddings then serve as textual prompts, to steer diffusion models generate images that depict specific interactions.
arXiv Detail & Related papers (2024-10-26T12:00:33Z) - Hierarchical Knowledge Distillation on Text Graph for Data-limited
Attribute Inference [5.618638372635474]
We develop a text-graph-based few-shot learning model for attribute inferences on social media text data.
Our model first constructs and refines a text graph using manifold learning and message passing.
To further use cross-domain texts and unlabeled texts to improve few-shot performance, a hierarchical knowledge distillation is devised over text graph.
arXiv Detail & Related papers (2024-01-10T05:50:34Z) - DIGMN: Dynamic Intent Guided Meta Network for Differentiated User
Engagement Forecasting in Online Professional Social Platforms [32.70471436337077]
A major reason for the differences in user engagement patterns is that users have different intents.
We propose a Dynamic Guided Meta Network (DIGMN) which can explicitly model user intent varying with time.
Our method outperforms state-of-the-art baselines significantly.
arXiv Detail & Related papers (2022-10-22T09:57:27Z) - Unsupervised Neural Stylistic Text Generation using Transfer learning
and Adapters [66.17039929803933]
We propose a novel transfer learning framework which updates only $0.3%$ of model parameters to learn style specific attributes for response generation.
We learn style specific attributes from the PERSONALITY-CAPTIONS dataset.
arXiv Detail & Related papers (2022-10-07T00:09:22Z) - Improved Goal Oriented Dialogue via Utterance Generation and Look Ahead [5.062869359266078]
intent prediction can be improved by training a deep text-to-text neural model to generate successive user utterances from unlabeled dialogue data.
We present a novel look-ahead approach that uses user utterance generation to improve intent prediction in time.
arXiv Detail & Related papers (2021-10-24T11:12:48Z) - Context-aware Heterogeneous Graph Attention Network for User Behavior
Prediction in Local Consumer Service Platform [8.30503479549857]
Local consumer service platform provides users with software to consume service to the nearby store or to the home, such as Groupon and Koubei.
The behavior of users on the local consumer service platform is closely related to their real-time local context information.
We propose a context-aware heterogeneous graph attention network (CHGAT) to generate the representation of the user and to estimate the probability for future behavior.
arXiv Detail & Related papers (2021-06-24T03:08:21Z) - Enhancing Dialogue Generation via Multi-Level Contrastive Learning [57.005432249952406]
We propose a multi-level contrastive learning paradigm to model the fine-grained quality of the responses with respect to the query.
A Rank-aware (RC) network is designed to construct the multi-level contrastive optimization objectives.
We build a Knowledge Inference (KI) component to capture the keyword knowledge from the reference during training and exploit such information to encourage the generation of informative words.
arXiv Detail & Related papers (2020-09-19T02:41:04Z) - Disentangled Graph Collaborative Filtering [100.26835145396782]
Disentangled Graph Collaborative Filtering (DGCF) is a new model for learning informative representations of users and items from interaction data.
By modeling a distribution over intents for each user-item interaction, we iteratively refine the intent-aware interaction graphs and representations.
DGCF achieves significant improvements over several state-of-the-art models like NGCF, DisenGCN, and MacridVAE.
arXiv Detail & Related papers (2020-07-03T15:37:25Z) - Intent Mining from past conversations for conversational agent [1.9754522186574608]
Bots are increasingly being deployed to provide round-the-clock support and to increase customer engagement.
Many of the commercial bot building frameworks follow a standard approach that requires one to build and train an intent model to recognize a user input.
We have introduced a novel density-based clustering algorithm ITERDB-LabelSCAN for unbalanced data clustering.
arXiv Detail & Related papers (2020-05-22T05:29:13Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z) - IART: Intent-aware Response Ranking with Transformers in
Information-seeking Conversation Systems [80.0781718687327]
We analyze user intent patterns in information-seeking conversations and propose an intent-aware neural response ranking model "IART"
IART is built on top of the integration of user intent modeling and language representation learning with the Transformer architecture.
arXiv Detail & Related papers (2020-02-03T05:59:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.