Argument Mining using BERT and Self-Attention based Embeddings
- URL: http://arxiv.org/abs/2302.13906v1
- Date: Mon, 27 Feb 2023 15:52:31 GMT
- Title: Argument Mining using BERT and Self-Attention based Embeddings
- Authors: Pranjal Srivastava, Pranav Bhatnagar, Anurag Goel
- Abstract summary: In this paper, a novel methodology for argument mining is proposed.
It employs attention-based embeddings for link prediction to model the causational hierarchies in typical argument structures prevalent in online discourse.
- Score: 1.6114012813668934
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Argument mining automatically identifies and extracts the structure of
inference and reasoning conveyed in natural language arguments. To the best of
our knowledge, most of the state-of-the-art works in this field have focused on
using tree-like structures and linguistic modeling. But, these approaches are
not able to model more complex structures which are often found in online
forums and real world argumentation structures. In this paper, a novel
methodology for argument mining is proposed which employs attention-based
embeddings for link prediction to model the causational hierarchies in typical
argument structures prevalent in online discourse.
Related papers
- Hidden Holes: topological aspects of language models [1.1172147007388977]
We study the evolution of topological structure in GPT based large language models across depth and time during training.
We show that the latter exhibit more topological complexity, with a distinct pattern of changes common to all natural languages but absent from synthetically generated data.
arXiv Detail & Related papers (2024-06-09T14:25:09Z) - A Unifying Framework for Learning Argumentation Semantics [50.69905074548764]
We present a novel framework, which uses an Inductive Logic Programming approach to learn the acceptability semantics for several abstract and structured argumentation frameworks in an interpretable way.
Our framework outperforms existing argumentation solvers, thus opening up new future research directions in the area of formal argumentation and human-machine dialogues.
arXiv Detail & Related papers (2023-10-18T20:18:05Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Physics of Language Models: Part 1, Learning Hierarchical Language Structures [51.68385617116854]
Transformer-based language models are effective but complex, and understanding their inner workings is a significant challenge.
We introduce a family of synthetic CFGs that produce hierarchical rules, capable of generating lengthy sentences.
We demonstrate that generative models like GPT can accurately learn this CFG language and generate sentences based on it.
arXiv Detail & Related papers (2023-05-23T04:28:16Z) - Automatic Debate Evaluation with Argumentation Semantics and Natural
Language Argument Graph Networks [2.4861619769660637]
We propose an original hybrid method to automatically evaluate argumentative debates.
For that purpose, we combine concepts from argumentation theory with Transformer-based architectures and neural graph networks.
We obtain promising results that lay the basis on an unexplored new instance of the automatic analysis of natural language arguments.
arXiv Detail & Related papers (2022-03-28T11:09:07Z) - A Formalisation of Abstract Argumentation in Higher-Order Logic [77.34726150561087]
We present an approach for representing abstract argumentation frameworks based on an encoding into classical higher-order logic.
This provides a uniform framework for computer-assisted assessment of abstract argumentation frameworks using interactive and automated reasoning tools.
arXiv Detail & Related papers (2021-10-18T10:45:59Z) - Exploring Discourse Structures for Argument Impact Classification [48.909640432326654]
This paper empirically shows that the discourse relations between two arguments along the context path are essential factors for identifying the persuasive power of an argument.
We propose DisCOC to inject and fuse the sentence-level structural information with contextualized features derived from large-scale language models.
arXiv Detail & Related papers (2021-06-02T06:49:19Z) - High-order Semantic Role Labeling [86.29371274587146]
This paper introduces a high-order graph structure for the neural semantic role labeling model.
It enables the model to explicitly consider not only the isolated predicate-argument pairs but also the interaction between the predicate-argument pairs.
Experimental results on 7 languages of the CoNLL-2009 benchmark show that the high-order structural learning techniques are beneficial to the strong performing SRL models.
arXiv Detail & Related papers (2020-10-09T15:33:54Z) - AMPERSAND: Argument Mining for PERSuAsive oNline Discussions [41.06165177604387]
We propose a computational model for argument mining in online persuasive discussion forums.
Our approach relies on identifying relations between components of arguments in a discussion thread.
Our models obtain significant improvements compared to recent state-of-the-art approaches.
arXiv Detail & Related papers (2020-04-30T10:33:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.