"Definition Modeling: To model definitions." Generating Definitions With
Little to No Semantics
- URL: http://arxiv.org/abs/2306.08433v1
- Date: Wed, 14 Jun 2023 11:08:38 GMT
- Title: "Definition Modeling: To model definitions." Generating Definitions With
Little to No Semantics
- Authors: Vincent Segonne and Timothee Mickus
- Abstract summary: We present evidence that the task may not involve as much semantics as one might expect.
We show how an earlier model from the literature is both rather insensitive to semantic aspects such as explicit polysemy.
- Score: 0.4061135251278187
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Definition Modeling, the task of generating definitions, was first proposed
as a means to evaluate the semantic quality of word embeddings-a coherent
lexical semantic representations of a word in context should contain all the
information necessary to generate its definition. The relative novelty of this
task entails that we do not know which factors are actually relied upon by a
Definition Modeling system. In this paper, we present evidence that the task
may not involve as much semantics as one might expect: we show how an earlier
model from the literature is both rather insensitive to semantic aspects such
as explicit polysemy, as well as reliant on formal similarities between
headwords and words occurring in its glosses, casting doubt on the validity of
the task as a means to evaluate embeddings.
Related papers
- Definition generation for lexical semantic change detection [3.7297237438000788]
We use contextualized word definitions generated by large language models as semantic representations in the task of diachronic lexical semantic change detection (LSCD)
In short, generated definitions are used as senses', and the change score of a target word is retrieved by comparing their distributions in two time periods under comparison.
Our approach is on par with or outperforms prior non-supervised LSCD methods.
arXiv Detail & Related papers (2024-06-20T10:13:08Z) - Domain Embeddings for Generating Complex Descriptions of Concepts in
Italian Language [65.268245109828]
We propose a Distributional Semantic resource enriched with linguistic and lexical information extracted from electronic dictionaries.
The resource comprises 21 domain-specific matrices, one comprehensive matrix, and a Graphical User Interface.
Our model facilitates the generation of reasoned semantic descriptions of concepts by selecting matrices directly associated with concrete conceptual knowledge.
arXiv Detail & Related papers (2024-02-26T15:04:35Z) - Agentivit\`a e telicit\`a in GilBERTo: implicazioni cognitive [77.71680953280436]
The goal of this study is to investigate whether a Transformer-based neural language model infers lexical semantics.
The semantic properties considered are telicity (also combined with definiteness) and agentivity.
arXiv Detail & Related papers (2023-07-06T10:52:22Z) - Vec2Gloss: definition modeling leveraging contextualized vectors with
Wordnet gloss [8.741676279851728]
We propose a Vec2Gloss' model, which produces the gloss from the target word's contextualized embeddings.
The generated glosses of this study are made possible by the systematic gloss patterns provided by Chinese Wordnet.
Our results indicate that the proposed Vec2Gloss' model opens a new perspective to the lexical-semantic applications of contextualized embeddings.
arXiv Detail & Related papers (2023-05-29T02:37:37Z) - Semantic Role Labeling Meets Definition Modeling: Using Natural Language
to Describe Predicate-Argument Structures [104.32063681736349]
We present an approach to describe predicate-argument structures using natural language definitions instead of discrete labels.
Our experiments and analyses on PropBank-style and FrameNet-style, dependency-based and span-based SRL also demonstrate that a flexible model with an interpretable output does not necessarily come at the expense of performance.
arXiv Detail & Related papers (2022-12-02T11:19:16Z) - Distance Based Image Classification: A solution to generative
classification's conundrum? [70.43638559782597]
We argue that discriminative boundaries are counter-intuitive as they define semantics by what-they-are-not.
We propose a new generative model in which semantic factors are accommodated by shell theory's hierarchical generative process.
We use the model to develop a classification scheme which suppresses the impact of noise while preserving semantic cues.
arXiv Detail & Related papers (2022-10-04T03:35:13Z) - IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words
and Their Semantic Representations [0.0]
We present our findings based on the descriptive, exploratory, and predictive data analysis conducted on the CODWOE dataset.
We give a detailed overview of the systems that we designed for Definition Modeling and Reverse Dictionary tasks.
arXiv Detail & Related papers (2022-05-13T18:15:20Z) - Translational Concept Embedding for Generalized Compositional Zero-shot
Learning [73.60639796305415]
Generalized compositional zero-shot learning means to learn composed concepts of attribute-object pairs in a zero-shot fashion.
This paper introduces a new approach, termed translational concept embedding, to solve these two difficulties in a unified framework.
arXiv Detail & Related papers (2021-12-20T21:27:51Z) - CDM: Combining Extraction and Generation for Definition Modeling [8.487707405248242]
We propose to combine extraction and generation for definition modeling.
First extract self- and correlative definitional information of target terms from the Web.
Then generate the final definitions by incorporating the extracted definitional information.
arXiv Detail & Related papers (2021-11-14T08:03:18Z) - VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word
Representations for Improved Definition Modeling [24.775371434410328]
We tackle the task of definition modeling, where the goal is to learn to generate definitions of words and phrases.
Existing approaches for this task are discriminative, combining distributional and lexical semantics in an implicit rather than direct way.
We propose a generative model for the task, introducing a continuous latent variable to explicitly model the underlying relationship between a phrase used within a context and its definition.
arXiv Detail & Related papers (2020-10-07T02:48:44Z) - Lexical Sememe Prediction using Dictionary Definitions by Capturing
Local Semantic Correspondence [94.79912471702782]
Sememes, defined as the minimum semantic units of human languages, have been proven useful in many NLP tasks.
We propose a Sememe Correspondence Pooling (SCorP) model, which is able to capture this kind of matching to predict sememes.
We evaluate our model and baseline methods on a famous sememe KB HowNet and find that our model achieves state-of-the-art performance.
arXiv Detail & Related papers (2020-01-16T17:30:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.