Noun2Verb: Probabilistic frame semantics for word class conversion
- URL: http://arxiv.org/abs/2205.06321v1
- Date: Thu, 12 May 2022 19:16:12 GMT
- Title: Noun2Verb: Probabilistic frame semantics for word class conversion
- Authors: Lei Yu, Yang Xu
- Abstract summary: We present a formal framework that simulates the production and comprehension of novel denominal verb usages.
We show that a model where the speaker and listener cooperatively learn the joint distribution over semantic frame elements better explains the empirical denominal verb usages.
- Score: 8.939269057094661
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Humans can flexibly extend word usages across different grammatical classes,
a phenomenon known as word class conversion. Noun-to-verb conversion, or
denominal verb (e.g., to Google a cheap flight), is one of the most prevalent
forms of word class conversion. However, existing natural language processing
systems are impoverished in interpreting and generating novel denominal verb
usages. Previous work has suggested that novel denominal verb usages are
comprehensible if the listener can compute the intended meaning based on shared
knowledge with the speaker. Here we explore a computational formalism for this
proposal couched in frame semantics. We present a formal framework, Noun2Verb,
that simulates the production and comprehension of novel denominal verb usages
by modeling shared knowledge of speaker and listener in semantic frames. We
evaluate an incremental set of probabilistic models that learn to interpret and
generate novel denominal verb usages via paraphrasing. We show that a model
where the speaker and listener cooperatively learn the joint distribution over
semantic frame elements better explains the empirical denominal verb usages
than state-of-the-art language models, evaluated against data from 1)
contemporary English in both adult and child speech, 2) contemporary Mandarin
Chinese, and 3) the historical development of English. Our work grounds word
class conversion in probabilistic frame semantics and bridges the gap between
natural language processing systems and humans in lexical creativity.
Related papers
- Probabilistic Transformer: A Probabilistic Dependency Model for
Contextual Word Representation [52.270712965271656]
We propose a new model of contextual word representation, not from a neural perspective, but from a purely syntactic and probabilistic perspective.
We find that the graph of our model resembles transformers, with correspondences between dependencies and self-attention.
Experiments show that our model performs competitively to transformers on small to medium sized datasets.
arXiv Detail & Related papers (2023-11-26T06:56:02Z) - Transparency Helps Reveal When Language Models Learn Meaning [71.96920839263457]
Our systematic experiments with synthetic data reveal that, with languages where all expressions have context-independent denotations, both autoregressive and masked language models learn to emulate semantic relations between expressions.
Turning to natural language, our experiments with a specific phenomenon -- referential opacity -- add to the growing body of evidence that current language models do not well-represent natural language semantics.
arXiv Detail & Related papers (2022-10-14T02:35:19Z) - GSRFormer: Grounded Situation Recognition Transformer with Alternate
Semantic Attention Refinement [73.73599110214828]
Grounded Situation Recognition (GSR) aims to generate structured semantic summaries of images for human-like'' event understanding.
Inspired by object detection and image captioning tasks, existing methods typically employ a two-stage framework.
We propose a novel two-stage framework that focuses on utilizing such bidirectional relations within verbs and roles.
arXiv Detail & Related papers (2022-08-18T17:13:59Z) - Disentangled Action Recognition with Knowledge Bases [77.77482846456478]
We aim to improve the generalization ability of the compositional action recognition model to novel verbs or novel nouns.
Previous work utilizes verb-noun compositional action nodes in the knowledge graph, making it inefficient to scale.
We propose our approach: Disentangled Action Recognition with Knowledge-bases (DARK), which leverages the inherent compositionality of actions.
arXiv Detail & Related papers (2022-07-04T20:19:13Z) - Do Trajectories Encode Verb Meaning? [22.409307683247967]
Grounded language models learn to connect concrete categories like nouns and adjectives to the world via images and videos.
In this paper, we investigate the extent to which trajectories (i.e. the position and rotation of objects over time) naturally encode verb semantics.
We find that trajectories correlate as-is with some verbs (e.g., fall), and that additional abstraction via self-supervised pretraining can further capture nuanced differences in verb meaning.
arXiv Detail & Related papers (2022-06-23T19:57:16Z) - Augmenting semantic lexicons using word embeddings and transfer learning [1.101002667958165]
We propose two models for predicting sentiment scores to augment semantic lexicons at a relatively low cost using word embeddings and transfer learning.
Our evaluation shows both models are able to score new words with a similar accuracy to reviewers from Amazon Mechanical Turk, but at a fraction of the cost.
arXiv Detail & Related papers (2021-09-18T20:59:52Z) - Verb Sense Clustering using Contextualized Word Representations for
Semantic Frame Induction [9.93359829907774]
Contextualized word representations have proven useful for various natural language processing tasks.
In this paper, we focus on verbs that evoke different frames depending on the context.
We investigate how well contextualized word representations can recognize the difference of frames that the same verb evokes.
arXiv Detail & Related papers (2021-05-27T21:53:40Z) - A Computational Framework for Slang Generation [2.1813490315521773]
We take an initial step toward machine generation of slang by developing a framework that models the speaker's word choice in slang context.
Our framework encodes novel slang meaning by relating the conventional and slang senses of a word.
We perform rigorous evaluations on three slang dictionaries and show that our approach outperforms state-of-the-art language models.
arXiv Detail & Related papers (2021-02-03T01:19:07Z) - Lexical semantic change for Ancient Greek and Latin [61.69697586178796]
Associating a word's correct meaning in its historical context is a central challenge in diachronic research.
We build on a recent computational approach to semantic change based on a dynamic Bayesian mixture model.
We provide a systematic comparison of dynamic Bayesian mixture models for semantic change with state-of-the-art embedding-based models.
arXiv Detail & Related papers (2021-01-22T12:04:08Z) - Investigating Cross-Linguistic Adjective Ordering Tendencies with a
Latent-Variable Model [66.84264870118723]
We present the first purely corpus-driven model of multi-lingual adjective ordering in the form of a latent-variable model.
We provide strong converging evidence for the existence of universal, cross-linguistic, hierarchical adjective ordering tendencies.
arXiv Detail & Related papers (2020-10-09T18:27:55Z) - Word class flexibility: A deep contextualized approach [18.50173460090958]
We propose a principled methodology to explore regularity in word class flexibility.
We find that contextualized embeddings capture human judgment of class variation within words in English.
We find greater semantic variation when flexible lemmas are used in their dominant word class.
arXiv Detail & Related papers (2020-09-19T14:41:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.