Construction Grammar and Language Models
- URL: http://arxiv.org/abs/2308.13315v2
- Date: Mon, 4 Sep 2023 21:03:51 GMT
- Title: Construction Grammar and Language Models
- Authors: Harish Tayyar Madabushi and Laurence Romain and Petar Milin and Dagmar
Divjak
- Abstract summary: Recent progress in deep learning has given rise to powerful models that are primarily trained on a cloze-like task.
This chapter aims to foster collaboration between researchers in the fields of natural language processing and Construction Grammar.
- Score: 4.171555557592296
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Recent progress in deep learning and natural language processing has given
rise to powerful models that are primarily trained on a cloze-like task and
show some evidence of having access to substantial linguistic information,
including some constructional knowledge. This groundbreaking discovery presents
an exciting opportunity for a synergistic relationship between computational
methods and Construction Grammar research. In this chapter, we explore three
distinct approaches to the interplay between computational methods and
Construction Grammar: (i) computational methods for text analysis, (ii)
computational Construction Grammar, and (iii) deep learning models, with a
particular focus on language models. We touch upon the first two approaches as
a contextual foundation for the use of computational methods before providing
an accessible, yet comprehensive overview of deep learning models, which also
addresses reservations construction grammarians may have. Additionally, we
delve into experiments that explore the emergence of constructionally relevant
information within these models while also examining the aspects of
Construction Grammar that may pose challenges for these models. This chapter
aims to foster collaboration between researchers in the fields of natural
language processing and Construction Grammar. By doing so, we hope to pave the
way for new insights and advancements in both these fields.
Related papers
- The Computational Learning of Construction Grammars: State of the Art and Prospective Roadmap [2.287415292857564]
This paper documents and reviews the state of the art concerning computational models of construction grammar learning.
It aims to synthesise the variety of methodologies that have been proposed to date and the results that have been obtained.
arXiv Detail & Related papers (2024-07-10T12:45:02Z) - Language Evolution with Deep Learning [49.879239655532324]
Computational modeling plays an essential role in the study of language emergence.
It aims to simulate the conditions and learning processes that could trigger the emergence of a structured language.
This chapter explores another class of computational models that have recently revolutionized the field of machine learning: deep learning models.
arXiv Detail & Related papers (2024-03-18T16:52:54Z) - Learning Interpretable Concepts: Unifying Causal Representation Learning
and Foundation Models [51.43538150982291]
We study how to learn human-interpretable concepts from data.
Weaving together ideas from both fields, we show that concepts can be provably recovered from diverse data.
arXiv Detail & Related papers (2024-02-14T15:23:59Z) - Language Models: A Guide for the Perplexed [51.88841610098437]
This tutorial aims to help narrow the gap between those who study language models and those who are intrigued and want to learn more.
We offer a scientific viewpoint that focuses on questions amenable to study through experimentation.
We situate language models as they are today in the context of the research that led to their development.
arXiv Detail & Related papers (2023-11-29T01:19:02Z) - Construction Grammar and Artificial Intelligence [2.864550757598007]
We argue that it is beneficial for the contemporary construction grammarian to have a thorough understanding of the strong relationship between the research fields of construction grammar and artificial intelligence.
We show that their relationship is rooted in a common attitude towards human communication and language.
We conclude that the further elaboration of this relationship will play a key role in shaping the future of the field of construction grammar.
arXiv Detail & Related papers (2023-08-31T21:15:06Z) - Foundational Models Defining a New Era in Vision: A Survey and Outlook [151.49434496615427]
Vision systems to see and reason about the compositional nature of visual scenes are fundamental to understanding our world.
The models learned to bridge the gap between such modalities coupled with large-scale training data facilitate contextual reasoning, generalization, and prompt capabilities at test time.
The output of such models can be modified through human-provided prompts without retraining, e.g., segmenting a particular object by providing a bounding box, having interactive dialogues by asking questions about an image or video scene or manipulating the robot's behavior through language instructions.
arXiv Detail & Related papers (2023-07-25T17:59:18Z) - Probing via Prompting [71.7904179689271]
This paper introduces a novel model-free approach to probing, by formulating probing as a prompting task.
We conduct experiments on five probing tasks and show that our approach is comparable or better at extracting information than diagnostic probes.
We then examine the usefulness of a specific linguistic property for pre-training by removing the heads that are essential to that property and evaluating the resulting model's performance on language modeling.
arXiv Detail & Related papers (2022-07-04T22:14:40Z) - Towards Understanding Large-Scale Discourse Structures in Pre-Trained
and Fine-Tuned Language Models [30.615883375573432]
We describe a novel approach to infer discourse structures from arbitrarily long documents.
Second, we propose a new type of analysis to explore where and how accurately intrinsic discourse is captured in the BERT and BART models.
We assess how similar the generated structures are to a variety of baselines as well as their distribution within and between models.
arXiv Detail & Related papers (2022-04-08T20:42:08Z) - Leveraging pre-trained language models for conversational information
seeking from text [2.8425118603312]
In this paper we investigate the usage of in-context learning and pre-trained language representation models to address the problem of information extraction from process description documents.
The results highlight the potential of the approach and the usefulness of the in-context learning customizations.
arXiv Detail & Related papers (2022-03-31T09:00:46Z) - Combining pre-trained language models and structured knowledge [9.521634184008574]
transformer-based language models have achieved state of the art performance in various NLP benchmarks.
It has proven challenging to integrate structured information, such as knowledge graphs into these models.
We examine a variety of approaches to integrate structured knowledge into current language models and determine challenges, and possible opportunities to leverage both structured and unstructured information sources.
arXiv Detail & Related papers (2021-01-28T21:54:03Z) - Exploiting Structured Knowledge in Text via Graph-Guided Representation
Learning [73.0598186896953]
We present two self-supervised tasks learning over raw text with the guidance from knowledge graphs.
Building upon entity-level masked language models, our first contribution is an entity masking scheme.
In contrast to existing paradigms, our approach uses knowledge graphs implicitly, only during pre-training.
arXiv Detail & Related papers (2020-04-29T14:22:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.