Which course? Discourse! Teaching Discourse and Generation in the Era of LLMs
- URL: http://arxiv.org/abs/2602.02878v2
- Date: Mon, 09 Feb 2026 18:21:16 GMT
- Title: Which course? Discourse! Teaching Discourse and Generation in the Era of LLMs
- Authors: Junyi Jessy Li, Yang Janet Liu, Kanishka Misra, Valentina Pyatkin, William Sheffield,
- Abstract summary: The field of NLP has undergone vast, continuous transformations over the past few years, sparking debates going beyond discipline boundaries.<n>This paper explores this question from the angle of discourse processing, an area with rich linguistic insights and computational models for the intentional, attentional, and coherence structure of language.<n>We present a new course, "Computational Discourse and Natural Language Generation"
- Score: 37.14307877244518
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The field of NLP has undergone vast, continuous transformations over the past few years, sparking debates going beyond discipline boundaries. This begs important questions in education: how do we design courses that bridge sub-disciplines in this shifting landscape? This paper explores this question from the angle of discourse processing, an area with rich linguistic insights and computational models for the intentional, attentional, and coherence structure of language. Discourse is highly relevant for open-ended or long-form text generation, yet this connection is under-explored in existing undergraduate curricula. We present a new course, "Computational Discourse and Natural Language Generation". The course is collaboratively designed by a team with complementary expertise and was offered for the first time in Fall 2025 as an upper-level undergraduate course, cross-listed between Linguistics and Computer Science. Our philosophy is to deeply integrate the theoretical and empirical aspects, and create an exploratory mindset inside the classroom and in the assignments. This paper describes the course in detail and concludes with takeaways from an independent survey as well as our vision for future directions.
Related papers
- Designing a Syllabus for a Course on Empirical Software Engineering [2.518416353853374]
This chapter attempts to support educators in the first and most crucial step in their course design: creating the syllabus.<n>It offers a list of the fundamental building blocks for a syllabus, namely course aims, course topics, and practical assignments.<n>The course topics are also linked to the subsequent chapters of this book, so that readers can dig deeper into those chapters and get support on teaching specific research methods or cross-cutting topics.
arXiv Detail & Related papers (2025-03-14T10:58:29Z) - Position: Topological Deep Learning is the New Frontier for Relational Learning [51.05869778335334]
Topological deep learning (TDL) is a rapidly evolving field that uses topological features to understand and design deep learning models.
This paper posits that TDL is the new frontier for relational learning.
arXiv Detail & Related papers (2024-02-14T00:35:10Z) - To Build Our Future, We Must Know Our Past: Contextualizing Paradigm
Shifts in Natural Language Processing [14.15370310437262]
We study factors that shape NLP as a field, including culture, incentives, and infrastructure.
Our interviewees identify cyclical patterns in the field, as well as new shifts without historical parallel.
We conclude by discussing shared visions, concerns, and hopes for the future of NLP.
arXiv Detail & Related papers (2023-10-11T17:59:36Z) - Experiences with Research Processes in an Undergraduate Theory of
Computing Course [0.30458514384586394]
Theory of computing (ToC) courses are a staple in many undergraduate CS curricula.
We emulated a common research environment within our ToC course by creating a mock conference assignment.
arXiv Detail & Related papers (2023-10-03T11:37:06Z) - A Framework for Curriculum Transformation in Quantum Information Science and Technology Education [0.0]
The Quantum Curriculum Transformation Framework (QCTF) consists of four steps: 1. choose a topic, 2. choose one or more targeted skills, 3. choose a learning goal and 4. choose a teaching approach that achieves this goal.
The framework is intended to structure the narrative of QIST teaching, and with future testing and refinement it will form a basis for further research in the didactics of QIST.
arXiv Detail & Related papers (2023-08-20T21:44:10Z) - UKP-SQuARE: An Interactive Tool for Teaching Question Answering [61.93372227117229]
The exponential growth of question answering (QA) has made it an indispensable topic in any Natural Language Processing (NLP) course.
We introduce UKP-SQuARE as a platform for QA education.
Students can run, compare, and analyze various QA models from different perspectives.
arXiv Detail & Related papers (2023-05-31T11:29:04Z) - A Survey of Knowledge Enhanced Pre-trained Language Models [78.56931125512295]
We present a comprehensive review of Knowledge Enhanced Pre-trained Language Models (KE-PLMs)
For NLU, we divide the types of knowledge into four categories: linguistic knowledge, text knowledge, knowledge graph (KG) and rule knowledge.
The KE-PLMs for NLG are categorized into KG-based and retrieval-based methods.
arXiv Detail & Related papers (2022-11-11T04:29:02Z) - An Inclusive Notion of Text [69.36678873492373]
We argue that clarity on the notion of text is crucial for reproducible and generalizable NLP.
We introduce a two-tier taxonomy of linguistic and non-linguistic elements that are available in textual sources and can be used in NLP modeling.
arXiv Detail & Related papers (2022-11-10T14:26:43Z) - The interdisciplinary quantum information classroom: Themes from a
survey of quantum information science instructors [0.0]
Interdisciplinary introduction to quantum information science (QIS) courses are proliferating at universities across the US.
We report on the findings of a survey of instructors teaching introduction to QIS courses at institutions across the US.
arXiv Detail & Related papers (2022-02-12T00:37:58Z) - Positioning yourself in the maze of Neural Text Generation: A
Task-Agnostic Survey [54.34370423151014]
This paper surveys the components of modeling approaches relaying task impacts across various generation tasks such as storytelling, summarization, translation etc.
We present an abstraction of the imperative techniques with respect to learning paradigms, pretraining, modeling approaches, decoding and the key challenges outstanding in the field in each of them.
arXiv Detail & Related papers (2020-10-14T17:54:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.