Discovering dynamical laws for speech gestures
- URL: http://arxiv.org/abs/2504.04849v1
- Date: Mon, 07 Apr 2025 09:03:32 GMT
- Title: Discovering dynamical laws for speech gestures
- Authors: Sam Kirkham,
- Abstract summary: We discover models in the form of symbolic equations that govern articulatory gestures during speech.<n>A sparse symbolic regression algorithm is used to discover models from kinematic data on the tongue and lips.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A fundamental challenge in the cognitive sciences is discovering the dynamics that govern behaviour. Take the example of spoken language, which is characterised by a highly variable and complex set of physical movements that map onto the small set of cognitive units that comprise language. What are the fundamental dynamical principles behind the movements that structure speech production? In this study, we discover models in the form of symbolic equations that govern articulatory gestures during speech. A sparse symbolic regression algorithm is used to discover models from kinematic data on the tongue and lips. We explore these candidate models using analytical techniques and numerical simulations, and find that a second-order linear model achieves high levels of accuracy, but a nonlinear force is required to properly model articulatory dynamics in approximately one third of cases. This supports the proposal that an autonomous, nonlinear, second-order differential equation is a viable dynamical law for articulatory gestures in speech. We conclude by identifying future opportunities and obstacles in data-driven model discovery and outline prospects for discovering the dynamical principles that govern language, brain and behaviour.
Related papers
- Solvable Dynamics of Self-Supervised Word Embeddings and the Emergence of Analogical Reasoning [3.519547280344187]
We study a class of solvable contrastive self-supervised algorithms which we term quadratic word embedding models.<n>Our solutions reveal that these models learn linear subspaces one at a time, each one incrementing the effective rank of the embeddings until model capacity is saturated.<n>We use our dynamical theory to predict how and when models acquire the ability to complete analogies.
arXiv Detail & Related papers (2025-02-14T02:16:48Z) - No Equations Needed: Learning System Dynamics Without Relying on Closed-Form ODEs [56.78271181959529]
This paper proposes a conceptual shift to modeling low-dimensional dynamical systems by departing from the traditional two-step modeling process.<n>Instead of first discovering a closed-form equation and then analyzing it, our approach, direct semantic modeling, predicts the semantic representation of the dynamical system.<n>Our approach not only simplifies the modeling pipeline but also enhances the transparency and flexibility of the resulting models.
arXiv Detail & Related papers (2025-01-30T18:36:48Z) - Scaling laws for nonlinear dynamical models of articulatory control [0.0]
We show how the addition of a nonlinear restoring force to task dynamic models introduces challenges with parameterization and interpretability.<n>We apply the scaling laws to a cubic model and show how they facilitate interpretable simulations of articulatory dynamics.
arXiv Detail & Related papers (2024-11-19T18:38:01Z) - In-Context Language Learning: Architectures and Algorithms [73.93205821154605]
We study ICL through the lens of a new family of model problems we term in context language learning (ICLL)
We evaluate a diverse set of neural sequence models on regular ICLL tasks.
arXiv Detail & Related papers (2024-01-23T18:59:21Z) - Interpretable Imitation Learning with Dynamic Causal Relations [65.18456572421702]
We propose to expose captured knowledge in the form of a directed acyclic causal graph.
We also design this causal discovery process to be state-dependent, enabling it to model the dynamics in latent causal graphs.
The proposed framework is composed of three parts: a dynamic causal discovery module, a causality encoding module, and a prediction module, and is trained in an end-to-end manner.
arXiv Detail & Related papers (2023-09-30T20:59:42Z) - Neural Implicit Representations for Physical Parameter Inference from a Single Video [49.766574469284485]
We propose to combine neural implicit representations for appearance modeling with neural ordinary differential equations (ODEs) for modelling physical phenomena.
Our proposed model combines several unique advantages: (i) Contrary to existing approaches that require large training datasets, we are able to identify physical parameters from only a single video.
The use of neural implicit representations enables the processing of high-resolution videos and the synthesis of photo-realistic images.
arXiv Detail & Related papers (2022-04-29T11:55:35Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Uncovering Closed-form Governing Equations of Nonlinear Dynamics from
Videos [8.546520029145853]
We introduce a novel end-to-end unsupervised deep learning framework to uncover the mathematical structure of equations that governs the dynamics of moving objects in videos.
Such an architecture consists of (1) an encoder-decoder network that learns low-dimensional spatial/pixel coordinates of the moving object, (2) a learnable Spatial-Physical Transformation component that creates mapping between the extracted spatial/pixel coordinates and the latent physical states of dynamics, and (3) a numerical integrator-based sparse regression module that uncovers the parsimonious closed-form governing equations of learned physical states.
arXiv Detail & Related papers (2021-06-09T02:50:11Z) - Implicit Representations of Meaning in Neural Language Models [31.71898809435222]
We identify contextual word representations that function as models of entities and situations as they evolve throughout a discourse.
Our results indicate that prediction in pretrained neural language models is supported, at least in part, by dynamic representations of meaning and implicit simulation of entity state.
arXiv Detail & Related papers (2021-06-01T19:23:20Z) - Discovery of Nonlinear Dynamical Systems using a Runge-Kutta Inspired
Dictionary-based Sparse Regression Approach [9.36739413306697]
We blend machine learning and dictionary-based learning with numerical analysis tools to discover governing differential equations.
We obtain interpretable and parsimonious models which are prone to generalize better beyond the sampling regime.
We discuss its extension to governing equations, containing rational nonlinearities that typically appear in biological networks.
arXiv Detail & Related papers (2021-05-11T08:46:51Z) - Context-aware Dynamics Model for Generalization in Model-Based
Reinforcement Learning [124.9856253431878]
We decompose the task of learning a global dynamics model into two stages: (a) learning a context latent vector that captures the local dynamics, then (b) predicting the next state conditioned on it.
In order to encode dynamics-specific information into the context latent vector, we introduce a novel loss function that encourages the context latent vector to be useful for predicting both forward and backward dynamics.
The proposed method achieves superior generalization ability across various simulated robotics and control tasks, compared to existing RL schemes.
arXiv Detail & Related papers (2020-05-14T08:10:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.