Cognitively Inspired Energy-Based World Models
- URL: http://arxiv.org/abs/2406.08862v1
- Date: Thu, 13 Jun 2024 06:54:37 GMT
- Title: Cognitively Inspired Energy-Based World Models
- Authors: Alexi Gladstone, Ganesh Nanduru, Md Mofijul Islam, Aman Chadha, Jundong Li, Tariq Iqbal,
- Abstract summary: We introduce Energy-Based World Models (EBWM)
EBWM involves training an Energy-Based Model (EBM) to predict the compatibility of a given context and a predicted future state.
We develop a variant of the traditional autoregressive transformer termed the Energy-Based Transformer (EBT)
- Score: 40.08174759225766
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: One of the predominant methods for training world models is autoregressive prediction in the output space of the next element of a sequence. In Natural Language Processing (NLP), this takes the form of Large Language Models (LLMs) predicting the next token; in Computer Vision (CV), this takes the form of autoregressive models predicting the next frame/token/pixel. However, this approach differs from human cognition in several respects. First, human predictions about the future actively influence internal cognitive processes. Second, humans naturally evaluate the plausibility of predictions regarding future states. Based on this capability, and third, by assessing when predictions are sufficient, humans allocate a dynamic amount of time to make a prediction. This adaptive process is analogous to System 2 thinking in psychology. All these capabilities are fundamental to the success of humans at high-level reasoning and planning. Therefore, to address the limitations of traditional autoregressive models lacking these human-like capabilities, we introduce Energy-Based World Models (EBWM). EBWM involves training an Energy-Based Model (EBM) to predict the compatibility of a given context and a predicted future state. In doing so, EBWM enables models to achieve all three facets of human cognition described. Moreover, we developed a variant of the traditional autoregressive transformer tailored for Energy-Based models, termed the Energy-Based Transformer (EBT). Our results demonstrate that EBWM scales better with data and GPU Hours than traditional autoregressive transformers in CV, and that EBWM offers promising early scaling in NLP. Consequently, this approach offers an exciting path toward training future models capable of System 2 thinking and intelligently searching across state spaces.
Related papers
- Humans and language models diverge when predicting repeating text [52.03471802608112]
We present a scenario in which the performance of humans and LMs diverges.
Human and GPT-2 LM predictions are strongly aligned in the first presentation of a text span, but their performance quickly diverges when memory begins to play a role.
We hope that this scenario will spur future work in bringing LMs closer to human behavior.
arXiv Detail & Related papers (2023-10-10T08:24:28Z) - Revisiting Energy Based Models as Policies: Ranking Noise Contrastive
Estimation and Interpolating Energy Models [18.949193683555237]
In this work, we revisit the choice of energy-based models (EBM) as a policy class.
We develop a training objective and algorithm for energy models which combines several key ingredients.
We show that the Implicit Behavior Cloning (IBC) objective is actually biased even at the population level.
arXiv Detail & Related papers (2023-09-11T20:13:47Z) - Human Trajectory Forecasting with Explainable Behavioral Uncertainty [63.62824628085961]
Human trajectory forecasting helps to understand and predict human behaviors, enabling applications from social robots to self-driving cars.
Model-free methods offer superior prediction accuracy but lack explainability, while model-based methods provide explainability but cannot predict well.
We show that BNSP-SFM achieves up to a 50% improvement in prediction accuracy, compared with 11 state-of-the-art methods.
arXiv Detail & Related papers (2023-07-04T16:45:21Z) - Neural Foundations of Mental Simulation: Future Prediction of Latent
Representations on Dynamic Scenes [3.2744507958793143]
We combine a goal-driven modeling approach with dense neurophysiological data and human behavioral readouts to impinge on this question.
Specifically, we construct and evaluate several classes of sensory-cognitive networks to predict the future state of rich, ethologically-relevant environments.
We find strong differentiation across these model classes in their ability to predict neural and behavioral data both within and across diverse environments.
arXiv Detail & Related papers (2023-05-19T15:56:06Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - Skillful Twelve Hour Precipitation Forecasts using Large Context Neural
Networks [8.086653045816151]
Current operational forecasting models are based on physics and use supercomputers to simulate the atmosphere.
An emerging class of weather models based on neural networks represents a paradigm shift in weather forecasting.
We present a neural network that is capable of large-scale precipitation forecasting up to twelve hours ahead.
arXiv Detail & Related papers (2021-11-14T22:53:04Z) - Transformers for prompt-level EMA non-response prediction [62.41658786277712]
Ecological Momentary Assessments (EMAs) are an important psychological data source for measuring cognitive states, affect, behavior, and environmental factors.
Non-response, in which participants fail to respond to EMA prompts, is an endemic problem.
The ability to accurately predict non-response could be utilized to improve EMA delivery and develop compliance interventions.
arXiv Detail & Related papers (2021-11-01T18:38:47Z) - Probabilistic Human Motion Prediction via A Bayesian Neural Network [71.16277790708529]
We propose a probabilistic model for human motion prediction in this paper.
Our model could generate several future motions when given an observed motion sequence.
We extensively validate our approach on a large scale benchmark dataset Human3.6m.
arXiv Detail & Related papers (2021-07-14T09:05:33Z) - Multimodal Deep Generative Models for Trajectory Prediction: A
Conditional Variational Autoencoder Approach [34.70843462687529]
We provide a self-contained tutorial on a conditional variational autoencoder approach to human behavior prediction.
The goals of this tutorial paper are to review and build a taxonomy of state-of-the-art methods in human behavior prediction.
arXiv Detail & Related papers (2020-08-10T03:18:27Z) - Predicting human decisions with behavioral theories and machine learning [13.000185375686325]
We introduce BEAST Gradient Boosting (BEAST-GB), a novel hybrid model that synergizes behavioral theories with machine learning techniques.
We show that BEAST-GB achieves state-of-the-art performance on the largest publicly available dataset of human risky choice.
We also show BEAST-GB displays robust domain generalization capabilities as it effectively predicts choice behavior in new experimental contexts.
arXiv Detail & Related papers (2019-04-15T06:12:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.