Efficient Calibration of Multi-Agent Market Simulators from Time Series
with Bayesian Optimization
- URL: http://arxiv.org/abs/2112.03874v1
- Date: Fri, 3 Dec 2021 22:57:46 GMT
- Title: Efficient Calibration of Multi-Agent Market Simulators from Time Series
with Bayesian Optimization
- Authors: Yuanlu Bai, Henry Lam, Svitlana Vyetrenko, Tucker Balch
- Abstract summary: Multi-agent market simulation is commonly used to create an environment for downstream machine learning or reinforcement learning tasks.
We propose a simple and efficient framework for calibrating multi-agent market simulator parameters from historical time series observations.
- Score: 2.6749843984691672
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-agent market simulation is commonly used to create an environment for
downstream machine learning or reinforcement learning tasks, such as training
or testing trading strategies before deploying them to real-time trading. In
electronic trading markets only the price or volume time series, that result
from interaction of multiple market participants, are typically directly
observable. Therefore, multi-agent market environments need to be calibrated so
that the time series that result from interaction of simulated agents resemble
historical -- which amounts to solving a highly complex large-scale
optimization problem. In this paper, we propose a simple and efficient
framework for calibrating multi-agent market simulator parameters from
historical time series observations. First, we consider a novel concept of
eligibility set to bypass the potential non-identifiability issue. Second, we
generalize the two-sample Kolmogorov-Smirnov (K-S) test with Bonferroni
correction to test the similarity between two high-dimensional time series
distributions, which gives a simple yet effective distance metric between the
time series sample sets. Third, we suggest using Bayesian optimization (BO) and
trust-region BO (TuRBO) to minimize the aforementioned distance metric.
Finally, we demonstrate the efficiency of our framework using numerical
experiments.
Related papers
- PeFAD: A Parameter-Efficient Federated Framework for Time Series Anomaly Detection [51.20479454379662]
We propose a.
Federated Anomaly Detection framework named PeFAD with the increasing privacy concerns.
We conduct extensive evaluations on four real datasets, where PeFAD outperforms existing state-of-the-art baselines by up to 28.74%.
arXiv Detail & Related papers (2024-06-04T13:51:08Z) - Deciphering Movement: Unified Trajectory Generation Model for Multi-Agent [53.637837706712794]
We propose a Unified Trajectory Generation model, UniTraj, that processes arbitrary trajectories as masked inputs.
Specifically, we introduce a Ghost Spatial Masking (GSM) module embedded within a Transformer encoder for spatial feature extraction.
We benchmark three practical sports game datasets, Basketball-U, Football-U, and Soccer-U, for evaluation.
arXiv Detail & Related papers (2024-05-27T22:15:23Z) - UniMatch: A Unified User-Item Matching Framework for the Multi-purpose
Merchant Marketing [27.459774494479227]
We present a unified user-item matching framework to simultaneously conduct item recommendation and user targeting with just one model.
Our framework results in significant performance gains in comparison with the state-of-the-art methods, with greatly reduced cost on computing resources and daily maintenance.
arXiv Detail & Related papers (2023-07-19T13:49:35Z) - Distributionally Robust Models with Parametric Likelihood Ratios [123.05074253513935]
Three simple ideas allow us to train models with DRO using a broader class of parametric likelihood ratios.
We find that models trained with the resulting parametric adversaries are consistently more robust to subpopulation shifts when compared to other DRO approaches.
arXiv Detail & Related papers (2022-04-13T12:43:12Z) - Time Series Anomaly Detection by Cumulative Radon Features [32.36217153362305]
In this work, we argue that shallow features suffice when combined with distribution distance measures.
Our approach models each time series as a high dimensional empirical distribution of features, where each time-point constitutes a single sample.
We show that by parameterizing each time series using cumulative Radon features, we are able to efficiently and effectively model the distribution of normal time series.
arXiv Detail & Related papers (2022-02-08T18:58:53Z) - Multi-Asset Spot and Option Market Simulation [52.77024349608834]
We construct realistic spot and equity option market simulators for a single underlying on the basis of normalizing flows.
We leverage the conditional invertibility property of normalizing flows and introduce a scalable method to calibrate the joint distribution of a set of independent simulators.
arXiv Detail & Related papers (2021-12-13T17:34:28Z) - Cadence: A Practical Time-series Partitioning Algorithm for Unlabeled
IoT Sensor Streams [1.2330326247154968]
We show that our algorithm can robustly detect time-series events across different applications.
We demonstrate its applicability in a real-world IoT deployment for ambient-sensing based activity recognition.
arXiv Detail & Related papers (2021-12-06T21:13:18Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Learning who is in the market from time series: market participant
discovery through adversarial calibration of multi-agent simulators [0.0]
In electronic trading markets only the price or volume time series are directly observable.
We propose a novel two-step method to train a discriminator that is able to distinguish between "real" and "fake" price and volume time series.
arXiv Detail & Related papers (2021-08-02T06:53:37Z) - The Right Tool for the Job: Matching Model and Instance Complexities [62.95183777679024]
As NLP models become larger, executing a trained model requires significant computational resources incurring monetary and environmental costs.
We propose a modification to contextual representation fine-tuning which, during inference, allows for an early (and fast) "exit"
We test our proposed modification on five different datasets in two tasks: three text classification datasets and two natural language inference benchmarks.
arXiv Detail & Related papers (2020-04-16T04:28:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.