Meta Temporal Point Processes
- URL: http://arxiv.org/abs/2301.12023v1
- Date: Fri, 27 Jan 2023 23:21:07 GMT
- Title: Meta Temporal Point Processes
- Authors: Wonho Bae, Mohamed Osama Ahmed, Frederick Tung, Gabriel L. Oliveira
- Abstract summary: temporal point process (TPP) is a process where its realization is a sequence of discrete events in time.
Recent work in TPPs model the process using a neural network in a supervised learning framework.
We propose to train TPPs in a meta learning framework, where each sequence is treated as a different task.
- Score: 13.525125302111844
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A temporal point process (TPP) is a stochastic process where its realization
is a sequence of discrete events in time. Recent work in TPPs model the process
using a neural network in a supervised learning framework, where a training set
is a collection of all the sequences. In this work, we propose to train TPPs in
a meta learning framework, where each sequence is treated as a different task,
via a novel framing of TPPs as neural processes (NPs). We introduce context
sets to model TPPs as an instantiation of NPs. Motivated by attentive NP, we
also introduce local history matching to help learn more informative features.
We demonstrate the potential of the proposed method on popular public benchmark
datasets and tasks, and compare with state-of-the-art TPP methods.
Related papers
- TPP-Gaze: Modelling Gaze Dynamics in Space and Time with Neural Temporal Point Processes [63.95928298690001]
We present TPP-Gaze, a novel and principled approach to model scanpath dynamics based on Neural Temporal Point Process (TPP)
Our results show the overall superior performance of the proposed model compared to state-of-the-art approaches.
arXiv Detail & Related papers (2024-10-30T19:22:38Z) - TPP-LLM: Modeling Temporal Point Processes by Efficiently Fine-Tuning Large Language Models [0.0]
Temporal point processes (TPPs) are widely used to model the timing and occurrence of events in domains such as social networks, transportation systems, and e-commerce.
We introduce TPP-LLM, a novel framework that integrates large language models (LLMs) with TPPs to capture both the semantic and temporal aspects of event sequences.
arXiv Detail & Related papers (2024-10-02T22:17:24Z) - In-Context In-Context Learning with Transformer Neural Processes [50.57807892496024]
We develop the in-context in-context learning pseudo-token TNP (ICICL-TNP)
The ICICL-TNP is capable of conditioning on both sets of datapoints and sets of datasets, enabling it to perform in-context in-context learning.
We demonstrate the importance of in-context in-context learning and the effectiveness of the ICICL-TNP in a number of experiments.
arXiv Detail & Related papers (2024-06-19T12:26:36Z) - On Non-asymptotic Theory of Recurrent Neural Networks in Temporal Point Processes [10.4442505961159]
temporal point process (TPP) is an important tool for modeling and predicting irregularly timed events across various domains.
Recent neural network (RNN)-based TPPs have shown practical advantages over traditional parametric TPP models.
In this paper, we establish the excess risk bounds of RNN-TPPs under many well-known TPP settings.
arXiv Detail & Related papers (2024-06-02T06:19:25Z) - Prompt-augmented Temporal Point Process for Streaming Event Sequence [18.873915278172095]
We present a novel framework for continuous monitoring of a Neural Temporal Point Processes (TPP) model.
PromptTPP consistently achieves state-of-the-art performance across three real user behavior datasets.
arXiv Detail & Related papers (2023-10-08T03:41:16Z) - Learning a Better Initialization for Soft Prompts via Meta-Learning [58.53984967461313]
We propose MetaPT (Meta-learned Prompt Tuning) to improve prompt tuning.
We introduce the structure by first clustering pre-training data into different auxiliary tasks.
We use these tasks to pre-train prompts with a meta-learning algorithm.
arXiv Detail & Related papers (2022-05-25T03:50:23Z) - Semi-supervised Learning for Marked Temporal Point Processes [7.666240799116112]
This research proposes a novel algorithm for Semi-supervised Learning for Marked Temporal Point Processes (SSL-MTPP)
The proposed algorithm utilizes a combination of labeled and unlabeled data for learning a robust marker prediction model.
The efficacy of the proposed algorithm has been demonstrated via multiple protocols on the Retweet dataset.
arXiv Detail & Related papers (2021-07-16T06:59:38Z) - Neural Temporal Point Processes: A Review [25.969319777457606]
Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.
neural TPPs combine the fundamental ideas from point process literature with deep learning approaches.
arXiv Detail & Related papers (2021-04-08T06:10:50Z) - Process Discovery for Structured Program Synthesis [70.29027202357385]
A core task in process mining is process discovery which aims to learn an accurate process model from event log data.
In this paper, we propose to use (block-) structured programs directly as target process models.
We develop a novel bottom-up agglomerative approach to the discovery of such structured program process models.
arXiv Detail & Related papers (2020-08-13T10:33:10Z) - Bootstrapping Neural Processes [114.97111530885093]
Neural Processes (NPs) implicitly define a broad class of processes with neural networks.
NPs still rely on an assumption that uncertainty in processes is modeled by a single latent variable.
We propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap.
arXiv Detail & Related papers (2020-08-07T02:23:34Z) - Determinantal Point Processes in Randomized Numerical Linear Algebra [80.27102478796613]
Numerical Linear Algebra (RandNLA) uses randomness to develop improved algorithms for matrix problems that arise in scientific computing, data science, machine learning, etc.
Recent work has uncovered deep and fruitful connections between DPPs and RandNLA which lead to new guarantees and improved algorithms.
arXiv Detail & Related papers (2020-05-07T00:39:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.