Learn over Past, Evolve for Future: Forecasting Temporal Trends for Fake
News Detection
- URL: http://arxiv.org/abs/2306.14728v1
- Date: Mon, 26 Jun 2023 14:29:05 GMT
- Title: Learn over Past, Evolve for Future: Forecasting Temporal Trends for Fake
News Detection
- Authors: Beizhe Hu, Qiang Sheng, Juan Cao, Yongchun Zhu, Danding Wang, Zhengjia
Wang, Zhiwei Jin
- Abstract summary: We design an effective framework FTT (Forecasting Temporal Trends), which could forecast the temporal distribution patterns of news data.
Experiments on the real-world temporally split dataset demonstrate the superiority of our proposed framework.
- Score: 14.271593690271136
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Fake news detection has been a critical task for maintaining the health of
the online news ecosystem. However, very few existing works consider the
temporal shift issue caused by the rapidly-evolving nature of news data in
practice, resulting in significant performance degradation when training on
past data and testing on future data. In this paper, we observe that the
appearances of news events on the same topic may display discernible patterns
over time, and posit that such patterns can assist in selecting training
instances that could make the model adapt better to future data. Specifically,
we design an effective framework FTT (Forecasting Temporal Trends), which could
forecast the temporal distribution patterns of news data and then guide the
detector to fast adapt to future distribution. Experiments on the real-world
temporally split dataset demonstrate the superiority of our proposed framework.
The code is available at https://github.com/ICTMCG/FTT-ACL23.
Related papers
- A Survey of Deep Graph Learning under Distribution Shifts: from Graph Out-of-Distribution Generalization to Adaptation [59.14165404728197]
We provide an up-to-date and forward-looking review of deep graph learning under distribution shifts.
Specifically, we cover three primary scenarios: graph OOD generalization, training-time graph OOD adaptation, and test-time graph OOD adaptation.
To provide a better understanding of the literature, we systematically categorize the existing models based on our proposed taxonomy.
arXiv Detail & Related papers (2024-10-25T02:39:56Z) - From News to Forecast: Integrating Event Analysis in LLM-Based Time Series Forecasting with Reflection [16.47323362700347]
We introduce a novel approach to enhance time series forecasting by reasoning across both text and time series data.
With language as a medium, our method adaptively integrates social events into forecasting models, aligning news content with time series fluctuations to provide richer insights.
Specifically, we utilize LLM-based agents to iteratively filter out irrelevant news and employ human-like reasoning to evaluate predictions.
arXiv Detail & Related papers (2024-09-26T03:50:22Z) - Probing the Robustness of Time-series Forecasting Models with
CounterfacTS [1.823020744088554]
We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks.
CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts.
arXiv Detail & Related papers (2024-03-06T07:34:47Z) - HIP Network: Historical Information Passing Network for Extrapolation
Reasoning on Temporal Knowledge Graph [14.832067253514213]
We propose the Historical Information Passing (HIP) network to predict future events.
Our method considers the updating of relation representations and adopts three scoring functions corresponding to the above dimensions.
Experimental results on five benchmark datasets show the superiority of HIP network.
arXiv Detail & Related papers (2024-02-19T11:50:30Z) - VIBE: Topic-Driven Temporal Adaptation for Twitter Classification [9.476760540618903]
We study temporal adaptation, where models trained on past data are tested in the future.
Our model, with only 3% of data, significantly outperforms previous state-of-the-art continued-pretraining methods.
arXiv Detail & Related papers (2023-10-16T08:53:57Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Diverse Data Augmentation with Diffusions for Effective Test-time Prompt
Tuning [73.75282761503581]
We propose DiffTPT, which leverages pre-trained diffusion models to generate diverse and informative new data.
Our experiments on test datasets with distribution shifts and unseen categories demonstrate that DiffTPT improves the zero-shot accuracy by an average of 5.13%.
arXiv Detail & Related papers (2023-08-11T09:36:31Z) - Correlated Time Series Self-Supervised Representation Learning via
Spatiotemporal Bootstrapping [13.988624652592259]
Time series analysis plays an important role in many real-world industries.
In this paper, we propose a time-step-level representation learning framework for individual instances.
A linear regression model trained on top of the learned representations demonstrates our model performs best in most cases.
arXiv Detail & Related papers (2023-06-12T09:42:16Z) - Time-Varying Propensity Score to Bridge the Gap between the Past and Present [104.46387765330142]
We introduce a time-varying propensity score that can detect gradual shifts in the distribution of data.
We demonstrate different ways of implementing it and evaluate it on a variety of problems.
arXiv Detail & Related papers (2022-10-04T07:21:49Z) - Networked Time Series Prediction with Incomplete Data [59.45358694862176]
We propose NETS-ImpGAN, a novel deep learning framework that can be trained on incomplete data with missing values in both history and future.
We conduct extensive experiments on three real-world datasets under different missing patterns and missing rates.
arXiv Detail & Related papers (2021-10-05T18:20:42Z) - Hidden Biases in Unreliable News Detection Datasets [60.71991809782698]
We show that selection bias during data collection leads to undesired artifacts in the datasets.
We observed a significant drop (>10%) in accuracy for all models tested in a clean split with no train/test source overlap.
We suggest future dataset creation include a simple model as a difficulty/bias probe and future model development use a clean non-overlapping site and date split.
arXiv Detail & Related papers (2021-04-20T17:16:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.