TX-Gen: Multi-Objective Optimization for Sparse Counterfactual Explanations for Time-Series Classification
- URL: http://arxiv.org/abs/2409.09461v2
- Date: Mon, 11 Nov 2024 09:37:09 GMT
- Title: TX-Gen: Multi-Objective Optimization for Sparse Counterfactual Explanations for Time-Series Classification
- Authors: Qi Huang, Sofoklis Kitharidis, Thomas Bäck, Niki van Stein,
- Abstract summary: We introduce TX-Gen, a novel algorithm for generating counterfactual explanations based on the Non-dominated Sorting Genetic Algorithm II (NSGA-II)
By incorporating a flexible reference-guided mechanism, our method improves the plausibility and interpretability of the counterfactuals without relying on predefined assumptions.
- Score: 0.42105583610914427
- License:
- Abstract: In time-series classification, understanding model decisions is crucial for their application in high-stakes domains such as healthcare and finance. Counterfactual explanations, which provide insights by presenting alternative inputs that change model predictions, offer a promising solution. However, existing methods for generating counterfactual explanations for time-series data often struggle with balancing key objectives like proximity, sparsity, and validity. In this paper, we introduce TX-Gen, a novel algorithm for generating counterfactual explanations based on the Non-dominated Sorting Genetic Algorithm II (NSGA-II). TX-Gen leverages evolutionary multi-objective optimization to find a diverse set of counterfactuals that are both sparse and valid, while maintaining minimal dissimilarity to the original time series. By incorporating a flexible reference-guided mechanism, our method improves the plausibility and interpretability of the counterfactuals without relying on predefined assumptions. Extensive experiments on benchmark datasets demonstrate that TX-Gen outperforms existing methods in generating high-quality counterfactuals, making time-series models more transparent and interpretable.
Related papers
- FutureFill: Fast Generation from Convolutional Sequence Models [22.410028211490424]
FutureFill is a method for fast generation that applies to any sequence prediction algorithm based on convolutional operators.
Our approach reduces the generation time requirement from quadratic to quasilinear relative to the context length.
We validate our theoretical findings with experimental evidence demonstrating correctness and efficiency gains in a synthetic generation task.
arXiv Detail & Related papers (2024-10-02T15:22:08Z) - Classification of High-dimensional Time Series in Spectral Domain using Explainable Features [8.656881800897661]
We propose a model-based approach for classifying high-dimensional stationary time series.
Our approach emphasizes the interpretability of model parameters, making it especially suitable for fields like neuroscience.
The novelty of our method lies in the interpretability of the model parameters, addressing critical needs in neuroscience.
arXiv Detail & Related papers (2024-08-15T19:10:12Z) - tPARAFAC2: Tracking evolving patterns in (incomplete) temporal data [0.7285444492473742]
We introduce t(emporal)PARAFAC2 which utilizes temporal smoothness regularization on the evolving factors.
Our numerical experiments on both simulated and real datasets demonstrate the effectiveness of the temporal smoothness regularization.
arXiv Detail & Related papers (2024-07-01T15:10:55Z) - Embedded feature selection in LSTM networks with multi-objective
evolutionary ensemble learning for time series forecasting [49.1574468325115]
We present a novel feature selection method embedded in Long Short-Term Memory networks.
Our approach optimize the weights and biases of the LSTM in a partitioned manner.
Experimental evaluations on air quality time series data from Italy and southeast Spain demonstrate that our method substantially improves the ability generalization of conventional LSTMs.
arXiv Detail & Related papers (2023-12-29T08:42:10Z) - Precision-Recall Divergence Optimization for Generative Modeling with
GANs and Normalizing Flows [54.050498411883495]
We develop a novel training method for generative models, such as Generative Adversarial Networks and Normalizing Flows.
We show that achieving a specified precision-recall trade-off corresponds to minimizing a unique $f$-divergence from a family we call the textitPR-divergences.
Our approach improves the performance of existing state-of-the-art models like BigGAN in terms of either precision or recall when tested on datasets such as ImageNet.
arXiv Detail & Related papers (2023-05-30T10:07:17Z) - Conditional Denoising Diffusion for Sequential Recommendation [62.127862728308045]
Two prominent generative models, Generative Adversarial Networks (GANs) and Variational AutoEncoders (VAEs)
GANs suffer from unstable optimization, while VAEs are prone to posterior collapse and over-smoothed generations.
We present a conditional denoising diffusion model, which includes a sequence encoder, a cross-attentive denoising decoder, and a step-wise diffuser.
arXiv Detail & Related papers (2023-04-22T15:32:59Z) - An Interpretable and Efficient Infinite-Order Vector Autoregressive
Model for High-Dimensional Time Series [1.4939176102916187]
This paper proposes a novel sparse infinite-order VAR model for high-dimensional time series.
The temporal and cross-sectional structures of the VARMA-type dynamics captured by this model can be interpreted separately.
Greater statistical efficiency and interpretability can be achieved with little loss of temporal information.
arXiv Detail & Related papers (2022-09-02T17:14:24Z) - HyperImpute: Generalized Iterative Imputation with Automatic Model
Selection [77.86861638371926]
We propose a generalized iterative imputation framework for adaptively and automatically configuring column-wise models.
We provide a concrete implementation with out-of-the-box learners, simulators, and interfaces.
arXiv Detail & Related papers (2022-06-15T19:10:35Z) - MACE: An Efficient Model-Agnostic Framework for Counterfactual
Explanation [132.77005365032468]
We propose a novel framework of Model-Agnostic Counterfactual Explanation (MACE)
In our MACE approach, we propose a novel RL-based method for finding good counterfactual examples and a gradient-less descent method for improving proximity.
Experiments on public datasets validate the effectiveness with better validity, sparsity and proximity.
arXiv Detail & Related papers (2022-05-31T04:57:06Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Instance-based Counterfactual Explanations for Time Series
Classification [11.215352918313577]
We advance a novel model-agnostic, case-based technique that generates counterfactual explanations for time series classifiers.
We show that Native Guide generates plausible, proximal, sparse and diverse explanations that are better than those produced by key benchmark counterfactual methods.
arXiv Detail & Related papers (2020-09-28T10:52:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.