A Bounded Measure for Estimating the Benefit of Visualization
- URL: http://arxiv.org/abs/2002.05282v2
- Date: Sat, 25 Jul 2020 20:33:33 GMT
- Title: A Bounded Measure for Estimating the Benefit of Visualization
- Authors: Min Chen, Mateu Sbert, Alfie Abdul-Rahman, and Deborah Silver
- Abstract summary: Information theory can be used to analyze the cost-benefit of visualization processes.
The current measure of benefit contains an unbounded term that is neither easy to estimate nor intuitive to interpret.
We propose to revise the existing cost-benefit measure by replacing the unbounded term with a bounded one.
- Score: 3.8360246117087473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Information theory can be used to analyze the cost-benefit of visualization
processes. However, the current measure of benefit contains an unbounded term
that is neither easy to estimate nor intuitive to interpret. In this work, we
propose to revise the existing cost-benefit measure by replacing the unbounded
term with a bounded one. We examine a number of bounded measures that include
the Jenson-Shannon divergence and a new divergence measure formulated as part
of this work. We use visual analysis to support the multi-criteria comparison,
narrowing the search down to those options with better mathematical properties.
We apply those remaining options to two visualization case studies to
instantiate their uses in practical scenarios, while the collected real world
data further informs the selection of a bounded measure, which can be used to
estimate the benefit of visualization.
Related papers
- Rethinking Distance Metrics for Counterfactual Explainability [53.436414009687]
We investigate a framing for counterfactual generation methods that considers counterfactuals not as independent draws from a region around the reference, but as jointly sampled with the reference from the underlying data distribution.
We derive a distance metric, tailored for counterfactual similarity that can be applied to a broad range of settings.
arXiv Detail & Related papers (2024-10-18T15:06:50Z) - Disentangled Representation Learning with Transmitted Information Bottleneck [57.22757813140418]
We present textbfDisTIB (textbfTransmitted textbfInformation textbfBottleneck for textbfDisd representation learning), a novel objective that navigates the balance between information compression and preservation.
arXiv Detail & Related papers (2023-11-03T03:18:40Z) - Navigating Explanatory Multiverse Through Counterfactual Path Geometry [5.109188339767978]
We introduce the novel concept of explanatory multiverse.
We show how to navigate, reason about and compare the geometry of these trajectories.
We propose an all-in-one metric, called opportunity potential, to quantify them.
arXiv Detail & Related papers (2023-06-05T11:26:46Z) - Best-Effort Adaptation [62.00856290846247]
We present a new theoretical analysis of sample reweighting methods, including bounds holding uniformly over the weights.
We show how these bounds can guide the design of learning algorithms that we discuss in detail.
We report the results of a series of experiments demonstrating the effectiveness of our best-effort adaptation and domain adaptation algorithms.
arXiv Detail & Related papers (2023-05-10T00:09:07Z) - Bayesian Hierarchical Models for Counterfactual Estimation [12.159830463756341]
We propose a probabilistic paradigm to estimate a diverse set of counterfactuals.
We treat the perturbations as random variables endowed with prior distribution functions.
A gradient based sampler with superior convergence characteristics efficiently computes the posterior samples.
arXiv Detail & Related papers (2023-01-21T00:21:11Z) - Unifying Summary Statistic Selection for Approximate Bayesian Computation [2.928146328426698]
We characterize different classes of summaries and demonstrate their importance for correctly analysing dimensionality reduction algorithms.
We offer a unifying framework for obtaining informative summaries, provide concrete recommendations for practitioners, and propose a practical method to obtain high-fidelity summaries.
arXiv Detail & Related papers (2022-06-06T03:59:46Z) - On the Choice of Fairness: Finding Representative Fairness Metrics for a
Given Context [5.667221573173013]
Various notions of fairness have been defined, though choosing an appropriate metric is cumbersome.
Trade-offs and impossibility theorems make such selection even more complicated and controversial.
We propose a framework that automatically discovers the correlations and trade-offs between different pairs of measures for a given context.
arXiv Detail & Related papers (2021-09-13T04:17:38Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Uncertainty-Aware Few-Shot Image Classification [118.72423376789062]
Few-shot image classification learns to recognize new categories from limited labelled data.
We propose Uncertainty-Aware Few-Shot framework for image classification.
arXiv Detail & Related papers (2020-10-09T12:26:27Z) - Functional Regularization for Representation Learning: A Unified
Theoretical Perspective [27.93916012334704]
Unsupervised and self-supervised learning approaches have become a crucial tool to learn representations for downstream prediction tasks.
We present a unifying perspective where several such approaches can be viewed as imposing a regularization on the representation via a learnable function using unlabeled data.
We propose a discriminative theoretical framework for analyzing the sample complexity of these approaches, which generalizes the framework of (Balcan and Blum, 2010) to allow learnable regularization functions.
arXiv Detail & Related papers (2020-08-06T04:06:04Z) - Evaluations and Methods for Explanation through Robustness Analysis [117.7235152610957]
We establish a novel set of evaluation criteria for such feature based explanations by analysis.
We obtain new explanations that are loosely necessary and sufficient for a prediction.
We extend the explanation to extract the set of features that would move the current prediction to a target class.
arXiv Detail & Related papers (2020-05-31T05:52:05Z) - On Binscatter [0.7999703756441756]
We study the properties of this method formally and develop enhanced visualization and econometric binscatter tools.
General purpose software in Python, R, and Stata is provided.
arXiv Detail & Related papers (2019-02-25T20:53:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.