ELSA: Efficient Label Shift Adaptation through the Lens of
Semiparametric Models
- URL: http://arxiv.org/abs/2305.19123v1
- Date: Tue, 30 May 2023 15:31:44 GMT
- Title: ELSA: Efficient Label Shift Adaptation through the Lens of
Semiparametric Models
- Authors: Qinglong Tian, Xin Zhang, Jiwei Zhao
- Abstract summary: Under the label shift context, the marginal distribution of the label varies across the training and testing datasets.
We propose a moment-matching framework for adapting the label shift based on the geometry of the influence function.
Under such a framework, we propose a novel method named underlineEfficient underlineLabel underlineShift underlineAdaptation (ELSA)
- Score: 2.73424570009809
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the domain adaptation problem with label shift in this work. Under
the label shift context, the marginal distribution of the label varies across
the training and testing datasets, while the conditional distribution of
features given the label is the same. Traditional label shift adaptation
methods either suffer from large estimation errors or require cumbersome
post-prediction calibrations. To address these issues, we first propose a
moment-matching framework for adapting the label shift based on the geometry of
the influence function. Under such a framework, we propose a novel method named
\underline{E}fficient \underline{L}abel \underline{S}hift
\underline{A}daptation (ELSA), in which the adaptation weights can be estimated
by solving linear systems. Theoretically, the ELSA estimator is
$\sqrt{n}$-consistent ($n$ is the sample size of the source data) and
asymptotically normal. Empirically, we show that ELSA can achieve
state-of-the-art estimation performances without post-prediction calibrations,
thus, gaining computational efficiency.
Related papers
- Inaccurate Label Distribution Learning with Dependency Noise [52.08553913094809]
We introduce the Dependent Noise-based Inaccurate Label Distribution Learning (DN-ILDL) framework to tackle the challenges posed by noise in label distribution learning.
We show that DN-ILDL effectively addresses the ILDL problem and outperforms existing LDL methods.
arXiv Detail & Related papers (2024-05-26T07:58:07Z) - Estimating calibration error under label shift without labels [47.57286245320775]
Existing CE estimators assume access to labels from the target domain, which are often unavailable in practice, i.e., when the model is deployed and used.
This work proposes a novel CE estimator under label shift, which is characterized by changes in the marginal label distribution $p(Y)$ while keeping the conditional $p(X|Y)$ constant between the source and target distributions.
Our contribution is an approach, which, by leveraging importance re-weighting of the labeled source distribution, provides consistent and unbiased CE estimation with respect to the shifted target distribution.
arXiv Detail & Related papers (2023-12-14T01:18:51Z) - Online Label Shift: Optimal Dynamic Regret meets Practical Algorithms [33.61487362513345]
This paper focuses on supervised and unsupervised online label shift, where the class marginals $Q(y)$ varies but the class-conditionals $Q(x|y)$ remain invariant.
In the unsupervised setting, our goal is to adapt a learner, trained on some offline labeled data, to changing label distributions given unlabeled online data.
We develop novel algorithms that reduce the adaptation problem to online regression and guarantee optimal dynamic regret without any prior knowledge of the extent of drift in the label distribution.
arXiv Detail & Related papers (2023-05-31T05:39:52Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - Label distribution learning via label correlation grid [9.340734188957727]
We propose a textbfLabel textbfCorrelation textbfGrid (LCG) to model the uncertainty of label relationships.
Our network learns the LCG to accurately estimate the label distribution for each instance.
arXiv Detail & Related papers (2022-10-15T03:58:15Z) - Domain Adaptation under Open Set Label Shift [39.424134505152544]
We introduce the problem of domain adaptation under Open Set Label Shift (OSLS)
OSLS subsumes domain adaptation under label shift and Positive-Unlabeled (PU) learning.
We propose practical methods for both tasks that leverage black-box predictors.
arXiv Detail & Related papers (2022-07-26T17:09:48Z) - Instance-Dependent Partial Label Learning [69.49681837908511]
Partial label learning is a typical weakly supervised learning problem.
Most existing approaches assume that the incorrect labels in each training example are randomly picked as the candidate labels.
In this paper, we consider instance-dependent and assume that each example is associated with a latent label distribution constituted by the real number of each label.
arXiv Detail & Related papers (2021-10-25T12:50:26Z) - Distribution-Aware Semantics-Oriented Pseudo-label for Imbalanced
Semi-Supervised Learning [80.05441565830726]
This paper addresses imbalanced semi-supervised learning, where heavily biased pseudo-labels can harm the model performance.
We propose a general pseudo-labeling framework to address the bias motivated by this observation.
We term the novel pseudo-labeling framework for imbalanced SSL as Distribution-Aware Semantics-Oriented (DASO) Pseudo-label.
arXiv Detail & Related papers (2021-06-10T11:58:25Z) - Coping with Label Shift via Distributionally Robust Optimisation [72.80971421083937]
We propose a model that minimises an objective based on distributionally robust optimisation (DRO)
We then design and analyse a gradient descent-proximal mirror ascent algorithm tailored for large-scale problems to optimise the proposed objective.
arXiv Detail & Related papers (2020-10-23T08:33:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.