Direct Measure Matching for Crowd Counting
- URL: http://arxiv.org/abs/2107.01558v1
- Date: Sun, 4 Jul 2021 06:37:33 GMT
- Title: Direct Measure Matching for Crowd Counting
- Authors: Hui Lin, Xiaopeng Hong, Zhiheng Ma, Xing Wei, Yunfeng Qiu, Yaowei
Wang, Yihong Gong
- Abstract summary: We propose a new measure-based counting approach to regress the predicted density maps to the scattered point-annotated ground truth directly.
In this paper, we derive a semi-balanced form of Sinkhorn divergence, based on which a Sinkhorn counting loss is designed for measure matching.
- Score: 59.66286603624411
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traditional crowd counting approaches usually use Gaussian assumption to
generate pseudo density ground truth, which suffers from problems like
inaccurate estimation of the Gaussian kernel sizes. In this paper, we propose a
new measure-based counting approach to regress the predicted density maps to
the scattered point-annotated ground truth directly. First, crowd counting is
formulated as a measure matching problem. Second, we derive a semi-balanced
form of Sinkhorn divergence, based on which a Sinkhorn counting loss is
designed for measure matching. Third, we propose a self-supervised mechanism by
devising a Sinkhorn scale consistency loss to resist scale changes. Finally, an
efficient optimization method is provided to minimize the overall loss
function. Extensive experiments on four challenging crowd counting datasets
namely ShanghaiTech, UCF-QNRF, JHU++, and NWPU have validated the proposed
method.
Related papers
- Scale-Aware Crowd Count Network with Annotation Error Correction [19.036693336902662]
Traditional crowd counting networks suffer from information loss when feature maps are downsized through pooling layers.
We propose a Scale-Aware Crowd Counting Network (SACC-Net) that introduces a scale-aware'' architecture with error-correcting capabilities of noisy annotations.
arXiv Detail & Related papers (2023-12-28T01:36:38Z) - Direct Unsupervised Denoising [60.71146161035649]
Unsupervised denoisers do not directly produce a single prediction, such as the MMSE estimate.
We present an alternative approach that trains a deterministic network alongside the VAE to directly predict a central tendency.
arXiv Detail & Related papers (2023-10-27T13:02:12Z) - Revisiting Rotation Averaging: Uncertainties and Robust Losses [51.64986160468128]
We argue that the main problem of current methods is the minimized cost function that is only weakly connected with the input data via the estimated epipolar.
We propose to better model the underlying noise distributions by directly propagating the uncertainty from the point correspondences into the rotation averaging.
arXiv Detail & Related papers (2023-03-09T11:51:20Z) - Scale-Aware Crowd Counting Using a Joint Likelihood Density Map and
Synthetic Fusion Pyramid Network [15.882525477601183]
We develop a Synthetic Fusion Pyramid Network (SPF-Net) with a scale-aware loss function design for accurate crowd counting.
Existing crowd-counting methods assume that the training annotation points were accurate and thus ignore the fact that noisy annotations can lead to large model-learning bias and counting error.
This work is the first to properly handle such noise at multiple scales in end-to-end loss design and thus push the crowd counting state-of-the-art.
arXiv Detail & Related papers (2022-11-13T06:52:47Z) - Robust Inference of Manifold Density and Geometry by Doubly Stochastic
Scaling [8.271859911016719]
We develop tools for robust inference under high-dimensional noise.
We show that our approach is robust to variability in technical noise levels across cell types.
arXiv Detail & Related papers (2022-09-16T15:39:11Z) - Heavy-tailed denoising score matching [5.371337604556311]
We develop an iterative noise scaling algorithm to consistently initialise the multiple levels of noise in Langevin dynamics.
On the practical side, our use of heavy-tailed DSM leads to improved score estimation, controllable sampling convergence, and more balanced unconditional generative performance for imbalanced datasets.
arXiv Detail & Related papers (2021-12-17T22:04:55Z) - Distribution Matching for Crowd Counting [51.90971145453012]
We show that imposing Gaussians to annotations hurts generalization performance.
We propose to use Distribution Matching for crowd COUNTing (DM-Count)
In terms of Mean Absolute Error, DM-Count outperforms the previous state-of-the-art methods.
arXiv Detail & Related papers (2020-09-28T04:57:23Z) - Completely Self-Supervised Crowd Counting via Distribution Matching [92.09218454377395]
We propose a complete self-supervision approach to training models for dense crowd counting.
The only input required to train, apart from a large set of unlabeled crowd images, is the approximate upper limit of the crowd count.
Our method dwells on the idea that natural crowds follow a power law distribution, which could be leveraged to yield error signals for backpropagation.
arXiv Detail & Related papers (2020-09-14T13:20:12Z) - Generative Modeling with Denoising Auto-Encoders and Langevin Sampling [88.83704353627554]
We show that both DAE and DSM provide estimates of the score of the smoothed population density.
We then apply our results to the homotopy method of arXiv:1907.05600 and provide theoretical justification for its empirical success.
arXiv Detail & Related papers (2020-01-31T23:50:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.