Dilated Continuous Random Field for Semantic Segmentation
- URL: http://arxiv.org/abs/2202.00162v1
- Date: Tue, 1 Feb 2022 00:38:55 GMT
- Title: Dilated Continuous Random Field for Semantic Segmentation
- Authors: Xi Mo, Xiangyu Chen, Cuncong Zhong, Rui Li, Kaidong Li, Usman Sajid
- Abstract summary: Mean field approximation methodology has laid the foundation of modern Continuous Random Field (CRF) based solutions for semantic segmentation.
In this paper, we propose to relax the hard constraint of mean field approximation, by a global optimization with the proposed dilated sparse convolution module (DSConv)
In addition, adaptive global average-pooling and adaptive global max-pooling are implemented as replacements of fully connected layers.
- Score: 6.1794523510406885
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mean field approximation methodology has laid the foundation of modern
Continuous Random Field (CRF) based solutions for the refinement of semantic
segmentation. In this paper, we propose to relax the hard constraint of mean
field approximation - minimizing the energy term of each node from
probabilistic graphical model, by a global optimization with the proposed
dilated sparse convolution module (DSConv). In addition, adaptive global
average-pooling and adaptive global max-pooling are implemented as replacements
of fully connected layers. In order to integrate DSConv, we design an
end-to-end, time-efficient DilatedCRF pipeline. The unary energy term is
derived either from pre-softmax and post-softmax features, or the predicted
affordance map using a conventional classifier, making it easier to implement
DilatedCRF for varieties of classifiers. We also present superior experimental
results of proposed approach on the suction dataset comparing to other
CRF-based approaches.
Related papers
- Adaptive Sampled Softmax with Inverted Multi-Index: Methods, Theory and Applications [79.53938312089308]
The MIDX-Sampler is a novel adaptive sampling strategy based on an inverted multi-index approach.
Our method is backed by rigorous theoretical analysis, addressing key concerns such as sampling bias, gradient bias, convergence rates, and generalization error bounds.
arXiv Detail & Related papers (2025-01-15T04:09:21Z) - Integrating Geodesic Interpolation and Flow Matching for Non-Autoregressive Text Generation in Logit Space [4.347494885647007]
Non-autoregressive language models are emerging as effective alternatives to autoregressive models in the field of natural language processing.
This study introduces a novel flow matching approach that employs Kullback-Leibler divergence geodesics to interpolate between initial and target distributions for discrete sequences.
arXiv Detail & Related papers (2024-11-25T17:15:41Z) - Diffusion Stochastic Optimization for Min-Max Problems [33.73046548872663]
The optimistic gradient method is useful in addressing minimax optimization problems.
Motivated by the observation that the conventional version suffers from the need for a large batch size, we introduce and analyze a new formulation termed Samevareps-generativeOGOG.
arXiv Detail & Related papers (2024-01-26T01:16:59Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Regularized Frank-Wolfe for Dense CRFs: Generalizing Mean Field and
Beyond [19.544213396776268]
We introduce regularized Frank-Wolfe, a general and effective CNN baseline inference for dense conditional fields.
We show that our new algorithms, with our new algorithms, with our new datasets, with significant improvements in strong strong neural networks.
arXiv Detail & Related papers (2021-10-27T20:44:47Z) - The Minimax Complexity of Distributed Optimization [0.0]
I present the "graph oracle model", an extension of the classic oracle framework that can be applied to distributed optimization.
I focus on the specific case of the "intermittent communication setting"
I analyze the theoretical properties of the popular Local Descent (SGD) algorithm in convex setting.
arXiv Detail & Related papers (2021-09-01T15:18:33Z) - Efficient semidefinite-programming-based inference for binary and
multi-class MRFs [83.09715052229782]
We propose an efficient method for computing the partition function or MAP estimate in a pairwise MRF.
We extend semidefinite relaxations from the typical binary MRF to the full multi-class setting, and develop a compact semidefinite relaxation that can again be solved efficiently using the solver.
arXiv Detail & Related papers (2020-12-04T15:36:29Z) - Efficient Methods for Structured Nonconvex-Nonconcave Min-Max
Optimization [98.0595480384208]
We propose a generalization extraient spaces which converges to a stationary point.
The algorithm applies not only to general $p$-normed spaces, but also to general $p$-dimensional vector spaces.
arXiv Detail & Related papers (2020-10-31T21:35:42Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z) - AIN: Fast and Accurate Sequence Labeling with Approximate Inference
Network [75.44925576268052]
The linear-chain Conditional Random Field (CRF) model is one of the most widely-used neural sequence labeling approaches.
Exact probabilistic inference algorithms are typically applied in training and prediction stages of the CRF model.
We propose to employ a parallelizable approximate variational inference algorithm for the CRF model.
arXiv Detail & Related papers (2020-09-17T12:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.