An MRF-UNet Product of Experts for Image Segmentation
- URL: http://arxiv.org/abs/2104.05495v1
- Date: Mon, 12 Apr 2021 14:25:32 GMT
- Title: An MRF-UNet Product of Experts for Image Segmentation
- Authors: Mikael Brudfors, Ya\"el Balbastre, John Ashburner, Geraint Rees,
Parashkev Nachev, S\'ebastien Ourselin, M. Jorge Cardoso
- Abstract summary: Markov random fields (MRFs) encode simpler over labels that are less prone to over-fitting.
We propose to fuse both strategies by computing the product of distributions of a UNet and an MRF.
The resulting MRF-UNet is trained jointly by back-propagation.
- Score: 1.7897459398362972
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While convolutional neural networks (CNNs) trained by back-propagation have
seen unprecedented success at semantic segmentation tasks, they are known to
struggle on out-of-distribution data. Markov random fields (MRFs) on the other
hand, encode simpler distributions over labels that, although less flexible
than UNets, are less prone to over-fitting. In this paper, we propose to fuse
both strategies by computing the product of distributions of a UNet and an MRF.
As this product is intractable, we solve for an approximate distribution using
an iterative mean-field approach. The resulting MRF-UNet is trained jointly by
back-propagation. Compared to other works using conditional random fields
(CRFs), the MRF has no dependency on the imaging data, which should allow for
less over-fitting. We show on 3D neuroimaging data that this novel network
improves generalisation to out-of-distribution samples. Furthermore, it allows
the overall number of parameters to be reduced while preserving high accuracy.
These results suggest that a classic MRF smoothness prior can allow for less
over-fitting when principally integrated into a CNN model. Our implementation
is available at https://github.com/balbasty/nitorch.
Related papers
- Bypassing the Noisy Parity Barrier: Learning Higher-Order Markov Random Fields from Dynamics [21.976109703401114]
We consider the problem of learning graphical models, also known as Markov random fields (MRFs) from temporally correlated samples.
In particular, we show that given a trajectory with $widetildeO_k(n)$ site updates of an order $k$ MRF from the Glauber dynamics, there is an algorithm that recovers the graph and the parameters in $widetildeO_k(n2)$ time.
Our results thus surprisingly show that this more realistic, but intuitively less tractable, model for MRFs actually leads to efficiency far beyond what
arXiv Detail & Related papers (2024-09-09T02:32:45Z) - Boundary-aware Decoupled Flow Networks for Realistic Extreme Rescaling [49.215957313126324]
Recently developed generative methods, including invertible rescaling network (IRN) based and generative adversarial network (GAN) based methods, have demonstrated exceptional performance in image rescaling.
However, IRN-based methods tend to produce over-smoothed results, while GAN-based methods easily generate fake details.
We propose Boundary-aware Decoupled Flow Networks (BDFlow) to generate realistic and visually pleasing results.
arXiv Detail & Related papers (2024-05-05T14:05:33Z) - Neural Markov Random Field for Stereo Matching [31.769019851152173]
We propose a neural MRF model, where both potential functions and message passing are designed using data-driven neural networks.
We also propose a Disparity Proposal Network (DPN) to adaptively prune the search space of disparity.
The proposed approach ranks $1st$ on both KITTI 2012 and 2015 leaderboards while running faster than 100 ms.
arXiv Detail & Related papers (2024-03-17T12:40:46Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - GFlowOut: Dropout with Generative Flow Networks [76.59535235717631]
Monte Carlo Dropout has been widely used as a relatively cheap way for approximate Inference.
Recent works show that the dropout mask can be viewed as a latent variable, which can be inferred with variational inference.
GFlowOutleverages the recently proposed probabilistic framework of Generative Flow Networks (GFlowNets) to learn the posterior distribution over dropout masks.
arXiv Detail & Related papers (2022-10-24T03:00:01Z) - Learning from aggregated data with a maximum entropy model [73.63512438583375]
We show how a new model, similar to a logistic regression, may be learned from aggregated data only by approximating the unobserved feature distribution with a maximum entropy hypothesis.
We present empirical evidence on several public datasets that the model learned this way can achieve performances comparable to those of a logistic model trained with the full unaggregated data.
arXiv Detail & Related papers (2022-10-05T09:17:27Z) - Distributionally Robust Models with Parametric Likelihood Ratios [123.05074253513935]
Three simple ideas allow us to train models with DRO using a broader class of parametric likelihood ratios.
We find that models trained with the resulting parametric adversaries are consistently more robust to subpopulation shifts when compared to other DRO approaches.
arXiv Detail & Related papers (2022-04-13T12:43:12Z) - Exponentially Tilted Gaussian Prior for Variational Autoencoder [3.52359746858894]
Recent studies show that probabilistic generative models can perform poorly on this task.
We propose the exponentially tilted Gaussian prior distribution for the Variational Autoencoder (VAE)
We show that our model produces high quality image samples which are more crisp than that of a standard Gaussian VAE.
arXiv Detail & Related papers (2021-11-30T18:28:19Z) - Semi-Supervised Node Classification on Graphs: Markov Random Fields vs.
Graph Neural Networks [38.760186021633146]
Semi-supervised node classification on graph-structured data has many applications such as fraud detection, fake account and review detection, user's private attribute inference in social networks, and community detection.
Various methods such as pairwise Markov Random Fields (pMRF) and graph neural networks were developed for semi-supervised node classification.
pMRF is more efficient than graph neural networks.
Existing pMRF-based methods are less accurate than graph neural networks, due to a key limitation that they assume a constant edge potential for all edges.
arXiv Detail & Related papers (2020-12-24T03:46:08Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z) - Deep Gaussian Markov Random Fields [17.31058900857327]
We establish a formal connection between GMRFs and convolutional neural networks (CNNs)
Common GMRFs are special cases of a generative model where the inverse mapping from data to latent variables is given by a 1-layer linear CNN.
We describe how well-established tools, such as autodiff and variational inference, can be used for simple and efficient inference and learning of the deep GMRF.
arXiv Detail & Related papers (2020-02-18T10:06:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.