Reciprocal Adversarial Learning via Characteristic Functions
- URL: http://arxiv.org/abs/2006.08413v2
- Date: Fri, 23 Oct 2020 22:00:39 GMT
- Title: Reciprocal Adversarial Learning via Characteristic Functions
- Authors: Shengxi Li, Zeyang Yu, Min Xiang, Danilo Mandic
- Abstract summary: Generative adversarial nets (GANs) have become a preferred tool for tasks involving complicated distributions.
We show how to use the characteristic function (CF) to compare the distributions rather than their moments.
We then prove an equivalence between the embedded and data domains when a reciprocal exists, where we naturally develop the GAN in an auto-encoder structure.
This efficient structure uses only two modules, together with a simple training strategy, to achieve bi-directionally generating clear images.
- Score: 12.961770002117142
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative adversarial nets (GANs) have become a preferred tool for tasks
involving complicated distributions. To stabilise the training and reduce the
mode collapse of GANs, one of their main variants employs the integral
probability metric (IPM) as the loss function. This provides extensive IPM-GANs
with theoretical support for basically comparing moments in an embedded domain
of the \textit{critic}. We generalise this by comparing the distributions
rather than their moments via a powerful tool, i.e., the characteristic
function (CF), which uniquely and universally comprising all the information
about a distribution. For rigour, we first establish the physical meaning of
the phase and amplitude in CF, and show that this provides a feasible way of
balancing the accuracy and diversity of generation. We then develop an
efficient sampling strategy to calculate the CFs. Within this framework, we
further prove an equivalence between the embedded and data domains when a
reciprocal exists, where we naturally develop the GAN in an auto-encoder
structure, in a way of comparing everything in the embedded space (a
semantically meaningful manifold). This efficient structure uses only two
modules, together with a simple training strategy, to achieve bi-directionally
generating clear images, which is referred to as the reciprocal CF GAN
(RCF-GAN). Experimental results demonstrate the superior performances of the
proposed RCF-GAN in terms of both generation and reconstruction.
Related papers
- CF-GO-Net: A Universal Distribution Learner via Characteristic Function Networks with Graph Optimizers [8.816637789605174]
We introduce an approach which employs the characteristic function (CF), a probabilistic descriptor that directly corresponds to the distribution.
Unlike the probability density function (pdf), the characteristic function not only always exists, but also provides an additional degree of freedom.
Our method allows the use of a pre-trained model, such as a well-trained autoencoder, and is capable of learning directly in its feature space.
arXiv Detail & Related papers (2024-09-19T09:33:12Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - SMaRt: Improving GANs with Score Matching Regularity [94.81046452865583]
Generative adversarial networks (GANs) usually struggle in learning from highly diverse data, whose underlying manifold is complex.
We show that score matching serves as a promising solution to this issue thanks to its capability of persistently pushing the generated data points towards the real data manifold.
We propose to improve the optimization of GANs with score matching regularity (SMaRt)
arXiv Detail & Related papers (2023-11-30T03:05:14Z) - Generative Adversarial Networks to infer velocity components in rotating
turbulent flows [2.0873604996221946]
We show that CNN and GAN always outperform EPOD both concerning point-wise and statistical reconstructions.
The analysis is performed using both standard validation tools based on $L$ spatial distance between the prediction and the ground truth.
arXiv Detail & Related papers (2023-01-18T13:59:01Z) - Transformer-based Context Condensation for Boosting Feature Pyramids in
Object Detection [77.50110439560152]
Current object detectors typically have a feature pyramid (FP) module for multi-level feature fusion (MFF)
We propose a novel and efficient context modeling mechanism that can help existing FPs deliver better MFF results.
In particular, we introduce a novel insight that comprehensive contexts can be decomposed and condensed into two types of representations for higher efficiency.
arXiv Detail & Related papers (2022-07-14T01:45:03Z) - Contrastive Conditional Neural Processes [45.70735205041254]
Conditional Neural Processes(CNPs) bridge neural networks with probabilistic inference to approximate functions of Processes under meta-learning settings.
Two auxiliary contrastive branches are set up hierarchically, namely in-instantiation temporal contrastive learning(tt TCL) and cross-instantiation function contrastive learning(tt FCL)
We empirically show that tt TCL captures high-level abstraction of observations, whereas tt FCL helps identify underlying functions, which in turn provides more efficient representations.
arXiv Detail & Related papers (2022-03-08T10:08:45Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - CSformer: Bridging Convolution and Transformer for Compressive Sensing [65.22377493627687]
This paper proposes a hybrid framework that integrates the advantages of leveraging detailed spatial information from CNN and the global context provided by transformer for enhanced representation learning.
The proposed approach is an end-to-end compressive image sensing method, composed of adaptive sampling and recovery.
The experimental results demonstrate the effectiveness of the dedicated transformer-based architecture for compressive sensing.
arXiv Detail & Related papers (2021-12-31T04:37:11Z) - Self-Ensembling GAN for Cross-Domain Semantic Segmentation [107.27377745720243]
This paper proposes a self-ensembling generative adversarial network (SE-GAN) exploiting cross-domain data for semantic segmentation.
In SE-GAN, a teacher network and a student network constitute a self-ensembling model for generating semantic segmentation maps, which together with a discriminator, forms a GAN.
Despite its simplicity, we find SE-GAN can significantly boost the performance of adversarial training and enhance the stability of the model.
arXiv Detail & Related papers (2021-12-15T09:50:25Z) - Local Similarity Pattern and Cost Self-Reassembling for Deep Stereo
Matching Networks [3.7384509727711923]
We introduce a pairwise feature for deep stereo matching networks, named LSP (Local Similarity Pattern)
Through explicitly revealing the neighbor relationships, LSP contains rich structural information, which can be leveraged to aid for more discriminative feature description.
Secondly, we design a dynamic self-reassembling refinement strategy and apply it to the cost distribution and the disparity map respectively.
arXiv Detail & Related papers (2021-12-02T06:52:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.