DeepPhaseCut: Deep Relaxation in Phase for Unsupervised Fourier Phase
Retrieval
- URL: http://arxiv.org/abs/2011.10475v1
- Date: Fri, 20 Nov 2020 16:10:08 GMT
- Title: DeepPhaseCut: Deep Relaxation in Phase for Unsupervised Fourier Phase
Retrieval
- Authors: Eunju Cha, Chanseok Lee, Mooseok Jang, and Jong Chul Ye
- Abstract summary: We propose a novel, unsupervised, feed-forward neural network for Fourier phase retrieval.
Unlike the existing deep learning approaches that use a neural network as a regularization term or an end-to-end blackbox model for supervised training, our algorithm is a feed-forward neural network implementation of PhaseCut algorithm in an unsupervised learning framework.
Our network is composed of two generators: one for the phase estimation using PhaseCut loss, followed by another generator for image reconstruction, all of which are trained simultaneously using a cycleGAN framework without matched data.
- Score: 31.380061715549584
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fourier phase retrieval is a classical problem of restoring a signal only
from the measured magnitude of its Fourier transform. Although Fienup-type
algorithms, which use prior knowledge in both spatial and Fourier domains, have
been widely used in practice, they can often stall in local minima. Modern
methods such as PhaseLift and PhaseCut may offer performance guarantees with
the help of convex relaxation. However, these algorithms are usually
computationally intensive for practical use. To address this problem, we
propose a novel, unsupervised, feed-forward neural network for Fourier phase
retrieval which enables immediate high quality reconstruction. Unlike the
existing deep learning approaches that use a neural network as a regularization
term or an end-to-end blackbox model for supervised training, our algorithm is
a feed-forward neural network implementation of PhaseCut algorithm in an
unsupervised learning framework. Specifically, our network is composed of two
generators: one for the phase estimation using PhaseCut loss, followed by
another generator for image reconstruction, all of which are trained
simultaneously using a cycleGAN framework without matched data. The link to the
classical Fienup-type algorithms and the recent symmetry-breaking learning
approach is also revealed. Extensive experiments demonstrate that the proposed
method outperforms all existing approaches in Fourier phase retrieval problems.
Related papers
- Robust Fourier Neural Networks [1.0589208420411014]
We show that introducing a simple diagonal layer after the Fourier embedding layer makes the network more robust to measurement noise.
Under certain conditions, our proposed approach can also learn functions that are noisy mixtures of nonlinear functions of Fourier features.
arXiv Detail & Related papers (2024-09-03T16:56:41Z) - Coordinate-based Neural Network for Fourier Phase Retrieval [8.827173113748703]
Single impliCit neurAl Network (SCAN) is a tool built upon coordinate neural networks meticulously designed for enhanced phase retrieval performance.
SCAN adeptly connects object coordinates to their amplitude and phase within a unified network in an unsupervised manner.
arXiv Detail & Related papers (2023-11-25T04:23:23Z) - Untrained neural network embedded Fourier phase retrieval from few
measurements [8.914156789222266]
This paper proposes an untrained neural network embedded algorithm to solve FPR with few measurements.
We use a generative network to represent the image to be recovered, which confines the image to the space defined by the network structure.
To reduce the computational cost mainly caused by the parameter updates of the untrained NN, we develop an accelerated algorithm that adaptively trades off between explicit and implicit regularization.
arXiv Detail & Related papers (2023-07-16T16:23:50Z) - Globally Optimal Training of Neural Networks with Threshold Activation
Functions [63.03759813952481]
We study weight decay regularized training problems of deep neural networks with threshold activations.
We derive a simplified convex optimization formulation when the dataset can be shattered at a certain layer of the network.
arXiv Detail & Related papers (2023-03-06T18:59:13Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - SiPRNet: End-to-End Learning for Single-Shot Phase Retrieval [8.820823270160695]
convolutional neural networks (CNN) have played important roles in various image reconstruction tasks.
In this paper, we design a novel CNN structure, named SiPRNet, to recover a signal from a single Fourier intensity measurement.
The proposed approach consistently outperforms other CNN-based and traditional optimization-based methods in single-shot maskless phase retrieval.
arXiv Detail & Related papers (2022-05-23T16:24:52Z) - Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network
Pruning [50.232218214751455]
optimal network pruning is a non-trivial task which mathematically is an NP-hard problem.
In this paper, we investigate the Magnitude-Based Pruning (MBP) scheme and analyze it from a novel perspective.
We also propose a novel two-stage pruning approach, where one stage is to obtain the topological structure of the pruned network and the other stage is to retrain the pruned network to recover the capacity.
arXiv Detail & Related papers (2022-01-30T03:42:36Z) - Functional Regularization for Reinforcement Learning via Learned Fourier
Features [98.90474131452588]
We propose a simple architecture for deep reinforcement learning by embedding inputs into a learned Fourier basis.
We show that it improves the sample efficiency of both state-based and image-based RL.
arXiv Detail & Related papers (2021-12-06T18:59:52Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Learning Frequency Domain Approximation for Binary Neural Networks [68.79904499480025]
We propose to estimate the gradient of sign function in the Fourier frequency domain using the combination of sine functions for training BNNs.
The experiments on several benchmark datasets and neural architectures illustrate that the binary network learned using our method achieves the state-of-the-art accuracy.
arXiv Detail & Related papers (2021-03-01T08:25:26Z) - Solving Phase Retrieval with a Learned Reference [18.76940558836028]
Fourier phase retrieval is a classical problem that deals with the recovery of an image from the amplitude measurements of its Fourier coefficients.
In this paper, we assume that a known (learned) reference is added to the signal before capturing the amplitude measurements.
Our method is inspired by the principle of adding a reference signal in holography.
arXiv Detail & Related papers (2020-07-29T06:17:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.