Revisiting Nonlocal Self-Similarity from Continuous Representation
- URL: http://arxiv.org/abs/2401.00708v1
- Date: Mon, 1 Jan 2024 09:25:03 GMT
- Title: Revisiting Nonlocal Self-Similarity from Continuous Representation
- Authors: Yisi Luo, Xile Zhao, Deyu Meng
- Abstract summary: Nonlocal self-similarity (NSS) is an important prior that has been successfully applied in multi-dimensional data processing tasks.
We propose a novel Continuous Representation-based NonLocal method (termed as CRNL) for both on-meshgrid and off-meshgrid data.
- Score: 62.06288797179193
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Nonlocal self-similarity (NSS) is an important prior that has been
successfully applied in multi-dimensional data processing tasks, e.g., image
and video recovery. However, existing NSS-based methods are solely suitable for
meshgrid data such as images and videos, but are not suitable for emerging
off-meshgrid data, e.g., point cloud and climate data. In this work, we revisit
the NSS from the continuous representation perspective and propose a novel
Continuous Representation-based NonLocal method (termed as CRNL), which has two
innovative features as compared with classical nonlocal methods. First, based
on the continuous representation, our CRNL unifies the measure of
self-similarity for on-meshgrid and off-meshgrid data and thus is naturally
suitable for both of them. Second, the nonlocal continuous groups can be more
compactly and efficiently represented by the coupled low-rank function
factorization, which simultaneously exploits the similarity within each group
and across different groups, while classical nonlocal methods neglect the
similarity across groups. This elaborately designed coupled mechanism allows
our method to enjoy favorable performance over conventional NSS methods in
terms of both effectiveness and efficiency. Extensive multi-dimensional data
processing experiments on-meshgrid (e.g., image inpainting and image denoising)
and off-meshgrid (e.g., climate data prediction and point cloud recovery)
validate the versatility, effectiveness, and efficiency of our CRNL as compared
with state-of-the-art methods.
Related papers
- Efficient Privacy-Preserving KAN Inference Using Homomorphic Encryption [9.0993556073886]
Homomorphic encryption (HE) facilitates privacy-preserving inference for deep learning models.
Complex structure of KANs, incorporating nonlinear elements like the SiLU activation function and B-spline functions, renders existing privacy-preserving inference techniques inadequate.
We propose an accurate and efficient privacy-preserving inference scheme tailored for KANs.
arXiv Detail & Related papers (2024-09-12T04:51:27Z) - Contrastive Learning with Synthetic Positives [11.932323457691945]
Contrastive learning with the nearest neighbor has proved to be one of the most efficient self-supervised learning (SSL) techniques.
In this paper, we introduce a novel approach called Contrastive Learning with Synthetic Positives (NCLP)
NCLP utilizes synthetic images, generated by an unconditional diffusion model, as the additional positives to help the model learn from diverse positives.
arXiv Detail & Related papers (2024-08-30T01:47:43Z) - Interacting Particle Systems on Networks: joint inference of the network
and the interaction kernel [8.535430501710712]
We infer the weight matrix of the network and systems which determine the rules of the interactions between agents.
We use two algorithms: one is on a new algorithm named operator regression with alternating least squares of data.
Both algorithms are scalable conditions guaranteeing identifiability and well-posedness.
arXiv Detail & Related papers (2024-02-13T12:29:38Z) - Heterogenous Memory Augmented Neural Networks [84.29338268789684]
We introduce a novel heterogeneous memory augmentation approach for neural networks.
By introducing learnable memory tokens with attention mechanism, we can effectively boost performance without huge computational overhead.
We show our approach on various image and graph-based tasks under both in-distribution (ID) and out-of-distribution (OOD) conditions.
arXiv Detail & Related papers (2023-10-17T01:05:28Z) - Multilayer Multiset Neuronal Networks -- MMNNs [55.2480439325792]
The present work describes multilayer multiset neuronal networks incorporating two or more layers of coincidence similarity neurons.
The work also explores the utilization of counter-prototype points, which are assigned to the image regions to be avoided.
arXiv Detail & Related papers (2023-08-28T12:55:13Z) - On the effectiveness of partial variance reduction in federated learning
with heterogeneous data [27.527995694042506]
We show that the diversity of the final classification layers across clients impedes the performance of the FedAvg algorithm.
Motivated by this, we propose to correct model by variance reduction only on the final layers.
We demonstrate that this significantly outperforms existing benchmarks at a similar or lower communication cost.
arXiv Detail & Related papers (2022-12-05T11:56:35Z) - Cluster-level pseudo-labelling for source-free cross-domain facial
expression recognition [94.56304526014875]
We propose the first Source-Free Unsupervised Domain Adaptation (SFUDA) method for Facial Expression Recognition (FER)
Our method exploits self-supervised pretraining to learn good feature representations from the target data.
We validate the effectiveness of our method in four adaptation setups, proving that it consistently outperforms existing SFUDA methods when applied to FER.
arXiv Detail & Related papers (2022-10-11T08:24:50Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Stochastic Cluster Embedding [14.485496311015398]
Neighbor Embedding (NE) aims to preserve pairwise similarities between data items.
NE methods such as Neighbor Embedding (SNE) may leave large-scale patterns such as clusters hidden.
We propose a new cluster visualization method based on Neighbor Embedding.
arXiv Detail & Related papers (2021-08-18T07:07:28Z) - Attentive CutMix: An Enhanced Data Augmentation Approach for Deep
Learning Based Image Classification [58.20132466198622]
We propose Attentive CutMix, a naturally enhanced augmentation strategy based on CutMix.
In each training iteration, we choose the most descriptive regions based on the intermediate attention maps from a feature extractor.
Our proposed method is simple yet effective, easy to implement and can boost the baseline significantly.
arXiv Detail & Related papers (2020-03-29T15:01:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.