SymNMF-Net for The Symmetric NMF Problem
- URL: http://arxiv.org/abs/2205.13214v1
- Date: Thu, 26 May 2022 08:17:39 GMT
- Title: SymNMF-Net for The Symmetric NMF Problem
- Authors: Mingjie Li, Hao Kong, Zhouchen Lin
- Abstract summary: We propose a neural network called SymNMF-Net for the Symmetric NMF problem.
We show that the inference of each block corresponds to a single iteration of the optimization.
Empirical results on real-world datasets demonstrate the superiority of our SymNMF-Net.
- Score: 62.44067422984995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, many works have demonstrated that Symmetric Non-negative Matrix
Factorization~(SymNMF) enjoys a great superiority for various clustering tasks.
Although the state-of-the-art algorithms for SymNMF perform well on synthetic
data, they cannot consistently obtain satisfactory results with desirable
properties and may fail on real-world tasks like clustering. Considering the
flexibility and strong representation ability of the neural network, in this
paper, we propose a neural network called SymNMF-Net for the Symmetric NMF
problem to overcome the shortcomings of traditional optimization algorithms.
Each block of SymNMF-Net is a differentiable architecture with an inversion
layer, a linear layer and ReLU, which are inspired by a traditional update
scheme for SymNMF. We show that the inference of each block corresponds to a
single iteration of the optimization. Furthermore, we analyze the constraints
of the inversion layer to ensure the output stability of the network to a
certain extent. Empirical results on real-world datasets demonstrate the
superiority of our SymNMF-Net and confirm the sufficiency of our theoretical
analysis.
Related papers
- Multilevel CNNs for Parametric PDEs based on Adaptive Finite Elements [0.0]
A neural network architecture is presented that exploits the multilevel properties of high-dimensional parameter-dependent partial differential equations.
The network is trained with data on adaptively refined finite element meshes.
A complete convergence and complexity analysis is carried out for the adaptive multilevel scheme.
arXiv Detail & Related papers (2024-08-20T13:32:11Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - An NMF-Based Building Block for Interpretable Neural Networks With
Continual Learning [0.8158530638728501]
Existing learning methods often struggle to balance interpretability and predictive performance.
Our approach aims to strike a better balance between these two aspects through the use of a building block based on NMF.
arXiv Detail & Related papers (2023-11-20T02:00:33Z) - A Provable Splitting Approach for Symmetric Nonnegative Matrix
Factorization [27.766572707447352]
We show that designing fast algorithms for the symmetric NMF is not as easy as for its nonsymmetric counterpart.
We first split the decision variable and transform the symmetric NMF to a penalized nonsymmetric one.
We then show that solving the penalized nonsymmetric reformulation returns a solution to the original symmetric NMF.
arXiv Detail & Related papers (2023-01-25T10:21:59Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Influence Estimation and Maximization via Neural Mean-Field Dynamics [60.91291234832546]
We propose a novel learning framework using neural mean-field (NMF) dynamics for inference and estimation problems.
Our framework can simultaneously learn the structure of the diffusion network and the evolution of node infection probabilities.
arXiv Detail & Related papers (2021-06-03T00:02:05Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Regularizing Recurrent Neural Networks via Sequence Mixup [7.036759195546171]
We extend a class of celebrated regularization techniques originally proposed for feed-forward neural networks.
Our proposed methods are easy to implement complexity, while leverage the performance of simple neural architectures.
arXiv Detail & Related papers (2020-11-27T05:43:40Z) - Kernel-Based Smoothness Analysis of Residual Networks [85.20737467304994]
Residual networks (ResNets) stand out among these powerful modern architectures.
In this paper, we show another distinction between the two models, namely, a tendency of ResNets to promote smoothers than gradients.
arXiv Detail & Related papers (2020-09-21T16:32:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.