Noise-robust classification with hypergraph neural network
- URL: http://arxiv.org/abs/2102.01934v1
- Date: Wed, 3 Feb 2021 08:34:53 GMT
- Title: Noise-robust classification with hypergraph neural network
- Authors: Nguyen Trinh Vu Dang, Loc Tran, Linh Tran
- Abstract summary: This paper presents a novel version of the hypergraph neural network method.
The accuracies of these five methods are evaluated and compared.
Experimental results show that the hypergraph neural network methods achieve the best performance when the noise level increases.
- Score: 4.003697389752555
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: This paper presents a novel version of the hypergraph neural network method.
This method is utilized to solve the noisy label learning problem. First, we
apply the PCA dimensional reduction technique to the feature matrices of the
image datasets in order to reduce the "noise" and the redundant features in the
feature matrices of the image datasets and to reduce the runtime constructing
the hypergraph of the hypergraph neural network method. Then, the classic
graph-based semi-supervised learning method, the classic hypergraph based
semi-supervised learning method, the graph neural network, the hypergraph
neural network, and our proposed hypergraph neural network are employed to
solve the noisy label learning problem. The accuracies of these five methods
are evaluated and compared. Experimental results show that the hypergraph
neural network methods achieve the best performance when the noise level
increases. Moreover, the hypergraph neural network methods are at least as good
as the graph neural network.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - HyperMagNet: A Magnetic Laplacian based Hypergraph Neural Network [0.16874375111244327]
We propose an alternative approach to hypergraph neural networks in which the hypergraph is represented as a non-reversible Markov chain.
We use this Markov chain to construct a complex Hermitian Laplacian matrix - the magnetic Laplacian - which serves as the input to our proposed hypergraph neural network.
arXiv Detail & Related papers (2024-02-15T03:05:45Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Preventing Over-Smoothing for Hypergraph Neural Networks [0.0]
We show that the performance of hypergraph neural networks does not improve as the number of layers increases.
We develop a new deep hypergraph convolutional network called Deep-HGCN, which can maintain node representation in deep layers.
arXiv Detail & Related papers (2022-03-31T16:33:31Z) - Directed hypergraph neural network [0.0]
We will present the novel neural network method for directed hypergraph.
The two datasets that are used in the experiments are the cora and the citeseer datasets.
arXiv Detail & Related papers (2020-08-09T01:39:52Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - Learning Local Complex Features using Randomized Neural Networks for
Texture Analysis [0.1474723404975345]
We present a new approach that combines a learning technique and the Complex Network (CN) theory for texture analysis.
This method takes advantage of the representation capacity of CN to model a texture image as a directed network.
This neural network has a single hidden layer and uses a fast learning algorithm, which is able to learn local CN patterns for texture characterization.
arXiv Detail & Related papers (2020-07-10T23:18:01Z) - Hcore-Init: Neural Network Initialization based on Graph Degeneracy [22.923756039561194]
We propose an adapted version of the k-core structure for the complete weighted multipartite graph extracted from a deep learning architecture.
As a multipartite graph is a combination of bipartite graphs, that are in turn the incidence graphs of hypergraphs, we design k-hypercore decomposition.
arXiv Detail & Related papers (2020-04-16T12:57:14Z) - Molecule Property Prediction and Classification with Graph Hypernetworks [113.38181979662288]
We show that the replacement of the underlying networks with hypernetworks leads to a boost in performance.
A major difficulty in the application of hypernetworks is their lack of stability.
A recent work has tackled the training instability of hypernetworks in the context of error correcting codes.
arXiv Detail & Related papers (2020-02-01T16:44:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.