Implementation of neural network operators with applications to remote sensing data
- URL: http://arxiv.org/abs/2412.00375v1
- Date: Sat, 30 Nov 2024 06:51:53 GMT
- Title: Implementation of neural network operators with applications to remote sensing data
- Authors: Danilo Costarelli, Michele Piconi,
- Abstract summary: Two algorithms are presented based on the theory of multidimensional neural network (NN) operators activated by hyperbolic tangent sigmoidal functions.
We discuss several applications of the NN-based algorithms for modeling and rescaling/enhancement remote sensing data.
- Score: 0.41436032949434404
- License:
- Abstract: In this paper, we provide two algorithms based on the theory of multidimensional neural network (NN) operators activated by hyperbolic tangent sigmoidal functions. Theoretical results are recalled to justify the performance of the here implemented algorithms. Specifically, the first algorithm models multidimensional signals (such as digital images), while the second one addresses the problem of rescaling and enhancement of the considered data. We discuss several applications of the NN-based algorithms for modeling and rescaling/enhancement remote sensing data (represented as images), with numerical experiments conducted on a selection of remote sensing (RS) images from the (open access) RETINA dataset. A comparison with classical interpolation methods, such as bilinear and bicubic interpolation, shows that the proposed algorithms outperform the others, particularly in terms of the Structural Similarity Index (SSIM).
Related papers
- Deep Convolutional Neural Networks Meet Variational Shape Compactness Priors for Image Segmentation [7.314877483509877]
Shape compactness is a key geometrical property to describe interesting regions in many image segmentation tasks.
We propose two novel algorithms to solve the introduced image segmentation problem that incorporates a shape-compactness prior.
The proposed algorithms significantly improve IoU by 20% training on a highly noisy image dataset.
arXiv Detail & Related papers (2024-05-23T11:05:35Z) - Discrete Neural Algorithmic Reasoning [18.497863598167257]
We propose to force neural reasoners to maintain the execution trajectory as a combination of finite predefined states.
trained with supervision on the algorithm's state transitions, such models are able to perfectly align with the original algorithm.
arXiv Detail & Related papers (2024-02-18T16:03:04Z) - Multilayer Multiset Neuronal Networks -- MMNNs [55.2480439325792]
The present work describes multilayer multiset neuronal networks incorporating two or more layers of coincidence similarity neurons.
The work also explores the utilization of counter-prototype points, which are assigned to the image regions to be avoided.
arXiv Detail & Related papers (2023-08-28T12:55:13Z) - A PINN Approach to Symbolic Differential Operator Discovery with Sparse
Data [0.0]
In this work we perform symbolic discovery of differential operators in a situation where there is sparse experimental data.
We modify the PINN approach by adding a neural network that learns a representation of unknown hidden terms in the differential equation.
The algorithm yields both a surrogate solution to the differential equation and a black-box representation of the hidden terms.
arXiv Detail & Related papers (2022-12-09T02:09:37Z) - Random Features for the Neural Tangent Kernel [57.132634274795066]
We propose an efficient feature map construction of the Neural Tangent Kernel (NTK) of fully-connected ReLU network.
We show that dimension of the resulting features is much smaller than other baseline feature map constructions to achieve comparable error bounds both in theory and practice.
arXiv Detail & Related papers (2021-04-03T09:08:12Z) - Deep Representational Similarity Learning for analyzing neural
signatures in task-based fMRI dataset [81.02949933048332]
This paper develops Deep Representational Similarity Learning (DRSL), a deep extension of Representational Similarity Analysis (RSA)
DRSL is appropriate for analyzing similarities between various cognitive tasks in fMRI datasets with a large number of subjects.
arXiv Detail & Related papers (2020-09-28T18:30:14Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Similarity of Neural Networks with Gradients [8.804507286438781]
We propose to leverage both feature vectors and gradient ones into designing the representation of a neural network.
We show that the proposed approach provides a state-of-the-art method for computing similarity of neural networks.
arXiv Detail & Related papers (2020-03-25T17:04:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.