Sinc Kolmogorov-Arnold Network and Its Applications on Physics-informed Neural Networks
- URL: http://arxiv.org/abs/2410.04096v1
- Date: Sat, 5 Oct 2024 09:33:39 GMT
- Title: Sinc Kolmogorov-Arnold Network and Its Applications on Physics-informed Neural Networks
- Authors: Tianchi Yu, Jingwei Qiu, Jiang Yang, Ivan Oseledets,
- Abstract summary: We propose to use Sinc in the context of Kolmogorov-Arnold Networks, neural networks with learnable activation functions.
We show that Sinc proposes a viable alternative, since it is known in numerical analysis to represent well both smooth functions and functions with singularities.
- Score: 4.61590049339329
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose to use Sinc interpolation in the context of Kolmogorov-Arnold Networks, neural networks with learnable activation functions, which recently gained attention as alternatives to multilayer perceptron. Many different function representations have already been tried, but we show that Sinc interpolation proposes a viable alternative, since it is known in numerical analysis to represent well both smooth functions and functions with singularities. This is important not only for function approximation but also for the solutions of partial differential equations with physics-informed neural networks. Through a series of experiments, we show that SincKANs provide better results in almost all of the examples we have considered.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Feature Mapping in Physics-Informed Neural Networks (PINNs) [1.9819034119774483]
We study the training dynamics of PINNs with a feature mapping layer via the limiting Conjugate Kernel and Neural Tangent Kernel.
We propose conditionally positive definite Radial Basis Function as a better alternative.
arXiv Detail & Related papers (2024-02-10T13:51:09Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Simple initialization and parametrization of sinusoidal networks via
their kernel bandwidth [92.25666446274188]
sinusoidal neural networks with activations have been proposed as an alternative to networks with traditional activation functions.
We first propose a simplified version of such sinusoidal neural networks, which allows both for easier practical implementation and simpler theoretical analysis.
We then analyze the behavior of these networks from the neural tangent kernel perspective and demonstrate that their kernel approximates a low-pass filter with an adjustable bandwidth.
arXiv Detail & Related papers (2022-11-26T07:41:48Z) - Open- and Closed-Loop Neural Network Verification using Polynomial
Zonotopes [6.591194329459251]
We present a novel approach to efficiently compute tight non-contact activation functions.
In particular, we evaluate the input-output relation of each neuron by an approximation.
This results in a superior performance compared to other methods.
arXiv Detail & Related papers (2022-07-06T14:39:19Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - SPINN: Sparse, Physics-based, and Interpretable Neural Networks for PDEs [0.0]
We introduce a class of Sparse, Physics-based, and Interpretable Neural Networks (SPINN) for solving ordinary and partial differential equations.
By reinterpreting a traditional meshless representation of solutions of PDEs as a special sparse deep neural network, we develop a class of sparse neural network architectures that are interpretable.
arXiv Detail & Related papers (2021-02-25T17:45:50Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Finite Difference Neural Networks: Fast Prediction of Partial
Differential Equations [5.575293536755126]
We propose a novel neural network framework, finite difference neural networks (FDNet), to learn partial differential equations from data.
Specifically, our proposed finite difference inspired network is designed to learn the underlying governing partial differential equations from trajectory data.
arXiv Detail & Related papers (2020-06-02T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.