Kolmogorov-Arnold PointNet: Deep learning for prediction of fluid fields on irregular geometries
- URL: http://arxiv.org/abs/2408.02950v2
- Date: Sun, 02 Mar 2025 18:31:59 GMT
- Title: Kolmogorov-Arnold PointNet: Deep learning for prediction of fluid fields on irregular geometries
- Authors: Ali Kashefi,
- Abstract summary: Kolmogorov-Arnold Networks (KANs) have emerged as a promising alternative to traditional Multilayer Perceptrons (MLPs) in deep learning.<n>We present KA-PointNet as a novel supervised deep learning framework for the prediction of incompressible steady-state fluid flow fields.
- Score: 1.90365714903665
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kolmogorov-Arnold Networks (KANs) have emerged as a promising alternative to traditional Multilayer Perceptrons (MLPs) in deep learning. KANs have already been integrated into various architectures, such as convolutional neural networks, graph neural networks, and transformers, and their potential has been assessed for predicting physical quantities. However, the combination of KANs with point-cloud-based neural networks (e.g., PointNet) for computational physics has not yet been explored. To address this, we present Kolmogorov-Arnold PointNet (KA-PointNet) as a novel supervised deep learning framework for the prediction of incompressible steady-state fluid flow fields in irregular domains, where the predicted fields are a function of the geometry of the domains. In KA-PointNet, we implement shared KANs in the segmentation branch of the PointNet architecture. We utilize Jacobi polynomials to construct shared KANs. As a benchmark test case, we consider incompressible laminar steady-state flow over a cylinder, where the geometry of its cross-section varies over the data set. We investigate the performance of Jacobi polynomials with different degrees as well as special cases of Jacobi polynomials such as Legendre polynomials, Chebyshev polynomials of the first and second kinds, and Gegenbauer polynomials, in terms of the computational cost of training and accuracy of prediction of the test set. Additionally, we compare the performance of PointNet with shared KANs (i.e., KA-PointNet) and PointNet with shared MLPs. It is observed that when the number of trainable parameters is approximately equal, PointNet with shared KANs (i.e., KA-PointNet) outperforms PointNet with shared MLPs. Moreover, KA-PointNet predicts the pressure and velocity distributions along the surface of cylinders more accurately, resulting in more precise computations of lift and drag.
Related papers
- Physics-informed KAN PointNet: Deep learning for simultaneous solutions to inverse problems in incompressible flow on numerous irregular geometries [4.548755617115688]
Physics-informed PointNet (PIPN) was introduced to address this limitation for PINNs.
PI-KAN-PointNet enables the simultaneous solution of an inverse problem over multiple irregular geometries within a single training run.
Our findings indicate that a physics-informed PointNet model employing layers as the encoder and KAN layers as the decoder represents the optimal configuration among all models investigated.
arXiv Detail & Related papers (2025-04-08T12:31:57Z) - PointNet with KAN versus PointNet with MLP for 3D Classification and Segmentation of Point Sets [1.90365714903665]
We introduce PointNet-KAN, a neural network for 3D point cloud classification and segmentation tasks.
It employs Kolmogorov-Arnold Networks (KANs) instead of traditional Multilayer Perceptrons (MLPs)
arXiv Detail & Related papers (2024-10-14T01:57:06Z) - Imputation of Time-varying Edge Flows in Graphs by Multilinear Kernel Regression and Manifold Learning [4.129225533930965]
This paper extends the framework of multilinear kernel regression and imputation via manifold learning (MultiL-KRIM) to impute time-varying edge flows in a graph.
MultiL-KRIM uses simplicial-complex arguments and Hodge Laplacians to incorporate the graph topology.
It exploits manifold-learning arguments to identify latent geometries within features which are modeled as a point-cloud around a smooth manifold embedded in a kernel reproducing Hilbert space (RKHS)
arXiv Detail & Related papers (2024-09-08T15:38:31Z) - Point Deformable Network with Enhanced Normal Embedding for Point Cloud
Analysis [59.12922158979068]
Recently-based methods have shown strong performance in point cloud analysis.
Simple architectures are able to learn geometric features in local point groups yet fail to model long-range dependencies directly.
We propose Point Deformable Network (PDNet) to capture long-range relations with strong representation ability.
arXiv Detail & Related papers (2023-12-20T14:52:07Z) - On the accuracy of interpolation based on single-layer artificial neural networks with a focus on defeating the Runge phenomenon [29.004178992441336]
We consider one-hidden layer ANNs with a feedforward architecture, also referred to as shallow or two-layer networks.
We present the case where the parameters are trained using a procedure that is referred to as Extreme Learning Machine (ELM)
The focus is then on the accuracy of the outside of the given sampling nodes when they are thespaced, the Chebychev, and the randomly selected ones.
arXiv Detail & Related papers (2023-08-21T13:40:09Z) - Revisiting Tropical Polynomial Division: Theory, Algorithms and
Application to Neural Networks [40.137069931650444]
Tropical geometry has recently found several applications in the analysis of neural networks with piecewise linear activation functions.
This paper presents a new look at the problem of tropical division and its application to the simplification of neural networks.
arXiv Detail & Related papers (2023-06-27T02:26:07Z) - Regularization of polynomial networks for image recognition [78.4786845859205]
Polynomial Networks (PNs) have emerged as an alternative method with a promising performance and improved interpretability.
We introduce a class of PNs, which are able to reach the performance of ResNet across a range of six benchmarks.
arXiv Detail & Related papers (2023-03-24T10:05:22Z) - Efficient Graph Field Integrators Meet Point Clouds [59.27295475120132]
We present two new classes of algorithms for efficient field integration on graphs encoding point clouds.
The first class, SeparatorFactorization(SF), leverages the bounded genus of point cloud mesh graphs, while the second class, RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations for point clouds.
arXiv Detail & Related papers (2023-02-02T08:33:36Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - PhyGNNet: Solving spatiotemporal PDEs with Physics-informed Graph Neural
Network [12.385926494640932]
We propose PhyGNNet for solving partial differential equations on the basics of a graph neural network.
In particular, we divide the computing area into regular grids, define partial differential operators on the grids, then construct pde loss for the network to optimize to build PhyGNNet model.
arXiv Detail & Related papers (2022-08-07T13:33:34Z) - Layer Adaptive Node Selection in Bayesian Neural Networks: Statistical
Guarantees and Implementation Details [0.5156484100374059]
Sparse deep neural networks have proven to be efficient for predictive model building in large-scale studies.
We propose a Bayesian sparse solution using spike-and-slab Gaussian priors to allow for node selection during training.
We establish the fundamental result of variational posterior consistency together with the characterization of prior parameters.
arXiv Detail & Related papers (2021-08-25T00:48:07Z) - Deep Archimedean Copulas [98.96141706464425]
ACNet is a novel differentiable neural network architecture that enforces structural properties.
We show that ACNet is able to both approximate common Archimedean Copulas and generate new copulas which may provide better fits to data.
arXiv Detail & Related papers (2020-12-05T22:58:37Z) - A Point-Cloud Deep Learning Framework for Prediction of Fluid Flow
Fields on Irregular Geometries [62.28265459308354]
Network learns end-to-end mapping between spatial positions and CFD quantities.
Incompress laminar steady flow past a cylinder with various shapes for its cross section is considered.
Network predicts the flow fields hundreds of times faster than our conventional CFD.
arXiv Detail & Related papers (2020-10-15T12:15:02Z) - Permutation Matters: Anisotropic Convolutional Layer for Learning on
Point Clouds [145.79324955896845]
We propose a permutable anisotropic convolutional operation (PAI-Conv) that calculates soft-permutation matrices for each point.
Experiments on point clouds demonstrate that PAI-Conv produces competitive results in classification and semantic segmentation tasks.
arXiv Detail & Related papers (2020-05-27T02:42:29Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.