Adaptive Topological Feature via Persistent Homology: Filtration
Learning for Point Clouds
- URL: http://arxiv.org/abs/2307.09259v2
- Date: Sun, 24 Dec 2023 08:43:44 GMT
- Title: Adaptive Topological Feature via Persistent Homology: Filtration
Learning for Point Clouds
- Authors: Naoki Nishikawa, Yuichi Ike and Kenji Yamanishi
- Abstract summary: We propose a framework that learns a filtration adaptively with the use of neural networks.
We show a theoretical result on a finite-dimensional approximation of filtration functions, which justifies the proposed network architecture.
- Score: 13.098609653951893
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning for point clouds has been attracting much attention, with
many applications in various fields, such as shape recognition and material
science. For enhancing the accuracy of such machine learning methods, it is
often effective to incorporate global topological features, which are typically
extracted by persistent homology. In the calculation of persistent homology for
a point cloud, we choose a filtration for the point cloud, an increasing
sequence of spaces. Since the performance of machine learning methods combined
with persistent homology is highly affected by the choice of a filtration, we
need to tune it depending on data and tasks. In this paper, we propose a
framework that learns a filtration adaptively with the use of neural networks.
In order to make the resulting persistent homology isometry-invariant, we
develop a neural network architecture with such invariance. Additionally, we
show a theoretical result on a finite-dimensional approximation of filtration
functions, which justifies the proposed network architecture. Experimental
results demonstrated the efficacy of our framework in several classification
tasks.
Related papers
- Filtration learning in exact multi-parameter persistent homology and classification of time-series data [3.193388094899312]
We propose a framework for filtration learning of EMPH.
We derive the exact formula of the gradient of the loss function with respect to the filtration parameters.
arXiv Detail & Related papers (2024-06-28T00:25:43Z) - Image Classification using Combination of Topological Features and
Neural Networks [1.0323063834827417]
We use the persistent homology method, a technique in topological data analysis (TDA), to extract essential topological features from the data space.
This was carried out with the aim of classifying images from multiple classes in the MNIST dataset.
Our approach inserts topological features into deep learning approaches composed by single and two-streams neural networks.
arXiv Detail & Related papers (2023-11-10T20:05:40Z) - Non-isotropic Persistent Homology: Leveraging the Metric Dependency of
PH [5.70896453969985]
We show that information on the point cloud is lost when restricting persistent homology to a single distance function.
We numerically show that non-isotropic persistent homology can extract information on orientation, orientational variance, and scaling of randomly generated point clouds.
arXiv Detail & Related papers (2023-10-25T08:03:17Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Localized Persistent Homologies for more Effective Deep Learning [60.78456721890412]
We introduce an approach that relies on a new filtration function to account for location during network training.
We demonstrate experimentally on 2D images of roads and 3D image stacks of neuronal processes that networks trained in this manner are better at recovering the topology of the curvilinear structures they extract.
arXiv Detail & Related papers (2021-10-12T19:28:39Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Implementing a foveal-pit inspired filter in a Spiking Convolutional
Neural Network: a preliminary study [0.0]
We have presented a Spiking Convolutional Neural Network (SCNN) that incorporates retinal foveal-pit inspired Difference of Gaussian filters and rank-order encoding.
The model is trained using a variant of the backpropagation algorithm adapted to work with spiking neurons, as implemented in the Nengo library.
The network has achieved up to 90% accuracy, where loss is calculated using the cross-entropy function.
arXiv Detail & Related papers (2021-05-29T15:28:30Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Permutation Matters: Anisotropic Convolutional Layer for Learning on
Point Clouds [145.79324955896845]
We propose a permutable anisotropic convolutional operation (PAI-Conv) that calculates soft-permutation matrices for each point.
Experiments on point clouds demonstrate that PAI-Conv produces competitive results in classification and semantic segmentation tasks.
arXiv Detail & Related papers (2020-05-27T02:42:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.