A logifold structure on measure space
- URL: http://arxiv.org/abs/2405.05492v2
- Date: Sat, 19 Oct 2024 02:39:01 GMT
- Title: A logifold structure on measure space
- Authors: Inkee Jung, Siu-Cheong Lau,
- Abstract summary: We develop a local-to-global and measure-theoretical approach to understand datasets.
We show in experiments how it can be used to find fuzzy domains and to improve accuracy in data classification problems.
- Score: 0.0
- License:
- Abstract: In this paper,we develop a local-to-global and measure-theoretical approach to understand datasets. The idea is to take network models with restricted domains as local charts of datasets. We develop the mathematical foundations for these structures, and show in experiments how it can be used to find fuzzy domains and to improve accuracy in data classification problems.
Related papers
- Logifold: A Geometrical Foundation of Ensemble Machine Learning [0.0]
We present a local-to-global and measure-theoretical approach to understanding datasets.
The core idea is to formulate a logifold structure and to interpret network models with restricted domains as local charts of datasets.
arXiv Detail & Related papers (2024-07-23T04:47:58Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - Prospector Heads: Generalized Feature Attribution for Large Models & Data [82.02696069543454]
We introduce prospector heads, an efficient and interpretable alternative to explanation-based attribution methods.
We demonstrate how prospector heads enable improved interpretation and discovery of class-specific patterns in input data.
arXiv Detail & Related papers (2024-02-18T23:01:28Z) - Tackling Computational Heterogeneity in FL: A Few Theoretical Insights [68.8204255655161]
We introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneous data.
Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
arXiv Detail & Related papers (2023-07-12T16:28:21Z) - Hierarchical Optimal Transport for Unsupervised Domain Adaptation [0.0]
We propose a novel approach for unsupervised domain adaptation, that relates notions of optimal transport, learning probability measures and unsupervised learning.
The proposed approach, HOT-DA, is based on a hierarchical formulation of optimal transport.
Experiments on a toy dataset with controllable complexity and two challenging visual adaptation datasets show the superiority of the proposed approach over the state-of-the-art.
arXiv Detail & Related papers (2021-12-03T18:37:23Z) - Incorporation of Deep Neural Network & Reinforcement Learning with
Domain Knowledge [0.0]
We present a study of the manners by which Domain information has been incorporated when building models with Neural Networks.
Integrating space data is uniquely important to the development of Knowledge understanding model, as well as other fields that aid in understanding information by utilizing the human-machine interface and Reinforcement Learning.
arXiv Detail & Related papers (2021-07-29T17:29:02Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Clustered Federated Learning via Generalized Total Variation
Minimization [83.26141667853057]
We study optimization methods to train local (or personalized) models for local datasets with a decentralized network structure.
Our main conceptual contribution is to formulate federated learning as total variation minimization (GTV)
Our main algorithmic contribution is a fully decentralized federated learning algorithm.
arXiv Detail & Related papers (2021-05-26T18:07:19Z) - Sheaves as a Framework for Understanding and Interpreting Model Fit [2.867517731896504]
We argue that sheaves can provide a natural framework to analyze how well a statistical model fits at the local level.
The sheaf-based approach is suitably general enough to be useful in a range of applications.
arXiv Detail & Related papers (2021-05-21T15:34:09Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Domain Adaptation by Topology Regularization [0.0]
Domain adaptation (DA) or transfer learning (TL) enables algorithms to transfer knowledge from a labelled (source) data set to an unlabelled but related (target) data set of interest.
We propose to leverage global data structure by applying a topological data analysis technique called persistent homology to TL.
arXiv Detail & Related papers (2021-01-28T16:45:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.