Improved Graph-based semi-supervised learning Schemes
- URL: http://arxiv.org/abs/2407.00760v1
- Date: Sun, 30 Jun 2024 16:50:08 GMT
- Title: Improved Graph-based semi-supervised learning Schemes
- Authors: Farid Bozorgnia,
- Abstract summary: In this work, we improve the accuracy of several known algorithms to address the classification of large datasets when few labels are available.
Our framework lies in the realm of graph-based semi-supervised learning.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we improve the accuracy of several known algorithms to address the classification of large datasets when few labels are available. Our framework lies in the realm of graph-based semi-supervised learning. With novel modifications on Gaussian Random Fields Learning and Poisson Learning algorithms, we increase the accuracy and create more robust algorithms. Experimental results demonstrate the efficiency and superiority of the proposed methods over conventional graph-based semi-supervised techniques, especially in the context of imbalanced datasets.
Related papers
- Structured Prediction in Online Learning [66.36004256710824]
We study a theoretical and algorithmic framework for structured prediction in the online learning setting.
We show that our algorithm is a generalisation of optimal algorithms from the supervised learning setting.
We consider a second algorithm designed especially for non-stationary data distributions, including adversarial data.
arXiv Detail & Related papers (2024-06-18T07:45:02Z) - Performance Evaluation and Comparison of a New Regression Algorithm [4.125187280299247]
We compare the performance of a newly proposed regression algorithm against four conventional machine learning algorithms.
The reader is free to replicate our results since we have provided the source code in a GitHub repository.
arXiv Detail & Related papers (2023-06-15T13:01:16Z) - Efficiently Learning the Graph for Semi-supervised Learning [4.518012967046983]
We show how to learn the best graphs from the sparse families efficiently using the conjugate gradient method.
Our approach can also be used to learn the graph efficiently online with sub-linear regret, under mild smoothness assumptions.
We implement our approach and demonstrate significant ($sim$10-100x) speedups over prior work on semi-supervised learning with learned graphs on benchmark datasets.
arXiv Detail & Related papers (2023-06-12T13:22:06Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Weakly Supervised Change Detection Using Guided Anisotropic Difusion [97.43170678509478]
We propose original ideas that help us to leverage such datasets in the context of change detection.
First, we propose the guided anisotropic diffusion (GAD) algorithm, which improves semantic segmentation results.
We then show its potential in two weakly-supervised learning strategies tailored for change detection.
arXiv Detail & Related papers (2021-12-31T10:03:47Z) - A Comprehensive Analytical Survey on Unsupervised and Semi-Supervised
Graph Representation Learning Methods [4.486285347896372]
This survey aims to evaluate all major classes of graph embedding methods.
We organized graph embedding techniques using a taxonomy that includes methods from manual feature engineering, matrix factorization, shallow neural networks, and deep graph convolutional networks.
We designed experiments on top of PyTorch Geometric and DGL libraries and run experiments on different multicore CPU and GPU platforms.
arXiv Detail & Related papers (2021-12-20T07:50:26Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Feature space approximation for kernel-based supervised learning [2.653409741248232]
The goal is to reduce the size of the training data, resulting in lower storage consumption and computational complexity.
We demonstrate significant improvements in comparison to the computation of data-driven predictions involving the full training data set.
The method is applied to classification and regression problems from different application areas such as image recognition, system identification, and oceanographic time series analysis.
arXiv Detail & Related papers (2020-11-25T11:23:58Z) - Information Theoretic Meta Learning with Gaussian Processes [74.54485310507336]
We formulate meta learning using information theoretic concepts; namely, mutual information and the information bottleneck.
By making use of variational approximations to the mutual information, we derive a general and tractable framework for meta learning.
arXiv Detail & Related papers (2020-09-07T16:47:30Z) - Semi-Supervised Learning with Meta-Gradient [123.26748223837802]
We propose a simple yet effective meta-learning algorithm in semi-supervised learning.
We find that the proposed algorithm performs favorably against state-of-the-art methods.
arXiv Detail & Related papers (2020-07-08T08:48:56Z) - Flow-based Algorithms for Improving Clusters: A Unifying Framework,
Software, and Performance [0.0]
Clustering points in a vector space or nodes in a graph is a ubiquitous primitive in statistical data analysis.
We focus on principled algorithms for this cluster improvement problem.
We develop efficient implementations of these algorithms in our LocalGraphClustering Python package.
arXiv Detail & Related papers (2020-04-20T20:14:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.