GSGP-CUDA -- a CUDA framework for Geometric Semantic Genetic Programming
- URL: http://arxiv.org/abs/2106.04034v1
- Date: Tue, 8 Jun 2021 00:58:39 GMT
- Title: GSGP-CUDA -- a CUDA framework for Geometric Semantic Genetic Programming
- Authors: Leonardo Trujillo, Jose Manuel Mu\~noz Contreras, Daniel E Hernandez,
Mauro Castelli and Juan J Tapia
- Abstract summary: Geometric Semantic Genetic Programming (GSGP) is a state-of-the-art machine learning method based on evolutionary computation.
Efficient implementation of GSGP in C++ exploit this fact, but not to its full potential.
Results show speedups greater than 1,000X relative to the state-of-the-art sequential implementation.
- Score: 2.275405513780208
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Geometric Semantic Genetic Programming (GSGP) is a state-of-the-art machine
learning method based on evolutionary computation. GSGP performs search
operations directly at the level of program semantics, which can be done more
efficiently then operating at the syntax level like most GP systems. Efficient
implementations of GSGP in C++ exploit this fact, but not to its full
potential. This paper presents GSGP-CUDA, the first CUDA implementation of GSGP
and the most efficient, which exploits the intrinsic parallelism of GSGP using
GPUs. Results show speedups greater than 1,000X relative to the
state-of-the-art sequential implementation.
Related papers
- Interactive Segmentation as Gaussian Process Classification [58.44673380545409]
Click-based interactive segmentation (IS) aims to extract the target objects under user interaction.
Most of the current deep learning (DL)-based methods mainly follow the general pipelines of semantic segmentation.
We propose to formulate the IS task as a Gaussian process (GP)-based pixel-wise binary classification model on each image.
arXiv Detail & Related papers (2023-02-28T14:01:01Z) - A Lanczos approach to the Adiabatic Gauge Potential [0.0]
The Adiabatic Gauge Potential (AGP) measures the rate at which the eigensystem of Hamiltonian changes under adiabatic deformations.
We employ a version of this approach by using the Lanczos algorithm to evaluate the AGP operator in terms of Krylov vectors and the AGP norm in terms of the Lanczos coefficients.
arXiv Detail & Related papers (2023-02-14T18:18:21Z) - Functional Code Building Genetic Programming [0.0]
Code Building Genetic Programming (CBGP) is a recently introduced GP method for general program synthesis.
We show that a functional programming language and a Hindley-Milner type system can be used to evolve type-safe programs.
arXiv Detail & Related papers (2022-06-09T15:22:33Z) - Rethinking and Scaling Up Graph Contrastive Learning: An Extremely
Efficient Approach with Group Discrimination [87.07410882094966]
Graph contrastive learning (GCL) alleviates the heavy reliance on label information for graph representation learning (GRL)
We introduce a new learning paradigm for self-supervised GRL, namely, Group Discrimination (GD)
Instead of similarity computation, GGD directly discriminates two groups of summarised node instances with a simple binary cross-entropy loss.
In addition, GGD requires much fewer training epochs to obtain competitive performance compared with GCL methods on large-scale datasets.
arXiv Detail & Related papers (2022-06-03T12:32:47Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Accelerating Genetic Programming using GPUs [0.0]
Genetic Programming (GP) has multiple applications in machine learning such as curve fitting, data modelling, feature selection, classification etc.
This paper describes a GPU accelerated stack-based variant of the generational GP algorithm which can be used for symbolic regression and binary classification.
arXiv Detail & Related papers (2021-10-15T06:13:01Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Solving classification problems using Traceless Genetic Programming [0.0]
Traceless Genetic Programming (TGP) is a new Genetic Programming (GP) that may be used for solving difficult real-world problems.
In this paper, TGP is used for solving real-world classification problems taken from PROBEN1.
arXiv Detail & Related papers (2021-10-07T06:13:07Z) - Using Traceless Genetic Programming for Solving Multiobjective
Optimization Problems [1.9493449206135294]
Traceless Genetic Programming (TGP) is a Genetic Programming (GP) variant that is used in cases where the focus is rather the output of the program than the program itself.
Two genetic operators are used in conjunction with TGP: crossover and insertion.
Numerical experiments show that TGP is able to solve very fast and very well the considered test problems.
arXiv Detail & Related papers (2021-10-07T05:55:55Z) - Scalable Graph Neural Networks via Bidirectional Propagation [89.70835710988395]
Graph Neural Networks (GNN) is an emerging field for learning on non-Euclidean data.
This paper presents GBP, a scalable GNN that utilizes a localized bidirectional propagation process from both the feature vectors and the training/testing nodes.
An empirical study demonstrates that GBP achieves state-of-the-art performance with significantly less training/testing time.
arXiv Detail & Related papers (2020-10-29T08:55:33Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.