Which Hyperparameters to Optimise? An Investigation of Evolutionary
Hyperparameter Optimisation in Graph Neural Network For Molecular Property
Prediction
- URL: http://arxiv.org/abs/2104.06046v2
- Date: Wed, 14 Apr 2021 09:45:54 GMT
- Title: Which Hyperparameters to Optimise? An Investigation of Evolutionary
Hyperparameter Optimisation in Graph Neural Network For Molecular Property
Prediction
- Authors: Yingfang Yuan, Wenjun Wang, Wei Pang
- Abstract summary: The study of graph neural network (GNN) has attracted much attention and achieved promising performance in molecular property prediction.
We focus on the impact of selecting two types of GNN hyper parameters, those belonging to graph-related layers and those of task-specific layers, on the performance of GNN for molecular property prediction.
- Score: 8.02401104726362
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, the study of graph neural network (GNN) has attracted much
attention and achieved promising performance in molecular property prediction.
Most GNNs for molecular property prediction are proposed based on the idea of
learning the representations for the nodes by aggregating the information of
their neighbor nodes (e.g. atoms). Then, the representations can be passed to
subsequent layers to deal with individual downstream tasks. Therefore, the
architectures of GNNs can be considered as being composed of two core parts:
graph-related layers and task-specific layers. Facing real-world molecular
problems, the hyperparameter optimization for those layers are vital.
Hyperparameter optimization (HPO) becomes expensive in this situation because
evaluating candidate solutions requires massive computational resources to
train and validate models. Furthermore, a larger search space often makes the
HPO problems more challenging. In this research, we focus on the impact of
selecting two types of GNN hyperparameters, those belonging to graph-related
layers and those of task-specific layers, on the performance of GNN for
molecular property prediction. In our experiments. we employed a
state-of-the-art evolutionary algorithm (i.e., CMA-ES) for HPO. The results
reveal that optimizing the two types of hyperparameters separately can gain the
improvements on GNNs' performance, but optimising both types of hyperparameters
simultaneously will lead to predominant improvements. Meanwhile, our study also
further confirms the importance of HPO for GNNs in molecular property
prediction problems.
Related papers
- Spiking Graph Neural Network on Riemannian Manifolds [51.15400848660023]
Graph neural networks (GNNs) have become the dominant solution for learning on graphs.
Existing spiking GNNs consider graphs in Euclidean space, ignoring the structural geometry.
We present a Manifold-valued Spiking GNN (MSG)
MSG achieves superior performance to previous spiking GNNs and energy efficiency to conventional GNNs.
arXiv Detail & Related papers (2024-10-23T15:09:02Z) - Diffusing to the Top: Boost Graph Neural Networks with Minimal Hyperparameter Tuning [33.948899558876604]
This work introduces a graph-conditioned latent diffusion framework (GNN-Diff) to generate high-performing GNNs.
We validate our method through 166 experiments across four graph tasks: node classification on small, large, and long-range graphs, as well as link prediction.
arXiv Detail & Related papers (2024-10-08T05:27:34Z) - Molecular Hypergraph Neural Networks [1.4559839293730863]
Graph neural networks (GNNs) have demonstrated promising performance across various chemistry-related tasks.
We introduce molecular hypergraphs and propose Molecular Hypergraph Neural Networks (MHNN) to predict the optoelectronic properties of organic semiconductors.
MHNN outperforms all baseline models on most tasks of OPV, OCELOTv1 and PCQM4Mv2 datasets.
arXiv Detail & Related papers (2023-12-20T15:56:40Z) - HiGNN: Hierarchical Informative Graph Neural Networks for Molecular
Property Prediction Equipped with Feature-Wise Attention [5.735627221409312]
We propose a well-designed hierarchical informative graph neural networks framework (termed HiGNN) for predicting molecular property.
Experiments demonstrate that HiGNN achieves state-of-the-art predictive performance on many challenging drug discovery-associated benchmark datasets.
arXiv Detail & Related papers (2022-08-30T05:16:15Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - A Genetic Algorithm with Tree-structured Mutation for Hyperparameter
Optimisation of Graph Neural Networks [8.02401104726362]
Graph neural networks (GNNs) have gained increasing attention, as they possess excellent capability of processing graph-related problems.
In practice, hyperparameter optimisation (HPO) is critical for GNNs to achieve satisfactory results.
We propose a tree-structured mutation strategy for GA to alleviate this issue.
arXiv Detail & Related papers (2021-02-24T00:31:52Z) - A Systematic Comparison Study on Hyperparameter Optimisation of Graph
Neural Networks for Molecular Property Prediction [8.02401104726362]
Graph neural networks (GNNs) have been proposed for a wide range of graph-related learning tasks.
In recent years there has been an increasing number of GNN systems that were applied to predict molecular properties.
arXiv Detail & Related papers (2021-02-08T15:40:50Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.