超参数
计算机科学
降级(电信)
人工神经网络
图形
人工智能
理论计算机科学
电信
作者
Viktorija Vodilovska,Sonja Gievska,Ilinka Ivanoska
标识
DOI:10.23919/mipro57284.2023.10159737
摘要
Graph Neural Networks (GNN) emerged as increasingly attractive deep learning models for complex data, making them extremely useful in biochemical and pharmaceutical domains. However, building a good-performing GNN requires lots of parameter choices and Hyperparameter optimization (HPO) can aid in exploring solutions. This study presents a comparative analysis of several strategies for Hyperparameter optimization of GNNs. The explored optimization techniques include complex algorithms such as the bio-inspired Genetic Algorithm, Particle Swarm Optimization, and Artificial Bee Colony. In addition, Hill Climb and Simulated Annealing as well as the commonly used methods Random Search and Bayesian Search have also been covered. The proposed optimization algorithms have been evaluated on improving the performance of the GNN architectures developed for predicting mRNA degradation. The Stanford OpenVaccine dataset for mRNA degradation prediction has been used for training and testing the predictive models. Finding mRNA molecules with low degradation rates is important in development of mRNA vaccines for diseases such as COVID-19 and we hope to benefit research on ML in this domain. According to the analysis's findings, Simulated Annealing algorithm outperforms other algorithms on both architectures. Furthermore, population based algorithms like Particle Swarm optimization show promising results, with certain limitations related to the complexity of the algorithms which encourages further exploration of the subject.
科研通智能强力驱动
Strongly Powered by AbleSci AI