计算机科学
贝叶斯优化
超参数
最优化问题
水准点(测量)
任务(项目管理)
高斯过程
贝叶斯概率
机器学习
独立同分布随机变量
黑匣子
数据挖掘
人工智能
高斯分布
算法
随机变量
物理
经济
统计
量子力学
管理
地理
数学
大地测量学
作者
Hangyu Zhu,Xilu Wang,Yaochu Jin
标识
DOI:10.1109/tevc.2023.3279775
摘要
Bayesian optimization is a powerful surrogate-assisted algorithm for solving expensive black-box optimization problems. While Bayesian optimization was developed for centralized optimization, the availability of massive distributed data has attracted increased interests in exploring federated Bayesian optimization that can use data on multiple clients without leaking the raw data. However, existing federated Bayesian optimization (FBO) approaches assume that either all clients jointly solve the same optimization task, or only one client solves one target optimization task by transferring knowledge from others in a federated way, making them unsuited for many real-world applications. In this paper, we consider FBO for the scenario where multiple related local black-box tasks associated with different clients are jointly optimized by sharing knowledge between tasks without leaking the data privacy. An efficient federated many-task Bayesian optimization framework is proposed to address not independent and identically distributed (non-IID) data while protecting the data privacy in the federated setting. A novel federated knowledge transfer paradigm is developed for dynamic many-task model aggregation according to a dissimilarity matrix. The dissimilarity is measured based on the rank of the predictions and only the hyperparameters in the local Gaussian process models are shared. In addition, a federated ensemble acquisition function is constructed by integrating the predictions of two surrogates using the global and local hyperparameters, respectively, to effectively search for the optimal solution. Experimental results show that our proposed method has reliable performance on both benchmark problems and a real machine learning problem also in the presence of non-IID data.
科研通智能强力驱动
Strongly Powered by AbleSci AI