期刊:IEEE Transactions on Parallel and Distributed Systems [Institute of Electrical and Electronics Engineers] 日期:2024-02-20卷期号:35 (4): 634-645
标识
DOI:10.1109/tpds.2024.3366471
摘要
Bayesian networks are important Machine Learning models with many practical applications in, e.g., biomedicine and bioinformatics. The problem of Bayesian networks learning is $\mathcal {NP}$ -hard and computationally challenging. In this article, we propose practical parallel exact algorithms to learn Bayesian networks from data. Our approach uses shared-memory task parallelism to realize exploration of dynamic programming lattices emerging in Bayesian networks structure learning, and introduces several optimization techniques to constraint and partition the underlying search space. Through extensive experimental testing we show that the resulting method is highly scalable, and it can be used to efficiently learn large globally optimal networks.