计算机科学
可验证秘密共享
块链
可扩展性
架空(工程)
分布式计算
推论
链条(单位)
培训(气象学)
联合学习
人工智能
机器学习
数据库
计算机安全
操作系统
程序设计语言
集合(抽象数据类型)
气象学
物理
天文
作者
Sarthak Chakraborty,Sandip Chakraborty
标识
DOI:10.1109/icbc54727.2022.9805548
摘要
Blockchain has widely been adopted to design accountable federated learning frameworks; however, the existing frameworks do not scale for distributed model training over multiple independent blockchain networks. For storing the pre-trained models over blockchain, current approaches primarily embed a model using its structural properties that are neither scalable for cross-chain exchange nor suitable for cross-chain verification. This paper proposes an architectural framework for cross-chain verifiable model training using federated learning, called Proof of Federated Training (PoFT), the first of its kind that enables a federated training procedure span across the clients over multiple blockchain networks. Instead of structural embedding, PoFT uses model parameters to embed the model over a blockchain and then applies a verifiable model exchange between two blockchain networks for cross-network model training. We implement and test PoFT over a large-scale setup using Amazon EC2 instances and observe that cross-chain training can significantly boosts up the model efficacy. In contrast, PoFT incurs marginal overhead for inter-chain model exchanges.
科研通智能强力驱动
Strongly Powered by AbleSci AI