当前位置: X-MOL 学术IEEE Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues
IEEE NETWORK ( IF 6.8 ) Pub Date : 2024-02-23 , DOI: 10.1109/mnet.2024.3369406
Xiaodong Wang 1 , Zhitao Guan 1 , Longfei Wu 2 , Keke Gai 3
Affiliation  

Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.

中文翻译:


在分布式云中实现安全、稳健的联合蒸馏:挑战和设计问题



联邦学习(FL)提供了一种有前途的解决方案,可以有效利用分散在分布式云系统中的数据。尽管其潜力巨大,但巨大的通信开销给分布式云系统带来了很大的负担。联邦蒸馏(FD)是一种新型的低通信成本的分布式学习技术,其中客户端仅通信模型逻辑而不是模型参数。然而,FD面临着数据异构性和安全性方面的挑战。此外,FD中的传统聚合方法容易受到恶意上传的攻击。在本文中,我们讨论了分布式云系统背景下 FL 的局限性和 FD 的挑战。为了解决这些问题,我们提出了一个基于区块链的框架来实现安全和稳健的FD。具体来说,我们开发了一种预训练数据准备方法来减少数据分布的异质性,并开发了一种聚合方法来增强聚合过程的鲁棒性。此外,还设计了委员会/工人选择策略,以优化客户之间的任务分配。进行实验评估以评估所提出框架的有效性。
更新日期:2024-02-23
down
wechat
bug