当前位置: X-MOL 学术IEEE Wirel. Commun. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
When Federated Learning Meets Knowledge Distillation
IEEE Wireless Communications ( IF 10.9 ) Pub Date : 2024-10-02 , DOI: 10.1109/mwc.016.2300523
Xiaoyi Pang, Jiahui Hu, Peng Sun, Ju Ren, Zhibo Wang

Federated learning (FL) has garnered significant attention in the Internet-of-things (IoT) domain due to its ability to facilitate collaborative learning among distributed privacy-sensitive devices without compromising their local data. However, FL's development and application are hindered by the heterogeneity, vulnerability, and limited computing and wireless communication resources of IoT systems. To solve the issues, knowledge distillation (KD), a technique that can transfer learned knowledge from one network to another in a quick and light manner, is incorporated into FL to enhance its usability and universality in IoT. KD-based FL can effectively achieve collaborative learning among resource-limited IoT devices that are heterogeneous in data distribution, model architectures, or quantity of resources, and offer enhanced privacy guarantees. Given the increasing adoption and benefits of KD in FL, it is essential to review KD-based FL schemes to identify the common methodologies and potential future directions. In this article, we examine the challenges associated with applying FL in IoT and sort out three key steps for incorporating KD into FL, along with the general workflow for KD-based FL. Furthermore, we conduct a comprehensive retrospective analysis of existing KD-based FL schemes and tease out their approaches to utilize KD to solve challenges, and then we compare them based on various design aspects. Based on our analysis and comparison, we shed light on several unresolved research questions that warrant further investigation.

中文翻译:


当联邦学习遇上知识蒸馏



联邦学习 (FL) 在物联网 (IoT) 领域引起了广泛关注,因为它能够促进分布式隐私敏感设备之间的协作学习,而不损害其本地数据。然而,物联网系统的异构性、脆弱性以及有限的计算和无线通信资源阻碍了FL的发展和应用。为了解决这些问题,知识蒸馏(KD)技术被纳入 FL 中,以增强其在物联网中的可用性和通用性。知识蒸馏(KD)是一种能够以快速、轻松的方式将学习到的知识从一个网络转移到另一个网络的技术。基于KD的FL可以有效地实现数据分布、模型架构或资源数量异构的资源有限的物联网设备之间的协作学习,并提供增强的隐私保证。鉴于 KD 在 FL 中的采用率和优势不断增加,有必要审查基于 KD 的 FL 方案以确定通用方法和潜在的未来方向。在本文中,我们研究了在物联网中应用 FL 所面临的挑战,并梳理了将 KD 纳入 FL 的三个关键步骤,以及基于 KD 的 FL 的一般工作流程。此外,我们对现有的基于 KD 的 FL 方案进行了全面的回顾性分析,并梳理了它们利用 KD 解决挑战的方法,然后根据各个设计方面对它们进行了比较。根据我们的分析和比较,我们阐明了几个值得进一步研究的未解决的研究问题。
更新日期:2024-10-02
down
wechat
bug