当前位置: X-MOL 学术Future Gener. Comput. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
PHiFL-TL: Personalized hierarchical federated learning using transfer learning
Future Generation Computer Systems ( IF 6.2 ) Pub Date : 2024-12-09 , DOI: 10.1016/j.future.2024.107672
Afsaneh Afzali, Pirooz Shamsinejadbabaki

Federated Learning is a collaborative machine learning (ML) framework designed to train a globally shared model without accessing participants’ private data. However, due to the statistical heterogeneity in the participants’ data, federated learning faces significant challenges. This approach generates a similar output for all participants, without adapting the model to each individual. Consequently, the global model performs poorly on each participant's task. To mitigate these issues, personalized federated learning methods aim to reduce the negative effects caused by data heterogeneity. Previous personalized approaches have relied on a single central server. However, in federated learning based on a client-server architecture, the central server's workload becomes a bottleneck. In our paper, we propose a Personalized Hierarchical Federated Learning approach (PHiFL-TL). First, PHiFL-TL trains a global shared model using hierarchical federated learning. Next, it constructs relatively personalized models through transfer learning. We demonstrate the effectiveness of PHiFL-TL on non-identical and independent data partitions from MNIST and FEMNIST datasets.

中文翻译:


PHiFL-TL:使用迁移学习的个性化分层联邦学习



联邦学习是一个协作式机器学习 (ML) 框架,旨在训练全球共享的模型,而无需访问参与者的私人数据。然而,由于参与者数据的统计异质性,联邦学习面临重大挑战。这种方法为所有参与者生成相似的输出,而无需根据每个人调整模型。因此,全局模型在每个参与者的任务上表现不佳。为了缓解这些问题,个性化联邦学习方法旨在减少数据异质性造成的负面影响。以前的个性化方法依赖于单个中央服务器。然而,在基于客户端-服务器架构的联邦学习中,中央服务器的工作负载成为瓶颈。在我们的论文中,我们提出了一种个性化分层联邦学习方法(PHiFL-TL)。首先,PHiFL-TL 使用分层联邦学习训练全局共享模型。接下来,它通过迁移学习构建相对个性化的模型。我们证明了 PHiFL-TL 对 MNIST 和 FEMNIST 数据集的非相同和独立数据分区的有效性。
更新日期:2024-12-09
down
wechat
bug