Abstract:
Federated Learning (FL) is a distributed machine learning approach that aims to solve privacy issues by training models without sharing the original data among clients. However, the heterogeneity of client data across FL can hinder the convergence and generalization performance of optimization. To address this issue, this paper proposes a balanced information and dynamic updating federated prototype learning (BD-FedProto) framework consisting of two components: dynamic aggregation (DA)of prototype scheduling and contrastive prototype aggregation (CPA). The former dynamically adjusts the proportion between local learning and global learning to balance the effectiveness of local knowledge and global knowledge. The latter utilizes missing classes as negative samples by learning unknown distributions through a unified prototype clustering. The experimental results on the CIFAR-10 and MNIST datasets show that BDFedProto is effective in improving the classification performance and stability of FL.