Moreau envelopes-based personalized asynchronous federated learning: improving practicality in distributed machine learning
Abstract
Federated learning is a promising approach for training models on distributed data, driven by increasing demand in various industries. However, it faces several challenges, including communication bottlenecks and client data heterogeneity. Personalized asynchronous federated learning addresses these challenges by customizing the model for individual users based on their local data while trading model updates asynchronously. In this paper, we propose Personalized
Moreau Envelopes-based Asynchronous Federated Learning (APFedMe) that combines personalized learning with asynchronous communication and Moreau Envelopes as clients’ regularized loss
functions. Our approach uses the Moreau Envelopes to handle non-convex optimization problems
and employs asynchronous updates to improve communication efficiency while mitigating heterogeneity data challenges through a personalized learning environment. We evaluate our approach
on several datasets and compare it with PFedMe, FedAvg, and PFedAvg federated learning methods. Our experiments show that APFedMe outperforms other methods in terms of convergence
speed and communication efficiency. Then, we mention some well-performing implementations
to handle missing data in distributed learning. Overall, our work contributes to the development of
more effective and efficient federated learning methods that can be applied in various real-world
scenarios.