Download PDFOpen PDF in browser

Federated Learning Approaches for Decentralized Data Privacy in Machine Learning

EasyChair Preprint 14994

12 pagesDate: September 22, 2024

Abstract

As data privacy concerns escalate, especially in domains such as healthcare and finance, the need for privacy-preserving machine learning methodologies has become paramount. Federated learning (FL) emerges as a revolutionary paradigm that facilitates collaborative model training across distributed devices, ensuring that raw data remains localized. This paper delves into various federated learning strategies, analyzing their efficacy in preserving privacy while maintaining robust model performance. We examine classical algorithms like Federated Averaging (FedAvg) and Federated SGD (FedSGD) alongside cutting-edge approaches like Federated Proximal (FedProx), which addresses data heterogeneity challenges. Through rigorous evaluation on a synthetic dataset mimicking real-world conditions, we provide a comprehensive assessment of these approaches, focusing on critical metrics such as accuracy, communication efficiency, and model convergence. Our findings underscore the potential of federated learning to offer a balanced solution to the trade-offs between privacy, efficiency, and accuracy, paving the way for broader adoption across various sectors.

Keyphrases: Federated Averaging, Federated Learning, communication efficiency, data heterogeneity, decentralized data, privacy-preserving machine learning

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:14994,
  author    = {Lucas Zhang},
  title     = {Federated Learning Approaches for Decentralized Data Privacy in Machine Learning},
  howpublished = {EasyChair Preprint 14994},
  year      = {EasyChair, 2024}}
Download PDFOpen PDF in browser