Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 6 von 12
2022 IEEE International Conference on Edge Computing and Communications (EDGE), 2022, p.12-22
2022
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
Computationally Efficient Auto-Weighted Aggregation for Heterogeneous Federated Learning
Ist Teil von
  • 2022 IEEE International Conference on Edge Computing and Communications (EDGE), 2022, p.12-22
Ort / Verlag
IEEE
Erscheinungsjahr
2022
Quelle
IEEE Xplore Digital Library
Beschreibungen/Notizen
  • Federated Learning (FL) offers a privacy-preserving massively distributed Machine Learning (ML) paradigm where many clients cooperatively work together towards training a shared machine learning model. FL, however, is susceptible to data heterogeneity problems as the FL clients have diverse data sources. Prior works employ auto-weighted model aggregation to mitigate the heterogeneity issue to minimize the impact of unfavorable model updates. However, existing approaches require extensive computation for statistical analysis of clients' model updates. To circumvent this, we propose, FedASL (Federated Learning with Auto-weighted Aggregation based on Standard Deviation of Training Loss) which uses only the local training loss of FL clients for auto-weighting the model aggregation. Our evaluation under three different datasets and various data corruption scenarios reveals that FedASL can effectively thwart data corruption from bad clients while causing as little as one-tenth of the computation cost of existing approaches.
Sprache
Englisch
Identifikatoren
eISSN: 2767-9918
DOI: 10.1109/EDGE55608.2022.00015
Titel-ID: cdi_ieee_primary_9860312

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX