Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Structured Bayesian Federated Learning for Green AI: A Decentralized Model Compression Using Turbo-VBI-Based Approach
Ist Teil von
IEEE internet of things journal, 2024-04, Vol.11 (7), p.12783-12798
Ort / Verlag
Piscataway: IEEE
Erscheinungsjahr
2024
Quelle
IEEE Xplore
Beschreibungen/Notizen
Although deep neural networks (DNNs) have been remarkably successful in numerous areas, the performance of DNN is compromised in federated learning (FL) scenarios because of the large model size. A large model can induce huge communication overhead during the federated training, and also induce infeasible storage and computation burden at the clients during the inference. To address these issues, we investigate structured model compression in FL to construct sparse models with regular structure such that they require significantly less communication, storage and computation resources. We do this by proposing a three-layer hierarchical prior, which can promote a common regular sparse structure in the local models. We design a decentralized Turbo variational Bayesian inference (D-Turbo-VBI) algorithm to solve the resulting federated training problem. With the common regular sparse structure, both upstream and downstream communication overhead can be reduced, and the final model also has a regular sparse structure, which requires significantly less local storage and computation resources. Simulation results demonstrate that our proposed algorithm can efficiently reduce the communication overhead during federated training and the resulting model can achieve a significantly lower sparsity rate and inference time compared to the baselines while maintaining a competitive accuracy.