Best Papers
 Autoencoder-Based Model Compression Schemes for Federated Learning 


Vol. 48,  No. 3, pp. 295-305, Mar.  2023
10.7840/kics.2023.48.3.295


PDF Full-Text
  Abstract

Edge intelligence has been an emerging key enabler of intelligent 6G networks. The federated learning (FL) algorithm has been regarded as a promising solution to realize remote and decentralized training processes of a number of artificial intelligence (AI) models distributed over multiple clients. For the FL system, it is essential to exchange AI model parameters among a server and clients, which incurs prohibitive communication cost. To overcome this challenge, this paper proposes an autoencoder approach to compress AI model parameters in the FL system. Inference steps of the proposed autoencoder are carefully designed such that the weighted averaging operations of the FL algorithm can be injected into the end-to-end compression-reconstruction process. Numerical results demonstrate the effectiveness of the proposed method over conventional schemes.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Related Articles
  Cite this article

[IEEE Style]

Do-YunLee and HoonLee, "Autoencoder-Based Model Compression Schemes for Federated Learning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 3, pp. 295-305, 2023. DOI: 10.7840/kics.2023.48.3.295.

[ACM Style]

Do-YunLee and HoonLee. 2023. Autoencoder-Based Model Compression Schemes for Federated Learning. The Journal of Korean Institute of Communications and Information Sciences, 48, 3, (2023), 295-305. DOI: 10.7840/kics.2023.48.3.295.

[KICS Style]

Do-YunLee and HoonLee, "Autoencoder-Based Model Compression Schemes for Federated Learning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 3, pp. 295-305, 3. 2023. (https://doi.org/10.7840/kics.2023.48.3.295)
Vol. 48, No. 3 Index