Energy-Efficient Artificial Neural Network Inference Based on Neural Network Partitioning 


Vol. 48,  No. 4, pp. 405-408, Apr.  2023
10.7840/kics.2023.48.4.405


PDF
  Abstract

In this paper, we investigate an energy-efficient inference of an artificial neural network. In particular, we propose an energy-efficient artificial neural network inference process based on neural network partitioning, in which the artificial neural network is partitioned and distributed between a mobile end device and an edge server. Then, by performing cooperative inference via wireless communications, the amount of energy consumption required for the entire inference process can be reduced. Simulation results show that the proposed scheme can improve the energy efficiency of practical inference tasks.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Related Articles
  Cite this article

[IEEE Style]

SangseokYun, "Energy-Efficient Artificial Neural Network Inference Based on Neural Network Partitioning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 4, pp. 405-408, 2023. DOI: 10.7840/kics.2023.48.4.405.

[ACM Style]

SangseokYun. 2023. Energy-Efficient Artificial Neural Network Inference Based on Neural Network Partitioning. The Journal of Korean Institute of Communications and Information Sciences, 48, 4, (2023), 405-408. DOI: 10.7840/kics.2023.48.4.405.

[KICS Style]

SangseokYun, "Energy-Efficient Artificial Neural Network Inference Based on Neural Network Partitioning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 4, pp. 405-408, 4. 2023. (https://doi.org/10.7840/kics.2023.48.4.405)
Vol. 48, No. 4 Index