Real-time Eye Contact System Using a Kinect Depth Camera for Realistic Telepresence 


Vol. 37,  No. 4, pp. 277-282, Apr.  2012


PDF
  Abstract

In this paper, we present a real-time eye contact system for realistic telepresence using a Kinect depth camera. In order to generate the eye contact image, we capture a pair of color and depth video. Then, the foreground single user is separated from the background. Since the raw depth data includes several types of noises, we perform a joint bilateral filtering method. We apply the discontinuity-adaptive depth filter to the filtered depth map to reduce the disocclusion area. From the color image and the preprocessed depth map, we construct a user mesh model at the virtual viewpoint. The entire system is implemented through GPU-based parallel programming for real-time processing. Experimental results have shown that the proposed eye contact system is efficient in realizing eye contact, providing the realistic telepresence.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

S. Lee and Y. Ho, "Real-time Eye Contact System Using a Kinect Depth Camera for Realistic Telepresence," The Journal of Korean Institute of Communications and Information Sciences, vol. 37, no. 4, pp. 277-282, 2012. DOI: .

[ACM Style]

Sang-Beom Lee and Yo-Sung Ho. 2012. Real-time Eye Contact System Using a Kinect Depth Camera for Realistic Telepresence. The Journal of Korean Institute of Communications and Information Sciences, 37, 4, (2012), 277-282. DOI: .

[KICS Style]

Sang-Beom Lee and Yo-Sung Ho, "Real-time Eye Contact System Using a Kinect Depth Camera for Realistic Telepresence," The Journal of Korean Institute of Communications and Information Sciences, vol. 37, no. 4, pp. 277-282, 4. 2012.