UVR Lab. was formed
in Feb. 2001 at GIST to study and develop “Virtual Reality in Smart
computing environments” that process multimodal input, perceive
user’s intention and emotion, and respond to user’s request through
Augmented Reality. Since 2012, UVR Lab moved to KAIST GSCT and restarted
with a theme of “FUN in Ubiquitous VR.”
작성일 : 16-04-03 22:09
3D Finger CAPE (full video)
조회 : 1,570
Youngkyoon Jang, Seung-Tak Noh, Hyung Jin Chang, Tae-Kyun Kim, Woontack Woo 3D Finger CAPE: Clicking Action and Position Estimation under Self-Occlusions in Egocentric Viewpoint IEEE Transactions on Visualization and Computer Graphics (TVCG), vol. 21, no. 4, April 2015. (also presented in IEEE VR 2015, Arles, Camargue, Provence, France, Mar. 23-27, 2015 as a long paper, accept rate: 13.8% (13/94)).
In this paper we present a novel framework for simultaneous detection of click action and estimation of occluded fingertip positions from egocentric viewed single-depth image sequences. For the detection and estimation, a novel probabilistic inference based on knowledge priors of clicking motion and clicked position is presented. Based on the detection and estimation results, we were able to achieve a fine resolution level of a bare hand-based interaction with virtual objects in egocentric viewpoint. The proposed method delivers promising performance under frequent self-occlusions in the process of selecting objects in AR/VR space whilst wearing an egocentric-depth camera-attached HMD.