WeARHand
July 29, 2014 | video, 9749views
We introduce WeARHand, which allows a user to manipulate virtual 3D objects with a bare hand in a wearable augmented reality (AR) environment. Our method uses no environmentally tethered tracking devices and localizes a pair of near-range and far-range RGB-D cameras mounted on a head-worn display and a moving bare hand in 3D space by exploiting depth input data. Depth perception is enhanced through egocentric visual feedback, including a semi-transparent proxy hand. We implement a virtual hand interaction technique and feedback approaches, and evaluate their performance and usability. The proposed method can apply to many 3D interaction scenarios using hands in a wearable AR environment, such as AR information browsing, maintenance, design, and games.
 
Uploaded by UVRLAB on Apr. 9, 2014
 
List
Address. (34141)KAIST N5 #2325,
291 Daehak-ro, Yuseong-gu,
Daejeon,
Republic of Korea
Phone. +82-42-350-5923