GSCT UVR LAB.
UVR Lab. was formed in Feb. 2001 at GIST to study and develop “Virtual Reality in Smart computing environments” that process multimodal input, perceive user’s intention and emotion, and respond to user’s request through Augmented Reality. Since 2012, UVR Lab moved to KAIST GSCT and restarted with a theme of “FUN in Ubiquitous VR.”
Total 42
No Subject Date
42
3D Finger CAPE (full video)
Youngkyoon Jang, Seung-Tak Noh, Hyung Jin Chang, Tae-Kyun Kim, Woontack Woo3D Finger CAPE: Clicking Action and Position Estimation under Self-Occlusions in Egocentric ViewpointIEEE Transactions on Visualization and Computer Graphics (TVCG), vol. 21, no. 4, April 2015.(also presented in IEEE VR …
2016-04-03
41
A HMD-based remote collaboration system
An HMD-based remote collaboration system using hand-augmented object interaction in coexistence reality
2016-04-03
40
Smartwatch-assisted Robust 6-DOF Hand Tracking System for Object Manipulation in HMD-based AR
Hyung-il Kim, Woontack Woo Smartwatch-assisted Robust 6-DOF Hand Tracking System for Object Manipulation in HMD-based Augmented RealityThis work is accepted as poster in IEEE Symposium on 3D User Interfaces (3DUI) 2016. (People's Choice Award)We introduce a smartwatch assisted sensor fusion ap…
2016-04-03
39
2015 SK Creative Challenge 'IlumiAR' User scenario
2015 SK Creative Challenge 최우수상  수상자: 전익범, 박혜빈, 이현진 주관/주최: SKT 경제경영연구소참가: 각 대학별 예선을 거쳐 선발된 4개대학 총 8개 팀
2016-04-03
38
“DreamHouse”: NUI-based Photo-realistic AR Authoring System for Interior Design
Jinwoo Park, Sung Sil Kim, Hyerim Park, Woontack Woo, "DreamHouse: NUI-based Photo-realistic AR Authoring System for Interior Design,"Proceedings of the 7th Augmented Human International Conference 2016. ACM, 2016. This paper proposes a system which enables users to have enhanced interior de…
2016-04-03
37
3D Finger CAPE
In this paper we present a novel framework for simultaneous detection of click action and estimation of occluded fingertip positions from egocentric viewed single-depth image sequences. For the detection and estimation, a novel probabilistic inference based on knowledge priors of clicking motio…
2015-02-03
36
WeARHand
We introduce WeARHand, which allows a user to manipulate virtual 3D objects with a bare hand in a wearable augmented reality (AR) environment. Our method uses no environmentally tethered tracking devices and localizes a pair of near-range and far-range RGB-D cameras mounted on a head-worn displ…
2014-07-29
35
CHIC - KAIST GSCT UVR Lab.
Uploaded by UVRLAB on Apr. 9, 2014    
2014-04-18
34
Multiple object recognition (work in progress)
본 연구는 안정적인 객체 인식을 지원하기 위해 계산학적으로 모델링된 사람의 시각인지 모형에 따라 진행한 객체인식 연구이다. 본 동영상에서 보여지는 결과는 해당 모형을 따르는 프로토타입 결과이다. Saliency map detection…
2014-04-18
33
Hand tracking with gesture detection
Bare Hand User Interface based on a Video see-through HMD with a Near-range RGB-D Camera for Wearable Augmented Reality Interaction착용형 증강현실 상호작용을 위한 근거리 RGB-D 카메라가 장착된 비디오 투과형 HMD 기반 맨손 사용자 인터페이스    …
2014-04-18
 
 1  2  3  4  5  
and or