UVR Lab. was formed in Feb. 2001 at GIST to study and develop “Virtual Reality in Smart computing environments” that process multimodal input, perceive user’s intention and emotion, and respond to user’s request through Augmented Reality. Since 2012, UVR Lab moved to KAIST GSCT and restarted with a theme of “FUN in Ubiquitous VR.”
작성일 : 12-04-09
Point-and-Shoot for Ubiquitous Tagging
 글쓴이 : UVR
조회 : 8,852  

Uploaded by GIST CTI on Aug. 23, 2010
This video demonstrated our online patch learning & detection algorithm proposed in our ISMAR 2010 paper: W. Lee, Y. Park, V. Lepetit, W.Woo, "Point-and-Shoot for Ubiquitous Tagging on Mobile Phones," International Symposium on Mixed and Augmented Reality (ISMAR), 2010.

Our algorithm learns the appearance of planar targets in situ and the target is detected in real-time on mobile phones. Although the fronto-parallel view of the target is unavailable, our method can generate the frontal view automatically.

The capability of online learning allows users to interact with real world objects, which is unknown. In addition, the target object data can be shared with nearby mobile phones via Bluetooth communication and a collaborative mobile AR space can be built in situ.


ADD. (34141)KAIST N5 2325, 291 Daehak-ro, Yuseong-gu, Daejeon, Republic of Korea / TEL. +82-42-350-5923