UVR Lab. was formed
in Feb. 2001 at GIST to study and develop “Virtual Reality in Smart
computing environments” that process multimodal input, perceive
user’s intention and emotion, and respond to user’s request through
Augmented Reality. Since 2012, UVR Lab moved to KAIST GSCT and restarted
with a theme of “FUN in Ubiquitous VR.”
This video demonstrated our online patch learning & detection algorithm proposed in our ISMAR 2010 paper: W. Lee, Y. Park, V. Lepetit, W.Woo, "Point-and-Shoot for Ubiquitous Tagging on Mobile Phones," International Symposium on Mixed and Augmented Reality (ISMAR), 2010.
Our algorithm learns the appearance of planar targets in situ and the target is detected in real-time on mobile phones. Although the fronto-parallel view of the target is unavailable, our method can generate the frontal view automatically.
The capability of online learning allows users to interact with real world objects, which is unknown. In addition, the target object data can be shared with nearby mobile phones via Bluetooth communication and a collaborative mobile AR space can be built in situ.