UVR Lab. was formed in Feb. 2001 at GIST to study and develop “Virtual Reality in Smart computing environments” that process multimodal input, perceive user’s intention and emotion, and respond to user’s request through Augmented Reality. Since 2012, UVR Lab moved to KAIST GSCT and restarted with a theme of “FUN in Ubiquitous VR.”
Human reconstruction for telepresent interaction

  • Final Research Goal
    Starting with Microsoft Holoportation and Facebook Spaces, research for tele-presence systems has been actively conducted for remote interaction between users. However, many prototype systems have limitations since they require unrealistic hardware configuration or have limited capability for providing tele-presence. In this project, necessary research topics will be studied and developed for reconstruction technology, which is a key factor for providing high tele-presence. Our system will be based on the commodity sensors that acquire 3-dimensional information of the human body such as Microsoft Kinect or Intel RealSense.
     
  • Research Foci
    -Hand Tracking   -Telepresence   -Interaction with remote users.  
  • Expected contribution
    (1) Body and facial reconstruction – To reconstruct a user’s body and facial mesh using the input image of RGB-D camera, and to extract the motion of the user through the skeleton extraction method based on deep learning.   (2) Hand reconstruction – To reconstruct the user’s hand mesh and extract the motion of the hand based on the imput image of a small RGB-D camera attached to the AR headset, and to improve the precision of reconstruction and motion extraction using wrist-worn IMU sensor information.   (3) Gaze tracking – To estimate the position of pupil and track user’s gaze using the input image of the near-infrared camera attached to the AR headset, and to detect facial expression based on feature points.
     
  • Application
    (1) Virtual classroom – can improve the presence and learning effect through the sight alignment between the instructor and the learner.   (2) Tele-medicine – can provide medical services for remote patients.   (3) Virtual personal training (PT) - can provide the observation of the instructor’s demonstration pose at the point of the view that the learner wants, and can be extended to the sport fields where the accuracy of the posture is important such as golf or yoga.
     
  • Sponsor
    This research was supported by Next-Generation Information Computing Development Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT (NRF- 2017M3C4A7066316).
     
 
Geometry-aware Interactive AR Authoring using a Smartphone in a 3D Glass Environment

  • Final Research Goal 
    Our project final research goal is achieving the authoring method that enables a user to easily build an augmented reality world in an in-situ environment and manipulate 3D virtual content to it. Our goal holds that a user can use an RGB-D camera attached to an HMD, obtains 3D image features from the unknown 3D space, and interactively align a local reference coordinates using depth information. Then a user can easily add 3D virtual objects and set dynamic reaction properties from various distances.
     
  • Research Foci.
    - Automatic user localization and registration between real world and virtual world.
    - Authoring tool for static or dynamic properties of AR contents.
    - Interaction with virtual objects
     
  • Expected Contribution
    - AR simulation that registered with real space improves space perception and help decision-making
    - Proposed authoring tool makes users reduce time cost for AR authoring.
    - AR contents that registered with real space is new cultural contents industry and will apply to various industry including education and game industry.
     
  • Application
    - AR based tour guide in museum, lecture / education, military simulation, etc.

  • Sponsor
    - This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea Government (MEST) (NRF-2014R1A2A2A01003005).
 
K-Culture Time Machine 2.0: Enhancing Korean Cultural Heritage Tour
Through Outdoor Mobile Augmented Reality Cultural Contents


  • Final Research Goal 
    “K-Culture Time Machine 2.0” project was developed to provide users with both the representation of cultural heritage and on-site and remote experience of exploration. This application delivers its users with relevant information and multimedia contents depending on the context of the environment as well as the user’s profile. In addition, the content creators use our web-based authoring tool to anchor information and multimedia contents to the real world. These generated contents can be used to produce tours such as video pilgrimage (tour on drama or TV shows that were taken place at the heritage site) or space-telling (tour based on historical narratives). Lastly, the users can also create 3D virtual contents on the heritage site to share with friends or other tourists. Underneath it all, we proposed and implemented the 5W1H (what, when, where, who, why, and how) model-based metadata structure for context-aware cultural heritage tourism.
     
  • Research Foci.
    - Integrating technologies to access multidimensional cultural content
      The cooperation of the relevant agencies is required to collect data because it is difficult to gather data individually. Also, because of the variety of metadata of the cultural content agency, it is difficult to maintain the consistency of the linked data model for spatiotemporal information integration.
    - AR/VR-based visualization technique of space-time cultural content
      o Automated 3D image feature map generation based on SLAM
      o Mobile visualization on cultural content based on augmented and virtual reality
      o AR/VR-based space-time cultural content visualization on head mounted display
    - Participation/Share-enabled open cultural content platform
      o Multidimensional query processing techniques for combined time and space content
      o Open service platform technology supports heterogeneous cultural content and links, convergence
    - Prototype service to spread the space and temporal culture content
      o 2D/3D map-based cultural content and spatial information fusion
      o Mobile and web-based AR/VR pilot service development
      o Technology to create a space-time combined culture content creation and distribution package
     
  • Expected Contribution
    We developed a prototype application of the K-Culture Time Machine, an in-situated authoring and viewing system, for augmented reality-based tour guidance that helps a user to easily create and experience an augmented reality scene of a cultural heritage.
     
  • Sponsor
    This research was supported by Ministry of Culture, Sports and Tourism (MCST) and Korea Creative Content Agency (KOCCA) in the Culture Technology (CT) Research & Development Program 2014.


 
Wearable Interface for Sustainable Enhancement AR UI / UX Platform for Smart Glasses

  • Final Research Goal
     o Overcoming the limitations and practical application of smart glasses UI
      - Improve sustainable user skills in everyday life WISE: Wearable Interface for Sustainable Enhancement
      - Recognition of Augmented Space and Detailed Objects
      - Using Augmented Reality Objects as Input / Output Tools
    - User's attention and intention-oriented interface configuration and utilization

     o Through open SW research and development, accumulation of SW results and leading and revitalizing the global community
      - Improving Sustainable User Ability in Everyday WISE: Wearable Interface for Sustainable Enhancement
      - Secure the original technology and actively use it in the industry through the open SW of development technology
      - Vitalization of smart glasses ecosystem through technology exchange and cooperation with major overseas leading companies and research institutes
     
  • Research Contents
     o Augmented Space-Time : User Perspective Estimation and 3D Reconstruction and Tracking of Object Units in Surrounding Environments
      - Real-time view-point Estimation in 3D Space of Smart Glasses User
      - 3D reconstruction of static space around the user
      - Tracking objects around users and reconstruction space around objects  
     o Wise Interface: Attention estimation by integrating the surrounding environment and user motion information
      - User's explicit attention tracking and utilization interaction
      - Implicit Attention Tracking and Utilization Interaction with User Specific Characteristics
      - Enhance information and content in context with user status and surroundings  
     o Mediated Experience: Smart glasses input/output and information augmentation using objects around users
      - Application of interaction method considering the meaning of target object and user's intention
      - Virtual object physical manipulation using real object
      - Control of virtual object function (additional property) using real object