UVR Lab. was formed in Feb. 2001 at GIST to study and develop “Virtual Reality in Smart computing environments” that process multimodal input, perceive user’s intention and emotion, and respond to user’s request through Augmented Reality. Since 2012, UVR Lab moved to KAIST GSCT and restarted with a theme of “FUN in Ubiquitous VR.”
Trans-Space: Research on Real-time Hand Tracking and
Hand-based Interaction for 4D+ Augmented Avatar-mediated Remote Collaboration


  • Final Research Goal
    Our final goal of the research is achieving the “Social presence-oriented 4D+ Trans-Space Convergence,” which enables user to overcome the spatial-temporal limitation and share the information/knowledge/ atmosphere with high level Co-Presence.
     
  • Achievement in the 1st Step: Mirrored-Space
    In the 1st step, we conducted a technology for Mirrored-Space convergence technology. We assumed that each user in local space equipped with Video See-through HMD. In this situation, first, we obtained the knowledge from the real-world physical space by computer vision-based technology such as 3D space reconstruction and object recognition. Second, we defined the Mirrored-Space, which means the additive digital information are registered and modified in the real-world space in real-time. As a result, the system can manage the real-world information in a digital way. In this approach, the user can explore and interact with those information tagged onto the real-world object. 
     
  • Research Goal in the 2nd Step: Interaction in Digi-Log Space
    In the 2nd step, we conducted a technology which is the “4D+ Trans-Space Convergence Technology for supporting the realistic interaction between hand-augmented object among the users who are equipped with HMD in their own local space.” In this system, first, we tracked the relative pose of HMD in its Mirrored-Space, and the 3D articulation of user’s hand in egocentric view. This supported the user to interact with augmented object, meaning the registered virtual object in real-world. In addition, we defined the Trans-Space, meaning that the merged Mirrored-Spaces for collaboration. Each user can determined one’s own basis instantly, and this became the basis of merging the other’s Mirrored-Spaces. After determined this Trans-Space, the avatar as a remote user was augmented in the local user’s space. By this approach, the system can provide the realistic collaboration workspace to users in remote spaces.
     
  • Research Goal in the 3rd Step: Collaboration in Coexistence Reality
    In the 3rd step, we research on the hand-based collaboration system in egocentric coexistence reality (CR), which enables a local user to collaborate with multiple users in the distance. To achieve this, we conduct a convergence research among the following core technologies: 1) Hand-augmented object interaction method for supporting the remote collaboration in CR space by tracking bare hands or hands with grasping objects in egocentric view, 2) Two hands-body information fusion method for reproducing realistic remote user’s movement in CR space. By this approach, it enables direct interaction between user’s bare hand or hands with grasping object and 3D augmented object as long as the user moves in 4D+ Trans-Space. In addition, the user can be work together with remote users as feeling spatial co-presence in CR space.
     
  • Research Foci
     o  RGB-D Image based Real-time 3D Two Hands(Bare Hands or Hands with Grasping Real-world Objects) Articulation Recognition/Tracking and Registration in Egocentric Viewpoint   
     o Interaction with Augmented Object by Virtual Hand Model Registered on the User’s Hand  
     o Body-Hand Fusion Method based on Exo- and Ego-centric Tracking Data for Remote Avatar
     o General Method for Merging the Multiple- Mirrored-Spaces
     
  • Expected Contricution
    A normal user wearing AR glasses can seamlessly experience COI based intelligent information by intuitively authoring/augmenting/interacting with 4D+ mirror world while moving in indoor. In other words, human's activity can be spatially/temporally extended by sharing and experiencing useful mirror world information in the real world.
     
  • Application
    AR based time/space transcended smart work (remote conference), next generation experimental education (Science, Math, Art, History, etc.), AR simulation (ecology, environment, military, urban planning, architecture, disaster, etc.), video based information survey (traffic, security, etc.), AR medical information, AR entertainment.
     
  • Sponsor
    This work was supported by the Global Frontier R&D Program on funded by the National Research Foundation of Korea grant funded by the Korean Government(MSIP) (NRF-2010-0029751).
     
 
Geometry-aware Interactive AR Authoring using a Smartphone in a 3D Glass Environment

  • Final Research Goal 
    Our project final research goal is achieving the authoring method that enables a user to easily build an augmented reality world in an in-situ environment and manipulate 3D virtual content to it. Our goal holds that a user can use an RGB-D camera attached to an HMD, obtains 3D image features from the unknown 3D space, and interactively align a local reference coordinates using depth information. Then a user can easily add 3D virtual objects and set dynamic reaction properties from various distances.
     
  • Research Foci.
    - Automatic user localization and registration between real world and virtual world.
    - Authoring tool for static or dynamic properties of AR contents.
    - Interaction with virtual objects
     
  • Expected Contribution
    - AR simulation that registered with real space improves space perception and help decision-making
    - Proposed authoring tool makes users reduce time cost for AR authoring.
    - AR contents that registered with real space is new cultural contents industry and will apply to various industry including education and game industry.
     
  • Application
    - AR based tour guide in museum, lecture / education, military simulation, etc.

  • Sponsor
    - This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea Government (MEST) (NRF-2014R1A2A2A01003005).
 
K-Culture Time Machine: Development of Creation and
Provision Technology for Time•Space-connected Cultural Contents


  • Final Research Goal 
    The “K-Culture Time Machine” project develops technologies to structure diverse cultural content from associated organizations and projects, including the “Cultural Heritage hub-bank,” and construct new cultural content connected to time and space, then develop a technique that provides the structured content to industries (culture, tourism, IT) and the public. To integrate heterogeneous dataset, designing a new data model is a vital process for our project. As Europeana designed data model which aims to integrate and link several data set across cultural institutions of Europe, we also invented a new data model that encompasses a wide range of metadata for cultural institution in Korea.
     
  • Research Foci.
    - Integrating technologies to access multidimensional cultural content
      The cooperation of the relevant agencies is required to collect data because it is difficult to gather data individually. Also, because of the variety of metadata of the cultural content agency, it is difficult to maintain the consistency of the linked data model for spatiotemporal information integration.
    - AR/VR-based visualization technique of space-time cultural content
      o Automated 3D image feature map generation technique using image capture and a wearable display
      o Mobile devices based on augmented and virtual reality techniques to visualize the cultural content
      o Head-mounted wearable-device-based AR/VR-based space-time cultural content visualization technology
    - Participation/Share-enabled open cultural content platform
      o Multidimensional query processing techniques for combined time and space content
      o Open service platform technology supports heterogeneous cultural content and links, convergence
    - Prototype service to spread the space and temporal culture content
      o 2D/3D map-based cultural content and spatial information fusion
      o Mobile and web-based AR/VR pilot service development
      o Technology to create a space-time combined culture content creation and distribution package
     
  • Expected Contribution
    We developed a prototype application of the K-Culture Time Machine, an in-situated authoring and viewing system, for augmented reality-based tour guidance that helps a user to easily create and experience an augmented reality scene of a cultural heritage.
     
  • Sponsor
    This research was supported by Ministry of Culture, Sports and Tourism (MCST) and Korea Creative Content Agency (KOCCA) in the Culture Technology (CT) Research & Development Program 2014.