Meta-Object
The ultimate goal of the Meta-Object research is to enable a transformative reality-virtuality convergence by facilitating seamless interactions that transcend spatial and temporal barriers. This is to be achieved by implementing next-generation virtual objects that inherit the form, properties, and functions of their real-world counterparts, allowing for smooth synchronization, interaction, and sharing between the physical and virtual worlds. This will all be realized within a post-metaverse intelligent simulation platform through wearable AR/MR/VR devices.
This research centers on the concept of the 'Meta-Object'. This includes the creation of Meta-Objects, the user experience with them, and the economic system within a 'post-metaverse' platform. Specifically, it focuses on property-embedded modeling for physical and action realism, adaptive multisensory feedback tailored to user interactions, and a scene graph-based intelligence simulation platform for scalable and efficient ecosystem integration.
Meta-Objects can be utilized to provide immersive experiences through wearable AR/VR devices. Specific examples include scenarios where controlling a virtual drone causes a real drone to mirror its actions in real-time, and providing adaptive haptic feedback for different materials. They can also be applied to a wide range of fields, including remote collaboration, urban planning, resource distribution, education, and industry (vividly recreating past experiences or enabling risky or costly experiences).
This research is expected to make academic and technological contributions by presenting the 'Meta-Object' concept as a foundational unit for communication, collaboration, and co-creation in environments where the distinctions between virtual and real are blurred. It aims to pave the way for the next generation of human-computer interaction through a scalable, inclusive, and persistent shared experience platform (the post-metaverse). Furthermore, it is anticipated to provide a framework that overcomes the limitations of existing virtual objects by enabling bidirectional interaction and realistic multisensory experiences.
▼
Publications (17)
Yoonseok Shin, Minju Baeck, Woontack Woo "Region-Guided Interactive Docent for Paintings in Mixed Reality", 2025 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
#Poster/Demo/Workshop Conf. Int.
Taejun Son, Juyoung Lee, Seoyoung Oh, Woontack Woo "Round-Trip2 Gesture: Inplace IMU Gesture Recognition with Visual Guidance for Out-of-FOV Interaction", 2025 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
#Poster/Demo/Workshop Conf. Int.
Minju Baeck, Yoonseok Shin, Dooyoung Kim, Hyunjin Lee, Woontack Woo "
Visuo-Tactile Feedback with Hand Outline Styles for Modulating Affective Roughness Perception",
IEEE Transactions on Visualization and Computer Graphics
[Link]
#SCI Journal Int.
Dooyoung Kim, Jinseok Hong, Heejeong Ko, Woontack Woo "
Viewpoint-Tolerant Depth Perception for Shared Extended Space Experience on Wall-Sized Display",
IEEE Transactions on Visualization and Computer Graphics
[Link]
#SCI Journal Int.
Dooyoung Kim, Taewook Ha, Jinseok Hong, Seonji Kim, Selin Choi, Heejeong Ko "
Meta-Objects: Interactive and Multisensory Virtual Objects Learned From the Real World for Use in Augmented Reality",
IEEE CG&A 2025
https://doi.org/10.1109/MCG.2025.3555901
Journal Int.
이승재, 오서영, 우운택 "
가상 환경에서 잡기의 실감 강화: 마찰력 기반 잡기에 악력 상한 적용",
2025 한국컴퓨터종합학술대회 (KCC2025)
[Link]
#Poster/Demo/Workshop Conf. KR
손태준, 조우진, 우운택 "
가상 객체 대상 동적 손동작 인식 기반 직관적인 증강현실 상호작용",
2025 한국컴퓨터종합학술대회 (KCC2025)
[Link]
Conf. KR
Yoonseok Shin, Meng Yu Chiu, Yunseo Chang, Akanksha Jain, Jaehong Ahn, Woontack Woo "
Interactive Digital Upcycling System UsingImage and 3D Mesh Generative AI model",
2025 한국컴퓨터종합학술대회 (KCC 2025)
[Link]
Conf.
Taewook Ha, Selin Choi, Seonji Kim, Dooyoung Kim, Woontack Woo "
Human-Scene Interaction Data Generation with Virtual Environment using User-Centric Scene Graph",
Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
10.1109/VRW66409.2025.00357
#Poster/Demo/Workshop
신윤석, 우운택 "
손 자세와 깊이 추정을 통한 손-객체 상호작용 영상의 분할",
2024 한국소프트웨어종합학술대회 (KSC 2024)
[Link]
Conf. KR
최세린, 백민주, 우운택 "
Undo Slider for Multi-Step Editing in Collaborative AR Environments",
2024 한국소프트웨어종합학술대회 (KSC 2024)
[Link]
Conf. KR
우운택, 김선지, 김두영, 신재은 "이질공간 간 객체 클러스터 정합을 위한 기하학적 공간 가용성 그래프를 사용한 공유 가상 공간 생성", 출원, 10-2024-0165710
Patent KR
우운택, 김두영 "대형 벽면 디스플레이를 활용한 다중 사용자 공유를 위한 인간 입체적 지각 기반 공간 확장 기술 및 시스템", 출원, 10-2024-0184973
Patent KR
Juyoung Lee, Seo Young Oh, Minju Baeck, Hui-Shyong Yeo, Hyung-il Kim, Thad Starner, Woontack Woo "
Whirling Interface: Hand-based Motion Matching Selection for Small Target on XR Displays",
2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR62088.2024.00046[Video ][Project Page]
#FeaturedConf. Conf. Int.
Jinseok Hong, Taewook Ha, Hyerim Park, Hayun Kim, Woontack Woo "
Crowd Data-driven Artwork Placement in Virtual Exhibitions for Visitor Density Distribution Planning",
2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR62088.2024.00075
#FeaturedConf. Conf. Int.
Dooyoung Kim, Seonji Kim, Selin Choi, and Woontack Woo "
Spatial Affordance-aware Interactable Subspace Allocation for Mixed Reality Telepresence",
2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR62088.2024.00142
#FeaturedConf. Conf. Int.
Woojin Cho, Taewook Ha, Ikbeom Jeon, Jinwoo Jeon, Tae-Kyun Kim, Woontack Woo "
Temporally enhanced graph convolutional network for hand tracking from an egocentric camera",
Springer Virtual Reality
10.1007/s10055-024-01039-3
#SCI Journal Int.
TranSpace 3.0 : Open XR platform
This research aims to develop an open XR collaboration platform that induces high-quality immersion and social co-presence to enhance user collaboration quality. The platform is based on Scene Graph and utilizes digital twins to manage collaborative interactions of spatial models, static/dynamic objects, sound, hand gestures, and haptic elements through functions like creation, deletion, movement, and transformation. The study focuses on developing high-quality immersive elements and enhancing social co-presence by emphasizing various technologies.
The research covers the composition and integration of 3D objects in the spatial domain, optimal alignment of scanned data and CAD models, sound reproduction techniques, application of hand gestures using reinforcement learning in a physics simulator, high-quality multi-modal haptic technology supporting wearable interfaces, avatar mesh-point cloud interaction, force-based interaction, deep learning models for 3D clothing reconstruction, adaptive spatial configuration between multiple VR clients and single AR hosts, CG avatar face and costume reconstruction with texture extraction, as well as the development of evaluation metrics and factors for enhancing social co-presence in XR.
Open platform for XR collaboration, Authoring for XR contents, XR techniques for high immersion, XR smart construction, XR silver health care
.
The expected impact of this study lies in securing XR collaboration space service computing platform technology to prepare for future environmental changes in science and technology. On a national and societal level, it aims to provide immersive experiences similar to face-to-face interactions in an untact environment, resolving educational disparities and offering efficient remote collaboration environments to improve the quality of life for citizens. Moreover, by leveraging open platform technologies, XR can be introduced to various industries, contributing to the rapid growth of the XR market.
▼
Publications (22)
Hail Song, Seokhwan Yang, Woontack Woo "
Fast Texture Transfer for XR Avatars via Barycentric UV Conversion",
2025 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
https://arxiv.org/abs/2508.19518
#Poster/Demo/Workshop Conf. Int.
Hyeongil Nam, Seoyoung Kang, Anh Nguyen, Isaac Cho, Woontack Woo, Kangsoo Kim "Gender Congruence and Social Context in XR: Effects on Partner Preference, Warmth, Competence, and Uncanniness", 2025 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
Conf. Int.
Hyeongil Nam, Muskan Sarvesh, Seoyoung Kang, Woontack Woo, Kangsoo Kim "
Effects of AI-Powered Embodied Avatars on Communication Quality
and Social Connection in Asynchronous Virtual Meetings",
IEEE Transactions on Visualization and Computer Graphics
[Link]
#SCI Journal Int.
Dooyoung Kim, Jinseok Hong, Heejeong Ko, Woontack Woo "
Viewpoint-Tolerant Depth Perception for Shared Extended Space Experience on Wall-Sized Display",
IEEE Transactions on Visualization and Computer Graphics
[Link]
#SCI Journal Int.
Taewook Ha, Selin Choi, Seonji Kim, Dooyoung Kim, Woontack Woo "
Human-Scene Interaction Data Generation with Virtual Environment using User-Centric Scene Graph",
Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
10.1109/VRW66409.2025.00357
#Poster/Demo/Workshop
양석환, 진형우, 우운택 "
VR 헤드셋 사용자의 표정 트래킹 및 메쉬 모델 표현을 통한 사실적인 얼굴 아바타 생성",
2024 한국소프트웨어종합학술대회 (KSC 2024)
[Link]
Conf. KR
우운택, 김두영 "혼합 현실 텔레프레즌스 시스템의 동작 방법 및 이를 수행하는 혼합 현실 텔레프레즌스 시스템", 출원, 10-2024-0005235
Patent KR
Woontack Woo, Dooyoung Kim "EDGE-CENTRIC SPACE RESCALING METHOD FOR DISSIMILAR SPACE REGISTRATION AND THE SYSTEM THEREOF", PCT, US 출원, 18/611,759
Patent Int.
Seoyoung Kang, Anh Nguyen, Boram Yoon, Kangsoo Kim, Woontack Woo "
Gender Differences in Perceiving Avatar Face and Interpersonal Distance:
Exploring Realism and Social Presence in Mixed Reality",
2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR62088.2024.00024
#FeaturedConf. Conf. Int.
Seoyoung Kang, Hail Song, Boram Yoon, Kangsoo Kim, Woontack Woo "
The Influence of Emotion-based Prioritized Facial Expressions on Social Presence in Avatar-mediated Remote Communication",
2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR62088.2024.00131
#FeaturedConf. Conf. Int.
Dooyoung Kim, Seonji Kim, Selin Choi, and Woontack Woo "
Spatial Affordance-aware Interactable Subspace Allocation for Mixed Reality Telepresence",
2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR62088.2024.00142
#FeaturedConf. Conf. Int.
Jae-eun Shin, Hayun Kim, Hyerim Park, and Woontack Woo "
Investigating the Design of Augmented Narrative Spaces Through Virtual-Real Connections: A Systematic Literature Review",
Proceedings of the CHI Conference on Human Factors in Computing Systems
10.1145/3613904.3642819
#FeaturedConf. Conf. Int.
Hyerim Park, Aram Min, Hyunjin Lee, Maryam Shakeri, Ikbeom Jeon, and Woontack Woo "
Comfortable Mobility vs. Attractive Scenery: The Key to Augmenting Narrative Worlds in Outdoor Locative Augmented Reality Storytelling",
Proceedings of the CHI Conference on Human Factors in Computing Systems
10.1145/3613904.3642431
#FeaturedConf. Conf. Int.
Hail Song "
Toward Realistic 3D Avatar Generation with Dynamic 3D Gaussian Splatting for AR/VR Communication",
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
10.1109/VRW62533.2024.00356
Conf. Int.
Seoyoung Kang "
Investigating Avatar Facial Expressions and Collaboration Dynamics for Social Presence in Avatar-Mediated XR Remote Communication",
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
10.1109/VRW62533.2024.00351
Conf. Int.
Seonji Kim "
Experience Graph using Spatio-Temporal Scene Data for Replaying Mixed Reality Interaction",
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
10.1109/VRW62533.2024.00350
Conf. Int.
우운택, 김두영 "서로 다른 공간 등록을 위한 에지 중심 공간 재설정 방법 및 시스템", 출원, 10-2023-0100226
Patent KR
Dooyoung Kim, Woontack Woo "
Edge-Centric Space Rescaling with Redirected Walking for Dissimilar Physical-Virtual Space Registration",
2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
10.1109/ISMAR59233.2023.00098
#FeaturedConf. Conf. Int.
Hail Song, Boram Yoon, Woojin Cho, Woontack Woo "
RC-SMPL : Real-time Cumulative SMPL-based Avatar Generation",
2023 IEEE International Symposium on Mixed and Augmented Reality
10.1109/ISMAR59233.2023.00023
#FeaturedConf. Conf. Int.
Seoyoung Kang, Hail Song, Boram Yoon, Kangsoo Kim, Woontack Woo "
Effects of Different Facial Blendshape Combinations on Social Presence for Avatar-mediated Mixed Reality Remote Communication",
2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
10.1109/ISMAR-Adjunct60411.2023.00094
#Poster/Demo/Workshop Conf. Int.
Hyung-il Kim, Boram Yoon, Seo Young Oh, Woontack Woo "
Visualizing Hand Force with Wearable Muscle Sensing for Enhanced Mixed Reality Remote Collaboration",
IEEE Transactions on Visualization and Computer Graphics ( Volume: 29, Issue: 11, November 2023)
10.1109/TVCG.2023.3320210
#SCI Journal Int.
Boram Yoon, Jae-eun Shin, Hyung-il Kim, Seo Young Oh, Dooyoung Kim, Woontack Woo "
Effects of Avatar Transparency on Social Presence in Task-Centric Mixed Reality Remote Collaboration",
IEEE Transactions on Visualization and Computer Graphics ( Volume: 29, Issue: 11, November 2023)
10.1109/TVCG.2023.3320258
#SCI Journal Int.
WISE AR UI/UX Platform
For the effective application of smart glasses to everyday life, anyone can use them intuitively, reflect surrounding situations and individual characteristics, and ultimately research and development of WISE (Wearable Interface for Sustainable Enhancement) beyond the functional use of the interface. Dynamic interfaces reflecting the overall state of the user’s attention and intention, all surrounding spaces and objects are recognized as semantic objects to access related information, and the object is used as an input tool for each characteristic.
Dynamic interface technology based on user intent for multi-task situations, Parameterization of everyday objects and development of input recognition and object control modules based on multi-user intentions, A technique for estimating the user’s global position in any indoor space, Integrated WISE AR UI/UX Platform for Smart Glasses
An integrated WISE UI/UX platform that makes smart glasses “magically” easy and useful anywhere indoors and outdoors. It is a dynamic interface that reflects the overall state, including the user’s intention. It can recognize both surrounding space and objects as semantic objects.
The short-term goal is to disclose the WISE UI/UX platform developed through this project and the module technologies that make up the platform as open SW so that it can be used in industry and academia. In addition, we look forward to the development of the latest ICT fields such as augmented, virtual reality and metaverse through the industry-academic activities of excellent personnel who participated in the research and development of this project. Ultimately, projects and services using augmented reality are made easier for various stakeholders, such as individual producers, small-scale project teams at home and abroad, and mobile communication/manufacturing/education/distribution companies.
▼
Publications (14)
Young Bin Kim, Suji Kang, Minju Baeck, Seonji Kim, Woontack Woo "Architectural Elements Augmentation from 2D Artworks for Spatial Experience of AR Exhibition", 2025 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
#Poster/Demo/Workshop Conf. Int.
Taeyeon Kim, Eunhwa Song, Eunhee Jeong, Hyunjin Lee, Woontack Woo "Clinical Features-based Lifestyle Quantification Using Mobile Devices for Personalized Augmented Reality Intervention", 2025 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
#Poster/Demo/Workshop Conf. Int.
Dooyoung Kim, Woontack Woo "
Virtual reality space adjusting method with relative translation gain in redirected walking and the system thereof ",
등록, US12333628B2
[Link]
Patent Int.
김영빈, 전진우, 우운택 "
Instance-Aware 기반 객체 중심 3D 맵 생성 프레임워크",
2025 한국컴퓨터종합학술대회 (KCC 2025)
[Link]
Conf.
Sunyoung Bang, Hyunjin Lee, Seo Young Oh, Woontack Woo "
AReading with Smartphones: Understanding the Trade-offs between Enhanced Legibility and Display Switching Costs in Hybrid AR Interfaces",
Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems
10.1145/3706598.3713879
#FeaturedConf. Conf.
Juyoung Lee, Minju Baeck, Hui-Shyong Yeo, Thad Starner, Woontack Woo "
GestureMark: Shortcut Input Technique using Smartwatch Touch Gestures for XR Glasses",
AHs '24: Proceedings of the Augmented Humans International Conference 2024
10.1145/3652920.3652941
Conf. Int.
Taeyeon Kim, Hyun-Song Kwon, Kyunghyun Cho, Woontack Woo "
Holistic Patient Assessment System using Digital Twin for XR Medical Teleconsultation",
AHs '24: Proceedings of the Augmented Humans International Conference 2024
10.1145/3652920.3652943
Conf. Int.
Eunhwa Song, Taewook Ha, Junhyeok Park, Hyunjin Lee, Woontack Woo "
Holistic Quantified-Self: An Integrated User Model for AR Glass Users",
2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
10.1109/ISMAR-Adjunct60411.2023.00092
#Poster/Demo/Workshop Conf. Int.
Hyung-il Kim, Boram Yoon, Seo Young Oh, Woontack Woo "
Visualizing Hand Force with Wearable Muscle Sensing for Enhanced Mixed Reality Remote Collaboration",
IEEE Transactions on Visualization and Computer Graphics ( Volume: 29, Issue: 11, November 2023)
10.1109/TVCG.2023.3320210
#SCI Journal Int.
Hui-Shyong Yeo, Erwin Wu, Daehwa Kim, Juyoung Lee, Hyung-il Kim, Seo Young Oh, Luna Takagi, Woontack Woo, Hideki Koike, Aaron John Quigley "
OmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional Camera",
Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
10.1145/3544548.3580747[Video ]
#FeaturedConf. Conf. Int.
Hyunjin Lee, Woontack Woo "
Exploring the Effects of Augmented Reality Notification Type and Placement in AR HMD while Walking",
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)
10.1109/VR55154.2023.00067
#FeaturedConf. Conf. Int.
Sunyoung Bang, Woontack Woo "
Enhancing the Reading Experience on AR HMDs by Using Smartphones as Assistive Displays",
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)
10.1109/VR55154.2023.00053
#FeaturedConf. Conf. Int.
Eunhwa Song, Minju Baeck, Jihyeon Lee, Seo Young Oh, Dooyoung Kim, Woontack Woo, Jeongmi Lee, Sang Ho Yoon "
Memo:me, an AR Sticky Note With Priority-Based Color Transition and On-Time Reminder",
2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
10.1109/VRW58643.2023.00126
#Poster/Demo/Workshop Conf. Int.
Dynamic Digital Twin for Realistic Untact XR Collaboration
In this project, we study a scene graph-based 3D scene reconstruction method for the dynamic digital twin. The dynamic digital twin track and update dynamic movement of physical objects in the indoor environment in real-time. In addition, the project studies XR interworking technology in which users in the field and remote users experience and manipulate the same environment directly or indirectly through realistic rendering-based augmented and virtual reality. Finally, this project will enable various realistic non-face-to-face collaboration applications. To implement this system, we intend to organically converge specialized technologies in various fields, such as computer vision, computer graphics, and indoor GIS.
Digital twin, Scene Graph, SLAM, Scene Reconstruction, Object Detection and Tracking , Semantic Segmentation, AR, VR
An integrated XR platform provides a synchronized 3D semantic map with an indoor environment to multiple AR / VR devices for the digital twin. This integrated platform allows AR/VR devices to interact with physical environments, including moving objects.
The contribution of the proposed project keeps the virtual environment of the digital twin in an identical state to the dynamic indoor environment. Through the proposed project, non-face-to-face collaboration services based on digital twin technology can be utilized in a dynamic environment. The map and object information generated by this project can be linked to the digital twin based on the semantic relationship between the indoor space and the object to efficiently respond to changes such as object movement and indoor space change. The expected results from this project are expected to be used in various fields related to non-face-to-face collaboration, such as e-commerce, teleconferencing, education, games, museums, plays, and concerts.
▼
Publications (4)
강수지, 양석환, 김석영, 우운택 "
Speech-to-3D: 음성 인식을 통한 사용자 맞춤형 3D Scene 렌더링",
2025 한국컴퓨터종합학술대회 (KCC2025)
[Link]
#Poster/Demo/Workshop Conf. KR
최세린, 홍윤재, 박민지, 우운택 "
가상 소모품과 현실 도구와의 인터랙션을 활용한 원격 확장현실(XR) 트레이닝 시스템",
KCC2024 Korean Comprehensive Computer Science Conference Paper Collection Vol.2024
[Link]
Conf. KR
Eunhee Jeong, Hankyeol Kim Jaehong Ahn, Seongha Park, Sangho Yoon, Woontack Woo "
Function-Adaptive Affordance Extraction from 3D Objects Using LLM for Interaction Authoring with Augmented Artifacts",
2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
10.1109/ISMAR-Adjunct64951.2024.00050
#Poster/Demo/Workshop Conf. Int.
Min-yung Kim, Kun-woo Song, Yohan Lim, and Sang Ho Yoon "
Collision Prevention in Diminished Reality through the Use of Peripheral Vision",
Adjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology
10.1145/3672539.3686346
#Poster/Demo/Workshop Conf. Int.