Research Projects: VirtualizeMe

From Gravity
(Difference between revisions)
Jump to: navigation, search
Line 6: Line 6:
  
 
{{Template:TeaserImage1 |
 
{{Template:TeaserImage1 |
teaser1 = [[Image:Title.png|1000px|VirtualizeMe processing pipeline: background extraction, disparity map calculation, point cloud construction and rendering]]
+
teaser1 = [[Image:UCSD_Calit2_GRAVITY_VirtualizeMe_Title.png|1000px|VirtualizeMe processing pipeline: background extraction, disparity map calculation, point cloud construction and rendering]]
 
}}
 
}}
  

Revision as of 14:23, 5 September 2008

Project Name VirtualizeMe
Team Members Daniel Knoblauch, Falko Kuester
Project Sponsor California Institute for Telecommunications and Information Technology (Calit2), The Jacobs School of Engineering (JSoE)
VirtualizeMe processing pipeline: background extraction, disparity map calculation, point cloud construction and rendering

Contents

Overview

Tele-Immersion is a fully interactive, face-to-face, viewpoint corrected, stereoscopic, virtual environment, allowing users to naturally interact with each other and the digital environment surrounding them via realistic avatars. The focus of this project is extracting, modeling and communicating the avatar of a user needed for Tele-Immersion applications by introducing a stereo video stream based 3D reconstruction system. The combination of several 2.5 D reconstruction results (disparity maps) from different viewing angles results into a high resolution extraction of avatar models with real-time characteristics. The point cloud based avatars are communicated to a remote collaboration location using lossless point cloud compression algorithms.

Publications

  • Knoblauch D., Kuester F. (2008). VirtualizeMe: Interactive Model Reconstruction From Stereo Video Streams. Accepted to VRST 2008, Bordeaux (France)

Sub-Projects

  • 3D Reconstruction:
    • Stereo video stream based disparity map calculations for 2.5D reconstruction
    • Combination of several 2.5 D reconstructions from different viewing angles
    • Refine composed point cloud (noise and oversampling removal)
  • Point Cloud Streaming:
    • Lossless point cloud compression
    • Point cloud transmission with real-time characteristics
  • Point Cloud Rendering:
    • High fidelity avatar rendering based on received point cloud

Research Opportunities

There are different opportunities for undergraduate and graduate students to join our research in this project.
Please Contact Daniel Knoblauch (dknoblau@ucsd.edu) for any questions.

  • Disparity Map Research (Computer Vision)
    • Implementation and Evaluation of state of the art Disparity Map Algorithms for Tele-Immersion.
    • Development of new Tele-Immersion specialized disparity map algorithm.
  • Point Cloud Streaming Research (Computer Graphics/Visualization, Networking)
    • Implementation and Evaluation of state of the art real-time point cloud compression and streaming algorithms.
    • Implementation of a Tele-Immersion specialized point cloud compression and streaming infrastructure.

Copyright

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by the author's copyright. This work may not be reposted without the explicit permission of the copyright holder.


back

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox