This year, I started my master's thesis project at the Cologne Game Lab and IFS Internationale Filmschule Köln. Photogrammetry was the primary scanning method to create a realistic digital human of myself. This breakdown covers the entire process, from capturing photogrammetric data to transforming it into a high-quality digital twin of me for both offline and real-time rendering. The photogrammetry process was carried out through collaboration with TU Dortmund. For this, a multi-camera scanning setup consisting of 56 synchronized cameras was employed to capture a full-body scan in a single shot. This series will focus on the offline rendering process.
In this first part, I briefly demonstrate the workflow, including the modeling and cleanup phases. Following the reconstruction, the 3D scan was imported into ZBrush for refinement. This phase involved artifact cleanup, surface smoothing, and manually reconstructing missing elements such as the ears and eyes. In the second phase, I will focus on developing realistic hair using XGen Core.