Development and Tracking of Consensus Mesh for Monocular Depth Sequences

Gaurav Mishra


Human body tracking typically requires specialized capture set-ups. Although pose tracking is available in consumer devices like Microsoft Kinect, it is restricted to stick figures visualizing body part detection. In this thesis, we propose a method for full 3D human body shape and motion capture of arbitrary movements from the depth channel of a single Kinect, when the subject wears casual clothes. We do not use the RGB channel or an initialization procedure that requires the subject to move around in front of the camera. This makes our method applicable for arbitrary clothing textures and lighting environments, with minimal subject intervention. Our method consists of 3D surface feature detection and articulated motion tracking, which is regularized by a statistical human body model [40]. We also propose the idea of a Consensus Mesh (CMesh) which is the 3D template of a person created from a single view point. We demonstrate tracking results on challenging poses and argue that using CMesh along with statistical body models can improve tracking accuracies. Quantitative evaluation of our dense body tracking shows that our method has very little drift which is improved by the usage of CMesh We explore the possibility of improving the quality of CMesh using RGB images in a post processing step. For this we propose a pipeline involving Generative Adversarial Networks. We show that CMesh can be improved from RGB images of the original person by learning corresponding relative normal maps ( N R map ). These N R map have the potential to encode the nuances in the CMesh with respect to ground truth object. We explore such method in a synthetic setting for static human like objects. We demonstrate quantitatively that details which are learned from such a pipeline are invariant to lighting and texture changes. In future the generated N R map can be used to improve the quality of CMesh


Year of completion:  June 2019
 Advisor : P J Narayanan and Kiran Varanasi

Related Publications