LDG: Lightweight Deformable 3D Gaussians for Single View Dynamic Scene Reconstruction

Youhong Peng1, Weixing Xie1, Jinwen Li1, Shaoqi Wu1, Zefeng Wang1,
1Xiamen University
method

Architecture of our LDG network.

Abstract

Recent deformable 3D Gaussians methods achieve high-quality reconstruction and real-time rendering. However, they require multi-view information and are not applicable to single-view dynamic scenes captured from mobile phones. Additionally, the high-dimensional hidden layer of deformation MLP and the excessive number of Gaussian primitives and attributes impose significant storage pressure, greatly limiting their practical application. To address the issues, we propose a novel Lightweight Deformable 3D Gaussians teacher-student framework. Specifically, we initialize Gaussian primitives with an initialization strategy designed for single-view scenes, and then optimize the teacher model using color and depth information. For the trained teacher model, we distill deformation MLP, prune Gaussian primitives and Gaussian attributes, and finally obtain a student model with low storage and high efficiency. Public benchmark experiments demonstrate the effectiveness of our framework, showing a compression rate exceeding 4× while maintaining satisfactory rendering quality.

Comparison Experiments

Comparison with D-NeRF*, DRSM and Deformable 3D-GS*:

D-NeRF* and Deformable 3D-GS* are modified versions of the original model with depth supervision, which are fairly compared with our method.

Related Links

D-NeRF: A method for synthesizing novel views, at an arbitrary point in time, of dynamic scenes with complex non-rigid geometries.

DRSM: A novel framework a novel neural 4D decomposition for dynamic reconstruction from single-view videos to tackle 4D decomposition problem for dynamic scenes in monocular cameras.

Deformable 3D Gaussians: A method that reconstructs scenes using explicit 3D Gaussians and learns Gaussians in canonical space with a deformation field to model monocular dynamic scenes.


If you have any questions or feedbacks, please contact Youhong Peng (p1207510984@gmail.com).