PhyRecon:
Physically Plausible Neural Scene Reconstruction

* indicates equal contribution   
1National Key Laboratory of General Artificial Intelligence, BIGAI    2Tsinghua University     3Peking University



PhyRecon harnesses both differentiable rendering and differentiable physics simulation to learn implicit surface representations. It achieves significant enhancements in reconstruction quality and demonstrates substantial stability improvements in physical simulators.

Abstract


While neural implicit representations have gained popularity in multi-view 3D reconstruction, previous work struggles to yield physically plausible results, thereby limiting their applications in physics-demanding domains like embodied AI and robotics. The lack of plausibility originates from both the absence of physics modeling in the existing pipeline and their inability to recover intricate geometrical structures. In this paper, we introduce PhyRecon, which stands as the first approach to harness both differentiable rendering and differentiable physics simulation to learn implicit surface representations. Our framework proposes a novel differentiable particle-based physical simulator seamlessly integrated with the neural implicit representation. At its core is an efficient transformation between SDF-based implicit representation and explicit surface points by our proposed algorithm, Surface Points Marching Cubes (SP-MC) , enabling differentiable learning with both rendering and physical losses. Moreover, we model both rendering and physical uncertainty to identify and compensate for the inconsistent and inaccurate monocular geometric priors. The physical uncertainty additionally enables a physics-guided pixel sampling to enhance the learning of slender structures. By amalgamating these techniques, our model facilitates efficient joint modeling with appearance, geometry, and physics. Extensive experiments demonstrate that PhyRecon significantly outperforms all state-of-the-art methods in terms of reconstruction quality. Our reconstruction results also yield superior physical stability, verified by Isaac Gym, with at least a 40% improvement across all datasets, opening broader avenues for future physics-based applications.

Method


Our novel framework bridges neural scene reconstruction and physics simulation to achieve the joint modeling of physics, geometry, and appearance. We realize a particle-based physical simulator and a highly efficient method for transitioning from SDF-based neural implicit representations to explicit representations that are conducive to physics simulation. Furthermore, we propose a joint uncertainty modeling approach, encompassing both rendering and physical uncertainty, to mitigate the inconsistencies and improve the reconstruction of thin structures.

Results


Indoor Scene Reconstruction

Examples from ScanNet++, ScanNet and Replica demonstrate our model produces higher quality reconstructions compared with the baselines. Our results contain finer details for slender structures (chair legs and the objects on the table) and plausible support relations, which are shown in the zoom-in boxes.

Object Stability Comparison

We visualize the trajectory for the reconstructed object during dropping simulation in Isaac Gym. Our method enhances the physical plausibility of the reconstruction results, which can remain stable during dropping simulation in Isaac Gym.


BibTeX

@misc{ni2024phyrecon,
        title={PhyRecon: Physically Plausible Neural Scene Reconstruction}, 
        author={Junfeng Ni and Yixin Chen and Bohan Jing and Nan Jiang and Bin Wang and Bo Dai and Yixin Zhu and Song-Chun Zhu and Siyuan Huang},
        year={2024},
        eprint={2404.16666},
        archivePrefix={arXiv},
        primaryClass={cs.CV}
}