In this article, we explore the challenges of 3D modeling from satellite imagery and introduce SUNDIAL, a comprehensive approach to address these challenges using neural radiance fields. Traditional 3D modeling techniques face difficulties in the remote sensing context due to limited multi-view baselines, varying illumination conditions, and scene changes. With SUNDIAL, we aim to jointly learn satellite scene geometry, illumination components, and sun direction using a single-model approach. Our technique incorporates lighting cues and geometric priors from remote sensing literature, enabling us to model physical properties like shadows, scattered sky illumination, and complex illumination of vegetation and water. We evaluate the performance of SUNDIAL against existing NeRF-based techniques and showcase improved scene and lighting disentanglement, novel view and lighting rendering, and accurate geometry and sun direction estimation on challenging satellite scenes. SUNDIAL has the potential to revolutionize 3D reconstruction in areas like environmental science, urban planning, agriculture, and disaster response.

Abstract:3D modeling from satellite imagery is essential in areas of environmental science, urban planning, agriculture, and disaster response. However, traditional 3D modeling techniques face unique challenges in the remote sensing context, including limited multi-view baselines over extensive regions, varying direct, ambient, and complex illumination conditions, and time-varying scene changes across captures. In this work, we introduce SUNDIAL, a comprehensive approach to 3D reconstruction of satellite imagery using neural radiance fields. We jointly learn satellite scene geometry, illumination components, and sun direction in this single-model approach, and propose a secondary shadow ray casting technique to 1) improve scene geometry using oblique sun angles to render shadows, 2) enable physically-based disentanglement of scene albedo and illumination, and 3) determine the components of illumination from direct, ambient (sky), and complex sources. To achieve this, we incorporate lighting cues and geometric priors from remote sensing literature in a neural rendering approach, modeling physical properties of satellite scenes such as shadows, scattered sky illumination, and complex illumination and shading of vegetation and water. We evaluate the performance of SUNDIAL against existing NeRF-based techniques for satellite scene modeling and demonstrate improved scene and lighting disentanglement, novel view and lighting rendering, and geometry and sun direction estimation on challenging scenes with small baselines, sparse inputs, and variable illumination.

Read the original article