Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios
Iryna Repinetska, Anna Hilsmann, Peter Eisert, Peter Eisert
2025
Abstract
Photo-realistic rendering and novel view synthesis play a crucial role in human-computer interaction tasks, from gaming to path planning. Neural Radiance Fields (NeRFs) model scenes as continuous volumetric functions and achieve remarkable rendering quality. However, NeRFs often struggle in large, low-textured areas, producing cloudy artifacts known as ”floaters” that reduce scene realism, especially in indoor environments with featureless architectural surfaces like walls, ceilings, and floors. To overcome this limitation, prior work has integrated geometric constraints into the NeRF pipeline, typically leveraging depth information derived from Structure from Motion or Multi-View Stereo. Yet, conventional RGB-feature correspondence methods face challenges in accurately estimating depth in textureless regions, leading to unreliable constraints. This challenge is further complicated in 360-degree ”inside-out” views, where sparse visual overlap between adjacent images further hinders depth estimation. In order to address these issues, we propose an efficient and robust method for computing dense depth priors, specifically tailored for large low-textured architectural surfaces in indoor environments. We introduce a novel depth loss function to enhance rendering quality in these challenging, low-feature regions, while complementary depth-patch regularization further refines depth consistency across other areas. Experiments with Instant-NGP on two synthetic 360-degree indoor scenes demonstrate improved visual fidelity with our method compared to standard photometric loss and Mean Squared Error depth supervision.
DownloadPaper Citation
in Harvard Style
Repinetska I., Hilsmann A. and Eisert P. (2025). Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios. In Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: VISAPP; ISBN 978-989-758-728-3, SciTePress, pages 725-734. DOI: 10.5220/0013301500003912
in Bibtex Style
@conference{visapp25,
author={Iryna Repinetska and Anna Hilsmann and Peter Eisert},
title={Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios},
booktitle={Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: VISAPP},
year={2025},
pages={725-734},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013301500003912},
isbn={978-989-758-728-3},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: VISAPP
TI - Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios
SN - 978-989-758-728-3
AU - Repinetska I.
AU - Hilsmann A.
AU - Eisert P.
PY - 2025
SP - 725
EP - 734
DO - 10.5220/0013301500003912
PB - SciTePress