loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Iryna Repinetska 1 ; Anna Hilsmann 2 and Peter Eisert 2 ; 1

Affiliations: 1 Department of Computer Science, Humboldt University, Berlin, Germany ; 2 Fraunhofer Institute for Telecommunications, Heinrich Hertz Institute, Berlin, Germany

Keyword(s): Novel View Synthesis, Neural Radiance Fields, Geometry Constraints, 360-Degree Indoor Dataset.

Abstract: Photo-realistic rendering and novel view synthesis play a crucial role in human-computer interaction tasks, from gaming to path planning. Neural Radiance Fields (NeRFs) model scenes as continuous volumetric functions and achieve remarkable rendering quality. However, NeRFs often struggle in large, low-textured areas, producing cloudy artifacts known as ”floaters” that reduce scene realism, especially in indoor environments with featureless architectural surfaces like walls, ceilings, and floors. To overcome this limitation, prior work has integrated geometric constraints into the NeRF pipeline, typically leveraging depth information derived from Structure from Motion or Multi-View Stereo. Yet, conventional RGB-feature correspondence methods face challenges in accurately estimating depth in textureless regions, leading to unreliable constraints. This challenge is further complicated in 360-degree ”inside-out” views, where sparse visual overlap between adjacent images further hinders d epth estimation. In order to address these issues, we propose an efficient and robust method for computing dense depth priors, specifically tailored for large low-textured architectural surfaces in indoor environments. We introduce a novel depth loss function to enhance rendering quality in these challenging, low-feature regions, while complementary depth-patch regularization further refines depth consistency across other areas. Experiments with Instant-NGP on two synthetic 360-degree indoor scenes demonstrate improved visual fidelity with our method compared to standard photometric loss and Mean Squared Error depth supervision. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.119.142.85

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Repinetska, I., Hilsmann, A. and Eisert, P. (2025). Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios. In Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: VISAPP; ISBN 978-989-758-728-3; ISSN 2184-4321, SciTePress, pages 725-734. DOI: 10.5220/0013301500003912

@conference{visapp25,
author={Iryna Repinetska and Anna Hilsmann and Peter Eisert},
title={Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios},
booktitle={Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: VISAPP},
year={2025},
pages={725-734},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013301500003912},
isbn={978-989-758-728-3},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 20th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 2: VISAPP
TI - Improving Geometric Consistency for 360-Degree Neural Radiance Fields in Indoor Scenarios
SN - 978-989-758-728-3
IS - 2184-4321
AU - Repinetska, I.
AU - Hilsmann, A.
AU - Eisert, P.
PY - 2025
SP - 725
EP - 734
DO - 10.5220/0013301500003912
PB - SciTePress