Authors:
Diego Hernández Rodríguez
1
;
2
;
Motoharu Sonogashira
1
;
2
;
Kazuya Kitano
2
;
Yuki Fujimura
2
;
Takuya Funatomi
2
;
Yasuhiro Mukaigawa
2
and
Yasutomo Kawanishi
1
;
2
Affiliations:
1
Guardian Robot Project, RIKEN, Kyoto, Japan
;
2
Division of Information Science, Nara Institute of Science and Technology, Nara, Japan
Keyword(s):
Event Camera Simulation, Neural Radiance Fields.
Abstract:
Event cameras are novel sensors that offer significant advantages over standard cameras, such as high temporal resolution, high dynamic range, and low latency. Despite recent efforts, however, event cameras remain rela-tively expensive and difficult to obtain. Simulators for these sensors are crucial for developing new algorithms and mitigating accessibility issues. However, existing simulators based on a real-world video often fail to generalize to novel viewpoints or temporal resolutions, making the generation of realistic event data from a single scene unfeasible. To address these challenges, we propose enhancing event camera simulators with neural radiance fields (NeRFs). NeRFs can synthesize novel views of complex scenes from a low-frame-rate video sequence, providing a powerful tool for simulating event cameras from arbitrary viewpoints. This approach not only simplifies the simulation process but also allows for greater flexibility and realism in generating event camera data,
making the technology more accessible to researchers and developers.
(More)