Deblurring Neural Radiance Fields with Event-driven Bundle Adjustment

Yunshan Qi1       Lin Zhu2 *       Yifan Zhao1       Nan Bao1       Jia Li1 *

1Beihang University    2Beijing Institute of Technology   

Accepted by 32nd ACM International Conference on Multimedia (ACM MM 2024)

Framework

Abstract

Neural Radiance Fields (NeRF) achieves impressive 3D representation learning and novel view synthesis results with high-quality multi-view images as input. However, motion blur in images often occurs in low-light and high-speed motion scenes, which significantly degrades the reconstruction quality of NeRF. Previous deblurring NeRF methods struggle to estimate pose and lighting changes during the exposure time, making them unable to accurately model the motion blur. The bio-inspired event camera measuring intensity changes with high temporal resolution makes up this information deficiency. In this paper, we propose Event-driven Bundle Adjustment for Deblurring Neural Radiance Fields (EBAD-NeRF) to jointly optimize the learnable poses and NeRF parameters by leveraging the hybrid event-RGB data. An intensity-change-metric event loss and a photo-metric blur loss are introduced to strengthen the explicit modeling of camera motion blur. Experiments on both synthetic and real-captured data demonstrate that EBAD-NeRF can obtain accurate camera trajectory during the exposure time and learn a sharper 3D representations compared to prior works.

Qualitative Results on Real Data

Qualitative Results on Synthetic Data

Supplementary Video