GeMS: Efficient Gaussian Splatting for extreme Motion-Blurred Images

Under review

Input GeMS GeMS-E

GeMS framework learns a sharp 3D Gaussian representation of the scene along with its camera motion trajectories directly from extreme motion-blurred images, enabling state-of-the-art deblurring, high-quality novel view synthesis, and real-time rendering, all with efficient training time and minimal memory overhead.


Abstract

GeMS is a 3D Gaussian Splatting framework designed to handle extreme motion blur in scene reconstruction. Unlike traditional methods that require sharp images for pose estimation and point cloud generation, GeMS directly processes blurred images, making it more practical for real-world scenarios.

Our approach integrates VGGSfM, a deep learning-based SfM pipeline, to estimate camera poses and generate point clouds, along with MCMC-based Gaussian Splatting, which ensures robust scene initialization without heuristic densification. Additionally, joint optimization of camera motion and Gaussian parameters enhances stability and accuracy.

To further improve reconstruction, particularly in cases where all input views are severely blurred, GeMS-E: an event-based extension of GeMS, incorporates Event-based Double Integral (EDI) deblurring, which first restores sharp images from blurred inputs. These refined images are then fed into the GeMS pipeline, leading to improved camera pose estimation, higher quality point clouds, and overall enhanced reconstruction accuracy. GeMS-E is particularly useful when event data is available, enabling sharper and more reliable results even under extreme motion blur conditions. To further support research in this area, we provide a complementary synthetic event dataset specifically curated for extreme blur scenarios.


Keywords

Gaussian Splatting, Deblurring, Image Restoration, Event Cameras.


Pipeline


overview

Results


GeMS and its event-based extension GeMS-E achieve high-quality 3D reconstructions and sharp novel view synthesis under extreme motion blur. Our results on both synthetic and real-world datasets demonstrate state-of-the-art performance in challenging blur scenarios.

Real-World Dataset Results



Input GeMS GeMS-E


Synthetic Dataset Results

Hover over the image to view the outputs of our method!

GeMS

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF


GeMS-E

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Image GIF
Image GIF
Image GIF
Image GIF
Image GIF

Comparisons


We compare GeMS and GeMS-E against strong baselines including MPRNet, Restormer, EDI, E2NeRF, and EBAD-NeRF on both synthetic and real datasets. Our method produces sharper reconstructions with better detail preservation under extreme motion blur. Note that ExBluRF* and BAD-Gaussians* rely on sharp-image-based initialization and are excluded from real-world comparisons due to their impracticality.


Real-World Dataset


comparison_1



Synthetic Dataset


comparison_2


BibTeX

@misc{matta2025gems,
      author       = {Gopi Raju Matta and Trisha Reddypalli and Divya Madhuri Vemunuri and Kaushik Mitra},
      title        = {{GeMS: Efficient Gaussian Splatting for Extreme Motion-Blurred Images}},
      year         = {2025},
      month        = apr,
      howpublished = {TechRxiv},
      note         = {Preprint},
      doi          = {10.36227/techrxiv.174362573.31249887/v1},
      url          = {https://www.techrxiv.org/doi/full/10.36227/techrxiv.174362573.31249887/v1}
    }