FlareGS: 4D Flare Removal using Gaussian Splatting for Urban Scenes

Mayank Chandak1, Sai Sri Teja Kuppa1, Rahul1, Gopi Raju Matta1, Vinayak Gupta1, Kaushik Mitra1
1Indian Institute of Technology, Madras
FlareGS Teaser Figure

Our approach addresses 4D flare removal by utilizing multi-view information through Gaussian splatting. We present a pipeline for recovering flare-free novel views from flare-corrupted multi-view videos, enabling improved performance on downstream tasks. The teaser demonstrates our novel 4D flare removal pipeline that leverages multi-view consistency through Gaussian splatting. By aggregating information from spatially and temporally adjacent views, our method achieves photometrically and geometrically consistent reconstructions.

Abstract

Flare artifacts such as halos, ghosting, and internal reflections often degrade visual quality in autonomous driving scenarios, particularly under adverse weather conditions like rain, fog, or rapid pressure shifts across windshields. These flares, arising from water droplets, condensation, or internal glass reflections, are fundamentally distinct from conventional lens flares and remain largely unaddressed in prior literature.

In this work, we present the first systematic effort to model and remove such reflective flares that appear in real-world driving videos. Our method leverages multi-view consistency through Gaussian Splatting-based novel view synthesis, achieving more photometrically and geometrically consistent reconstructions compared to single-view approaches.

We highlight our advantages as: (1) Physics-based Synthetic Pipeline, (2) Depth-guided Uformer Architecture, (3) Gaussian Splatting Framework, and (4) Comprehensive Evaluation.

Key Contributions

  • Physics-based Synthetic Pipeline: We introduce a controlled synthetic dataset that simulates flare formation using physics-informed rendering
  • Depth-guided Uformer Architecture: A multi-modal restoration framework that fuses flare-degraded RGB inputs with flare-invariant depth priors
  • Gaussian Splatting Framework: Novel view synthesis that enhances multi-view consistency and facilitates accurate reconstruction of flare-free scenes
  • Comprehensive Evaluation: Significant improvements in both visual fidelity and downstream tasks (segmentation, optical flow) under adverse weather conditions

Method Overview

Method Architecture

Our depth-guided Uformer architecture leverages depth information to better disentangle flare artifacts from scene content. Our multi-modal architecture fuses flare-corrupted RGB inputs with flare-invariant depth priors from LiDAR sensors. The depth information serves as a reliable structural prior since it remains unaffected by optical flare artifacts. This enables precise localization of flare-affected regions and more accurate restoration compared to RGB-only approaches.

Qualitative Results

Qualitative Results

Visual comparison of flare removal results across different methods and scenarios. Our method demonstrates superior flare removal capabilities across diverse weather conditions including rain, fog, and varying lighting scenarios. The results show effective suppression of reflective flares, ghosting artifacts, and halo patterns while preserving fine scene details. Compared to baseline methods, our approach maintains better color consistency and structural integrity in the restored images.

Downstream Task Performance

Segmentation Results

Improved semantic segmentation performance after flare removal using our method. Flare removal significantly improves the performance of downstream perception tasks, with semantic segmentation accuracy increasing by 15-20% on flare-corrupted images. The improved segmentation results demonstrate that our method preserves important scene semantics while removing artifacts. This validates the practical impact of flare removal for autonomous driving applications where accurate scene understanding is critical.

Real-world Driving Scenarios

Real-world Results

Flare removal results on real-world driving scenarios with various weather conditions. Real-world evaluation demonstrates robust performance across diverse driving scenarios including urban environments, highways, and adverse weather conditions. Our method effectively handles various flare types including reflective flares from streetlights, scattering flares from rain, and halo patterns from fog. The results show consistent flare suppression while maintaining scene fidelity, making it suitable for deployment in autonomous driving systems.

BibTeX

@inproceedings{flaregs2025,
  title={FlareGS: 4D Flare Removal using Gaussian Splatting for Urban Scenes},
  author={Chandak, Mayank and Kuppa, Sai Sri Teja and Rahul and Matta, Gopi Raju and Gupta, Vinayak and Mitra, Kaushik},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  year={2025}
}