LRC-WeatherNet: LiDAR, RADAR, and Camera Fusion Network for Real-time Weather-type Classification in Autonomous Driving
Autonomous vehicles encounter significant perception and navigation challenges during adverse weather conditions including rain, fog, and snow, which degrade the performance of LiDAR, RADAR, and RGB camera sensors. This study introduces LRC-WeatherNet, a novel multi-sensor fusion framework that integrates LiDAR, RADAR, and camera data for real-time weather-type classification in autonomous driving. The proposed approach employs both early fusion utilizing a unified Bird's Eye View representation and mid-level gated fusion of modality-specific feature maps, enabling adaptive sensor reliability assessment under varying weather conditions. Evaluated on the extensive MSU-4S dataset covering nine distinct weather types, LRC-WeatherNet demonstrates superior classification performance and computational efficiency, significantly outperforming unimodal baseline approaches.