Customer Datasets
The links below are to customers and partners who have published Radar datasets.
Oxford Radar RobotCar Dataset
The Oxford Radar RobotCar Dataset provides data from a Navtech radar and companion sensors with optimised ground truth radar odometry for 280 km of driving around Oxford, UKHeriot-Watt RADIATE Dataset
RADIATE (RAdar Dataset In Adverse weaThEr) is a high-resolution radar dataset which includes about 3 hours annotated radar images and more than 200K labelled instances on public roads.MulRan
Multimodal (Radar & LiDAR) Range Dataset for Urban Place Recognition - a dataset namely for radio detection and ranging (radar) and light detection and ranging (LiDAR) specifically targeting the urban environment. This dataset focuses on the range sensor-based place recognition and provides 6D baseline trajectories of a vehicle for place recognition ground truth. Provided radar data support both raw-level and image-format data, including a set of time-stamped 1D intensity arrays and 360° polar images, respectively. In doing so, we provide flexibility between raw data and image data depending on the purpose of the research.Radar-LiDAR Dataset - dgbicra2019-radar-lidar
Radar - LiDAR dataset provides radar data with the baseline trajectory of a vehicle for evaluation together with a 3D LiDAR point cloud as prior information.
Unlike existing radar datasets, the provided radar data support both raw-level and image format data, including 360º cumulated 1D intensity arrays with time stamps and 360º polar images.
The Boreas Dataset
The Boreas dataset encompasses a year-long collection of driving data along a repeated route, exhibiting noticeable seasonal changes. The dataset comprises more than 350km of driving data, encompassing various instances of challenging weather conditions like rain and heavy snow. Notably, the Boreas data-taking platform showcases an exceptional sensor suite, including a 128-channel Velodyne Alpha Prime lidar, a 360-degree Navtech radar, and precise ground truth poses acquired from an Applanix POSLV GPS/IMU. Presently, the platform provides active and inclusive benchmarks for odometry, metric localization, and 3D object detection.OSDAR23
Within the Rail industry, this is the first freely available multi-sensor data set for machine learning for the development of fully automated driving
Homepage OSDaR23-multi-sensor data set for machine learning (digitale-schiene-deutschland.de)
The multi-sensor data and the associated annotations contained in the OSDaR23 dataset can be downloaded from the following link:[ref: https://doi.org/10.57806/9mv146r0]
For easy use of the dataset, DB Netz AG has also published a suitable Python software development environment:
[ref: https://github.com/DSD-DBS/raillabel%5D
To visualize the dataset, the WebLabel Player of the Vicomtech Research Foundation can be used:
[ref: https://github.com/Vicomtech/weblabel%5D ]
The Radar Doppler Dataset provides 25 km of data from a prototype Navtech CTS350-X Millimetre-Wave FMCW radar containing raw Doppler information, alongside LiDAR, camera and RTK GPS. The new Doppler configuration provides raw return discrepancies, which can be used to detect moving objects and estimate their velocity. We are excited to share this data with the community and we intend that this dataset will help accelerate research in this interesting modality,
HeRCULES: Heterogeneous Radar Dataset in Complex Urban Environment for Multi-session Radar SLAM
https://arxiv.org/abs/2502.01946
A multi-modal dataset with heterogeneous radars, FMCW LiDAR, IMU, GPS, and cameras. This is the first dataset to integrate 4D radar and spinning radar alongside FMCW LiDAR, offering unparalleled localization, mapping, and place recognition capabilities. The dataset covers diverse weather and lighting conditions and a range of urban traffic scenarios, enabling a comprehensive analysis across various environments. The sequence paths with multiple revisits and ground truth pose for each sensor enhance its suitability for place recognition research. The dataset and development tools are available at this https URL.
MUlti-SEnsor Semantic perception dataset - ETH Zurich
https://muses.vision.ee.ethz.ch/
MUSES comprises a diverse collection of 2500 images, evenly distributed across different combinations of weather conditions (clear, fog, rain, and snow) and illuminations (daytime/nighttime). Each image in the dataset is accompanied by high-quality 2D pixel-level panoptic annotations, as well as class level and instance-level uncertainty annotations. The dataset contains synchronized recordings from a variety of sensors, including a frame camera, a MEMS lidar, a FMCW radar, an HD event camera, an IMU/GNSS sensor, and a corresponding image of the same scene taken under normal conditions.
MOANA Dataset - a multi-radar dataset for unmanned surface vessels
This dataset integrates short-range LiDAR data, medium-range W-band radar data, and long-range X-band radar data into a unified framework. Additionally, it includes object labels for oceanic object detection usage, derived from radar and stereo camera images. The dataset comprises seven sequences collected from diverse regions with varying levels of estimation difficulty, ranging from easy to challenging, and includes common locations suitable for global localization tasks. Dataset can be found in following link: this https URL