Open Datasets
It's often difficult to find open source datasets, that reflect multi-sensor environments, for testing and trial. We're compiling an updated list for reference here - follow the links to get the dataset!
ZOD - Zenseact Open Dataset
The Zenseact Open Dataset (ZOD) is a large multi-modal autonomous driving (AD) dataset, created by our customer, Zenseact. Over a 2-year period in 14 different European counties, researchers at Zenseact collected data using a fleet of vehicles equipped with a full sensor suite consisting of 3x lidars, 1x camera and IMU data. In total, the dataset includes 100k single frame images, 1473 sequences and 29 Drives.
Audi A2D2
Audi's dataset includes 40k frames with semantic segmentation labels, 12k with 3D bounding boxes and approximately 390k unlabeled frames in sequences from several German cities. It is fed by a multi-sensor matrix with 5x Lidars, and 6x cameras capturing a dense 360 view of the environment in 2D and 3D.
Berkeley BDD100K
This open source dataset is very diverse, including data from different times of the day, different weather scenarios that were captured from urban, highway and rural areas.
NuScenes
NuScenes is a multi-modal dataset with 6 cameras, lidar, IMU & GPS. It consists of 1000 dense urban scenes collected from test fleets that were active in both Boston and Singapore.