Lyft Open-Sourced Its Level 5 Autonomous Driving Dataset

Lyft has open-sourced its autonomous driving dataset from its Level 5 self-driving fleet. The dataset, as one of the largest publicly available datasets of this kind, contains high-quality data from camera and LIDAR sensors.

It contains 55 thousand human-labeled 3D frames, alongside with an HD spatial semantic map and a drivable surface map. The data was collected using 7 cameras – covering a 360 view around the car and 3 LIDAR sensors, 2 placed in the front part and one on the car’s roof.

All the cameras are synchronized between each other as well as they were synchronized with the LIDAR sensors to provide a fused data sample at a particular point in time. The cameras that Lyft engineers used have resolutions of 1224×1024 and 2048×864 and a wide field of view. On the other hand, the LIDAR sensors were 40-beam LIDARs that together produce around 216,000 points working at 10 Hz.

Lyft is another autonomous driving company in the series of companies that decided to share (at least a part) of their data to the community. Their “Level 5” division, responsible for the research and development of self-driving vehicle technology collected the data from the fleet of Lyft vehicles present on the roads in the United States.

The dataset can be downloaded from the following link. Engineers from Lyft have also released a devkit as well as a tutorial on how to use the data, together with the release of the dataset.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments