TartanAir Dataset Documentation
Install:
pip install tartanair
Welcome to TartanAir V2!
Let’s go on an adventure to beautiful mountains, to dark caves, to stylish homes, to the Moon 🚀, and to other exciting places. And there is more! You, your models, and your robots can experience these worlds via a variety of sensors: LiDAR, IMU, optical cameras with any lense configuration you want (we provide customizable fisheye, pinhole, and equirectangular camera models), depth cameras, segmentation “cameras”, and event cameras.

All the environments have recorded trajectories that were designed to be challenging and realistic. Can we improve the state of the art in SLAM, navigation, and robotics?
🆕 TartanGround: A Large-Scale Dataset for Ground Robot Perception and Navigation
We are still working on adding the TartanGround dataset to the documentation. The tartanair python package will support this new dataset as well.
Getting Started:
License
Similar to TartanAir V1, the TartanAir V2 dataset is licensed under a Creative Commons Attribution 4.0 International License and the toolkit is licensed under a BSD-3-Clause License.