.. tartanair documentation master file, created by
sphinx-quickstart on Wed Mar 1 21:13:43 2023.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
TartanAir Dataset Documentation
==========================================================
Install:
.. code-block:: bash
pip install tartanair
Welcome to TartanAir V2!
Let's go on an adventure to beautiful mountains, to dark caves, to stylish homes, to the Moon 🚀, and to other exciting places. And there is more! You, your models, and your robots can experience these worlds via a variety of sensors: LiDAR, IMU, optical cameras with any lense configuration you want (we provide customizable fisheye, pinhole, and equirectangular camera models), depth cameras, segmentation "cameras", and event cameras.
.. image:: images/title.png
All the environments have recorded trajectories that were designed to be challenging and realistic. Can we improve the state of the art in SLAM, navigation, and robotics?
.. toctree::
:maxdepth: 2
:caption: Getting Started:
installation
examples
usage
modalities
environments
troubleshooting
License
==========================================================
Similar to `TartanAir V1 `_, the TartanAir V2 dataset is licensed under a `Creative Commons Attribution 4.0 International License `_ and the toolkit is licensed under a `BSD-3-Clause License `_.