FoMo: A Proposal for a Multi-Season Dataset for Robot Navigation in Forêt Montmorency

ICRA 2024 Workshop on Field Robotics Poster presentation: 13th of May

Abstract

In this paper, we propose the FoMo (Forêt Montmorency) dataset: a comprehensive, multi-season data collection. Located in the Montmorency Forest, Quebec, Canada, our dataset will capture a rich variety of sensory data over six distinct trajectories totaling 6 kilometers, repeated through different seasons to accumulate 42 kilometers of recorded data. The boreal forest environment increases the diversity of datasets for mobile robot navigation. This proposed dataset will feature a broad array of sensor modalities, including lidar, radar, and a navigation-grade Inertial Measurement Unit (IMU), against the backdrop of challenging boreal forest conditions. Notably, the FoMo dataset will be distinguished by its inclusion of seasonal variations, such as changes in tree canopy and snow depth up to 2 meters, presenting new challenges for robot navigation algorithms. Alongside, we will offer a centimeter-level accurate ground truth, obtained through Post Processed Kinematic (PPK) Global Navigation Satellite System (GNSS) correction, facilitating precise evaluation of odometry and localization algorithms. This work aims to spur advancements in autonomous navigation, enabling the development of robust algorithms capable of handling the dynamic, unstructured environments characteristic of boreal forests. With a public odometry and localization leaderboard and a dedicated software suite, we invite the robotics community to engage with the FoMo dataset by exploring new frontiers in robot navigation under extreme environmental variations. We seek feedback from the community based on this proposal to make the dataset as useful as possible.

Trajectories

The data recording will take place in the Montmorency Forest, located 70 km north of Quebec City, Canada. The 6 proposed trajectories have a combine length of 6 km and are located mostly on forest roads. Click on the trajectories on the map below to show more information.

Platform

Our mobile robot is equipped with different proprioceptive and exteroceptive sensors, mounted on a custom made modular sensor frame. Use the 3D model below to inspect the sensor frame from various points of view! The front of the robot hosts a ZED X Stereo camera and a RoboSense RS-32 3D lidar. A Navtech CIR-304H radar is mounted in the rear, with the Atlans-C navigation grade Inertial Navigation System underneath it. Connected to the Atlans-C, we have a Septentrio GNSS receiver. Next to the lidar we have an XSens MTi-10 2A8G4 and a VectorNav VN100 MEMS IMUs. In the rear, a Basler ace2 with a fisheye lens points downwards.


Time Synchronization and Calibration

The IEEE1588 Precition Time Protocol (PTP) will be used to eliminate the discrepancy between devices on our system. The main computer, directly connected to the GNSS receiver, serves as a PTP Grandmaster and maintains synchronized time between devices. For the spatial calibration, we consider intrisic and extrinsic calibration of the camears and spatial calibration between distinct pairs of sensors. All pair-wise calibration priors come from a precise CAD model of our system, displayed above.

BibTeX

@misc{boxan2024fomo,
    title={FoMo: A Proposal for a Multi-Season Dataset for Robot Navigation in For\^et Montmorency},
    author={Matěj Boxan and Alexander Krawciw and Effie Daum and Xinyuan Qiao and Sven Lilge and Timothy D. Barfoot and François Pomerleau},
    year={2024},
    eprint={2404.13166},
    archivePrefix={arXiv},
    primaryClass={cs.RO}
}