The data collection efforts have begun. Our most recent update First Deployment of the Season! was posted on 13th December 2024. Click the snippet for more details.
The data collection for the FoMo dataset has started!
In this paper, we propose the FoMo (Forêt Montmorency) dataset: a comprehensive, multi-season data collection. Located in the Montmorency Forest, Quebec, Canada, our dataset will capture a rich variety of sensory data over six distinct trajectories totaling 6 kilometers, repeated through different seasons to accumulate 42 kilometers of recorded data. The boreal forest environment increases the diversity of datasets for mobile robot navigation. This proposed dataset will feature a broad array of sensor modalities, including lidar, radar, and a navigation-grade Inertial Measurement Unit (IMU), against the backdrop of challenging boreal forest conditions. Notably, the FoMo dataset will be distinguished by its inclusion of seasonal variations, such as changes in tree canopy and snow depth up to 2 meters, presenting new challenges for robot navigation algorithms. Alongside, we will offer a centimeter-level accurate ground truth obtained through Post Processed Kinematic (PPK) Global Navigation Satellite System (GNSS) correction, facilitating precise evaluation of odometry and localization algorithms. This work aims to spur advancements in autonomous navigation, enabling the development of robust algorithms capable of handling the dynamic, unstructured environments characteristic of boreal forests. With a public odometry and localization leaderboard and a dedicated software suite, we invite the robotics community to engage with the FoMo dataset by exploring new frontiers in robot navigation under extreme environmental variations. We seek feedback from the community-based on this proposal to make the dataset as useful as possible.
The data recording takes place in the Montmorency Forest, located 70 km north of Quebec City, Canada. The 6 proposed trajectories have a combined length of 6 km and are located mostly on forest roads. Click on the trajectories on the map below to show more information.
Our mobile robot is equipped with different proprioceptive and exteroceptive sensors, mounted on a custom-made modular sensor frame. Use the 3D model below to inspect the sensor frame from various points of view! The front of the robot hosts a ZED X Stereo camera and a RoboSense Ruby Plus 128-channel 3D lidar. Below them, we have a Leishen 128S1 semi-solid state lidar. A Navtech CIR-304H radar is mounted in the rear, with an XSens MTi-10 2A8G4 and a VectorNav VN100 MEMS IMUs underneath the radar. A Basler ace2 with a fisheye lens points backward from the platform, inspecting the effects of the robot passing through the terrain and, notably, snow. Finally, we include three multi-band and three single-band Emlid M2 and RS+ GNSS receivers.
The IEEE1588 Precision Time Protocol (PTP) will be used to eliminate the discrepancy between devices on our system. The main computer, directly connected to the GNSS receiver, serves as a PTP Grandmaster and maintains synchronized time between devices. For the spatial calibration, we consider intrinsic and extrinsic calibration of the cameras and spatial calibration between distinct pairs of sensors. All pair-wise calibration priors come from a precise CAD model of our system, displayed above.
@misc{boxan2024fomo,
title={FoMo: A Proposal for a Multi-Season Dataset for Robot Navigation in For\^et Montmorency},
author={Matěj Boxan and Alexander Krawciw and Effie Daum and Xinyuan Qiao and Sven Lilge and Timothy D. Barfoot and François Pomerleau},
year={2024},
eprint={2404.13166},
archivePrefix={arXiv},
primaryClass={cs.RO}
}