This repository provides a robust and automated framework for running, benchmarking, and evaluating containerized SLAM (Simultaneous Localization and Mapping) systems. It is designed for streamlined use in robotics competitions and research, leveraging Docker and Docker Compose for portability and reproducibility.
- Automated Pipeline: Execute the entire workflowโrunning SLAM, playing data, and evaluating resultsโwith a single command.
- Containerized & Reproducible: Runs any Docker-based SLAM system, ensuring consistent environments.
- Flexible Configuration: Easily configure dataset paths, playback parameters, and the target SLAM image via a central
.envfile. - Modular Structure: Cleanly organized source code, entrypoints, and configuration make the framework easy to understand and extend.
This framework is the official evaluation tool for the following robotics competitions:
- SLAM Challenge Series @ CTU in Prague: comrob-ds.fel.cvut.cz:555
- SLAM Pipeline and Evaluation Framework
Ensure you have the following software installed on your system:
This guide covers the initial setup of the framework. For specific use cases, please see the Usage Scenarios below.
-
Clone the Repository:
git clone https://github.com/comrob/slam-bench.git slam_competition cd slam_competition -
Create Environment File: Copy the example environment file. You will configure this file based on your chosen scenario.
cp .env.example .env
This framework is designed for two primary user groups. Please follow the guide that matches your goal.
Goal: To containerize your SLAM system and test it locally, simulating the exact process used by the official evaluation platform.
Recommendation: This guide walks you through building a provided example (VINS-Mono-crl) from source and running it with this framework. Following these steps is the best way to understand how to prepare your own SLAM system for the competition.
-
Clone the Example SLAM System: First, get the source code for
VINS-Mono-crl, which serves as a template for a dockerized SLAM solution.git clone https://github.com/comrob/VINS-Mono-crl.git
-
Build the SLAM Docker Image Locally: Navigate into the cloned repository and build the Docker image. This process simulates you building your own SLAM system's image.
cd VINS-Mono-crl/docker make # vins-mono-crl:latest will be build cd ..
You now have a local Docker image named
vins-mono-crl:latestready for testing. -
Set up the Evaluation Framework & Dataset: If you haven't already, complete the Quick Setup for this
slam-benchrepository and download theshellby-0225-train-labdataset from the Example Dataset link. -
Configure the
.envFile: In theslam_competitiondirectory, open your.envfile and point it to your locally built image and the downloaded dataset.# Absolute path to the dataset directory BAGFILES_PATH_HOST=/home/user/datasets/shellby-0225-train-lab # Name of your locally built Docker image SLAM_IMAGE=vins-mono-crl:latest
Note: Replace
/home/user/with the actual path on your system. -
Run the Full Pipeline:
./run_pipeline.sh
This will use your locally built
vins-mono-crl:latestimage to run the evaluation. This mirrors the exact process you'll follow for your own algorithm.
After running the example, you are ready to adapt the process for your own solution. Keep the following in mind:
- ROS1 is Straightforward: If your solution already runs in ROS1, the containerization process is very direct. You primarily need to create a
Dockerfilethat builds your workspace and runs your launch file. - Dockerfile Best Practices: Your
Dockerfileshould be clean and minimal. We highly recommend studying the VINS-Mono-crl Dockerfile as a template. It should:- Install only the necessary dependencies.
- Build a minimal ROS workspace.
- Use
CMDorENTRYPOINTto execute theroslaunchcommand that starts your SLAM node. Ensure the topics in your launch file match the topics provided in the competition datasets.
Click to expand: Competition Submission Format
To submit your solution, package your Docker image and an optional description file.
-
Save your Docker Image: Archive your final SLAM image into a
.tarfile.# Using docker save docker save -o my-slam-image.tar your-slam-image:your-tag # Or using the repository script ./docker/docker2tar.sh your-slam-image:your-tag
-
(Optional) Create a
description.yaml: This file specifies runtime parameters. Check the official competition rules for the required fields.# Example description.yaml ROSBAG_PLAY_RATE: 5.0
Available Parameters:
ROSBAG_PLAY_RATE: (Optional, default: 5.0) Controlsrosbag playspeed.
-
Create the ZIP Archive: Create a
.zipfile containing the.tarimage and the optional.yamlfile.zip submission.zip my-slam-image.tar description.yaml
This
submission.zipfile is ready for upload.
Goal: To prepare and validate a new dataset for use in a competition.
Recommendation: To ensure your dataset is compatible with the evaluation platform, you should test it with this framework using a known, working SLAM algorithm.
-
Structure Your Dataset: Organize your dataset files into the required directory structure. See the Example Dataset section below for the correct layout (
sensors,reference,calibration, etc.). -
Test Locally: Configure the
.envfile to pointBAGFILES_PATH_HOSTto your new dataset's location. Use a reliable, locally built SLAM image (likevins-mono-crl:latestfrom the example above) forSLAM_IMAGEto run the pipeline../run_pipeline.sh
If the pipeline completes successfully and generates a sensible evaluation report, your dataset is correctly structured.
-
Contact Organizers: Once you have validated your dataset, contact the competition organizers to arrange for it to be uploaded to the official evaluation server.
We provide a sample collection of datasets to help you get started.
https://drive.google.com/drive/folders/1ef0k0JzQpKvGQkLGCsqh9FhuewvpQVYqTo test the framework with your own custom dataset, it must follow the same directory structure, including providing sensor calibration files:
<your_dataset_root>/ # e.g., /home/user/datasets/my_awesome_dataset
โโโ sensors/
โ โโโ *.bag # One or more .bag files containing sensory data
โโโ reference/
โ โโโ reference.txt # The ground truth trajectory file
โโโ calibration/
โโโ intrinsics.yaml # Intrinsic parameters for cameras, etc.
โโโ extrinsics.yaml # Extrinsic transformations between sensor frames
Click to expand: Configuration (`.env`) Details
All pipeline parameters are controlled from the .env file. Below is a description of the key variables.
| Variable | Description | Example Value |
|---|---|---|
BAGFILES_PATH_HOST |
Absolute path to the specific dataset directory you want to run. | $HOME/datasets/shellby-0225-train-lab |
BAGFILE_NAME |
The name of the .bag file or a subdirectory within BAGFILES_PATH_HOST that contains the bag file(s). Recommended to use sensors. |
sensors |
SLAM_IMAGE |
The Docker image for the SLAM system you want to evaluate. | my-slam-algo:latest |
CRL_SLAM_IMAGE |
A fallback SLAM image if SLAM_IMAGE is not set. |
ghcr.io/comrob/liorf-crl:latest |
REFERENCE_TRAJECTORY_FILE_HOST |
Absolute path to the ground truth trajectory file. The default value is usually sufficient. | $BAGFILES_PATH_HOST/reference/reference.txt |
ROSBAG_PLAY_RATE |
Playback rate for the rosbag play command. |
5.0 |
TOPICS_FILE |
(Optional) Path relative to BAGFILES_PATH_HOST to a file listing ROS topics to play. |
tracks/passive.txt |
DEV_DOCKER |
Set to true to use the locally built slam-bench:latest image. For developing this framework, not the SLAM system. |
true |
SLAM_CONFIG_OVERRIDE_FILE |
(Optional) Host-side path to a config file mounted into the SLAM container at /config/override.yaml. |
./config/slam/override.yaml |
Click to expand: Framework Workflows (Manual Control, Visualization, etc.)
You can run individual components of the pipeline for fine-grained control and debugging.
- Start Background Services: Launch your SLAM system and the odometry recorder in detached mode (
-d).docker compose up -d run_slam record_odometry
- Play the Dataset: Run the
play_bagservice in the foreground.docker compose up play_bag
- Evaluate the Trajectory: After playback finishes, run the evaluation service.
docker compose up evaluate_trajectory
- Clean Up: Stop and remove all pipeline containers.
docker compose down
To view SLAM outputs on your host machine:
- Allow local connections to your X server (run once per session):
xhost +
- Ensure the
DISPLAYenvironment variable is correctly set in your shell. - Run the pipeline. The SLAM container will connect to your host's display.
To test changes made to this evaluation framework itself:
- Modify the code in the
src/directory. - Build the local Docker image:
./docker/build.sh
- Set
DEV_DOCKER=truein your.envfile. - Run the pipeline to test your changes:
./run_pipeline.sh
Click to expand: Common Issues and Solutions
-
Evaluation fails or the score is zero:
- Check logs:
docker compose logs record_odometry. Is it receiving messages? - Verify your SLAM container is publishing the trajectory correctly.
- Ensure
$OUTPUT_PATH_HOST/estimated_trajectory.txtis created and not empty.
- Check logs:
-
Local code changes (to this framework) have no effect:
- Did you run
./docker/build.shafter making changes? - Is
DEV_DOCKER=trueset in your.envfile?
- Did you run
-
RViz is not displaying anything:
- Did you run
xhost +on your host machine before starting the pipeline? - Is your
DISPLAYenvironment variable correctly set?
- Did you run
This project is licensed under the MIT License. See the LICENSE file for details.