A real-time Sparse mapping SLAM implementation of ORB-SLAM3 on Raspberry Pi 5 using ROS2 (Humble) with an external USB camera.
SLAM (Simultaneous Localisation and Mapping) enables a robot to construct a three-dimensional map of the environment while localising its own pose (position and orientation) within it. ORB-SLAM3 is a one of the most powerful SLAM implementations which have real-time, multi-map, and multi-mode support even on low-powered devices (such as the Raspberry Pi). I have implemented ORB-SLAM3 in ROS2 (Humble) on the Pi5, and built a full modular system that supports real-time monocular SLAM, RGB-D mapping, and dataset-based evaluation (TUM, EuRoC, and KITTI) with automated scripts and camera calibration tools. The system is capable of publishing 3D sparse maps, integrating with RViz2 for visualization, and supports live or recorded input via ROS 2 topics or native ORB-SLAM3 executables.
This is also my minor project for my Bachelor's of Technology degree at Manipal University Jaipur
- Real-time Monocular SLAM
realtime-execution.mp4
- Example TUM RGB-D Monocular SLAM
tum-execution.mp4
- Camera calibration
- Real-time execution in Monocular mode
- 3D Sparse maps generation
- Dataset-based SLAM evaluation (TUM, EuRoC MAV & KITTI Visual Odometry datasets)
- Conversion tools
- RViz visualisation support
- Modular bash automation
Model : Raspberry Pi 5 Model B Rev 1.0 Operating system : Debian GNU/Linux 12 (bookworm) Mode : Monocular
- Raspberry Pi 5 Model B Rev 1.0
- Raspberry Pi Fan & Heatsink
- Raspberry Pi Power Adaptor (minimum 27 Watts)
- 256 GB Micro SD Card
- External USB Camera
- Raspian OS (12 (bookworm))
- Python (3.11.2)
- CMake (3.25.1)
- OpenCV (4.6.0)
- OpenCV Contrib
- Vision_opencv
- ROS2 (Humble Hawksbill)
- Eigen3 (3.3.7)
- g2o
- Sophus (v1.1.0)
- DBoW2 (v1.1)
- Pangolin (4.5.0)
- OpenGL (3.1)
- Mesa (23.2.1-1~bpo12+rpt3)
- ORB-SLAM3
- Octomap
- Image Common
- Message Filters
- Rclcpp
- Octomap_ros
- Geometry2
- Common_interfaces
- Steps on how to setup & execute this project
- Download Raspberry Pi Imager from the official website.
- Connect your microSD card to your laptop/PC.
- Run Raspberry Pi Imager:
- Select the appropriate Raspberry Pi device (e.g., Raspberry Pi 5).
- Select the required operating system (e.g., Raspberry Pi OS (64-bit)).
- Select the appropriate storage (your microSD card).
- Press
Ctrl + Shift + X
to open Advanced Options:- Set a hostname (default:
raspberrypi
). - Enable SSH and set a username and password (remember these).
- Configure Wi-Fi settings (SSID, password, and country code).
- Set the correct locale & keyboard layout.
To check your layout: press
Windows + Spacebar
or check your system settings. - Click Save.
- Set a hostname (default:
- Click Write to write the OS to the microSD card The Imager will automatically eject the card when done
- Insert the microSD card into the Raspberry Pi and power it on
- Open Command Prompt (Windows) or Terminal (Mac/Linux).
- Run the following command:
ssh <username>@<hostname>.local
- Example:
- Swap space is increased for max performance from the raspberry pi 5
sudo nano /etc/dphys-swapfile
- Change swapsize to 2048
sudo systemctl restart dphys-swapfile
cd ./scripts/
chmod +x setup.sh
./setup/setup.sh
- Use this checkerboard for calibration: checkerboard-patterns
- To calibrate your camera and generate an ORB-SLAM3-compatible .yaml file:
./start_camera_calibration.sh
- TUM RGB-D
cd ~/ros2_test/scripts/tum/
./download_tum_sequences.sh
- EuRoC MAV
cd ~/ros2_test/scripts/euroc/
./download_euroc_sequences.sh
- KITTI Visual Odometry
cd ~/ros2_test/scripts/kitti/
./download_kitti_sequences.sh
- Monocular
ros2 bag record /camera/image_raw
ros2 run image_tools showimage --ros-args -r image:=/camera/image_raw
- RGB-D
ros2 bag record /camera/color/image_raw /camera/depth/image_raw
- Convert any TUM dataset to a ROS2 bag:
cd ~/ros2_test/scripts/tum
python3 convert_tum_to_ros2_bag.py <tum_sequence_folder_path> <output_folder>
- Convert any euroc dataset to a ROS2 bag:
cd ~/ros2_test/scripts/euroc
python3 convert_euroc_to_ros2_bag.py <tum_sequence_folder_path> <output_folder>
- Example:
cd ~/ros2_test/scripts/tum
python3 convert_tum_to_ros2_bag.py \
/home/{your_username}/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk \
/home/{your_username}/tum_ros2_bag
ros2 bag info ~/ros2_test/{path_to_rosbag}/{rosbag_name}
ros2 bag play ~/ros2_test/{path_to_rosbag}/{rosbag_name}
- Executions will work only after successful setup of all libraries, packages & datasets (if executing on them)
- Check for updates:
sudo apt update && sudo apt upgrade -y
- Before executing, execute these on terminal(s)
export DISPLAY=:0
export LIBGL_ALWAYS_INDIRECT=1
export MESA_GL_VERSION_OVERRIDE=3.3
export MESA_GLSL_VERSION_OVERRIDE=330
export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libGL.so
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/ros2_test/src/ORB_SLAM3/lib
export LD_LIBRARY_PATH=~/ros2_test/src/ORB_SLAM3/Thirdparty/DBoW2/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=~/ros2_test/src/ORB_SLAM3/Thirdparty/g2o/lib:$LD_LIBRARY_PATH
source ~/ros2_test/install/local_setup.bash
- Real-time SLAM (Monocular)
~/ros2_test/ ./start_realtime_slam.sh
- Dataset-based SLAM
- TUM - Monocular :
./start_dataset_slam.sh tum_mono
- TUM - RGB-D :
./start_dataset_slam.sh tum_rgbd
- EuRoC - Monocular :
./start_dataset_slam.sh euroc_mono
- Custom ROS2 Bag (RGB-D) :
./start_dataset_slam.sh custom_rgbd
- TUM - Monocular :
- TUM RGB-D - Monocular
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Monocular/mono_tum \
./Vocabulary/ORBvoc.txt \
./Examples/Monocular/TUM1.yaml \
~/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk
- TUM RGB-D - RGB-D
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/RGB-D/rgbd_tum \
Vocabulary/ORBvoc.txt \
./Examples/RGB-D/TUM1.yaml \
~/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk \
~/ros2_test/datasets/TUM/rgbd_dataset_freiburg1_desk/associate.txt
- EuRoC MAV - Monocular
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Monocular/mono_euroc \
./Vocabulary/ORBvoc.txt \
./Examples/Monocular/EuRoC.yaml \
~/ros2_test/datasets/EuRoC/MH01 \
./Examples/Monocular/EuRoC_TimeStamps/MH01.txt
- EuRoC MAV - Stereo
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Stereo/stereo_euroc \
./Vocabulary/ORBvoc.txt \
./Examples/Stereo/EuRoC.yaml \
~/ros2_test/datasets/EuRoC/MH01 \
~/ros2_test/datasets/EuRoC/MH01/mav0/MH01.txt
- EuRoC MAV - Mono-Inertial
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Monocular-Inertial/mono_inertial_euroc \
./Vocabulary/ORBvoc.txt \
./Examples/Monocular-Inertial/EuRoC.yaml \
~/ros2_test/datasets/EuRoC/MH01 \
./Examples/Monocular-Inertial/EuRoC_TimeStamps/MH01.txt
- EuRoC MAV - Stereo-Inertial
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Stereo-Inertial/stereo_inertial_euroc \
./Vocabulary/ORBvoc.txt \
./Examples/Stereo-Inertial/EuRoC.yaml \
~/ros2_test/datasets/EuRoC/MH01 \
./Examples/Stereo-Inertial/EuRoC_TimeStamps/MH01.txt
- KITTI Visual Odometry - Monocular
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Monocular/mono_kitti \
./Vocabulary/ORBvoc.txt \
./Examples/Monocular/KITTI00-02.yaml \
~/ros2_test/datasets/KITTI/dataset/sequences/00 \
00
- KITTI Visual Odometry - Stereo
cd ~/ros2_test/src/ORB_SLAM3/
./Examples/Stereo/stereo_kitti \
./Vocabulary/ORBvoc.txt \
./Examples/Stereo/KITTI00-02.yaml \
~/ros2_test/datasets/KITTI/dataset/sequences/00
- Monocular Node
ros2 run ros2_orbslam3_wrapper monocular_node --ros-args -p input_mode:=live
- Camera Publisher with Calibration YAML
ros2 run ros2_orbslam3_wrapper camera_publisher_node \
/home/{your_username}/ros2_test/scripts/common/my_camera.yaml
- RViz Launcher (from SSH, not VNC)
Note
RViz may not launch under VNC. For proper OpenGL rendering, use an SSH terminal session instead. Also, create a ros topic of map & set it to ORB_SLAM3/map
ros2 run ros2_orbslam3_wrapper rviz_launcher_node
- Monocular Node
ros2 run ros2_orbslam3_wrapper monocular_node --ros-args -p input_mode:=recorded
- RGB-D Node
ros2 run ros2_orbslam3_wrapper rgbd_node --ros-args -p input_mode:=recorded
- Camera Publisher with Calibration YAML
ros2 run ros2_orbslam3_wrapper camera_publisher_node \
/home/{your_username}/ros2_test/scripts/common/my_camera.yaml
- RViz Launcher (from SSH, not VNC)
Note
RViz may not launch under VNC. For proper OpenGL rendering, use an SSH terminal session instead. Also, create a ros topic of map & set it to ORB_SLAM3/map
ros2 run ros2_orbslam3_wrapper rviz_launcher_node
- Play Rosbag
ros2 bag play ~/ros2_test/{path_to_rosbag}/{rosbag_name}
Important
You should now see the sparse map being created on the pangolin viewer & on the RViz viewer as well
- OpenCV4
- opencv-contrib (OpenCV dependency)
- vision-opencv (ROS2-OpenCV dependency)
- Pangolin
- ORB-SLAM3 [Forked]
- ROS2
- ROS2 ORB-SLAM3 Wrapper [Referenced from]
- rviz
- image-common
- message-filters
- rclcpp
- octomap
- octomap_ros
- geometry2
- common_Interfaces