<aside> <img src="/icons/burst_gray.svg" alt="/icons/burst_gray.svg" width="40px" />

Domains: Robotics, ROS, SLAM

</aside>

https://github.com/f0rgotteng0d/autoport

Overview

Autoport is a ROS 2-powered mobile robot designed for autonomous exploration, mapping, and navigation using Simultaneous Localization and Mapping (SLAM). Built on affordable hardware like the Raspberry Pi and common sensors, Autoport serves as a platform for robotics research, education, and prototyping.

IMG_4645.mov

image.png

image.png

Key Concepts

ROS (Robot Operating System) An open-source set of software libraries and tools for building robot applications. It is not a traditional OS, but a flexible framework that provides a structured communication layer, allowing different processes (nodes) to exchange data and work together, which greatly simplifies integrating sensors, actuators, and algorithms.

SLAM (Simultaneous Localization and Mapping) A computational process used by autonomous robots to construct a map of an unknown environment while simultaneously keeping track of their own position within that map.

2D LiDAR (Light Detection and Ranging) A sensor that measures distances by emitting laser beams and detecting their reflections. A 2D LiDAR scans a single horizontal plane to create a 2D point-cloud map of the robot's surroundings, identifying walls and obstacles.

Odometry The use of data from motion sensors, such as wheel encoders (which track wheel rotations) and Inertial Measurement Units (IMUs) (which measure orientation and angular velocity), to estimate the robot's change in position over time.

Approach and Workflow

  1. Assembling the physical robot chassis and integrating hardware like the 2D LiDAR and motors with an onboard computer running ROS.
  2. Writing C++ or Python code within ROS nodes, or integrating pre-existing ROS packages, to implement sensor data handling, the SLAM algorithm, and robot control logic.
  3. Testing these nodes in a realistic physics simulator like Gazebo, where a virtual model of the robot and environment is used to make sure the entire system is working as intended.
  4. Compiling and deploying the verified code to the robot for initial hardware validation, ensuring that sensors are publishing data and motors respond correctly.
  5. Operating the robot in its target environment for real-world mapping runs and parameter tuning, where the algorithm's settings are optimized to achieve accurate and reliable performance.