LIDAR Robot Simulation Using Gazebo and ROS2

Overview

This project implements a robot that uses LIDAR sensors for navigation and obstacle avoidance within the Gazebo simulation environment. The robot is controlled using ROS (Robot Operating System), and the system integrates an xbox controller which is used to control the movement of the robot.

The LIDAR sensor is the primary input device used to detect obstacles and map the environment.

Features

Requirements

Before you begin, ensure you have the following software installed:

Dependencies

To run the robot in the Gazebo simulation, you will need the following ROS packages:

Install them (if not already installed):

sudo apt install ros-jazzy-ros2-control ros-jazzy-ros2-controllers ros-jazzy-ros-gz ros-jazzy-gz-ros2-control ros-jazzy-joy-* ros-jazzy-joint-state-publisher

Installation

  1. Clone the repository:

    git clone https://github.com/nochilli/lidar_bot.git
    
  2. Navigate to the project directory:

    cd lidar_bot
    
  3. Build the workspace:

    colcon build
    

Running the Simulation

Launching the Gazebo Simulation

To launch the robot and Gazebo environment:

ros2 launch lidar_bot lidar_bot_spawn.launch.py

This will start Gazebo with the robot in the environment and initialize the LIDAR sensor.

Visualizing in RViz

The launch file automatically opens RViz with predefined configurations to visualize the robot’s position, LIDAR scans, and more.

Robot Control

The robot uses differential drive for movement control. The control node subscribes to the joystick input and drives the robot by publishing velocity commands to the /cmd_vel topic.

Basic control commands include:

Troubleshooting

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments