Chapter 2.3: Simulating Sensors (LiDAR, Camera, IMU)
A robot operating in the dark is useless. To perform any intelligent task, a robot must be able to perceive its environment. In simulation, this is achieved by adding sensor plugins to our robot model. These plugins generate realistic data that mimics the output of their real-world hardware counterparts.
This chapter details how to add and configure common sensors like cameras, LiDAR, and IMUs to a URDF model for use in Gazebo, and how these sensors publish their data over ROS 2 topics.
The Gazebo Sensor Plugin System
Gazebo's power comes from its plugin architecture. To add a sensor, we don't just add a visual shape to our robot; we add a <sensor> tag to our URDF and associate it with a specific plugin. This plugin contains the code that generates the sensor data and publishes it to a ROS 2 topic.
The general structure for adding a sensor to a URDF (or more commonly, a Xacro file) is:
<gazebo reference="link_where_sensor_is_attached">
<sensor name="my_sensor_name" type="sensor_type">
<!-- Sensor-specific configuration -->
<update_rate>30.0</update_rate>
<!-- ROS 2 Plugin Configuration -->
<plugin name="my_plugin_name" filename="libgazebo_ros_sensor_plugin.so">
<ros>
<namespace>/my_robot</namespace>
<output>my_topic_name</output>
</ros>
</plugin>
</sensor>
</gazebo>
Key Components:
<gazebo reference="link_name">: This tag attaches the sensor to a specific link in your URDF. The sensor's position will be relative to this link's origin.<sensor type="...">: Defines the sensor. Thetypeattribute specifies what kind of sensor it is (e.g.,camera,rayfor LiDAR,imu).<update_rate>: How many times per second the sensor should generate new data.<plugin>: This is where the magic happens. We load a shared library (.sofile) that contains the logic for the sensor.libgazebo_ros_camera.so,libgazebo_ros_ray.so, andlibgazebo_ros_imu.soare common examples from thegazebo_pluginspackage.<ros>tag: Inside the plugin, this configures the ROS 2 interface. You can set the namespace and the output topic name.
1. Simulating a Camera
Cameras are one of the most common sensors in robotics, used for everything from object detection to visual navigation.
<gazebo reference="head_link">
<sensor name="head_camera" type="camera">
<update_rate>30.0</update_rate>
<camera name="head">
<horizontal_fov>1.396263</horizontal_fov>

<clip>
<near>0.02</near>
<far>300</far>
</clip>
</camera>
<plugin name="camera_controller" filename="libgazebo_ros_camera.so">
<ros>
<namespace>/my_robot</namespace>
<output>camera/image_raw</output>
</ros>
</plugin>
</sensor>
</gazebo>
<camera>: Contains the intrinsic parameters of the camera.<horizontal_fov>: The horizontal field of view in radians.<image>: Resolution and pixel format.<clip>: The near and far clipping planes. Objects closer thannearor farther thanfarwill not be rendered.
- Plugin Output: This sensor will publish messages of type
sensor_msgs/Imageto the topic/my_robot/camera/image_raw.
Depth Cameras (like Intel RealSense)
To simulate a depth camera, you use the depth_camera type. It's similar to a regular camera but generates a second image where each pixel's value represents the distance to that point in meters. This is crucial for 3D perception. The plugin for this is often libgazebo_ros_depth_camera.so.
2. Simulating a LiDAR
LiDAR sensors are the workhorse of robot navigation and mapping, providing a 360-degree view of the environment. In Gazebo, LiDAR is simulated using the ray sensor type.
<gazebo reference="base_link">
<sensor name="lidar" type="ray">
<pose>0 0 0.25 0 0 0</pose> <!-- Position relative to the base_link -->
<visualize>true</visualize>
<update_rate>10</update_rate>
<ray>
<scan>
<horizontal>
<samples>720</samples>
<resolution>1</resolution>
<min_angle>-1.570796</min_angle>
<max_angle>1.570796</max_angle>
</horizontal>
</scan>
<range>
<min>0.10</min>
<max>30.0</max>
<resolution>0.01</resolution>
</range>
</ray>
<plugin name="gazebo_ros_lidar" filename="libgazebo_ros_ray.so">
<ros>
<namespace>/my_robot</namespace>
<output>scan</output>
</ros>
</plugin>
</sensor>
</gazebo>
<visualize>: If true, you can see the laser beams in the Gazebo GUI, which is useful for debugging.<ray>: Defines the laser properties.<scan>: Defines the scanning pattern. The example above defines a 180-degree horizontal scan with 720 samples.<range>: The minimum and maximum detection distances.
- Plugin Output: This sensor publishes messages of type
sensor_msgs/LaserScanto the topic/my_robot/scan.
3. Simulating an IMU
An IMU is critical for estimating a robot's orientation and motion. It measures linear acceleration and angular velocity.
<gazebo reference="base_link">
<sensor name="imu_sensor" type="imu">
<always_on>true</always_on>
<update_rate>100</update_rate>
<visualize>true</visualize>
<plugin name="imu_plugin" filename="libgazebo_ros_imu.so">
<ros>
<namespace>/my_robot</namespace>
<output>imu/data</output>
</ros>
<initial_orientation_as_reference>false</initial_orientation_as_reference>
</plugin>
</sensor>
</gazebo>
<imu>tag: This tag within the plugin configuration allows you to add noise to the measurements to make the simulation more realistic. You can specify noise models (e.g., Gaussian) for both the accelerometer and gyroscope.- Plugin Output: This sensor publishes messages of type
sensor_msgs/Imuto the topic/my_robot/imu/data.
Summary
Simulating sensors is a matter of adding the correct <sensor> and <plugin> blocks to your robot's URDF/Xacro file. By configuring these plugins, we can generate realistic camera images, laser scans, and inertial measurements. This simulated data is published on ROS 2 topics, exactly like the data from real hardware would be. This allows our AI and control nodes to be completely agnostic to whether they are running in a simulation or on a physical robot, a key enabler of a successful sim-to-real workflow.