Skip to main content

SLAMTEC RPLidar S3

This tutorial will guide you through the process of connecting a LiDAR sensor to your Leo Rover.

Light Detection and Ranging devices, or lidars for short, are mechanisms used for mapping the environment, object detection, tracking the speed of vehicles and in a wide range of other applications. In robotics 2D lidars, like A2M8 / A2M12, are used for things such as indoor SLAM (Simultaneous localization and mapping) or safety systems.

What to expect?

After finishing the tutorial you should be able to both gather the lidar data and visualize it using RViz. Just like in the image below:

Prerequisites

📄Connect via SSH
Learn how to establish an SSH connection with your Leo Rover and access its terminal using Putty or OpenSSH.
📄Install ROS on your computer
Learn how to install the Robot Operating System (ROS) on your computer. Step-by-step guide for beginners.
📄ROS Development
Detailed guide on ROS development for Leo Rover, covering topics like adding additional functionalities, building ROS packages and more.

List of components

  • RPLidar S3
  • RPLidar S3 adapter plate (can be found here: Addon adapters)
  • M5x10 Allen head screw x4
  • M2.5x6 Allen head screw x4

Mechanical integration

We developed 3D printable models of mechanical interfaces that allow you to mount the aforementioned sensor to the mounting plate of the rover. Locating the sensor the top of the robot provides a wide field of view with not many obstacles for the laser beam to get caught on. Get the files from here: Addon Adapters.

  • With 4 x M2.5x6 Allen screws connect the sensor to the printed interface plate.
  • Use 4 x M5x10 Allen screws to fasten the sensor to the Leo Rover.

Wiring and electronics connection

The sensor can be connected to the robot's main computer via the USB socket positioned at the top of the rover. If your USB cable is long it might get into lidars field of view if it sticks up too much. try to hide it in the empty space inside Leo Rovers back frame. Just like on the photo below:

info

When mounting the sensor, you should be particularly careful not to obstruct the field of view by other parts of the Rover.

The sensor can be connected to the robot's main computer via the USB socket positioned at the top of the rover.

USB connection provides power to the sensor and allows the data transfer. This means that no external power sources are necessary. Some lidars might need external power connections, that's when Powerbox might come in handy.

00188 MEB cover for Leo Rover
Powerbox
The Powerbox module significantly enhances Leo Rover's capabilities by providing versatile power options and enabling continuous operation through battery hot-swaps and external power access.

With everything connected you are ready to try out your new sensor.

Software integration

The first thing you can do is to make sure your device has the correct permissions and is available at the fixed path on your system. To do this, you can add the following rule to the udev service:

/etc/udev/rules.d/lidar.rules
KERNEL=="ttyUSB*", ATTRS{idVendor}=="10c4", ATTRS{idProduct}=="ea60", MODE="0666", GROUP="dialout", SYMLINK+="lidar"

Paste these lines to /etc/udev/rules.d/lidar.rules file and reload udev rules by typing:

sudo udevadm control --reload-rules && sudo udevadm trigger

Your device should now be available at the /dev/lidar path.

We want the sensor functionality to be available in the ROS ecosystem, so you should install a ROS package that provides a node for the sensor you are trying to integrate.

sudo apt install ros-${ROS_DISTRO}-rplidar-ros

Now, create a launch file that would start the node with a fitting configuration.

/etc/ros/laser.launch
<launch>
<node name="rplidar_node" pkg="rplidar_ros" type="rplidarNode" output="screen">
<param name="serial_port" value="/dev/lidar"/>
<param name="frame_id" value="laser_frame"/>
<param name="serial_baudrate" value="1000000"/>
</node>
</launch>

Include your launch file in the robot.launch file, so that your node will start at boot.

In /etc/ros/robot.launch:

<include file="/etc/ros/laser.launch"/>

Your robot should be aware of where the sensor is located and what space it occupies. You can ensure it does that by creating a URDF model of the sensor.

/etc/ros/urdf/laser.urdf.xacro
<?xml version="1.0"?>
<robot>

<link name="sp3lidar_link">
<visual>
<origin xyz="0 0 0.009"/>
<geometry>
<box size="0.08 0.08 0.018"/>
</geometry>
<material name="support">
<color rgba="0 0 0 0.6"/>
</material>
</visual>
<visual>
<origin xyz="0 0 0.02975"/>
<geometry>
<box size="0.055 0.055 0.0235"/>
</geometry>
<material name="base">
<color rgba="0.0 0.0 0.0 1.0"/>
</material>
</visual>
<visual>
<origin xyz="0 0 0.0504"/>
<geometry>
<cylinder radius="0.0253" length="0.0178"/>
</geometry>
<material name="lidar">
<color rgba="1.0 0.0 0.0 0.7"/>
</material>
</visual>
<collision>
<origin xyz="0 0 0.009"/>
<geometry>
<box size="0.08 0.08 0.018"/>
</geometry>
</collision>
<collision>
<origin xyz="0 0 0.02975"/>
<geometry>
<box size="0.055 0.055 0.0235"/>
</geometry>
</collision>
<collision>
<origin xyz="0 0 0.0504"/>
<geometry>
<cylinder radius="0.0253" length="0.0178"/>
</geometry>
</collision>
</link>

<joint name="sp3lidar_joint" type="fixed">
<origin xyz="0.0775 0 0.0"/>
<parent link="base_link"/>
<child link="sp3lidar_link"/>
</joint>

<link name="laser_frame"/>

<joint name="laser_joint" type="fixed">
<origin xyz="0 0 0.048" rpy="0 0 3.14159"/>
<parent link="sp3lidar_link"/>
<child link="laser_frame"/>
</joint>

</robot>

And including it in the description that is uploaded at boot.

/etc/ros/urdf/robot.urdf.xacro
<xacro:include filename="/etc/ros/urdf/laser.urdf.xacro"/>
tip

You can experiment with the URDF file and create a more representative model of the sensor by adding more visual and collision tags or by including meshes in STL or COLLADA format.

The last step is to either reboot the robot or restart the leo service.

sudo systemctl restart leo

Examples

Reading and visualizing the data

The robot should now publish the LaserScan messages on the /scan topic. You can check the raw data that it sends by typing:

rostopic echo /scan

If you have ROS installed on your computer, you can get a more graphical representation of the data with RViz. If you don't have ROS, you can follow this guide:

📄Install ROS on your computer
Learn how to install the Robot Operating System (ROS) on your computer. Step-by-step guide for beginners.

Before starting RViz, make sure you completed the Connecting another computer to ROS network section of ROS Development tutorial:

📄ROS Development
Detailed guide on ROS development for Leo Rover, covering topics like adding additional functionalities, building ROS packages and more.

Now, open RViz by typing rviz in the terminal, or, if you have the leo_viz package installed, type:

roslaunch leo_viz rviz.launch

This will start RViz with visualization of the current robot model.

On the Displays panel click Add -> By topic and search for the /scan topic. Choose the LaserScan display and click Ok.

You should now be able to see the data from the sensor visualized as points in 3D space.

To put the points into the camera image, you can also add the Camera display.

Be sure to check compressed as the Transport Hint and /camera/image_raw as the Image Topic.

Here's an example end result:

What next?

Lidars are commonly used in projects involving autonomous navigation, you might be interested in a tutorial about it.

They are however, not the only way of teaching a Leo Rover how to move on it's own. Check out our line follower tutorial if you want to learn more. You can also check our Integrations page for more instructions.