ROS with Raspberry Pi: Investigating the Core Issue of Slow Streaming Performance
Sebastian
Posted on March 21, 2022
Building a moving robot will typically lead to adding visual sensors that enable the robot to inspect its surroundings and navigate. Visual sensors encompass ultrasonic sensors or laser scanners for distance measurements, LIDAR for a 360-degree laser scan image, cameras that provide RGB images, and sensors that provide complex point cloud. ROS supports all these sensors: Attach the correct plugin in RVIZ and Gazebo, start the hardware sensor, publish the correct topic and subscribe to its data.
In my robot project, I have the goal to stream image data from a mobile robot, connected via WiFi, to my Linux workstation, connected via ethernet. The camera of my choice is the Intel Realsense D435 because of its great form factor and having an official ROS plugin. Once the ROS software and configuration was done, streaming of images with the topic /camera/color/image_raw
and of pointcloud data with /camera/depth/color/points
could start. However, I was suprised by the bad performance. While images were published with about 17FPS on the robot, it was only 6FPS on the workstation. And for the pointcloud data, it dropped from 12FPS to 3FPS. This is not fast enough to see the robot surroundings in real-time, and not adequate for navigation.
This problem took me more than 2 months to investigate, find and solve several issues, to finally get 30FPS for both types of data. Follow along to learn a lot about optimizing the usage of the D435 camera, image and pointcloud configuration in ROS, and network performance.
Note: The technical environment is Ubuntu 20.04 with ROS Noetic 1.15.11.
This article originally appeared at my blog admantium.com.
Hardware & Software Overview
To detail the context of this article, here is the concrete hardware and software that I'm using in my robot project.
- Mobile Robot
- Raspberry Pi4B 4GB RAM
- Ubuntu 20.04 Server LTS (headless)
- ROS Noetic 1.15.11
- Linux Workstation
- Intel Celeron N3450 @ 4x 2,2GHz, 6GB RAM
- Ubuntu 20.04 focal
- ROS Noetic 1.15.11
On the mobile robot, I also use the Realsense SDK and the Realsense ROS packages - in different version, because they have difference performance. Details will be explained in the next sections.
Baseline Performance
With Realsense Camera SDK 2.47 and Realsense ROS1 2.3.2, I could get the camera node starting and stream images. On the mobile robot node I started the ROS nodes, and then on my Linux workstation I measured the number of received messages with the command ros topic hz
.
The baseline, then, is this: 7 FPS for /camera/color/image_raw
and 3 FPS for /camera/depth/color/points
.
From here on, I systematically tried different configuration parameters for the ROS realsense node, different versions of the camera SD and the realsense ROS package. The results of these measurements are explained in the next sections.
ROS1: ROS Camera Parameter Configurations
In the first attempt, these versions of the SDK and library were used:
Realsense SDK v2.47
Realsense ROS 2.3.2
Configuring different parameters of the ROS node leads to these results.
Parameter | ||||||||
---|---|---|---|---|---|---|---|---|
depth&color width | not set | not set | 640 | 640 | 640 | 640 | 640 | 640 |
depth&color height | not set | not set | 480 | 480 | 480 | 480 | 480 | 480 |
depth_fps | not set | not set | 5 | 5 | 5 | 6 | 6 | 6 |
color_fps | not set | not set | 5 | 5 | 5 | 6 | 6 | 6 |
initial_reset | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | TRUE | |
enable_sync | TRUE | |||||||
align_depth | ||||||||
filters | pointcloud | pointcloud | pointcloud | pointcloud | pointcloud | pointcloud | pointcloud | pointcloud |
texture_stream | any | color | any | any | color | color | color | color |
ordered_pc | yes |
Topic HZ Receive | ||||||||
---|---|---|---|---|---|---|---|---|
/color/image_raw | 6.5 | no data | 1.5 | 1.5 | no data | 6 | 6 | 6 |
/depth/color/points | no data | no data | no data | no data | no data | 3 | 1 | 3 |
/depth/image-rect | 7 | 6.5 | no data | no data | 6 | 6 | 6 | 6 |
The first results showed that no stream can get better than 7 fps for any stream, and only when setting the pointcloud data specifically to 6FPS any data at all was received.
So I opened another Github issue about ROS1 performance and continued the experiments.
In other issues, I read that downgrading the Realsense SDK helped to increase the performance. I tried a downgraded SDK version both for ROS1 and ROS2.
ROS1: Decreasing Realsense SDK
In the next attempt, I downgraded the SDK.
Realsense SDK v2.41
Realsense Ros 2.2.21
This time, I also measured the topic frequency on the sender side.
Parameters | ||||
---|---|---|---|---|
depth/color width | 640 | 640 | 640 | |
depth/color height | 480 | 480 | 480 | |
depth_fps | 5 | 30 | 30 | |
color_fps | 5 | 30 | 30 | |
initial_reset | FALSE | TRUE | TRUE | |
enable_sync | FALSE | FALSE | FALSE | |
align_depth | FALSE | TRUE | TRUE | |
filters | pointcloud | - | - | |
texture_stream |
Topic HZ Sender | ||||
---|---|---|---|---|
color/image_raw | 17 | 15 | 28 | 26 |
depth/color/points | no data | 12 | 16 | 0 |
depth/image-rect | 30 | 25 | 17 | |
/camera/aligned_depth_to_color/image_raw | no data | no data | 8 | 15 |
Topic HZ Receiver | ||||
---|---|---|---|---|
color/image_raw | 6 | 6 | 11 | 10 |
depth/color/points | 2 | 2 | 3 | |
depth/image-rect | 9 | 9 | 5 | 5 |
These results show that stable 28FPS for the topic /camera/color/image_raw
and 16 FPS for /camera/depth/color/points
are possible - on the sender side! However, the workstation that receives this data and is connected via ethernet, receives only about 1/3 of these frames.
ROS2: Decreasing Realsense SDK
The next measurement in ROS2 was also done with a downgraded version of the SDK.
Realsense SDK v2.41
Realsense Ros 3.13
These are the parameters that I used, and the results.
Parameters | |||
---|---|---|---|
depth/color width | 640 | ||
depth/color height | 480 | ||
depth_fps | |||
color_fps | |||
initial_reset | |||
enable_sync | |||
align_depth | |||
filters | pointcloud | ||
texture_stream | |||
infra1 | disabled | ||
infra2 | disabled |
Topic HZ Sender | |||
---|---|---|---|
color/image_raw | 19 | 16 | 16 |
depth/color/points | no data | no data | 13 |
depth/image-rect | 30 | 30 | 28 |
Topic HZ Receiver | |||
---|---|---|---|
color/image_raw | 2.5 | 2 | 2 |
depth/color/points | no data | 1 | |
depth/image-rect | 4.5 | 4.5 | 4 |
A maximum of 19FPS for images, and 13 for point cloud - that’s worse than in ROS1. Also, the severe framerate drop on the receiver side prevents any usage for visualization of navigation.
ROS Performance Considerations
Following up on these results, I made a general search on the internet about ROS1 and ROS2 performance. This should provide a general feeling about where to look for performance gaps. Here are some of the potential issues to investigate:
- Setting appropriate buffer sizes and queue sizes for the receivers
- When a topic has multiple subscribers, the topic publication rate decreases
- In ROS2, earlier versions had heavy constraints when parsing Python messages, which resulted in very sloppy messages, but this is fixed in current ROS2 distributions
- In ROS2, multicasting can cause problems with specific routers:
Overall, these issues lead me to check my local WIFI. From the Raspberry Pi 4 itself, I made a simple online speedtest.
Using a WLAN connection:
Testing download speed................................................................................
Download: 4.22 Mbit/s
Testing upload speed......................................................................................................
Upload: 6.23 Mbit/s
Using an ethernet cable:
Testing download speed................................................................................
Download: 106.04 Mbit/s
Testing upload speed......................................................................................................
Upload: 42.32 Mbit/s
I was surprised! The upload speed limit of about 6MB could explain the severe performance drop: Images are not sent fast enough. Assuming about 1MB per image, getting 30HZ image topic frequency on the receiver side would mean an upload speed of at least 30MBs. Clearly, this is not possible with my current WLAN speed.
And from here on, I took a totally new direction for my investigation: Optimizing the WLAN speed of my Raspberry Pi. And this is the story of another article.
Conclusion
Visual sensors are essential for robot navigation. Amongst the many available data formats, this article investigated the topics /camera/color/image_raw
and /camera/depth/color/points
, which are raw image data and point cloud date for depth information. The particular setup - a Raspberry Pi4 with the Realsense D435 Camera, connected via WIFI to a Linux workstation - showed a severe performance drop if these topics on the receiver side. by systematically trying different versions of the Realsense SDK with both ROS1 and with ROS2, and by trying startup parameters for the Realsense Node, I could see improve the publication for the sender, but the receiver capped at about 11 FPS for raw images and 3 FPS for pointcloud. Finally, I measured the raw upload speed of my Raspberry Pi4: It is only about 6MB. How to improve that was covered in my previous article Improving Image Streaming Performance.
Posted on March 21, 2022
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
March 21, 2022