Turtlebot multi-robot system

I have three turtlebot v2.0 robots that I want to control in a multi robot system. That means I want them to eventually talk to each other sharing information and building a representation of their surroundings.

But first the question becomes what software infrastructure is well fittet for such a task and will allow me to effficiently solve the task.

The turtlebots come with the ROS operating system to communicate between laptop and the Kobuki base. ROS also integrates the Kinect sensor on the robot to allow the turtlebot to perform SLAM (gmapping) out of the box (thanks to the factory calibrated IMU of the v2.0 version). The Kinect can of course be used to additionally observe various other aspects of the environment.

For this to work each turtlebot runs a local rosmaster that manages the connections between the various local nodes that are communicating with each other as publishers to and subscribers of ROS topics.

The integration of the turtlebots into a multi-robot system requires communication (via WiFi) between them to share information. Because of the structure of ROS rosmasters cannot communicate with eachother directly without additional code.

ROS multi-master projects

There are several projects that try to allow communication between ROS masters:

  • Master Sync: would allow communication between two ros-masters via a super master, but is currently limited to two robots.
  • Foreign Relay: can make a topic visible to other masters. But doesn’t allow the whole multi-robot system to be monitored easily.
  • Fkie Multimaster: Seems to be the most mature and and a truly multi-master project. Every ros-master that is connected to the system exposes all topics to all other ros-masters and thus allows the ros-masters to communicate via subscriptions to remote topics.

Lightweight Communications and Marshaling

Another option would be to leave the ROS environment for the task of communicating between the turtlebots. One project that was suggested to me for this task is the Lightweight Communications and Marshaling (LCM) project. It was developed for MITs autonomous car for the DARPA Urban Challenge. LCM is a library, that can be used in various programming languages like Python, C and C++. The code for a simple subscriber and publisher setup is very concise (see the tutorial). The code is free and open source – it can be found here.

The key property of LCM that makes it attractive for multi robot systems is, that it is decentralized – there is no master that manages nodes. Peers communicate directly with each other. UDP Multicast is used to broadcast messages to several receiving peers. The use of UDP minimizes latency over the WiFi connection between robots, which makes it a good choice in the multi-robot setup.

It would not make sense to only use LCM with the turtlebots, since LCM does not provide any interface or functionality to communicate with the base of the turtlebots. This means there is no way to control the robots with LCM.

The idea for the multi-master turtlebot system is to use ROS locally on each robot for all control and data collection from Kinect and other sensors. LCM is then added to exchange relevant and processed information between the robots.

The important realization is, that it usually does not make sense to publish the raw sensor data to the other robots in the system (and it would not be feasible at high enough rates for say RGB-D data from Kinect). There usually is a processing step involved on each robot, that extracts some kind of information from the raw sensor data on each robot, like for example building a map of the environment or extracting and quantizing visual data from the RGB data. This distilled data can then be shared with the other robots to allow for example cooperate mapping of the environment.

 

Leave a Reply