This example project illuistrates how to provision data from a ROS2 topic to Apache Kafka
- We use a preconfigured devcontainer environment in VS Code on Ubuntu 24. So you have to first install docker and then build the docker image.
docker build .devcontainer/
- Then open VS Code and agree to open the folder in 'devcontainer mode' (just click yes when asked).
code .
- Create and activate the virtual environment, then install the requirements.
python3 -m venv .venv
source .venv/bin/activate
pip install -r src/ros2kafka/requirements_kafka.txt
pip install -r src/ros2kafka/requirements_simrk.txt -i https://nexus.basys.dfki.dev/repository/pypi-group/simple
- Build the ROS2 project
colcon build
- Unfortunately, I did not figure out how to automatically copy the installed Python requirements to the install folder created by colcon. As long as there is no better solution please do that manually.
cp -r .venv/lib/python3.12/site-packages/* install/ros2kafka/lib/python3.12/site-packages
Prerequisits: You have a running Apache Kafka installation together with a Schema Registry (only needed for AVRO format). You can then configure endpoints and topics in config.yaml
- Start the ROS2 publisher node in a new terminal.
source install/setup.bash
ros2 run ros2kafka sample_pose_publisher --ros-args --params-file config.yaml
- Start the ROS2 node that provisions data to Apache Kafka in JSON format in a new terminal.
source install/setup.bash
ros2 run ros2kafka kafka_data_provider_JSON --ros-args --params-file config.yaml
- Alternatively or additionally, start the ROS2 node that provisions data to Apache Kafka in AVRO format in a new terminal.
source install/setup.bash
ros2 run ros2kafka kafka_data_provider_AVRO --ros-args --params-file config.yaml