Introduction

I am working on a project in my personal time to learn about robotics. I custom-built a robot that utilizes Robot Operating System (ROS2) and Kubernetes (1.24) to orchestrate the bot's containerized ROS2 components. ROS is an open source toolset for building robots. I was initially using a custom Python setup, but switched to ROS2 for the vast ecosystem of packages and industry usage. The bot is equipped with live video streaming capabilities, allowing users to remotely control and maneuver it through a web browser. It's capable of movement in the forward, backward, right, and left directions. It is also equipped with an ultrasonic sensor that allows it to detect any immediate obstacles in its path and stop accordingly.

Software Components

I installed and configured Ubuntu 22.04 server and Kubernetes with MicroK8s on a Raspberry Pi 4 (8GB). I'm using Skaffold to deploy to the Pi Kubernetes instance. ROS2 discovery happens via Eclipse Cyclone DDS over Kubernetes CNIs.

Hardware Components

The project currently uses an Arduino microcontroller, motors, ultrasonic sensor, and a Raspberry Pi camera to control the bot's movements. It's using a basic Actobotics rover chasis. The build incorporates various materials, including an old phone car mount (a workaround with bonus telescopic swivel arm!), cardboard, and hair ties, adding functional and creative elements to the project. I'll need to upgrade the hardware platform soon enough.

Goals

I want to explore video processing options for navigation tasks. I'm currently using ROS2 Galactic and plan to upgrade to Humble Hawksbill.

Bot Image