Build a raspberry pi robot yourself – project overview

Almost all conceivable projects can be realized with the Raspberry Pi. A very popular topic, which I was often asked about, is how to build your own robot. For this there are many different kits and ways how to build a Raspberry Pi robot. My robot can be so z.B. Follow lines, avoid obstacles, follow a voice, can be controlled remotely, etc.

In this overview I would like to go into the individual topics of a Raspberry Pi Robot and give helpful tips for the self-construction. By rebuilding this tutorial a fully functional robot with a lot of features can be created, but also own modifications can be introduced.

Raspberry Pi Robot Components

  • 2-3x line follower modules (z.B. the SG90)
  • 3x Sound Sensors (for front, right and left) (soldering iron, solder, pad, etc.).)

The complete code of our robot is written in Python. If you don’t know this programming language yet, you can either follow the introduction on this page, or z.B. take an online course.

Raspberry Pi robot body

The chassis of ours is the most important aspect of our entire project. Here, there are different models, which mainly have either two (2WD) or four (4WD) engines, which can be controlled separately. The advantage of four motors, in addition to better stability, is more precise rotation. Those robot chassis, which have two motors, come with at least one swiveling support wheel, which allows precise navigation u.U. a little bit more difficult.

Since my robot should only drive around in the apartment, I have taken such a two-wheeled kit as a basis. A battery holder was also included, which will serve as an additional power supply for the motors. You can find detailed instructions on how to assemble and create your first Raspberry Pi robot code here:

Want to go with a different chassis, here are a few ideas:

Robot follows lines

After assembling the body kit, our robot can unfortunately still do very little – except for driving. Therefore we let it now follow lines. This works with the help of infrared modules, which are attached to the underside of the chassis. By laying a line with dark (light-absorbing) tape, these sensors can detect if they are on the line and control the motors accordingly.

You can see how this looks in action in the video below. as soon as the robot loses the line, it swings to the right and left in a fixed angle to find the line. If it is found, it simply continues to follow. If nothing is found in this area, the mode is terminated and the robot stops.

Raspberry Pi robot remote control – by infrared remote control

So that we can also control the robot wirelessly by ourselves, we have some possibilities. The simplest one is a control by infrared remote control. For this we can use the program Lirc, with which the connection of an IR receiver is very easy to realize. How the setup works exactly and what has to be considered, you can find in this tutorial:

Once everything is ready, the Raspberry Pi can be controlled by IR remote control, as seen in the video

But before that we have to extend the code. We can assign commands to the stored buttons of the remote control. In my case the buttons are used to steer in the different directions and to change the modes. The complete instructions for creating the commands and extending the Python code can be found here:

Avoiding obstacles by ultrasound

As another feauture I thought about an autopilot mode for my Raspberry Pi robot. It should be able to move around the room without colliding with objects. To realize this project I decided to use an ultrasonic module. The advantage is that it measures very quickly and the measurement is also not too inaccurate. There is the question, if you prefer to use several modules at several places or – like I did – use one single ultrasonic sensor.

So that not only the distance to a point is measured and my robot is also variable, I have attached the HC-SR04 module on a servo motor. The big advantage is that it rotates my sensor and automatically detects the best possible direction – namely the direction where there is the most space. How exactly the implementation of the whole thing looks like you can read in the fourth part of the Raspberry Pi robot tutorial:

Guiding the Raspberry Pi robot by voice

In the next part we made the robot react to voices. Here we don’t use a microphone, but several voice detectors and attach them to the car body. These sensors predominantly ignore ambient noise and measure the volume of a voice. Since several are mounted around the body, we can approximately determine where the voice came from, as it is perceived loudest by the closest sensor.

These modules work analog and deliver a corresponding value. To be able to read them, we need an analog-digital converter. The more detectors we have attached to our Raspberry Pi robot, the more accurately we can locate the origin of the voice and drive there accordingly.

Wiring is appropriately simple, as all that needs to be connected is the MCP3008 ADC and to this come the 3 (or more) sound detectors. The rest of the navigation is done by software. Both can be read in the fifth part of the series:

Controlling the Raspberry Pi robot with the Xbox 360 controller

In the sixth part of the series I decided to build an "extended" controller. For this I use an Xbox 360 controller, which is connected to the Raspberry Pi via USB with a wireless adapter. After that we are able to read all inputs of the controller and react accordingly. This is how I used the left joystick as a control to steer the body. In addition, it is possible to assign all other buttons – e.g.B. by letting start other modes, like automatic obstacle avoidance. What exactly has to be done, you can find out here:

As you can see in the video, the motors are quite fast and you have to react accordingly fast. Therefore I defined the A-button as a brake – as long as it is pressed, the robot does not move.

Further conceivable robot feautures

  • GPS NavigationIf you want to use the robot outdoors or if you have good GPS reception at home, you could consider using satellite data to control the robot. In buildings the accuracy will not be sufficient for exact navigation, but outdoors a GPS coordinate could be transmitted, to which the vehicle will be driven (in combination with our obstacle avoidance mechanism). By the way, you can find a tutorial on how to set up a GPS receiver here.
  • Voice control: Amazon Echo, Siri, S-Voice, a.o.a. are showing the way: Voice control is no longer a thing of the past. We have already seen that it is also possible to react to voice commands with a Raspberry Pi. Also a Raspberry Pi robot should be easily controllable by voice with a microphone.
  • swiveling livestream: After we use the left joystick of our Xbox 360 controller to set the direction in which the robot should steer, the right joystick is still unassigned. How a simple control of a servo motor with the Xbox 360 controller looks like is explained here. In addition, we have already realized a livestream of the Raspberry Pi camera on a smartphone using VNC in the past. Connecting these two components is not too hard: Just place the camera on a servo and move it back and forth with the right joystick. The stream can then be viewed on the cell phone.
  • Face recognitionIf we have placed a camera, we can also recognize faces and other objects with the help of OpenCV. This way we can let the Raspberry Pi robot automatically steer in the direction where a face or a person was detected. The principle would be similar to the automatic obstacle detection – a servo with camera rotates in a semicircle and e.g. a face is detected. every 20° we look if something was recognized. If yes, it will be driven in this direction, if not it will be searched further.

Of course, I’m happy to receive further ideas – if you have any, please leave a short comment below.

Raspberry Pi Robot Code

If you want to download the whole code, you can do that too. For this I created a GitHub repository, where the files are included. On your Pi you can easily load the code for our robot:

This contains all scripts and the complete robot class, which we extended step by step with each tutorial.

If you like you can upload pictures of your robot and link them here – I would be interested to see what you have built.

Like this post? Please share to your friends:
Leave a Reply

;-) :| :x :twisted: :smile: :shock: :sad: :roll: :razz: :oops: :o :mrgreen: :lol: :idea: :grin: :evil: :cry: :cool: :arrow: :???: :?: :!: