Skip to the content.

EDU4AI - Autonomous Driving Cars

This tutorial is part of the following EU funded project:

Edu4AI - Artificial Intelligence and Machine Learning to Foster 21st Century Skills in Secondary Education, in the framework of Erasmus+, Grant Agreement VG-IN-BY-20-25-077366

Final result

Robot, complete

Get the parts

For our project we used the following parts

Optional:

Just as reference, you can see the list part here.

Chassis

More information on how we created the chassis in the dedicated page.

Power

In this project, we use two separated power source:

Note: we cannot use the Raspberry Pi to power motors, since its output are not suitable for supporting loads, so we need an additional power supply, and a 9V battery is a good solution.

We reccomend rechargable 9V batteries, since common 9V battery would last less than hours, according to our experience (see the shopping list above for details).

According to official documentation, we can use a good quality 2.5A power bank to power the board if the downstream USB peripherals consume less than 500mA in total. We have only one peripheral, the webcam, that should stay inside this limit.

L298N <-> Raspberry Pi GPIO pins

L298N Raspberry PI GPIO PIN Raspberry Pi pin description
IN1 29 GPIO5
IN2 31 GPIO6
IN3 37 GPIO26
IN4 36 GPIO16
GND 39 GND

Robot rear with connections Robot top

Prepare the SD Card

We used the official Raspberry Pi OS.

We tested the project with:

Set up the Raspberry Pi

Connect a mouse, keyboard and monitor to your Raspberry Pi and power it. Follow configuration and initialisation instructions if any.

To upgrade the Operating System, open a Terminal window and execute:

sudo apt update
sudo apt -y dist-upgrade

Answer ‘yes’ to any prompts.

Now we should install VNC (Virtual Network Computing) to access the Robot remotely using our Windows laptop.

Install VNC server on the Raspberry Pi

sudo apt update
sudo apt install realvnc-vnc-server

Then from the Pi menu enable the VNC by: Raspberry Pi Menu->Preferences->Raspberry Pi Configuration->Interface tab->Enable VNC and click OK.

After that you should install a VNC viewer on your Windows laptop; you can download it here.

Get the IP from the Raspberry using:

hostname -I

Put this IP into the VNC viewer on your laptop.

Move the motors

We are now almost ready to actually move the motors.

On the Raspberry PI, click on the menu on the top right->Programming->Thonny Python IDE.

from gpiozero import Robot
myRobot = Robot(right=(26,16), left=(5,6))

# Argument is the speed, from 0 (stop) to 1 (max speed, default),
# leave empty for default.
# I suggest to start with 0.6, and eventually change it.
# Use these commands on the intearctive console.

myRobot.forward(0.6)  # go forward
myRobot.backward(0.6) # go backward
myRobot.left(0.6) # turn left
myRobot.right(0.6) # turn right
myRobot.stop() # turn right

Train your model

Go to Teachable Machine.

Use “Standard Image Model”

After training, export as Tensorflow Lite -> Quantized

We used the Duckietown Traffic Signs to train the model.

Test trained model with your car

Open a terminal on Raspberry Pi and install dependencies.

sudo apt update
sudo apt -y install libjpeg-dev libtiff5-dev libjasper-dev libpng-dev
sudo apt -y install libavcodec-dev libavformat-dev libswscale-dev libv4l-dev
sudo apt -y install libxvidcore-dev libx264-dev
sudo apt -y install libatlas-base-dev

pip3 install opencv-python

pip3 install tflite-runtime

pip3 install numpy --upgrade

Create a folder on Raspberry, unzip the Tensorflow model and download the classify_webcam.py example inside it.

Inside a terminal, test that everything works:

python3 classify_webcam.py --model myModel/model.tflite --labels myModel/labels.txt

With this script, without any modification, you can immediatly check that image detection works and manually drive the robot using WASD keys.

However, you need to modify it to actually implement an autonomous driving.

Follow a path

To challenge students in something mesaurable, we drew a path on the floor of the laboratory with scotch tape.

Roads with scotch tape

We used Duckietown Navigation specification to draw the roads. This is not mandatory, since we do not want to participate in Duckietown competitions and we did not follow exactly the standard, but it is useful to have a starting point. In particular, we did not trained AI to follow the lines, but just to recognize traffic signs.

We also used Duckietown Map-Utils repository to generate a random map, but a simple ring in a 3x3 map should be enough to start challenging student in following a path with traffic signs.