In the previous post I’ve outlined the hardware build of a simple autonomous car using a small RC chassis with few modifications. The car drives using a convolutional neural network and monocular vision. If you’ve followed the directions in the last post, you should be able to customize your RC car with a simple wooden or plastic platform, a Raspberry Pi, a camera and a PWM HAT1 that can control a motor and a servo. For my build I’ve also added an RC receiver, since my NAVIO2 HAT supports decoding of SBUS and PPM signals out of the box. However this is optional, and there are many ways to control your car, depending on what you have available (WiFi, for instance).
Even though the hardware is essential to a functioning autonomous car, at it’s heart it’s the software and the algorithms that enable autonomy. In this post we will be focusing on building a simple software stack on the Raspberry Pi that can control the steering of an autonomous vehicle using a Convolutional Neural Network (CNN).
Background and Aims
Let us elaborate on the background and our goals a bit. As mentioned earlier, the aim of this project is to build a car that can navigate itself around a course, using vision and vision alone. A single convolutional neural network (CNN) will perform all decision making regarding steering and throttle. The network takes the image as input and outputs values corresponding to steering and throttle2. This type of decision-making known in machine learning as an “end-to-end” approach.
End-to-end model fitting is just one of a number of different approaches for autononous vehicles. Another popular one is the so-called “robotics” approach, where a suite of different sensors (vision included) are “fused” together algorithmically to produce a map of the vehicle surroundings and localize the vehicle within it. Then, decision making takes place as a separate step, and sometimes consists of hand-coded conditions and actions.
It is not the place in this blog post to debate the merits of one approach vs the other. The truth may lie in a compositional approach, for all we know. Taking into account, the the simple goals set forward by this project as well as the recent leaps in end-to-end neural net approaches for autonomous cars, I feel it’s worth a try. And so did quite a few people, including the Donkey team, whose default CNN model and pieces of code we’ll be using in this build.
Autonomous car software Installation
For up to date installation instruction you are invited to visit the Burro repo on Github. This is a dated installation guide and may become obsolete in the future.
This car build uses the Burro autonomous car software, freely available on Github. Burro is an adaptation of Donkey for the NAVIO2 HAT. While it borrows a lot of features from Donkey, Burro has a number of significant differences:
- There is no separare server instance. An onboard web socket server serves all telemetry.
- Car control is through RC(SBUS) or gamepad (Logitech F710).
- It works with either the NAVIO2 HAT or the Adafruit Motor HAT.
Currently Burro requires a Raspberry 2 or 3 board with the NAVIO2 HAT. Before proceeding with the installation of Burro, you will need to have a working EMLID image installation. We strongly recommend to get the latest version. Please make sure you follow the instructions available in the relevant EMLID docs.
Once this is complete, ssh to your Raspberry Pi, which by default should be navio.local, if using the EMLID image.
wget the Burro install script
change permissions and run it
This will install all required libraries, create a virtual environment, clone the Burro repo and set it up, and create symlinks for you. After successful completion, you end up with a fully working installation.
A warning: Some steps of this script can take a significant amount of time. The numpy install step, for instance, takes around 20min. Unfortunately numpy needs to be compiled from source due to version incompatibility with the apt-get versions. Total installation time should be around 30min. To ensure that your installation goes smoothly, make sure that you run your Pi out of either a power supply that can supply at least 5V/2A, or a fully charged power bank or LiPo of sufficient capacity.
I am using the software with a Turnigy mini-trooper 1/16 RC car. If you have the same car, you need only change your RC channels if necessary. RC Input channels are as follows: 0 – Yaw (i.e. steering), 2 – Throttle, 4 – Arm. Arm, Yaw and throttle are configurable via
config.py. Arming the RC controller triggers a neutral point calibration. Thus, you only need to make sure that your sticks are center before arming the car.
By default Burro outputs throttle on channel 2 of the NAVIO2 rail, and steering on channel 0. You may wish to change this.
You may also wish to configure the throttle threshold value above which images are recorded.
See the Readme in the Burro repo for more instructions on how to edit your configuration.
After completing installation and configuration, you should be able to drive your autonomous car around either using the manual controls, or using a mix of CNN for steering and manual controls for throttle. Automatic throttle control is not yet available, but it will be in a future version.
To start a Burro instance, first ssh to your RPi, if you havent done already:
Then type following, from the place where your install-burro.sh script was located:
Point your browser to your RPi address (http://navio.local by default for the EMLID image), the telemetry interface will come up. Choose your driving mode based on your controller. The default is using the F710 gamepad for steering and throttle. There are options for RC, gamepad, and mixed RC+CNN and gamepad+CNN driving, where the CNN controls the steering and you control the throttle.
Here’s a video from Burro running on an RC autonomous car in Mixed mode:
Autonomous Car – Next Steps
I like to think the Burro project as being part of the lively Donkey community, since it has been spun out of Donkey after all. As such it is worth taking a look at many resources created by the Donkey developers, namely:
If you’re interested in the development of autonomous small scale vehicles, you may wish to be part of the Slack community of Donkey, by requesting an invite.
This is the second post in a series discussing the software aspects of a small scale autonomous vehicle, using vision alone and end-to-end machine learning for control and navigation. The post went over installation of the Burro autonomous car platform and basic software configuration.
Autonomous vehicles is a very young and promising field of AI, and certainly we will be seeing very interesting competition in the near future.
Did this post help you build an RC autonomous car? Share your experience in the comments below!