How to Run Tensorflow 2.4.0 and AutoKeras (AutoML) on Raspberry Pi 4B with 64-bit OS

Alan Wang
6 min readMar 5, 2021

This is a simple article, originating from one of my experiments for a failed company project proposal.

Tensorflow/Keras is (still) a popular AI package, and AutoKeras (a open AutoML or automated machine learning Python package) lets you train a neural network model in TF without having to set any parameters. It requires Tensorflow 2, which in turn needs 64-bit Python runtime. However, official Raspberry Pi OS is still using 32-bit kernels.

It’s not that hard to find 64-bit OS, but there are so many dependencies that are never fully documented. I managed to find the solution: here I’ll show you how to install and use them without really complex configurations. I’ve tested these instructions several times, and so far they worked every time.

However, this may become obsolete in the future when Tensorflow and related packages finally get supports for ARM64 (AArch64) platforms.

Photo by GC Libraries Creative Tech Lab on Unsplash

Hardware

I use a standard Raspberry Pi 4B with 4 GB RAM:

I put on a big heat sink stripped from an old motherboard. A cooling fan is recommended for the model training process though: the CPU temperature goes up to ~70 °C without overclocking. (clock speed would be automatically reduced if it reaches 85 °C.)

Software

Go to the following link to download the 64-bit kernel Raspberry Pi OS (beta):

This OS comes with a Python 3.7.x 64-bit runtime. The kernel can be updated to a more recent version, so don’t worry about it.

Use Raspberry Pi Imager to burn this .iso image to your micro SD card. (Choose OS -> Use Custom)

I’ve also overclocked the Pi up to 1750 MHz. Actually the Pi can run at 2000 MHz but not during the model training process (it would crash and reboot). If you want to try overclocking, check out the link below:

I used the following settings in config.txt (which be be found in the root directory of SD card):

#uncomment to overclock the arm. 700 MHz is the default.
over_voltage=2
arm_freq=1750

The whole Pi OS installation took about an hour on my overclocked-to-1750 MHz Pi.

Photo by Charlotte Coneybeer on Unsplash

Boot Up and System Update

Boot up your Pi and open the terminal:

sudo apt-get update --fix-missingsudo apt-get full-upgrade -y

Setup other things you want to set, and reboot.

This is the system information I’ve got after update:

pi@raspberrypi:~ $ uname -aLinux raspberrypi 5.10.17-v8+ #1403 SMP PREEMPT Mon Feb 22 11:37:54 GMT 2021 aarch64 GNU/Linux

Install Dependencies

sudo apt-get install build-essential python3-dev python3-pip python-h5py python3-h5py libhdf5-dev libblas-dev liblapack-dev libopenblas-dev libatlas-base-dev gfortran -ysudo pip3 install cython pip setuptools wheel numpy==1.18.5 --upgradesudo pip3 install wrapt --upgrade --ignore-installed

Some of the packages might be re-installed by the Tensorflow wheels later, but as far as I tried, this is guaranteed to make those wheels work. I’ve also use NumPy 1.18.5, since 1.19.x are not supported well by TF 2.4.0.

The second instruction has the upgrade parameter behind it, since some of the packages might be already exists on your system. Don’t worry if you see messages of cannot uninstall something.

Install Tensorflow

Since there are no official TF support for ARM64 platforms (yet), we will be using one of the wheels built by other people:

sudo pip3 install https://github.com/bitsy-ai/tensorflow-arm-bin/releases/download/v2.4.0/tensorflow-2.4.0-cp37-none-linux_aarch64.whl

There are in fact several wheels avaliable:

I’ve tried two of them and both works. Remember to choose the link with cp37 (Python 3.7) and aar64.

Install AutoKeras

sudo pip3 install autokeras

This will take a longer time, since AutoKeras will install Pandas, SciPy and scikit-learn as well, and each of them need to build their own wheel to be installed.

My installed AutoKeras version is 1.0.12.

Photo by NeONBRAND on Unsplash

Test Run

Now let’s run some neural network model training using TF and AutoKeras (be warned that this will take at least several hours to finish, even if we set the max_trail parameter to 1):

import sklearn
import autokeras as ak
from tensorflow.keras.datasets import fashion_mnist
from sklearn.metrics import classification_report
# load datasets
(data_train, target_train), (data_test, target_test) = \
fashion_mnist.load_data()
# classifier
clf = ak.ImageClassifier(overwrite=True, max_trials=1)
# training
clf.fit(data_train, target_train)
# making predictions on test data
predictions = clf.predict(data_test).astype('int8')
# print out classification results
print(classification_report(target_test, predictions))

This is all the code you’ll need. That’s the beauty of AutoML.

Here we use TF’s builtin fasion_mnist dataset, which consists of 60,000 28 x 28 pixel gray-scale images labeled as 10 objects (10 class). AutoKeras’ ImageClassifier can be used to process two-dimensional arrays like this. (Check out AutoKeras’ online doc to see different classifiers.)

Be noted that some dataset-importing function of TF or Scikit-learn may fail on RPis and virtual machines, probably due to memory limitations.

Noticed the import sklearn line in the beginning? Due to some reason, AutoKeras would throw an error trying to import scikit-learn on its own (it would do so when you import AutoKeras for the first time). This line is a workable workaround for now.

Photo by ThisisEngineering RAEng on Unsplash

Below is the training result (only the last iteration and the result is shown):

...
Epoch 1/11
1875/1875 [==============================] - 438s 232ms/step - loss: 0.5587 - accuracy: 0.7985
Epoch 2/11
1875/1875 [==============================] - 425s 227ms/step - loss: 0.3238 - accuracy: 0.8841
Epoch 3/11
1875/1875 [==============================] - 425s 226ms/step - loss: 0.2813 - accuracy: 0.8994
Epoch 4/11
1875/1875 [==============================] - 438s 234ms/step - loss: 0.2593 - accuracy: 0.9060
Epoch 5/11
1875/1875 [==============================] - 420s 224ms/step - loss: 0.2488 - accuracy: 0.9079
Epoch 6/11
1875/1875 [==============================] - 419s 223ms/step - loss: 0.2339 - accuracy: 0.9149
Epoch 7/11
1875/1875 [==============================] - 419s 223ms/step - loss: 0.2205 - accuracy: 0.9179
Epoch 8/11
1875/1875 [==============================] - 416s 222ms/step - loss: 0.2188 - accuracy: 0.9187
Epoch 9/11
1875/1875 [==============================] - 419s 223ms/step - loss: 0.2109 - accuracy: 0.9213
Epoch 10/11
1875/1875 [==============================] - 418s 223ms/step - loss: 0.2003 - accuracy: 0.9258
Epoch 11/11
1875/1875 [==============================] - 416s 222ms/step - loss: 0.2016 - accuracy: 0.9245
2021-03-05 16:22:52.398538: W tensorflow/python/util/util.cc:348] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.
precision recall f1-score support 0 0.84 0.92 0.88 1000
1 0.99 0.98 0.99 1000
2 0.89 0.86 0.88 1000
3 0.91 0.94 0.92 1000
4 0.83 0.91 0.87 1000
5 0.99 0.98 0.98 1000
6 0.83 0.68 0.75 1000
7 0.96 0.97 0.97 1000
8 0.98 0.98 0.98 1000
9 0.97 0.97 0.97 1000
accuracy 0.92 10000
macro avg 0.92 0.92 0.92 10000
weighted avg 0.92 0.92 0.92 10000

So we’ve achieved 92% test data prediction accuracy, which is better than 88–89% from the examples I’ve found on books. There you have it!

--

--

Alan Wang

Technical writer, former translator and IT editor.