This is an update to articles for installing the PyTorch machine learning library on a Raspberry Pi that have been published by Amrit Das in 2018 and Saparna Nair in 2019. It builds on them by updating the required settings and introducing a fix and a few tweaks to make the process run considerably faster. Although there are Python wheels floating around that offer PyTorch as a Raspberry Pi Python package, downloading them from unverified sources is a security risk. Here’s how to install PyTorch from source.
Install the requisite dependencies.
sudo apt-get install libopenblas-dev libblas-dev m4 cmake cython \
python3-dev python3-yaml python3-setuptools
Clone the PyTorch repository and change your current directory to it. I’m using --depth=1
to save the bandwidth of copying unneeded past history.
git clone --depth=1 --recursive https://github.com/pytorch/pytorch
cd pytorch
At the time of writing there’s a bug in the used version of protobuf that prevents the compilation. Fix it by updating to the current version of protobuf with the following command.
git submodule update --remote third_party/protobuf
The Raspberry Pi 3B+ I used has 1MB of RAM, which can easily run out, causing the compilation to fail with an out of memory error. To fix this you need to configure swap space. This is disk space that is dynamically used to simulate — very slow — RAM. You do not need to configure swap space if you’ve already done that in the past or you run the build process on a Pi model with more than 3GB of real RAM. To see if you have already configured sufficient swap memory, run the command /sbin/swapon
, and see if there is an entry under SIZE with at least 2GB of swap. If swap is required, configure it with the following commands.
# Create a 2GB swap file
sudo dd if=/dev/zero of=/swap0 bs=1M count=2048
# Format it for swapping
sudo mkswap /swap0
# Configure the system to use it for swapping
sudo sh -c 'echo /swap0 swap swap' >>/etc/fstab`
# Make this take effect now
sudo swapon -a
Configure the build process by setting the following environment variables. I derived these by reading the file setup.py
. You may want to do the same in order to verify that they are still current, and that they match your configuration. You may also want to save these variables in a script, which you will feed to your shell with the source
command (e.g. source myvariables.sh
) every time you login to the Pi and want to compile PyTorch. Disabling testing is risky, as I do, so you need to ensure things work as they should through other means, e.g. by cross checking the results you obtain on the Pi with those of PyTorch running on a more powerful platform on which it was tested.
The most important setting is the MAX_JOBS
. By default, the build process will run NCPU-1 jobs in parallel. So if your Pi has four cores, it will run three jobs in parallel. On 1MB Pi models this causes the real RAM to run out and swap to be used in its place. Because compilation processes compete concurrently for that space this increases tremendously the disk I/O, and slows down the compilation process almost to a halt. With two rather than three parallel jobs at the time, I saw that even for the largest compilations, a few tens of MB RAM always remained free.
# Limit the number of parallel jobs in a 1MB Pi to prevent thrashing
export MAX_JOBS=2
# Disable features that don't make sense on a Pi
export USE_CUDA=0
export USE_CUDNN=0
export USE_MKLDNN=0
export USE_NNPACK=0
export USE_QNNPACK=0
export USE_DISTRIBUTED=0
# Disable testing, which takes ages
export BUILD_TEST=0
Start the build process and be prepared to wait for more than 12 hours for it to finish.
python3 setup.py build
You may want to run the command with nohup
and on the background to ensure that it will continue to run even if I your connection with the Pi is interrupted. In such a case you can see the compilation results with tail -f nohup.out
.
Finally, if all goes well, install the compiled files with
sudo -E python3 setup.py install
Enjoy PyTorch!
Comments Toot! TweetLast modified: Tuesday, March 17, 2020 5:19 pm
Unless otherwise expressly stated, all original material on this page created by Diomidis Spinellis is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.