Here you'll learn how to build Tensorflow either for your x86_64 machine or for the raspberry pi 3 as a standalone shared library which can be interfaced from the C++ API .

(This tutorial couldn't be possible without the help of the people from the References section)

Watch out for the "For the Rpi" dropdown menus to know what commands are related to the Rpi and which ones aren't.

If you don't want to go through the pain of all the coming steps to compile tensorflow on the Raspberry pi 3 I prepared the precompiled tensorflow lib for the test project here . You can download it and jump directly to"Compile the test project".

Contents Install basic dependencies Install USB Memory as Swap (Rpi) Compile the test project

For the Rpi

What You Need Raspberry Pi 2 or 3 Model B An SD card running Raspbian with several GB of free space An 8 GB card with a fresh install of Raspbian does not have enough space. A 16 GB SD card minimum is recommended. These instructions may work on linux distributions other than Raspbian Internet connection to the Raspberry Pi A USB memory drive that can be installed as swap memory (if it is a flash drive, make sure you don't care about the drive). Anything over 1 GB should be fine A fair amount of time Overview

These instructions were crafted for a Raspberry Pi 3 Model B running a vanilla copy of Ubuntu 16.04-xenial or an x86_64 machine also running Ubuntu 16.04 .

1. Install the basic dependencies

First, update apt-get to make sure it knows where to download everything.

sudo apt-get update

Next, install some base dependencies and tools we'll need later.

For Bazel:

sudo apt-get install pkg-config zip g++ zlib1g-dev unzip default-jdk autoconf automake libtool

For TensorFlow:

# For python 2.7 sudo apt-get install python-pip python-numpy swig python-dev sudo pip install wheel # For Python 3.3+ sudo apt-get install python3-pip python3-numpy swig python3-dev sudo pip3 install wheel

For the Rpi

To be able to take advantage of certain optimization flags:

sudo apt-get install gcc-4.8 g++-4.8 sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-4.8 100 sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-4.8 100

Finally, for cleanliness, make a directory that will hold the Protobuf, Bazel, and TensorFlow repositories.

mkdir tf cd tf 2. Install a Memory Drive as Swap for Compiling

For the Rpi

In order to successfully build TensorFlow, your Raspberry Pi needs a little bit more memory to fall back on. Fortunately, this process is pretty straightforward. Grab a USB storage drive that has at least 1GB of memory to use it as a swap area.

First, put insert your USB drive, and find the /dev/XXX path for the device.

sudo blkid

As an example, my drive's path was /dev/sda1

Once you've found your device, unmount it by using the umount command.

sudo umount /dev/XXX

Format your USB drive with the following command (the swap area will be 2GB: 1024 * 2048 = 2097152):

sudo dd if=/dev/zero of=/dev/sda bs=1024 count=2097152

Find it back with this command:

sudo fdisk -l

Then flag your device to be swap:

sudo mkswap /dev/XXX

If the previous command outputted an alphanumeric UUID, copy that now. Otherwise, find the UUID by running blkid again. Copy the UUID associated with /dev/XXX

sudo blkid

Now edit your /etc/fstab file to register your swap file. (I'm a Vim guy, but Nano is installed by default)

sudo nano /etc/fstab

On a separate line, enter the following information. Replace the X's with the UUID (without quotes)


Save /etc/fstab , exit your text editor, and run the following command:

sudo swapon -a

If you get an error claiming it can't find your UUID, go back and edit /etc/fstab . Replace the UUID=XXX.. bit with the original /dev/XXX information.

sudo nano /etc/fstab # Replace the UUID with /dev/XXX /dev/XXX none swap sw,pri=5 0 0

Alright! You've got swap! Don't throw out the /dev/XXX information yet- you'll need it to remove the device safely later on.

3. Build Bazel

Here we'll need to build bazel from sources on the Rpi platform. If you are on a x86_64 machine then just install bazel following these instructions .

For the Rpi

To build Bazel , we're going to need to download a zip file containing a distribution archive. Let's do that now and extract it into a new directory called bazel (the 0.5.2+ releases ends with an error):

wget unzip -d bazel

Once it's done downloading and extracting, we can move into the directory to make a few changes:

cd bazel

Now we need to change the permissions of every files in the bazel project with:

sudo chmod u+w ./* -R

Before building Bazel, we need to set the javac maximum heap size for this job, or else we'll get an OutOfMemoryError. To do this, we need to make a small addition to bazel/scripts/bootstrap/ . (Shout-out to @SangManLINUX for pointing this out ).

nano scripts/bootstrap/

Move down to line 117, where you'll see the following block of code:

run "${JAVAC}" -classpath "${classpath}" -sourcepath "${sourcepath}" \ -d "${output}/classes" -source "$JAVA_VERSION" -target "$JAVA_VERSION" \ -encoding UTF-8 "@${paramfile}"

At the end of this block, add in the -J-Xmx500M flag, which sets the maximum size of the Java heap to 500 MB:

run "${JAVAC}" -classpath "${classpath}" -sourcepath "${sourcepath}" \ -d "${output}/classes" -source "$JAVA_VERSION" -target "$JAVA_VERSION" \ -encoding UTF-8 "@${paramfile}" -J-Xmx500M

Now we can build Bazel!

Warning: This takes a really, really long time. Several hours.


When the build finishes, you end up with a new binary, output/bazel . Copy that to your /usr/local/bin directory.

sudo cp output/bazel /usr/local/bin/bazel

To make sure it's working properly, run bazel on the command line and verify it prints help text. Note: this may take 15-30 seconds to run, so be patient!

bazel Usage: bazel <command> <options> ... Available commands: analyze-profile Analyzes build profile data. build Builds the specified targets. canonicalize-flags Canonicalizes a list of bazel options. clean Removes output files and optionally stops the server. dump Dumps the internal state of the bazel server process. fetch Fetches external repositories that are prerequisites to the targets. help Prints help for commands, or the index. info Displays runtime info about the bazel server. mobile-install Installs targets to mobile devices. query Executes a dependency graph query. run Runs the specified target. shutdown Stops the bazel server. test Builds and runs the specified test targets. version Prints version information for bazel. Getting more help: bazel help <command> Prints help and options for <command>. bazel help startup_options Options for the JVM hosting bazel. bazel help target-syntax Explains the syntax for specifying targets. bazel help info-keys Displays a list of keys used by the info command.

Move out of the bazel directory, and we'll move onto the next step.

cd ..

4. Compiling TensorFlow

First things first, clone the TensorFlow repository and move into the newly created directory.

For the Rpi

Note: Here we're gonna use a slightly different version of Tensorflow I made from the official ones. It's based on the 1.3.0-rc2 branch and updates some dependencies which are required to make it compile on the Rpi platform.

git clone --recurse-submodules -b v1.3.0-rc2/rpi3 cd tensorflow

Now we have to write a nifty one-liner that is incredibly important. The next line goes through all files and changes references of 64-bit program implementations (which we don't have access to) to 32-bit implementations. Neat!

grep -Rl 'lib64' | xargs sed -i 's/lib64/lib/g'

For the x86_64

git clone --recurse-submodules -b v1.3.0-rc2 cd tensorflow

Now let's configure the build:

./configure Please specify the location of python. [Default is /usr/bin/python]: /usr/bin/python Do you wish to build TensorFlow with MKL support? [y/N] N No MKL support will be enabled for TensorFlow Please specify optimization flags to use during compilation when bazel option "--config=opt" is specified [Default is -march=native]: Do you wish to use jemalloc as the malloc implementation? [Y/n] Y jemalloc enabled Do you wish to build TensorFlow with Google Cloud Platform support? [y/N] N No Google Cloud Platform support will be enabled for TensorFlow Do you wish to build TensorFlow with Hadoop File System support? [y/N] N No Hadoop File System support will be enabled for TensorFlow Do you wish to build TensorFlow with the XLA just-in-time compiler (experimental)? [y/N] N No XLA support will be enabled for TensorFlow Do you wish to build TensorFlow with VERBS support? [y/N] N No VERBS support will be enabled for TensorFlow Do you wish to build TensorFlow with OpenCL support? [y/N] N No OpenCL support will be enabled for TensorFlow Do you wish to build TensorFlow with CUDA support? [y/N] N No CUDA support will be enabled for TensorFlow Do you wish to build TensorFlow with MPI support? [y/N] N MPI support will not be enabled for TensorFlow Configuration finished

Note: if you want to build for Python 3, specify /usr/bin/python3 for Python's location and /usr/local/lib/python3.5/dist-packages for the Python library path.

Now we can use it to build TensorFlow!

To build the shared library to use in C++ (for bindings in other languages you need to compile the C interface with // instead ):

For the Rpi

Warning: This takes a really, really long time on the Rpi. Several hours. bazel build -c opt --copt="-funsafe-math-optimizations" --copt="-ftree-vectorize" --copt="-fomit-frame-pointer" --local_resources 1024,1.0,1.0 --verbose_failures //

Note: the optimization flag --copt="-mfpu=neon-vfpv4" could not be used unfortunately as the gemmlowp dependency on tensorflow has an issue with it. To be able to use it we need to make tensorflow use the dependency at least up to this commit . If you want to follow the issue thread take a look here .

For the x86_64

bazel build -c opt --verbose_failures //

Now we'll move the shared libraries and headers to a test folder instead of copying them system wide.

To do so lets create our project structure:

mkdir ../tf_test

then open test.cpp :

nano ../tf_test/test.cpp

and copy this code :

#include "tensorflow/cc/client/client_session.h" #include "tensorflow/cc/ops/standard_ops.h" #include "tensorflow/core/framework/tensor.h" int main() { using namespace tensorflow; using namespace tensorflow::ops; Scope root = Scope::NewRootScope(); // Matrix A = [3 2; -1 0] auto A = Const(root, { {3.f, 2.f}, {-1.f, 0.f}}); // Vector b = [3 5] auto b = Const(root, { {3.f, 5.f}}); // v = Ab^T auto v = MatMul(root.WithOpName("v"), A, b, MatMul::TransposeB(true)); std::vector<Tensor> outputs; ClientSession session(root); // Run and fetch v TF_CHECK_OK(session.Run({v}, &outputs)); // Expect outputs[0] == [19; -3] LOG(INFO) << outputs[0].matrix<float>(); return 0; }

Before moving on we also need to compile the Protobuf dependencies with:

mkdir /tmp/proto tensorflow/contrib/makefile/ cd tensorflow/contrib/makefile/downloads/protobuf/ ./ ./configure --prefix=/tmp/proto/ make make install

Same thing for Eigen :

mkdir /tmp/eigen cd ../eigen mkdir build_dir cd build_dir cmake -DCMAKE_INSTALL_PREFIX=/tmp/eigen/ ../ make install cd ../../../../../..

Now copy the libraries to the project folder:

mkdir ../tf_test/lib cp bazel-bin/tensorflow/ ../tf_test/lib/ cp /tmp/proto/lib/libprotobuf.a ../tf_test/lib/

Then the includes files:

mkdir -p ../tf_test/include/tensorflow cp -r bazel-genfiles/* ../tf_test/include/ cp -r tensorflow/cc ../tf_test/include/tensorflow cp -r tensorflow/core ../tf_test/include/tensorflow cp -r third_party ../tf_test/include cp -r /tmp/proto/include/* ../tf_test/include cp -r /tmp/eigen/include/eigen3/* ../tf_test/include 5. Compile the test project

Finally do some cleanup and compile the test file with:

cd ../tf_test/ find . -name "*.cc" -type f -delete g++ -std=c++11 -Iinclude -Llib test.cpp -ltensorflow_cc -o exec

Of course at that point if you launch your exec program you'll have an error like this:

./exec: error while loading shared libraries: cannot open shared object file: No such file or directory

Simply because your Tensorflow lib is not in the shared library path. (For the record the ideal solution would be to have a static libtensorflow_cc.a library but creating a such library is much more complicated with Bazel. Some people tried but didn't succeed yet.)

So to solve this problem either copy to /usr/lib with (recommended):

sudo cp lib/ /usr/lib/

Or add the current directory to your LD_LIBRARY_PATH with (non permanent):


(When you execute your program you may see some warning such as: The TensorFlow library wasn't compiled to use SSE4.1 instructions . This tutorial didn't include these special flags to speedup Tensorflow. I let the reader figure out what compilation flags are adapted to his platform).

Now it's done, congrats!

6. Cleaning Up

For the Rpi

There's one last bit of house-cleaning we need to do before we're done: remove the USB drive that we've been using as swap.

First, turn off your drive as swap:

sudo swapoff /dev/XXX

Finally, remove the line you wrote in /etc/fstab referencing the device

sudo nano /etc/fstab

Then reboot your Raspberry Pi.

And you're done!You deserve a break.

7. References Building tensorflow on the raspberry pi Tensorflow for Go on rpi3 Compiling the C++ interface Compiling for Tensorflow C++ API Mfpu neon issue thread

本文系统(linux)相关术语:linux系统 鸟哥的linux私房菜 linux命令大全 linux操作系统

主题: Raspberry PiC++HadoopUbuntuLinuxJavaJVMPythonRYUT
本文标题:Building TensorFlow 1.3.0-rc2 as a standalone project (Raspberry pi 3 included)

技术大类 技术大类 | 系统(linux) | 评论(0) | 阅读(17)