Trying MiNiFi with TensorFlow (GPU) on the Jetson Nano

These are some initial steps I followed to get Apache NiFi – MiNiFI – C++ running on the NVIDIA Jetson Nano. The Nano is a SBC/development kit designed for prototyping AI applications, so it is a good fit for running realtime inferences on the edge with MiNiFI and its TensorFlow processors.

Download/write nvidia image to SD

This process varies depending on which OS you are using to write the Jetson Nano SDK image from. More details are available from NVIDIA.

I'm on Linux, so I used dd to do this:

sudo dd if=jetson-nano-sd-r32.1.1-2019-05-31.img of=/dev/sdc

First boot of the Nano

Put the SD card in, plug in the network and USB devices, and plug the power cord in to boot up the Nano.

On the first boot, the Nano will prompt to accept the NVIDIA usage agreement as well as some initial settings such as keyboard layout, time zone, and so on.

Update the system

The following is sufficient to update the system:

apt-get update
apt-get upgrade
apt-get dist-upgrade
apt-get autoremove

Then reboot the system:


Install dependencies

sudo apt-get -y install libprotobuf-dev

Install tensorflow_cc

git clone --recurse-submodules
cd tfcc-jetson
./build # This will build AND install tensorflow_cc

Build nifi-minifi-cpp

sudo apt-get install libcurl-dev libcurl4-openssl-dev
git clone
cd nifi-minifi-cpp
git fetch --all
git checkout draft-tf2.0
mkdir build
cd build
nice make -j4