Trying MiNiFi with TensorFlow (GPU) on the Jetson Nano
These are some initial steps I followed to get Apache NiFi – MiNiFI – C++ running on the NVIDIA Jetson Nano. The Nano is a SBC/development kit designed for prototyping AI applications, so it is a good fit for running realtime inferences on the edge with MiNiFI and its TensorFlow processors.
Download/write nvidia image to SD
This process varies depending on which OS you are using to write the Jetson Nano SDK image from. More details are available from NVIDIA.
I'm on Linux, so I used
dd to do this:
wget https://developer.nvidia.com/embedded/dlc/jetson-nano-dev-kit-sd-card-image unzip jetson-nano-sd-r32.1.1-2019-05-31.zip sudo dd if=jetson-nano-sd-r32.1.1-2019-05-31.img of=/dev/sdc
First boot of the Nano
Put the SD card in, plug in the network and USB devices, and plug the power cord in to boot up the Nano.
On the first boot, the Nano will prompt to accept the NVIDIA usage agreement as well as some initial settings such as keyboard layout, time zone, and so on.
Update the system
The following is sufficient to update the system:
apt-get update apt-get upgrade apt-get dist-upgrade apt-get autoremove
Then reboot the system:
sudo apt-get -y install libprotobuf-dev
git clone --recurse-submodules https://github.com/achristianson/tfcc-jetson.git cd tfcc-jetson ./build # This will build AND install tensorflow_cc
sudo apt-get install libcurl-dev libcurl4-openssl-dev git clone https://github.com/achristianson/nifi-minifi-cpp.git cd nifi-minifi-cpp git fetch --all git checkout draft-tf2.0 mkdir build cd build cmake -DENABLE_TENSORFLOW=1 .. nice make -j4