Real time object detection with TensorFlow on Android

February 2022
November 2020

Read also about Explainable Object Detection

Automatic object detection based on deep learning has enormous potential. As a result, it can make a significant contribution in the future. We apply object detection in areas such as the monitoring of industrial manufacturing processes. Other areas are driver assistance systems or support in diagnostics in healthcare. However, to enable large-scale industrial use, we need to be able to apply these methods on resource-limited devices. Therefore, we demonstrate one possible use case with the described application. Our example enables real-time object detection with high quality on Android devices using TensorFlow.

Advances in machine learning and artificial intelligence are enabling revolutionary methods in computer vision and text analysis. The new trend of deep learning uses methods with numerous layers between input layer and output layer. With this, the machine learning model learns to recognize more complex patterns in data, through artificial neural networks (ANN). A possible application field for these models is, for instance, image classification. Here, it is even possible to recognize multiple objects within an image. The recent advancement allow us to use deep learning algorithms on mobile devices and thus real-time object detection. In order to illustrate this, we give a brief overview of how we can deploy a real-time mobile object detector on an Android device.

Real time object detection with TensorFlow in Android - Set up development environment

In order to build the Android application, we use Docker. Its containerization allows us to easily install all the necessary dependencies without causing potential problems on our host computer. We need to customize a provided Docker file from the TensorFlow Git repository, first. The Docker file is based on the official TensorFlow Docker image and provides dependencies and configurations required to build the Java-based Android package (APK). There is also an option to remove code from the Docker file. For example, if you do not want to train a model in Google Cloud or convert a pre-trained model to TensorFlow Lite format.

Real time object detection with TensorFlow in Android - Select the right AI model

We use a MobileNet model trained on the COCO dataset for our Android application. A variety of prefabricated frozen MobileNet models are available in the TensorFlow Git repository. In addition, we can download several pre-trained models such as SSD MobileNetv1, SSD MobileNetv2, or Faster R-CNN from the TensorFlow Hub or the TensorFlow Guide. The MobileNet models are low-latency, low-power models and thus ideal for resource-constrained mobile devices. If you want to apply your own model, you need to convert the trained frozen TensorFlow graph to TensorFlow Lite format. This will ensure optimal use on your mobile device. TensorFlow provides a converter to convert a TensorFlow log buffer graph (.pb) to a TensorFlow Lite FlatBuffer file (.tflite).

Real time object detection with TensorFlow in Android - Building the APK

In order to run the model on an Android mobile device, you must first integrate it with an APK. The Docker file contains the Android NDK (to support C and C++) and the SDK. These are necessary for the build process. Please note, currently the Docker file specifies version 14b. If you use a newer version, that NDK version must be compatible with Bazel. Next, we integrate the model into the Android demo app TensorFlow Lite. This requires build tools with API level ≥23. Similarly, the SDK tools and configurations are included in the Docker file. API level 23 corresponds to Android 6.0 Marshmallow, but the Android demo is already compatible with devices with API level ≥21. If you want to use a newer version, for example, to enable the Neural Networks API (API level 27, Android 8.1), you can modify the Docker file accordingly.

You can find a list of API levels and corresponding Android versions here.

Real time object detection with TensorFlow in Android - Creation of the APK

To create the APK with your chosen model, rename the converted Tflite file to detect.tflite. Then move the file to the tensorflow/contrib/lite/examples/android/app/src/main/assets folder in the running container. In addition, you need to place the associated labels.txt, which contains the labels of the object classes, in the same directory.

If you use a different label, you must point the Bazel BUILD file to the new model to include it in the APK assets. The BUILD file is located in tensorflow/contrib/lite/examples/android/. Simply replace the references to detect.tflite and coco_labels_list.txt with your model and label list names. Also, an update with the new names for the definitions of TF_OD_API_MODEL_FILE and TF_OD_API_LABELS_FILE in tensorflow/contrib/lite/examples/android/app/src/main/java/org/tensorflow/demo/DetectorActivity.java will be necessary. Furthermore, it is possible to customize additional parameters in this Java file, such as the minimum detection reliability threshold, output text size, etc.

Depending on the selected model, you have to adjust the TF_OD_API_INPUT_SIZTF_OD_API_INPUT_SIZE configuration value to the tensor dimensions of your model. For example, the pre-packaged SSD MobileNet model is configured for tensors with dimensions in input layer 1,300,300,3. This corresponds to images with 300 x 300 pixels and three-dimensional color space. The Android demo app transforms each camera image into TF_OD_API_INPUT_SIZTF_OD_API_INPUT_SIZE × TF_OD_API_INPUT_SIZTF_OD_API_INPUT_SIZE pixels.

To build the APK, we use the build tool Bazel. You execute the build process for ARMv7-A 64-bit architecture with the following command from the tensorflow directory in the container:

    
    $ bazel build -c opt --config=android_arm{,64} --cxxopt='--std=c++11'// tensorflow/lite/examples/android:tflite_demo
    

However, if we want to create the APK for a different CPU architecture, such as the x86_64 platform, we run the following command:

    
    $ bazel build -c opt --fat_apk_cpu=x86_64 --cxxopt='--std=c++11'// tensorflow/lite/examples/android:tflite_demo
    

The APK file is created in the directory bazel-bin/tensorflow/lite/examples/android/directory. Then we can test the APK with an Android emulator.

Execution of the APK for object detection

After the successful creation process of the APK, it is ready for installation on an Android cell phone. In order to install the APK file on your own device, you must first enable the developer options in the system settings. Then, you can start the installation process on your USB-connected phone using the Android Debugging Bridge (ADB). The Android platform tools already include this. The ADB command to install the package is:

    
    adb install tflite_demo.apk
    

Now you can start the real-time object detection with TensorFlow on Android. With the help of the app TFL Detect you can detect objects in the COCO dataset.

Want to learn more? Continue reading about object detection on our blog: Explainable Object Detection

You want to see more?

Featured posts

Show more
No spam, we promise
Get great insights from our expert team.