Issue
PROBLEM ENCOUNTERED:
E/AndroidRuntime: FATAL EXCEPTION: main Process: org.tensorflow.lite.examples.detection, PID: 14719 java.lang.AssertionError: Error occurred when initializing ObjectDetector: Mobile SSD models are expected to have exactly 4 outputs, found 8
Problem Description
- Android Application Source: TensorFlow Lite Object Detection Example from Google
- Error shown when starting the Example Application
Model Description
Custom Model Used? YES
Pre-trained Model Used: ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8
Inference type: FLOAT
Number of classes: 4
System Information
- OS Platform and Distribution: ( Linux Ubuntu 20.14)
- TensorFlow Version: 2.4.1
- TensorFlow installed from: Pip
Saved Model conversion commands used:
1. Saved_Model.pb export:
python ./exporter_main_v2.py
--input_type image_tensor
--pipeline_config_path ./models/ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8/pipeline.config
--trained_checkpoint_dir ./models/ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8
--output_directory exported_models/tflite
2. Convert saved model (.pb) to tflite
toco
--saved_model_dir ./exported-models/tflite/saved_model
--emit-select-tf-ops true
--allow_custom_ops
--graph_def_file ./exported-models/tflite/saved_model/saved_model.pb
--output_file ./exported-models/tflite/tflite/detect.tflite
--input_shapes 1,300,300,3
--input_arrays normalized_input_image_tensor
--output_arrays 'TFLite_Detection_PostProcess’,’TFLite_Detection_PostProcess:1','TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3'
--inference_type=FLOAT
--allow_custom_ops
Remarks I am trying to use a trained custom model on the Google TensorFlow lite provided example. Just that every time I open the application, it returns such an error, Mobile SSD models are expected to have exactly 4 outputs, found 8. The model is trained to identify 4 classes, all stated in the labelmap.txt and pipeline config.
Does anybody have any clue about this error?
Solution
After further study, I believe the aforementioned issue was raised since the model has 8 tensors output but the Android application written in Java can only support 4 tensors output (at least the example provided by Google only supports 4 tensors output)
I am not very certain about the number of tensors output on different models. So far as I understood and messed around with different models, models with fixed_shape_resizer of 640 x 640 are likely to require more than 4 tensors output ( 8 tensors output usually), which is not compatible with the Android application written in Java.
For any amateur users like me, please find the following prerequisites to use your custom model in the Android application
Suggested Setup ( Assume you are using TensorFlow version >= 2.3):
TensorFlow Model: SSD model with fixed_shape_resizer of 320 x 320
(In my case, SSD MobileNet v2 320x320 works perfectly fine) (The tensors output has to be 4)Colab ( Perfect for model training and conversion)
( I've tried to perform the training and conversion on both Linux and Windows platform on my local machine, the incompatibility of different tools and package gives me a headache. I turned out using Colab to perform the training and conversion. It is much more powerful and has great compatibility with those training tools and script.)The metadata writer library that was written by @lu-wang-g
(In my case, after converting the trained model to .tflite, if you directly migrate the .tflite model to the Android application, the application will report tons of problem regarding the config of the .tflite model. Assume if you trained and converted the model correctly, all you need is the metadata writer library above. It will automatically configure the metadata for you according to the .tflite model. Then you can directly migrate the model to the application.)
For detail, please visit my GitHub issue:
https://github.com/tensorflow/tensorflow/issues/47595
Answered By - Alex1338
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.