-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Runtime error: failed to find input Node 'image_tensor' after conversion from protobuf to tflite model file #25171
Comments
I really cannot tell what the app you tried to use is TF Mobile one or TFLite one. If what you want to try is the TFLite one, you should read this article first. |
Thanks, the bazel build command in that article solved my problem. Do you know a method how to get the output dimensions of the mobile_ssd_v2_coco file ( found under https://storage.googleapis.com/download.tensorflow.org/models/tflite/gpu/mobile_ssd_v2_float_coco.tflite) that is referred at the tutorial https://www.tensorflow.org/lite/performance/gpu#supported_models_and_ops? I would like to know how my outputMap has to be structured: I have tried to use
No inputs spotted. When I analyse a .pb file the command even doesn'tell my how my outputMap has to be formed. Thank you so much!! |
import sys
from tensorflow.lite.python import interpreter as interpreter_wrapper
interpreter = interpreter_wrapper.Interpreter(model_path=sys.argv[1])
print(interpreter.get_output_details()) Run it,
It seems the model doesn't come with post-processing nodes. |
@freedomtan Thank you so much for you answer, which so super helpful!! Could you please tell me where I can find some documentation about pre-/ post-processing nodes of models that are useable in Tensorflow Lite? I have already searched but only found stuff describing that post-processing of bounding boxes is drawing them. That was not so helpful. |
@freedomtan the image_tensor error from above only appears in the tfMobile App but in the tfLite demo app it could be resolved by using your link. Is tfMobile only able to use protobuf files and this is the reason for the error? |
@defaultUser3214 |
@defaultUser3214 Surely TFLite doesn't have image_tensor problem, because there is no such node if you follow the article I mentioned. I don't know if there is any documentation on preprocessing and postprocessing of the model. I guess most people figure it out by reading paper and source code. |
@freedomtan Is this question regarding the CPU execution or the GPU execution? The CPU execution was specified in the article |
@achowdhery I don't have question. I was trying to answer @defaultUser3214's question :-) And, yes, I think it's CPU execution question. |
TFMobile consumes the frozen graphs (.pb) files, whereas TFLite consumes converted flatbuffer (.tflite) files. They are incompatible and not interchangeable. Do you have a specific question for TensorFlow Lite execution? |
We are closing this issue for now due to lack of activity. Please comment if this is still an issue for you. Thanks! |
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
Describe the current behavior
Because I assumed that protobuf files were much slower than .tflite files I tried to converted a .pb to a .tflite:
Thus I downloaded the r1.95 branch of Tensorflow and converted the frozen_inference_graph.pb from (https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md) to a .tflite using #15633 (comment) and #15633 (comment). This worked well!
The .pb file worked well with my Android App, but after copying the .tflite model to the app/assets directory of the TensorFlow Mobile demo App (from https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android) and replacing the .pb file in the code with
` private static final DetectorMode MODE = DetectorMode.TF_OD_API;
the following runtime error appears:
E/AndroidRuntime: FATAL EXCEPTION: main Process: myPackage.myProcess, PID: 15491 **java.lang.RuntimeException: Failed to find input Node 'image_tensor'** at myPackage.myProcess.myClass.TensorFlowObjectDetectionAPIModel.create(TensorFlowObjectDetectionAPIModel.java:106) at myPackage.myProcess.myClass.DetectorActivity.onPreviewSizeChosen(DetectorActivity.java:146) at myPackage.myProcess.myClass.CameraActivity$5.onPreviewSizeChosen(CameraActivity.java:370) at myPackage.myProcess.myClass.CameraConnectionFragment.setUpCameraOutputs(CameraConnectionFragment.java:412) at myPackage.myProcess.myClass.CameraConnectionFragment.openCamera(CameraConnectionFragment.java:419) at myPackage.myProcess.myClass.CameraConnectionFragment.access$000(CameraConnectionFragment.java:66) at myPackage.myProcess.myClass.CameraConnectionFragment$1.onSurfaceTextureAvailable(CameraConnectionFragment.java:97) at android.view.TextureView.getHardwareLayer(TextureView.java:390)
I think there is a similar issue #22565 .
Describe the expected behavior
I would have expected that the .tflite version works because the .pb version of the same ssd_model works well!
Code to reproduce the issue
Download and extract SSD MobileNet model
wget http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2017_11_17.tar.gz
tar -xvf ssd_mobilenet_v1_coco_2017_11_17.tar.gz
DETECT_PB=$PWD/ssd_mobilenet_v1_coco_2017_11_17/frozen_inference_graph.pb
STRIPPED_PB=$PWD/frozen_inference_graph_stripped.pb
DETECT_FB=$PWD/tensorflow/contrib/lite/examples/android/assets/mobilenet_ssd.tflite
Strip out problematic nodes before even letting TOCO see the graphdef
bazel run -c opt tensorflow/python/tools/optimize_for_inference --
--input=$DETECT_PB --output=$STRIPPED_PB --frozen_graph=True
--input_names=Preprocessor/sub --output_names=concat,concat_1
--alsologtostderr
Run TOCO conversion.
bazel run tensorflow/lite/toco:toco --
--input_file=$STRIPPED_PB --output_file=$DETECT_FB
--input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE
--input_shapes=1,300,300,3 --input_arrays=Preprocessor/sub
--output_arrays=concat,concat_1 --inference_type=FLOAT --logtostderr
Build and install the demo
bazel build -c opt --cxxopt='--std=c++11' //tensorflow/contrib/lite/examples/android:tflite_demo
adb install -r -f bazel-bin/tensorflow/contrib/lite/examples/android/tflite_demo.apk
Other info / logs
bug_tracker_bazel_run_warnings.txt
bug_tracker_runtime_error.txt
The text was updated successfully, but these errors were encountered: