error about Kernel8bitNeonDotprodOutOfOrder occurs when running int8 CPU inference by tflite #47834
Labels
comp:lite
TF Lite related issues
stale
This label marks the issue/pr stale - to be closed automatically if no activity
stat:awaiting response
Status - Awaiting response from author
TF 2.0
Issues relating to TensorFlow 2.0
type:support
Support issues
System information
provided in TensorFlow): No
happens on a mobile device: under-release yet, the platform is Qualcom SM4350
You can collect some of this information using our environment capture script:
https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh
You can obtain the TensorFlow version with:
python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"
Describe the problem
The problem occurs when doing stress testing for camera of mobile phone. In detail, we do cpu inference for a int8 quantilized model and the problem occurs occasionally, causing the camera crash.
Source code / logs
03-16 04:58:16.798467 28412 28412 F DEBUG : backtrace:
03-16 04:58:16.798525 28412 28412 F DEBUG : #00 pc 00000000002247dc /vendor/lib64/libtensorflowLite.so (ruy::Kernel8bitNeonDotprodOutOfOrder(ruy::KernelParams8bit<8, 8> const&)+1364)
03-16 04:58:16.798548 28412 28412 F DEBUG : #1 pc 0000000000114774 /vendor/lib64/libtensorflowLite.so (void ruy::RunKernelTyped<(ruy::Path)8, signed char, signed char, signed char, ruy::BasicSpec<int, signed char> >(ruy::Tuning, ruy::PackedMatrix const&, ruy::PackedMatrix const&, ruy::BasicSpec<int, signed char> const&, int, int, int, int, ruy::Matrix)+508)
03-16 04:58:16.798564 28412 28412 F DEBUG : #2 pc 0000000000113c74 /vendor/lib64/libtensorflowLite.so (void ruy::RunKernel<(ruy::Path)8, signed char, signed char, signed char, ruy::BasicSpec<int, signed char> >(ruy::Tuning, ruy::SidePairruy::PMatrix const&, void, ruy::SidePair const&, ruy::SidePair const&, ruy::DMatrix*)+160)
03-16 04:58:16.798573 28412 28412 F DEBUG : #3 pc 00000000002297b4 /vendor/lib64/libtensorflowLite.so
03-16 04:58:16.798583 28412 28412 F DEBUG : #4 pc 0000000000228dc4 /vendor/lib64/libtensorflowLite.so (ruy::TrMul(ruy::TrMulParams*, ruy::Context*)+2292)
03-16 04:58:16.798596 28412 28412 F DEBUG : #5 pc 00000000001131d0 /vendor/lib64/libtensorflowLite.so (void ruy::DispatchMul<(ruy::Path)15, signed char, signed char, signed char, ruy::BasicSpec<int, signed char> >(ruy::Matrix const&, ruy::Matrix const&, ruy::BasicSpec<int, signed char> const&, ruy::Context*, ruy::Matrix)+384)
03-16 04:58:16.798610 28412 28412 F DEBUG : #6 pc 0000000000112410 /vendor/lib64/libtensorflowLite.so (tflite::optimized_integer_ops::ConvPerChannel(tflite::ConvParams const&, int const, int const*, tflite::RuntimeShape const&, signed char const*, tflite::RuntimeShape const&, signed char const*, tflite::RuntimeShape const&, int const*, tflite::RuntimeShape const&, signed char*, tflite::RuntimeShape const&, signed char*, tflite::CpuBackendContext*)+1164)
03-16 04:58:16.798622 28412 28412 F DEBUG : #7 pc 000000000011109c /vendor/lib64/libtensorflowLite.so (void tflite::ops::builtin::conv::EvalQuantizedPerChannel<(tflite::ops::builtin::conv::KernelType)1>(TfLiteContext*, TfLiteNode*, TfLiteConvParams*, tflite::ops::builtin::conv::OpData*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*, TfLiteTensor*)+648)
03-16 04:58:16.798636 28412 28412 F DEBUG : #8 pc 0000000000105ed8 /vendor/lib64/libtensorflowLite.so (TfLiteStatus tflite::ops::builtin::conv::Eval<(tflite::ops::builtin::conv::KernelType)1>(TfLiteContext*, TfLiteNode*)+280)
03-16 04:58:16.798647 28412 28412 F DEBUG : #9 pc 000000000023079c /vendor/lib64/libtensorflowLite.so (tflite::Subgraph::Invoke()+740)
03-16 04:58:16.798657 28412 28412 F DEBUG : #10 pc 00000000002342dc /vendor/lib64/libtensorflowLite.so (tflite::Interpreter::Invoke()+32)
The text was updated successfully, but these errors were encountered: