New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question&Error] Is there detection model like a SSD-Mobile-net in tensorflow-lite? #15633
Comments
@aselle can you please take a look at this issue? Thanks. |
We are currently working to convert mobilenet SSD (and then inception ssd after that) , but it contains ops that are not supported completely. I will update this issue once we have that done. |
Great, I have asked similar question here: #14731 How long do you reckon until you guys add support from ssd-mobilenet? Thanks, |
A member of the TensorFlow organization has replied after the stat:awaiting tensorflower label was applied. |
? |
Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
Any updates? |
@yucheeling |
Could you please suggest any dataset like " |
@rana3579, please ask such a question on stackoverflow. A quick update on mobilenet ssd. This is progressing and we hope we will have an example out soon. |
@rana3579 check my video, got this running on movidius, nvidia gpus as well as arm processors. I cannot share the dataset but if you are part of a company we could talk about potential collaboration: https://www.youtube.com/watch?v=3MinI9cCJrc |
@aselle thanks for the update! Where to look for the notifications on this? I would like to be notified as soon as it is out if that is possible. Thank you, I appreciate your hard-work on this! |
@andrewharp, is working on this and will be updating the Java TF Mobile app to use tflite. So watch for those changes in the repository. I'll leave this issue open for now. |
This is functional internally; should have something out in the next week or two. |
@andrewharp thats awesome!! Does that also go for the iOS camera example? Some others already converted the existing SSD Mobilenet pb to a coreml model and wrote the missing output layers in Swift: But thats only really like 8-12 fps on an iPhone 7. |
Hi, |
I am also curious :) |
I have a commit porting the Android TF demo to tflite currently under review, should show up on github this week hopefully. @madhavajay It's Android only, but you should be able to adapt it for iOS. The only thing is that some of the pre-processing (image resizing/normalization) and post-processing (non-max suppression and adjustment by box priors) is done in Java as tflite doesn't fully support all the operators used by MobileNet SSD. |
@andrewharp That’s awesome. Can you briefly explain why those operations are not available currently in TF lite. Seems the same case for the tfcoreml conversion tool on regular SSD. Not complaining just asking out of technical interest, do they do something that’s particularly difficult to implement in the mobile stack or is it just low priority? |
Looking forwards to seeing your epic effort on the Android code!!! Thanks a lot. I know im not the only one looking forwards to this! |
@andrewharp, and @aselle Any update on getting demo for using SSD based Object Localization example for TFLite? |
It's live now at tensorflow/contrib/lite/examples/android! This is a more complete port of the original TF Android demo (only lacking the Stylize example), and will be replacing the other demo in tensorflow/contrib/lite/java/demo going forward. A converted TF Lite flatbuffer can be found in mobilenet_ssd_tflite_v1.zip, and you can find the Java inference implementation in TFLiteObjectDetectionAPIModel.java. Note that this differs from the original TF implementation in that the boxes must be manually decoded in Java, and a box prior txt file needs to be packaged in the apps assets (I think the one included in the model zip above should be valid for most graphs). During TOCO conversion a different input node (Preprocessor/sub) is used, as well as different output nodes (concat,concat_1). This skips some parts that are problematic for tflite, until either the graph is restructured or TF Lite reaches TF parity. Here are the quick steps for converting an SSD MobileNet model to tflite format and building the demo to use it:
|
@achowdhery It is my own dataset. I trained for mobilenetv2 architecture. When I run the .pb model (tensorflow model), I get Do you think its related? |
@ashwaniag Please open a new bug and provide exact reproducible instructions |
@ashwaniag check these both issues, i had a similar problem : #10254 and #19854 |
@achraf-boussaada Thank you! I fixed it. It was a version mismatch issue. |
@ashwaniag Please define very bad results. Do you have small objects? Please attach a model checkpoint, pipeline config and label file as well as a sample image to help us reproduce the issue. Thanks |
@oopsodd hello, I get a wrong class index either . it complained "java.lang.ArrayIndexOutOfBoundsException: length=10; index=-739161663", Can you help me ? |
Note I have created TensorFlow Lite SSD (Object Detection) minimal working examples for iOS and Android; https://github.com/baxterai/tfliteSSDminimalWorkingExample. The iOS version is based on obj_detect_lite.cc by YijinLiu (with nms function by WeiboXu), and the Android version is based on https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/examples/android tflDetect. It removes all overhead like the internal camera, and isolates the core code required to detect objects and display the detection boxes. |
@baxterai great work! thanks, I will test it. |
Thanks for your amazing work everybody! I have another question regarding the recently added postprocessing operation. The output of the pretrained ssd_mobilenet_v1_quantized_coco
is this resolved by retraining the network with this pipeline configuration or is the dimensionality of |
@Georg-W You will need to change max detection in export_tflite_ssd_graph.py as well. There is a command line option. |
@achowdhery ah thank you ! Thats what I missed. |
i have the same issue.. got solution? |
Hi I'm trying to detect more than 10 objects in the image ( which is default ) I also modified but still giving 10 objects as output in the android [1,10,4]. any idea? |
I would be also interested in the solution of @KaviSanth issue. |
This solution of @stevelb should work. You may want to visualize the frozen graph to make sure that max_detections is set correctly. |
@achowdhery Thank you for your reply. I tried to execute the commands written by @andrewharp but I get the following error. Indeed, toco isn't located at this place. I am using the master version and the r1.95 version from the github repository. bazel run tensorflow/contrib/lite/toco:toco -- --input_file=$STRIPPED_PB --output_file=$DETECT_FB --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --input_shapes=1,300,300,3 --input_arrays=Preprocessor/sub --output_arrays=concat,concat_1 --inference_type=FLOAT --logtostderr I could find a toco under tensorflow/lite/toco and I am just testing whether it works. |
There appears a runtime error when adding the .tflite file in the DetectorActivity from https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/android/src/org/tensorflow/demo/DetectorActivity.java) with the line
E/AndroidRuntime: FATAL EXCEPTION: main Is it not possible to use .tflite models in that app? |
@defaultUser3214 you are using a classifier model in the detection app. MobileNet v1 is classification model. Please use MobileNet SSD model |
@achowdhery Thank you! Using the model from wget http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2017_11_17.tar.gz resulted in that error. But I thought that this was the ssd version? But using the ssd_mobilenet_v1_android_export.pb converted to .tflite that worked as .pb before produces the same error. |
@defaultUser3214 Thats an old version of the model that will not work in latest demo app released in July 2018. Please download the latest models in July 2018 in detection model zoo : they do work in the app. Please open a new issue if this is still blocked. |
@SteveIb You also need to change NUM_DETECTIONS = 500 in TFLiteObjectDetectionAPIModel.java |
not able to convert ssdmobilenet v1 .pb to .tflite |
Any progress on this? Trying to convert frozen_inference_graph.pb to .TFLITE file but getting error
For custom object detection in Android. Any ideas on different conversion methods? Transfer learned ssd_mobilenet_v1_pets on Windows 10 following the tutorial here: https://github.com/EdjeElectronics/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10 |
Just to follow up on this and to help anyone else who was having the same error - this is caused by using an incorrect model checkpoint to train from. To work on Android with .tflite, the initial model must MobileNet and must also be quantized and will have this section of code or something similar in the .config file:
|
This works like a charm! |
HI.
Developing an android application using tensorflow-lite.
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/g3doc/models.md
Not found detection model.
Also, I try to convert SSD-Inceptionv2 using tensorflow-lite-API. But there seems to be a problem.
##Command
##Error code
The fire_inception_v2 file is created, but its size is zero bytes.
What is a problem?
also,
please let me know what's the best way to deploy custom model for object detection?
Somebody help me plz!.
thank you.
The text was updated successfully, but these errors were encountered: