Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NonMaxSuppression parameters are not deployed correctly #2291

Open
cansik opened this issue Jul 3, 2023 · 4 comments
Open

NonMaxSuppression parameters are not deployed correctly #2291

cansik opened this issue Jul 3, 2023 · 4 comments
Assignees

Comments

@cansik
Copy link

cansik commented Jul 3, 2023

I trained an object detection model based on YOLOX with otx and exported / optimized it as openvino model. The network includes a NonMaxSuppression operator to post-process the detected objects which is great.

The only problem is that now the score-threshold is fixed to 0.01 and nms is fixed to 0.65 as set in the test_cfg. It seems that the post_processing settings in the deployment.py are ignored and only the test_cfg settings from the model.py are applied. Is this behaviour intended?

But since these values maybe should be adaptive, wouldn't it make sense to expose them as network inputs? Or is there another way to change the thresholds (maybe even on runtime)? I know this is maybe more MMDetection related, but the deployment problem seems to be otx related.

@jaegukhyun
Copy link
Contributor

Thanks for reporting, it sounds make sense to me. Let me check

@jaegukhyun
Copy link
Contributor

Hi @cansik I've looked into this issue. I think we need more detailed information of your request.

  1. Do you want to change score_thr, nms_threhold during train? If you do, changing model.py in your workspace seems appropriate.
  2. Do you want to change score_thr, nms_threshold during pytorch inference? In this case also changing model.py in your workspace seems appropriate.
  3. Do you want to change score_thr, nms_threshold during ov inference?
    3-1. In this case, you can change model.py in your workspace before exporting model. Do you think this is not proper way?
    3-2. Maybe you want to change score_thr, nms_threshold for already exported model. If this is your case, more investigation is needed.

@cansik
Copy link
Author

cansik commented Jul 17, 2023

@jaegukhyun In deployment.py I can change various values for the ONNX and OpenVINO export, like input size and other parameters. These parameters are used when I run the otx export command, so it is all about exporting models from PyTorch to ONNX / OpenVINO.

However, the export does not take into account the two parameters confidence_threshold and iou_threshold. This leads to an exported model with some default nms values instead of the configured ones.

@jaegukhyun
Copy link
Contributor

jaegukhyun commented Jul 27, 2023

Hi @cansik. There are misunderstandings for previous guides. OTX collects configurable hyper parameters in https://github.com/openvinotoolkit/training_extensions/blob/develop/src/otx/algorithms/detection/configs/detection/configuration.yaml. This configuration.yaml file will be placed in your workspace if you use otx build or otx train command. You may be able to change some hyperparameters from model.py or deployment.py, not configuration.yaml, but it is not recommended behaviors since we have plan to hide those file. In near future user only can access hyperparameters from configurational.yaml.

In summary, we want user to change hyperparamters using configuration.yaml and template.yaml, and don't recommend changing those parameters using model.py and the other files.

However, current configuration.yaml can only change confidence threshold. Iou_threshold and input_size can be fixed by model.py and data_pipeline.py.
#2388 is PR for enabling to change confidence threshold during model export or ov model inference. Other parameters such as iou_threshold and input_size should wait until patching through configuration.yaml is supported.

In summary, now confidence threshold can be changed by user when model export and ov model inference, you can check usage of this variable in PR summary. Input size and iou threshold should wait for supporting from configuration.yaml.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants