Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detectron2 DefaultPredictor #85

Open
neerajvashistha opened this issue Feb 25, 2021 · 0 comments
Open

Detectron2 DefaultPredictor #85

neerajvashistha opened this issue Feb 25, 2021 · 0 comments

Comments

@neerajvashistha
Copy link

For finetuned models built using detectron2 we generally write predictor() for example

from detectron2 import model_zoo
from detectron2.engine import DefaultPredictor
from detectron2.config import get_cfg
from detectron2.utils.visualizer import Visualizer
from detectron2.data import MetadataCatalog

cfg = get_cfg()
cfg.merge_from_file(model_zoo.get_config_file(“COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml”))
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.5 # set threshold for this model
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url(“COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml”)
predictor = DefaultPredictor(cfg)
im = cv2.imread("./input.jpg")
outputs = predictor(im)
mask = outputs[‘instances’].pred_masks.to(‘cpu’).numpy()

I wanted to use Service Streamer and make my model more reliable. But I am running into errors. The most logical and basic thing I tried was

streamer = ThreadedStreamer(predictor, batch_size=1, max_latency=0.1)
outputs = streamer.predict(im)

and it resulted into below

Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/service_streamer/service_streamer.py", line 154, in run_forever
    handled = self._run_once()
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/service_streamer/service_streamer.py", line 184, in _run_once
    model_outputs = self.model_predict(model_inputs)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/service_streamer/service_streamer.py", line 163, in model_predict
    batch_result = self._predict(batch_input)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/detectron2/engine/defaults.py", line 218, in __call__
    height, width = original_image.shape[:2]
AttributeError: 'list' object has no attribute 'shape'

[2021-02-25 01:08:08,197] ERROR in app: Exception on /predict [POST]
Traceback (most recent call last):
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "main.py", line 102, in predict
    outputs = streamer.predict(im)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/service_streamer/service_streamer.py", line 132, in predict
    ret = self._output(task_id)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/service_streamer/service_streamer.py", line 122, in _output
    batch_result = future.result(WORKER_TIMEOUT)
  File "/home/ubuntu/virtpy3/lib/python3.6/site-packages/service_streamer/service_streamer.py", line 41, in result
    raise TimeoutError("Task: %d Timeout" % self._id)
TimeoutError: Task: 0 Timeout
127.0.0.1 - - [25/Feb/2021 01:08:08] "POST /predict HTTP/1.1" 500 -

any clue on how to use predictor()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant