Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error on multi-threading !!! #84

Open
marcusau opened this issue Feb 16, 2021 · 2 comments
Open

Error on multi-threading !!! #84

marcusau opened this issue Feb 16, 2021 · 2 comments

Comments

@marcusau
Copy link

from my server side:

from backend.bert_test import NERServeHandler
from flask import Flask, request, jsonify
from service_streamer import ThreadedStreamer


app = Flask(__name__)
model = None
streamer = None


@app.route("/stream", methods=["POST"])
def stream_predict():
    inputs =  request.data.strip()
    inputs = inputs.decode('utf-8')
    print(f'receive inputs: {inputs}')
    outputs = streamer.predict([inputs])
    return jsonify(outputs)


if __name__ == "__main__":
    model=NERServeHandler()
    model.initialize()
    streamer = ThreadedStreamer(model.predict, batch_size=64, max_latency=0.1)
    app.run(host='127.0.0.1',port=5005, debug=False)

for my request side:

import requests
ner_api_url='http://127.0.0.1:5005/stream'

word='由匈牙利政府派出,計劃運輸由中方出口的新冠疫苗的包機,今日凌晨飛抵北京並在機場完成裝箱後,即啟程返航,預計於今天傍晚抵達匈牙利。這是中方向匈牙利出口的首批疫苗。'

word = word.encode('utf-8').strip()

url_response = requests.post(ner_api_url, data=word)
if url_response.status_code != 200:
    print('status code error:',url_response.status_code)
else:
    print(url_response.content)

Error:

    loading NER labels and arguments
    loading BERT Config
    loading BERT tokenizer
    loading BERT Model
     * Running on http://127.0.0.1:5005/ (Press CTRL+C to quit)
    Model successfully loaded.
     * Serving Flask app "service_streamer_test" (lazy loading)
     * Environment: production
       WARNING: This is a development server. Do not use it in a production deployment.
       Use a production WSGI server instead.
     * Debug mode: off

    receive inputs: 由匈牙利政府派出,計劃運輸由中方出口的新冠疫苗的包機,今日凌晨飛抵北京並在機場完成裝箱後,即啟程返航,預計於今天傍晚抵達匈牙利。這是中方向匈牙利出口的首批疫苗。
    Exception in thread thread_worker:
    Traceback (most recent call last):
      File "C:\Program Files\Python37\lib\threading.py", line 926, in _bootstrap_inner
        self.run()
      File "C:\Program Files\Python37\lib\threading.py", line 870, in run
        self._target(*self._args, **self._kwargs)
      File "C:\Users\marcus\Desktop\boc_app_nlp\lib\site-packages\service_streamer\service_streamer.py", line 154, in run_forever
        handled = self._run_once()
      File "C:\Users\marcus\Desktop\boc_app_nlp\lib\site-packages\service_streamer\service_streamer.py", line 189, in _        run_once
        self._send_response(client_id, task_id, request_id, model_outputs[i])
    KeyError: 0
@kuangdd
Copy link

kuangdd commented Dec 7, 2021

我也遇到同样的问题,什么原因呢?
是代码的bug,还是我们的使用方式有问题?还是某些环境变量需要专门设定?或者是其他什么原因呢?

@kuangdd
Copy link

kuangdd commented Dec 7, 2021

我的原因找到了,model.predict的逻辑必须都是张量计算的步骤,不能新添加非张量计算的逻辑,否则就这样的报错。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants