This starter include a Pipfile to install tensorflow + bert-serving-server.
- tensorflow 1.12.0 (using gpu support version as default), read: https://www.tensorflow.org/install
- bert-serving-server http, from https://github.com/hanxiao/bert-as-service
- bert model
BERT-Base, Multilingual Cased
, from: https://github.com/google-research/bert
- Git
- Python3
- pipenv
- Nvidia GPU
- CUDA 9.0
- CUDNN 7.4.2
Clone this starter:
$ git clone https://github.com/jk195417/bert-as-service-starter.git
$ cd bert-as-service-starter
Install dependencies:
$ pipenv update
Download and unzip BERT-Base, Multilingual Cased
into /models
dir
If you don't want GPU support:
$ pipenv uninstall tensorflow-gpu
$ pipenv install tensorflow
1 worker need 1G RAM, example: GTX 1060 6G can open 4 workers max.
# Start server at localhost:8125
$ pipenv run bert-serving-start -model_dir=./models/multi_cased_L-12_H-768_A-12 -num_worker=4 -http_port=8125 -http_max_connect=20
Press ctrl+c
twice to shutdown server, then you can clean up those tmp dirs.