Skip to content

Latest commit

 

History

History
21 lines (15 loc) · 976 Bytes

README.md

File metadata and controls

21 lines (15 loc) · 976 Bytes

bertVersusAll

For Using the Traditional Embed-Encode-Attend-Predict Model

For Using the Bert Model (Minimal BERT)

  1. Install: pip install -U bert-serving-server bert-serving-client

  2. Install other needed libraries and download data

    • pip install keras_metrics keras pandas numpy sklearn
  3. Serve Model:

    • bert-serving-start -model_dir ~/bert/models/cased_L-12_H-768_A-12/ -num_worker=2 -max_seq_len=250 num_workers can be used to increase num_workers (Model is loaded n-times) max_seq_len specifies the largest entry size to the bert model (suggest 250 for concept work)