Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Predicting Missing Word Using Bert Masked LM

input

A sentence with a masked word, which is defined as SENTENCE in bert.py.
Masked Word should be represented by one _.

output

Top k predicted words suitable for filling the Masked Word.
k is defined as NUM_PREDICT in bert.py

Usage

SENTENCE is defined in the bert.py. ex. SENTENCE = 'I want to _ the car because it is cheap.'

  • English Bert
$ python3 bert.py
...
predicted top 3 words: ['buy', 'drive', 'rent']
$ python3 bert.py -l jp
predicted top 3 words: ['結婚', '[UNK]', '']

Reference

pytorch-pretrained-bert
BERT日本語Pretrainedモデル

Framework

PyTorch 1.3.0

Model Format

ONNX opset = 10

Netron

bert-base-uncased.onnx.prototxt