Skip to content
/ BERT Public

Bidirectional Encoder Representations from Transformers

License

Notifications You must be signed in to change notification settings

purang2/BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

77 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERT

Bidirectional Encoder Representations from Transformers

| PAPER (theory) | Hugging Face (engineering) | Hits |

The BERT model was proposed in BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding by Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova. It’s a bidirectional transformer pretrained using a combination of masked language modeling objective and next sentence prediction on a large corpus comprising the Toronto Book Corpus and Wikipedia.

How to see 😲

Just type the '.' on your keyboard now!!,

Inkedkeyb_LI

for watching the directory (codes) on Visual Studio Code -Github

log-git

About

Bidirectional Encoder Representations from Transformers

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published