Skip to content

praveenjune17/BERT_text_summarisation

Repository files navigation

Text_summarisation using BERT

This Project is inspired from https://arxiv.org/pdf/1902.09243v2.pdf.
Part of the code adapted from https://github.com/raufer/bert-summarization.
Created using TensorFlow 2

Apart from the existing functionalities in the adapted code added few features like
*)Added Beam-search mechanism during inference
*)Added Copy mechanism to the decoder
*)Added topk, nucleus decoders
*)Used Huggingface Transformers library to extract BERT embeddings
*)Mixed precision policy enabled training
*)Used BERT score for validation
*)Migrated the adapted code from Tensorflow 1 to 2

Instructions to Train the model
*) Run train_bert_summarizer_mixed_precision.py if you have GPU with compute compatibility >= 7.5 else use train_bert_summarizer.py.

*) Google colab Demo available here

About

Create a text summarisation model using BERT

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published