Skip to content
#

seq2seq-model

Here are 118 public repositories matching this topic...

Successfully established a text summarization model using Seq2Seq modeling with Luong Attention, which can give a short and concise summary of the global news headlines.

  • Updated May 6, 2024
  • Jupyter Notebook

Thesis scope: Train and Develop a Table-to-Text Transformer-based model for contextual summarization of tabular data. To achieve this T5-small , T5-base, Bart-base and Llama2 7B chat were finetuned on ToTTo and QTSumm. Regarding ToTTo, the models outperformed the benchmark.

  • Updated Apr 24, 2024
  • Jupyter Notebook

Abstractive text summarization generates a shorter version of a given sentence while attempting to preserve its contextual meaning. In our approach we model the problem using an attentional encoder decoder which ensures that the decoder focuses on the appropriate input words at each step of our generation.

  • Updated Jun 1, 2023
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the seq2seq-model topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the seq2seq-model topic, visit your repo's landing page and select "manage topics."

Learn more