Successfully established a text summarization model using Seq2Seq modeling with Luong Attention, which can give a short and concise summary of the global news headlines.
-
Updated
May 6, 2024 - Jupyter Notebook
Successfully established a text summarization model using Seq2Seq modeling with Luong Attention, which can give a short and concise summary of the global news headlines.
Awesome Chatbot Projects,Corpus,Papers,Tutorials.Chinese Chatbot =>:
A suite of auto-regressive and Seq2Seq (sequence-to-sequence) transformer models for tabular and relational synthetic data generation.
Thesis scope: Train and Develop a Table-to-Text Transformer-based model for contextual summarization of tabular data. To achieve this T5-small , T5-base, Bart-base and Llama2 7B chat were finetuned on ToTTo and QTSumm. Regarding ToTTo, the models outperformed the benchmark.
This repository describes NLTK methodology applied to Machine Learning
Implement a sequence to sequence (seq2seq) model and analyze the models' performance.
Train Seq2Seq with Attention Chatbot
Facebook chatbot that I trained to talk like me using Seq2Seq
Large Multi-Language Models for News Translation
Just my experiment with Bert2Bert for summarization.
A numpy implementation of the Transformer model in "Attention is All You Need"
Fine tuned Urdu to English machine translation pre train model using Hugging-Face Trainer API on custom dataset.
This repository is related to the NMT challenge conducted in the CS779A course
Huet.AI é um app que faz tradução de Língua Brasileira de Sinais (Libras) para língua portuguesa utilizando IA.
Recommender and Chatbot Systems
Implementing translation tasks using the seq2seq approach, the necessary step of understanding temporal models.
Yaw misalignment calibrator of wind turbine using RNN and Kalman filter
Fine-tuned and compared 3 🤗 pre-trained Multilingual LLMs
Abstractive text summarization generates a shorter version of a given sentence while attempting to preserve its contextual meaning. In our approach we model the problem using an attentional encoder decoder which ensures that the decoder focuses on the appropriate input words at each step of our generation.
Add a description, image, and links to the seq2seq-model topic page so that developers can more easily learn about it.
To associate your repository with the seq2seq-model topic, visit your repo's landing page and select "manage topics."