Materials for "Linguistic Knowledge Can Enhance Encoder-Decoder Models (If You Let It)"
-
Updated
May 20, 2024
Materials for "Linguistic Knowledge Can Enhance Encoder-Decoder Models (If You Let It)"
AraT5: Text-to-Text Transformers for Arabic Language Understanding
The "LLM Projects Archive" is a centralized GitHub repository, offering a diverse collection of Language Model Models projects. A valuable resource for researchers, developers, and enthusiasts, it showcases the latest advancements and applications in the realm of LLMs. Explore and contribute to the dynamic landscape of language model projects.
This repository contains implementations of abstractive text summarization using RNN ,RNN with Reinforcement learning and Transformer architectures.
As seen in TREC 2023, A QA-First, Hallucination-Lite, Multi-LM Summarizer
Python script for text summarization using the T5 model.
Forked version of https://github.com/alfazh123/ParaFaze with a State-of-the-Art of an over engineering :)
中文对话0.2B小模型(ChatLM-Chinese-0.2B),开源所有数据集来源、数据清洗、tokenizer训练、模型预训练、SFT指令微调、RLHF优化等流程的全部代码。支持下游任务sft微调,给出三元组信息抽取微调示例。
A T5-based Seq2Seq Model that Generates Titles for Machine Learning Papers using the Abstract
Code and Assets for "Benchmarking and Improving Text-to-SQL Generation Under Ambiguity" (EMNLP 2023)
Materials for "IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation" 🇮🇹
This repository explores enhancing dialogue summarization with commonsense knowledge through the SICK framework, evaluating models on dialogue datasets to assess commonsense's impact on summarization quality.
Training paraphasing using huggingface T5
Experiment Machine Translation for 10 Indonesian Local Languages.
Grammar error corrector model fine-tuned with t5-base and jfleg.
Deep learning project text summarization from BBC News using T5 Model
Reproducing ACL Paper for iterative text generation and performing Robustness and Multilinguality tests
Developed a chatbot using Generative AI from Large Language Models (LLM). Model used is T5 (seq-2-seq)
Add a description, image, and links to the t5-model topic page so that developers can more easily learn about it.
To associate your repository with the t5-model topic, visit your repo's landing page and select "manage topics."