1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
-
Updated
Jun 9, 2020 - Python
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
The task of post modifier generation requires to automatically generate a post modifier phrase describing the target entity (an entity essentially refers to a noun but here we only consider people) that contextually fits in the input sentence.
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
Deep Learning Library for Text Classification.
Use BiLSTM_attention, BERT, ALBERT, RoBERTa, XLNet model to classify the SST-2 data set based on pytorch
PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类
中文实体关系抽取,pytorch,bilstm+attention
Deep Learning based end-to-end solution for detecting fraudulent and spam messages across all your devices
# 2022 COMAP Problem C chosen (Bitcoin and Gold Quant Trading
Explainable Sentence-Level Sentiment Analysis – Final project for "Deep Natural Language Processing" course @ PoliTO
Implementation of papers for text classification task on SST-1/SST-2
This repo contains all files needed to train and select NLP models for fake news detection
TextCNN, TextRNN, FastText, TextRCNN, BiLSTM Attention, DPCNN and Transformer in Pytorch framework
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
This project features a Next Word Prediction Model deployed via a FLASK API, implemented using a Bi-LSTM model with an attention layer.
中文情感分类 | 基于三分类的文本情感分析
Course project of CS247.
Add a description, image, and links to the bilstm-attention topic page so that developers can more easily learn about it.
To associate your repository with the bilstm-attention topic, visit your repo's landing page and select "manage topics."