Use several classical deep learning models to solve multi-label NLP classification problem
-
Updated
Sep 8, 2021 - Python
Use several classical deep learning models to solve multi-label NLP classification problem
Exploring fast & accurate zero-shot text classification
Repository for a transformer I coded from scratch and trained on the tiny-shakespeare dataset.
Pytorch实现transformer编码器+attention的文本分类算法
Code for the Paper Constructing Global Coherence Representations:Identifying Interpretability and Coherences ofTransformer Attention in Time Series Data. It is about creating coherence matrices which represent the attention from each symbol to each other symbol.
Projects based on self learning purpose.
Pose-based word-level sign language recognition with BERT-styled transformer in Keras
Julia experimentation using sequence-based NLP models
Using the similarity between embedded protein sequences to align them
Co-Driven Recognition of Semantic Consistency via the Fusion of Transformer and HowNet Sememes Knowledge
A deep learning classification tool for anomalous diffusion trajectories.
A comprehensive code for AI & Robotics.
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
Objective: Predicting whether the customer will return or not in the next month. Techniques used: XGBoost, logistic regression, attention based LSTM neural network, self attention based transformer neural network
CS747 - Foundations Of Intelligent Learning Agents (FILA) Course Project
Deep Learning Course Assignment on Image Captioning and Machine Translation using LSTMs
A comparative study of deep learning models to correctly identify the cancer a patient has, as a means to creating a more streamlined process when making a post on the Cancer Survivors Network website.,
Augmenting Multitask Learning for Multiclass Anomaly Severity Detection on Drone Flight Logs
Transformer quantization and binarization exploration
Deep Learning project October 2023
Add a description, image, and links to the transformer-encoder topic page so that developers can more easily learn about it.
To associate your repository with the transformer-encoder topic, visit your repo's landing page and select "manage topics."