Skip to content
This repository has been archived by the owner on Nov 18, 2022. It is now read-only.

Repository for the course project of Deep Learning for Speech and Natural Language Processing at Universität Stuttgart.

Notifications You must be signed in to change notification settings

ShawonAshraf/nlu-jointbert-dl2021

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nlu-jointbert-dl2021

Repository for the course project of Deep Learning for Speech and Natural Language Processing at Universität Stuttgart. Winter 2020-2021.

Open In Colab

Task

Natural Language Understanding

Data

Custom dataset provided by the instructors.

Dataset properties

{
  "text": "",
  "positions": [{}],
  "slots": [{}],
  "intent": ""
}

Model Description

Inputs: text

Labels: slots, intents

Pretrained model in use: bert-base-cased from Huggingface Transformers.

ENV Setup

pip install -r requirements.txt

Sample prediction output

Input

add kansas city, missouri to Stress Relief

Output

{
  "intent": "AddToPlaylist",
  "slots": {
    "playlist": "Stress Relief",
    "entity_name": "kansas city, missouri"
  }
}

Run

python main.py <gpu_id>
# use gpu_id from nvidia-smi for multi-gpu systems
# for single gpu, use 0

Reference

  1. Chen et al. (2019), BERT for Joint Intent Classification and Slot Filling. https://arxiv.org/abs/1902.10909

  2. https://github.com/monologg/JointBERT

About

Repository for the course project of Deep Learning for Speech and Natural Language Processing at Universität Stuttgart.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published