Skip to content

Splitting a single dialogue task into multiple tasks with dropping different output tokens to learn better encoder representation.

Notifications You must be signed in to change notification settings

ppartha03/Semantic-Loss-Dialogue-Generation

Repository files navigation

Semantic-Loss-Dialogue-Generation

Splitting a single dialogue task into multiple tasks with dropping different output tokens to learn better encoder representation.

Publication

An arxiv link to the paper can be found here.

Running the code

python beamsearch2.py for training and reproducing the results.


@inproceedings{
  parthasarathi2021Semantic,
  author       = {Parthasarathi, Prasanna and Abdelsalam, Mohamed and Pineau, Joelle and Chandar, Sarath},
  title        = {A Brief Study on the Effects of Training Generative Dialogue Models with a Semantic loss},
  year         = {2021},
  booktitle    = {Proceedings of the 22nd Annual SIGdial Meeting on Discourse and Dialogue},
  publisher    = {Association for Computational Linguistics},
}

About

Splitting a single dialogue task into multiple tasks with dropping different output tokens to learn better encoder representation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages