Skip to content

fuyahuii/ConSK-GCN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 

Repository files navigation

The PyTorch code for paper: CONSK-GCN: Conversational Semantic- and Knowledge-Oriented Graph Convolutional Network for Multimodal Emotion Recognition [PDF].

Context- and Knowledge-Aware Graph Convolutional Network for Multimodal Emotion Recognition[PDF].

The code is based on DialogueGCN.

Steps:

Knowledge preparation(You can skip this step by using the already preprocessed knowledges in the Data directory):

  • Download ConceptNet and NRC_VAD.
  • preprocess ConceptNet and NRC_VAD: run preprocess_knowledge.py.

Model training: run train_multi.py for both IEMOCAP and MELD datasets.

Citing

If you find this repo or paper useful, please cite

@article{fu2022context,
  title={Context-and Knowledge-Aware Graph Convolutional Network for Multimodal Emotion Recognition},
  author={Fu, Yahui and Okada, Shogo and Wang, Longbiao and Guo, Lili and Liu, Jiaxing and Song, Yaodong and Dang, Jianwu},
  journal={IEEE MultiMedia},
  year={2022},
  publisher={IEEE}
}

About

The PyTorch code for paper: "CONSK-GCN: Conversational Semantic- and Knowledge-Oriented Graph Convolutional Network for Multimodal Emotion Recognition."

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published