Skip to content

SimiaoZuo/Transformer-Hawkes-Process

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer Hawkes Process

Source code for Transformer Hawkes Process (ICML 2020).

Run the code

Dependencies

  • Python 3.7.
  • Anaconda contains all the required packages.
  • PyTorch version 1.4.0.

Instructions

  1. Put the data folder inside the root folder, modify the data entry in run.sh accordingly. The datasets are available here.
  2. bash run.sh to run the code.

Note

  • Right now the code only supports single GPU training, but an extension to support multiple GPUs should be easy.
  • The reported event time prediction RMSE and the time stamps provided in the datasets are not of the same unit, i.e., the provided time stamps can be in minutes, but the reported results are in hours.
  • There are several factors that can be changed, beside the ones in run.sh:
    • In Main.py, function train_epoch, the event time prediction squared error needs to be properly scaled to stabilize training. In the meantime, also scale the diff variable in function time_loss in Utils.py.
    • In Utils.py, function log_likelihood, users can select whether to use numerical integration or Monte Carlo integration.
    • In transformer/Models.py, class Transformer, there is an optional recurrent layer. This is inspired by the fact that additional recurrent layers can better capture the sequential context, as suggested in this paper. In reality, this may or may not help, depending on the dataset.

Reference

Please cite the following paper if you use this code.

@article{zuo2020transformer,
  title={Transformer Hawkes Process},
  author={Zuo, Simiao and Jiang, Haoming and Li, Zichong and Zhao, Tuo and Zha, Hongyuan},
  journal={arXiv preprint arXiv:2002.09291},
  year={2020}
}

About

Code for Transformer Hawkes Process, ICML 2020.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published