Skip to content

abojchevski/node_embedding_attack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adversarial Attacks on Node Embeddings via Graph Poisoning

Preliminary reference implementation of the attack proposed in the paper:

"Adversarial Attacks on Node Embeddings via Graph Poisoning",

Aleksandar Bojchevski and Stephan Günnemann, ICML 2019.

Requirements

  • gensim
  • tensorflow
  • sklearn (only for evaluation)

Example

The notebook example.ipynb shows an example of our general attack and comparison with the baselines.

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{bojchevski2019adversarial,
  title =      {Adversarial Attacks on Node Embeddings via Graph Poisoning},
  author =      {Aleksandar Bojchevski and Stephan G{\"{u}}nnemann},
  booktitle ={Proceedings of the 36th International Conference on Machine Learning, {ICML}},
  year =      {2019},
  series =      {Proceedings of Machine Learning Research},
  publisher =      {PMLR},
}

About

Adversarial Attacks on Node Embeddings via Graph Poisoning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published