Skip to content

mbarbetti/calotron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

calotron logo

Transformer-based models to fast-simulate the LHCb ECAL detector

TensorFlow versions Python versions PyPI - Version GitHub - License

GitHub - Tests Codecov

GitHub - Style Code style: black

Transformers

Models Generator Implementation Test Design inspired by
Transformer 1, 4
OptionalTransformer 1, 4
MaskedTransformer 🛠️
GigaGenerator 5, 6

Discriminators

Models Algorithm Implementation Test Design inspired by
Discriminator DeepSets 2, 3
PairwiseDiscriminator DeepSets 2, 3
GNNDiscriminator GNN 🛠️
GigaDiscriminator Transformer 5, 6, 7

References

  1. A. Vaswani et al., "Attention Is All You Need", arXiv:1706.03762
  2. The ATLAS Collaboration, "Deep Sets based Neural Networks for Impact Parameter Flavour Tagging in ATLAS", ATL-PHYS-PUB-2020-014
  3. M. Zaheer et al., "Deep Sets", arXiv:1703.06114
  4. L. Liu et al., "Understanding the Difficulty of Training Transformers", arXiv:2004.08249
  5. M. Kang et al., "Scaling up GANs for Text-to-Image Synthesis", arXiv:2303.05511
  6. K. Lee et al., "ViTGAN: Training GANs with Vision Transformers", arXiv:2107.04589
  7. H. Kim, G. Papamakarios and A. Mnih, "The Lipschitz Constant of Self-Attention", arXiv:2006.04710

Credits

Transformer implementation freely inspired by the TensorFlow tutorial Neural machine translation with a Transformer and Keras and the Keras tutorial Image classification with Vision Transformer.