Both versions of EduBERT were trained using Transformers by Huggingface. The best way to use the model is to load its weights&config through the PyTorch implementation of Transformers.
EduBERT (.tar.gz, ±388.7MB): https://storage.googleapis.com/edubert/edubert.tar.gz
DistilEduBERT (.tar.gz, ±235.5MB): https://storage.googleapis.com/edubert/distiledubert.tar.gz
The models are also available on the HuggingFace Hub: EduBERT, DistilEduBERT
@inproceedings{edubert,
title={EduBERT: Pretrained Deep Language Models for Learning Analytics},
author={Clavi{\'e}, Benjamin and Gal, Kobi},
booktitle={Companion Proceedings of the Tenth International Conference on Learning Analytics And Knowledge},
year={2020}
}