Skip to content

Latest commit

 

History

History
22 lines (16 loc) · 941 Bytes

DISTILL.md

File metadata and controls

22 lines (16 loc) · 941 Bytes

Accelerating Molecular Graph Neural Networks via Knowledge Distillation

This repository is associated with the paper (accepted at NeurIPS 2023):

Kelvinius, F. E.*, Georgiev, D.*, Toshev, A. P.*, & Gasteiger, J. (2023). Accelerating Molecular Graph Neural Networks via Knowledge Distillation.

Citation

You can cite our work as follows:

@article{kelvinius2023accelerating,
  title={Accelerating Molecular Graph Neural Networks via Knowledge Distillation},
  author={Kelvinius, Filip Ekstr{\"o}m and Georgiev, Dimitar and Toshev, Artur Petrov and Gasteiger, Johannes},
  journal={arXiv preprint arXiv:2306.14818},
  year={2023}
}

If you find the repository useful, please consider leaving a star on GitHub.

Acknowledgements