Edit model card

A multilingual BERGAMOT model with pre-trained on UMLS (version 2020AB) using a Graph Attention Network (GAT) encoder.

The model is described in paper "Biomedical Entity Representation with Graph-Augmented Multi-Objective Transformer " which is accepted to NAACL 2024!

For pretraining code see our github: https://github.com/Andoree/BERGAMOT.

Citation

@inproceedings{sakhovskiy-et-al-2024-bergamot,
    title = "Biomedical Entity Representation with Graph-Augmented Multi-Objective Transformer",
    author = "Sakhovskiy, Andrey and Semenova, Natalia and Kadurin, Artur and Tutubalina, Elena",
    booktitle = "Findings of the Association for Computational Linguistics: NAACL 2024",
    month = jun,
    year = "2024",
    address = "Mexico City, Mexico",
    publisher = "Association for Computational Linguistics",
  }
Downloads last month
29