fl399 commited on
Commit
c1f013f
1 Parent(s): 24d0ad9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -5
README.md CHANGED
@@ -16,10 +16,20 @@ SapBERT by [Liu et al. (2020)](https://arxiv.org/pdf/2010.11784.pdf). Trained wi
16
 
17
  ### Citation
18
  ```bibtex
19
- @article{liu2020self,
20
- title={Self-alignment Pre-training for Biomedical Entity Representations},
21
- author={Liu, Fangyu and Shareghi, Ehsan and Meng, Zaiqiao and Basaldella, Marco and Collier, Nigel},
22
- journal={arXiv preprint arXiv:2010.11784},
23
- year={2020}
 
 
 
 
 
 
 
 
 
 
24
  }
25
  ```
16
 
17
  ### Citation
18
  ```bibtex
19
+ @inproceedings{liu-etal-2021-self,
20
+ title = "Self-Alignment Pretraining for Biomedical Entity Representations",
21
+ author = "Liu, Fangyu and
22
+ Shareghi, Ehsan and
23
+ Meng, Zaiqiao and
24
+ Basaldella, Marco and
25
+ Collier, Nigel",
26
+ booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
27
+ month = jun,
28
+ year = "2021",
29
+ address = "Online",
30
+ publisher = "Association for Computational Linguistics",
31
+ url = "https://www.aclweb.org/anthology/2021.naacl-main.334",
32
+ pages = "4228--4238",
33
+ abstract = "Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.",
34
  }
35
  ```