FremyCompany commited on
Commit
eba56ab
1 Parent(s): 584ab2b

Update citation to ACL Anthology version

Browse files
Files changed (1) hide show
  1. README.md +13 -9
README.md CHANGED
@@ -36,15 +36,19 @@ This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentence
36
  This model accompanies the [BioLORD: Learning Ontological Representations from Definitions](https://arxiv.org/abs/2210.11892) paper, accepted in the EMNLP 2022 Findings. When you use this model, please cite the original paper as follows:
37
 
38
  ```latex
39
- @misc{https://doi.org/10.48550/arxiv.2210.11892,
40
- title = {BioLORD: Learning Ontological Representations from Definitions (for Biomedical Concepts and their Textual Descriptions)},
41
- author = {Remy, François and Demuynck, Kris and Demeester, Thomas},
42
- url = {https://arxiv.org/abs/2210.11892},
43
- doi = {10.48550/ARXIV.2210.11892},
44
- keywords = {Computation and Language (cs.CL), Information Retrieval (cs.IR), FOS: Computer and information sciences, FOS: Computer and information sciences},
45
- publisher = {arXiv},
46
- year = {2022},
47
- copyright = {Creative Commons Attribution 4.0 International}
 
 
 
 
48
  }
49
  ```
50
 
 
36
  This model accompanies the [BioLORD: Learning Ontological Representations from Definitions](https://arxiv.org/abs/2210.11892) paper, accepted in the EMNLP 2022 Findings. When you use this model, please cite the original paper as follows:
37
 
38
  ```latex
39
+ @inproceedings{remy-etal-2022-biolord,
40
+ title = "{B}io{LORD}: Learning Ontological Representations from Definitions for Biomedical Concepts and their Textual Descriptions",
41
+ author = "Remy, François and
42
+ Demuynck, Kris and
43
+ Demeester, Thomas",
44
+ booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2022",
45
+ month = dec,
46
+ year = "2022",
47
+ address = "Abu Dhabi, United Arab Emirates",
48
+ publisher = "Association for Computational Linguistics",
49
+ url = "https://aclanthology.org/2022.findings-emnlp.104",
50
+ pages = "1454--1465",
51
+ abstract = "This work introduces BioLORD, a new pre-training strategy for producing meaningful representations for clinical sentences and biomedical concepts. State-of-the-art methodologies operate by maximizing the similarity in representation of names referring to the same concept, and preventing collapse through contrastive learning. However, because biomedical names are not always self-explanatory, it sometimes results in non-semantic representations. BioLORD overcomes this issue by grounding its concept representations using definitions, as well as short descriptions derived from a multi-relational knowledge graph consisting of biomedical ontologies. Thanks to this grounding, our model produces more semantic concept representations that match more closely the hierarchical structure of ontologies. BioLORD establishes a new state of the art for text similarity on both clinical sentences (MedSTS) and biomedical concepts (MayoSRS).",
52
  }
53
  ```
54