Add brief description and BibTex to readme
Browse files
README.md
CHANGED
@@ -4,4 +4,33 @@ language:
|
|
4 |
tags:
|
5 |
- biomedical
|
6 |
- bioNLP
|
7 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
tags:
|
5 |
- biomedical
|
6 |
- bioNLP
|
7 |
+
---
|
8 |
+
|
9 |
+
This is a version of [PubmedBERT](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext?text=%5BMASK%5D+is+a+tumor+suppressor+gene.) which has been domain-adapted (via additional pretraining) to a set of PubMed abstracts that likely discuss multiple-drug therapies. This model was the strongest contextualized encoder in the experiments in the paper ["A Dataset for N-ary Relation Extraction of Drug Combinations"](https://arxiv.org/abs/2205.02289).
|
10 |
+
|
11 |
+
If you use this model, cite both
|
12 |
+
```latex
|
13 |
+
@misc{pubmedbert,
|
14 |
+
author = {Yu Gu and Robert Tinn and Hao Cheng and Michael Lucas and Naoto Usuyama and Xiaodong Liu and Tristan Naumann and Jianfeng Gao and Hoifung Poon},
|
15 |
+
title = {Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing},
|
16 |
+
year = {2020},
|
17 |
+
eprint = {arXiv:2007.15779},
|
18 |
+
}
|
19 |
+
```
|
20 |
+
|
21 |
+
and
|
22 |
+
|
23 |
+
```latex
|
24 |
+
@inproceedings{Tiktinsky2022ADF,
|
25 |
+
title = "A Dataset for N-ary Relation Extraction of Drug Combinations",
|
26 |
+
author = "Tiktinsky, Aryeh and Viswanathan, Vijay and Niezni, Danna and Meron Azagury, Dana and Shamay, Yosi and Taub-Tabib, Hillel and Hope, Tom and Goldberg, Yoav",
|
27 |
+
booktitle = "Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
|
28 |
+
month = jul,
|
29 |
+
year = "2022",
|
30 |
+
address = "Seattle, United States",
|
31 |
+
publisher = "Association for Computational Linguistics",
|
32 |
+
url = "https://aclanthology.org/2022.naacl-main.233",
|
33 |
+
doi = "10.18653/v1/2022.naacl-main.233",
|
34 |
+
pages = "3190--3203",
|
35 |
+
}
|
36 |
+
```
|