julien-c HF staff commited on
Commit
3dc4517
1 Parent(s): 04e9089

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/allenai/biomed_roberta_base/README.md

Files changed (1) hide show
  1. README.md +38 -0
README.md ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ thumbnail: https://huggingface.co/front/thumbnails/allenai.png
3
+ ---
4
+
5
+ # BioMed-RoBERTa-base
6
+
7
+ BioMed-RoBERTa-base is a language model based on the RoBERTa-base (Liu et. al, 2019) architecture. We adapt RoBERTa-base to 2.68 million scientific papers from the [Semantic Scholar](https://www.semanticscholar.org) corpus via continued pretraining. This amounts to 7.55B tokens and 47GB of data. We use the full text of the papers in training, not just abstracts.
8
+
9
+ Specific details of the adaptive pretraining procedure can be found in Gururangan et. al, 2020.
10
+
11
+
12
+ ## Evaluation
13
+
14
+ BioMed-RoBERTa achieves competitive performance to state of the art models on a number of NLP tasks in the biomedical domain (numbers are mean (standard deviation) over 3+ random seeds)
15
+
16
+
17
+ | Task | Task Type | RoBERTa-base | BioMed-RoBERTa-base |
18
+ |--------------|---------------------|--------------|---------------------|
19
+ | RCT-180K | Text Classification | 86.4 (0.3) | 86.9 (0.2) |
20
+ | ChemProt | Relation Extraction | 81.1 (1.1) | 83.0 (0.7) |
21
+ | JNLPBA | NER | 74.3 (0.2) | 75.2 (0.1) |
22
+ | BC5CDR | NER | 85.6 (0.1) | 87.8 (0.1) |
23
+ | NCBI-Disease | NER | 86.6 (0.3) | 87.1 (0.8) |
24
+
25
+ More evaluations TBD.
26
+
27
+ ## Citation
28
+
29
+ If using this model, please cite the following paper:
30
+
31
+ ```bibtex
32
+ @inproceedings{domains,
33
+ author = {Suchin Gururangan and Ana Marasović and Swabha Swayamdipta and Kyle Lo and Iz Beltagy and Doug Downey and Noah A. Smith},
34
+ title = {Don't Stop Pretraining: Adapt Language Models to Domains and Tasks},
35
+ year = {2020},
36
+ booktitle = {Proceedings of ACL},
37
+ }
38
+ ```