File size: 743 Bytes
064d735
25f8f85
 
01dcc06
d99bd11
ec88232
 
01dcc06
 
 
 
 
 
 
5e0029f
 
 
 
c5872b3
0c1c710
 
064d735
 
4283922
c8f9754
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
widget:
- text: "low lung volumes, [MASK] pulmonary vascularity."
tags:
- fill-mask
- pytorch
- transformers
- bert
- biobert 
- radbert 
- language-model
- uncased
- radiology
- biomedical
datasets:
- wikipedia
- bookscorpus
- pubmed
- radreports
language:
  - en
license: mit
---

RadBERT was continuously pre-trained on radiology reports from a BioBERT initialization. 

## Citation

```bibtex
@article{chambon_cook_langlotz_2022, 
  title={Improved fine-tuning of in-domain transformer model for inferring COVID-19 presence in multi-institutional radiology reports}, 
  DOI={10.1007/s10278-022-00714-8}, journal={Journal of Digital Imaging}, 
  author={Chambon, Pierre and Cook, Tessa S. and Langlotz, Curtis P.}, 
  year={2022}
} 
```