Edit model card

RadBERT was continuously pre-trained on radiology reports from a BioBERT initialization.

Citation

@article{chambon_cook_langlotz_2022, 
  title={Improved fine-tuning of in-domain transformer model for inferring COVID-19 presence in multi-institutional radiology reports}, 
  DOI={10.1007/s10278-022-00714-8}, journal={Journal of Digital Imaging}, 
  author={Chambon, Pierre and Cook, Tessa S. and Langlotz, Curtis P.}, 
  year={2022}
} 
Downloads last month
1,409
Hosted inference API
Fill-Mask
Examples
Examples
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.

Datasets used to train StanfordAIMI/RadBERT