Edit model card

Using the DistilRoBERTa model as starting point, the ClimateBERT Language Model is additionally pretrained on a text corpus comprising climate-related research paper abstracts, corporate and general news and reports from companies. The underlying methodology can be found in our language model research paper.

BibTeX entry and citation info

@article{wkbl2021,
        title={ClimateBERT: A Pretrained Language Model for Climate-Related Text},
        author={Webersinke, Nicolas and Kraus, Mathias and Bingler, Julia and Leippold, Markus},
        journal={arXiv preprint arXiv:2110.12010},
        year={2021}
}
Downloads last month
31
Hosted inference API
Fill-Mask
Examples
Examples
Mask token: <mask>
This model can be loaded on the Inference API on-demand.