language: | |
- en | |
license: apache-2.0 | |
[`bioformer-8L`](https://huggingface.co/bioformers/bioformer-8L) pretrained on 164,179 COVID-19 abstracts (from [LitCovid website](https://www.ncbi.nlm.nih.gov/research/coronavirus/)) for 100 epochs. | |
In our evaluation, this pretraining process leads to improved performance on the multi-label COVID-19 topic classification task (BioCreative VII track 5). |