File size: 840 Bytes
c0a6089
 
1d3ddb6
 
 
a329704
1d3ddb6
c0a6089
1d3ddb6
a329704
1d3ddb6
a62aaaa
1
2
3
4
5
6
7
8
9
10
11
12
---
license: cc-by-nc-sa-4.0
pipeline_tag: fill-mask
language: en
tags:
- biomedical
- long-documents
---

# Biomedical Longformer (base)

This is a derivative model based on [microsoft/BiomedNLP-PubMedBERT-large-uncased-abstract](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-large-uncased-abstract) BERT model developed in the work "Fine-Tuning Large Neural Language Models for Biomedical Natural Language Processing" by [Tinn et al. (2021)](https://arxiv.org/abs/2112.07869). All model parameters where cloned from the original model, while the positional embeddings were extended by cloning the original embeddings multiple times following [Beltagy et al. (2020)](https://arxiv.org/abs/2004.05150) using a python script similar to this one (https://github.com/allenai/longformer/blob/master/scripts/convert_model_to_long.ipynb).