# longformer-base-4096-extra.pos.embd.only | |
This model is similar to `longformer-base-4096` but it was pretrained to preserve RoBERTa weights by freezing all RoBERTa weights and only train the additional position embeddings. | |
### Citing | |
If you use `Longformer` in your research, please cite [Longformer: The Long-Document Transformer](https://arxiv.org/abs/2004.05150). | |
``` | |
@article{Beltagy2020Longformer, | |
title={Longformer: The Long-Document Transformer}, | |
author={Iz Beltagy and Matthew E. Peters and Arman Cohan}, | |
journal={arXiv:2004.05150}, | |
year={2020}, | |
} | |
``` | |
`Longformer` is an open-source project developed by [the Allen Institute for Artificial Intelligence (AI2)](http://www.allenai.org). | |
AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. | |