Edit model card

longformer-base-4096-spanish

Longformer is a Transformer model for long documents.

longformer-base-4096 is a BERT-like model started from the RoBERTa checkpoint (BERTIN in this case) and pre-trained for MLM on long documents (from BETO's all_wikis). It supports sequences of length up to 4,096!

Longformer uses a combination of a sliding window (local) attention and global attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.

This model was made following the research done by Iz Beltagy and Matthew E. Peters and Arman Cohan.

Citation

If you want to cite this model you can use this:

@misc{mromero2022longformer-base-4096-spanish,
  title={Spanish LongFormer by Manuel Romero},
  author={Romero, Manuel},
  publisher={Hugging Face},
  journal={Hugging Face Hub},
  howpublished={\url{https://huggingface.co/mrm8488/longformer-base-4096-spanish}},
  year={2022}
}
Downloads last month
33
Hosted inference API
Fill-Mask
Examples
Examples
Mask token: <mask>
This model can be loaded on the Inference API on-demand.