angel-poc commited on
Commit
05b9dad
1 Parent(s): b61a0f8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -48,7 +48,7 @@ widget:
48
  </details>
49
 
50
  ## Model description
51
- The **longformer-base-4096-bne-es** is the [Longformer](https://huggingface.co/allenai/longformer-base-4096) version of the [roberta-base-bne](https://https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) masked language model for the Spanish language. Using this kind of models, allows us to process larger contexts as input without needing to use additional aggregation strategies. The model started from the **roberta-base-bne** checkpoint and was pretrained for MLM on long documents from the National Library of Spain.
52
 
53
  The Longformer model uses a combination of sliding window (local) attention and global attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations. Please refer to the original [paper](https://arxiv.org/abs/2004.05150) for more details on how to set global attention.
54
 
 
48
  </details>
49
 
50
  ## Model description
51
+ The **longformer-base-4096-bne-es** is the [Longformer](https://huggingface.co/allenai/longformer-base-4096) version of the [roberta-base-bne](https://https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) masked language model for the Spanish language. The use of these models, allows us to process larger contexts as input without the need of additional aggregation strategies. The model started from the **roberta-base-bne** checkpoint and was pretrained for MLM on long documents from the National Library of Spain.
52
 
53
  The Longformer model uses a combination of sliding window (local) attention and global attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations. Please refer to the original [paper](https://arxiv.org/abs/2004.05150) for more details on how to set global attention.
54