Pablogps commited on
Commit
94a43c6
1 Parent(s): 3dff778

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -14,6 +14,8 @@ This is a **RoBERTa-base** model trained from scratch in Spanish.
14
  The training dataset is [mc4](https://huggingface.co/datasets/bertin-project/mc4-es-sampled ) subsampling documents to a total of about 50 million examples. Sampling is random.
15
  This model takes the one using [sequence length 128](https://huggingface.co/bertin-project/bertin-base-random) and trains during 25.000 steps using sequence length 512.
16
 
 
 
17
  This is part of the
18
  [Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
19
 
 
14
  The training dataset is [mc4](https://huggingface.co/datasets/bertin-project/mc4-es-sampled ) subsampling documents to a total of about 50 million examples. Sampling is random.
15
  This model takes the one using [sequence length 128](https://huggingface.co/bertin-project/bertin-base-random) and trains during 25.000 steps using sequence length 512.
16
 
17
+ Please see our main [card](https://huggingface.co/bertin-project/bertin-roberta-base-spanish) for more information.
18
+
19
  This is part of the
20
  [Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
21