Update README.md
Browse files
README.md
CHANGED
@@ -15,6 +15,8 @@ The training dataset is [mc4](https://huggingface.co/datasets/bertin-project/mc4
|
|
15 |
|
16 |
This model takes the one using [sequence length 128](https://huggingface.co/bertin-project/bertin-base-gaussian) and trains during 25.000 steps using sequence length 512.
|
17 |
|
|
|
|
|
18 |
This is part of the
|
19 |
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
|
20 |
|
|
|
15 |
|
16 |
This model takes the one using [sequence length 128](https://huggingface.co/bertin-project/bertin-base-gaussian) and trains during 25.000 steps using sequence length 512.
|
17 |
|
18 |
+
Please see our main [card](https://huggingface.co/bertin-project/bertin-roberta-base-spanish) for more information.
|
19 |
+
|
20 |
This is part of the
|
21 |
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
|
22 |
|