Update README.md
Browse files
README.md
CHANGED
@@ -12,6 +12,10 @@ inference: false
|
|
12 |
|
13 |
# Italian T5-base 🇮🇹
|
14 |
|
|
|
|
|
|
|
|
|
15 |
Created by [Gabriele Sarti](https://gsarti.com/) during the [Hugging Face community week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google, for the project [PreTrain T5 for Italian](https://discuss.huggingface.co/t/pretrain-t5-for-italian/7425/4).
|
16 |
|
17 |
This is notably the first sequence-to-sequence model pre-trained on the Italian language available on the 🤗 Hub. For people interested in studying the pre-training dynamics of this model, the repository [`t5-base-it-training`](https://huggingface.co/gsarti/t5-base-it-training/tree/main) contains Flax checkpoints for the whole pre-training process (saved each 2000 steps, 129 checkpoints, ~250GB).
|
|
|
12 |
|
13 |
# Italian T5-base 🇮🇹
|
14 |
|
15 |
+
⚠️⚠️ REDIRECTION NOTICE ⚠️⚠️
|
16 |
+
|
17 |
+
The contents of the repository `gsarti/t5-base-it` will be transfered to a new repository `gsarti/it5-base-oscar` on the Huggingface Hub on **October 23rd, 2021**. Users looking for an improved version of the Italian T5 model can already use the checkpoint in the `gsarti/it5-base` repository (more details soon!).
|
18 |
+
|
19 |
Created by [Gabriele Sarti](https://gsarti.com/) during the [Hugging Face community week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google, for the project [PreTrain T5 for Italian](https://discuss.huggingface.co/t/pretrain-t5-for-italian/7425/4).
|
20 |
|
21 |
This is notably the first sequence-to-sequence model pre-trained on the Italian language available on the 🤗 Hub. For people interested in studying the pre-training dynamics of this model, the repository [`t5-base-it-training`](https://huggingface.co/gsarti/t5-base-it-training/tree/main) contains Flax checkpoints for the whole pre-training process (saved each 2000 steps, 129 checkpoints, ~250GB).
|