gsarti commited on
Commit
86c8c8e
1 Parent(s): 236a2ae

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -20,7 +20,7 @@ This model is released as part of the project ["IT5: Large-Scale Text-to-Text Pr
20
 
21
  ## Model variants
22
 
23
- This repository contains the checkpoints for the `base` version of the model. The model was trained for one epoch (1.05M steps) on the [Thoroughly Cleaned Italian mC4 Corpus](https://huggingface.co/datasets/gsarti/clean_mc4_it) (~41B words, ~275GB) using 🤗 Datasets and the `google/t5-v1_1-large` improved configuration. The training procedure is made available [on Github](https://github.com/gsarti/t5-flax-gcp).
24
 
25
  The following table summarizes the parameters for all available models
26
 
20
 
21
  ## Model variants
22
 
23
+ This repository contains the checkpoints for the `base` version of the model. The model was trained for one epoch (1.05M steps) on the [Thoroughly Cleaned Italian mC4 Corpus](https://huggingface.co/datasets/gsarti/clean_mc4_it) (~41B words, ~275GB) using 🤗 Datasets and the `google/t5-v1_1-large` improved configuration. The training procedure is made available [on Github](https://github.com/gsarti/t5-flax-gcp).
24
 
25
  The following table summarizes the parameters for all available models
26