ALBERT XXLarge Spanish
This is an ALBERT model trained on a big spanish corpora. The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time:
- LR: 0.0003125
- Batch Size: 128
- Warmup ratio: 0.00078125
- Warmup steps: 3125
- Goal steps: 4000000
- Total steps: 1650000
- Total training time (aprox): 70.7 days.
Training loss
- Downloads last month
- 10,889