File size: 629 Bytes
6ebeb24
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# ALBERT Base Spanish

This is an [ALBERT](https://github.com/google-research/albert) model trained on a [big spanish corpora](https://github.com/josecannete/spanish-corpora).
The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time:
- LR: 0.0008838834765
- Batch Size: 960
- Warmup ratio: 0.00625
- Warmup steps: 53333.33333
- Goal steps: 8533333.333
- Total steps: 3650000
- Total training time (aprox): 70.4 days.

## Training loss

![https://drive.google.com/uc?export=view&id=1IsxcgMwd7Hl-3bSnNl8W9jUrHJeHtZql](https://drive.google.com/uc?export=view&id=1IsxcgMwd7Hl-3bSnNl8W9jUrHJeHtZql)