--- language: - es tags: - albert - spanish - OpenCENIA datasets: - large_spanish_corpus --- # ALBERT Tiny Spanish This is an [ALBERT](https://github.com/google-research/albert) model trained on a [big spanish corpora](https://github.com/josecannete/spanish-corpora). The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time: - LR: 0.00125 - Batch Size: 2048 - Warmup ratio: 0.0125 - Warmup steps: 125000 - Goal steps: 10000000 - Total steps: 8300000 - Total training time (aprox): 58.2 days ## Training loss ![https://drive.google.com/uc?export=view&id=1KQc8yWZLKvDLjBtu4IOAgpTx0iLcvX_Q](https://drive.google.com/uc?export=view&id=1KQc8yWZLKvDLjBtu4IOAgpTx0iLcvX_Q)