--- library_name: transformers license: apache-2.0 base_model: EuroBERT/EuroBERT-210m tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: eurobert210m_Mobilite_v1 results: [] --- # eurobert210m_Mobilite_v1 This model is a fine-tuned version of [EuroBERT/EuroBERT-210m](https://huggingface.co/EuroBERT/EuroBERT-210m) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0135 - Accuracy: 0.9942 - F1: 0.9942 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 100 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 1.2363 | 1.0 | 124 | 0.8864 | 0.6790 | 0.6323 | | 0.588 | 2.0 | 248 | 0.3959 | 0.8901 | 0.8796 | | 0.3365 | 3.0 | 372 | 0.1947 | 0.9506 | 0.9502 | | 0.2177 | 4.0 | 496 | 0.1856 | 0.9549 | 0.9540 | | 0.1708 | 5.0 | 620 | 0.1072 | 0.9786 | 0.9786 | | 0.1455 | 6.0 | 744 | 0.1288 | 0.9713 | 0.9716 | | 0.1217 | 7.0 | 868 | 0.0800 | 0.9836 | 0.9837 | | 0.0986 | 8.0 | 992 | 0.0599 | 0.9874 | 0.9874 | | 0.0735 | 9.0 | 1116 | 0.0480 | 0.9892 | 0.9892 | | 0.0577 | 10.0 | 1240 | 0.0305 | 0.9922 | 0.9922 | | 0.0619 | 11.0 | 1364 | 0.0475 | 0.9897 | 0.9897 | | 0.0449 | 12.0 | 1488 | 0.0991 | 0.9816 | 0.9814 | | 0.0566 | 13.0 | 1612 | 0.0215 | 0.9932 | 0.9932 | | 0.0473 | 14.0 | 1736 | 0.0228 | 0.9939 | 0.9939 | | 0.0344 | 15.0 | 1860 | 0.0336 | 0.9922 | 0.9922 | | 0.04 | 16.0 | 1984 | 0.0426 | 0.9909 | 0.9909 | | 0.0353 | 17.0 | 2108 | 0.0191 | 0.9945 | 0.9945 | | 0.0448 | 18.0 | 2232 | 0.0193 | 0.9932 | 0.9932 | | 0.0359 | 19.0 | 2356 | 0.0184 | 0.9942 | 0.9942 | | 0.0314 | 20.0 | 2480 | 0.0146 | 0.9942 | 0.9942 | | 0.0257 | 21.0 | 2604 | 0.0173 | 0.9942 | 0.9942 | | 0.0208 | 22.0 | 2728 | 0.0144 | 0.9942 | 0.9942 | | 0.0334 | 23.0 | 2852 | 0.0135 | 0.9942 | 0.9942 | ### Framework versions - Transformers 4.48.3 - Pytorch 2.5.1+cu124 - Datasets 3.3.2 - Tokenizers 0.21.0