--- license: apache-2.0 base_model: distilbert-base-multilingual-cased tags: - generated_from_trainer metrics: - accuracy model-index: - name: 20231008-5-distilbert-base-multilingual-cased-new results: [] --- # 20231008-5-distilbert-base-multilingual-cased-new This model is a fine-tuned version of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Accuracy: 0.5614 - Loss: 1.9964 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Accuracy | Validation Loss | |:-------------:|:-----:|:----:|:--------:|:---------------:| | 2.9949 | 1.82 | 200 | 0.3868 | 2.5810 | | 2.5887 | 3.64 | 400 | 0.4463 | 2.5529 | | 2.3369 | 5.45 | 600 | 0.4665 | 2.4076 | | 2.2815 | 7.27 | 800 | 0.5133 | 2.2435 | | 2.1494 | 9.09 | 1000 | 0.5 | 2.1755 | | 2.0746 | 10.91 | 1200 | 0.5523 | 1.9893 | | 1.9617 | 12.73 | 1400 | 0.5648 | 1.8462 | | 1.9549 | 14.55 | 1600 | 0.5392 | 1.8725 | | 1.9192 | 16.36 | 1800 | 0.5605 | 2.0018 | | 1.8967 | 18.18 | 2000 | 0.6077 | 1.7557 | | 1.899 | 20.0 | 2200 | 0.5614 | 1.9964 | ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1