--- license: apache-2.0 base_model: distilbert-base-multilingual-cased tags: - pytorch - amazon-rating - DistilBERTForSequenceClassification - generated_from_trainer metrics: - accuracy - matthews_correlation model-index: - name: distilbert-base-amazon-multi results: [] datasets: - mteb/amazon_reviews_multi language: - en - de - es - fr - ja - zh library_name: transformers pipeline_tag: text-classification --- # distilbert-base-amazon-multi This model is a fine-tuned version of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) on the mteb/amazon_reviews_multi dataset. It achieves the following results on the evaluation set: - Loss: 0.9292 - Accuracy: 0.6055 - Matthews Correlation: 0.5072 ## Training procedure This model was fine tuned on Google Colab using a single **NVIDIA V100** GPU with 16GB of VRAM. It took around 13 hours to finish the finetuning of 10_000 steps. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 320 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - training_steps: 100000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Matthews Correlation | |:-------------:|:-----:|:------:|:---------------:|:--------:|:--------------------:| | 1.0008 | 0.26 | 10000 | 1.0027 | 0.5616 | 0.4520 | | 0.9545 | 0.51 | 20000 | 0.9705 | 0.5810 | 0.4788 | | 0.9216 | 0.77 | 30000 | 0.9415 | 0.5883 | 0.4868 | | 0.8765 | 1.03 | 40000 | 0.9495 | 0.5891 | 0.4871 | | 0.8837 | 1.28 | 50000 | 0.9254 | 0.5992 | 0.4997 | | 0.8753 | 1.54 | 60000 | 0.9199 | 0.6014 | 0.5029 | | 0.8572 | 1.8 | 70000 | 0.9108 | 0.6090 | 0.5117 | | 0.7851 | 2.05 | 80000 | 0.9276 | 0.6052 | 0.5066 | | 0.7918 | 2.31 | 90000 | 0.9292 | 0.6055 | 0.5072 | | 0.793 | 2.57 | 100000 | 0.9288 | 0.6064 | 0.5084 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0