intent_classification_model_roberta
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0233
- Accuracy: 0.9966
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.6634 | 0.3680 | 712 | 0.7704 | 0.8192 |
1.4557 | 0.7359 | 1424 | 0.3484 | 0.9357 |
0.8559 | 1.1039 | 2136 | 0.2281 | 0.9451 |
0.7125 | 1.4718 | 2848 | 0.1856 | 0.9506 |
0.6141 | 1.8398 | 3560 | 0.1342 | 0.9625 |
0.5507 | 2.2078 | 4272 | 0.1166 | 0.9675 |
0.4741 | 2.5757 | 4984 | 0.0879 | 0.9758 |
0.4507 | 2.9437 | 5696 | 0.0634 | 0.9847 |
0.397 | 3.3116 | 6408 | 0.0749 | 0.9797 |
0.3525 | 3.6796 | 7120 | 0.0616 | 0.9834 |
0.3465 | 4.0475 | 7832 | 0.0443 | 0.9874 |
0.2644 | 4.4155 | 8544 | 0.0493 | 0.9859 |
0.2758 | 4.7835 | 9256 | 0.0395 | 0.9893 |
0.2839 | 5.1514 | 9968 | 0.0539 | 0.9827 |
0.2045 | 5.5194 | 10680 | 0.0296 | 0.9923 |
0.2013 | 5.8873 | 11392 | 0.0218 | 0.9946 |
0.169 | 6.2553 | 12104 | 0.0275 | 0.9934 |
0.1676 | 6.6233 | 12816 | 0.0255 | 0.9935 |
0.1743 | 6.9912 | 13528 | 0.0227 | 0.9944 |
0.1216 | 7.3592 | 14240 | 0.0262 | 0.9935 |
0.1295 | 7.7271 | 14952 | 0.0243 | 0.9945 |
0.133 | 8.0951 | 15664 | 0.0204 | 0.9957 |
0.1035 | 8.4630 | 16376 | 0.0234 | 0.9954 |
0.1053 | 8.8310 | 17088 | 0.0285 | 0.9933 |
0.0987 | 9.1990 | 17800 | 0.0264 | 0.9952 |
0.0841 | 9.5669 | 18512 | 0.0202 | 0.9966 |
0.0798 | 9.9349 | 19224 | 0.0232 | 0.9957 |
0.0812 | 10.3028 | 19936 | 0.0173 | 0.9967 |
0.0701 | 10.6708 | 20648 | 0.0216 | 0.9965 |
0.0678 | 11.0388 | 21360 | 0.0173 | 0.9969 |
0.0504 | 11.4067 | 22072 | 0.0208 | 0.9959 |
0.0521 | 11.7747 | 22784 | 0.0227 | 0.9960 |
0.0592 | 12.1426 | 23496 | 0.0247 | 0.9960 |
0.0364 | 12.5106 | 24208 | 0.0211 | 0.9967 |
0.0378 | 12.8786 | 24920 | 0.0201 | 0.9966 |
0.0298 | 13.2465 | 25632 | 0.0202 | 0.9967 |
0.0286 | 13.6145 | 26344 | 0.0220 | 0.9967 |
0.0322 | 13.9824 | 27056 | 0.0193 | 0.9970 |
0.0207 | 14.3504 | 27768 | 0.0212 | 0.9964 |
0.0274 | 14.7183 | 28480 | 0.0257 | 0.9961 |
0.0201 | 15.0863 | 29192 | 0.0211 | 0.9971 |
0.0143 | 15.4543 | 29904 | 0.0245 | 0.9966 |
0.0168 | 15.8222 | 30616 | 0.0242 | 0.9965 |
0.021 | 16.1902 | 31328 | 0.0216 | 0.9966 |
0.0097 | 16.5581 | 32040 | 0.0223 | 0.9968 |
0.0172 | 16.9261 | 32752 | 0.0210 | 0.9967 |
0.0146 | 17.2941 | 33464 | 0.0224 | 0.9967 |
0.0125 | 17.6620 | 34176 | 0.0237 | 0.9966 |
0.0086 | 18.0300 | 34888 | 0.0232 | 0.9965 |
0.0077 | 18.3979 | 35600 | 0.0224 | 0.9969 |
0.0037 | 18.7659 | 36312 | 0.0235 | 0.9964 |
0.0046 | 19.1339 | 37024 | 0.0223 | 0.9965 |
0.0038 | 19.5018 | 37736 | 0.0232 | 0.9966 |
0.0058 | 19.8698 | 38448 | 0.0233 | 0.9966 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.0.0+cu117
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 13