distilbert-base-uncased-finetuned-clinc
This model is a fine-tuned version of distilbert-base-uncased on the clinc_oos dataset. It achieves the following results on the evaluation set:
- Loss: 0.2695
- Accuracy: 0.9458
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 384
- eval_batch_size: 384
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 1.0 | 40 | 4.7106 | 0.1761 |
No log | 2.0 | 80 | 4.1216 | 0.5903 |
No log | 3.0 | 120 | 3.5380 | 0.7252 |
No log | 4.0 | 160 | 2.9932 | 0.7829 |
No log | 5.0 | 200 | 2.4860 | 0.8232 |
No log | 6.0 | 240 | 2.0342 | 0.8523 |
No log | 7.0 | 280 | 1.6488 | 0.8865 |
3.1704 | 8.0 | 320 | 1.3241 | 0.9019 |
3.1704 | 9.0 | 360 | 1.0621 | 0.9097 |
3.1704 | 10.0 | 400 | 0.8619 | 0.9232 |
3.1704 | 11.0 | 440 | 0.7078 | 0.9313 |
3.1704 | 12.0 | 480 | 0.5935 | 0.9339 |
3.1704 | 13.0 | 520 | 0.5085 | 0.9348 |
3.1704 | 14.0 | 560 | 0.4453 | 0.9368 |
3.1704 | 15.0 | 600 | 0.3983 | 0.9410 |
0.6354 | 16.0 | 640 | 0.3701 | 0.9416 |
0.6354 | 17.0 | 680 | 0.3489 | 0.94 |
0.6354 | 18.0 | 720 | 0.3330 | 0.9413 |
0.6354 | 19.0 | 760 | 0.3165 | 0.9435 |
0.6354 | 20.0 | 800 | 0.3047 | 0.9439 |
0.6354 | 21.0 | 840 | 0.3017 | 0.9403 |
0.6354 | 22.0 | 880 | 0.2936 | 0.9419 |
0.6354 | 23.0 | 920 | 0.2902 | 0.9419 |
0.1474 | 24.0 | 960 | 0.2868 | 0.9432 |
0.1474 | 25.0 | 1000 | 0.2805 | 0.9442 |
0.1474 | 26.0 | 1040 | 0.2799 | 0.9439 |
0.1474 | 27.0 | 1080 | 0.2783 | 0.9452 |
0.1474 | 28.0 | 1120 | 0.2750 | 0.9435 |
0.1474 | 29.0 | 1160 | 0.2736 | 0.9442 |
0.1474 | 30.0 | 1200 | 0.2754 | 0.9435 |
0.1474 | 31.0 | 1240 | 0.2709 | 0.9448 |
0.0762 | 32.0 | 1280 | 0.2721 | 0.9445 |
0.0762 | 33.0 | 1320 | 0.2714 | 0.9448 |
0.0762 | 34.0 | 1360 | 0.2708 | 0.9455 |
0.0762 | 35.0 | 1400 | 0.2687 | 0.9455 |
0.0762 | 36.0 | 1440 | 0.2693 | 0.9465 |
0.0762 | 37.0 | 1480 | 0.2688 | 0.9452 |
0.0762 | 38.0 | 1520 | 0.2696 | 0.9455 |
0.0762 | 39.0 | 1560 | 0.2692 | 0.9455 |
0.0572 | 40.0 | 1600 | 0.2695 | 0.9458 |
Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 111
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for hxstar/distilbert-base-uncased-finetuned-clinc
Base model
distilbert/distilbert-base-uncasedDataset used to train hxstar/distilbert-base-uncased-finetuned-clinc
Evaluation results
- Accuracy on clinc_oosvalidation set self-reported0.946