Edit model card

tinybert_29_med_intents

This model is a fine-tuned version of prajjwal1/bert-tiny on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3344
  • Accuracy: 0.9199

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 430 2.8232 0.4144
3.1061 2.0 860 2.4220 0.4890
2.6532 3.0 1290 2.0921 0.5967
2.28 4.0 1720 1.8178 0.6878
1.9726 5.0 2150 1.5987 0.7431
1.7268 6.0 2580 1.4221 0.7569
1.5454 7.0 3010 1.2797 0.7762
1.5454 8.0 3440 1.1608 0.7818
1.3826 9.0 3870 1.0589 0.8039
1.2445 10.0 4300 0.9737 0.8177
1.1266 11.0 4730 0.8920 0.8343
1.0328 12.0 5160 0.8279 0.8398
0.9528 13.0 5590 0.7646 0.8453
0.8538 14.0 6020 0.7186 0.8564
0.8538 15.0 6450 0.6733 0.8619
0.7987 16.0 6880 0.6347 0.8812
0.7367 17.0 7310 0.5945 0.8840
0.6931 18.0 7740 0.5674 0.8950
0.6339 19.0 8170 0.5429 0.9061
0.606 20.0 8600 0.5132 0.9033
0.5647 21.0 9030 0.4991 0.9061
0.5647 22.0 9460 0.4709 0.9033
0.5375 23.0 9890 0.4642 0.9116
0.4961 24.0 10320 0.4421 0.9116
0.4695 25.0 10750 0.4390 0.9088
0.4499 26.0 11180 0.4126 0.9088
0.4315 27.0 11610 0.4149 0.9088
0.4005 28.0 12040 0.4036 0.9116
0.4005 29.0 12470 0.3938 0.9033
0.3929 30.0 12900 0.3846 0.9061
0.3707 31.0 13330 0.3856 0.9116
0.369 32.0 13760 0.3727 0.9088
0.3517 33.0 14190 0.3739 0.9088
0.3355 34.0 14620 0.3604 0.9088
0.3226 35.0 15050 0.3518 0.9144
0.3226 36.0 15480 0.3570 0.9116
0.3197 37.0 15910 0.3502 0.9144
0.3038 38.0 16340 0.3463 0.9144
0.3038 39.0 16770 0.3448 0.9116
0.2918 40.0 17200 0.3448 0.9144
0.2937 41.0 17630 0.3460 0.9144
0.2845 42.0 18060 0.3414 0.9199
0.2845 43.0 18490 0.3412 0.9199
0.2785 44.0 18920 0.3401 0.9227
0.2781 45.0 19350 0.3372 0.9199
0.2665 46.0 19780 0.3364 0.9199
0.2722 47.0 20210 0.3352 0.9199
0.2683 48.0 20640 0.3359 0.9199
0.267 49.0 21070 0.3345 0.9199
0.2641 50.0 21500 0.3344 0.9199

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
9
Safetensors
Model size
4.39M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for m-aliabbas1/tinybert_29_med_intents

Finetuned
(50)
this model