onsba's picture
End of training
84f5434 verified
|
raw
history blame
No virus
6.41 kB
metadata
license: apache-2.0
base_model: distilbert-base-uncased
tags:
  - generated_from_trainer
model-index:
  - name: distilbert-base-uncased-finetuned-pfe-projectt
    results: []

distilbert-base-uncased-finetuned-pfe-projectt

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.6986

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 6 2.5604
No log 2.0 12 2.8128
No log 3.0 18 2.5817
No log 4.0 24 2.8479
No log 5.0 30 2.8753
No log 6.0 36 3.0051
No log 7.0 42 2.9990
No log 8.0 48 3.1331
No log 9.0 54 3.0289
No log 10.0 60 3.1572
No log 11.0 66 3.1695
No log 12.0 72 3.0457
No log 13.0 78 3.2199
No log 14.0 84 3.0475
No log 15.0 90 2.8916
No log 16.0 96 3.0530
No log 17.0 102 3.2559
No log 18.0 108 3.0997
No log 19.0 114 3.0878
No log 20.0 120 3.1099
No log 21.0 126 3.2060
No log 22.0 132 3.2004
No log 23.0 138 3.5195
No log 24.0 144 3.2190
No log 25.0 150 3.1644
No log 26.0 156 3.4342
No log 27.0 162 3.2915
No log 28.0 168 3.2673
No log 29.0 174 3.1651
No log 30.0 180 3.1639
No log 31.0 186 3.1415
No log 32.0 192 3.2468
No log 33.0 198 3.3137
No log 34.0 204 3.3605
No log 35.0 210 3.3658
No log 36.0 216 3.3332
No log 37.0 222 3.4058
No log 38.0 228 3.3871
No log 39.0 234 3.5490
No log 40.0 240 3.5084
No log 41.0 246 3.3001
No log 42.0 252 3.4091
No log 43.0 258 3.4617
No log 44.0 264 3.3954
No log 45.0 270 3.4649
No log 46.0 276 3.5548
No log 47.0 282 3.4694
No log 48.0 288 3.5323
No log 49.0 294 3.6298
No log 50.0 300 3.5810
No log 51.0 306 3.5994
No log 52.0 312 3.5456
No log 53.0 318 3.5188
No log 54.0 324 3.3893
No log 55.0 330 3.4129
No log 56.0 336 3.5145
No log 57.0 342 3.4143
No log 58.0 348 3.4388
No log 59.0 354 3.4903
No log 60.0 360 3.5829
No log 61.0 366 3.5710
No log 62.0 372 3.6743
No log 63.0 378 3.6255
No log 64.0 384 3.6043
No log 65.0 390 3.6279
No log 66.0 396 3.6332
No log 67.0 402 3.7761
No log 68.0 408 3.7641
No log 69.0 414 3.7318
No log 70.0 420 3.6692
No log 71.0 426 3.6632
No log 72.0 432 3.7541
No log 73.0 438 3.8217
No log 74.0 444 3.7746
No log 75.0 450 3.6729
No log 76.0 456 3.6182
No log 77.0 462 3.6192
No log 78.0 468 3.5641
No log 79.0 474 3.5862
No log 80.0 480 3.5692
No log 81.0 486 3.5628
No log 82.0 492 3.5613
No log 83.0 498 3.5187
0.1491 84.0 504 3.5166
0.1491 85.0 510 3.5846
0.1491 86.0 516 3.6448
0.1491 87.0 522 3.6829
0.1491 88.0 528 3.6796
0.1491 89.0 534 3.6635
0.1491 90.0 540 3.6570
0.1491 91.0 546 3.6742
0.1491 92.0 552 3.7069
0.1491 93.0 558 3.6936
0.1491 94.0 564 3.6882
0.1491 95.0 570 3.6864
0.1491 96.0 576 3.6779
0.1491 97.0 582 3.6845
0.1491 98.0 588 3.6930
0.1491 99.0 594 3.6972
0.1491 100.0 600 3.6986

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.15.2