Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

myproject

This model is a fine-tuned version of distilbert/distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 10.7401
  • Accuracy: 0.3585

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 70

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 60 6.8412 0.3585
No log 2.0 120 7.0381 0.3585
No log 3.0 180 7.2302 0.3585
No log 4.0 240 7.3315 0.3585
No log 5.0 300 7.5093 0.3585
No log 6.0 360 7.6537 0.3585
No log 7.0 420 7.7774 0.3585
No log 8.0 480 7.8459 0.3585
2.4126 9.0 540 7.9683 0.3585
2.4126 10.0 600 8.0727 0.3585
2.4126 11.0 660 8.1432 0.3585
2.4126 12.0 720 8.2632 0.3585
2.4126 13.0 780 8.3617 0.3585
2.4126 14.0 840 8.3888 0.3585
2.4126 15.0 900 8.4802 0.3585
2.4126 16.0 960 8.6048 0.3585
1.3107 17.0 1020 8.6832 0.3585
1.3107 18.0 1080 8.7031 0.3585
1.3107 19.0 1140 8.8062 0.3585
1.3107 20.0 1200 8.9172 0.3585
1.3107 21.0 1260 8.9063 0.3585
1.3107 22.0 1320 9.0786 0.3585
1.3107 23.0 1380 9.1961 0.3585
1.3107 24.0 1440 9.1986 0.3585
0.6626 25.0 1500 9.2499 0.3585
0.6626 26.0 1560 9.2925 0.3585
0.6626 27.0 1620 9.4094 0.3585
0.6626 28.0 1680 9.4204 0.3585
0.6626 29.0 1740 9.5304 0.3585
0.6626 30.0 1800 9.5006 0.3585
0.6626 31.0 1860 9.6281 0.3585
0.6626 32.0 1920 9.6695 0.3585
0.6626 33.0 1980 9.6979 0.3585
0.3334 34.0 2040 9.8019 0.3585
0.3334 35.0 2100 9.8644 0.3585
0.3334 36.0 2160 9.8489 0.3585
0.3334 37.0 2220 9.8635 0.3585
0.3334 38.0 2280 9.9720 0.3585
0.3334 39.0 2340 10.0142 0.3585
0.3334 40.0 2400 10.1065 0.3585
0.3334 41.0 2460 10.1095 0.3585
0.1755 42.0 2520 10.1453 0.3585
0.1755 43.0 2580 10.1601 0.3585
0.1755 44.0 2640 10.2449 0.3585
0.1755 45.0 2700 10.2644 0.3585
0.1755 46.0 2760 10.2791 0.3585
0.1755 47.0 2820 10.3368 0.3585
0.1755 48.0 2880 10.3840 0.3585
0.1755 49.0 2940 10.3765 0.3585
0.1048 50.0 3000 10.4580 0.3585
0.1048 51.0 3060 10.4575 0.3585
0.1048 52.0 3120 10.4835 0.3585
0.1048 53.0 3180 10.5889 0.3585
0.1048 54.0 3240 10.5074 0.3585
0.1048 55.0 3300 10.5430 0.3585
0.1048 56.0 3360 10.5594 0.3585
0.1048 57.0 3420 10.6237 0.3585
0.1048 58.0 3480 10.6025 0.3585
0.0744 59.0 3540 10.6312 0.3585
0.0744 60.0 3600 10.6667 0.3585
0.0744 61.0 3660 10.6999 0.3585
0.0744 62.0 3720 10.6992 0.3585
0.0744 63.0 3780 10.6985 0.3585
0.0744 64.0 3840 10.7162 0.3585
0.0744 65.0 3900 10.7121 0.3585
0.0744 66.0 3960 10.7050 0.3585
0.06 67.0 4020 10.7263 0.3585
0.06 68.0 4080 10.7295 0.3585
0.06 69.0 4140 10.7384 0.3585
0.06 70.0 4200 10.7401 0.3585

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
5
Safetensors
Model size
67.3M params
Tensor type
F32
·

Finetuned from