Edit model card

dino-base-2023_11_20-with_custom_head

This model is a fine-tuned version of facebook/dinov2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1133
  • F1 Micro: 0.8447
  • F1 Macro: 0.7988
  • Roc Auc: 0.8941
  • Accuracy: 0.5681
  • Learning Rate: 0.0001

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.01
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss F1 Micro F1 Macro Roc Auc Accuracy Rate
0.2488 1.0 536 0.2054 0.5885 0.4304 0.7132 0.3901 0.01
0.219 2.0 1072 0.2322 0.6698 0.6174 0.7719 0.4198 0.01
0.215 3.0 1608 0.1781 0.6953 0.5734 0.7846 0.4073 0.01
0.2161 4.0 2144 0.1750 0.7345 0.6175 0.8322 0.3901 0.01
0.2121 5.0 2680 0.1680 0.7332 0.5938 0.8161 0.4344 0.01
0.2131 6.0 3216 0.1809 0.7109 0.5159 0.8012 0.4259 0.01
0.2128 7.0 3752 0.1683 0.7151 0.5694 0.8001 0.4284 0.01
0.2137 8.0 4288 0.1849 0.7333 0.5993 0.8297 0.4098 0.01
0.2112 9.0 4824 0.1981 0.7056 0.6295 0.8023 0.4130 0.01
0.2192 10.0 5360 0.1912 0.7417 0.6277 0.8323 0.4212 0.01
0.2174 11.0 5896 0.2587 0.7234 0.6177 0.8144 0.4173 0.01
0.2164 12.0 6432 2.3939 0.7708 0.6958 0.8435 0.4691 0.001
0.1857 13.0 6968 0.1573 0.7965 0.7418 0.8588 0.5023 0.001
0.1659 14.0 7504 0.1316 0.8104 0.7555 0.8748 0.5148 0.001
0.1605 15.0 8040 0.1319 0.8114 0.7627 0.8744 0.5138 0.001
0.1598 16.0 8576 0.2096 0.8137 0.7530 0.8707 0.5305 0.001
0.1543 17.0 9112 0.1297 0.8221 0.7760 0.8899 0.5148 0.001
0.1566 18.0 9648 0.1273 0.8266 0.7854 0.8914 0.5259 0.001
0.1533 19.0 10184 0.1292 0.8189 0.7632 0.8798 0.5216 0.001
0.1497 20.0 10720 0.1305 0.8273 0.7773 0.8841 0.5198 0.001
0.1513 21.0 11256 0.1217 0.8290 0.7707 0.8872 0.5295 0.001
0.1486 22.0 11792 0.1211 0.8268 0.7743 0.8800 0.5420 0.001
0.1477 23.0 12328 0.1435 0.8210 0.7706 0.8793 0.5263 0.001
0.1471 24.0 12864 0.1243 0.8277 0.7811 0.8811 0.5370 0.001
0.144 25.0 13400 0.1260 0.8245 0.7752 0.8860 0.5205 0.001
0.1444 26.0 13936 0.1184 0.8310 0.7857 0.8851 0.5466 0.001
0.1466 27.0 14472 0.1270 0.8206 0.7654 0.8736 0.5320 0.001
0.1462 28.0 15008 0.1304 0.8267 0.7756 0.8790 0.5384 0.001
0.1456 29.0 15544 0.1315 0.8304 0.7897 0.8921 0.5252 0.001
0.144 30.0 16080 0.1236 0.8320 0.7825 0.8846 0.5495 0.001
0.1429 31.0 16616 0.1284 0.8114 0.7589 0.8700 0.5277 0.001
0.1467 32.0 17152 0.1222 0.8357 0.7878 0.8916 0.5427 0.001
0.1388 33.0 17688 0.1284 0.8348 0.7837 0.8857 0.5488 0.0001
0.1356 34.0 18224 0.1119 0.8419 0.7958 0.8962 0.5577 0.0001
0.1333 35.0 18760 0.1145 0.8408 0.7943 0.8932 0.5627 0.0001
0.1292 36.0 19296 0.1136 0.8405 0.7919 0.8918 0.5591 0.0001
0.1294 37.0 19832 0.1124 0.8431 0.7990 0.8971 0.5591 0.0001
0.1297 38.0 20368 0.1126 0.8407 0.7941 0.8911 0.5638 0.0001
0.1259 39.0 20904 0.1121 0.8475 0.8062 0.9007 0.5631 0.0001
0.1285 40.0 21440 0.1113 0.8445 0.8013 0.8954 0.5609 0.0001
0.1229 41.0 21976 0.1086 0.8465 0.8019 0.8971 0.5663 0.0001
0.1234 42.0 22512 0.1093 0.8435 0.7966 0.8941 0.5613 0.0001
0.1241 43.0 23048 0.1165 0.8431 0.7990 0.8922 0.5584 0.0001
0.1229 44.0 23584 0.1084 0.8446 0.8018 0.8939 0.5663 0.0001
0.1205 45.0 24120 0.1073 0.8505 0.8126 0.9030 0.5691 0.0001
0.1219 46.0 24656 0.1095 0.8491 0.8142 0.9081 0.5588 0.0001
0.1213 47.0 25192 0.1076 0.8486 0.8105 0.9002 0.5688 0.0001
0.1205 48.0 25728 0.1131 0.8477 0.8064 0.8999 0.5659 0.0001
0.1194 49.0 26264 0.1102 0.8490 0.8107 0.9024 0.5659 0.0001
0.1195 50.0 26800 0.1133 0.8447 0.7988 0.8941 0.5681 0.0001

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
7

Finetuned from