lombardata's picture
Evaluation on the test set completed on 2024_09_08.
8e9825c verified
|
raw
history blame
11.2 kB
metadata
license: apache-2.0
base_model: facebook/dinov2-large
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: DinoVdeau-large-2024_09_05-batch-size32_epochs150_freeze
    results: []

DinoVdeau-large-2024_09_05-batch-size32_epochs150_freeze

This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1209
  • F1 Micro: 0.8228
  • F1 Macro: 0.7175
  • Roc Auc: 0.8813
  • Accuracy: 0.3111
  • Learning Rate: 0.0000

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Micro F1 Macro Roc Auc Accuracy Rate
No log 1.0 273 0.1690 0.7517 0.5430 0.8384 0.2231 0.001
0.2719 2.0 546 0.1538 0.7657 0.5721 0.8396 0.2401 0.001
0.2719 3.0 819 0.1483 0.7773 0.6138 0.8516 0.2346 0.001
0.1694 4.0 1092 0.1480 0.7723 0.6225 0.8407 0.2495 0.001
0.1694 5.0 1365 0.1458 0.7797 0.6302 0.8470 0.2495 0.001
0.1625 6.0 1638 0.1450 0.7798 0.6093 0.8477 0.2481 0.001
0.1625 7.0 1911 0.1475 0.7767 0.6248 0.8454 0.2526 0.001
0.1592 8.0 2184 0.1457 0.7804 0.6249 0.8521 0.2574 0.001
0.1592 9.0 2457 0.1417 0.7869 0.6526 0.8561 0.2574 0.001
0.157 10.0 2730 0.1436 0.7757 0.6290 0.8403 0.2547 0.001
0.1563 11.0 3003 0.1428 0.7887 0.6448 0.8569 0.2640 0.001
0.1563 12.0 3276 0.1439 0.7905 0.6493 0.8638 0.2581 0.001
0.1558 13.0 3549 0.1391 0.7907 0.6562 0.8551 0.2713 0.001
0.1558 14.0 3822 0.1409 0.7838 0.6338 0.8485 0.2644 0.001
0.1543 15.0 4095 0.1396 0.7907 0.6463 0.8603 0.2578 0.001
0.1543 16.0 4368 0.1390 0.7913 0.6594 0.8564 0.2654 0.001
0.1535 17.0 4641 0.1418 0.7940 0.6586 0.8665 0.2564 0.001
0.1535 18.0 4914 0.1416 0.7957 0.6560 0.8646 0.2658 0.001
0.1549 19.0 5187 0.1403 0.7886 0.6524 0.8536 0.2630 0.001
0.1549 20.0 5460 0.1476 0.7911 0.6558 0.8568 0.2613 0.001
0.154 21.0 5733 0.1429 0.7880 0.6397 0.8568 0.2658 0.001
0.1529 22.0 6006 0.1414 0.7937 0.6508 0.8654 0.2613 0.001
0.1529 23.0 6279 0.1415 0.7976 0.6618 0.8613 0.2685 0.0001
0.1449 24.0 6552 0.1323 0.8045 0.6751 0.8665 0.2789 0.0001
0.1449 25.0 6825 0.1310 0.8044 0.6724 0.8688 0.2793 0.0001
0.1416 26.0 7098 0.1327 0.8036 0.6689 0.8646 0.2821 0.0001
0.1416 27.0 7371 0.1317 0.8069 0.6797 0.8715 0.2817 0.0001
0.1391 28.0 7644 0.1288 0.8072 0.6818 0.8698 0.2775 0.0001
0.1391 29.0 7917 0.1294 0.8038 0.6808 0.8629 0.2845 0.0001
0.138 30.0 8190 0.1294 0.8077 0.6826 0.8702 0.2859 0.0001
0.138 31.0 8463 0.1274 0.8074 0.6779 0.8666 0.2879 0.0001
0.1364 32.0 8736 0.1278 0.8104 0.6869 0.8728 0.2883 0.0001
0.1359 33.0 9009 0.1277 0.8077 0.6811 0.8692 0.2869 0.0001
0.1359 34.0 9282 0.1266 0.8109 0.6874 0.8714 0.2883 0.0001
0.1341 35.0 9555 0.1262 0.8104 0.6885 0.8716 0.2904 0.0001
0.1341 36.0 9828 0.1269 0.8070 0.6876 0.8657 0.2827 0.0001
0.1339 37.0 10101 0.1266 0.8082 0.6834 0.8678 0.2866 0.0001
0.1339 38.0 10374 0.1255 0.8106 0.6936 0.8707 0.2956 0.0001
0.1307 39.0 10647 0.1249 0.8142 0.6986 0.8768 0.2928 0.0001
0.1307 40.0 10920 0.1258 0.8138 0.6990 0.8773 0.2935 0.0001
0.1317 41.0 11193 0.1253 0.8101 0.6924 0.8688 0.2924 0.0001
0.1317 42.0 11466 0.1244 0.8138 0.6970 0.8738 0.3004 0.0001
0.1308 43.0 11739 0.1245 0.8131 0.6956 0.8734 0.2949 0.0001
0.1307 44.0 12012 0.1250 0.8130 0.6915 0.8743 0.2966 0.0001
0.1307 45.0 12285 0.1240 0.8137 0.7051 0.8740 0.2963 0.0001
0.1295 46.0 12558 0.1241 0.8131 0.6988 0.8733 0.2976 0.0001
0.1295 47.0 12831 0.1243 0.8119 0.6958 0.8716 0.2956 0.0001
0.1293 48.0 13104 0.1239 0.8135 0.6990 0.8744 0.2956 0.0001
0.1293 49.0 13377 0.1243 0.8153 0.7007 0.8775 0.2997 0.0001
0.1274 50.0 13650 0.1241 0.8152 0.7000 0.8769 0.2980 0.0001
0.1274 51.0 13923 0.1248 0.8153 0.7056 0.8803 0.3011 0.0001
0.1271 52.0 14196 0.1243 0.8157 0.7036 0.8751 0.3049 0.0001
0.1271 53.0 14469 0.1241 0.8153 0.7032 0.8778 0.3021 0.0001
0.1275 54.0 14742 0.1234 0.8152 0.7068 0.8753 0.3021 0.0001
0.1256 55.0 15015 0.1231 0.8166 0.7076 0.8776 0.3018 0.0001
0.1256 56.0 15288 0.1228 0.8190 0.7088 0.8822 0.3067 0.0001
0.1258 57.0 15561 0.1226 0.8160 0.7080 0.8767 0.3070 0.0001
0.1258 58.0 15834 0.1233 0.8170 0.7073 0.8773 0.3021 0.0001
0.1258 59.0 16107 0.1227 0.8172 0.7135 0.8781 0.3021 0.0001
0.1258 60.0 16380 0.1233 0.8143 0.7040 0.8729 0.3021 0.0001
0.1252 61.0 16653 0.1234 0.8168 0.7121 0.8784 0.3042 0.0001
0.1252 62.0 16926 0.1223 0.8169 0.7125 0.8764 0.3049 0.0001
0.1238 63.0 17199 0.1231 0.8151 0.7090 0.8752 0.3035 0.0001
0.1238 64.0 17472 0.1228 0.8183 0.7114 0.8785 0.3067 0.0001
0.1247 65.0 17745 0.1231 0.8185 0.7156 0.8802 0.3035 0.0001
0.123 66.0 18018 0.1225 0.8193 0.7084 0.8809 0.3021 0.0001
0.123 67.0 18291 0.1222 0.8186 0.7136 0.8814 0.3032 0.0001
0.1224 68.0 18564 0.1220 0.8201 0.7169 0.8818 0.3091 0.0001
0.1224 69.0 18837 0.1228 0.8171 0.7165 0.8768 0.3018 0.0001
0.1228 70.0 19110 0.1227 0.8177 0.7131 0.8765 0.3042 0.0001
0.1228 71.0 19383 0.1232 0.8155 0.7123 0.8733 0.2980 0.0001
0.1224 72.0 19656 0.1222 0.8177 0.7181 0.8780 0.3056 0.0001
0.1224 73.0 19929 0.1221 0.8162 0.7047 0.8760 0.3077 0.0001
0.122 74.0 20202 0.1230 0.8148 0.7070 0.8732 0.2973 0.0001
0.122 75.0 20475 0.1214 0.8176 0.7124 0.8768 0.3049 1e-05
0.1201 76.0 20748 0.1209 0.8213 0.7265 0.8828 0.3067 1e-05
0.1192 77.0 21021 0.1216 0.8221 0.7249 0.8860 0.3073 1e-05
0.1192 78.0 21294 0.1211 0.8210 0.7233 0.8828 0.3056 1e-05
0.1178 79.0 21567 0.1211 0.8181 0.7158 0.8769 0.3056 1e-05
0.1178 80.0 21840 0.1210 0.8200 0.7197 0.8824 0.3091 1e-05
0.1178 81.0 22113 0.1205 0.8190 0.7194 0.8784 0.3105 1e-05
0.1178 82.0 22386 0.1205 0.8187 0.7213 0.8782 0.3070 1e-05
0.1162 83.0 22659 0.1215 0.8171 0.7136 0.8754 0.3049 1e-05
0.1162 84.0 22932 0.1209 0.8212 0.7226 0.8817 0.3115 1e-05
0.1174 85.0 23205 0.1206 0.8213 0.7219 0.8823 0.3094 1e-05
0.1174 86.0 23478 0.1210 0.8207 0.7256 0.8811 0.3084 1e-05
0.1167 87.0 23751 0.1210 0.8192 0.7163 0.8800 0.3073 1e-05
0.116 88.0 24024 0.1208 0.8219 0.7180 0.8831 0.3094 1e-05
0.116 89.0 24297 0.1213 0.8236 0.7293 0.8872 0.3125 0.0000
0.1161 90.0 24570 0.1211 0.8228 0.7250 0.8869 0.3108 0.0000
0.1161 91.0 24843 0.1206 0.8191 0.7187 0.8779 0.3105 0.0000
0.1162 92.0 25116 0.1208 0.8196 0.7150 0.8793 0.3105 0.0000

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1