lombardata's picture
Evaluation on the test set completed on 2024_05_23.
f6be8c4 verified
|
raw
history blame
No virus
6.74 kB
metadata
license: apache-2.0
base_model: facebook/dinov2-large
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze
    results: []

dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze

This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2361
  • F1 Micro: 0.7694
  • F1 Macro: 0.4048
  • Roc Auc: 0.8448
  • Accuracy: 0.1449
  • Learning Rate: 0.0001

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 512
  • eval_batch_size: 512
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Micro F1 Macro Roc Auc Accuracy Rate
No log 1.0 28 0.5952 0.5739 0.4067 0.7528 0.0124 0.001
No log 2.0 56 0.4730 0.7307 0.4368 0.8401 0.0698 0.001
No log 3.0 84 0.3240 0.7499 0.3770 0.8378 0.1074 0.001
No log 4.0 112 0.2770 0.7521 0.3710 0.8372 0.1180 0.001
No log 5.0 140 0.2588 0.7507 0.3715 0.8353 0.1196 0.001
No log 6.0 168 0.2533 0.7520 0.3630 0.8354 0.1218 0.001
No log 7.0 196 0.2513 0.7517 0.3646 0.8347 0.1153 0.001
No log 8.0 224 0.2508 0.7576 0.3894 0.8407 0.1228 0.001
No log 9.0 252 0.2479 0.7550 0.3829 0.8360 0.1275 0.001
No log 10.0 280 0.2481 0.7583 0.3797 0.8407 0.1265 0.001
No log 11.0 308 0.2467 0.7601 0.3964 0.8431 0.1243 0.001
No log 12.0 336 0.2460 0.7565 0.3733 0.8362 0.1251 0.001
No log 13.0 364 0.2456 0.7582 0.3862 0.8399 0.1298 0.001
No log 14.0 392 0.2465 0.7526 0.3708 0.8323 0.1371 0.001
No log 15.0 420 0.2452 0.7541 0.3795 0.8344 0.1271 0.001
No log 16.0 448 0.2437 0.7597 0.3904 0.8409 0.1293 0.001
No log 17.0 476 0.2447 0.7526 0.3854 0.8317 0.1316 0.001
0.3126 18.0 504 0.2454 0.7534 0.3578 0.8326 0.1332 0.001
0.3126 19.0 532 0.2441 0.7568 0.3694 0.8367 0.1324 0.001
0.3126 20.0 560 0.2454 0.7509 0.3768 0.8288 0.1361 0.001
0.3126 21.0 588 0.2438 0.7602 0.3896 0.8416 0.1249 0.001
0.3126 22.0 616 0.2419 0.7576 0.3716 0.8368 0.1302 0.001
0.3126 23.0 644 0.2435 0.7629 0.3880 0.8454 0.1265 0.001
0.3126 24.0 672 0.2413 0.7561 0.3897 0.8344 0.1342 0.001
0.3126 25.0 700 0.2419 0.7599 0.3827 0.8415 0.1298 0.001
0.3126 26.0 728 0.2438 0.7593 0.3971 0.8401 0.1267 0.001
0.3126 27.0 756 0.2418 0.7614 0.3838 0.8422 0.1310 0.001
0.3126 28.0 784 0.2432 0.7498 0.3793 0.8275 0.1334 0.001
0.3126 29.0 812 0.2420 0.7622 0.3960 0.8436 0.1367 0.001
0.3126 30.0 840 0.2407 0.7620 0.3860 0.8404 0.1424 0.001
0.3126 31.0 868 0.2422 0.7612 0.3929 0.8429 0.1328 0.001
0.3126 32.0 896 0.2430 0.7516 0.3912 0.8298 0.1312 0.001
0.3126 33.0 924 0.2414 0.7589 0.3884 0.8388 0.1302 0.001
0.3126 34.0 952 0.2404 0.7625 0.4037 0.8419 0.1354 0.001
0.3126 35.0 980 0.2413 0.7602 0.3973 0.8400 0.1300 0.001
0.2465 36.0 1008 0.2419 0.7622 0.3876 0.8436 0.1357 0.001
0.2465 37.0 1036 0.2399 0.7598 0.3992 0.8381 0.1342 0.001
0.2465 38.0 1064 0.2400 0.7607 0.3933 0.8397 0.1330 0.001
0.2465 39.0 1092 0.2409 0.7619 0.4008 0.8412 0.1389 0.001
0.2465 40.0 1120 0.2399 0.76 0.3925 0.8378 0.1354 0.001
0.2465 41.0 1148 0.2423 0.7640 0.4061 0.8464 0.1249 0.001
0.2465 42.0 1176 0.2426 0.7569 0.4005 0.8378 0.1310 0.001
0.2465 43.0 1204 0.2392 0.7594 0.4008 0.8369 0.1336 0.001
0.2465 44.0 1232 0.2418 0.7577 0.4064 0.8365 0.1304 0.001
0.2465 45.0 1260 0.2411 0.7591 0.3906 0.8384 0.1379 0.001
0.2465 46.0 1288 0.2396 0.7654 0.4106 0.8457 0.1363 0.001
0.2465 47.0 1316 0.2396 0.7575 0.3968 0.8349 0.1326 0.001
0.2465 48.0 1344 0.2423 0.7564 0.3878 0.8373 0.1287 0.001
0.2465 49.0 1372 0.2398 0.7608 0.4027 0.8390 0.1330 0.001
0.2465 50.0 1400 0.2367 0.7652 0.4087 0.8436 0.1424 0.0001

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu118
  • Datasets 2.19.1
  • Tokenizers 0.19.1