metadata
license: apache-2.0
base_model: facebook/dinov2-large
tags:
- generated_from_trainer
model-index:
- name: drone-DinoVdeau-from-probs-large-2024_11_15-batch-size64_freeze_probs
results: []
drone-DinoVdeau-from-probs-large-2024_11_15-batch-size64_freeze_probs
This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4672
- Rmse: 0.1553
- Mae: 0.1147
- Kl Divergence: 0.3577
- Explained Variance: 0.4654
- Learning Rate: 0.0000
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rmse | Mae | Kl Divergence | Explained Variance | Rate |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 110 | 0.5006 | 0.1904 | 0.1552 | 0.1025 | 0.3284 | 0.001 |
No log | 2.0 | 220 | 0.4755 | 0.1681 | 0.1245 | 0.5180 | 0.3932 | 0.001 |
No log | 3.0 | 330 | 0.4745 | 0.1675 | 0.1227 | 0.6862 | 0.3975 | 0.001 |
No log | 4.0 | 440 | 0.4742 | 0.1672 | 0.1255 | 0.3212 | 0.4024 | 0.001 |
0.5081 | 5.0 | 550 | 0.4725 | 0.1653 | 0.1224 | 0.5072 | 0.4118 | 0.001 |
0.5081 | 6.0 | 660 | 0.4726 | 0.1657 | 0.1216 | 0.6710 | 0.4101 | 0.001 |
0.5081 | 7.0 | 770 | 0.4732 | 0.1655 | 0.1255 | 0.3162 | 0.4183 | 0.001 |
0.5081 | 8.0 | 880 | 0.4728 | 0.1651 | 0.1260 | 0.2719 | 0.4234 | 0.001 |
0.5081 | 9.0 | 990 | 0.4708 | 0.1639 | 0.1206 | 0.6393 | 0.4237 | 0.001 |
0.4668 | 10.0 | 1100 | 0.4733 | 0.1654 | 0.1230 | 0.5359 | 0.4151 | 0.001 |
0.4668 | 11.0 | 1210 | 0.4716 | 0.1647 | 0.1253 | 0.2479 | 0.4305 | 0.001 |
0.4668 | 12.0 | 1320 | 0.4708 | 0.1631 | 0.1244 | 0.3119 | 0.4358 | 0.001 |
0.4668 | 13.0 | 1430 | 0.4715 | 0.1635 | 0.1230 | 0.3694 | 0.4274 | 0.001 |
0.4641 | 14.0 | 1540 | 0.4721 | 0.1653 | 0.1216 | 0.5592 | 0.4134 | 0.001 |
0.4641 | 15.0 | 1650 | 0.4701 | 0.1628 | 0.1213 | 0.4936 | 0.4314 | 0.001 |
0.4641 | 16.0 | 1760 | 0.4719 | 0.1646 | 0.1229 | 0.2820 | 0.4328 | 0.001 |
0.4641 | 17.0 | 1870 | 0.4693 | 0.1621 | 0.1200 | 0.5294 | 0.4332 | 0.001 |
0.4641 | 18.0 | 1980 | 0.4710 | 0.1635 | 0.1216 | 0.4093 | 0.4294 | 0.001 |
0.4618 | 19.0 | 2090 | 0.4698 | 0.1622 | 0.1219 | 0.2918 | 0.4388 | 0.001 |
0.4618 | 20.0 | 2200 | 0.4692 | 0.1617 | 0.1190 | 0.4772 | 0.4355 | 0.001 |
0.4618 | 21.0 | 2310 | 0.4683 | 0.1606 | 0.1204 | 0.4336 | 0.4424 | 0.001 |
0.4618 | 22.0 | 2420 | 0.4724 | 0.1650 | 0.1183 | 0.7962 | 0.4233 | 0.001 |
0.4613 | 23.0 | 2530 | 0.4714 | 0.1641 | 0.1223 | 0.2854 | 0.4354 | 0.001 |
0.4613 | 24.0 | 2640 | 0.4707 | 0.1633 | 0.1207 | 0.4206 | 0.4280 | 0.001 |
0.4613 | 25.0 | 2750 | 0.4679 | 0.1606 | 0.1185 | 0.5436 | 0.4416 | 0.001 |
0.4613 | 26.0 | 2860 | 0.4708 | 0.1634 | 0.1192 | 0.4964 | 0.4268 | 0.001 |
0.4613 | 27.0 | 2970 | 0.4695 | 0.1625 | 0.1185 | 0.6399 | 0.4301 | 0.001 |
0.4607 | 28.0 | 3080 | 0.4701 | 0.1624 | 0.1184 | 0.5737 | 0.4324 | 0.001 |
0.4607 | 29.0 | 3190 | 0.4699 | 0.1624 | 0.1200 | 0.4459 | 0.4324 | 0.001 |
0.4607 | 30.0 | 3300 | 0.4723 | 0.1643 | 0.1254 | 0.2726 | 0.4308 | 0.001 |
0.4607 | 31.0 | 3410 | 0.4696 | 0.1622 | 0.1184 | 0.5308 | 0.4313 | 0.001 |
0.4604 | 32.0 | 3520 | 0.4668 | 0.1593 | 0.1175 | 0.4200 | 0.4508 | 0.0001 |
0.4604 | 33.0 | 3630 | 0.4663 | 0.1587 | 0.1177 | 0.3529 | 0.4565 | 0.0001 |
0.4604 | 34.0 | 3740 | 0.4667 | 0.1592 | 0.1181 | 0.3588 | 0.4542 | 0.0001 |
0.4604 | 35.0 | 3850 | 0.4659 | 0.1584 | 0.1160 | 0.4813 | 0.4545 | 0.0001 |
0.4604 | 36.0 | 3960 | 0.4658 | 0.1581 | 0.1173 | 0.3504 | 0.4594 | 0.0001 |
0.4565 | 37.0 | 4070 | 0.4654 | 0.1578 | 0.1158 | 0.3919 | 0.4608 | 0.0001 |
0.4565 | 38.0 | 4180 | 0.4655 | 0.1580 | 0.1166 | 0.4058 | 0.4583 | 0.0001 |
0.4565 | 39.0 | 4290 | 0.4658 | 0.1585 | 0.1174 | 0.4118 | 0.4567 | 0.0001 |
0.4565 | 40.0 | 4400 | 0.4656 | 0.1579 | 0.1170 | 0.3564 | 0.4607 | 0.0001 |
0.4552 | 41.0 | 4510 | 0.4657 | 0.1582 | 0.1171 | 0.3573 | 0.4598 | 0.0001 |
0.4552 | 42.0 | 4620 | 0.4652 | 0.1579 | 0.1155 | 0.5042 | 0.4587 | 0.0001 |
0.4552 | 43.0 | 4730 | 0.4651 | 0.1575 | 0.1157 | 0.4462 | 0.4613 | 0.0001 |
0.4552 | 44.0 | 4840 | 0.4654 | 0.1579 | 0.1166 | 0.4236 | 0.4604 | 0.0001 |
0.4552 | 45.0 | 4950 | 0.4649 | 0.1574 | 0.1151 | 0.4510 | 0.4625 | 0.0001 |
0.4538 | 46.0 | 5060 | 0.4648 | 0.1575 | 0.1157 | 0.4490 | 0.4619 | 0.0001 |
0.4538 | 47.0 | 5170 | 0.4649 | 0.1574 | 0.1152 | 0.4751 | 0.4615 | 0.0001 |
0.4538 | 48.0 | 5280 | 0.4648 | 0.1575 | 0.1151 | 0.5305 | 0.4631 | 0.0001 |
0.4538 | 49.0 | 5390 | 0.4648 | 0.1574 | 0.1154 | 0.4799 | 0.4630 | 0.0001 |
0.4532 | 50.0 | 5500 | 0.4650 | 0.1572 | 0.1172 | 0.2825 | 0.4694 | 0.0001 |
0.4532 | 51.0 | 5610 | 0.4656 | 0.1582 | 0.1151 | 0.4879 | 0.4573 | 0.0001 |
0.4532 | 52.0 | 5720 | 0.4643 | 0.1566 | 0.1155 | 0.4199 | 0.4674 | 0.0001 |
0.4532 | 53.0 | 5830 | 0.4644 | 0.1569 | 0.1156 | 0.3880 | 0.4673 | 0.0001 |
0.4532 | 54.0 | 5940 | 0.4646 | 0.1569 | 0.1148 | 0.4229 | 0.4654 | 0.0001 |
0.4526 | 55.0 | 6050 | 0.4644 | 0.1569 | 0.1159 | 0.4009 | 0.4659 | 0.0001 |
0.4526 | 56.0 | 6160 | 0.4647 | 0.1572 | 0.1164 | 0.3405 | 0.4660 | 0.0001 |
0.4526 | 57.0 | 6270 | 0.4645 | 0.1569 | 0.1152 | 0.4188 | 0.4661 | 0.0001 |
0.4526 | 58.0 | 6380 | 0.4651 | 0.1576 | 0.1164 | 0.3079 | 0.4659 | 0.0001 |
0.4526 | 59.0 | 6490 | 0.4645 | 0.1570 | 0.1150 | 0.4339 | 0.4654 | 1e-05 |
0.4514 | 60.0 | 6600 | 0.4642 | 0.1566 | 0.1150 | 0.3894 | 0.4679 | 1e-05 |
0.4514 | 61.0 | 6710 | 0.4639 | 0.1563 | 0.1146 | 0.4145 | 0.4693 | 1e-05 |
0.4514 | 62.0 | 6820 | 0.4641 | 0.1565 | 0.1148 | 0.4064 | 0.4686 | 1e-05 |
0.4514 | 63.0 | 6930 | 0.4643 | 0.1565 | 0.1149 | 0.3542 | 0.4698 | 1e-05 |
0.4511 | 64.0 | 7040 | 0.4640 | 0.1564 | 0.1150 | 0.3718 | 0.4702 | 1e-05 |
0.4511 | 65.0 | 7150 | 0.4641 | 0.1565 | 0.1152 | 0.4128 | 0.4680 | 1e-05 |
0.4511 | 66.0 | 7260 | 0.4644 | 0.1570 | 0.1145 | 0.4988 | 0.4658 | 1e-05 |
0.4511 | 67.0 | 7370 | 0.4638 | 0.1562 | 0.1151 | 0.4122 | 0.4697 | 1e-05 |
0.4511 | 68.0 | 7480 | 0.4640 | 0.1565 | 0.1144 | 0.4579 | 0.4674 | 1e-05 |
0.4508 | 69.0 | 7590 | 0.4638 | 0.1561 | 0.1143 | 0.4197 | 0.4702 | 1e-05 |
0.4508 | 70.0 | 7700 | 0.4639 | 0.1563 | 0.1145 | 0.4286 | 0.4695 | 1e-05 |
0.4508 | 71.0 | 7810 | 0.4641 | 0.1563 | 0.1153 | 0.3542 | 0.4708 | 1e-05 |
0.4508 | 72.0 | 7920 | 0.4642 | 0.1566 | 0.1147 | 0.4250 | 0.4681 | 1e-05 |
0.4505 | 73.0 | 8030 | 0.4638 | 0.1561 | 0.1140 | 0.4397 | 0.4700 | 1e-05 |
0.4505 | 74.0 | 8140 | 0.4638 | 0.1563 | 0.1145 | 0.4437 | 0.4689 | 1e-05 |
0.4505 | 75.0 | 8250 | 0.4638 | 0.1561 | 0.1145 | 0.4049 | 0.4705 | 1e-05 |
0.4505 | 76.0 | 8360 | 0.4640 | 0.1565 | 0.1141 | 0.4926 | 0.4675 | 0.0000 |
0.4505 | 77.0 | 8470 | 0.4639 | 0.1562 | 0.1142 | 0.4427 | 0.4695 | 0.0000 |
0.4505 | 78.0 | 8580 | 0.4639 | 0.1563 | 0.1145 | 0.4293 | 0.4692 | 0.0000 |
0.4505 | 79.0 | 8690 | 0.4641 | 0.1564 | 0.1147 | 0.3765 | 0.4700 | 0.0000 |
Framework versions
- Transformers 4.41.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.2
- Tokenizers 0.19.1