clee9 commited on
Commit
af928d4
·
verified ·
1 Parent(s): e7e335c

End of training

Browse files
Files changed (2) hide show
  1. README.md +105 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,105 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: facebook/detr-resnet-50
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: detr_finetuned_30
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # detr_finetuned_30
16
+
17
+ This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 1.1249
20
+ - Map: 0.2684
21
+ - Map 50: 0.5212
22
+ - Map 75: 0.2578
23
+ - Map Small: 0.22
24
+ - Map Medium: 0.377
25
+ - Map Large: 0.395
26
+ - Mar 1: 0.1497
27
+ - Mar 10: 0.3903
28
+ - Mar 100: 0.4383
29
+ - Mar Small: 0.398
30
+ - Mar Medium: 0.5486
31
+ - Mar Large: 0.6314
32
+ - Map Basketball: 0.0431
33
+ - Mar 100 Basketball: 0.147
34
+ - Map Player: 0.3203
35
+ - Mar 100 Player: 0.5743
36
+ - Map Referee: 0.4419
37
+ - Mar 100 Referee: 0.5936
38
+
39
+ ## Model description
40
+
41
+ More information needed
42
+
43
+ ## Intended uses & limitations
44
+
45
+ More information needed
46
+
47
+ ## Training and evaluation data
48
+
49
+ More information needed
50
+
51
+ ## Training procedure
52
+
53
+ ### Training hyperparameters
54
+
55
+ The following hyperparameters were used during training:
56
+ - learning_rate: 5e-05
57
+ - train_batch_size: 16
58
+ - eval_batch_size: 8
59
+ - seed: 42
60
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
61
+ - lr_scheduler_type: cosine
62
+ - num_epochs: 30
63
+
64
+ ### Training results
65
+
66
+ | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Basketball | Mar 100 Basketball | Map Player | Mar 100 Player | Map Referee | Mar 100 Referee |
67
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:----------:|:--------------:|:-----------:|:---------------:|
68
+ | No log | 1.0 | 461 | 1.7649 | 0.0495 | 0.1443 | 0.0205 | 0.0386 | 0.0955 | 0.1299 | 0.0262 | 0.1028 | 0.1539 | 0.1341 | 0.2707 | 0.5936 | 0.0048 | 0.0142 | 0.1345 | 0.3828 | 0.0093 | 0.0646 |
69
+ | 1.9775 | 2.0 | 922 | 1.5434 | 0.1063 | 0.2515 | 0.0673 | 0.0772 | 0.1801 | 0.4147 | 0.0497 | 0.2231 | 0.2853 | 0.2337 | 0.3809 | 0.6431 | 0.0004 | 0.0211 | 0.2031 | 0.4458 | 0.1154 | 0.389 |
70
+ | 1.7156 | 3.0 | 1383 | 1.4497 | 0.139 | 0.3385 | 0.0796 | 0.1042 | 0.3513 | 0.4331 | 0.0722 | 0.2634 | 0.3277 | 0.2719 | 0.519 | 0.6221 | 0.0019 | 0.042 | 0.21 | 0.4515 | 0.2049 | 0.4895 |
71
+ | 1.5817 | 4.0 | 1844 | 1.3623 | 0.191 | 0.4015 | 0.1554 | 0.1364 | 0.2989 | 0.4691 | 0.1062 | 0.2979 | 0.3572 | 0.3144 | 0.4426 | 0.6789 | 0.002 | 0.0515 | 0.2474 | 0.4896 | 0.3236 | 0.5305 |
72
+ | 1.4789 | 5.0 | 2305 | 1.3544 | 0.1874 | 0.4189 | 0.1338 | 0.1415 | 0.357 | 0.4253 | 0.1031 | 0.2946 | 0.36 | 0.3174 | 0.5002 | 0.6838 | 0.0032 | 0.0591 | 0.2486 | 0.4917 | 0.3104 | 0.5292 |
73
+ | 1.4235 | 6.0 | 2766 | 1.2867 | 0.211 | 0.4444 | 0.1715 | 0.154 | 0.3671 | 0.298 | 0.118 | 0.3084 | 0.3629 | 0.3113 | 0.5314 | 0.6814 | 0.0032 | 0.0494 | 0.2607 | 0.4919 | 0.3692 | 0.5474 |
74
+ | 1.3454 | 7.0 | 3227 | 1.3163 | 0.2024 | 0.4324 | 0.1654 | 0.1381 | 0.362 | 0.4507 | 0.114 | 0.3076 | 0.3557 | 0.3114 | 0.4828 | 0.651 | 0.0052 | 0.0581 | 0.2397 | 0.4762 | 0.3623 | 0.5329 |
75
+ | 1.3253 | 8.0 | 3688 | 1.2467 | 0.2169 | 0.4386 | 0.1888 | 0.1576 | 0.3525 | 0.4086 | 0.1128 | 0.327 | 0.3884 | 0.344 | 0.5275 | 0.6926 | 0.0105 | 0.0693 | 0.2758 | 0.525 | 0.3643 | 0.5708 |
76
+ | 1.276 | 9.0 | 4149 | 1.2343 | 0.2242 | 0.4614 | 0.1877 | 0.1659 | 0.3289 | 0.3643 | 0.1273 | 0.3341 | 0.3883 | 0.3438 | 0.4961 | 0.6814 | 0.0094 | 0.0896 | 0.2701 | 0.5176 | 0.3932 | 0.5579 |
77
+ | 1.2626 | 10.0 | 4610 | 1.2377 | 0.2324 | 0.4659 | 0.2072 | 0.1803 | 0.3146 | 0.3717 | 0.1261 | 0.3469 | 0.3967 | 0.3548 | 0.4797 | 0.624 | 0.0101 | 0.0849 | 0.2777 | 0.5198 | 0.4094 | 0.5855 |
78
+ | 1.2274 | 11.0 | 5071 | 1.2398 | 0.2358 | 0.4759 | 0.2075 | 0.1795 | 0.3235 | 0.4632 | 0.1338 | 0.3479 | 0.3952 | 0.3457 | 0.4835 | 0.6647 | 0.0128 | 0.0864 | 0.2831 | 0.5323 | 0.4114 | 0.5669 |
79
+ | 1.2026 | 12.0 | 5532 | 1.1964 | 0.2407 | 0.4828 | 0.2133 | 0.1808 | 0.3974 | 0.4632 | 0.1329 | 0.3571 | 0.4064 | 0.3481 | 0.5671 | 0.6966 | 0.0151 | 0.0897 | 0.2866 | 0.5305 | 0.4203 | 0.5991 |
80
+ | 1.2026 | 13.0 | 5993 | 1.2058 | 0.2367 | 0.4879 | 0.201 | 0.1919 | 0.3161 | 0.4254 | 0.1314 | 0.3515 | 0.398 | 0.3516 | 0.4787 | 0.6789 | 0.019 | 0.0999 | 0.287 | 0.531 | 0.404 | 0.5632 |
81
+ | 1.1779 | 14.0 | 6454 | 1.1949 | 0.2365 | 0.4716 | 0.2179 | 0.1759 | 0.3254 | 0.4736 | 0.1268 | 0.359 | 0.4064 | 0.3592 | 0.5701 | 0.6495 | 0.0203 | 0.1028 | 0.2945 | 0.542 | 0.3946 | 0.5743 |
82
+ | 1.1578 | 15.0 | 6915 | 1.2130 | 0.2291 | 0.468 | 0.2024 | 0.1673 | 0.3503 | 0.4148 | 0.1238 | 0.3551 | 0.4049 | 0.3668 | 0.485 | 0.6059 | 0.02 | 0.1102 | 0.2821 | 0.5349 | 0.3851 | 0.5697 |
83
+ | 1.131 | 16.0 | 7376 | 1.2012 | 0.2384 | 0.478 | 0.2139 | 0.1848 | 0.2908 | 0.4595 | 0.1314 | 0.3597 | 0.4048 | 0.3616 | 0.5554 | 0.6382 | 0.0174 | 0.1082 | 0.291 | 0.5388 | 0.4068 | 0.5672 |
84
+ | 1.1208 | 17.0 | 7837 | 1.1768 | 0.2517 | 0.4928 | 0.2365 | 0.2057 | 0.3507 | 0.4399 | 0.1352 | 0.3697 | 0.4208 | 0.38 | 0.5615 | 0.6201 | 0.0222 | 0.1082 | 0.3005 | 0.5574 | 0.4324 | 0.5969 |
85
+ | 1.0992 | 18.0 | 8298 | 1.1605 | 0.2403 | 0.4776 | 0.2194 | 0.1967 | 0.2987 | 0.4002 | 0.1315 | 0.3655 | 0.4148 | 0.3761 | 0.5148 | 0.5838 | 0.0216 | 0.1004 | 0.2958 | 0.5538 | 0.4034 | 0.5901 |
86
+ | 1.0793 | 19.0 | 8759 | 1.1529 | 0.2496 | 0.4954 | 0.2307 | 0.2071 | 0.3352 | 0.4166 | 0.133 | 0.3752 | 0.4255 | 0.3868 | 0.5096 | 0.6431 | 0.0278 | 0.1219 | 0.3072 | 0.567 | 0.4139 | 0.5877 |
87
+ | 1.0648 | 20.0 | 9220 | 1.1573 | 0.2539 | 0.505 | 0.2326 | 0.2033 | 0.3472 | 0.4315 | 0.1389 | 0.3818 | 0.4308 | 0.3896 | 0.5701 | 0.6275 | 0.0327 | 0.1387 | 0.3021 | 0.5606 | 0.427 | 0.5931 |
88
+ | 1.05 | 21.0 | 9681 | 1.1417 | 0.257 | 0.505 | 0.2463 | 0.2135 | 0.3753 | 0.4454 | 0.1392 | 0.3862 | 0.4331 | 0.3949 | 0.5359 | 0.6696 | 0.0339 | 0.1388 | 0.3103 | 0.5656 | 0.4267 | 0.5948 |
89
+ | 1.0362 | 22.0 | 10142 | 1.1439 | 0.259 | 0.5124 | 0.2466 | 0.2112 | 0.3458 | 0.3431 | 0.1406 | 0.3832 | 0.4307 | 0.3895 | 0.4829 | 0.6402 | 0.0326 | 0.1252 | 0.312 | 0.5706 | 0.4324 | 0.5963 |
90
+ | 1.0248 | 23.0 | 10603 | 1.1317 | 0.2641 | 0.5182 | 0.2514 | 0.215 | 0.3594 | 0.3094 | 0.1445 | 0.3838 | 0.4319 | 0.3942 | 0.5376 | 0.6137 | 0.0334 | 0.1296 | 0.3123 | 0.5687 | 0.4467 | 0.5972 |
91
+ | 1.0173 | 24.0 | 11064 | 1.1485 | 0.2581 | 0.5057 | 0.247 | 0.2102 | 0.3723 | 0.4356 | 0.1414 | 0.3819 | 0.4295 | 0.3906 | 0.5416 | 0.6681 | 0.0334 | 0.1372 | 0.3158 | 0.5696 | 0.4251 | 0.5817 |
92
+ | 1.0082 | 25.0 | 11525 | 1.1344 | 0.2642 | 0.5158 | 0.2495 | 0.2176 | 0.3517 | 0.4473 | 0.1467 | 0.3843 | 0.4322 | 0.3915 | 0.554 | 0.6377 | 0.0354 | 0.1386 | 0.3158 | 0.5685 | 0.4414 | 0.5894 |
93
+ | 1.0082 | 26.0 | 11986 | 1.1267 | 0.2648 | 0.5175 | 0.2514 | 0.2147 | 0.3598 | 0.4399 | 0.1489 | 0.3868 | 0.4341 | 0.3942 | 0.5624 | 0.6294 | 0.0381 | 0.1422 | 0.3181 | 0.5706 | 0.4381 | 0.5894 |
94
+ | 1.0006 | 27.0 | 12447 | 1.1296 | 0.2687 | 0.5208 | 0.2581 | 0.2198 | 0.3694 | 0.4415 | 0.1506 | 0.3887 | 0.4359 | 0.3967 | 0.5455 | 0.6333 | 0.0439 | 0.1464 | 0.3188 | 0.5727 | 0.4434 | 0.5885 |
95
+ | 0.9989 | 28.0 | 12908 | 1.1237 | 0.2675 | 0.5191 | 0.2562 | 0.2202 | 0.3769 | 0.3991 | 0.1484 | 0.3897 | 0.4374 | 0.397 | 0.5484 | 0.6333 | 0.0411 | 0.1454 | 0.3204 | 0.5735 | 0.441 | 0.5932 |
96
+ | 0.9952 | 29.0 | 13369 | 1.1251 | 0.2687 | 0.5204 | 0.2582 | 0.221 | 0.3689 | 0.3963 | 0.15 | 0.3904 | 0.4385 | 0.3984 | 0.5485 | 0.6314 | 0.0426 | 0.1474 | 0.3207 | 0.5744 | 0.4427 | 0.5936 |
97
+ | 0.9909 | 30.0 | 13830 | 1.1249 | 0.2684 | 0.5212 | 0.2578 | 0.22 | 0.377 | 0.395 | 0.1497 | 0.3903 | 0.4383 | 0.398 | 0.5486 | 0.6314 | 0.0431 | 0.147 | 0.3203 | 0.5743 | 0.4419 | 0.5936 |
98
+
99
+
100
+ ### Framework versions
101
+
102
+ - Transformers 4.46.3
103
+ - Pytorch 2.5.1
104
+ - Datasets 3.1.0
105
+ - Tokenizers 0.20.3
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fe5094fd07bcc5d2f22048681014b1f8777b595a7a92eb134fba223f8ecc04b9
3
  size 166496880
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:959daef7f9d2ff034a692cb2a86f536c68460f397731dd2a9487599e8307aec2
3
  size 166496880