File size: 12,709 Bytes
a0a734b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9e9efaa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a0a734b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9e9efaa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a0a734b
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
---

license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
model-index:
- name: detr_finetuned_cppe5
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# detr_finetuned_cppe5

This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2426
- Map: 0.268
- Map 50: 0.5294
- Map 75: 0.2419
- Map Small: 0.1163
- Map Medium: 0.2288
- Map Large: 0.5006
- Mar 1: 0.2865
- Mar 10: 0.4475
- Mar 100: 0.4749
- Mar Small: 0.3002
- Mar Medium: 0.4623
- Mar Large: 0.7345
- Map Coverall: 0.5546
- Mar 100 Coverall: 0.6736
- Map Face Shield: 0.1674
- Mar 100 Face Shield: 0.4833
- Map Gloves: 0.1944
- Mar 100 Gloves: 0.3662
- Map Goggles: 0.1199
- Mar 100 Goggles: 0.4421
- Map Mask: 0.3036
- Mar 100 Mask: 0.4092

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05

- train_batch_size: 8

- eval_batch_size: 8

- seed: 42

- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08

- lr_scheduler_type: cosine

- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 102  | 2.3847          | 0.008  | 0.0259 | 0.0031 | 0.0071    | 0.0113     | 0.0323    | 0.0303 | 0.1094 | 0.1661  | 0.1002    | 0.1711     | 0.2087    | 0.0105       | 0.207            | 0.0034          | 0.0681              | 0.0078     | 0.2167         | 0.0025      | 0.0807          | 0.0156   | 0.2583       |
| No log        | 2.0   | 204  | 2.1346          | 0.0292 | 0.0819 | 0.0147 | 0.0126    | 0.0268     | 0.042     | 0.0773 | 0.1631 | 0.2064  | 0.0815    | 0.1912     | 0.2543    | 0.0873       | 0.3791           | 0.0083          | 0.1153              | 0.0116     | 0.2069         | 0.0         | 0.0             | 0.0388   | 0.3306       |
| No log        | 3.0   | 306  | 2.0183          | 0.0546 | 0.1289 | 0.0413 | 0.0288    | 0.0352     | 0.0606    | 0.1234 | 0.2329 | 0.2765  | 0.1078    | 0.2436     | 0.3393    | 0.1671       | 0.5567           | 0.0187          | 0.2167              | 0.0094     | 0.2157         | 0.0346      | 0.0912          | 0.0431   | 0.3024       |
| No log        | 4.0   | 408  | 1.9305          | 0.0779 | 0.1742 | 0.0587 | 0.02      | 0.0476     | 0.1114    | 0.1417 | 0.2692 | 0.3059  | 0.0926    | 0.2783     | 0.4861    | 0.27         | 0.59             | 0.0335          | 0.3208              | 0.0214     | 0.2377         | 0.0036      | 0.086           | 0.061    | 0.2951       |
| 3.3072        | 5.0   | 510  | 1.7155          | 0.1186 | 0.2742 | 0.0935 | 0.0303    | 0.0871     | 0.2195    | 0.1434 | 0.3145 | 0.3401  | 0.1573    | 0.3076     | 0.5985    | 0.3762       | 0.5582           | 0.0213          | 0.2569              | 0.0463     | 0.2892         | 0.0081      | 0.2333          | 0.1414   | 0.3626       |
| 3.3072        | 6.0   | 612  | 1.6430          | 0.1371 | 0.301  | 0.1037 | 0.0347    | 0.1046     | 0.2527    | 0.1733 | 0.3374 | 0.3664  | 0.1605    | 0.3393     | 0.5973    | 0.4248       | 0.601            | 0.0216          | 0.2875              | 0.0572     | 0.3059         | 0.0243      | 0.2877          | 0.1577   | 0.35         |
| 3.3072        | 7.0   | 714  | 1.5879          | 0.1537 | 0.345  | 0.1218 | 0.0534    | 0.1358     | 0.2768    | 0.1933 | 0.3528 | 0.3807  | 0.1858    | 0.3554     | 0.6497    | 0.4326       | 0.6124           | 0.034           | 0.3486              | 0.0719     | 0.2917         | 0.0246      | 0.2877          | 0.2056   | 0.3631       |
| 3.3072        | 8.0   | 816  | 1.5310          | 0.1649 | 0.3587 | 0.1398 | 0.0579    | 0.1333     | 0.3036    | 0.1925 | 0.3729 | 0.3946  | 0.1854    | 0.3742     | 0.6476    | 0.4674       | 0.6353           | 0.0415          | 0.3722              | 0.0933     | 0.301          | 0.0387      | 0.3333          | 0.1836   | 0.3311       |
| 3.3072        | 9.0   | 918  | 1.4758          | 0.1789 | 0.3922 | 0.1478 | 0.0668    | 0.1372     | 0.3167    | 0.2241 | 0.3745 | 0.405   | 0.2063    | 0.3752     | 0.6691    | 0.4539       | 0.6199           | 0.063           | 0.4139              | 0.1041     | 0.3108         | 0.0375      | 0.3053          | 0.236    | 0.3752       |
| 1.4864        | 10.0  | 1020 | 1.4622          | 0.1735 | 0.3827 | 0.1333 | 0.05      | 0.1411     | 0.354     | 0.2103 | 0.371  | 0.3951  | 0.183     | 0.3664     | 0.6752    | 0.4784       | 0.6313           | 0.053           | 0.3903              | 0.1182     | 0.3186         | 0.0195      | 0.3053          | 0.1985   | 0.3301       |
| 1.4864        | 11.0  | 1122 | 1.4252          | 0.1858 | 0.4134 | 0.1496 | 0.0591    | 0.1632     | 0.3561    | 0.2227 | 0.3873 | 0.4144  | 0.1911    | 0.4137     | 0.6645    | 0.4794       | 0.6488           | 0.0752          | 0.4153              | 0.1131     | 0.3034         | 0.0292      | 0.3439          | 0.2319   | 0.3607       |
| 1.4864        | 12.0  | 1224 | 1.3893          | 0.1973 | 0.4218 | 0.1643 | 0.0749    | 0.169      | 0.4139    | 0.242  | 0.4054 | 0.4302  | 0.2226    | 0.4175     | 0.6991    | 0.4854       | 0.6413           | 0.0662          | 0.4292              | 0.1319     | 0.3397         | 0.0503      | 0.3561          | 0.2529   | 0.3845       |
| 1.4864        | 13.0  | 1326 | 1.3891          | 0.1998 | 0.431  | 0.1596 | 0.0675    | 0.1829     | 0.3762    | 0.2277 | 0.3962 | 0.4222  | 0.1979    | 0.4311     | 0.7011    | 0.504        | 0.6428           | 0.0911          | 0.4333              | 0.1384     | 0.3353         | 0.0552      | 0.3702          | 0.2101   | 0.3296       |
| 1.4864        | 14.0  | 1428 | 1.3981          | 0.193  | 0.42   | 0.1614 | 0.0698    | 0.1693     | 0.3523    | 0.235  | 0.3978 | 0.4271  | 0.2379    | 0.42       | 0.6729    | 0.4962       | 0.6557           | 0.0681          | 0.4278              | 0.136      | 0.3451         | 0.0493      | 0.3298          | 0.2155   | 0.3772       |
| 1.2306        | 15.0  | 1530 | 1.3472          | 0.217  | 0.4617 | 0.1785 | 0.0857    | 0.1817     | 0.4264    | 0.2416 | 0.4046 | 0.4329  | 0.2377    | 0.4143     | 0.7007    | 0.5137       | 0.6363           | 0.0968          | 0.4611              | 0.1571     | 0.3475         | 0.0484      | 0.3509          | 0.2689   | 0.3684       |
| 1.2306        | 16.0  | 1632 | 1.3450          | 0.227  | 0.4747 | 0.1915 | 0.0861    | 0.1891     | 0.4373    | 0.2521 | 0.4104 | 0.439   | 0.2503    | 0.4112     | 0.7344    | 0.5183       | 0.6428           | 0.1179          | 0.4514              | 0.1589     | 0.3289         | 0.0684      | 0.3912          | 0.2717   | 0.3806       |
| 1.2306        | 17.0  | 1734 | 1.2998          | 0.2359 | 0.4833 | 0.202  | 0.1089    | 0.1972     | 0.4426    | 0.2661 | 0.4303 | 0.4475  | 0.2792    | 0.4221     | 0.6999    | 0.5251       | 0.6463           | 0.12            | 0.4556              | 0.1646     | 0.3466         | 0.0857      | 0.393           | 0.284    | 0.3961       |
| 1.2306        | 18.0  | 1836 | 1.2995          | 0.2376 | 0.4866 | 0.1989 | 0.0926    | 0.2056     | 0.4487    | 0.2711 | 0.4325 | 0.4575  | 0.2798    | 0.4319     | 0.7195    | 0.522        | 0.6542           | 0.1299          | 0.475               | 0.1636     | 0.3544         | 0.0838      | 0.4018          | 0.2884   | 0.4019       |
| 1.2306        | 19.0  | 1938 | 1.2998          | 0.2362 | 0.4948 | 0.1954 | 0.1036    | 0.1905     | 0.4647    | 0.2563 | 0.4277 | 0.4446  | 0.249     | 0.4216     | 0.7165    | 0.5308       | 0.6672           | 0.1334          | 0.4722              | 0.1829     | 0.3407         | 0.0721      | 0.3772          | 0.2617   | 0.3655       |
| 1.0733        | 20.0  | 2040 | 1.2773          | 0.2513 | 0.5082 | 0.2298 | 0.1057    | 0.2148     | 0.4873    | 0.2723 | 0.4393 | 0.4678  | 0.2749    | 0.4514     | 0.7437    | 0.5342       | 0.6652           | 0.1499          | 0.4556              | 0.1754     | 0.3534         | 0.1101      | 0.4561          | 0.287    | 0.4087       |
| 1.0733        | 21.0  | 2142 | 1.2668          | 0.2516 | 0.5077 | 0.2323 | 0.1048    | 0.2104     | 0.4929    | 0.2758 | 0.4353 | 0.4592  | 0.2787    | 0.4287     | 0.7393    | 0.541        | 0.6692           | 0.1386          | 0.4653              | 0.1778     | 0.3525         | 0.1074      | 0.4018          | 0.2933   | 0.4073       |
| 1.0733        | 22.0  | 2244 | 1.2665          | 0.2496 | 0.5166 | 0.2143 | 0.114     | 0.2045     | 0.4759    | 0.2609 | 0.4314 | 0.454   | 0.2708    | 0.4246     | 0.7292    | 0.5355       | 0.6577           | 0.1393          | 0.4556              | 0.182      | 0.3657         | 0.1069      | 0.4             | 0.2842   | 0.3913       |
| 1.0733        | 23.0  | 2346 | 1.2512          | 0.2585 | 0.5258 | 0.2298 | 0.1196    | 0.2121     | 0.4884    | 0.2789 | 0.4465 | 0.4695  | 0.2991    | 0.4453     | 0.7262    | 0.5455       | 0.6672           | 0.1491          | 0.4833              | 0.1899     | 0.373          | 0.1149      | 0.4211          | 0.2931   | 0.4029       |
| 1.0733        | 24.0  | 2448 | 1.2511          | 0.2639 | 0.5275 | 0.2388 | 0.1198    | 0.2218     | 0.511     | 0.2845 | 0.4464 | 0.47    | 0.2911    | 0.4511     | 0.7377    | 0.5482       | 0.6657           | 0.1549          | 0.4694              | 0.192      | 0.3725         | 0.125       | 0.4386          | 0.2994   | 0.4039       |
| 0.9823        | 25.0  | 2550 | 1.2495          | 0.2629 | 0.5392 | 0.2309 | 0.1173    | 0.2213     | 0.4926    | 0.2828 | 0.4429 | 0.467   | 0.2888    | 0.4478     | 0.7363    | 0.549        | 0.6672           | 0.1633          | 0.4792              | 0.1931     | 0.3652         | 0.1181      | 0.4263          | 0.2908   | 0.3971       |
| 0.9823        | 26.0  | 2652 | 1.2470          | 0.2653 | 0.5276 | 0.2364 | 0.1136    | 0.2258     | 0.5082    | 0.2884 | 0.4486 | 0.4715  | 0.3017    | 0.4567     | 0.7313    | 0.5535       | 0.6701           | 0.1641          | 0.475               | 0.192      | 0.3672         | 0.1162      | 0.4368          | 0.3007   | 0.4083       |
| 0.9823        | 27.0  | 2754 | 1.2471          | 0.2661 | 0.5287 | 0.2366 | 0.1138    | 0.227      | 0.5013    | 0.2809 | 0.4483 | 0.4736  | 0.2986    | 0.4636     | 0.7286    | 0.5519       | 0.6711           | 0.1687          | 0.4806              | 0.1934     | 0.3676         | 0.1135      | 0.4404          | 0.3031   | 0.4083       |
| 0.9823        | 28.0  | 2856 | 1.2434          | 0.2673 | 0.5291 | 0.242  | 0.1156    | 0.229      | 0.5028    | 0.2866 | 0.4462 | 0.4745  | 0.3008    | 0.461      | 0.7367    | 0.5555       | 0.6736           | 0.1651          | 0.4806              | 0.1951     | 0.3662         | 0.1179      | 0.4421          | 0.3028   | 0.4102       |
| 0.9823        | 29.0  | 2958 | 1.2427          | 0.2676 | 0.5272 | 0.2425 | 0.116     | 0.2286     | 0.5       | 0.2863 | 0.4472 | 0.4745  | 0.299     | 0.4623     | 0.7343    | 0.554        | 0.6721           | 0.1675          | 0.4833              | 0.1942     | 0.3667         | 0.1195      | 0.4404          | 0.3027   | 0.4102       |
| 0.9316        | 30.0  | 3060 | 1.2426          | 0.268  | 0.5294 | 0.2419 | 0.1163    | 0.2288     | 0.5006    | 0.2865 | 0.4475 | 0.4749  | 0.3002    | 0.4623     | 0.7345    | 0.5546       | 0.6736           | 0.1674          | 0.4833              | 0.1944     | 0.3662         | 0.1199      | 0.4421          | 0.3036   | 0.4092       |


### Framework versions

- Transformers 4.41.1
- Pytorch 2.2.2+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1