File size: 8,407 Bytes
39b73fa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
---
license: apache-2.0
base_model: SenseTime/deformable-detr
tags:
- generated_from_trainer
datasets:
- imagefolder
model-index:
- name: deformable_detr_finetuned_rsna_2018
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# deformable_detr_finetuned_rsna_2018

This model is a fine-tuned version of [SenseTime/deformable-detr](https://huggingface.co/SenseTime/deformable-detr) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1784
- Map: 0.1070
- Map 50: 0.2451
- Map 75: 0.0775
- Map Small: 0.0225
- Map Medium: 0.1112
- Map Large: 0.1525
- Mar 1: 0.1582
- Mar 10: 0.4060
- Mar 100: 0.6497
- Mar Small: 0.3167
- Mar Medium: 0.6607
- Mar Large: 0.7767
- Map Pneumonia: 0.1070
- Mar 100 Pneumonia: 0.6497

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Pneumonia | Mar 100 Pneumonia |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------------:|:-----------------:|
| No log        | 1.0   | 301  | 1.5776          | 0.0290 | 0.0947 | 0.0092 | 0.0027    | 0.0320     | 0.0329    | 0.0576 | 0.2553 | 0.4849  | 0.0667    | 0.4968     | 0.6628    | 0.0290        | 0.4849            |
| 1.8843        | 2.0   | 602  | 1.5194          | 0.0623 | 0.1806 | 0.0287 | 0.0285    | 0.0565     | 0.2087    | 0.0932 | 0.2880 | 0.4894  | 0.0900    | 0.5034     | 0.6349    | 0.0623        | 0.4894            |
| 1.8843        | 3.0   | 903  | 1.4116          | 0.0840 | 0.2315 | 0.0439 | 0.0162    | 0.0785     | 0.2462    | 0.1110 | 0.3184 | 0.5424  | 0.1300    | 0.5573     | 0.6884    | 0.0840        | 0.5424            |
| 1.4595        | 4.0   | 1204 | 1.4335          | 0.0640 | 0.1683 | 0.0328 | 0.0366    | 0.0652     | 0.1351    | 0.1041 | 0.3313 | 0.5402  | 0.0900    | 0.5573     | 0.6907    | 0.0640        | 0.5402            |
| 1.4125        | 5.0   | 1505 | 1.4249          | 0.0681 | 0.1879 | 0.0344 | 0.0517    | 0.0653     | 0.1600    | 0.1137 | 0.3120 | 0.5298  | 0.1133    | 0.5424     | 0.7000    | 0.0681        | 0.5298            |
| 1.4125        | 6.0   | 1806 | 1.4030          | 0.0620 | 0.1636 | 0.0316 | 0.0570    | 0.0615     | 0.1675    | 0.1269 | 0.3244 | 0.5518  | 0.1067    | 0.5659     | 0.7279    | 0.0620        | 0.5518            |
| 1.396         | 7.0   | 2107 | 1.3371          | 0.0755 | 0.2010 | 0.0418 | 0.0611    | 0.0794     | 0.2684    | 0.1149 | 0.3116 | 0.5708  | 0.2333    | 0.5834     | 0.6860    | 0.0755        | 0.5708            |
| 1.396         | 8.0   | 2408 | 1.3428          | 0.0761 | 0.2083 | 0.0380 | 0.0438    | 0.0785     | 0.2026    | 0.1238 | 0.3190 | 0.5712  | 0.1633    | 0.5868     | 0.7070    | 0.0761        | 0.5712            |
| 1.3424        | 9.0   | 2709 | 1.3000          | 0.0720 | 0.1792 | 0.0420 | 0.0746    | 0.0739     | 0.1360    | 0.1348 | 0.3385 | 0.5867  | 0.2000    | 0.5985     | 0.7442    | 0.0720        | 0.5867            |
| 1.3113        | 10.0  | 3010 | 1.2955          | 0.0774 | 0.1925 | 0.0482 | 0.0407    | 0.0759     | 0.1505    | 0.1356 | 0.3511 | 0.5915  | 0.2233    | 0.6027     | 0.7419    | 0.0774        | 0.5915            |
| 1.3113        | 11.0  | 3311 | 1.2706          | 0.0902 | 0.2191 | 0.0595 | 0.0668    | 0.0904     | 0.1601    | 0.1418 | 0.3588 | 0.6133  | 0.1933    | 0.6305     | 0.7419    | 0.0902        | 0.6133            |
| 1.2808        | 12.0  | 3612 | 1.2623          | 0.0870 | 0.2131 | 0.0593 | 0.0560    | 0.0900     | 0.1509    | 0.1356 | 0.3449 | 0.6041  | 0.25      | 0.6141     | 0.7558    | 0.0870        | 0.6041            |
| 1.2808        | 13.0  | 3913 | 1.2412          | 0.0940 | 0.2286 | 0.0562 | 0.0285    | 0.0996     | 0.1058    | 0.1470 | 0.3843 | 0.6159  | 0.2567    | 0.6283     | 0.7488    | 0.0940        | 0.6159            |
| 1.2554        | 14.0  | 4214 | 1.2547          | 0.0838 | 0.2087 | 0.0590 | 0.0256    | 0.0949     | 0.1144    | 0.1482 | 0.3704 | 0.6174  | 0.2567    | 0.6305     | 0.7442    | 0.0838        | 0.6174            |
| 1.2427        | 15.0  | 4515 | 1.2478          | 0.0905 | 0.2212 | 0.0563 | 0.0814    | 0.1071     | 0.1178    | 0.1474 | 0.3770 | 0.6203  | 0.2200    | 0.6354     | 0.7558    | 0.0905        | 0.6203            |
| 1.2427        | 16.0  | 4816 | 1.2225          | 0.0972 | 0.2415 | 0.0638 | 0.0354    | 0.1005     | 0.1391    | 0.1513 | 0.3754 | 0.6246  | 0.2833    | 0.6378     | 0.7372    | 0.0972        | 0.6246            |
| 1.2246        | 17.0  | 5117 | 1.2105          | 0.0999 | 0.2357 | 0.0755 | 0.0248    | 0.1057     | 0.1150    | 0.1530 | 0.3981 | 0.6244  | 0.2800    | 0.6339     | 0.7744    | 0.0999        | 0.6244            |
| 1.2246        | 18.0  | 5418 | 1.2324          | 0.0813 | 0.1970 | 0.0535 | 0.0207    | 0.0854     | 0.1488    | 0.1209 | 0.3621 | 0.6176  | 0.2967    | 0.6293     | 0.7302    | 0.0813        | 0.6176            |
| 1.2085        | 19.0  | 5719 | 1.1992          | 0.1033 | 0.2369 | 0.0718 | 0.0431    | 0.1040     | 0.1668    | 0.1598 | 0.4002 | 0.6366  | 0.2900    | 0.6490     | 0.7605    | 0.1033        | 0.6366            |
| 1.1883        | 20.0  | 6020 | 1.2163          | 0.0978 | 0.2356 | 0.0640 | 0.0291    | 0.1030     | 0.1321    | 0.1545 | 0.3994 | 0.6282  | 0.2767    | 0.6410     | 0.7512    | 0.0978        | 0.6282            |
| 1.1883        | 21.0  | 6321 | 1.2100          | 0.0995 | 0.2346 | 0.0701 | 0.0353    | 0.1059     | 0.1336    | 0.1547 | 0.3983 | 0.6311  | 0.2567    | 0.6446     | 0.7628    | 0.0995        | 0.6311            |
| 1.1775        | 22.0  | 6622 | 1.1979          | 0.1014 | 0.2343 | 0.0735 | 0.0210    | 0.1080     | 0.1287    | 0.1636 | 0.3915 | 0.6333  | 0.3000    | 0.6446     | 0.7581    | 0.1014        | 0.6333            |
| 1.1775        | 23.0  | 6923 | 1.1840          | 0.1019 | 0.2410 | 0.0697 | 0.0213    | 0.1050     | 0.1525    | 0.1615 | 0.4000 | 0.6427  | 0.3100    | 0.6551     | 0.7558    | 0.1019        | 0.6427            |
| 1.1678        | 24.0  | 7224 | 1.1822          | 0.1066 | 0.2445 | 0.0802 | 0.0245    | 0.1113     | 0.1572    | 0.1689 | 0.4021 | 0.6487  | 0.3200    | 0.6612     | 0.7581    | 0.1066        | 0.6487            |
| 1.1604        | 25.0  | 7525 | 1.1796          | 0.1042 | 0.2409 | 0.0773 | 0.0253    | 0.1082     | 0.1478    | 0.1660 | 0.3954 | 0.6482  | 0.2933    | 0.6605     | 0.7791    | 0.1042        | 0.6482            |
| 1.1604        | 26.0  | 7826 | 1.1830          | 0.1035 | 0.2390 | 0.0771 | 0.0189    | 0.1084     | 0.1452    | 0.1615 | 0.3921 | 0.6395  | 0.3200    | 0.6502     | 0.7605    | 0.1035        | 0.6395            |
| 1.1551        | 27.0  | 8127 | 1.1789          | 0.1082 | 0.2489 | 0.0823 | 0.0236    | 0.1139     | 0.1521    | 0.1636 | 0.3959 | 0.6497  | 0.3033    | 0.6624     | 0.7698    | 0.1082        | 0.6497            |
| 1.1551        | 28.0  | 8428 | 1.1784          | 0.1057 | 0.2429 | 0.0764 | 0.0223    | 0.1104     | 0.1471    | 0.1561 | 0.4033 | 0.6482  | 0.3133    | 0.6600     | 0.7698    | 0.1057        | 0.6482            |
| 1.1471        | 29.0  | 8729 | 1.1783          | 0.1068 | 0.2452 | 0.0773 | 0.0222    | 0.1114     | 0.1501    | 0.1576 | 0.4054 | 0.6480  | 0.3133    | 0.6595     | 0.7721    | 0.1068        | 0.6480            |
| 1.1449        | 30.0  | 9030 | 1.1784          | 0.1070 | 0.2451 | 0.0775 | 0.0225    | 0.1112     | 0.1525    | 0.1582 | 0.4060 | 0.6497  | 0.3167    | 0.6607     | 0.7767    | 0.1070        | 0.6497            |


### Framework versions

- Transformers 4.43.3
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1