File size: 6,796 Bytes
aec47c8
 
 
 
 
 
 
 
10c5322
 
aec47c8
 
 
 
 
 
 
10c5322
aec47c8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10c5322
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
license: apache-2.0
base_model: IDEA-Research/grounding-dino-tiny
tags:
- generated_from_trainer
model-index:
- name: grounding-dino-tiny-aquarium-fine-tune
  results: []
datasets:
- EduardoPacheco/aquarium
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# grounding-dino-tiny-aquarium-fine-tune

This model is a fine-tuned version of [IDEA-Research/grounding-dino-tiny](https://huggingface.co/IDEA-Research/grounding-dino-tiny) on an [aquarium dataset](https://huggingface.co/datasets/EduardoPacheco/aquarium).
It achieves the following results on the evaluation set:
- Loss: 18.3797
- Map: 0.1008
- Map 50: 0.1716
- Map 75: 0.0971
- Map Small: -1.0
- Map Medium: 0.1054
- Map Large: 0.1024
- Mar 1: 0.1063
- Mar 10: 0.1758
- Mar 100: 0.2843
- Mar Small: -1.0
- Mar Medium: 0.1848
- Mar Large: 0.2881
- Map Fish: 0.0827
- Mar 100 Fish: 0.411
- Map Jellyfish: 0.1291
- Mar 100 Jellyfish: 0.4026
- Map Penguins: 0.0963
- Mar 100 Penguins: 0.426
- Map Sharks: 0.0336
- Mar 100 Sharks: 0.0561
- Map Puffins: 0.04
- Mar 100 Puffins: 0.1
- Map Stingrays: 0.2407
- Mar 100 Stingrays: 0.4042
- Map Starfish: 0.0829
- Mar 100 Starfish: 0.1906

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Fish | Mar 100 Fish | Map Jellyfish | Mar 100 Jellyfish | Map Penguins | Mar 100 Penguins | Map Sharks | Mar 100 Sharks | Map Puffins | Mar 100 Puffins | Map Stingrays | Mar 100 Stingrays | Map Starfish | Mar 100 Starfish |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------:|:------------:|:-------------:|:-----------------:|:------------:|:----------------:|:----------:|:--------------:|:-----------:|:---------------:|:-------------:|:-----------------:|:------------:|:----------------:|
| 47786.9107    | 1.0   | 112  | 47816.1406      | 0.0006 | 0.0019 | 0.0003 | -1.0      | 0.0012     | 0.0008    | 0.0005 | 0.0071 | 0.0116  | -1.0      | 0.0455     | 0.0107    | 0.0028   | 0.0491       | 0.0017        | 0.0323            | 0.0          | 0.0              | 0.0        | 0.0            | 0.0         | 0.0             | 0.0           | 0.0               | 0.0          | 0.0              |
| 36235.5402    | 2.0   | 224  | 26297.4707      | 0.0036 | 0.0117 | 0.0021 | -1.0      | 0.0002     | 0.0039    | 0.0014 | 0.0092 | 0.0326  | -1.0      | 0.0119     | 0.0341    | 0.0182   | 0.1514       | 0.0069        | 0.0768            | 0.0          | 0.0              | 0.0        | 0.0            | 0.0         | 0.0             | 0.0           | 0.0               | 0.0          | 0.0              |
| 13131.6674    | 3.0   | 336  | 4944.3315       | 0.0022 | 0.0039 | 0.0016 | -1.0      | 0.0005     | 0.0022    | 0.0017 | 0.005  | 0.0085  | -1.0      | 0.0107     | 0.0087    | 0.0139   | 0.0546       | 0.0013        | 0.0052            | 0.0          | 0.0              | 0.0        | 0.0            | 0.0         | 0.0             | 0.0           | 0.0               | 0.0          | 0.0              |
| 1436.2895     | 4.0   | 448  | 160.7001        | 0.0065 | 0.0175 | 0.004  | -1.0      | 0.0074     | 0.0068    | 0.0044 | 0.0129 | 0.0322  | -1.0      | 0.0119     | 0.0336    | 0.0202   | 0.1621       | 0.0254        | 0.0632            | 0.0          | 0.0              | 0.0        | 0.0            | 0.0         | 0.0             | 0.0           | 0.0               | 0.0          | 0.0              |
| 48.6727       | 5.0   | 560  | 19.5311         | 0.0183 | 0.0474 | 0.0105 | -1.0      | 0.0086     | 0.0197    | 0.0081 | 0.0338 | 0.0768  | -1.0      | 0.0399     | 0.0798    | 0.0332   | 0.2379       | 0.0663        | 0.211             | 0.0288       | 0.0885           | 0.0        | 0.0            | 0.0         | 0.0             | 0.0           | 0.0               | 0.0          | 0.0              |
| 19.4237       | 6.0   | 672  | 18.9543         | 0.0649 | 0.12   | 0.0633 | -1.0      | 0.1113     | 0.0658    | 0.0817 | 0.1352 | 0.2257  | -1.0      | 0.1698     | 0.2284    | 0.0611   | 0.3724       | 0.0829        | 0.3542            | 0.0444       | 0.3154           | 0.0        | 0.0            | 0.0113      | 0.0574          | 0.202         | 0.3583            | 0.0529       | 0.1219           |
| 18.6905       | 7.0   | 784  | 18.5298         | 0.0855 | 0.1458 | 0.0805 | -1.0      | 0.0869     | 0.087     | 0.109  | 0.1788 | 0.2767  | -1.0      | 0.165      | 0.2801    | 0.0615   | 0.3822       | 0.1007        | 0.3826            | 0.0772       | 0.3538           | 0.0107     | 0.0258         | 0.02        | 0.0852          | 0.2416        | 0.4792            | 0.0871       | 0.2281           |
| 18.3457       | 8.0   | 896  | 18.6190         | 0.0808 | 0.1485 | 0.0724 | -1.0      | 0.1232     | 0.0818    | 0.1071 | 0.1821 | 0.2959  | -1.0      | 0.201      | 0.2993    | 0.0644   | 0.4116       | 0.1096        | 0.4142            | 0.0761       | 0.4067           | 0.0208     | 0.0833         | 0.0386      | 0.1574          | 0.2184        | 0.4542            | 0.0379       | 0.1437           |
| 18.2089       | 9.0   | 1008 | 18.3819         | 0.0997 | 0.1662 | 0.0979 | -1.0      | 0.0888     | 0.1015    | 0.098  | 0.1657 | 0.2626  | -1.0      | 0.1938     | 0.2653    | 0.0726   | 0.3886       | 0.1283        | 0.3903            | 0.0993       | 0.3904           | 0.0207     | 0.05           | 0.0236      | 0.0593          | 0.2596        | 0.425             | 0.094        | 0.1344           |
| 17.7603       | 10.0  | 1120 | 18.3797         | 0.1008 | 0.1716 | 0.0971 | -1.0      | 0.1054     | 0.1024    | 0.1063 | 0.1758 | 0.2843  | -1.0      | 0.1848     | 0.2881    | 0.0827   | 0.411        | 0.1291        | 0.4026            | 0.0963       | 0.426            | 0.0336     | 0.0561         | 0.04        | 0.1             | 0.2407        | 0.4042            | 0.0829       | 0.1906           |


### Framework versions

- Transformers 4.45.0.dev0
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1