Amine commited on
Commit
99875ae
1 Parent(s): 2f40c6f

MT-legendary-capybara-96

Browse files
Files changed (2) hide show
  1. README.md +100 -0
  2. pytorch_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: toobiza/MT-ancient-spaceship-83
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: MT-legendary-capybara-96
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # MT-legendary-capybara-96
14
+
15
+ This model is a fine-tuned version of [toobiza/MT-ancient-spaceship-83](https://huggingface.co/toobiza/MT-ancient-spaceship-83) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.1572
18
+ - Loss Ce: 0.0000
19
+ - Loss Bbox: 0.0216
20
+ - Cardinality Error: 1.0
21
+ - Giou: 97.5514
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-05
41
+ - train_batch_size: 4
42
+ - eval_batch_size: 4
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 10
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Loss Ce | Loss Bbox | Cardinality Error | Giou |
51
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:---------:|:-----------------:|:-------:|
52
+ | 0.2851 | 0.24 | 200 | 0.1903 | 0.0000 | 0.0263 | 1.0 | 97.0566 |
53
+ | 0.1809 | 0.48 | 400 | 0.1726 | 0.0000 | 0.0237 | 1.0 | 97.2974 |
54
+ | 0.1909 | 0.73 | 600 | 0.1923 | 0.0000 | 0.0268 | 1.0 | 97.0772 |
55
+ | 0.1808 | 0.97 | 800 | 0.1745 | 0.0000 | 0.0239 | 1.0 | 97.2598 |
56
+ | 0.169 | 1.21 | 1000 | 0.1774 | 0.0000 | 0.0245 | 1.0 | 97.2469 |
57
+ | 0.1916 | 1.45 | 1200 | 0.1800 | 0.0000 | 0.0249 | 1.0 | 97.2128 |
58
+ | 0.1511 | 1.69 | 1400 | 0.1810 | 0.0000 | 0.0251 | 1.0 | 97.2199 |
59
+ | 0.1205 | 1.93 | 1600 | 0.1811 | 0.0000 | 0.0251 | 1.0 | 97.2107 |
60
+ | 0.0905 | 2.18 | 1800 | 0.1816 | 0.0000 | 0.0252 | 1.0 | 97.2090 |
61
+ | 0.1175 | 2.42 | 2000 | 0.1789 | 0.0000 | 0.0247 | 1.0 | 97.2187 |
62
+ | 0.1781 | 2.66 | 2200 | 0.1713 | 0.0000 | 0.0236 | 1.0 | 97.3242 |
63
+ | 0.1751 | 2.9 | 2400 | 0.1886 | 0.0000 | 0.0261 | 1.0 | 97.0914 |
64
+ | 0.1084 | 3.14 | 2600 | 0.1692 | 0.0000 | 0.0232 | 1.0 | 97.3369 |
65
+ | 0.1171 | 3.39 | 2800 | 0.1570 | 0.0000 | 0.0216 | 1.0 | 97.5552 |
66
+ | 0.1191 | 3.63 | 3000 | 0.1859 | 0.0000 | 0.0259 | 1.0 | 97.1879 |
67
+ | 0.1515 | 3.87 | 3200 | 0.1598 | 0.0000 | 0.0221 | 1.0 | 97.5370 |
68
+ | 0.1529 | 4.11 | 3400 | 0.1750 | 0.0000 | 0.0240 | 1.0 | 97.2571 |
69
+ | 0.1169 | 4.35 | 3600 | 0.1627 | 0.0000 | 0.0224 | 1.0 | 97.4536 |
70
+ | 0.1433 | 4.59 | 3800 | 0.1764 | 0.0000 | 0.0244 | 1.0 | 97.2739 |
71
+ | 0.0873 | 4.84 | 4000 | 0.1536 | 0.0000 | 0.0209 | 1.0 | 97.5448 |
72
+ | 0.1176 | 5.08 | 4200 | 0.1545 | 0.0000 | 0.0212 | 1.0 | 97.5786 |
73
+ | 0.0921 | 5.32 | 4400 | 0.1580 | 0.0000 | 0.0216 | 1.0 | 97.5027 |
74
+ | 0.0894 | 5.56 | 4600 | 0.1579 | 0.0000 | 0.0216 | 1.0 | 97.5178 |
75
+ | 0.0843 | 5.8 | 4800 | 0.1604 | 0.0000 | 0.0220 | 1.0 | 97.4857 |
76
+ | 0.1446 | 6.05 | 5000 | 0.1692 | 0.0000 | 0.0233 | 1.0 | 97.3695 |
77
+ | 0.0929 | 6.29 | 5200 | 0.1723 | 0.0000 | 0.0238 | 1.0 | 97.3369 |
78
+ | 0.0831 | 6.53 | 5400 | 0.1638 | 0.0000 | 0.0225 | 1.0 | 97.4370 |
79
+ | 0.093 | 6.77 | 5600 | 0.1606 | 0.0000 | 0.0220 | 1.0 | 97.4782 |
80
+ | 0.0869 | 7.01 | 5800 | 0.1604 | 0.0000 | 0.0220 | 1.0 | 97.4893 |
81
+ | 0.1183 | 7.26 | 6000 | 0.1599 | 0.0000 | 0.0219 | 1.0 | 97.4886 |
82
+ | 0.0807 | 7.5 | 6200 | 0.1614 | 0.0000 | 0.0222 | 1.0 | 97.4926 |
83
+ | 0.0851 | 7.74 | 6400 | 0.1642 | 0.0000 | 0.0226 | 1.0 | 97.4411 |
84
+ | 0.1279 | 7.98 | 6600 | 0.1596 | 0.0000 | 0.0220 | 1.0 | 97.5193 |
85
+ | 0.0828 | 8.22 | 6800 | 0.1606 | 0.0000 | 0.0222 | 1.0 | 97.5183 |
86
+ | 0.0933 | 8.46 | 7000 | 0.1576 | 0.0000 | 0.0217 | 1.0 | 97.5506 |
87
+ | 0.085 | 8.71 | 7200 | 0.1584 | 0.0000 | 0.0218 | 1.0 | 97.5329 |
88
+ | 0.0736 | 8.95 | 7400 | 0.1564 | 0.0000 | 0.0215 | 1.0 | 97.5616 |
89
+ | 0.1001 | 9.19 | 7600 | 0.1581 | 0.0000 | 0.0217 | 1.0 | 97.5258 |
90
+ | 0.075 | 9.43 | 7800 | 0.1575 | 0.0000 | 0.0217 | 1.0 | 97.5435 |
91
+ | 0.0714 | 9.67 | 8000 | 0.1571 | 0.0000 | 0.0216 | 1.0 | 97.5487 |
92
+ | 0.0881 | 9.92 | 8200 | 0.1572 | 0.0000 | 0.0216 | 1.0 | 97.5514 |
93
+
94
+
95
+ ### Framework versions
96
+
97
+ - Transformers 4.33.2
98
+ - Pytorch 2.1.0+cu118
99
+ - Datasets 2.14.6
100
+ - Tokenizers 0.13.3
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d3fde0dcb698d599c7f5c4a022cd2e1d3f1a2548426836a89fb2c3facfc12fdd
3
  size 115385222
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcaf2de15a6b1f85b691c34115c296cde99fdcbf0e5c763af44aa2d33fc0f05b
3
  size 115385222