davanstrien HF staff commited on
Commit
465db3b
1 Parent(s): 806682f

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +115 -0
README.md ADDED
@@ -0,0 +1,115 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - precision
7
+ - recall
8
+ - f1
9
+ - accuracy
10
+ model-index:
11
+ - name: convnext-base-224_finetuned_on_ImageIn_annotations
12
+ results: []
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+ # convnext-base-224_finetuned_on_ImageIn_annotations
19
+
20
+ This model is a fine-tuned version of [facebook/convnext-base-224](https://huggingface.co/facebook/convnext-base-224) on the None dataset.
21
+ It achieves the following results on the evaluation set:
22
+ - Loss: 0.0749
23
+ - Precision: 0.9722
24
+ - Recall: 0.9811
25
+ - F1: 0.9765
26
+ - Accuracy: 0.9824
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 2e-05
46
+ - train_batch_size: 16
47
+ - eval_batch_size: 16
48
+ - seed: 42
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - num_epochs: 50
52
+ - mixed_precision_training: Native AMP
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | No log | 1.0 | 83 | 0.1368 | 0.9748 | 0.9632 | 0.9688 | 0.9772 |
59
+ | No log | 2.0 | 166 | 0.0734 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
60
+ | No log | 3.0 | 249 | 0.0693 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
61
+ | No log | 4.0 | 332 | 0.0698 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
62
+ | No log | 5.0 | 415 | 0.0688 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
63
+ | No log | 6.0 | 498 | 0.0690 | 0.9729 | 0.9751 | 0.9740 | 0.9807 |
64
+ | 0.0947 | 7.0 | 581 | 0.0666 | 0.9689 | 0.9800 | 0.9743 | 0.9807 |
65
+ | 0.0947 | 8.0 | 664 | 0.0642 | 0.9689 | 0.9800 | 0.9743 | 0.9807 |
66
+ | 0.0947 | 9.0 | 747 | 0.0790 | 0.9763 | 0.9763 | 0.9763 | 0.9824 |
67
+ | 0.0947 | 10.0 | 830 | 0.0813 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
68
+ | 0.0947 | 11.0 | 913 | 0.0797 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
69
+ | 0.0947 | 12.0 | 996 | 0.0791 | 0.9763 | 0.9763 | 0.9763 | 0.9824 |
70
+ | 0.0205 | 13.0 | 1079 | 0.0871 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
71
+ | 0.0205 | 14.0 | 1162 | 0.0716 | 0.9722 | 0.9811 | 0.9765 | 0.9824 |
72
+ | 0.0205 | 15.0 | 1245 | 0.0746 | 0.9776 | 0.9799 | 0.9787 | 0.9842 |
73
+ | 0.0205 | 16.0 | 1328 | 0.0917 | 0.9738 | 0.9692 | 0.9714 | 0.9789 |
74
+ | 0.0205 | 17.0 | 1411 | 0.0694 | 0.9776 | 0.9799 | 0.9787 | 0.9842 |
75
+ | 0.0205 | 18.0 | 1494 | 0.0697 | 0.9768 | 0.9859 | 0.9812 | 0.9859 |
76
+ | 0.0166 | 19.0 | 1577 | 0.0689 | 0.9702 | 0.9835 | 0.9766 | 0.9824 |
77
+ | 0.0166 | 20.0 | 1660 | 0.0995 | 0.9738 | 0.9692 | 0.9714 | 0.9789 |
78
+ | 0.0166 | 21.0 | 1743 | 0.0847 | 0.9776 | 0.9799 | 0.9787 | 0.9842 |
79
+ | 0.0166 | 22.0 | 1826 | 0.0843 | 0.9776 | 0.9799 | 0.9787 | 0.9842 |
80
+ | 0.0166 | 23.0 | 1909 | 0.0869 | 0.9750 | 0.9727 | 0.9739 | 0.9807 |
81
+ | 0.0166 | 24.0 | 1992 | 0.0762 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
82
+ | 0.0125 | 25.0 | 2075 | 0.0778 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
83
+ | 0.0125 | 26.0 | 2158 | 0.0834 | 0.9763 | 0.9763 | 0.9763 | 0.9824 |
84
+ | 0.0125 | 27.0 | 2241 | 0.0818 | 0.9776 | 0.9799 | 0.9787 | 0.9842 |
85
+ | 0.0125 | 28.0 | 2324 | 0.0756 | 0.9684 | 0.9859 | 0.9768 | 0.9824 |
86
+ | 0.0125 | 29.0 | 2407 | 0.1150 | 0.9591 | 0.9824 | 0.9700 | 0.9772 |
87
+ | 0.0125 | 30.0 | 2490 | 0.0781 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
88
+ | 0.0111 | 31.0 | 2573 | 0.0793 | 0.9716 | 0.9871 | 0.9790 | 0.9842 |
89
+ | 0.0111 | 32.0 | 2656 | 0.0713 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
90
+ | 0.0111 | 33.0 | 2739 | 0.0802 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
91
+ | 0.0111 | 34.0 | 2822 | 0.0636 | 0.9802 | 0.9870 | 0.9835 | 0.9877 |
92
+ | 0.0111 | 35.0 | 2905 | 0.0702 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
93
+ | 0.0111 | 36.0 | 2988 | 0.0773 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
94
+ | 0.0145 | 37.0 | 3071 | 0.0663 | 0.9781 | 0.9894 | 0.9836 | 0.9877 |
95
+ | 0.0145 | 38.0 | 3154 | 0.0721 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
96
+ | 0.0145 | 39.0 | 3237 | 0.0708 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
97
+ | 0.0145 | 40.0 | 3320 | 0.0729 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
98
+ | 0.0145 | 41.0 | 3403 | 0.0760 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
99
+ | 0.0145 | 42.0 | 3486 | 0.0771 | 0.9716 | 0.9871 | 0.9790 | 0.9842 |
100
+ | 0.0106 | 43.0 | 3569 | 0.0713 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
101
+ | 0.0106 | 44.0 | 3652 | 0.0721 | 0.9748 | 0.9883 | 0.9813 | 0.9859 |
102
+ | 0.0106 | 45.0 | 3735 | 0.0732 | 0.9768 | 0.9859 | 0.9812 | 0.9859 |
103
+ | 0.0106 | 46.0 | 3818 | 0.0783 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
104
+ | 0.0106 | 47.0 | 3901 | 0.0770 | 0.9789 | 0.9835 | 0.9811 | 0.9859 |
105
+ | 0.0106 | 48.0 | 3984 | 0.0744 | 0.9735 | 0.9847 | 0.9789 | 0.9842 |
106
+ | 0.0082 | 49.0 | 4067 | 0.0752 | 0.9722 | 0.9811 | 0.9765 | 0.9824 |
107
+ | 0.0082 | 50.0 | 4150 | 0.0749 | 0.9722 | 0.9811 | 0.9765 | 0.9824 |
108
+
109
+
110
+ ### Framework versions
111
+
112
+ - Transformers 4.22.1
113
+ - Pytorch 1.12.1+cu113
114
+ - Datasets 2.5.1
115
+ - Tokenizers 0.12.1