emrecan commited on
Commit
c3ffc05
1 Parent(s): 01766c8

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +146 -0
README.md ADDED
@@ -0,0 +1,146 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: dbmdz_bert-base-turkish-cased_allnli_tr
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # dbmdz_bert-base-turkish-cased_allnli_tr
16
+
17
+ This model is a fine-tuned version of [dbmdz/bert-base-turkish-cased](https://huggingface.co/dbmdz/bert-base-turkish-cased) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.5771
20
+ - Accuracy: 0.7978
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 2e-05
40
+ - train_batch_size: 32
41
+ - eval_batch_size: 32
42
+ - seed: 42
43
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
+ - lr_scheduler_type: linear
45
+ - num_epochs: 3
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
50
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|
51
+ | 0.8559 | 0.03 | 1000 | 0.7577 | 0.6798 |
52
+ | 0.6612 | 0.07 | 2000 | 0.7263 | 0.6958 |
53
+ | 0.6115 | 0.1 | 3000 | 0.6431 | 0.7364 |
54
+ | 0.5916 | 0.14 | 4000 | 0.6347 | 0.7407 |
55
+ | 0.5719 | 0.17 | 5000 | 0.6317 | 0.7483 |
56
+ | 0.5575 | 0.2 | 6000 | 0.6034 | 0.7544 |
57
+ | 0.5521 | 0.24 | 7000 | 0.6148 | 0.7568 |
58
+ | 0.5393 | 0.27 | 8000 | 0.5931 | 0.7610 |
59
+ | 0.5382 | 0.31 | 9000 | 0.5866 | 0.7665 |
60
+ | 0.5306 | 0.34 | 10000 | 0.5881 | 0.7594 |
61
+ | 0.5295 | 0.37 | 11000 | 0.6120 | 0.7632 |
62
+ | 0.5225 | 0.41 | 12000 | 0.5620 | 0.7759 |
63
+ | 0.5112 | 0.44 | 13000 | 0.5641 | 0.7769 |
64
+ | 0.5133 | 0.48 | 14000 | 0.5571 | 0.7798 |
65
+ | 0.5023 | 0.51 | 15000 | 0.5719 | 0.7722 |
66
+ | 0.5017 | 0.54 | 16000 | 0.5482 | 0.7844 |
67
+ | 0.5111 | 0.58 | 17000 | 0.5503 | 0.7800 |
68
+ | 0.4929 | 0.61 | 18000 | 0.5502 | 0.7836 |
69
+ | 0.4923 | 0.65 | 19000 | 0.5424 | 0.7843 |
70
+ | 0.4894 | 0.68 | 20000 | 0.5417 | 0.7851 |
71
+ | 0.4877 | 0.71 | 21000 | 0.5514 | 0.7841 |
72
+ | 0.4818 | 0.75 | 22000 | 0.5494 | 0.7848 |
73
+ | 0.4898 | 0.78 | 23000 | 0.5450 | 0.7859 |
74
+ | 0.4823 | 0.82 | 24000 | 0.5417 | 0.7878 |
75
+ | 0.4806 | 0.85 | 25000 | 0.5354 | 0.7875 |
76
+ | 0.4779 | 0.88 | 26000 | 0.5338 | 0.7848 |
77
+ | 0.4744 | 0.92 | 27000 | 0.5277 | 0.7934 |
78
+ | 0.4678 | 0.95 | 28000 | 0.5507 | 0.7871 |
79
+ | 0.4727 | 0.99 | 29000 | 0.5603 | 0.7789 |
80
+ | 0.4243 | 1.02 | 30000 | 0.5626 | 0.7894 |
81
+ | 0.3955 | 1.05 | 31000 | 0.5324 | 0.7939 |
82
+ | 0.4022 | 1.09 | 32000 | 0.5322 | 0.7925 |
83
+ | 0.3976 | 1.12 | 33000 | 0.5450 | 0.7920 |
84
+ | 0.3913 | 1.15 | 34000 | 0.5464 | 0.7948 |
85
+ | 0.406 | 1.19 | 35000 | 0.5406 | 0.7958 |
86
+ | 0.3875 | 1.22 | 36000 | 0.5489 | 0.7878 |
87
+ | 0.4024 | 1.26 | 37000 | 0.5427 | 0.7925 |
88
+ | 0.3988 | 1.29 | 38000 | 0.5335 | 0.7904 |
89
+ | 0.393 | 1.32 | 39000 | 0.5415 | 0.7923 |
90
+ | 0.3988 | 1.36 | 40000 | 0.5385 | 0.7962 |
91
+ | 0.3912 | 1.39 | 41000 | 0.5383 | 0.7950 |
92
+ | 0.3949 | 1.43 | 42000 | 0.5415 | 0.7931 |
93
+ | 0.3902 | 1.46 | 43000 | 0.5438 | 0.7893 |
94
+ | 0.3948 | 1.49 | 44000 | 0.5348 | 0.7906 |
95
+ | 0.3921 | 1.53 | 45000 | 0.5361 | 0.7890 |
96
+ | 0.3944 | 1.56 | 46000 | 0.5419 | 0.7953 |
97
+ | 0.3959 | 1.6 | 47000 | 0.5402 | 0.7967 |
98
+ | 0.3926 | 1.63 | 48000 | 0.5429 | 0.7925 |
99
+ | 0.3854 | 1.66 | 49000 | 0.5346 | 0.7959 |
100
+ | 0.3864 | 1.7 | 50000 | 0.5241 | 0.7979 |
101
+ | 0.385 | 1.73 | 51000 | 0.5149 | 0.8002 |
102
+ | 0.3871 | 1.77 | 52000 | 0.5325 | 0.8002 |
103
+ | 0.3819 | 1.8 | 53000 | 0.5332 | 0.8022 |
104
+ | 0.384 | 1.83 | 54000 | 0.5419 | 0.7873 |
105
+ | 0.3899 | 1.87 | 55000 | 0.5225 | 0.7974 |
106
+ | 0.3894 | 1.9 | 56000 | 0.5358 | 0.7977 |
107
+ | 0.3838 | 1.94 | 57000 | 0.5264 | 0.7988 |
108
+ | 0.3881 | 1.97 | 58000 | 0.5280 | 0.7956 |
109
+ | 0.3756 | 2.0 | 59000 | 0.5601 | 0.7969 |
110
+ | 0.3156 | 2.04 | 60000 | 0.5936 | 0.7925 |
111
+ | 0.3125 | 2.07 | 61000 | 0.5898 | 0.7938 |
112
+ | 0.3179 | 2.11 | 62000 | 0.5591 | 0.7981 |
113
+ | 0.315 | 2.14 | 63000 | 0.5853 | 0.7970 |
114
+ | 0.3122 | 2.17 | 64000 | 0.5802 | 0.7979 |
115
+ | 0.3105 | 2.21 | 65000 | 0.5758 | 0.7979 |
116
+ | 0.3076 | 2.24 | 66000 | 0.5685 | 0.7980 |
117
+ | 0.3117 | 2.28 | 67000 | 0.5799 | 0.7944 |
118
+ | 0.3108 | 2.31 | 68000 | 0.5742 | 0.7988 |
119
+ | 0.3047 | 2.34 | 69000 | 0.5907 | 0.7921 |
120
+ | 0.3114 | 2.38 | 70000 | 0.5723 | 0.7937 |
121
+ | 0.3035 | 2.41 | 71000 | 0.5944 | 0.7955 |
122
+ | 0.3129 | 2.45 | 72000 | 0.5838 | 0.7928 |
123
+ | 0.3071 | 2.48 | 73000 | 0.5929 | 0.7949 |
124
+ | 0.3061 | 2.51 | 74000 | 0.5794 | 0.7967 |
125
+ | 0.3068 | 2.55 | 75000 | 0.5892 | 0.7954 |
126
+ | 0.3053 | 2.58 | 76000 | 0.5796 | 0.7962 |
127
+ | 0.3117 | 2.62 | 77000 | 0.5763 | 0.7981 |
128
+ | 0.3062 | 2.65 | 78000 | 0.5852 | 0.7964 |
129
+ | 0.3004 | 2.68 | 79000 | 0.5793 | 0.7966 |
130
+ | 0.3146 | 2.72 | 80000 | 0.5693 | 0.7985 |
131
+ | 0.3146 | 2.75 | 81000 | 0.5788 | 0.7982 |
132
+ | 0.3079 | 2.79 | 82000 | 0.5726 | 0.7978 |
133
+ | 0.3058 | 2.82 | 83000 | 0.5677 | 0.7988 |
134
+ | 0.3055 | 2.85 | 84000 | 0.5701 | 0.7982 |
135
+ | 0.3049 | 2.89 | 85000 | 0.5809 | 0.7970 |
136
+ | 0.3044 | 2.92 | 86000 | 0.5741 | 0.7986 |
137
+ | 0.3057 | 2.96 | 87000 | 0.5743 | 0.7980 |
138
+ | 0.3081 | 2.99 | 88000 | 0.5771 | 0.7978 |
139
+
140
+
141
+ ### Framework versions
142
+
143
+ - Transformers 4.12.3
144
+ - Pytorch 1.10.0+cu102
145
+ - Datasets 1.15.1
146
+ - Tokenizers 0.10.3