File size: 6,743 Bytes
81ada8c
 
5b1343c
 
 
 
 
 
81ada8c
5b1343c
 
 
 
 
 
 
 
b1465c2
 
 
 
5b1343c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
081a670
1d5c53a
b3ca690
1816f03
ee637f7
0c4d280
f60fe1e
44ae962
1404aa2
7b13440
7b04c8b
b41bd56
d4b540e
108e58c
b77dc94
3633e85
74a05de
d12da3c
d97bea7
4eb6c9c
d189d1e
f0888cb
1f86cd1
d88545f
93b1062
fd8e6f9
139fa28
19aed88
916c746
2489363
4f42d49
e41ddc9
dc7490f
45c131c
eeff905
917009d
904087b
b6c79f5
ece3b8b
bf0cdda
e24504a
cf4bd29
07360d9
9fb5d4f
5ff147f
f2f9425
d07e033
0fbfd1e
e94de3a
491b539
1f77b3b
a426af8
a3ec2eb
ff96a43
fbdba7e
6f7e2c6
46d887f
b748b6c
ec14c10
575fb4d
773d1ff
6ecfb2c
7fc9926
3d4e817
b2495a5
621c9b2
0b7da33
156e557
457ac11
3136733
73428aa
c6b7a38
4c5c67f
e49f33b
a55796d
efb2e34
000fb04
9abb5d9
4ed357f
c973ae8
8858abb
4a173dc
b3154d8
52b3ec1
8393b8b
13acaf7
cda3d29
dd7078d
b1465c2
5b1343c
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: sevvalkapcak/newModel2
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# sevvalkapcak/newModel2

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0138
- Validation Loss: 0.6631
- Train Accuracy: 0.9225
- Epoch: 89

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 5e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.2465     | 0.2029          | 0.9085         | 0     |
| 0.1354     | 0.1302          | 0.939          | 1     |
| 0.1121     | 0.1588          | 0.934          | 2     |
| 0.0945     | 0.1551          | 0.937          | 3     |
| 0.0815     | 0.1696          | 0.939          | 4     |
| 0.0778     | 0.1647          | 0.932          | 5     |
| 0.0522     | 0.2356          | 0.931          | 6     |
| 0.0444     | 0.2861          | 0.9335         | 7     |
| 0.0329     | 0.2144          | 0.9355         | 8     |
| 0.0290     | 0.2548          | 0.935          | 9     |
| 0.0222     | 0.2866          | 0.93           | 10    |
| 0.0256     | 0.2787          | 0.9385         | 11    |
| 0.0267     | 0.2764          | 0.941          | 12    |
| 0.0201     | 0.2888          | 0.9315         | 13    |
| 0.0221     | 0.2737          | 0.934          | 14    |
| 0.0174     | 0.4403          | 0.93           | 15    |
| 0.0170     | 0.2836          | 0.932          | 16    |
| 0.0214     | 0.3033          | 0.9375         | 17    |
| 0.0125     | 0.3894          | 0.934          | 18    |
| 0.0271     | 0.3687          | 0.9305         | 19    |
| 0.0154     | 0.3817          | 0.9305         | 20    |
| 0.0149     | 0.4736          | 0.93           | 21    |
| 0.0196     | 0.4435          | 0.9325         | 22    |
| 0.0124     | 0.4873          | 0.929          | 23    |
| 0.0157     | 0.4008          | 0.932          | 24    |
| 0.0153     | 0.4074          | 0.931          | 25    |
| 0.0176     | 0.3996          | 0.9295         | 26    |
| 0.0160     | 0.3652          | 0.9355         | 27    |
| 0.0081     | 0.4446          | 0.934          | 28    |
| 0.0098     | 0.5249          | 0.934          | 29    |
| 0.0151     | 0.4112          | 0.937          | 30    |
| 0.0124     | 0.4888          | 0.929          | 31    |
| 0.0146     | 0.5022          | 0.9325         | 32    |
| 0.0130     | 0.5585          | 0.9305         | 33    |
| 0.0102     | 0.4304          | 0.935          | 34    |
| 0.0158     | 0.4239          | 0.933          | 35    |
| 0.0156     | 0.4849          | 0.93           | 36    |
| 0.0153     | 0.5097          | 0.9245         | 37    |
| 0.0135     | 0.4689          | 0.934          | 38    |
| 0.0178     | 0.4578          | 0.9285         | 39    |
| 0.0124     | 0.4083          | 0.9275         | 40    |
| 0.0106     | 0.4946          | 0.926          | 41    |
| 0.0098     | 0.4908          | 0.927          | 42    |
| 0.0131     | 0.5604          | 0.928          | 43    |
| 0.0143     | 0.4226          | 0.9315         | 44    |
| 0.0105     | 0.5664          | 0.9245         | 45    |
| 0.0189     | 0.5121          | 0.925          | 46    |
| 0.0148     | 0.5259          | 0.9245         | 47    |
| 0.0090     | 0.4567          | 0.9295         | 48    |
| 0.0156     | 0.4633          | 0.926          | 49    |
| 0.0128     | 0.5222          | 0.9295         | 50    |
| 0.0118     | 0.5461          | 0.921          | 51    |
| 0.0172     | 0.4626          | 0.927          | 52    |
| 0.0129     | 0.5266          | 0.922          | 53    |
| 0.0159     | 0.5203          | 0.925          | 54    |
| 0.0106     | 0.5360          | 0.9265         | 55    |
| 0.0158     | 0.4766          | 0.9305         | 56    |
| 0.0106     | 0.5630          | 0.926          | 57    |
| 0.0142     | 0.6162          | 0.922          | 58    |
| 0.0137     | 0.5518          | 0.916          | 59    |
| 0.0083     | 0.6281          | 0.9155         | 60    |
| 0.0071     | 0.6263          | 0.9245         | 61    |
| 0.0116     | 0.6166          | 0.9235         | 62    |
| 0.0162     | 0.5217          | 0.9195         | 63    |
| 0.0158     | 0.6366          | 0.9215         | 64    |
| 0.0120     | 0.5511          | 0.9245         | 65    |
| 0.0093     | 0.4895          | 0.9225         | 66    |
| 0.0094     | 0.5207          | 0.9255         | 67    |
| 0.0067     | 0.6252          | 0.9275         | 68    |
| 0.0058     | 0.6934          | 0.9235         | 69    |
| 0.0055     | 0.6577          | 0.928          | 70    |
| 0.0073     | 0.5865          | 0.9255         | 71    |
| 0.0336     | 0.4875          | 0.9175         | 72    |
| 0.0177     | 0.5256          | 0.923          | 73    |
| 0.0143     | 0.5042          | 0.917          | 74    |
| 0.0076     | 0.6803          | 0.9225         | 75    |
| 0.0114     | 0.5571          | 0.9205         | 76    |
| 0.0118     | 0.5649          | 0.9235         | 77    |
| 0.0147     | 0.5592          | 0.9245         | 78    |
| 0.0109     | 0.6044          | 0.9195         | 79    |
| 0.0095     | 0.6940          | 0.921          | 80    |
| 0.0139     | 0.6246          | 0.9245         | 81    |
| 0.0145     | 0.7057          | 0.917          | 82    |
| 0.0147     | 0.6455          | 0.9155         | 83    |
| 0.0100     | 0.6044          | 0.922          | 84    |
| 0.0074     | 0.6786          | 0.92           | 85    |
| 0.0093     | 0.7300          | 0.9125         | 86    |
| 0.0152     | 0.6264          | 0.9205         | 87    |
| 0.0115     | 0.6208          | 0.915          | 88    |
| 0.0138     | 0.6631          | 0.9225         | 89    |


### Framework versions

- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.1
- Tokenizers 0.15.1