File size: 3,610 Bytes
81ada8c
 
5b1343c
 
 
 
 
 
81ada8c
5b1343c
 
 
 
 
 
 
 
eeff905
 
 
 
5b1343c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
081a670
1d5c53a
b3ca690
1816f03
ee637f7
0c4d280
f60fe1e
44ae962
1404aa2
7b13440
7b04c8b
b41bd56
d4b540e
108e58c
b77dc94
3633e85
74a05de
d12da3c
d97bea7
4eb6c9c
d189d1e
f0888cb
1f86cd1
d88545f
93b1062
fd8e6f9
139fa28
19aed88
916c746
2489363
4f42d49
e41ddc9
dc7490f
45c131c
eeff905
5b1343c
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: sevvalkapcak/newModel2
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# sevvalkapcak/newModel2

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0158
- Validation Loss: 0.4239
- Train Accuracy: 0.933
- Epoch: 35

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 5e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.2465     | 0.2029          | 0.9085         | 0     |
| 0.1354     | 0.1302          | 0.939          | 1     |
| 0.1121     | 0.1588          | 0.934          | 2     |
| 0.0945     | 0.1551          | 0.937          | 3     |
| 0.0815     | 0.1696          | 0.939          | 4     |
| 0.0778     | 0.1647          | 0.932          | 5     |
| 0.0522     | 0.2356          | 0.931          | 6     |
| 0.0444     | 0.2861          | 0.9335         | 7     |
| 0.0329     | 0.2144          | 0.9355         | 8     |
| 0.0290     | 0.2548          | 0.935          | 9     |
| 0.0222     | 0.2866          | 0.93           | 10    |
| 0.0256     | 0.2787          | 0.9385         | 11    |
| 0.0267     | 0.2764          | 0.941          | 12    |
| 0.0201     | 0.2888          | 0.9315         | 13    |
| 0.0221     | 0.2737          | 0.934          | 14    |
| 0.0174     | 0.4403          | 0.93           | 15    |
| 0.0170     | 0.2836          | 0.932          | 16    |
| 0.0214     | 0.3033          | 0.9375         | 17    |
| 0.0125     | 0.3894          | 0.934          | 18    |
| 0.0271     | 0.3687          | 0.9305         | 19    |
| 0.0154     | 0.3817          | 0.9305         | 20    |
| 0.0149     | 0.4736          | 0.93           | 21    |
| 0.0196     | 0.4435          | 0.9325         | 22    |
| 0.0124     | 0.4873          | 0.929          | 23    |
| 0.0157     | 0.4008          | 0.932          | 24    |
| 0.0153     | 0.4074          | 0.931          | 25    |
| 0.0176     | 0.3996          | 0.9295         | 26    |
| 0.0160     | 0.3652          | 0.9355         | 27    |
| 0.0081     | 0.4446          | 0.934          | 28    |
| 0.0098     | 0.5249          | 0.934          | 29    |
| 0.0151     | 0.4112          | 0.937          | 30    |
| 0.0124     | 0.4888          | 0.929          | 31    |
| 0.0146     | 0.5022          | 0.9325         | 32    |
| 0.0130     | 0.5585          | 0.9305         | 33    |
| 0.0102     | 0.4304          | 0.935          | 34    |
| 0.0158     | 0.4239          | 0.933          | 35    |


### Framework versions

- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.1
- Tokenizers 0.15.1