Training in progress epoch 25
Browse files- README.md +10 -5
- tf_model.h5 +1 -1
README.md
CHANGED
@@ -14,11 +14,11 @@ probably proofread and complete it, then remove this comment. -->
|
|
14 |
|
15 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
-
- Train Loss: 0.
|
18 |
-
- Validation Loss: 3.
|
19 |
-
- Train F1: 0.
|
20 |
-
- Train Accuracy: 0.
|
21 |
-
- Epoch:
|
22 |
|
23 |
## Model description
|
24 |
|
@@ -65,6 +65,11 @@ The following hyperparameters were used during training:
|
|
65 |
| 0.0298 | 3.3675 | 0.4252 | 0.5307 | 18 |
|
66 |
| 0.0255 | 3.4341 | 0.4148 | 0.5217 | 19 |
|
67 |
| 0.0230 | 3.4253 | 0.4311 | 0.5250 | 20 |
|
|
|
|
|
|
|
|
|
|
|
68 |
|
69 |
|
70 |
### Framework versions
|
|
|
14 |
|
15 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Train Loss: 0.0126
|
18 |
+
- Validation Loss: 3.6501
|
19 |
+
- Train F1: 0.4317
|
20 |
+
- Train Accuracy: 0.5275
|
21 |
+
- Epoch: 25
|
22 |
|
23 |
## Model description
|
24 |
|
|
|
65 |
| 0.0298 | 3.3675 | 0.4252 | 0.5307 | 18 |
|
66 |
| 0.0255 | 3.4341 | 0.4148 | 0.5217 | 19 |
|
67 |
| 0.0230 | 3.4253 | 0.4311 | 0.5250 | 20 |
|
68 |
+
| 0.0195 | 3.5133 | 0.4278 | 0.5233 | 21 |
|
69 |
+
| 0.0166 | 3.5915 | 0.4277 | 0.5301 | 22 |
|
70 |
+
| 0.0165 | 3.5547 | 0.4191 | 0.5340 | 23 |
|
71 |
+
| 0.0142 | 3.6109 | 0.4333 | 0.5362 | 24 |
|
72 |
+
| 0.0126 | 3.6501 | 0.4317 | 0.5275 | 25 |
|
73 |
|
74 |
|
75 |
### Framework versions
|
tf_model.h5
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 268031680
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:31d3d358a2efc883e5b03b8e23e9a9dc9385652d85f801db235697f9b397d722
|
3 |
size 268031680
|