Training in progress epoch 20
Browse files- README.md +10 -5
- tf_model.h5 +1 -1
README.md
CHANGED
@@ -14,11 +14,11 @@ probably proofread and complete it, then remove this comment. -->
|
|
14 |
|
15 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
-
- Train Loss: 0.
|
18 |
-
- Validation Loss: 3.
|
19 |
-
- Train F1: 0.
|
20 |
-
- Train Accuracy: 0.
|
21 |
-
- Epoch:
|
22 |
|
23 |
## Model description
|
24 |
|
@@ -60,6 +60,11 @@ The following hyperparameters were used during training:
|
|
60 |
| 0.0562 | 3.1115 | 0.4282 | 0.5222 | 13 |
|
61 |
| 0.0493 | 3.1710 | 0.4306 | 0.5268 | 14 |
|
62 |
| 0.0435 | 3.1507 | 0.4280 | 0.5322 | 15 |
|
|
|
|
|
|
|
|
|
|
|
63 |
|
64 |
|
65 |
### Framework versions
|
|
|
14 |
|
15 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Train Loss: 0.0230
|
18 |
+
- Validation Loss: 3.4253
|
19 |
+
- Train F1: 0.4311
|
20 |
+
- Train Accuracy: 0.5250
|
21 |
+
- Epoch: 20
|
22 |
|
23 |
## Model description
|
24 |
|
|
|
60 |
| 0.0562 | 3.1115 | 0.4282 | 0.5222 | 13 |
|
61 |
| 0.0493 | 3.1710 | 0.4306 | 0.5268 | 14 |
|
62 |
| 0.0435 | 3.1507 | 0.4280 | 0.5322 | 15 |
|
63 |
+
| 0.0391 | 3.3222 | 0.4165 | 0.5110 | 16 |
|
64 |
+
| 0.0321 | 3.3243 | 0.4218 | 0.5309 | 17 |
|
65 |
+
| 0.0298 | 3.3675 | 0.4252 | 0.5307 | 18 |
|
66 |
+
| 0.0255 | 3.4341 | 0.4148 | 0.5217 | 19 |
|
67 |
+
| 0.0230 | 3.4253 | 0.4311 | 0.5250 | 20 |
|
68 |
|
69 |
|
70 |
### Framework versions
|
tf_model.h5
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 268031680
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:819442c8872f01846babdeed96162e0cd60179bfe096cdb248a8d44a89fe2241
|
3 |
size 268031680
|