letingliu commited on
Commit
d88e8e9
1 Parent(s): a7b6bc9

Training in progress epoch 7

Browse files
Files changed (2) hide show
  1. README.md +11 -4
  2. tf_model.h5 +1 -1
README.md CHANGED
@@ -14,10 +14,10 @@ probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Train Loss: 0.6853
18
- - Validation Loss: 0.6586
19
- - Train Accuracy: 0.7788
20
- - Epoch: 0
21
 
22
  ## Model description
23
 
@@ -44,6 +44,13 @@ The following hyperparameters were used during training:
44
  | Train Loss | Validation Loss | Train Accuracy | Epoch |
45
  |:----------:|:---------------:|:--------------:|:-----:|
46
  | 0.6853 | 0.6586 | 0.7788 | 0 |
 
 
 
 
 
 
 
47
 
48
 
49
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Train Loss: 0.5129
18
+ - Validation Loss: 0.5051
19
+ - Train Accuracy: 0.9231
20
+ - Epoch: 7
21
 
22
  ## Model description
23
 
 
44
  | Train Loss | Validation Loss | Train Accuracy | Epoch |
45
  |:----------:|:---------------:|:--------------:|:-----:|
46
  | 0.6853 | 0.6586 | 0.7788 | 0 |
47
+ | 0.6489 | 0.6197 | 0.7788 | 1 |
48
+ | 0.6090 | 0.5693 | 0.8942 | 2 |
49
+ | 0.5617 | 0.5245 | 0.8942 | 3 |
50
+ | 0.5235 | 0.5051 | 0.9231 | 4 |
51
+ | 0.5116 | 0.5051 | 0.9231 | 5 |
52
+ | 0.5112 | 0.5051 | 0.9231 | 6 |
53
+ | 0.5129 | 0.5051 | 0.9231 | 7 |
54
 
55
 
56
  ### Framework versions
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:740e030ac4b71c3622d6cde4ebc6d0d36bc1e53f8be87e05a184669ae251e90a
3
  size 267956392
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:442e6903b9f87bd509cdeec82c1b160c92ce94fe458b624e33aaac45ab05eb89
3
  size 267956392