Andreas Jörg commited on
Commit
02f0535
1 Parent(s): 401ddc7

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -11
README.md CHANGED
@@ -14,7 +14,7 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [benjamin/gerpt2-large](https://huggingface.co/benjamin/gerpt2-large) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 4.2487
18
 
19
  ## Model description
20
 
@@ -40,25 +40,21 @@ The following hyperparameters were used during training:
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
  - lr_scheduler_warmup_steps: 500
43
- - num_epochs: 8
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:-----:|:---------------:|
49
- | 3.5332 | 1.0 | 11347 | 3.5015 |
50
- | 3.1448 | 2.0 | 22694 | 3.4149 |
51
- | 2.8081 | 3.0 | 34041 | 3.4495 |
52
- | 2.4535 | 4.0 | 45388 | 3.5714 |
53
- | 2.1844 | 5.0 | 56735 | 3.7579 |
54
- | 1.9625 | 6.0 | 68082 | 3.9492 |
55
- | 1.7585 | 7.0 | 79429 | 4.1280 |
56
- | 1.6407 | 8.0 | 90776 | 4.2487 |
57
 
58
 
59
  ### Framework versions
60
 
61
- - Transformers 4.19.4
62
  - Pytorch 1.11.0+cu113
63
  - Datasets 2.3.2
64
  - Tokenizers 0.12.1
 
14
 
15
  This model is a fine-tuned version of [benjamin/gerpt2-large](https://huggingface.co/benjamin/gerpt2-large) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 3.4257
18
 
19
  ## Model description
20
 
 
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
  - lr_scheduler_warmup_steps: 500
43
+ - num_epochs: 4
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:-----:|:---------------:|
49
+ | 3.4094 | 1.0 | 11308 | 3.3838 |
50
+ | 3.0445 | 2.0 | 22616 | 3.3107 |
51
+ | 2.7161 | 3.0 | 33924 | 3.3409 |
52
+ | 2.4793 | 4.0 | 45232 | 3.4257 |
 
 
 
 
53
 
54
 
55
  ### Framework versions
56
 
57
+ - Transformers 4.20.1
58
  - Pytorch 1.11.0+cu113
59
  - Datasets 2.3.2
60
  - Tokenizers 0.12.1