Payoto commited on
Commit
26360a7
1 Parent(s): fcfbfcd

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the swag dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.4407
22
- - Accuracy: 0.8367
23
 
24
  ## Model description
25
 
@@ -44,8 +44,8 @@ The following hyperparameters were used during training:
44
  - seed: 42
45
  - distributed_type: IPU
46
  - gradient_accumulation_steps: 16
47
- - total_train_batch_size: 128
48
- - total_eval_batch_size: 40
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 3
@@ -55,9 +55,9 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
58
- | 0.5306 | 1.0 | 574 | 0.4961 | 0.8163 |
59
- | 0.515 | 2.0 | 1148 | 0.4294 | 0.8328 |
60
- | 0.3507 | 3.0 | 1722 | 0.4407 | 0.8367 |
61
 
62
 
63
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the swag dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.5093
22
+ - Accuracy: 0.8302
23
 
24
  ## Model description
25
 
 
44
  - seed: 42
45
  - distributed_type: IPU
46
  - gradient_accumulation_steps: 16
47
+ - total_train_batch_size: 32
48
+ - total_eval_batch_size: 10
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 3
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
58
+ | 0.9911 | 1.0 | 2298 | 0.5308 | 0.7929 |
59
+ | 0.3024 | 2.0 | 4596 | 0.4780 | 0.8161 |
60
+ | 0.0785 | 3.0 | 6894 | 0.5093 | 0.8302 |
61
 
62
 
63
  ### Framework versions