Payoto commited on
Commit
77935be
1 Parent(s): cc0a8cf

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the swag dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.5098
22
- - Accuracy: 0.8273
23
 
24
  ## Model description
25
 
@@ -44,8 +44,8 @@ The following hyperparameters were used during training:
44
  - seed: 42
45
  - distributed_type: IPU
46
  - gradient_accumulation_steps: 16
47
- - total_train_batch_size: 32
48
- - total_eval_batch_size: 10
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 3
@@ -55,9 +55,9 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
58
- | 1.0432 | 1.0 | 2298 | 0.5479 | 0.7869 |
59
- | 0.274 | 2.0 | 4596 | 0.4749 | 0.8155 |
60
- | 0.0939 | 3.0 | 6894 | 0.5098 | 0.8273 |
61
 
62
 
63
  ### Framework versions
18
 
19
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the swag dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.4382
22
+ - Accuracy: 0.8390
23
 
24
  ## Model description
25
 
44
  - seed: 42
45
  - distributed_type: IPU
46
  - gradient_accumulation_steps: 16
47
+ - total_train_batch_size: 128
48
+ - total_eval_batch_size: 40
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 3
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
58
+ | 0.5707 | 1.0 | 574 | 0.4990 | 0.8097 |
59
+ | 0.5092 | 2.0 | 1148 | 0.4321 | 0.8361 |
60
+ | 0.3597 | 3.0 | 1722 | 0.4382 | 0.8390 |
61
 
62
 
63
  ### Framework versions