gcapde commited on
Commit
16b7f96
1 Parent(s): 7e428f0

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -21,7 +21,7 @@ model-index:
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
- value: 0.90625
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -31,8 +31,8 @@ should probably proofread and complete it, then remove this comment. -->
31
 
32
  This model is a fine-tuned version of [BSC-TeMU/roberta-base-bne](https://huggingface.co/BSC-TeMU/roberta-base-bne) on the amazon_reviews_multi dataset.
33
  It achieves the following results on the evaluation set:
34
- - Loss: 0.2912
35
- - Accuracy: 0.9062
36
 
37
  ## Model description
38
 
@@ -63,13 +63,13 @@ The following hyperparameters were used during training:
63
 
64
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
65
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
66
- | 0.2496 | 1.0 | 63 | 0.2528 | 0.9093 |
67
- | 0.2034 | 2.0 | 126 | 0.2912 | 0.9062 |
68
 
69
 
70
  ### Framework versions
71
 
72
- - Transformers 4.27.1
73
  - Pytorch 1.13.1+cu116
74
  - Datasets 2.10.1
75
  - Tokenizers 0.13.2
 
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
+ value: 0.909
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [BSC-TeMU/roberta-base-bne](https://huggingface.co/BSC-TeMU/roberta-base-bne) on the amazon_reviews_multi dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.2958
35
+ - Accuracy: 0.909
36
 
37
  ## Model description
38
 
 
63
 
64
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
65
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
66
+ | 0.2212 | 1.0 | 63 | 0.2900 | 0.901 |
67
+ | 0.1967 | 2.0 | 126 | 0.2958 | 0.909 |
68
 
69
 
70
  ### Framework versions
71
 
72
+ - Transformers 4.27.2
73
  - Pytorch 1.13.1+cu116
74
  - Datasets 2.10.1
75
  - Tokenizers 0.13.2