RogerB commited on
Commit
da750f5
1 Parent(s): 8c8966c

End of training

Browse files
Files changed (2) hide show
  1. README.md +6 -6
  2. config.json +6 -6
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [RogerB/afro-xlmr-large-kinte-domain-kinte-task](https://huggingface.co/RogerB/afro-xlmr-large-kinte-domain-kinte-task) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 0.9121
21
- - F1: 0.6803
22
 
23
  ## Model description
24
 
@@ -49,14 +49,14 @@ The following hyperparameters were used during training:
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | F1 |
51
  |:-------------:|:-----:|:----:|:---------------:|:------:|
52
- | 0.9553 | 1.0 | 1013 | 0.7033 | 0.7093 |
53
- | 0.7532 | 2.0 | 2026 | 0.5574 | 0.7787 |
54
- | 0.6321 | 3.0 | 3039 | 0.5224 | 0.8074 |
55
 
56
 
57
  ### Framework versions
58
 
59
  - Transformers 4.34.1
60
  - Pytorch 2.1.0+cu118
61
- - Datasets 2.14.5
62
  - Tokenizers 0.14.1
 
17
 
18
  This model is a fine-tuned version of [RogerB/afro-xlmr-large-kinte-domain-kinte-task](https://huggingface.co/RogerB/afro-xlmr-large-kinte-domain-kinte-task) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.9268
21
+ - F1: 0.6910
22
 
23
  ## Model description
24
 
 
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | F1 |
51
  |:-------------:|:-----:|:----:|:---------------:|:------:|
52
+ | 0.9124 | 1.0 | 1013 | 0.6422 | 0.7362 |
53
+ | 0.7181 | 2.0 | 2026 | 0.5277 | 0.7920 |
54
+ | 0.6082 | 3.0 | 3039 | 0.5014 | 0.8197 |
55
 
56
 
57
  ### Framework versions
58
 
59
  - Transformers 4.34.1
60
  - Pytorch 2.1.0+cu118
61
+ - Datasets 2.14.6
62
  - Tokenizers 0.14.1
config.json CHANGED
@@ -11,16 +11,16 @@
11
  "hidden_dropout_prob": 0.1,
12
  "hidden_size": 1024,
13
  "id2label": {
14
- "0": "LABEL_0",
15
- "1": "LABEL_1",
16
- "2": "LABEL_2"
17
  },
18
  "initializer_range": 0.02,
19
  "intermediate_size": 4096,
20
  "label2id": {
21
- "LABEL_0": 0,
22
- "LABEL_1": 1,
23
- "LABEL_2": 2
24
  },
25
  "layer_norm_eps": 1e-05,
26
  "max_position_embeddings": 514,
 
11
  "hidden_dropout_prob": 0.1,
12
  "hidden_size": 1024,
13
  "id2label": {
14
+ "0": "positive",
15
+ "1": "neutral",
16
+ "2": "negative"
17
  },
18
  "initializer_range": 0.02,
19
  "intermediate_size": 4096,
20
  "label2id": {
21
+ "negative": 2,
22
+ "neutral": 1,
23
+ "positive": 0
24
  },
25
  "layer_norm_eps": 1e-05,
26
  "max_position_embeddings": 514,