peter2000 commited on
Commit
4702ced
1 Parent(s): cdc3a09

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: xlm-roberta-base-finetuned-ecoicop
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # xlm-roberta-base-finetuned-ecoicop
14
+
15
+ This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.1685
18
+ - Acc: 0.9659
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 2e-05
38
+ - train_batch_size: 16
39
+ - eval_batch_size: 16
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - num_epochs: 5
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss | Acc |
48
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
49
+ | 0.4224 | 1.0 | 2577 | 0.3612 | 0.9132 |
50
+ | 0.2313 | 2.0 | 5154 | 0.2510 | 0.9441 |
51
+ | 0.1746 | 3.0 | 7731 | 0.1928 | 0.9569 |
52
+ | 0.1325 | 4.0 | 10308 | 0.1731 | 0.9640 |
53
+ | 0.0946 | 5.0 | 12885 | 0.1685 | 0.9659 |
54
+
55
+
56
+ ### Framework versions
57
+
58
+ - Transformers 4.11.3
59
+ - Pytorch 1.9.0+cu111
60
+ - Datasets 1.14.0
61
+ - Tokenizers 0.10.3