afaji commited on
Commit
5b0c234
1 Parent(s): 75bcc9c

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +89 -0
README.md ADDED
@@ -0,0 +1,89 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - f1
7
+ - precision
8
+ - recall
9
+ model-index:
10
+ - name: fine-tuned-DatasetQAS-IDK-MRC-with-indobert-base-uncased-without-ITTL-without-freeze-LR-1e-05
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # fine-tuned-DatasetQAS-IDK-MRC-with-indobert-base-uncased-without-ITTL-without-freeze-LR-1e-05
18
+
19
+ This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on the None dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 1.0791
22
+ - Exact Match: 69.7644
23
+ - F1: 75.9108
24
+ - Precision: 77.5909
25
+ - Recall: 77.7773
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 1e-05
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - gradient_accumulation_steps: 4
49
+ - total_train_batch_size: 64
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_ratio: 0.06
53
+ - num_epochs: 16
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 | Precision | Recall |
58
+ |:-------------:|:-----:|:----:|:---------------:|:-----------:|:-------:|:---------:|:-------:|
59
+ | 5.5507 | 0.49 | 73 | 3.2003 | 49.6073 | 49.6073 | 49.6073 | 49.6073 |
60
+ | 3.6491 | 0.99 | 146 | 1.9800 | 49.8691 | 49.8691 | 49.8691 | 49.8691 |
61
+ | 2.1085 | 1.49 | 219 | 1.7880 | 42.0157 | 48.4391 | 47.4995 | 57.0930 |
62
+ | 1.926 | 1.98 | 292 | 1.5461 | 54.3194 | 59.1586 | 59.2743 | 63.4653 |
63
+ | 1.5331 | 2.48 | 365 | 1.3471 | 57.7225 | 62.7979 | 63.2329 | 68.5704 |
64
+ | 1.4896 | 2.98 | 438 | 1.1975 | 59.0314 | 65.0097 | 66.0998 | 69.0900 |
65
+ | 1.1584 | 3.47 | 511 | 1.1617 | 60.9948 | 67.2465 | 68.0441 | 71.1982 |
66
+ | 1.1448 | 3.97 | 584 | 1.0450 | 65.4450 | 70.7693 | 71.7620 | 73.7743 |
67
+ | 0.9692 | 4.47 | 657 | 1.0827 | 65.3141 | 70.8950 | 71.9487 | 74.1019 |
68
+ | 0.9078 | 4.96 | 730 | 1.0273 | 66.8848 | 72.6251 | 74.0714 | 75.6255 |
69
+ | 0.8139 | 5.46 | 803 | 1.0441 | 66.3613 | 72.1886 | 73.9642 | 74.5072 |
70
+ | 0.8035 | 5.96 | 876 | 1.0418 | 66.6230 | 72.3513 | 73.8273 | 74.5317 |
71
+ | 0.7829 | 6.45 | 949 | 1.0555 | 67.2775 | 72.9075 | 74.5876 | 75.6701 |
72
+ | 0.7168 | 6.95 | 1022 | 1.0134 | 68.7173 | 74.2844 | 75.7597 | 76.3650 |
73
+ | 0.6677 | 7.45 | 1095 | 1.0526 | 68.8482 | 74.6640 | 76.4448 | 76.5281 |
74
+ | 0.6795 | 7.94 | 1168 | 1.0144 | 69.2408 | 75.2363 | 77.0568 | 76.9687 |
75
+ | 0.6109 | 8.44 | 1241 | 1.0488 | 69.3717 | 74.9248 | 76.5687 | 76.9808 |
76
+ | 0.5713 | 8.94 | 1314 | 1.0025 | 70.6806 | 76.3889 | 77.8845 | 78.7983 |
77
+ | 0.5859 | 9.43 | 1387 | 1.0352 | 70.8115 | 76.1957 | 77.9573 | 78.0250 |
78
+ | 0.5204 | 9.93 | 1460 | 1.0295 | 70.9424 | 76.5325 | 78.2172 | 78.3561 |
79
+ | 0.4952 | 10.43 | 1533 | 1.0356 | 70.4188 | 76.0822 | 77.7609 | 78.4852 |
80
+ | 0.4832 | 10.92 | 1606 | 1.0636 | 70.1571 | 75.9582 | 77.6080 | 78.0054 |
81
+ | 0.4613 | 11.42 | 1679 | 1.0791 | 69.7644 | 75.9108 | 77.5909 | 77.7773 |
82
+
83
+
84
+ ### Framework versions
85
+
86
+ - Transformers 4.26.1
87
+ - Pytorch 1.13.1+cu117
88
+ - Datasets 2.2.0
89
+ - Tokenizers 0.13.2