haryoaw commited on
Commit
c9f09d1
1 Parent(s): 59a908d

Initial Commit

Browse files
Files changed (4) hide show
  1. README.md +54 -54
  2. eval_results_cardiff.json +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,14 +1,14 @@
1
  ---
 
 
2
  base_model: microsoft/mdeberta-v3-base
 
 
3
  datasets:
4
  - tweet_sentiment_multilingual
5
- library_name: transformers
6
- license: mit
7
  metrics:
8
  - accuracy
9
  - f1
10
- tags:
11
- - generated_from_trainer
12
  model-index:
13
  - name: scenario-NON-KD-PR-COPY-CDF-ALL-D2_data-cardiffnlp_tweet_sentiment_multilingual_
14
  results: []
@@ -21,9 +21,9 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [microsoft/mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) on the tweet_sentiment_multilingual dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 5.0069
25
- - Accuracy: 0.5625
26
- - F1: 0.5624
27
 
28
  ## Model description
29
 
@@ -45,7 +45,7 @@ The following hyperparameters were used during training:
45
  - learning_rate: 5e-05
46
  - train_batch_size: 32
47
  - eval_batch_size: 32
48
- - seed: 66
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 50
@@ -54,52 +54,52 @@ The following hyperparameters were used during training:
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
56
  |:-------------:|:-------:|:-----:|:---------------:|:--------:|:------:|
57
- | 1.0241 | 1.0870 | 500 | 0.9704 | 0.5316 | 0.5088 |
58
- | 0.8423 | 2.1739 | 1000 | 0.9940 | 0.5721 | 0.5673 |
59
- | 0.6621 | 3.2609 | 1500 | 1.3553 | 0.5610 | 0.5513 |
60
- | 0.4729 | 4.3478 | 2000 | 1.3480 | 0.5652 | 0.5599 |
61
- | 0.3292 | 5.4348 | 2500 | 1.7478 | 0.5525 | 0.5523 |
62
- | 0.2255 | 6.5217 | 3000 | 2.0315 | 0.5490 | 0.5351 |
63
- | 0.1749 | 7.6087 | 3500 | 1.9869 | 0.5448 | 0.5458 |
64
- | 0.1412 | 8.6957 | 4000 | 1.8836 | 0.5517 | 0.5494 |
65
- | 0.1102 | 9.7826 | 4500 | 2.4417 | 0.5467 | 0.5476 |
66
- | 0.0951 | 10.8696 | 5000 | 2.6246 | 0.5613 | 0.5586 |
67
- | 0.0858 | 11.9565 | 5500 | 3.1135 | 0.5683 | 0.5662 |
68
- | 0.0696 | 13.0435 | 6000 | 3.1428 | 0.5513 | 0.5507 |
69
- | 0.061 | 14.1304 | 6500 | 3.0275 | 0.5671 | 0.5658 |
70
- | 0.0547 | 15.2174 | 7000 | 3.3766 | 0.5671 | 0.5649 |
71
- | 0.0486 | 16.3043 | 7500 | 3.5224 | 0.5525 | 0.5506 |
72
- | 0.0456 | 17.3913 | 8000 | 3.1902 | 0.5490 | 0.5439 |
73
- | 0.0446 | 18.4783 | 8500 | 3.7426 | 0.5652 | 0.5645 |
74
- | 0.0452 | 19.5652 | 9000 | 3.4598 | 0.5517 | 0.5513 |
75
- | 0.0254 | 20.6522 | 9500 | 3.9115 | 0.5579 | 0.5564 |
76
- | 0.0305 | 21.7391 | 10000 | 3.9650 | 0.5401 | 0.5382 |
77
- | 0.0299 | 22.8261 | 10500 | 3.7843 | 0.5540 | 0.5531 |
78
- | 0.0216 | 23.9130 | 11000 | 4.1351 | 0.5556 | 0.5557 |
79
- | 0.0248 | 25.0 | 11500 | 4.0123 | 0.5598 | 0.5600 |
80
- | 0.023 | 26.0870 | 12000 | 4.2745 | 0.5683 | 0.5650 |
81
- | 0.0235 | 27.1739 | 12500 | 3.7494 | 0.5633 | 0.5584 |
82
- | 0.016 | 28.2609 | 13000 | 4.1544 | 0.5602 | 0.5597 |
83
- | 0.0169 | 29.3478 | 13500 | 3.9233 | 0.5602 | 0.5547 |
84
- | 0.012 | 30.4348 | 14000 | 4.4512 | 0.5640 | 0.5617 |
85
- | 0.011 | 31.5217 | 14500 | 4.5114 | 0.5602 | 0.5596 |
86
- | 0.0114 | 32.6087 | 15000 | 4.2668 | 0.5621 | 0.5622 |
87
- | 0.0139 | 33.6957 | 15500 | 4.6988 | 0.5575 | 0.5564 |
88
- | 0.0113 | 34.7826 | 16000 | 4.3218 | 0.5540 | 0.5521 |
89
- | 0.0082 | 35.8696 | 16500 | 4.6459 | 0.5548 | 0.5519 |
90
- | 0.0065 | 36.9565 | 17000 | 4.6509 | 0.5586 | 0.5577 |
91
- | 0.0081 | 38.0435 | 17500 | 4.6140 | 0.5648 | 0.5631 |
92
- | 0.0074 | 39.1304 | 18000 | 4.3968 | 0.5602 | 0.5589 |
93
- | 0.0058 | 40.2174 | 18500 | 4.6515 | 0.5563 | 0.5569 |
94
- | 0.0036 | 41.3043 | 19000 | 5.0937 | 0.5548 | 0.5545 |
95
- | 0.0041 | 42.3913 | 19500 | 5.0242 | 0.5552 | 0.5529 |
96
- | 0.0054 | 43.4783 | 20000 | 4.8426 | 0.5583 | 0.5576 |
97
- | 0.0026 | 44.5652 | 20500 | 4.9950 | 0.5606 | 0.5623 |
98
- | 0.0044 | 45.6522 | 21000 | 4.9220 | 0.5629 | 0.5627 |
99
- | 0.0031 | 46.7391 | 21500 | 4.9823 | 0.5610 | 0.5616 |
100
- | 0.002 | 47.8261 | 22000 | 4.9342 | 0.5621 | 0.5618 |
101
- | 0.0026 | 48.9130 | 22500 | 5.0100 | 0.5590 | 0.5588 |
102
- | 0.0014 | 50.0 | 23000 | 5.0069 | 0.5625 | 0.5624 |
103
 
104
 
105
  ### Framework versions
 
1
  ---
2
+ library_name: transformers
3
+ license: mit
4
  base_model: microsoft/mdeberta-v3-base
5
+ tags:
6
+ - generated_from_trainer
7
  datasets:
8
  - tweet_sentiment_multilingual
 
 
9
  metrics:
10
  - accuracy
11
  - f1
 
 
12
  model-index:
13
  - name: scenario-NON-KD-PR-COPY-CDF-ALL-D2_data-cardiffnlp_tweet_sentiment_multilingual_
14
  results: []
 
21
 
22
  This model is a fine-tuned version of [microsoft/mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) on the tweet_sentiment_multilingual dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 5.1517
25
+ - Accuracy: 0.5571
26
+ - F1: 0.5567
27
 
28
  ## Model description
29
 
 
45
  - learning_rate: 5e-05
46
  - train_batch_size: 32
47
  - eval_batch_size: 32
48
+ - seed: 44
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 50
 
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
56
  |:-------------:|:-------:|:-----:|:---------------:|:--------:|:------:|
57
+ | 1.0024 | 1.0870 | 500 | 0.9815 | 0.5556 | 0.5555 |
58
+ | 0.8411 | 2.1739 | 1000 | 0.9889 | 0.5772 | 0.5763 |
59
+ | 0.6502 | 3.2609 | 1500 | 1.1977 | 0.5633 | 0.5598 |
60
+ | 0.4723 | 4.3478 | 2000 | 1.6466 | 0.5617 | 0.5626 |
61
+ | 0.3276 | 5.4348 | 2500 | 1.7205 | 0.5498 | 0.5519 |
62
+ | 0.2221 | 6.5217 | 3000 | 2.0190 | 0.5590 | 0.5600 |
63
+ | 0.167 | 7.6087 | 3500 | 2.5446 | 0.5552 | 0.5562 |
64
+ | 0.1317 | 8.6957 | 4000 | 2.5112 | 0.5525 | 0.5539 |
65
+ | 0.1141 | 9.7826 | 4500 | 2.6152 | 0.5594 | 0.5545 |
66
+ | 0.1007 | 10.8696 | 5000 | 3.2079 | 0.5513 | 0.5416 |
67
+ | 0.0827 | 11.9565 | 5500 | 2.7099 | 0.5590 | 0.5590 |
68
+ | 0.0653 | 13.0435 | 6000 | 3.1595 | 0.5721 | 0.5678 |
69
+ | 0.0644 | 14.1304 | 6500 | 3.1304 | 0.5679 | 0.5667 |
70
+ | 0.054 | 15.2174 | 7000 | 3.0885 | 0.5590 | 0.5573 |
71
+ | 0.0504 | 16.3043 | 7500 | 3.5769 | 0.5583 | 0.5580 |
72
+ | 0.0394 | 17.3913 | 8000 | 3.5597 | 0.5606 | 0.5608 |
73
+ | 0.0419 | 18.4783 | 8500 | 3.8739 | 0.5525 | 0.5501 |
74
+ | 0.0406 | 19.5652 | 9000 | 3.5220 | 0.5667 | 0.5660 |
75
+ | 0.0355 | 20.6522 | 9500 | 4.0325 | 0.5691 | 0.5667 |
76
+ | 0.0281 | 21.7391 | 10000 | 3.7630 | 0.5602 | 0.5614 |
77
+ | 0.0266 | 22.8261 | 10500 | 4.0162 | 0.5617 | 0.5553 |
78
+ | 0.0283 | 23.9130 | 11000 | 3.9135 | 0.5525 | 0.5529 |
79
+ | 0.027 | 25.0 | 11500 | 4.0734 | 0.5563 | 0.5541 |
80
+ | 0.0205 | 26.0870 | 12000 | 4.2900 | 0.5583 | 0.5586 |
81
+ | 0.0198 | 27.1739 | 12500 | 4.2693 | 0.5579 | 0.5572 |
82
+ | 0.0155 | 28.2609 | 13000 | 4.7029 | 0.5563 | 0.5435 |
83
+ | 0.0187 | 29.3478 | 13500 | 4.4409 | 0.5640 | 0.5616 |
84
+ | 0.014 | 30.4348 | 14000 | 4.4588 | 0.5571 | 0.5568 |
85
+ | 0.0147 | 31.5217 | 14500 | 4.3420 | 0.5652 | 0.5640 |
86
+ | 0.0128 | 32.6087 | 15000 | 4.5721 | 0.5598 | 0.5575 |
87
+ | 0.0099 | 33.6957 | 15500 | 4.5574 | 0.5586 | 0.5599 |
88
+ | 0.0101 | 34.7826 | 16000 | 4.3777 | 0.5610 | 0.5613 |
89
+ | 0.0053 | 35.8696 | 16500 | 4.8103 | 0.5610 | 0.5617 |
90
+ | 0.0107 | 36.9565 | 17000 | 4.2925 | 0.5590 | 0.5589 |
91
+ | 0.0081 | 38.0435 | 17500 | 4.5884 | 0.5606 | 0.5591 |
92
+ | 0.0071 | 39.1304 | 18000 | 4.7187 | 0.5617 | 0.5621 |
93
+ | 0.0075 | 40.2174 | 18500 | 4.7305 | 0.5594 | 0.5591 |
94
+ | 0.0081 | 41.3043 | 19000 | 4.5589 | 0.5602 | 0.5607 |
95
+ | 0.0059 | 42.3913 | 19500 | 4.6516 | 0.5598 | 0.5589 |
96
+ | 0.0061 | 43.4783 | 20000 | 4.6553 | 0.5613 | 0.5605 |
97
+ | 0.0032 | 44.5652 | 20500 | 4.9672 | 0.5567 | 0.5569 |
98
+ | 0.0031 | 45.6522 | 21000 | 5.0283 | 0.5590 | 0.5595 |
99
+ | 0.0032 | 46.7391 | 21500 | 5.0801 | 0.5563 | 0.5552 |
100
+ | 0.0032 | 47.8261 | 22000 | 5.0934 | 0.5602 | 0.5604 |
101
+ | 0.0017 | 48.9130 | 22500 | 5.1305 | 0.5579 | 0.5581 |
102
+ | 0.0017 | 50.0 | 23000 | 5.1517 | 0.5571 | 0.5567 |
103
 
104
 
105
  ### Framework versions
eval_results_cardiff.json CHANGED
@@ -1 +1 @@
1
- {"arabic": {"f1": 0.5143201009164864, "accuracy": 0.5114942528735632, "confusion_matrix": [[171, 97, 22], [106, 151, 33], [41, 126, 123]]}, "english": {"f1": 0.5780106944386793, "accuracy": 0.5977011494252874, "confusion_matrix": [[262, 18, 10], [165, 90, 35], [74, 48, 168]]}, "french": {"f1": 0.5842392950199085, "accuracy": 0.593103448275862, "confusion_matrix": [[233, 30, 27], [76, 150, 64], [93, 64, 133]]}, "german": {"f1": 0.668888888888889, "accuracy": 0.6689655172413793, "confusion_matrix": [[230, 22, 38], [90, 165, 35], [80, 23, 187]]}, "hindi": {"f1": 0.4503815887579726, "accuracy": 0.47126436781609193, "confusion_matrix": [[222, 34, 34], [173, 73, 44], [128, 47, 115]]}, "italian": {"f1": 0.579391708667071, "accuracy": 0.5793103448275863, "confusion_matrix": [[152, 61, 77], [40, 174, 76], [43, 69, 178]]}, "portuguese": {"f1": 0.6139994795732501, "accuracy": 0.6137931034482759, "confusion_matrix": [[190, 72, 28], [88, 152, 50], [42, 56, 192]]}, "spanish": {"f1": 0.5732816131044166, "accuracy": 0.5885057471264368, "confusion_matrix": [[236, 30, 24], [128, 99, 63], [77, 36, 177]]}, "all": {"f1": 0.5736504412779122, "accuracy": 0.5780172413793103, "confusion_matrix": [[1696, 364, 260], [866, 1054, 400], [578, 469, 1273]]}}
 
1
+ {"arabic": {"f1": 0.5625619299343044, "accuracy": 0.5597701149425287, "confusion_matrix": [[141, 118, 31], [76, 171, 43], [41, 74, 175]]}, "english": {"f1": 0.6406225857351387, "accuracy": 0.6448275862068965, "confusion_matrix": [[223, 44, 23], [100, 141, 49], [42, 51, 197]]}, "french": {"f1": 0.5815633943105122, "accuracy": 0.5862068965517241, "confusion_matrix": [[215, 40, 35], [61, 149, 80], [73, 71, 146]]}, "german": {"f1": 0.6898338453998946, "accuracy": 0.6908045977011494, "confusion_matrix": [[182, 48, 60], [49, 200, 41], [32, 39, 219]]}, "hindi": {"f1": 0.47341547251844335, "accuracy": 0.48160919540229885, "confusion_matrix": [[176, 41, 73], [120, 91, 79], [97, 41, 152]]}, "italian": {"f1": 0.5751810164287168, "accuracy": 0.5758620689655173, "confusion_matrix": [[142, 59, 89], [31, 187, 72], [43, 75, 172]]}, "portuguese": {"f1": 0.6016948201987572, "accuracy": 0.6057471264367816, "confusion_matrix": [[149, 99, 42], [57, 154, 79], [24, 42, 224]]}, "spanish": {"f1": 0.5944342569849371, "accuracy": 0.5942528735632184, "confusion_matrix": [[175, 83, 32], [66, 152, 72], [40, 60, 190]]}, "all": {"f1": 0.5919315531719215, "accuracy": 0.5923850574712644, "confusion_matrix": [[1403, 532, 385], [560, 1245, 515], [392, 453, 1475]]}}
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c20cd505290f029d892d756b587ba3c63ad5e26915d75d8f89bcd91e1176d1a7
3
  size 946716948
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:32d6030d7d91f6283fd28886fec4a85c014b51138f815b45efeda60d6c926967
3
  size 946716948
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fcf24262e832d6abe5177f9590bfac586fddd73089340b0327e8b4b6853d5bbf
3
  size 5368
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6e9dc0b9538f6b294cdfb11906b794438c54c69b87175e24803c89251cef4ab
3
  size 5368