haryoaw commited on
Commit
0eae183
1 Parent(s): 1e7ffe8

Initial Commit

Browse files
Files changed (4) hide show
  1. README.md +54 -54
  2. eval_results_cardiff.json +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,14 +1,14 @@
1
  ---
 
 
2
  base_model: microsoft/mdeberta-v3-base
 
 
3
  datasets:
4
  - tweet_sentiment_multilingual
5
- library_name: transformers
6
- license: mit
7
  metrics:
8
  - accuracy
9
  - f1
10
- tags:
11
- - generated_from_trainer
12
  model-index:
13
  - name: scenario-NON-KD-PR-COPY-CDF-ALL-D2_data-cardiffnlp_tweet_sentiment_multilingual_
14
  results: []
@@ -21,9 +21,9 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [microsoft/mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) on the tweet_sentiment_multilingual dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 5.1517
25
- - Accuracy: 0.5571
26
- - F1: 0.5567
27
 
28
  ## Model description
29
 
@@ -45,7 +45,7 @@ The following hyperparameters were used during training:
45
  - learning_rate: 5e-05
46
  - train_batch_size: 32
47
  - eval_batch_size: 32
48
- - seed: 44
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 50
@@ -54,52 +54,52 @@ The following hyperparameters were used during training:
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
56
  |:-------------:|:-------:|:-----:|:---------------:|:--------:|:------:|
57
- | 1.0024 | 1.0870 | 500 | 0.9815 | 0.5556 | 0.5555 |
58
- | 0.8411 | 2.1739 | 1000 | 0.9889 | 0.5772 | 0.5763 |
59
- | 0.6502 | 3.2609 | 1500 | 1.1977 | 0.5633 | 0.5598 |
60
- | 0.4723 | 4.3478 | 2000 | 1.6466 | 0.5617 | 0.5626 |
61
- | 0.3276 | 5.4348 | 2500 | 1.7205 | 0.5498 | 0.5519 |
62
- | 0.2221 | 6.5217 | 3000 | 2.0190 | 0.5590 | 0.5600 |
63
- | 0.167 | 7.6087 | 3500 | 2.5446 | 0.5552 | 0.5562 |
64
- | 0.1317 | 8.6957 | 4000 | 2.5112 | 0.5525 | 0.5539 |
65
- | 0.1141 | 9.7826 | 4500 | 2.6152 | 0.5594 | 0.5545 |
66
- | 0.1007 | 10.8696 | 5000 | 3.2079 | 0.5513 | 0.5416 |
67
- | 0.0827 | 11.9565 | 5500 | 2.7099 | 0.5590 | 0.5590 |
68
- | 0.0653 | 13.0435 | 6000 | 3.1595 | 0.5721 | 0.5678 |
69
- | 0.0644 | 14.1304 | 6500 | 3.1304 | 0.5679 | 0.5667 |
70
- | 0.054 | 15.2174 | 7000 | 3.0885 | 0.5590 | 0.5573 |
71
- | 0.0504 | 16.3043 | 7500 | 3.5769 | 0.5583 | 0.5580 |
72
- | 0.0394 | 17.3913 | 8000 | 3.5597 | 0.5606 | 0.5608 |
73
- | 0.0419 | 18.4783 | 8500 | 3.8739 | 0.5525 | 0.5501 |
74
- | 0.0406 | 19.5652 | 9000 | 3.5220 | 0.5667 | 0.5660 |
75
- | 0.0355 | 20.6522 | 9500 | 4.0325 | 0.5691 | 0.5667 |
76
- | 0.0281 | 21.7391 | 10000 | 3.7630 | 0.5602 | 0.5614 |
77
- | 0.0266 | 22.8261 | 10500 | 4.0162 | 0.5617 | 0.5553 |
78
- | 0.0283 | 23.9130 | 11000 | 3.9135 | 0.5525 | 0.5529 |
79
- | 0.027 | 25.0 | 11500 | 4.0734 | 0.5563 | 0.5541 |
80
- | 0.0205 | 26.0870 | 12000 | 4.2900 | 0.5583 | 0.5586 |
81
- | 0.0198 | 27.1739 | 12500 | 4.2693 | 0.5579 | 0.5572 |
82
- | 0.0155 | 28.2609 | 13000 | 4.7029 | 0.5563 | 0.5435 |
83
- | 0.0187 | 29.3478 | 13500 | 4.4409 | 0.5640 | 0.5616 |
84
- | 0.014 | 30.4348 | 14000 | 4.4588 | 0.5571 | 0.5568 |
85
- | 0.0147 | 31.5217 | 14500 | 4.3420 | 0.5652 | 0.5640 |
86
- | 0.0128 | 32.6087 | 15000 | 4.5721 | 0.5598 | 0.5575 |
87
- | 0.0099 | 33.6957 | 15500 | 4.5574 | 0.5586 | 0.5599 |
88
- | 0.0101 | 34.7826 | 16000 | 4.3777 | 0.5610 | 0.5613 |
89
- | 0.0053 | 35.8696 | 16500 | 4.8103 | 0.5610 | 0.5617 |
90
- | 0.0107 | 36.9565 | 17000 | 4.2925 | 0.5590 | 0.5589 |
91
- | 0.0081 | 38.0435 | 17500 | 4.5884 | 0.5606 | 0.5591 |
92
- | 0.0071 | 39.1304 | 18000 | 4.7187 | 0.5617 | 0.5621 |
93
- | 0.0075 | 40.2174 | 18500 | 4.7305 | 0.5594 | 0.5591 |
94
- | 0.0081 | 41.3043 | 19000 | 4.5589 | 0.5602 | 0.5607 |
95
- | 0.0059 | 42.3913 | 19500 | 4.6516 | 0.5598 | 0.5589 |
96
- | 0.0061 | 43.4783 | 20000 | 4.6553 | 0.5613 | 0.5605 |
97
- | 0.0032 | 44.5652 | 20500 | 4.9672 | 0.5567 | 0.5569 |
98
- | 0.0031 | 45.6522 | 21000 | 5.0283 | 0.5590 | 0.5595 |
99
- | 0.0032 | 46.7391 | 21500 | 5.0801 | 0.5563 | 0.5552 |
100
- | 0.0032 | 47.8261 | 22000 | 5.0934 | 0.5602 | 0.5604 |
101
- | 0.0017 | 48.9130 | 22500 | 5.1305 | 0.5579 | 0.5581 |
102
- | 0.0017 | 50.0 | 23000 | 5.1517 | 0.5571 | 0.5567 |
103
 
104
 
105
  ### Framework versions
 
1
  ---
2
+ library_name: transformers
3
+ license: mit
4
  base_model: microsoft/mdeberta-v3-base
5
+ tags:
6
+ - generated_from_trainer
7
  datasets:
8
  - tweet_sentiment_multilingual
 
 
9
  metrics:
10
  - accuracy
11
  - f1
 
 
12
  model-index:
13
  - name: scenario-NON-KD-PR-COPY-CDF-ALL-D2_data-cardiffnlp_tweet_sentiment_multilingual_
14
  results: []
 
21
 
22
  This model is a fine-tuned version of [microsoft/mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) on the tweet_sentiment_multilingual dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 5.0035
25
+ - Accuracy: 0.5625
26
+ - F1: 0.5617
27
 
28
  ## Model description
29
 
 
45
  - learning_rate: 5e-05
46
  - train_batch_size: 32
47
  - eval_batch_size: 32
48
+ - seed: 55
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - num_epochs: 50
 
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
56
  |:-------------:|:-------:|:-----:|:---------------:|:--------:|:------:|
57
+ | 1.021 | 1.0870 | 500 | 1.0027 | 0.5409 | 0.5367 |
58
+ | 0.8432 | 2.1739 | 1000 | 1.0327 | 0.5814 | 0.5820 |
59
+ | 0.6715 | 3.2609 | 1500 | 1.1554 | 0.5822 | 0.5778 |
60
+ | 0.48 | 4.3478 | 2000 | 1.4182 | 0.5613 | 0.5573 |
61
+ | 0.3384 | 5.4348 | 2500 | 1.8214 | 0.5567 | 0.5573 |
62
+ | 0.2309 | 6.5217 | 3000 | 1.8385 | 0.5502 | 0.5445 |
63
+ | 0.1737 | 7.6087 | 3500 | 2.0368 | 0.5444 | 0.5440 |
64
+ | 0.1324 | 8.6957 | 4000 | 2.3667 | 0.5424 | 0.5414 |
65
+ | 0.1132 | 9.7826 | 4500 | 2.0414 | 0.5509 | 0.5486 |
66
+ | 0.1058 | 10.8696 | 5000 | 2.5673 | 0.5509 | 0.5491 |
67
+ | 0.0833 | 11.9565 | 5500 | 2.7424 | 0.5513 | 0.5509 |
68
+ | 0.0662 | 13.0435 | 6000 | 3.2582 | 0.5544 | 0.5529 |
69
+ | 0.0664 | 14.1304 | 6500 | 3.5005 | 0.5556 | 0.5521 |
70
+ | 0.0532 | 15.2174 | 7000 | 3.0692 | 0.5502 | 0.5509 |
71
+ | 0.0494 | 16.3043 | 7500 | 3.1700 | 0.5478 | 0.5487 |
72
+ | 0.0485 | 17.3913 | 8000 | 3.8948 | 0.5382 | 0.5377 |
73
+ | 0.0359 | 18.4783 | 8500 | 3.5655 | 0.5583 | 0.5570 |
74
+ | 0.0322 | 19.5652 | 9000 | 4.0121 | 0.5583 | 0.5547 |
75
+ | 0.0294 | 20.6522 | 9500 | 3.5540 | 0.5579 | 0.5582 |
76
+ | 0.026 | 21.7391 | 10000 | 4.0054 | 0.5525 | 0.5535 |
77
+ | 0.0305 | 22.8261 | 10500 | 3.8289 | 0.5498 | 0.5453 |
78
+ | 0.0232 | 23.9130 | 11000 | 4.4012 | 0.5556 | 0.5558 |
79
+ | 0.0209 | 25.0 | 11500 | 4.0916 | 0.5559 | 0.5504 |
80
+ | 0.0224 | 26.0870 | 12000 | 4.3087 | 0.5586 | 0.5583 |
81
+ | 0.0192 | 27.1739 | 12500 | 4.0617 | 0.5467 | 0.5474 |
82
+ | 0.0198 | 28.2609 | 13000 | 4.1456 | 0.5567 | 0.5555 |
83
+ | 0.0148 | 29.3478 | 13500 | 4.5847 | 0.5505 | 0.5519 |
84
+ | 0.016 | 30.4348 | 14000 | 4.3128 | 0.5494 | 0.5501 |
85
+ | 0.0145 | 31.5217 | 14500 | 4.4021 | 0.5505 | 0.5500 |
86
+ | 0.0146 | 32.6087 | 15000 | 4.3393 | 0.5509 | 0.5506 |
87
+ | 0.0089 | 33.6957 | 15500 | 4.4852 | 0.5486 | 0.5499 |
88
+ | 0.0089 | 34.7826 | 16000 | 4.8487 | 0.5475 | 0.5487 |
89
+ | 0.0085 | 35.8696 | 16500 | 4.8052 | 0.5567 | 0.5573 |
90
+ | 0.0077 | 36.9565 | 17000 | 4.6518 | 0.5502 | 0.5484 |
91
+ | 0.0095 | 38.0435 | 17500 | 4.2742 | 0.5567 | 0.5554 |
92
+ | 0.0054 | 39.1304 | 18000 | 4.7804 | 0.5548 | 0.5520 |
93
+ | 0.0074 | 40.2174 | 18500 | 4.6940 | 0.5540 | 0.5516 |
94
+ | 0.0053 | 41.3043 | 19000 | 4.6543 | 0.5590 | 0.5581 |
95
+ | 0.003 | 42.3913 | 19500 | 5.0637 | 0.5563 | 0.5572 |
96
+ | 0.0044 | 43.4783 | 20000 | 4.7918 | 0.5652 | 0.5657 |
97
+ | 0.0053 | 44.5652 | 20500 | 4.7492 | 0.5625 | 0.5604 |
98
+ | 0.0031 | 45.6522 | 21000 | 4.8642 | 0.5571 | 0.5567 |
99
+ | 0.0026 | 46.7391 | 21500 | 4.9137 | 0.5617 | 0.5614 |
100
+ | 0.0025 | 47.8261 | 22000 | 4.8985 | 0.5629 | 0.5626 |
101
+ | 0.0007 | 48.9130 | 22500 | 4.9890 | 0.5633 | 0.5621 |
102
+ | 0.0027 | 50.0 | 23000 | 5.0035 | 0.5625 | 0.5617 |
103
 
104
 
105
  ### Framework versions
eval_results_cardiff.json CHANGED
@@ -1 +1 @@
1
- {"arabic": {"f1": 0.5625619299343044, "accuracy": 0.5597701149425287, "confusion_matrix": [[141, 118, 31], [76, 171, 43], [41, 74, 175]]}, "english": {"f1": 0.6406225857351387, "accuracy": 0.6448275862068965, "confusion_matrix": [[223, 44, 23], [100, 141, 49], [42, 51, 197]]}, "french": {"f1": 0.5815633943105122, "accuracy": 0.5862068965517241, "confusion_matrix": [[215, 40, 35], [61, 149, 80], [73, 71, 146]]}, "german": {"f1": 0.6898338453998946, "accuracy": 0.6908045977011494, "confusion_matrix": [[182, 48, 60], [49, 200, 41], [32, 39, 219]]}, "hindi": {"f1": 0.47341547251844335, "accuracy": 0.48160919540229885, "confusion_matrix": [[176, 41, 73], [120, 91, 79], [97, 41, 152]]}, "italian": {"f1": 0.5751810164287168, "accuracy": 0.5758620689655173, "confusion_matrix": [[142, 59, 89], [31, 187, 72], [43, 75, 172]]}, "portuguese": {"f1": 0.6016948201987572, "accuracy": 0.6057471264367816, "confusion_matrix": [[149, 99, 42], [57, 154, 79], [24, 42, 224]]}, "spanish": {"f1": 0.5944342569849371, "accuracy": 0.5942528735632184, "confusion_matrix": [[175, 83, 32], [66, 152, 72], [40, 60, 190]]}, "all": {"f1": 0.5919315531719215, "accuracy": 0.5923850574712644, "confusion_matrix": [[1403, 532, 385], [560, 1245, 515], [392, 453, 1475]]}}
 
1
+ {"arabic": {"f1": 0.5505121631124021, "accuracy": 0.5482758620689655, "confusion_matrix": [[153, 106, 31], [74, 164, 52], [58, 72, 160]]}, "english": {"f1": 0.6252888515559376, "accuracy": 0.6229885057471264, "confusion_matrix": [[204, 75, 11], [84, 174, 32], [47, 79, 164]]}, "french": {"f1": 0.5844273689986265, "accuracy": 0.5850574712643678, "confusion_matrix": [[192, 50, 48], [46, 159, 85], [65, 67, 158]]}, "german": {"f1": 0.6770577974077062, "accuracy": 0.6770114942528735, "confusion_matrix": [[198, 62, 30], [55, 216, 19], [63, 52, 175]]}, "hindi": {"f1": 0.4860992225426562, "accuracy": 0.4885057471264368, "confusion_matrix": [[164, 56, 70], [100, 115, 75], [87, 57, 146]]}, "italian": {"f1": 0.5664004433041901, "accuracy": 0.5747126436781609, "confusion_matrix": [[110, 88, 92], [20, 200, 70], [23, 77, 190]]}, "portuguese": {"f1": 0.6162615581379695, "accuracy": 0.6137931034482759, "confusion_matrix": [[154, 110, 26], [56, 194, 40], [28, 76, 186]]}, "spanish": {"f1": 0.5890270816894922, "accuracy": 0.5896551724137931, "confusion_matrix": [[181, 77, 32], [80, 145, 65], [51, 52, 187]]}, "all": {"f1": 0.5881456796344003, "accuracy": 0.5875, "confusion_matrix": [[1356, 624, 340], [515, 1367, 438], [422, 532, 1366]]}}
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:32d6030d7d91f6283fd28886fec4a85c014b51138f815b45efeda60d6c926967
3
  size 946716948
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7062281d46cdcc4b78238f63c508610ff1d74bbdcb1dcf671f96baabede84101
3
  size 946716948
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f6e9dc0b9538f6b294cdfb11906b794438c54c69b87175e24803c89251cef4ab
3
  size 5368
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:286d21ebe8c947df3bf342be6d4627995d796ef0b0af7e94d2b085f097bc8a6b
3
  size 5368