xshubhamx commited on
Commit
f418b53
1 Parent(s): 9b6704f

End of training

Browse files
README.md ADDED
@@ -0,0 +1,105 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/bart-large
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ - precision
9
+ - recall
10
+ model-index:
11
+ - name: bart-large
12
+ results: []
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+ # bart-large
19
+
20
+ This model is a fine-tuned version of [facebook/bart-large](https://huggingface.co/facebook/bart-large) on an unknown dataset.
21
+ It achieves the following results on the evaluation set:
22
+ - Loss: 2.1833
23
+ - Accuracy: 0.7823
24
+ - Precision: 0.7806
25
+ - Recall: 0.7823
26
+ - Precision Macro: 0.7201
27
+ - Recall Macro: 0.7056
28
+ - Macro Fpr: 0.0201
29
+ - Weighted Fpr: 0.0195
30
+ - Weighted Specificity: 0.9714
31
+ - Macro Specificity: 0.9836
32
+ - Weighted Sensitivity: 0.7823
33
+ - Macro Sensitivity: 0.7056
34
+ - F1 Micro: 0.7823
35
+ - F1 Macro: 0.7080
36
+ - F1 Weighted: 0.7801
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 5e-05
56
+ - train_batch_size: 2
57
+ - eval_batch_size: 2
58
+ - seed: 42
59
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
+ - lr_scheduler_type: linear
61
+ - num_epochs: 30
62
+ - mixed_precision_training: Native AMP
63
+
64
+ ### Training results
65
+
66
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
67
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
68
+ | 1.1685 | 1.0 | 2569 | 1.2587 | 0.6847 | 0.6360 | 0.6847 | 0.4176 | 0.4720 | 0.0331 | 0.0318 | 0.9550 | 0.9760 | 0.6847 | 0.4720 | 0.6847 | 0.4296 | 0.6471 |
69
+ | 1.1965 | 2.0 | 5138 | 1.1623 | 0.6638 | 0.6943 | 0.6638 | 0.4564 | 0.4261 | 0.0342 | 0.0349 | 0.9654 | 0.9753 | 0.6638 | 0.4261 | 0.6638 | 0.3955 | 0.6468 |
70
+ | 1.189 | 3.0 | 7707 | 1.3574 | 0.7235 | 0.7220 | 0.7235 | 0.5413 | 0.5528 | 0.0271 | 0.0266 | 0.9628 | 0.9791 | 0.7235 | 0.5528 | 0.7235 | 0.5196 | 0.7031 |
71
+ | 1.0127 | 4.0 | 10276 | 1.4685 | 0.7668 | 0.7584 | 0.7668 | 0.6671 | 0.6202 | 0.0224 | 0.0213 | 0.9653 | 0.9821 | 0.7668 | 0.6202 | 0.7668 | 0.6233 | 0.7569 |
72
+ | 1.0205 | 5.0 | 12845 | 1.4232 | 0.7668 | 0.7711 | 0.7668 | 0.6765 | 0.6872 | 0.0215 | 0.0213 | 0.9737 | 0.9827 | 0.7668 | 0.6872 | 0.7668 | 0.6732 | 0.7643 |
73
+ | 0.7927 | 6.0 | 15414 | 1.5678 | 0.7428 | 0.7451 | 0.7428 | 0.6489 | 0.6333 | 0.0248 | 0.0241 | 0.9690 | 0.9808 | 0.7428 | 0.6333 | 0.7428 | 0.6108 | 0.7292 |
74
+ | 0.7701 | 7.0 | 17983 | 1.7337 | 0.7467 | 0.7600 | 0.7467 | 0.6863 | 0.6536 | 0.0240 | 0.0237 | 0.9680 | 0.9810 | 0.7467 | 0.6536 | 0.7467 | 0.6584 | 0.7399 |
75
+ | 0.584 | 8.0 | 20552 | 1.6188 | 0.7692 | 0.7766 | 0.7692 | 0.6979 | 0.7065 | 0.0214 | 0.0210 | 0.9706 | 0.9827 | 0.7692 | 0.7065 | 0.7692 | 0.6980 | 0.7683 |
76
+ | 0.5659 | 9.0 | 23121 | 1.6983 | 0.7599 | 0.7665 | 0.7599 | 0.7000 | 0.6804 | 0.0227 | 0.0221 | 0.9695 | 0.9820 | 0.7599 | 0.6804 | 0.7599 | 0.6728 | 0.7542 |
77
+ | 0.7021 | 10.0 | 25690 | 1.6445 | 0.7699 | 0.7656 | 0.7699 | 0.7144 | 0.6857 | 0.0223 | 0.0209 | 0.9608 | 0.9821 | 0.7699 | 0.6857 | 0.7699 | 0.6954 | 0.7634 |
78
+ | 0.6216 | 11.0 | 28259 | 1.6562 | 0.7676 | 0.7634 | 0.7676 | 0.6856 | 0.6776 | 0.0223 | 0.0212 | 0.9640 | 0.9821 | 0.7676 | 0.6776 | 0.7676 | 0.6786 | 0.7624 |
79
+ | 0.6408 | 12.0 | 30828 | 1.6682 | 0.7668 | 0.7629 | 0.7668 | 0.6706 | 0.6719 | 0.0223 | 0.0213 | 0.9666 | 0.9822 | 0.7668 | 0.6719 | 0.7668 | 0.6666 | 0.7608 |
80
+ | 0.523 | 13.0 | 33397 | 1.7727 | 0.7653 | 0.7674 | 0.7653 | 0.8238 | 0.6934 | 0.0226 | 0.0214 | 0.9659 | 0.9821 | 0.7653 | 0.6934 | 0.7653 | 0.7066 | 0.7534 |
81
+ | 0.3688 | 14.0 | 35966 | 1.8404 | 0.7792 | 0.7788 | 0.7792 | 0.7229 | 0.6921 | 0.0209 | 0.0198 | 0.9675 | 0.9831 | 0.7792 | 0.6921 | 0.7792 | 0.6960 | 0.7731 |
82
+ | 0.2394 | 15.0 | 38535 | 1.7885 | 0.7816 | 0.7809 | 0.7816 | 0.7441 | 0.7115 | 0.0210 | 0.0196 | 0.9628 | 0.9830 | 0.7816 | 0.7115 | 0.7816 | 0.7230 | 0.7765 |
83
+ | 0.2734 | 16.0 | 41104 | 1.8944 | 0.7777 | 0.7870 | 0.7777 | 0.7539 | 0.7265 | 0.0203 | 0.0200 | 0.9724 | 0.9833 | 0.7777 | 0.7265 | 0.7777 | 0.7295 | 0.7777 |
84
+ | 0.4319 | 17.0 | 43673 | 1.7744 | 0.7885 | 0.7847 | 0.7885 | 0.7247 | 0.7320 | 0.0195 | 0.0188 | 0.9718 | 0.9840 | 0.7885 | 0.7320 | 0.7885 | 0.7269 | 0.7855 |
85
+ | 0.2347 | 18.0 | 46242 | 2.0036 | 0.7413 | 0.7352 | 0.7413 | 0.6934 | 0.6799 | 0.0255 | 0.0243 | 0.9597 | 0.9801 | 0.7413 | 0.6799 | 0.7413 | 0.6825 | 0.7354 |
86
+ | 0.1882 | 19.0 | 48811 | 1.9298 | 0.7816 | 0.7804 | 0.7816 | 0.7243 | 0.7262 | 0.0202 | 0.0196 | 0.9708 | 0.9835 | 0.7816 | 0.7262 | 0.7816 | 0.7225 | 0.7792 |
87
+ | 0.1799 | 20.0 | 51380 | 1.9688 | 0.7792 | 0.7892 | 0.7792 | 0.7312 | 0.7343 | 0.0205 | 0.0198 | 0.9714 | 0.9834 | 0.7792 | 0.7343 | 0.7792 | 0.7242 | 0.7779 |
88
+ | 0.1366 | 21.0 | 53949 | 1.9910 | 0.7847 | 0.7846 | 0.7847 | 0.7148 | 0.7455 | 0.0198 | 0.0192 | 0.9730 | 0.9838 | 0.7847 | 0.7455 | 0.7847 | 0.7265 | 0.7833 |
89
+ | 0.1793 | 22.0 | 56518 | 2.2548 | 0.7630 | 0.7648 | 0.7630 | 0.7150 | 0.7273 | 0.0230 | 0.0217 | 0.9633 | 0.9818 | 0.7630 | 0.7273 | 0.7630 | 0.7150 | 0.7582 |
90
+ | 0.1749 | 23.0 | 59087 | 2.1109 | 0.7816 | 0.7768 | 0.7816 | 0.7466 | 0.7230 | 0.0205 | 0.0196 | 0.9690 | 0.9834 | 0.7816 | 0.7230 | 0.7816 | 0.7289 | 0.7774 |
91
+ | 0.1154 | 24.0 | 61656 | 2.0637 | 0.7878 | 0.7837 | 0.7878 | 0.7590 | 0.7269 | 0.0196 | 0.0189 | 0.9718 | 0.9840 | 0.7878 | 0.7269 | 0.7878 | 0.7331 | 0.7828 |
92
+ | 0.1447 | 25.0 | 64225 | 2.0027 | 0.7916 | 0.7858 | 0.7916 | 0.7750 | 0.7299 | 0.0194 | 0.0185 | 0.9697 | 0.9841 | 0.7916 | 0.7299 | 0.7916 | 0.7408 | 0.7861 |
93
+ | 0.0806 | 26.0 | 66794 | 2.0777 | 0.7885 | 0.7831 | 0.7885 | 0.7162 | 0.7134 | 0.0196 | 0.0188 | 0.9715 | 0.9840 | 0.7885 | 0.7134 | 0.7885 | 0.7118 | 0.7840 |
94
+ | 0.0407 | 27.0 | 69363 | 2.1754 | 0.7885 | 0.7863 | 0.7885 | 0.7192 | 0.7080 | 0.0194 | 0.0188 | 0.9725 | 0.9841 | 0.7885 | 0.7080 | 0.7885 | 0.7105 | 0.7866 |
95
+ | 0.0701 | 28.0 | 71932 | 2.1578 | 0.7823 | 0.7817 | 0.7823 | 0.7130 | 0.7097 | 0.0201 | 0.0195 | 0.9714 | 0.9836 | 0.7823 | 0.7097 | 0.7823 | 0.7066 | 0.7810 |
96
+ | 0.1034 | 29.0 | 74501 | 2.2132 | 0.7800 | 0.7789 | 0.7800 | 0.7163 | 0.7044 | 0.0203 | 0.0197 | 0.9713 | 0.9834 | 0.7800 | 0.7044 | 0.7800 | 0.7064 | 0.7785 |
97
+ | 0.0388 | 30.0 | 77070 | 2.1833 | 0.7823 | 0.7806 | 0.7823 | 0.7201 | 0.7056 | 0.0201 | 0.0195 | 0.9714 | 0.9836 | 0.7823 | 0.7056 | 0.7823 | 0.7080 | 0.7801 |
98
+
99
+
100
+ ### Framework versions
101
+
102
+ - Transformers 4.35.2
103
+ - Pytorch 2.1.0+cu121
104
+ - Datasets 2.19.0
105
+ - Tokenizers 0.15.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a4ea6b3355af60802292819b636c66b0772c041026898e059e5f6636cf870643
3
  size 1629486164
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a5800f2e81754def4cb0ca4adf2705a90fe22b40c71ff1e251a765472dbe9997
3
  size 1629486164
runs/Apr20_23-47-18_a809b8a532ab/events.out.tfevents.1713656840.a809b8a532ab.1584.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9a5d42e7df879685c862b0d51a8e11c61e51ad44d248d7c9a2097706778543f7
3
- size 62638
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c20d52816f78bdb0a14b07acc54a934598d344595eee649f1a16f35f65e461a2
3
+ size 62998