File size: 10,042 Bytes
f418b53
 
 
 
ef26f8e
f418b53
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ef26f8e
 
 
 
f418b53
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: apache-2.0
tags:
- generated_from_trainer
base_model: facebook/bart-large
metrics:
- accuracy
- precision
- recall
model-index:
- name: bart-large
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-large

This model is a fine-tuned version of [facebook/bart-large](https://huggingface.co/facebook/bart-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0027
- Accuracy: 0.7916
- Precision: 0.7858
- Recall: 0.7916
- Precision Macro: 0.7201
- Recall Macro: 0.7056
- Macro Fpr: 0.0201
- Weighted Fpr: 0.0195
- Weighted Specificity: 0.9714
- Macro Specificity: 0.9836
- Weighted Sensitivity: 0.7823
- Macro Sensitivity: 0.7056
- F1 Micro: 0.7823
- F1 Macro: 0.7080
- F1 Weighted: 0.7801

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| 1.1685        | 1.0   | 2569  | 1.2587          | 0.6847   | 0.6360    | 0.6847 | 0.4176          | 0.4720       | 0.0331    | 0.0318       | 0.9550               | 0.9760            | 0.6847               | 0.4720            | 0.6847   | 0.4296   | 0.6471      |
| 1.1965        | 2.0   | 5138  | 1.1623          | 0.6638   | 0.6943    | 0.6638 | 0.4564          | 0.4261       | 0.0342    | 0.0349       | 0.9654               | 0.9753            | 0.6638               | 0.4261            | 0.6638   | 0.3955   | 0.6468      |
| 1.189         | 3.0   | 7707  | 1.3574          | 0.7235   | 0.7220    | 0.7235 | 0.5413          | 0.5528       | 0.0271    | 0.0266       | 0.9628               | 0.9791            | 0.7235               | 0.5528            | 0.7235   | 0.5196   | 0.7031      |
| 1.0127        | 4.0   | 10276 | 1.4685          | 0.7668   | 0.7584    | 0.7668 | 0.6671          | 0.6202       | 0.0224    | 0.0213       | 0.9653               | 0.9821            | 0.7668               | 0.6202            | 0.7668   | 0.6233   | 0.7569      |
| 1.0205        | 5.0   | 12845 | 1.4232          | 0.7668   | 0.7711    | 0.7668 | 0.6765          | 0.6872       | 0.0215    | 0.0213       | 0.9737               | 0.9827            | 0.7668               | 0.6872            | 0.7668   | 0.6732   | 0.7643      |
| 0.7927        | 6.0   | 15414 | 1.5678          | 0.7428   | 0.7451    | 0.7428 | 0.6489          | 0.6333       | 0.0248    | 0.0241       | 0.9690               | 0.9808            | 0.7428               | 0.6333            | 0.7428   | 0.6108   | 0.7292      |
| 0.7701        | 7.0   | 17983 | 1.7337          | 0.7467   | 0.7600    | 0.7467 | 0.6863          | 0.6536       | 0.0240    | 0.0237       | 0.9680               | 0.9810            | 0.7467               | 0.6536            | 0.7467   | 0.6584   | 0.7399      |
| 0.584         | 8.0   | 20552 | 1.6188          | 0.7692   | 0.7766    | 0.7692 | 0.6979          | 0.7065       | 0.0214    | 0.0210       | 0.9706               | 0.9827            | 0.7692               | 0.7065            | 0.7692   | 0.6980   | 0.7683      |
| 0.5659        | 9.0   | 23121 | 1.6983          | 0.7599   | 0.7665    | 0.7599 | 0.7000          | 0.6804       | 0.0227    | 0.0221       | 0.9695               | 0.9820            | 0.7599               | 0.6804            | 0.7599   | 0.6728   | 0.7542      |
| 0.7021        | 10.0  | 25690 | 1.6445          | 0.7699   | 0.7656    | 0.7699 | 0.7144          | 0.6857       | 0.0223    | 0.0209       | 0.9608               | 0.9821            | 0.7699               | 0.6857            | 0.7699   | 0.6954   | 0.7634      |
| 0.6216        | 11.0  | 28259 | 1.6562          | 0.7676   | 0.7634    | 0.7676 | 0.6856          | 0.6776       | 0.0223    | 0.0212       | 0.9640               | 0.9821            | 0.7676               | 0.6776            | 0.7676   | 0.6786   | 0.7624      |
| 0.6408        | 12.0  | 30828 | 1.6682          | 0.7668   | 0.7629    | 0.7668 | 0.6706          | 0.6719       | 0.0223    | 0.0213       | 0.9666               | 0.9822            | 0.7668               | 0.6719            | 0.7668   | 0.6666   | 0.7608      |
| 0.523         | 13.0  | 33397 | 1.7727          | 0.7653   | 0.7674    | 0.7653 | 0.8238          | 0.6934       | 0.0226    | 0.0214       | 0.9659               | 0.9821            | 0.7653               | 0.6934            | 0.7653   | 0.7066   | 0.7534      |
| 0.3688        | 14.0  | 35966 | 1.8404          | 0.7792   | 0.7788    | 0.7792 | 0.7229          | 0.6921       | 0.0209    | 0.0198       | 0.9675               | 0.9831            | 0.7792               | 0.6921            | 0.7792   | 0.6960   | 0.7731      |
| 0.2394        | 15.0  | 38535 | 1.7885          | 0.7816   | 0.7809    | 0.7816 | 0.7441          | 0.7115       | 0.0210    | 0.0196       | 0.9628               | 0.9830            | 0.7816               | 0.7115            | 0.7816   | 0.7230   | 0.7765      |
| 0.2734        | 16.0  | 41104 | 1.8944          | 0.7777   | 0.7870    | 0.7777 | 0.7539          | 0.7265       | 0.0203    | 0.0200       | 0.9724               | 0.9833            | 0.7777               | 0.7265            | 0.7777   | 0.7295   | 0.7777      |
| 0.4319        | 17.0  | 43673 | 1.7744          | 0.7885   | 0.7847    | 0.7885 | 0.7247          | 0.7320       | 0.0195    | 0.0188       | 0.9718               | 0.9840            | 0.7885               | 0.7320            | 0.7885   | 0.7269   | 0.7855      |
| 0.2347        | 18.0  | 46242 | 2.0036          | 0.7413   | 0.7352    | 0.7413 | 0.6934          | 0.6799       | 0.0255    | 0.0243       | 0.9597               | 0.9801            | 0.7413               | 0.6799            | 0.7413   | 0.6825   | 0.7354      |
| 0.1882        | 19.0  | 48811 | 1.9298          | 0.7816   | 0.7804    | 0.7816 | 0.7243          | 0.7262       | 0.0202    | 0.0196       | 0.9708               | 0.9835            | 0.7816               | 0.7262            | 0.7816   | 0.7225   | 0.7792      |
| 0.1799        | 20.0  | 51380 | 1.9688          | 0.7792   | 0.7892    | 0.7792 | 0.7312          | 0.7343       | 0.0205    | 0.0198       | 0.9714               | 0.9834            | 0.7792               | 0.7343            | 0.7792   | 0.7242   | 0.7779      |
| 0.1366        | 21.0  | 53949 | 1.9910          | 0.7847   | 0.7846    | 0.7847 | 0.7148          | 0.7455       | 0.0198    | 0.0192       | 0.9730               | 0.9838            | 0.7847               | 0.7455            | 0.7847   | 0.7265   | 0.7833      |
| 0.1793        | 22.0  | 56518 | 2.2548          | 0.7630   | 0.7648    | 0.7630 | 0.7150          | 0.7273       | 0.0230    | 0.0217       | 0.9633               | 0.9818            | 0.7630               | 0.7273            | 0.7630   | 0.7150   | 0.7582      |
| 0.1749        | 23.0  | 59087 | 2.1109          | 0.7816   | 0.7768    | 0.7816 | 0.7466          | 0.7230       | 0.0205    | 0.0196       | 0.9690               | 0.9834            | 0.7816               | 0.7230            | 0.7816   | 0.7289   | 0.7774      |
| 0.1154        | 24.0  | 61656 | 2.0637          | 0.7878   | 0.7837    | 0.7878 | 0.7590          | 0.7269       | 0.0196    | 0.0189       | 0.9718               | 0.9840            | 0.7878               | 0.7269            | 0.7878   | 0.7331   | 0.7828      |
| 0.1447        | 25.0  | 64225 | 2.0027          | 0.7916   | 0.7858    | 0.7916 | 0.7750          | 0.7299       | 0.0194    | 0.0185       | 0.9697               | 0.9841            | 0.7916               | 0.7299            | 0.7916   | 0.7408   | 0.7861      |
| 0.0806        | 26.0  | 66794 | 2.0777          | 0.7885   | 0.7831    | 0.7885 | 0.7162          | 0.7134       | 0.0196    | 0.0188       | 0.9715               | 0.9840            | 0.7885               | 0.7134            | 0.7885   | 0.7118   | 0.7840      |
| 0.0407        | 27.0  | 69363 | 2.1754          | 0.7885   | 0.7863    | 0.7885 | 0.7192          | 0.7080       | 0.0194    | 0.0188       | 0.9725               | 0.9841            | 0.7885               | 0.7080            | 0.7885   | 0.7105   | 0.7866      |
| 0.0701        | 28.0  | 71932 | 2.1578          | 0.7823   | 0.7817    | 0.7823 | 0.7130          | 0.7097       | 0.0201    | 0.0195       | 0.9714               | 0.9836            | 0.7823               | 0.7097            | 0.7823   | 0.7066   | 0.7810      |
| 0.1034        | 29.0  | 74501 | 2.2132          | 0.7800   | 0.7789    | 0.7800 | 0.7163          | 0.7044       | 0.0203    | 0.0197       | 0.9713               | 0.9834            | 0.7800               | 0.7044            | 0.7800   | 0.7064   | 0.7785      |
| 0.0388        | 30.0  | 77070 | 2.1833          | 0.7823   | 0.7806    | 0.7823 | 0.7201          | 0.7056       | 0.0201    | 0.0195       | 0.9714               | 0.9836            | 0.7823               | 0.7056            | 0.7823   | 0.7080   | 0.7801      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.19.0
- Tokenizers 0.15.1