File size: 10,037 Bytes
b9adb55
 
 
 
 
 
 
 
a3c3687
b9adb55
 
 
 
 
 
 
 
 
 
 
 
a3c3687
 
 
 
b9adb55
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
base_model: facebook/bart-base
model-index:
- name: bart-base
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-base

This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7146
- Accuracy: 0.8180
- Precision: 0.8189
- Recall: 0.8180
- Precision Macro: 0.7608
- Recall Macro: 0.7799
- Macro Fpr: 0.0157
- Weighted Fpr: 0.0151
- Weighted Specificity: 0.9781
- Macro Specificity: 0.9868
- Weighted Sensitivity: 0.8234
- Macro Sensitivity: 0.7799
- F1 Micro: 0.8234
- F1 Macro: 0.7642
- F1 Weighted: 0.8237

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| 1.2028        | 1.0   | 643   | 0.8430          | 0.7599   | 0.7601    | 0.7599 | 0.6004          | 0.6367       | 0.0232    | 0.0221       | 0.9655               | 0.9817            | 0.7599               | 0.6367            | 0.7599   | 0.6064   | 0.7489      |
| 0.715         | 2.0   | 1286  | 0.7332          | 0.7932   | 0.8020    | 0.7932 | 0.7386          | 0.7321       | 0.0190    | 0.0183       | 0.9745               | 0.9845            | 0.7932               | 0.7321            | 0.7932   | 0.7214   | 0.7853      |
| 0.578         | 3.0   | 1929  | 0.8045          | 0.7940   | 0.8075    | 0.7940 | 0.7231          | 0.7069       | 0.0185    | 0.0182       | 0.9775               | 0.9848            | 0.7940               | 0.7069            | 0.7940   | 0.6998   | 0.7901      |
| 0.3938        | 4.0   | 2572  | 0.8291          | 0.8156   | 0.8171    | 0.8156 | 0.7937          | 0.7218       | 0.0169    | 0.0159       | 0.9711               | 0.9858            | 0.8156               | 0.7218            | 0.8156   | 0.7369   | 0.8105      |
| 0.3238        | 5.0   | 3215  | 0.8889          | 0.7940   | 0.8146    | 0.7940 | 0.7464          | 0.7515       | 0.0188    | 0.0182       | 0.9762               | 0.9847            | 0.7940               | 0.7515            | 0.7940   | 0.7361   | 0.7995      |
| 0.246         | 6.0   | 3858  | 1.1629          | 0.7955   | 0.8067    | 0.7955 | 0.7483          | 0.7600       | 0.0186    | 0.0180       | 0.9749               | 0.9847            | 0.7955               | 0.7600            | 0.7955   | 0.7362   | 0.7946      |
| 0.1791        | 7.0   | 4501  | 1.1354          | 0.8180   | 0.8151    | 0.8180 | 0.7832          | 0.7697       | 0.0165    | 0.0156       | 0.9747               | 0.9862            | 0.8180               | 0.7697            | 0.8180   | 0.7736   | 0.8147      |
| 0.1305        | 8.0   | 5144  | 1.2825          | 0.8110   | 0.8148    | 0.8110 | 0.7422          | 0.7489       | 0.0169    | 0.0164       | 0.9765               | 0.9858            | 0.8110               | 0.7489            | 0.8110   | 0.7369   | 0.8088      |
| 0.0924        | 9.0   | 5787  | 1.4217          | 0.8040   | 0.8114    | 0.8040 | 0.7465          | 0.7809       | 0.0178    | 0.0171       | 0.9762               | 0.9853            | 0.8040               | 0.7809            | 0.8040   | 0.7560   | 0.8015      |
| 0.0953        | 10.0  | 6430  | 1.5552          | 0.8025   | 0.8056    | 0.8025 | 0.7702          | 0.7822       | 0.0183    | 0.0173       | 0.9712               | 0.9849            | 0.8025               | 0.7822            | 0.8025   | 0.7661   | 0.8001      |
| 0.0617        | 11.0  | 7073  | 1.5224          | 0.8040   | 0.8144    | 0.8040 | 0.7457          | 0.7512       | 0.0176    | 0.0171       | 0.9762               | 0.9853            | 0.8040               | 0.7512            | 0.8040   | 0.7422   | 0.8070      |
| 0.0582        | 12.0  | 7716  | 1.6428          | 0.7971   | 0.8148    | 0.7971 | 0.7470          | 0.7655       | 0.0183    | 0.0179       | 0.9771               | 0.9849            | 0.7971               | 0.7655            | 0.7971   | 0.7465   | 0.8022      |
| 0.0511        | 13.0  | 8359  | 1.4952          | 0.8195   | 0.8208    | 0.8195 | 0.7645          | 0.7580       | 0.0162    | 0.0155       | 0.9759               | 0.9864            | 0.8195               | 0.7580            | 0.8195   | 0.7586   | 0.8187      |
| 0.0476        | 14.0  | 9002  | 1.7132          | 0.7971   | 0.7958    | 0.7971 | 0.7637          | 0.7328       | 0.0189    | 0.0179       | 0.9708               | 0.9845            | 0.7971               | 0.7328            | 0.7971   | 0.7417   | 0.7913      |
| 0.0375        | 15.0  | 9645  | 1.7058          | 0.8002   | 0.8110    | 0.8002 | 0.7369          | 0.7696       | 0.0182    | 0.0175       | 0.9757               | 0.9851            | 0.8002               | 0.7696            | 0.8002   | 0.7437   | 0.8017      |
| 0.0241        | 16.0  | 10288 | 1.7146          | 0.8180   | 0.8189    | 0.8180 | 0.7852          | 0.7787       | 0.0162    | 0.0156       | 0.9761               | 0.9863            | 0.8180               | 0.7787            | 0.8180   | 0.7780   | 0.8174      |
| 0.0226        | 17.0  | 10931 | 1.7035          | 0.8203   | 0.8238    | 0.8203 | 0.7732          | 0.7781       | 0.0160    | 0.0154       | 0.9774               | 0.9865            | 0.8203               | 0.7781            | 0.8203   | 0.7714   | 0.8206      |
| 0.0189        | 18.0  | 11574 | 1.8079          | 0.8164   | 0.8160    | 0.8164 | 0.7583          | 0.7677       | 0.0166    | 0.0158       | 0.9749               | 0.9861            | 0.8164               | 0.7677            | 0.8164   | 0.7578   | 0.8149      |
| 0.026         | 19.0  | 12217 | 1.8187          | 0.8125   | 0.8170    | 0.8125 | 0.7675          | 0.7833       | 0.0169    | 0.0162       | 0.9748               | 0.9858            | 0.8125               | 0.7833            | 0.8125   | 0.7719   | 0.8138      |
| 0.0101        | 20.0  | 12860 | 1.8354          | 0.8187   | 0.8220    | 0.8187 | 0.7748          | 0.7818       | 0.0161    | 0.0156       | 0.9772               | 0.9864            | 0.8187               | 0.7818            | 0.8187   | 0.7710   | 0.8180      |
| 0.0216        | 21.0  | 13503 | 1.8372          | 0.8156   | 0.8219    | 0.8156 | 0.7502          | 0.7858       | 0.0163    | 0.0159       | 0.9789               | 0.9863            | 0.8156               | 0.7858            | 0.8156   | 0.7618   | 0.8164      |
| 0.0138        | 22.0  | 14146 | 1.8472          | 0.8203   | 0.8263    | 0.8203 | 0.7613          | 0.7796       | 0.0159    | 0.0154       | 0.9786               | 0.9866            | 0.8203               | 0.7796            | 0.8203   | 0.7662   | 0.8222      |
| 0.0169        | 23.0  | 14789 | 1.8104          | 0.8218   | 0.8252    | 0.8218 | 0.7719          | 0.7595       | 0.0160    | 0.0152       | 0.9749               | 0.9865            | 0.8218               | 0.7595            | 0.8218   | 0.7607   | 0.8209      |
| 0.0079        | 24.0  | 15432 | 1.9253          | 0.8110   | 0.8202    | 0.8110 | 0.7622          | 0.7576       | 0.0171    | 0.0164       | 0.9759               | 0.9858            | 0.8110               | 0.7576            | 0.8110   | 0.7524   | 0.8123      |
| 0.0017        | 25.0  | 16075 | 1.9111          | 0.8156   | 0.8193    | 0.8156 | 0.7554          | 0.7742       | 0.0164    | 0.0159       | 0.9775               | 0.9862            | 0.8156               | 0.7742            | 0.8156   | 0.7594   | 0.8155      |
| 0.0071        | 26.0  | 16718 | 1.8809          | 0.8187   | 0.8244    | 0.8187 | 0.7595          | 0.7749       | 0.0161    | 0.0156       | 0.9783               | 0.9865            | 0.8187               | 0.7749            | 0.8187   | 0.7601   | 0.8199      |
| 0.0032        | 27.0  | 17361 | 1.8246          | 0.8273   | 0.8333    | 0.8273 | 0.7727          | 0.7807       | 0.0152    | 0.0147       | 0.9786               | 0.9871            | 0.8273               | 0.7807            | 0.8273   | 0.7718   | 0.8289      |
| 0.0014        | 28.0  | 18004 | 1.8354          | 0.8265   | 0.8337    | 0.8265 | 0.7624          | 0.7806       | 0.0154    | 0.0148       | 0.9784               | 0.9870            | 0.8265               | 0.7806            | 0.8265   | 0.7648   | 0.8282      |
| 0.0004        | 29.0  | 18647 | 1.8558          | 0.8234   | 0.8277    | 0.8234 | 0.7616          | 0.7801       | 0.0157    | 0.0151       | 0.9778               | 0.9867            | 0.8234               | 0.7801            | 0.8234   | 0.7646   | 0.8234      |
| 0.0012        | 30.0  | 19290 | 1.8392          | 0.8234   | 0.8281    | 0.8234 | 0.7608          | 0.7799       | 0.0157    | 0.0151       | 0.9781               | 0.9868            | 0.8234               | 0.7799            | 0.8234   | 0.7642   | 0.8237      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1