File size: 4,770 Bytes
ef2ff0c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
---
license: mit
library_name: peft
tags:
- generated_from_trainer
base_model: facebook/bart-large-mnli
metrics:
- f1
- precision
- recall
- accuracy
model-index:
- name: results
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# results

This model is a fine-tuned version of [facebook/bart-large-mnli](https://huggingface.co/facebook/bart-large-mnli) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1726
- F1: 0.9412
- Precision: 0.9524
- Recall: 0.9302
- Accuracy: 0.9333

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1     | Precision | Recall | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:------:|:---------:|:------:|:--------:|
| 0.5809        | 0.14  | 10   | 0.3912          | 0.8671 | 0.8621    | 0.8721 | 0.8467   |
| 0.3659        | 0.28  | 20   | 0.3689          | 0.8790 | 0.9718    | 0.8023 | 0.8733   |
| 0.3805        | 0.42  | 30   | 0.2890          | 0.9133 | 0.9080    | 0.9186 | 0.9      |
| 0.4068        | 0.56  | 40   | 0.3249          | 0.9068 | 0.9733    | 0.8488 | 0.9      |
| 0.3183        | 0.69  | 50   | 0.2801          | 0.9133 | 0.9080    | 0.9186 | 0.9      |
| 0.1929        | 0.83  | 60   | 0.2832          | 0.9123 | 0.9176    | 0.9070 | 0.9      |
| 0.2861        | 0.97  | 70   | 0.2883          | 0.9195 | 0.9091    | 0.9302 | 0.9067   |
| 0.209         | 1.11  | 80   | 0.3000          | 0.9222 | 0.9506    | 0.8953 | 0.9133   |
| 0.2192        | 1.25  | 90   | 0.2845          | 0.9176 | 0.9286    | 0.9070 | 0.9067   |
| 0.3116        | 1.39  | 100  | 0.2520          | 0.9249 | 0.9195    | 0.9302 | 0.9133   |
| 0.2512        | 1.53  | 110  | 0.2650          | 0.9222 | 0.9506    | 0.8953 | 0.9133   |
| 0.1774        | 1.67  | 120  | 0.2571          | 0.9231 | 0.9398    | 0.9070 | 0.9133   |
| 0.1126        | 1.81  | 130  | 0.2668          | 0.9364 | 0.9310    | 0.9419 | 0.9267   |
| 0.2379        | 1.94  | 140  | 0.3075          | 0.9012 | 0.9605    | 0.8488 | 0.8933   |
| 0.2753        | 2.08  | 150  | 0.2254          | 0.9240 | 0.9294    | 0.9186 | 0.9133   |
| 0.1727        | 2.22  | 160  | 0.2707          | 0.9310 | 0.9205    | 0.9419 | 0.92     |
| 0.224         | 2.36  | 170  | 0.3118          | 0.9057 | 0.9863    | 0.8372 | 0.9      |
| 0.2056        | 2.5   | 180  | 0.2673          | 0.9302 | 0.9302    | 0.9302 | 0.92     |
| 0.2274        | 2.64  | 190  | 0.2515          | 0.9302 | 0.9302    | 0.9302 | 0.92     |
| 0.1193        | 2.78  | 200  | 0.2250          | 0.9357 | 0.9412    | 0.9302 | 0.9267   |
| 0.2806        | 2.92  | 210  | 0.2268          | 0.9286 | 0.9512    | 0.9070 | 0.92     |
| 0.1272        | 3.06  | 220  | 0.2031          | 0.9349 | 0.9518    | 0.9186 | 0.9267   |
| 0.1879        | 3.19  | 230  | 0.1730          | 0.9480 | 0.9425    | 0.9535 | 0.94     |
| 0.1341        | 3.33  | 240  | 0.1867          | 0.9419 | 0.9419    | 0.9419 | 0.9333   |
| 0.1376        | 3.47  | 250  | 0.2628          | 0.9341 | 0.9630    | 0.9070 | 0.9267   |
| 0.1599        | 3.61  | 260  | 0.2484          | 0.9405 | 0.9634    | 0.9186 | 0.9333   |
| 0.1899        | 3.75  | 270  | 0.1847          | 0.9480 | 0.9425    | 0.9535 | 0.94     |
| 0.0828        | 3.89  | 280  | 0.1869          | 0.9412 | 0.9524    | 0.9302 | 0.9333   |
| 0.1025        | 4.03  | 290  | 0.1876          | 0.9349 | 0.9518    | 0.9186 | 0.9267   |
| 0.118         | 4.17  | 300  | 0.1811          | 0.9419 | 0.9419    | 0.9419 | 0.9333   |
| 0.1475        | 4.31  | 310  | 0.1901          | 0.9294 | 0.9405    | 0.9186 | 0.92     |
| 0.1354        | 4.44  | 320  | 0.1805          | 0.9357 | 0.9412    | 0.9302 | 0.9267   |
| 0.1444        | 4.58  | 330  | 0.1706          | 0.9540 | 0.9432    | 0.9651 | 0.9467   |
| 0.1068        | 4.72  | 340  | 0.1693          | 0.9480 | 0.9425    | 0.9535 | 0.94     |
| 0.0875        | 4.86  | 350  | 0.1715          | 0.9412 | 0.9524    | 0.9302 | 0.9333   |
| 0.0922        | 5.0   | 360  | 0.1726          | 0.9412 | 0.9524    | 0.9302 | 0.9333   |


### Framework versions

- PEFT 0.9.0
- Transformers 4.39.1
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2