File size: 3,090 Bytes
0a1d383
 
 
 
 
c4e4950
0a1d383
 
 
 
 
 
c4e4950
0a1d383
 
 
c4e4950
 
0a1d383
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c4e4950
0a1d383
 
 
 
 
c4e4950
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0a1d383
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: roberta-base-mnli_MULTI
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# roberta-base-mnli_MULTI

This model is a fine-tuned version of [WillHeld/roberta-base-mnli](https://huggingface.co/WillHeld/roberta-base-mnli) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7607
- Acc: 0.8310

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Acc    |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.4798        | 0.17  | 2000  | 0.4810          | 0.8211 |
| 0.4494        | 0.33  | 4000  | 0.4606          | 0.8268 |
| 0.431         | 0.5   | 6000  | 0.4623          | 0.8301 |
| 0.4371        | 0.67  | 8000  | 0.4437          | 0.8297 |
| 0.4297        | 0.83  | 10000 | 0.4548          | 0.8313 |
| 0.4214        | 1.0   | 12000 | 0.4566          | 0.8337 |
| 0.3123        | 1.17  | 14000 | 0.4893          | 0.8323 |
| 0.3158        | 1.33  | 16000 | 0.4861          | 0.8342 |
| 0.324         | 1.5   | 18000 | 0.4812          | 0.8308 |
| 0.3161        | 1.66  | 20000 | 0.4630          | 0.8365 |
| 0.32          | 1.83  | 22000 | 0.4630          | 0.8366 |
| 0.3195        | 2.0   | 24000 | 0.4681          | 0.8355 |
| 0.2274        | 2.16  | 26000 | 0.5347          | 0.8407 |
| 0.2311        | 2.33  | 28000 | 0.5650          | 0.8310 |
| 0.2293        | 2.5   | 30000 | 0.5408          | 0.8355 |
| 0.2296        | 2.66  | 32000 | 0.5207          | 0.8374 |
| 0.2274        | 2.83  | 34000 | 0.5696          | 0.8353 |
| 0.23          | 3.0   | 36000 | 0.5332          | 0.8366 |
| 0.1686        | 3.16  | 38000 | 0.6275          | 0.8343 |
| 0.1632        | 3.33  | 40000 | 0.6457          | 0.8349 |
| 0.1686        | 3.5   | 42000 | 0.5965          | 0.8338 |
| 0.1634        | 3.66  | 44000 | 0.6272          | 0.8342 |
| 0.1656        | 3.83  | 46000 | 0.6541          | 0.8312 |
| 0.162         | 4.0   | 48000 | 0.6408          | 0.8317 |
| 0.1288        | 4.16  | 50000 | 0.7237          | 0.8349 |
| 0.1275        | 4.33  | 52000 | 0.7558          | 0.8295 |
| 0.1291        | 4.5   | 54000 | 0.7730          | 0.8306 |
| 0.1261        | 4.66  | 56000 | 0.7524          | 0.8300 |
| 0.1272        | 4.83  | 58000 | 0.7572          | 0.8317 |
| 0.1242        | 4.99  | 60000 | 0.7607          | 0.8310 |


### Framework versions

- Transformers 4.24.0
- Pytorch 1.13.0+cu117
- Datasets 2.7.1
- Tokenizers 0.12.1