File size: 5,722 Bytes
d354a7e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: mit
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: aces-roberta-10
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# aces-roberta-10

This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6188
- Precision: 0.8040
- Recall: 0.8198
- F1: 0.8097
- Accuracy: 0.8198
- F1 Who: 0.7939
- F1 What: 0.7929
- F1 Where: 0.7769
- F1 How: 0.8905

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy | F1 Who | F1 What | F1 Where | F1 How |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:------:|:-------:|:--------:|:------:|
| 1.6596        | 0.15  | 20   | 1.2172          | 0.5510    | 0.6640 | 0.5906 | 0.6640   | 0.0    | 0.6409  | 0.3258   | 0.7719 |
| 1.0566        | 0.31  | 40   | 0.9097          | 0.6534    | 0.7087 | 0.6590 | 0.7087   | 0.3855 | 0.7020  | 0.5620   | 0.8086 |
| 0.8056        | 0.46  | 60   | 0.7640          | 0.7092    | 0.7570 | 0.7196 | 0.7570   | 0.6857 | 0.7709  | 0.6696   | 0.8114 |
| 0.6996        | 0.61  | 80   | 0.6706          | 0.7601    | 0.7931 | 0.7687 | 0.7931   | 0.8103 | 0.7743  | 0.7471   | 0.8499 |
| 0.6346        | 0.76  | 100  | 0.6471          | 0.7763    | 0.8032 | 0.7852 | 0.8032   | 0.7874 | 0.7813  | 0.7490   | 0.8665 |
| 0.523         | 0.92  | 120  | 0.6635          | 0.7872    | 0.8061 | 0.7865 | 0.8061   | 0.8244 | 0.7718  | 0.7692   | 0.8771 |
| 0.5324        | 1.07  | 140  | 0.6162          | 0.8045    | 0.8212 | 0.8110 | 0.8212   | 0.8197 | 0.8008  | 0.8033   | 0.8852 |
| 0.4734        | 1.22  | 160  | 0.6147          | 0.7935    | 0.8097 | 0.7978 | 0.8097   | 0.7939 | 0.7861  | 0.7698   | 0.8911 |
| 0.5111        | 1.37  | 180  | 0.6142          | 0.8022    | 0.8154 | 0.8051 | 0.8154   | 0.8244 | 0.8047  | 0.768    | 0.8909 |
| 0.4416        | 1.53  | 200  | 0.6204          | 0.8006    | 0.8190 | 0.8079 | 0.8190   | 0.8271 | 0.7984  | 0.7773   | 0.8886 |
| 0.5249        | 1.68  | 220  | 0.6239          | 0.7907    | 0.8133 | 0.8006 | 0.8133   | 0.8182 | 0.7969  | 0.7739   | 0.8776 |
| 0.4599        | 1.83  | 240  | 0.6458          | 0.7989    | 0.8082 | 0.7967 | 0.8082   | 0.8244 | 0.7953  | 0.7751   | 0.8853 |
| 0.4979        | 1.98  | 260  | 0.6390          | 0.8071    | 0.8183 | 0.8051 | 0.8183   | 0.7869 | 0.8000  | 0.7583   | 0.8871 |
| 0.393         | 2.14  | 280  | 0.6348          | 0.7994    | 0.8125 | 0.8021 | 0.8125   | 0.8271 | 0.7904  | 0.7653   | 0.8812 |
| 0.4079        | 2.29  | 300  | 0.6227          | 0.8002    | 0.8140 | 0.8040 | 0.8140   | 0.8182 | 0.7908  | 0.7668   | 0.8784 |
| 0.3731        | 2.44  | 320  | 0.6319          | 0.7887    | 0.8075 | 0.7965 | 0.8075   | 0.8030 | 0.7814  | 0.7692   | 0.8702 |
| 0.3987        | 2.6   | 340  | 0.6171          | 0.7922    | 0.8140 | 0.8015 | 0.8140   | 0.7907 | 0.7813  | 0.7968   | 0.8759 |
| 0.3865        | 2.75  | 360  | 0.6161          | 0.7968    | 0.8118 | 0.8032 | 0.8118   | 0.7846 | 0.7824  | 0.7692   | 0.8851 |
| 0.4222        | 2.9   | 380  | 0.6137          | 0.7955    | 0.8140 | 0.8033 | 0.8140   | 0.8060 | 0.7897  | 0.7874   | 0.8746 |
| 0.4164        | 3.05  | 400  | 0.6016          | 0.8017    | 0.8176 | 0.8079 | 0.8176   | 0.7846 | 0.7954  | 0.7843   | 0.8832 |
| 0.3505        | 3.21  | 420  | 0.6239          | 0.7912    | 0.8075 | 0.7949 | 0.8075   | 0.7846 | 0.7930  | 0.7786   | 0.8556 |
| 0.3834        | 3.36  | 440  | 0.6038          | 0.8022    | 0.8169 | 0.8082 | 0.8169   | 0.7907 | 0.7976  | 0.7757   | 0.8835 |
| 0.3139        | 3.51  | 460  | 0.6068          | 0.7978    | 0.8161 | 0.8052 | 0.8161   | 0.7970 | 0.7904  | 0.7846   | 0.8870 |
| 0.3679        | 3.66  | 480  | 0.6070          | 0.8026    | 0.8183 | 0.8063 | 0.8183   | 0.7907 | 0.7953  | 0.7799   | 0.8835 |
| 0.3387        | 3.82  | 500  | 0.6059          | 0.8025    | 0.8205 | 0.8094 | 0.8205   | 0.7879 | 0.7977  | 0.7937   | 0.8879 |
| 0.3208        | 3.97  | 520  | 0.6064          | 0.8015    | 0.8183 | 0.8082 | 0.8183   | 0.7970 | 0.7900  | 0.7782   | 0.8854 |
| 0.3008        | 4.12  | 540  | 0.6088          | 0.8020    | 0.8205 | 0.8107 | 0.8205   | 0.7970 | 0.7946  | 0.7813   | 0.8883 |
| 0.3014        | 4.27  | 560  | 0.6093          | 0.8032    | 0.8212 | 0.8114 | 0.8212   | 0.8120 | 0.7961  | 0.7813   | 0.8867 |
| 0.3486        | 4.43  | 580  | 0.6112          | 0.8042    | 0.8205 | 0.8107 | 0.8205   | 0.7939 | 0.7961  | 0.7829   | 0.8873 |
| 0.2793        | 4.58  | 600  | 0.6156          | 0.8047    | 0.8183 | 0.8088 | 0.8183   | 0.7846 | 0.7945  | 0.7769   | 0.8905 |
| 0.2943        | 4.73  | 620  | 0.6170          | 0.8044    | 0.8212 | 0.8107 | 0.8212   | 0.7846 | 0.7992  | 0.7843   | 0.8895 |
| 0.3314        | 4.89  | 640  | 0.6188          | 0.8040    | 0.8198 | 0.8097 | 0.8198   | 0.7939 | 0.7929  | 0.7769   | 0.8905 |


### Framework versions

- Transformers 4.26.0
- Pytorch 1.13.1+cu117
- Datasets 2.8.0
- Tokenizers 0.13.2