File size: 13,095 Bytes
e619ab8
 
c791d95
e619ab8
 
 
b21b0b1
c791d95
 
 
 
 
4cba74d
e619ab8
 
c791d95
 
 
0df5108
 
 
adcff3d
0df5108
c791d95
 
 
 
e619ab8
 
 
 
 
 
 
 
 
 
 
 
 
 
0df5108
e619ab8
 
 
 
c791d95
e619ab8
0df5108
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e619ab8
 
 
 
 
 
 
c791d95
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
license: mit
language: en
tags:
- generated_from_trainer
model-index:
- name: verdict-classifier-en
  results:
  - task:
      type: text-classification
      name: Verdict Classification
widget:
- "Even though it might look true, it has been taken out of context."
---

# English Verdict Classifier
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on 2,500 deduplicated verdicts from [Google Fact Check Tools API](https://developers.google.com/fact-check/tools/api/reference/rest/v1alpha1/claims/search), translated into English with the [Google Cloud Translation API](https://cloud.google.com/translate/docs/reference/rest/).
It achieves the following results on the evaluation set, being 1,000 such verdicts translated into English, but here including duplicates to represent the true distribution:
- Loss: 0.1290
- F1 Macro: 0.9171
- F1 Misinformation: 0.9896
- F1 Factual: 0.9890
- F1 Other: 0.7727
- Precision Macro: 0.8940
- Precision Misinformation: 0.9954
- Precision Factual: 0.9783
- Precision Other: 0.7083

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2500
- num_epochs: 1000

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Misinformation | F1 Factual | F1 Other | Precision Macro | Precision Misinformation | Precision Factual | Precision Other |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------------:|:----------:|:--------:|:----------:|:-------------------:|:------------:|:----------:|
| 1.1493        | 0.16  | 50   | 1.1040          | 0.0550   | 0.0               | 0.1650     | 0.0      | 0.0300     | 0.0                 | 0.0899       | 0.0        |
| 1.0899        | 0.32  | 100  | 1.0765          | 0.0619   | 0.0203            | 0.1654     | 0.0      | 0.2301     | 0.6                 | 0.0903       | 0.0        |
| 1.0136        | 0.48  | 150  | 1.0487          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.9868        | 0.64  | 200  | 1.0221          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.9599        | 0.8   | 250  | 0.9801          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.9554        | 0.96  | 300  | 0.9500          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.935         | 1.12  | 350  | 0.9071          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.948         | 1.28  | 400  | 0.8809          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.9344        | 1.44  | 450  | 0.8258          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.9182        | 1.6   | 500  | 0.7687          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.8942        | 1.76  | 550  | 0.5787          | 0.3102   | 0.9306            | 0.0        | 0.0      | 0.2900     | 0.8701              | 0.0          | 0.0        |
| 0.8932        | 1.92  | 600  | 0.4506          | 0.4043   | 0.9628            | 0.0        | 0.25     | 0.3777     | 0.9753              | 0.0          | 0.1579     |
| 0.7448        | 2.08  | 650  | 0.2884          | 0.5323   | 0.9650            | 0.3303     | 0.3017   | 0.7075     | 0.9810              | 0.9474       | 0.1942     |
| 0.6616        | 2.24  | 700  | 0.2162          | 0.8161   | 0.9710            | 0.9724     | 0.5051   | 0.7910     | 0.9824              | 0.9670       | 0.4237     |
| 0.575         | 2.4   | 750  | 0.1754          | 0.8305   | 0.9714            | 0.9780     | 0.5421   | 0.7961     | 0.9881              | 0.9674       | 0.4328     |
| 0.5246        | 2.56  | 800  | 0.1641          | 0.8102   | 0.9659            | 0.9175     | 0.5472   | 0.7614     | 0.9892              | 0.8558       | 0.4394     |
| 0.481         | 2.72  | 850  | 0.1399          | 0.8407   | 0.9756            | 0.9780     | 0.5686   | 0.8082     | 0.9894              | 0.9674       | 0.4677     |
| 0.4588        | 2.88  | 900  | 0.1212          | 0.8501   | 0.9786            | 0.9783     | 0.5934   | 0.8247     | 0.9871              | 0.9574       | 0.5294     |
| 0.4512        | 3.04  | 950  | 0.1388          | 0.8270   | 0.9702            | 0.9836     | 0.5273   | 0.7904     | 0.9893              | 0.9677       | 0.4143     |
| 0.3894        | 3.2   | 1000 | 0.1270          | 0.8411   | 0.9737            | 0.9836     | 0.5660   | 0.8043     | 0.9905              | 0.9677       | 0.4545     |
| 0.3772        | 3.36  | 1050 | 0.1267          | 0.8336   | 0.9732            | 0.9890     | 0.5385   | 0.8013     | 0.9882              | 0.9783       | 0.4375     |
| 0.3528        | 3.52  | 1100 | 0.1073          | 0.8546   | 0.9791            | 0.9890     | 0.5957   | 0.8284     | 0.9883              | 0.9783       | 0.5185     |
| 0.3694        | 3.68  | 1150 | 0.1120          | 0.8431   | 0.9786            | 0.9890     | 0.5618   | 0.8244     | 0.9849              | 0.9783       | 0.5102     |
| 0.3146        | 3.84  | 1200 | 0.1189          | 0.8325   | 0.9738            | 0.9836     | 0.54     | 0.8016     | 0.9870              | 0.9677       | 0.45       |
| 0.3038        | 4.01  | 1250 | 0.1041          | 0.8648   | 0.9815            | 0.9836     | 0.6292   | 0.8425     | 0.9884              | 0.9677       | 0.5714     |
| 0.2482        | 4.17  | 1300 | 0.1245          | 0.8588   | 0.9773            | 0.9836     | 0.6154   | 0.8202     | 0.9929              | 0.9677       | 0.5        |
| 0.2388        | 4.33  | 1350 | 0.1167          | 0.8701   | 0.9808            | 0.9836     | 0.6458   | 0.8377     | 0.9918              | 0.9677       | 0.5536     |
| 0.2593        | 4.49  | 1400 | 0.1215          | 0.8654   | 0.9790            | 0.9836     | 0.6337   | 0.8284     | 0.9929              | 0.9677       | 0.5246     |
| 0.239         | 4.65  | 1450 | 0.1057          | 0.8621   | 0.9803            | 0.9890     | 0.6170   | 0.8349     | 0.9895              | 0.9783       | 0.5370     |
| 0.2397        | 4.81  | 1500 | 0.1256          | 0.8544   | 0.9761            | 0.9890     | 0.5981   | 0.8162     | 0.9929              | 0.9783       | 0.4776     |
| 0.2238        | 4.97  | 1550 | 0.1189          | 0.8701   | 0.9802            | 0.9836     | 0.6465   | 0.8343     | 0.9929              | 0.9677       | 0.5424     |
| 0.1811        | 5.13  | 1600 | 0.1456          | 0.8438   | 0.9737            | 0.9836     | 0.5741   | 0.8051     | 0.9917              | 0.9677       | 0.4559     |
| 0.1615        | 5.29  | 1650 | 0.1076          | 0.8780   | 0.9838            | 0.9836     | 0.6667   | 0.8581     | 0.9895              | 0.9677       | 0.6170     |
| 0.1783        | 5.45  | 1700 | 0.1217          | 0.8869   | 0.9831            | 0.9836     | 0.6939   | 0.8497     | 0.9953              | 0.9677       | 0.5862     |
| 0.1615        | 5.61  | 1750 | 0.1305          | 0.8770   | 0.9808            | 0.9836     | 0.6667   | 0.8371     | 0.9953              | 0.9677       | 0.5484     |
| 0.155         | 5.77  | 1800 | 0.1218          | 0.8668   | 0.9821            | 0.9890     | 0.6292   | 0.8460     | 0.9884              | 0.9783       | 0.5714     |
| 0.167         | 5.93  | 1850 | 0.1091          | 0.8991   | 0.9873            | 0.9890     | 0.7209   | 0.8814     | 0.9919              | 0.9783       | 0.6739     |
| 0.1455        | 6.09  | 1900 | 0.1338          | 0.8535   | 0.9773            | 0.9890     | 0.5941   | 0.8202     | 0.9906              | 0.9783       | 0.4918     |
| 0.1301        | 6.25  | 1950 | 0.1321          | 0.8792   | 0.9820            | 0.9890     | 0.6667   | 0.8439     | 0.9941              | 0.9783       | 0.5593     |
| 0.1049        | 6.41  | 2000 | 0.1181          | 0.9031   | 0.9879            | 0.9834     | 0.7381   | 0.8911     | 0.9908              | 0.9780       | 0.7045     |
| 0.1403        | 6.57  | 2050 | 0.1432          | 0.8608   | 0.9779            | 0.9890     | 0.6154   | 0.8237     | 0.9929              | 0.9783       | 0.5        |
| 0.1178        | 6.73  | 2100 | 0.1443          | 0.8937   | 0.9844            | 0.9945     | 0.7021   | 0.8644     | 0.9930              | 0.9890       | 0.6111     |
| 0.1267        | 6.89  | 2150 | 0.1346          | 0.8494   | 0.9786            | 0.9890     | 0.5806   | 0.8249     | 0.9871              | 0.9783       | 0.5094     |
| 0.1043        | 7.05  | 2200 | 0.1494          | 0.8905   | 0.9832            | 0.9945     | 0.6939   | 0.8564     | 0.9941              | 0.9890       | 0.5862     |
| 0.0886        | 7.21  | 2250 | 0.1180          | 0.8946   | 0.9873            | 0.9890     | 0.7073   | 0.8861     | 0.9896              | 0.9783       | 0.6905     |
| 0.1183        | 7.37  | 2300 | 0.1777          | 0.8720   | 0.9790            | 0.9890     | 0.6481   | 0.8298     | 0.9964              | 0.9783       | 0.5147     |
| 0.0813        | 7.53  | 2350 | 0.1405          | 0.8912   | 0.9856            | 0.9836     | 0.7045   | 0.8685     | 0.9919              | 0.9677       | 0.6458     |
| 0.111         | 7.69  | 2400 | 0.1379          | 0.8874   | 0.9838            | 0.9836     | 0.6947   | 0.8540     | 0.9941              | 0.9677       | 0.6        |
| 0.1199        | 7.85  | 2450 | 0.1301          | 0.9080   | 0.9879            | 0.9890     | 0.7473   | 0.8801     | 0.9953              | 0.9783       | 0.6667     |
| 0.1054        | 8.01  | 2500 | 0.1478          | 0.8845   | 0.9838            | 0.9890     | 0.6809   | 0.8546     | 0.9930              | 0.9783       | 0.5926     |
| 0.105         | 8.17  | 2550 | 0.1333          | 0.9021   | 0.9879            | 0.9890     | 0.7294   | 0.8863     | 0.9919              | 0.9783       | 0.6889     |
| 0.09          | 8.33  | 2600 | 0.1555          | 0.8926   | 0.9855            | 0.9890     | 0.7033   | 0.8662     | 0.9930              | 0.9783       | 0.6275     |
| 0.0947        | 8.49  | 2650 | 0.1572          | 0.8831   | 0.9856            | 0.9890     | 0.6747   | 0.8726     | 0.9885              | 0.9783       | 0.6512     |
| 0.0784        | 8.65  | 2700 | 0.1477          | 0.8969   | 0.9873            | 0.9890     | 0.7143   | 0.8836     | 0.9908              | 0.9783       | 0.6818     |
| 0.0814        | 8.81  | 2750 | 0.1700          | 0.8932   | 0.9861            | 0.9890     | 0.7045   | 0.8720     | 0.9919              | 0.9783       | 0.6458     |
| 0.0962        | 8.97  | 2800 | 0.1290          | 0.9171   | 0.9896            | 0.9890     | 0.7727   | 0.8940     | 0.9954              | 0.9783       | 0.7083     |
| 0.0802        | 9.13  | 2850 | 0.1721          | 0.8796   | 0.9832            | 0.9890     | 0.6667   | 0.8517     | 0.9918              | 0.9783       | 0.5849     |
| 0.0844        | 9.29  | 2900 | 0.1516          | 0.9023   | 0.9867            | 0.9890     | 0.7312   | 0.8717     | 0.9953              | 0.9783       | 0.6415     |
| 0.0511        | 9.45  | 2950 | 0.1544          | 0.9062   | 0.9879            | 0.9890     | 0.7416   | 0.8820     | 0.9942              | 0.9783       | 0.6735     |
| 0.0751        | 9.61  | 3000 | 0.1748          | 0.8884   | 0.9832            | 0.9945     | 0.6875   | 0.8571     | 0.9930              | 0.9890       | 0.5893     |
| 0.0707        | 9.77  | 3050 | 0.1743          | 0.8721   | 0.9802            | 0.9890     | 0.6471   | 0.8349     | 0.9941              | 0.9783       | 0.5323     |
| 0.0951        | 9.93  | 3100 | 0.1660          | 0.8899   | 0.9850            | 0.9890     | 0.6957   | 0.8622     | 0.9930              | 0.9783       | 0.6154     |
| 0.0576        | 10.1  | 3150 | 0.2029          | 0.8613   | 0.9766            | 0.9890     | 0.6182   | 0.8197     | 0.9952              | 0.9783       | 0.4857     |
| 0.0727        | 10.26 | 3200 | 0.1709          | 0.8920   | 0.9849            | 0.9890     | 0.7021   | 0.8612     | 0.9942              | 0.9783       | 0.6111     |
| 0.0654        | 10.42 | 3250 | 0.1599          | 0.8999   | 0.9861            | 0.9945     | 0.7191   | 0.8780     | 0.9919              | 0.9890       | 0.6531     |
| 0.0553        | 10.58 | 3300 | 0.2091          | 0.8920   | 0.9849            | 0.9890     | 0.7021   | 0.8612     | 0.9942              | 0.9783       | 0.6111     |


### Framework versions

- Transformers 4.11.3
- Pytorch 1.9.0+cu102
- Datasets 1.9.0
- Tokenizers 0.10.2