File size: 25,474 Bytes
57bd9f7
 
 
 
9dd7584
 
 
57bd9f7
9dd7584
57bd9f7
9dd7584
 
57bd9f7
9dd7584
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57bd9f7
9dd7584
57bd9f7
 
9dd7584
 
57bd9f7
 
 
 
 
9dd7584
 
 
 
57bd9f7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9dd7584
 
 
 
 
 
57bd9f7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9dd7584
57bd9f7
9dd7584
57bd9f7
 
 
 
9dd7584
57bd9f7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9dd7584
 
57bd9f7
9dd7584
57bd9f7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9dd7584
 
 
 
57bd9f7
 
 
 
 
9dd7584
57bd9f7
 
 
 
 
 
 
 
 
 
9dd7584
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57bd9f7
 
 
 
 
 
9dd7584
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
---
license: apache-2.0
base_model: distilbert-base-cased
tags:
  - generated_from_trainer
  - news_classification
  - multi_label
datasets:
  - reuters21578
metrics:
  - f1
  - accuracy
model-index:
  - name: distilbert-finetuned-reuters21578-multilabel
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: reuters21578
          type: reuters21578
          config: ModApte
          split: test
          args: ModApte
        metrics:
          - name: F1
            type: f1
            value: 0.8628858578607322
          - name: Accuracy
            type: accuracy
            value: 0.8195625759416768
language:
  - en
pipeline_tag: text-classification
widget:
  - text: "JAPAN TO REVISE LONG-TERM ENERGY DEMAND DOWNWARDS The Ministry of International Trade and Industry (MITI) will revise its long-term energy supply/demand outlook by August to meet a forecast downtrend in Japanese energy demand, ministry officials said.     MITI is expected to lower the projection for primary energy supplies in the year 2000 to 550 mln kilolitres (kl) from 600 mln, they said.     The decision follows the emergence of structural changes in Japanese industry following the rise in the value of the yen and a decline in domestic electric power demand.     MITI is planning to work out a revised energy supply/demand outlook through deliberations of committee meetings of the Agency of Natural Resources and Energy, the officials said.     They said MITI will also review the breakdown of energy supply sources, including oil, nuclear, coal and natural gas.     Nuclear energy provided the bulk of Japan's electric power in the fiscal year ended March 31, supplying an estimated 27 pct on a kilowatt/hour basis, followed by oil (23 pct) and liquefied natural gas (21 pct), they noted.  REUTER"
    example_title: "Example-1"
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

## Origin of this model

This model was forked from https://huggingface.co/lxyuan/distilbert-finetuned-reuters21578-multilabel -- I just generated the onnx versions in /onnx

## Motivation

Fine-tuning on the Reuters-21578 multilabel dataset is a valuable exercise, especially as it's frequently used in take-home tests during interviews. The dataset's complexity is just right for testing multilabel classification skills within a limited timeframe, while its real-world relevance helps simulate practical challenges. Experimenting with this dataset not only helps candidates prepare for interviews but also hones various skills including preprocessing, feature extraction, and model evaluation.

This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the reuters21578 dataset.

## Inference Example

```python
from transformers import pipeline

pipe = pipeline("text-classification", model="lxyuan/distilbert-finetuned-reuters21578-multilabel", return_all_scores=True)

# dataset["test"]["text"][2]
news_article = (
    "JAPAN TO REVISE LONG-TERM ENERGY DEMAND DOWNWARDS The Ministry of International Trade and "
    "Industry (MITI) will revise its long-term energy supply/demand "
    "outlook by August to meet a forecast downtrend in Japanese "
    "energy demand, ministry officials said. "
    "MITI is expected to lower the projection for primary energy "
    "supplies in the year 2000 to 550 mln kilolitres (kl) from 600 "
    "mln, they said. "
    "The decision follows the emergence of structural changes in "
    "Japanese industry following the rise in the value of the yen "
    "and a decline in domestic electric power demand. "
    "MITI is planning to work out a revised energy supply/demand "
    "outlook through deliberations of committee meetings of the "
    "Agency of Natural Resources and Energy, the officials said. "
    "They said MITI will also review the breakdown of energy "
    "supply sources, including oil, nuclear, coal and natural gas. "
    "Nuclear energy provided the bulk of Japan's electric power "
    "in the fiscal year ended March 31, supplying an estimated 27 "
    "pct on a kilowatt/hour basis, followed by oil (23 pct) and "
    "liquefied natural gas (21 pct), they noted. "
    "REUTER"
)

# dataset["test"]["topics"][2]
target_topics = ['crude', 'nat-gas']

fn_kwargs={"padding": "max_length", "truncation": True, "max_length": 512}
output = pipe(example, function_to_apply="sigmoid", **fn_kwargs)

for item in output[0]:
    if item["score"]>=0.5:
        print(item["label"], item["score"])

>>> crude 0.7355073690414429
nat-gas 0.8600426316261292

```

## Overall Summary and Comparison Table

| Metric              | Baseline (Scikit-learn) | Transformer Model |
| ------------------- | ----------------------- | ----------------- |
| Micro-Averaged F1   | 0.77                    | 0.86              |
| Macro-Averaged F1   | 0.29                    | 0.33              |
| Weighted Average F1 | 0.70                    | 0.84              |
| Samples Average F1  | 0.75                    | 0.80              |

**Precision vs Recall**: Both models prioritize high precision over recall. In our client-facing news classification model, precision takes precedence over recall. This is because the repercussions of false positives are more severe and harder to justify to clients compared to false negatives. When the model incorrectly tags a news item with a topic, it's challenging to explain this error. On the other hand, if the model misses a topic, it's easier to defend by stating that the topic wasn't sufficiently emphasized in the news article.

**Class Imbalance Handling**: Both models suffer from the same general issue of not performing well on minority classes, as reflected in the low macro-averaged F1-scores. However, the transformer model shows a slight improvement, albeit marginal, in macro-averaged F1-score (0.33 vs 0.29).

**Issue of Zero Support Labels**: Both models have the problem of zero support for several labels, meaning these labels did not appear in the test set. This lack of "support" can significantly skew the performance metrics and may suggest that either the models are not well-tuned to predict these minority classes, or the dataset itself lacks sufficient examples of these classes. Given that both models struggle with low macro-averaged F1 scores, this issue further emphasizes the need for improved minority class handling in the models.

**General Performance**: The transformer model surpasses the scikit-learn baseline in terms of weighted and samples average F1-scores, indicating better overall performance and better handling of label imbalance.

**Conclusion**: While both models exhibit high precision, which is a business requirement, the transformer model slightly outperforms the scikit-learn baseline model in all metrics considered. It provides a better trade-off between precision and recall, as well as some improvement, albeit small, in handling minority classes. Thus, despite sharing similar weaknesses with the baseline, the transformer model demonstrates incremental improvements that could be significant in a production setting.

## Training and evaluation data

We remove single appearance label from both training and test sets using the following code:

```python
# Find Single Appearance Labels
def find_single_appearance_labels(y):
    """Find labels that appear only once in the dataset."""
    all_labels = list(chain.from_iterable(y))
    label_count = Counter(all_labels)
    single_appearance_labels = [label for label, count in label_count.items() if count == 1]
    return single_appearance_labels

# Remove Single Appearance Labels from Dataset
def remove_single_appearance_labels(dataset, single_appearance_labels):
    """Remove samples with single-appearance labels from both train and test sets."""
    for split in ['train', 'test']:
        dataset[split] = dataset[split].filter(lambda x: all(label not in single_appearance_labels for label in x['topics']))
    return dataset

dataset = load_dataset("reuters21578", "ModApte")

# Find and Remove Single Appearance Labels
y_train = [item['topics'] for item in dataset['train']]
single_appearance_labels = find_single_appearance_labels(y_train)
print(f"Single appearance labels: {single_appearance_labels}")
>>> Single appearance labels: ['lin-oil', 'rye', 'red-bean', 'groundnut-oil', 'citruspulp', 'rape-meal', 'corn-oil', 'peseta', 'cotton-oil', 'ringgit', 'castorseed', 'castor-oil', 'lit', 'rupiah', 'skr', 'nkr', 'dkr', 'sun-meal', 'lin-meal', 'cruzado']

print("Removing samples with single-appearance labels...")
dataset = remove_single_appearance_labels(dataset, single_appearance_labels)

unique_labels = set(chain.from_iterable(dataset['train']["topics"]))
print(f"We have {len(unique_labels)} unique labels:\n{unique_labels}")
>>> We have 95 unique labels:
{'veg-oil', 'gold', 'platinum', 'ipi', 'acq', 'carcass', 'wool', 'coconut-oil', 'linseed', 'copper', 'soy-meal', 'jet', 'dlr', 'copra-cake', 'hog', 'rand', 'strategic-metal', 'can', 'tea', 'sorghum', 'livestock', 'barley', 'lumber', 'earn', 'wheat', 'trade', 'soy-oil', 'cocoa', 'inventories', 'income', 'rubber', 'tin', 'iron-steel', 'ship', 'rapeseed', 'wpi', 'sun-oil', 'pet-chem', 'palmkernel', 'nat-gas', 'gnp', 'l-cattle', 'propane', 'rice', 'lead', 'alum', 'instal-debt', 'saudriyal', 'cpu', 'jobs', 'meal-feed', 'oilseed', 'dmk', 'plywood', 'zinc', 'retail', 'dfl', 'cpi', 'crude', 'pork-belly', 'gas', 'money-fx', 'corn', 'tapioca', 'palladium', 'lei', 'cornglutenfeed', 'sunseed', 'potato', 'silver', 'sugar', 'grain', 'groundnut', 'naphtha', 'orange', 'soybean', 'coconut', 'stg', 'cotton', 'yen', 'rape-oil', 'palm-oil', 'oat', 'reserves', 'housing', 'interest', 'coffee', 'fuel', 'austdlr', 'money-supply', 'heat', 'fishmeal', 'bop', 'nickel', 'nzdlr'}
```

## Training procedure

[EDA on Reuters-21578 dataset](https://github.com/LxYuan0420/nlp/blob/main/notebooks/eda_reuters.ipynb):
This notebook provides an Exploratory Data Analysis (EDA) of the Reuters-21578 dataset. It includes visualizations and statistical summaries that offer insights into the dataset's structure, label distribution, and text characteristics.

[Reuters Baseline Scikit-Learn Model](https://github.com/LxYuan0420/nlp/blob/main/notebooks/scikit_learn_reuters.ipynb):
This notebook establishes a baseline model for text classification on the Reuters-21578 dataset using scikit-learn. It guides you through data preprocessing, feature extraction, model training, and evaluation.

[Reuters Transformer Model](https://github.com/LxYuan0420/nlp/blob/main/notebooks/transformer_reuters.ipynb):
This notebook delves into advanced text classification using a Transformer model on the Reuters-21578 dataset. It covers the implementation details, training process, and performance metrics of using Transformer-based models for this specific task.

[Multilabel Stratified Sampling & Hypyerparameter Search on Reuters Dataset](https://github.com/LxYuan0420/nlp/blob/main/notebooks/transformer_reuters_hyperparameter_tuning.ipynb):
In this notebook, we explore advanced machine learning techniques through the lens of the Hugging Face Trainer API, specifically targeting Multilabel Iterative Stratified Splitting and Hyperparameter Search. The former aims to fairly distribute imbalanced datasets across multiple labels in k-fold cross-validation, maintaining a distribution closely resembling that of the complete dataset. The latter walks users through a structured hyperparameter search to fine-tune model performance for optimal results.

## Evaluation results

<details>
<summary>Transformer Model Evaluation Result</summary>

Classification Report:
precision recall f1-score support

            acq       0.97      0.93      0.95       719
           alum       1.00      0.70      0.82        23
        austdlr       0.00      0.00      0.00         0
         barley       1.00      0.50      0.67        12
            bop       0.79      0.50      0.61        30
            can       0.00      0.00      0.00         0
        carcass       0.67      0.67      0.67        18
          cocoa       1.00      1.00      1.00        18
        coconut       0.00      0.00      0.00         2
    coconut-oil       0.00      0.00      0.00         2
         coffee       0.86      0.89      0.87        27
         copper       1.00      0.78      0.88        18
     copra-cake       0.00      0.00      0.00         1
           corn       0.84      0.87      0.86        55
    cornglutenfeed       0.00      0.00      0.00         0
         cotton       0.92      0.67      0.77        18
            cpi       0.86      0.43      0.57        28
            cpu       0.00      0.00      0.00         1
          crude       0.87      0.93      0.90       189
            dfl       0.00      0.00      0.00         1
            dlr       0.72      0.64      0.67        44
            dmk       0.00      0.00      0.00         4
           earn       0.98      0.99      0.98      1087
       fishmeal       0.00      0.00      0.00         0
           fuel       0.00      0.00      0.00        10
            gas       0.80      0.71      0.75        17
            gnp       0.79      0.66      0.72        35
           gold       0.95      0.67      0.78        30
          grain       0.94      0.92      0.93       146
      groundnut       0.00      0.00      0.00         4
           heat       0.00      0.00      0.00         5
            hog       1.00      0.33      0.50         6
        housing       0.00      0.00      0.00         4
         income       0.00      0.00      0.00         7
    instal-debt       0.00      0.00      0.00         1
       interest       0.89      0.67      0.77       131
    inventories       0.00      0.00      0.00         0
            ipi       1.00      0.58      0.74        12
     iron-steel       0.90      0.64      0.75        14
            jet       0.00      0.00      0.00         1
           jobs       0.92      0.57      0.71        21
       l-cattle       0.00      0.00      0.00         2
           lead       0.00      0.00      0.00        14
            lei       0.00      0.00      0.00         3
        linseed       0.00      0.00      0.00         0
      livestock       0.63      0.79      0.70        24
         lumber       0.00      0.00      0.00         6
      meal-feed       0.00      0.00      0.00        17
       money-fx       0.78      0.81      0.80       177
       money-supply       0.80      0.71      0.75        34
        naphtha       0.00      0.00      0.00         4
        nat-gas       0.82      0.60      0.69        30
         nickel       0.00      0.00      0.00         1
          nzdlr       0.00      0.00      0.00         2
            oat       0.00      0.00      0.00         4
        oilseed       0.64      0.61      0.63        44
         orange       1.00      0.36      0.53        11
      palladium       0.00      0.00      0.00         1
       palm-oil       1.00      0.56      0.71         9
     palmkernel       0.00      0.00      0.00         1
       pet-chem       0.00      0.00      0.00        12
       platinum       0.00      0.00      0.00         7
        plywood       0.00      0.00      0.00         0
     pork-belly       0.00      0.00      0.00         0
         potato       0.00      0.00      0.00         3
        propane       0.00      0.00      0.00         3
           rand       0.00      0.00      0.00         1
       rape-oil       0.00      0.00      0.00         1
       rapeseed       0.00      0.00      0.00         8
       reserves       0.83      0.56      0.67        18
         retail       0.00      0.00      0.00         2
           rice       1.00      0.57      0.72        23
         rubber       0.82      0.75      0.78        12
      saudriyal       0.00      0.00      0.00         0
           ship       0.95      0.81      0.87        89
         silver       1.00      0.12      0.22         8
        sorghum       1.00      0.12      0.22         8
       soy-meal       0.00      0.00      0.00        12
        soy-oil       0.00      0.00      0.00         8
        soybean       0.72      0.56      0.63        32
            stg       0.00      0.00      0.00         0
      strategic-metal       0.00      0.00      0.00        11
          sugar       1.00      0.80      0.89        35
        sun-oil       0.00      0.00      0.00         0
        sunseed       0.00      0.00      0.00         5
        tapioca       0.00      0.00      0.00         0
            tea       0.00      0.00      0.00         3
            tin       1.00      0.42      0.59        12
          trade       0.78      0.79      0.79       116
        veg-oil       0.91      0.59      0.71        34
          wheat       0.83      0.83      0.83        69
           wool       0.00      0.00      0.00         0
            wpi       0.00      0.00      0.00        10
            yen       0.57      0.29      0.38        14
           zinc       1.00      0.69      0.82        13

      micro avg       0.92      0.81      0.86      3694
      macro avg       0.41      0.30      0.33      3694

weighted avg 0.87 0.81 0.84 3694
samples avg 0.81 0.80 0.80 3694

</details>

<details>
<summary>Scikit-learn Baseline Model Evaluation Result</summary>
Classification Report:
                 precision    recall  f1-score   support

            acq       0.98      0.87      0.92       719
           alum       1.00      0.00      0.00        23
        austdlr       1.00      1.00      1.00         0
         barley       1.00      0.00      0.00        12
            bop       1.00      0.30      0.46        30
            can       1.00      1.00      1.00         0
        carcass       1.00      0.06      0.11        18
          cocoa       1.00      0.61      0.76        18
        coconut       1.00      0.00      0.00         2
    coconut-oil       1.00      0.00      0.00         2
         coffee       0.94      0.59      0.73        27
         copper       1.00      0.22      0.36        18
     copra-cake       1.00      0.00      0.00         1
           corn       0.97      0.51      0.67        55
     cornglutenfeed       1.00      1.00      1.00         0
         cotton       1.00      0.06      0.11        18
            cpi       1.00      0.14      0.25        28
            cpu       1.00      0.00      0.00         1
          crude       0.94      0.69      0.80       189
            dfl       1.00      0.00      0.00         1
            dlr       0.86      0.43      0.58        44
            dmk       1.00      0.00      0.00         4
           earn       0.99      0.97      0.98      1087
       fishmeal       1.00      1.00      1.00         0
           fuel       1.00      0.00      0.00        10
            gas       1.00      0.00      0.00        17
            gnp       1.00      0.31      0.48        35
           gold       0.83      0.17      0.28        30
          grain       1.00      0.65      0.79       146
      groundnut       1.00      0.00      0.00         4
           heat       1.00      0.00      0.00         5
            hog       1.00      0.00      0.00         6
        housing       1.00      0.00      0.00         4
         income       1.00      0.00      0.00         7
    instal-debt       1.00      0.00      0.00         1
       interest       0.88      0.40      0.55       131
    inventories       1.00      1.00      1.00         0
            ipi       1.00      0.00      0.00        12
     iron-steel       1.00      0.00      0.00        14
            jet       1.00      0.00      0.00         1
           jobs       1.00      0.14      0.25        21
       l-cattle       1.00      0.00      0.00         2
           lead       1.00      0.00      0.00        14
            lei       1.00      0.00      0.00         3
        linseed       1.00      1.00      1.00         0
      livestock       0.67      0.08      0.15        24
         lumber       1.00      0.00      0.00         6
      meal-feed       1.00      0.00      0.00        17
       money-fx       0.80      0.50      0.62       177
     money-supply       0.88      0.41      0.56        34
        naphtha       1.00      0.00      0.00         4
        nat-gas       1.00      0.27      0.42        30
         nickel       1.00      0.00      0.00         1
          nzdlr       1.00      0.00      0.00         2
            oat       1.00      0.00      0.00         4
        oilseed       0.62      0.11      0.19        44
         orange       1.00      0.00      0.00        11
      palladium       1.00      0.00      0.00         1
       palm-oil       1.00      0.22      0.36         9
     palmkernel       1.00      0.00      0.00         1
       pet-chem       1.00      0.00      0.00        12
       platinum       1.00      0.00      0.00         7
        plywood       1.00      1.00      1.00         0
     pork-belly       1.00      1.00      1.00         0
         potato       1.00      0.00      0.00         3
        propane       1.00      0.00      0.00         3
           rand       1.00      0.00      0.00         1
       rape-oil       1.00      0.00      0.00         1
       rapeseed       1.00      0.00      0.00         8
       reserves       1.00      0.00      0.00        18
         retail       1.00      0.00      0.00         2
           rice       1.00      0.00      0.00        23
         rubber       1.00      0.17      0.29        12
      saudriyal       1.00      1.00      1.00         0
           ship       0.92      0.26      0.40        89
         silver       1.00      0.00      0.00         8
        sorghum       1.00      0.00      0.00         8
       soy-meal       1.00      0.00      0.00        12
        soy-oil       1.00      0.00      0.00         8
        soybean       1.00      0.16      0.27        32
            stg       1.00      1.00      1.00         0
      strategic-metal       1.00      0.00      0.00        11
          sugar       1.00      0.60      0.75        35
        sun-oil       1.00      1.00      1.00         0
        sunseed       1.00      0.00      0.00         5
        tapioca       1.00      1.00      1.00         0
            tea       1.00      0.00      0.00         3
            tin       1.00      0.00      0.00        12
          trade       0.92      0.61      0.74       116
        veg-oil       1.00      0.12      0.21        34
          wheat       0.97      0.55      0.70        69
           wool       1.00      1.00      1.00         0
            wpi       1.00      0.00      0.00        10
            yen       1.00      0.00      0.00        14
           zinc       1.00      0.00      0.00        13

      micro avg       0.97      0.64      0.77      3694
      macro avg       0.98      0.25      0.29      3694

weighted avg 0.96 0.64 0.70 3694
samples avg 0.98 0.74 0.75 3694

</details>

### Training hyperparameters

The following hyperparameters were used during training:

- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Validation Loss |   F1   | Roc Auc | Accuracy |
| :-----------: | :---: | :--: | :-------------: | :----: | :-----: | :------: |
|    0.1801     |  1.0  | 300  |     0.0439      | 0.3896 | 0.6210  |  0.3566  |
|    0.0345     |  2.0  | 600  |     0.0287      | 0.6289 | 0.7318  |  0.5954  |
|    0.0243     |  3.0  | 900  |     0.0219      | 0.6721 | 0.7579  |  0.6084  |
|    0.0178     |  4.0  | 1200 |     0.0177      | 0.7505 | 0.8128  |  0.6908  |
|     0.014     |  5.0  | 1500 |     0.0151      | 0.7905 | 0.8376  |  0.7278  |
|    0.0115     |  6.0  | 1800 |     0.0135      | 0.8132 | 0.8589  |  0.7555  |
|    0.0096     |  7.0  | 2100 |     0.0124      | 0.8291 | 0.8727  |  0.7725  |
|    0.0082     |  8.0  | 2400 |     0.0124      | 0.8335 | 0.8757  |  0.7822  |
|    0.0071     |  9.0  | 2700 |     0.0119      | 0.8392 | 0.8847  |  0.7883  |
|    0.0064     | 10.0  | 3000 |     0.0123      | 0.8339 | 0.8810  |  0.7828  |
|    0.0058     | 11.0  | 3300 |     0.0114      | 0.8538 | 0.8999  |  0.8047  |
|    0.0053     | 12.0  | 3600 |     0.0113      | 0.8525 | 0.8967  |  0.8044  |
|    0.0048     | 13.0  | 3900 |     0.0115      | 0.8520 | 0.8982  |  0.8029  |
|    0.0045     | 14.0  | 4200 |     0.0111      | 0.8566 | 0.8962  |  0.8104  |
|    0.0042     | 15.0  | 4500 |     0.0110      | 0.8610 | 0.9060  |  0.8165  |
|    0.0039     | 16.0  | 4800 |     0.0112      | 0.8583 | 0.9021  |  0.8138  |
|    0.0037     | 17.0  | 5100 |     0.0110      | 0.8620 | 0.9055  |  0.8196  |
|    0.0035     | 18.0  | 5400 |     0.0110      | 0.8629 | 0.9063  |  0.8196  |
|    0.0035     | 19.0  | 5700 |     0.0111      | 0.8624 | 0.9062  |  0.8180  |
|    0.0034     | 20.0  | 6000 |     0.0111      | 0.8626 | 0.9055  |  0.8177  |

### Framework versions

- Transformers 4.33.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.14.3
- Tokenizers 0.13.3