File size: 4,533 Bytes
9d6ac2c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
license: mit
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: Yepes_0.0001_0404_ES6
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Yepes_0.0001_0404_ES6

This model is a fine-tuned version of [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1207
- Precision: 0.4902
- Recall: 0.3743
- F1: 0.4244
- Accuracy: 0.9769

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 2000

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.4338        | 0.43  | 25   | 0.1979          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.2051        | 0.86  | 50   | 0.1923          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1601        | 1.29  | 75   | 0.1618          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1742        | 1.72  | 100  | 0.1400          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1506        | 2.16  | 125  | 0.1462          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1507        | 2.59  | 150  | 0.1516          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1566        | 3.02  | 175  | 0.1382          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1467        | 3.45  | 200  | 0.1360          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1492        | 3.88  | 225  | 0.1400          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1543        | 4.31  | 250  | 0.1364          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1435        | 4.74  | 275  | 0.1384          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1369        | 5.17  | 300  | 0.1282          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1284        | 5.6   | 325  | 0.1337          | 0.2381    | 0.1198 | 0.1594 | 0.9704   |
| 0.1235        | 6.03  | 350  | 0.1215          | 0.0       | 0.0    | 0.0    | 0.9705   |
| 0.1165        | 6.47  | 375  | 0.1337          | 0.3613    | 0.1677 | 0.2290 | 0.9739   |
| 0.1184        | 6.9   | 400  | 0.1228          | 0.2303    | 0.1228 | 0.1602 | 0.9718   |
| 0.1076        | 7.33  | 425  | 0.1174          | 0.2646    | 0.3263 | 0.2922 | 0.9671   |
| 0.0964        | 7.76  | 450  | 0.1094          | 0.3972    | 0.2545 | 0.3102 | 0.9751   |
| 0.0902        | 8.19  | 475  | 0.1217          | 0.4264    | 0.2515 | 0.3164 | 0.9742   |
| 0.0891        | 8.62  | 500  | 0.1075          | 0.3746    | 0.3263 | 0.3488 | 0.9736   |
| 0.0813        | 9.05  | 525  | 0.1295          | 0.4354    | 0.2725 | 0.3352 | 0.9738   |
| 0.078         | 9.48  | 550  | 0.1067          | 0.375     | 0.3413 | 0.3574 | 0.9742   |
| 0.0751        | 9.91  | 575  | 0.1042          | 0.4905    | 0.3084 | 0.3787 | 0.9765   |
| 0.0683        | 10.34 | 600  | 0.1028          | 0.4672    | 0.3413 | 0.3945 | 0.9761   |
| 0.0687        | 10.78 | 625  | 0.1070          | 0.4975    | 0.2994 | 0.3738 | 0.9762   |
| 0.0664        | 11.21 | 650  | 0.1225          | 0.3256    | 0.3383 | 0.3319 | 0.9703   |
| 0.0565        | 11.64 | 675  | 0.1000          | 0.4487    | 0.3144 | 0.3697 | 0.9767   |
| 0.0555        | 12.07 | 700  | 0.1033          | 0.4463    | 0.3234 | 0.375  | 0.9757   |
| 0.045         | 12.5  | 725  | 0.1150          | 0.4237    | 0.3323 | 0.3725 | 0.9746   |
| 0.0514        | 12.93 | 750  | 0.1126          | 0.6       | 0.3503 | 0.4423 | 0.9774   |
| 0.0387        | 13.36 | 775  | 0.1409          | 0.3986    | 0.3473 | 0.3712 | 0.9742   |
| 0.0419        | 13.79 | 800  | 0.1096          | 0.4336    | 0.4401 | 0.4368 | 0.9723   |
| 0.0349        | 14.22 | 825  | 0.1207          | 0.4902    | 0.3743 | 0.4244 | 0.9769   |


### Framework versions

- Transformers 4.27.4
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.2