File size: 5,634 Bytes
001cf9a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
license: mit
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: multiCorp_2e-05_0404
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# multiCorp_2e-05_0404

This model is a fine-tuned version of [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0717
- Precision: 0.5887
- Recall: 0.5283
- F1: 0.5569
- Accuracy: 0.9834

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 2000

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 1.4744        | 0.08  | 25   | 0.2177          | 0.0       | 0.0    | 0.0    | 0.9735   |
| 0.1749        | 0.15  | 50   | 0.2022          | 0.0       | 0.0    | 0.0    | 0.9735   |
| 0.1853        | 0.23  | 75   | 0.2006          | 0.0       | 0.0    | 0.0    | 0.9735   |
| 0.188         | 0.31  | 100  | 0.1984          | 0.0       | 0.0    | 0.0    | 0.9735   |
| 0.1662        | 0.39  | 125  | 0.1869          | 0.0       | 0.0    | 0.0    | 0.9735   |
| 0.1824        | 0.46  | 150  | 0.1656          | 0.0       | 0.0    | 0.0    | 0.9735   |
| 0.1672        | 0.54  | 175  | 0.1443          | 0.8214    | 0.0107 | 0.0211 | 0.9737   |
| 0.1269        | 0.62  | 200  | 0.1296          | 0.2189    | 0.1110 | 0.1473 | 0.9740   |
| 0.116         | 0.7   | 225  | 0.1221          | 0.3206    | 0.1982 | 0.2450 | 0.9753   |
| 0.111         | 0.77  | 250  | 0.1208          | 0.4109    | 0.1968 | 0.2662 | 0.9762   |
| 0.1382        | 0.85  | 275  | 0.1149          | 0.4680    | 0.1495 | 0.2266 | 0.9763   |
| 0.1136        | 0.93  | 300  | 0.1051          | 0.3749    | 0.2303 | 0.2853 | 0.9767   |
| 0.1043        | 1.01  | 325  | 0.1066          | 0.3451    | 0.3315 | 0.3381 | 0.9762   |
| 0.1062        | 1.08  | 350  | 0.1012          | 0.4072    | 0.3319 | 0.3657 | 0.9769   |
| 0.0834        | 1.16  | 375  | 0.0972          | 0.4079    | 0.3301 | 0.3649 | 0.9781   |
| 0.0923        | 1.24  | 400  | 0.0973          | 0.4598    | 0.3500 | 0.3975 | 0.9781   |
| 0.0932        | 1.32  | 425  | 0.0932          | 0.4649    | 0.3384 | 0.3917 | 0.9789   |
| 0.1044        | 1.39  | 450  | 0.0934          | 0.5039    | 0.3319 | 0.4002 | 0.9792   |
| 0.0962        | 1.47  | 475  | 0.0926          | 0.4636    | 0.3045 | 0.3676 | 0.9788   |
| 0.079         | 1.55  | 500  | 0.0883          | 0.4772    | 0.3890 | 0.4286 | 0.9799   |
| 0.0792        | 1.63  | 525  | 0.0856          | 0.4520    | 0.3890 | 0.4182 | 0.9799   |
| 0.0823        | 1.7   | 550  | 0.0847          | 0.4618    | 0.4517 | 0.4567 | 0.9799   |
| 0.079         | 1.78  | 575  | 0.0830          | 0.5208    | 0.3890 | 0.4454 | 0.9805   |
| 0.0832        | 1.86  | 600  | 0.0830          | 0.5201    | 0.3538 | 0.4211 | 0.9803   |
| 0.0688        | 1.93  | 625  | 0.0824          | 0.4816    | 0.4550 | 0.4679 | 0.9806   |
| 0.0752        | 2.01  | 650  | 0.0786          | 0.4956    | 0.4401 | 0.4662 | 0.9810   |
| 0.0699        | 2.09  | 675  | 0.0795          | 0.5304    | 0.4698 | 0.4983 | 0.9817   |
| 0.0705        | 2.17  | 700  | 0.0777          | 0.4963    | 0.4954 | 0.4958 | 0.9813   |
| 0.0591        | 2.24  | 725  | 0.0807          | 0.5545    | 0.4438 | 0.4930 | 0.9818   |
| 0.0641        | 2.32  | 750  | 0.0793          | 0.5270    | 0.4257 | 0.4710 | 0.9814   |
| 0.0761        | 2.4   | 775  | 0.0745          | 0.5150    | 0.4796 | 0.4966 | 0.9818   |
| 0.068         | 2.48  | 800  | 0.0765          | 0.5741    | 0.4262 | 0.4892 | 0.9819   |
| 0.0596        | 2.55  | 825  | 0.0757          | 0.5346    | 0.4341 | 0.4791 | 0.9817   |
| 0.0648        | 2.63  | 850  | 0.0724          | 0.5526    | 0.5023 | 0.5263 | 0.9827   |
| 0.0619        | 2.71  | 875  | 0.0739          | 0.5471    | 0.5288 | 0.5378 | 0.9824   |
| 0.06          | 2.79  | 900  | 0.0738          | 0.5627    | 0.5227 | 0.5420 | 0.9829   |
| 0.058         | 2.86  | 925  | 0.0740          | 0.5456    | 0.5107 | 0.5276 | 0.9825   |
| 0.0624        | 2.94  | 950  | 0.0712          | 0.5665    | 0.5237 | 0.5443 | 0.9832   |
| 0.0602        | 3.02  | 975  | 0.0700          | 0.5368    | 0.5181 | 0.5273 | 0.9828   |
| 0.049         | 3.1   | 1000 | 0.0720          | 0.5710    | 0.5339 | 0.5518 | 0.9832   |
| 0.0562        | 3.17  | 1025 | 0.0715          | 0.5847    | 0.5176 | 0.5491 | 0.9831   |
| 0.0559        | 3.25  | 1050 | 0.0711          | 0.5921    | 0.5460 | 0.5681 | 0.9834   |
| 0.054         | 3.33  | 1075 | 0.0707          | 0.6062    | 0.5395 | 0.5709 | 0.9837   |
| 0.0522        | 3.41  | 1100 | 0.0716          | 0.5530    | 0.5209 | 0.5365 | 0.9828   |
| 0.0456        | 3.48  | 1125 | 0.0717          | 0.5887    | 0.5283 | 0.5569 | 0.9834   |


### Framework versions

- Transformers 4.27.4
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.2