File size: 4,638 Bytes
0e591a4
 
 
 
 
 
 
 
 
 
78f3395
 
0e591a4
 
 
 
 
493a1c5
0e591a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
493a1c5
0e591a4
 
 
493a1c5
 
 
 
0e591a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
license: apache-2.0
base_model: nferruz/ProtGPT2
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: output_dir_clean_df_10-100_noX_100_1_epoch_cluster
  results: []
widget:
  - text: <|endoftext|>
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# output_dir_clean_df_10-100_noX_100_50_epoch_cluster

This model is a fine-tuned version of [nferruz/ProtGPT2](https://huggingface.co/nferruz/ProtGPT2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 5.3807
- Accuracy: 0.2682

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

all AMPs from the compass dataset with the length between 10-100 AA. Only considering the 20 standard AA (NO X).

## Training procedure

50 epochs
training rate: 1 e-06
block size: 100

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50.0

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log        | 1.0   | 197  | 6.2811          | 0.2041   |
| No log        | 2.0   | 394  | 6.1540          | 0.2116   |
| 6.3092        | 3.0   | 591  | 6.0786          | 0.2153   |
| 6.3092        | 4.0   | 788  | 6.0237          | 0.2177   |
| 6.3092        | 5.0   | 985  | 5.9779          | 0.2200   |
| 6.0762        | 6.0   | 1182 | 5.9383          | 0.2222   |
| 6.0762        | 7.0   | 1379 | 5.9011          | 0.2249   |
| 5.9715        | 8.0   | 1576 | 5.8651          | 0.2273   |
| 5.9715        | 9.0   | 1773 | 5.8307          | 0.2302   |
| 5.9715        | 10.0  | 1970 | 5.8002          | 0.2320   |
| 5.8814        | 11.0  | 2167 | 5.7723          | 0.2338   |
| 5.8814        | 12.0  | 2364 | 5.7463          | 0.2355   |
| 5.8123        | 13.0  | 2561 | 5.7220          | 0.2371   |
| 5.8123        | 14.0  | 2758 | 5.6994          | 0.2386   |
| 5.8123        | 15.0  | 2955 | 5.6781          | 0.2404   |
| 5.7544        | 16.0  | 3152 | 5.6581          | 0.2419   |
| 5.7544        | 17.0  | 3349 | 5.6391          | 0.2433   |
| 5.7009        | 18.0  | 3546 | 5.6211          | 0.2450   |
| 5.7009        | 19.0  | 3743 | 5.6044          | 0.2466   |
| 5.7009        | 20.0  | 3940 | 5.5888          | 0.2482   |
| 5.6629        | 21.0  | 4137 | 5.5735          | 0.2493   |
| 5.6629        | 22.0  | 4334 | 5.5588          | 0.2507   |
| 5.6235        | 23.0  | 4531 | 5.5451          | 0.2520   |
| 5.6235        | 24.0  | 4728 | 5.5320          | 0.2531   |
| 5.6235        | 25.0  | 4925 | 5.5197          | 0.2541   |
| 5.5865        | 26.0  | 5122 | 5.5078          | 0.2552   |
| 5.5865        | 27.0  | 5319 | 5.4969          | 0.2562   |
| 5.5649        | 28.0  | 5516 | 5.4866          | 0.2573   |
| 5.5649        | 29.0  | 5713 | 5.4765          | 0.2583   |
| 5.5649        | 30.0  | 5910 | 5.4670          | 0.2595   |
| 5.5322        | 31.0  | 6107 | 5.4582          | 0.2604   |
| 5.5322        | 32.0  | 6304 | 5.4500          | 0.2612   |
| 5.5168        | 33.0  | 6501 | 5.4424          | 0.2618   |
| 5.5168        | 34.0  | 6698 | 5.4350          | 0.2627   |
| 5.5168        | 35.0  | 6895 | 5.4283          | 0.2633   |
| 5.4984        | 36.0  | 7092 | 5.4219          | 0.2640   |
| 5.4984        | 37.0  | 7289 | 5.4161          | 0.2647   |
| 5.4984        | 38.0  | 7486 | 5.4107          | 0.2651   |
| 5.48          | 39.0  | 7683 | 5.4058          | 0.2656   |
| 5.48          | 40.0  | 7880 | 5.4014          | 0.2660   |
| 5.4665        | 41.0  | 8077 | 5.3974          | 0.2665   |
| 5.4665        | 42.0  | 8274 | 5.3936          | 0.2668   |
| 5.4665        | 43.0  | 8471 | 5.3905          | 0.2671   |
| 5.4612        | 44.0  | 8668 | 5.3878          | 0.2674   |
| 5.4612        | 45.0  | 8865 | 5.3855          | 0.2677   |
| 5.4515        | 46.0  | 9062 | 5.3836          | 0.2679   |
| 5.4515        | 47.0  | 9259 | 5.3822          | 0.2680   |
| 5.4515        | 48.0  | 9456 | 5.3812          | 0.2681   |
| 5.4453        | 49.0  | 9653 | 5.3808          | 0.2682   |
| 5.4453        | 50.0  | 9850 | 5.3807          | 0.2682   |


### Framework versions

- Transformers 4.38.0.dev0
- Pytorch 2.2.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0