wcvz commited on
Commit
e607087
1 Parent(s): fd27e5e

Model save

Browse files
Files changed (2) hide show
  1. README.md +152 -0
  2. adapter_model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,152 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ library_name: peft
4
+ tags:
5
+ - generated_from_trainer
6
+ base_model: facebook/esm2_t30_150M_UR50D
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: esm2_t130_150M-lora-classifier_2024-04-26_10-31-42
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # esm2_t130_150M-lora-classifier_2024-04-26_10-31-42
18
+
19
+ This model is a fine-tuned version of [facebook/esm2_t30_150M_UR50D](https://huggingface.co/facebook/esm2_t30_150M_UR50D) on the None dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.8800
22
+ - Accuracy: 0.8945
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 0.0008701568055793088
42
+ - train_batch_size: 28
43
+ - eval_batch_size: 28
44
+ - seed: 8893
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - num_epochs: 90
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
54
+ | 0.659 | 1.0 | 55 | 0.6868 | 0.5820 |
55
+ | 0.5296 | 2.0 | 110 | 0.6101 | 0.6777 |
56
+ | 0.5532 | 3.0 | 165 | 0.5395 | 0.7402 |
57
+ | 0.505 | 4.0 | 220 | 0.3482 | 0.8613 |
58
+ | 0.1246 | 5.0 | 275 | 0.3447 | 0.8535 |
59
+ | 0.3953 | 6.0 | 330 | 0.2982 | 0.8789 |
60
+ | 0.169 | 7.0 | 385 | 0.3257 | 0.875 |
61
+ | 0.2096 | 8.0 | 440 | 0.4022 | 0.8438 |
62
+ | 0.2339 | 9.0 | 495 | 0.3090 | 0.8848 |
63
+ | 0.1876 | 10.0 | 550 | 0.2723 | 0.8926 |
64
+ | 0.0631 | 11.0 | 605 | 0.3260 | 0.875 |
65
+ | 0.2843 | 12.0 | 660 | 0.4296 | 0.8438 |
66
+ | 0.0502 | 13.0 | 715 | 0.3301 | 0.8945 |
67
+ | 0.185 | 14.0 | 770 | 0.3273 | 0.8984 |
68
+ | 0.098 | 15.0 | 825 | 0.3355 | 0.8945 |
69
+ | 0.2528 | 16.0 | 880 | 0.4209 | 0.8926 |
70
+ | 0.2165 | 17.0 | 935 | 0.4467 | 0.8809 |
71
+ | 0.5202 | 18.0 | 990 | 0.4732 | 0.8984 |
72
+ | 0.0746 | 19.0 | 1045 | 0.4913 | 0.8926 |
73
+ | 0.0107 | 20.0 | 1100 | 0.4292 | 0.8906 |
74
+ | 0.0263 | 21.0 | 1155 | 0.6244 | 0.8555 |
75
+ | 0.011 | 22.0 | 1210 | 0.5228 | 0.8809 |
76
+ | 0.093 | 23.0 | 1265 | 0.5670 | 0.8828 |
77
+ | 0.058 | 24.0 | 1320 | 0.8912 | 0.8379 |
78
+ | 0.0007 | 25.0 | 1375 | 1.0458 | 0.8594 |
79
+ | 0.082 | 26.0 | 1430 | 0.7712 | 0.8848 |
80
+ | 0.0172 | 27.0 | 1485 | 0.7106 | 0.8945 |
81
+ | 0.1386 | 28.0 | 1540 | 0.9548 | 0.8535 |
82
+ | 0.1145 | 29.0 | 1595 | 0.8496 | 0.8945 |
83
+ | 0.001 | 30.0 | 1650 | 0.9245 | 0.8770 |
84
+ | 0.0004 | 31.0 | 1705 | 0.9368 | 0.8867 |
85
+ | 0.0039 | 32.0 | 1760 | 0.9754 | 0.8828 |
86
+ | 0.0165 | 33.0 | 1815 | 1.0050 | 0.8770 |
87
+ | 0.0028 | 34.0 | 1870 | 1.0051 | 0.8848 |
88
+ | 0.0006 | 35.0 | 1925 | 0.9673 | 0.8652 |
89
+ | 0.0 | 36.0 | 1980 | 0.9794 | 0.8906 |
90
+ | 0.0 | 37.0 | 2035 | 0.9294 | 0.8984 |
91
+ | 0.1004 | 38.0 | 2090 | 0.9621 | 0.8965 |
92
+ | 0.0 | 39.0 | 2145 | 0.9699 | 0.8965 |
93
+ | 0.0001 | 40.0 | 2200 | 1.0551 | 0.8926 |
94
+ | 0.0001 | 41.0 | 2255 | 0.9521 | 0.8965 |
95
+ | 0.1139 | 42.0 | 2310 | 1.0807 | 0.8828 |
96
+ | 0.0291 | 43.0 | 2365 | 0.9925 | 0.8965 |
97
+ | 0.0001 | 44.0 | 2420 | 1.0462 | 0.8867 |
98
+ | 0.0001 | 45.0 | 2475 | 0.9989 | 0.8848 |
99
+ | 0.0 | 46.0 | 2530 | 0.9005 | 0.8945 |
100
+ | 0.0005 | 47.0 | 2585 | 1.0845 | 0.8809 |
101
+ | 0.0 | 48.0 | 2640 | 0.9892 | 0.8965 |
102
+ | 0.0001 | 49.0 | 2695 | 0.9311 | 0.8887 |
103
+ | 0.0 | 50.0 | 2750 | 0.9819 | 0.8887 |
104
+ | 0.0 | 51.0 | 2805 | 1.0463 | 0.8887 |
105
+ | 0.0 | 52.0 | 2860 | 1.0672 | 0.8828 |
106
+ | 0.0 | 53.0 | 2915 | 1.0893 | 0.8926 |
107
+ | 0.0 | 54.0 | 2970 | 1.1496 | 0.8848 |
108
+ | 0.0002 | 55.0 | 3025 | 1.1330 | 0.8809 |
109
+ | 0.0009 | 56.0 | 3080 | 1.0782 | 0.8828 |
110
+ | 0.0046 | 57.0 | 3135 | 0.9937 | 0.8887 |
111
+ | 0.0009 | 58.0 | 3190 | 0.9710 | 0.8945 |
112
+ | 0.001 | 59.0 | 3245 | 1.0381 | 0.8848 |
113
+ | 0.0001 | 60.0 | 3300 | 0.9837 | 0.8887 |
114
+ | 0.0 | 61.0 | 3355 | 0.9552 | 0.8926 |
115
+ | 0.0002 | 62.0 | 3410 | 1.0600 | 0.8730 |
116
+ | 0.0 | 63.0 | 3465 | 0.9684 | 0.8887 |
117
+ | 0.0 | 64.0 | 3520 | 0.9498 | 0.8926 |
118
+ | 0.0003 | 65.0 | 3575 | 0.9644 | 0.8926 |
119
+ | 0.0 | 66.0 | 3630 | 0.9054 | 0.8887 |
120
+ | 0.0 | 67.0 | 3685 | 0.9370 | 0.8945 |
121
+ | 0.0001 | 68.0 | 3740 | 1.0082 | 0.8789 |
122
+ | 0.0001 | 69.0 | 3795 | 0.9378 | 0.8945 |
123
+ | 0.0048 | 70.0 | 3850 | 0.9371 | 0.8945 |
124
+ | 0.0002 | 71.0 | 3905 | 1.0431 | 0.8730 |
125
+ | 0.0007 | 72.0 | 3960 | 0.9235 | 0.8828 |
126
+ | 0.0011 | 73.0 | 4015 | 0.9624 | 0.8867 |
127
+ | 0.0 | 74.0 | 4070 | 0.9465 | 0.8926 |
128
+ | 0.0 | 75.0 | 4125 | 0.9266 | 0.8906 |
129
+ | 0.0 | 76.0 | 4180 | 0.9872 | 0.8867 |
130
+ | 0.0 | 77.0 | 4235 | 0.9488 | 0.8887 |
131
+ | 0.0002 | 78.0 | 4290 | 0.9376 | 0.8906 |
132
+ | 0.0 | 79.0 | 4345 | 0.9632 | 0.8867 |
133
+ | 0.0001 | 80.0 | 4400 | 0.9373 | 0.8926 |
134
+ | 0.0001 | 81.0 | 4455 | 0.9352 | 0.8848 |
135
+ | 0.0 | 82.0 | 4510 | 0.8856 | 0.8906 |
136
+ | 0.0001 | 83.0 | 4565 | 0.8813 | 0.8926 |
137
+ | 0.0001 | 84.0 | 4620 | 0.8822 | 0.8887 |
138
+ | 0.0 | 85.0 | 4675 | 0.8911 | 0.8887 |
139
+ | 0.0 | 86.0 | 4730 | 0.8834 | 0.8945 |
140
+ | 0.0001 | 87.0 | 4785 | 0.8747 | 0.8945 |
141
+ | 0.0 | 88.0 | 4840 | 0.8823 | 0.8926 |
142
+ | 0.0 | 89.0 | 4895 | 0.8824 | 0.8926 |
143
+ | 0.0 | 90.0 | 4950 | 0.8800 | 0.8945 |
144
+
145
+
146
+ ### Framework versions
147
+
148
+ - PEFT 0.10.0
149
+ - Transformers 4.39.3
150
+ - Pytorch 2.2.1
151
+ - Datasets 2.16.1
152
+ - Tokenizers 0.15.2
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f25df2618923240265ae251114cd0858ffdea739d007a0658107bd9dcbc2d0b9
3
  size 3514768
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c37cfccb5792673a232c982a92560c206a476c7d365f1cbb3e78259f5755fbc4
3
  size 3514768