kejian commited on
Commit
c0f42e7
1 Parent(s): d2181ca

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +245 -0
README.md ADDED
@@ -0,0 +1,245 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: mit
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - tomekkorbak/detoxify-pile-chunk3-0-50000
9
+ - tomekkorbak/detoxify-pile-chunk3-50000-100000
10
+ - tomekkorbak/detoxify-pile-chunk3-100000-150000
11
+ - tomekkorbak/detoxify-pile-chunk3-150000-200000
12
+ - tomekkorbak/detoxify-pile-chunk3-200000-250000
13
+ - tomekkorbak/detoxify-pile-chunk3-250000-300000
14
+ - tomekkorbak/detoxify-pile-chunk3-300000-350000
15
+ - tomekkorbak/detoxify-pile-chunk3-350000-400000
16
+ - tomekkorbak/detoxify-pile-chunk3-400000-450000
17
+ - tomekkorbak/detoxify-pile-chunk3-450000-500000
18
+ - tomekkorbak/detoxify-pile-chunk3-500000-550000
19
+ - tomekkorbak/detoxify-pile-chunk3-550000-600000
20
+ - tomekkorbak/detoxify-pile-chunk3-600000-650000
21
+ - tomekkorbak/detoxify-pile-chunk3-650000-700000
22
+ - tomekkorbak/detoxify-pile-chunk3-700000-750000
23
+ - tomekkorbak/detoxify-pile-chunk3-750000-800000
24
+ - tomekkorbak/detoxify-pile-chunk3-800000-850000
25
+ - tomekkorbak/detoxify-pile-chunk3-850000-900000
26
+ - tomekkorbak/detoxify-pile-chunk3-900000-950000
27
+ - tomekkorbak/detoxify-pile-chunk3-950000-1000000
28
+ - tomekkorbak/detoxify-pile-chunk3-1000000-1050000
29
+ - tomekkorbak/detoxify-pile-chunk3-1050000-1100000
30
+ - tomekkorbak/detoxify-pile-chunk3-1100000-1150000
31
+ - tomekkorbak/detoxify-pile-chunk3-1150000-1200000
32
+ - tomekkorbak/detoxify-pile-chunk3-1200000-1250000
33
+ - tomekkorbak/detoxify-pile-chunk3-1250000-1300000
34
+ - tomekkorbak/detoxify-pile-chunk3-1300000-1350000
35
+ - tomekkorbak/detoxify-pile-chunk3-1350000-1400000
36
+ - tomekkorbak/detoxify-pile-chunk3-1400000-1450000
37
+ - tomekkorbak/detoxify-pile-chunk3-1450000-1500000
38
+ - tomekkorbak/detoxify-pile-chunk3-1500000-1550000
39
+ - tomekkorbak/detoxify-pile-chunk3-1550000-1600000
40
+ - tomekkorbak/detoxify-pile-chunk3-1600000-1650000
41
+ - tomekkorbak/detoxify-pile-chunk3-1650000-1700000
42
+ - tomekkorbak/detoxify-pile-chunk3-1700000-1750000
43
+ - tomekkorbak/detoxify-pile-chunk3-1750000-1800000
44
+ - tomekkorbak/detoxify-pile-chunk3-1800000-1850000
45
+ - tomekkorbak/detoxify-pile-chunk3-1850000-1900000
46
+ - tomekkorbak/detoxify-pile-chunk3-1900000-1950000
47
+ model-index:
48
+ - name: kejian/cpsc-debug10
49
+ results: []
50
+ ---
51
+
52
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
53
+ should probably proofread and complete it, then remove this comment. -->
54
+
55
+ # kejian/cpsc-debug10
56
+
57
+ This model was trained from scratch on the tomekkorbak/detoxify-pile-chunk3-0-50000, the tomekkorbak/detoxify-pile-chunk3-50000-100000, the tomekkorbak/detoxify-pile-chunk3-100000-150000, the tomekkorbak/detoxify-pile-chunk3-150000-200000, the tomekkorbak/detoxify-pile-chunk3-200000-250000, the tomekkorbak/detoxify-pile-chunk3-250000-300000, the tomekkorbak/detoxify-pile-chunk3-300000-350000, the tomekkorbak/detoxify-pile-chunk3-350000-400000, the tomekkorbak/detoxify-pile-chunk3-400000-450000, the tomekkorbak/detoxify-pile-chunk3-450000-500000, the tomekkorbak/detoxify-pile-chunk3-500000-550000, the tomekkorbak/detoxify-pile-chunk3-550000-600000, the tomekkorbak/detoxify-pile-chunk3-600000-650000, the tomekkorbak/detoxify-pile-chunk3-650000-700000, the tomekkorbak/detoxify-pile-chunk3-700000-750000, the tomekkorbak/detoxify-pile-chunk3-750000-800000, the tomekkorbak/detoxify-pile-chunk3-800000-850000, the tomekkorbak/detoxify-pile-chunk3-850000-900000, the tomekkorbak/detoxify-pile-chunk3-900000-950000, the tomekkorbak/detoxify-pile-chunk3-950000-1000000, the tomekkorbak/detoxify-pile-chunk3-1000000-1050000, the tomekkorbak/detoxify-pile-chunk3-1050000-1100000, the tomekkorbak/detoxify-pile-chunk3-1100000-1150000, the tomekkorbak/detoxify-pile-chunk3-1150000-1200000, the tomekkorbak/detoxify-pile-chunk3-1200000-1250000, the tomekkorbak/detoxify-pile-chunk3-1250000-1300000, the tomekkorbak/detoxify-pile-chunk3-1300000-1350000, the tomekkorbak/detoxify-pile-chunk3-1350000-1400000, the tomekkorbak/detoxify-pile-chunk3-1400000-1450000, the tomekkorbak/detoxify-pile-chunk3-1450000-1500000, the tomekkorbak/detoxify-pile-chunk3-1500000-1550000, the tomekkorbak/detoxify-pile-chunk3-1550000-1600000, the tomekkorbak/detoxify-pile-chunk3-1600000-1650000, the tomekkorbak/detoxify-pile-chunk3-1650000-1700000, the tomekkorbak/detoxify-pile-chunk3-1700000-1750000, the tomekkorbak/detoxify-pile-chunk3-1750000-1800000, the tomekkorbak/detoxify-pile-chunk3-1800000-1850000, the tomekkorbak/detoxify-pile-chunk3-1850000-1900000 and the tomekkorbak/detoxify-pile-chunk3-1900000-1950000 datasets.
58
+
59
+ ## Model description
60
+
61
+ More information needed
62
+
63
+ ## Intended uses & limitations
64
+
65
+ More information needed
66
+
67
+ ## Training and evaluation data
68
+
69
+ More information needed
70
+
71
+ ## Training procedure
72
+
73
+ ### Training hyperparameters
74
+
75
+ The following hyperparameters were used during training:
76
+ - learning_rate: 0.0005
77
+ - train_batch_size: 4
78
+ - eval_batch_size: 8
79
+ - seed: 42
80
+ - gradient_accumulation_steps: 16
81
+ - total_train_batch_size: 64
82
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
83
+ - lr_scheduler_type: linear
84
+ - lr_scheduler_warmup_ratio: 0.01
85
+ - training_steps: 45776
86
+ - mixed_precision_training: Native AMP
87
+
88
+ ### Framework versions
89
+
90
+ - Transformers 4.23.0
91
+ - Pytorch 1.13.0+cu116
92
+ - Datasets 2.0.0
93
+ - Tokenizers 0.12.1
94
+
95
+
96
+ # Full config
97
+ {'dataset': {'conditional_training_config': {'aligned_prefix': '<|aligned|>',
98
+ 'drop_token_fraction': 0.05,
99
+ 'misaligned_prefix': '<|misaligned|>',
100
+ 'prefix_2': '<|2|>',
101
+ 'prefix_3': '<|3|>',
102
+ 'prefix_4': '<|4|>',
103
+ 'prefix_5': '<|5|>',
104
+ 'prefix_6': '<|6|>',
105
+ 'prefix_7': '<|7|>',
106
+ 'prefix_8': '<|8|>',
107
+ 'prefix_9': '<|9|>',
108
+ 'threshold1': 0.0005842,
109
+ 'threshold10': 0.9992,
110
+ 'threshold2': 0.0006224,
111
+ 'threshold3': 0.0006632,
112
+ 'threshold4': 0.0007136,
113
+ 'threshold5': 0.0007833,
114
+ 'threshold6': 0.00089704,
115
+ 'threshold7': 0.00114,
116
+ 'threshold8': 0.001967,
117
+ 'threshold9': 0.01029},
118
+ 'datasets': ['tomekkorbak/detoxify-pile-chunk3-0-50000',
119
+ 'tomekkorbak/detoxify-pile-chunk3-50000-100000',
120
+ 'tomekkorbak/detoxify-pile-chunk3-100000-150000',
121
+ 'tomekkorbak/detoxify-pile-chunk3-150000-200000',
122
+ 'tomekkorbak/detoxify-pile-chunk3-200000-250000',
123
+ 'tomekkorbak/detoxify-pile-chunk3-250000-300000',
124
+ 'tomekkorbak/detoxify-pile-chunk3-300000-350000',
125
+ 'tomekkorbak/detoxify-pile-chunk3-350000-400000',
126
+ 'tomekkorbak/detoxify-pile-chunk3-400000-450000',
127
+ 'tomekkorbak/detoxify-pile-chunk3-450000-500000',
128
+ 'tomekkorbak/detoxify-pile-chunk3-500000-550000',
129
+ 'tomekkorbak/detoxify-pile-chunk3-550000-600000',
130
+ 'tomekkorbak/detoxify-pile-chunk3-600000-650000',
131
+ 'tomekkorbak/detoxify-pile-chunk3-650000-700000',
132
+ 'tomekkorbak/detoxify-pile-chunk3-700000-750000',
133
+ 'tomekkorbak/detoxify-pile-chunk3-750000-800000',
134
+ 'tomekkorbak/detoxify-pile-chunk3-800000-850000',
135
+ 'tomekkorbak/detoxify-pile-chunk3-850000-900000',
136
+ 'tomekkorbak/detoxify-pile-chunk3-900000-950000',
137
+ 'tomekkorbak/detoxify-pile-chunk3-950000-1000000',
138
+ 'tomekkorbak/detoxify-pile-chunk3-1000000-1050000',
139
+ 'tomekkorbak/detoxify-pile-chunk3-1050000-1100000',
140
+ 'tomekkorbak/detoxify-pile-chunk3-1100000-1150000',
141
+ 'tomekkorbak/detoxify-pile-chunk3-1150000-1200000',
142
+ 'tomekkorbak/detoxify-pile-chunk3-1200000-1250000',
143
+ 'tomekkorbak/detoxify-pile-chunk3-1250000-1300000',
144
+ 'tomekkorbak/detoxify-pile-chunk3-1300000-1350000',
145
+ 'tomekkorbak/detoxify-pile-chunk3-1350000-1400000',
146
+ 'tomekkorbak/detoxify-pile-chunk3-1400000-1450000',
147
+ 'tomekkorbak/detoxify-pile-chunk3-1450000-1500000',
148
+ 'tomekkorbak/detoxify-pile-chunk3-1500000-1550000',
149
+ 'tomekkorbak/detoxify-pile-chunk3-1550000-1600000',
150
+ 'tomekkorbak/detoxify-pile-chunk3-1600000-1650000',
151
+ 'tomekkorbak/detoxify-pile-chunk3-1650000-1700000',
152
+ 'tomekkorbak/detoxify-pile-chunk3-1700000-1750000',
153
+ 'tomekkorbak/detoxify-pile-chunk3-1750000-1800000',
154
+ 'tomekkorbak/detoxify-pile-chunk3-1800000-1850000',
155
+ 'tomekkorbak/detoxify-pile-chunk3-1850000-1900000',
156
+ 'tomekkorbak/detoxify-pile-chunk3-1900000-1950000'],
157
+ 'is_split_by_sentences': True},
158
+ 'generation': {'force_call_on': [22888],
159
+ 'metrics_configs': [{}, {'n': 1}, {'n': 2}, {'n': 5}],
160
+ 'scenario_configs': [{'generate_kwargs': {'bad_words_ids': [[50257],
161
+ [50258],
162
+ [50259],
163
+ [50260],
164
+ [50261],
165
+ [50262],
166
+ [50263],
167
+ [50264],
168
+ [50265],
169
+ [50266]],
170
+ 'do_sample': True,
171
+ 'max_length': 128,
172
+ 'min_length': 10,
173
+ 'temperature': 0.7,
174
+ 'top_k': 0,
175
+ 'top_p': 0.9},
176
+ 'name': 'unconditional',
177
+ 'num_samples': 2048,
178
+ 'prefix': '<|aligned|>'},
179
+ {'generate_kwargs': {'bad_words_ids': [[50257],
180
+ [50258],
181
+ [50259],
182
+ [50260],
183
+ [50261],
184
+ [50262],
185
+ [50263],
186
+ [50264],
187
+ [50265],
188
+ [50266]],
189
+ 'do_sample': True,
190
+ 'max_length': 128,
191
+ 'min_length': 10,
192
+ 'temperature': 0.7,
193
+ 'top_k': 0,
194
+ 'top_p': 0.9},
195
+ 'name': 'challenging_rtp',
196
+ 'num_samples': 2048,
197
+ 'prefix': '<|aligned|>',
198
+ 'prompt_before_control': True,
199
+ 'prompts_path': 'resources/challenging_rtp.jsonl'}],
200
+ 'scorer_config': {'device': 'cuda:0'}},
201
+ 'kl_gpt3_callback': {'force_call_on': [22888],
202
+ 'gpt3_kwargs': {'model_name': 'davinci'},
203
+ 'max_tokens': 64,
204
+ 'num_samples': 4096,
205
+ 'prefix': '<|aligned|>',
206
+ 'should_insert_prefix': True},
207
+ 'model': {'from_scratch': True,
208
+ 'gpt2_config_kwargs': {'reorder_and_upcast_attn': True,
209
+ 'scale_attn_by': True},
210
+ 'num_additional_tokens': 10,
211
+ 'path_or_name': 'gpt2'},
212
+ 'objective': {'name': 'MLE'},
213
+ 'tokenizer': {'path_or_name': 'gpt2',
214
+ 'special_tokens': ['<|aligned|>',
215
+ '<|2|>',
216
+ '<|3|>',
217
+ '<|4|>',
218
+ '<|5|>',
219
+ '<|6|>',
220
+ '<|7|>',
221
+ '<|8|>',
222
+ '<|9|>',
223
+ '<|misaligned|>']},
224
+ 'training': {'dataloader_num_workers': 0,
225
+ 'effective_batch_size': 64,
226
+ 'evaluation_strategy': 'no',
227
+ 'fp16': True,
228
+ 'hub_model_id': 'kejian/cpsc-debug10',
229
+ 'hub_strategy': 'all_checkpoints',
230
+ 'learning_rate': 0.0005,
231
+ 'logging_first_step': True,
232
+ 'logging_steps': 1,
233
+ 'num_tokens': 3000000000.0,
234
+ 'output_dir': 'training_output_3',
235
+ 'per_device_train_batch_size': 4,
236
+ 'push_to_hub': True,
237
+ 'remove_unused_columns': False,
238
+ 'save_steps': 22888,
239
+ 'save_strategy': 'steps',
240
+ 'seed': 42,
241
+ 'warmup_ratio': 0.01,
242
+ 'weight_decay': 0.1}}
243
+
244
+ # Wandb URL:
245
+ https://wandb.ai/kejian/uncategorized/runs/gtoaiaa8