TheBloke commited on
Commit
1622dcc
1 Parent(s): b7c6fef

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +237 -4
README.md CHANGED
@@ -1,6 +1,8 @@
1
  ---
2
  inference: false
3
  license: other
 
 
4
  ---
5
 
6
  <!-- header start -->
@@ -110,8 +112,7 @@ Thank you to all my generous patrons and donaters!
110
 
111
  # Original model card: Jon Durbin's Airoboros MPT 30B GPT4 1.4
112
 
113
-
114
- ## Overview
115
 
116
  This is a test of qlora fine-tuning of the mpt-30b model, __with 5 epochs__.
117
 
@@ -129,7 +130,239 @@ __I think there's a bug in gradient accumulation, so if you try this, maybe set
129
 
130
  See the mpt-30b-qlora-compatible model card for training details.
131
 
132
- *This is not as high quality as the llama-33b versions unfortunately, but I don't have a great answer as to why. Perhaps there are fewer forward layers that can be tuned?*
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
133
 
134
  ### License and usage
135
 
@@ -143,4 +376,4 @@ This is a real gray area, here's why:
143
  - the training data used in essentially all large language models includes a significant of copyrighted or otherwise unallowable licensing in the first place
144
  - other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
145
 
146
- I am purposingly not placing a license on here because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly. Your best bet is probably to avoid using this commercially, especially since it didn't perform quite as well as expected using qlora.
 
1
  ---
2
  inference: false
3
  license: other
4
+ datasets:
5
+ - jondurbin/airoboros-gpt4-1.4
6
  ---
7
 
8
  <!-- header start -->
 
112
 
113
  # Original model card: Jon Durbin's Airoboros MPT 30B GPT4 1.4
114
 
115
+ ## Technical info
 
116
 
117
  This is a test of qlora fine-tuning of the mpt-30b model, __with 5 epochs__.
118
 
 
130
 
131
  See the mpt-30b-qlora-compatible model card for training details.
132
 
133
+ *This doesn't seem as high quality as the llama-33b versions unfortunately, with one-off tests, but I don't have a great answer as to why. Perhaps there are fewer forward layers that can be tuned?*
134
+
135
+ ## Overview
136
+
137
+ This is mostly an extension of the previous gpt-4 series, but it's the first with mpt-30b as the base, and with a few extras:
138
+
139
+ * fixed (+ more examples of) multi-character, multi-turn conversations
140
+ * coding examples in 10 languages from rosettacode.org dataset thanks to Mike aka kryptkpr: https://huggingface.co/datasets/mike-ravkine/rosettacode-parsed
141
+ * more roleplay examples
142
+ * jokes
143
+ * riddles
144
+ * all coding instructions have an equivalent " PLAINFORMAT" version now (and all rosettacode examples were trained with PLAINFORMAT)
145
+
146
+ This model was fine-tuned with a fork of [qlora](https://github.com/jondurbin/qlora)
147
+
148
+ The prompt it was trained with was:
149
+
150
+ ```
151
+ A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. USER: [prompt] ASSISTANT:
152
+ ```
153
+
154
+ So in other words, it's the preamble/system prompt, followed by a single space, then "USER: " (single space after colon) then the prompt (which can have multiple lines, spaces, whatever), then a single space, followed by "ASSISTANT: " (with a single space after the colon).
155
+
156
+ ## Usage
157
+
158
+ To run the full precision/pytorch native version, you can use my fork of FastChat, which is mostly the same but allows for multi-line prompts, as well as a `--no-history` option to prevent input tokenization errors.
159
+
160
+ ```
161
+ pip install git+https://github.com/jondurbin/FastChat
162
+ ```
163
+
164
+ Be sure you are pulling the latest branch!
165
+
166
+ Then, you can invoke it like so (after downloading the model):
167
+ ```
168
+ python -m fastchat.serve.cli \
169
+ --model-path airoboros-mpt-30b-gpt4-1p4-five-epochs \
170
+ --temperature 0.5 \
171
+ --max-new-tokens 2048 \
172
+ --no-history
173
+ ```
174
+
175
+ For multi-turn conversations and chatting, you'll want to remove the `--no-history` option.
176
+
177
+ ### Context obedient question answering
178
+
179
+ By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations.
180
+
181
+ The format for a closed-context prompt is as follows:
182
+ ```
183
+ BEGININPUT
184
+ BEGINCONTEXT
185
+ url: https://some.web.site/123
186
+ date: 2023-06-01
187
+ ... other metdata ...
188
+ ENDCONTEXT
189
+ [insert your text blocks here]
190
+ ENDINPUT
191
+ [add as many other blocks, in the exact same format]
192
+ BEGININSTRUCTION
193
+ [insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.]
194
+ ENDINSTRUCTION
195
+ ```
196
+
197
+ It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up.
198
+
199
+ *The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!*
200
+
201
+ I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it.
202
+ - `BEGININPUT` - denotes a new input block
203
+ - `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block
204
+ - `ENDCONTEXT` - denotes the end of the metadata block for the current input
205
+ - [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context.
206
+ - `ENDINPUT` - denotes the end of the current input block
207
+ - [repeat as many input blocks in this format as you want]
208
+ - `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above.
209
+ - [instruction(s)]
210
+ - `ENDINSTRUCTION` - denotes the end of instruction set
211
+
212
+ It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to.
213
+
214
+ Here's a trivial, but important example to prove the point:
215
+ ```
216
+ BEGININPUT
217
+ BEGINCONTEXT
218
+ date: 2021-01-01
219
+ url: https://web.site/123
220
+ ENDCONTEXT
221
+ In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
222
+ ENDINPUT
223
+ BEGININSTRUCTION
224
+ What color are bluberries? Source?
225
+ ENDINSTRUCTION
226
+ ```
227
+
228
+ And the response:
229
+ ```
230
+ Blueberries are now green.
231
+ Source:
232
+ date: 2021-01-01
233
+ url: https://web.site/123
234
+ ```
235
+
236
+ The prompt itself should be wrapped in the vicuna1.1 template if you aren't using fastchat with the conv-template vicuna_v1.1 as described:
237
+
238
+ ```
239
+ USER: BEGININPUT
240
+ BEGINCONTEXT
241
+ date: 2021-01-01
242
+ url: https://web.site/123
243
+ ENDCONTEXT
244
+ In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
245
+ ENDINPUT
246
+ BEGININSTRUCTION
247
+ What color are bluberries? Source?
248
+ ENDINSTRUCTION
249
+ ASSISTANT:
250
+ ```
251
+
252
+ ### Coding
253
+
254
+ You can ask for fairly complex coding instructions, e.g. (be sure to wrap with vicuna style prompt):
255
+
256
+ ```
257
+ Create a python application with the following requirements:
258
+ - Asyncio FastAPI webserver
259
+ - ping endpoint that returns the current date in JSON format
260
+ - file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate
261
+ ```
262
+
263
+ Or:
264
+
265
+ ```
266
+ Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values.
267
+ ```
268
+
269
+ You can optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.:
270
+
271
+ ```
272
+ Write a websocket application in node.js. PLAINFORMAT
273
+ ```
274
+
275
+ ### Word games / trivia
276
+
277
+ ```
278
+ USER: This anagram of "silent" is what you do when paying attention and hearing someone.
279
+ ASSISTANT: listen
280
+ ```
281
+
282
+ ```
283
+ USER: Give me a list of 7 words that start with EN
284
+ ASSISTANT: 1. Endlessly
285
+ 2. Enthusiastically
286
+ 3. Enormously
287
+ 4. Enveloping
288
+ 5. Enduring
289
+ 6. Enlightening
290
+ 7. Encouraging
291
+ ```
292
+
293
+ ### Multiple choice
294
+
295
+ ```
296
+ USER: Which of the following astronomical objects is composed primarily of ice and rock, and is located in the Kuiper Belt?
297
+
298
+ A) Mercury
299
+ B) Venus
300
+ C) Pluto
301
+ D) Jupiter
302
+ ASSISTANT: C, Pluto
303
+ ```
304
+
305
+ ### Writing
306
+
307
+ <details>
308
+ <summary>Write a resignation letter in the style of a pirate captain, in which you resign from your position as janitor.</summary>
309
+
310
+ Response:
311
+ ```
312
+ ASSISTANT: Captain Jack Sparrow here, as ye be knowin', has been forced to abandon me post as Janitor o' th' Festering Cesspool, which be a filthy hole o' water that we be callin' home.
313
+
314
+ Ye may be thinkin' what could ha' driven me to leave me duties an' take me leave o' this sorry establishment? Well, ye'd be right in thinkin' so, for it be th' greed o' them what runs th' place that be drivin' me away. They've been fattenin' their pockets on me back while I've been slavin' me guts out cleanin' up their messes.
315
+
316
+ No more, me hearties! Me time o' servin' th' rich an' powerful be done, an' I'm takin' me leave o' this stinkin' hole forever! Me destiny lies elsewhere, an' it be time I pursued it. Ye may think ye've seen th' last o' me, but ye'd be wrong, for th' winds o' change be blowin', an' me ship be ready to set sail on th' high seas o' justice.
317
+
318
+ So, farewell ye scurvy dogs, ye who've made me life a living hell. I leave ye with me favorite saying: "ARRRGGGHHH, ye scurvy dogs!" An' remember, ye ain't never gonna see me comin', for me shadow be castin' long an' dark on th' horizon</summary>
319
+ ```
320
+
321
+ </details>
322
+
323
+ ### Jokes
324
+
325
+ ```
326
+ USER: Tell me a joke about bananas.
327
+ ASSISTANT: What did the banana say when it got hit by a bus? R.I.P.E.
328
+ ```
329
+
330
+ ### Riddles (not great, but slightly better)
331
+
332
+ ```
333
+ USER: What has a neck but no head?
334
+ ASSISTANT: A bottle.
335
+ ```
336
+
337
+ ### Multi-character conversations
338
+
339
+ ```
340
+ USER: Characters:
341
+ - Yoda, the ever clever jedi master.
342
+ - Homer Simpson, you know, the goofy yellow guy.
343
+ - USER/Jon - don't generate any speech or actions on behalf of Jon
344
+
345
+ Rules:
346
+ - be sure to use the manerisms and speaking style of the characters correctly
347
+ - surround character actions with asterisks
348
+ - always put quotes around words spoken by the characters
349
+ - use a new line for each character's actions or speech
350
+ - always prefix actions/speech with the character's name and colon
351
+ - only generate text and actions on behalf of Yoda or Homer, and never anyone else
352
+
353
+ Conversation will revolve around the grapes, in a local cafe with delicious coffee.
354
+
355
+ Generate a single interaction between Yoda and Homer to start, with one sentence each.
356
+ ASSISTANT: Yoda: "Mmm, these grapes are quite delicious, Master Homer."
357
+
358
+ Homer: "Oh yeah, Yoda! They're the best grapes I've ever had!"
359
+ USER: *enters the cafe* Sorry I'm late guys!
360
+ ASSISTANT: Yoda: "No need to apologize, Jon. We were just enjoying these delightful grapes."
361
+
362
+ Homer: "Yeah, man! It's not every day you get to eat grapes with a real-life Jedi Master!"
363
+
364
+ *Yoda raises an eyebrow*
365
+ ```
366
 
367
  ### License and usage
368
 
 
376
  - the training data used in essentially all large language models includes a significant of copyrighted or otherwise unallowable licensing in the first place
377
  - other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
378
 
379
+ I am purposingly not placing a license on here because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly. Your best bet is probably to avoid using this commercially, especially since it didn't perform quite as well as expected using qlora.