aa33 commited on
Commit
b272543
1 Parent(s): 027f0b0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -15,6 +15,8 @@ inference: false
15
  duplicated_from: mosaicml/mpt-7b
16
  ---
17
 
 
 
18
  # MPT-7B
19
 
20
  MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
 
15
  duplicated_from: mosaicml/mpt-7b
16
  ---
17
 
18
+ ## Authors Note: This is MPT-7B with some fixes borrowed from https://huggingface.co/Birchlabs/mosaicml-mpt-7b-chat-qlora to allow LoRA fine-tuning
19
+
20
  # MPT-7B
21
 
22
  MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.