Aeala's picture
Update README.md
0f93e34
|
raw
history blame
343 Bytes

4-bit GPTQ quantization of VicUnlocked-alpaca-30b

Important Note: While this is trained on a cleaned ShareGPT dataset like Vicuna used, this was trained in the Alpaca format, so prompting should be something like:

### Instruction:

<prompt> (without the <>)

### Response: