20B
Collection
4 items
•
Updated
EXL2 Quantization of Undi95's's MLewd-ReMM-L2-Chat-20B.
Quantized at 3.18bpw with hb 6. Can run full 4K context on 16GB VRAM. 8.13bpw also available.
Perplexity:
Base = 6.5820
8.13 = 6.5535
3.18 h6 = 6.6928
Dataset = wikitext
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response: