--- license: apache-2.0 tags: - Solar - Mistral - Roleplay --- # DaringLotus-10.7B 6bpw EXL2 ## Description EXL2 quant of [BlueNipples/DaringLotus-10.7B](https://huggingface.co/BlueNipples/DaringLotus-10.7B) - 6bpw should be comfortable on 12 gb with 8k context - 4bpw might just fit on 8gb of vram at 4k context - if you have more ram get the 8bpw ## Other quants: EXL2: [8bpw](https://huggingface.co/Kooten/DaringLotus-8bpw-exl2), [6bpw](https://huggingface.co/Kooten/DaringLotus-6bpw-exl2), [5bpw](https://huggingface.co/Kooten/DaringLotus-5bpw-exl2), [4bpw](https://huggingface.co/Kooten/DaringLotus-4bpw-exl2) ## Prompt Format ### Alpaca: I am not entirely certain of this but i think alpaca is correct for this model ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Input: {input} ### Response: ``` ## Contact Kooten on discord