leonardlin's picture
Update README.md
d5c5814
---
license: apache-2.0
datasets:
- augmxnt/ultra-orca-boros-en-ja-v1
- Open-Orca/SlimOrca
- augmxnt/shisa-en-ja-dpo-v1
language:
- ja
- en
---
This EXL2 quant matches the same bpw as [mmnga's q4_K_M GGUF](https://huggingface.co/mmnga/shisa-7b-v1-gguf)
Like TheBloke, used [shisa-en-ja-dpo-v1](https://huggingface.co/datasets/augmxnt/shisa-en-ja-dpo-v1) dataset for calibration.
Main model: https://huggingface.co/augmxnt/shisa-7b-v1
For other quants (EXL2, AWQ, GGUF, etc) see: https://huggingface.co/augmxnt/shisa-7b-v1/discussions/2