leonardlin's picture
Update README.md
d5c5814
metadata
license: apache-2.0
datasets:
  - augmxnt/ultra-orca-boros-en-ja-v1
  - Open-Orca/SlimOrca
  - augmxnt/shisa-en-ja-dpo-v1
language:
  - ja
  - en

This EXL2 quant matches the same bpw as mmnga's q4_K_M GGUF Like TheBloke, used shisa-en-ja-dpo-v1 dataset for calibration.

Main model: https://huggingface.co/augmxnt/shisa-7b-v1

For other quants (EXL2, AWQ, GGUF, etc) see: https://huggingface.co/augmxnt/shisa-7b-v1/discussions/2