Edit model card

Information

This is a Exl2 quantized version of airoboros-mistral2.2-7b

Please refer to the original creator for more information.

Calibration dataset: wikitext

Branches:

  • main: Measurement files
  • 4bpw: 4 bits per weight
  • 5bpw: 5 bits per weight
  • 6bpw: 6 bits per weight

Notes

  • 6bpw is recommended for the best quality to vram usage ratio (assuming you have enough vram).
  • Please ask for more bpws in the community tab if necessary.

Donate?

All my infrastructure and cloud expenses are paid out of pocket. If you'd like to donate, you can do so here: https://ko-fi.com/kingbri

You should not feel obligated to donate, but if you do, I'd appreciate it.

Downloads last month

-

Downloads are not tracked for this model. How to track
Unable to determine this model's library. Check the docs .

Collection including royallab/airoboros-mistral2.2-7b-exl2