4.65bpw exl2 quant of ausboss/SuperCOT-70B.
Calibration done using wikitext.
measurements.json file included in repo.

Original model card below:

Special thanks to Alpin, Tav and the rest of the Pygmalion peeps involved in training this one. Its trained on the supercot dataset like my other qloras and models. I'll update the card with more info soon.

Might be a bit overbaked ๐Ÿง‘โ€๐Ÿณ๐Ÿ”ฅ

Downloads last month
25
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kecik/SuperCOT-L2-70B-4.65bpw-h6-exl2

Finetuned
(28)
this model