Perplexity stats seems incorrect

#2
by vaclavkosar - opened

There is a strange value in the perplexity comparison in the readme. Is that correct?

Pruna AI org

Unfortunately, the perplexity of the HQQ 2 bits quantized degrade a lot without access to data. Among other techniques, HQQ+ with peft could significantly recover performance on your target use-case with access to the data.

sharpenb changed discussion status to closed

Sign up or log in to comment