EVA-LLaMA-3.33-70B-v0.0 - EXL2 4.0bpw
This is a 4.0bpw EXL2 quant of EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.0
Details about the model can be found at the above model page.
EXL2 Version
These quants were made with exllamav2 version 0.2.4. Quants made on this version of EXL2 may not work on older versions of the exllamav2 library.
If you have problems loading these models, please update Text Generation WebUI to the latest version.
Perplexity Scoring
Below are the perplexity scores for the EXL2 models. A lower score is better.
Quant Level | Perplexity Score |
---|---|
5.0 | 5.2386 |
4.5 | 5.3409 |
4.0 | 5.5167 |
3.5 | 5.9224 |
3.0 | 15.1469 |
2.75 | 8.9386 |
2.5 | 9.4244 |
2.25 | 11.5358 |
- Downloads last month
- 29
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Dracones/EVA-LLaMA-3.33-70B-v0.0_exl2_4.0bpw
Base model
meta-llama/Llama-3.1-70B
Finetuned
meta-llama/Llama-3.3-70B-Instruct
Finetuned
EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.0