exl2 quantization of Sao10K/Euryale-1.4-L2-70B
I haven't tested perplexity of this, but it seems to work fine compared to Euryale 1.3 5bpw.
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.