Edit model card

Nous Capybara 57B

exllamav2 quant for TeeZee/2xNous-Capybara-34B

Runs smoothly on single 3090 in webui with context length set to 4096, ExLlamav2_HF loader and cache_8bit=True

All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel: Buy Me A Coffee

Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train TeeZee/2xNous-Capybara-34B-bpw3-h6-exl2

Collection including TeeZee/2xNous-Capybara-34B-bpw3-h6-exl2