--- language: - en library_name: transformers pipeline_tag: text-generation tags: - text-generation-inference - merge license: apache-2.0 --- quant of [brucethemoose's](https://huggingface.co/brucethemoose) [Yi-34B-200K-DARE-merge-v5](https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v5) fits into 24gb with 16k context on windows pippa_cleaned used for calibration, with 8192 token length