--- license: other license_name: yi license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE language: - en library_name: transformers --- See: https://huggingface.co/01-ai/Yi-34B-200K Yi-30B-200K quantized to 3.9bpw, which should allow for ~50K context on 24GB GPUs. Ask if you need another size. Quantized with 8K rows on a mix of wikitext and my own RP stories. Use with --enable-remote-code in text-gen-ui. https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE --- license: other license_name: yi license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE ---