--- library_name: transformers tags: - 4-bit - AWQ - text-generation - autotrain_compatible - endpoints_compatible pipeline_tag: text-generation inference: false quantized_by: mayflowergmbh --- # VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct - Model creator: [VAGOsolutions](https://vago-solutions.ai/) - Original model: [Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) - Quantized by: [mayflowergmbh](https://mayflower.de/) This model is a variant of https://huggingface.co/mayflowergmbh/Llama-3-SauerkrautLM-8b-Instruct-AWQ extended to 16k context size by means of ROPE scaling.