--- license: apache-2.0 --- This is a working version of Mixtral Instruct that is AWQ quantized. As of 11-02-2024, [https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-AWQ](https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-AWQ) is not working, so please use this repository instead.