mistralai_Mistral-Nemo-Instruct-2407-exl2-6bpw

This is a 6.0bpw quantized version of mistralai/Mistral-Nemo-Instruct-2407 made with exllamav2.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collection including DrNicefellow/Mistral-Nemo-Instruct-2407-exl2-6bpw