Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

mistralai_Mistral-Nemo-Instruct-2407-exl2-5.5bpw

This is a 5.5bpw quantized version of mistralai/Mistral-Nemo-Instruct-2407 made with exllamav2.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including DrNicefellow/Mistral-Nemo-Instruct-2407-exl2-5.5bpw