metadata
language:
- en
license: apache-2.0
library_name: transformers
tags:
- mistral
- unsloth
- transformers
I quanted this from the Unsloth upload for Mistral Nemo Instruct.
You can find the link here This is for the base Mistral Nemo Instruct Model
EXL2 quanting seemed to work. I ran a few tests on it and it seemed to have zero issues generating text up to 32k context size. I did not try higher than that, but uploading so folks can start testing this. Pleasantly surprised for a roleplay capacity as it seemed to latch onto character traits very well.