quant versions

#2
by prudant - opened

@TheBloke please πŸ™ŒπŸ™πŸ™πŸ™πŸ™

yes, that would be nice.

Alternatively, I'm working on distilling the model down into 2-3bn param range.

TheBloke/Sensei-7B-V1-AWQ πŸ‘πŸ‘πŸ‘
it would be very nice have a distiled version, can you notify your progress please!

SciPhi-AI org

@TheBloke - I recently pushed a substantial update to this model, is it possible for you to re-run?

@emrgnt-cmplxty very interesting model! I made a new GGUF quantized based on your new models: https://huggingface.co/MaziyarPanahi/Sensei-7B-V1-GGUF

Sign up or log in to comment