Model error (solved)

#1
by EloyOn - opened

Thanks for merging an OAS Aura.

I'm having problems with it though, when quanted it gives this error:

"Model error: You have selected an invalid model. Please go to the Raw Model Instructions app and double check your selected model fits your selected inference engine."

That was using Layla app, but a user also says that it crashes on KoboldCPP (1.65).

Something might be wrong with the model, being based from a FP32, or is it from the quant side?

Owner

Could you try mradermacher/Aura-Uncensored-OAS-8B-L3-GGUF? I can load these quants and chat with them

Just noticed mradermacher already quanted it! That guy is incredible quanting everything lately, doing what TheBloke used to do. God's work.

Downloading now. Thanks.

Yeah, it was the other quant that was busted. Mradermacher's works perfectly.

It turned out quite well, I didn't test the censorship limits yet, but the normal conversation is like the original one so far, gonna enjoy the slight differences. I'm having a good time talking about the effects of different chemical substances from medicinal plants to compare what she used to say xDDD

Again, thank you for merging an OAS version of Aura uncensored.

EloyOn changed discussion title from Model error to Model error 8solved)
EloyOn changed discussion title from Model error 8solved) to Model error (solved)
EloyOn changed discussion status to closed

Sign up or log in to comment