Edit model card

ok, it's a berry.

yay epic exllama v2 format at 6.0 bpw !!!!1

ez to host api with oobabooga then connect to it with sillytavern!

Downloads last month
2
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.