Text Generation
Transformers
PyTorch
English
mixtral
conversational
Inference Endpoints
text-generation-inference

How to load in oobabooga?

#1
by pelatho - opened

I tried loading this in Ooobabooga with 80GB VRAM but getting this error:
RuntimeError: unable to mmap 8564773576 bytes from file <models/LoneStriker_dolphin-2.5-mixtral-8x7b-6.0bpw-h6-exl2-2/output-00001-of-00005.safetensors>: Cannot allocate memory (12)

There are no instructions what settings to use anywhere :\

Sign up or log in to comment