FP16 GGUF and Q4_K_S quant of Kitchen Sink 103b https://huggingface.co/MarsupialAI/KitchenSink_103b
Files split with 7zip (store-only) to get around the 50GB file size limit. Use 7zip to recombine.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.