Edit model card

Description

This repo contains fp16 files of Toppy-Mix-4x7B.

This project was originaly a request from BlueNipples : link

The difference with the OG Toppy-M is the addition of Noromaid with the 3 models used to do Toppy-M, to have all the model as Expert in this MoE model, and not just merged one into one.

WARNING: ALL THE "K" GGUF QUANT OF MIXTRAL MODELS SEEMS TO BE BROKEN, PREFER Q4_0, Q5_0 or Q8_0!

Models and loras used

Prompt template: Alpaca

Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{prompt}

### Response:

If you want to support me, you can here.

Downloads last month
0
Inference API
Input a message to start chatting with LoneStriker/Toppy-Mix-4x7B-4.0bpw-h6-exl2-2.
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.