Merge:

[Xwin (0.66) + ReMM (0.33)] x [Xwin (0.33) + MLewd (0.66)]

The goal was to recreate https://huggingface.co/Undi95/MXLewd-L2-20B in 13B without using merge interlacing (will probably be a little less good).

Models used

  • Undi95/MLewd-L2-13B-v2-3
  • Undi95/ReMM-v2.1-L2-13B
  • Xwin-LM/Xwin-LM-13B-V0.1

One part is ReMM (0.33) and Xwin (0.66)

One part is Xwin (0.33) and MLewd (0.66)

Prompt template: Alpaca

Below is an instruction that describes a task. Write a response that completes the request.

### Instruction:
{prompt}

### Response:
Downloads last month
17
Safetensors
Model size
13B params
Tensor type
F32
·
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Undi95/MXLewdMini-L2-13B

Quantizations
3 models