Edit model card

Miquliz-120b-137l

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [0, 16]
  - sources:
    - model: lizpreciatior/lzlv_70b_fp16_hf
      layer_range: [8, 24]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [17, 32]
  - sources:
    - model: lizpreciatior/lzlv_70b_fp16_hf
      layer_range: [25, 40]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [33, 48]
  - sources:
    - model: lizpreciatior/lzlv_70b_fp16_hf
      layer_range: [41, 56]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [49, 64]
  - sources:
    - model: lizpreciatior/lzlv_70b_fp16_hf
      layer_range: [57, 72]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [65, 80]
merge_method: passthrough
dtype: bfloat16
Downloads last month
4
Safetensors
Model size
118B params
Tensor type
BF16
·

Merge of