Recipe

merge_method: dare_ties

  • base_model: athirdpath/BigLlama-20b-v1.1

  • model: athirdpath/Sydney_Megamind-20b

    weight: [0.35, 0.40, 0.45, 0.50, 0.45, 0.40, 0.45, 0.50, 0.45] / density: [0.30, 0.30, 0.40, 0.60, 0.40, 0.30, 0.40, 0.60, 0.50]

  • model: jebcarter/psyonic-cetacean-20B

    weight: [0.65, 0.60, 0.55, 0.50, 0.55, 0.60, 0.55, 0.50, 0.55] / density: 0.35

int8_mask: true

dtype: bfloat16

Downloads last month
2
Safetensors
Model size
20B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support