Edit model card

Psyonic-Cetacean-20B-V2

This is a merge of pre-trained language models created using mergekit.

GGUF (standard and iMatrix) quants can be found here courtesy of MarsupialAI: https://huggingface.co/MarsupialAI/Psyonic-Cetacean-20b-v2_iMatrix_GGUF

Merge Details

Merge Method

This model was merged using the linear merge method on two stack-merged models.

The first is jebcarter/psyonic-cetacean-20B (Orca first, reproduced so I didn't have to download that model on top of the components). The second is the same recipe with the models reversed.

Since jebcarter suggested this recipe, credit goes to him.

Models Merged

The following models were included in the merge:

  • microsoft/Orca-2-13b
  • KoboldAI/LLaMA2-13B-Psyfighter2

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: microsoft/Orca-2-13b
    parameters:
      weight: 1.0
merge_method: task_arithmetic
base_model: TheBloke/Llama-2-13B-fp16
dtype: float16
name: FlatOrca2
---
slices:
  - sources:
    - model: FlatOrca2
      layer_range: [0, 16]
  - sources:
    - model: KoboldAI/LLaMA2-13B-Psyfighter2
      layer_range: [8, 24]
  - sources:
    - model: FlatOrca2
      layer_range: [17, 32]
  - sources:
    - model: KoboldAI/LLaMA2-13B-Psyfighter2
      layer_range: [25, 40]
merge_method: passthrough
dtype: float16
name: Psycet
---
slices:
  - sources:
    - model: KoboldAI/LLaMA2-13B-Psyfighter2
      layer_range: [0, 16]
  - sources:
    - model: FlatOrca2
      layer_range: [8, 24]
  - sources:
    - model: KoboldAI/LLaMA2-13B-Psyfighter2
      layer_range: [17, 32]
  - sources:
    - model: FlatOrca2
      layer_range: [25, 40]
merge_method: passthrough
dtype: float16
name: Psycet-Reverse
---
models:
  - model: Psycet
    parameters:
      weight: 0.5
  - model: Psycet-Reverse
    parameters:
      weight: 0.5
merge_method: linear
dtype: float16
Downloads last month
15
Safetensors
Model size
20B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including ToastyPigeon/psyonic-cetacean-20b-v2