Edit model card

10_merged_model

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using microsoft/phi-2 as a base.

Models Merged

The following models were included in the merge:

  • /home/ubuntu/llm_mill/output/07_merged_phi-2_openschnabeltier_de
  • /home/ubuntu/llm_mill/output/09_merged_phi-2_wiki_qa_de
  • /home/ubuntu/llm_mill/output/01_merged_phi-2_alpaca-gpt4_de
  • /home/ubuntu/llm_mill/output/06_merged_phi-2_oasst_de
  • /home/ubuntu/llm_mill/output/05_merged_phi-2_evol-instruct_de
  • /home/ubuntu/llm_mill/output/03_merged_phi-2_dolly-15k_de
  • /home/ubuntu/llm_mill/output/08_merged_phi-2_ultrachat_chat_de
  • /home/ubuntu/llm_mill/output/02_merged_phi-2_booksum_de
  • /home/ubuntu/llm_mill/output/00_merged_phi-2_airoboros-3.0_de
  • /home/ubuntu/llm_mill/output/04_merged_phi-2_dolphin_de

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: microsoft/phi-2
  - model: /home/ubuntu/llm_mill/output/00_merged_phi-2_airoboros-3.0_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/01_merged_phi-2_alpaca-gpt4_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/02_merged_phi-2_booksum_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/03_merged_phi-2_dolly-15k_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/04_merged_phi-2_dolphin_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/05_merged_phi-2_evol-instruct_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/06_merged_phi-2_oasst_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/07_merged_phi-2_openschnabeltier_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/08_merged_phi-2_ultrachat_chat_de
    parameters:
      density: 0.5
      weight: 0.5
  - model: /home/ubuntu/llm_mill/output/09_merged_phi-2_wiki_qa_de
    parameters:
      density: 0.5
      weight: 0.5
merge_method: ties
base_model: microsoft/phi-2
parameters:
  normalize: true
dtype: float16
Downloads last month
1
Safetensors
Model size
2.78B params
Tensor type
FP16
·

Finetuned from

Collection including TristanBehrens/HeilbronnGPTAlpha-10_merged_model