Edit model card

Neo_7b-merge15

Neo_7b-merge15 is a merge of the following models using LazyMergekit:

🧩 Configuration

# Define the slices for the model merging process
slices:
  - sources:
      # First part: merge layer 0 with layer 3
      - model: DewEfresh/neo_7b
        layer_range: [0, 0]
      - model: m-a-p/neo_7b
        layer_range: [3, 3]
  - sources:
      # Second part: merge layer 1 with layer 3
      - model: DewEfresh/neo_7b
        layer_range: [1, 1]
      - model: m-a-p/neo_7b
        layer_range: [3, 3]
  - sources:
      # Third part: merge layer 2 with layer 3
      - model: DewEfresh/neo_7b
        layer_range: [2, 2]
      - model: m-a-p/neo_7b
        layer_range: [3, 3]
  - sources:
      # Fourth part: merge layer 4 with layer 7
      - model: DewEfresh/neo_7b
        layer_range: [4, 4]
      - model: m-a-p/neo_7b
        layer_range: [7, 7]
  - sources:
      # Fifth part: merge layer 5 with layer 7
      - model: DewEfresh/neo_7b
        layer_range: [5, 5]
      - model: m-a-p/neo_7b
        layer_range: [7, 7]
  - sources:
      # Sixth part: merge layer 6 with layer 7
      - model: DewEfresh/neo_7b
        layer_range: [6, 6]
      - model: m-a-p/neo_7b
        layer_range: [7, 7]
  - sources:
      # Seventh part: merge layer 8 with layer 11
      - model: DewEfresh/neo_7b
        layer_range: [8, 8]
      - model: m-a-p/neo_7b
        layer_range: [11, 11]
  - sources:
      # Eighth part: merge layer 9 with layer 11
      - model: DewEfresh/neo_7b
        layer_range: [9, 9]
      - model: m-a-p/neo_7b
        layer_range: [11, 11]
  - sources:
      # Ninth part: merge layer 10 with layer 11
      - model: DewEfresh/neo_7b
        layer_range: [10, 10]
      - model: m-a-p/neo_7b
        layer_range: [11, 11]
  - sources:
      # Tenth part: merge layer 12 with layer 15
      - model: DewEfresh/neo_7b
        layer_range: [12, 12]
      - model: m-a-p/neo_7b
        layer_range: [15, 15]
  - sources:
      # Eleventh part: merge layer 13 with layer 15
      - model: DewEfresh/neo_7b
        layer_range: [13, 13]
      - model: m-a-p/neo_7b
        layer_range: [15, 15]
  - sources:
      # Twelfth part: merge layer 14 with layer 15
      - model: DewEfresh/neo_7b
        layer_range: [14, 14]
      - model: m-a-p/neo_7b
        layer_range: [15, 15]
  - sources:
      # Thirteenth part: merge layer 16 with layer 19
      - model: DewEfresh/neo_7b
        layer_range: [16, 16]
      - model: m-a-p/neo_7b
        layer_range: [19, 19]
  - sources:
      # Fourteenth part: merge layer 17 with layer 19
      - model: DewEfresh/neo_7b
        layer_range: [17, 17]
      - model: m-a-p/neo_7b
        layer_range: [19, 19]
  - sources:
      # Fifteenth part: merge layer 18 with layer 19
      - model: DewEfresh/neo_7b
        layer_range: [18, 18]
      - model: m-a-p/neo_7b
        layer_range: [19, 19]
  - sources:
      # Sixteenth part: merge layer 20 with layer 23
      - model: DewEfresh/neo_7b
        layer_range: [20, 20]
      - model: m-a-p/neo_7b
        layer_range: [23, 23]
  - sources:
      # Seventeenth part: merge layer 21 with layer 23
      - model: DewEfresh/neo_7b
        layer_range: [21, 21]
      - model: m-a-p/neo_7b
        layer_range: [23, 23]
  - sources:
      # Eighteenth part: merge layer 22 with layer 23
      - model: DewEfresh/neo_7b
        layer_range: [22, 22]
      - model: m-a-p/neo_7b
        layer_range: [23, 23]
  - sources:
      # Nineteenth part: merge layer 24 with layer 27
      - model: DewEfresh/neo_7b
        layer_range: [24, 24]
      - model: m-a-p/neo_7b
        layer_range: [27, 27]
  - sources:
      # Twentieth part: merge layer 25 with layer 27
      - model: DewEfresh/neo_7b
        layer_range: [25, 25]
      - model: m-a-p/neo_7b
        layer_range: [27, 27]
  - sources:
      # Twenty-first part: merge layer 26 with layer 27
      - model: DewEfresh/neo_7b
        layer_range: [26, 26]
      - model: m-a-p/neo_7b
        layer_range: [27, 27]
# Specify the merging method for the slices
merge_method: slerp
base_model: DewEfresh/neo_7b
parameters:
  t: 0.3333  # Set global interpolation value to 33.33%
dtype: bfloat16

πŸ’» Usage

!pip install -qU transformers accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "DewEfresh/Neo_7b-merge15"
messages = [{"role": "user", "content": "What is a large language model?"}]

tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    torch_dtype=torch.float16,
    device_map="auto",
)

outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Downloads last month
3
Safetensors
Model size
395M params
Tensor type
BF16
Β·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Merge of