Edit model card

vinallama-chat-merge-2

This model is a merge of the following models made with LazyMergekit:

🧩 Configuration

slices:
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [0, 16]
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [8, 16]
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [8, 16]      
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [12, 24]
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [12, 24]
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [20, 28]
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [20, 28]
  - sources:
    - model: vilm/vinallama-7b-chat
      layer_range: [28, 32]
merge_method: passthrough
dtype: bfloat16
Downloads last month
3
Safetensors
Model size
15.8B params
Tensor type
BF16
·

Collection including qnguyen3/vinallama-16b-chat-franken