final_model

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using tensoralchemistdev01/bb17 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: tensoralchemistdev01/bb17
dtype: bfloat16
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 1.0
slices:
- sources:
  - layer_range: [0, 4]
    model: luaqi/llama_01141
    parameters:
      density: 0.9822166935301276
      weight: 0.4836395670729823
  - layer_range: [0, 4]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.5618096971607989
      weight: 0.5910112056710738
  - layer_range: [0, 4]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [4, 8]
    model: luaqi/llama_01141
    parameters:
      density: 0.8590193339521722
      weight: 0.5055715642231858
  - layer_range: [4, 8]
    model: RoyJoy/llama-jan16
    parameters:
      density: 1.0
      weight: 0.8577299022769803
  - layer_range: [4, 8]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [8, 12]
    model: luaqi/llama_01141
    parameters:
      density: 1.0
      weight: 0.3671144555754953
  - layer_range: [8, 12]
    model: RoyJoy/llama-jan16
    parameters:
      density: 1.0
      weight: 0.5833008120360145
  - layer_range: [8, 12]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [12, 16]
    model: luaqi/llama_01141
    parameters:
      density: 1.0
      weight: 0.7579582648893424
  - layer_range: [12, 16]
    model: RoyJoy/llama-jan16
    parameters:
      density: 1.0
      weight: 0.6187487781035264
  - layer_range: [12, 16]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [16, 20]
    model: luaqi/llama_01141
    parameters:
      density: 0.9862014619206753
      weight: 0.2660917543704695
  - layer_range: [16, 20]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.6495434917253259
      weight: 0.8139610308413384
  - layer_range: [16, 20]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [20, 24]
    model: luaqi/llama_01141
    parameters:
      density: 1.0
      weight: 0.39331898026975115
  - layer_range: [20, 24]
    model: RoyJoy/llama-jan16
    parameters:
      density: 1.0
      weight: 0.799505715498531
  - layer_range: [20, 24]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [24, 28]
    model: luaqi/llama_01141
    parameters:
      density: 1.0
      weight: 0.5083181461601151
  - layer_range: [24, 28]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.8589109963600738
      weight: 0.28719026544614595
  - layer_range: [24, 28]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [28, 32]
    model: luaqi/llama_01141
    parameters:
      density: 1.0
      weight: 0.3953601387700516
  - layer_range: [28, 32]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.7172849243353902
      weight: 0.45318771937126756
  - layer_range: [28, 32]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [32, 36]
    model: luaqi/llama_01141
    parameters:
      density: 0.9900255060816008
      weight: 0.4734312560215305
  - layer_range: [32, 36]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.7287510233989866
      weight: 0.6631914136579918
  - layer_range: [32, 36]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [36, 40]
    model: luaqi/llama_01141
    parameters:
      density: 1.0
      weight: 0.38286836007087965
  - layer_range: [36, 40]
    model: RoyJoy/llama-jan16
    parameters:
      density: 1.0
      weight: 0.528846962020614
  - layer_range: [36, 40]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [40, 44]
    model: luaqi/llama_01141
    parameters:
      density: 0.8658528465260166
      weight: 0.6223779596324325
  - layer_range: [40, 44]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.9092912617731014
      weight: 0.5145684051474502
  - layer_range: [40, 44]
    model: tensoralchemistdev01/bb17
- sources:
  - layer_range: [44, 48]
    model: luaqi/llama_01141
    parameters:
      density: 0.5058899007539202
      weight: 0.7719998499654174
  - layer_range: [44, 48]
    model: RoyJoy/llama-jan16
    parameters:
      density: 0.625169391261857
      weight: 0.16955077412057684
  - layer_range: [44, 48]
    model: tensoralchemistdev01/bb17
Downloads last month
4
Safetensors
Model size
11.4B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for deepcoreCoalbiter/timmy