File size: 2,175 Bytes
1077550
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
base_model:
- 152334H/miqu-1-70b-sf
license: unknown
language:
- en
pipeline_tag: text-generation
tags:
- merge
- frankenmerge
- 95b
---
# BigWeave v28 96b

<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/4CbbAN-X7ZWj702JrcCGH.png" width=600>

The BigWeave models aim to experimentally identify merge settings for increasing model performance. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared.

# Prompting Format
Chatml, Mistral, Vicuna.

# Merge process
This is a self-merge of 152334H/miqu-1-70b-sf. The slices use a uniform size and only overlap with the adjacent sizes by one layer. See [this discussion](https://huggingface.co/llmixer/BigWeave-v16-103b/discussions/2).

Merge configuration:
```
slices:
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [0,12]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [10,16]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [14,20]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [18,24]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [22,28]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [26,32]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [30,36]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [34,40]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [38,44]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [42,48]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [46,52]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [50,56]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [54,60]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [58,64]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [62,68]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [66,72]
  - sources:
    - model: 152334H/miqu-1-70b-sf
      layer_range: [70,80]
merge_method: passthrough
dtype: float16

```