File size: 2,559 Bytes
e382fb3
 
 
 
 
 
 
dc0eae7
e382fb3
 
0016e33
 
 
e382fb3
57d0569
 
f1a6b3a
eb90be7
 
 
 
 
70a6061
 
eb90be7
 
2f69e60
 
 
 
8fd0d59
 
 
 
 
 
 
 
 
 
8fec468
fb17ef1
7bf573a
8fd0d59
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e382fb3
 
7bf573a
 
 
 
 
8fd0d59
e382fb3
8fd0d59
e382fb3
8fd0d59
e382fb3
8fd0d59
e382fb3
8fd0d59
e382fb3
8fd0d59
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
base_model: v000000/L3-11.5B-DuS-FrankenRoot
library_name: transformers
tags:
- mergekit
- merge
- llama-cpp
- llama
---

# v000000/L3-11.5B-DuS-MoonRoot-Q6_K
This model was converted to GGUF format from [`v000000/L3-11.5B-DuS-MoonRoot`](https://huggingface.co/v000000/L3-11.5B-DuS-MoonRoot) using llama.cpp
Refer to the [original model card](https://huggingface.co/v000000/L3-11.5B-DuS-MoonRoot) for more details on the model.'

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f74b6e6389380c77562762/lNgAEcW3pWd6x0x-4C3q1.png)

# Pretty good understanding gets the percentage wrong but understands.
```bash
user: A dead cat is placed into a box along with a nuclear isotope, a vial of poison and a radiation detector.
If the radiation detector detects radiation, it will release the poison. The box is opened one day later.
What is the probability of the cat being alive?

assistant: The answer is 100%. Since the cat is already dead when it was placed in the box,
there is no possibility for it to be alive when the box is opened...
```

Shows similar emergent language nuance abilities compared to 8B.

Unaligned and somewhat lazy.

---
base_model:
- Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B
- v000000/L3-8B-Poppy-Moonfall-C
library_name: transformers
tags:
- mergekit
- merge
- llama
---
### Llama-3-11.5B-Depth-Upscaled-MoonRoot
experiement, no continued finetuning

# merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the passthrough merge method.

### Models Merged

The following models were included in the merge:
* [Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B](https://huggingface.co/Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B)
* [v000000/L3-8B-Poppy-Moonfall-C](https://huggingface.co/v000000/L3-8B-Poppy-Moonfall-C)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
slices:
  - sources:
    - model: v000000/L3-8B-Poppy-Moonfall-C
      layer_range: [0, 24]
  - sources:
    - model: Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B
      layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
```

---
base_model:
- Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B
- v000000/L3-8B-Poppy-Moonfall-C

# Prompt Template:
```bash
<|begin_of_text|><|start_header_id|>system<|end_header_id|>

{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>

{input}<|eot_id|><|start_header_id|>assistant<|end_header_id|>

{output}<|eot_id|>

```