v000000's picture
Update README.md
f1a6b3a verified
|
raw
history blame
No virus
2.46 kB
---
base_model: v000000/L3-11.5B-DuS-FrankenRoot
library_name: transformers
tags:
- mergekit
- merge
- llama-cpp
- llama
---
# v000000/L3-11.5B-DuS-MoonRoot-Q6_K
This model was converted to GGUF format from [`v000000/L3-11.5B-DuS-MoonRoot`](https://huggingface.co/v000000/L3-11.5B-DuS-MoonRoot) using llama.cpp
Refer to the [original model card](https://huggingface.co/v000000/L3-11.5B-DuS-MoonRoot) for more details on the model.'
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f74b6e6389380c77562762/lNgAEcW3pWd6x0x-4C3q1.png)
# Pretty good understanding gets the percentage wrong but understands.
```bash
user: A dead cat is placed into a box along with a nuclear isotope, a vial of poison and a radiation detector.
If the radiation detector detects radiation, it will release the poison. The box is opened one day later.
What is the probability of the cat being alive?
assistant: The answer is 100%. Since the cat is already dead when it was placed in the box, there is no possibility for it to be alive when the box is opened...
```
---
base_model:
- Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B
- v000000/L3-8B-Poppy-Moonfall-C
library_name: transformers
tags:
- mergekit
- merge
- llama
---
### Llama-3-11.5B-Depth-Upscaled-MoonRoot
experiement, no continued finetuning
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B](https://huggingface.co/Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B)
* [v000000/L3-8B-Poppy-Moonfall-C](https://huggingface.co/v000000/L3-8B-Poppy-Moonfall-C)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: v000000/L3-8B-Poppy-Moonfall-C
layer_range: [0, 24]
- sources:
- model: Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B
layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
```
---
base_model:
- Cas-Warehouse/Llama-3-MopeyMule-Blackroot-8B
- v000000/L3-8B-Poppy-Moonfall-C
# Prompt Template:
```bash
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>
{input}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
{output}<|eot_id|>
```