|
Quantization made by Richard Erkhov. |
|
|
|
[Github](https://github.com/RichardErkhov) |
|
|
|
[Discord](https://discord.gg/pvy7H8DZMG) |
|
|
|
[Request more models](https://github.com/RichardErkhov/quant_request) |
|
|
|
|
|
Tippy-Toppy-7b - GGUF |
|
- Model creator: https://huggingface.co/Azazelle/ |
|
- Original model: https://huggingface.co/Azazelle/Tippy-Toppy-7b/ |
|
|
|
|
|
| Name | Quant method | Size | |
|
| ---- | ---- | ---- | |
|
| [Tippy-Toppy-7b.Q2_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q2_K.gguf) | Q2_K | 2.53GB | |
|
| [Tippy-Toppy-7b.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ3_XS.gguf) | IQ3_XS | 2.81GB | |
|
| [Tippy-Toppy-7b.IQ3_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ3_S.gguf) | IQ3_S | 2.96GB | |
|
| [Tippy-Toppy-7b.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K_S.gguf) | Q3_K_S | 2.95GB | |
|
| [Tippy-Toppy-7b.IQ3_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ3_M.gguf) | IQ3_M | 3.06GB | |
|
| [Tippy-Toppy-7b.Q3_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K.gguf) | Q3_K | 3.28GB | |
|
| [Tippy-Toppy-7b.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K_M.gguf) | Q3_K_M | 3.28GB | |
|
| [Tippy-Toppy-7b.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K_L.gguf) | Q3_K_L | 3.56GB | |
|
| [Tippy-Toppy-7b.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ4_XS.gguf) | IQ4_XS | 3.67GB | |
|
| [Tippy-Toppy-7b.Q4_0.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_0.gguf) | Q4_0 | 3.83GB | |
|
| [Tippy-Toppy-7b.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ4_NL.gguf) | IQ4_NL | 3.87GB | |
|
| [Tippy-Toppy-7b.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_K_S.gguf) | Q4_K_S | 3.86GB | |
|
| [Tippy-Toppy-7b.Q4_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_K.gguf) | Q4_K | 4.07GB | |
|
| [Tippy-Toppy-7b.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_K_M.gguf) | Q4_K_M | 4.07GB | |
|
| [Tippy-Toppy-7b.Q4_1.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_1.gguf) | Q4_1 | 4.24GB | |
|
| [Tippy-Toppy-7b.Q5_0.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_0.gguf) | Q5_0 | 4.65GB | |
|
| [Tippy-Toppy-7b.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_K_S.gguf) | Q5_K_S | 4.65GB | |
|
| [Tippy-Toppy-7b.Q5_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_K.gguf) | Q5_K | 4.78GB | |
|
| [Tippy-Toppy-7b.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_K_M.gguf) | Q5_K_M | 4.78GB | |
|
| [Tippy-Toppy-7b.Q5_1.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_1.gguf) | Q5_1 | 5.07GB | |
|
| [Tippy-Toppy-7b.Q6_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q6_K.gguf) | Q6_K | 5.53GB | |
|
| [Tippy-Toppy-7b.Q8_0.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q8_0.gguf) | Q8_0 | 7.17GB | |
|
|
|
|
|
|
|
|
|
Original model description: |
|
--- |
|
pipeline_tag: text-generation |
|
tags: |
|
- mistral |
|
- merge |
|
license: cc-by-4.0 |
|
--- |
|
# Model Card for Tippy-Toppy-7b |
|
<!-- Provide a quick summary of what the model is/does. --> |
|
DARE merge intended to be build on Toppy-M-7b. |
|
|
|
.yaml file for mergekit |
|
```.yaml: |
|
models: |
|
- model: mistralai/Mistral-7B-v0.1 |
|
# no parameters necessary for base model |
|
- model: Undi95/Toppy-M-7B #175 |
|
parameters: |
|
weight: 0.54 |
|
density: 0.81 |
|
- model: PistachioAlt/Noromaid-Bagel-7B-Slerp #75 |
|
parameters: |
|
weight: 0.23 |
|
density: 0.61 |
|
- model: OpenPipe/mistral-ft-optimized-1227 #100 |
|
parameters: |
|
weight: 0.31 |
|
density: 0.68 |
|
merge_method: dare_ties |
|
base_model: mistralai/Mistral-7B-v0.1 |
|
parameters: |
|
int8_mask: true |
|
dtype: bfloat16 |
|
``` |
|
|
|
|