File size: 3,952 Bytes
2d07a6f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
---
base_model:
- huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
- IlyaGusev/saiga_llama3_8b
- lightblue/suzume-llama-3-8B-multilingual
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25
- lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75
library_name: transformers
tags:
- mergekit
- merge
- bfloat16
- safetensors
- 8b
- chat
- conversational
language:
- de
- en
- es
- fr
- hi
- it
- ja
- pt
- ru
- th
- zh
---
[![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory)
# QuantFactory/Multilingual-SaigaSuzume-8B-GGUF
This is quantized version of [Khetterman/Multilingual-SaigaSuzume-8B](https://huggingface.co/Khetterman/Multilingual-SaigaSuzume-8B) created using llama.cpp
# Original Model Card
# Multilingual-SaigaSuzume-8B
>Your words are like rain falling from heaven on a tower in a sinful land; can anyone in Babylon understand them?
![Multilingual-SaigaSuzume-8B-Logo256.png](https://cdn-uploads.huggingface.co/production/uploads/673125091920e70ac26c8a2e/aVbK8k3mUMBAOlUSXBK91.png)
This model was created as the basis of multilingual abilities for other models. I think it will be very useful as an integral part of your model. There is some censorship, keep this in mind.
## Merge Details
### Method
This is a simple, but usefull merge of **7 cool models**, created using [mergekit](https://github.com/arcee-ai/mergekit).
### Models
The following models were included in the merge:
* [huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated](https://huggingface.co/huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated)
* [IlyaGusev/saiga_llama3_8b](https://huggingface.co/IlyaGusev/saiga_llama3_8b)
* [lightblue/suzume-llama-3-8B-multilingual](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25)
* [lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75](https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75)
### Configuration
The following YAML configurations was used to produce this model:
```yaml
# Multilingual-SaigaSuzume-8B-BFH
models:
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full
- model: IlyaGusev/saiga_llama3_8b
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half
merge_method: model_stock
base_model: huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
dtype: bfloat16
# Multilingual-SaigaSuzume-8B-BTP
models:
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75
- model: IlyaGusev/saiga_llama3_8b
- model: lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25
merge_method: model_stock
base_model: huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
dtype: bfloat16
# Multilingual-SaigaSuzume-8B-Classic
models:
- model: IlyaGusev/saiga_llama3_8b
- model: lightblue/suzume-llama-3-8B-multilingual
merge_method: model_stock
base_model: huihui-ai/Meta-Llama-3.1-8B-Instruct-abliterated
dtype: bfloat16
# Multilingual-SaigaSuzume-8B
models:
- model: Multilingual-SaigaSuzume-8B-BFH
- model: Multilingual-SaigaSuzume-8B-BTP
merge_method: model_stock
base_model: Multilingual-SaigaSuzume-8B-Classic
dtype: bfloat16
```
>My thanks to the authors of the original models, your work is incredible. Have a good time 🖤
|