File size: 3,231 Bytes
b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 0f14eb3 b6acb35 be5feb2 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 e8c0226 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 b6acb35 4d45cb0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 |
---
library_name: transformers
tags:
- deutsch
- german
- seedbox
- mistral
- mixtral
license: apache-2.0
datasets:
- seedboxai/multitask_german_examples_32k
- seedboxai/ultra_feedback_german_modified_v1
language:
- de
pipeline_tag: text-generation
---
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/645ded34a45b4182d7f5c385/9QywLGTbRrHYSq-m6fQmJ.jpeg)
# KafkaLM-8x7b-German-V0.1
**KafkaLM 8x7b** is a MoE model based on [Mistral AI´s Mixtral 8x7b](https://mistral.ai/news/mixtral-of-experts/) which was finetuned on an ensemble of popular high-quality open-source instruction sets (translated from English to German).
KafkaLM 8x7b is a [Seedbox](https://huggingface.co/seedboxai) project trained by [Dennis Dickmann](https://huggingface.co/doubledsbv).
**Why Kafka?**
The models are proficient, yet creative, have some tendencies to linguistically push boundaries 😊
## Model Details
The purpose of releasing the **KafkaLM series** is to contribute to the German AI community with a set of fine-tuned LLMs that are easy to use in everyday applications across a variety of tasks.
The main goal was to provide LLMs proficient in German, especially to be used in German-speaking business contexts where English alone is not sufficient.
### DPO
The model has been aligned with a german and modified version of the ultra feedback dataset from huggingface.
### Dataset
I used a 8k filtered version of the following [seedboxai/multitask_german_examples_32k](https://huggingface.co/datasets/seedboxai/multitask_german_examples_32k)
### Prompt Format
This model follows the subsequent prompt format:
```
<|system|>
Du bist ein freundlicher und hilfsbereiter KI-Assistent. Du beantwortest Fragen faktenorientiert und präzise, ohne dabei relevante Fakten auszulassen.</s>
<|user|>
Welche Möglichkeiten der energetischen Sanierung habe ich neben Solar und Energiespeicher?</s>
<|assistant|>
```
### Inference
Getting started with the model is straightforward
```python
import transformers
model_id = "seedboxai/KafkaLM-8x7B-German-V0.1-DPO"
model = AutoModelForCausalLM.from_pretrained(model_id, load_in_4bit=True, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_id)
def generate_prompt(input):
prompt = ''
sys_prompt = "Du bist ein freundlicher und hilfsbereiter KI-Assistent. Du beantwortest Fragen faktenorientiert und präzise, ohne dabei relevante Fakten auszulassen."
prompt += f"<|system|>\n{sys_prompt.strip()}</s>\n"
prompt += f"<|user|>\n{input.strip()}</s>\n"
prompt += f"<|assistant|>\n"
return prompt.strip()
generate_text = transformers.pipeline(
model=model, tokenizer=tokenizer,
return_full_text=True,
task='text-generation',
temperature=0.5,
max_new_tokens=512,
top_p=0.95,
top_k=50,
do_sample=True,
)
print(generate_text(generate_prompt("Wer ist eigentlich dieser Kafka?"))
```
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model.
This model should only be used for research purposes. The original Llama2 license and all restrictions of datasets used to train this model apply. |