DavidGF commited on
Commit
c1d314d
1 Parent(s): 324adc6

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +119 -0
README.md ADDED
@@ -0,0 +1,119 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: tongyi-qianwen-research
4
+ license_link: >-
5
+ https://huggingface.co/Qwen/Qwen1.5-32B/blob/main/LICENSE
6
+ language:
7
+ - de
8
+ - en
9
+ tags:
10
+ - sft
11
+ - dpo
12
+ ---
13
+
14
+
15
+ ![SauerkrautLM](https://vago-solutions.ai/wp-content/uploads/2024/04/SauerkrautLM-Qwen-32b.png "SauerkrautLM-Qwen-32b")
16
+ ## VAGO solutions SauerkrautLM-Qwen-32b
17
+ Introducing **SauerkrautLM-Qwen-32b** – our Sauerkraut version of the powerful [Qwen/Qwen1.5-32B](https://huggingface.co/Qwen/Qwen1.5-32B)!
18
+ **It is an early stage finetuned model and should be used with caution!**
19
+
20
+ The model **SauerkrautLM-Qwen-32b** is a **joint effort** between **VAGO solutions** and **Hyperspace.ai.**
21
+
22
+ - Finetuned with **SFT**
23
+ - Aligned with **DPO**
24
+
25
+ # Table of Contents
26
+ 1. [Overview of all SauerkrautLM-Qwen-32b](#all-SauerkrautLM-Qwen-32b)
27
+ 2. [Model Details](#model-details)
28
+ - [Prompt template](#prompt-template)
29
+ - [Training procedure](#proceed-of-the-training)
30
+ 3. [Evaluation](#evaluation)
31
+ 5. [Disclaimer](#disclaimer)
32
+ 6. [Contact](#contact)
33
+ 7. [Collaborations](#collaborations)
34
+ 8. [Acknowledgement](#acknowledgement)
35
+
36
+
37
+ ## All SauerkrautLM-Qwen-32b
38
+
39
+ | Model | HF | EXL2 | GGUF | AWQ |
40
+ |-------|-------|-------|-------|-------|
41
+ | SauerkrautLM-Qwen-32b | [Link](https://huggingface.co/VAGOsolutions/SauerkrautLM-Qwen-32b) | coming soon | coming soon | coming soon |
42
+
43
+ ## Model Details
44
+ **SauerkrautLM-Gemma-7b**
45
+ - **Model Type:** SauerkrautLM-Qwen-32b is a finetuned Model based on [Qwen/Qwen1.5-32B](https://huggingface.co/Qwen/Qwen1.5-32B)
46
+ - **Language(s):** German, English
47
+ - **License:** [tongyi-qianwen-research](https://huggingface.co/Qwen/Qwen1.5-32B/blob/main/LICENSEs)
48
+ - **Contact:** [VAGO solutions](https://vago-solutions.ai), [Hyperspace.ai](https://hyperspace.computer/)
49
+
50
+ ### Training procedure:
51
+ We trained this model for 2 epochs on 160k data samples with SFT.
52
+ Afterwards we applied DPO with 110k data.
53
+
54
+ **We teached German language skills on this model.** As far as we know, it is the first Qwen 32B model with bilingual skills in German and English. Nevertheless, formulations may occur that are not entirely correct (still work in progress).
55
+
56
+
57
+ ### Prompt Template:
58
+
59
+ **English:**
60
+ ```
61
+ <|im_start|>system
62
+ You are SauerkrautLM, a helpful AI assistant.<|im_end|>
63
+ <|im_start|>user
64
+ {prompt}<|im_end|>
65
+ <|im_start|>assistant
66
+
67
+ ```
68
+
69
+ **German:**
70
+ ```
71
+ <|im_start|>system
72
+ Du bist SauerkrautLM, ein hilfreicher und freundlicher KI-Assistent.<|im_end|>
73
+ <|im_start|>user
74
+ {prompt}<|im_end|>
75
+ <|im_start|>assistant
76
+
77
+ ```
78
+ ### Example output of german language:
79
+
80
+ ```
81
+ Die Karte zeigte eine verborgene Höhle, die in den umliegenden Bergen zu finden war. Mia war von der Idee fasziniert, diese Höhle zu entdecken und ihre Geheimnisse zu lüften. Sie nahm die Karte mit sich und machte sich auf den Weg, um die Höhle zu finden.
82
+ Die Wanderung zu den Bergen war eine Herausforderung, aber Mia war fest entschlossen, ihr Abenteuer zu vollenden. Sie überwand steinige Wege und überquerte klirrende Bäche, die ihre Füße kühlten und ihr die Energie für den Rest des Weges gab.
83
+ Endlich erreichte Mia die Höhle, die von einem dichten Wald umgeben war. Die Höhle war ein Ort der Geheimnisse und des Staunens, der ihr Herz höher schlagen ließ. Sie betrat die Höhle, und die Dunkelheit umhüllte sie wie ein Schleier aus Stille.
84
+ In der Höhle fand Mia eine alte Schatzkiste, die mit einem alten, verwitterten Holz verziert war. Mit zitternden Händen öffnete sie die Schatzkiste und fand darin eine alte, zerfledderte Schriftrolle. Die Schriftrolle war ein geheimnisvolles Artefakt, das ihr die Geschichte der Höhle offenbarte.
85
+ ```
86
+
87
+ ## Evaluation
88
+
89
+ **Open LLM Leaderboard:**
90
+
91
+
92
+ | Metric | Value |
93
+ |-----------------------|---------------------------|
94
+ | Avg. | **73.11** |
95
+ | ARC (25-shot) | 59.22 |
96
+ | HellaSwag (10-shot) | 82.32 |
97
+ | MMLU (5-shot) | 74.40|
98
+ | TruthfulQA (0-shot) | 61.03 |
99
+ | Winogrande (5-shot) | 82.16 |
100
+ | GSM8K (5-shot) | 79.53 |
101
+
102
+
103
+
104
+
105
+
106
+
107
+ ## Disclaimer
108
+ We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
109
+ However, we cannot guarantee consistently appropriate behavior. Therefore, if you encounter any issues or come across inappropriate content, we kindly request that you inform us through the contact information provided.
110
+ Additionally, it is essential to understand that the licensing of these models does not constitute legal advice. We are not held responsible for the actions of third parties who utilize our models.
111
+  
112
+ ## Contact
113
+ If you are interested in customized LLMs for business applications, please get in contact with us via our websites. We are also grateful for your feedback and suggestions.
114
+  
115
+ ## Collaborations
116
+ We are also keenly seeking support and investment for our startups, VAGO solutions and Hyperspace where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us at [VAGO solutions](https://vago-solutions.de/#Kontakt), [Hyperspace.computer](https://hyperspace.computer/)
117
+
118
+ ## Acknowledgement
119
+ Many thanks to [Qwen](https://huggingface.co/Qwen) for providing such valuable model to the Open-Source community