DavidGF commited on
Commit
5a0ce35
1 Parent(s): b9d457f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -16,7 +16,7 @@ tags:
16
  ![SauerkrautLM](images/hero.png "SauerkrautLM-7b-HerO")
17
  ## VAGO solutions SauerkrautLM-7b-HerO
18
  Introducing **SauerkrautLM-7b-HerO** – the pinnacle of German language model technology!
19
- Crafted through the **merging** of **[Teknium's OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)** and **[Open-Orca's Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)**, this model is **uniquely fine-tuned with the Sauerkraut dataset.**
20
  SauerkrautLM-7b-HerO represents a breakthrough in language modeling, achieving an optimal balance between extensive German data and essential international sources.
21
  This ensures the model not only excels in understanding the nuances of the German language but also retains its global capabilities.
22
  Harnessing the innovative power of the **gradient SLERP method from MergeKit**, we've achieved a groundbreaking fusion of two of the most best performing 7B models based on the Mistral framework.
 
16
  ![SauerkrautLM](images/hero.png "SauerkrautLM-7b-HerO")
17
  ## VAGO solutions SauerkrautLM-7b-HerO
18
  Introducing **SauerkrautLM-7b-HerO** – the pinnacle of German language model technology!
19
+ Crafted through the **merging** of **[Teknium's OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)** and **[Open-Orca's Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)** and **uniquely fine-tuned with the Sauerkraut dataset.**
20
  SauerkrautLM-7b-HerO represents a breakthrough in language modeling, achieving an optimal balance between extensive German data and essential international sources.
21
  This ensures the model not only excels in understanding the nuances of the German language but also retains its global capabilities.
22
  Harnessing the innovative power of the **gradient SLERP method from MergeKit**, we've achieved a groundbreaking fusion of two of the most best performing 7B models based on the Mistral framework.