Update README.md
Browse files
README.md
CHANGED
@@ -20,6 +20,8 @@ base_model:
|
|
20 |
|
21 |
The finetunes used in this merge saw several hundreds of millions of tokens of completion, instruction and roleplaying data. A Kahneman-Tversky Optimization was applied to both heal and give this model a unique output style.
|
22 |
|
|
|
|
|
23 |
Developed by **Aura Industries**, with contributions from **Anthracite Org**
|
24 |
|
25 |
## Model Details
|
|
|
20 |
|
21 |
The finetunes used in this merge saw several hundreds of millions of tokens of completion, instruction and roleplaying data. A Kahneman-Tversky Optimization was applied to both heal and give this model a unique output style.
|
22 |
|
23 |
+
This model can be considered inferior to [Aura-MoE-2x4B-v2](https://huggingface.co/AuraIndustries/Aura-MoE-2x4B-v2) which is a direct improvement.
|
24 |
+
|
25 |
Developed by **Aura Industries**, with contributions from **Anthracite Org**
|
26 |
|
27 |
## Model Details
|