Update README.md
Browse files
README.md
CHANGED
|
@@ -19,6 +19,8 @@ language:
|
|
| 19 |
|
| 20 |
Lamarck-14B version 0.3 is strongly based on [arcee-ai/Virtuoso-Small](https://huggingface.co/arcee-ai/Virtuoso-Small) as a diffuse influence for prose and reasoning. Arcee's pioneering use of distillation and innovative merge techniques create a diverse knowledge pool for its models.
|
| 21 |
|
|
|
|
|
|
|
| 22 |
### Overview:
|
| 23 |
|
| 24 |
- Two model_stocks used to begin specialized branches for reasoning and prose quality.
|
|
|
|
| 19 |
|
| 20 |
Lamarck-14B version 0.3 is strongly based on [arcee-ai/Virtuoso-Small](https://huggingface.co/arcee-ai/Virtuoso-Small) as a diffuse influence for prose and reasoning. Arcee's pioneering use of distillation and innovative merge techniques create a diverse knowledge pool for its models.
|
| 21 |
|
| 22 |
+
Thanks go to @arcee-ai's team for the bounties of mergekit, and to @CultriX for the helpful examples of memory-efficient sliced merges and evolutionary merging. Their contribution of tinyevals on version 0.1 of Lamarck did much to validate and focus the build process of this model.
|
| 23 |
+
|
| 24 |
### Overview:
|
| 25 |
|
| 26 |
- Two model_stocks used to begin specialized branches for reasoning and prose quality.
|