sometimesanotion commited on
Commit
8c2510e
·
verified ·
1 Parent(s): 90b652d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -20,11 +20,11 @@ language:
20
 
21
  Lamarck-14B version 0.3 is strongly based on [arcee-ai/Virtuoso-Small](https://huggingface.co/arcee-ai/Virtuoso-Small) as a diffuse influence for prose and reasoning. Arcee's pioneering use of distillation and innovative merge techniques create a diverse knowledge pool for its models.
22
 
23
- The overall strategy:
24
- Two model_stocks used to begin specialized branches for reasoning and prose quality.
25
- For refinement on Virtuoso as a base model, DELLA and SLERP include the model_stocks while re-emphasizing selected ancestors.
26
- For integration, a SLERP merge of Virtuoso with the converged branches.
27
- For finalization and a little bit of abliteration, TIES with a light touch from [huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2](http://huggingface.co/huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2).
28
 
29
  ### Ancestor Models
30
 
 
20
 
21
  Lamarck-14B version 0.3 is strongly based on [arcee-ai/Virtuoso-Small](https://huggingface.co/arcee-ai/Virtuoso-Small) as a diffuse influence for prose and reasoning. Arcee's pioneering use of distillation and innovative merge techniques create a diverse knowledge pool for its models.
22
 
23
+ ### Merge Strategy:
24
+ - Two model_stocks used to begin specialized branches for reasoning and prose quality.
25
+ - For refinement on Virtuoso as a base model, DELLA and SLERP include the model_stocks while re-emphasizing selected ancestors.
26
+ - For integration, a SLERP merge of Virtuoso with the converged branches.
27
+ - For finalization and a little bit of abliteration, TIES with a light touch from [huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2](http://huggingface.co/huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2).
28
 
29
  ### Ancestor Models
30