merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Both is the combination of the best models QWEN2.5-0.5B of the Open LLM Leaderscore.
Merge Method
This model was merged using the Arcee Fusion merge method using CoolSpring/Qwen2-0.5B-Abyme-merge3 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
- model: CoolSpring/Qwen2-0.5B-Abyme-merge3
parameters:
density: 0.53
weight: 0.6
merge_method: arcee_fusion
base_model: CoolSpring/Qwen2-0.5B-Abyme-merge3
tokenizer_source: union
parameters:
int8_mask: true
dtype: bfloat16
random_seed: 0
- Downloads last month
- 14
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for Novaciano/Qwen2.5-0.5B-Abyss
Merge model
this model