Untitled Model (1)
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using /home/metaai2/hyenjin/model/korean_model/Meta-Llama-3.1-8B-Instruct-abliterated/ as a base.
Models Merged
The following models were included in the merge:
- /home/metaai2/hyenjin/model/korean_model/Llama-3-Alpha-Ko-8B-Instruct/
- /home/metaai2/hyenjin/model/korean_model/Ko-Llama-3-8B-Instruct/
- /home/metaai2/hyenjin/model/korean_model/Llama3-Ko-Carrot-8B-it/
- /home/metaai2/hyenjin/model/korean_model/Llama-3-Lumimaid-8B-v0.1/
- /home/metaai2/hyenjin/model/korean_model/KONI-Llama3-8B-Instruct-20240729/
- /home/metaai2/hyenjin/model/korean_model/llama-3-Korean-Bllossom-8B/
Configuration
The following YAML configuration was used to produce this model:
models:
- model: /home/metaai2/hyenjin/model/korean_model/Meta-Llama-3.1-8B-Instruct-abliterated/
parameters:
density: 0.8
weight: 0.5
- model: /home/metaai2/hyenjin/model/korean_model/llama-3-Korean-Bllossom-8B/
parameters:
density: 0.7
weight: 0.4
- model: /home/metaai2/hyenjin/model/korean_model/KONI-Llama3-8B-Instruct-20240729/
parameters:
density: 0.6
weight: 0.3
- model: /home/metaai2/hyenjin/model/korean_model/Llama3-Ko-Carrot-8B-it/
parameters:
density: 0.5
weight: 0.25
- model: /home/metaai2/hyenjin/model/korean_model/Ko-Llama-3-8B-Instruct/
parameters:
density: 0.4
weight: 0.2
- model: /home/metaai2/hyenjin/model/korean_model/Llama-3-Alpha-Ko-8B-Instruct/
parameters:
density: 0.4
weight: 0.2
- model: /home/metaai2/hyenjin/model/korean_model/Llama-3-Lumimaid-8B-v0.1/
parameters:
density: 0.4
weight: 0.2
merge_method: dare_ties
base_model: /home/metaai2/hyenjin/model/korean_model/Meta-Llama-3.1-8B-Instruct-abliterated/
parameters:
int8_mask: true
dtype: bfloat16
- Downloads last month
- 1,617
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support