--- base_model: [] tags: - mergekit - merge --- # airoboros-3.2-mixtral-zloss-merged This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * /home/hien/models/Mixtral-8x7B-Instruct-v0.1 * /home/hien/models/airoboros-3.2-mixtral-zloss ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /home/hien/models/Mixtral-8x7B-Instruct-v0.1 parameters: weight: 0.5 - model: /home/hien/models/airoboros-3.2-mixtral-zloss parameters: weight: 0.5 merge_method: linear #merge_method: dare_ties #base_model: ./extra_hdd/Mixtral-8x7B-v0.1 parameters: #normalize: false #int8_mask: true dtype: bfloat16 ```