metadata
license: cc-by-4.0
tags:
- merge
- moe
Open_Gpt4 cc
VERSION 0.2 OUT NOW:
Fp16:
q8_0.gguf:
This model is a TIES merger of notux-8x7b-v1 and UNAversal-8x7B-v1beta with MixtralOrochi8x7B being the Base model.
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
Noromaid-v0.1-mixtral, openbuddy-mixtral and possibly other models that were not named. My goal was to expand the models capabilities and make it even more useful of a model, maybe even competitive with closed source models like Gpt-4. But for that more testing is required. I hope the community can help me determine if its deserving of its name. 😊
Base model:
Merged models:
Instruct template: Alpaca
Merger config:
models:
- model: notux-8x7b-v1
parameters:
density: .5
weight: 1
- model: UNAversal-8x7B-v1beta
parameters:
density: .5
weight: 1
merge_method: ties
base_model: MixtralOrochi8x7B
parameters:
normalize: true
int8_mask: true
dtype: float16