metadata
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- unsloth/mistral-7b-v0.2
- mistralai/Mistral-7B-Instruct-v0.2
- quantized
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- chatml
base_model:
- unsloth/mistral-7b-v0.2
- mistralai/Mistral-7B-Instruct-v0.2
pipeline_tag: text-generation
inference: false
quantized_by: Suparious
NeuralNovel/Mini-Mixtral-v0.2 AWQ
- Model creator: NeuralNovel
- Original model: Mini-Mixtral-v0.2
Model Summary
Mini-Mixtral-v0.2 is a Mixture of Experts (MoE) made with the following models using LazyMergekit: