sayhan's picture
Update README.md
6ec733b verified
metadata
base_model: malhajar/Mixtral-8x7B-v0.1-turkish
language:
  - tr
  - en
pipeline_tag: text-generation
license: apache-2.0
model_type: mixtral
library_name: transformers
inference: false

Mixtral 8x7B v0.1 Turkish

Description

This repo contains GGUF format model files for malhajar's Mixtral 8x7B v0.1 Turkish

Original model

Original model description

malhajar/Mixtral-8x7B-v0.1-turkish is a finetuned version of Mixtral-8x7B-v0.1 using SFT Training. This model can answer information in turkish language as it is finetuned on a turkish dataset specifically alpaca-gpt4-tr

Quantizon types

quantization method bits size description recommended
Q3_K_S 3 20.4 GB very small, high quality loss
Q3_K_L 3 26.4 GB small, substantial quality loss
Q4_0 4 26.4 GB legacy; small, very high quality loss
Q4_K_M 4 28.4 GB medium, balanced quality
Q5_0 5 33.2 GB legacy; medium, balanced quality
Q5_K_S 5 32.2 GB large, low quality loss
Q5_K_M 5 33.2 GB large, very low quality loss
Q6_K 6 38.4 GB very large, extremely low quality loss
Q8_0 8 49.6 GB very large, extremely low quality loss

Prompt Template

### Instruction:
<prompt> (without the <>)
### Response: