crystalkalem's picture
Update README.md
1129100 verified
|
raw
history blame contribute delete
No virus
756 Bytes
---
base_model: DopeorNope/SOLARC-MOE-10.7Bx4
inference: true
language:
- ko
library_name: transformers
license: cc-by-nc-sa-4.0
model_creator: Seungyoo Lee
model_name: Solarc MOE 10.7Bx4
model_type: mixtral
pipeline_tag: text-generation
prompt_template: |
### User:
{prompt}
### Assistant:
quantized_by: TheBloke
---
## Description
I reuploaded one of the smaller versions of this model originaly posted by TheBloke.
Original found here **https://huggingface.co/TheBloke/SOLARC-MOE-10.7Bx4-GGUF**
This repo contains GGUF format model files for [Seungyoo Lee's Solarc MOE 10.7Bx4](https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4).
These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).