File size: 756 Bytes
9f53ef4
 
c0eff97
9f53ef4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
697ae09
 
 
1129100
3974113
 
 
697ae09
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
base_model: DopeorNope/SOLARC-MOE-10.7Bx4
inference: true
language:
  - ko
library_name: transformers
license: cc-by-nc-sa-4.0
model_creator: Seungyoo Lee
model_name: Solarc MOE 10.7Bx4
model_type: mixtral
pipeline_tag: text-generation
prompt_template: |
  ### User:
  {prompt}

  ### Assistant:
quantized_by: TheBloke
---

## Description

I reuploaded one of the smaller versions of this model originaly posted by TheBloke.

Original found here **https://huggingface.co/TheBloke/SOLARC-MOE-10.7Bx4-GGUF**

This repo contains GGUF format model files for [Seungyoo Lee's Solarc MOE 10.7Bx4](https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4).

These files were quantised using hardware kindly provided by [Massed Compute](https://massedcompute.com/).