File size: 1,709 Bytes
d86fb39
 
 
 
 
 
 
 
 
 
e28e96e
 
 
8d5af12
e28e96e
 
 
 
 
855665a
 
 
e28e96e
 
d86fb39
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
license: cc-by-sa-4.0
datasets:
- maywell/ko_wikidata_QA
- kyujinpy/OpenOrca-KO
language:
- en
- ko
pipeline_tag: text-generation
---
## Exl2 version of [maywell/PiVoT-0.1-early](https://huggingface.co/maywell/PiVoT-0.1-early)  

## branch  
main : 8bpw h8  
6bh8 : 6bpw h8  
4bh8 : 4bpw h8  

Using [VMware/open-instruct](https://huggingface.co/datasets/VMware/open-instruct/viewer) as dataset  
 
Quantization settings : ```python convert.py -i models/maywell_PiVoT-0.1-early -o PiVoT-0.1-early-temp -cf PiVoT-0.1-early-8bpw-h8-exl2 -c 0000.parquet -l 4096 -b 8 -hb 8```  
```python convert.py -i models/maywell_PiVoT-0.1-early -o PiVoT-0.1-early-temp2 -cf PiVoT-0.1-early-6bpw-h8-exl22 -c 0000.parquet -l 4096 -b 6 -hb 8 -m PiVoT-0.1-early-temp/measurement.json```  
```python convert.py -i models/maywell_PiVoT-0.1-early -o PiVoT-0.1-early-temp3 -cf PiVoT-0.1-early-4bpw-h8-exl2 -c 0000.parquet -l 4096 -b 4 -hb 8 -m PiVoT-0.1-early-temp/measurement.json```  

### below this line is original readme  

# PiVoT-0.1-early

![image/png](./PiVoT.png)

# **Model Details**

### Description
PivoT is Finetuned model based on Mistral 7B. It is variation from Synatra v0.3 RP which has shown decent performance.

OpenOrca Dataset used when finetune PiVoT variation. Arcalive Ai Chat Chan log 7k, [ko_wikidata_QA](https://huggingface.co/datasets/maywell/ko_wikidata_QA), [kyujinpy/OpenOrca-KO](https://huggingface.co/datasets/kyujinpy/OpenOrca-KO) and other datasets used on base model.

Follow me on twitter: https://twitter.com/stablefluffy

Consider Support me making these model alone: https://www.buymeacoffee.com/mwell or with Runpod Credit Gift 💕

Contact me on Telegram: https://t.me/AlzarTakkarsen