Edit model card

Exl2 version of maywell/PiVoT-0.1-Evil-a

branch

main : 8bpw h8
6bh8 : 6bpw h8
4bh8 : 4bpw h8

Using VMware/open-instruct as dataset

Quantization settings : python convert.py -i models/maywell_PiVoT-0.1-Evil-a -o PiVoT-0.1-Evil-a-temp -cf PiVoT-0.1-Evil-a-8bpw-h8-exl2 -c 0000.parquet -l 4096 -b 8 -hb 8
python convert.py -i models/maywell_PiVoT-0.1-Evil-a -o PiVoT-0.1-Evil-a-temp2 -cf PiVoT-0.1-Evil-a-6bpw-h8-exl2 -c 0000.parquet -l 4096 -b 6 -hb 8 -m PiVoT-0.1-Evil-a-temp/measurement.json
python convert.py -i models/maywell_PiVoT-0.1-Evil-a -o PiVoT-0.1-Evil-a-temp3 -cf PiVoT-0.1-Evil-a-4bpw-h8-exl2 -c 0000.parquet -l 4096 -b 4 -hb 8 -m PiVoT-0.1-Evil-a-temp/measurement.json

below this line is original readme

PiVoT-0.1-early

image/png

Model Details

Description

PivoT is Finetuned model based on Mistral 7B. It is variation from Synatra v0.3 RP which has shown decent performance.

PiVoT-0.1-Evil-a is an Evil tuned Version of PiVoT. It finetuned by method below.

PiVot-0.1-Evil-b has Noisy Embedding tuned. It would have more variety in results.

image/png

Disclaimer

The AI model provided herein is intended for experimental purposes only. The creator of this model makes no representations or warranties of any kind, either express or implied, as to the model's accuracy, reliability, or suitability for any particular purpose. The creator shall not be held liable for any outcomes, decisions, or actions taken on the basis of the information generated by this model. Users of this model assume full responsibility for any consequences resulting from its use.

OpenOrca Dataset used when finetune PiVoT variation. Arcalive Ai Chat Chan log 7k, ko_wikidata_QA, kyujinpy/OpenOrca-KO and other datasets used on base model.

Follow me on twitter: https://twitter.com/stablefluffy

Consider Support me making these model alone: https://www.buymeacoffee.com/mwell or with Runpod Credit Gift 💕

Contact me on Telegram: https://t.me/AlzarTakkarsen

Downloads last month
2

Datasets used to train IHaBiS/PiVoT-0.1-Evil-a-exl2