File size: 1,087 Bytes
cb460ee b1bc2db cb460ee b1bc2db cb460ee 8b0b7d5 cb460ee |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
datasets:
- allenai/ai2_arc
- unalignment/spicy-3.1
- codeparrot/apps
- facebook/belebele
- boolq
- jondurbin/cinematika-v0.1
- drop
- lmsys/lmsys-chat-1m
- TIGER-Lab/MathInstruct
- cais/mmlu
- Muennighoff/natural-instructions
- openbookqa
- piqa
- Vezora/Tested-22k-Python-Alpaca
- cakiki/rosetta-code
- Open-Orca/SlimOrca
- spider
- squad_v2
- migtissera/Synthia-v1.3
- datasets/winogrande
- nvidia/HelpSteer
- Intel/orca_dpo_pairs
- unalignment/toxic-dpo-v0.1
- jondurbin/truthy-dpo-v0.1
- allenai/ultrafeedback_binarized_cleaned
- Squish42/bluemoon-fandom-1-1-rp-cleaned
- LDJnr/Capybara
- JULIELab/EmoBank
- kingbri/PIPPA-shareGPT
license: apache-2.0
---
quant of [jondurbin's](https://huggingface.co/jondurbin) [bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2)
fits into 24gb with 16k context on windows
```
python3 convert.py \
-i /input/jondurbin_bagel-dpo-34b-v0.2/ \
-c /input/pippa_cleaned/0000.parquet \
-o /output/temp/ \
-cf /output/bagel-dpo-34b-v0.2-4.65bpw-h6-exl2/ \
-l 8192 \
-ml 8192 \
-b 4.65 \
-hb 6
``` |