metadata
license: apache-2.0
datasets:
- fblgit/tree-of-knowledge
- Open-Orca/SlimOrca-Dedup
- allenai/ultrafeedback_binarized_cleaned
library_name: transformers
tags:
- juanako
- UNA
- cybertron
- fbl
quantized_by: bartowski
Exllama v2 Quantizations of una-cybertron-7b-v2-bf16 at 6.0 bits per weight
Using turboderp's ExLlamaV2 v0.0.10 for quantization.
Conversion was done using wikitext-103-raw-v1-test.parquet as calibration dataset.
Original model: https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16
Download instructions
With git:
git clone --single-branch --branch 6.0 https://huggingface.co/bartowski/una-cybertron-7b-v2-bf16-exl2
With huggingface hub (credit to TheBloke for instructions):
pip3 install huggingface-hub
To download from a different branch, add the --revision
parameter:
mkdir una-cybertron-7b-v2-bf16-exl2
huggingface-cli download bartowski/una-cybertron-7b-v2-bf16-exl2 --revision 6_0 --local-dir una-cybertron-7b-v2-bf16-exl2 --local-dir-use-symlinks False