Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -19,6 +19,62 @@ These quants were made with exllamav2 version 0.0.18. Quants made on this versio
|
|
19 |
|
20 |
If you have problems loading these models, please update Text Generation WebUI to the latest version.
|
21 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
22 |
|
23 |
|
24 |
## Quant Details
|
|
|
19 |
|
20 |
If you have problems loading these models, please update Text Generation WebUI to the latest version.
|
21 |
|
22 |
+
## Perplexity Scoring
|
23 |
+
|
24 |
+
Below are the perplexity scores for the EXL2 models. A lower score is better.
|
25 |
+
|
26 |
+
| Quant Level | Perplexity Score |
|
27 |
+
|-------------|------------------|
|
28 |
+
| 7.0 | 4.5859 |
|
29 |
+
| 6.0 | 4.6252 |
|
30 |
+
| 5.5 | 4.6493 |
|
31 |
+
| 5.0 | 4.6937 |
|
32 |
+
| 4.5 | 4.8029 |
|
33 |
+
| 4.0 | 4.9372 |
|
34 |
+
| 3.5 | 5.1336 |
|
35 |
+
| 3.25 | 5.3636 |
|
36 |
+
| 3.0 | 5.5468 |
|
37 |
+
| 2.75 | 5.8255 |
|
38 |
+
| 2.5 | 6.3362 |
|
39 |
+
| 2.25 | 7.7763 |
|
40 |
+
|
41 |
+
|
42 |
+
### Perplexity Script
|
43 |
+
|
44 |
+
This was the script used for perplexity testing.
|
45 |
+
|
46 |
+
```bash
|
47 |
+
#!/bin/bash
|
48 |
+
|
49 |
+
# Activate the conda environment
|
50 |
+
source ~/miniconda3/etc/profile.d/conda.sh
|
51 |
+
conda activate exllamav2
|
52 |
+
|
53 |
+
DATA_SET=/root/wikitext/wikitext-2-v1.parquet
|
54 |
+
|
55 |
+
# Set the model name and bit size
|
56 |
+
MODEL_NAME="WizardLM-2-8x22B"
|
57 |
+
BIT_PRECISIONS=(6.0 5.5 5.0 4.5 4.0 3.5 3.25 3.0 2.75 2.5 2.25)
|
58 |
+
|
59 |
+
# Print the markdown table header
|
60 |
+
echo "| Quant Level | Perplexity Score |"
|
61 |
+
echo "|-------------|------------------|"
|
62 |
+
|
63 |
+
for BIT_PRECISION in "${BIT_PRECISIONS[@]}"
|
64 |
+
do
|
65 |
+
LOCAL_FOLDER="/root/models/${MODEL_NAME}_exl2_${BIT_PRECISION}bpw"
|
66 |
+
REMOTE_FOLDER="Dracones/${MODEL_NAME}_exl2_${BIT_PRECISION}bpw"
|
67 |
+
|
68 |
+
if [ ! -d "$LOCAL_FOLDER" ]; then
|
69 |
+
huggingface-cli download --local-dir-use-symlinks=False --local-dir "${LOCAL_FOLDER}" "${REMOTE_FOLDER}" >> /root/download.log 2>&1
|
70 |
+
fi
|
71 |
+
|
72 |
+
output=$(python test_inference.py -m "$LOCAL_FOLDER" -gs 40,40,40,40 -ed "$DATA_SET")
|
73 |
+
score=$(echo "$output" | grep -oP 'Evaluation perplexity: \K[\d.]+')
|
74 |
+
echo "| $BIT_PRECISION | $score |"
|
75 |
+
# rm -rf "${LOCAL_FOLDER}"
|
76 |
+
done
|
77 |
+
```
|
78 |
|
79 |
|
80 |
## Quant Details
|