RichardErkhov commited on
Commit
01dd88e
1 Parent(s): 0967bee

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +59 -0
README.md ADDED
@@ -0,0 +1,59 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged - GGUF
11
+ - Model creator: https://huggingface.co/dhmeltzer/
12
+ - Original model: https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q2_K.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q2_K.gguf) | Q2_K | 2.36GB |
18
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ3_XS.gguf) | IQ3_XS | 2.6GB |
19
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ3_S.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ3_S.gguf) | IQ3_S | 2.75GB |
20
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K_S.gguf) | Q3_K_S | 2.75GB |
21
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ3_M.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ3_M.gguf) | IQ3_M | 2.9GB |
22
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K.gguf) | Q3_K | 3.07GB |
23
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K_M.gguf) | Q3_K_M | 3.07GB |
24
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q3_K_L.gguf) | Q3_K_L | 3.35GB |
25
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ4_XS.gguf) | IQ4_XS | 3.4GB |
26
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_0.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_0.gguf) | Q4_0 | 3.56GB |
27
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.IQ4_NL.gguf) | IQ4_NL | 3.58GB |
28
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_K_S.gguf) | Q4_K_S | 3.59GB |
29
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_K.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_K.gguf) | Q4_K | 3.8GB |
30
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_K_M.gguf) | Q4_K_M | 3.8GB |
31
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_1.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q4_1.gguf) | Q4_1 | 3.95GB |
32
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_0.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_0.gguf) | Q5_0 | 4.33GB |
33
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_K_S.gguf) | Q5_K_S | 4.33GB |
34
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_K.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_K.gguf) | Q5_K | 4.45GB |
35
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_K_M.gguf) | Q5_K_M | 4.45GB |
36
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_1.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q5_1.gguf) | Q5_1 | 4.72GB |
37
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q6_K.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q6_K.gguf) | Q6_K | 5.15GB |
38
+ | [llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q8_0.gguf](https://huggingface.co/RichardErkhov/dhmeltzer_-_llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged-gguf/blob/main/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged.Q8_0.gguf) | Q8_0 | 6.67GB |
39
+
40
+
41
+
42
+
43
+ Original model description:
44
+
45
+ # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
46
+ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged)
47
+
48
+ | Metric | Value |
49
+ |-----------------------|---------------------------|
50
+ | Avg. | 43.96 |
51
+ | ARC (25-shot) | 53.75 |
52
+ | HellaSwag (10-shot) | 78.76 |
53
+ | MMLU (5-shot) | 46.02 |
54
+ | TruthfulQA (0-shot) | 43.31 |
55
+ | Winogrande (5-shot) | 73.48 |
56
+ | GSM8K (5-shot) | 4.7 |
57
+ | DROP (3-shot) | 7.72 |
58
+
59
+