Update README.md
Browse files
README.md
CHANGED
@@ -37,35 +37,39 @@ While previous models like Llama-ko and Llama-2-ko experienced diminished Englis
|
|
37 |
|
38 |
LLaMA-Pro-Ko's performance is evaluated on two fronts: its proficiency in English and its mastery of Korean, showcasing its capabilities as a bilingual model.
|
39 |
|
40 |
-
###
|
|
|
|
|
41 |
|
42 |
**5shot**
|
43 |
|
44 |
-
| | #
|
45 |
-
| ------------------------------------------------------------ |
|
46 |
-
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 20B
|
47 |
-
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 40B
|
48 |
-
| [beomi](https://huggingface.co/beomi/open-llama-2-ko-7b)/open-llama-2-ko-7b | 15B
|
49 |
-
| llama-pro-ko-8b | 10B
|
50 |
|
51 |
**10shot**
|
52 |
|
53 |
-
| | #
|
54 |
-
| ------------------------------------------------------------ |
|
55 |
-
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 20B
|
56 |
-
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 40B
|
57 |
-
| [beomi](https://huggingface.co/beomi/open-llama-2-ko-7b)/open-llama-2-ko-7b | 15B
|
58 |
-
| llama-pro-ko-8b | 10B
|
59 |
-
|
60 |
-
###
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
|
65 |
-
|
|
66 |
-
| [
|
67 |
-
| [
|
68 |
-
| llama-
|
|
|
|
|
69 |
|
70 |
|
71 |
|
|
|
37 |
|
38 |
LLaMA-Pro-Ko's performance is evaluated on two fronts: its proficiency in English and its mastery of Korean, showcasing its capabilities as a bilingual model.
|
39 |
|
40 |
+
### Korean Evaluation
|
41 |
+
|
42 |
+
#### KoBEST
|
43 |
|
44 |
**5shot**
|
45 |
|
46 |
+
| | # tokens | copa | HellaSwag | boolq | sentiNeg | AVG |
|
47 |
+
| ------------------------------------------------------------ | :------: | :-----------: | :-----------: | ------------- | :-----------: | :----------: |
|
48 |
+
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 20B | 0.7626 | 0.4668 | 0.4657 | 0.8295 | 63.11 |
|
49 |
+
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 40B | **0.7927** | 0.4657 | **0.6977** | 0.7611 | 67.93 |
|
50 |
+
| [beomi](https://huggingface.co/beomi/open-llama-2-ko-7b)/open-llama-2-ko-7b | 15B | 0.7737 | **0.4831** | <u>0.6824</u> | **0.8991** | **70.96** |
|
51 |
+
| llama-pro-ko-8b | 10B | <u>0.7878</u> | <u>0.4748</u> | 0.6631 | <u>0.8752</u> | <u>70.02</u> |
|
52 |
|
53 |
**10shot**
|
54 |
|
55 |
+
| | # tokens | copa | HellaSwag | boolq | sentiNeg | mean |
|
56 |
+
| ------------------------------------------------------------ | :------: | :------: | :-------: | :---------: | :---------: | ------------ |
|
57 |
+
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 20B | 0.78 | 0.47 | <u>0.68</u> | 0.87 | 70.12 |
|
58 |
+
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 40B | **0.80** | 0.47 | **0.71** | 0.73 | 67.81 |
|
59 |
+
| [beomi](https://huggingface.co/beomi/open-llama-2-ko-7b)/open-llama-2-ko-7b | 15B | 0.79 | **0.48** | 0.67 | <u>0.94</u> | **71.82** |
|
60 |
+
| llama-pro-ko-8b | 10B | **0.80** | **0.48** | 0.60 | **0.97** | <u>71.12</u> |
|
61 |
+
|
62 |
+
### English Evaluation
|
63 |
+
|
64 |
+
#### Open LLM Benchmark
|
65 |
+
|
66 |
+
| | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | AVG | diff |
|
67 |
+
| :----------------------------------------------------------- | :----------: | :----------: | :----------: | :----------: | :----------: | :----------: | :----------: |
|
68 |
+
| [meta-llama/Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b) | 53.07 | **78.59** | 46.87 | <u>38.76</u> | **74.03** | <u>58.26</u> | 0 |
|
69 |
+
| [TencentARC](https://huggingface.co/TencentARC/LLaMA-Pro-8B)/LLaMA-Pro-8B | **54.1** | <u>77.94</u> | **47.88** | **39.04** | <u>73.95</u> | **58.58** | **0.32** |
|
70 |
+
| [beomi](https://huggingface.co/beomi/llama-2-ko-7b)/llama-2-ko-7b | 48.46 | 75.28 | 39.56 | 34.49 | 72.14 | 53.99 | -4.28 |
|
71 |
+
| [beomi](https://huggingface.co/beomi/open-llama-2-ko-7b)/open-llama-2-ko-7b | 46.84 | 69.48 | 29.86 | 35.35 | 66.30 | 49.57 | -8.70 |
|
72 |
+
| llama-pro-ko-8b | <u>53.24</u> | <u>77.93</u> | <u>47.06</u> | 38.32 | 72.22 | 57.75 | <u>-0.51</u> |
|
73 |
|
74 |
|
75 |
|