Update README.md
Browse files
README.md
CHANGED
@@ -37,6 +37,7 @@ Jisoo Kim(kuotient)
|
|
37 |
[beomi/llama-2-koen-13b](https://huggingface.co/beomi/llama-2-koen-13b)
|
38 |
#### **Datasets**
|
39 |
- [sharegpt_deepl_ko_translation](https://huggingface.co/datasets/squarelike/sharegpt_deepl_ko_translation)
|
|
|
40 |
- AIHUB
|
41 |
- ๊ธฐ์ ๊ณผํ ๋ถ์ผ ํ-์ ๋ฒ์ญ ๋ณ๋ ฌ ๋ง๋ญ์น ๋ฐ์ดํฐ
|
42 |
- ์ผ์์ํ ๋ฐ ๊ตฌ์ด์ฒด ํ-์ ๋ฒ์ญ ๋ณ๋ ฌ ๋ง๋ญ์น ๋ฐ์ดํฐ
|
@@ -73,6 +74,7 @@ DeepL:
|
|
73 |
> ์
์์ ํ๋ ํจ์ $\psi(x)$๋ $$\psi(x)=\begin{cases}๋ก ์ฃผ์ด์ง๋๋ค. 3x & \text{if } -1 \leq x \leq 0 \\ 3(1-x) & \text{if } 0 < x \leq 1 \\ 0 & \text{๊ธฐํ} \end{cases}$$ ํ๋ ํจ์ $\psi(x)$์ ํธ๋ฆฌ์ ๋ณํ์ธ $\tilde{\psi}(k)$๋ฅผ ๊ณ์ฐํ๊ณ ํธ๋ฆฌ์ ๋ฐ์ ์ ๋ฆฌ, ์ฆ $\psi(x) = \frac{1}{\sqrt{2\pi}}๋ฅผ ๋ง์กฑํจ์ ์ฆ๋ช
ํฉ๋๋ค. \int_{-\infty}^{\infty} \๋ฌผ๊ฒฐํ{\psi}(k) e^{ikx} \mathrm{d}k$.
|
74 |
|
75 |
...and much more awesome case with SQL query, code, markdowns!
|
|
|
76 |
#### **How to**
|
77 |
**I highly recommend to inference model with vllm. I will write a guide for quick and easy inference if requested.**
|
78 |
Since, chat_template already contains insturction format above.
|
|
|
37 |
[beomi/llama-2-koen-13b](https://huggingface.co/beomi/llama-2-koen-13b)
|
38 |
#### **Datasets**
|
39 |
- [sharegpt_deepl_ko_translation](https://huggingface.co/datasets/squarelike/sharegpt_deepl_ko_translation)
|
40 |
+
- [KOR-OpenOrca-Platypus-v3](https://huggingface.co/datasets/kyujinpy/KOR-OpenOrca-Platypus-v3)
|
41 |
- AIHUB
|
42 |
- ๊ธฐ์ ๊ณผํ ๋ถ์ผ ํ-์ ๋ฒ์ญ ๋ณ๋ ฌ ๋ง๋ญ์น ๋ฐ์ดํฐ
|
43 |
- ์ผ์์ํ ๋ฐ ๊ตฌ์ด์ฒด ํ-์ ๋ฒ์ญ ๋ณ๋ ฌ ๋ง๋ญ์น ๋ฐ์ดํฐ
|
|
|
74 |
> ์
์์ ํ๋ ํจ์ $\psi(x)$๋ $$\psi(x)=\begin{cases}๋ก ์ฃผ์ด์ง๋๋ค. 3x & \text{if } -1 \leq x \leq 0 \\ 3(1-x) & \text{if } 0 < x \leq 1 \\ 0 & \text{๊ธฐํ} \end{cases}$$ ํ๋ ํจ์ $\psi(x)$์ ํธ๋ฆฌ์ ๋ณํ์ธ $\tilde{\psi}(k)$๋ฅผ ๊ณ์ฐํ๊ณ ํธ๋ฆฌ์ ๋ฐ์ ์ ๋ฆฌ, ์ฆ $\psi(x) = \frac{1}{\sqrt{2\pi}}๋ฅผ ๋ง์กฑํจ์ ์ฆ๋ช
ํฉ๋๋ค. \int_{-\infty}^{\infty} \๋ฌผ๊ฒฐํ{\psi}(k) e^{ikx} \mathrm{d}k$.
|
75 |
|
76 |
...and much more awesome case with SQL query, code, markdowns!
|
77 |
+
|
78 |
#### **How to**
|
79 |
**I highly recommend to inference model with vllm. I will write a guide for quick and easy inference if requested.**
|
80 |
Since, chat_template already contains insturction format above.
|