seungduk commited on
Commit
f56998c
1 Parent(s): 4cccfba

Update README

Browse files
Files changed (1) hide show
  1. README.md +12 -16
README.md CHANGED
@@ -7,7 +7,6 @@ model-index:
7
  - name: yanolja/Bookworm-10.7B-v0.4-DPO
8
  results: []
9
  ---
10
- [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
11
  # Bookworm-10.7B-v0.4-DPO
12
 
13
  ## Join Our Community on Discord!
@@ -16,20 +15,17 @@ If you're passionate about the field of Large Language Models and wish to exchan
16
 
17
  ## About the Model
18
 
19
- Bookworm-10.7B-v0.4-DPO is an instruction tuned model based on [yanolja/KoSOLAR-10.7B-v0.2](https://huggingface.co/yanolja/KoSOLAR-10.7B-v0.2). To be updated.
20
 
21
- ### Our Dedicated Team
 
 
 
22
 
23
- #### Research
24
- - Myeongho Jeong
25
- - Seungtaek Choi
26
- - Seungduk Kim
27
-
28
- #### Engineering
29
- - Sanghoon Han
30
- - Suhyun Kang
31
- - Geon Kim
32
- - Rifqi Alfi
33
-
34
- #### Product Management
35
- - Bokyung Huh
 
7
  - name: yanolja/Bookworm-10.7B-v0.4-DPO
8
  results: []
9
  ---
 
10
  # Bookworm-10.7B-v0.4-DPO
11
 
12
  ## Join Our Community on Discord!
 
15
 
16
  ## About the Model
17
 
18
+ This model is a fine-tuned version of [yanolja/KoSOLAR-10.7B-v0.2](https://huggingface.co/yanolja/KoSOLAR-10.7B-v0.2), which is a Korean vocabulary-extended version of [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0). Specifically, we employed Direct Preference Optimization (DPO) based on [LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory).
19
 
20
+ ### Training Data
21
+ - Korean-translated version of [Open-Orca/SlimOrca-Dedup](https://huggingface.co/datasets/Open-Orca/SlimOrca-Dedup)
22
+ - Korean-translated version of [argilla/ultrafeedback-binarized-preferences-cleaned](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned)
23
+ - No other dataset was used
24
 
25
+ ## Our Dedicated Team
26
+ | Research | Engineering | Product Management |
27
+ |-----------------|-----------------|--------------------|
28
+ | Myeongho Jeong | Sanghoon Han | Bokyung Huh |
29
+ | Seungtaek Choi | Suhyun Kang | |
30
+ | Seungduk Kim | Rifqi Alfi | |
31
+ | | Geon Kim | |