|
--- |
|
license: cc-by-nc-4.0 |
|
tags: |
|
- merge |
|
--- |
|
|
|
# etri-xainlp/SOLAR-10.7B-merge-dpo_v1 |
|
|
|
## Model Details |
|
|
|
**Model Developers** ETRI xainlp team |
|
|
|
**Input** text only. |
|
|
|
**Output** text only. |
|
|
|
**Model Architecture** |
|
|
|
We used MergeKit to merge Model heavytail/kullm-solar into Model etri-xainlp/SOLAR-10.7B-merge-dpo as the base. |
|
|
|
**Base Model** [etri-xainlp/SOLAR-10.7B-merge-dpo](https://huggingface.co/etri-xainlp/SOLAR-10.7B-merge-dpo) |
|
|
|
**Merge Model** [davidkim205/komt-solar-10.7b-sft-v5](https://huggingface.co/davidkim205/komt-solar-10.7b-sft-v5) |
|
|
|
**Training Dataset** |
|
|
|
- dpo+lora: 100k user preference set |
|
|
|
- We use A100 GPU 80GB * 8, when training. |