dddsaty's picture
Update README.md
94acda3 verified
|
raw
history blame
962 Bytes
metadata
license: apache-2.0
language:
  - ko
library_name: transformers
pipeline_tag: text-generation
datasets:
  - maywell/ko_Ultrafeedback_binarized

Explanation

  • With the base model, applied DPO to the small amount of layers with the open dataset , saved just the adapter part
  • Merged the base model and the tuned adapter together

Base Model

Used Corpus

Log

  • 2024.01.25: Initial version Upload
  • 2024.02.10: Readme updated

LICENSE

  • Apache 2.0

Citation

  • beomi/OPEN-SOLAR-KO-10.7B
    @misc {solar_ko_junbum_2023,
        author       = { {L. Junbum} },
        title        = { Solar-Ko-10.7b },
        year         = 2024,
        url          = { https://huggingface.co/beomi/SOLAR-KO-10.7B },
        publisher    = { Hugging Face }
    }