--- license: apache-2.0 datasets: - Intel/orca_dpo_pairs language: - en library_name: transformers --- # 🌞🚀 SOLAR-4x10.7_38B Merge of four Solar-10.7B instruct finetunes. ![solar](solar.png) ## 🌟 Usage ## 🌅 Code Example TODO ## Evaluations TODO ### 📚 Citations ```bibtex @misc{kim2023solar, title={SOLAR 10.7B: Scaling Large Language Models with Simple yet Effective Depth Up-Scaling}, author={Dahyun Kim and Chanjun Park and Sanghoon Kim and Wonsung Lee and Wonho Song and Yunsu Kim and Hyeonwoo Kim and Yungi Kim and Hyeonju Lee and Jihoo Kim and Changbae Ahn and Seonghoon Yang and Sukyung Lee and Hyunbyung Park and Gyoungjin Gim and Mikyoung Cha and Hwalsuk Lee and Sunghun Kim}, year={2023}, eprint={2312.15166}, archivePrefix={arXiv}, primaryClass={cs.CL} }