SOLARC-M-10.7B / README.md
DopeorNope's picture
Update README.md
17807f9 verified
metadata
language:
  - ko
library_name: transformers
pipeline_tag: text-generation
license: cc-by-nc-sa-4.0
tags:
  - merge

The license is cc-by-nc-sa-4.0.

🐻‍❄️SOLARC-M-10.7B🐻‍❄️

img

Model Details

Model Developers Seungyoo Lee(DopeorNope)

I am in charge of Large Language Models (LLMs) at Markr AI team in South Korea.

Input Models input text only.

Output Models generate text only.

Model Architecture
SOLARC-M-10.7B is an auto-regressive language model based on the SOLAR architecture.


Base Model

kyujinpy/Sakura-SOLAR-Instruct

jeonsworld/CarbonVillain-en-10.7B-v1

Implemented Method

I have built a model using the merge method, utilizing each of these models as the base.


Implementation Code

Load model


from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

repo = "DopeorNope/SOLARC-M-10.7B"
OpenOrca = AutoModelForCausalLM.from_pretrained(
        repo,
        return_dict=True,
        torch_dtype=torch.float16,
        device_map='auto'
)
OpenOrca_tokenizer = AutoTokenizer.from_pretrained(repo)