Edit model card

The license is cc-by-nc-sa-4.0.

๐Ÿปโ€โ„๏ธSOLARC-M-10.7B๐Ÿปโ€โ„๏ธ

img

Model Details

Model Developers Seungyoo Lee(DopeorNope)

I am in charge of Large Language Models (LLMs) at Markr AI team in South Korea.

Input Models input text only.

Output Models generate text only.

Model Architecture
SOLARC-M-10.7B is an auto-regressive language model based on the SOLAR architecture.


Base Model

kyujinpy/Sakura-SOLAR-Instruct

jeonsworld/CarbonVillain-en-10.7B-v1

Implemented Method

I have built a model using the merge method, utilizing each of these models as the base.


Implementation Code

Load model


from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

repo = "DopeorNope/SOLARC-M-10.7B"
OpenOrca = AutoModelForCausalLM.from_pretrained(
        repo,
        return_dict=True,
        torch_dtype=torch.float16,
        device_map='auto'
)
OpenOrca_tokenizer = AutoTokenizer.from_pretrained(repo)

Downloads last month
3,311
Safetensors
Model size
10.7B params
Tensor type
F32
ยท

Spaces using DopeorNope/SOLARC-M-10.7B 10