Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "I-BRICKS/Cerebro_BM_solar_v01"
model = AutoModelForCausalLM.from_pretrained(
        model_id,
        torch_dtype=torch.float16,
        device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(model_id)

Correspond

Model Developer : YoungWoo Nam

Company : I-BRICKS

Downloads last month
127
Safetensors
Model size
10.7B params
Tensor type
BF16
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for I-BRICKS/Cerebro_BM_solar_v01

Quantizations
4 models

Spaces using I-BRICKS/Cerebro_BM_solar_v01 6