Introduction

Yi-32b-x2-v2.0 is an MoE model created with mergekit + custom prompts. The following base models are used:

Weyaxi/Bagel-Hermes-34B-Slerp
one-man-army/UNA-34Beagles-32K-bf16-v1

Details

Used Librarys

  • mergekit
  • transformers

How to use

# pip install transformers==4.35.2
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("sumo43/Yi-32b-x2-v2.0")
model = AutoModelForCausalLM.from_pretrained(
    "sumo43/Yi-32b-x2-v2.0"
)
Downloads last month
19
Safetensors
Model size
60.8B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.