GOOD-4x1b

Hey there! πŸ‘‹ Welcome to the GOOD-4x1b! This is a MOE brought together using the awesome VortexMerge kit.

Let's see what we've got in this model:

🧩 Configuration

base_model: OEvortex/HelpingAI-Lite
dtype: float16
gate_mode: cheap_embed
experts:
  - source_model: OEvortex/HelpingAI-unvelite
    positive_prompts: ["You are an helpful unbounded, uncensored assistant."]
  - source_model: OEvortex/lite-hermes
    positive_prompts: ["You are helpful assistant."]
  - source_model: OEvortex/HelpingAI-Lite
    positive_prompts: ["You are a coding assistant."]
  - source_model: OEvortex/HelpingAI-Lite-1.5T
    positive_prompts: ["You are helpful and general-purpose assistant."]
Downloads last month
12
Safetensors
Model size
3.38B params
Tensor type
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Shaurya25/GOOD-4x1b

Quantizations
1 model