NeuralKrishnaMathWizard-7B
Hey there! 👋 Welcome to the NeuralKrishnaMathWizard-7B! This is a merge of multiple models brought together using the awesome VortexMerge kit.
Let's see what we've got in this merge:
🧩 Configuration
models:
- model: Kukedlc/NeuralSirKrishna-7b
parameters:
density: 0.9
weight: 0.5
- model: WizardLM/WizardMath-7B-V1.1
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: Kukedlc/NeuralSirKrishna-7b
parameters:
normalize: true
int8_mask: true
dtype: float16
- Downloads last month
- 274
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.