Llama-3-8B-NLI-ties3
Llama-3-8B-NLI-ties3 is a merge of the following models using mergekit:
- /content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained
- /content/drive/MyDrive/llama3_label_rationale_pretrained3
🧩 Configuration
models:
- model: /content/drive/MyDrive/llama3_anli1_rationale_ft_pretrained
parameters:
density: 1
weight: 0.5
- model: /content/drive/MyDrive/llama3_label_rationale_pretrained3
parameters:
density: 1
weight: 0.5
# - model: WizardLM/WizardMath-13B-V1.0
# parameters:
# density: 0.33
# weight:
# - filter: mlp
# value: 0.5
# - value: 0
merge_method: ties
base_model: /content/drive/MyDrive/Meta-Llama-3-8B-Instruct
parameters:
normalize: true
int8_mask: true
dtype: float16
- Downloads last month
- 11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.