Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

Not Horny Enough

The Drummer becomes hornier

Recipe based on MarsupialAI/Monstral-123B but uses TheDrummer/Behemoth-123B-v1.1 as the base.

This is a merge of pre-trained language models created using mergekit.

GGUF Quants:

Thank you mradermacher for honoring my request.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: TheDrummer/Behemoth-123B-v1.1
  - model: anthracite-org/magnum-v4-123b
merge_method: slerp
base_model: TheDrummer/Behemoth-123B-v1.1
parameters:
  t: [0.1, 0.3, 0.6, 0.3, 0.1]
dtype: float16
Downloads last month
15
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for Darkhn/Behemoth-123B-v1.1-Magnum-v4-123B-3.5bpw-h8-exl2-pippa