Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

MedPaxTral-2x7b - bnb 4bits

Original model description:

license: apache-2.0 language:

  • en library_name: transformers pipeline_tag: text-generation tags:
  • medical

A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful LLM.

Downloads last month
0
Safetensors
Model size
5.42B params
Tensor type
F32
FP16
U8