Quantization made by Richard Erkhov.
MedPaxTral-2x7b - bnb 8bits
- Model creator: https://huggingface.co/skuma307/
- Original model: https://huggingface.co/skuma307/MedPaxTral-2x7b/
Original model description:
license: apache-2.0 language:
- en library_name: transformers pipeline_tag: text-generation tags:
- medical
A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful LLM.