|
Quantization made by Richard Erkhov. |
|
|
|
[Github](https://github.com/RichardErkhov) |
|
|
|
[Discord](https://discord.gg/pvy7H8DZMG) |
|
|
|
[Request more models](https://github.com/RichardErkhov/quant_request) |
|
|
|
|
|
MedPaxTral-2x7b - bnb 4bits |
|
- Model creator: https://huggingface.co/skuma307/ |
|
- Original model: https://huggingface.co/skuma307/MedPaxTral-2x7b/ |
|
|
|
|
|
|
|
|
|
Original model description: |
|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
tags: |
|
- medical |
|
|
|
--- |
|
A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful LLM. |
|
|
|
|