RichardErkhov commited on
Commit
31af6dd
1 Parent(s): 106a46c

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +29 -0
README.md ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ MedPaxTral-2x7b - bnb 8bits
11
+ - Model creator: https://huggingface.co/skuma307/
12
+ - Original model: https://huggingface.co/skuma307/MedPaxTral-2x7b/
13
+
14
+
15
+
16
+
17
+ Original model description:
18
+ ---
19
+ license: apache-2.0
20
+ language:
21
+ - en
22
+ library_name: transformers
23
+ pipeline_tag: text-generation
24
+ tags:
25
+ - medical
26
+
27
+ ---
28
+ A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful LLM.
29
+