Update README.md
Browse files
README.md
CHANGED
@@ -7,9 +7,11 @@ tags:
|
|
7 |
- allenai/tulu-2-dpo-7b
|
8 |
---
|
9 |
|
10 |
-
|
11 |
|
12 |
-
Medtulu-
|
|
|
|
|
13 |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b)
|
14 |
* [allenai/tulu-2-dpo-7b](https://huggingface.co/allenai/tulu-2-dpo-7b)
|
15 |
|
|
|
7 |
- allenai/tulu-2-dpo-7b
|
8 |
---
|
9 |
|
10 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63486df1f8f01fcc4b23e97d/C88GQfAqHnvTMIAGqTNy9.png)
|
11 |
|
12 |
+
# Medtulu-2x7b
|
13 |
+
|
14 |
+
Medtulu-2x7b is a Mixure of Experts (MoE) made with the following models:
|
15 |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b)
|
16 |
* [allenai/tulu-2-dpo-7b](https://huggingface.co/allenai/tulu-2-dpo-7b)
|
17 |
|