Update README.md
Browse files
README.md
CHANGED
@@ -13,6 +13,12 @@ Medorca-11B is a Mixure of Experts (MoE) made with the following models:
|
|
13 |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b)
|
14 |
* [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b)
|
15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
## 🧩 Configuration
|
17 |
|
18 |
```yaml
|
|
|
13 |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b)
|
14 |
* [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b)
|
15 |
|
16 |
+
## Evaluations
|
17 |
+
|
18 |
+
| Benchmark | Technoculture/Medorca-11B | epfl-llm/meditron-7b | microsoft/Orca-2-7b | epfl-llm/meditron-7b |
|
19 |
+
| --- | --- | --- | --- | --- |
|
20 |
+
| | | | | |
|
21 |
+
|
22 |
## 🧩 Configuration
|
23 |
|
24 |
```yaml
|