Update README.md
Browse files
README.md
CHANGED
@@ -34,7 +34,8 @@ Medtulu-2x7b is a Mixure of Experts (MoE) made with the following models:
|
|
34 |
|
35 |
## 🧩 Configuration
|
36 |
|
37 |
-
```
|
|
|
38 |
gate_mode: hidden
|
39 |
dtype: bfloat16
|
40 |
experts:
|
|
|
34 |
|
35 |
## 🧩 Configuration
|
36 |
|
37 |
+
```yaml
|
38 |
+
base_model: allenai/tulu-2-dpo-7b
|
39 |
gate_mode: hidden
|
40 |
dtype: bfloat16
|
41 |
experts:
|