satyamt commited on
Commit
7c1597d
1 Parent(s): 74cee96

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -34,7 +34,8 @@ Medtulu-2x7b is a Mixure of Experts (MoE) made with the following models:
34
 
35
  ## 🧩 Configuration
36
 
37
- ```yamlbase_model: allenai/tulu-2-dpo-7b
 
38
  gate_mode: hidden
39
  dtype: bfloat16
40
  experts:
 
34
 
35
  ## 🧩 Configuration
36
 
37
+ ```yaml
38
+ base_model: allenai/tulu-2-dpo-7b
39
  gate_mode: hidden
40
  dtype: bfloat16
41
  experts: