arlineka commited on
Commit
5ed9742
1 Parent(s): 4abe440

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -9,9 +9,10 @@ tags:
9
  Brunhilde-2x7b-MOE-DPO-v.01.5 is a Mixure of Experts (MoE).
10
  * [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
11
  * [mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
12
- ```
13
 
14
- ```python
 
 
15
  !pip install -qU transformers bitsandbytes accelerate
16
 
17
  from transformers import AutoTokenizer
 
9
  Brunhilde-2x7b-MOE-DPO-v.01.5 is a Mixure of Experts (MoE).
10
  * [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
11
  * [mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
 
12
 
13
+ ## Usage
14
+
15
+ ```
16
  !pip install -qU transformers bitsandbytes accelerate
17
 
18
  from transformers import AutoTokenizer