pabloce commited on
Commit
05fe7c8
1 Parent(s): 50d4f86

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -8,6 +8,9 @@ language:
8
 
9
  ## Mixtral Experts with DeepSeek-MoE Architecture
10
 
 
 
 
11
  This is a direct extraction of the 8 experts from [Mixtral-8x7b-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1), and a transfer of them into the DeepSeek-MoE Architecture.
12
 
13
  - **Expert Configuration:** It is 2 experts per token.
 
8
 
9
  ## Mixtral Experts with DeepSeek-MoE Architecture
10
 
11
+ [![Discord](https://img.shields.io/discord/1156064224225808488?logo=Discord&logoColor=%23ffffff&label=Discord&link=https%3A%2F%2Fdiscord.gg%2FtCMkMDDHwm)](https://discord.gg/cognitivecomputations)
12
+ Discord: https://discord.gg/cognitivecomputations
13
+
14
  This is a direct extraction of the 8 experts from [Mixtral-8x7b-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1), and a transfer of them into the DeepSeek-MoE Architecture.
15
 
16
  - **Expert Configuration:** It is 2 experts per token.