Crystalcareai commited on
Commit
99ae855
1 Parent(s): 9344ca5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -2
README.md CHANGED
@@ -11,9 +11,17 @@ license_link: https://ai.google.dev/gemma/terms
11
  </p>
12
 
13
 
 
14
 
15
- Note: If you wish to use GemMoE while it is in beta, you must flag (trust_remote_code=True) in your training/inference config.
16
- FlashAttention is currently required - so ensure you have the most up to date version if you're using anything other than an A100/H100
 
 
 
 
 
 
 
17
 
18
 
19
  # GemMoE: An 8x8 Mixture Of Experts based on Gemma.
 
11
  </p>
12
 
13
 
14
+ # ⚠️ REPO DEPRECATED ⚠️
15
 
16
+ > **NOTE:** This repo is deprecated in favor of the models found [here](https://huggingface.co/Crystalcareai/GemMoE-Base-Random/blob/main/howto.md).
17
+
18
+ ## Purpose of this Repo
19
+
20
+ This repo purely serves to host the remote files needed to make GemMoE run with transformers.
21
+
22
+ ## ❌ Training and Inference
23
+
24
+ If you try to run training or inference from here, it will not work. Please refer to the new location of the models mentioned above.
25
 
26
 
27
  # GemMoE: An 8x8 Mixture Of Experts based on Gemma.