rpand002 commited on
Commit
13fcb5a
1 Parent(s): f76ec80

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -123,7 +123,7 @@ PowerMoE-3B is a 3B sparse Mixture-of-Experts (sMoE) language model trained with
123
  Paper: https://arxiv.org/abs/2408.13359
124
 
125
  ## Usage
126
- Note: requires a custom branch of transformers: https://github.com/mayank31398/transformers/tree/granitemoe
127
 
128
  ### Generation
129
  This is a simple example of how to use **PowerMoE-3b** model.
 
123
  Paper: https://arxiv.org/abs/2408.13359
124
 
125
  ## Usage
126
+ Note: Requires installing HF transformers from source.
127
 
128
  ### Generation
129
  This is a simple example of how to use **PowerMoE-3b** model.