rhysjones commited on
Commit
ee20741
1 Parent(s): 35375e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -7,9 +7,9 @@ license: apache-2.0
7
  HelixNet-LMoE is a simple LoRA based Mixture of Experts version of the [HelixNet](https://huggingface.co/migtissera/HelixNet) 3-model system by [Migel Tissera](https://huggingface.co/migtissera).
8
 
9
  For each HelixNet model, a separate LoRA adapter was extracted :
10
- * [HelixNet-LMoE-Actor](rhysjones/HelixNet-LMoE-Actor)
11
- * [HelixNet-LMoE-Critic](rhysjones/HelixNet-LMoE-Critic)
12
- * [HelixNet-LMoE-Regenerator](rhysjones/HelixNet-LMoE-Regenerator)
13
 
14
  These are then loaded togeter with the base [Mistral 7b](https://huggingface.co/mistralai/Mistral-7B-v0.1) model to give the combined LMoE model.
15
 
 
7
  HelixNet-LMoE is a simple LoRA based Mixture of Experts version of the [HelixNet](https://huggingface.co/migtissera/HelixNet) 3-model system by [Migel Tissera](https://huggingface.co/migtissera).
8
 
9
  For each HelixNet model, a separate LoRA adapter was extracted :
10
+ * [HelixNet-LMoE-Actor](https://huggingface.co/rhysjones/HelixNet-LMoE-Actor)
11
+ * [HelixNet-LMoE-Critic](https://huggingface.co/rhysjones/HelixNet-LMoE-Critic)
12
+ * [HelixNet-LMoE-Regenerator](https://huggingface.co/rhysjones/HelixNet-LMoE-Regenerator)
13
 
14
  These are then loaded togeter with the base [Mistral 7b](https://huggingface.co/mistralai/Mistral-7B-v0.1) model to give the combined LMoE model.
15