Crystalcareai commited on
Commit
0f18a20
1 Parent(s): 6eeeab9

Update howto.md

Browse files
Files changed (1) hide show
  1. howto.md +2 -2
howto.md CHANGED
@@ -10,13 +10,13 @@ GemMoE-Beta-1 will continue to serve as the repository for the `modeling_files`
10
 
11
  I'm introducing two new models:
12
 
13
- 1. **Crystalcareai/GemMoE-Base-Hidden**
14
  - This is a new MoE created using an improved method that I will explain below.
15
  - It utilizes a hidden gate and shows strong potential.
16
  - The model has not been altered and requires finetuning to reach its full potential.
17
  - If you're looking to achieve great performance with relatively minimal training, this is an excellent starting point.
18
 
19
- 2. **Crystalcareai/GemMoE-Base-Random**
20
  - This model was created using the same merge method as GemMoE-Base-Hidden, but with a RANDOM gate.
21
  - It randomly selects the experts during the merging process.
22
  - With finetuning, the model learns to choose the appropriate experts naturally, potentially leading to better results compared to GemMoE-Base-Hidden.
 
10
 
11
  I'm introducing two new models:
12
 
13
+ 1. [**Crystalcareai/GemMoE-Base-Hidden**](https://huggingface.co/Crystalcareai/GemMoE-Base-Hidden)
14
  - This is a new MoE created using an improved method that I will explain below.
15
  - It utilizes a hidden gate and shows strong potential.
16
  - The model has not been altered and requires finetuning to reach its full potential.
17
  - If you're looking to achieve great performance with relatively minimal training, this is an excellent starting point.
18
 
19
+ 2. [**Crystalcareai/GemMoE-Base-Random**](https://huggingface.co/Crystalcareai/GemMoE-Base-Random)
20
  - This model was created using the same merge method as GemMoE-Base-Hidden, but with a RANDOM gate.
21
  - It randomly selects the experts during the merging process.
22
  - With finetuning, the model learns to choose the appropriate experts naturally, potentially leading to better results compared to GemMoE-Base-Hidden.