GemMoE-Base-Random / howto.md
Crystalcareai's picture
Update howto.md
b125dc5 verified

GemMoE: Sharing Tools and Improved Base Models

I'm excited to share the tools I used to create GemMoE and release improved base models for the community to explore and build upon.

Updates to GemMoE-Beta-1

GemMoE-Beta-1 will continue to serve as the repository for the modeling_files required to operate the Mixture of Experts (MoEs). However, I will be removing the PyTorch files from that repository.

New Models

I'm introducing two new models:

  1. Crystalcareai/GemMoE-Base-Hidden

    • This is a new MoE created using an improved method that I will explain below.
    • It utilizes a hidden gate and shows strong potential.
    • The model has not been altered and requires finetuning to reach its full potential.
    • If you're looking to achieve great performance with relatively minimal training, this is an excellent starting point.
  2. Crystalcareai/GemMoE-Base-Random

    • This model was created using the same merge method as GemMoE-Base-Hidden, but with a RANDOM gate.
    • It randomly selects the experts during the merging process.
    • With finetuning, the model learns to choose the appropriate experts naturally, potentially leading to better results compared to GemMoE-Base-Hidden.
    • This method offers an intriguing mix between clown-car and mixtral-style approaches.

The new merge method and modeling files also reduce VRAM usage, making the models easier to finetune.

Training Experiences and Challenges

I have successfully trained the models on a single A100 using Qlora, although it required careful monitoring and posed some difficulties. It appears there is currently an issue with Qlora and GemMoE. I observed better VRAM usage when using 4 A6000 cards and finetuning with Dora without any quantization and deepspeed_Zero3.

Creating Your Own Merges

You can create your own merges using my modified branch of mergekit:

git clone -b gemmoe https://github.com/Crystalcareai/mergekit.git

To create an exact replica of Crystalcareai/GemMoE-Base-Hidden, use the following command:

mergekit-moe examples/gemmoe.yml ./merged --cuda --lazy-unpickle --allow-crimes

Feel free to modify the /examples/gemmoe.yml file to customize the merge according to your preferences.

Alternatively, you can use my modified lazymergekit available on Colab: Link to Colab Notebook

Let's Collaborate!

I'm thrilled to see what we can create together using these tools and improved base models. Let's push the boundaries of what's possible with GemMoE and explore new possibilities in the world of AI and machine learning.

Happy experimenting and building!