mandelakori commited on
Commit
c318b7a
1 Parent(s): 9b2fc83

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -12,8 +12,9 @@ AISAK, short for Artificially Intelligent Swiss Army Knife, is a state-of-the-ar
12
  - **Model Name**: AISAK
13
  - **Version**: 1.0
14
  - **Model Architecture**: Mixture of Experts (MoE)
15
- - **Specialization**: The model is divided into distinct expert modules, each adept at capturing specific patterns and features within the input data.
16
  - **Gating Mechanism**: A dynamic gating mechanism intelligently selects and combines the outputs of these experts based on the input data, enhancing adaptability and performance.
 
17
 
18
  ### Intended Use:
19
 
 
12
  - **Model Name**: AISAK
13
  - **Version**: 1.0
14
  - **Model Architecture**: Mixture of Experts (MoE)
15
+ - **Specialization**: AISAK is structured upon the principles of the Mixture of Experts (MoE) architecture, meticulously crafted to emulate the success of the renowned https://huggingface.co/mistralai/Mixtral-8x7B-v0.1 model. Its architecture is ingeniously segmented into distinct expert modules, each adept at discerning specific patterns and features inherent within the input data.
16
  - **Gating Mechanism**: A dynamic gating mechanism intelligently selects and combines the outputs of these experts based on the input data, enhancing adaptability and performance.
17
+ - **Performance Comparison**: While AISAK may not boast the same parameter count as the Mistral8x7b model, it maintains a remarkably high and heavily comparable performance level. Through meticulous optimization and leveraging the strengths of the MoE architecture, AISAK achieves results on par with its predecessor, ensuring that it stands as a formidable contender in the realm of artificial intelligence models.
18
 
19
  ### Intended Use:
20