HelpingAI-Lite-2x1B / README.md
Abhaykoul's picture
Update README.md
a7405ee verified
|
raw
history blame contribute delete
No virus
815 Bytes
metadata
language:
  - en
metrics:
  - accuracy
library_name: transformers
base_model: OEvortex/HelpingAI-Lite
tags:
  - HelpingAI
  - coder
  - lite
  - Fine-tuned
  - moe
  - nlp
license: other
license_name: hsul
license_link: https://huggingface.co/OEvortex/vortex-3b/raw/main/LICENSE.md

HelpingAI-Lite

Subscribe to my YouTube channel

Subscribe

The HelpingAI-Lite-2x1B is a MOE (Mixture of Experts) model, surpassing HelpingAI-Lite in accuracy. However, it operates at a marginally reduced speed compared to the efficiency of HelpingAI-Lite. This nuanced trade-off positions the HelpingAI-Lite-2x1B as an exemplary choice for those who prioritize heightened accuracy within a context that allows for a slightly extended processing time.

Language

The model supports English language.