Edit model card

MoCLE Model Card

MoCLE is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on InstructBLIP. This repo contains the MoCLE cluster model with 64 instruction clusters based on SentenceBERT. Check detailed usage in our Github repo and Website.

Downloads last month

-

Downloads are not tracked for this model. How to track
Unable to determine this model's library. Check the docs .

Collection including KaiChen1998/mocle-cluster64