|
--- |
|
license: apache-2.0 |
|
--- |
|
# MoCLE Model Card |
|
[MoCLE](https://arxiv.org/abs/2312.12379) is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://huggingface.co/docs/transformers/model_doc/instructblip). |
|
This repo contains the MoCLE cluster model with 64 instruction clusters based on SentenceBERT. |
|
Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle) and [Website](https://kaichen1998.github.io/projects/mocle/). |