KaiChen1998
commited on
Commit
•
073bcc7
1
Parent(s):
4d09303
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,7 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
# MoCLE Model Card
|
5 |
+
[MoCLE](https://arxiv.org/abs/2312.12379) is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on [InstructBLIP](https://huggingface.co/docs/transformers/model_doc/instructblip).
|
6 |
+
This repo contains the MoCLE cluster model with 256 instruction clusters based on SentenceBERT.
|
7 |
+
Check detailed usage in our [Github repo](https://github.com/gyhdog99/mocle) and [Website](https://kaichen1998.github.io/projects/mocle/).
|