This is the model weight of MUFFIN-T5-11B (Multi-Faceted Instructions).

We fine-tune the T5-11B model on our MUFFIN dataset.

We released both 3B and 11B models:

Model Number of parameters
MUFFIN-T5-3B 3 billion
MUFFIN-T5-11B 11 billion

Please refer to MUFFIN-T5-3B for detailed documentation.

🥳 Citation

Please kindly cite our paper if you use any resources in this repository:

@inproceedings{Lou2023MUFFIN,
   title={{MUFFIN}: Curating Multi-Faceted Instructions for Improving Instruction Following},
   author={Renze Lou and Kai Zhang and Jian Xie and Yuxuan Sun and Janice Ahn and Hanzi Xu and Yu su and Wenpeng Yin},
   booktitle={The Twelfth International Conference on Learning Representations},
   year={2024},
   url={https://openreview.net/forum?id=1vrS1zwekw}
}
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Reza8848/MUFFIN-T5-11B

Quantizations
1 model

Dataset used to train Reza8848/MUFFIN-T5-11B