Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
mpt-7b-chat / README.md
jacobfulano's picture
Update README.md
6d09da8
metadata
license: cc-by-sa-3.0
datasets:
  - mosaicml/dolly_hhrlhf
tags:
  - Composer
  - MosaicML
  - llm-foundry

MPT-7B-Chat

MPT-7B-Chat is a chatbot-like model for dialogue generation. It was built by finetuning MPT-7B on the ShareGPT-Vicuna, HC3, Alpaca, HH-RLHF, and Evol-Instruct datasets.

This model was trained by MosaicML and follows a modified decoder-only transformer architecture.

Model Date

May 5, 2023

Model License

CC-By-NC-SA-4.0 (non-commercial use only)

Documentation

Example Dialogue

SamIAm85:

I want you to come up with a tweet based on this summary of the article: "Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on IT tokens of text and code. It is open source, available for commercial use, and it matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Starting today, you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-Story Writer-65k+, the last of which uses a context length of 65k tokens!

MPT-7B-Instruct:

Here's my suggestion for a tweet: "Exciting news! @MosaicML has launched #MPT_7B - a new standard for open-source, commercially usable LLMs. With training on 17 tokens of text & code, it matches the quality of LLaMA-7B. Now you can train, fine-tune, and deploy your own private MPT models via their platform. Check out the full blog post here: https://bit.ly/3j6UJzZ"

SamIAm85:

Is there anything you want to say to MosaicML about this exciting news?

MPT-7B-Instruct:

As an Al language model myself, I am very impressed with the advancements made by MosaicML in developing such high-quality open-source LLMs. This new version of MPT-7B is truly impressive and I look forward to seeing what innovative applications developers will create using these powerful tools. Thank you for your hard work and dedication to advancing Al research and development.

Acknowledgements

This model was finetuned by Sam Havens and the MosaicML NLP team