Base Model : iamplus/mpt-30b-v2

Tool : MosaicML's llm-foundry (https://github.com/mosaicml/llm-foundry)

Dataset : Entire flan1m-GPT4 dataset

Config yaml with Model Params : https://huggingface.co/iamplus/mpt-30b-v3/blob/main/mpt-30b_orca.yaml

Description : mosaicml/mpt-30b -> Finetuning on (Entire flan3m-GPT3.5 dataset for 1 epoch) -> iamplus/mpt-30b-v2 -> Finetuning on (Entire flan1m-GPT4 dataset for 1 epoch) -> iamplus/mpt-30b-v3

Prompt Format :

<system>: [system prompt]

<human>: [question]

<bot>:
Downloads last month
15
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Dataset used to train iamplus/mpt-30b-v3