mpt-30b-v4 / README.md
manojpreveen's picture
Update README.md
f796066
|
raw
history blame
509 Bytes
metadata
datasets:
  - ehartford/dolphin
license: apache-2.0

Base Model : mosaicml/mpt-30b

Tool : MosaicML's llm-foundry (https://github.com/mosaicml/llm-foundry)

Dataset : Entire flan1m-GPT4 dataset

Config yaml with Model Params : https://huggingface.co/manojpreveen/mpt-30b-v4/blob/main/mpt-30b_v4.yaml

Description : mosaicml/mpt-30b -> Finetuning on (Entire flan3m-GPT3.5 dataset for 4 epochs)

Prompt Format :

<system>: [system prompt]

<human>: [question]

<bot>: