--- license: apache-2.0 --- # Summary An instruction-following large language model based on [pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) and trained on [Databricks' 15k instruction](https://huggingface.co/datasets/databricks/databricks-dolly-15k) with capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation, information extraction, open QA and summarization. This model is an experiment in using small base model ([pythia-70m](https://huggingface.co/EleutherAI/pythia-70m)) to build a model similar to Databricks' [dolly model](https://huggingface.co/databricks/dolly-v2-12b).