Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ language:
|
|
11 |
> TLDR: As part of OpenChatKit (codebase available [here](https://github.com/togethercomputer/OpenChaT)),
|
12 |
> Pythia-Chat-Base-7B-v0.16 is a 7B parameter language model, fine-tuned from EleutherAI’s Pythia 7B with over 40 million instructions on 100% carbon negative compute.
|
13 |
|
14 |
-
Pythia-Chat-Base-7B-v0.16 is based on ElutherAI’s
|
15 |
We focused the tuning on several tasks such as question answering, classification, extraction, and summarization.
|
16 |
We’ve fine-tuned the model with a collection of 43 million high-quality instructions.
|
17 |
Together partnered with LAION and Ontocord.ai, who both helped curate the dataset the model is based on.
|
|
|
11 |
> TLDR: As part of OpenChatKit (codebase available [here](https://github.com/togethercomputer/OpenChaT)),
|
12 |
> Pythia-Chat-Base-7B-v0.16 is a 7B parameter language model, fine-tuned from EleutherAI’s Pythia 7B with over 40 million instructions on 100% carbon negative compute.
|
13 |
|
14 |
+
Pythia-Chat-Base-7B-v0.16 is based on ElutherAI’s Pythia-7B model, and is fine-tuned with data focusing on dialog-style interactions.
|
15 |
We focused the tuning on several tasks such as question answering, classification, extraction, and summarization.
|
16 |
We’ve fine-tuned the model with a collection of 43 million high-quality instructions.
|
17 |
Together partnered with LAION and Ontocord.ai, who both helped curate the dataset the model is based on.
|