Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
conversational
custom_code
text-generation-inference

Update README.md

#1
by ejyuen - opened
Files changed (1) hide show
  1. README.md +0 -2
README.md CHANGED
@@ -31,8 +31,6 @@ It was built by finetuning [MPT-7B-8k](https://huggingface.co/mosaicml/mpt-7b-8k
31
  [GPTeacher](https://github.com/teknium1/GPTeacher), [Guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco), [Baize](https://github.com/project-baize/baize-chatbot) and some generated datasets.
32
  This is the same dataset that [MPT-30B-Chat](https://huggingface.co/mosaicml/mpt-30b-chat) was trained on.
33
  * License: _CC-By-NC-SA-4.0_ (non-commercial use only)
34
- * [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-chat)
35
-
36
 
37
  This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
38
 
 
31
  [GPTeacher](https://github.com/teknium1/GPTeacher), [Guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco), [Baize](https://github.com/project-baize/baize-chatbot) and some generated datasets.
32
  This is the same dataset that [MPT-30B-Chat](https://huggingface.co/mosaicml/mpt-30b-chat) was trained on.
33
  * License: _CC-By-NC-SA-4.0_ (non-commercial use only)
 
 
34
 
35
  This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
36