Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
conversational
custom_code
text-generation-inference
kartikmosaicml commited on
Commit
c41b073
1 Parent(s): 084d672

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -45,7 +45,7 @@ _CC-By-NC-SA-4.0_ (non-commercial use only)
45
 
46
  ## Documentation
47
 
48
- * [Blog post: MPT-30B: Raising the bar for open-source commercial foundation models](https://www.mosaicml.com/blog/mpt-30b)
49
  * [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
50
  * Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
51
 
@@ -224,7 +224,8 @@ Please cite this model using the following format:
224
  ```
225
  @online{MosaicML2023Introducing,
226
  author = {MosaicML NLP Team},
227
- title = {Introducing MPT-30B: Raising the bar for open-source commercial foundation models},
 
228
  year = {2023},
229
  url = {www.mosaicml.com/blog/mpt-30b},
230
  note = {Accessed: 2023-06-22},
 
45
 
46
  ## Documentation
47
 
48
+ * [Blog post: Raising the bar for open-source foundation models](https://www.mosaicml.com/blog/mpt-30b)
49
  * [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
50
  * Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
51
 
 
224
  ```
225
  @online{MosaicML2023Introducing,
226
  author = {MosaicML NLP Team},
227
+ title = {Introducing MPT-30B: Raising the bar
228
+ for open-source foundation models},
229
  year = {2023},
230
  url = {www.mosaicml.com/blog/mpt-30b},
231
  note = {Accessed: 2023-06-22},