Text Generation
Transformers
English
mpt
llm-rs
ggml
text-generation-inference
LLukas22 commited on
Commit
c993cc9
1 Parent(s): b6fed65

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -1
README.md CHANGED
@@ -9,6 +9,11 @@ tags:
9
  ---
10
  # GGML converted versions of [Mosaic's](https://huggingface.co/mosaicml) MPT Models
11
 
 
 
 
 
 
12
  ## Converted Models:
13
  | Name | Based on | Type | Container |
14
  |-|-|-|-|
@@ -32,7 +37,7 @@ tags:
32
  ### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):
33
 
34
  #### Installation
35
- Via pip: `pip install llm-rs huggingface_hub`
36
 
37
  #### Run inference
38
  ```python
 
9
  ---
10
  # GGML converted versions of [Mosaic's](https://huggingface.co/mosaicml) MPT Models
11
 
12
+ MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
13
+ This model was trained by [MosaicML](https://www.mosaicml.com).
14
+
15
+ MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
16
+
17
  ## Converted Models:
18
  | Name | Based on | Type | Container |
19
  |-|-|-|-|
 
37
  ### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):
38
 
39
  #### Installation
40
+ Via pip: `pip install llm-rs`
41
 
42
  #### Run inference
43
  ```python