LLukas22 commited on
Commit
605746f
1 Parent(s): 4c7d779

Update README_TEMPLATE.md

Browse files
Files changed (1) hide show
  1. README_TEMPLATE.md +39 -72
README_TEMPLATE.md CHANGED
@@ -1,83 +1,32 @@
1
  ---
2
- license: bigscience-bloom-rail-1.0
3
  language:
4
- - ak
5
- - ar
6
- - as
7
- - bm
8
- - bn
9
- - ca
10
- - code
11
  - en
12
- - es
13
- - eu
14
- - fon
15
- - fr
16
- - gu
17
- - hi
18
- - id
19
- - ig
20
- - ki
21
- - kn
22
- - lg
23
- - ln
24
- - ml
25
- - mr
26
- - ne
27
- - nso
28
- - ny
29
- - or
30
- - pa
31
- - pt
32
- - rn
33
- - rw
34
- - sn
35
- - st
36
- - sw
37
- - ta
38
- - te
39
- - tn
40
- - ts
41
- - tum
42
- - tw
43
- - ur
44
- - vi
45
- - wo
46
- - xh
47
- - yo
48
- - zh
49
- - zu
50
- programming_language:
51
- - C
52
- - C++
53
- - C#
54
- - Go
55
- - Java
56
- - JavaScript
57
- - Lua
58
- - PHP
59
- - Python
60
- - Ruby
61
- - Rust
62
- - Scala
63
- - TypeScript
64
  tags:
65
  - llm-rs
66
  - ggml
67
- pipeline_tag: text-generation
 
 
 
 
 
 
68
  ---
 
69
 
70
- # GGML covnerted Models of [BigScience](https://huggingface.co/bigscience)'s Bloom models
71
-
72
- ## Description
73
 
74
- BLOOM is an autoregressive Large Language Model (LLM), trained to continue text from a prompt on vast amounts of text data using industrial-scale computational resources. As such, it is able to output coherent text in 46 languages and 13 programming languages that is hardly distinguishable from text written by humans. BLOOM can also be instructed to perform text tasks it hasn't been explicitly trained for, by casting them as text generation tasks.
75
 
76
-
77
- ## Converted Models
78
  $MODELS$
79
 
80
- ## Usage
 
 
81
 
82
  ### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):
83
 
@@ -89,13 +38,12 @@ Via pip: `pip install llm-rs`
89
  from llm_rs import AutoModel
90
 
91
  #Load the model, define any model you like from the list above as the `model_file`
92
- model = AutoModel.from_pretrained("rustformers/bloom-ggml",model_file="bloom-3b-q4_0-ggjt.bin")
93
 
94
  #Generate
95
  print(model.generate("The meaning of life is"))
96
  ```
97
-
98
- ### Rust via [Rustformers/llm](https://github.com/rustformers/llm):
99
 
100
  #### Installation
101
  ```
@@ -106,5 +54,24 @@ cargo build --release
106
 
107
  #### Run inference
108
  ```
109
- cargo run --release -- bloom infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
110
  ```
 
1
  ---
2
+ license: apache-2.0
3
  language:
 
 
 
 
 
 
 
4
  - en
5
+ pipeline_tag: text-generation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  tags:
7
  - llm-rs
8
  - ggml
9
+ datasets:
10
+ - mc4
11
+ - c4
12
+ - togethercomputer/RedPajama-Data-1T
13
+ - bigcode/the-stack
14
+ - allenai/s2orc
15
+ inference: false
16
  ---
17
+ # GGML converted versions of [Mosaic's](https://huggingface.co/mosaicml) MPT Models
18
 
19
+ MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
20
+ This model was trained by [MosaicML](https://www.mosaicml.com).
 
21
 
22
+ MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
23
 
24
+ ## Converted Models:
 
25
  $MODELS$
26
 
27
+ ⚠️Caution⚠️: mpt-7b-storywriter is still under development!
28
+
29
+ ## Usage
30
 
31
  ### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):
32
 
 
38
  from llm_rs import AutoModel
39
 
40
  #Load the model, define any model you like from the list above as the `model_file`
41
+ model = AutoModel.from_pretrained("rustformers/mpt-7b-ggml",model_file="mpt-7b-q4_0-ggjt.bin")
42
 
43
  #Generate
44
  print(model.generate("The meaning of life is"))
45
  ```
46
+ ### Rust via [rustformers/llm](https://github.com/rustformers/llm):
 
47
 
48
  #### Installation
49
  ```
 
54
 
55
  #### Run inference
56
  ```
57
+ cargo run --release -- mpt infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:"
58
+ ```
59
+
60
+ ### C via [GGML](https://github.com/ggerganov/ggml)
61
+ The `GGML` example only supports the ggml container type!
62
+
63
+ #### Installation
64
+
65
+ ```
66
+ git clone https://github.com/ggerganov/ggml
67
+ cd ggml
68
+ mkdir build && cd build
69
+ cmake ..
70
+ make -j4 mpt
71
+ ```
72
+
73
+ #### Run inference
74
+
75
+ ```
76
+ ./bin/mpt -m path/to/model.bin -p "The meaning of life is"
77
  ```