LLukas22 commited on
Commit
9a4ad56
1 Parent(s): dccd36b

Create README_TEMPLATE.md

Browse files
Files changed (1) hide show
  1. README_TEMPLATE.md +54 -0
README_TEMPLATE.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - llm-rs
4
+ - ggml
5
+ pipeline_tag: text-generation
6
+ license: apache-2.0
7
+ language:
8
+ - en
9
+ ---
10
+ # GGML converted versions of [Together](https://huggingface.co/togethercomputer)'s RedPajama models
11
+
12
+ ## Description
13
+ RedPajama-INCITE-Base-3B-v1 was developed by Together and leaders from the open-source AI community including Ontocord.ai, ETH DS3Lab, AAI CERC, Université de Montréal, MILA - Québec AI Institute, Stanford Center for Research on Foundation Models (CRFM), Stanford Hazy Research research group and LAION.
14
+ The training was done on 3,072 V100 GPUs provided as part of the INCITE 2023 project on Scalable Foundation Models for Transferrable Generalist AI, awarded to MILA, LAION, and EleutherAI in fall 2022, with support from the Oak Ridge Leadership Computing Facility (OLCF) and INCITE program.
15
+
16
+ - Base Model: [RedPajama-INCITE-Base-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1)
17
+ - Instruction-tuned Version: [RedPajama-INCITE-Instruct-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1)
18
+ - Chat Version: [RedPajama-INCITE-Chat-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1)
19
+
20
+ ## Converted Models:
21
+
22
+ $MODELS$
23
+
24
+ ## Usage
25
+
26
+ ### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):
27
+
28
+ #### Installation
29
+ Via pip: `pip install llm-rs`
30
+
31
+ #### Run inference
32
+ ```python
33
+ from llm_rs import AutoModel
34
+
35
+ #Load the model, define any model you like from the list above as the `model_file`
36
+ model = AutoModel.from_pretrained("rustformers/redpajama-ggml",model_file="RedPajama-INCITE-Base-3B-v1-q4_0-ggjt.bin")
37
+
38
+ #Generate
39
+ print(model.generate("The meaning of life is"))
40
+ ```
41
+
42
+ ### Rust via [Rustformers/llm](https://github.com/rustformers/llm):
43
+
44
+ #### Installation
45
+ ```
46
+ git clone --recurse-submodules https://github.com/rustformers/llm.git
47
+ cd llm
48
+ cargo build --release
49
+ ```
50
+
51
+ #### Run inference
52
+ ```
53
+ cargo run --release -- gptneox infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:"
54
+ ```