qicao-apple commited on
Commit
67efbea
1 Parent(s): bd57e36

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +77 -0
README.md ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: apple-sample-code-license
4
+ license_link: LICENSE
5
+ ---
6
+
7
+ # OpenELM
8
+
9
+ *Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Sun, Iman Mirzadeh, Mahyar Najibi, Dmitry Belenko, Peter Zatloukal, Mohammad Rastegari*
10
+
11
+ We introduce **OpenELM**, a family of **Open**-source **E**fficient **L**anguage **M**odels. We release both pretrained and instruction tuned models with 270M, 450M, 1.1B and 3B parameters.
12
+
13
+ Our pre-training dataset contains RefinedWeb, deduplicated PILE, a subset of RedPajama, and a subset of Dolma v1.6, totaling approximately 1.8 trillion tokens.
14
+
15
+ See below table for the page of each model:
16
+
17
+ | **Models** |
18
+ |-----------------------------------------------------------------------------|
19
+ | [OpenELM-270M](https://huggingface.co/apple/OpenELM-270M) |
20
+ | [OpenELM-450M](https://huggingface.co/apple/OpenELM-450M) |
21
+ | [OpenELM-1_1B](https://huggingface.co/apple/OpenELM-1_1B) |
22
+ | [OpenELM-3B](https://huggingface.co/apple/OpenELM-3B) |
23
+ | [OpenELM-270M-Instruct](https://huggingface.co/apple/OpenELM-270M-Instruct) |
24
+ | [OpenELM-450M-Instruct](https://huggingface.co/apple/OpenELM-450M-Instruct) |
25
+ | [OpenELM-1_1B-Instruct](https://huggingface.co/apple/OpenELM-1_1B-Instruct) |
26
+ | [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct) |
27
+
28
+
29
+ ```python
30
+
31
+ from transformers import AutoModelForCausalLM
32
+
33
+ openelm_270m = AutoModelForCausalLM.from_pretrained("apple/OpenELM-270M", trust_remote_code=True)
34
+ openelm_450m = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M", trust_remote_code=True)
35
+ openelm_1b = AutoModelForCausalLM.from_pretrained("apple/OpenELM-1_1B", trust_remote_code=True)
36
+ openelm_3b = AutoModelForCausalLM.from_pretrained("apple/OpenELM-3B", trust_remote_code=True)
37
+
38
+ openelm_270m_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-270M-Instruct", trust_remote_code=True)
39
+ openelm_450m_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M-Instruct", trust_remote_code=True)
40
+ openelm_1b_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-1_1B-Instruct", trust_remote_code=True)
41
+ openelm_3b_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-3B-Instruct", trust_remote_code=True)
42
+
43
+ ```
44
+
45
+
46
+ ## Example Usage
47
+
48
+ Below we provide an example of loading the model via [HuggingFace Hub](https://huggingface.co/docs/hub/) as:
49
+
50
+ ```python
51
+ import torch
52
+ from transformers import AutoTokenizer, AutoModelForCausalLM
53
+ # obtain access to "meta-llama/Llama-2-7b-hf", then see https://huggingface.co/docs/hub/security-tokens to get a token
54
+ tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-hf", token="hf_xxxx")
55
+
56
+ model_path = "apple/OpenELM-450M"
57
+
58
+ model = AutoModelForCausalLM.from_pretrained(model_path, trust_remote_code=True)
59
+ model = model.cuda().eval()
60
+ prompt = "Once upon a time there was"
61
+ tokenized_prompt = tokenizer(prompt)
62
+ prompt_tensor = torch.tensor(tokenized_prompt["input_ids"], device="cuda").unsqueeze(0)
63
+ output_ids = model.generate(prompt_tensor, max_new_tokens=256, repetition_penalty=1.2, pad_token_id=0)
64
+ output_ids = output_ids[0].tolist()
65
+ output_text = tokenizer.decode(output_ids, skip_special_tokens=True)
66
+ print(f'{model_path=}, {prompt=}\n')
67
+ print(output_text)
68
+
69
+ # below is the output:
70
+ """
71
+ model_path='apple/OpenELM-450M', prompt='Once upon a time there was'
72
+
73
+ Once upon a time there was a little girl who lived in the woods. She had a big heart and she loved to play with her friends. One day, she decided to go for a walk in the woods. As she walked, she saw a beautiful tree. It was so tall that it looked like a mountain. The tree was covered with leaves and flowers.
74
+ The little girl thought that this tree was very pretty. She wanted to climb up to the tree and see what was inside. So, she went up to the tree and climbed up to the top. She was very excited when she saw that the tree was full of beautiful flowers. She also
75
+ """
76
+ ```
77
+