heegyu commited on
Commit
184858b
1 Parent(s): 40f5b05

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ extra_gated_heading: Access Llama 2 on Hugging Face
3
+ extra_gated_description: >-
4
+ This is a form to enable access to Llama 2 on Hugging Face after you have been
5
+ granted access from Meta. Please visit the [Meta
6
+ website](https://ai.meta.com/resources/models-and-libraries/llama-downloads)
7
+ and accept our license terms and acceptable use policy before submitting this
8
+ form. Requests will be processed in 1-2 days.
9
+ extra_gated_prompt: >-
10
+ **Your Hugging Face account email address MUST match the email you provide on
11
+ the Meta website, or your request will not be approved.**
12
+ extra_gated_button_content: Submit
13
+ extra_gated_fields:
14
+ I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox
15
+ language:
16
+ - en
17
+ pipeline_tag: text-generation
18
+ inference: false
19
+ tags:
20
+ - facebook
21
+ - meta
22
+ - pytorch
23
+ - llama
24
+ - llama-2
25
+ ---
26
+ This is 82M parameters llama model of random weights. This model can be use for proof of concept. <br/>
27
+ Tokenizer is copy of meta-llama/Llama-2-7b
28
+
29
+ ```
30
+ # Use a pipeline as a high-level helper
31
+ from transformers import LlamaConfig, LlamaForCausalLM, LlamaTokenizer
32
+ import numpy as np
33
+
34
+ config = LlamaConfig(vocab_size=32000, hidden_size=768, intermediate_size=768*4, num_hidden_layers=4, num_attention_heads=8)
35
+ tokenizer = LlamaTokenizer.from_pretrained("meta-llama/Llama-2-7b")
36
+ model = LlamaForCausalLM(config).half()
37
+ model_parameters = filter(lambda p: p.requires_grad, model.parameters())
38
+ params = sum([np.prod(p.size()) for p in model_parameters])
39
+ print(params / 1024 / 1024) # 82.881591796875
40
+
41
+ hub_id = "heegyu/llama-small-randomweights"
42
+ tokenizer.push_to_hub(hub_id)
43
+ model.push_to_hub(hub_id)
44
+ ```