rishigami commited on
Commit
7277d2f
1 Parent(s): 2d4e138

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +67 -0
README.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - ja
5
+ - en
6
+ tags:
7
+ - japanese
8
+ - causal-lm
9
+ inference: false
10
+ ---
11
+ # CyberAgentLM2-7B
12
+
13
+ ## Model Description
14
+
15
+ CyberAgentLM2 is a decoder-only language model pre-trained on the 1.3T tokens of publicly available Japanese and English datasets.
16
+
17
+ ## Requirements
18
+ - transformers >= 4.34.1
19
+ - accelerate
20
+
21
+ ## Usage
22
+
23
+ ```python
24
+ import transformers
25
+ from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
26
+
27
+ assert transformers.__version__ >= "4.34.1"
28
+
29
+ model = AutoModelForCausalLM.from_pretrained("cyberagent/calm2-7b", device_map="auto", torch_dtype="auto")
30
+ tokenizer = AutoTokenizer.from_pretrained("cyberagent/calm2-7b")
31
+ streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
32
+
33
+ prompt = "AIによって私達の暮らしは、"
34
+
35
+ token_ids = tokenizer.encode(prompt, return_tensors="pt")
36
+ output_ids = model.generate(
37
+ input_ids=token_ids.to(model.device),
38
+ max_new_tokens=100,
39
+ do_sample=True,
40
+ temperature=0.9,
41
+ streamer=streamer,
42
+ )
43
+ ```
44
+
45
+ ## Model Details
46
+
47
+ * **Model size**: 7B
48
+ * **Trained tokens**: 1.3T tokens
49
+ * **Context length**: 4096
50
+ * **Model type**: Transformer-based Language Model
51
+ * **Language(s)**: Japanese, English
52
+ * **Developed by**: [CyberAgent, Inc.](https://www.cyberagent.co.jp/)
53
+ * **License**: Apache-2.0
54
+
55
+ ## Author
56
+
57
+ [Ryosuke Ishigami](https://huggingface.co/rishigami)
58
+
59
+ ## Citations
60
+ ```tex
61
+ @article{touvron2023llama,
62
+ title={LLaMA: Open and Efficient Foundation Language Models},
63
+ author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
64
+ journal={arXiv preprint arXiv:2302.13971},
65
+ year={2023}
66
+ }
67
+ ```