Kamja Kim commited on
Commit
e625cbd
1 Parent(s): 2f42448

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -3
README.md CHANGED
@@ -1,3 +1,37 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - g0ster/TinyStories-Korean
5
+ language:
6
+ - ko
7
+ library_name: transformers
8
+ ---
9
+
10
+ # TinyStories-Korean-800K
11
+
12
+ A tiny autoregressive language model trained from scratch.
13
+
14
+ - Architecture: Llama
15
+ - Vocab size: 4096
16
+ - Hidden size: 64
17
+ - Layers: 5
18
+ - Heads: 8 (MHA)
19
+ - Context length: up to 512 tokens
20
+
21
+ ## Note
22
+
23
+ This model was trained for experimental purposes and the generated output may not make any sense at all.
24
+
25
+ ## Usage
26
+
27
+ ```python
28
+ from transformers import AutoModelForCausalLM, AutoTokenizer
29
+
30
+ model = AutoModelForCausalLM.from_pretrained('northwind33/TinyStories-Korean-800K')
31
+ tokenizer = AutoTokenizer.from_pretrained('northwind33/TinyStories-Korean-800K')
32
+
33
+ input_text = ''
34
+ input_ids = tokenizer(input_text, return_tensors='pt').input_ids
35
+
36
+ output = model.generate(input_ids, max_length=512, do_sample=True, temperature=0.5)
37
+ ```