lexiconium commited on
Commit
b9af64e
โ€ข
1 Parent(s): 6c68e34

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -1
README.md CHANGED
@@ -1,3 +1,11 @@
 
 
 
 
 
 
 
 
1
  ## Example
2
  ```python
3
  import torch
@@ -20,7 +28,7 @@ model = AutoModelForCausalLM.from_pretrained(
20
  model.eval()
21
 
22
  prompt = "์„์–‘์ด ๋ณด์ด๋Š” ๊ฒฝ์น˜"
23
- wrapped_prompt = f"@{prompt}@"
24
  with torch.no_grad():
25
  tokens = tokenizer.encode(wrapped_prompt, return_tensors="pt").to(device="cuda")
26
  gen_tokens = model.generate(
@@ -36,4 +44,10 @@ with torch.no_grad():
36
  generated = tokenizer.decode(gen_tokens[0][len(tokens[0]):])
37
 
38
  print(generated)
 
 
 
 
 
 
39
  ```
 
1
+ ---
2
+ language:
3
+ - ko
4
+ tags:
5
+ - gpt2
6
+ license: cc-by-nc-sa-4.0
7
+ ---
8
+
9
  ## Example
10
  ```python
11
  import torch
 
28
  model.eval()
29
 
30
  prompt = "์„์–‘์ด ๋ณด์ด๋Š” ๊ฒฝ์น˜"
31
+ wrapped_prompt = f"@{prompt}@<usr>\n"
32
  with torch.no_grad():
33
  tokens = tokenizer.encode(wrapped_prompt, return_tensors="pt").to(device="cuda")
34
  gen_tokens = model.generate(
 
44
  generated = tokenizer.decode(gen_tokens[0][len(tokens[0]):])
45
 
46
  print(generated)
47
+ # ํ•ด๊ฐ€ ์ง€๊ณ  ์žˆ์„ ๋ฌด๋ ต
48
+ # ๋‚˜๋Š” ์„์–‘์„ ๋ณด๋Ÿฌ ๊ฐ„๋‹ค
49
+ # ๋ถ‰์€ ํ•˜๋Š˜๊ณผ ํ•˜์–€ ๊ตฌ๋ฆ„์ด ๋‚˜๋ฅผ ๋ฐ˜๊ฒจ์ค„ ๊ฒƒ ๊ฐ™์•„์„œ๋ฆฌ
50
+ # ํ•˜์ง€๋งŒ ๋‚ด๊ฐ€ ๋ณธ ํ•ด๋Š” ์ €๋ฌผ์–ด๋งŒ ๊ฐ€๊ณ 
51
+ # ๊ตฌ๋ฆ„๋งˆ์ € ์ž์ทจ๋ฅผ ๊ฐ์ถ˜ ์–ด๋‘ ๋งŒ์ด ๋‚จ์•„์žˆ์„ ๋ฟ์ด๋„ค
52
+ # ๋‚ด๊ฐ€ ํƒ„ ๋ฐฐ๋Š” ๋ณด์ด์ง€๋„ ์•Š๊ณ 
53
  ```