kogpt_6b_fp16 / README.md
MrBananaHuman
Update README.md
f9f9853
|
raw
history blame
No virus
831 Bytes

kakao brain์—์„œ ๊ณต๊ฐœํ•œ kogpt 6b model('kakaobrain/kogpt')์„ fp16์œผ๋กœ ์ €์žฅํ•œ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

์นด์นด์˜ค๋ธŒ๋ ˆ์ธ ๋ชจ๋ธ์„ fp16์œผ๋กœ ๋กœ๋“œํ•˜๋Š” ๋ฐฉ๋ฒ•

from transformers import GPTJForCausalLM

model = GPTJForCausalLM.from_pretrained('kakaobrain/kogpt', cache_dir='./my_dir', revision='KoGPT6B-ryan1.5b', torch_dtype=torch.float16)

fp16 ๋ชจ๋ธ ๋กœ๋“œ ํ›„ ๋ฌธ์žฅ ์ƒ์„ฑ

from transformers import GPTJForCausalLM, AutoTokenizer

model = GPTJForCausalLM.from_pretrained('MrBananaHuman/kogpt_6b_fp16')
model.to('cuda')
tokenizer = AutoTokenizer.from_pretrained('MrBananaHuman/kogpt_6b_fp16')

input_text = '์ด์ˆœ์‹ ์€'
input_ids = tokenizer.encode(input_text, 

์ฐธ๊ณ  ๋งํฌ

https://github.com/kakaobrain/kogpt/issues/6?fbclid=IwAR1KpWhuHnevQvEWV18o16k2z9TLgrXkbWTkKqzL-NDXHfDnWcIq7I4SJXM