kakao brain์์ ๊ณต๊ฐํ kogpt 6b model('kakaobrain/kogpt')์ fp16์ผ๋ก ์ ์ฅํ ๋ชจ๋ธ์ ๋๋ค.
์นด์นด์ค๋ธ๋ ์ธ ๋ชจ๋ธ์ fp16์ผ๋ก ๋ก๋ํ๋ ๋ฐฉ๋ฒ
from transformers import GPTJForCausalLM
model = GPTJForCausalLM.from_pretrained('kakaobrain/kogpt', cache_dir='./my_dir', revision='KoGPT6B-ryan1.5b', torch_dtype=torch.float16)
fp16 ๋ชจ๋ธ ๋ก๋ ํ ๋ฌธ์ฅ ์์ฑ
from transformers import GPTJForCausalLM, AutoTokenizer
model = GPTJForCausalLM.from_pretrained('MrBananaHuman/kogpt_6b_fp16')
model.to('cuda')
tokenizer = AutoTokenizer.from_pretrained('MrBananaHuman/kogpt_6b_fp16')
input_text = '์ด์์ ์'
input_ids = tokenizer.encode(input_text,