YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
kakao brain์์ ๊ณต๊ฐํ kogpt 6b model('kakaobrain/kogpt')์ fp16์ผ๋ก ์ ์ฅํ ๋ชจ๋ธ์ ๋๋ค.
์นด์นด์ค๋ธ๋ ์ธ ๋ชจ๋ธ์ fp16์ผ๋ก ๋ก๋ํ๋ ๋ฐฉ๋ฒ
import torch
from transformers import GPTJForCausalLM
model = GPTJForCausalLM.from_pretrained('kakaobrain/kogpt', cache_dir='./my_dir', revision='KoGPT6B-ryan1.5b', torch_dtype=torch.float16)
fp16 ๋ชจ๋ธ ๋ก๋ ํ ๋ฌธ์ฅ ์์ฑ
import torch
from transformers import GPTJForCausalLM, AutoTokenizer
model = GPTJForCausalLM.from_pretrained('MrBananaHuman/kogpt_6b_fp16', low_cpu_mem_usage=True))
model.to('cuda')
tokenizer = AutoTokenizer.from_pretrained('MrBananaHuman/kogpt_6b_fp16')
input_text = '์ด์์ ์'
input_ids = tokenizer(input_text, return_tensors='pt').input_ids.to('cuda')
output = model.generate(input_ids, max_length=64)
print(tokenizer.decode(output[0]))
>>> ์ด์์ ์ ์ฐ๋ฆฌ์๊ฒ ๋ฌด์์ธ๊ฐ? 1. ๋จธ๋ฆฌ๋ง ์ด๊ธ์ ์์ง์๋ ๋น์ ์ด์์ธ์ด ๋ณด์ฌ์ค
์ฐธ๊ณ ๋งํฌ
- Downloads last month
- 33
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.