YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

kakao brain์—์„œ ๊ณต๊ฐœํ•œ kogpt 6b model('kakaobrain/kogpt')์„ fp16์œผ๋กœ ์ €์žฅํ•œ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

์นด์นด์˜ค๋ธŒ๋ ˆ์ธ ๋ชจ๋ธ์„ fp16์œผ๋กœ ๋กœ๋“œํ•˜๋Š” ๋ฐฉ๋ฒ•

import torch
from transformers import GPTJForCausalLM

model = GPTJForCausalLM.from_pretrained('kakaobrain/kogpt', cache_dir='./my_dir', revision='KoGPT6B-ryan1.5b', torch_dtype=torch.float16)

fp16 ๋ชจ๋ธ ๋กœ๋“œ ํ›„ ๋ฌธ์žฅ ์ƒ์„ฑ

Open In Colab

import torch
from transformers import GPTJForCausalLM, AutoTokenizer

model = GPTJForCausalLM.from_pretrained('MrBananaHuman/kogpt_6b_fp16', low_cpu_mem_usage=True))
model.to('cuda')
tokenizer = AutoTokenizer.from_pretrained('MrBananaHuman/kogpt_6b_fp16')

input_text = '์ด์ˆœ์‹ ์€'
input_ids = tokenizer(input_text, return_tensors='pt').input_ids.to('cuda')

output = model.generate(input_ids, max_length=64)
print(tokenizer.decode(output[0]))

>>> ์ด์ˆœ์‹ ์€ ์šฐ๋ฆฌ์—๊ฒŒ ๋ฌด์—‡์ธ๊ฐ€? 1. ๋จธ๋ฆฌ๋ง ์ด๊ธ€์€ ์ž„์ง„์™œ๋ž€ ๋‹น์‹œ ์ด์ˆœ์ธ์ด ๋ณด์—ฌ์ค€

์ฐธ๊ณ  ๋งํฌ

https://github.com/kakaobrain/kogpt/issues/6?fbclid=IwAR1KpWhuHnevQvEWV18o16k2z9TLgrXkbWTkKqzL-NDXHfDnWcIq7I4SJXM

Downloads last month
33
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.