Edit model card

πŸ§‘πŸ»β€πŸ’» KLUE RoBERTa Large

  • 이 λͺ¨λΈμ€ klue/roberta-largeλ₯Ό ν•œκ΅­μ–΄ Machine Reading Comprehensionλ₯Ό μœ„ν•΄ KorQuAD 데이터 2.1 version 27,423개의 데이터λ₯Ό ν•™μŠ΅μ‹œμΌœ λ§Œλ“  λͺ¨λΈμž…λ‹ˆλ‹€.

πŸ“ What Should Know

  • KorQuAD v2.1의 원본 데이터가 μ•„λ‹Œ ν•˜μ΄νΌλ§ν¬, νƒœκ·Έ, μœ λ‹ˆμ½”λ“œ BOMλ₯Ό μ œκ±°ν•˜μ—¬ μ „μ²˜λ¦¬λ₯Ό ν•˜μ˜€κ³ , context 길이가 7500이 λ„˜μ–΄κ°„ 데이터듀은 μ œμ™Έν•˜μ—¬ 27,423개의 데이터셋을 μ΄μš©ν•˜μ—¬ ν•™μŠ΅μ‹œμΌ°μŠ΅λ‹ˆλ‹€.
  • 원본 데이터 링크 : https://korquad.github.io/

πŸ“ Getting Started

from transformers import AutoConfig, AutoModelForQuestionAnswering, AutoTokenizer

config = AutoConfig.from_pretrained('uomnf97/klue-roberta-finetuned-korquad-v2')
tokenizer = AutoTokenizer.from_pretrained('uomnf97/klue-roberta-finetuned-korquad-v2')
model = AutoModelForQuestionAnswering.from_pretrained('uomnf97/klue-roberta-finetuned-korquad-v2',config=config)
Downloads last month
22