bert-base for KLUE Relation Extraction task.
Fine-tuned klue/bert-base using KLUE RE dataset.
- KLUE Official Webpage : https://klue-benchmark.com/
- KLUE Official Github : https://github.com/KLUE-benchmark/KLUE
- KLUE RE Github : https://github.com/ainize-team/klue-re-workspace
- Run KLUE RE on free GPU : Ainize Workspace
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("ainize/klue-bert-base-re")
model = AutoModelForSequenceClassification.from_pretrained("ainize/klue-bert-base-re")
# Add "<subj>", "</subj>" to both ends of the subject object and "<obj>", "</obj>" to both ends of the object object.
sentence = "<subj>์ํฅ๋ฏผ</subj>์ <obj>๋ํ๋ฏผ๊ตญ</obj>์์ ํ์ด๋ฌ๋ค."
encodings = tokenizer(sentence,
max_length=128,
truncation=True,
padding="max_length",
return_tensors="pt")
outputs = model(**encodings)
logits = outputs['logits']
preds = torch.argmax(logits, dim=1)
About us
- Teachable NLP - Train NLP models with your own text without writing any code
- Ainize - Deploy ML project using free gpu