bert-base for KLUE Relation Extraction task.

Fine-tuned klue/bert-base using KLUE RE dataset.


Usage


from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("ainize/klue-bert-base-re")
model = AutoModelForSequenceClassification.from_pretrained("ainize/klue-bert-base-re")

# Add "<subj>", "</subj>" to both ends of the subject object and "<obj>", "</obj>" to both ends of the object object.
sentence = "<subj>손흥민</subj>은 <obj>대한민국</obj>에서 태어났다."

encodings = tokenizer(sentence, 
                      max_length=128, 
                      truncation=True, 
                      padding="max_length", 
                      return_tensors="pt")

outputs = model(**encodings)

logits = outputs['logits']

preds = torch.argmax(logits, dim=1)

About us

  • Teachable NLP - Train NLP models with your own text without writing any code
  • Ainize - Deploy ML project using free gpu
Downloads last month
13
Hosted inference API
Text Classification
This model can be loaded on the Inference API on-demand.