bert-base for QA

Code: See Ainize Workspace

klue-bert-base-mrc DEMO: Ainize DEMO

klue-bert-base-mrc API: Ainize API

Overview

Language model: klue/bert-base
Language: Korean
Downstream-task: Extractive QA
Training data: KLUE-MRC
Eval data: KLUE-MRC

Usage

In Transformers

from transformers import AutoModelForQuestionAnswering, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("ainize/klue-bert-base-mrc")
model = AutoModelForQuestionAnswering.from_pretrained("ainize/klue-bert-base-mrc")

context = "your context"
question = "your question"

encodings = tokenizer(context, question, max_length=512, truncation=True,
                      padding="max_length", return_token_type_ids=False)
encodings = {key: torch.tensor([val]) for key, val in encodings.items()}             

input_ids = encodings["input_ids"]
attention_mask = encodings["attention_mask"]

pred = model(input_ids, attention_mask=attention_mask)

start_logits, end_logits = pred.start_logits, pred.end_logits

token_start_index, token_end_index = start_logits.argmax(dim=-1), end_logits.argmax(dim=-1)

pred_ids = input_ids[0][token_start_index: token_end_index + 1]

prediction = tokenizer.decode(pred_ids)

About us

Teachable NLP - Train NLP models with your own text without writing any code
Ainize - Deploy ML project using free gpu

Downloads last month
181
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train ainize/klue-bert-base-mrc

Space using ainize/klue-bert-base-mrc 1