Edit model card

Turkish Model: Question Answering

I fine-tuned Turkish-Bert-Model for Question-Answering problem with THQuAD+BQuAD;

Example Usage

"Load Model"


from transformers import AutoTokenizer, AutoModelForQuestionAnswering

tokenizer = AutoTokenizer.from_pretrained("Marzu39/bert-base-turkish-128k-qa")
model = AutoModelForQuestionAnswering.from_pretrained("Marzu39/bert-base-turkish-128k-qa")
Downloads last month
4
Safetensors
Model size
184M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.