Back to all models
text-classification mask_token: [MASK]
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Monthly model downloads

savasy/bert-turkish-uncased-qnli savasy/bert-turkish-uncased-qnli
last 30 days



Contributed by

savasy savaş yıldırım
10 models

How to use this model directly from the 🤗/transformers library:

Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("savasy/bert-turkish-uncased-qnli") model = AutoModelForSequenceClassification.from_pretrained("savasy/bert-turkish-uncased-qnli")
Uploaded in S3

Turkish QNLI Model

I fine-tuned Turkish-Bert-Model for Question-Answering problem with Turkish version of SQuAD; TQuAD

Data: TQuAD

I used following TQuAD data set

I convert the dataset into transformers glue data format of QNLI by the following script SQuAD -> QNLI

import argparse
import collections
import json
import numpy as np
import os
import re
import string
import sys


for article in dataset['data']:
 title= article['title']
 for p in article['paragraphs']:
  context= p['context']
  for qa in p['qas']:
   answer= qa['answers'][0]['text']
   all_other_answers= list(set([e['answers'][0]['text'] for e in p['qas']]))
   print(i,qa['question'].replace(";",":") , answer.replace(";",":"),"entailment", sep="\t")
   for other in all_other_answers:
    print(i,qa['question'].replace(";",":") , other.replace(";",":"),"not_entailment" ,sep="\t")

Under QNLI folder there are dev and test test Training data looks like

613 II.Friedrich’in bilginler arasındaki en önemli şahsiyet olarak belirttiği kişi kimdir? filozof, kimyacı, astrolog ve çevirmen not_entailment 614 II.Friedrich’in bilginler arasındaki en önemli şahsiyet olarak belirttiği kişi kimdir? kişisel eğilimi ve özel temaslar nedeniyle not_entailment 615 Michael Scotus’un mesleği nedir? filozof, kimyacı, astrolog ve çevirmen entailment 616 Michael Scotus’un mesleği nedir? Palermo’ya not_entailment


Training the model with following environment

export GLUE_DIR=./glue/glue_dataTR/QNLI
python3 \
  --model_type bert \
  --model_name_or_path dbmdz/bert-base-turkish-uncased\
  --task_name $TASK_NAME \
  --do_train \
  --do_eval \
  --data_dir $GLUE_DIR \
  --max_seq_length 128 \
  --per_gpu_train_batch_size 32 \
  --learning_rate 2e-5 \
  --num_train_epochs 3.0 \
  --output_dir /tmp/$TASK_NAME/

Evaluation Results

== | acc | 0.9124060613527165 | loss| 0.21582801340189717 ==

See all my model