Layers

#1
by ryjovsky - opened

Hi,
Have you been adding another layers at the head of BERT or it's just fine-tuned BertForQuestionAnswering model.

No it's just the regular BertForQuestionAnswering model. I used the fine-tuning script provided in the transformers repo (https://wandb.ai/salti/mBERT_QA/runs/wkqzhrp2/code).

Sign up or log in to comment