metadata
language: zh-tw
datasets: DRCD
tasks: Question Answering
BERT DRCD 384
This model is a fine-tune checkpoint of bert-base-chinese, fine-tuned on DRCD dataset. This model reaches a F1 score of 86. This model reaches a EM score of 83.
Training Arguments:
length: 384
stride: 128
learning_rate: 3e-5
batch_size: 10
epoch: 3