wangchanberta-base-wiki-20210520-news-spm_span-mask-finetune-qa

Finetuning airesearch/wangchanberta-base-wiki-20210520-news-spm_span-mask with the training set of iapp_wiki_qa_squad, thaiqa_squad, and nsc_qa (removed examples which have cosine similarity with validation and test examples over 0.8; contexts of the latter two are trimmed to be around 300 newmm words). Benchmarks shared on wandb using validation and test sets of iapp_wiki_qa_squad. Trained with thai2transformers.

Run with:

export MODEL_NAME=airesearch/wangchanberta-base-wiki-20210520-news-spm_span-mask
CUDA_LAUNCH_BLOCKING=1 python train_question_answering_lm_finetuning.py \
  --model_name $MODEL_NAME \
  --dataset_name chimera_qa \
  --output_dir $MODEL_NAME-finetune-chimera_qa-model \
  --log_dir $MODEL_NAME-finetune-chimera_qa-log \
  --model_max_length 400 \
  --pad_on_right \
  --fp16 \
  --use_auth_token
New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
4
Hosted inference API
Question Answering
This model can be loaded on the Inference API on-demand.