Finetuning airesearch/wangchanberta-base-att-spm-uncased with the training set of iapp_wiki_qa_squad, thaiqa_squad, and nsc_qa (removed examples which have cosine similarity with validation and test examples over 0.8; contexts of the latter two are trimmed to be around 300 newmm words). Benchmarks shared on wandb using validation and test sets of iapp_wiki_qa_squad. Trained with thai2transformers.

Run with:

export MODEL_NAME=airesearch/wangchanberta-base-att-spm-uncased
python \
  --model_name $MODEL_NAME \
  --dataset_name chimera_qa \
  --output_dir $MODEL_NAME-finetune-chimera_qa-model \
  --log_dir $MODEL_NAME-finetune-chimera_qa-log \
  --lowercase \
  --pad_on_right \

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
Hosted inference API
Question Answering
This model can be loaded on the Inference API on-demand.