Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

The distilled version of the 'airesearch/wangchanberta-base-att-spm-uncased'. This is the 62M params model trained with Assorted Thai Texts (4.8 GB) used for WangchanBERTa pre-training.

pls use the tokenizer from the 'airesearch/wangchanberta-base-att-spm-uncased'

Downloads last month
0