Generating synthetic data via self-chatting
BERT as language model
Compare different tokenizers in char-level and byte-level.
Knowledge-injected Pre-trained Language Model