--- library_name: transformers language: - ms --- # Malaysian Mistral 64M on MLM task using 512 context length Replicating https://github.com/McGill-NLP/llm2vec using https://huggingface.co/mesolitica/malaysian-mistral-64M-4096, done by https://github.com/aisyahrzk https://twitter.com/aisyahhhrzk Source code at https://github.com/mesolitica/malaya/tree/master/session/llm2vec WandB, https://wandb.ai/aisyahrazak/mistral-64M-mlm?nw=nwuseraisyahrazak