# RoBERTa base model for Hindi language [Pretrained model on Hindi language](https://huggingface.co/spaces/flax-community/roberta-hindi) using a masked language modeling (MLM) objective. > This is part of the [Flax/Jax Community Week](https://discuss.huggingface.co/t/pretrain-roberta-from-scratch-in-hindi/7091), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.