--- license: mit language: - uz --- # BERTbek-news-big-cased A pre-trained BERT model for Uzbek (12layers, cased). Trained on big News corpus (Daryo)