l3cube-pune commited on
Commit
7c23194
1 Parent(s): 0da8634

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +27 -0
README.md ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-4.0
3
+ language: mr
4
+ datasets:
5
+ - L3Cube-MahaCorpus
6
+ ---
7
+
8
+ ## MahaBERT-Small
9
+ MahaBERT-Small is a smaller version of the Marathi BERT model with 6 transformer layers. It is a base-BERT model trained from scratch on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets.
10
+ [dataset link] (https://github.com/l3cube-pune/MarathiNLP)
11
+
12
+ More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)
13
+
14
+ The best version of this model is available <a href='https://huggingface.co/l3cube-pune/marathi-bert-v2'> here </a>.
15
+
16
+ ```
17
+ @InProceedings{joshi:2022:WILDRE6,
18
+ author = {Joshi, Raviraj},
19
+ title = {L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources},
20
+ booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference},
21
+ month = {June},
22
+ year = {2022},
23
+ address = {Marseille, France},
24
+ publisher = {European Language Resources Association},
25
+ pages = {97--101}
26
+ }
27
+ ```