Fill-Mask
Transformers
PyTorch
Bulgarian
bert
torch
rmihaylov commited on
Commit
2e6308b
1 Parent(s): 4163dfb

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +40 -0
README.md ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ inference: false
3
+ language:
4
+ - bg
5
+ license: mit
6
+ datasets:
7
+ - oscar
8
+ - chitanka
9
+ - wikipedia
10
+ tags:
11
+ - torch
12
+ ---
13
+
14
+ # BERT BASE (cased)
15
+
16
+ Pretrained model on Bulgarian language using a masked language modeling (MLM) objective. It was introduced in
17
+ [this paper](https://arxiv.org/abs/1810.04805) and first released in
18
+ [this repository](https://github.com/google-research/bert). This model is cased: it does make a difference
19
+ between bulgarian and Bulgarian. The training data is Bulgarian text from [OSCAR](https://oscar-corpus.com/post/oscar-2019/), [Chitanka](https://chitanka.info/) and [Wikipedia](https://bg.wikipedia.org/).
20
+
21
+ The model was compressed via [progressive module replacing](https://arxiv.org/abs/2002.02925).
22
+
23
+ ### How to use
24
+
25
+ Here is how to use this model in PyTorch:
26
+
27
+ ```python
28
+ >>> from transformers import pipeline
29
+ >>>
30
+ >>> model = pipeline(
31
+ >>> 'fill-mask',
32
+ >>> model='rmihaylov/bert-base-theseus-bg',
33
+ >>> tokenizer='rmihaylov/bert-base-theseus-bg',
34
+ >>> device=0,
35
+ >>> revision=None)
36
+ >>>
37
+ >>> output = model("Кой ще поеме [MASK] отговорност?")
38
+ >>> print(output)
39
+
40
+ ```