Question Answering
Transformers
PyTorch
Bulgarian
bert
torch
rmihaylov commited on
Commit
e34ac33
1 Parent(s): bb7e890

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +46 -0
README.md ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ inference: false
3
+ language:
4
+ - bg
5
+ license: mit
6
+ datasets:
7
+ - oscar
8
+ - chitanka
9
+ - wikipedia
10
+ tags:
11
+ - torch
12
+ ---
13
+
14
+ # BERT BASE (cased) finetuned on Bulgarian squad data
15
+
16
+ Pretrained model on Bulgarian language using a masked language modeling (MLM) objective. It was introduced in
17
+ [this paper](https://arxiv.org/abs/1810.04805) and first released in
18
+ [this repository](https://github.com/google-research/bert). This model is cased: it does make a difference
19
+ between bulgarian and Bulgarian. The training data is Bulgarian text from [OSCAR](https://oscar-corpus.com/post/oscar-2019/), [Chitanka](https://chitanka.info/) and [Wikipedia](https://bg.wikipedia.org/).
20
+
21
+ It was finetuned on private squad Bulgarian data.
22
+
23
+ Then, it was compressed via [progressive module replacing](https://arxiv.org/abs/2002.02925).
24
+
25
+ ### How to use
26
+
27
+ Here is how to use this model in PyTorch:
28
+
29
+ ```python
30
+ >>> from transformers import pipeline
31
+ >>>
32
+ >>> model = pipeline(
33
+ >>> 'question-answering',
34
+ >>> model='rmihaylov/bert-base-squad-theseus-bg',
35
+ >>> tokenizer='rmihaylov/bert-base-squad-theseus-bg',
36
+ >>> device=0,
37
+ >>> revision=None)
38
+ >>>
39
+ >>> question = "С какво се проследява пандемията?"
40
+ >>> context = "Епидемията гасне, обяви при обявяването на данните тази сутрин Тодор Кантарджиев, член на Националния оперативен щаб. Той направи този извод на база на данните от математическите модели, с които се проследява развитието на заразата. Те показват, че т. нар. ефективно репродуктивно число е вече в границите 0.6-1. Тоест, 10 души заразяват 8, те на свой ред 6 и така нататък. "
41
+
42
+ >>> output = model(**{'question': question, 'context': context})
43
+ >>> print(output)
44
+
45
+ {'score': 0.85157310962677, 'start': 162, 'end': 186, 'answer': ' математическите модели,'}
46
+ ```