MichelBartelsDeepset commited on
Commit
1eab61b
1 Parent(s): 190950b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ datasets:
4
+ - squad_v2
5
+ license: mit
6
+ thumbnail: https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg
7
+ tags:
8
+ - exbert
9
+ ---
10
+
11
+ ![bert_image](https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg)
12
+
13
+ ## Overview
14
+ **Language model:** deepset/bert-medium-distilled-squad2
15
+ **Language:** German
16
+ **Training data:** GermanQuAD train set (~ 12MB)
17
+ **Eval data:** GermanQuAD test set (~ 5MB)
18
+ **Infrastructure**: 1x V100 GPU
19
+ **Published**: Apr 21st, 2021
20
+
21
+ ## Details
22
+ - haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.
23
+
24
+ ## Hyperparameters
25
+ ```
26
+ batch_size = 6
27
+ n_epochs = 2
28
+ max_seq_len = 384
29
+ learning_rate = 3e-5
30
+ lr_schedule = LinearWarmup
31
+ embeds_dropout_prob = 0.1
32
+ temperature = 5
33
+ distillation_loss_weight = 1
34
+ ```
35
+ ## Performance
36
+ ```
37
+ "exact": 68.6431398972458
38
+ "f1": 72.7637083790805%
39
+ ```
40
+ ![performancetable](https://lh3.google.com/u/0/d/1IFqkq8OZ7TFnGzxmW6eoxXSYa12f2M7O=w1970-h1546-iv1)
41
+
42
+ ## Authors
43
+ - Timo Möller: `timo.moeller [at] deepset.ai`
44
+ - Julian Risch: `julian.risch [at] deepset.ai`
45
+ - Malte Pietsch: `malte.pietsch [at] deepset.ai`
46
+ - Michel Bartels: `michel.bartels [at] deepset.ai`
47
+ ## About us
48
+ ![deepset logo](https://workablehr.s3.amazonaws.com/uploads/account/logo/476306/logo)
49
+ We bring NLP to the industry via open source!
50
+ Our focus: Industry specific language models & large scale QA systems.
51
+
52
+ Some of our work:
53
+ - [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
54
+ - [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
55
+ - [FARM](https://github.com/deepset-ai/FARM)
56
+ - [Haystack](https://github.com/deepset-ai/haystack/)
57
+
58
+ Get in touch:
59
+ [Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
60
+
61
+ By the way: [we're hiring!](http://www.deepset.ai/jobs)