dkurt commited on
Commit
f6ed83d
1 Parent(s): 717f42b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -0
README.md ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ # OpenVINO model bert-large-uncased-whole-word-masking-squad-int8-0001
2
+
3
+ This is a BERT-large model pre-trained on lower-cased English text using Whole-Word-Masking and fine-tuned on the SQuAD v1.1 training set. The model performs question answering for English language; the input is a concatenated premise and question for the premise, and the output is the location of the answer to the question inside the premise. For details about the original floating-point model, check out [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805).
4
+
5
+ The model has been further quantized to INT8 precision using quantization-aware fine-tuning with [NNCF](https://github.com/openvinotoolkit/nncf).
6
+
7
+ Model source: [Open Model Zoo](https://github.com/openvinotoolkit/open_model_zoo/tree/master/models/intel/bert-large-uncased-whole-word-masking-squad-int8-0001)