Back to all models
question-answering mask_token: [MASK]
Context
Query this model
馃敟 This model is currently loaded and running on the Inference API. 鈿狅笍 This model could not be loaded by the inference API. 鈿狅笍 This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

鈿★笍 Upgrade your account to access the Inference API

							$
							curl -X POST \
-H "Authorization: Bearer YOUR_ORG_OR_USER_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{"question": "Where does she live?", "context": "She lives in Berlin."}' \
https://api-inference.huggingface.co/models/aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616_squad2
Share Copied link to clipboard

Monthly model downloads

aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616_squad2 aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616_squad2
503 downloads
last 30 days

pytorch

tf

Contributed by

aodiniz Adriano Orsoni Diniz
35 models

How to use this model directly from the 馃/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelForQuestionAnswering tokenizer = AutoTokenizer.from_pretrained("aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616_squad2") model = AutoModelForQuestionAnswering.from_pretrained("aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616_squad2")

BERT L-10 H-512 CORD-19 (2020/06/16) fine-tuned on SQuAD v2.0

BERT model with 10 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16) and fine-tuned for QA on SQuAD v2.0.

Training the model

python run_squad.py
    --model_type bert
    --model_name_or_path aodiniz/bert_uncased_L-10_H-512_A-8_cord19-200616
    --train_file 'train-v2.0.json'
    --predict_file 'dev-v2.0.json'
    --do_train
    --do_eval
    --do_lower_case
    --version_2_with_negative
    --max_seq_length 384
    --per_gpu_train_batch_size 10
    --learning_rate 3e-5
    --num_train_epochs 2
    --output_dir bert_uncased_L-10_H-512_A-8_cord19-200616_squad2