YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

BERT L-2 H-512 fine-tuned on MLM (CORD-19 2020/06/16)

BERT model with 2 Transformer layers and hidden embedding of size 512, referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models, fine-tuned for MLM on CORD-19 dataset (as released on 2020/06/16).

Training the model

python run_language_modeling.py
    --model_type bert
    --model_name_or_path google/bert_uncased_L-2_H-512_A-8
    --do_train
    --train_data_file {cord19-200616-dataset}
    --mlm
    --mlm_probability 0.2
    --line_by_line
    --block_size 512
    --per_device_train_batch_size 20
    --learning_rate 3e-5
    --num_train_epochs 2
    --output_dir bert_uncased_L-2_H-512_A-8_cord19-200616
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.