Model Card for Model ID

Fine-tuned distilbert model. Trained on train set of BC5CDR-chem dataset taken from BLURB.

Model Details

Model Sources [optional]

Training Details

Training Data

Train set of BC5CDR-chem dataset.

Training Procedure

Classical fine-tuning.

Training Hyperparameters

  • Training regime: [More Information Needed]

learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01

Evaluation

Testing Data

Test set of BC5CDR-chem dataset.

Results

Precision: 0.89 Recall: 0.87 Micro-F1: 0.88

Environmental Impact

  • Hardware Type: 1xRTX A4000
  • Hours used: 00:07:00
Downloads last month
186
Safetensors
Model size
66.4M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for kbulutozler/distilbert-base-uncased-FT-ner-BC5CDR-chem

Finetuned
(7537)
this model