Model Card for Model ID

Fine-tuned distilbert model. Trained on train set of JNLPBA dataset taken from BLURB.

Model Details

Model Sources [optional]

Training Details

Training Data

Train set of JNLPBA dataset.

Training Procedure

Classical fine-tuning.

Training Hyperparameters

  • Training regime: [More Information Needed]

learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01

Evaluation

Testing Data

Test set of JNLPBA dataset.

Results

Precision: 0.73 Recall: 0.83 Micro-F1: 0.78

Environmental Impact

  • Hardware Type: 1xRTX A4000
  • Hours used: 00:19:00
Downloads last month
198
Safetensors
Model size
66.4M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for kbulutozler/distilbert-base-uncased-FT-ner-JNLPBA

Finetuned
(7535)
this model