cafierom's picture
Update README.md
c51de69 verified
metadata
library_name: transformers
license: apache-2.0
base_model: cafierom/bert-base-cased-DA-ChemTok-ZN1540K-V1
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: bert-base-cased-DA-ChemTok-ZN1540K-V1-finetuned-HMGCR-IC50s-V1
    results: []

bert-base-cased-DA-ChemTok-ZN1540K-V1-finetuned-HMGCR-IC50s-V1

This model is a fine-tuned version of cafierom/bert-base-cased-DA-ChemTok-ZN1540K-V1. It is trained on the cafierom/HMGCR_1K dataset and achieves the following results on the evaluation set:

  • Loss: 0.4459
  • Accuracy: 0.8714
  • F1: 0.8574

Model description

More information needed

Intended uses & limitations

Can classify HMGCR IC50 values as < 50 nM, < 500 nM, and > 500 nM. See Confusion matrix below.

Training and evaluation data

image/png

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.8468 1.0 25 0.6663 0.7214 0.6517
0.637 2.0 50 0.5691 0.7786 0.7122
0.5817 3.0 75 0.5069 0.8 0.7643
0.5045 4.0 100 0.4742 0.8286 0.7958
0.4358 5.0 125 0.4312 0.8214 0.7959
0.411 6.0 150 0.4571 0.8071 0.7703
0.3655 7.0 175 0.4557 0.8429 0.8084
0.3522 8.0 200 0.4085 0.8714 0.8649
0.3217 9.0 225 0.4788 0.8286 0.7893
0.3186 10.0 250 0.4516 0.8143 0.7909
0.2875 11.0 275 0.4568 0.8357 0.8084
0.2754 12.0 300 0.4401 0.8357 0.8135
0.2678 13.0 325 0.4637 0.85 0.8379
0.235 14.0 350 0.4450 0.85 0.8263
0.2347 15.0 375 0.4622 0.8571 0.8445
0.2215 16.0 400 0.4882 0.8429 0.8153
0.2046 17.0 425 0.4459 0.8714 0.8574
0.207 18.0 450 0.4715 0.8571 0.8409
0.1988 19.0 475 0.4697 0.8643 0.8506
0.186 20.0 500 0.4765 0.8643 0.8506

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.1
  • Tokenizers 0.21.0