mol_id / README.md
blenderwang's picture
Update README.md
565aa42 verified
|
raw
history blame contribute delete
No virus
383 Bytes
metadata
license: mit

Mol ID

A transformer encoder model pretrained on 50M ZINC SMILES string using flash attention 2

Hardware:

  • gpu that support flash attention 2 and bf16

Software:

  • flash attention 2
  • lightning for mixed precision (bf16-mixed)
  • wandb for logging
  • huggingface
    • tokenizers
    • datasets

github repo: link