YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
To produce BioELECTRA, we pretrain ELECTRA on a corpus of over 20 million abstracts from PubMed.
How to use the discriminator in transformers:
from transformers import ElectraForPreTraining, ElectraTokenizerFast
import torch
discriminator = ElectraForPreTraining.from_pretrained("molly-hayward/bioelectra-base-discriminator")
tokenizer = ElectraTokenizerFast.from_pretrained("molly-hayward/bioelectra-base-discriminator")
- Downloads last month
- 100
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.