Text Classification
Transformers
Safetensors
bert
DNA
biology
genomics
Inference Endpoints
Edit model card

Plant foundation DNA large language models

The plant DNA large language models (LLMs) contain a series of foundation models based on different model architectures, which are pre-trained on various plant reference genomes.
All the models have a comparable model size between 90 MB and 150 MB, BPE tokenizer is used for tokenization and 8000 tokens are included in the vocabulary.

Developed by: zhangtaolab

Model Sources

Architecture

The model is trained based on the Google BERT base model with modified tokenizer specific for DNA sequence.

This model is fine-tuned for predicting H3K27ac histone modification.

How to use

Install the runtime library first:

pip install transformers

Here is a simple code for inference:

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline

model_name = 'plant-dnabert-BPE-H3K27ac'
# load model and tokenizer
model = AutoModelForSequenceClassification.from_pretrained(f'zhangtaolab/{model_name}', trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(f'zhangtaolab/{model_name}', trust_remote_code=True)

# inference
sequences = ['GCTTTGGTTTATACCTTACACAACATAAATCACATAGTTAATCCCTAATCGTCTTTGATTCTCAATGTTTTGTTCATTTTTACCATGAACATCATCTGATTGATAAGTGCATAGAGAATTAACGGCTTACACTTTACACTTGCATAGATGATTCCTAAGTATGTCCT',
             'TAGCCCCCTCCTCTCTTTATATAGTGCAATCTAATATATGAAAGGTTCGGTGATGGGGCCAATAAGTGTATTTAGGCTAGGCCTTCATGGGCCAAGCCCAAAAGTTTCTCAACACTCCCCCTTGAGCACTCACCGCGTAATGTCCATGCCTCGTCAAAACTCCATAAAAACCCAGTG']
pipe = pipeline('text-classification', model=model, tokenizer=tokenizer,
                trust_remote_code=True, top_k=None)
results = pipe(sequences)
print(results)

Training data

We use BertForSequenceClassification to fine-tune the model.
Detailed training procedure can be found in our manuscript.

Hardware

Model was trained on a NVIDIA GTX1080Ti GPU (11 GB).

Downloads last month
17
Safetensors
Model size
92.2M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for zhangtaolab/plant-dnabert-BPE-H3K27ac

Finetuned
(8)
this model

Dataset used to train zhangtaolab/plant-dnabert-BPE-H3K27ac