Example Usage

from transformers import AutoTokenizer, T5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("QizhiPei/biot5-plus-base")
model = T5ForConditionalGeneration.from_pretrained('QizhiPei/biot5-plus-base')

References

For more information, please refer to our paper and GitHub repository.

Paper: BioT5+: Towards Generalized Biological Understanding with IUPAC Integration and Multi-task Tuning

GitHub: BioT5+

Authors: Qizhi Pei, Lijun Wu, Kaiyuan Gao, Xiaozhuan Liang, Yin Fang, Jinhua Zhu, Shufang Xie, Tao Qin, and Rui Yan

Downloads last month
3,648
Safetensors
Model size
252M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for QizhiPei/biot5-plus-base

Finetunes
1 model
Quantizations
1 model

Collection including QizhiPei/biot5-plus-base