Example Usage

from transformers import AutoTokenizer, T5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("QizhiPei/biot5-plus-large")
model = T5ForConditionalGeneration.from_pretrained('QizhiPei/biot5-plus-large')

References

For more information, please refer to our paper and GitHub repository.

Paper: BioT5+: Towards Generalized Biological Understanding with IUPAC Integration and Multi-task Tuning

GitHub: BioT5+

Authors: Qizhi Pei, Lijun Wu, Kaiyuan Gao, Xiaozhuan Liang, Yin Fang, Jinhua Zhu, Shufang Xie, Tao Qin, and Rui Yan

Downloads last month
75
Safetensors
Model size
789M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for QizhiPei/biot5-plus-large

Quantizations
1 model

Collection including QizhiPei/biot5-plus-large