Example Usage

from transformers import AutoTokenizer, T5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("QizhiPei/biot5-plus-base-mol-instructions-protein")
model = T5ForConditionalGeneration.from_pretrained('QizhiPei/biot5-plus-base-mol-instructions-protein')

References

For more information, please refer to our paper and GitHub repository.

Paper: BioT5+: Towards Generalized Biological Understanding with IUPAC Integration and Multi-task Tuning

GitHub: BioT5+

Authors: Qizhi Pei, Lijun Wu, Kaiyuan Gao, Xiaozhuan Liang, Yin Fang, Jinhua Zhu, Shufang Xie, Tao Qin, and Rui Yan

Downloads last month
12
Safetensors
Model size
252M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for QizhiPei/biot5-plus-base-mol-instructions-protein

Quantizations
1 model

Collection including QizhiPei/biot5-plus-base-mol-instructions-protein