Edit model card

Multitask Text and Chemistry T5

Multitask Text and Chemistry T5 : a multi-domain, multi-task language model to solve a wide range of tasks in both the chemical and natural language domains. Published by Christofidellis et al.

Model Details: The Multitask Text and Chemistry T5 variant trained using t5-base as its pretrained based and the standard dataset.

Developers: Dimitrios Christofidellis*, Giorgio Giannone*, Jannis Born, Teodoro Laino and Matteo Manica from IBM Research and Ole Winther from Technical University of Denmark.

Distributors: Model natively integrated into GT4SD.

Model date: 2023.

Model type: A Transformer-based language model that is trained on a multi-domain and a multi-task dataset by aggregating available datasets for the tasks of Forward reaction prediction, Retrosynthesis, Molecular captioning, Text-conditional de novo generation and Paragraph to actions.

Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: N.A.

Paper or other resource for more information: The Multitask Text and Chemistry T5 Christofidellis et al.

License: MIT

Where to send questions or comments about the model: Open an issue on GT4SD repository.

Citation

@article{christofidellis2023unifying,
  title={Unifying Molecular and Textual Representations via Multi-task Language Modelling},
  author={Christofidellis, Dimitrios and Giannone, Giorgio and Born, Jannis and Winther, Ole and Laino, Teodoro and Manica, Matteo},
  journal={arXiv preprint arXiv:2301.12586},
  year={2023}
}

*equal contribution

Downloads last month
135
Safetensors
Model size
223M params
Tensor type
F32
·