|
--- |
|
|
|
tags: |
|
- molecular language model |
|
|
|
--- |
|
|
|
# MolGen |
|
MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES. |
|
## Model description |
|
MolGen is the first pre-trained model that only produces chemically valid molecules. |
|
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms. |
|
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder. |
|
Through its carefully designed multi-task prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization. |
|
|
|
|
|
### BibTeX entry and citation info |
|
```bibtex |
|
@article{fang2023molecular, |
|
title={Molecular Language Model as Multi-task Generator}, |
|
author={Fang, Yin and Zhang, Ningyu and Chen, Zhuo and Fan, Xiaohui and Chen, Huajun}, |
|
journal={arXiv preprint arXiv:2301.11259}, |
|
year={2023} |
|
} |
|
``` |