ZJU-Fangyin commited on
Commit
089d787
1 Parent(s): 62be3a1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -0
README.md ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # MolGen
2
+ MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
3
+ ## Model description
4
+ MolGen is the first pre-trained model that only produces chemically valid molecules.
5
+ With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
6
+ Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
7
+ Through its carefully designed multi-task prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
8
+
9
+
10
+ ### BibTeX entry and citation info
11
+ ```bibtex
12
+ @article{fang2023molecular,
13
+ title={Molecular Language Model as Multi-task Generator},
14
+ author={Fang, Yin and Zhang, Ningyu and Chen, Zhuo and Fan, Xiaohui and Chen, Huajun},
15
+ journal={arXiv preprint arXiv:2301.11259},
16
+ year={2023}
17
+ }
18
+ ```