Yin Fang commited on
Commit
48ba16d
·
1 Parent(s): 089d787

Delete Readme.md

Browse files
Files changed (1) hide show
  1. Readme.md +0 -25
Readme.md DELETED
@@ -1,25 +0,0 @@
1
- ---
2
- # language: SELFIES
3
- tags:
4
- - molecular language model
5
-
6
- ---
7
-
8
- # MolGen
9
- MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
10
- ## Model description
11
- MolGen is the first pre-trained model that only produces chemically valid molecules.
12
- With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
13
- Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
14
- Through its carefully designed multi-task prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
15
-
16
-
17
- ### BibTeX entry and citation info
18
- ```bibtex
19
- @article{fang2023molecular,
20
- title={Molecular Language Model as Multi-task Generator},
21
- author={Fang, Yin and Zhang, Ningyu and Chen, Zhuo and Fan, Xiaohui and Chen, Huajun},
22
- journal={arXiv preprint arXiv:2301.11259},
23
- year={2023}
24
- }
25
- ```