File size: 383 Bytes
156de07 565aa42 156de07 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
---
license: mit
---
# Mol ID
A transformer encoder model pretrained on 50M ZINC SMILES string using flash attention 2
Hardware:
- gpu that support flash attention 2 and bf16
Software:
- flash attention 2
- lightning for mixed precision (bf16-mixed)
- wandb for logging
- huggingface
- tokenizers
- datasets
github repo: [link](https://github.com/BlenderWang9487/mol_id) |