| license: mit | |
| # Mol ID | |
| A transformer encoder model pretrained on 50M ZINC SMILES string using flash attention 2 | |
| Hardware: | |
| - gpu that support flash attention 2 and bf16 | |
| Software: | |
| - flash attention 2 | |
| - lightning for mixed precision (bf16-mixed) | |
| - wandb for logging | |
| - huggingface | |
| - tokenizers | |
| - datasets | |
| github repo: [link](https://github.com/BlenderWang9487/mol_id) |