metadata
license: agpl-3.0
tags:
- PubChem
- Electra
This is an ELECTRA model consisting of a generator and a discriminator with 5.6m parameters each. The model has been trained using the ELECTRA pretraining objective on 118 million SMILES strings from PubChem. Training has been conducted with the chebai library. No fine-tuning for a specific task has been applied.