This is an ELECTRA model consisting of a generator and a discriminator with 5.6m parameters each. The model has been trained using the ELECTRA pretraining objective on 118 million SMILES strings from PubChem. Training has been conducted with the chebai library. No fine-tuning for a specific task has been applied.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support