This is an ELECTRA model consisting of a generator and a discriminator with 5.6m parameters each. The model has been trained using the ELECTRA pretraining objective on 118 million SMILES strings from PubChem. Training has been conducted with the chebai library. No fine-tuning for a specific task has been applied.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for chebai/electra_pretrained_118m

Finetunes
1 model