FLP 2
Fine-tuned from DeepChem/ChemBERTa-10M-MTR using Precite.
Training Configuration
| Parameter | Value |
|---|---|
| Base Model | DeepChem/ChemBERTa-10M-MTR |
| Version | 1 |
| Epochs | 10 |
| Batch Size | 32 |
| Learning Rate | 0.0001 |
| Test Split | 20% |
Dataset
Training data: blainetrain/flp-2-cmlh1ixs-data
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for blainetrain/flp-2-cmlh1ixs
Base model
DeepChem/ChemBERTa-10M-MTR