metadata
license: apache-2.0
base_model: DeepChem/ChemBERTa-10M-MTR
datasets:
- blainetrain/flp-2-cmlh1ixs-data
tags:
- precite
- materials-science
- fine-tuned
FLP 2
Fine-tuned from DeepChem/ChemBERTa-10M-MTR using Precite.
Training Configuration
| Parameter | Value |
|---|---|
| Base Model | DeepChem/ChemBERTa-10M-MTR |
| Version | 1 |
| Epochs | 10 |
| Batch Size | 32 |
| Learning Rate | 0.0001 |
| Test Split | 20% |
Dataset
Training data: blainetrain/flp-2-cmlh1ixs-data