FLP 2

Fine-tuned from DeepChem/ChemBERTa-10M-MTR using Precite.

Training Configuration

Parameter Value
Base Model DeepChem/ChemBERTa-10M-MTR
Version 1
Epochs 10
Batch Size 32
Learning Rate 0.0001
Test Split 20%

Dataset

Training data: blainetrain/flp-2-cmlh1ixs-data

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for blainetrain/flp-2-cmlh1ixs

Finetuned
(2)
this model

Dataset used to train blainetrain/flp-2-cmlh1ixs