flp-2-cmlh1ixs / README.md
blainetrain's picture
Model card from Precite
a8e0389 verified
metadata
license: apache-2.0
base_model: DeepChem/ChemBERTa-10M-MTR
datasets:
  - blainetrain/flp-2-cmlh1ixs-data
tags:
  - precite
  - materials-science
  - fine-tuned

FLP 2

Fine-tuned from DeepChem/ChemBERTa-10M-MTR using Precite.

Training Configuration

Parameter Value
Base Model DeepChem/ChemBERTa-10M-MTR
Version 1
Epochs 10
Batch Size 32
Learning Rate 0.0001
Test Split 20%

Dataset

Training data: blainetrain/flp-2-cmlh1ixs-data