FLP Test v10
A chemistry prediction model fine-tuned on Precite platform.
Model Details
- Base Model: seyonec/ChemBERTa-zinc-base-v1
- Fine-tuned On: 8 training samples, 2 validation samples (80/20 split)
- Task: Molecular property prediction (4 classes)
- Epochs: 2
- Training Date: 2026-02-04
Performance Metrics (20% Holdout Test Set)
| Metric | Value |
|---|---|
| Accuracy | 0.5000 |
| F1 Score | 0.3333 |
| Precision | 0.2500 |
| Recall | 0.5000 |
| Training Loss | 1.4379 |
Label Classes
highlowmediumvery_low
Usage
This model can be queried through the Precite platform for FLP chemistry predictions.
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("blainetrain/FLP-Test-v10")
tokenizer = AutoTokenizer.from_pretrained("blainetrain/FLP-Test-v10")
Training Data
See the associated dataset: blainetrain/precite-dataset-FLP-Test-v10
- Downloads last month
- 103
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for blainetrain/FLP-Test-v10
Base model
seyonec/ChemBERTa-zinc-base-v1Evaluation results
- Accuracyself-reported0.500
- F1self-reported0.333
- Precisionself-reported0.250
- Recallself-reported0.500