FLP-Test-v10 / README.md
blainetrain's picture
Upload README.md with huggingface_hub
ecc511f verified
---
license: apache-2.0
tags:
- chemistry
- precite
- chemberta
datasets:
- blainetrain/precite-dataset-FLP-Test-v10
base_model: seyonec/ChemBERTa-zinc-base-v1
model-index:
- name: FLP-Test-v10
results:
- task:
type: molecular-property-prediction
metrics:
- name: Accuracy
type: accuracy
value: 0.5000
- name: F1
type: f1
value: 0.3333
- name: Precision
type: precision
value: 0.2500
- name: Recall
type: recall
value: 0.5000
---
# FLP Test v10
A chemistry prediction model fine-tuned on Precite platform.
## Model Details
- **Base Model**: [seyonec/ChemBERTa-zinc-base-v1](https://huggingface.co/seyonec/ChemBERTa-zinc-base-v1)
- **Fine-tuned On**: 8 training samples, 2 validation samples (80/20 split)
- **Task**: Molecular property prediction (4 classes)
- **Epochs**: 2
- **Training Date**: 2026-02-04
## Performance Metrics (20% Holdout Test Set)
| Metric | Value |
|--------|-------|
| **Accuracy** | 0.5000 |
| **F1 Score** | 0.3333 |
| **Precision** | 0.2500 |
| **Recall** | 0.5000 |
| Training Loss | 1.4379 |
## Label Classes
- `high`
- `low`
- `medium`
- `very_low`
## Usage
This model can be queried through the Precite platform for FLP chemistry predictions.
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("blainetrain/FLP-Test-v10")
tokenizer = AutoTokenizer.from_pretrained("blainetrain/FLP-Test-v10")
```
## Training Data
See the associated dataset: [blainetrain/precite-dataset-FLP-Test-v10](https://huggingface.co/datasets/blainetrain/precite-dataset-FLP-Test-v10)