File size: 1,770 Bytes
272572f
ecc511f
80d82e7
ecc511f
 
 
 
 
 
272572f
ecc511f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
272572f
 
ecc511f
272572f
ecc511f
272572f
ecc511f
272572f
ecc511f
 
 
 
 
272572f
ecc511f
272572f
ecc511f
 
 
 
 
 
 
272572f
ecc511f
272572f
ecc511f
 
 
 
272572f
ecc511f
272572f
ecc511f
272572f
ecc511f
 
272572f
ecc511f
 
 
272572f
ecc511f
272572f
ecc511f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
---
license: apache-2.0
tags:
  - chemistry
  - precite
  - chemberta
datasets:
  - blainetrain/precite-dataset-FLP-Test-v10
base_model: seyonec/ChemBERTa-zinc-base-v1
model-index:
  - name: FLP-Test-v10
    results:
      - task:
          type: molecular-property-prediction
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.5000
          - name: F1
            type: f1
            value: 0.3333
          - name: Precision
            type: precision
            value: 0.2500
          - name: Recall
            type: recall
            value: 0.5000
---

# FLP Test v10

A chemistry prediction model fine-tuned on Precite platform.

## Model Details

- **Base Model**: [seyonec/ChemBERTa-zinc-base-v1](https://huggingface.co/seyonec/ChemBERTa-zinc-base-v1)
- **Fine-tuned On**: 8 training samples, 2 validation samples (80/20 split)
- **Task**: Molecular property prediction (4 classes)
- **Epochs**: 2
- **Training Date**: 2026-02-04

## Performance Metrics (20% Holdout Test Set)

| Metric | Value |
|--------|-------|
| **Accuracy** | 0.5000 |
| **F1 Score** | 0.3333 |
| **Precision** | 0.2500 |
| **Recall** | 0.5000 |
| Training Loss | 1.4379 |

## Label Classes

- `high`
- `low`
- `medium`
- `very_low`

## Usage

This model can be queried through the Precite platform for FLP chemistry predictions.

```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer

model = AutoModelForSequenceClassification.from_pretrained("blainetrain/FLP-Test-v10")
tokenizer = AutoTokenizer.from_pretrained("blainetrain/FLP-Test-v10")
```

## Training Data

See the associated dataset: [blainetrain/precite-dataset-FLP-Test-v10](https://huggingface.co/datasets/blainetrain/precite-dataset-FLP-Test-v10)