File size: 1,690 Bytes
714cf46 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 | ---
library_name: transformers
tags: []
---
# nikraf/OmniPath_2class_clustered-30_ESMC-600_2026-03-11-15-46_NQRV
Fine-tuned with Protify.
## About Protify
Protify is an open source platform designed to simplify and democratize workflows for chemical language models. With Protify, deep learning models can be trained to predict chemical properties without requiring extensive coding knowledge or computational resources.
### Why Protify?
- Benchmark multiple models efficiently.
- Flexible for all skill levels.
- Accessible computing with support for precomputed embeddings.
- Cost-effective workflows for training and evaluation.
## Training Run
- `dataset`: OmniPath_2class_clustered-30
- `model`: ESMC-600
- `run_id`: 2026-03-11-15-46_NQRV
- `task_type`: singlelabel
- `num_runs`: 1
## Dataset Statistics
- `train_size`: 102872
- `valid_size`: 18102
- `test_size`: 18074
## Validation Metrics
- `epoch`: 5.000000
- `eval_accuracy`: 0.789750
- `eval_f1`: 0.789330
- `eval_loss`: 0.445219
- `eval_mcc`: 0.581780
- `eval_model_preparation_time`: 0.000300
- `eval_pr_auc`: 0.884610
- `eval_precision`: 0.792040
- `eval_recall`: 0.789750
- `eval_roc_auc`: 0.880010
- `eval_runtime`: 21.260300
- `eval_samples_per_second`: 851.444000
- `eval_steps_per_second`: 13.311000
## Test Metrics
- `test_accuracy`: 0.779350
- `test_f1`: 0.778210
- `test_loss`: 0.455012
- `test_mcc`: 0.564560
- `test_model_preparation_time`: 0.000300
- `test_pr_auc`: 0.884200
- `test_precision`: 0.785240
- `test_recall`: 0.779350
- `test_roc_auc`: 0.874270
- `test_runtime`: 21.119900
- `test_samples_per_second`: 855.780000
- `test_steps_per_second`: 13.400000
- `training_time_seconds`: 1235.285100
|