nikraf's picture
Upload folder using huggingface_hub
714cf46 verified
metadata
library_name: transformers
tags: []

nikraf/OmniPath_2class_clustered-30_ESMC-600_2026-03-11-15-46_NQRV

Fine-tuned with Protify.

About Protify

Protify is an open source platform designed to simplify and democratize workflows for chemical language models. With Protify, deep learning models can be trained to predict chemical properties without requiring extensive coding knowledge or computational resources.

Why Protify?

  • Benchmark multiple models efficiently.
  • Flexible for all skill levels.
  • Accessible computing with support for precomputed embeddings.
  • Cost-effective workflows for training and evaluation.

Training Run

  • dataset: OmniPath_2class_clustered-30
  • model: ESMC-600
  • run_id: 2026-03-11-15-46_NQRV
  • task_type: singlelabel
  • num_runs: 1

Dataset Statistics

  • train_size: 102872
  • valid_size: 18102
  • test_size: 18074

Validation Metrics

  • epoch: 5.000000
  • eval_accuracy: 0.789750
  • eval_f1: 0.789330
  • eval_loss: 0.445219
  • eval_mcc: 0.581780
  • eval_model_preparation_time: 0.000300
  • eval_pr_auc: 0.884610
  • eval_precision: 0.792040
  • eval_recall: 0.789750
  • eval_roc_auc: 0.880010
  • eval_runtime: 21.260300
  • eval_samples_per_second: 851.444000
  • eval_steps_per_second: 13.311000

Test Metrics

  • test_accuracy: 0.779350
  • test_f1: 0.778210
  • test_loss: 0.455012
  • test_mcc: 0.564560
  • test_model_preparation_time: 0.000300
  • test_pr_auc: 0.884200
  • test_precision: 0.785240
  • test_recall: 0.779350
  • test_roc_auc: 0.874270
  • test_runtime: 21.119900
  • test_samples_per_second: 855.780000
  • test_steps_per_second: 13.400000
  • training_time_seconds: 1235.285100