Instructions to use ViTAMIn-O/PDLO_classifier with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ViTAMIn-O/PDLO_classifier with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="ViTAMIn-O/PDLO_classifier") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("ViTAMIn-O/PDLO_classifier") model = AutoModelForImageClassification.from_pretrained("ViTAMIn-O/PDLO_classifier") - Notebooks
- Google Colab
- Kaggle
ViTAMIn-O Auto-Upload: Seed 46
Browse files- README.md +3 -3
- best_metric_tracker.json +1 -1
- history_seed_46.csv +21 -0
- model.safetensors +1 -1
- predictions_seed_46.csv +81 -0
- results_seed_46.json +1 -0
README.md
CHANGED
|
@@ -17,12 +17,12 @@ This model was trained using the **ColabViTAMIn-O** code-free infrastructure.
|
|
| 17 |
* **Repository:** `ViTAMIn-O/PDLO_classifier`
|
| 18 |
|
| 19 |
## Training Hyperparameters
|
| 20 |
-
* **Seed:** `
|
| 21 |
* **Epochs:** `20`
|
| 22 |
* **Batch Size:** `32`
|
| 23 |
|
| 24 |
## Evaluation Metrics (Test Set)
|
| 25 |
-
* **Accuracy:** `0.
|
| 26 |
-
* **Global AUROC:** `0.
|
| 27 |
|
| 28 |
*This model card was auto-generated by the ViTAMIn-O pipeline to ensure reproducibility and open-science transparency.*
|
|
|
|
| 17 |
* **Repository:** `ViTAMIn-O/PDLO_classifier`
|
| 18 |
|
| 19 |
## Training Hyperparameters
|
| 20 |
+
* **Seed:** `46`
|
| 21 |
* **Epochs:** `20`
|
| 22 |
* **Batch Size:** `32`
|
| 23 |
|
| 24 |
## Evaluation Metrics (Test Set)
|
| 25 |
+
* **Accuracy:** `0.9625`
|
| 26 |
+
* **Global AUROC:** `0.9938`
|
| 27 |
|
| 28 |
*This model card was auto-generated by the ViTAMIn-O pipeline to ensure reproducibility and open-science transparency.*
|
best_metric_tracker.json
CHANGED
|
@@ -1 +1 @@
|
|
| 1 |
-
{"best_auroc": 0.
|
|
|
|
| 1 |
+
{"best_auroc": 0.99375, "seed": 46}
|
history_seed_46.csv
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
epoch,train_loss,val_loss
|
| 2 |
+
1,0.538878183811903,0.30797648429870605
|
| 3 |
+
2,0.26180985383689404,0.17218229174613953
|
| 4 |
+
3,0.1717955470085144,0.13028602302074432
|
| 5 |
+
4,0.1437947005033493,0.10177387297153473
|
| 6 |
+
5,0.10723237367346883,0.11121335625648499
|
| 7 |
+
6,0.08349944278597832,0.11192755028605461
|
| 8 |
+
7,0.07720590848475695,0.131826750934124
|
| 9 |
+
8,0.07390826242044568,0.08874695003032684
|
| 10 |
+
9,0.059899297542870045,0.08273351192474365
|
| 11 |
+
10,0.07470285997260362,0.08861568570137024
|
| 12 |
+
11,0.028520599356852472,0.10082890838384628
|
| 13 |
+
12,0.024814573233015835,0.06342275068163872
|
| 14 |
+
13,0.037214646697975695,0.09319406375288963
|
| 15 |
+
14,0.02800476152333431,0.04895220510661602
|
| 16 |
+
15,0.017185716424137354,0.04200319666415453
|
| 17 |
+
16,0.05007501703221351,0.06883817911148071
|
| 18 |
+
17,0.020178327045869082,0.11221175268292427
|
| 19 |
+
18,0.01919980809907429,0.2134331837296486
|
| 20 |
+
19,0.02056509326212108,0.0834646187722683
|
| 21 |
+
20,0.010971037321723998,0.046443226747214794
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 780512112
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:add4f927138a0ebb8ed8fa8ac9ad893479f3a69eadafba1828b4dc3cd2591b95
|
| 3 |
size 780512112
|
predictions_seed_46.csv
ADDED
|
@@ -0,0 +1,81 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
true_label,pred_label,prob_satisfactory,prob_unsatisfactory
|
| 2 |
+
satisfactory,satisfactory,0.9989592,0.00104085
|
| 3 |
+
satisfactory,satisfactory,0.99998164,1.834027e-05
|
| 4 |
+
satisfactory,satisfactory,0.9998419,0.00015803758
|
| 5 |
+
satisfactory,satisfactory,0.99997854,2.1474747e-05
|
| 6 |
+
satisfactory,satisfactory,0.9999387,6.1259125e-05
|
| 7 |
+
satisfactory,satisfactory,0.9998808,0.000119206
|
| 8 |
+
satisfactory,satisfactory,0.99995244,4.7522222e-05
|
| 9 |
+
satisfactory,satisfactory,0.99994564,5.4399185e-05
|
| 10 |
+
satisfactory,satisfactory,0.9991604,0.00083954225
|
| 11 |
+
satisfactory,satisfactory,0.749852,0.25014794
|
| 12 |
+
satisfactory,satisfactory,0.99984896,0.0001509777
|
| 13 |
+
satisfactory,satisfactory,0.99970573,0.00029425745
|
| 14 |
+
satisfactory,satisfactory,0.9997497,0.00025031203
|
| 15 |
+
satisfactory,satisfactory,0.9985238,0.0014762129
|
| 16 |
+
satisfactory,satisfactory,0.9999131,8.691986e-05
|
| 17 |
+
satisfactory,satisfactory,0.9989341,0.0010659255
|
| 18 |
+
satisfactory,satisfactory,0.99975973,0.00024032393
|
| 19 |
+
satisfactory,satisfactory,0.99793315,0.0020668693
|
| 20 |
+
satisfactory,satisfactory,0.9996648,0.000335244
|
| 21 |
+
satisfactory,satisfactory,0.9941375,0.005862531
|
| 22 |
+
satisfactory,satisfactory,0.9998272,0.00017286537
|
| 23 |
+
satisfactory,satisfactory,0.99036914,0.009630804
|
| 24 |
+
satisfactory,satisfactory,0.9998977,0.00010225509
|
| 25 |
+
satisfactory,satisfactory,0.9995353,0.00046464158
|
| 26 |
+
satisfactory,satisfactory,0.9862896,0.013710445
|
| 27 |
+
satisfactory,satisfactory,0.85799223,0.14200777
|
| 28 |
+
satisfactory,satisfactory,0.999321,0.0006790212
|
| 29 |
+
satisfactory,satisfactory,0.9898065,0.010193493
|
| 30 |
+
satisfactory,satisfactory,0.9998789,0.00012107872
|
| 31 |
+
satisfactory,satisfactory,0.9766833,0.023316737
|
| 32 |
+
satisfactory,satisfactory,0.72331876,0.27668127
|
| 33 |
+
satisfactory,satisfactory,0.99993956,6.045825e-05
|
| 34 |
+
satisfactory,satisfactory,0.95151323,0.04848676
|
| 35 |
+
satisfactory,satisfactory,0.99948895,0.0005110404
|
| 36 |
+
satisfactory,satisfactory,0.9420749,0.05792504
|
| 37 |
+
satisfactory,satisfactory,0.9997631,0.00023695698
|
| 38 |
+
satisfactory,satisfactory,0.6913073,0.30869263
|
| 39 |
+
satisfactory,satisfactory,0.9996069,0.00039306903
|
| 40 |
+
satisfactory,satisfactory,0.998459,0.0015410646
|
| 41 |
+
satisfactory,satisfactory,0.9995982,0.00040181552
|
| 42 |
+
unsatisfactory,satisfactory,0.80931675,0.19068323
|
| 43 |
+
unsatisfactory,unsatisfactory,0.0018195248,0.9981805
|
| 44 |
+
unsatisfactory,satisfactory,0.93508685,0.0649131
|
| 45 |
+
unsatisfactory,unsatisfactory,0.000591661,0.99940836
|
| 46 |
+
unsatisfactory,unsatisfactory,0.00011861852,0.9998814
|
| 47 |
+
unsatisfactory,unsatisfactory,0.0034897495,0.9965102
|
| 48 |
+
unsatisfactory,unsatisfactory,0.0065973457,0.99340266
|
| 49 |
+
unsatisfactory,unsatisfactory,0.0033877383,0.99661225
|
| 50 |
+
unsatisfactory,unsatisfactory,0.00079514674,0.99920493
|
| 51 |
+
unsatisfactory,unsatisfactory,0.00025222785,0.9997478
|
| 52 |
+
unsatisfactory,unsatisfactory,0.31696433,0.6830357
|
| 53 |
+
unsatisfactory,unsatisfactory,0.00012214592,0.9998778
|
| 54 |
+
unsatisfactory,unsatisfactory,0.053903997,0.94609606
|
| 55 |
+
unsatisfactory,unsatisfactory,3.832877e-05,0.9999616
|
| 56 |
+
unsatisfactory,unsatisfactory,0.0007804709,0.99921954
|
| 57 |
+
unsatisfactory,unsatisfactory,0.36850524,0.6314947
|
| 58 |
+
unsatisfactory,unsatisfactory,0.2314275,0.7685725
|
| 59 |
+
unsatisfactory,unsatisfactory,0.0021984521,0.9978015
|
| 60 |
+
unsatisfactory,unsatisfactory,4.4412125e-05,0.99995553
|
| 61 |
+
unsatisfactory,unsatisfactory,2.4539242e-05,0.99997544
|
| 62 |
+
unsatisfactory,unsatisfactory,0.00034533217,0.99965465
|
| 63 |
+
unsatisfactory,unsatisfactory,0.007220509,0.99277943
|
| 64 |
+
unsatisfactory,unsatisfactory,0.010568988,0.989431
|
| 65 |
+
unsatisfactory,unsatisfactory,0.015350503,0.9846495
|
| 66 |
+
unsatisfactory,unsatisfactory,9.5259915e-05,0.99990475
|
| 67 |
+
unsatisfactory,unsatisfactory,0.012054313,0.9879457
|
| 68 |
+
unsatisfactory,unsatisfactory,0.39652866,0.60347134
|
| 69 |
+
unsatisfactory,unsatisfactory,0.0022187922,0.9977812
|
| 70 |
+
unsatisfactory,unsatisfactory,0.00015025517,0.9998497
|
| 71 |
+
unsatisfactory,unsatisfactory,1.7556826e-05,0.9999825
|
| 72 |
+
unsatisfactory,unsatisfactory,0.0029846062,0.99701536
|
| 73 |
+
unsatisfactory,unsatisfactory,0.0034120379,0.996588
|
| 74 |
+
unsatisfactory,unsatisfactory,0.0007051092,0.9992949
|
| 75 |
+
unsatisfactory,satisfactory,0.81684226,0.18315774
|
| 76 |
+
unsatisfactory,unsatisfactory,0.06626126,0.93373877
|
| 77 |
+
unsatisfactory,unsatisfactory,6.315671e-05,0.9999368
|
| 78 |
+
unsatisfactory,unsatisfactory,0.43318996,0.5668101
|
| 79 |
+
unsatisfactory,unsatisfactory,9.6322525e-05,0.9999037
|
| 80 |
+
unsatisfactory,unsatisfactory,4.0687475e-05,0.99995935
|
| 81 |
+
unsatisfactory,unsatisfactory,0.0008214618,0.99917847
|
results_seed_46.json
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
{"Accuracy": 0.9625, "Balanced_Accuracy": 0.9625, "MCC": 0.9276125895432153, "Cohen_Kappa": 0.925, "Brier_Score": 0.03875666580073609, "Global_AUROC": 0.99375, "satisfactory_AUROC": 0.99375, "unsatisfactory_AUROC": 0.99375, "satisfactory_Precision": 0.9302325581395349, "satisfactory_Recall": 1.0, "satisfactory_Specificity": 0.925, "satisfactory_NPV": 1.0, "satisfactory_F1": 0.963855421686747, "unsatisfactory_Precision": 1.0, "unsatisfactory_Recall": 0.925, "unsatisfactory_Specificity": 1.0, "unsatisfactory_NPV": 0.9302325581395349, "unsatisfactory_F1": 0.961038961038961, "Macro_Precision": 0.9651162790697674, "Macro_Recall": 0.9625, "Macro_Specificity": 0.9625, "Macro_F1": 0.9624471913628541, "Seed": 46}
|