parlange commited on
Commit
2dcd9be
·
verified ·
1 Parent(s): 0a8e611

Upload DeiT3 model from experiment c3

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +2 -0
  2. README.md +166 -0
  3. config.json +76 -0
  4. confusion_matrices/DeiT3_Confusion_Matrix_a.png +0 -0
  5. confusion_matrices/DeiT3_Confusion_Matrix_b.png +0 -0
  6. confusion_matrices/DeiT3_Confusion_Matrix_c.png +0 -0
  7. confusion_matrices/DeiT3_Confusion_Matrix_d.png +0 -0
  8. confusion_matrices/DeiT3_Confusion_Matrix_e.png +0 -0
  9. confusion_matrices/DeiT3_Confusion_Matrix_f.png +0 -0
  10. confusion_matrices/DeiT3_Confusion_Matrix_g.png +0 -0
  11. confusion_matrices/DeiT3_Confusion_Matrix_h.png +0 -0
  12. confusion_matrices/DeiT3_Confusion_Matrix_i.png +0 -0
  13. confusion_matrices/DeiT3_Confusion_Matrix_j.png +0 -0
  14. confusion_matrices/DeiT3_Confusion_Matrix_k.png +0 -0
  15. confusion_matrices/DeiT3_Confusion_Matrix_l.png +0 -0
  16. deit3-gravit-c3.pth +3 -0
  17. evaluation_results.csv +145 -0
  18. model.safetensors +3 -0
  19. pytorch_model.bin +3 -0
  20. roc_confusion_matrix/DeiT3_roc_confusion_matrix_a.png +0 -0
  21. roc_confusion_matrix/DeiT3_roc_confusion_matrix_b.png +0 -0
  22. roc_confusion_matrix/DeiT3_roc_confusion_matrix_c.png +0 -0
  23. roc_confusion_matrix/DeiT3_roc_confusion_matrix_d.png +0 -0
  24. roc_confusion_matrix/DeiT3_roc_confusion_matrix_e.png +0 -0
  25. roc_confusion_matrix/DeiT3_roc_confusion_matrix_f.png +0 -0
  26. roc_confusion_matrix/DeiT3_roc_confusion_matrix_g.png +0 -0
  27. roc_confusion_matrix/DeiT3_roc_confusion_matrix_h.png +0 -0
  28. roc_confusion_matrix/DeiT3_roc_confusion_matrix_i.png +0 -0
  29. roc_confusion_matrix/DeiT3_roc_confusion_matrix_j.png +0 -0
  30. roc_confusion_matrix/DeiT3_roc_confusion_matrix_k.png +0 -0
  31. roc_confusion_matrix/DeiT3_roc_confusion_matrix_l.png +0 -0
  32. roc_curves/DeiT3_ROC_a.png +0 -0
  33. roc_curves/DeiT3_ROC_b.png +0 -0
  34. roc_curves/DeiT3_ROC_c.png +0 -0
  35. roc_curves/DeiT3_ROC_d.png +0 -0
  36. roc_curves/DeiT3_ROC_e.png +0 -0
  37. roc_curves/DeiT3_ROC_f.png +0 -0
  38. roc_curves/DeiT3_ROC_g.png +0 -0
  39. roc_curves/DeiT3_ROC_h.png +0 -0
  40. roc_curves/DeiT3_ROC_i.png +0 -0
  41. roc_curves/DeiT3_ROC_j.png +0 -0
  42. roc_curves/DeiT3_ROC_k.png +0 -0
  43. roc_curves/DeiT3_ROC_l.png +0 -0
  44. training_curves/DeiT3_accuracy.png +0 -0
  45. training_curves/DeiT3_auc.png +0 -0
  46. training_curves/DeiT3_combined_metrics.png +3 -0
  47. training_curves/DeiT3_f1.png +0 -0
  48. training_curves/DeiT3_loss.png +0 -0
  49. training_curves/DeiT3_metrics.csv +31 -0
  50. training_metrics.csv +31 -0
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ training_curves/DeiT3_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
37
+ training_notebook_c3.ipynb filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,166 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - image-classification
5
+ - pytorch
6
+ - timm
7
+ - deit3
8
+ - vision-transformer
9
+ - transformer
10
+ - gravitational-lensing
11
+ - strong-lensing
12
+ - astronomy
13
+ - astrophysics
14
+ datasets:
15
+ - parlange/gravit-c21-j24
16
+ metrics:
17
+ - accuracy
18
+ - auc
19
+ - f1
20
+ paper:
21
+ - title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
22
+ url: "https://arxiv.org/abs/2509.00226"
23
+ authors: "Parlange et al."
24
+ model-index:
25
+ - name: DeiT3-c3
26
+ results:
27
+ - task:
28
+ type: image-classification
29
+ name: Strong Gravitational Lens Discovery
30
+ dataset:
31
+ type: common-test-sample
32
+ name: Common Test Sample (More et al. 2024)
33
+ metrics:
34
+ - type: accuracy
35
+ value: 0.9015
36
+ name: Average Accuracy
37
+ - type: auc
38
+ value: 0.8912
39
+ name: Average AUC-ROC
40
+ - type: f1
41
+ value: 0.6869
42
+ name: Average F1-Score
43
+ ---
44
+
45
+ # 🌌 deit3-gravit-c3
46
+
47
+ 🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
48
+
49
+ 🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
50
+
51
+ ## 🛰️ Model Details
52
+
53
+ - **🤖 Model Type**: DeiT3
54
+ - **🧪 Experiment**: C3 - C21+J24-all-blocks-ResNet18
55
+ - **🌌 Dataset**: C21+J24
56
+ - **🪐 Fine-tuning Strategy**: all-blocks
57
+
58
+
59
+
60
+ ## 💻 Quick Start
61
+
62
+ ```python
63
+ import torch
64
+ import timm
65
+
66
+ # Load the model directly from the Hub
67
+ model = timm.create_model(
68
+ 'hf-hub:parlange/deit3-gravit-c3',
69
+ pretrained=True
70
+ )
71
+ model.eval()
72
+
73
+ # Example inference
74
+ dummy_input = torch.randn(1, 3, 224, 224)
75
+ with torch.no_grad():
76
+ output = model(dummy_input)
77
+ predictions = torch.softmax(output, dim=1)
78
+ print(f"Lens probability: {predictions[0][1]:.4f}")
79
+ ```
80
+
81
+ ## ⚡️ Training Configuration
82
+
83
+ **Training Dataset:** C21+J24 (Cañameras et al. 2021 + Jaelani et al. 2024)
84
+ **Fine-tuning Strategy:** all-blocks
85
+
86
+
87
+ | 🔧 Parameter | 📝 Value |
88
+ |--------------|----------|
89
+ | Batch Size | 192 |
90
+ | Learning Rate | AdamW with ReduceLROnPlateau |
91
+ | Epochs | 100 |
92
+ | Patience | 10 |
93
+ | Optimizer | AdamW |
94
+ | Scheduler | ReduceLROnPlateau |
95
+ | Image Size | 224x224 |
96
+ | Fine Tune Mode | all_blocks |
97
+ | Stochastic Depth Probability | 0.1 |
98
+
99
+
100
+ ## 📈 Training Curves
101
+
102
+ ![Combined Training Metrics](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/training_curves/DeiT3_combined_metrics.png)
103
+
104
+
105
+ ## 🏁 Final Epoch Training Metrics
106
+
107
+ | Metric | Training | Validation |
108
+ |:---------:|:-----------:|:-------------:|
109
+ | 📉 Loss | 0.0115 | 0.0422 |
110
+ | 🎯 Accuracy | 0.9961 | 0.9934 |
111
+ | 📊 AUC-ROC | 0.9999 | 0.9987 |
112
+ | ⚖️ F1 Score | 0.9961 | 0.9934 |
113
+
114
+
115
+ ## ☑️ Evaluation Results
116
+
117
+ ### ROC Curves and Confusion Matrices
118
+
119
+ Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
120
+
121
+ ![ROC + Confusion Matrix - Dataset A](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_a.png)
122
+ ![ROC + Confusion Matrix - Dataset B](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_b.png)
123
+ ![ROC + Confusion Matrix - Dataset C](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_c.png)
124
+ ![ROC + Confusion Matrix - Dataset D](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_d.png)
125
+ ![ROC + Confusion Matrix - Dataset E](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_e.png)
126
+ ![ROC + Confusion Matrix - Dataset F](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_f.png)
127
+ ![ROC + Confusion Matrix - Dataset G](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_g.png)
128
+ ![ROC + Confusion Matrix - Dataset H](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_h.png)
129
+ ![ROC + Confusion Matrix - Dataset I](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_i.png)
130
+ ![ROC + Confusion Matrix - Dataset J](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_j.png)
131
+ ![ROC + Confusion Matrix - Dataset K](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_k.png)
132
+ ![ROC + Confusion Matrix - Dataset L](https://huggingface.co/parlange/deit3-gravit-c3/resolve/main/roc_confusion_matrix/DeiT3_roc_confusion_matrix_l.png)
133
+
134
+ ### 📋 Performance Summary
135
+
136
+ Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
137
+
138
+ | Metric | Value |
139
+ |-----------|----------|
140
+ | 🎯 Average Accuracy | 0.9015 |
141
+ | 📈 Average AUC-ROC | 0.8912 |
142
+ | ⚖️ Average F1-Score | 0.6869 |
143
+
144
+
145
+ ## 📘 Citation
146
+
147
+ If you use this model in your research, please cite:
148
+
149
+ ```bibtex
150
+ @misc{parlange2025gravit,
151
+ title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
152
+ author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
153
+ year={2025},
154
+ eprint={2509.00226},
155
+ archivePrefix={arXiv},
156
+ primaryClass={cs.CV},
157
+ url={https://arxiv.org/abs/2509.00226},
158
+ }
159
+ ```
160
+
161
+ ---
162
+
163
+
164
+ ## Model Card Contact
165
+
166
+ For questions about this model, please contact the author through: https://github.com/parlange/
config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architecture": "deit3_base_patch16_224",
3
+ "num_classes": 2,
4
+ "num_features": 1000,
5
+ "global_pool": "avg",
6
+ "crop_pct": 0.875,
7
+ "interpolation": "bicubic",
8
+ "mean": [
9
+ 0.485,
10
+ 0.456,
11
+ 0.406
12
+ ],
13
+ "std": [
14
+ 0.229,
15
+ 0.224,
16
+ 0.225
17
+ ],
18
+ "first_conv": "conv1",
19
+ "classifier": "fc",
20
+ "input_size": [
21
+ 3,
22
+ 224,
23
+ 224
24
+ ],
25
+ "pool_size": [
26
+ 7,
27
+ 7
28
+ ],
29
+ "pretrained_cfg": {
30
+ "tag": "gravit_c3",
31
+ "custom_load": false,
32
+ "input_size": [
33
+ 3,
34
+ 224,
35
+ 224
36
+ ],
37
+ "fixed_input_size": true,
38
+ "interpolation": "bicubic",
39
+ "crop_pct": 0.875,
40
+ "crop_mode": "center",
41
+ "mean": [
42
+ 0.485,
43
+ 0.456,
44
+ 0.406
45
+ ],
46
+ "std": [
47
+ 0.229,
48
+ 0.224,
49
+ 0.225
50
+ ],
51
+ "num_classes": 2,
52
+ "pool_size": [
53
+ 7,
54
+ 7
55
+ ],
56
+ "first_conv": "conv1",
57
+ "classifier": "fc"
58
+ },
59
+ "model_name": "deit3_gravit_c3",
60
+ "experiment": "c3",
61
+ "training_strategy": "all-blocks",
62
+ "dataset": "C21+J24",
63
+ "hyperparameters": {
64
+ "batch_size": "192",
65
+ "learning_rate": "AdamW with ReduceLROnPlateau",
66
+ "epochs": "100",
67
+ "patience": "10",
68
+ "optimizer": "AdamW",
69
+ "scheduler": "ReduceLROnPlateau",
70
+ "image_size": "224x224",
71
+ "fine_tune_mode": "all_blocks",
72
+ "stochastic_depth_probability": "0.1"
73
+ },
74
+ "hf_hub_id": "parlange/deit3-gravit-c3",
75
+ "license": "apache-2.0"
76
+ }
confusion_matrices/DeiT3_Confusion_Matrix_a.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_b.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_c.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_d.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_e.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_f.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_g.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_h.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_i.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_j.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_k.png ADDED
confusion_matrices/DeiT3_Confusion_Matrix_l.png ADDED
deit3-gravit-c3.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:058af372f0971917904be3167642f0f4281139b83165175d3ec53ff629aa6511
3
+ size 343337390
evaluation_results.csv ADDED
@@ -0,0 +1,145 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Model,Dataset,Loss,Accuracy,AUCROC,F1
2
+ ViT,a,0.3332538347373928,0.886199308393587,0.9282716390423573,0.4676470588235294
3
+ ViT,b,0.15735628360080778,0.9440427538509902,0.9576924493554329,0.6411290322580645
4
+ ViT,c,0.39000377248233686,0.8729959132348318,0.9239502762430939,0.4404432132963989
5
+ ViT,d,0.07890508225652003,0.9745363093366866,0.9801749539594843,0.7969924812030075
6
+ ViT,e,0.34977894677943117,0.8880351262349067,0.9340497994399455,0.7571428571428571
7
+ ViT,f,0.23130612724766006,0.9195259855936798,0.9467497493631001,0.23434045689019897
8
+ ViT,g,0.07770086391766866,0.969,0.9985143333333334,0.9696376101860921
9
+ ViT,h,0.20104280499617258,0.9313333333333333,0.9957643333333334,0.9351385390428212
10
+ ViT,i,0.03610865117112796,0.9851666666666666,0.9993911111111111,0.9852380162547686
11
+ ViT,j,2.548160281856855,0.6111666666666666,0.5926025555555556,0.4136717768283488
12
+ ViT,k,2.5065680770576,0.6273333333333333,0.7480725555555555,0.42400824317362185
13
+ ViT,l,0.9482682921706662,0.8283538681190842,0.7880956172483574,0.7088789237668162
14
+ MLP-Mixer,a,0.693774286923413,0.7667400188619931,0.878732044198895,0.289272030651341
15
+ MLP-Mixer,b,0.5321722498988626,0.8346431939641622,0.9091961325966851,0.3647342995169082
16
+ MLP-Mixer,c,0.9557083110995204,0.6777742848160956,0.8515027624309391,0.22758100979653353
17
+ MLP-Mixer,d,0.08414041630148475,0.9713926438226973,0.972451197053407,0.7684478371501272
18
+ MLP-Mixer,e,0.40313906763178325,0.8572996706915478,0.9160977824869447,0.6990740740740741
19
+ MLP-Mixer,f,0.5521763351439839,0.8145767175276896,0.9037233142227219,0.11201780415430267
20
+ MLP-Mixer,g,0.2688900081912676,0.9158333333333334,0.9949920555555556,0.9221519963002929
21
+ MLP-Mixer,h,0.4934347181916237,0.8326666666666667,0.9912926111111112,0.8562839965645577
22
+ MLP-Mixer,i,0.03135846547285716,0.9883333333333333,0.9996557777777778,0.988433575677462
23
+ MLP-Mixer,j,0.7298590982755025,0.7896666666666666,0.8653722777777778,0.7797556719022688
24
+ MLP-Mixer,k,0.4923275619546572,0.8621666666666666,0.9557253333333332,0.8438149197355996
25
+ MLP-Mixer,l,0.5266123601464402,0.8324255724181693,0.9252579204980724,0.7723583075928453
26
+ CvT,a,0.6393119974430159,0.8154668343288274,0.8805138121546962,0.3382187147688839
27
+ CvT,b,0.3633024044679194,0.8953159383841559,0.9188913443830571,0.47393364928909953
28
+ CvT,c,1.1649154984175278,0.6969506444514304,0.8355791896869245,0.23734177215189872
29
+ CvT,d,0.07392493324370102,0.9795661741590694,0.9774419889502762,0.821917808219178
30
+ CvT,e,0.5351559021208603,0.8353457738748628,0.8926663134791492,0.6666666666666666
31
+ CvT,f,0.5429993842459881,0.8470296646270622,0.9025078880097911,0.13186813186813187
32
+ CvT,g,0.17292167934030295,0.9475,0.998136,0.94991254571474
33
+ CvT,h,0.5979102214997013,0.8423333333333334,0.9933014444444443,0.8632947976878613
34
+ CvT,i,0.019503380698462327,0.9921666666666666,0.9998416666666667,0.9921939877096828
35
+ CvT,j,2.9556157615184784,0.5416666666666666,0.5250468888888888,0.28645563051375195
36
+ CvT,k,2.8021974964936573,0.5863333333333334,0.8273376111111111,0.3078639152258784
37
+ CvT,l,1.25715323004886,0.765427528951404,0.7598163378053491,0.6245132893177586
38
+ Swin,a,0.3497173586513066,0.8616787173844703,0.9157173112338858,0.4037940379403794
39
+ Swin,b,0.2814936617454318,0.894687205281358,0.9352283609576427,0.4707740916271722
40
+ Swin,c,0.503795853185039,0.8088651367494498,0.8968895027624311,0.32891832229580575
41
+ Swin,d,0.04239246920535809,0.9874253379440427,0.9881252302025783,0.8816568047337278
42
+ Swin,e,0.26784261316878866,0.897914379802415,0.9381745250889276,0.7621483375959079
43
+ Swin,f,0.2793698093965194,0.8924947718999303,0.934230056463828,0.1767497034400949
44
+ Swin,g,0.13889798412223656,0.9468333333333333,0.9980561111111113,0.9492603785589311
45
+ Swin,h,0.25675517926116787,0.9013333333333333,0.9963051111111111,0.9097560975609756
46
+ Swin,i,0.012134499582151572,0.996,0.9999221111111112,0.9959946595460614
47
+ Swin,j,1.6750605003833772,0.6498333333333334,0.6798894444444443,0.5336293007769145
48
+ Swin,k,1.5482970288371047,0.699,0.8766526666666666,0.5710213776722091
49
+ Swin,l,0.6816675912849454,0.8306805562899899,0.8426904510477526,0.7302897574123989
50
+ CaiT,a,0.5386943910452451,0.7966048412448915,0.8977044198895028,0.3239289446185998
51
+ CaiT,b,0.27990006677237217,0.8972021376925495,0.9328563535911601,0.48665620094191525
52
+ CaiT,c,0.8093499287011975,0.703552342030808,0.8726187845303867,0.24740622505985635
53
+ CaiT,d,0.05759982030905481,0.9827098396730588,0.9882780847145488,0.8493150684931506
54
+ CaiT,e,0.8215033926241484,0.7069154774972558,0.8727011276772875,0.537261698440208
55
+ CaiT,f,0.44334294090392135,0.8346371311284951,0.9199878045075582,0.12678936605316973
56
+ CaiT,g,0.13352473031356932,0.9495,0.9994326666666666,0.9518971265280203
57
+ CaiT,h,0.4142214151509106,0.8468333333333333,0.997650388888889,0.8671005061460593
58
+ CaiT,i,0.015668559301644562,0.9948333333333333,0.9999401111111111,0.9948564791770367
59
+ CaiT,j,1.9409965459505718,0.6211666666666666,0.5872103888888889,0.47493647493647495
60
+ CaiT,k,1.8231403918862343,0.6665,0.8462843333333334,0.5067784076904116
61
+ CaiT,l,0.8769211871417011,0.7827190524033631,0.7953060293098494,0.6705155961831449
62
+ DeiT,a,0.2227139560325983,0.9198365293932725,0.9217523020257825,0.538878842676311
63
+ DeiT,b,0.12757987987644753,0.9575605155611443,0.9520055248618784,0.6882217090069284
64
+ DeiT,c,0.3189492260405418,0.8830556428795976,0.9063683241252302,0.44477611940298506
65
+ DeiT,d,0.06807509632895331,0.9776799748506759,0.9840036832412522,0.8075880758807588
66
+ DeiT,e,0.2424249538883943,0.9220636663007684,0.9449405888140467,0.8075880758807588
67
+ DeiT,f,0.16089401494481675,0.9398962125319495,0.9412565697247984,0.2774674115456238
68
+ DeiT,g,0.05315884395440419,0.9803333333333333,0.9993382222222222,0.9806176084099869
69
+ DeiT,h,0.15461649525165558,0.9408333333333333,0.9978233333333334,0.9438735177865613
70
+ DeiT,i,0.0216113910873731,0.991,0.9997796666666667,0.9910358565737052
71
+ DeiT,j,1.4732107556263605,0.6831666666666667,0.7790845555555556,0.5584204413472706
72
+ DeiT,k,1.441663302719593,0.6938333333333333,0.909922,0.5668474416411223
73
+ DeiT,l,0.564133838248539,0.863095552852837,0.87913302847728,0.7700914661220141
74
+ DeiT3,a,0.23484845360151366,0.9276956931782459,0.8688867403314917,0.5228215767634855
75
+ DeiT3,b,0.2093494395927152,0.9333542911034266,0.8948563535911603,0.5431034482758621
76
+ DeiT3,c,0.24055628168418025,0.9305249921408362,0.872364640883978,0.53276955602537
77
+ DeiT3,d,0.11087679768584877,0.9685633448601069,0.9730791896869244,0.7159090909090909
78
+ DeiT3,e,0.3502105330694651,0.9165751920965971,0.8899114508438659,0.7682926829268293
79
+ DeiT3,f,0.14640022547262477,0.9520563860274185,0.9015864990256626,0.2893226176808266
80
+ DeiT3,g,0.080124724984169,0.9706666666666667,0.9986111111111111,0.9713261648745519
81
+ DeiT3,h,0.0966695581873258,0.9691666666666666,0.9981386666666666,0.9699040182202701
82
+ DeiT3,i,0.02791781535744667,0.9893333333333333,0.9996381111111111,0.989379356123465
83
+ DeiT3,j,2.1139168503483137,0.6835,0.6340643888888889,0.5698754246885617
84
+ DeiT3,k,2.061709966202577,0.7021666666666667,0.830329,0.5847083430165001
85
+ DeiT3,l,0.7510140769096566,0.8741473216646396,0.8329381541106192,0.7857785778577858
86
+ Twins_SVT,a,0.2616993184595204,0.9207796290474694,0.9109723756906076,0.5209125475285171
87
+ Twins_SVT,b,0.25328254044824183,0.9138635649166928,0.9203333333333333,0.5
88
+ Twins_SVT,c,0.2903442978858948,0.9047469349261239,0.9014797421731123,0.4748700173310225
89
+ Twins_SVT,d,0.06683065037248045,0.981766740018862,0.9800589318600368,0.8253012048192772
90
+ Twins_SVT,e,0.501175480276248,0.8507135016465422,0.8861651403920382,0.6682926829268293
91
+ Twins_SVT,f,0.19588971365251565,0.9343970257919604,0.9257999765638224,0.24442462087421946
92
+ Twins_SVT,g,0.1121031986673673,0.958,0.9983401111111112,0.9594072164948454
93
+ Twins_SVT,h,0.13175206087032953,0.9531666666666667,0.9979833333333334,0.9549462882796216
94
+ Twins_SVT,i,0.013252561777830124,0.994,0.9998952222222223,0.9939919893190922
95
+ Twins_SVT,j,1.9202613861560822,0.6245,0.7093498888888888,0.4644639885904445
96
+ Twins_SVT,k,1.8214107508560022,0.6605,0.9014717222222222,0.4896016036081183
97
+ Twins_SVT,l,0.7119041901399152,0.8470731320395537,0.8628713312328674,0.7388949079089924
98
+ Twins_PCPVT,a,0.30413780080114133,0.9157497642250865,0.9007108655616943,0.49812734082397003
99
+ Twins_PCPVT,b,0.15472232246597545,0.9544168500471549,0.9364972375690609,0.6472019464720195
100
+ Twins_PCPVT,c,0.4068696537002083,0.8814838101226029,0.8871178637200737,0.41368584758942456
101
+ Twins_PCPVT,d,0.08009447079562927,0.9814523734674631,0.9849171270718233,0.8184615384615385
102
+ Twins_PCPVT,e,0.4030229083367682,0.8990120746432492,0.9090819647317034,0.7430167597765364
103
+ Twins_PCPVT,f,0.1949533162393722,0.9419874525598327,0.9262654450920738,0.2620689655172414
104
+ Twins_PCPVT,g,0.04784699815511703,0.9833333333333333,0.9997761111111111,0.9835904168034132
105
+ Twins_PCPVT,h,0.18152711460987728,0.9446666666666667,0.9988461666666668,0.9475181789440404
106
+ Twins_PCPVT,i,0.00828180085743467,0.9976666666666667,0.999966,0.9976697736351531
107
+ Twins_PCPVT,j,3.1867437440951667,0.6673333333333333,0.6140582777777778,0.5245354930919486
108
+ Twins_PCPVT,k,3.147178516438852,0.6816666666666666,0.8307092777777778,0.5355058365758755
109
+ Twins_PCPVT,l,1.1300106981552416,0.8598170377029242,0.8156386491151393,0.7614505534059209
110
+ PiT,a,0.6125603188174733,0.8204966991512103,0.9077661141804788,0.3518728717366629
111
+ PiT,b,0.3235708940557548,0.9038038352719271,0.9432992633517496,0.5032467532467533
112
+ PiT,c,1.2576219515768996,0.6579691920779629,0.8583489871086556,0.2217453505007153
113
+ PiT,d,0.05706561206120041,0.9849104055328513,0.9926353591160222,0.8659217877094972
114
+ PiT,e,0.4257534373179487,0.8880351262349067,0.935154771815636,0.7524271844660194
115
+ PiT,f,0.5470704378571871,0.8442413445898846,0.9260653695755013,0.13356311934510987
116
+ PiT,g,0.15482272257159155,0.9525,0.9987017222222222,0.9545816733067729
117
+ PiT,h,0.6500254770293832,0.8221666666666667,0.9941473333333334,0.8488026073402296
118
+ PiT,i,0.013530508808791638,0.9955,0.9999065555555556,0.995512713977065
119
+ PiT,j,3.0745317553281786,0.5515,0.5528523888888889,0.304471439648488
120
+ PiT,k,2.933239512239893,0.5945,0.8885508888888889,0.3262254223206868
121
+ PiT,l,1.302048514799378,0.7659034424409074,0.7757457952448608,0.6281394372112558
122
+ ResNet-18,a,0.6009161868435675,0.808236403646652,0.9456114180478821,0.35517970401691334
123
+ ResNet-18,b,0.4458324937691519,0.8519333542911034,0.9591114180478821,0.4163568773234201
124
+ ResNet-18,c,0.7989861659754826,0.7541653568060358,0.9309244935543278,0.3005366726296959
125
+ ResNet-18,d,0.015449083624662364,0.9949701351776171,0.999609576427256,0.9545454545454546
126
+ ResNet-18,e,0.5140058943913876,0.8331503841931943,0.9492696586694922,0.6885245901639344
127
+ ResNet-18,f,0.48200214555188836,0.846719851289598,0.9582666776614167,0.14514038876889848
128
+ ResNet-18,g,0.23460300221045813,0.9216666666666666,0.9970519999999999,0.9270865653118213
129
+ ResNet-18,h,0.42183320385217665,0.8698333333333333,0.9940301111111111,0.8844161610182033
130
+ ResNet-18,i,0.006428073017857969,0.9975,0.9999838888888889,0.9974962443665498
131
+ ResNet-18,j,6.947763346751531,0.43466666666666665,0.12029633333333332,0.03745743473325766
132
+ ResNet-18,k,6.719588431844177,0.5105,0.7508054444444445,0.043010752688172046
133
+ ResNet-18,l,2.462256099074432,0.7395695626883825,0.6324618509475799,0.5668044682909666
134
+ Ensemble,a,,0.9195221628418736,0.9235128913443831,0.5412186379928315
135
+ Ensemble,b,,0.9519019176359635,0.9473370165745855,0.6637362637362637
136
+ Ensemble,c,,0.8695378811694435,0.9035138121546961,0.4211994421199442
137
+ Ensemble,d,,0.9867966048412449,0.9853793738489871,0.877906976744186
138
+ Ensemble,e,,0.9143798024149287,0.9323620676606372,0.7947368421052632
139
+ Ensemble,f,,0.9361784524823794,0.93950146042107,0.2682060390763766
140
+ Ensemble,g,,0.9785,0.9996941111111111,0.9789112309955861
141
+ Ensemble,h,,0.9348333333333333,0.9986463333333333,0.9387051261953284
142
+ Ensemble,i,,0.997,0.9999658888888888,0.997002997002997
143
+ Ensemble,j,,0.6451666666666667,0.7836463333333332,0.48287588049550645
144
+ Ensemble,k,,0.6636666666666666,0.9382774444444444,0.4962556165751373
145
+ Ensemble,l,,0.8500343715297974,0.8794563846610491,0.7448263451502609
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd7db6dcc88fd50c4d7b3037f189bb48b6154403149d6e8756ac9f4e86bb76b0
3
+ size 343287616
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:058af372f0971917904be3167642f0f4281139b83165175d3ec53ff629aa6511
3
+ size 343337390
roc_confusion_matrix/DeiT3_roc_confusion_matrix_a.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_b.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_c.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_d.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_e.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_f.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_g.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_h.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_i.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_j.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_k.png ADDED
roc_confusion_matrix/DeiT3_roc_confusion_matrix_l.png ADDED
roc_curves/DeiT3_ROC_a.png ADDED
roc_curves/DeiT3_ROC_b.png ADDED
roc_curves/DeiT3_ROC_c.png ADDED
roc_curves/DeiT3_ROC_d.png ADDED
roc_curves/DeiT3_ROC_e.png ADDED
roc_curves/DeiT3_ROC_f.png ADDED
roc_curves/DeiT3_ROC_g.png ADDED
roc_curves/DeiT3_ROC_h.png ADDED
roc_curves/DeiT3_ROC_i.png ADDED
roc_curves/DeiT3_ROC_j.png ADDED
roc_curves/DeiT3_ROC_k.png ADDED
roc_curves/DeiT3_ROC_l.png ADDED
training_curves/DeiT3_accuracy.png ADDED
training_curves/DeiT3_auc.png ADDED
training_curves/DeiT3_combined_metrics.png ADDED

Git LFS Details

  • SHA256: 6525293367811474d56491563894b123717bd9ceb438a9545f5d27fb35816135
  • Pointer size: 131 Bytes
  • Size of remote file: 139 kB
training_curves/DeiT3_f1.png ADDED
training_curves/DeiT3_loss.png ADDED
training_curves/DeiT3_metrics.csv ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.12133353116152513,0.06166662340956596,0.9491312378150973,0.9803206997084548,0.9908295962688415,0.9979940330984539,0.9488842872240791,0.9799851742031134
3
+ 2,0.05494636449336345,0.034128174685354486,0.9807521291514177,0.9861516034985423,0.997780707696122,0.999188263393654,0.9806888860292541,0.9861616897305171
4
+ 3,0.045487246261129495,0.043049787942988055,0.9842152067585593,0.9868804664723032,0.998435085276439,0.9982957781196611,0.9841651083394809,0.9868421052631579
5
+ 4,0.04013614664559494,0.03402237415737296,0.9862674008961247,0.9861516034985423,0.9987488464713992,0.9991500140247686,0.9862360946846985,0.9860601614086574
6
+ 5,0.037818929275874505,0.049131851762793846,0.9869343639908336,0.9854227405247813,0.9989165928871198,0.9980556570816581,0.9869077199897182,0.9854862119013063
7
+ 6,0.03612760852247132,0.04465823140709441,0.9873619044361597,0.9854227405247813,0.9990236021499846,0.99842965091076,0.987333527586857,0.9854862119013063
8
+ 7,0.03312974497234396,0.04106172019990471,0.9885761193008858,0.9868804664723032,0.9991481656207636,0.9989024556094824,0.9885538039753256,0.986764705882353
9
+ 8,0.03079390252278298,0.04467770861310385,0.989183226733249,0.9868804664723032,0.9992727514651405,0.998591148246054,0.9891647894200378,0.9868613138686131
10
+ 9,0.030111658072643988,0.041507352197491194,0.9897219276943598,0.9876093294460642,0.9992672579425801,0.9991702011916803,0.9897062601695641,0.9876543209876543
11
+ 10,0.029094924251348422,0.032358164527065206,0.990149468139686,0.9912536443148688,0.9993035476258836,0.999216950420318,0.9901305644083479,0.9912663755458515
12
+ 11,0.028486179824413364,0.05093552202577553,0.9903204843178165,0.9897959183673469,0.9993758190264957,0.9977709117799556,0.9903010778484158,0.9898255813953488
13
+ 12,0.027378678419949454,0.03892283181798354,0.9903632383623491,0.9919825072886297,0.9994001925023193,0.9984976497887784,0.9903479698192065,0.9919883466860888
14
+ 13,0.025691640328447344,0.040297305444934275,0.9907907788076752,0.9905247813411079,0.999497996418865,0.9979037220885856,0.9907774514253419,0.9905316824471959
15
+ 14,0.026315096579743742,0.040405817177491504,0.9910473030748709,0.9883381924198251,0.9994469299246713,0.999188263393654,0.9910346540164237,0.9883381924198251
16
+ 15,0.025027122080401924,0.03933713233249252,0.9910558538837774,0.9897959183673469,0.9995003760630373,0.9986007105882753,0.9910419128856003,0.9898107714701602
17
+ 16,0.024790827932474956,0.04512184367392283,0.9913209289598796,0.9876093294460642,0.999470932993152,0.9991277018929187,0.9913049437605476,0.9875640087783467
18
+ 17,0.017164641173017664,0.0428765761059656,0.9941170434723124,0.9919825072886297,0.9997483471816039,0.9985582112895137,0.9941082775275318,0.9919766593727206
19
+ 18,0.014272876617472185,0.044261461064089146,0.9949293703184321,0.9919825072886297,0.9998364982798729,0.9985560863245757,0.9949249873766549,0.9919766593727206
20
+ 19,0.013573616959855908,0.03765543523100653,0.9955535793686082,0.9934402332361516,0.9998433336333756,0.9986973964929579,0.9955482501198548,0.9934258582907232
21
+ 20,0.01287249052858807,0.04189210441560857,0.9953740123815713,0.9927113702623906,0.9998704032008604,0.9987345833793743,0.9953698552758831,0.9927007299270073
22
+ 21,0.012630879596626093,0.037352593599359774,0.9958528576803366,0.9927113702623906,0.999858099038118,0.9986984589754269,0.9958479937677103,0.9926900584795322
23
+ 22,0.012325043967807007,0.04300889761374387,0.9957502479734582,0.9941690962099126,0.9998819902386092,0.9986538347117273,0.9957459920740215,0.9941520467836257
24
+ 23,0.011392199501855074,0.04363907175206477,0.9961606868009714,0.9941690962099126,0.9999004254983352,0.9986432098870369,0.9961572366337735,0.9941520467836257
25
+ 24,0.011286299277906982,0.04223336118646611,0.9960837295208127,0.9934402332361516,0.999887471770091,0.9986931465630816,0.9960801766488078,0.9934258582907232
26
+ 25,0.011058116214544809,0.04236892951481348,0.996186339227691,0.9934402332361516,0.9998906818695731,0.9986761468435771,0.9961827487632449,0.9934258582907232
27
+ 26,0.01128154789737973,0.04287581923387201,0.9959982214317474,0.9934402332361516,0.999901828015834,0.9986538347117273,0.9959943167228718,0.9934258582907232
28
+ 27,0.011398236130418838,0.0421129193472136,0.9960324246673735,0.9934402332361516,0.9998935698184921,0.9986655220188868,0.9960288932251549,0.9934258582907232
29
+ 28,0.011419007594133643,0.04220352793498755,0.9960580770940931,0.9934402332361516,0.9998995173934799,0.998675084361108,0.9960551424341739,0.9934258582907232
30
+ 29,0.010842687454953264,0.04214560194882841,0.9961350343742518,0.9934402332361516,0.9999097506092081,0.9986772093260461,0.99613079952063,0.9934258582907232
31
+ 30,0.011535594235865117,0.0421641564523429,0.9961008311386257,0.9934402332361516,0.999880920546658,0.9986761468435771,0.9960974941804738,0.9934258582907232
training_metrics.csv ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.12133353116152513,0.06166662340956596,0.9491312378150973,0.9803206997084548,0.9908295962688415,0.9979940330984539,0.9488842872240791,0.9799851742031134
3
+ 2,0.05494636449336345,0.034128174685354486,0.9807521291514177,0.9861516034985423,0.997780707696122,0.999188263393654,0.9806888860292541,0.9861616897305171
4
+ 3,0.045487246261129495,0.043049787942988055,0.9842152067585593,0.9868804664723032,0.998435085276439,0.9982957781196611,0.9841651083394809,0.9868421052631579
5
+ 4,0.04013614664559494,0.03402237415737296,0.9862674008961247,0.9861516034985423,0.9987488464713992,0.9991500140247686,0.9862360946846985,0.9860601614086574
6
+ 5,0.037818929275874505,0.049131851762793846,0.9869343639908336,0.9854227405247813,0.9989165928871198,0.9980556570816581,0.9869077199897182,0.9854862119013063
7
+ 6,0.03612760852247132,0.04465823140709441,0.9873619044361597,0.9854227405247813,0.9990236021499846,0.99842965091076,0.987333527586857,0.9854862119013063
8
+ 7,0.03312974497234396,0.04106172019990471,0.9885761193008858,0.9868804664723032,0.9991481656207636,0.9989024556094824,0.9885538039753256,0.986764705882353
9
+ 8,0.03079390252278298,0.04467770861310385,0.989183226733249,0.9868804664723032,0.9992727514651405,0.998591148246054,0.9891647894200378,0.9868613138686131
10
+ 9,0.030111658072643988,0.041507352197491194,0.9897219276943598,0.9876093294460642,0.9992672579425801,0.9991702011916803,0.9897062601695641,0.9876543209876543
11
+ 10,0.029094924251348422,0.032358164527065206,0.990149468139686,0.9912536443148688,0.9993035476258836,0.999216950420318,0.9901305644083479,0.9912663755458515
12
+ 11,0.028486179824413364,0.05093552202577553,0.9903204843178165,0.9897959183673469,0.9993758190264957,0.9977709117799556,0.9903010778484158,0.9898255813953488
13
+ 12,0.027378678419949454,0.03892283181798354,0.9903632383623491,0.9919825072886297,0.9994001925023193,0.9984976497887784,0.9903479698192065,0.9919883466860888
14
+ 13,0.025691640328447344,0.040297305444934275,0.9907907788076752,0.9905247813411079,0.999497996418865,0.9979037220885856,0.9907774514253419,0.9905316824471959
15
+ 14,0.026315096579743742,0.040405817177491504,0.9910473030748709,0.9883381924198251,0.9994469299246713,0.999188263393654,0.9910346540164237,0.9883381924198251
16
+ 15,0.025027122080401924,0.03933713233249252,0.9910558538837774,0.9897959183673469,0.9995003760630373,0.9986007105882753,0.9910419128856003,0.9898107714701602
17
+ 16,0.024790827932474956,0.04512184367392283,0.9913209289598796,0.9876093294460642,0.999470932993152,0.9991277018929187,0.9913049437605476,0.9875640087783467
18
+ 17,0.017164641173017664,0.0428765761059656,0.9941170434723124,0.9919825072886297,0.9997483471816039,0.9985582112895137,0.9941082775275318,0.9919766593727206
19
+ 18,0.014272876617472185,0.044261461064089146,0.9949293703184321,0.9919825072886297,0.9998364982798729,0.9985560863245757,0.9949249873766549,0.9919766593727206
20
+ 19,0.013573616959855908,0.03765543523100653,0.9955535793686082,0.9934402332361516,0.9998433336333756,0.9986973964929579,0.9955482501198548,0.9934258582907232
21
+ 20,0.01287249052858807,0.04189210441560857,0.9953740123815713,0.9927113702623906,0.9998704032008604,0.9987345833793743,0.9953698552758831,0.9927007299270073
22
+ 21,0.012630879596626093,0.037352593599359774,0.9958528576803366,0.9927113702623906,0.999858099038118,0.9986984589754269,0.9958479937677103,0.9926900584795322
23
+ 22,0.012325043967807007,0.04300889761374387,0.9957502479734582,0.9941690962099126,0.9998819902386092,0.9986538347117273,0.9957459920740215,0.9941520467836257
24
+ 23,0.011392199501855074,0.04363907175206477,0.9961606868009714,0.9941690962099126,0.9999004254983352,0.9986432098870369,0.9961572366337735,0.9941520467836257
25
+ 24,0.011286299277906982,0.04223336118646611,0.9960837295208127,0.9934402332361516,0.999887471770091,0.9986931465630816,0.9960801766488078,0.9934258582907232
26
+ 25,0.011058116214544809,0.04236892951481348,0.996186339227691,0.9934402332361516,0.9998906818695731,0.9986761468435771,0.9961827487632449,0.9934258582907232
27
+ 26,0.01128154789737973,0.04287581923387201,0.9959982214317474,0.9934402332361516,0.999901828015834,0.9986538347117273,0.9959943167228718,0.9934258582907232
28
+ 27,0.011398236130418838,0.0421129193472136,0.9960324246673735,0.9934402332361516,0.9998935698184921,0.9986655220188868,0.9960288932251549,0.9934258582907232
29
+ 28,0.011419007594133643,0.04220352793498755,0.9960580770940931,0.9934402332361516,0.9998995173934799,0.998675084361108,0.9960551424341739,0.9934258582907232
30
+ 29,0.010842687454953264,0.04214560194882841,0.9961350343742518,0.9934402332361516,0.9999097506092081,0.9986772093260461,0.99613079952063,0.9934258582907232
31
+ 30,0.011535594235865117,0.0421641564523429,0.9961008311386257,0.9934402332361516,0.999880920546658,0.9986761468435771,0.9960974941804738,0.9934258582907232