JasonXF commited on
Commit
fecabdf
·
verified ·
1 Parent(s): 171b480

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/dataset_indices.txt +21 -0
  2. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/results.txt +22 -0
  3. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/training_config.txt +67 -0
  4. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/training_log.txt +51 -0
  5. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch20_mse0.3101_r20.6028_20251205_073601.pth +3 -0
  6. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch30_mse0.2908_r20.6275_20251205_080423.pth +3 -0
  7. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch40_mse0.2787_r20.6429_20251205_083244.pth +3 -0
  8. Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch50_mse0.2731_r20.6500_20251205_090107.pth +3 -0
  9. Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/dataset_indices.txt +21 -0
  10. Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/results.txt +26 -0
  11. Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/training_config.txt +72 -0
  12. Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/training_log.txt +51 -0
  13. Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/model/fno_vE-Base.1.0+.E_epoch40_mse0.0536_r20.9317_20251208_102956.pth +3 -0
  14. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/dataset_indices.txt +21 -0
  15. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/results.txt +26 -0
  16. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/training_config.txt +72 -0
  17. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/training_log.txt +51 -0
  18. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch20_mse0.0555_r20.9297_20251208_114349.pth +3 -0
  19. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch30_mse0.0544_r20.9312_20251208_120631.pth +3 -0
  20. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch40_mse0.0542_r20.9314_20251208_122906.pth +3 -0
  21. Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch50_mse0.0544_r20.9311_20251208_125142.pth +3 -0
  22. Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/dataset_indices.txt +21 -0
  23. Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/results.txt +26 -0
  24. Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/training_config.txt +72 -0
  25. Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/training_log.txt +51 -0
  26. Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/dataset_indices.txt +21 -0
  27. Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/results.txt +26 -0
  28. Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/training_config.txt +72 -0
  29. Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/training_log.txt +51 -0
  30. Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/model/fno_vE-test-11.1.0+.E_epoch20_mse0.0968_r20.8820_20251209_045446.pth +3 -0
  31. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/dataset_indices.txt +21 -0
  32. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/results.txt +26 -0
  33. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/training_config.txt +72 -0
  34. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/training_log.txt +51 -0
  35. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/model/fno_vE-test-12.1.0+.E_epoch30_mse0.1763_r20.7610_20251209_051212.pth +3 -0
  36. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/model/fno_vE-test-12.1.0+.E_epoch40_mse0.1511_r20.7953_20251209_051320.pth +3 -0
  37. Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/model/fno_vE-test-12.1.0+.E_epoch50_mse0.1424_r20.8071_20251209_051428.pth +3 -0
  38. Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/dataset_indices.txt +21 -0
  39. Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/results.txt +26 -0
  40. Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/training_config.txt +72 -0
  41. Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/training_log.txt +51 -0
  42. Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/dataset_indices.txt +21 -0
  43. Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/results.txt +26 -0
  44. Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/training_config.txt +72 -0
  45. Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/training_log.txt +51 -0
  46. Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/dataset_indices.txt +21 -0
  47. Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/results.txt +26 -0
  48. Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/training_config.txt +72 -0
  49. Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/training_log.txt +51 -0
  50. Efficiency-Series (E-Base, E-Test-1~15)/E-test-3-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_141228/details/dataset_indices.txt +21 -0
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0.7, 0.2, 0.1]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (138567 samples):
9
+ Range: [57, 198017]
10
+ First 10: [130245, 130246, 130247, 130248, 130249, 130250, 130251, 130252, 130253, 130254]
11
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
12
+
13
+ Validation Indices (39615 samples):
14
+ Range: [0, 197960]
15
+ First 10: [178809, 178810, 178811, 178812, 178813, 178814, 178815, 178816, 178817, 178818]
16
+ Last 10: [1301, 1302, 1303, 1304, 1305, 1306, 1307, 1308, 1309, 1310]
17
+
18
+ Test Indices (19836 samples):
19
+ Range: [798, 196877]
20
+ First 10: [80427, 80428, 80429, 80430, 80431, 80432, 80433, 80434, 80435, 80436]
21
+ Last 10: [21194, 21195, 21196, 21197, 21198, 21199, 21200, 21201, 21202, 21203]
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/results.txt ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - KNET Test Split
2
+ ============================================================
3
+ Model Version: 1.0+ (Standard FNO)
4
+
5
+ Model Configuration:
6
+ - Hidden Channels: 64
7
+ - Fourier Modes: 64
8
+ - Number of Layers: 4
9
+ - Domain Padding: 0.1
10
+
11
+ REGRESSION (Floor Acceleration Response)
12
+ ------------------------------------------------------------
13
+ MSE Loss: 0.274706
14
+ RMSE: 0.524124
15
+ MAE: 0.300024
16
+ R² Score: 0.641545
17
+
18
+ ============================================================
19
+ Training Performance:
20
+ - Total Training Time: 8515.66s
21
+ - Average Epoch Time: 170.31s
22
+ ============================================================
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/training_config.txt ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Training Configuration (Standard FNO)
3
+ ======================================================================
4
+
5
+ MODEL ARCHITECTURE
6
+ ----------------------------------------------------------------------
7
+ Model Version: 1.0+
8
+ Model Type: Standard FNO (neuralop)
9
+ Fourier Modes: 64
10
+ Hidden Channels: 64
11
+ Number of Layers: 4
12
+ Domain Padding: 0.1
13
+ Projection Ratio: 2
14
+ Input Channels: 1
15
+ Output Channels: 1
16
+ Grid Size: 3000
17
+ Total Parameters: 591,425
18
+ Trainable Parameters: 591,425
19
+
20
+ TRAINING PARAMETERS
21
+ ----------------------------------------------------------------------
22
+ Batch Size: 2560
23
+ Number of Epochs: 50
24
+ Learning Rate: 0.001
25
+ Weight Decay: 0.0001
26
+ Optimizer: AdamW
27
+ Loss Function: MSE Loss
28
+ LR Scheduler: StepLR
29
+ - Step Size: 20
30
+ - Gamma: 0.5
31
+ Checkpoint Interval: 10 epochs
32
+ Augmentation Factors: 57 (out of 57)
33
+
34
+ DATASET CONFIGURATION
35
+ ----------------------------------------------------------------------
36
+ Train/Val/Test Split: 0.7/0.2/0.1
37
+ Train Samples: 138567
38
+ Validation Samples: 39615
39
+ Test Samples (KNET): 19836
40
+ Total GM Count: 198018
41
+ Random Seed: 42
42
+
43
+ DATA PATHS
44
+ ----------------------------------------------------------------------
45
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
46
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
47
+ Buildings Files: 1 .h5 files
48
+ Output Directory: /home/jason/SeismicAssessment/output/Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911
49
+
50
+ SYSTEM INFORMATION
51
+ ----------------------------------------------------------------------
52
+ Platform: linux
53
+ Device: cuda
54
+ PyTorch Version: 2.9.1+cu128
55
+ CUDA Available: True
56
+ CUDA Version: 12.8
57
+ GPU Count: 1
58
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
59
+
60
+ ADDITIONAL NOTES
61
+ ----------------------------------------------------------------------
62
+ Run Test: True
63
+ Task Type: Regression (Floor Acceleration Response)
64
+ Compiled Model: True
65
+ DataLoader Workers: 0 (h5py compatibility)
66
+
67
+ ======================================================================
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.748659,0.865251,0.510024,0.057501,0.691819,0.831757,0.486453,0.113289,1.00e-03,176.64
3
+ 2,0.715880,0.846097,0.496994,0.099427,0.678199,0.823528,0.479792,0.130795,1.00e-03,170.60
4
+ 3,0.654247,0.808855,0.479189,0.175297,0.592849,0.769967,0.454484,0.240025,1.00e-03,170.52
5
+ 4,0.596859,0.772567,0.459542,0.247492,0.558146,0.747092,0.442239,0.284466,1.00e-03,168.77
6
+ 5,0.536808,0.732672,0.438003,0.324473,0.493476,0.702478,0.419753,0.367673,1.00e-03,171.22
7
+ 6,0.474672,0.688965,0.411688,0.402255,0.452871,0.672957,0.399131,0.419944,1.00e-03,170.39
8
+ 7,0.441314,0.664314,0.396232,0.444401,0.422833,0.650256,0.383794,0.458580,1.00e-03,170.10
9
+ 8,0.415970,0.644958,0.384073,0.475879,0.400234,0.632641,0.372595,0.487547,1.00e-03,170.00
10
+ 9,0.387893,0.622810,0.370003,0.511490,0.385773,0.621106,0.365534,0.505998,1.00e-03,170.33
11
+ 10,0.371105,0.609184,0.361410,0.532394,0.371626,0.609612,0.357987,0.524070,1.00e-03,170.34
12
+ 11,0.364352,0.603616,0.357740,0.541641,0.360710,0.600591,0.351556,0.538074,1.00e-03,169.35
13
+ 12,0.346748,0.588853,0.348073,0.562957,0.351179,0.592603,0.346419,0.550251,1.00e-03,170.59
14
+ 13,0.339410,0.582589,0.343978,0.572894,0.347742,0.589696,0.343947,0.554649,1.00e-03,170.10
15
+ 14,0.330581,0.574961,0.339162,0.583627,0.341593,0.584459,0.341860,0.562501,1.00e-03,170.08
16
+ 15,0.323780,0.569017,0.335555,0.592252,0.336137,0.579773,0.339441,0.569436,1.00e-03,170.35
17
+ 16,0.316397,0.562492,0.331503,0.601382,0.326848,0.571706,0.333173,0.581398,1.00e-03,170.30
18
+ 17,0.313851,0.560224,0.330373,0.604530,0.322886,0.568230,0.330556,0.586394,1.00e-03,169.28
19
+ 18,0.303943,0.551310,0.324340,0.616995,0.317558,0.563523,0.327952,0.593206,1.00e-03,170.14
20
+ 19,0.307067,0.554136,0.326531,0.613605,0.313651,0.560046,0.325802,0.598182,1.00e-03,170.27
21
+ 20,0.293286,0.541559,0.317625,0.631266,0.310061,0.556831,0.323175,0.602790,5.00e-04,170.47
22
+ 21,0.287661,0.536341,0.314367,0.637899,0.306297,0.553441,0.320780,0.607609,5.00e-04,170.48
23
+ 22,0.284986,0.533841,0.312740,0.641285,0.304508,0.551823,0.320198,0.609879,5.00e-04,170.81
24
+ 23,0.283058,0.532032,0.311507,0.644020,0.302492,0.549992,0.318586,0.612491,5.00e-04,170.39
25
+ 24,0.280939,0.530037,0.310132,0.646954,0.301613,0.549193,0.318376,0.613591,5.00e-04,169.41
26
+ 25,0.278395,0.527631,0.308931,0.649521,0.298959,0.546771,0.316193,0.616994,5.00e-04,170.39
27
+ 26,0.275980,0.525338,0.307667,0.652196,0.297449,0.545388,0.315701,0.618916,5.00e-04,170.32
28
+ 27,0.274665,0.524085,0.306499,0.654660,0.296145,0.544192,0.314913,0.620620,5.00e-04,170.71
29
+ 28,0.272128,0.521659,0.305264,0.657321,0.294144,0.542350,0.313411,0.623183,5.00e-04,169.85
30
+ 29,0.269962,0.519579,0.303889,0.660156,0.293234,0.541510,0.312856,0.624350,5.00e-04,170.07
31
+ 30,0.268900,0.518555,0.303241,0.661619,0.290757,0.539218,0.311306,0.627519,5.00e-04,168.87
32
+ 31,0.266139,0.515887,0.301420,0.665251,0.290309,0.538803,0.311287,0.628065,5.00e-04,169.82
33
+ 32,0.265089,0.514868,0.300886,0.666596,0.293150,0.541434,0.313855,0.624384,5.00e-04,170.13
34
+ 33,0.263419,0.513244,0.299689,0.669024,0.298635,0.546475,0.317783,0.617322,5.00e-04,170.28
35
+ 34,0.261584,0.511453,0.298909,0.670756,0.286428,0.535190,0.309076,0.633019,5.00e-04,170.47
36
+ 35,0.258470,0.508399,0.297168,0.674100,0.286089,0.534873,0.307755,0.633486,5.00e-04,170.53
37
+ 36,0.258794,0.508718,0.297434,0.674007,0.284358,0.533252,0.307418,0.635717,5.00e-04,169.17
38
+ 37,0.255757,0.505725,0.295301,0.677930,0.282382,0.531396,0.305885,0.638215,5.00e-04,170.63
39
+ 38,0.255396,0.505367,0.294916,0.678897,0.285321,0.534154,0.309046,0.634374,5.00e-04,170.12
40
+ 39,0.253226,0.503216,0.293940,0.681011,0.283585,0.532527,0.307892,0.636576,5.00e-04,170.35
41
+ 40,0.251219,0.501217,0.292479,0.683752,0.278684,0.527905,0.303931,0.642926,2.50e-04,169.96
42
+ 41,0.248981,0.498980,0.290738,0.686921,0.278127,0.527378,0.303573,0.643609,2.50e-04,170.13
43
+ 42,0.248121,0.498117,0.290115,0.688173,0.277451,0.526736,0.302831,0.644480,2.50e-04,171.09
44
+ 43,0.246947,0.496937,0.289755,0.688933,0.277032,0.526338,0.302964,0.645002,2.50e-04,169.21
45
+ 44,0.246298,0.496284,0.289236,0.690020,0.276774,0.526093,0.303135,0.645318,2.50e-04,170.09
46
+ 45,0.245451,0.495430,0.288737,0.691059,0.275830,0.525195,0.302042,0.646546,2.50e-04,170.17
47
+ 46,0.244872,0.494846,0.288300,0.691932,0.275035,0.524438,0.301402,0.647571,2.50e-04,170.16
48
+ 47,0.243818,0.493780,0.287791,0.692936,0.274661,0.524081,0.301298,0.648030,2.50e-04,170.48
49
+ 48,0.243135,0.493087,0.287296,0.693957,0.274451,0.523880,0.300905,0.648327,2.50e-04,170.80
50
+ 49,0.242917,0.492866,0.287126,0.694343,0.276058,0.525412,0.302889,0.646225,2.50e-04,169.60
51
+ 50,0.242237,0.492176,0.286775,0.695054,0.273123,0.522611,0.300038,0.650006,2.50e-04,170.42
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch20_mse0.3101_r20.6028_20251205_073601.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:703ad70490212ae3cfcaae53842832fc96b5dc11276573c4a9df2d94082d43c6
3
+ size 13645867
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch30_mse0.2908_r20.6275_20251205_080423.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:16a2e985913b912103865d91a4d8287330a65d80770d58c7c98ead095b89b4eb
3
+ size 13645867
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch40_mse0.2787_r20.6429_20251205_083244.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd36459639392ee82859cae608af28fd1d6897a367cc6f2414a59b778f4999fe
3
+ size 13645867
Base-FNO_v1.0+_h64_m64_l4_e50_20251205_063911/model/fno_vBase.1.0+_epoch50_mse0.2731_r20.6500_20251205_090107.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3bb2db45e16de575c5406d90ae9d533500c76d52be7e6cfa0610f9d4f9a75e6e
3
+ size 13645867
Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (136800 samples):
9
+ Range: [114, 197960]
10
+ First 10: [42294, 42295, 42296, 42297, 42298, 42299, 42300, 42301, 42302, 42303]
11
+ Last 10: [1301, 1302, 1303, 1304, 1305, 1306, 1307, 1308, 1309, 1310]
12
+
13
+ Validation Indices (34200 samples):
14
+ Range: [0, 198017]
15
+ First 10: [84132, 84133, 84134, 84135, 84136, 84137, 84138, 84139, 84140, 84141]
16
+ Last 10: [16976, 16977, 16978, 16979, 16980, 16981, 16982, 16983, 16984, 16985]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 3000
7
+ - Scales Used: 57
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.054141
18
+ RMSE: 0.232682
19
+ MAE: 0.118034
20
+ R² Score: 0.930201
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 8679.66s
25
+ - Average Epoch Time: 173.59s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 3000
10
+ Scales Used per GM: 57
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 136800
44
+ Validation Samples: 34200
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.444309,0.666565,0.394889,0.435738,0.226337,0.475749,0.292088,0.712409,1.00e-03,176.61
3
+ 2,0.157035,0.396276,0.240289,0.801001,0.116957,0.341990,0.201793,0.851228,1.00e-03,172.15
4
+ 3,0.096321,0.310356,0.182138,0.878170,0.088018,0.296679,0.170045,0.887998,1.00e-03,173.19
5
+ 4,0.075602,0.274958,0.157157,0.904382,0.076052,0.275776,0.155353,0.903196,1.00e-03,172.87
6
+ 5,0.064804,0.254567,0.143612,0.918100,0.070243,0.265033,0.147344,0.910649,1.00e-03,172.57
7
+ 6,0.058487,0.241841,0.136283,0.926056,0.066615,0.258099,0.142302,0.915205,1.00e-03,170.87
8
+ 7,0.053869,0.232097,0.129418,0.931919,0.062197,0.249393,0.136426,0.920845,1.00e-03,173.15
9
+ 8,0.050078,0.223782,0.124878,0.936724,0.062689,0.250377,0.142958,0.920225,1.00e-03,173.78
10
+ 9,0.048027,0.219150,0.122065,0.939297,0.059079,0.243063,0.131373,0.924820,1.00e-03,173.84
11
+ 10,0.044951,0.212015,0.117593,0.943224,0.057926,0.240678,0.129105,0.926284,1.00e-03,172.80
12
+ 11,0.042822,0.206934,0.115364,0.945874,0.057488,0.239768,0.129330,0.926826,1.00e-03,178.24
13
+ 12,0.041787,0.204419,0.113359,0.947196,0.059631,0.244194,0.129113,0.924082,1.00e-03,172.68
14
+ 13,0.039559,0.198895,0.110453,0.949995,0.055781,0.236180,0.125675,0.928995,1.00e-03,172.91
15
+ 14,0.037909,0.194702,0.108615,0.952078,0.055644,0.235891,0.126643,0.929176,1.00e-03,172.96
16
+ 15,0.037084,0.192572,0.107517,0.953156,0.055396,0.235364,0.124526,0.929498,1.00e-03,174.15
17
+ 16,0.035873,0.189402,0.105147,0.954659,0.054681,0.233841,0.123876,0.930381,1.00e-03,175.48
18
+ 17,0.035294,0.187866,0.106018,0.955448,0.056359,0.237400,0.127040,0.928263,1.00e-03,175.57
19
+ 18,0.034126,0.184733,0.103408,0.956896,0.053956,0.232284,0.121914,0.931313,1.00e-03,173.36
20
+ 19,0.032967,0.181568,0.100906,0.958389,0.057141,0.239042,0.125879,0.927213,1.00e-03,174.30
21
+ 20,0.033076,0.181869,0.101287,0.958191,0.054333,0.233095,0.121961,0.930822,5.00e-04,175.16
22
+ 21,0.030517,0.174690,0.097208,0.961418,0.053406,0.231097,0.120329,0.931992,5.00e-04,174.88
23
+ 22,0.029821,0.172687,0.096223,0.962312,0.053558,0.231425,0.120605,0.931803,5.00e-04,173.28
24
+ 23,0.029453,0.171620,0.095738,0.962767,0.053500,0.231301,0.120100,0.931872,5.00e-04,174.33
25
+ 24,0.029154,0.170745,0.095285,0.963166,0.054015,0.232412,0.120498,0.931193,5.00e-04,173.57
26
+ 25,0.028862,0.169889,0.094797,0.963541,0.053592,0.231499,0.120127,0.931744,5.00e-04,173.48
27
+ 26,0.028498,0.168812,0.094268,0.963993,0.053535,0.231376,0.119948,0.931808,5.00e-04,173.94
28
+ 27,0.028062,0.167516,0.093639,0.964553,0.053531,0.231368,0.119825,0.931823,5.00e-04,172.38
29
+ 28,0.028148,0.167773,0.093660,0.964419,0.053419,0.231126,0.119630,0.931975,5.00e-04,174.44
30
+ 29,0.027592,0.166108,0.092902,0.965143,0.053374,0.231028,0.119535,0.931999,5.00e-04,174.45
31
+ 30,0.027135,0.164727,0.092222,0.965721,0.053475,0.231246,0.119297,0.931881,5.00e-04,173.74
32
+ 31,0.026953,0.164174,0.092097,0.965945,0.053712,0.231758,0.119683,0.931593,5.00e-04,174.03
33
+ 32,0.026836,0.163815,0.091674,0.966087,0.053416,0.231118,0.119055,0.931944,5.00e-04,172.59
34
+ 33,0.026298,0.162167,0.090928,0.966766,0.053355,0.230986,0.119014,0.932020,5.00e-04,173.94
35
+ 34,0.026111,0.161590,0.090620,0.967023,0.053417,0.231121,0.118657,0.931935,5.00e-04,174.88
36
+ 35,0.025957,0.161111,0.090446,0.967203,0.053362,0.231001,0.118705,0.932029,5.00e-04,173.54
37
+ 36,0.025729,0.160404,0.090056,0.967480,0.054311,0.233047,0.119049,0.930792,5.00e-04,172.81
38
+ 37,0.025741,0.160440,0.090220,0.967473,0.053808,0.231965,0.119917,0.931438,5.00e-04,173.19
39
+ 38,0.025174,0.158662,0.089319,0.968205,0.053192,0.230634,0.118269,0.932226,5.00e-04,173.23
40
+ 39,0.024865,0.157686,0.088645,0.968575,0.053950,0.232272,0.118565,0.931266,5.00e-04,172.76
41
+ 40,0.024654,0.157015,0.088499,0.968855,0.053611,0.231541,0.118573,0.931683,2.50e-04,172.62
42
+ 41,0.023849,0.154431,0.087098,0.969867,0.053357,0.230991,0.117838,0.932001,2.50e-04,172.51
43
+ 42,0.023535,0.153412,0.086593,0.970266,0.053362,0.231003,0.117903,0.931983,2.50e-04,172.71
44
+ 43,0.023485,0.153248,0.086535,0.970335,0.053422,0.231133,0.117914,0.931933,2.50e-04,171.15
45
+ 44,0.023314,0.152690,0.086263,0.970549,0.053735,0.231809,0.118015,0.931529,2.50e-04,171.64
46
+ 45,0.023218,0.152376,0.086173,0.970674,0.053813,0.231976,0.118361,0.931418,2.50e-04,171.36
47
+ 46,0.023072,0.151893,0.085891,0.970843,0.053646,0.231616,0.117810,0.931640,2.50e-04,171.03
48
+ 47,0.022973,0.151569,0.085683,0.970986,0.053800,0.231948,0.117786,0.931420,2.50e-04,172.38
49
+ 48,0.022844,0.151143,0.085518,0.971143,0.053755,0.231850,0.117766,0.931481,2.50e-04,174.31
50
+ 49,0.022750,0.150830,0.085311,0.971290,0.053695,0.231722,0.118111,0.931553,2.50e-04,177.00
51
+ 50,0.022718,0.150726,0.085299,0.971296,0.053918,0.232202,0.118269,0.931280,2.50e-04,170.85
Efficiency-Series (E-Base, E-Test-1~15)/E-Base-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_083402/model/fno_vE-Base.1.0+.E_epoch40_mse0.0536_r20.9317_20251208_102956.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3e8eba7e1d842bb4e854eeb49a27c3eab38a8a688d658d5eeaceee76115eb661
3
+ size 303053571
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (110400 samples):
9
+ Range: [114, 197960]
10
+ First 10: [42294, 42295, 42296, 42298, 42299, 42300, 42301, 42303, 42304, 42305]
11
+ Last 10: [1299, 1300, 1301, 1303, 1304, 1305, 1306, 1308, 1309, 1310]
12
+
13
+ Validation Indices (27600 samples):
14
+ Range: [0, 198017]
15
+ First 10: [84132, 84133, 84134, 84136, 84137, 84138, 84139, 84141, 84142, 84143]
16
+ Last 10: [16974, 16975, 16976, 16978, 16979, 16980, 16981, 16983, 16984, 16985]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 3000
7
+ - Scales Used: 46
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.054490
18
+ RMSE: 0.233430
19
+ MAE: 0.118761
20
+ R² Score: 0.929753
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 6754.88s
25
+ - Average Epoch Time: 135.10s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 3000
10
+ Scales Used per GM: 46
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 110400
44
+ Validation Samples: 27600
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.492072,0.701479,0.416184,0.372684,0.263695,0.513512,0.311554,0.666411,1.00e-03,133.48
3
+ 2,0.191875,0.438036,0.267450,0.756707,0.140649,0.375032,0.224944,0.822109,1.00e-03,129.54
4
+ 3,0.113682,0.337167,0.200511,0.856019,0.100830,0.317538,0.184329,0.872428,1.00e-03,129.26
5
+ 4,0.086036,0.293318,0.169116,0.891127,0.085206,0.291900,0.167420,0.892182,1.00e-03,129.78
6
+ 5,0.072919,0.270035,0.153775,0.907871,0.076555,0.276685,0.154614,0.903140,1.00e-03,130.13
7
+ 6,0.065111,0.255168,0.143570,0.917902,0.070775,0.266036,0.147232,0.910458,1.00e-03,129.28
8
+ 7,0.059361,0.243642,0.136530,0.924836,0.067835,0.260452,0.142777,0.914178,1.00e-03,132.83
9
+ 8,0.055428,0.235432,0.131374,0.929966,0.067190,0.259210,0.139622,0.914991,1.00e-03,134.96
10
+ 9,0.052567,0.229275,0.128133,0.933588,0.062523,0.250046,0.135729,0.920893,1.00e-03,137.01
11
+ 10,0.049335,0.222115,0.123101,0.937791,0.062592,0.250183,0.136277,0.920796,1.00e-03,136.33
12
+ 11,0.047150,0.217140,0.120913,0.940425,0.059519,0.243965,0.132255,0.924701,1.00e-03,136.09
13
+ 12,0.045593,0.213524,0.119322,0.942435,0.058196,0.241238,0.129222,0.926367,1.00e-03,135.59
14
+ 13,0.044455,0.210844,0.118166,0.943789,0.058902,0.242698,0.132622,0.925475,1.00e-03,135.92
15
+ 14,0.041846,0.204563,0.113396,0.947181,0.057077,0.238909,0.126861,0.927769,1.00e-03,135.92
16
+ 15,0.040597,0.201488,0.111129,0.948821,0.056595,0.237897,0.126552,0.928383,1.00e-03,134.76
17
+ 16,0.039476,0.198685,0.109936,0.950170,0.058232,0.241313,0.128052,0.926299,1.00e-03,134.84
18
+ 17,0.039168,0.197908,0.110630,0.950545,0.056179,0.237020,0.125395,0.928914,1.00e-03,135.34
19
+ 18,0.037114,0.192649,0.107159,0.953101,0.055791,0.236202,0.125365,0.929393,1.00e-03,135.50
20
+ 19,0.036193,0.190244,0.105591,0.954281,0.054991,0.234501,0.123633,0.930402,1.00e-03,135.41
21
+ 20,0.035531,0.188496,0.105274,0.955092,0.055542,0.235675,0.125827,0.929689,5.00e-04,135.25
22
+ 21,0.033799,0.183845,0.102011,0.957326,0.054678,0.233833,0.122205,0.930787,5.00e-04,135.45
23
+ 22,0.032996,0.181649,0.100720,0.958305,0.054719,0.233922,0.121778,0.930734,5.00e-04,136.65
24
+ 23,0.032577,0.180490,0.100165,0.958850,0.054712,0.233906,0.122060,0.930753,5.00e-04,136.45
25
+ 24,0.032190,0.179415,0.099613,0.959385,0.054541,0.233541,0.121665,0.930959,5.00e-04,137.33
26
+ 25,0.031768,0.178236,0.098992,0.959901,0.054794,0.234081,0.122080,0.930647,5.00e-04,136.73
27
+ 26,0.031590,0.177736,0.098918,0.960086,0.054732,0.233950,0.121267,0.930712,5.00e-04,136.40
28
+ 27,0.031320,0.176974,0.098286,0.960523,0.054629,0.233729,0.121580,0.930842,5.00e-04,135.07
29
+ 28,0.030743,0.175337,0.097668,0.961153,0.054210,0.232829,0.120797,0.931378,5.00e-04,136.15
30
+ 29,0.030364,0.174252,0.096944,0.961681,0.054408,0.233255,0.120727,0.931124,5.00e-04,135.51
31
+ 30,0.030166,0.173685,0.096686,0.961953,0.054358,0.233148,0.120760,0.931188,5.00e-04,135.03
32
+ 31,0.029730,0.172424,0.096139,0.962493,0.054303,0.233029,0.120741,0.931258,5.00e-04,135.27
33
+ 32,0.029407,0.171484,0.095674,0.962883,0.054464,0.233376,0.120406,0.931048,5.00e-04,133.85
34
+ 33,0.029282,0.171120,0.095547,0.963005,0.055039,0.234604,0.120501,0.930315,5.00e-04,135.46
35
+ 34,0.029012,0.170329,0.095240,0.963369,0.054922,0.234354,0.120293,0.930463,5.00e-04,135.75
36
+ 35,0.028852,0.169858,0.095120,0.963498,0.054138,0.232675,0.120197,0.931460,5.00e-04,135.99
37
+ 36,0.028276,0.168155,0.094149,0.964299,0.054350,0.233130,0.120251,0.931189,5.00e-04,135.48
38
+ 37,0.028175,0.167855,0.093947,0.964451,0.054461,0.233369,0.120090,0.931042,5.00e-04,135.27
39
+ 38,0.028024,0.167404,0.094011,0.964613,0.054026,0.232436,0.119752,0.931603,5.00e-04,134.93
40
+ 39,0.027437,0.165640,0.092832,0.965346,0.054339,0.233107,0.120817,0.931203,5.00e-04,136.21
41
+ 40,0.027240,0.165045,0.092469,0.965591,0.054178,0.232763,0.119525,0.931400,2.50e-04,135.67
42
+ 41,0.026470,0.162695,0.091244,0.966590,0.053862,0.232081,0.118830,0.931805,2.50e-04,136.74
43
+ 42,0.026053,0.161411,0.090593,0.967124,0.053889,0.232140,0.119084,0.931772,2.50e-04,137.51
44
+ 43,0.025898,0.160929,0.090491,0.967277,0.053969,0.232312,0.118995,0.931671,2.50e-04,136.52
45
+ 44,0.025718,0.160369,0.090162,0.967512,0.054327,0.233081,0.118880,0.931212,2.50e-04,136.00
46
+ 45,0.025579,0.159936,0.090025,0.967613,0.054142,0.232684,0.118867,0.931444,2.50e-04,135.78
47
+ 46,0.025576,0.159926,0.089858,0.967780,0.054112,0.232621,0.119090,0.931487,2.50e-04,135.36
48
+ 47,0.025397,0.159363,0.089677,0.967937,0.054306,0.233037,0.118873,0.931234,2.50e-04,134.60
49
+ 48,0.025246,0.158889,0.089460,0.968119,0.054360,0.233152,0.118962,0.931172,2.50e-04,134.73
50
+ 49,0.025136,0.158544,0.089244,0.968269,0.054137,0.232674,0.118943,0.931458,2.50e-04,133.89
51
+ 50,0.025059,0.158300,0.089140,0.968345,0.054418,0.233277,0.118999,0.931096,2.50e-04,134.74
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch20_mse0.0555_r20.9297_20251208_114349.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a083f06730b38362038547db429c16a1090bb8692a5e04bd5aafa362d2e1640a
3
+ size 303053903
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch30_mse0.0544_r20.9312_20251208_120631.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b8c7538bf3218f660e9f2a6fb908213d864a248b389c30580a550733674a73c2
3
+ size 303053903
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch40_mse0.0542_r20.9314_20251208_122906.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aa5ab92413b2c909c8f9af51885062ed0a94b2a7594935ea526179993531138f
3
+ size 303053903
Efficiency-Series (E-Base, E-Test-1~15)/E-test-1-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_105908/model/fno_vE-test-1.1.0+.E_epoch50_mse0.0544_r20.9311_20251208_125142.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:09cc24b22bb80d6cad8d4155e0dce6f2db70089e595ce386d41c4823b4a18c77
3
+ size 303053903
Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (48960 samples):
9
+ Range: [0, 197960]
10
+ First 10: [88749, 88751, 88752, 88754, 88756, 88757, 88759, 88761, 88763, 88764]
11
+ Last 10: [151376, 151377, 151379, 151381, 151383, 151384, 151386, 151388, 151389, 151391]
12
+
13
+ Validation Indices (12240 samples):
14
+ Range: [798, 197846]
15
+ First 10: [53067, 53069, 53070, 53072, 53074, 53075, 53077, 53079, 53081, 53082]
16
+ Last 10: [37034, 37035, 37037, 37039, 37041, 37042, 37044, 37046, 37047, 37049]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 1800
7
+ - Scales Used: 34
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.066595
18
+ RMSE: 0.258060
19
+ MAE: 0.135838
20
+ R² Score: 0.914180
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 3006.82s
25
+ - Average Epoch Time: 60.14s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 1800
10
+ Scales Used per GM: 34
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 48960
44
+ Validation Samples: 12240
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-10-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_035505/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.653862,0.808617,0.479037,0.138059,0.461427,0.679284,0.409555,0.437504,1.00e-03,65.34
3
+ 2,0.340934,0.583895,0.356805,0.554073,0.283801,0.532730,0.320165,0.654515,1.00e-03,60.04
4
+ 3,0.235594,0.485380,0.297022,0.692345,0.208297,0.456396,0.278029,0.746433,1.00e-03,59.63
5
+ 4,0.170691,0.413147,0.254740,0.776588,0.157771,0.397204,0.237802,0.807962,1.00e-03,59.70
6
+ 5,0.131810,0.363057,0.218916,0.827463,0.132028,0.363356,0.213858,0.839284,1.00e-03,59.86
7
+ 6,0.110267,0.332065,0.196188,0.856848,0.115902,0.340444,0.198638,0.858873,1.00e-03,59.62
8
+ 7,0.096040,0.309903,0.181464,0.875021,0.104942,0.323947,0.185184,0.872227,1.00e-03,59.95
9
+ 8,0.086377,0.293899,0.171553,0.887530,0.097656,0.312499,0.177992,0.881097,1.00e-03,59.58
10
+ 9,0.078359,0.279927,0.160941,0.897812,0.091301,0.302161,0.169447,0.888835,1.00e-03,59.90
11
+ 10,0.073312,0.270763,0.154349,0.904966,0.086900,0.294789,0.164714,0.894207,1.00e-03,59.57
12
+ 11,0.068468,0.261664,0.148846,0.910932,0.085661,0.292679,0.161435,0.895726,1.00e-03,59.67
13
+ 12,0.065010,0.254970,0.146267,0.914998,0.081121,0.284818,0.158009,0.901239,1.00e-03,60.36
14
+ 13,0.061069,0.247121,0.140122,0.920231,0.079607,0.282147,0.154407,0.903100,1.00e-03,59.97
15
+ 14,0.058914,0.242722,0.137000,0.923637,0.077227,0.277897,0.155283,0.905989,1.00e-03,60.15
16
+ 15,0.056512,0.237723,0.135221,0.926281,0.075293,0.274396,0.152974,0.908350,1.00e-03,59.71
17
+ 16,0.054027,0.232438,0.132129,0.929556,0.073838,0.271731,0.148192,0.910118,1.00e-03,59.97
18
+ 17,0.052512,0.229155,0.128960,0.931925,0.073361,0.270852,0.147273,0.910708,1.00e-03,59.81
19
+ 18,0.051394,0.226703,0.128523,0.933140,0.073521,0.271147,0.148293,0.910528,1.00e-03,60.32
20
+ 19,0.049440,0.222351,0.125188,0.935878,0.071419,0.267243,0.146592,0.913082,1.00e-03,59.44
21
+ 20,0.047992,0.219071,0.123164,0.938010,0.071299,0.267019,0.143998,0.913210,5.00e-04,60.20
22
+ 21,0.046272,0.215110,0.120968,0.939961,0.069519,0.263665,0.142106,0.915396,5.00e-04,59.85
23
+ 22,0.044947,0.212006,0.118956,0.941484,0.069226,0.263109,0.141456,0.915760,5.00e-04,59.70
24
+ 23,0.044407,0.210730,0.117895,0.942416,0.069001,0.262680,0.141083,0.916037,5.00e-04,59.94
25
+ 24,0.043736,0.209133,0.117196,0.943186,0.069676,0.263963,0.141142,0.915212,5.00e-04,59.95
26
+ 25,0.043649,0.208923,0.117368,0.943405,0.068422,0.261577,0.140406,0.916743,5.00e-04,60.11
27
+ 26,0.042681,0.206594,0.115956,0.944526,0.068122,0.261003,0.139855,0.917116,5.00e-04,59.76
28
+ 27,0.042191,0.205404,0.115091,0.945239,0.068093,0.260946,0.139688,0.917157,5.00e-04,60.18
29
+ 28,0.041531,0.203793,0.114361,0.946010,0.068297,0.261337,0.139505,0.916912,5.00e-04,59.81
30
+ 29,0.040996,0.202475,0.113631,0.946787,0.067966,0.260702,0.139885,0.917328,5.00e-04,60.18
31
+ 30,0.040609,0.201517,0.113316,0.947125,0.067647,0.260090,0.139032,0.917702,5.00e-04,59.62
32
+ 31,0.039835,0.199588,0.112125,0.948180,0.067376,0.259568,0.138378,0.918034,5.00e-04,59.87
33
+ 32,0.039326,0.198307,0.111356,0.948857,0.068531,0.261785,0.138896,0.916625,5.00e-04,60.00
34
+ 33,0.039696,0.199240,0.111835,0.948581,0.067379,0.259574,0.138839,0.918047,5.00e-04,59.81
35
+ 34,0.038716,0.196763,0.110989,0.949635,0.067007,0.258858,0.137886,0.918492,5.00e-04,60.17
36
+ 35,0.038098,0.195186,0.109483,0.950660,0.066883,0.258618,0.137492,0.918644,5.00e-04,59.83
37
+ 36,0.037523,0.193708,0.108925,0.951174,0.067533,0.259870,0.137933,0.917865,5.00e-04,59.75
38
+ 37,0.037220,0.192925,0.108444,0.951658,0.066825,0.258505,0.137148,0.918722,5.00e-04,59.55
39
+ 38,0.036877,0.192035,0.108026,0.952095,0.067890,0.260557,0.137624,0.917420,5.00e-04,59.94
40
+ 39,0.037040,0.192459,0.108110,0.951985,0.066624,0.258117,0.136760,0.918961,5.00e-04,59.55
41
+ 40,0.036091,0.189976,0.107032,0.953142,0.066511,0.257897,0.136552,0.919098,2.50e-04,59.63
42
+ 41,0.035539,0.188518,0.105810,0.954094,0.066289,0.257467,0.135996,0.919369,2.50e-04,59.59
43
+ 42,0.035026,0.187151,0.105319,0.954466,0.066417,0.257716,0.136124,0.919219,2.50e-04,59.36
44
+ 43,0.034981,0.187032,0.105116,0.954684,0.066606,0.258082,0.136141,0.918994,2.50e-04,59.97
45
+ 44,0.034694,0.186262,0.104811,0.954953,0.066261,0.257412,0.135733,0.919411,2.50e-04,59.02
46
+ 45,0.034463,0.185643,0.104455,0.955290,0.066238,0.257367,0.135694,0.919434,2.50e-04,59.95
47
+ 46,0.034219,0.184985,0.104261,0.955483,0.066269,0.257427,0.135793,0.919400,2.50e-04,58.93
48
+ 47,0.033966,0.184300,0.103974,0.955767,0.066324,0.257534,0.135744,0.919335,2.50e-04,59.83
49
+ 48,0.033926,0.184190,0.103723,0.956010,0.066335,0.257555,0.135544,0.919315,2.50e-04,59.88
50
+ 49,0.033810,0.183875,0.103503,0.956230,0.066295,0.257479,0.135624,0.919373,2.50e-04,59.68
51
+ 50,0.033436,0.182856,0.103214,0.956463,0.066339,0.257563,0.135325,0.919312,2.50e-04,59.62
Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (22080 samples):
9
+ Range: [0, 197960]
10
+ First 10: [10146, 10149, 10151, 10154, 10156, 10159, 10161, 10164, 10166, 10169]
11
+ Last 10: [151368, 151371, 151373, 151376, 151378, 151381, 151383, 151386, 151388, 151391]
12
+
13
+ Validation Indices (5520 samples):
14
+ Range: [399, 197732]
15
+ First 10: [47538, 47541, 47543, 47546, 47548, 47551, 47553, 47556, 47558, 47561]
16
+ Last 10: [77667, 77670, 77672, 77675, 77677, 77680, 77682, 77685, 77687, 77690]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 1200
7
+ - Scales Used: 23
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.080610
18
+ RMSE: 0.283920
19
+ MAE: 0.155507
20
+ R² Score: 0.896106
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 1357.18s
25
+ - Average Epoch Time: 27.14s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 1200
10
+ Scales Used per GM: 23
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 22080
44
+ Validation Samples: 5520
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.766311,0.875392,0.509570,0.017693,0.755686,0.869302,0.483049,0.099875,1.00e-03,32.22
3
+ 2,0.595700,0.771816,0.457179,0.229937,0.490866,0.700618,0.407220,0.405898,1.00e-03,26.90
4
+ 3,0.406209,0.637346,0.389321,0.476433,0.361515,0.601261,0.355473,0.558147,1.00e-03,26.58
5
+ 4,0.319668,0.565392,0.344805,0.589243,0.307027,0.554101,0.327134,0.621789,1.00e-03,26.84
6
+ 5,0.271546,0.521101,0.316835,0.650961,0.265776,0.515535,0.304200,0.671161,1.00e-03,26.66
7
+ 6,0.237542,0.487383,0.297232,0.695726,0.235033,0.484802,0.287658,0.709459,1.00e-03,26.93
8
+ 7,0.206285,0.454186,0.277822,0.735305,0.206501,0.454423,0.269302,0.745404,1.00e-03,26.85
9
+ 8,0.177486,0.421290,0.258357,0.771992,0.183715,0.428620,0.252422,0.774565,1.00e-03,26.59
10
+ 9,0.155030,0.393739,0.240388,0.801047,0.163912,0.404860,0.236508,0.799325,1.00e-03,26.76
11
+ 10,0.137527,0.370846,0.223705,0.823784,0.149521,0.386679,0.223956,0.816975,1.00e-03,26.62
12
+ 11,0.124408,0.352715,0.212023,0.840436,0.139902,0.374034,0.214483,0.829150,1.00e-03,26.86
13
+ 12,0.114064,0.337734,0.202130,0.853816,0.129861,0.360362,0.206599,0.840781,1.00e-03,26.62
14
+ 13,0.105001,0.324039,0.192725,0.865379,0.122743,0.350346,0.198527,0.849816,1.00e-03,26.90
15
+ 14,0.097961,0.312986,0.185220,0.874538,0.116931,0.341952,0.194412,0.856965,1.00e-03,26.54
16
+ 15,0.092022,0.303352,0.178804,0.882277,0.111598,0.334062,0.187690,0.863313,1.00e-03,26.91
17
+ 16,0.086698,0.294445,0.172638,0.888867,0.107930,0.328527,0.183514,0.868148,1.00e-03,26.70
18
+ 17,0.082103,0.286536,0.167261,0.894683,0.104703,0.323579,0.180000,0.871947,1.00e-03,26.89
19
+ 18,0.078603,0.280363,0.163152,0.899144,0.102445,0.320070,0.176848,0.875312,1.00e-03,26.65
20
+ 19,0.075286,0.274384,0.158949,0.903496,0.099017,0.314669,0.174706,0.878840,1.00e-03,26.89
21
+ 20,0.071984,0.268299,0.154715,0.907916,0.096772,0.311081,0.170587,0.881959,5.00e-04,26.66
22
+ 21,0.069410,0.263457,0.151331,0.911091,0.095581,0.309162,0.168642,0.883471,5.00e-04,26.91
23
+ 22,0.067982,0.260733,0.149454,0.912863,0.094768,0.307844,0.167698,0.884447,5.00e-04,26.68
24
+ 23,0.066794,0.258445,0.147918,0.914454,0.093882,0.306401,0.166701,0.885560,5.00e-04,26.85
25
+ 24,0.065674,0.256270,0.146501,0.916041,0.093590,0.305925,0.165689,0.886216,5.00e-04,26.62
26
+ 25,0.064529,0.254026,0.145238,0.917403,0.092351,0.303893,0.164889,0.887447,5.00e-04,26.76
27
+ 26,0.063412,0.251817,0.143732,0.918899,0.091684,0.302793,0.163889,0.888347,5.00e-04,26.70
28
+ 27,0.062319,0.249637,0.142385,0.920290,0.091143,0.301899,0.163118,0.889147,5.00e-04,26.85
29
+ 28,0.061220,0.247426,0.141201,0.921613,0.090667,0.301110,0.162173,0.889707,5.00e-04,26.57
30
+ 29,0.060323,0.245607,0.139956,0.922868,0.090100,0.300166,0.161257,0.890546,5.00e-04,26.93
31
+ 30,0.059394,0.243709,0.138686,0.924094,0.089598,0.299328,0.160745,0.891246,5.00e-04,26.81
32
+ 31,0.058315,0.241485,0.137613,0.925267,0.089083,0.298468,0.159970,0.891891,5.00e-04,27.04
33
+ 32,0.057526,0.239846,0.136693,0.926240,0.088825,0.298035,0.159662,0.892239,5.00e-04,26.64
34
+ 33,0.056707,0.238132,0.135610,0.927356,0.088154,0.296907,0.158678,0.893102,5.00e-04,26.68
35
+ 34,0.055786,0.236191,0.134394,0.928600,0.087765,0.296252,0.158260,0.893518,5.00e-04,26.49
36
+ 35,0.055078,0.234688,0.133393,0.929611,0.087273,0.295421,0.157590,0.894226,5.00e-04,26.68
37
+ 36,0.054233,0.232879,0.132491,0.930552,0.086901,0.294789,0.157043,0.894635,5.00e-04,26.49
38
+ 37,0.053541,0.231388,0.131682,0.931421,0.086260,0.293701,0.156467,0.895284,5.00e-04,26.93
39
+ 38,0.052680,0.229521,0.130561,0.932496,0.085963,0.293195,0.155961,0.895693,5.00e-04,26.74
40
+ 39,0.052213,0.228501,0.129904,0.933246,0.086791,0.294604,0.156038,0.895148,5.00e-04,26.93
41
+ 40,0.051520,0.226981,0.129043,0.934089,0.086028,0.293306,0.156036,0.895989,2.50e-04,26.64
42
+ 41,0.050727,0.225226,0.128108,0.935047,0.085238,0.291955,0.154945,0.896690,2.50e-04,26.98
43
+ 42,0.050308,0.224295,0.127538,0.935534,0.085412,0.292254,0.154697,0.896769,2.50e-04,26.72
44
+ 43,0.050012,0.223633,0.127033,0.936026,0.084984,0.291521,0.154240,0.897153,2.50e-04,26.90
45
+ 44,0.049547,0.222592,0.126499,0.936485,0.085046,0.291627,0.154227,0.897191,2.50e-04,26.72
46
+ 45,0.049349,0.222146,0.126186,0.936852,0.084717,0.291062,0.153875,0.897487,2.50e-04,27.01
47
+ 46,0.048990,0.221337,0.125753,0.937280,0.084993,0.291536,0.153720,0.897370,2.50e-04,26.85
48
+ 47,0.048746,0.220786,0.125365,0.937630,0.084555,0.290783,0.153490,0.897775,2.50e-04,27.04
49
+ 48,0.048421,0.220048,0.124952,0.938049,0.084685,0.291007,0.153318,0.897677,2.50e-04,26.81
50
+ 49,0.048124,0.219373,0.124594,0.938454,0.084564,0.290799,0.153219,0.897834,2.50e-04,26.91
51
+ 50,0.047750,0.218519,0.124181,0.938843,0.084231,0.290226,0.153050,0.898173,2.50e-04,26.70
Efficiency-Series (E-Base, E-Test-1~15)/E-test-11-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_044539/model/fno_vE-test-11.1.0+.E_epoch20_mse0.0968_r20.8820_20251209_045446.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:95f527f8815990d8771c9c0cb832556769f82696e3d76470588d6faca3f62bf4
3
+ size 303054069
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (5280 samples):
9
+ Range: [0, 197789]
10
+ First 10: [150423, 150429, 150434, 150440, 150445, 150451, 150457, 150462, 150468, 150473]
11
+ Last 10: [52503, 52508, 52514, 52519, 52525, 52531, 52536, 52542, 52547, 52553]
12
+
13
+ Validation Indices (1320 samples):
14
+ Range: [1425, 197960]
15
+ First 10: [83277, 83283, 83288, 83294, 83299, 83305, 83311, 83316, 83322, 83327]
16
+ Last 10: [12432, 12437, 12443, 12448, 12454, 12460, 12465, 12471, 12476, 12482]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 600
7
+ - Scales Used: 11
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.151799
18
+ RMSE: 0.389614
19
+ MAE: 0.234563
20
+ R² Score: 0.804419
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 344.23s
25
+ - Average Epoch Time: 6.88s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 600
10
+ Scales Used per GM: 11
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 5280
44
+ Validation Samples: 1320
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.752955,0.867730,0.511840,-0.000251,0.734296,0.856911,0.483321,0.004986,1.00e-03,12.18
3
+ 2,0.770324,0.877681,0.507863,0.008381,0.718177,0.847454,0.479127,0.026828,1.00e-03,6.40
4
+ 3,0.756918,0.870010,0.502290,0.036846,0.665784,0.815956,0.462967,0.097824,1.00e-03,6.64
5
+ 4,0.692868,0.832387,0.482650,0.121242,0.585948,0.765473,0.439180,0.206006,1.00e-03,6.43
6
+ 5,0.591314,0.768970,0.458880,0.222792,0.541079,0.735581,0.423485,0.266806,1.00e-03,6.41
7
+ 6,0.528605,0.727052,0.437938,0.304538,0.449891,0.670739,0.394544,0.390371,1.00e-03,6.59
8
+ 7,0.464579,0.681600,0.410772,0.409370,0.398259,0.631077,0.375262,0.460336,1.00e-03,6.41
9
+ 8,0.427002,0.653454,0.392007,0.470289,0.362059,0.601714,0.362799,0.509388,1.00e-03,6.62
10
+ 9,0.369514,0.607877,0.376324,0.522513,0.328491,0.573141,0.346001,0.554876,1.00e-03,6.41
11
+ 10,0.340791,0.583773,0.358072,0.563777,0.310443,0.557174,0.336955,0.579332,1.00e-03,6.43
12
+ 11,0.319905,0.565602,0.348240,0.585928,0.298947,0.546760,0.329721,0.594910,1.00e-03,6.66
13
+ 12,0.311641,0.558248,0.339300,0.605961,0.284249,0.533150,0.318767,0.614826,1.00e-03,6.38
14
+ 13,0.285818,0.534619,0.327771,0.627608,0.273022,0.522515,0.311227,0.630039,1.00e-03,6.34
15
+ 14,0.289258,0.537827,0.319406,0.646062,0.263888,0.513701,0.304776,0.642416,1.00e-03,6.61
16
+ 15,0.264849,0.514635,0.311877,0.662954,0.252644,0.502637,0.297691,0.657652,1.00e-03,6.37
17
+ 16,0.264728,0.514517,0.305377,0.675161,0.247682,0.497677,0.296943,0.664376,1.00e-03,6.34
18
+ 17,0.252554,0.502548,0.302526,0.687503,0.236943,0.486768,0.290611,0.678928,1.00e-03,6.56
19
+ 18,0.234890,0.484654,0.295831,0.699958,0.228211,0.477714,0.288198,0.690761,1.00e-03,6.33
20
+ 19,0.236282,0.486088,0.291064,0.713401,0.220006,0.469048,0.280243,0.701879,1.00e-03,6.37
21
+ 20,0.218490,0.467429,0.284480,0.723851,0.213081,0.461607,0.275960,0.711262,5.00e-04,6.58
22
+ 21,0.214599,0.463249,0.279136,0.734502,0.209662,0.457888,0.273413,0.715896,5.00e-04,6.41
23
+ 22,0.200432,0.447696,0.275510,0.741189,0.205274,0.453072,0.270811,0.721842,5.00e-04,6.62
24
+ 23,0.199255,0.446380,0.272039,0.747353,0.202060,0.449510,0.269108,0.726198,5.00e-04,6.41
25
+ 24,0.199804,0.446994,0.269253,0.753479,0.199108,0.446215,0.267272,0.730197,5.00e-04,6.40
26
+ 25,0.193656,0.440063,0.266207,0.759455,0.194140,0.440613,0.262744,0.736929,5.00e-04,6.57
27
+ 26,0.181430,0.425946,0.262013,0.765518,0.191147,0.437204,0.260790,0.740984,5.00e-04,6.39
28
+ 27,0.181112,0.425572,0.259080,0.771365,0.186686,0.432072,0.257255,0.747030,5.00e-04,6.37
29
+ 28,0.180375,0.424706,0.255247,0.777132,0.182483,0.427180,0.254239,0.752725,5.00e-04,6.61
30
+ 29,0.171781,0.414465,0.251943,0.782166,0.181107,0.425566,0.253468,0.754590,5.00e-04,6.43
31
+ 30,0.174057,0.417201,0.249958,0.787453,0.176350,0.419940,0.250632,0.761036,5.00e-04,6.41
32
+ 31,0.166065,0.407511,0.246428,0.793021,0.173175,0.416143,0.248257,0.765338,5.00e-04,6.61
33
+ 32,0.161099,0.401372,0.243714,0.797863,0.171383,0.413984,0.249600,0.767767,5.00e-04,6.38
34
+ 33,0.159594,0.399492,0.242943,0.802263,0.168769,0.410815,0.245565,0.771308,5.00e-04,6.62
35
+ 34,0.155240,0.394006,0.239167,0.806041,0.165377,0.406665,0.242164,0.775905,5.00e-04,6.40
36
+ 35,0.152397,0.390380,0.236915,0.810327,0.163211,0.403994,0.242215,0.778839,5.00e-04,6.39
37
+ 36,0.148037,0.384756,0.233664,0.814405,0.159671,0.399588,0.236759,0.783637,5.00e-04,6.61
38
+ 37,0.146445,0.382681,0.229784,0.818548,0.157436,0.396783,0.235823,0.786664,5.00e-04,6.43
39
+ 38,0.141431,0.376073,0.227594,0.822019,0.155209,0.393965,0.233724,0.789683,5.00e-04,6.41
40
+ 39,0.140953,0.375437,0.225384,0.825234,0.153210,0.391420,0.232006,0.792392,5.00e-04,6.64
41
+ 40,0.136616,0.369616,0.223035,0.828608,0.151075,0.388683,0.229740,0.795285,2.50e-04,6.39
42
+ 41,0.143662,0.379028,0.220442,0.831528,0.150218,0.387579,0.228693,0.796446,2.50e-04,6.41
43
+ 42,0.132849,0.364484,0.219276,0.833212,0.148718,0.385640,0.227396,0.798478,2.50e-04,6.61
44
+ 43,0.132301,0.363732,0.217690,0.834983,0.147893,0.384568,0.226165,0.799597,2.50e-04,6.33
45
+ 44,0.129113,0.359322,0.216574,0.836196,0.147547,0.384118,0.226185,0.800065,2.50e-04,6.62
46
+ 45,0.130021,0.360584,0.215728,0.837897,0.146272,0.382455,0.225036,0.801793,2.50e-04,6.39
47
+ 46,0.129330,0.359625,0.214467,0.839460,0.145436,0.381361,0.224224,0.802926,2.50e-04,6.35
48
+ 47,0.122856,0.350509,0.213556,0.840932,0.145037,0.380838,0.223970,0.803466,2.50e-04,6.62
49
+ 48,0.126393,0.355517,0.212733,0.842256,0.143749,0.379142,0.222709,0.805212,2.50e-04,6.36
50
+ 49,0.121437,0.348479,0.211460,0.843520,0.142921,0.378049,0.221836,0.806334,2.50e-04,6.40
51
+ 50,0.126908,0.356242,0.210412,0.844906,0.142378,0.377330,0.221409,0.807070,2.50e-04,6.61
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/model/fno_vE-test-12.1.0+.E_epoch30_mse0.1763_r20.7610_20251209_051212.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:19c4e1a0967821b6e8ba03e17ac0f2989adcf64658fa8b75e736c7273fc003c1
3
+ size 303054069
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/model/fno_vE-test-12.1.0+.E_epoch40_mse0.1511_r20.7953_20251209_051320.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c3870c4135accf69f25c8ae3945aa25b13a0a2c56502e49e74d4ad9f361d3ec0
3
+ size 303054069
Efficiency-Series (E-Base, E-Test-1~15)/E-test-12-FNO_v1.0+.E_h64_m1536_l4_e50_20251209_050844/model/fno_vE-test-12.1.0+.E_epoch50_mse0.1424_r20.8071_20251209_051428.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3580c83247509f7232c371a034bf4e6880140e32cd86320b9842503988cea948
3
+ size 303054069
Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (14400 samples):
9
+ Range: [114, 197960]
10
+ First 10: [42294, 42305, 42316, 42328, 42339, 42350, 144039, 144050, 144061, 144073]
11
+ Last 10: [67852, 67864, 67875, 67886, 1254, 1265, 1276, 1288, 1299, 1310]
12
+
13
+ Validation Indices (3600 samples):
14
+ Range: [0, 198017]
15
+ First 10: [84132, 84143, 84154, 84166, 84177, 84188, 118161, 118172, 118183, 118195]
16
+ Last 10: [71785, 71797, 71808, 71819, 16929, 16940, 16951, 16963, 16974, 16985]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 3000
7
+ - Scales Used: 6
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.082831
18
+ RMSE: 0.287804
19
+ MAE: 0.161126
20
+ R² Score: 0.893295
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 905.12s
25
+ - Average Epoch Time: 18.10s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 3000
10
+ Scales Used per GM: 6
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 14400
44
+ Validation Samples: 3600
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-13-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_063342/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.789521,0.888550,0.497189,0.004831,0.779280,0.882768,0.491265,0.029133,1.00e-03,30.72
3
+ 2,0.717023,0.846772,0.480191,0.088808,0.624065,0.789978,0.447518,0.218594,1.00e-03,17.47
4
+ 3,0.572296,0.756502,0.434972,0.271583,0.488365,0.698831,0.401646,0.388659,1.00e-03,17.50
5
+ 4,0.447950,0.669291,0.390177,0.430583,0.396276,0.629504,0.369139,0.503975,1.00e-03,17.59
6
+ 5,0.362504,0.602083,0.357121,0.540333,0.336793,0.580339,0.337661,0.579431,1.00e-03,17.51
7
+ 6,0.319478,0.565224,0.333265,0.595398,0.304044,0.551401,0.319697,0.619760,1.00e-03,17.48
8
+ 7,0.286799,0.535537,0.313898,0.636308,0.273796,0.523256,0.303481,0.656572,1.00e-03,17.42
9
+ 8,0.261300,0.511176,0.299654,0.668664,0.252442,0.502436,0.291770,0.683478,1.00e-03,17.52
10
+ 9,0.239643,0.489533,0.288035,0.696314,0.231885,0.481545,0.280434,0.709561,1.00e-03,17.59
11
+ 10,0.219926,0.468963,0.275948,0.722259,0.212724,0.461220,0.269061,0.733670,1.00e-03,17.52
12
+ 11,0.200009,0.447224,0.263609,0.746730,0.193561,0.439955,0.257676,0.757493,1.00e-03,17.61
13
+ 12,0.182185,0.426831,0.251553,0.769667,0.179314,0.423455,0.248646,0.775398,1.00e-03,17.53
14
+ 13,0.166529,0.408079,0.240179,0.789325,0.164286,0.405323,0.234815,0.794155,1.00e-03,17.74
15
+ 14,0.153228,0.391443,0.228790,0.805999,0.153055,0.391223,0.225268,0.808310,1.00e-03,17.62
16
+ 15,0.142469,0.377450,0.219305,0.819591,0.144337,0.379917,0.219685,0.819008,1.00e-03,17.52
17
+ 16,0.134068,0.366152,0.212331,0.830431,0.136923,0.370030,0.213473,0.828062,1.00e-03,17.54
18
+ 17,0.126183,0.355222,0.206304,0.840020,0.128519,0.358496,0.203478,0.838628,1.00e-03,17.53
19
+ 18,0.119565,0.345782,0.198290,0.849447,0.124589,0.352971,0.200533,0.843650,1.00e-03,17.53
20
+ 19,0.114102,0.337790,0.193849,0.855994,0.117696,0.343068,0.194044,0.852241,1.00e-03,17.51
21
+ 20,0.108162,0.328880,0.188753,0.862983,0.113371,0.336707,0.189266,0.857536,5.00e-04,17.54
22
+ 21,0.103970,0.322443,0.182982,0.868501,0.110960,0.333107,0.186471,0.860667,5.00e-04,17.54
23
+ 22,0.101692,0.318892,0.180673,0.871490,0.109192,0.330442,0.184084,0.862902,5.00e-04,17.56
24
+ 23,0.099593,0.315583,0.178457,0.874085,0.107357,0.327654,0.182228,0.865182,5.00e-04,17.58
25
+ 24,0.097650,0.312490,0.176220,0.876672,0.105475,0.324768,0.180281,0.867513,5.00e-04,17.55
26
+ 25,0.095305,0.308716,0.174065,0.879088,0.103900,0.322336,0.178595,0.869503,5.00e-04,17.56
27
+ 26,0.094001,0.306596,0.172156,0.881336,0.102374,0.319959,0.176982,0.871418,5.00e-04,17.52
28
+ 27,0.092103,0.303484,0.170389,0.883520,0.100770,0.317443,0.175214,0.873404,5.00e-04,17.53
29
+ 28,0.090567,0.300944,0.168574,0.885632,0.099640,0.315659,0.173520,0.874874,5.00e-04,17.50
30
+ 29,0.089139,0.298561,0.166994,0.887569,0.098040,0.313114,0.172292,0.876842,5.00e-04,17.47
31
+ 30,0.087499,0.295802,0.165156,0.889567,0.096669,0.310917,0.170990,0.878513,5.00e-04,17.48
32
+ 31,0.085819,0.292948,0.163889,0.891360,0.095583,0.309164,0.169512,0.879941,5.00e-04,17.49
33
+ 32,0.084431,0.290571,0.162214,0.893122,0.094302,0.307086,0.168299,0.881532,5.00e-04,17.53
34
+ 33,0.083179,0.288407,0.160668,0.894883,0.093268,0.305398,0.166954,0.882875,5.00e-04,17.46
35
+ 34,0.082100,0.286532,0.159253,0.896516,0.092220,0.303677,0.166028,0.884151,5.00e-04,17.52
36
+ 35,0.080685,0.284051,0.158432,0.897935,0.091353,0.302246,0.164517,0.885282,5.00e-04,17.53
37
+ 36,0.079546,0.282038,0.156637,0.899539,0.090198,0.300330,0.163315,0.886735,5.00e-04,17.58
38
+ 37,0.078158,0.279568,0.155176,0.901102,0.089254,0.298754,0.162656,0.887908,5.00e-04,17.53
39
+ 38,0.076972,0.277437,0.154009,0.902531,0.088397,0.297316,0.161714,0.889005,5.00e-04,17.64
40
+ 39,0.076165,0.275980,0.152888,0.903892,0.087725,0.296183,0.160096,0.889926,5.00e-04,17.50
41
+ 40,0.075091,0.274027,0.151635,0.905164,0.086828,0.294665,0.159646,0.891008,2.50e-04,17.53
42
+ 41,0.074164,0.272330,0.150558,0.906377,0.086512,0.294129,0.158542,0.891458,2.50e-04,17.55
43
+ 42,0.073636,0.271359,0.149935,0.907036,0.086052,0.293346,0.158647,0.892021,2.50e-04,17.46
44
+ 43,0.073073,0.270321,0.149341,0.907685,0.085638,0.292640,0.157765,0.892540,2.50e-04,17.52
45
+ 44,0.072676,0.269584,0.148667,0.908293,0.085215,0.291916,0.157386,0.893038,2.50e-04,17.54
46
+ 45,0.072167,0.268639,0.148250,0.908824,0.084960,0.291479,0.157126,0.893403,2.50e-04,17.47
47
+ 46,0.071553,0.267495,0.147730,0.909422,0.084708,0.291046,0.156452,0.893749,2.50e-04,17.45
48
+ 47,0.071074,0.266598,0.147147,0.909994,0.084277,0.290305,0.156115,0.894275,2.50e-04,17.52
49
+ 48,0.070755,0.265997,0.146605,0.910601,0.083864,0.289592,0.155824,0.894775,2.50e-04,17.56
50
+ 49,0.070306,0.265153,0.146141,0.911183,0.083540,0.289033,0.155147,0.895192,2.50e-04,17.52
51
+ 50,0.069819,0.264232,0.145520,0.911754,0.083356,0.288714,0.154768,0.895469,2.50e-04,17.53
Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (7200 samples):
9
+ Range: [114, 197960]
10
+ First 10: [42294, 42322, 42350, 144039, 144067, 144095, 16587, 16615, 16643, 171798]
11
+ Last 10: [172595, 103740, 103768, 103796, 67830, 67858, 67886, 1254, 1282, 1310]
12
+
13
+ Validation Indices (1800 samples):
14
+ Range: [0, 198017]
15
+ First 10: [84132, 84160, 84188, 118161, 118189, 118217, 177897, 177925, 177953, 15903]
16
+ Last 10: [185933, 142500, 142528, 142556, 71763, 71791, 71819, 16929, 16957, 16985]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 3000
7
+ - Scales Used: 3
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.162886
18
+ RMSE: 0.403592
19
+ MAE: 0.243490
20
+ R² Score: 0.790131
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 461.52s
25
+ - Average Epoch Time: 9.23s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 3000
10
+ Scales Used per GM: 3
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 7200
44
+ Validation Samples: 1800
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-14-FNO_v1.0+.E_h64_m1536_l4_e50_20251210_064916/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.820923,0.906048,0.469079,0.000255,0.817786,0.904315,0.463599,0.009702,1.00e-03,15.51
3
+ 2,0.808860,0.899367,0.464444,0.015852,0.790292,0.888984,0.459249,0.042996,1.00e-03,8.89
4
+ 3,0.764084,0.874119,0.456935,0.067520,0.705140,0.839726,0.435769,0.146111,1.00e-03,8.69
5
+ 4,0.662514,0.813950,0.427202,0.189448,0.607822,0.779629,0.408851,0.263958,1.00e-03,8.88
6
+ 5,0.597832,0.773196,0.409040,0.271594,0.542960,0.736858,0.388790,0.342503,1.00e-03,8.67
7
+ 6,0.517554,0.719412,0.379527,0.368059,0.491777,0.701268,0.360659,0.404482,1.00e-03,8.88
8
+ 7,0.479564,0.692505,0.357601,0.415420,0.462319,0.679941,0.350876,0.440154,1.00e-03,8.71
9
+ 8,0.444814,0.666944,0.353500,0.456883,0.419228,0.647479,0.348633,0.492335,1.00e-03,8.87
10
+ 9,0.402107,0.634119,0.345542,0.509040,0.373137,0.610849,0.330335,0.548150,1.00e-03,8.67
11
+ 10,0.358335,0.598611,0.325033,0.563005,0.342385,0.585137,0.313684,0.585389,1.00e-03,8.93
12
+ 11,0.329448,0.573976,0.311247,0.597976,0.322577,0.567959,0.305862,0.609376,1.00e-03,8.65
13
+ 12,0.312600,0.559107,0.308734,0.619244,0.306980,0.554058,0.297626,0.628263,1.00e-03,8.91
14
+ 13,0.296667,0.544671,0.299759,0.638078,0.291837,0.540220,0.291733,0.646600,1.00e-03,8.68
15
+ 14,0.281940,0.530980,0.292204,0.656844,0.278887,0.528098,0.283150,0.662282,1.00e-03,8.88
16
+ 15,0.268527,0.518196,0.282274,0.673049,0.268596,0.518263,0.277076,0.674743,1.00e-03,8.65
17
+ 16,0.256308,0.506269,0.274496,0.687153,0.260328,0.510224,0.272685,0.684756,1.00e-03,8.90
18
+ 17,0.247213,0.497205,0.269147,0.698070,0.251470,0.501468,0.266088,0.695482,1.00e-03,8.67
19
+ 18,0.238649,0.488517,0.263315,0.709079,0.242974,0.492924,0.262334,0.705771,1.00e-03,8.87
20
+ 19,0.230411,0.480012,0.257720,0.719774,0.234993,0.484761,0.257589,0.715435,1.00e-03,8.65
21
+ 20,0.221444,0.470578,0.253174,0.729712,0.228218,0.477722,0.253189,0.723640,5.00e-04,8.94
22
+ 21,0.215064,0.463750,0.248973,0.737788,0.224190,0.473487,0.251275,0.728517,5.00e-04,8.71
23
+ 22,0.211294,0.459668,0.246955,0.742630,0.220586,0.469666,0.249001,0.732882,5.00e-04,8.92
24
+ 23,0.207090,0.455071,0.244178,0.747397,0.217179,0.466024,0.246754,0.737007,5.00e-04,8.67
25
+ 24,0.203472,0.451078,0.241868,0.752222,0.213277,0.461819,0.244815,0.741732,5.00e-04,8.90
26
+ 25,0.199585,0.446749,0.239600,0.756926,0.209843,0.458087,0.242467,0.745890,5.00e-04,8.65
27
+ 26,0.195503,0.442158,0.237036,0.761815,0.205973,0.453842,0.240470,0.750577,5.00e-04,8.92
28
+ 27,0.191777,0.437924,0.234922,0.766543,0.202393,0.449881,0.238319,0.754912,5.00e-04,8.65
29
+ 28,0.188130,0.433740,0.232438,0.771268,0.199021,0.446118,0.236147,0.758995,5.00e-04,8.82
30
+ 29,0.183715,0.428619,0.229981,0.775949,0.195307,0.441936,0.234237,0.763492,5.00e-04,8.75
31
+ 30,0.179873,0.424114,0.227736,0.780760,0.191897,0.438061,0.232053,0.767622,5.00e-04,8.93
32
+ 31,0.175891,0.419394,0.225366,0.785384,0.188404,0.434056,0.230211,0.771852,5.00e-04,8.72
33
+ 32,0.172019,0.414751,0.223002,0.790027,0.185004,0.430121,0.228102,0.775970,5.00e-04,8.89
34
+ 33,0.169010,0.411109,0.220860,0.794637,0.181573,0.426114,0.226191,0.780124,5.00e-04,8.66
35
+ 34,0.164806,0.405963,0.218550,0.799084,0.178370,0.422339,0.223939,0.784003,5.00e-04,8.89
36
+ 35,0.161266,0.401579,0.215988,0.803510,0.175220,0.418593,0.222248,0.787817,5.00e-04,8.67
37
+ 36,0.157674,0.397082,0.213850,0.807794,0.172460,0.415283,0.219827,0.791159,5.00e-04,8.95
38
+ 37,0.154310,0.392823,0.211228,0.811919,0.169227,0.411372,0.218381,0.795074,5.00e-04,8.70
39
+ 38,0.150932,0.388500,0.209293,0.815994,0.166539,0.408092,0.215976,0.798330,5.00e-04,8.92
40
+ 39,0.147601,0.384189,0.206966,0.819743,0.163700,0.404598,0.214356,0.801768,5.00e-04,8.69
41
+ 40,0.144907,0.380666,0.204653,0.823551,0.161113,0.401389,0.212650,0.804900,2.50e-04,8.87
42
+ 41,0.142193,0.377084,0.202948,0.826728,0.159923,0.399903,0.211485,0.806342,2.50e-04,8.68
43
+ 42,0.140838,0.375284,0.201588,0.828514,0.158711,0.398385,0.210673,0.807810,2.50e-04,8.86
44
+ 43,0.139476,0.373465,0.200650,0.830246,0.157500,0.396862,0.209808,0.809276,2.50e-04,8.67
45
+ 44,0.137851,0.371283,0.199534,0.831931,0.156432,0.395514,0.208911,0.810569,2.50e-04,8.87
46
+ 45,0.136424,0.369356,0.198458,0.833583,0.155274,0.394048,0.208203,0.811971,2.50e-04,8.68
47
+ 46,0.135372,0.367929,0.197591,0.835198,0.154301,0.392812,0.207320,0.813149,2.50e-04,8.92
48
+ 47,0.133745,0.365711,0.196510,0.836757,0.153118,0.391304,0.206585,0.814581,2.50e-04,8.66
49
+ 48,0.132553,0.364079,0.195600,0.838319,0.152130,0.390039,0.205619,0.815778,2.50e-04,8.88
50
+ 49,0.131495,0.362622,0.194423,0.839861,0.151094,0.388708,0.205033,0.817033,2.50e-04,8.62
51
+ 50,0.130371,0.361069,0.193579,0.841358,0.150103,0.387432,0.204121,0.818232,2.50e-04,8.97
Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (81600 samples):
9
+ Range: [114, 197960]
10
+ First 10: [42294, 42296, 42297, 42299, 42301, 42302, 42304, 42306, 42308, 42309]
11
+ Last 10: [1295, 1296, 1298, 1300, 1302, 1303, 1305, 1307, 1308, 1310]
12
+
13
+ Validation Indices (20400 samples):
14
+ Range: [0, 198017]
15
+ First 10: [84132, 84134, 84135, 84137, 84139, 84140, 84142, 84144, 84146, 84147]
16
+ Last 10: [16970, 16971, 16973, 16975, 16977, 16978, 16980, 16982, 16983, 16985]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]
Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/results.txt ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Regression Test Results - Efficiency Analysis
2
+ ============================================================
3
+ Model Version: 1.0+_E (Standard FNO)
4
+
5
+ Efficiency Analysis:
6
+ - Train GMs Used: 3000
7
+ - Scales Used: 34
8
+
9
+ Model Configuration:
10
+ - Hidden Channels: 64
11
+ - Fourier Modes: 1536
12
+ - Number of Layers: 4
13
+ - Domain Padding: 0.1
14
+
15
+ REGRESSION (Floor Acceleration Response)
16
+ ------------------------------------------------------------
17
+ MSE Loss: 0.057237
18
+ RMSE: 0.239243
19
+ MAE: 0.123326
20
+ R² Score: 0.926272
21
+
22
+ ============================================================
23
+ Training Performance:
24
+ - Total Training Time: 4791.41s
25
+ - Average Epoch Time: 95.83s
26
+ ============================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/training_config.txt ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ======================================================================
2
+ FNO v1.0 Efficiency Analysis Configuration
3
+ ======================================================================
4
+
5
+ EFFICIENCY ANALYSIS PARAMETERS
6
+ ----------------------------------------------------------------------
7
+ Total GMs: 3474
8
+ Test Size (Fixed): 474
9
+ Train GMs Used: 3000
10
+ Scales Used per GM: 34
11
+ Random Seed: 42
12
+
13
+ MODEL ARCHITECTURE
14
+ ----------------------------------------------------------------------
15
+ Model Version: 1.0+_E
16
+ Model Type: Standard FNO (neuralop)
17
+ Fourier Modes: 1536
18
+ Hidden Channels: 64
19
+ Number of Layers: 4
20
+ Domain Padding: 0.1
21
+ Projection Ratio: 2
22
+ Input Channels: 1
23
+ Output Channels: 1
24
+ Grid Size: 3000
25
+ Total Parameters: 12,650,049
26
+ Trainable Parameters: 12,650,049
27
+
28
+ TRAINING PARAMETERS
29
+ ----------------------------------------------------------------------
30
+ Batch Size: 2560
31
+ Number of Epochs: 50
32
+ Learning Rate: 0.001
33
+ Weight Decay: 0.0001
34
+ Optimizer: AdamW
35
+ Loss Function: MSE Loss
36
+ LR Scheduler: StepLR
37
+ - Step Size: 20
38
+ - Gamma: 0.5
39
+ Checkpoint Interval: 10 epochs
40
+
41
+ DATASET CONFIGURATION
42
+ ----------------------------------------------------------------------
43
+ Train Samples: 81600
44
+ Validation Samples: 20400
45
+ Test Samples: 27018
46
+ Total GM Count: 198018
47
+
48
+ DATA PATHS
49
+ ----------------------------------------------------------------------
50
+ GM Path: /home/jason/SesimicTransformerData/MDOF/All_GMs/GMs_knet_3474_AF_57.h5
51
+ Buildings Dir: /home/jason/SesimicTransformerData/MDOF/knet-250/Data/fno
52
+ Buildings Files: 1 .h5 files
53
+ Output Directory: /home/jason/SeismicAssessment/output/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210
54
+
55
+ SYSTEM INFORMATION
56
+ ----------------------------------------------------------------------
57
+ Platform: linux
58
+ Device: cuda
59
+ PyTorch Version: 2.9.1+cu128
60
+ CUDA Available: True
61
+ CUDA Version: 12.8
62
+ GPU Count: 1
63
+ GPU Device: NVIDIA RTX PRO 6000 Blackwell Workstation Edition
64
+
65
+ ADDITIONAL NOTES
66
+ ----------------------------------------------------------------------
67
+ Run Test: True
68
+ Task Type: Regression (Floor Acceleration Response)
69
+ Compiled Model: True
70
+ DataLoader Workers: 0 (h5py compatibility)
71
+
72
+ ======================================================================
Efficiency-Series (E-Base, E-Test-1~15)/E-test-2-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_125210/details/training_log.txt ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Epoch,Train_MSE,Train_RMSE,Train_MAE,Train_R2,Val_MSE,Val_RMSE,Val_MAE,Val_R2,Learning_Rate,Time(s)
2
+ 1,0.559734,0.748154,0.442959,0.292049,0.325001,0.570088,0.346644,0.589603,1.00e-03,105.10
3
+ 2,0.253526,0.503514,0.307686,0.679591,0.197246,0.444124,0.270430,0.750906,1.00e-03,99.56
4
+ 3,0.156805,0.395986,0.240629,0.801845,0.132264,0.363681,0.217139,0.832943,1.00e-03,98.53
5
+ 4,0.112849,0.335931,0.199447,0.857442,0.105284,0.324475,0.187113,0.867016,1.00e-03,98.90
6
+ 5,0.091969,0.303264,0.176091,0.883843,0.090835,0.301389,0.171937,0.885261,1.00e-03,98.49
7
+ 6,0.079590,0.282117,0.161163,0.899484,0.081738,0.285898,0.159998,0.896750,1.00e-03,96.64
8
+ 7,0.071805,0.267965,0.151551,0.909319,0.076325,0.276269,0.153292,0.903587,1.00e-03,95.46
9
+ 8,0.066154,0.257204,0.145326,0.916432,0.073427,0.270975,0.151020,0.907244,1.00e-03,95.43
10
+ 9,0.061478,0.247947,0.138744,0.922357,0.068654,0.262019,0.144037,0.913275,1.00e-03,95.22
11
+ 10,0.058039,0.240912,0.134465,0.926703,0.066919,0.258687,0.142184,0.915467,1.00e-03,95.11
12
+ 11,0.055282,0.235122,0.131282,0.930184,0.064853,0.254662,0.139314,0.918077,1.00e-03,94.94
13
+ 12,0.053175,0.230597,0.128804,0.932844,0.063375,0.251744,0.136697,0.919944,1.00e-03,95.46
14
+ 13,0.050600,0.224945,0.125536,0.936109,0.062665,0.250330,0.134638,0.920843,1.00e-03,94.87
15
+ 14,0.049091,0.221565,0.123295,0.938007,0.061619,0.248231,0.134676,0.922163,1.00e-03,94.80
16
+ 15,0.047525,0.218002,0.121072,0.940000,0.062310,0.249619,0.135261,0.921289,1.00e-03,94.52
17
+ 16,0.045741,0.213870,0.118726,0.942238,0.060136,0.245226,0.131542,0.924036,1.00e-03,95.33
18
+ 17,0.044574,0.211127,0.117764,0.943716,0.060115,0.245183,0.130351,0.924063,1.00e-03,95.52
19
+ 18,0.043172,0.207778,0.115322,0.945487,0.059513,0.243952,0.130568,0.924821,1.00e-03,95.25
20
+ 19,0.043237,0.207935,0.117773,0.945410,0.066759,0.258378,0.138362,0.915672,1.00e-03,95.33
21
+ 20,0.042647,0.206511,0.116086,0.946135,0.058945,0.242786,0.129255,0.925539,5.00e-04,95.54
22
+ 21,0.039364,0.198404,0.110002,0.950291,0.057703,0.240213,0.126802,0.927110,5.00e-04,95.31
23
+ 22,0.038651,0.196598,0.108897,0.951192,0.057582,0.239963,0.126602,0.927262,5.00e-04,95.15
24
+ 23,0.038123,0.195251,0.108212,0.951862,0.057497,0.239785,0.126297,0.927370,5.00e-04,95.17
25
+ 24,0.037725,0.194228,0.107706,0.952362,0.057429,0.239644,0.126362,0.927455,5.00e-04,94.76
26
+ 25,0.037505,0.193661,0.107358,0.952641,0.057417,0.239618,0.126523,0.927470,5.00e-04,95.24
27
+ 26,0.036927,0.192165,0.106623,0.953380,0.057325,0.239427,0.125948,0.927586,5.00e-04,94.93
28
+ 27,0.036431,0.190869,0.105975,0.953989,0.057371,0.239522,0.125422,0.927528,5.00e-04,94.09
29
+ 28,0.036207,0.190281,0.105665,0.954278,0.057228,0.239224,0.125170,0.927708,5.00e-04,95.32
30
+ 29,0.035649,0.188810,0.104892,0.954977,0.057078,0.238911,0.125133,0.927897,5.00e-04,95.27
31
+ 30,0.035031,0.187165,0.104100,0.955772,0.057098,0.238951,0.125095,0.927873,5.00e-04,95.03
32
+ 31,0.034615,0.186050,0.103590,0.956291,0.057253,0.239277,0.125669,0.927675,5.00e-04,95.44
33
+ 32,0.034369,0.185390,0.103241,0.956594,0.056941,0.238623,0.124556,0.928069,5.00e-04,95.19
34
+ 33,0.033905,0.184134,0.102643,0.957185,0.056944,0.238629,0.124563,0.928066,5.00e-04,94.14
35
+ 34,0.033501,0.183032,0.102058,0.957702,0.056845,0.238422,0.124195,0.928191,5.00e-04,94.94
36
+ 35,0.033115,0.181977,0.101538,0.958181,0.057004,0.238755,0.124288,0.927990,5.00e-04,95.35
37
+ 36,0.032827,0.181183,0.101137,0.958554,0.057416,0.239616,0.124393,0.927470,5.00e-04,94.62
38
+ 37,0.032820,0.181163,0.101332,0.958562,0.057019,0.238785,0.124492,0.927972,5.00e-04,94.91
39
+ 38,0.032142,0.179282,0.100266,0.959418,0.056789,0.238305,0.123895,0.928261,5.00e-04,95.20
40
+ 39,0.031792,0.178304,0.099739,0.959857,0.056585,0.237876,0.123536,0.928519,5.00e-04,95.30
41
+ 40,0.031634,0.177859,0.099518,0.960059,0.056662,0.238039,0.123530,0.928420,2.50e-04,95.26
42
+ 41,0.030744,0.175340,0.098207,0.961180,0.056584,0.237875,0.123051,0.928519,2.50e-04,95.02
43
+ 42,0.030508,0.174666,0.097841,0.961477,0.056738,0.238198,0.123020,0.928325,2.50e-04,94.97
44
+ 43,0.030374,0.174283,0.097669,0.961648,0.056731,0.238182,0.123190,0.928334,2.50e-04,95.52
45
+ 44,0.030158,0.173661,0.097384,0.961919,0.056647,0.238007,0.122991,0.928440,2.50e-04,95.02
46
+ 45,0.029967,0.173109,0.097125,0.962156,0.056684,0.238085,0.122847,0.928393,2.50e-04,95.60
47
+ 46,0.029821,0.172688,0.096899,0.962345,0.056761,0.238246,0.122881,0.928296,2.50e-04,95.05
48
+ 47,0.029665,0.172236,0.096701,0.962543,0.056759,0.238241,0.122853,0.928298,2.50e-04,95.47
49
+ 48,0.029511,0.171787,0.096448,0.962734,0.056767,0.238258,0.122700,0.928288,2.50e-04,95.19
50
+ 49,0.029339,0.171285,0.096197,0.962953,0.056795,0.238316,0.122869,0.928253,2.50e-04,95.31
51
+ 50,0.029245,0.171012,0.096090,0.963073,0.056972,0.238688,0.122774,0.928030,2.50e-04,94.41
Efficiency-Series (E-Base, E-Test-1~15)/E-test-3-FNO_v1.0+.E_h64_m1536_l4_e50_20251208_141228/details/dataset_indices.txt ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Dataset Split Indices
2
+ ============================================================
3
+
4
+ Random Seed: 42
5
+ Split Ratio (Train:Val:Test): [0, 0, 0]
6
+ Total GM Count: 198018
7
+
8
+ Train Indices (55200 samples):
9
+ Range: [114, 197960]
10
+ First 10: [42294, 42297, 42299, 42302, 42304, 42307, 42309, 42312, 42314, 42317]
11
+ Last 10: [1287, 1290, 1292, 1295, 1297, 1300, 1302, 1305, 1307, 1310]
12
+
13
+ Validation Indices (13800 samples):
14
+ Range: [0, 198017]
15
+ First 10: [84132, 84135, 84137, 84140, 84142, 84145, 84147, 84150, 84152, 84155]
16
+ Last 10: [16962, 16965, 16967, 16970, 16972, 16975, 16977, 16980, 16982, 16985]
17
+
18
+ Test Indices (27018 samples):
19
+ Range: [57, 197276]
20
+ First 10: [193401, 193402, 193403, 193404, 193405, 193406, 193407, 193408, 193409, 193410]
21
+ Last 10: [180965, 180966, 180967, 180968, 180969, 180970, 180971, 180972, 180973, 180974]