JasonXF commited on
Commit
cad95eb
Β·
verified Β·
1 Parent(s): ccb8ee2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +163 -3
README.md CHANGED
@@ -1,3 +1,163 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - fourier-neural-operator
4
+ - seismic
5
+ - structural-engineering
6
+ - time-series
7
+ - regression
8
+ - pytorch
9
+ library_name: pytorch
10
+ pipeline_tag: other
11
+ ---
12
+
13
+ # SeFNO β€” Seismic Floor Acceleration Response Prediction (FNO v1.0+)
14
+
15
+ Pre-trained **Fourier Neural Operator (FNO)** models for predicting multi-floor acceleration response time histories of MDOF shear buildings subjected to seismic ground motions.
16
+
17
+ **Code:** [github.com/HKUJasonJiang/Seismic-FNO](https://github.com/HKUJasonJiang/Seismic-FNO)
18
+ **Dataset:** [<!-- TODO: HuggingFace dataset link -->]
19
+
20
+ ---
21
+
22
+ ## Task
23
+
24
+ Given a scaled ground motion acceleration time series (3 000 time steps, 50 Hz), the model predicts the roof-floor acceleration response of a target building β€” a pure **regression** task over 1-D signals.
25
+
26
+ | Input | Shape | Description |
27
+ |-------|-------|-------------|
28
+ | Ground motion | `(1, 3000)` | Scaled accelerogram (m/sΒ²) |
29
+
30
+ | Output | Shape | Description |
31
+ |--------|-------|-------------|
32
+ | Floor acceleration | `(1, 3000)` | Roof acceleration response (m/sΒ²) |
33
+
34
+ ---
35
+
36
+ ## Available Models
37
+
38
+ ### Baseline Models
39
+
40
+ Three FNO configurations trained for 50 epochs on the full KNET dataset (3 474 GMs Γ— 57 amplitude scale factors, 250 building configurations):
41
+
42
+ | Folder | Hidden (`h`) | Modes (`m`) | Layers (`l`) | Parameters |
43
+ |--------|:-----------:|:-----------:|:------------:|:----------:|
44
+ | `Base-FNO_v1.0+_h64_m64_l4_e50_*/` | 64 | 64 | 4 | ~3 M |
45
+ | `Large-FNO_v1.0+_h64_m512_l8_e50_*/` | 64 | 512 | 8 | ~12 M |
46
+ | `Huge-FNO_v1.0+_h128_m1024_l12_e50_*/` | 128 | 1024 | 12 | ~48 M |
47
+
48
+ ### Experimental Series
49
+
50
+ | Folder | Runs | Purpose |
51
+ |--------|:----:|---------|
52
+ | `Test-Series (Test-1~10)/` | 10 | Hyper-parameter sweep (modes, layers, hidden channels) |
53
+ | `Efficiency-Series (E-Base, E-Test-1~15)/` | 16 | Dataset-size ablation (varying number of GMs and scale factors) |
54
+
55
+ Each model folder contains:
56
+ ```
57
+ <model_folder>/
58
+ β”œβ”€β”€ model/
59
+ β”‚ └── fno_best.pth # Best checkpoint (lowest validation loss)
60
+ └── details/
61
+ β”œβ”€β”€ training_log.csv # Epoch-by-epoch MSE / RMSE / MAE / RΒ²
62
+ β”œβ”€β”€ training_config.txt # Full hyperparameter configuration
63
+ β”œβ”€β”€ dataset_indices.pkl # Reproducible train / val / test split indices
64
+ └── test_results.txt # Final test-set metrics
65
+ ```
66
+
67
+ ---
68
+
69
+ ## Usage
70
+
71
+ ### Install dependencies
72
+
73
+ ```bash
74
+ git clone https://github.com/HKUJasonJiang/Seismic-FNO.git
75
+ cd Seismic-FNO
76
+ pip install -r requirement.txt # Windows
77
+ # pip install -r requirement_linux.txt # Linux
78
+ ```
79
+
80
+ ### Load a checkpoint and run inference
81
+
82
+ ```python
83
+ import torch
84
+ from neuralop.models import FNO
85
+
86
+ def load_fno(checkpoint_path, n_modes, hidden_channels, n_layers, device="cuda"):
87
+ model = FNO(
88
+ n_modes = (n_modes,),
89
+ hidden_channels = hidden_channels,
90
+ in_channels = 1,
91
+ out_channels = 1,
92
+ n_layers = n_layers,
93
+ projection_channel_ratio = 2,
94
+ domain_padding = 0.1,
95
+ )
96
+ ckpt = torch.load(checkpoint_path, map_location=device, weights_only=False)
97
+ state = ckpt.get("model_state_dict", ckpt)
98
+ # strip torch.compile prefix if present
99
+ state = {(k[10:] if k.startswith("_orig_mod.") else k): v for k, v in state.items()}
100
+ model.load_state_dict(state)
101
+ return model.to(device).eval()
102
+
103
+
104
+ device = "cuda" if torch.cuda.is_available() else "cpu"
105
+
106
+ # --- Base model ---
107
+ model = load_fno(
108
+ checkpoint_path = "output/Base-FNO_v1.0+_h64_m64_l4_e50_.../model/fno_best.pth",
109
+ n_modes = 64,
110
+ hidden_channels = 64,
111
+ n_layers = 4,
112
+ device = device,
113
+ )
114
+
115
+ # --- Inference on a single ground motion ---
116
+ import numpy as np
117
+
118
+ gm_array = np.load("my_ground_motion.npy") # shape (3000,), unit m/sΒ²
119
+ x = torch.from_numpy(gm_array).float().unsqueeze(0).unsqueeze(0).to(device) # (1,1,3000)
120
+
121
+ with torch.no_grad():
122
+ pred = model(x).cpu().numpy().squeeze() # shape (3000,)
123
+
124
+ print("Predicted roof acceleration shape:", pred.shape)
125
+ ```
126
+
127
+ ### Quick review notebook
128
+
129
+ Open `quick_inference.ipynb` in the cloned repository to run inference on the held-out test set and visualise time-history and Fourier amplitude spectrum comparisons interactively.
130
+
131
+ ---
132
+
133
+ ## Training Details
134
+
135
+ | Property | Value |
136
+ |----------|-------|
137
+ | Framework | PyTorch β‰₯ 2.0 + [neuralop](https://github.com/neuraloperator/neuraloperator) |
138
+ | Loss | MSE |
139
+ | Optimizer | AdamW |
140
+ | LR schedule | StepLR (step=20, Ξ³=0.5) |
141
+ | Batch size | 2 560 |
142
+ | Epochs | 50 (baseline models) |
143
+ | Train / Val / Test split | 70 % / 20 % / 10 % (grouped by GM to prevent leakage) |
144
+ | Dataset | KNET β€” 3 474 GMs Γ— 57 amplitude scale factors Γ— 250 buildings |
145
+
146
+ ---
147
+
148
+ ## Citation
149
+
150
+ If you use these models, please cite (forthcoming):
151
+
152
+ ```
153
+ @misc{jiang2025sefno,
154
+ title = {SeismicFNO: Fourier Neural Operators for Seismic Structural Response Prediction},
155
+ author = {Jason Jiang},
156
+ year = {2025},
157
+ url = {https://github.com/HKUJasonJiang/Seismic-FNO}
158
+ }
159
+ ```
160
+
161
+ ## License
162
+
163
+ [Creative Commons Attribution 4.0 (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/)