pokkiri commited on
Commit
519a27e
·
verified ·
1 Parent(s): c9425b6

Upload folder using huggingface_hub

Browse files
Files changed (6) hide show
  1. README.md +132 -0
  2. inference.py +167 -0
  3. model.pt +3 -0
  4. model.py +67 -0
  5. model_package.pkl +3 -0
  6. requirements.txt +8 -0
README.md ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ license: mit
4
+ library_name: pytorch
5
+ tags:
6
+ - biomass
7
+ - remote-sensing
8
+ - satellite-imagery
9
+ - deep-learning
10
+ - regression
11
+ ---
12
+
13
+ # StableResNet Biomass Prediction Model
14
+
15
+ This model predicts above-ground biomass (AGB) from multi-spectral satellite imagery on a per-pixel basis.
16
+
17
+ ## Model Description
18
+
19
+ - **Developed by:** pokkiri
20
+ - **Model type:** StableResNet
21
+ - **Date:** 2025-05-17
22
+ - **License:** MIT
23
+
24
+ ### Input
25
+
26
+ The model takes multi-spectral satellite data as input, processing features extracted from each pixel.
27
+
28
+ ### Output
29
+
30
+ The model outputs biomass predictions in Mg/ha (megagrams per hectare).
31
+
32
+ ## Training Data
33
+
34
+ The model was trained on multi-spectral satellite imagery from various forest regions, including:
35
+ - Sentinel-1 (SAR)
36
+ - Sentinel-2 (optical)
37
+ - Landsat-8 (optical)
38
+ - PALSAR (SAR)
39
+
40
+ Ground truth biomass measurements were derived from field measurements.
41
+
42
+ ## Performance
43
+
44
+ The model achieves strong performance in predicting biomass across diverse landscapes:
45
+ - RMSE: ~25 Mg/ha
46
+ - R²: ~0.85
47
+ - MAE: ~18 Mg/ha
48
+
49
+ ## Training Procedure
50
+
51
+ The model was trained with:
52
+ - MSE loss function
53
+ - Adam optimizer with learning rate 0.001
54
+ - Early stopping based on validation performance
55
+ - Log transformation for target values to stabilize training
56
+
57
+ ## How to Use
58
+
59
+ ```python
60
+ from huggingface_hub import hf_hub_download
61
+ import torch
62
+ import joblib
63
+ import numpy as np
64
+ from model import StableResNet
65
+
66
+ # Download model files
67
+ model_path = hf_hub_download(repo_id="pokkiri/biomass-model", filename="model.pt")
68
+ package_path = hf_hub_download(repo_id="pokkiri/biomass-model", filename="model_package.pkl")
69
+
70
+ # Load model and package
71
+ package = joblib.load(package_path)
72
+ model = StableResNet(n_features=package['n_features'])
73
+ model.load_state_dict(torch.load(model_path, map_location="cpu"))
74
+ model.eval()
75
+
76
+ # Example inference function
77
+ def predict_biomass(features):
78
+ # Scale features
79
+ features_scaled = package['scaler'].transform(features)
80
+
81
+ # Convert to tensor
82
+ tensor = torch.tensor(features_scaled, dtype=torch.float32)
83
+
84
+ # Make prediction
85
+ with torch.no_grad():
86
+ output = model(tensor).numpy()
87
+
88
+ # Convert from log scale if needed
89
+ if package.get('use_log_transform', True):
90
+ output = np.exp(output) - package.get('epsilon', 1.0)
91
+ output = np.maximum(output, 0) # Ensure non-negative
92
+
93
+ return output
94
+
95
+ # Example: predict for a single pixel with features
96
+ example_features = np.random.rand(1, package['n_features']) # Replace with actual features
97
+ biomass = predict_biomass(example_features)
98
+ print(f"Predicted biomass: {biomass[0]:.2f} Mg/ha")
99
+
100
+
101
+
102
+ For GeoTIFF processing, use the included inference.py:
103
+
104
+ ``` Python
105
+ from inference import predict_from_geotiff
106
+
107
+ # Process a multi-band satellite image
108
+ biomass_map = predict_from_geotiff(
109
+ "satellite_image.tif",
110
+ "output_biomass.tif",
111
+ repo_id="pokkiri/biomass-model"
112
+ )
113
+
114
+ ```
115
+
116
+ ## Limitations
117
+
118
+ - The model performs best on forest types similar to those in the training data
119
+ - Very high biomass values (>500 Mg/ha) may be underestimated
120
+ - Requires calibrated multi-spectral satellite imagery as input
121
+
122
+ @## Citation
123
+
124
+ ```
125
+ @misc{biomass_model,
126
+ author = {pokkiri},
127
+ title = {StableResNet Biomass Prediction Model},
128
+ year = {2025},
129
+ publisher = {HuggingFace},
130
+ howpublished = {https://huggingface.co/pokkiri/biomass-model}
131
+ }
132
+ ```
inference.py ADDED
@@ -0,0 +1,167 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Inference code for StableResNet Biomass Prediction Model
3
+ Provides utility functions for making predictions with the model
4
+
5
+ Author: najahpokkiri
6
+ Date: 2025-05-17
7
+ """
8
+ import os
9
+ import torch
10
+ import numpy as np
11
+ import joblib
12
+ from model import StableResNet
13
+ from huggingface_hub import hf_hub_download
14
+
15
+ def load_model_from_hub(repo_id="najahpokkiri/biomass-model"):
16
+ """Load model from HuggingFace Hub"""
17
+ # Download files from HuggingFace
18
+ model_path = hf_hub_download(repo_id=repo_id, filename="model.pt")
19
+ package_path = hf_hub_download(repo_id=repo_id, filename="model_package.pkl")
20
+
21
+ # Load package with metadata
22
+ package = joblib.load(package_path)
23
+ n_features = package['n_features']
24
+
25
+ # Initialize model
26
+ model = StableResNet(n_features=n_features)
27
+ model.load_state_dict(torch.load(model_path, map_location='cpu'))
28
+ model.eval()
29
+
30
+ return model, package
31
+
32
+ def load_model_local(model_path, package_path):
33
+ """Load model from local files"""
34
+ # Load package with metadata
35
+ package = joblib.load(package_path)
36
+ n_features = package['n_features']
37
+
38
+ # Initialize model
39
+ model = StableResNet(n_features=n_features)
40
+ model.load_state_dict(torch.load(model_path, map_location='cpu'))
41
+ model.eval()
42
+
43
+ return model, package
44
+
45
+ def predict_biomass(model, features, package):
46
+ """Predict biomass from feature array"""
47
+ # Get metadata
48
+ scaler = package['scaler']
49
+ use_log_transform = package['use_log_transform']
50
+ epsilon = package.get('epsilon', 1.0)
51
+
52
+ # Scale features
53
+ features_scaled = scaler.transform(features)
54
+
55
+ # Convert to tensor
56
+ tensor = torch.tensor(features_scaled, dtype=torch.float32)
57
+
58
+ # Make prediction
59
+ with torch.no_grad():
60
+ output = model(tensor).numpy()
61
+
62
+ # Convert from log scale if needed
63
+ if use_log_transform:
64
+ output = np.exp(output) - epsilon
65
+ output = np.maximum(output, 0) # Ensure non-negative
66
+
67
+ return output
68
+
69
+ def predict_from_geotiff(tiff_path, output_path=None, model=None, package=None, repo_id="najahpokkiri/biomass-model"):
70
+ """Predict biomass from a GeoTIFF file"""
71
+ try:
72
+ import rasterio
73
+ except ImportError:
74
+ raise ImportError("rasterio is required for GeoTIFF processing. Install with 'pip install rasterio'.")
75
+
76
+ # Load model if not provided
77
+ if model is None or package is None:
78
+ model, package = load_model_from_hub(repo_id)
79
+
80
+ with rasterio.open(tiff_path) as src:
81
+ # Read image data
82
+ data = src.read()
83
+ height, width = data.shape[1], data.shape[2]
84
+ transform = src.transform
85
+ crs = src.crs
86
+
87
+ # Predict in chunks
88
+ chunk_size = 1000
89
+ predictions = np.zeros((height, width), dtype=np.float32)
90
+
91
+ # Create mask for valid pixels
92
+ valid_mask = np.all(np.isfinite(data), axis=0)
93
+
94
+ # Process image in chunks
95
+ for y_start in range(0, height, chunk_size):
96
+ y_end = min(y_start + chunk_size, height)
97
+
98
+ for x_start in range(0, width, chunk_size):
99
+ x_end = min(x_start + chunk_size, width)
100
+
101
+ # Get chunk mask
102
+ chunk_mask = valid_mask[y_start:y_end, x_start:x_end]
103
+ if not np.any(chunk_mask):
104
+ continue
105
+
106
+ # Extract valid pixels
107
+ valid_y, valid_x = np.where(chunk_mask)
108
+
109
+ # Extract features
110
+ pixel_features = []
111
+ for i, j in zip(valid_y, valid_x):
112
+ pixel_values = data[:, y_start+i, x_start+j]
113
+ pixel_features.append(pixel_values)
114
+
115
+ # Make predictions
116
+ pixel_features = np.array(pixel_features)
117
+ batch_predictions = predict_biomass(model, pixel_features, package)
118
+
119
+ # Insert predictions back into the image
120
+ for idx, (i, j) in enumerate(zip(valid_y, valid_x)):
121
+ predictions[y_start+i, x_start+j] = batch_predictions[idx]
122
+
123
+ # Save predictions if output path is provided
124
+ if output_path:
125
+ meta = src.meta.copy()
126
+ meta.update(
127
+ dtype='float32',
128
+ count=1,
129
+ nodata=0
130
+ )
131
+
132
+ with rasterio.open(output_path, 'w', **meta) as dst:
133
+ dst.write(predictions, 1)
134
+
135
+ print(f"Saved biomass predictions to: {output_path}")
136
+
137
+ return predictions
138
+
139
+ def example():
140
+ """Example usage"""
141
+ print("StableResNet Biomass Prediction Example")
142
+ print("-" * 40)
143
+
144
+ # Option 1: Load from HuggingFace Hub
145
+ print("Loading model from HuggingFace Hub...")
146
+ model, package = load_model_from_hub("najahpokkiri/biomass-model")
147
+
148
+ # Option 2: Load from local files
149
+ # model, package = load_model_local("model.pt", "model_package.pkl")
150
+
151
+ print(f"Model loaded. Expecting {package['n_features']} features")
152
+
153
+ # Example: Create synthetic features for demonstration
154
+ n_features = package['n_features']
155
+ example_features = np.random.rand(5, n_features)
156
+
157
+ print("\nPredicting biomass for 5 sample points...")
158
+ predictions = predict_biomass(model, example_features, package)
159
+
160
+ for i, pred in enumerate(predictions):
161
+ print(f"Sample {i+1}: {pred:.2f} Mg/ha")
162
+
163
+ print("\nTo process a GeoTIFF file:")
164
+ print("predictions = predict_from_geotiff('your_image.tif', 'output_biomass.tif')")
165
+
166
+ if __name__ == "__main__":
167
+ example()
model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:717dc1332822323a162a97d666607a340cde8ea96bd43f575d3e176874795fc8
3
+ size 791075
model.py ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ StableResNet Model for Biomass Prediction
3
+ A numerically stable ResNet architecture for regression tasks
4
+
5
+ Author: najahpokkiri
6
+ Date: 2025-05-17
7
+ """
8
+ import torch
9
+ import torch.nn as nn
10
+
11
+ class StableResNet(nn.Module):
12
+ """Numerically stable ResNet for biomass regression"""
13
+ def __init__(self, n_features, dropout=0.2):
14
+ super().__init__()
15
+
16
+ self.input_proj = nn.Sequential(
17
+ nn.Linear(n_features, 256),
18
+ nn.LayerNorm(256),
19
+ nn.ReLU(),
20
+ nn.Dropout(dropout)
21
+ )
22
+
23
+ self.layer1 = self._make_simple_resblock(256, 256)
24
+ self.layer2 = self._make_simple_resblock(256, 128)
25
+ self.layer3 = self._make_simple_resblock(128, 64)
26
+
27
+ self.regressor = nn.Sequential(
28
+ nn.Linear(64, 32),
29
+ nn.ReLU(),
30
+ nn.Linear(32, 1)
31
+ )
32
+
33
+ self._init_weights()
34
+
35
+ def _make_simple_resblock(self, in_dim, out_dim):
36
+ return nn.Sequential(
37
+ nn.Linear(in_dim, out_dim),
38
+ nn.BatchNorm1d(out_dim),
39
+ nn.ReLU(),
40
+ nn.Linear(out_dim, out_dim),
41
+ nn.BatchNorm1d(out_dim),
42
+ nn.ReLU()
43
+ ) if in_dim == out_dim else nn.Sequential(
44
+ nn.Linear(in_dim, out_dim),
45
+ nn.BatchNorm1d(out_dim),
46
+ nn.ReLU(),
47
+ )
48
+
49
+ def _init_weights(self):
50
+ for m in self.modules():
51
+ if isinstance(m, nn.Linear):
52
+ nn.init.kaiming_normal_(m.weight, mode='fan_in', nonlinearity='relu')
53
+ if m.bias is not None:
54
+ nn.init.zeros_(m.bias)
55
+
56
+ def forward(self, x):
57
+ x = self.input_proj(x)
58
+
59
+ identity = x
60
+ out = self.layer1(x)
61
+ x = out + identity
62
+
63
+ x = self.layer2(x)
64
+ x = self.layer3(x)
65
+
66
+ x = self.regressor(x)
67
+ return x.squeeze()
model_package.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1afa0720b4409a58c1fcf2121960a4073d1baae616232e58001b1f73cf1c5bbb
3
+ size 2950
requirements.txt ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+
2
+
3
+ torch>=1.10.0
4
+ numpy>=1.20.0
5
+ joblib>=1.1.0
6
+ rasterio>=1.2.0
7
+ huggingface_hub>=0.10.0
8
+ matplotlib>=3.5.0