Repository
https://github.com/alexcpn/elevation_transformer
Evaluation
Training dataset: alexcpn/longely_rice_model
Accuracy
| Metric | Value |
|---|---|
| RMSE | 17.85 dB |
| MAE | 10.94 dB |
| Median Error | 5.00 dB |
| 90th Percentile Error | 31.02 dB |
Runtime
Current runtime artifacts are in eval/.
| Engine | Time / sample | Throughput |
|---|---|---|
| Direct ITM | 11.0 us | 91,082 pred/s |
| Transformer | 1314.8 us | 761 pred/s |
The current repo validates the concept that attention can learn the ITM mapping, but it does not yet outperform native ITM in runtime.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Dataset used to train alexcpn/elevation_transformer
Evaluation results
- RMSE (dB) on longely_rice_modelvalidation set self-reported17.850
- MAE (dB) on longely_rice_modelvalidation set self-reported10.940
- Median Error (dB) on longely_rice_modelvalidation set self-reported5.000
- 90th Percentile Error (dB) on longely_rice_modelvalidation set self-reported31.020