Fix: use absolute resolve URLs for images on org page
Browse files
README.md
CHANGED
|
@@ -21,17 +21,17 @@ license: apache-2.0
|
|
| 21 |
|
| 22 |
## Workflow
|
| 23 |
|
| 24 |
-
<img src="01-ecosystem.png" alt="EdgeFirst Model Zoo Ecosystem"/>
|
| 25 |
|
| 26 |
Every model in the EdgeFirst Model Zoo passes through a validated pipeline. [**EdgeFirst Studio**](https://edgefirst.studio) manages datasets, training, multi-format export (ONNX, TFLite INT8, eIQ Neutron, Kinara DVM, HailoRT HEF, TensorRT), and reference validation. Models are then deployed to our board farm for **full-dataset on-target validation** on real hardware — measuring both accuracy (mAP) and detailed timing breakdown per device. Results are published here on HuggingFace with per-platform performance tables.
|
| 27 |
|
| 28 |
## Model Lifecycle
|
| 29 |
|
| 30 |
-
<img src="02-model-lifecycle.png" alt="Model Lifecycle: Training to Publication"/>
|
| 31 |
|
| 32 |
## On-Target Validation
|
| 33 |
|
| 34 |
-
<img src="03-on-target-validation.png" alt="On-Target Validation Pipeline"/>
|
| 35 |
|
| 36 |
Unlike desktop-only benchmarks, EdgeFirst validates every model on **real target hardware** with the full dataset. Each device produces both accuracy metrics (mAP) and a detailed timing breakdown — load, preprocessing, NPU inference, and decode — so you know exactly how a model performs on your specific platform.
|
| 37 |
|
|
|
|
| 21 |
|
| 22 |
## Workflow
|
| 23 |
|
| 24 |
+
<img src="https://huggingface.co/spaces/EdgeFirst/README/resolve/main/01-ecosystem.png" alt="EdgeFirst Model Zoo Ecosystem"/>
|
| 25 |
|
| 26 |
Every model in the EdgeFirst Model Zoo passes through a validated pipeline. [**EdgeFirst Studio**](https://edgefirst.studio) manages datasets, training, multi-format export (ONNX, TFLite INT8, eIQ Neutron, Kinara DVM, HailoRT HEF, TensorRT), and reference validation. Models are then deployed to our board farm for **full-dataset on-target validation** on real hardware — measuring both accuracy (mAP) and detailed timing breakdown per device. Results are published here on HuggingFace with per-platform performance tables.
|
| 27 |
|
| 28 |
## Model Lifecycle
|
| 29 |
|
| 30 |
+
<img src="https://huggingface.co/spaces/EdgeFirst/README/resolve/main/02-model-lifecycle.png" alt="Model Lifecycle: Training to Publication"/>
|
| 31 |
|
| 32 |
## On-Target Validation
|
| 33 |
|
| 34 |
+
<img src="https://huggingface.co/spaces/EdgeFirst/README/resolve/main/03-on-target-validation.png" alt="On-Target Validation Pipeline"/>
|
| 35 |
|
| 36 |
Unlike desktop-only benchmarks, EdgeFirst validates every model on **real target hardware** with the full dataset. Each device produces both accuracy metrics (mAP) and a detailed timing breakdown — load, preprocessing, NPU inference, and decode — so you know exactly how a model performs on your specific platform.
|
| 37 |
|