| <!DOCTYPE html> |
| <html lang="en"> |
| <head> |
| <meta charset="UTF-8"> |
| <meta name="viewport" content="width=device-width, initial-scale=1.0"> |
| <title>EdgeFirst AI — Spatial Perception at the Edge</title> |
| <style> |
| :root { |
| --primary: #1a1a2e; |
| --accent: #0f3460; |
| --highlight: #e94560; |
| --text: #eee; |
| --muted: #aaa; |
| --card-bg: #16213e; |
| } |
| * { margin: 0; padding: 0; box-sizing: border-box; } |
| body { |
| font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif; |
| background: var(--primary); |
| color: var(--text); |
| line-height: 1.6; |
| } |
| .container { max-width: 900px; margin: 0 auto; padding: 2rem; } |
| h1 { font-size: 2rem; margin-bottom: 0.25rem; } |
| h2 { font-size: 1.3rem; margin-top: 2rem; margin-bottom: 0.75rem; color: var(--highlight); } |
| .tagline { color: var(--muted); font-size: 1.1rem; margin-bottom: 1.5rem; } |
| .description { margin-bottom: 1.5rem; } |
| .badges { display: flex; flex-wrap: wrap; gap: 0.5rem; margin: 1.5rem 0; } |
| .badge { |
| background: var(--accent); |
| padding: 0.3rem 0.75rem; |
| border-radius: 4px; |
| font-size: 0.85rem; |
| white-space: nowrap; |
| } |
| .links { display: flex; gap: 1rem; margin: 1.5rem 0; flex-wrap: wrap; } |
| .links a { |
| color: var(--highlight); |
| text-decoration: none; |
| padding: 0.5rem 1rem; |
| border: 1px solid var(--highlight); |
| border-radius: 4px; |
| transition: all 0.2s; |
| } |
| .links a:hover { background: var(--highlight); color: #fff; } |
| .arch-table { width: 100%; border-collapse: collapse; margin: 1rem 0; } |
| .arch-table th, .arch-table td { |
| text-align: left; |
| padding: 0.5rem 0.75rem; |
| border-bottom: 1px solid rgba(255,255,255,0.1); |
| } |
| .arch-table th { color: var(--highlight); font-size: 0.85rem; text-transform: uppercase; } |
| .status { font-size: 0.8rem; padding: 0.15rem 0.5rem; border-radius: 3px; } |
| .stable { background: #1b5e20; } |
| .roadmap { background: #4a148c; } |
| .model-grid { display: grid; grid-template-columns: repeat(auto-fill, minmax(250px, 1fr)); gap: 1rem; margin: 1rem 0; } |
| .model-card { |
| background: var(--card-bg); |
| padding: 1rem; |
| border-radius: 6px; |
| border-left: 3px solid var(--highlight); |
| } |
| .model-card h3 { font-size: 1rem; margin-bottom: 0.25rem; } |
| .model-card .meta { color: var(--muted); font-size: 0.85rem; } |
| .model-card a { color: var(--highlight); text-decoration: none; } |
| .footer { margin-top: 3rem; text-align: center; color: var(--muted); font-size: 0.85rem; } |
| .footer a { color: var(--muted); } |
| </style> |
| </head> |
| <body> |
| <div class="container"> |
| <h1>EdgeFirst AI</h1> |
| <p class="tagline">Spatial Perception at the Edge</p> |
|
|
| <p class="description"> |
| <strong>EdgeFirst Perception</strong> is a comprehensive suite of open-source libraries and microservices for building AI-driven spatial perception systems on edge devices. It supports cameras, LiDAR, radar, and time-of-flight sensors — enabling real-time object detection, segmentation, sensor fusion, and 3D spatial understanding, all optimized for resource-constrained embedded hardware. |
| </p> |
|
|
| <div class="links"> |
| <a href="https://edgefirst.studio">EdgeFirst Studio</a> |
| <a href="https://github.com/EdgeFirstAI">GitHub</a> |
| <a href="https://doc.edgefirst.ai">Documentation</a> |
| <a href="https://www.au-zone.com">Au-Zone Technologies</a> |
| </div> |
|
|
| <h2>Supported Hardware</h2> |
| <div class="badges"> |
| <span class="badge">NXP i.MX 8M Plus</span> |
| <span class="badge">NXP i.MX 93</span> |
| <span class="badge">NXP i.MX 95</span> |
| <span class="badge">NXP Ara240</span> |
| <span class="badge">RPi5 + Hailo-8/8L</span> |
| <span class="badge">NVIDIA Jetson</span> |
| </div> |
|
|
| <h2>Architecture</h2> |
| <table class="arch-table"> |
| <tr> |
| <th>Layer</th> |
| <th>Description</th> |
| <th>Status</th> |
| </tr> |
| <tr> |
| <td><strong>Foundation</strong></td> |
| <td>Hardware abstraction, video I/O, accelerated inference delegates</td> |
| <td><span class="status stable">Stable</span></td> |
| </tr> |
| <tr> |
| <td><strong>Zenoh</strong></td> |
| <td>Modular perception pipeline over Zenoh pub/sub</td> |
| <td><span class="status stable">Stable</span></td> |
| </tr> |
| <tr> |
| <td><strong>GStreamer</strong></td> |
| <td>Spatial perception elements for GStreamer / NNStreamer</td> |
| <td><span class="status stable">Stable</span></td> |
| </tr> |
| <tr> |
| <td><strong>ROS 2</strong></td> |
| <td>Native ROS 2 nodes extending Zenoh microservices</td> |
| <td><span class="status roadmap">Roadmap</span></td> |
| </tr> |
| </table> |
|
|
| <h2>Model Zoo</h2> |
| <p>Pre-trained YOLO models optimized for edge deployment — ONNX FP32 and TFLite INT8 with platform-specific compiled variants.</p> |
|
|
| <div class="model-grid"> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolov8-det">YOLOv8 Detection</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano mAP@0.5: 50.2%</p> |
| </div> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolov8-seg">YOLOv8 Segmentation</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano Mask mAP: 34.1%</p> |
| </div> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolo11-det">YOLO11 Detection</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano mAP@0.5: 53.4%</p> |
| </div> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolo11-seg">YOLO11 Segmentation</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano Mask mAP: 35.5%</p> |
| </div> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolo26-det">YOLO26 Detection</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano mAP@0.5: 54.9%</p> |
| </div> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolo26-seg">YOLO26 Segmentation</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano Mask mAP: 37.0%</p> |
| </div> |
| <div class="model-card"> |
| <h3><a href="https://huggingface.co/EdgeFirst/yolov5-det">YOLOv5 Detection</a></h3> |
| <p class="meta">5 sizes · COCO 80 classes · Nano mAP@0.5: 49.6%</p> |
| </div> |
| </div> |
|
|
| <h2>EdgeFirst Studio</h2> |
| <p> |
| <a href="https://edgefirst.studio"><strong>EdgeFirst Studio</strong></a> is the companion SaaS platform for the complete perception development lifecycle. <strong>Free tier available.</strong> |
| </p> |
| <ul style="margin: 0.75rem 0 0 1.5rem; color: var(--muted);"> |
| <li>Dataset management & AI-assisted annotation</li> |
| <li>YOLO model training with automatic INT8 quantization</li> |
| <li>CameraAdaptor integration for native sensor format training</li> |
| <li>One-click deployment to edge devices</li> |
| </ul> |
|
|
| <div class="footer"> |
| <p>Apache 2.0 · © <a href="https://www.au-zone.com">Au-Zone Technologies Inc.</a></p> |
| </div> |
| </div> |
| </body> |
| </html> |
|
|