sebastientaylor commited on
Commit
ea800d9
·
verified ·
1 Parent(s): 1362a07

Update org landing page with EdgeFirst branding and model zoo links

Browse files
Files changed (2) hide show
  1. README.md +33 -6
  2. index.html +175 -0
README.md CHANGED
@@ -1,10 +1,37 @@
1
  ---
2
- title: README
3
- emoji: 📚
4
- colorFrom: gray
5
- colorTo: blue
6
  sdk: static
7
- pinned: false
 
8
  ---
9
 
10
- Edit this `README.md` markdown file to author your organization card.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: EdgeFirst AI
3
+ emoji: 🔬
4
+ colorFrom: indigo
5
+ colorTo: red
6
  sdk: static
7
+ pinned: true
8
+ license: apache-2.0
9
  ---
10
 
11
+ # EdgeFirst AI Spatial Perception at the Edge
12
+
13
+ Open-source libraries and microservices for AI-driven spatial perception on edge devices.
14
+
15
+ **EdgeFirst Perception** supports cameras, LiDAR, radar, and time-of-flight sensors — enabling real-time object detection, segmentation, sensor fusion, and 3D spatial understanding, all optimized for resource-constrained embedded hardware.
16
+
17
+ ## Architecture
18
+
19
+ | Layer | Description | Status |
20
+ |-------|-------------|--------|
21
+ | **Foundation** | Hardware abstraction, video I/O, accelerated inference delegates | Stable |
22
+ | **Zenoh** | Modular perception pipeline over Zenoh pub/sub | Stable |
23
+ | **GStreamer** | Spatial perception elements for GStreamer / NNStreamer | Stable |
24
+ | **ROS 2** | Native ROS 2 nodes extending Zenoh microservices | Roadmap |
25
+
26
+ ## Supported Hardware
27
+
28
+ NXP i.MX 8M Plus | NXP i.MX 93 | NXP i.MX 95 | NXP Ara240 | RPi5 + Hailo-8/8L | NVIDIA Jetson
29
+
30
+ ## Links
31
+
32
+ - [EdgeFirst Studio](https://edgefirst.studio) — MLOps platform (free tier available)
33
+ - [GitHub](https://github.com/EdgeFirstAI) — Open-source repositories
34
+ - [Documentation](https://doc.edgefirst.ai) — Full documentation
35
+ - [Au-Zone Technologies](https://www.au-zone.com) — Company
36
+
37
+ Apache 2.0 | Au-Zone Technologies Inc.
index.html ADDED
@@ -0,0 +1,175 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <!DOCTYPE html>
2
+ <html lang="en">
3
+ <head>
4
+ <meta charset="UTF-8">
5
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
6
+ <title>EdgeFirst AI — Spatial Perception at the Edge</title>
7
+ <style>
8
+ :root {
9
+ --primary: #1a1a2e;
10
+ --accent: #0f3460;
11
+ --highlight: #e94560;
12
+ --text: #eee;
13
+ --muted: #aaa;
14
+ --card-bg: #16213e;
15
+ }
16
+ * { margin: 0; padding: 0; box-sizing: border-box; }
17
+ body {
18
+ font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
19
+ background: var(--primary);
20
+ color: var(--text);
21
+ line-height: 1.6;
22
+ }
23
+ .container { max-width: 900px; margin: 0 auto; padding: 2rem; }
24
+ h1 { font-size: 2rem; margin-bottom: 0.25rem; }
25
+ h2 { font-size: 1.3rem; margin-top: 2rem; margin-bottom: 0.75rem; color: var(--highlight); }
26
+ .tagline { color: var(--muted); font-size: 1.1rem; margin-bottom: 1.5rem; }
27
+ .description { margin-bottom: 1.5rem; }
28
+ .badges { display: flex; flex-wrap: wrap; gap: 0.5rem; margin: 1.5rem 0; }
29
+ .badge {
30
+ background: var(--accent);
31
+ padding: 0.3rem 0.75rem;
32
+ border-radius: 4px;
33
+ font-size: 0.85rem;
34
+ white-space: nowrap;
35
+ }
36
+ .links { display: flex; gap: 1rem; margin: 1.5rem 0; flex-wrap: wrap; }
37
+ .links a {
38
+ color: var(--highlight);
39
+ text-decoration: none;
40
+ padding: 0.5rem 1rem;
41
+ border: 1px solid var(--highlight);
42
+ border-radius: 4px;
43
+ transition: all 0.2s;
44
+ }
45
+ .links a:hover { background: var(--highlight); color: #fff; }
46
+ .arch-table { width: 100%; border-collapse: collapse; margin: 1rem 0; }
47
+ .arch-table th, .arch-table td {
48
+ text-align: left;
49
+ padding: 0.5rem 0.75rem;
50
+ border-bottom: 1px solid rgba(255,255,255,0.1);
51
+ }
52
+ .arch-table th { color: var(--highlight); font-size: 0.85rem; text-transform: uppercase; }
53
+ .status { font-size: 0.8rem; padding: 0.15rem 0.5rem; border-radius: 3px; }
54
+ .stable { background: #1b5e20; }
55
+ .roadmap { background: #4a148c; }
56
+ .model-grid { display: grid; grid-template-columns: repeat(auto-fill, minmax(250px, 1fr)); gap: 1rem; margin: 1rem 0; }
57
+ .model-card {
58
+ background: var(--card-bg);
59
+ padding: 1rem;
60
+ border-radius: 6px;
61
+ border-left: 3px solid var(--highlight);
62
+ }
63
+ .model-card h3 { font-size: 1rem; margin-bottom: 0.25rem; }
64
+ .model-card .meta { color: var(--muted); font-size: 0.85rem; }
65
+ .model-card a { color: var(--highlight); text-decoration: none; }
66
+ .footer { margin-top: 3rem; text-align: center; color: var(--muted); font-size: 0.85rem; }
67
+ .footer a { color: var(--muted); }
68
+ </style>
69
+ </head>
70
+ <body>
71
+ <div class="container">
72
+ <h1>EdgeFirst AI</h1>
73
+ <p class="tagline">Spatial Perception at the Edge</p>
74
+
75
+ <p class="description">
76
+ <strong>EdgeFirst Perception</strong> is a comprehensive suite of open-source libraries and microservices for building AI-driven spatial perception systems on edge devices. It supports cameras, LiDAR, radar, and time-of-flight sensors &mdash; enabling real-time object detection, segmentation, sensor fusion, and 3D spatial understanding, all optimized for resource-constrained embedded hardware.
77
+ </p>
78
+
79
+ <div class="links">
80
+ <a href="https://edgefirst.studio">EdgeFirst Studio</a>
81
+ <a href="https://github.com/EdgeFirstAI">GitHub</a>
82
+ <a href="https://doc.edgefirst.ai">Documentation</a>
83
+ <a href="https://www.au-zone.com">Au-Zone Technologies</a>
84
+ </div>
85
+
86
+ <h2>Supported Hardware</h2>
87
+ <div class="badges">
88
+ <span class="badge">NXP i.MX 8M Plus</span>
89
+ <span class="badge">NXP i.MX 93</span>
90
+ <span class="badge">NXP i.MX 95</span>
91
+ <span class="badge">NXP Ara240</span>
92
+ <span class="badge">RPi5 + Hailo-8/8L</span>
93
+ <span class="badge">NVIDIA Jetson</span>
94
+ </div>
95
+
96
+ <h2>Architecture</h2>
97
+ <table class="arch-table">
98
+ <tr>
99
+ <th>Layer</th>
100
+ <th>Description</th>
101
+ <th>Status</th>
102
+ </tr>
103
+ <tr>
104
+ <td><strong>Foundation</strong></td>
105
+ <td>Hardware abstraction, video I/O, accelerated inference delegates</td>
106
+ <td><span class="status stable">Stable</span></td>
107
+ </tr>
108
+ <tr>
109
+ <td><strong>Zenoh</strong></td>
110
+ <td>Modular perception pipeline over Zenoh pub/sub</td>
111
+ <td><span class="status stable">Stable</span></td>
112
+ </tr>
113
+ <tr>
114
+ <td><strong>GStreamer</strong></td>
115
+ <td>Spatial perception elements for GStreamer / NNStreamer</td>
116
+ <td><span class="status stable">Stable</span></td>
117
+ </tr>
118
+ <tr>
119
+ <td><strong>ROS 2</strong></td>
120
+ <td>Native ROS 2 nodes extending Zenoh microservices</td>
121
+ <td><span class="status roadmap">Roadmap</span></td>
122
+ </tr>
123
+ </table>
124
+
125
+ <h2>Model Zoo</h2>
126
+ <p>Pre-trained YOLO models optimized for edge deployment &mdash; ONNX FP32 and TFLite INT8 with platform-specific compiled variants.</p>
127
+
128
+ <div class="model-grid">
129
+ <div class="model-card">
130
+ <h3><a href="https://huggingface.co/EdgeFirst/yolov8-det">YOLOv8 Detection</a></h3>
131
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano mAP@0.5: 50.2%</p>
132
+ </div>
133
+ <div class="model-card">
134
+ <h3><a href="https://huggingface.co/EdgeFirst/yolov8-seg">YOLOv8 Segmentation</a></h3>
135
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano Mask mAP: 34.1%</p>
136
+ </div>
137
+ <div class="model-card">
138
+ <h3><a href="https://huggingface.co/EdgeFirst/yolo11-det">YOLO11 Detection</a></h3>
139
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano mAP@0.5: 53.4%</p>
140
+ </div>
141
+ <div class="model-card">
142
+ <h3><a href="https://huggingface.co/EdgeFirst/yolo11-seg">YOLO11 Segmentation</a></h3>
143
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano Mask mAP: 35.5%</p>
144
+ </div>
145
+ <div class="model-card">
146
+ <h3><a href="https://huggingface.co/EdgeFirst/yolo26-det">YOLO26 Detection</a></h3>
147
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano mAP@0.5: 54.9%</p>
148
+ </div>
149
+ <div class="model-card">
150
+ <h3><a href="https://huggingface.co/EdgeFirst/yolo26-seg">YOLO26 Segmentation</a></h3>
151
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano Mask mAP: 37.0%</p>
152
+ </div>
153
+ <div class="model-card">
154
+ <h3><a href="https://huggingface.co/EdgeFirst/yolov5-det">YOLOv5 Detection</a></h3>
155
+ <p class="meta">5 sizes &middot; COCO 80 classes &middot; Nano mAP@0.5: 49.6%</p>
156
+ </div>
157
+ </div>
158
+
159
+ <h2>EdgeFirst Studio</h2>
160
+ <p>
161
+ <a href="https://edgefirst.studio"><strong>EdgeFirst Studio</strong></a> is the companion SaaS platform for the complete perception development lifecycle. <strong>Free tier available.</strong>
162
+ </p>
163
+ <ul style="margin: 0.75rem 0 0 1.5rem; color: var(--muted);">
164
+ <li>Dataset management &amp; AI-assisted annotation</li>
165
+ <li>YOLO model training with automatic INT8 quantization</li>
166
+ <li>CameraAdaptor integration for native sensor format training</li>
167
+ <li>One-click deployment to edge devices</li>
168
+ </ul>
169
+
170
+ <div class="footer">
171
+ <p>Apache 2.0 &middot; &copy; <a href="https://www.au-zone.com">Au-Zone Technologies Inc.</a></p>
172
+ </div>
173
+ </div>
174
+ </body>
175
+ </html>