README / index.html
sebastientaylor's picture
Upload folder using huggingface_hub
75182ed verified
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>EdgeFirst AI — Spatial Perception at the Edge</title>
<base target="_blank">
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=Barlow:wght@300;400;500;600;700&family=Crimson+Text:wght@400;600&display=swap" rel="stylesheet">
<style>
:root {
--navy: #3E3371;
--gold: #E8B820;
--teal: #1FA0A8;
--teal-text: #167A80;
--bg: #FFFFFF;
--bg-subtle: #F8F9FA;
--text: #343A40;
--text-strong: #212529;
--text-muted: #6C757D;
--border: #E9ECEF;
--heading: var(--navy);
--link: var(--teal-text);
}
@media (prefers-color-scheme: dark) {
:root {
--bg: #1a1a2e;
--bg-subtle: #16213e;
--text: #F1F3F5;
--text-strong: #FFFFFF;
--text-muted: #aaa;
--border: rgba(255,255,255,0.1);
--heading: #FFFFFF;
--link: #1FA0A8;
}
}
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: 'Crimson Text', Georgia, serif;
background: var(--bg);
color: var(--text);
line-height: 1.7;
font-size: 16px;
}
.container { max-width: 720px; margin: 0 auto; padding: 2.5rem 2rem; text-align: center; }
h1, h2, h3 {
font-family: 'Barlow', -apple-system, sans-serif;
color: var(--heading);
line-height: 1.3;
}
h1 { font-size: 2.2rem; font-weight: 700; margin-bottom: 0.15rem; }
h1 .edge { color: var(--heading); }
h1 .first { color: var(--gold); }
h2 {
font-size: 1.05rem; font-weight: 600;
text-transform: uppercase; letter-spacing: 0.06em;
margin-top: 2rem; margin-bottom: 0.6rem;
padding-bottom: 0.3rem; border-bottom: 2px solid var(--gold);
text-align: left;
}
.tagline {
font-family: 'Barlow', sans-serif; font-weight: 500;
letter-spacing: 0.12em; text-transform: uppercase;
color: var(--text-muted); font-size: 0.85rem; margin-bottom: 1.25rem;
}
p { margin-bottom: 1rem; text-align: left; }
a { color: var(--link); text-decoration: none; }
a:hover { text-decoration: underline; color: var(--navy); }
@media (prefers-color-scheme: dark) { a:hover { color: var(--gold); } }
.link-badges { display: flex; flex-wrap: wrap; gap: 0.4rem; justify-content: center; margin: 1.25rem 0; }
.link-badges img { height: 28px; }
.badges { display: flex; flex-wrap: wrap; gap: 0.4rem; margin: 0.75rem 0; }
.badges img { height: 22px; }
.model-table {
width: 100%;
border-collapse: collapse;
margin: 0.6rem 0;
font-family: 'Barlow', sans-serif;
font-size: 0.9rem;
text-align: left;
}
.model-table th {
padding: 0.4rem 0.6rem;
font-weight: 600; font-size: 0.78rem;
text-transform: uppercase; letter-spacing: 0.04em;
color: var(--text-muted);
border-bottom: 2px solid var(--border);
}
.model-table td {
padding: 0.35rem 0.6rem;
border-bottom: 1px solid var(--border);
}
.model-table td:first-child { font-weight: 600; color: var(--text-strong); white-space: nowrap; }
.links-table {
width: 100%;
border-collapse: collapse;
margin: 0.6rem 0;
font-family: 'Barlow', sans-serif;
font-size: 0.9rem;
text-align: left;
}
.links-table td {
padding: 0.3rem 0.6rem;
border-bottom: 1px solid var(--border);
}
.links-table td:first-child { font-weight: 600; color: var(--text-strong); white-space: nowrap; width: 120px; }
.cta {
display: inline-block;
font-family: 'Barlow', sans-serif;
font-weight: 600; font-size: 0.95rem;
color: var(--link);
margin: 0.5rem 0;
}
.footer {
margin-top: 2.5rem; padding-top: 1.25rem;
border-top: 1px solid var(--border);
text-align: center;
font-family: 'Barlow', sans-serif;
font-size: 0.8rem; color: var(--text-muted);
}
.footer a { color: var(--text-muted); }
.footer a:hover { color: var(--link); }
</style>
</head>
<body>
<div class="container">
<h1><span class="edge">Edge</span><span class="first">First</span> AI</h1>
<p class="tagline">Spatial Perception at the Edge</p>
<p style="text-align:center;">Open-source libraries and microservices for AI-driven spatial perception on embedded devices.</p>
<div class="link-badges">
<a href="https://edgefirst.studio"><img src="https://img.shields.io/badge/EdgeFirst_Studio-3E3371?style=for-the-badge&logoColor=white" alt="EdgeFirst Studio"></a>
<a href="https://github.com/EdgeFirstAI"><img src="https://img.shields.io/badge/GitHub-212529?style=for-the-badge&logo=github&logoColor=white" alt="GitHub"></a>
<a href="https://doc.edgefirst.ai"><img src="https://img.shields.io/badge/Docs-1FA0A8?style=for-the-badge&logo=readthedocs&logoColor=white" alt="Docs"></a>
<a href="https://edgefirst.ai"><img src="https://img.shields.io/badge/edgefirst.ai-E8B820?style=for-the-badge&logoColor=333" alt="Website"></a>
</div>
<h2>About</h2>
<p>
<strong>Au-Zone Technologies</strong> builds EdgeFirst &mdash; a comprehensive platform for deploying AI perception on edge devices. We support cameras, LiDAR, radar, and time-of-flight sensors for real-time object detection, segmentation, sensor fusion, and 3D spatial understanding.
</p>
<p>
Our stack spans four layers: <strong>Foundation</strong> (hardware abstraction and inference delegates), <strong>Zenoh</strong> (pub/sub microservices), <strong>GStreamer</strong> (media pipeline plugins), and <strong>ROS 2</strong> integration. All released under <strong>Apache 2.0</strong>.
</p>
<h2>Model Zoo</h2>
<p>Pre-trained models validated on real edge hardware with full-dataset accuracy metrics and detailed timing breakdowns per device.</p>
<a class="cta" href="https://huggingface.co/spaces/EdgeFirst/Models">Browse the EdgeFirst Model Zoo &rarr;</a>
<table class="model-table">
<tr><th>Task</th><th>Models</th></tr>
<tr>
<td>Detection</td>
<td><a href="https://huggingface.co/EdgeFirst/yolo26-det">YOLO26</a> &middot; <a href="https://huggingface.co/EdgeFirst/yolo11-det">YOLO11</a> &middot; <a href="https://huggingface.co/EdgeFirst/yolov8-det">YOLOv8</a> &middot; <a href="https://huggingface.co/EdgeFirst/yolov5-det">YOLOv5</a></td>
</tr>
<tr>
<td>Segmentation</td>
<td><a href="https://huggingface.co/EdgeFirst/yolo26-seg">YOLO26</a> &middot; <a href="https://huggingface.co/EdgeFirst/yolo11-seg">YOLO11</a> &middot; <a href="https://huggingface.co/EdgeFirst/yolov8-seg">YOLOv8</a></td>
</tr>
</table>
<h2>Supported Hardware</h2>
<div class="badges">
<img src="https://img.shields.io/badge/NXP-i.MX_8M_Plus-3E3371?style=flat-square&logoColor=white" alt="NXP i.MX 8M Plus">
<img src="https://img.shields.io/badge/NXP-i.MX_95-3E3371?style=flat-square&logoColor=white" alt="NXP i.MX 95">
<img src="https://img.shields.io/badge/NXP-Ara240-3E3371?style=flat-square&logoColor=white" alt="NXP Ara240">
<img src="https://img.shields.io/badge/RPi5-Hailo--8%2F8L-1FA0A8?style=flat-square&logoColor=white" alt="RPi5 + Hailo">
<img src="https://img.shields.io/badge/NVIDIA-Jetson-76B900?style=flat-square&logoColor=white" alt="NVIDIA Jetson">
</div>
<h2>EdgeFirst Studio</h2>
<p>
<a href="https://edgefirst.studio"><strong>EdgeFirst Studio</strong></a> is our MLOps platform for the complete perception development lifecycle &mdash; dataset management, model training, INT8 quantization, on-target validation, and deployment. <strong>Free tier available.</strong>
</p>
<h2>Links</h2>
<table class="links-table">
<tr><td>Website</td><td><a href="https://edgefirst.ai">edgefirst.ai</a></td></tr>
<tr><td>Studio</td><td><a href="https://edgefirst.studio">edgefirst.studio</a></td></tr>
<tr><td>GitHub</td><td><a href="https://github.com/EdgeFirstAI">github.com/EdgeFirstAI</a></td></tr>
<tr><td>Documentation</td><td><a href="https://doc.edgefirst.ai">doc.edgefirst.ai</a></td></tr>
<tr><td>Model Zoo</td><td><a href="https://huggingface.co/spaces/EdgeFirst/Models">EdgeFirst/Models</a></td></tr>
<tr><td>Company</td><td><a href="https://www.au-zone.com">Au-Zone Technologies</a></td></tr>
</table>
<div class="footer">
<p>Apache 2.0 &middot; &copy; <a href="https://www.au-zone.com">Au-Zone Technologies Inc.</a></p>
</div>
</div>
</body>
</html>