README / README.md
sebastientaylor's picture
Upload folder using huggingface_hub
75182ed verified
metadata
title: EdgeFirst AI
emoji: 🔬
colorFrom: indigo
colorTo: red
sdk: static
pinned: true
license: apache-2.0

EdgeFirst AI

Spatial Perception at the Edge

Open-source libraries and microservices for AI-driven spatial perception on embedded devices.

EdgeFirst Studio GitHub Documentation Website


About

Au-Zone Technologies builds EdgeFirst — a comprehensive platform for deploying AI perception on edge devices. We support cameras, LiDAR, radar, and time-of-flight sensors for real-time object detection, segmentation, sensor fusion, and 3D spatial understanding.

Our stack spans four layers: Foundation (hardware abstraction and inference delegates), Zenoh (pub/sub microservices), GStreamer (media pipeline plugins), and ROS 2 integration. All released under Apache 2.0.

Model Zoo

Pre-trained models validated on real edge hardware with full-dataset accuracy metrics and detailed timing breakdowns per device.

Browse the EdgeFirst Model Zoo →

Task Models
Detection YOLO26 · YOLO11 · YOLOv8 · YOLOv5
Instance Segmentation YOLO26 · YOLO11 · YOLOv8

Supported Hardware

NXP i.MX 8M Plus NXP i.MX 95 NXP Ara240 RPi5 + Hailo NVIDIA Jetson

EdgeFirst Studio

EdgeFirst Studio is our MLOps platform for the complete perception development lifecycle — dataset management, model training, INT8 quantization, on-target validation, and deployment. Free tier available.

Links