Edge-Medium (85M)
An 85 million parameter language model trained entirely from scratch.
The proof-of-concept that launched the Edge series. Built on Apple Silicon to validate our sovereign training pipeline before scaling to billions of parameters.
Overview
Edge-Medium is the first completed model in the Edge series by AXe Technologies. A compact transformer trained from zero β no pre-trained weights, no fine-tuning, no transfer learning.
| Parameters | 85,446,912 |
| Architecture | Proprietary transformer |
| Training | From scratch β complete |
| Hardware | Apple Silicon (Metal acceleration) |
| Status | β Training complete |
Purpose
Edge-Medium served as the architectural proving ground for the Edge series. It validated:
- Our from-scratch training pipeline on consumer hardware
- Architectural decisions later scaled to Edge-1.3B
- Data processing and tokenization infrastructure
- Evaluation and benchmarking methodology
The Edge Series
| Model | Parameters | Status |
|---|---|---|
| Edge-Medium | 85M | β Complete |
| Edge-1.3B | 1.3B | π Training |
| Edge-3 | Planned | Architecture phase |
Access
Model weights are available for approved researchers and partners. Request access below or contact us directly.
Training
Trained from scratch using proprietary infrastructure on Apple Silicon. Training methodology and architectural details are not publicly disclosed.
About AXe Technologies
Canadian AI research lab focused on sovereign, privacy-first artificial intelligence. We build models that run on your hardware, trained on our hardware. No cloud. No compromise.
Open to collaboration β Contact us for evaluation access and partnership inquiries.
Built in Canada π on Apple Silicon