Yuuki-3.7 / NOTICE
Gogs
🌸 Initial Yuuki v0.1 setup - Training in progress (Step 1,417)
ff5af85
Yuuki - Mobile-Trained Code Language Model
Copyright 2026 OpceanAI
This product includes a language model trained entirely on a mobile device
(Qualcomm Snapdragon 685) over 42 days with zero GPU budget.
Training Details:
- Base model: DistilGPT-2 (82M parameters)
- Training period: January-March 2026
- Hardware: Android device (Snapdragon 685, 6GB RAM)
- Dataset: The Stack (75,000 examples for v0.1)
- Total cost: $0 in cloud/GPU compute
Third-party Components:
- Transformers library by HuggingFace (Apache 2.0)
- PyTorch (BSD-3-Clause)
- The Stack dataset by BigCode (BigCode OpenRAIL-M)
- DistilGPT-2 base model (Apache 2.0)
Special Thanks:
- My snapdragon 685
- HuggingFace for infrastructure
- The ML community