File size: 723 Bytes
ff5af85 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
Yuuki - Mobile-Trained Code Language Model
Copyright 2026 OpceanAI
This product includes a language model trained entirely on a mobile device
(Qualcomm Snapdragon 685) over 42 days with zero GPU budget.
Training Details:
- Base model: DistilGPT-2 (82M parameters)
- Training period: January-March 2026
- Hardware: Android device (Snapdragon 685, 6GB RAM)
- Dataset: The Stack (75,000 examples for v0.1)
- Total cost: $0 in cloud/GPU compute
Third-party Components:
- Transformers library by HuggingFace (Apache 2.0)
- PyTorch (BSD-3-Clause)
- The Stack dataset by BigCode (BigCode OpenRAIL-M)
- DistilGPT-2 base model (Apache 2.0)
Special Thanks:
- My snapdragon 685
- HuggingFace for infrastructure
- The ML community
|