GPTx-v6-60gb
Trained from Scratch by AlphaMind Labs
Specifications:
- Dataset: 60 GB Uncopyrighted RawPile
- Training: 170,000 Training Steps (1 Epcs)
- Model: GPT-2 Small ~124m parameters ~500mb
Try the live demo! π
Trained on Modal Cloud. Sign up today for 30$ in free credits from Modal! (As of 2025 August 1st)
The model is still developing and does not produce incredibly coherent text.
- Downloads last month
- -