GPTx-v6-60gb

Trained from Scratch by AlphaMind Labs

Specifications:

  • Dataset: 60 GB Uncopyrighted RawPile
  • Training: 170,000 Training Steps (1 Epcs)
  • Model: GPT-2 Small ~124m parameters ~500mb

Try the live demo! πŸš€

Open in Spaces

Trained on Modal Cloud. Sign up today for 30$ in free credits from Modal! (As of 2025 August 1st)

The model is still developing and does not produce incredibly coherent text.

Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Dataset used to train AlphaMindLabs/GPTx-v6-60gb

Space using AlphaMindLabs/GPTx-v6-60gb 1