GPTx-v6-60gb / README.md
Uncover-F's picture
Update README.md
872e2fd verified
metadata
tags:
  - text-generation
widget:
  - text: Once upon a time
license: mit
datasets:
  - AlphaMindQ/RawPile-85gb
language:
  - en
pipeline_tag: text-generation

GPTx-v6-60gb

Trained from Scratch by AlphaMind Labs

Specifications:

  • Dataset: 60 GB Uncopyrighted RawPile
  • Training: 170,000 Training Steps (1 Epcs)
  • Model: GPT-2 Small ~124m parameters ~500mb

Try the live demo! ๐Ÿš€

Open in Spaces

Trained on Modal Cloud. Sign up today for 30$ in free credits from Modal! (As of 2025 August 1st)

The model is still developing and does not produce incredibly coherent text.