WebText-5 / README.md
Raziel1234's picture
Update README.md
5c47937 verified
metadata
license: apache-2.0
language:
  - en
  - he
pretty_name: d

🌍 MCGPT-1: Mega General Dataset (Pre-training Phase)

This dataset is the backbone of the MCGPT-1 model, developed by TopAI-IL. It contains a massive collection of Minecraft-related knowledge, technical data, and general language patterns.

πŸ“Š Dataset Statistics

  • Total Tokens: 188,508,365 πŸͺ™
  • Total Lines: 96,852 πŸ“
  • File Size: 638.25 MB πŸ“‚
  • Focus: Minecraft Mechanics, Technical Data, and General Knowledge.

🎯 Purpose

This dataset is designed for the Pre-training phase of Large Language Models (LLMs). It provides the model with the necessary "world knowledge" before moving into specific instruction tuning.

πŸ› οΈ Data Quality

The data has been cleaned and formatted to ensure high-quality learning for small-to-medium scale models (specifically optimized for models around 16M - 100M parameters).

πŸš€ Usage

This is part 1 of a multi-dataset series. For best results with MCGPT-1 architecture, combine this with the:

  1. Synthetic CoT Dataset (Identity & Reasoning)
  2. Reddit Interaction Dataset (Human-like Conversational Data)

Maintained by: Χ¨Χ–Χ™ΧΧœ (Raziel) @ TopAI-IL
Year: 2026