File size: 1,092 Bytes
550b9f1 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_examples: 7614
- name: validation
num_examples: 846
language:
- en
license: mit
tags:
- character-level
- philosophy
- mathematics
- julia
- scriptio-continua
- reduced-vocab
size_categories:
- 1K<n<10K
---
# JuliaGPT Training Data
Character-level training corpus for [JuliaGPT](https://huggingface.co/LisaMegaWatts/JuliaGPT), an experimental GPT in pure Julia exploring minimal vocabularies inspired by ancient Greek *scriptio continua*.
## Sources (ordered, not shuffled)
1. **Aristotle - Rhetoric** (5,460 chunks) — MIT Classics
2. **Euclid - The Elements** (3,001 chunks) — Project Gutenberg
## Vocabulary
28 characters: a-z + space + period. Numerals converted to English words. All other punctuation removed.
Plus BOS token = 29 vocab total.
## Processing
- Lowercase only
- Numerals converted to words (e.g. "42" becomes "forty two")
- All punctuation removed except period
- Chunks max 256 chars, split at sentence boundaries
## Format
One chunk per line in .
## Usage
|