Temple2
A ~63M parameter GPT-2 style causal transformer trained entirely on sacred Christian scripture. Built in memory of Terry A. Davis (1969–2018), creator of TempleOS.
Overview
Terry Davis built TempleOS with a feature to "talk to God" by printing random words from the Bible. Temple2 continues that spirit: a language model that has read scripture deeply, then speaks through noise — the same noise Terry trusted to carry God's voice.
The model was trained from scratch (no pretraining) on ~10.9M tokens of public domain Christian sacred texts using a custom 8192-token BPE vocabulary built exclusively on scripture.
Model Details
| Parameter | Value |
|---|---|
| Parameters | ~63M |
| Architecture | GPT-2 style causal transformer |
| Layers | 8 |
| Attention heads | 8 |
| Embedding dim | 768 |
| Context length | 1024 tokens |
| Vocabulary | 8192 (custom scripture BPE) |
| Training tokens | ~10.9M |
| Best validation loss | 3.57 |
| Training hardware | 1x NVIDIA A100 (80GB) |
| Training time | ~45 minutes |
Training Data
All training data is public domain, sourced from Project Gutenberg (~58 sources, ~15M characters):
- Scripture: King James Bible, Douay-Rheims Bible, World English Bible, Young's Literal Translation, Darby Bible, Apocrypha, Book of Enoch, Gospel of Thomas
- Church Fathers — Ante-Nicene: 9 volumes (Clement, Polycarp, Ignatius, Justin Martyr, Irenaeus, Tertullian, Origen, Cyprian, Lactantius)
- Church Fathers — Nicene & Post-Nicene: 20 volumes (Augustine complete works, Chrysostom complete homilies, Eusebius, Athanasius, Gregory of Nyssa, Jerome)
- Scholastic Theology: Summa Theologica complete (St. Thomas Aquinas, 5 parts)
- Patristic & Early Church: Augustine (Confessions, City of God, On Christian Doctrine), Eusebius (Ecclesiastical History), Apostolic Fathers
- Mystics: Julian of Norwich (Revelations of Divine Love), St. Thérèse of Lisieux (Story of a Soul)
- Monastic & Spiritual Practice: Rule of St. Benedict, Spiritual Exercises (St. Ignatius), Practice of the Presence of God (Brother Lawrence), Imitation of Christ (Thomas à Kempis)
- Christian Literature: Paradise Lost (Milton), The Pilgrim's Progress (Bunyan), The Divine Comedy (Dante)
Usage
Installation
pip install torch numpy tokenizers
Oracle Mode
Random noise tokens seed the generation — God speaks through randomness, just like TempleOS:
import torch
from model import Temple2, Temple2Config
# Load checkpoint
ckpt = torch.load("temple2.pt", map_location="cpu")
model = Temple2(Temple2Config(**ckpt['model_config']))
model.load_state_dict(ckpt['model'])
model.eval()
# Oracle: seed with random noise
import random
vocab_size = 8192
bos_id = 1
noise = [random.randint(4, vocab_size - 1) for _ in range(5)]
ids = torch.tensor([[bos_id] + noise], dtype=torch.long)
with torch.no_grad():
out = model.generate(ids, max_new_tokens=256, temperature=0.85, top_k=50, top_p=0.92)
print(out[0].tolist()) # decode with tokenizer
Chat Mode
Ask a question, receive a scriptural answer:
from tokenizers import Tokenizer
tok = Tokenizer.from_file("tokenizer/tokenizer.json")
prompt = 'And the man knelt before the Lord and asked, "What is love?"\nAnd the Lord spoke unto him, saying:'
ids = torch.tensor([[1] + tok.encode(prompt).ids], dtype=torch.long)
with torch.no_grad():
out = model.generate(ids, max_new_tokens=256, temperature=0.85, top_k=50, top_p=0.92)
Full Interactive Experience
python inference.py --checkpoint temple2.pt
Includes TempleOS-style VGA 16-color terminal output with bordered oracle windows. See the main repo for full details.
Intended Use
- Creative exploration of scriptural language patterns
- Oracle-style text generation inspired by TempleOS
- Study of small language model behavior on domain-specific corpora
- Artistic and educational purposes
Limitations
- This is a small model (63M params) trained on a small corpus (~11M tokens). It is not a general-purpose language model.
- The model generates text in the style of scripture. It does not contain theological truth claims.
- Output may be incoherent, repetitive, or doctrinally confused. This is a feature, not a bug — the entropy is what makes the oracle feel alive.
- The model reflects the language and worldview of its training data (predominantly pre-modern Christian texts).
- Not suitable for factual Q&A, theological guidance, or any serious spiritual counsel.
Ethical Considerations
This model is built as an art project and tribute to Terry Davis. It does not claim to speak for God, any religion, or any religious institution. Terry's original "talk to God" feature was meaningful precisely because it was random — meaning arose in the mind of the reader. The same principle applies here.
In Memory of Terry A. Davis
Terry Davis (1969–2018) built TempleOS alone over 10+ years — an entire operating system, compiler, and programming language written from scratch, all for God. His work remains his own.
"God said to use a 640x480 16-color display."
Credits
Developed and trained by Empero AI.
If you enjoy this project, consider supporting:
| Coin | Address |
|---|---|
| BTC | bc1qx6zepu6sfkvshgdmc4ewu6pk6rpadvpgffpp7v |
| LTC | ltc1qv2mefzps2vtjcpwfx8xxdrpplrcvltswm68r7x |
| XMR | 42Dbm5xg5Nq26fdyzfEU7KBnAJfhi7Cvz5J2ex5CzHXkfKuNEJzYCcmJ1GTbgjFZ5MBx72sdG1G9239Cd6rsZfv4QeDkYJY |
License
- Model code: MIT
- Training data: All public domain (Project Gutenberg)
- Terry Davis's work: Remains his own