Building GhostLM a decoder-only transformer LLM trained from scratch in PyTorch on cybersecurity corpora (CVEs, MITRE ATT&CK, CTFtime, Exploit-DB). Interested in domain-specialized pretraining, efficient training on consumer hardware, and the intersection of offensive security and language models. Scale ladder: ghost-tiny (14.7M) โ ghost-small (45M, current) โ ghost-base (350M) โ ghost-1B.