PluRel: Synthetic Data unlocks Scaling Laws for Relational Foundation Models
Paper
• 2602.04029 • Published
Relational Transformer (RT) model checkpoints pretrained on synthetic relational databases generated by PluRel, as introduced in:
PluRel: Synthetic Data unlocks Scaling Laws for Relational Foundation Models Kothapalli, Ranjan, Hudovernik, Dwivedi, Hoffart, Guestrin, Leskovec — arXiv:2602.04029 (2026)
The Relational Transformer operates on multi-tabular relational databases, treating rows across linked tables as a sequence via BFS-ordered context sampling.
| Hyperparameter | Value |
|---|---|
| Transformer blocks | 12 |
Model dimension (d_model) |
256 |
| Attention heads | 8 |
FFN dimension (d_ff) |
1,024 |
| Context length | 1,024 tokens |
| Text encoder | all-MiniLM-L12-v2 (d_text = 384) |
| Max BFS width | 128 |
The architecture and training loop build on the Relational Transformer (ICLR 2026) codebase.
huggingface-cli download kvignesh1420/relational-transformer-plurel \
--repo-type model \
--local-dir ~/scratch/rt_hf_ckpts