Relational Transformer — PluRel Checkpoints

Relational Transformer (RT) model checkpoints pretrained on synthetic relational databases generated by PluRel, as introduced in:

PluRel: Synthetic Data unlocks Scaling Laws for Relational Foundation Models Kothapalli, Ranjan, Hudovernik, Dwivedi, Hoffart, Guestrin, Leskovec — arXiv:2602.04029 (2026)

arXiv Project Page GitHub Dataset


Model Architecture

The Relational Transformer operates on multi-tabular relational databases, treating rows across linked tables as a sequence via BFS-ordered context sampling.

Hyperparameter Value
Transformer blocks 12
Model dimension (d_model) 256
Attention heads 8
FFN dimension (d_ff) 1,024
Context length 1,024 tokens
Text encoder all-MiniLM-L12-v2 (d_text = 384)
Max BFS width 128

The architecture and training loop build on the Relational Transformer (ICLR 2026) codebase.


Download

huggingface-cli download kvignesh1420/relational-transformer-plurel \
    --repo-type model \
    --local-dir ~/scratch/rt_hf_ckpts
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train kvignesh1420/relational-transformer-plurel

Papers for kvignesh1420/relational-transformer-plurel