File size: 4,539 Bytes
079f69c
 
 
 
d12a6f6
 
 
 
079f69c
 
 
 
 
 
 
 
 
 
 
9504590
 
 
 
 
079f69c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9504590
 
079f69c
 
 
 
 
 
 
 
 
 
 
 
9504590
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
079f69c
 
9504590
 
 
 
 
 
 
 
 
079f69c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9504590
079f69c
 
9504590
 
 
 
 
 
 
 
079f69c
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
---
license: mit
---

<div align="center">
  <img src="logo.png" alt="Orion-BiX Logo" width="700"/>
</div>

<div align="center">
  <a href="https://lexsi.ai/">
    <img src="https://img.shields.io/badge/Lexsi-Homepage-FF6B6B?style=for-the-badge" alt="Homepage"/>
  </a>
  <a href="https://huggingface.co/Lexsi">
    <img src="https://img.shields.io/badge/🤗%20Hugging%20Face-Lexsi AI-FFD21E?style=for-the-badge" alt="Hugging Face"/>
  </a>
  <a href="https://discord.gg/dSB62Q7A">
    <img src="https://img.shields.io/badge/Discord-Join-5865F2?style=for-the-badge&logo=discord&logoColor=white" alt="Discord"/>
  </a>
  <a href="https://github.com/Lexsi-Labs/Orion-BiX">
    <img src="https://img.shields.io/badge/GitHub-Orion%20BiX-181717?style=for-the-badge&logo=github&logoColor=white" alt="GitHub"/>
  </a>
  <!-- TabTune repo -->
  <a href="https://github.com/Lexsi-Labs/TabTune">
    <img src="https://img.shields.io/badge/GitHub-TabTune-181717?style=for-the-badge&logo=github&logoColor=white" alt="TabTune GitHub"/>
  </a>
</div>



# Orion-BiX: Bi-Axial Meta-Learning Model for Tabular In-Context Learning

**Orion-BiX** is an advanced tabular foundation model that combines **Bi-Axial Attention** with **Meta-Learning** capabilities for few-shot tabular classification. The model extends the TabICL architecture with alternating attention patterns and episode-based training.

The model is part of **Orion**, a family of tabular foundation models with various enhancements.

### Key Innovations

1. **Bi-Axial Attention**: Alternating attention patterns (Standard → Grouped → Hierarchical → Relational) that capture multi-scale feature interactions within tabular data
2. **Meta-Learning with k-NN Support Selection**: Episode-based training with intelligent support set selection using similarity metrics
3. **Three-Component Architecture**: Column embedding (Set Transformer), Bi-Axial row interaction, and In-Context Learning prediction

### Architecture Overview

```
Input → tf_col (Set Transformer) → Bi-Axial Attention → tf_icl (ICL) → Output
```
**Component Details:**
- **tf_col (Column Embedder)**: Set Transformer for statistical distribution learning across features
- **Bi-Axial Attention**: Replaces standard RowInteraction with alternating attention patterns:
  - Standard Cross-Feature Attention
  - Grouped Feature Attention
  - Hierarchical Feature Attention
  - Relational Feature Attention
  - CLS Token Aggregation
- **tf_icl (ICL Predictor)**: In-context learning module for few-shot prediction


## Usage
### Direct (OrionBiX Python package)


```python
from orion_bix.sklearn import OrionBiXClassifier

# Initialize and use
clf = OrionBiXClassifier()
clf.fit(X_train, y_train)
predictions = clf.predict(X_test)
```

This code will automatically download the pre-trained model from Hugging Face and use a GPU if available.

### Via TabTune (unified TFM library)

OrionBix can be used either directly via its own Python package or through [TabTune](https://github.com/Lexsi-Labs/TabTune), which provides a unified interface over several tabular foundation models.

```python
from tabtune import TabularPipeline

pipeline = TabularPipeline(
    model_name="OrionBix",          # use OrionBix through TabTune
    tuning_strategy="inference",    # zero-shot / in-context mode
    tuning_params={"device": "cuda"}  # or "cpu"
)

pipeline.fit(X_train, y_train)
predictions = pipeline.predict(X_test)
```

When used through TabTune, the OrionBiX weights are automatically downloaded from this Hugging Face repository on first use, and TabTune handles model-aware preprocessing for you.


## Installation

### Via TabTune (recommended if you want multiple tabular FMs)

```bash
pip install tabtune
```

This installs TabTune and its built-in OrionBiX integration; no separate orion-bix install is required.

### From the OrionBiX source
#### Option 1: From the local clone

```bash
cd orion-bix
pip install -e .
```

#### Option 2: From the Git Remote

```bash
pip install git+https://github.com/Lexsi-Labs/Orion-BiX.git
```

## Citation

If you use Orion-BiX in your research, please cite our [paper](https://arxiv.org/abs/2512.00181)::

```bibtex
@misc{bouadi2025orionbix,
      title={Orion-Bix: Bi-Axial Attention for Tabular In-Context Learning}, 
      author={Mohamed Bouadi and Pratinav Seth and Aditya Tanna and Vinay Kumar Sankarapu},
      year={2025},
      eprint={2512.00181},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2512.00181}, 
}
```