|
|
--- |
|
|
license: mit |
|
|
--- |
|
|
|
|
|
<div align="center"> |
|
|
<img src="logo.png" alt="Orion-BiX Logo" width="700"/> |
|
|
</div> |
|
|
|
|
|
<div align="center"> |
|
|
<a href="https://lexsi.ai/"> |
|
|
<img src="https://img.shields.io/badge/Lexsi-Homepage-FF6B6B?style=for-the-badge" alt="Homepage"/> |
|
|
</a> |
|
|
<a href="https://huggingface.co/Lexsi"> |
|
|
<img src="https://img.shields.io/badge/π€%20Hugging%20Face-Lexsi AI-FFD21E?style=for-the-badge" alt="Hugging Face"/> |
|
|
</a> |
|
|
<a href="https://discord.gg/dSB62Q7A"> |
|
|
<img src="https://img.shields.io/badge/Discord-Join-5865F2?style=for-the-badge&logo=discord&logoColor=white" alt="Discord"/> |
|
|
</a> |
|
|
<a href="https://github.com/Lexsi-Labs/Orion-BiX"> |
|
|
<img src="https://img.shields.io/badge/GitHub-Orion%20BiX-181717?style=for-the-badge&logo=github&logoColor=white" alt="GitHub"/> |
|
|
</a> |
|
|
<!-- TabTune repo --> |
|
|
<a href="https://github.com/Lexsi-Labs/TabTune"> |
|
|
<img src="https://img.shields.io/badge/GitHub-TabTune-181717?style=for-the-badge&logo=github&logoColor=white" alt="TabTune GitHub"/> |
|
|
</a> |
|
|
</div> |
|
|
|
|
|
|
|
|
|
|
|
# Orion-BiX: Bi-Axial Meta-Learning Model for Tabular In-Context Learning |
|
|
|
|
|
**Orion-BiX** is an advanced tabular foundation model that combines **Bi-Axial Attention** with **Meta-Learning** capabilities for few-shot tabular classification. The model extends the TabICL architecture with alternating attention patterns and episode-based training. |
|
|
|
|
|
The model is part of **Orion**, a family of tabular foundation models with various enhancements. |
|
|
|
|
|
### Key Innovations |
|
|
|
|
|
1. **Bi-Axial Attention**: Alternating attention patterns (Standard β Grouped β Hierarchical β Relational) that capture multi-scale feature interactions within tabular data |
|
|
2. **Meta-Learning with k-NN Support Selection**: Episode-based training with intelligent support set selection using similarity metrics |
|
|
3. **Three-Component Architecture**: Column embedding (Set Transformer), Bi-Axial row interaction, and In-Context Learning prediction |
|
|
|
|
|
### Architecture Overview |
|
|
|
|
|
``` |
|
|
Input β tf_col (Set Transformer) β Bi-Axial Attention β tf_icl (ICL) β Output |
|
|
``` |
|
|
**Component Details:** |
|
|
- **tf_col (Column Embedder)**: Set Transformer for statistical distribution learning across features |
|
|
- **Bi-Axial Attention**: Replaces standard RowInteraction with alternating attention patterns: |
|
|
- Standard Cross-Feature Attention |
|
|
- Grouped Feature Attention |
|
|
- Hierarchical Feature Attention |
|
|
- Relational Feature Attention |
|
|
- CLS Token Aggregation |
|
|
- **tf_icl (ICL Predictor)**: In-context learning module for few-shot prediction |
|
|
|
|
|
|
|
|
## Usage |
|
|
### Direct (OrionBiX Python package) |
|
|
|
|
|
|
|
|
```python |
|
|
from orion_bix.sklearn import OrionBiXClassifier |
|
|
|
|
|
# Initialize and use |
|
|
clf = OrionBiXClassifier() |
|
|
clf.fit(X_train, y_train) |
|
|
predictions = clf.predict(X_test) |
|
|
``` |
|
|
|
|
|
This code will automatically download the pre-trained model from Hugging Face and use a GPU if available. |
|
|
|
|
|
### Via TabTune (unified TFM library) |
|
|
|
|
|
OrionBix can be used either directly via its own Python package or through [TabTune](https://github.com/Lexsi-Labs/TabTune), which provides a unified interface over several tabular foundation models. |
|
|
|
|
|
```python |
|
|
from tabtune import TabularPipeline |
|
|
|
|
|
pipeline = TabularPipeline( |
|
|
model_name="OrionBix", # use OrionBix through TabTune |
|
|
tuning_strategy="inference", # zero-shot / in-context mode |
|
|
tuning_params={"device": "cuda"} # or "cpu" |
|
|
) |
|
|
|
|
|
pipeline.fit(X_train, y_train) |
|
|
predictions = pipeline.predict(X_test) |
|
|
``` |
|
|
|
|
|
When used through TabTune, the OrionBiX weights are automatically downloaded from this Hugging Face repository on first use, and TabTune handles model-aware preprocessing for you. |
|
|
|
|
|
|
|
|
## Installation |
|
|
|
|
|
### Via TabTune (recommended if you want multiple tabular FMs) |
|
|
|
|
|
```bash |
|
|
pip install tabtune |
|
|
``` |
|
|
|
|
|
This installs TabTune and its built-in OrionBiX integration; no separate orion-bix install is required. |
|
|
|
|
|
### From the OrionBiX source |
|
|
#### Option 1: From the local clone |
|
|
|
|
|
```bash |
|
|
cd orion-bix |
|
|
pip install -e . |
|
|
``` |
|
|
|
|
|
#### Option 2: From the Git Remote |
|
|
|
|
|
```bash |
|
|
pip install git+https://github.com/Lexsi-Labs/Orion-BiX.git |
|
|
``` |
|
|
|
|
|
## Citation |
|
|
|
|
|
If you use Orion-BiX in your research, please cite our [paper](https://arxiv.org/abs/2512.00181):: |
|
|
|
|
|
```bibtex |
|
|
@misc{bouadi2025orionbix, |
|
|
title={Orion-Bix: Bi-Axial Attention for Tabular In-Context Learning}, |
|
|
author={Mohamed Bouadi and Pratinav Seth and Aditya Tanna and Vinay Kumar Sankarapu}, |
|
|
year={2025}, |
|
|
eprint={2512.00181}, |
|
|
archivePrefix={arXiv}, |
|
|
primaryClass={cs.LG}, |
|
|
url={https://arxiv.org/abs/2512.00181}, |
|
|
} |
|
|
``` |