Update README.md
Browse files
README.md
CHANGED
|
@@ -17,7 +17,11 @@ license: mit
|
|
| 17 |
<img src="https://img.shields.io/badge/Discord-Join-5865F2?style=for-the-badge&logo=discord&logoColor=white" alt="Discord"/>
|
| 18 |
</a>
|
| 19 |
<a href="https://github.com/Lexsi-Labs/Orion-BiX">
|
| 20 |
-
<img src="https://img.shields.io/badge/GitHub-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
</a>
|
| 22 |
</div>
|
| 23 |
|
|
@@ -52,6 +56,8 @@ Input → tf_col (Set Transformer) → Bi-Axial Attention → tf_icl (ICL) → O
|
|
| 52 |
|
| 53 |
|
| 54 |
## Usage
|
|
|
|
|
|
|
| 55 |
|
| 56 |
```python
|
| 57 |
from orion_bix.sklearn import OrionBiXClassifier
|
|
@@ -64,9 +70,37 @@ predictions = clf.predict(X_test)
|
|
| 64 |
|
| 65 |
This code will automatically download the pre-trained model from Hugging Face and use a GPU if available.
|
| 66 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 67 |
## Installation
|
| 68 |
|
| 69 |
-
###
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 70 |
#### Option 1: From the local clone
|
| 71 |
|
| 72 |
```bash
|
|
@@ -82,12 +116,16 @@ pip install git+https://github.com/Lexsi-Labs/Orion-BiX.git
|
|
| 82 |
|
| 83 |
## Citation
|
| 84 |
|
| 85 |
-
If you use Orion-BiX in your research, please cite
|
| 86 |
|
| 87 |
```bibtex
|
| 88 |
-
@misc{
|
| 89 |
-
|
| 90 |
-
|
| 91 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 92 |
}
|
| 93 |
```
|
|
|
|
| 17 |
<img src="https://img.shields.io/badge/Discord-Join-5865F2?style=for-the-badge&logo=discord&logoColor=white" alt="Discord"/>
|
| 18 |
</a>
|
| 19 |
<a href="https://github.com/Lexsi-Labs/Orion-BiX">
|
| 20 |
+
<img src="https://img.shields.io/badge/GitHub-Orion%20BiX-181717?style=for-the-badge&logo=github&logoColor=white" alt="GitHub"/>
|
| 21 |
+
</a>
|
| 22 |
+
<!-- TabTune repo -->
|
| 23 |
+
<a href="https://github.com/Lexsi-Labs/TabTune">
|
| 24 |
+
<img src="https://img.shields.io/badge/GitHub-TabTune-181717?style=for-the-badge&logo=github&logoColor=white" alt="TabTune GitHub"/>
|
| 25 |
</a>
|
| 26 |
</div>
|
| 27 |
|
|
|
|
| 56 |
|
| 57 |
|
| 58 |
## Usage
|
| 59 |
+
### Direct (OrionBiX Python package)
|
| 60 |
+
|
| 61 |
|
| 62 |
```python
|
| 63 |
from orion_bix.sklearn import OrionBiXClassifier
|
|
|
|
| 70 |
|
| 71 |
This code will automatically download the pre-trained model from Hugging Face and use a GPU if available.
|
| 72 |
|
| 73 |
+
### Via TabTune (unified TFM library)
|
| 74 |
+
|
| 75 |
+
OrionBix can be used either directly via its own Python package or through [TabTune](https://github.com/Lexsi-Labs/TabTune), which provides a unified interface over several tabular foundation models.
|
| 76 |
+
|
| 77 |
+
```python
|
| 78 |
+
from tabtune import TabularPipeline
|
| 79 |
+
|
| 80 |
+
pipeline = TabularPipeline(
|
| 81 |
+
model_name="OrionBix", # use OrionBix through TabTune
|
| 82 |
+
tuning_strategy="inference", # zero-shot / in-context mode
|
| 83 |
+
tuning_params={"device": "cuda"} # or "cpu"
|
| 84 |
+
)
|
| 85 |
+
|
| 86 |
+
pipeline.fit(X_train, y_train)
|
| 87 |
+
predictions = pipeline.predict(X_test)
|
| 88 |
+
```
|
| 89 |
+
|
| 90 |
+
When used through TabTune, the OrionBiX weights are automatically downloaded from this Hugging Face repository on first use, and TabTune handles model-aware preprocessing for you.
|
| 91 |
+
|
| 92 |
+
|
| 93 |
## Installation
|
| 94 |
|
| 95 |
+
### Via TabTune (recommended if you want multiple tabular FMs)
|
| 96 |
+
|
| 97 |
+
```bash
|
| 98 |
+
pip install tabtune
|
| 99 |
+
```
|
| 100 |
+
|
| 101 |
+
This installs TabTune and its built-in OrionBiX integration; no separate orion-bix install is required.
|
| 102 |
+
|
| 103 |
+
### From the OrionBiX source
|
| 104 |
#### Option 1: From the local clone
|
| 105 |
|
| 106 |
```bash
|
|
|
|
| 116 |
|
| 117 |
## Citation
|
| 118 |
|
| 119 |
+
If you use Orion-BiX in your research, please cite our [paper](https://arxiv.org/abs/2512.00181)::
|
| 120 |
|
| 121 |
```bibtex
|
| 122 |
+
@misc{bouadi2025orionbix,
|
| 123 |
+
title={Orion-Bix: Bi-Axial Attention for Tabular In-Context Learning},
|
| 124 |
+
author={Mohamed Bouadi and Pratinav Seth and Aditya Tanna and Vinay Kumar Sankarapu},
|
| 125 |
+
year={2025},
|
| 126 |
+
eprint={2512.00181},
|
| 127 |
+
archivePrefix={arXiv},
|
| 128 |
+
primaryClass={cs.LG},
|
| 129 |
+
url={https://arxiv.org/abs/2512.00181},
|
| 130 |
}
|
| 131 |
```
|