MohamedBouadi commited on
Commit
079f69c
·
verified ·
1 Parent(s): 9dc95d5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +89 -3
README.md CHANGED
@@ -1,3 +1,89 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ ---
4
+
5
+ <div align="center">
6
+ <a href="https://lexsi.ai/">
7
+ <img src="https://img.shields.io/badge/Lexsi-Homepage-FF6B6B?style=for-the-badge" alt="Homepage"/>
8
+ </a>
9
+ <a href="https://huggingface.co/Lexsi">
10
+ <img src="https://img.shields.io/badge/🤗%20Hugging%20Face-Lexsi AI-FFD21E?style=for-the-badge" alt="Hugging Face"/>
11
+ </a>
12
+ <a href="https://discord.gg/dSB62Q7A">
13
+ <img src="https://img.shields.io/badge/Discord-Join-5865F2?style=for-the-badge&logo=discord&logoColor=white" alt="Discord"/>
14
+ </a>
15
+ <a href="https://github.com/Lexsi-Labs/Orion-BiX">
16
+ <img src="https://img.shields.io/badge/GitHub-Repository-181717?style=for-the-badge&logo=github&logoColor=white" alt="GitHub"/>
17
+ </a>
18
+ </div>
19
+
20
+
21
+
22
+ # Orion-BiX: Bi-Axial Meta-Learning Model for Tabular In-Context Learning
23
+
24
+ **Orion-BiX** is an advanced tabular foundation model that combines **Bi-Axial Attention** with **Meta-Learning** capabilities for few-shot tabular classification. The model extends the TabICL architecture with alternating attention patterns and episode-based training.
25
+
26
+ The model is part of **Orion**, a family of tabular foundation models with various enhancements.
27
+
28
+ ### Key Innovations
29
+
30
+ 1. **Bi-Axial Attention**: Alternating attention patterns (Standard → Grouped → Hierarchical → Relational) that capture multi-scale feature interactions within tabular data
31
+ 2. **Meta-Learning with k-NN Support Selection**: Episode-based training with intelligent support set selection using similarity metrics
32
+ 3. **Three-Component Architecture**: Column embedding (Set Transformer), Bi-Axial row interaction, and In-Context Learning prediction
33
+
34
+ ### Architecture Overview
35
+
36
+ ```
37
+ Input → tf_col (Set Transformer) → Bi-Axial Attention → tf_icl (ICL) → Output
38
+ ```
39
+ **Component Details:**
40
+ - **tf_col (Column Embedder)**: Set Transformer for statistical distribution learning across features
41
+ - **Bi-Axial Attention**: Replaces standard RowInteraction with alternating attention patterns:
42
+ - Standard Cross-Feature Attention
43
+ - Grouped Feature Attention
44
+ - Hierarchical Feature Attention
45
+ - Relational Feature Attention
46
+ - CLS Token Aggregation
47
+ - **tf_icl (ICL Predictor)**: In-context learning module for few-shot prediction
48
+
49
+
50
+ ## Usage
51
+
52
+ ```python
53
+ from orion_bix.sklearn import OrionBiXClassifier
54
+
55
+ # Initialize and use
56
+ clf = OrionBiXClassifier()
57
+ clf.fit(X_train, y_train)
58
+ predictions = clf.predict(X_test)
59
+ ```
60
+
61
+ This code will automatically download the pre-trained model from Hugging Face and use a GPU if available.
62
+
63
+ ## Installation
64
+
65
+ ### From the source
66
+ #### Option 1: From the local clone
67
+
68
+ ```bash
69
+ cd orion-bix
70
+ pip install -e .
71
+ ```
72
+
73
+ #### Option 2: From the Git Remote
74
+
75
+ ```bash
76
+ pip install git+https://github.com/Lexsi-Labs/Orion-BiX.git
77
+ ```
78
+
79
+ ## Citation
80
+
81
+ If you use Orion-BiX in your research, please cite:
82
+
83
+ ```bibtex
84
+ @misc{bouadi25oriobix,
85
+ title={Orion-BiX: Bi-Axial Meta-Learning for Tabular In-Context Learning},
86
+ author={Mohamed Bouadi and Pratinav Seth and Aditya Tanna and Vinay Kumar Sankarapu},
87
+ year={2025},
88
+ }
89
+ ```