Link model to paper and improve model card
#1
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -1,24 +1,39 @@
|
|
| 1 |
---
|
| 2 |
-
|
|
|
|
|
|
|
| 3 |
tags:
|
| 4 |
- Tabular
|
| 5 |
- In-Context-Learning
|
| 6 |
- Transformer
|
| 7 |
-
|
| 8 |
-
license: apache-2.0
|
| 9 |
---
|
| 10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
## Installation
|
| 13 |
-
|
| 14 |
-
|
|
|
|
|
|
|
|
|
|
| 15 |
cd TabDPT
|
| 16 |
pip install -e .
|
| 17 |
```
|
| 18 |
|
| 19 |
## Example Usage
|
| 20 |
-
|
| 21 |
-
For
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
## Updates
|
| 24 |
|
|
@@ -29,11 +44,16 @@ For better performance, please increase `context_size` or increase `n_ensembles`
|
|
| 29 |
Added support for flash attention (with bf16 precision) and compile flag. Both are enabled to True by default and should lead to a significant speed-up.
|
| 30 |
|
| 31 |
## Citation
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
|
|
|
|
|
|
|
|
|
| 38 |
}
|
| 39 |
-
```
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
license_name: modified-mit
|
| 4 |
+
pipeline_tag: other
|
| 5 |
tags:
|
| 6 |
- Tabular
|
| 7 |
- In-Context-Learning
|
| 8 |
- Transformer
|
| 9 |
+
arxiv: 2410.18164
|
|
|
|
| 10 |
---
|
| 11 |
+
|
| 12 |
+
# TabDPT: Scaling Tabular Foundation Models on Real Data
|
| 13 |
+
|
| 14 |
+
**TabDPT** is an open-source foundation model for tabular data based on in-context learning (ICL). It is trained on real-world data and can generalize to new tasks across classification and regression without additional training or hyperparameter tuning.
|
| 15 |
+
|
| 16 |
+
- **Paper:** [TabDPT: Scaling Tabular Foundation Models on Real Data](https://huggingface.co/papers/2410.18164)
|
| 17 |
+
- **GitHub:** [layer6ai-labs/TabDPT](https://github.com/layer6ai-labs/TabDPT)
|
| 18 |
|
| 19 |
## Installation
|
| 20 |
+
|
| 21 |
+
To set up the environment, ensure you have Python 3.10 or 3.11, then run:
|
| 22 |
+
|
| 23 |
+
```bash
|
| 24 |
+
git clone https://github.com/layer6ai-labs/TabDPT.git
|
| 25 |
cd TabDPT
|
| 26 |
pip install -e .
|
| 27 |
```
|
| 28 |
|
| 29 |
## Example Usage
|
| 30 |
+
|
| 31 |
+
TabDPT performs zero-shot prediction on new tabular datasets. For detailed working examples, please refer to the following files in the GitHub repository:
|
| 32 |
+
- `tests/cls_example.py` for classification tasks.
|
| 33 |
+
- `tests/reg_example.py` for regression tasks.
|
| 34 |
+
|
| 35 |
+
**Performance Tips:**
|
| 36 |
+
For better performance, you can increase `context_size` or increase `n_ensembles` to trade off speed and accuracy.
|
| 37 |
|
| 38 |
## Updates
|
| 39 |
|
|
|
|
| 44 |
Added support for flash attention (with bf16 precision) and compile flag. Both are enabled to True by default and should lead to a significant speed-up.
|
| 45 |
|
| 46 |
## Citation
|
| 47 |
+
|
| 48 |
+
```bibtex
|
| 49 |
+
@inproceedings{
|
| 50 |
+
ma2025tabdpt,
|
| 51 |
+
title={Tab{DPT}: Scaling Tabular Foundation Models on Real Data},
|
| 52 |
+
author={Junwei Ma and Valentin Thomas and Rasa Hosseinzadeh and Alex Labach and Hamidreza Kamkari and Jesse C. Cresswell and Keyvan Golestan and Guangwei Yu and Anthony L. Caterini and Maksims Volkovs},
|
| 53 |
+
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
|
| 54 |
+
year={2025},
|
| 55 |
+
url={https://openreview.net/forum?id=pIZxEOZCId}
|
| 56 |
}
|
| 57 |
+
```
|
| 58 |
+
|
| 59 |
+
© Copyright 2024-2025 The Toronto-Dominion Bank and/or its affiliates
|