This repo contains the pre-trained weights for the Beit model converted in a format that can be used by candle.
Citing BEiT
As per their GitHub repository:
@misc{bao2022beitbertpretrainingimage,
title={BEiT: BERT Pre-Training of Image Transformers},
author={Hangbo Bao and Li Dong and Songhao Piao and Furu Wei},
year={2022},
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support