Instructions to use Felix92/doctr-torch-parseq-multilingual-v1 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Felix92/doctr-torch-parseq-multilingual-v1 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Felix92/doctr-torch-parseq-multilingual-v1", dtype="auto") - Notebooks
- Google Colab
- Kaggle
What is the batchsize of this mode?
#2
by yumikimi381 - opened
Config_json is missing batchsize
Was trained with batch_size 64
(800k train / 200k val) - all synthetic data
But if you use it in docTR you can specify the batch_size you want / your hardware can handle
Thank you so much for the swift reply! Have a nice day :))