Instructions to use GeneZC/bert-large-mnlimm with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use GeneZC/bert-large-mnlimm with Transformers:
# Load model directly from transformers import AutoTokenizer, BertCls tokenizer = AutoTokenizer.from_pretrained("GeneZC/bert-large-mnlimm") model = BertCls.from_pretrained("GeneZC/bert-large-mnlimm") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- bb0fa7ef45b8b49a6b8edb601c11b0a5d875809e2cceb96da187b1d3f273655e
- Size of remote file:
- 1.34 GB
- SHA256:
- db6cb7f5087aa719c856154f9547bce6625a7e2c92394e087684e074f8dc8614
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.