Instructions to use albert/albert-xxlarge-v2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use albert/albert-xxlarge-v2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="albert/albert-xxlarge-v2")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("albert/albert-xxlarge-v2") model = AutoModelForMaskedLM.from_pretrained("albert/albert-xxlarge-v2") - Inference
- Notebooks
- Google Colab
- Kaggle
addition of Rust model
#2
by kavanmevada - opened
No description provided.
Converted pytorch_model.bin via utility in rust-bert using following command:
python3 utils/convert_model.py ../pytorch_model.bin
I haven't tried loading the converted model, but the process to generate the files looks good, thank you
julien-c changed pull request status to merged