Fill-Mask
Transformers
PyTorch
TensorFlow
JAX
Arabic
bert
Arabic
Dialect
Egyptian
Gulf
Levantine
Classical Arabic
MSA
Modern Standard Arabic
Instructions to use CAMeL-Lab/bert-base-arabic-camelbert-mix with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use CAMeL-Lab/bert-base-arabic-camelbert-mix with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="CAMeL-Lab/bert-base-arabic-camelbert-mix")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-mix") model = AutoModelForMaskedLM.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-mix") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 62d502ceb7ae3d56099b9860d7e62e99edd1337f6bfbffe38d09ce1aa8894370
- Size of remote file:
- 436 MB
- SHA256:
- e1c7ec33d731f6ab5309c80fa26a8afe17d7c665874d2c191f27fc9974573d37
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.