allenai/MADLAD-400
Updated • 32.9k • 168
How to use illian64/madlad400-10b-mt-ct2-bfloat16 with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "translation" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("translation", model="illian64/madlad400-10b-mt-ct2-bfloat16") # Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("illian64/madlad400-10b-mt-ct2-bfloat16", dtype="auto")Disclaimer: illian64, who was not involved in this research, converted the original models to CTranslate2 optimized model and wrote the contents of this model card based on google/madlad400-10b-mt.
Convert params:
ct2-transformers-converter --model google/madlad400-10b-mt --quantization bfloat16 --output_dir madlad400-10b-mt-ct2-bfloat16
Base model
google/madlad400-10b-mt