Instructions to use projecte-aina/Plume32k with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use projecte-aina/Plume32k with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="projecte-aina/Plume32k")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("projecte-aina/Plume32k") model = AutoModelForCausalLM.from_pretrained("projecte-aina/Plume32k") - Notebooks
- Google Colab
- Kaggle