Instructions to use openai/whisper-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use openai/whisper-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("automatic-speech-recognition", model="openai/whisper-large")# Load model directly from transformers import AutoProcessor, AutoModelForSpeechSeq2Seq processor = AutoProcessor.from_pretrained("openai/whisper-large") model = AutoModelForSpeechSeq2Seq.from_pretrained("openai/whisper-large") - Notebooks
- Google Colab
- Kaggle
Unable to run it on my local machine
Hi, I am new and I wanted to run whisper model on my laptop
when I try to import WhisperProcessor, WhisperForConditionalGeneration
It gives me an error can someone place help me.
Thanks in advance
Error:
ImportError Traceback (most recent call last)
/tmp/ipykernel_17/3614451028.py in
----> 1 from transformers import WhisperProcessor, WhisperForConditionalGeneration
2 from datasets import load_dataset
3 import torch
4
5 # load model and processor
ImportError: cannot import name 'WhisperProcessor' from 'transformers' (/opt/conda/lib/python3.7/site-packages/transformers/init.py)
Try install transformers from github repo, not from pip. Because Whisper added only few days ago, and in pypi package i think its not presented yet
Exactly! Should be on transformers soon!
Just in case anyone is facing the same issue. Updating transformer package now and importing WhisperProcessor, WhisperForConditionalGeneration works.