# Load model directly
from transformers import AutoProcessor, AutoModelForPreTraining
processor = AutoProcessor.from_pretrained("MagicLuke/Wav2Vec2-MyST")
model = AutoModelForPreTraining.from_pretrained("MagicLuke/Wav2Vec2-MyST")Quick Links
Model Description:
This is the wav2vec2-base model being pre-trained on the My Science Tutor (MyST 470h) dataset (from LDC).
The pertaining is done by using fairseq (wav2vec2_base_librispeech config).
The converge checkpoint is converted from PyTorch model to Hugging Face model by using a modified version of convertor script offered by Huggingface
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
# Gated model: Login with a HF token with gated access permission hf auth login