Instructions to use haritzpuerto/MetaQA with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use haritzpuerto/MetaQA with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="haritzpuerto/MetaQA")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("haritzpuerto/MetaQA") model = AutoModel.from_pretrained("haritzpuerto/MetaQA") - Notebooks
- Google Colab
- Kaggle
Commit History
Librarian Bot: Update Hugging Face dataset ID (#2) 451a17c verified
Create model.py dd38fc4
Update inference.py 4b2efc2
Create README.md 0297073
Update inference.py 127b15f
Create inference.py 9f26017
Update config.json 70a4de4
Upload tokenizer 7f5e0f3
Upload model eda5cec
initial commit bf48835
Haritz Puerto commited on