Instructions to use afaji/fresh-12-layer-gpqa with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use afaji/fresh-12-layer-gpqa with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForMultipleChoice tokenizer = AutoTokenizer.from_pretrained("afaji/fresh-12-layer-gpqa") model = AutoModelForMultipleChoice.from_pretrained("afaji/fresh-12-layer-gpqa") - Notebooks
- Google Colab
- Kaggle
Ctrl+K