Instructions to use Medissa/finetuned_bert_MCQ with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Medissa/finetuned_bert_MCQ with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForMultipleChoice tokenizer = AutoTokenizer.from_pretrained("Medissa/finetuned_bert_MCQ") model = AutoModelForMultipleChoice.from_pretrained("Medissa/finetuned_bert_MCQ") - Notebooks
- Google Colab
- Kaggle
Ctrl+K