Instructions to use allenai/multicite-qa-qasper with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use allenai/multicite-qa-qasper with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("allenai/multicite-qa-qasper", dtype="auto") - Notebooks
- Google Colab
- Kaggle
MultiCite: Multi-label Citation Intent Analysis as paper-level Q&A (NAACL 2022)
This model has been trained on the data available here: https://github.com/allenai/multicite.
- Downloads last month
- 12
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support