Instructions to use AdapterHub/llama2-13b-qlora-openassistant with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Adapters
How to use AdapterHub/llama2-13b-qlora-openassistant with Adapters:
from adapters import AutoAdapterModel model = AutoAdapterModel.from_pretrained("meta-llama/Llama-2-13b-hf") model.load_adapter("AdapterHub/llama2-13b-qlora-openassistant", set_active=True) - Notebooks
- Google Colab
- Kaggle
Welcome to the community
The community tab is the place to discuss and collaborate with the HF community!