How to use Feluda/Zephyr-7b-QnA with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="Feluda/Zephyr-7b-QnA")
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Feluda/Zephyr-7b-QnA") model = AutoModelForCausalLM.from_pretrained("Feluda/Zephyr-7b-QnA")
The community tab is the place to discuss and collaborate with the HF community!