Transformers
English
falcon

Keyerror: falcon when loading the model from the hub directly using ctransformers

#4
by Karthik1611 - opened

I get the Keyerror: falcon when trying to load the model from huggingface hub directly. Can someone please help me in resolving this issue?

Karthik1611 changed discussion title from Keyerror: falcon when loading the model from the hub directly to Keyerror: falcon when loading the model from the hub directly using ctransformers

You can't load GGML models directly from transformers or with Hugging Face's Text Generation Inference.

To load Falcon GGMLs from Python code you should use ctransformers. It is designed to work in the same way as Transformers, but for GGML models.

Karthik1611 changed discussion status to closed

Sign up or log in to comment