Inference doesn't seem to load the model.

#14
by GiovanniPanda - opened

Hello. I preface by saying that I'm new to it, so I might be doing something wrong myself.

I've been trying to load the model with Inference
https://api-inference.huggingface.co/models/0xJustin/Dungeons-and-Diffusion

but it seems like it always returns the same error:
"error":"Model 0xJustin/Dungeons-and-Diffusion is currently loading","estimated_time":20.0

I am aware that the model has to load first, but no matter how much time I wait the estimated time stays at 20.

The code I'm using is really simple, just giving a text input and expecting an output.

def generate_image(prompt):
data = json.dumps({"inputs": prompt})
response = requests.post(API_URL, headers=headers, data=data)

if response.status_code == 200:
    return response.content
else:
    raise Exception(f"Failed to generate image: {response.text}")

Do I need to specify other parameters, is there something else I'm missing?

Thanks,
Gio

GiovanniPanda changed discussion status to closed

Sign up or log in to comment