Routing to a different model?

#27
by mbary - opened

I might be misunderstanding the trace, but I'm getting some random error when trying to send a call via a provider.

I'm trying to run the most basic example that's provided in the 'docs'

import os
from huggingface_hub import InferenceClient

client = InferenceClient(
    provider="wavespeed",
    api_key=os.environ["HF_TOKEN"],
)

with open("cat.png", "rb") as image_file:
   input_image = image_file.read()

# output is a PIL.Image object
image = client.image_to_image(
    input_image,
    prompt="Turn the cat into a tiger.",
    model="Qwen/Qwen-Image-Edit-2509",
)

Trying to identify what's happening I noticed I'm being routed to a completely different model?

HTTPStatusError: Client error '400 Bad Request' for url 'https://router.huggingface.co/wavespeed/api/v3/wavespeed-ai/qwen-image/edit-plus-lora'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

According to the official qwen docs, edit-plus doesn't even take an image as input?
https://modelstudio.console.alibabacloud.com/ap-southeast-1/?tab=doc#/doc/?type=model&url=2840914_2&modelId=qwen-image-plus

I'm just wondering what th is going on here...

Sign up or log in to comment