How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("image-classification", model="not-lain/MyRepo", trust_remote_code=True)
pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")
# Load model directly
from transformers import AutoModelForImageClassification
model = AutoModelForImageClassification.from_pretrained("not-lain/MyRepo", trust_remote_code=True, dtype="auto")
Quick Links

how to create custom architectures

you can read this blogpost to find out more ๐Ÿ“–

How to use

you can the model via the command

from transformers import AutoModelForImageClassification
model = AutoModelForImageClassification.from_pretrained("not-lain/MyRepo", trust_remote_code=True)

or you can use the pipeline

from transformers import pipeline
pipe = pipeline(model="not-lain/MyRepo", trust_remote_code=True)
pipe(
    "url",
    download=True, # will call the download_img method
    clean_output=False # will be passed as postprocess_kwargs
  )

parameters

the pipeline supports the following parameters :

  • download
  • clean_output

you can also use the following method to download images from the web

pipe.download_img(url)
Downloads last month
19
Safetensors
Model size
21.8k params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Article mentioning not-lain/MyRepo