text
stringlengths
0
5.54k
source
>
(
config: typing.Union[diffusers.configuration_utils.FrozenDict, typing.Dict[str, typing.Any]] = None
return_unused_kwargs = False
**kwargs
)
Parameters
config (Dict[str, Any]) β€”
A config dictionary from which the Python class will be instantiated. Make sure to only load
configuration files of compatible classes.
return_unused_kwargs (bool, optional, defaults to False) β€”
Whether kwargs that are not consumed by the Python class should be returned or not.
kwargs (remaining dictionary of keyword arguments, optional) β€”
Can be used to update the configuration object (after it being loaded) and initiate the Python class.
**kwargs will be directly passed to the underlying scheduler/model’s __init__ method and eventually
overwrite same named arguments of config.
Instantiate a Python class from a config dictionary
Examples:
Copied
>>> from diffusers import DDPMScheduler, DDIMScheduler, PNDMScheduler
>>> # Download scheduler from huggingface.co and cache.
>>> scheduler = DDPMScheduler.from_pretrained("google/ddpm-cifar10-32")
>>> # Instantiate DDIM scheduler class with same config as DDPM
>>> scheduler = DDIMScheduler.from_config(scheduler.config)
>>> # Instantiate PNDM scheduler class with same config as DDPM
>>> scheduler = PNDMScheduler.from_config(scheduler.config)
load_config
<
source
>
(
pretrained_model_name_or_path: typing.Union[str, os.PathLike]
return_unused_kwargs = False
**kwargs
)
Parameters
pretrained_model_name_or_path (str or os.PathLike, optional) β€”
Can be either:
A string, the model id of a model repo on huggingface.co. Valid model ids should have an
organization name, like google/ddpm-celebahq-256.
A path to a directory containing model weights saved using save_config(), e.g.,
./my_model_directory/.
cache_dir (Union[str, os.PathLike], optional) β€”
Path to a directory in which a downloaded pretrained model configuration should be cached if the
standard cache should not be used.
force_download (bool, optional, defaults to False) β€”
Whether or not to force the (re-)download of the model weights and configuration files, overriding the
cached versions if they exist.
resume_download (bool, optional, defaults to False) β€”
Whether or not to delete incompletely received files. Will attempt to resume the download if such a
file exists.
proxies (Dict[str, str], optional) β€”
A dictionary of proxy servers to use by protocol or endpoint, e.g., {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. The proxies are used on each request.
output_loading_info(bool, optional, defaults to False) β€”
Whether or not to also return a dictionary containing missing keys, unexpected keys and error messages.
local_files_only(bool, optional, defaults to False) β€”
Whether or not to only look at local files (i.e., do not try to download the model).
use_auth_token (str or bool, optional) β€”
The token to use as HTTP bearer authorization for remote files. If True, will use the token generated
when running transformers-cli login (stored in ~/.huggingface).