code stringlengths 66 870k | docstring stringlengths 19 26.7k | func_name stringlengths 1 138 | language stringclasses 1
value | repo stringlengths 7 68 | path stringlengths 5 324 | url stringlengths 46 389 | license stringclasses 7
values |
|---|---|---|---|---|---|---|---|
def find_custom_args_with_details(file_content: str, custom_args_var_name: str) -> list[dict]:
"""
Find the given custom args variable in the file content and return its content.
Args:
file_content: The string content of the Python file.
custom_args_var_name: The name of the custom args var... |
Find the given custom args variable in the file content and return its content.
Args:
file_content: The string content of the Python file.
custom_args_var_name: The name of the custom args variable.
| find_custom_args_with_details | python | huggingface/transformers | utils/check_docstrings.py | https://github.com/huggingface/transformers/blob/master/utils/check_docstrings.py | Apache-2.0 |
def update_file_with_new_docstrings(
candidate_file, lines, line_starts_candidates, line_ends_candidates, overwrite=False
):
"""
For a given file, update the docstrings for all @auto_docstring candidates and write the new content.
"""
content_base_file_new_lines = lines[: line_ends_candidates[0]]
... |
For a given file, update the docstrings for all @auto_docstring candidates and write the new content.
| update_file_with_new_docstrings | python | huggingface/transformers | utils/check_docstrings.py | https://github.com/huggingface/transformers/blob/master/utils/check_docstrings.py | Apache-2.0 |
def check_docstrings(overwrite: bool = False, check_all: bool = False):
"""
Check docstrings of all public objects that are callables and are documented. By default, only checks the diff.
Args:
overwrite (`bool`, *optional*, defaults to `False`):
Whether to fix inconsistencies or not.
... |
Check docstrings of all public objects that are callables and are documented. By default, only checks the diff.
Args:
overwrite (`bool`, *optional*, defaults to `False`):
Whether to fix inconsistencies or not.
check_all (`bool`, *optional*, defaults to `False`):
Whether... | check_docstrings | python | huggingface/transformers | utils/check_docstrings.py | https://github.com/huggingface/transformers/blob/master/utils/check_docstrings.py | Apache-2.0 |
def clean_doctest_list(doctest_file: str, overwrite: bool = False):
"""
Cleans the doctest in a given file.
Args:
doctest_file (`str`):
The path to the doctest file to check or clean.
overwrite (`bool`, *optional*, defaults to `False`):
Whether or not to fix problems... |
Cleans the doctest in a given file.
Args:
doctest_file (`str`):
The path to the doctest file to check or clean.
overwrite (`bool`, *optional*, defaults to `False`):
Whether or not to fix problems. If `False`, will error when the file is not clean.
| clean_doctest_list | python | huggingface/transformers | utils/check_doctest_list.py | https://github.com/huggingface/transformers/blob/master/utils/check_doctest_list.py | Apache-2.0 |
def clean_model_doc_toc(model_doc: List[dict]) -> List[dict]:
"""
Cleans a section of the table of content of the model documentation (one specific modality) by removing duplicates
and sorting models alphabetically.
Args:
model_doc (`List[dict]`):
The list of dictionaries extracted ... |
Cleans a section of the table of content of the model documentation (one specific modality) by removing duplicates
and sorting models alphabetically.
Args:
model_doc (`List[dict]`):
The list of dictionaries extracted from the `_toctree.yml` file for this specific modality.
Returns... | clean_model_doc_toc | python | huggingface/transformers | utils/check_doc_toc.py | https://github.com/huggingface/transformers/blob/master/utils/check_doc_toc.py | Apache-2.0 |
def check_model_doc(overwrite: bool = False):
"""
Check that the content of the table of content in `_toctree.yml` is clean (no duplicates and sorted for the model
API doc) and potentially auto-cleans it.
Args:
overwrite (`bool`, *optional*, defaults to `False`):
Whether to just che... |
Check that the content of the table of content in `_toctree.yml` is clean (no duplicates and sorted for the model
API doc) and potentially auto-cleans it.
Args:
overwrite (`bool`, *optional*, defaults to `False`):
Whether to just check if the TOC is clean or to auto-clean it (when `ove... | check_model_doc | python | huggingface/transformers | utils/check_doc_toc.py | https://github.com/huggingface/transformers/blob/master/utils/check_doc_toc.py | Apache-2.0 |
def find_backend(line: str) -> Optional[str]:
"""
Find one (or multiple) backend in a code line of the init.
Args:
line (`str`): A code line in an init file.
Returns:
Optional[`str`]: If one (or several) backend is found, returns it. In the case of multiple backends (the line
c... |
Find one (or multiple) backend in a code line of the init.
Args:
line (`str`): A code line in an init file.
Returns:
Optional[`str`]: If one (or several) backend is found, returns it. In the case of multiple backends (the line
contains `if is_xxx_available() and `is_yyy_available(... | find_backend | python | huggingface/transformers | utils/check_dummies.py | https://github.com/huggingface/transformers/blob/master/utils/check_dummies.py | Apache-2.0 |
def read_init() -> Dict[str, List[str]]:
"""
Read the init and extract backend-specific objects.
Returns:
Dict[str, List[str]]: A dictionary mapping backend name to the list of object names requiring that backend.
"""
with open(os.path.join(PATH_TO_TRANSFORMERS, "__init__.py"), "r", encodin... |
Read the init and extract backend-specific objects.
Returns:
Dict[str, List[str]]: A dictionary mapping backend name to the list of object names requiring that backend.
| read_init | python | huggingface/transformers | utils/check_dummies.py | https://github.com/huggingface/transformers/blob/master/utils/check_dummies.py | Apache-2.0 |
def create_dummy_object(name: str, backend_name: str) -> str:
"""
Create the code for a dummy object.
Args:
name (`str`): The name of the object.
backend_name (`str`): The name of the backend required for that object.
Returns:
`str`: The code of the dummy object.
"""
if... |
Create the code for a dummy object.
Args:
name (`str`): The name of the object.
backend_name (`str`): The name of the backend required for that object.
Returns:
`str`: The code of the dummy object.
| create_dummy_object | python | huggingface/transformers | utils/check_dummies.py | https://github.com/huggingface/transformers/blob/master/utils/check_dummies.py | Apache-2.0 |
def create_dummy_files(backend_specific_objects: Optional[Dict[str, List[str]]] = None) -> Dict[str, str]:
"""
Create the content of the dummy files.
Args:
backend_specific_objects (`Dict[str, List[str]]`, *optional*):
The mapping backend name to list of backend-specific objects. If not... |
Create the content of the dummy files.
Args:
backend_specific_objects (`Dict[str, List[str]]`, *optional*):
The mapping backend name to list of backend-specific objects. If not passed, will be obtained by calling
`read_init()`.
Returns:
`Dict[str, str]`: A dictiona... | create_dummy_files | python | huggingface/transformers | utils/check_dummies.py | https://github.com/huggingface/transformers/blob/master/utils/check_dummies.py | Apache-2.0 |
def check_dummies(overwrite: bool = False):
"""
Check if the dummy files are up to date and maybe `overwrite` with the right content.
Args:
overwrite (`bool`, *optional*, default to `False`):
Whether or not to overwrite the content of the dummy files. Will raise an error if they are not... |
Check if the dummy files are up to date and maybe `overwrite` with the right content.
Args:
overwrite (`bool`, *optional*, default to `False`):
Whether or not to overwrite the content of the dummy files. Will raise an error if they are not up to date
when `overwrite=False`.
... | check_dummies | python | huggingface/transformers | utils/check_dummies.py | https://github.com/huggingface/transformers/blob/master/utils/check_dummies.py | Apache-2.0 |
def find_backend(line: str) -> Optional[str]:
"""
Find one (or multiple) backend in a code line of the init.
Args:
line (`str`): A code line of the main init.
Returns:
Optional[`str`]: If one (or several) backend is found, returns it. In the case of multiple backends (the line
... |
Find one (or multiple) backend in a code line of the init.
Args:
line (`str`): A code line of the main init.
Returns:
Optional[`str`]: If one (or several) backend is found, returns it. In the case of multiple backends (the line
contains `if is_xxx_available() and `is_yyy_available... | find_backend | python | huggingface/transformers | utils/check_inits.py | https://github.com/huggingface/transformers/blob/master/utils/check_inits.py | Apache-2.0 |
def parse_init(init_file) -> Optional[Tuple[Dict[str, List[str]], Dict[str, List[str]]]]:
"""
Read an init_file and parse (per backend) the `_import_structure` objects defined and the `TYPE_CHECKING` objects
defined.
Args:
init_file (`str`): Path to the init file to inspect.
Returns:
... |
Read an init_file and parse (per backend) the `_import_structure` objects defined and the `TYPE_CHECKING` objects
defined.
Args:
init_file (`str`): Path to the init file to inspect.
Returns:
`Optional[Tuple[Dict[str, List[str]], Dict[str, List[str]]]]`: A tuple of two dictionaries map... | parse_init | python | huggingface/transformers | utils/check_inits.py | https://github.com/huggingface/transformers/blob/master/utils/check_inits.py | Apache-2.0 |
def analyze_results(import_dict_objects: Dict[str, List[str]], type_hint_objects: Dict[str, List[str]]) -> List[str]:
"""
Analyze the differences between _import_structure objects and TYPE_CHECKING objects found in an init.
Args:
import_dict_objects (`Dict[str, List[str]]`):
A dictionar... |
Analyze the differences between _import_structure objects and TYPE_CHECKING objects found in an init.
Args:
import_dict_objects (`Dict[str, List[str]]`):
A dictionary mapping backend names (`"none"` for the objects independent of any specific backend) to
list of imported object... | analyze_results | python | huggingface/transformers | utils/check_inits.py | https://github.com/huggingface/transformers/blob/master/utils/check_inits.py | Apache-2.0 |
def get_transformers_submodules() -> List[str]:
"""
Returns the list of Transformers submodules.
"""
submodules = []
for path, directories, files in os.walk(PATH_TO_TRANSFORMERS):
for folder in directories:
# Ignore private modules
if folder.startswith("_"):
... |
Returns the list of Transformers submodules.
| get_transformers_submodules | python | huggingface/transformers | utils/check_inits.py | https://github.com/huggingface/transformers/blob/master/utils/check_inits.py | Apache-2.0 |
def check_submodules():
"""
Check all submodules of Transformers are properly registered in the main init. Error otherwise.
"""
# This is to make sure the transformers module imported is the one in the repo.
from transformers.utils import direct_transformers_import
transformers = direct_transfo... |
Check all submodules of Transformers are properly registered in the main init. Error otherwise.
| check_submodules | python | huggingface/transformers | utils/check_inits.py | https://github.com/huggingface/transformers/blob/master/utils/check_inits.py | Apache-2.0 |
def get_models_in_diff():
"""
Finds all models that have been modified in the diff.
Returns:
A set containing the names of the models that have been modified (e.g. {'llama', 'whisper'}).
"""
fork_point_sha = subprocess.check_output("git merge-base main HEAD".split()).decode("utf-8")
mod... |
Finds all models that have been modified in the diff.
Returns:
A set containing the names of the models that have been modified (e.g. {'llama', 'whisper'}).
| get_models_in_diff | python | huggingface/transformers | utils/check_modular_conversion.py | https://github.com/huggingface/transformers/blob/master/utils/check_modular_conversion.py | Apache-2.0 |
def guaranteed_no_diff(modular_file_path, dependencies, models_in_diff):
"""
Returns whether it is guaranteed to have no differences between the modular file and the modeling file.
Model is in the diff -> not guaranteed to have no differences
Dependency is in the diff -> not guaranteed to have no diffe... |
Returns whether it is guaranteed to have no differences between the modular file and the modeling file.
Model is in the diff -> not guaranteed to have no differences
Dependency is in the diff -> not guaranteed to have no differences
Otherwise -> guaranteed to have no differences
Args:
mod... | guaranteed_no_diff | python | huggingface/transformers | utils/check_modular_conversion.py | https://github.com/huggingface/transformers/blob/master/utils/check_modular_conversion.py | Apache-2.0 |
def check_missing_backends():
"""
Checks if all backends are installed (otherwise the check of this script is incomplete). Will error in the CI if
that's not the case but only throw a warning for users running this.
"""
missing_backends = []
if not is_torch_available():
missing_backends.... |
Checks if all backends are installed (otherwise the check of this script is incomplete). Will error in the CI if
that's not the case but only throw a warning for users running this.
| check_missing_backends | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_model_list():
"""
Checks the model listed as subfolders of `models` match the models available in `transformers.models`.
"""
# Get the models from the directory structure of `src/transformers/models/`
import transformers as tfrs
models_dir = os.path.join(PATH_TO_TRANSFORMERS, "models"... |
Checks the model listed as subfolders of `models` match the models available in `transformers.models`.
| check_model_list | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def get_model_modules() -> List[str]:
"""Get all the model modules inside the transformers library (except deprecated models)."""
_ignore_modules = [
"modeling_auto",
"modeling_encoder_decoder",
"modeling_marian",
"modeling_retribert",
"modeling_flax_auto",
"model... | Get all the model modules inside the transformers library (except deprecated models). | get_model_modules | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def get_models(module: types.ModuleType, include_pretrained: bool = False) -> List[Tuple[str, type]]:
"""
Get the objects in a module that are models.
Args:
module (`types.ModuleType`):
The module from which we are extracting models.
include_pretrained (`bool`, *optional*, defau... |
Get the objects in a module that are models.
Args:
module (`types.ModuleType`):
The module from which we are extracting models.
include_pretrained (`bool`, *optional*, defaults to `False`):
Whether or not to include the `PreTrainedModel` subclass (like `BertPreTrainedMo... | get_models | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def is_building_block(model: str) -> bool:
"""
Returns `True` if a model is a building block part of a bigger model.
"""
if model.endswith("Wrapper"):
return True
if model.endswith("Encoder"):
return True
if model.endswith("Decoder"):
return True
if model.endswith("Pr... |
Returns `True` if a model is a building block part of a bigger model.
| is_building_block | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def is_a_private_model(model: str) -> bool:
"""Returns `True` if the model should not be in the main init."""
if model in PRIVATE_MODELS:
return True
return is_building_block(model) | Returns `True` if the model should not be in the main init. | is_a_private_model | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_models_are_in_init():
"""Checks all models defined in the library are in the main init."""
models_not_in_init = []
dir_transformers = dir(transformers)
for module in get_model_modules():
models_not_in_init += [
model[0] for model in get_models(module, include_pretrained=Tru... | Checks all models defined in the library are in the main init. | check_models_are_in_init | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def get_model_test_files() -> List[str]:
"""
Get the model test files.
Returns:
`List[str]`: The list of test files. The returned files will NOT contain the `tests` (i.e. `PATH_TO_TESTS`
defined in this script). They will be considered as paths relative to `tests`. A caller has to use
... |
Get the model test files.
Returns:
`List[str]`: The list of test files. The returned files will NOT contain the `tests` (i.e. `PATH_TO_TESTS`
defined in this script). They will be considered as paths relative to `tests`. A caller has to use
`os.path.join(PATH_TO_TESTS, ...)` to access ... | get_model_test_files | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def find_tested_models(test_file: str) -> List[str]:
"""
Parse the content of test_file to detect what's in `all_model_classes`. This detects the models that inherit from
the common test class.
Args:
test_file (`str`): The path to the test file to check
Returns:
`List[str]`: The li... |
Parse the content of test_file to detect what's in `all_model_classes`. This detects the models that inherit from
the common test class.
Args:
test_file (`str`): The path to the test file to check
Returns:
`List[str]`: The list of models tested in that file.
| find_tested_models | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def should_be_tested(model_name: str) -> bool:
"""
Whether or not a model should be tested.
"""
if model_name in IGNORE_NON_TESTED:
return False
return not is_building_block(model_name) |
Whether or not a model should be tested.
| should_be_tested | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_models_are_tested(module: types.ModuleType, test_file: str) -> List[str]:
"""Check models defined in a module are all tested in a given file.
Args:
module (`types.ModuleType`): The module in which we get the models.
test_file (`str`): The path to the file where the module is tested.
... | Check models defined in a module are all tested in a given file.
Args:
module (`types.ModuleType`): The module in which we get the models.
test_file (`str`): The path to the file where the module is tested.
Returns:
`List[str]`: The list of error messages corresponding to models not te... | check_models_are_tested | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_models_are_tested():
"""Check all models are properly tested."""
modules = get_model_modules()
test_files = get_model_test_files()
failures = []
for module in modules:
# Matches a module to its test file.
test_file = [file for file in test_files if f"test_{module.__name... | Check all models are properly tested. | check_all_models_are_tested | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def get_all_auto_configured_models() -> List[str]:
"""Return the list of all models in at least one auto class."""
result = set() # To avoid duplicates we concatenate all model classes in a set.
if is_torch_available():
for attr_name in dir(transformers.models.auto.modeling_auto):
if at... | Return the list of all models in at least one auto class. | get_all_auto_configured_models | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def ignore_unautoclassed(model_name: str) -> bool:
"""Rules to determine if a model should be in an auto class."""
# Special white list
if model_name in IGNORE_NON_AUTO_CONFIGURED:
return True
# Encoder and Decoder should be ignored
if "Encoder" in model_name or "Decoder" in model_name:
... | Rules to determine if a model should be in an auto class. | ignore_unautoclassed | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_models_are_auto_configured(module: types.ModuleType, all_auto_models: List[str]) -> List[str]:
"""
Check models defined in module are each in an auto class.
Args:
module (`types.ModuleType`):
The module in which we get the models.
all_auto_models (`List[str]`):
... |
Check models defined in module are each in an auto class.
Args:
module (`types.ModuleType`):
The module in which we get the models.
all_auto_models (`List[str]`):
The list of all models in an auto class (as obtained with `get_all_auto_configured_models()`).
Returns... | check_models_are_auto_configured | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_models_are_auto_configured():
"""Check all models are each in an auto class."""
# This is where we need to check we have all backends or the check is incomplete.
check_missing_backends()
modules = get_model_modules()
all_auto_models = get_all_auto_configured_models()
failures = []
... | Check all models are each in an auto class. | check_all_models_are_auto_configured | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_auto_object_names_being_defined():
"""Check all names defined in auto (name) mappings exist in the library."""
# This is where we need to check we have all backends or the check is incomplete.
check_missing_backends()
failures = []
mappings_to_check = {
"TOKENIZER_MAPPING_NAME... | Check all names defined in auto (name) mappings exist in the library. | check_all_auto_object_names_being_defined | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_auto_mapping_names_in_config_mapping_names():
"""Check all keys defined in auto mappings (mappings of names) appear in `CONFIG_MAPPING_NAMES`."""
# This is where we need to check we have all backends or the check is incomplete.
check_missing_backends()
failures = []
# `TOKENIZER_PROCE... | Check all keys defined in auto mappings (mappings of names) appear in `CONFIG_MAPPING_NAMES`. | check_all_auto_mapping_names_in_config_mapping_names | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_auto_mappings_importable():
"""Check all auto mappings can be imported."""
# This is where we need to check we have all backends or the check is incomplete.
check_missing_backends()
failures = []
mappings_to_check = {}
# Each auto modeling files contains multiple mappings. Let's g... | Check all auto mappings can be imported. | check_all_auto_mappings_importable | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_objects_being_equally_in_main_init():
"""
Check if a (TensorFlow or Flax) object is in the main __init__ iif its counterpart in PyTorch is.
"""
attrs = dir(transformers)
failures = []
for attr in attrs:
obj = getattr(transformers, attr)
if hasattr(obj, "__module__") an... |
Check if a (TensorFlow or Flax) object is in the main __init__ iif its counterpart in PyTorch is.
| check_objects_being_equally_in_main_init | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_decorator_order(filename: str) -> List[int]:
"""
Check that in a given test file, the slow decorator is always last.
Args:
filename (`str`): The path to a test file to check.
Returns:
`List[int]`: The list of failures as a list of indices where there are problems.
"""
... |
Check that in a given test file, the slow decorator is always last.
Args:
filename (`str`): The path to a test file to check.
Returns:
`List[int]`: The list of failures as a list of indices where there are problems.
| check_decorator_order | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_decorator_order():
"""Check that in all test files, the slow decorator is always last."""
errors = []
for fname in os.listdir(PATH_TO_TESTS):
if fname.endswith(".py"):
filename = os.path.join(PATH_TO_TESTS, fname)
new_errors = check_decorator_order(filename)
... | Check that in all test files, the slow decorator is always last. | check_all_decorator_order | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def find_all_documented_objects() -> List[str]:
"""
Parse the content of all doc files to detect which classes and functions it documents.
Returns:
`List[str]`: The list of all object names being documented.
`Dict[str, List[str]]`: A dictionary mapping the object name (full import path, e.g... |
Parse the content of all doc files to detect which classes and functions it documents.
Returns:
`List[str]`: The list of all object names being documented.
`Dict[str, List[str]]`: A dictionary mapping the object name (full import path, e.g.
`integrations.PeftAdapterMixin`) to its d... | find_all_documented_objects | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def ignore_undocumented(name: str) -> bool:
"""Rules to determine if `name` should be undocumented (returns `True` if it should not be documented)."""
# NOT DOCUMENTED ON PURPOSE.
# Constants uppercase are not documented.
if name.isupper():
return True
# PreTrainedModels / Encoders / Decoder... | Rules to determine if `name` should be undocumented (returns `True` if it should not be documented). | ignore_undocumented | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_all_objects_are_documented():
"""Check all models are properly documented."""
documented_objs, documented_methods_map = find_all_documented_objects()
modules = transformers._modules
objects = [c for c in dir(transformers) if c not in modules and not c.startswith("_")]
undocumented_objs = [... | Check all models are properly documented. | check_all_objects_are_documented | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_public_method_exists(documented_methods_map):
"""Check that all explicitly documented public methods are defined in the corresponding class."""
failures = []
for obj, methods in documented_methods_map.items():
# Let's ensure there is no repetition
if len(set(methods)) != len(method... | Check that all explicitly documented public methods are defined in the corresponding class. | check_public_method_exists | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_model_type_doc_match():
"""Check all doc pages have a corresponding model type."""
model_doc_folder = Path(PATH_TO_DOC) / "model_doc"
model_docs = [m.stem for m in model_doc_folder.glob("*.md")]
model_types = list(transformers.models.auto.configuration_auto.MODEL_NAMES_MAPPING.keys())
mod... | Check all doc pages have a corresponding model type. | check_model_type_doc_match | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_deprecated_constant_is_up_to_date():
"""
Check if the constant `DEPRECATED_MODELS` in `models/auto/configuration_auto.py` is up to date.
"""
deprecated_folder = os.path.join(PATH_TO_TRANSFORMERS, "models", "deprecated")
deprecated_models = [m for m in os.listdir(deprecated_folder) if not m... |
Check if the constant `DEPRECATED_MODELS` in `models/auto/configuration_auto.py` is up to date.
| check_deprecated_constant_is_up_to_date | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def check_repo_quality():
"""Check all models are tested and documented."""
print("Repository-wide checks:")
print(" - checking all models are included.")
check_model_list()
print(" - checking all models are public.")
check_models_are_in_init()
print(" - checking all models have tes... | Check all models are tested and documented. | check_repo_quality | python | huggingface/transformers | utils/check_repo.py | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Apache-2.0 |
def find_priority_list(py_files):
"""
Given a list of modular files, sorts them by topological order. Modular models that DON'T depend on other modular
models will be higher in the topological order.
Args:
py_files: List of paths to the modular files
Returns:
A tuple with the order... |
Given a list of modular files, sorts them by topological order. Modular models that DON'T depend on other modular
models will be higher in the topological order.
Args:
py_files: List of paths to the modular files
Returns:
A tuple with the ordered files (list) and their dependencies (d... | find_priority_list | python | huggingface/transformers | utils/create_dependency_mapping.py | https://github.com/huggingface/transformers/blob/master/utils/create_dependency_mapping.py | Apache-2.0 |
def get_processor_types_from_config_class(config_class, allowed_mappings=None):
"""Return a tuple of processors for `config_class`.
We use `tuple` here to include (potentially) both slow & fast tokenizers.
"""
# To make a uniform return type
def _to_tuple(x):
if not isinstance(x, collectio... | Return a tuple of processors for `config_class`.
We use `tuple` here to include (potentially) both slow & fast tokenizers.
| get_processor_types_from_config_class | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def get_architectures_from_config_class(config_class, arch_mappings, models_to_skip=None):
"""Return a tuple of all possible architectures attributed to a configuration class `config_class`.
For example, BertConfig -> [BertModel, BertForMaskedLM, ..., BertForQuestionAnswering].
"""
# A model architectu... | Return a tuple of all possible architectures attributed to a configuration class `config_class`.
For example, BertConfig -> [BertModel, BertForMaskedLM, ..., BertForQuestionAnswering].
| get_architectures_from_config_class | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def get_config_class_from_processor_class(processor_class):
"""Get the config class from a processor class.
Some config/model classes use tokenizers/feature_extractors from other models. For example, `GPT-J` uses
`GPT2Tokenizer`. If no checkpoint is found for a config class, or a checkpoint is found withou... | Get the config class from a processor class.
Some config/model classes use tokenizers/feature_extractors from other models. For example, `GPT-J` uses
`GPT2Tokenizer`. If no checkpoint is found for a config class, or a checkpoint is found without necessary file(s) to
create the processor for `processor_clas... | get_config_class_from_processor_class | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def build_processor(config_class, processor_class, allow_no_checkpoint=False):
"""Create a processor for `processor_class`.
If a processor is not able to be built with the original arguments, this method tries to change the arguments and
call itself recursively, by inferring a new `config_class` or a new `... | Create a processor for `processor_class`.
If a processor is not able to be built with the original arguments, this method tries to change the arguments and
call itself recursively, by inferring a new `config_class` or a new `processor_class` from another one, in order to
find a checkpoint containing the ne... | build_processor | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def get_tiny_config(config_class, model_class=None, **model_tester_kwargs):
"""Retrieve a tiny configuration from `config_class` using each model's `ModelTester`.
Args:
config_class: Subclass of `PreTrainedConfig`.
Returns:
An instance of `config_class` with tiny hyperparameters
"""
... | Retrieve a tiny configuration from `config_class` using each model's `ModelTester`.
Args:
config_class: Subclass of `PreTrainedConfig`.
Returns:
An instance of `config_class` with tiny hyperparameters
| get_tiny_config | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def convert_processors(processors, tiny_config, output_folder, result):
"""Change a processor to work with smaller inputs.
For tokenizers, we try to reduce their vocabulary size.
For feature extractor, we use smaller image size or change
other attributes using the values from `tiny_config`. See `conve... | Change a processor to work with smaller inputs.
For tokenizers, we try to reduce their vocabulary size.
For feature extractor, we use smaller image size or change
other attributes using the values from `tiny_config`. See `convert_feature_extractor`.
This method should not fail: we catch the errors an... | convert_processors | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def _sanity_check(fast_tokenizer, slow_tokenizer, keep_fast_tokenizer=False):
"""Set tokenizer(s) to `None` if the fast/slow tokenizers have different values for `vocab_size` or `length`.
If `keep_fast_tokenizer=True`, the fast tokenizer will be kept.
"""
# sanity check 1: fast and slow... | Set tokenizer(s) to `None` if the fast/slow tokenizers have different values for `vocab_size` or `length`.
If `keep_fast_tokenizer=True`, the fast tokenizer will be kept.
| _sanity_check | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def get_checkpoint_dir(output_dir, model_arch):
"""Get framework-agnostic architecture name. Used to save all PT/TF/Flax models into the same directory."""
arch_name = model_arch.__name__
if arch_name.startswith("TF"):
arch_name = arch_name[2:]
elif arch_name.startswith("Flax"):
arch_na... | Get framework-agnostic architecture name. Used to save all PT/TF/Flax models into the same directory. | get_checkpoint_dir | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def build_model(model_arch, tiny_config, output_dir):
"""Create and save a model for `model_arch`.
Also copy the set of processors to each model (under the same model type) output folder.
"""
checkpoint_dir = get_checkpoint_dir(output_dir, model_arch)
processor_output_dir = os.path.join(output_di... | Create and save a model for `model_arch`.
Also copy the set of processors to each model (under the same model type) output folder.
| build_model | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def fill_result_with_error(result, error, trace, models_to_create):
"""Fill `result` with errors for all target model arch if we can't build processor"""
error = (error, trace)
result["error"] = error
for framework in FRAMEWORKS:
if framework in models_to_create:
result[framework] = ... | Fill `result` with errors for all target model arch if we can't build processor | fill_result_with_error | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def get_token_id_from_tokenizer(token_id_name, tokenizer, original_token_id):
"""Use `tokenizer` to get the values of `bos_token_id`, `eos_token_ids`, etc.
The argument `token_id_name` should be a string ending with `_token_id`, and `original_token_id` should be an
integer that will be return if `tokenizer... | Use `tokenizer` to get the values of `bos_token_id`, `eos_token_ids`, etc.
The argument `token_id_name` should be a string ending with `_token_id`, and `original_token_id` should be an
integer that will be return if `tokenizer` has no token corresponding to `token_id_name`.
| get_token_id_from_tokenizer | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def build(config_class, models_to_create, output_dir):
"""Create all models for a certain model type.
Args:
config_class (`PretrainedConfig`):
A subclass of `PretrainedConfig` that is used to determine `models_to_create`.
models_to_create (`dict`):
A dictionary containin... | Create all models for a certain model type.
Args:
config_class (`PretrainedConfig`):
A subclass of `PretrainedConfig` that is used to determine `models_to_create`.
models_to_create (`dict`):
A dictionary containing the processor/model classes that we want to create the insta... | build | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def build_tiny_model_summary(results, organization=None, token=None):
"""Build a summary: a dictionary of the form
{
model architecture name:
{
"tokenizer_classes": [...],
"processor_classes": [...],
"model_classes": [...],
}
..
}
"""
tiny_mo... | Build a summary: a dictionary of the form
{
model architecture name:
{
"tokenizer_classes": [...],
"processor_classes": [...],
"model_classes": [...],
}
..
}
| build_tiny_model_summary | python | huggingface/transformers | utils/create_dummy_models.py | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Apache-2.0 |
def get_indent(line: str) -> str:
"""Returns the indent in given line (as string)."""
search = _re_indent.search(line)
return "" if search is None else search.groups()[0] | Returns the indent in given line (as string). | get_indent | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def split_code_in_indented_blocks(
code: str, indent_level: str = "", start_prompt: Optional[str] = None, end_prompt: Optional[str] = None
) -> List[str]:
"""
Split some code into its indented blocks, starting at a given level.
Args:
code (`str`): The code to split.
indent_level (`str`)... |
Split some code into its indented blocks, starting at a given level.
Args:
code (`str`): The code to split.
indent_level (`str`): The indent level (as string) to use for identifying the blocks to split.
start_prompt (`str`, *optional*): If provided, only starts splitting at the line wh... | split_code_in_indented_blocks | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def ignore_underscore_and_lowercase(key: Callable[[Any], str]) -> Callable[[Any], str]:
"""
Wraps a key function (as used in a sort) to lowercase and ignore underscores.
"""
def _inner(x):
return key(x).lower().replace("_", "")
return _inner |
Wraps a key function (as used in a sort) to lowercase and ignore underscores.
| ignore_underscore_and_lowercase | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def sort_objects(objects: List[Any], key: Optional[Callable[[Any], str]] = None) -> List[Any]:
"""
Sort a list of objects following the rules of isort (all uppercased first, camel-cased second and lower-cased
last).
Args:
objects (`List[Any]`):
The list of objects to sort.
k... |
Sort a list of objects following the rules of isort (all uppercased first, camel-cased second and lower-cased
last).
Args:
objects (`List[Any]`):
The list of objects to sort.
key (`Callable[[Any], str]`, *optional*):
A function taking an object as input and returnin... | sort_objects | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def sort_objects_in_import(import_statement: str) -> str:
"""
Sorts the imports in a single import statement.
Args:
import_statement (`str`): The import statement in which to sort the imports.
Returns:
`str`: The same as the input, but with objects properly sorted.
"""
# This ... |
Sorts the imports in a single import statement.
Args:
import_statement (`str`): The import statement in which to sort the imports.
Returns:
`str`: The same as the input, but with objects properly sorted.
| sort_objects_in_import | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def sort_imports(file: str, check_only: bool = True):
"""
Sort the imports defined in the `_import_structure` of a given init.
Args:
file (`str`): The path to the init to check/fix.
check_only (`bool`, *optional*, defaults to `True`): Whether or not to just check (and not auto-fix) the init... |
Sort the imports defined in the `_import_structure` of a given init.
Args:
file (`str`): The path to the init to check/fix.
check_only (`bool`, *optional*, defaults to `True`): Whether or not to just check (and not auto-fix) the init.
| sort_imports | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def sort_imports_in_all_inits(check_only=True):
"""
Sort the imports defined in the `_import_structure` of all inits in the repo.
Args:
check_only (`bool`, *optional*, defaults to `True`): Whether or not to just check (and not auto-fix) the init.
"""
failures = []
for root, _, files in ... |
Sort the imports defined in the `_import_structure` of all inits in the repo.
Args:
check_only (`bool`, *optional*, defaults to `True`): Whether or not to just check (and not auto-fix) the init.
| sort_imports_in_all_inits | python | huggingface/transformers | utils/custom_init_isort.py | https://github.com/huggingface/transformers/blob/master/utils/custom_init_isort.py | Apache-2.0 |
def update_main_init_file(models):
"""
Replace all instances of model.model_name with model.deprecated.model_name in the __init__.py file
Args:
models (List[str]): The models to mark as deprecated
"""
filename = REPO_PATH / "src/transformers/__init__.py"
with open(filename, "r") as f:
... |
Replace all instances of model.model_name with model.deprecated.model_name in the __init__.py file
Args:
models (List[str]): The models to mark as deprecated
| update_main_init_file | python | huggingface/transformers | utils/deprecate_models.py | https://github.com/huggingface/transformers/blob/master/utils/deprecate_models.py | Apache-2.0 |
def remove_model_references_from_file(filename, models, condition):
"""
Remove all references to the given models from the given file
Args:
filename (str): The file to remove the references from
models (List[str]): The models to remove
condition (Callable): A function that takes the... |
Remove all references to the given models from the given file
Args:
filename (str): The file to remove the references from
models (List[str]): The models to remove
condition (Callable): A function that takes the line and model and returns True if the line should be removed
| remove_model_references_from_file | python | huggingface/transformers | utils/deprecate_models.py | https://github.com/huggingface/transformers/blob/master/utils/deprecate_models.py | Apache-2.0 |
def remove_model_config_classes_from_config_check(model_config_classes):
"""
Remove the deprecated model config classes from the check_config_attributes.py file
Args:
model_config_classes (List[str]): The model config classes to remove e.g. ["BertConfig", "DistilBertConfig"]
"""
filename = ... |
Remove the deprecated model config classes from the check_config_attributes.py file
Args:
model_config_classes (List[str]): The model config classes to remove e.g. ["BertConfig", "DistilBertConfig"]
| remove_model_config_classes_from_config_check | python | huggingface/transformers | utils/deprecate_models.py | https://github.com/huggingface/transformers/blob/master/utils/deprecate_models.py | Apache-2.0 |
def add_models_to_deprecated_models_in_config_auto(models):
"""
Add the models to the DEPRECATED_MODELS list in configuration_auto.py and sorts the list
to be in alphabetical order.
"""
filepath = REPO_PATH / "src/transformers/models/auto/configuration_auto.py"
with open(filepath, "r") as f:
... |
Add the models to the DEPRECATED_MODELS list in configuration_auto.py and sorts the list
to be in alphabetical order.
| add_models_to_deprecated_models_in_config_auto | python | huggingface/transformers | utils/deprecate_models.py | https://github.com/huggingface/transformers/blob/master/utils/deprecate_models.py | Apache-2.0 |
def extract_warnings_from_single_artifact(artifact_path, targets):
"""Extract warnings from a downloaded artifact (in .zip format)"""
selected_warnings = set()
buffer = []
def parse_line(fp):
for line in fp:
if isinstance(line, bytes):
line = line.decode("UTF-8")
... | Extract warnings from a downloaded artifact (in .zip format) | extract_warnings_from_single_artifact | python | huggingface/transformers | utils/extract_warnings.py | https://github.com/huggingface/transformers/blob/master/utils/extract_warnings.py | Apache-2.0 |
def extract_warnings(artifact_dir, targets):
"""Extract warnings from all artifact files"""
selected_warnings = set()
paths = [os.path.join(artifact_dir, p) for p in os.listdir(artifact_dir) if (p.endswith(".zip") or from_gh)]
for p in paths:
selected_warnings.update(extract_warnings_from_sing... | Extract warnings from all artifact files | extract_warnings | python | huggingface/transformers | utils/extract_warnings.py | https://github.com/huggingface/transformers/blob/master/utils/extract_warnings.py | Apache-2.0 |
def get_jobs(workflow_run_id, token=None):
"""Extract jobs in a GitHub Actions workflow run"""
headers = None
if token is not None:
headers = {"Accept": "application/vnd.github+json", "Authorization": f"Bearer {token}"}
url = f"https://api.github.com/repos/huggingface/transformers/actions/runs... | Extract jobs in a GitHub Actions workflow run | get_jobs | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def get_job_links(workflow_run_id, token=None):
"""Extract job names and their job links in a GitHub Actions workflow run"""
headers = None
if token is not None:
headers = {"Accept": "application/vnd.github+json", "Authorization": f"Bearer {token}"}
url = f"https://api.github.com/repos/hugging... | Extract job names and their job links in a GitHub Actions workflow run | get_job_links | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def get_artifacts_links(worflow_run_id, token=None):
"""Get all artifact links from a workflow run"""
headers = None
if token is not None:
headers = {"Accept": "application/vnd.github+json", "Authorization": f"Bearer {token}"}
url = f"https://api.github.com/repos/huggingface/transformers/actio... | Get all artifact links from a workflow run | get_artifacts_links | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def download_artifact(artifact_name, artifact_url, output_dir, token):
"""Download a GitHub Action artifact from a URL.
The URL is of the form `https://api.github.com/repos/huggingface/transformers/actions/artifacts/{ARTIFACT_ID}/zip`,
but it can't be used to download directly. We need to get a redirect UR... | Download a GitHub Action artifact from a URL.
The URL is of the form `https://api.github.com/repos/huggingface/transformers/actions/artifacts/{ARTIFACT_ID}/zip`,
but it can't be used to download directly. We need to get a redirect URL first.
See https://docs.github.com/en/rest/actions/artifacts#download-an... | download_artifact | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def get_errors_from_single_artifact(artifact_zip_path, job_links=None):
"""Extract errors from a downloaded artifact (in .zip format)"""
errors = []
failed_tests = []
job_name = None
with zipfile.ZipFile(artifact_zip_path) as z:
for filename in z.namelist():
if not os.path.isdir... | Extract errors from a downloaded artifact (in .zip format) | get_errors_from_single_artifact | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def get_all_errors(artifact_dir, job_links=None):
"""Extract errors from all artifact files"""
errors = []
paths = [os.path.join(artifact_dir, p) for p in os.listdir(artifact_dir) if p.endswith(".zip")]
for p in paths:
errors.extend(get_errors_from_single_artifact(p, job_links=job_links))
... | Extract errors from all artifact files | get_all_errors | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def get_model(test):
"""Get the model name from a test method"""
test = test.split("::")[0]
if test.startswith("tests/models/"):
test = test.split("/")[2]
else:
test = None
return test | Get the model name from a test method | get_model | python | huggingface/transformers | utils/get_ci_error_statistics.py | https://github.com/huggingface/transformers/blob/master/utils/get_ci_error_statistics.py | Apache-2.0 |
def extract_time_from_single_job(job):
"""Extract time info from a single job in a GitHub Actions workflow run"""
job_info = {}
start = job["started_at"]
end = job["completed_at"]
start_datetime = date_parser.parse(start)
end_datetime = date_parser.parse(end)
duration_in_min = round((end... | Extract time info from a single job in a GitHub Actions workflow run | extract_time_from_single_job | python | huggingface/transformers | utils/get_github_job_time.py | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | Apache-2.0 |
def get_job_time(workflow_run_id, token=None):
"""Extract time info for all jobs in a GitHub Actions workflow run"""
headers = None
if token is not None:
headers = {"Accept": "application/vnd.github+json", "Authorization": f"Bearer {token}"}
url = f"https://api.github.com/repos/huggingface/tra... | Extract time info for all jobs in a GitHub Actions workflow run | get_job_time | python | huggingface/transformers | utils/get_github_job_time.py | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | Apache-2.0 |
def get_daily_ci_runs(token, num_runs=7, workflow_id=None):
"""Get the workflow runs of the scheduled (daily) CI.
This only selects the runs triggered by the `schedule` event on the `main` branch.
"""
headers = None
if token is not None:
headers = {"Accept": "application/vnd.github+json", "... | Get the workflow runs of the scheduled (daily) CI.
This only selects the runs triggered by the `schedule` event on the `main` branch.
| get_daily_ci_runs | python | huggingface/transformers | utils/get_previous_daily_ci.py | https://github.com/huggingface/transformers/blob/master/utils/get_previous_daily_ci.py | Apache-2.0 |
def get_last_daily_ci_run(token, workflow_run_id=None, workflow_id=None, commit_sha=None):
"""Get the last completed workflow run id of the scheduled (daily) CI."""
headers = None
if token is not None:
headers = {"Accept": "application/vnd.github+json", "Authorization": f"Bearer {token}"}
workf... | Get the last completed workflow run id of the scheduled (daily) CI. | get_last_daily_ci_run | python | huggingface/transformers | utils/get_previous_daily_ci.py | https://github.com/huggingface/transformers/blob/master/utils/get_previous_daily_ci.py | Apache-2.0 |
def get_last_daily_ci_workflow_run_id(token, workflow_run_id=None, workflow_id=None, commit_sha=None):
"""Get the last completed workflow run id of the scheduled (daily) CI."""
if workflow_run_id is not None and workflow_run_id != "":
return workflow_run_id
workflow_run = get_last_daily_ci_run(toke... | Get the last completed workflow run id of the scheduled (daily) CI. | get_last_daily_ci_workflow_run_id | python | huggingface/transformers | utils/get_previous_daily_ci.py | https://github.com/huggingface/transformers/blob/master/utils/get_previous_daily_ci.py | Apache-2.0 |
def get_last_daily_ci_artifacts(
artifact_names, output_dir, token, workflow_run_id=None, workflow_id=None, commit_sha=None
):
"""Get the artifacts of last completed workflow run id of the scheduled (daily) CI."""
workflow_run_id = get_last_daily_ci_workflow_run_id(
token, workflow_run_id=workflow_r... | Get the artifacts of last completed workflow run id of the scheduled (daily) CI. | get_last_daily_ci_artifacts | python | huggingface/transformers | utils/get_previous_daily_ci.py | https://github.com/huggingface/transformers/blob/master/utils/get_previous_daily_ci.py | Apache-2.0 |
def get_last_daily_ci_reports(
artifact_names, output_dir, token, workflow_run_id=None, workflow_id=None, commit_sha=None
):
"""Get the artifacts' content of the last completed workflow run id of the scheduled (daily) CI."""
get_last_daily_ci_artifacts(
artifact_names,
output_dir,
to... | Get the artifacts' content of the last completed workflow run id of the scheduled (daily) CI. | get_last_daily_ci_reports | python | huggingface/transformers | utils/get_previous_daily_ci.py | https://github.com/huggingface/transformers/blob/master/utils/get_previous_daily_ci.py | Apache-2.0 |
def get_module_path(test_file):
"""Return the module path of a model test file."""
components = test_file.split(os.path.sep)
if components[0:2] != ["tests", "models"]:
raise ValueError(
"`test_file` should start with `tests/models/` (with `/` being the OS specific path separator). Got "
... | Return the module path of a model test file. | get_module_path | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_test_module(test_file):
"""Get the module of a model test file."""
test_module_path = get_module_path(test_file)
try:
test_module = importlib.import_module(test_module_path)
except AttributeError as exc:
# e.g. if you have a `tests` folder in `site-packages`, created by another p... | Get the module of a model test file. | get_test_module | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_tester_classes(test_file):
"""Get all classes in a model test file whose names ends with `ModelTester`."""
tester_classes = []
test_module = get_test_module(test_file)
for attr in dir(test_module):
if attr.endswith("ModelTester"):
tester_classes.append(getattr(test_module, at... | Get all classes in a model test file whose names ends with `ModelTester`. | get_tester_classes | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_test_classes(test_file):
"""Get all [test] classes in a model test file with attribute `all_model_classes` that are non-empty.
These are usually the (model) test classes containing the (non-slow) tests to run and are subclasses of one of the
classes `ModelTesterMixin`, `TFModelTesterMixin` or `Flax... | Get all [test] classes in a model test file with attribute `all_model_classes` that are non-empty.
These are usually the (model) test classes containing the (non-slow) tests to run and are subclasses of one of the
classes `ModelTesterMixin`, `TFModelTesterMixin` or `FlaxModelTesterMixin`, as well as a subclass... | get_test_classes | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_model_classes(test_file):
"""Get all model classes that appear in `all_model_classes` attributes in a model test file."""
test_classes = get_test_classes(test_file)
model_classes = set()
for test_class in test_classes:
model_classes.update(test_class.all_model_classes)
# sort with c... | Get all model classes that appear in `all_model_classes` attributes in a model test file. | get_model_classes | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_model_tester_from_test_class(test_class):
"""Get the model tester class of a model test class."""
test = test_class()
if hasattr(test, "setUp"):
test.setUp()
model_tester = None
if hasattr(test, "model_tester"):
# `(TF/Flax)ModelTesterMixin` has this attribute default to `No... | Get the model tester class of a model test class. | get_model_tester_from_test_class | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_test_classes_for_model(test_file, model_class):
"""Get all [test] classes in `test_file` that have `model_class` in their `all_model_classes`."""
test_classes = get_test_classes(test_file)
target_test_classes = []
for test_class in test_classes:
if model_class in test_class.all_model_cl... | Get all [test] classes in `test_file` that have `model_class` in their `all_model_classes`. | get_test_classes_for_model | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_tester_classes_for_model(test_file, model_class):
"""Get all model tester classes in `test_file` that are associated to `model_class`."""
test_classes = get_test_classes_for_model(test_file, model_class)
tester_classes = []
for test_class in test_classes:
tester_class = get_model_tester... | Get all model tester classes in `test_file` that are associated to `model_class`. | get_tester_classes_for_model | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_test_to_tester_mapping(test_file):
"""Get a mapping from [test] classes to model tester classes in `test_file`.
This uses `get_test_classes` which may return classes that are NOT subclasses of `unittest.TestCase`.
"""
test_classes = get_test_classes(test_file)
test_tester_mapping = {test_cl... | Get a mapping from [test] classes to model tester classes in `test_file`.
This uses `get_test_classes` which may return classes that are NOT subclasses of `unittest.TestCase`.
| get_test_to_tester_mapping | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_model_to_test_mapping(test_file):
"""Get a mapping from model classes to test classes in `test_file`."""
model_classes = get_model_classes(test_file)
model_test_mapping = {
model_class: get_test_classes_for_model(test_file, model_class) for model_class in model_classes
}
return model... | Get a mapping from model classes to test classes in `test_file`. | get_model_to_test_mapping | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def get_model_to_tester_mapping(test_file):
"""Get a mapping from model classes to model tester classes in `test_file`."""
model_classes = get_model_classes(test_file)
model_to_tester_mapping = {
model_class: get_tester_classes_for_model(test_file, model_class) for model_class in model_classes
}... | Get a mapping from model classes to model tester classes in `test_file`. | get_model_to_tester_mapping | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
def to_json(o):
"""Make the information succinct and easy to read.
Avoid the full class representation like `<class 'transformers.models.bert.modeling_bert.BertForMaskedLM'>` when
displaying the results. Instead, we use class name (`BertForMaskedLM`) for the readability.
"""
if isinstance(o, str):
... | Make the information succinct and easy to read.
Avoid the full class representation like `<class 'transformers.models.bert.modeling_bert.BertForMaskedLM'>` when
displaying the results. Instead, we use class name (`BertForMaskedLM`) for the readability.
| to_json | python | huggingface/transformers | utils/get_test_info.py | https://github.com/huggingface/transformers/blob/master/utils/get_test_info.py | Apache-2.0 |
Subsets and Splits
Django Code with Docstrings
Filters Python code examples from Django repository that contain Django-related code, helping identify relevant code snippets for understanding Django framework usage patterns.
SQL Console for Shuu12121/python-treesitter-filtered-datasetsV2
Retrieves specific code examples from the Flask repository but doesn't provide meaningful analysis or patterns beyond basic data retrieval.
HTTPX Repo Code and Docstrings
Retrieves specific code examples from the httpx repository, which is useful for understanding how particular libraries are used but doesn't provide broader analytical insights about the dataset.
Requests Repo Docstrings & Code
Retrieves code examples with their docstrings and file paths from the requests repository, providing basic filtering but limited analytical value beyond finding specific code samples.
Quart Repo Docstrings & Code
Retrieves code examples with their docstrings from the Quart repository, providing basic code samples but offering limited analytical value for understanding broader patterns or relationships in the dataset.