sample_id string | repo string | question_type string | hop_depth int64 | gold unknown | metadata unknown |
|---|---|---|---|---|---|
colinhacks__zod___isoDate__upstream__2hop_959fe2 | colinhacks/zod | upstream | 2 | {
"hop_1": [
"date"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/mini/iso.ts"
],
"hop_2": [
"date",
"convertBaseSchema"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/classic/schemas.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/classic/from-json-schema.ts"
]
} | {
"anchor": "_isoDate",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/api.ts",
"anchor_source": "export function _isoDate<T extends schemas.$ZodISODate>(\n Class: util.SchemaClass<T>,\n params?: string | $ZodISODateParams | $ZodCheckISODateParams\n): T {\n return new Class({\n type: \"string\",\n format: \"date\",\n check: \"string_format\",\n ...util.normalizeParams(params),\n });\n}",
"result_size": 3,
"created_at": "2026-03-20T17:18:06.594501+00:00",
"file_content": ""
} |
celery__celery__on_apply__downstream__1hop_6b7979 | celery/celery | downstream | 1 | {
"calls": [
"on_start",
"_add_to_pool_map",
"_make_killable_target",
"TaskPool"
],
"call_files": [
"private/tmp/repos/celery/celery/concurrency/eventlet.py",
"private/tmp/repos/celery/celery/concurrency/eventlet.py",
"private/tmp/repos/celery/celery/concurrency/eventlet.py",
"private/tmp/repos/celery/celery/concurrency/eventlet.py"
]
} | {
"anchor": "on_apply",
"anchor_file": "private/tmp/repos/celery/celery/concurrency/eventlet.py",
"anchor_source": "def on_apply(self, target, args=None, kwargs=None, callback=None,\n accept_callback=None, **_):\n target = TaskPool._make_killable_target(target)\n self._quick_apply_sig(sender=self, target=target, args=args, kwargs=kwargs,)\n greenlet = self._quick_put(\n apply_target,\n target, args,\n kwargs,\n callback,\n accept_callback,\n self.getpid\n )\n self._add_to_pool_map(id(greenlet), greenlet)",
"result_size": 4,
"created_at": "2026-03-20T17:18:06.437346+00:00"
} |
psf__requests__merge_environment_settings__upstream__2hop_ddaf90 | psf/requests | upstream | 2 | {
"hop_1": [
"request"
],
"hop_1_files": [
"private/tmp/repos/requests/src/requests/sessions.py"
],
"hop_2": [
"delete",
"post",
"put",
"request",
"get",
"head",
"patch",
"options"
],
"hop_2_files": [
"private/tmp/repos/requests/src/requests/sessions.py",
"private/tmp/repos/requests/src/requests/sessions.py",
"private/tmp/repos/requests/src/requests/sessions.py",
"private/tmp/repos/requests/src/requests/api.py",
"private/tmp/repos/requests/src/requests/sessions.py",
"private/tmp/repos/requests/src/requests/sessions.py",
"private/tmp/repos/requests/src/requests/sessions.py",
"private/tmp/repos/requests/src/requests/sessions.py"
]
} | {
"anchor": "merge_environment_settings",
"anchor_file": "private/tmp/repos/requests/src/requests/sessions.py",
"anchor_source": "def merge_environment_settings(self, url, proxies, stream, verify, cert):\n \"\"\"\n Check the environment and merge it with some settings.\n\n :rtype: dict\n \"\"\"\n # Gather clues from the surrounding environment.\n if self.trust_env:\n # Set environment's proxies.\n no_proxy = proxies.get(\"no_proxy\") if proxies is not None else None\n env_proxies = get_environ_proxies(url, no_proxy=no_proxy)\n for k, v in env_proxies.items():\n proxies.setdefault(k, v)\n\n # Look for requests environment configuration\n # and be compatible with cURL.\n if verify is True or verify is None:\n verify = (\n os.environ.get(\"REQUESTS_CA_BUNDLE\")\n or os.environ.get(\"CURL_CA_BUNDLE\")\n or verify\n )\n\n # Merge all the kwargs.\n proxies = merge_setting(proxies, self.proxies)\n stream = merge_setting(stream, self.stream)\n verify = merge_setting(verify, self.verify)\n cert = merge_setting(cert, self.cert)\n\n return {\"proxies\": proxies, \"stream\": stream, \"verify\": verify, \"cert\": cert}",
"result_size": 8,
"created_at": "2026-03-20T17:18:07.647326+00:00",
"file_content": "\"\"\"\nrequests.sessions\n~~~~~~~~~~~~~~~~~\n\nThis module provides a Session object to manage and persist settings across\nrequests (cookies, auth, proxies).\n\"\"\"\n\nimport os\nimport sys\nimport time\nfrom collections import OrderedDict\nfrom datetime import timedelta\n\nfrom ._internal_utils import to_native_string\nfrom .adapters import HTTPAdapter\nfrom .auth import _basic_auth_str\nfrom .compat import Mapping, cookielib, urljoin, urlparse\nfrom .cookies import (\n RequestsCookieJar,\n cookiejar_from_dict,\n extract_cookies_to_jar,\n merge_cookies,\n)\nfrom .exceptions import (\n ChunkedEncodingError,\n ContentDecodingError,\n InvalidSchema,\n TooManyRedirects,\n)\nfrom .hooks import default_hooks, dispatch_hook\n\n# formerly defined here, reexposed here for backward compatibility\nfrom .models import ( # noqa: F401\n DEFAULT_REDIRECT_LIMIT,\n REDIRECT_STATI,\n PreparedRequest,\n Request,\n)\nfrom .status_codes import codes\nfrom .structures import CaseInsensitiveDict\nfrom .utils import ( # noqa: F401\n DEFAULT_PORTS,\n default_headers,\n get_auth_from_url,\n get_environ_proxies,\n get_netrc_auth,\n requote_uri,\n resolve_proxies,\n rewind_body,\n should_bypass_proxies,\n to_key_val_list,\n)\n\n# Preferred clock, based on which one is more accurate on a given system.\nif sys.platform == \"win32\":\n preferred_clock = time.perf_counter\nelse:\n preferred_clock = time.time\n\n\ndef merge_setting(request_setting, session_setting, dict_class=OrderedDict):\n \"\"\"Determines appropriate setting for a given request, taking into account\n the explicit setting on that request, and the setting in the session. If a\n setting is a dictionary, they will be merged together using `dict_class`\n \"\"\"\n\n if session_setting is None:\n return request_setting\n\n if request_setting is None:\n return session_setting\n\n # Bypass if not a dictionary (e.g. verify)\n if not (\n isinstance(session_setting, Mapping) and isinstance(request_setting, Mapping)\n ):\n return request_setting\n\n merged_setting = dict_class(to_key_val_list(session_setting))\n merged_setting.update(to_key_val_list(request_setting))\n\n # Remove keys that are set to None. Extract keys first to avoid altering\n # the dictionary during iteration.\n none_keys = [k for (k, v) in merged_setting.items() if v is None]\n for key in none_keys:\n del merged_setting[key]\n\n return merged_setting\n\n\ndef merge_hooks(request_hooks, session_hooks, dict_class=OrderedDict):\n \"\"\"Properly merges both requests and session hooks.\n\n This is necessary because when request_hooks == {'response': []}, the\n merge breaks Session hooks entirely.\n \"\"\"\n if session_hooks is None or session_hooks.get(\"response\") == []:\n return request_hooks\n\n if request_hooks is None or request_hooks.get(\"response\") == []:\n return session_hooks\n\n return merge_setting(request_hooks, session_hooks, dict_class)\n\n\nclass SessionRedirectMixin:\n def get_redirect_target(self, resp):\n \"\"\"Receives a Response. Returns a redirect URI or ``None``\"\"\"\n # Due to the nature of how requests processes redirects this method will\n # be called at least once upon the original response and at least twice\n # on each subsequent redirect response (if any).\n # If a custom mixin is used to handle this logic, it may be advantageous\n # to cache the redirect location onto the response object as a private\n # attribute.\n if resp.is_redirect:\n location = resp.headers[\"location\"]\n # Currently the underlying http module on py3 decode headers\n # in latin1, but empirical evidence suggests that latin1 is very\n # rarely used with non-ASCII characters in HTTP headers.\n # It is more likely to get UTF8 header rather than latin1.\n # This causes incorrect handling of UTF8 encoded location headers.\n # To solve this, we re-encode the location in latin1.\n location = location.encode(\"latin1\")\n return to_native_string(location, \"utf8\")\n return None\n\n def should_strip_auth(self, old_url, new_url):\n \"\"\"Decide whether Authorization header should be removed when redirecting\"\"\"\n old_parsed = urlparse(old_url)\n new_parsed = urlparse(new_url)\n if old_parsed.hostname != new_parsed.hostname:\n return True\n # Special case: allow http -> https redirect when using the standard\n # ports. This isn't specified by RFC 7235, but is kept to avoid\n # breaking backwards compatibility with older versions of requests\n # that allowed any redirects on the same host.\n if (\n old_parsed.scheme == \"http\"\n and old_parsed.port in (80, None)\n and new_parsed.scheme == \"https\"\n and new_parsed.port in (443, None)\n ):\n return False\n\n # Handle default port usage corresponding to scheme.\n changed_port = old_parsed.port != new_parsed.port\n changed_scheme = old_parsed.scheme != new_parsed.scheme\n default_port = (DEFAULT_PORTS.get(old_parsed.scheme, None), None)\n if (\n not changed_scheme\n and old_parsed.port in default_port\n and new_parsed.port in default_port\n ):\n return False\n\n # Standard case: root URI must match\n return changed_port or changed_scheme\n\n def resolve_redirects(\n self,\n resp,\n req,\n stream=False,\n timeout=None,\n verify=True,\n cert=None,\n proxies=None,\n yield_requests=False,\n **adapter_kwargs,\n ):\n \"\"\"Receives a Response. Returns a generator of Responses or Requests.\"\"\"\n\n hist = [] # keep track of history\n\n url = self.get_redirect_target(resp)\n previous_fragment = urlparse(req.url).fragment\n while url:\n prepared_request = req.copy()\n\n # Update history and keep track of redirects.\n # resp.history must ignore the original request in this loop\n hist.append(resp)\n resp.history = hist[1:]\n\n try:\n resp.content # Consume socket so it can be released\n except (ChunkedEncodingError, ContentDecodingError, RuntimeError):\n resp.raw.read(decode_content=False)\n\n if len(resp.history) >= self.max_redirects:\n raise TooManyRedirects(\n f\"Exceeded {self.max_redirects} redirects.\", response=resp\n )\n\n # Release the connection back into the pool.\n resp.close()\n\n # Handle redirection without scheme (see: RFC 1808 Section 4)\n if url.startswith(\"//\"):\n parsed_rurl = urlparse(resp.url)\n url = \":\".join([to_native_string(parsed_rurl.scheme), url])\n\n # Normalize url case and attach previous fragment if needed (RFC 7231 7.1.2)\n parsed = urlparse(url)\n if parsed.fragment == \"\" and previous_fragment:\n parsed = parsed._replace(fragment=previous_fragment)\n elif parsed.fragment:\n previous_fragment = parsed.fragment\n url = parsed.geturl()\n\n # Facilitate relative 'location' headers, as allowed by RFC 7231.\n # (e.g. '/path/to/resource' instead of 'http://domain.tld/path/to/resource')\n # Compliant with RFC3986, we percent encode the url.\n if not parsed.netloc:\n url = urljoin(resp.url, requote_uri(url))\n else:\n url = requote_uri(url)\n\n prepared_request.url = to_native_string(url)\n\n self.rebuild_method(prepared_request, resp)\n\n # https://github.com/psf/requests/issues/1084\n if resp.status_code not in (\n codes.temporary_redirect,\n codes.permanent_redirect,\n ):\n \n# … (truncated at 8000 chars)"
} |
immerjs__immer__prepareSetCopy__downstream__2hop_0400d5 | immerjs/immer | downstream | 2 | {
"hop_1": [
"isDraftable",
"createProxy"
],
"hop_1_files": [
"private/tmp/repos/immerjs__immer/src/utils/common.ts",
"private/tmp/repos/immerjs__immer/src/core/immerClass.ts"
],
"hop_2": [
"getPlugin",
"createProxyProxy",
"isPlainObject",
"rootDraftCleanup",
"registerChildFinalizationCallback"
],
"hop_2_files": [
"private/tmp/repos/immerjs__immer/src/utils/plugins.ts",
"private/tmp/repos/immerjs__immer/src/core/proxy.ts",
"private/tmp/repos/immerjs__immer/src/utils/common.ts",
"private/tmp/repos/immerjs__immer/src/core/immerClass.ts",
"private/tmp/repos/immerjs__immer/src/core/finalize.ts"
]
} | {
"anchor": "prepareSetCopy",
"anchor_file": "private/tmp/repos/immerjs__immer/src/plugins/mapset.ts",
"anchor_source": "function prepareSetCopy(state: SetState) {\n\t\tif (!state.copy_) {\n\t\t\t// create drafts for all entries to preserve insertion order\n\t\t\tstate.copy_ = new Set()\n\t\t\tstate.base_.forEach(value => {\n\t\t\t\tif (isDraftable(value)) {\n\t\t\t\t\tconst draft = createProxy(state.scope_, value, state, value)\n\t\t\t\t\tstate.drafts_.set(value, draft)\n\t\t\t\t\tstate.copy_!.add(draft)\n\t\t\t\t} else {\n\t\t\t\t\tstate.copy_!.add(value)\n\t\t\t\t}\n\t\t\t})\n\t\t}\n\t}",
"result_size": 5,
"created_at": "2026-03-20T17:18:06.914297+00:00"
} |
trpc__trpc__resolveHTTPLinkOptions__upstream__1hop_e57bf4 | trpc/trpc | upstream | 1 | {
"calls": [
"httpLink",
"httpBatchLink",
"httpBatchStreamLink"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpLink.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpBatchLink.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpBatchStreamLink.ts"
]
} | {
"anchor": "resolveHTTPLinkOptions",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/client/src/links/internals/httpUtils.ts",
"anchor_source": "export function resolveHTTPLinkOptions(\n opts: HTTPLinkBaseOptions<AnyClientTypes>,\n): ResolvedHTTPLinkOptions {\n return {\n url: opts.url.toString(),\n fetch: opts.fetch,\n transformer: getTransformer(opts.transformer),\n methodOverride: opts.methodOverride,\n };\n}",
"result_size": 3,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
colinhacks__zod__dirty__upstream__1hop_03e5fe | colinhacks/zod | upstream | 1 | {
"calls": [
"_parse",
"mergeArray",
"mergeObjectSync",
"addIssue",
"finalizeSet"
],
"call_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/helpers/parseUtil.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/helpers/parseUtil.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
]
} | {
"anchor": "dirty",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v3/helpers/parseUtil.ts",
"anchor_source": "dirty(): void {\n if (this.value === \"valid\") this.value = \"dirty\";\n }",
"result_size": 6,
"created_at": "2026-03-20T17:18:06.594501+00:00",
"file_content": ""
} |
celery__celery___limit_post_eta__downstream__2hop_2f2221 | celery/celery | downstream | 2 | {
"hop_1": [
"_schedule_bucket_request"
],
"hop_1_files": [
"private/tmp/repos/celery/celery/worker/consumer/consumer.py"
],
"hop_2": [
"_limit_move_to_pool"
],
"hop_2_files": [
"private/tmp/repos/celery/celery/worker/consumer/consumer.py"
]
} | {
"anchor": "_limit_post_eta",
"anchor_file": "private/tmp/repos/celery/celery/worker/consumer/consumer.py",
"anchor_source": "def _limit_post_eta(self, request, bucket, tokens):\n self.qos.decrement_eventually()\n bucket.add((request, tokens))\n return self._schedule_bucket_request(bucket)",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.437346+00:00"
} |
pallets__click__get_params__upstream__1hop_09976b | pallets/click | upstream | 1 | {
"calls": [
"command_path",
"parse_args",
"make_parser",
"collect_usage_pieces",
"to_info_dict",
"format_options",
"_resolve_incomplete",
"shell_complete"
],
"call_files": [
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/shell_completion.py",
"private/tmp/repos/pallets__click/src/click/core.py"
]
} | {
"anchor": "get_params",
"anchor_file": "private/tmp/repos/pallets__click/src/click/core.py",
"anchor_source": "def get_params(self, ctx: Context) -> list[Parameter]:\n params = self.params\n help_option = self.get_help_option(ctx)\n\n if help_option is not None:\n params = [*params, help_option]\n\n if __debug__:\n import warnings\n\n opts = [opt for param in params for opt in param.opts]\n opts_counter = Counter(opts)\n duplicate_opts = (opt for opt, count in opts_counter.items() if count > 1)\n\n for duplicate_opt in duplicate_opts:\n warnings.warn(\n (\n f\"The parameter {duplicate_opt} is used more than once. \"\n \"Remove its duplicate as parameters should be unique.\"\n ),\n stacklevel=3,\n )\n\n return params",
"result_size": 8,
"created_at": "2026-03-20T17:18:07.184893+00:00",
"file_content": ""
} |
rq__rq__add_execution__upstream__2hop_4f80bf | rq/rq | upstream | 2 | {
"hop_1": [
"create",
"heartbeat"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/executions.py",
"private/tmp/repos/rq__rq/rq/executions.py"
],
"hop_2": [
"prepare_execution",
"maintain_heartbeats"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/executions.py",
"private/tmp/repos/rq__rq/rq/worker/base.py"
]
} | {
"anchor": "add_execution",
"anchor_file": "private/tmp/repos/rq__rq/rq/registry.py",
"anchor_source": "def add_execution(self, execution: 'Execution', pipeline: 'Pipeline', ttl: int = 0, xx: bool = False) -> int:\n \"\"\"Adds an execution to a registry with expiry time of now + ttl, unless it's -1 which is set to +inf\n\n Args:\n execution (Execution): The Execution to add.\n pipeline (Pipeline): The Redis Pipeline.\n ttl (int, optional): The time to live. Defaults to 0.\n xx (bool, optional): .... Defaults to False.\n\n Returns:\n result (int): The ZADD command result\n \"\"\"\n score: Union[int, str] = ttl if ttl < 0 else current_timestamp() + ttl\n if score == -1:\n score = '+inf'\n\n return pipeline.zadd(self.key, {execution.composite_key: score}, xx=xx) # type: ignore",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
rq__rq__get_retry_interval__upstream__2hop_e93890 | rq/rq | upstream | 2 | {
"hop_1": [
"retry"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/job.py"
],
"hop_2": [
"cleanup",
"handle_job_failure"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/registry.py",
"private/tmp/repos/rq__rq/rq/worker/base.py"
]
} | {
"anchor": "get_retry_interval",
"anchor_file": "private/tmp/repos/rq__rq/rq/job.py",
"anchor_source": "def get_retry_interval(self) -> int:\n \"\"\"Returns the desired retry interval.\n If number of retries is bigger than length of intervals, the first\n value in the list will be used multiple times.\n\n Returns:\n retry_interval (int): The desired retry interval\n \"\"\"\n if self.retry_intervals is None:\n return 0\n number_of_intervals = len(self.retry_intervals)\n assert self.retries_left\n index = max(number_of_intervals - self.retries_left, 0)\n return self.retry_intervals[index]",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
trpc__trpc__trpcMutationOptions__downstream__1hop_57d11f | trpc/trpc | downstream | 1 | {
"calls": [
"createTRPCOptionsResult",
"mutation",
"onSuccess",
"unwrapLazyArg",
"getClientArgs",
"getMutationKeyInternal"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/utils.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/internals/TRPCUntypedClient.ts",
"private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/mutationOptions.ts",
"private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/utils.ts",
"private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/utils.ts",
"private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/utils.ts"
]
} | {
"anchor": "trpcMutationOptions",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/mutationOptions.ts",
"anchor_source": "export function trpcMutationOptions<TFeatureFlags extends FeatureFlags>(args: {\n mutate: typeof TRPCUntypedClient.prototype.mutation;\n queryClient: QueryClient | (() => QueryClient);\n path: string[];\n opts: AnyTRPCMutationOptionsIn<TFeatureFlags> | undefined;\n overrides: MutationOptionsOverride | undefined;\n}): AnyTRPCMutationOptionsOut<TFeatureFlags> {\n const { mutate, path, opts, overrides } = args;\n const queryClient = unwrapLazyArg(args.queryClient);\n\n const mutationKey = getMutationKeyInternal({\n path,\n prefix: opts?.keyPrefix,\n }) as TRPCMutationKey<TFeatureFlags['keyPrefix']>;\n\n const defaultOpts = queryClient.defaultMutationOptions(\n queryClient.getMutationDefaults(mutationKey),\n );\n\n const mutationSuccessOverride: MutationOptionsOverride['onSuccess'] =\n overrides?.onSuccess ?? ((options) => options.originalFn());\n\n const mutationFn: MutationFunction = async (input) => {\n const result = await mutate(\n ...getClientArgs([...mutationKey, { input }] as any, opts),\n );\n\n return result;\n };\n\n return {\n ...opts,\n mutationKey,\n mutationFn,\n onSuccess(...args) {\n const originalFn = () =>\n opts?.onSuccess?.(...args) ?? defaultOpts?.onSuccess?.(...args);\n\n return mutationSuccessOverride({\n originalFn,\n queryClient,\n meta: opts?.meta ?? defaultOpts?.meta ?? {},\n });\n },\n trpc: createTRPCOptionsResult({ path }),\n };\n}",
"result_size": 6,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
pytest-dev__pytest__pytest_fixture_setup__downstream__1hop_43cdf5 | pytest-dev/pytest | downstream | 1 | {
"calls": [
"_show_fixture_action"
],
"call_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/setuponly.py"
]
} | {
"anchor": "pytest_fixture_setup",
"anchor_file": "private/tmp/repos/pytest-dev__pytest/src/_pytest/setuponly.py",
"anchor_source": "@pytest.hookimpl(wrapper=True)\ndef pytest_fixture_setup(\n fixturedef: FixtureDef[object], request: SubRequest\n) -> Generator[None, object, object]:\n try:\n return (yield)\n finally:\n if request.config.option.setupshow:\n if hasattr(request, \"param\"):\n # Save the fixture parameter so ._show_fixture_action() can\n # display it now and during the teardown (in .finish()).\n if fixturedef.ids:\n if callable(fixturedef.ids):\n param = fixturedef.ids(request.param)\n else:\n param = fixturedef.ids[request.param_index]\n else:\n param = request.param\n fixturedef.cached_param = param # type: ignore[attr-defined]\n _show_fixture_action(fixturedef, request.config, \"SETUP\")",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.712542+00:00"
} |
immerjs__immer__measure__downstream__1hop_87d16b | immerjs/immer | downstream | 1 | {
"calls": [
"measureTime"
],
"call_files": [
"private/tmp/repos/immerjs__immer/__performance_tests__/measure.mjs"
]
} | {
"anchor": "measure",
"anchor_file": "private/tmp/repos/immerjs__immer/__performance_tests__/measure.mjs",
"anchor_source": "export function measure(name, setup, fn) {\n const times = [...Array(5)].map(() => measureTime(setup, fn))\n const medianTime = times.sort()[Math.round(times.length / 2)]\n console.log(`${name}: ${medianTime}ms`)\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.914297+00:00"
} |
rq__rq__get_key__upstream__2hop_fdf8b0 | rq/rq | upstream | 2 | {
"hop_1": [
"fetch_latest",
"all",
"delete_all",
"count",
"save"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/results.py"
],
"hop_2": [
"create_retried",
"create_failure",
"results",
"create",
"latest_result"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/job.py",
"private/tmp/repos/rq__rq/rq/results.py",
"private/tmp/repos/rq__rq/rq/job.py"
]
} | {
"anchor": "get_key",
"anchor_file": "private/tmp/repos/rq__rq/rq/results.py",
"anchor_source": "@classmethod\n def get_key(cls, job_id):\n return f'rq:results:{job_id}'",
"result_size": 5,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
pallets__jinja__visit_MarkSafe__downstream__1hop_a940af | pallets/jinja | downstream | 1 | {
"calls": [
"write",
"visit"
],
"call_files": [
"private/tmp/repos/pallets__jinja/src/jinja2/compiler.py",
"private/tmp/repos/pallets__jinja/src/jinja2/visitor.py"
]
} | {
"anchor": "visit_MarkSafe",
"anchor_file": "private/tmp/repos/pallets__jinja/src/jinja2/compiler.py",
"anchor_source": "def visit_MarkSafe(self, node: nodes.MarkSafe, frame: Frame) -> None:\n self.write(\"Markup(\")\n self.visit(node.expr, frame)\n self.write(\")\")",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.327357+00:00"
} |
locustio__locust__get_ratio__upstream__2hop_b322ac | locustio/locust | upstream | 2 | {
"hop_1": [
"get_html_report",
"print_task_ratio",
"print_task_ratio_json",
"tasks"
],
"hop_1_files": [
"private/tmp/repos/locustio__locust/locust/html.py",
"private/tmp/repos/locustio__locust/locust/user/inspectuser.py",
"private/tmp/repos/locustio__locust/locust/user/inspectuser.py",
"private/tmp/repos/locustio__locust/locust/web.py"
],
"hop_2": [
"stats_report",
"main",
"save_html_report"
],
"hop_2_files": [
"private/tmp/repos/locustio__locust/locust/web.py",
"private/tmp/repos/locustio__locust/locust/main.py",
"private/tmp/repos/locustio__locust/locust/main.py"
]
} | {
"anchor": "get_ratio",
"anchor_file": "private/tmp/repos/locustio__locust/locust/user/inspectuser.py",
"anchor_source": "def get_ratio(user_classes: list[type[User]], user_spawned: dict[str, int], total: bool) -> dict[str, dict[str, float]]:\n user_count = sum(user_spawned.values()) or 1\n ratio_percent: dict[type[User], float] = {u: user_spawned.get(u.__name__, 0) / user_count for u in user_classes}\n\n task_dict: dict[str, dict[str, float]] = {}\n for u, r in ratio_percent.items():\n d = {\"ratio\": r}\n d[\"tasks\"] = _get_task_ratio(u.tasks, total, r)\n task_dict[u.__name__] = d\n\n return task_dict",
"result_size": 3,
"created_at": "2026-03-20T17:18:07.062541+00:00",
"file_content": ""
} |
colinhacks__zod___property__downstream__1hop_2b3043 | colinhacks/zod | downstream | 1 | {
"calls": [
"normalizeParams"
],
"call_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/util.ts"
]
} | {
"anchor": "_property",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/api.ts",
"anchor_source": "export function _property<K extends string, T extends schemas.$ZodType>(\n property: K,\n schema: T,\n params?: string | $ZodCheckPropertyParams\n): checks.$ZodCheckProperty<{ [k in K]: core.output<T> }> {\n return new checks.$ZodCheckProperty({\n check: \"property\",\n property,\n schema,\n ...util.normalizeParams(params),\n });\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
encode__httpx__primitive_value_to_str__upstream__1hop_32be34 | encode/httpx | upstream | 1 | {
"calls": [
"encode_urlencoded_data",
"add",
"set"
],
"call_files": [
"private/tmp/repos/encode__httpx/httpx/_content.py",
"private/tmp/repos/encode__httpx/httpx/_urls.py",
"private/tmp/repos/encode__httpx/httpx/_urls.py"
]
} | {
"anchor": "primitive_value_to_str",
"anchor_file": "private/tmp/repos/encode__httpx/httpx/_utils.py",
"anchor_source": "def primitive_value_to_str(value: PrimitiveData) -> str:\n \"\"\"\n Coerce a primitive data type into a string value.\n\n Note that we prefer JSON-style 'true'/'false' for boolean values here.\n \"\"\"\n if value is True:\n return \"true\"\n elif value is False:\n return \"false\"\n elif value is None:\n return \"\"\n return str(value)",
"result_size": 3,
"created_at": "2026-03-20T17:18:06.815591+00:00",
"file_content": ""
} |
psf__black__parse_atom__downstream__2hop_55d73c | psf/black | downstream | 2 | {
"hop_1": [
"expect",
"gettoken",
"NFAState",
"addarc",
"parse_rhs",
"raise_error"
],
"hop_1_files": [
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py",
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py",
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py",
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py",
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py",
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py"
],
"hop_2": [
"parse_alt"
],
"hop_2_files": [
"private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py"
]
} | {
"anchor": "parse_atom",
"anchor_file": "private/tmp/repos/psf__black/src/blib2to3/pgen2/pgen.py",
"anchor_source": "def parse_atom(self) -> tuple[\"NFAState\", \"NFAState\"]:\n # ATOM: '(' RHS ')' | NAME | STRING\n if self.value == \"(\":\n self.gettoken()\n a, z = self.parse_rhs()\n self.expect(token.OP, \")\")\n return a, z\n elif self.type in (token.NAME, token.STRING):\n a = NFAState()\n z = NFAState()\n a.addarc(z, self.value)\n self.gettoken()\n return a, z\n else:\n self.raise_error(\n f\"expected (...) or NAME or STRING, got {self.type}/{self.value}\"\n )",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.570141+00:00"
} |
colinhacks__zod__datetime__downstream__1hop_b776d7 | colinhacks/zod | downstream | 1 | {
"calls": [
"timeSource"
],
"call_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/regexes.ts"
]
} | {
"anchor": "datetime",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/regexes.ts",
"anchor_source": "export function datetime(args: {\n precision?: number | null;\n offset?: boolean;\n local?: boolean;\n}): RegExp {\n const time = timeSource({ precision: args.precision });\n const opts = [\"Z\"];\n if (args.local) opts.push(\"\");\n // if (args.offset) opts.push(`([+-]\\\\d{2}:\\\\d{2})`);\n if (args.offset) opts.push(`([+-](?:[01]\\\\d|2[0-3]):[0-5]\\\\d)`);\n const timeRegex = `${time}(?:${opts.join(\"|\")})`;\n\n return new RegExp(`^${dateSource}T(?:${timeRegex})$`);\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
sindresorhus__got__#createHeadersProxy__upstream__2hop_ca35ed | sindresorhus/got | upstream | 2 | {
"hop_1": [
"headers",
"constructor"
],
"hop_1_files": [
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts"
],
"hop_2": [
"headers",
"_onResponseBase",
"extend",
"fn",
"constructor"
],
"hop_2_files": [
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/index.ts",
"private/tmp/repos/got/source/create.ts",
"private/tmp/repos/got/benchmark/index.ts",
"private/tmp/repos/got/source/core/index.ts"
]
} | {
"anchor": "#createHeadersProxy",
"anchor_file": "private/tmp/repos/got/source/core/options.ts",
"anchor_source": "#createHeadersProxy(): Headers {\n\t\treturn new Proxy(this.#internals.headers, {\n\t\t\tget(target, property, receiver): unknown {\n\t\t\t\tif (typeof property === 'string') {\n\t\t\t\t\tif (Reflect.has(target, property)) {\n\t\t\t\t\t\treturn Reflect.get(target, property, receiver);\n\t\t\t\t\t}\n\n\t\t\t\t\tconst normalizedProperty = property.toLowerCase();\n\t\t\t\t\treturn Reflect.get(target, normalizedProperty, receiver);\n\t\t\t\t}\n\n\t\t\t\treturn Reflect.get(target, property, receiver);\n\t\t\t},\n\t\t\tset: (target, property, value): boolean => {\n\t\t\t\tif (typeof property === 'string') {\n\t\t\t\t\tconst normalizedProperty = property.toLowerCase();\n\t\t\t\t\tconst isSuccess = Reflect.set(target, normalizedProperty, value);\n\n\t\t\t\t\tif (isSuccess) {\n\t\t\t\t\t\tthis.#explicitHeaders.add(normalizedProperty);\n\t\t\t\t\t}\n\n\t\t\t\t\treturn isSuccess;\n\t\t\t\t}\n\n\t\t\t\treturn Reflect.set(target, property, value);\n\t\t\t},\n\t\t\tdeleteProperty: (target, property): boolean => {\n\t\t\t\tif (typeof property === 'string') {\n\t\t\t\t\tconst normalizedProperty = property.toLowerCase();\n\t\t\t\t\tconst isSuccess = Reflect.deleteProperty(target, normalizedProperty);\n\n\t\t\t\t\tif (isSuccess) {\n\t\t\t\t\t\tthis.#explicitHeaders.delete(normalizedProperty);\n\t\t\t\t\t}\n\n\t\t\t\t\treturn isSuccess;\n\t\t\t\t}\n\n\t\t\t\treturn Reflect.deleteProperty(target, property);\n\t\t\t},\n\t\t});\n\t}",
"result_size": 6,
"created_at": "2026-03-20T17:18:08.163238+00:00",
"file_content": "import process from 'node:process';\nimport {promisify, inspect, type InspectOptions} from 'node:util';\nimport {checkServerIdentity, type SecureContextOptions, type DetailedPeerCertificate} from 'node:tls';\n// DO NOT use destructuring for `https.request` and `http.request` as it's not compatible with `nock`.\nimport https, {\n\ttype RequestOptions as HttpsRequestOptions,\n\ttype Agent as HttpsAgent,\n} from 'node:https';\nimport http, {\n\ttype Agent as HttpAgent,\n\ttype ClientRequest,\n} from 'node:http';\nimport type {Readable} from 'node:stream';\nimport type {Socket, LookupFunction} from 'node:net';\nimport is, {assert} from '@sindresorhus/is';\nimport lowercaseKeys from 'lowercase-keys';\nimport CacheableLookup from 'cacheable-lookup';\nimport http2wrapper, {type ClientHttp2Session} from 'http2-wrapper';\nimport type {KeyvStoreAdapter} from 'keyv';\nimport type KeyvType from 'keyv';\nimport type ResponseLike from 'responselike';\nimport type {RequestPromise} from '../as-promise/types.js';\nimport type {IncomingMessageWithTimings} from './utils/timer.js';\nimport parseLinkHeader from './parse-link-header.js';\nimport type {PlainResponse, Response} from './response.js';\nimport type {RequestError} from './errors.js';\nimport type {Delays} from './timed-out.js';\n\ntype StorageAdapter = KeyvStoreAdapter | KeyvType | Map<unknown, unknown>;\n\ntype Promisable<T> = T | Promise<T>;\n\nconst [major, minor] = process.versions.node.split('.').map(Number) as [number, number, number];\n\nexport type DnsLookupIpVersion = undefined | 4 | 6;\n\ntype Except<ObjectType, KeysType extends keyof ObjectType> = Pick<ObjectType, Exclude<keyof ObjectType, KeysType>>;\n\nexport type NativeRequestOptions = HttpsRequestOptions & CacheOptions & {checkServerIdentity?: CheckServerIdentityFunction};\n\ntype AcceptableResponse = IncomingMessageWithTimings | ResponseLike;\ntype AcceptableRequestResult = Promisable<AcceptableResponse | ClientRequest | undefined>;\nexport type RequestFunction = (url: URL, options: NativeRequestOptions, callback?: (response: AcceptableResponse) => void) => AcceptableRequestResult;\n\nexport type Agents = {\n\thttp?: HttpAgent | false;\n\thttps?: HttpsAgent | false;\n\thttp2?: unknown | false;\n};\n\nexport type Headers = Record<string, string | string[] | undefined>;\n\nexport type ToughCookieJar = {\n\tgetCookieString: ((currentUrl: string, options: Record<string, unknown>, callback: (error: Error | undefined, cookies: string) => void) => void)\n\t\t& ((url: string, callback: (error: Error | undefined, cookieHeader: string) => void) => void);\n\tsetCookie: ((cookieOrString: unknown, currentUrl: string, options: Record<string, unknown>, callback: (error: Error | undefined, cookie: unknown) => void) => void)\n\t\t& ((rawCookie: string, url: string, callback: (error: Error | undefined, result: unknown) => void) => void);\n};\n\nexport type PromiseCookieJar = {\n\tgetCookieString: (url: string) => Promise<string>;\n\tsetCookie: (rawCookie: string, url: string) => Promise<unknown>;\n};\n\n/**\nUtility type to override specific properties in a type.\n\nUses `Omit` to remove properties before adding them back to ensure proper type replacement rather than intersection, which handles edge cases with optional/required properties correctly.\n*/\ntype OverrideProperties<T, U> = Omit<T, keyof U> & U;\n\n/**\nRepresents the runtime state of Options as seen by hooks after normalization.\n\nSome Options properties accept multiple input types but are normalized to a single type internally by the Options class setters. This type reflects the actual runtime types that hooks receive, ensuring type safety when accessing options within hook functions.\n*/\nexport type NormalizedOptions = OverrideProperties<Options, {\n\t// The URL is always normalized to a URL instance (or undefined) by the time hooks execute.\n\turl: URL | undefined;\n\n\t// When set to `true`, dnsCache is normalized to a CacheableLookup instance. When set to `false`, it becomes `undefined`.\n\tdnsCache: CacheableLookup | undefined;\n\n\t// When set to `true`, cache is normalized to the global cache Map. When set to `false`, it becomes `undefined`. Strings and other values are wrapped/processed into a StorageAdapter.\n\tcache: StorageAdapter | undefined;\n\n\t// The prefix URL is always normalized to a string.\n\tprefixUrl: string;\n}>;\n\nexport type InitHook = (init: OptionsInit, self: Options) => void;\n\nexport type BeforeRequestHookContext = {\n\t/**\n\tThe current retry count.\n\n\tIt will be `0` for the initial request and increment for each retry.\n\t*/\n\tretryCount: number;\n};\n\nexport type BeforeRequestHook = (options: NormalizedOptions, context: BeforeRequestHookContext) => Promisable<void | Response | ResponseLike>;\nexport type BeforeRedirectHook = (updatedOptions: NormalizedOptions, plainResponse: PlainResponse) => Promisable<void>;\nexport type BeforeErrorHook = (error: RequestError) => Promisable<Error>;\nexport type BeforeRetryHook = (error: RequestError, retryCount: number) => Promisable<void>;\nexport type BeforeCacheHook = (response: PlainResponse) => false | void;\nexport type AfterResponseHook<ResponseType = unknown> = (response: Response<ResponseType>, retryWithMergedOptions: (options: OptionsInit) => never) => Promisable<Response | RequestPromise<Response>>;\n\n/**\nAll available hooks of Got.\n*/\nexport type Hooks = {\n\t/**\n\tCalled with the plain request options, right before their normalization.\n\n\tThe second argument represents the current `Options` instance.\n\n\t@default []\n\n\t**Note:**\n\t> - This hook must be synchronous.\n\n\t**Note:**\n\t> - This is called every time options are merged.\n\n\t**Note:**\n\t> - The `options` object may not have the `url` property. To modify it, use a `beforeRequest` hook instead.\n\n\t**Note:**\n\t> - This hook is called when a new instance of `Options` is created.\n\t> - Do not confuse this with the creation of `Request` or `got(…)`.\n\n\t**Note:**\n\t> - When using `got(url)` or `got(url, undefined, defaults)` this hook will **not** be called.\n\n\tThis is especially useful in conjunction with `got.extend()` when the input needs custom handling.\n\n\tFor example, this can be used to fix typos to migrate from older versions faster.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst instance = got.extend({\n\t\thooks: {\n\t\t\tinit: [\n\t\t\t\tplain => {\n\t\t\t\t\tif ('followRedirects' in plain) {\n\t\t\t\t\t\tplain.followRedirect = plain.followRedirects;\n\t\t\t\t\t\tdelete plain.followRedirects;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t});\n\n\t// Normally, the following would throw:\n\tconst response = await instance(\n\t\t'https://example.com',\n\t\t{\n\t\t\tfollowRedirects: true\n\t\t}\n\t);\n\n\t// There is no option named `followRedirects`, but we correct it in an `init` hook.\n\t```\n\n\tOr you can create your own option and store it in a context:\n\n\t```\n\timport got from 'got';\n\n\tconst instance = got.extend({\n\t\thooks: {\n\t\t\tinit: [\n\t\t\t\t(plain, options) => {\n\t\t\t\t\tif ('secret' in plain) {\n\t\t\t\t\t\toptions.context.secret = plain.secret;\n\t\t\t\t\t\tdelete plain.secret;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t],\n\t\t\tbeforeRequest: [\n\t\t\t\toptions => {\n\t\t\t\t\toptions.headers.secret = options.context.secret;\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t});\n\n\tconst {headers} = await instance(\n\t\t'https://httpbin.org/anything',\n\t\t{\n\t\t\tsecret: 'passphrase'\n\t\t}\n\t).json();\n\n\tconsole.log(headers.Secret);\n\t//=> 'passphrase'\n\t```\n\t*/\n\tinit: InitHook[];\n\n\t/**\n\tCalled right before making the request with `options.createNativeRequestOptions()`.\n\n\tThe second argument is a context object containing request state information.\n\n\tThis hook is especially useful in conjunction with `got.extend()` when you want to sign your request.\n\n\t@default []\n\n\t**Note:**\n\t> - Got will make no further changes to the request before it is sent.\n\n\t**Note:**\n\t> - Changing `options.json` or `options.form` has no effect on the request. You should change `options.body` instead. If needed, update the `options.headers` accordingly.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst response = await got.post(\n\t\t'https://httpbin.org/anything',\n\t\t{\n\t\t\tjson: {payload: 'old'},\n\t\t\thooks: {\n\t\t\t\tbeforeRequest: [\n\t\t\t\t\t(options, context) => {\n\t\t\t\t\t\toptions.body = JSON.stringify({payload: 'new'});\n\t\t\n# … (truncated at 8000 chars)"
} |
sindresorhus__got__merge__upstream__2hop_5afe2f | sindresorhus/got | upstream | 2 | {
"hop_1": [
"asPromise",
"constructor",
"extend"
],
"hop_1_files": [
"private/tmp/repos/got/source/as-promise/index.ts",
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/create.ts"
],
"hop_2": [
"_onResponseBase",
"fn",
"constructor"
],
"hop_2_files": [
"private/tmp/repos/got/source/core/index.ts",
"private/tmp/repos/got/benchmark/index.ts",
"private/tmp/repos/got/source/core/index.ts"
]
} | {
"anchor": "merge",
"anchor_file": "private/tmp/repos/got/source/core/options.ts",
"anchor_source": "merge(options?: OptionsInit | Options) {\n\t\tif (!options) {\n\t\t\treturn;\n\t\t}\n\n\t\tif (options instanceof Options) {\n\t\t\t// Create a copy of the #init array to avoid infinite loop\n\t\t\t// when merging an Options instance with itself\n\t\t\tconst initArray = [...options.#init];\n\t\t\tfor (const init of initArray) {\n\t\t\t\tthis.merge(init);\n\t\t\t}\n\n\t\t\treturn;\n\t\t}\n\n\t\toptions = cloneRaw(options);\n\n\t\tinit(this, options, this);\n\t\tinit(options, options, this);\n\n\t\tthis.#merging = true;\n\n\t\ttry {\n\t\t\tlet push = false;\n\n\t\t\tfor (const key in options) {\n\t\t\t\t// `got.extend()` options\n\t\t\t\tif (key === 'mutableDefaults' || key === 'handlers') {\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\n\t\t\t\t// Never merge `url`\n\t\t\t\tif (key === 'url') {\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\n\t\t\t\t// Never merge `preserveHooks` - it's a control flag, not a persistent option\n\t\t\t\tif (key === 'preserveHooks') {\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\n\t\t\t\t// `isStream` is set internally by `got.stream()`, not by user options\n\t\t\t\tif (key === 'isStream') {\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\n\t\t\t\tif (!(key in this)) {\n\t\t\t\t\tthrow new Error(`Unexpected option: ${key}`);\n\t\t\t\t}\n\n\t\t\t\t// @ts-expect-error Type 'unknown' is not assignable to type 'never'.\n\t\t\t\tconst value = options[key as keyof Options];\n\t\t\t\tif (value === undefined) {\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\n\t\t\t\t// @ts-expect-error Type 'unknown' is not assignable to type 'never'.\n\t\t\t\tthis[key as keyof Options] = value;\n\n\t\t\t\tpush = true;\n\t\t\t}\n\n\t\t\tif (push) {\n\t\t\t\tthis.#init.push(options);\n\t\t\t}\n\t\t} finally {\n\t\t\tthis.#merging = false;\n\t\t}\n\t}",
"result_size": 4,
"created_at": "2026-03-20T17:18:08.163238+00:00",
"file_content": "import process from 'node:process';\nimport {promisify, inspect, type InspectOptions} from 'node:util';\nimport {checkServerIdentity, type SecureContextOptions, type DetailedPeerCertificate} from 'node:tls';\n// DO NOT use destructuring for `https.request` and `http.request` as it's not compatible with `nock`.\nimport https, {\n\ttype RequestOptions as HttpsRequestOptions,\n\ttype Agent as HttpsAgent,\n} from 'node:https';\nimport http, {\n\ttype Agent as HttpAgent,\n\ttype ClientRequest,\n} from 'node:http';\nimport type {Readable} from 'node:stream';\nimport type {Socket, LookupFunction} from 'node:net';\nimport is, {assert} from '@sindresorhus/is';\nimport lowercaseKeys from 'lowercase-keys';\nimport CacheableLookup from 'cacheable-lookup';\nimport http2wrapper, {type ClientHttp2Session} from 'http2-wrapper';\nimport type {KeyvStoreAdapter} from 'keyv';\nimport type KeyvType from 'keyv';\nimport type ResponseLike from 'responselike';\nimport type {RequestPromise} from '../as-promise/types.js';\nimport type {IncomingMessageWithTimings} from './utils/timer.js';\nimport parseLinkHeader from './parse-link-header.js';\nimport type {PlainResponse, Response} from './response.js';\nimport type {RequestError} from './errors.js';\nimport type {Delays} from './timed-out.js';\n\ntype StorageAdapter = KeyvStoreAdapter | KeyvType | Map<unknown, unknown>;\n\ntype Promisable<T> = T | Promise<T>;\n\nconst [major, minor] = process.versions.node.split('.').map(Number) as [number, number, number];\n\nexport type DnsLookupIpVersion = undefined | 4 | 6;\n\ntype Except<ObjectType, KeysType extends keyof ObjectType> = Pick<ObjectType, Exclude<keyof ObjectType, KeysType>>;\n\nexport type NativeRequestOptions = HttpsRequestOptions & CacheOptions & {checkServerIdentity?: CheckServerIdentityFunction};\n\ntype AcceptableResponse = IncomingMessageWithTimings | ResponseLike;\ntype AcceptableRequestResult = Promisable<AcceptableResponse | ClientRequest | undefined>;\nexport type RequestFunction = (url: URL, options: NativeRequestOptions, callback?: (response: AcceptableResponse) => void) => AcceptableRequestResult;\n\nexport type Agents = {\n\thttp?: HttpAgent | false;\n\thttps?: HttpsAgent | false;\n\thttp2?: unknown | false;\n};\n\nexport type Headers = Record<string, string | string[] | undefined>;\n\nexport type ToughCookieJar = {\n\tgetCookieString: ((currentUrl: string, options: Record<string, unknown>, callback: (error: Error | undefined, cookies: string) => void) => void)\n\t\t& ((url: string, callback: (error: Error | undefined, cookieHeader: string) => void) => void);\n\tsetCookie: ((cookieOrString: unknown, currentUrl: string, options: Record<string, unknown>, callback: (error: Error | undefined, cookie: unknown) => void) => void)\n\t\t& ((rawCookie: string, url: string, callback: (error: Error | undefined, result: unknown) => void) => void);\n};\n\nexport type PromiseCookieJar = {\n\tgetCookieString: (url: string) => Promise<string>;\n\tsetCookie: (rawCookie: string, url: string) => Promise<unknown>;\n};\n\n/**\nUtility type to override specific properties in a type.\n\nUses `Omit` to remove properties before adding them back to ensure proper type replacement rather than intersection, which handles edge cases with optional/required properties correctly.\n*/\ntype OverrideProperties<T, U> = Omit<T, keyof U> & U;\n\n/**\nRepresents the runtime state of Options as seen by hooks after normalization.\n\nSome Options properties accept multiple input types but are normalized to a single type internally by the Options class setters. This type reflects the actual runtime types that hooks receive, ensuring type safety when accessing options within hook functions.\n*/\nexport type NormalizedOptions = OverrideProperties<Options, {\n\t// The URL is always normalized to a URL instance (or undefined) by the time hooks execute.\n\turl: URL | undefined;\n\n\t// When set to `true`, dnsCache is normalized to a CacheableLookup instance. When set to `false`, it becomes `undefined`.\n\tdnsCache: CacheableLookup | undefined;\n\n\t// When set to `true`, cache is normalized to the global cache Map. When set to `false`, it becomes `undefined`. Strings and other values are wrapped/processed into a StorageAdapter.\n\tcache: StorageAdapter | undefined;\n\n\t// The prefix URL is always normalized to a string.\n\tprefixUrl: string;\n}>;\n\nexport type InitHook = (init: OptionsInit, self: Options) => void;\n\nexport type BeforeRequestHookContext = {\n\t/**\n\tThe current retry count.\n\n\tIt will be `0` for the initial request and increment for each retry.\n\t*/\n\tretryCount: number;\n};\n\nexport type BeforeRequestHook = (options: NormalizedOptions, context: BeforeRequestHookContext) => Promisable<void | Response | ResponseLike>;\nexport type BeforeRedirectHook = (updatedOptions: NormalizedOptions, plainResponse: PlainResponse) => Promisable<void>;\nexport type BeforeErrorHook = (error: RequestError) => Promisable<Error>;\nexport type BeforeRetryHook = (error: RequestError, retryCount: number) => Promisable<void>;\nexport type BeforeCacheHook = (response: PlainResponse) => false | void;\nexport type AfterResponseHook<ResponseType = unknown> = (response: Response<ResponseType>, retryWithMergedOptions: (options: OptionsInit) => never) => Promisable<Response | RequestPromise<Response>>;\n\n/**\nAll available hooks of Got.\n*/\nexport type Hooks = {\n\t/**\n\tCalled with the plain request options, right before their normalization.\n\n\tThe second argument represents the current `Options` instance.\n\n\t@default []\n\n\t**Note:**\n\t> - This hook must be synchronous.\n\n\t**Note:**\n\t> - This is called every time options are merged.\n\n\t**Note:**\n\t> - The `options` object may not have the `url` property. To modify it, use a `beforeRequest` hook instead.\n\n\t**Note:**\n\t> - This hook is called when a new instance of `Options` is created.\n\t> - Do not confuse this with the creation of `Request` or `got(…)`.\n\n\t**Note:**\n\t> - When using `got(url)` or `got(url, undefined, defaults)` this hook will **not** be called.\n\n\tThis is especially useful in conjunction with `got.extend()` when the input needs custom handling.\n\n\tFor example, this can be used to fix typos to migrate from older versions faster.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst instance = got.extend({\n\t\thooks: {\n\t\t\tinit: [\n\t\t\t\tplain => {\n\t\t\t\t\tif ('followRedirects' in plain) {\n\t\t\t\t\t\tplain.followRedirect = plain.followRedirects;\n\t\t\t\t\t\tdelete plain.followRedirects;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t});\n\n\t// Normally, the following would throw:\n\tconst response = await instance(\n\t\t'https://example.com',\n\t\t{\n\t\t\tfollowRedirects: true\n\t\t}\n\t);\n\n\t// There is no option named `followRedirects`, but we correct it in an `init` hook.\n\t```\n\n\tOr you can create your own option and store it in a context:\n\n\t```\n\timport got from 'got';\n\n\tconst instance = got.extend({\n\t\thooks: {\n\t\t\tinit: [\n\t\t\t\t(plain, options) => {\n\t\t\t\t\tif ('secret' in plain) {\n\t\t\t\t\t\toptions.context.secret = plain.secret;\n\t\t\t\t\t\tdelete plain.secret;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t],\n\t\t\tbeforeRequest: [\n\t\t\t\toptions => {\n\t\t\t\t\toptions.headers.secret = options.context.secret;\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t});\n\n\tconst {headers} = await instance(\n\t\t'https://httpbin.org/anything',\n\t\t{\n\t\t\tsecret: 'passphrase'\n\t\t}\n\t).json();\n\n\tconsole.log(headers.Secret);\n\t//=> 'passphrase'\n\t```\n\t*/\n\tinit: InitHook[];\n\n\t/**\n\tCalled right before making the request with `options.createNativeRequestOptions()`.\n\n\tThe second argument is a context object containing request state information.\n\n\tThis hook is especially useful in conjunction with `got.extend()` when you want to sign your request.\n\n\t@default []\n\n\t**Note:**\n\t> - Got will make no further changes to the request before it is sent.\n\n\t**Note:**\n\t> - Changing `options.json` or `options.form` has no effect on the request. You should change `options.body` instead. If needed, update the `options.headers` accordingly.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst response = await got.post(\n\t\t'https://httpbin.org/anything',\n\t\t{\n\t\t\tjson: {payload: 'old'},\n\t\t\thooks: {\n\t\t\t\tbeforeRequest: [\n\t\t\t\t\t(options, context) => {\n\t\t\t\t\t\toptions.body = JSON.stringify({payload: 'new'});\n\t\t\n# … (truncated at 8000 chars)"
} |
trpc__trpc__createTRPCUntypedClient__upstream__1hop_e4f785 | trpc/trpc | upstream | 1 | {
"calls": [
"experimental_createActionHook",
"getInitialProps"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/next/src/app-dir/create-action-hook.tsx",
"private/tmp/repos/trpc__trpc/packages/next/src/ssrPrepass.ts"
]
} | {
"anchor": "createTRPCUntypedClient",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/client/src/createTRPCUntypedClient.ts",
"anchor_source": "export function createTRPCUntypedClient<TRouter extends AnyRouter>(\n opts: CreateTRPCClientOptions<TRouter>,\n): TRPCUntypedClient<TRouter> {\n return new TRPCUntypedClient(opts);\n}",
"result_size": 2,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
node-fetch__node-fetch__fromRawHeaders__downstream__2hop_9101a8 | node-fetch/node-fetch | downstream | 2 | {
"hop_1": [
"constructor",
"Headers"
],
"hop_1_files": [
"private/tmp/repos/node-fetch__node-fetch/src/headers.js",
"private/tmp/repos/node-fetch__node-fetch/src/headers.js"
],
"hop_2": [
"raw",
"get"
],
"hop_2_files": [
"private/tmp/repos/node-fetch__node-fetch/src/headers.js",
"private/tmp/repos/node-fetch__node-fetch/src/headers.js"
]
} | {
"anchor": "fromRawHeaders",
"anchor_file": "private/tmp/repos/node-fetch__node-fetch/src/headers.js",
"anchor_source": "export function fromRawHeaders(headers = []) {\n\treturn new Headers(\n\t\theaders\n\t\t\t// Split into pairs\n\t\t\t.reduce((result, value, index, array) => {\n\t\t\t\tif (index % 2 === 0) {\n\t\t\t\t\tresult.push(array.slice(index, index + 2));\n\t\t\t\t}\n\n\t\t\t\treturn result;\n\t\t\t}, [])\n\t\t\t.filter(([name, value]) => {\n\t\t\t\ttry {\n\t\t\t\t\tvalidateHeaderName(name);\n\t\t\t\t\tvalidateHeaderValue(name, String(value));\n\t\t\t\t\treturn true;\n\t\t\t\t} catch {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t})\n\n\t);\n}",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.165214+00:00"
} |
colinhacks__zod__datetime__downstream__2hop_05cade | colinhacks/zod | downstream | 2 | {
"hop_1": [
"_addCheck"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
],
"hop_2": [
"ZodString",
"constructor"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
]
} | {
"anchor": "datetime",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"anchor_source": "datetime(\n options?:\n | string\n | {\n message?: string | undefined;\n precision?: number | null;\n offset?: boolean;\n local?: boolean;\n }\n ) {\n if (typeof options === \"string\") {\n return this._addCheck({\n kind: \"datetime\",\n precision: null,\n offset: false,\n local: false,\n message: options,\n });\n }\n return this._addCheck({\n kind: \"datetime\",\n\n precision: typeof options?.precision === \"undefined\" ? null : options?.precision,\n offset: options?.offset ?? false,\n local: options?.local ?? false,\n ...errorUtil.errToObj(options?.message),\n });\n }",
"result_size": 2,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
encode__httpx__aclose__upstream__1hop_43e88d | encode/httpx | upstream | 1 | {
"calls": [
"_send_handling_auth",
"_send_handling_redirects",
"send",
"aiter_raw",
"stream"
],
"call_files": [
"private/tmp/repos/encode__httpx/httpx/_client.py",
"private/tmp/repos/encode__httpx/httpx/_client.py",
"private/tmp/repos/encode__httpx/httpx/_client.py",
"private/tmp/repos/encode__httpx/httpx/_models.py",
"private/tmp/repos/encode__httpx/httpx/_client.py"
]
} | {
"anchor": "aclose",
"anchor_file": "private/tmp/repos/encode__httpx/httpx/_models.py",
"anchor_source": "async def aclose(self) -> None:\n \"\"\"\n Close the response and release the connection.\n Automatically called if the response body is read to completion.\n \"\"\"\n if not isinstance(self.stream, AsyncByteStream):\n raise RuntimeError(\"Attempted to call an async close on a sync stream.\")\n\n if not self.is_closed:\n self.is_closed = True\n with request_context(request=self._request):\n await self.stream.aclose()",
"result_size": 5,
"created_at": "2026-03-20T17:18:06.815591+00:00",
"file_content": ""
} |
sindresorhus__got__stripUrlAuth__upstream__1hop_4985c6 | sindresorhus/got | upstream | 1 | {
"calls": [
"constructor",
"_onResponseBase"
],
"call_files": [
"private/tmp/repos/got/source/core/response.ts",
"private/tmp/repos/got/source/core/index.ts"
]
} | {
"anchor": "stripUrlAuth",
"anchor_file": "private/tmp/repos/got/source/core/utils/strip-url-auth.ts",
"anchor_source": "export default function stripUrlAuth(url: URL | string): string {\n\tconst sanitized = new URL(url);\n\tsanitized.username = '';\n\tsanitized.password = '';\n\treturn sanitized.toString();\n}",
"result_size": 3,
"created_at": "2026-03-20T17:18:08.163238+00:00",
"file_content": "/*\nReturns the URL as a string with `username` and `password` stripped.\n*/\nexport default function stripUrlAuth(url: URL | string): string {\n\tconst sanitized = new URL(url);\n\tsanitized.username = '';\n\tsanitized.password = '';\n\treturn sanitized.toString();\n}\n"
} |
python-poetry__poetry___create_file_url_reference__downstream__1hop_8bdd80 | python-poetry/poetry | downstream | 1 | {
"calls": [
"_get_archive_info"
],
"call_files": [
"private/tmp/repos/python-poetry__poetry/src/poetry/installation/executor.py"
]
} | {
"anchor": "_create_file_url_reference",
"anchor_file": "private/tmp/repos/python-poetry__poetry/src/poetry/installation/executor.py",
"anchor_source": "def _create_file_url_reference(self, package: Package) -> dict[str, Any]:\n archive_info = self._get_archive_info(package)\n\n assert package.source_url is not None\n return {\n \"url\": Path(package.source_url).as_uri(),\n \"archive_info\": archive_info,\n }",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.827462+00:00"
} |
python-poetry__poetry__handle_starttag__downstream__1hop_e89dea | python-poetry/poetry | downstream | 1 | {
"calls": [
"Result",
"_match_class"
],
"call_files": [
"private/tmp/repos/python-poetry__poetry/src/poetry/repositories/parsers/pypi_search_parser.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/repositories/parsers/pypi_search_parser.py"
]
} | {
"anchor": "handle_starttag",
"anchor_file": "private/tmp/repos/python-poetry__poetry/src/poetry/repositories/parsers/pypi_search_parser.py",
"anchor_source": "def handle_starttag(self, tag: str, attrs: list[tuple[str, str | None]]) -> None:\n if not self._current:\n if tag == \"a\" and self._match_class(attrs, \"package-snippet\"):\n self._current = Result()\n self._nest_anchors = 1\n else:\n if tag == \"span\" and self._match_class(attrs, \"package-snippet__name\"):\n self._data_callback = functools.partial(setattr, self._current, \"name\")\n elif tag == \"span\" and self._match_class(attrs, \"package-snippet__version\"):\n self._data_callback = functools.partial(\n setattr, self._current, \"version\"\n )\n elif tag == \"p\" and self._match_class(\n attrs, \"package-snippet__description\"\n ):\n self._data_callback = functools.partial(\n setattr, self._current, \"description\"\n )\n elif tag == \"a\":\n self._nest_anchors += 1",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.827462+00:00"
} |
pallets__jinja__new_context__upstream__1hop_9daf52 | pallets/jinja | upstream | 1 | {
"calls": [
"make_module_async",
"generate",
"make_module",
"render",
"render_async",
"generate_async"
],
"call_files": [
"private/tmp/repos/pallets__jinja/src/jinja2/environment.py",
"private/tmp/repos/pallets__jinja/src/jinja2/environment.py",
"private/tmp/repos/pallets__jinja/src/jinja2/environment.py",
"private/tmp/repos/pallets__jinja/src/jinja2/environment.py",
"private/tmp/repos/pallets__jinja/src/jinja2/nativetypes.py",
"private/tmp/repos/pallets__jinja/src/jinja2/environment.py"
]
} | {
"anchor": "new_context",
"anchor_file": "private/tmp/repos/pallets__jinja/src/jinja2/environment.py",
"anchor_source": "def new_context(\n self,\n vars: dict[str, t.Any] | None = None,\n shared: bool = False,\n locals: t.Mapping[str, t.Any] | None = None,\n ) -> Context:\n \"\"\"Create a new :class:`Context` for this template. The vars\n provided will be passed to the template. Per default the globals\n are added to the context. If shared is set to `True` the data\n is passed as is to the context without adding the globals.\n\n `locals` can be a dict of local variables for internal usage.\n \"\"\"\n return new_context(\n self.environment, self.name, self.blocks, vars, shared, self.globals, locals\n )",
"result_size": 8,
"created_at": "2026-03-20T17:18:07.327357+00:00",
"file_content": ""
} |
colinhacks__zod__getErrorMap__downstream__1hop_a9bfde | colinhacks/zod | downstream | 1 | {
"calls": [
"config"
],
"call_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/core.ts"
]
} | {
"anchor": "getErrorMap",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/classic/compat.ts",
"anchor_source": "export function getErrorMap(): core.$ZodErrorMap<core.$ZodIssue> | undefined {\n return core.config().customError;\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
sindresorhus__got__isUnixSocketURL__upstream__1hop_1ae265 | sindresorhus/got | upstream | 1 | {
"calls": [
"getUnixSocketPath",
"_onResponseBase"
],
"call_files": [
"private/tmp/repos/got/source/core/utils/is-unix-socket-url.ts",
"private/tmp/repos/got/source/core/index.ts"
]
} | {
"anchor": "isUnixSocketURL",
"anchor_file": "private/tmp/repos/got/source/core/utils/is-unix-socket-url.ts",
"anchor_source": "export default function isUnixSocketURL(url: URL) {\n\treturn url.protocol === 'unix:' || url.hostname === 'unix';\n}",
"result_size": 2,
"created_at": "2026-03-20T17:18:08.163238+00:00",
"file_content": "// eslint-disable-next-line @typescript-eslint/naming-convention\nexport default function isUnixSocketURL(url: URL) {\n\treturn url.protocol === 'unix:' || url.hostname === 'unix';\n}\n\n/**\nExtract the socket path from a UNIX socket URL.\n\n@example\n```\ngetUnixSocketPath(new URL('http://unix/foo:/path'));\n//=> '/foo'\n\ngetUnixSocketPath(new URL('unix:/foo:/path'));\n//=> '/foo'\n\ngetUnixSocketPath(new URL('http://example.com'));\n//=> undefined\n```\n*/\nexport function getUnixSocketPath(url: URL): string | undefined {\n\tif (!isUnixSocketURL(url)) {\n\t\treturn undefined;\n\t}\n\n\treturn /(?<socketPath>.+?):(?<path>.+)/.exec(`${url.pathname}${url.search}`)?.groups?.socketPath;\n}\n"
} |
trpc__trpc__awsLambdaStreamingRequestHandler__downstream__1hop_8a3b3a | trpc/trpc | downstream | 1 | {
"calls": [
"getPlanner",
"resolveResponse"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/adapters/aws-lambda/getPlanner.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/http/resolveResponse.ts"
]
} | {
"anchor": "awsLambdaStreamingRequestHandler",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/adapters/aws-lambda/index.ts",
"anchor_source": "export function awsLambdaStreamingRequestHandler<\n TRouter extends AnyRouter,\n TEvent extends LambdaEvent,\n>(opts: AWSLambdaOptions<TRouter, TEvent>): StreamifyHandler<TEvent> {\n return async (event, responseStream, context) => {\n const planner = getPlanner(event);\n\n const createContext: ResolveHTTPRequestOptionsContextFn<TRouter> = async (\n innerOpts,\n ) => {\n return await opts.createContext?.({ event, context, ...innerOpts });\n };\n\n const response = await resolveResponse({\n ...opts,\n createContext,\n req: planner.request,\n path: planner.path,\n error: null,\n onError(o) {\n opts?.onError?.({\n ...o,\n req: event,\n });\n },\n });\n\n await planner.toStream(response, responseStream);\n };\n}",
"result_size": 2,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
sindresorhus__got__extend__downstream__1hop_1cde97 | sindresorhus/got | downstream | 1 | {
"calls": [
"constructor",
"Options",
"merge"
],
"call_files": [
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts"
]
} | {
"anchor": "extend",
"anchor_file": "private/tmp/repos/got/source/create.ts",
"anchor_source": "got.extend = (...instancesOrOptions) => {\n\t\tconst options = new Options(undefined, undefined, defaults.options);\n\t\tconst handlers = [...defaults.handlers];\n\n\t\tlet mutableDefaults: boolean | undefined;\n\n\t\tfor (const value of instancesOrOptions) {\n\t\t\tif (isGotInstance(value)) {\n\t\t\t\toptions.merge(value.defaults.options);\n\t\t\t\thandlers.push(...value.defaults.handlers);\n\t\t\t\tmutableDefaults = value.defaults.mutableDefaults;\n\t\t\t} else {\n\t\t\t\tassertNoUrlInOptionsObject(value);\n\t\t\t\toptions.merge(value);\n\n\t\t\t\tif (value.handlers) {\n\t\t\t\t\thandlers.push(...value.handlers);\n\t\t\t\t}\n\n\t\t\t\tmutableDefaults = value.mutableDefaults;\n\t\t\t}\n\t\t}\n\n\t\treturn create({\n\t\t\toptions,\n\t\t\thandlers,\n\t\t\tmutableDefaults: Boolean(mutableDefaults),\n\t\t});\n\t};",
"result_size": 3,
"created_at": "2026-03-20T17:18:08.163238+00:00"
} |
trpc__trpc__send__upstream__2hop_ec4eb0 | trpc/trpc | upstream | 2 | {
"hop_1": [
"batchSend",
"reconnect",
"request"
],
"hop_1_files": [
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts"
],
"hop_2": [
"handleIncomingRequest",
"setupWebSocketListeners",
"open",
"wsLink"
],
"hop_2_files": [
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsLink.ts"
]
} | {
"anchor": "send",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/client/src/links/wsLink/wsClient/wsClient.ts",
"anchor_source": "private send(\n messageOrMessages: TRPCClientOutgoingMessage | TRPCClientOutgoingMessage[],\n ) {\n if (!this.activeConnection.isOpen()) {\n throw new Error('Active connection is not open');\n }\n\n const messages =\n messageOrMessages instanceof Array\n ? messageOrMessages\n : [messageOrMessages];\n this.activeConnection.ws.send(\n this.encoder.encode(messages.length === 1 ? messages[0] : messages),\n );\n }",
"result_size": 8,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
trpc__trpc__takeWithGrace__upstream__2hop_c086f7 | trpc/trpc | upstream | 2 | {
"hop_1": [
"generator",
"sseStreamProducer"
],
"hop_1_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/sse.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/sse.ts"
],
"hop_2": [
"generatorWithErrorHandling",
"resolveResponse"
],
"hop_2_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/sse.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/http/resolveResponse.ts"
]
} | {
"anchor": "takeWithGrace",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/asyncIterable.ts",
"anchor_source": "export async function* takeWithGrace<T>(\n iterable: AsyncIterable<T>,\n opts: {\n count: number;\n gracePeriodMs: number;\n },\n): AsyncGenerator<T> {\n await using iterator = iteratorResource(iterable);\n\n // declaration outside the loop for garbage collection reasons\n let result: null | IteratorResult<T> | typeof disposablePromiseTimerResult;\n\n using timer = timerResource(opts.gracePeriodMs);\n\n let count = opts.count;\n\n let timerPromise = new Promise<typeof disposablePromiseTimerResult>(() => {\n // never resolves\n });\n\n while (true) {\n result = await Unpromise.race([iterator.next(), timerPromise]);\n if (result === disposablePromiseTimerResult) {\n throwAbortError();\n }\n if (result.done) {\n return result.value;\n }\n yield result.value;\n if (--count === 0) {\n timerPromise = timer.start();\n }\n // free up reference for garbage collection\n result = null;\n }\n}",
"result_size": 5,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
encode__httpx__iter_raw__upstream__2hop_6e8b03 | encode/httpx | upstream | 2 | {
"hop_1": [
"iter_bytes"
],
"hop_1_files": [
"private/tmp/repos/encode__httpx/httpx/_models.py"
],
"hop_2": [
"read",
"iter_text",
"download_response"
],
"hop_2_files": [
"private/tmp/repos/encode__httpx/httpx/_models.py",
"private/tmp/repos/encode__httpx/httpx/_models.py",
"private/tmp/repos/encode__httpx/httpx/_main.py"
]
} | {
"anchor": "iter_raw",
"anchor_file": "private/tmp/repos/encode__httpx/httpx/_models.py",
"anchor_source": "def iter_raw(self, chunk_size: int | None = None) -> typing.Iterator[bytes]:\n \"\"\"\n A byte-iterator over the raw response content.\n \"\"\"\n if self.is_stream_consumed:\n raise StreamConsumed()\n if self.is_closed:\n raise StreamClosed()\n if not isinstance(self.stream, SyncByteStream):\n raise RuntimeError(\"Attempted to call a sync iterator on an async stream.\")\n\n self.is_stream_consumed = True\n self._num_bytes_downloaded = 0\n chunker = ByteChunker(chunk_size=chunk_size)\n\n with request_context(request=self._request):\n for raw_stream_bytes in self.stream:\n self._num_bytes_downloaded += len(raw_stream_bytes)\n for chunk in chunker.decode(raw_stream_bytes):\n yield chunk\n\n for chunk in chunker.flush():\n yield chunk\n\n self.close()",
"result_size": 3,
"created_at": "2026-03-20T17:18:06.815591+00:00",
"file_content": ""
} |
rq__rq__dependents_key_for__upstream__2hop_67b9a8 | rq/rq | upstream | 2 | {
"hop_1": [
"register_dependency",
"dependents_key"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/job.py",
"private/tmp/repos/rq__rq/rq/job.py"
],
"hop_2": [
"setup_dependencies",
"get_jobs_with_met_dependencies"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/queue.py",
"private/tmp/repos/rq__rq/rq/dependency.py"
]
} | {
"anchor": "dependents_key_for",
"anchor_file": "private/tmp/repos/rq__rq/rq/job.py",
"anchor_source": "@classmethod\n def dependents_key_for(cls, job_id: str) -> str:\n \"\"\"The Redis key that is used to store job dependents hash under.\n\n Args:\n job_id (str): The \"parent\" job id\n\n Returns:\n dependents_key (str): The dependents key\n \"\"\"\n return f'{cls.redis_job_namespace_prefix}{job_id}:dependents'",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
pytest-dev__pytest__call_and_report__downstream__1hop_05ab76 | pytest-dev/pytest | downstream | 1 | {
"calls": [
"from_call",
"get_reraise_exceptions"
],
"call_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/runner.py",
"private/tmp/repos/pytest-dev__pytest/src/_pytest/runner.py"
]
} | {
"anchor": "call_and_report",
"anchor_file": "private/tmp/repos/pytest-dev__pytest/src/_pytest/runner.py",
"anchor_source": "def call_and_report(\n item: Item, when: Literal[\"setup\", \"call\", \"teardown\"], log: bool = True, **kwds\n) -> TestReport:\n ihook = item.ihook\n if when == \"setup\":\n runtest_hook: Callable[..., None] = ihook.pytest_runtest_setup\n elif when == \"call\":\n runtest_hook = ihook.pytest_runtest_call\n elif when == \"teardown\":\n runtest_hook = ihook.pytest_runtest_teardown\n else:\n assert False, f\"Unhandled runtest hook case: {when}\"\n\n call = CallInfo.from_call(\n lambda: runtest_hook(item=item, **kwds),\n when=when,\n reraise=get_reraise_exceptions(item.config),\n )\n report: TestReport = ihook.pytest_runtest_makereport(item=item, call=call)\n if log:\n ihook.pytest_runtest_logreport(report=report)\n if check_interactive_exception(call, report):\n ihook.pytest_exception_interact(node=item, call=call, report=report)\n return report",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.712542+00:00"
} |
sindresorhus__got__#getFallbackRequestFunction__upstream__2hop_2c7edf | sindresorhus/got | upstream | 2 | {
"hop_1": [
"#callFallbackRequest",
"getRequestFunction"
],
"hop_1_files": [
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts"
],
"hop_2": [
"_makeRequest",
"#resolveRequestWithFallback"
],
"hop_2_files": [
"private/tmp/repos/got/source/core/index.ts",
"private/tmp/repos/got/source/core/options.ts"
]
} | {
"anchor": "#getFallbackRequestFunction",
"anchor_file": "private/tmp/repos/got/source/core/options.ts",
"anchor_source": "#getFallbackRequestFunction() {\n\t\tconst url = this.#internals.url as (URL | undefined);\n\n\t\tif (!url) {\n\t\t\treturn;\n\t\t}\n\n\t\tif (url.protocol === 'https:') {\n\t\t\tif (this.#internals.http2) {\n\t\t\t\tif (major < 15 || (major === 15 && minor < 10)) {\n\t\t\t\t\tconst error = new Error('To use the `http2` option, install Node.js 15.10.0 or above');\n\t\t\t\t\t(error as NodeJS.ErrnoException).code = 'EUNSUPPORTED';\n\n\t\t\t\t\tthrow error;\n\t\t\t\t}\n\n\t\t\t\treturn http2wrapper.auto as RequestFunction;\n\t\t\t}\n\n\t\t\treturn https.request;\n\t\t}\n\n\t\treturn http.request;\n\t}",
"result_size": 2,
"created_at": "2026-03-20T17:18:08.163238+00:00",
"file_content": "import process from 'node:process';\nimport {promisify, inspect, type InspectOptions} from 'node:util';\nimport {checkServerIdentity, type SecureContextOptions, type DetailedPeerCertificate} from 'node:tls';\n// DO NOT use destructuring for `https.request` and `http.request` as it's not compatible with `nock`.\nimport https, {\n\ttype RequestOptions as HttpsRequestOptions,\n\ttype Agent as HttpsAgent,\n} from 'node:https';\nimport http, {\n\ttype Agent as HttpAgent,\n\ttype ClientRequest,\n} from 'node:http';\nimport type {Readable} from 'node:stream';\nimport type {Socket, LookupFunction} from 'node:net';\nimport is, {assert} from '@sindresorhus/is';\nimport lowercaseKeys from 'lowercase-keys';\nimport CacheableLookup from 'cacheable-lookup';\nimport http2wrapper, {type ClientHttp2Session} from 'http2-wrapper';\nimport type {KeyvStoreAdapter} from 'keyv';\nimport type KeyvType from 'keyv';\nimport type ResponseLike from 'responselike';\nimport type {RequestPromise} from '../as-promise/types.js';\nimport type {IncomingMessageWithTimings} from './utils/timer.js';\nimport parseLinkHeader from './parse-link-header.js';\nimport type {PlainResponse, Response} from './response.js';\nimport type {RequestError} from './errors.js';\nimport type {Delays} from './timed-out.js';\n\ntype StorageAdapter = KeyvStoreAdapter | KeyvType | Map<unknown, unknown>;\n\ntype Promisable<T> = T | Promise<T>;\n\nconst [major, minor] = process.versions.node.split('.').map(Number) as [number, number, number];\n\nexport type DnsLookupIpVersion = undefined | 4 | 6;\n\ntype Except<ObjectType, KeysType extends keyof ObjectType> = Pick<ObjectType, Exclude<keyof ObjectType, KeysType>>;\n\nexport type NativeRequestOptions = HttpsRequestOptions & CacheOptions & {checkServerIdentity?: CheckServerIdentityFunction};\n\ntype AcceptableResponse = IncomingMessageWithTimings | ResponseLike;\ntype AcceptableRequestResult = Promisable<AcceptableResponse | ClientRequest | undefined>;\nexport type RequestFunction = (url: URL, options: NativeRequestOptions, callback?: (response: AcceptableResponse) => void) => AcceptableRequestResult;\n\nexport type Agents = {\n\thttp?: HttpAgent | false;\n\thttps?: HttpsAgent | false;\n\thttp2?: unknown | false;\n};\n\nexport type Headers = Record<string, string | string[] | undefined>;\n\nexport type ToughCookieJar = {\n\tgetCookieString: ((currentUrl: string, options: Record<string, unknown>, callback: (error: Error | undefined, cookies: string) => void) => void)\n\t\t& ((url: string, callback: (error: Error | undefined, cookieHeader: string) => void) => void);\n\tsetCookie: ((cookieOrString: unknown, currentUrl: string, options: Record<string, unknown>, callback: (error: Error | undefined, cookie: unknown) => void) => void)\n\t\t& ((rawCookie: string, url: string, callback: (error: Error | undefined, result: unknown) => void) => void);\n};\n\nexport type PromiseCookieJar = {\n\tgetCookieString: (url: string) => Promise<string>;\n\tsetCookie: (rawCookie: string, url: string) => Promise<unknown>;\n};\n\n/**\nUtility type to override specific properties in a type.\n\nUses `Omit` to remove properties before adding them back to ensure proper type replacement rather than intersection, which handles edge cases with optional/required properties correctly.\n*/\ntype OverrideProperties<T, U> = Omit<T, keyof U> & U;\n\n/**\nRepresents the runtime state of Options as seen by hooks after normalization.\n\nSome Options properties accept multiple input types but are normalized to a single type internally by the Options class setters. This type reflects the actual runtime types that hooks receive, ensuring type safety when accessing options within hook functions.\n*/\nexport type NormalizedOptions = OverrideProperties<Options, {\n\t// The URL is always normalized to a URL instance (or undefined) by the time hooks execute.\n\turl: URL | undefined;\n\n\t// When set to `true`, dnsCache is normalized to a CacheableLookup instance. When set to `false`, it becomes `undefined`.\n\tdnsCache: CacheableLookup | undefined;\n\n\t// When set to `true`, cache is normalized to the global cache Map. When set to `false`, it becomes `undefined`. Strings and other values are wrapped/processed into a StorageAdapter.\n\tcache: StorageAdapter | undefined;\n\n\t// The prefix URL is always normalized to a string.\n\tprefixUrl: string;\n}>;\n\nexport type InitHook = (init: OptionsInit, self: Options) => void;\n\nexport type BeforeRequestHookContext = {\n\t/**\n\tThe current retry count.\n\n\tIt will be `0` for the initial request and increment for each retry.\n\t*/\n\tretryCount: number;\n};\n\nexport type BeforeRequestHook = (options: NormalizedOptions, context: BeforeRequestHookContext) => Promisable<void | Response | ResponseLike>;\nexport type BeforeRedirectHook = (updatedOptions: NormalizedOptions, plainResponse: PlainResponse) => Promisable<void>;\nexport type BeforeErrorHook = (error: RequestError) => Promisable<Error>;\nexport type BeforeRetryHook = (error: RequestError, retryCount: number) => Promisable<void>;\nexport type BeforeCacheHook = (response: PlainResponse) => false | void;\nexport type AfterResponseHook<ResponseType = unknown> = (response: Response<ResponseType>, retryWithMergedOptions: (options: OptionsInit) => never) => Promisable<Response | RequestPromise<Response>>;\n\n/**\nAll available hooks of Got.\n*/\nexport type Hooks = {\n\t/**\n\tCalled with the plain request options, right before their normalization.\n\n\tThe second argument represents the current `Options` instance.\n\n\t@default []\n\n\t**Note:**\n\t> - This hook must be synchronous.\n\n\t**Note:**\n\t> - This is called every time options are merged.\n\n\t**Note:**\n\t> - The `options` object may not have the `url` property. To modify it, use a `beforeRequest` hook instead.\n\n\t**Note:**\n\t> - This hook is called when a new instance of `Options` is created.\n\t> - Do not confuse this with the creation of `Request` or `got(…)`.\n\n\t**Note:**\n\t> - When using `got(url)` or `got(url, undefined, defaults)` this hook will **not** be called.\n\n\tThis is especially useful in conjunction with `got.extend()` when the input needs custom handling.\n\n\tFor example, this can be used to fix typos to migrate from older versions faster.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst instance = got.extend({\n\t\thooks: {\n\t\t\tinit: [\n\t\t\t\tplain => {\n\t\t\t\t\tif ('followRedirects' in plain) {\n\t\t\t\t\t\tplain.followRedirect = plain.followRedirects;\n\t\t\t\t\t\tdelete plain.followRedirects;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t});\n\n\t// Normally, the following would throw:\n\tconst response = await instance(\n\t\t'https://example.com',\n\t\t{\n\t\t\tfollowRedirects: true\n\t\t}\n\t);\n\n\t// There is no option named `followRedirects`, but we correct it in an `init` hook.\n\t```\n\n\tOr you can create your own option and store it in a context:\n\n\t```\n\timport got from 'got';\n\n\tconst instance = got.extend({\n\t\thooks: {\n\t\t\tinit: [\n\t\t\t\t(plain, options) => {\n\t\t\t\t\tif ('secret' in plain) {\n\t\t\t\t\t\toptions.context.secret = plain.secret;\n\t\t\t\t\t\tdelete plain.secret;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t],\n\t\t\tbeforeRequest: [\n\t\t\t\toptions => {\n\t\t\t\t\toptions.headers.secret = options.context.secret;\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t});\n\n\tconst {headers} = await instance(\n\t\t'https://httpbin.org/anything',\n\t\t{\n\t\t\tsecret: 'passphrase'\n\t\t}\n\t).json();\n\n\tconsole.log(headers.Secret);\n\t//=> 'passphrase'\n\t```\n\t*/\n\tinit: InitHook[];\n\n\t/**\n\tCalled right before making the request with `options.createNativeRequestOptions()`.\n\n\tThe second argument is a context object containing request state information.\n\n\tThis hook is especially useful in conjunction with `got.extend()` when you want to sign your request.\n\n\t@default []\n\n\t**Note:**\n\t> - Got will make no further changes to the request before it is sent.\n\n\t**Note:**\n\t> - Changing `options.json` or `options.form` has no effect on the request. You should change `options.body` instead. If needed, update the `options.headers` accordingly.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst response = await got.post(\n\t\t'https://httpbin.org/anything',\n\t\t{\n\t\t\tjson: {payload: 'old'},\n\t\t\thooks: {\n\t\t\t\tbeforeRequest: [\n\t\t\t\t\t(options, context) => {\n\t\t\t\t\t\toptions.body = JSON.stringify({payload: 'new'});\n\t\t\n# … (truncated at 8000 chars)"
} |
pytest-dev__pytest__saferepr__downstream__2hop_06f5f0 | pytest-dev/pytest | downstream | 2 | {
"hop_1": [
"repr"
],
"hop_1_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/_io/saferepr.py"
],
"hop_2": [
"_format_repr_exception",
"_ellipsize"
],
"hop_2_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/_io/saferepr.py",
"private/tmp/repos/pytest-dev__pytest/src/_pytest/_io/saferepr.py"
]
} | {
"anchor": "saferepr",
"anchor_file": "private/tmp/repos/pytest-dev__pytest/src/_pytest/_io/saferepr.py",
"anchor_source": "def saferepr(\n obj: object, maxsize: int | None = DEFAULT_REPR_MAX_SIZE, use_ascii: bool = False\n) -> str:\n \"\"\"Return a size-limited safe repr-string for the given object.\n\n Failing __repr__ functions of user instances will be represented\n with a short exception info and 'saferepr' generally takes\n care to never raise exceptions itself.\n\n This function is a wrapper around the Repr/reprlib functionality of the\n stdlib.\n \"\"\"\n return SafeRepr(maxsize, use_ascii).repr(obj)",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.712542+00:00"
} |
trpc__trpc__createTRPCOptionsResult__upstream__2hop_7f89dd | trpc/trpc | upstream | 2 | {
"hop_1": [
"createUtilityFunctions",
"queryOptions",
"infiniteQueryOptions",
"useHookResult"
],
"hop_1_files": [
"private/tmp/repos/trpc__trpc/packages/react-query/src/utils/createUtilityFunctions.ts",
"private/tmp/repos/trpc__trpc/packages/react-query/src/utils/createUtilityFunctions.ts",
"private/tmp/repos/trpc__trpc/packages/react-query/src/utils/createUtilityFunctions.ts",
"private/tmp/repos/trpc__trpc/packages/react-query/src/internals/trpcResult.ts"
],
"hop_2": [
"createTRPCQueryUtils",
"useMutation",
"useInfiniteQuery",
"useQuery",
"useSuspenseInfiniteQuery",
"useSuspenseQuery",
"createRootHooks"
],
"hop_2_files": [
"private/tmp/repos/trpc__trpc/packages/react-query/src/createTRPCQueryUtils.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/shared/hooks/createHooksInternal.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/shared/hooks/createHooksInternal.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/shared/hooks/createHooksInternal.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/shared/hooks/createHooksInternal.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/shared/hooks/createHooksInternal.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/shared/hooks/createHooksInternal.tsx"
]
} | {
"anchor": "createTRPCOptionsResult",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/react-query/src/internals/trpcResult.ts",
"anchor_source": "export function createTRPCOptionsResult(value: {\n path: readonly string[];\n}): TRPCQueryOptionsResult['trpc'] {\n const path = value.path.join('.');\n\n return {\n path,\n };\n}",
"result_size": 8,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
locustio__locust__get_abspaths_in__upstream__1hop_36995d | locustio/locust | upstream | 1 | {
"calls": [
"handle_message",
"_parse_locustfile_path"
],
"call_files": [
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/argument_parser.py"
]
} | {
"anchor": "get_abspaths_in",
"anchor_file": "private/tmp/repos/locustio__locust/locust/util/directory.py",
"anchor_source": "def get_abspaths_in(path, extension=None):\n return [\n os.path.abspath(os.path.join(root, f))\n for root, _dirs, fs in os.walk(path)\n for f in fs\n if os.path.isfile(os.path.join(root, f))\n and (f.endswith(extension) or extension is None)\n and not f.startswith(\"_\")\n ]",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.062541+00:00",
"file_content": ""
} |
immerjs__immer__scenario_rtkq_sequence__downstream__2hop_f363db | immerjs/immer | downstream | 2 | {
"hop_1": [
"createInitialState"
],
"hop_1_files": [
"private/tmp/repos/immerjs__immer/perf-testing/immutability-profiling.mjs"
],
"hop_2": [
"createLargeObject"
],
"hop_2_files": [
"private/tmp/repos/immerjs__immer/perf-testing/immutability-profiling.mjs"
]
} | {
"anchor": "scenario_rtkq_sequence",
"anchor_file": "private/tmp/repos/immerjs__immer/perf-testing/immutability-profiling.mjs",
"anchor_source": "function scenario_rtkq_sequence() {\n\tlet state = createInitialState()\n\tconst arraySize = 100\n\n\t// Phase 1: Execute all pending actions\n\tfor (let j = 0; j < arraySize; j++) {\n\t\tstate = immerReducer(state, rtkqPending(j))\n\t}\n\n\t// Phase 2: Execute all resolved actions\n\tfor (let j = 0; j < arraySize; j++) {\n\t\tstate = immerReducer(state, rtkqResolved(j))\n\t}\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.914297+00:00"
} |
trpc__trpc__timerResource__upstream__1hop_5c2d6b | trpc/trpc | upstream | 1 | {
"calls": [
"takeWithGrace",
"withPing",
"withTimeout"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/asyncIterable.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/withPing.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/sse.ts"
]
} | {
"anchor": "timerResource",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/timerResource.ts",
"anchor_source": "export function timerResource(ms: number) {\n let timer: ReturnType<typeof setTimeout> | null = null;\n\n return makeResource(\n {\n start() {\n if (timer) {\n throw new Error('Timer already started');\n }\n\n const promise = new Promise<typeof disposablePromiseTimerResult>(\n (resolve) => {\n timer = setTimeout(() => resolve(disposablePromiseTimerResult), ms);\n },\n );\n return promise;\n },\n },\n () => {\n if (timer) {\n clearTimeout(timer);\n }\n },\n );\n}",
"result_size": 3,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
pallets__jinja__iter_child_nodes__upstream__1hop_b541ac | pallets/jinja | upstream | 1 | {
"calls": [
"generic_visit",
"set_ctx",
"set_lineno",
"_simple_visit",
"find_all",
"set_environment",
"visit_CallBlock",
"visit_For"
],
"call_files": [
"private/tmp/repos/pallets__jinja/src/jinja2/visitor.py",
"private/tmp/repos/pallets__jinja/src/jinja2/nodes.py",
"private/tmp/repos/pallets__jinja/src/jinja2/nodes.py",
"private/tmp/repos/pallets__jinja/src/jinja2/idtracking.py",
"private/tmp/repos/pallets__jinja/src/jinja2/nodes.py",
"private/tmp/repos/pallets__jinja/src/jinja2/nodes.py",
"private/tmp/repos/pallets__jinja/src/jinja2/idtracking.py",
"private/tmp/repos/pallets__jinja/src/jinja2/compiler.py"
]
} | {
"anchor": "iter_child_nodes",
"anchor_file": "private/tmp/repos/pallets__jinja/src/jinja2/nodes.py",
"anchor_source": "def iter_child_nodes(\n self,\n exclude: t.Container[str] | None = None,\n only: t.Container[str] | None = None,\n ) -> t.Iterator[\"Node\"]:\n \"\"\"Iterates over all direct child nodes of the node. This iterates\n over all fields and yields the values of they are nodes. If the value\n of a field is a list all the nodes in that list are returned.\n \"\"\"\n for _, item in self.iter_fields(exclude, only):\n if isinstance(item, list):\n for n in item:\n if isinstance(n, Node):\n yield n\n elif isinstance(item, Node):\n yield item",
"result_size": 8,
"created_at": "2026-03-20T17:18:07.327357+00:00",
"file_content": ""
} |
trpc__trpc__isAbortError__upstream__1hop_510dbb | trpc/trpc | upstream | 1 | {
"calls": [
"writeResponseBody",
"unstable_localLink",
"onErrorCallback",
"generatorWithErrorHandling",
"sseStreamProducer"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/adapters/node-http/writeResponse.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/localLink.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/localLink.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/sse.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/sse.ts"
]
} | {
"anchor": "isAbortError",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/http/abortError.ts",
"anchor_source": "export function isAbortError(\n error: unknown,\n): error is DOMException | Error | { name: 'AbortError' } {\n return isObject(error) && error['name'] === 'AbortError';\n}",
"result_size": 8,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
trpc__trpc__load__upstream__2hop_06287f | trpc/trpc | upstream | 2 | {
"hop_1": [
"httpBatchLink",
"dataLoader",
"httpBatchStreamLink"
],
"hop_1_files": [
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpBatchLink.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/internals/dataLoader.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpBatchStreamLink.ts"
],
"hop_2": [
"experimental_nextHttpLink",
"dispatch"
],
"hop_2_files": [
"private/tmp/repos/trpc__trpc/packages/next/src/app-dir/links/nextHttp.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/internals/dataLoader.ts"
]
} | {
"anchor": "load",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/client/src/internals/dataLoader.ts",
"anchor_source": "function load(key: TKey): Promise<TValue> {\n const item: BatchItem<TKey, TValue> = {\n aborted: false,\n key,\n batch: null,\n resolve: throwFatalError,\n reject: throwFatalError,\n };\n\n const promise = new Promise<TValue>((resolve, reject) => {\n item.reject = reject;\n item.resolve = resolve;\n\n pendingItems ??= [];\n pendingItems.push(item);\n });\n\n dispatchTimer ??= setTimeout(dispatch);\n\n return promise;\n }",
"result_size": 6,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
rq__rq__enqueue_call__upstream__1hop_12d7e5 | rq/rq | upstream | 1 | {
"calls": [
"delay",
"enqueue"
],
"call_files": [
"private/tmp/repos/rq__rq/rq/decorators.py",
"private/tmp/repos/rq__rq/rq/queue.py"
]
} | {
"anchor": "enqueue_call",
"anchor_file": "private/tmp/repos/rq__rq/rq/queue.py",
"anchor_source": "def enqueue_call(\n self,\n func: 'FunctionReferenceType',\n args: Union[tuple, list, None] = None,\n kwargs: Optional[dict] = None,\n timeout: Optional[int] = None,\n result_ttl: Optional[int] = None,\n ttl: Optional[int] = None,\n failure_ttl: Optional[int] = None,\n description: Optional[str] = None,\n depends_on: Optional['JobDependencyType'] = None,\n job_id: Optional[str] = None,\n at_front: bool = False,\n meta: Optional[dict] = None,\n retry: Optional['Retry'] = None,\n repeat: Optional['Repeat'] = None,\n on_success: Optional[Union[Callback, Callable[..., Any]]] = None,\n on_failure: Optional[Union[Callback, Callable[..., Any]]] = None,\n on_stopped: Optional[Union[Callback, Callable[..., Any]]] = None,\n pipeline: Optional['Pipeline'] = None,\n ) -> Job:\n \"\"\"Creates a job to represent the delayed function call and enqueues it.\n\n It is much like `.enqueue()`, except that it takes the function's args\n and kwargs as explicit arguments. Any kwargs passed to this function\n contain options for RQ itself.\n\n Args:\n func (FunctionReferenceType): The reference to the function\n args (Union[Tuple, List, None], optional): The `*args` to pass to the function. Defaults to None.\n kwargs (Optional[Dict], optional): The `**kwargs` to pass to the function. Defaults to None.\n timeout (Optional[int], optional): Function timeout. Defaults to None.\n result_ttl (Optional[int], optional): Result time to live. Defaults to None.\n ttl (Optional[int], optional): Time to live. Defaults to None.\n failure_ttl (Optional[int], optional): Failure time to live. Defaults to None.\n description (Optional[str], optional): The job description. Defaults to None.\n depends_on (Optional[JobDependencyType], optional): The job dependencies. Defaults to None.\n job_id (Optional[str], optional): The job ID. Defaults to None.\n at_front (bool, optional): Whether to enqueue the job at the front. Defaults to False.\n meta (Optional[Dict], optional): Metadata to attach to the job. Defaults to None.\n retry (Optional[Retry], optional): Retry object. Defaults to None.\n on_success (Optional[Union[Callback, Callable[..., Any]]], optional): Callback for on success. Defaults to\n None. Callable is deprecated.\n on_failure (Optional[Union[Callback, Callable[..., Any]]], optional): Callback for on failure. Defaults to\n None. Callable is deprecated.\n on_stopped (Optional[Union[Callback, Callable[..., Any]]], optional): Callback for on stopped. Defaults to\n None. Callable is deprecated.\n pipeline (Optional[Pipeline], optional): The Redis Pipeline. Defaults to None.\n\n Returns:\n Job: The enqueued Job\n \"\"\"\n\n job = self.create_job(\n func,\n args=args,\n kwargs=kwargs,\n result_ttl=result_ttl,\n ttl=ttl,\n failure_ttl=failure_ttl,\n description=description,\n depends_on=depends_on,\n job_id=job_id,\n meta=meta,\n status=JobStatus.QUEUED,\n timeout=timeout,\n retry=retry,\n repeat=repeat,\n on_success=on_success,\n on_failure=on_failure,\n on_stopped=on_stopped,\n )\n return self.enqueue_job(job, pipeline=pipeline, at_front=at_front)",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
sindresorhus__got__signal__downstream__2hop_b750ce | sindresorhus/got | downstream | 2 | {
"hop_1": [
"signal"
],
"hop_1_files": [
"private/tmp/repos/got/source/core/options.ts"
],
"hop_2": [
"assertAny"
],
"hop_2_files": [
"private/tmp/repos/got/source/core/options.ts"
]
} | {
"anchor": "signal",
"anchor_file": "private/tmp/repos/got/source/core/options.ts",
"anchor_source": "get signal(): AbortSignal | undefined {\n\t\treturn this.#internals.signal;\n\t}",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.163238+00:00"
} |
psf__black__append_safe__downstream__1hop_93aac1 | psf/black | downstream | 1 | {
"calls": [
"append"
],
"call_files": [
"private/tmp/repos/psf__black/src/black/lines.py"
]
} | {
"anchor": "append_safe",
"anchor_file": "private/tmp/repos/psf__black/src/black/lines.py",
"anchor_source": "def append_safe(self, leaf: Leaf, preformatted: bool = False) -> None:\n \"\"\"Like :func:`append()` but disallow invalid standalone comment structure.\n\n Raises ValueError when any `leaf` is appended after a standalone comment\n or when a standalone comment is not the first leaf on the line.\n \"\"\"\n if (\n self.bracket_tracker.depth == 0\n or self.bracket_tracker.any_open_for_or_lambda()\n ):\n if self.is_comment:\n raise ValueError(\"cannot append to standalone comments\")\n\n if self.leaves and leaf.type == STANDALONE_COMMENT:\n raise ValueError(\n \"cannot append standalone comments to a populated line\"\n )\n\n self.append(leaf, preformatted=preformatted)",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.570141+00:00"
} |
rq__rq__get_redis_server_version__upstream__2hop_48a319 | rq/rq | upstream | 2 | {
"hop_1": [
"get_job_position",
"_prepare_for_queue"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/queue.py",
"private/tmp/repos/rq__rq/rq/queue.py"
],
"hop_2": [
"_enqueue_sync_job",
"get_position",
"_enqueue_async_job"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/queue.py",
"private/tmp/repos/rq__rq/rq/job.py",
"private/tmp/repos/rq__rq/rq/queue.py"
]
} | {
"anchor": "get_redis_server_version",
"anchor_file": "private/tmp/repos/rq__rq/rq/queue.py",
"anchor_source": "def get_redis_server_version(self) -> tuple[int, int, int]:\n \"\"\"Return Redis server version of connection\n\n Returns:\n redis_version (Tuple): A tuple with the parsed Redis version (eg: (5,0,0))\n \"\"\"\n if not self.redis_server_version:\n self.redis_server_version = get_version(self.connection)\n return self.redis_server_version",
"result_size": 3,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
immerjs__immer__leaveScope__upstream__1hop_5097fc | immerjs/immer | upstream | 1 | {
"calls": [
"revokeScope",
"createDraft"
],
"call_files": [
"private/tmp/repos/immerjs__immer/src/core/scope.ts",
"private/tmp/repos/immerjs__immer/src/core/immerClass.ts"
]
} | {
"anchor": "leaveScope",
"anchor_file": "private/tmp/repos/immerjs__immer/src/core/scope.ts",
"anchor_source": "export function leaveScope(scope: ImmerScope) {\n\tif (scope === currentScope) {\n\t\tcurrentScope = scope.parent_\n\t}\n}",
"result_size": 2,
"created_at": "2026-03-20T17:18:06.914297+00:00",
"file_content": ""
} |
immerjs__immer__crossReferenceCleanup__upstream__2hop_7df763 | immerjs/immer | upstream | 2 | {
"hop_1": [
"handleCrossReference"
],
"hop_1_files": [
"private/tmp/repos/immerjs__immer/src/core/finalize.ts"
],
"hop_2": [
"set",
"handleInsertedValues",
"enableArrayMethods",
"enableMapSet",
"add"
],
"hop_2_files": [
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts",
"private/tmp/repos/immerjs__immer/src/plugins/arrayMethods.ts",
"private/tmp/repos/immerjs__immer/src/plugins/arrayMethods.ts",
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts",
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts"
]
} | {
"anchor": "crossReferenceCleanup",
"anchor_file": "private/tmp/repos/immerjs__immer/src/core/finalize.ts",
"anchor_source": "state.callbacks_.push(function crossReferenceCleanup() {\n\t\t\t\t// Update the target location with finalized value\n\t\t\t\tprepareCopy(target)\n\n\t\t\t\tconst finalizedValue = getFinalValue(state)\n\n\t\t\t\tupdateDraftInParent(target, value, finalizedValue, key)\n\t\t\t})",
"result_size": 6,
"created_at": "2026-03-20T17:18:06.914297+00:00",
"file_content": ""
} |
pallets__flask___get_source_fast__downstream__2hop_ce1245 | pallets/flask | downstream | 2 | {
"hop_1": [
"_iter_loaders"
],
"hop_1_files": [
"private/tmp/repos/flask/src/flask/templating.py"
],
"hop_2": [
"iter_blueprints"
],
"hop_2_files": [
"private/tmp/repos/flask/src/flask/sansio/app.py"
]
} | {
"anchor": "_get_source_fast",
"anchor_file": "private/tmp/repos/flask/src/flask/templating.py",
"anchor_source": "def _get_source_fast(\n self, environment: BaseEnvironment, template: str\n ) -> tuple[str, str | None, t.Callable[[], bool] | None]:\n for _srcobj, loader in self._iter_loaders(template):\n try:\n return loader.get_source(environment, template)\n except TemplateNotFound:\n continue\n raise TemplateNotFound(template)",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.268438+00:00"
} |
trpc__trpc__observableToAsyncIterable__downstream__1hop_2800da | trpc/trpc | downstream | 1 | {
"calls": [
"observableToReadableStream"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/observable/observable.ts"
]
} | {
"anchor": "observableToAsyncIterable",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/observable/observable.ts",
"anchor_source": "export function observableToAsyncIterable<TValue>(\n observable: Observable<TValue, unknown>,\n signal: AbortSignal,\n): AsyncIterable<TValue> {\n const stream = observableToReadableStream(observable, signal);\n\n const reader = stream.getReader();\n const iterator: AsyncIterator<TValue> = {\n async next() {\n const value = await reader.read();\n if (value.done) {\n return {\n value: undefined,\n done: true,\n };\n }\n const { value: result } = value;\n if (!result.ok) {\n throw result.error;\n }\n return {\n value: result.value,\n done: false,\n };\n },\n async return() {\n await reader.cancel();\n return {\n value: undefined,\n done: true,\n };\n },\n };\n return {\n [Symbol.asyncIterator]() {\n return iterator;\n },\n };\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
pytest-dev__pytest__pytest_runtest_setup__downstream__2hop_1c94b7 | pytest-dev/pytest | downstream | 2 | {
"hop_1": [
"evaluate_xfail_marks",
"evaluate_skip_marks"
],
"hop_1_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/skipping.py",
"private/tmp/repos/pytest-dev__pytest/src/_pytest/skipping.py"
],
"hop_2": [
"evaluate_condition"
],
"hop_2_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/skipping.py"
]
} | {
"anchor": "pytest_runtest_setup",
"anchor_file": "private/tmp/repos/pytest-dev__pytest/src/_pytest/skipping.py",
"anchor_source": "@hookimpl(tryfirst=True)\ndef pytest_runtest_setup(item: Item) -> None:\n skipped = evaluate_skip_marks(item)\n if skipped:\n raise skip.Exception(skipped.reason, _use_item_location=True)\n\n item.stash[xfailed_key] = xfailed = evaluate_xfail_marks(item)\n if xfailed and not item.config.option.runxfail and not xfailed.run:\n xfail(\"[NOTRUN] \" + xfailed.reason)",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.712542+00:00"
} |
python-poetry__poetry__all__downstream__1hop_a8ec71 | python-poetry/poetry | downstream | 1 | {
"calls": [
"get",
"_all"
],
"call_files": [
"private/tmp/repos/python-poetry__poetry/src/poetry/config/config.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/config/config.py"
]
} | {
"anchor": "all",
"anchor_file": "private/tmp/repos/python-poetry__poetry/src/poetry/config/config.py",
"anchor_source": "def all(self) -> dict[str, Any]:\n def _all(config: dict[str, Any], parent_key: str = \"\") -> dict[str, Any]:\n all_ = {}\n\n for key in config:\n value = self.get(parent_key + key)\n if isinstance(value, dict):\n if parent_key != \"\":\n current_parent = parent_key + key + \".\"\n else:\n current_parent = key + \".\"\n all_[key] = _all(config[key], parent_key=current_parent)\n continue\n\n all_[key] = value\n\n return all_\n\n return _all(self.config)",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.827462+00:00"
} |
sindresorhus__got__agent__downstream__2hop_884e5f | sindresorhus/got | downstream | 2 | {
"hop_1": [
"agent"
],
"hop_1_files": [
"private/tmp/repos/got/source/core/options.ts"
],
"hop_2": [
"assertPlainObject",
"assertAny"
],
"hop_2_files": [
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts"
]
} | {
"anchor": "agent",
"anchor_file": "private/tmp/repos/got/source/core/options.ts",
"anchor_source": "get agent(): Agents {\n\t\treturn this.#internals.agent;\n\t}",
"result_size": 2,
"created_at": "2026-03-20T17:18:08.163238+00:00"
} |
colinhacks__zod__min__downstream__1hop_b39f7f | colinhacks/zod | downstream | 1 | {
"calls": [
"_addCheck"
],
"call_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
]
} | {
"anchor": "min",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"anchor_source": "min(minLength: number, message?: errorUtil.ErrMessage) {\n return this._addCheck({\n kind: \"min\",\n value: minLength,\n ...errorUtil.errToObj(message),\n });\n }",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
encode__django-rest-framework___get_forward_relationships__downstream__1hop_2cff34 | encode/django-rest-framework | downstream | 1 | {
"calls": [
"_get_to_field"
],
"call_files": [
"private/tmp/repos/django-rest-framework/rest_framework/utils/model_meta.py"
]
} | {
"anchor": "_get_forward_relationships",
"anchor_file": "private/tmp/repos/django-rest-framework/rest_framework/utils/model_meta.py",
"anchor_source": "def _get_forward_relationships(opts):\n \"\"\"\n Returns a dict of field names to `RelationInfo`.\n \"\"\"\n forward_relations = {}\n for field in [field for field in opts.fields if field.serialize and field.remote_field]:\n forward_relations[field.name] = RelationInfo(\n model_field=field,\n related_model=field.remote_field.model,\n to_many=False,\n to_field=_get_to_field(field),\n has_through_model=False,\n reverse=False\n )\n\n # Deal with forward many-to-many relationships.\n for field in [field for field in opts.many_to_many if field.serialize]:\n forward_relations[field.name] = RelationInfo(\n model_field=field,\n related_model=field.remote_field.model,\n to_many=True,\n # manytomany do not have to_fields\n to_field=None,\n has_through_model=(\n not field.remote_field.through._meta.auto_created\n ),\n reverse=False\n )\n\n return forward_relations",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.734050+00:00"
} |
rq__rq__get_current_job_id__upstream__1hop_301e1e | rq/rq | upstream | 1 | {
"calls": [
"handle_stop_job_command",
"get_current_job"
],
"call_files": [
"private/tmp/repos/rq__rq/rq/command.py",
"private/tmp/repos/rq__rq/rq/worker/base.py"
]
} | {
"anchor": "get_current_job_id",
"anchor_file": "private/tmp/repos/rq__rq/rq/worker/base.py",
"anchor_source": "def get_current_job_id(self, pipeline: Optional['Pipeline'] = None) -> Optional[str]:\n \"\"\"Retrieves the current job id.\n\n Args:\n pipeline (Optional['Pipeline'], optional): The pipeline to use. Defaults to None.\n\n Returns:\n job_id (Optional[str): The job id\n \"\"\"\n connection = pipeline if pipeline is not None else self.connection\n result = connection.hget(self.key, 'current_job')\n if result is None:\n return None\n return as_text(result)",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
rq__rq__retry__upstream__2hop_d3356e | rq/rq | upstream | 2 | {
"hop_1": [
"cleanup",
"handle_job_failure"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/registry.py",
"private/tmp/repos/rq__rq/rq/worker/base.py"
],
"hop_2": [
"clean_registries",
"perform_job",
"cleanup",
"monitor_work_horse",
"get_job_and_execution_ids",
"handle_job_retry"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/registry.py",
"private/tmp/repos/rq__rq/rq/worker/base.py",
"private/tmp/repos/rq__rq/rq/intermediate_queue.py",
"private/tmp/repos/rq__rq/rq/worker/worker_classes.py",
"private/tmp/repos/rq__rq/rq/registry.py",
"private/tmp/repos/rq__rq/rq/worker/base.py"
]
} | {
"anchor": "retry",
"anchor_file": "private/tmp/repos/rq__rq/rq/job.py",
"anchor_source": "def retry(self, queue: 'Queue', pipeline: 'Pipeline'):\n \"\"\"Should be called when a job was enqueued with queue.enqueue(retry=Retry(...)) raises an exception.\n\n Requeues or schedules the job for execution. If retry_interval was set,\n the job will be scheduled for later; otherwise it will be enqueued immediately.\n\n Args:\n queue (Queue): The queue to retry the job on\n pipeline (Pipeline): The Redis' pipeline to use\n \"\"\"\n retry_interval = self.get_retry_interval()\n assert self.retries_left\n self.retries_left = self.retries_left - 1\n if retry_interval:\n scheduled_datetime = now() + timedelta(seconds=retry_interval)\n self.set_status(JobStatus.SCHEDULED)\n queue.schedule_job(self, scheduled_datetime, pipeline=pipeline)\n self.log.info(\n 'Job %s: scheduled for retry at %s, %s remaining', self.id, scheduled_datetime, self.retries_left\n )\n else:\n queue._enqueue_job(self, pipeline=pipeline)\n self.log.info('Job %s: enqueued for retry, %s remaining', self.id, self.retries_left)",
"result_size": 6,
"created_at": "2026-03-20T17:18:07.942698+00:00",
"file_content": ""
} |
rq__rq__add_execution__downstream__2hop_b5f84e | rq/rq | downstream | 2 | {
"hop_1": [
"current_timestamp"
],
"hop_1_files": [
"private/tmp/repos/rq__rq/rq/utils.py"
],
"hop_2": [
"now"
],
"hop_2_files": [
"private/tmp/repos/rq__rq/rq/utils.py"
]
} | {
"anchor": "add_execution",
"anchor_file": "private/tmp/repos/rq__rq/rq/registry.py",
"anchor_source": "def add_execution(self, execution: 'Execution', pipeline: 'Pipeline', ttl: int = 0, xx: bool = False) -> int:\n \"\"\"Adds an execution to a registry with expiry time of now + ttl, unless it's -1 which is set to +inf\n\n Args:\n execution (Execution): The Execution to add.\n pipeline (Pipeline): The Redis Pipeline.\n ttl (int, optional): The time to live. Defaults to 0.\n xx (bool, optional): .... Defaults to False.\n\n Returns:\n result (int): The ZADD command result\n \"\"\"\n score: Union[int, str] = ttl if ttl < 0 else current_timestamp() + ttl\n if score == -1:\n score = '+inf'\n\n return pipeline.zadd(self.key, {execution.composite_key: score}, xx=xx) # type: ignore",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.942698+00:00"
} |
psf__requests__to_key_val_list__upstream__1hop_8374b5 | psf/requests | upstream | 1 | {
"calls": [
"_encode_files",
"merge_setting"
],
"call_files": [
"private/tmp/repos/requests/src/requests/models.py",
"private/tmp/repos/requests/src/requests/sessions.py"
]
} | {
"anchor": "to_key_val_list",
"anchor_file": "private/tmp/repos/requests/src/requests/utils.py",
"anchor_source": "def to_key_val_list(value):\n \"\"\"Take an object and test to see if it can be represented as a\n dictionary. If it can be, return a list of tuples, e.g.,\n\n ::\n\n >>> to_key_val_list([('key', 'val')])\n [('key', 'val')]\n >>> to_key_val_list({'key': 'val'})\n [('key', 'val')]\n >>> to_key_val_list('string')\n Traceback (most recent call last):\n ...\n ValueError: cannot encode objects that are not 2-tuples\n\n :rtype: list\n \"\"\"\n if value is None:\n return None\n\n if isinstance(value, (str, bytes, bool, int)):\n raise ValueError(\"cannot encode objects that are not 2-tuples\")\n\n if isinstance(value, Mapping):\n value = value.items()\n\n return list(value)",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.647326+00:00",
"file_content": "\"\"\"\nrequests.utils\n~~~~~~~~~~~~~~\n\nThis module provides utility functions that are used within Requests\nthat are also useful for external consumption.\n\"\"\"\n\nimport codecs\nimport contextlib\nimport io\nimport os\nimport re\nimport socket\nimport struct\nimport sys\nimport tempfile\nimport warnings\nimport zipfile\nfrom collections import OrderedDict\n\nfrom urllib3.util import make_headers, parse_url\n\nfrom . import certs\nfrom .__version__ import __version__\n\n# to_native_string is unused here, but imported here for backwards compatibility\nfrom ._internal_utils import ( # noqa: F401\n _HEADER_VALIDATORS_BYTE,\n _HEADER_VALIDATORS_STR,\n HEADER_VALIDATORS,\n to_native_string,\n)\nfrom .compat import (\n Mapping,\n basestring,\n bytes,\n getproxies,\n getproxies_environment,\n integer_types,\n is_urllib3_1,\n proxy_bypass,\n proxy_bypass_environment,\n quote,\n str,\n unquote,\n urlparse,\n urlunparse,\n)\nfrom .compat import parse_http_list as _parse_list_header\nfrom .cookies import cookiejar_from_dict\nfrom .exceptions import (\n FileModeWarning,\n InvalidHeader,\n InvalidURL,\n UnrewindableBodyError,\n)\nfrom .structures import CaseInsensitiveDict\n\nNETRC_FILES = (\".netrc\", \"_netrc\")\n\nDEFAULT_CA_BUNDLE_PATH = certs.where()\n\nDEFAULT_PORTS = {\"http\": 80, \"https\": 443}\n\n# Ensure that ', ' is used to preserve previous delimiter behavior.\nDEFAULT_ACCEPT_ENCODING = \", \".join(\n re.split(r\",\\s*\", make_headers(accept_encoding=True)[\"accept-encoding\"])\n)\n\n\nif sys.platform == \"win32\":\n # provide a proxy_bypass version on Windows without DNS lookups\n\n def proxy_bypass_registry(host):\n try:\n import winreg\n except ImportError:\n return False\n\n try:\n internetSettings = winreg.OpenKey(\n winreg.HKEY_CURRENT_USER,\n r\"Software\\Microsoft\\Windows\\CurrentVersion\\Internet Settings\",\n )\n # ProxyEnable could be REG_SZ or REG_DWORD, normalizing it\n proxyEnable = int(winreg.QueryValueEx(internetSettings, \"ProxyEnable\")[0])\n # ProxyOverride is almost always a string\n proxyOverride = winreg.QueryValueEx(internetSettings, \"ProxyOverride\")[0]\n except (OSError, ValueError):\n return False\n if not proxyEnable or not proxyOverride:\n return False\n\n # make a check value list from the registry entry: replace the\n # '<local>' string by the localhost entry and the corresponding\n # canonical entry.\n proxyOverride = proxyOverride.split(\";\")\n # filter out empty strings to avoid re.match return true in the following code.\n proxyOverride = filter(None, proxyOverride)\n # now check if we match one of the registry values.\n for test in proxyOverride:\n if test == \"<local>\":\n if \".\" not in host:\n return True\n test = test.replace(\".\", r\"\\.\") # mask dots\n test = test.replace(\"*\", r\".*\") # change glob sequence\n test = test.replace(\"?\", r\".\") # change glob char\n if re.match(test, host, re.I):\n return True\n return False\n\n def proxy_bypass(host): # noqa\n \"\"\"Return True, if the host should be bypassed.\n\n Checks proxy settings gathered from the environment, if specified,\n or the registry.\n \"\"\"\n if getproxies_environment():\n return proxy_bypass_environment(host)\n else:\n return proxy_bypass_registry(host)\n\n\ndef dict_to_sequence(d):\n \"\"\"Returns an internal sequence dictionary update.\"\"\"\n\n if hasattr(d, \"items\"):\n d = d.items()\n\n return d\n\n\ndef super_len(o):\n total_length = None\n current_position = 0\n\n if not is_urllib3_1 and isinstance(o, str):\n # urllib3 2.x+ treats all strings as utf-8 instead\n # of latin-1 (iso-8859-1) like http.client.\n o = o.encode(\"utf-8\")\n\n if hasattr(o, \"__len__\"):\n total_length = len(o)\n\n elif hasattr(o, \"len\"):\n total_length = o.len\n\n elif hasattr(o, \"fileno\"):\n try:\n fileno = o.fileno()\n except (io.UnsupportedOperation, AttributeError):\n # AttributeError is a surprising exception, seeing as how we've just checked\n # that `hasattr(o, 'fileno')`. It happens for objects obtained via\n # `Tarfile.extractfile()`, per issue 5229.\n pass\n else:\n total_length = os.fstat(fileno).st_size\n\n # Having used fstat to determine the file length, we need to\n # confirm that this file was opened up in binary mode.\n if \"b\" not in o.mode:\n warnings.warn(\n (\n \"Requests has determined the content-length for this \"\n \"request using the binary size of the file: however, the \"\n \"file has been opened in text mode (i.e. without the 'b' \"\n \"flag in the mode). This may lead to an incorrect \"\n \"content-length. In Requests 3.0, support will be removed \"\n \"for files in text mode.\"\n ),\n FileModeWarning,\n )\n\n if hasattr(o, \"tell\"):\n try:\n current_position = o.tell()\n except OSError:\n # This can happen in some weird situations, such as when the file\n # is actually a special file descriptor like stdin. In this\n # instance, we don't know what the length is, so set it to zero and\n # let requests chunk it instead.\n if total_length is not None:\n current_position = total_length\n else:\n if hasattr(o, \"seek\") and total_length is None:\n # StringIO and BytesIO have seek but no usable fileno\n try:\n # seek to end of file\n o.seek(0, 2)\n total_length = o.tell()\n\n # seek back to current position to support\n # partially read file-like objects\n o.seek(current_position or 0)\n except OSError:\n total_length = 0\n\n if total_length is None:\n total_length = 0\n\n return max(0, total_length - current_position)\n\n\ndef get_netrc_auth(url, raise_errors=False):\n \"\"\"Returns the Requests tuple auth for a given url from netrc.\"\"\"\n\n netrc_file = os.environ.get(\"NETRC\")\n if netrc_file is not None:\n netrc_locations = (netrc_file,)\n else:\n netrc_locations = (f\"~/{f}\" for f in NETRC_FILES)\n\n try:\n from netrc import NetrcParseError, netrc\n\n netrc_path = None\n\n for f in netrc_locations:\n loc = os.path.expanduser(f)\n if os.path.exists(loc):\n netrc_path = loc\n break\n\n # Abort early if there isn't one.\n if netrc_path is None:\n return\n\n ri = urlparse(url)\n host = ri.hostname\n\n try:\n _netrc = netrc(netrc_path).authenticators(host)\n if _netrc and any(_netrc):\n # Return with login / password\n login_i = 0 if _netrc[0] else 1\n return (_netrc[login_i], _netrc[2])\n except (NetrcParseError, OSError):\n # If there was a parsing error or a permissions issue reading the file,\n # we'll just skip netrc auth unless explicitly asked to raise errors.\n if raise_errors:\n raise\n\n # App Engine hackiness.\n except (ImportError, AttributeError):\n pass\n\n\ndef guess_filename(obj):\n \"\"\"Tries to guess the filename of the given object.\"\"\"\n name = getattr(obj, \"name\", None)\n if name and isinstance(name, basestring) and name[0] != \"<\" and name[-1] != \">\":\n return os.path.basename(name)\n\n\ndef extract_zipped_paths(path):\n \"\"\"Replace nonexistent paths that look \n# … (truncated at 8000 chars)"
} |
trpc__trpc__mergeWithoutOverrides__upstream__1hop_46f99d | trpc/trpc | upstream | 1 | {
"calls": [
"mergeRouters",
"createNewBuilder"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/router.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/procedureBuilder.ts"
]
} | {
"anchor": "mergeWithoutOverrides",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/utils.ts",
"anchor_source": "export function mergeWithoutOverrides<TType extends Record<string, unknown>>(\n obj1: TType,\n ...objs: Partial<TType>[]\n): TType {\n const newObj: TType = Object.assign(emptyObject(), obj1);\n\n for (const overrides of objs) {\n for (const key in overrides) {\n if (key in newObj && newObj[key] !== overrides[key]) {\n throw new Error(`Duplicate key ${key}`);\n }\n newObj[key as keyof TType] = overrides[key] as TType[keyof TType];\n }\n }\n return newObj;\n}",
"result_size": 2,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
pallets__click__get_current_context__upstream__1hop_b1b9a9 | pallets/click | upstream | 1 | {
"calls": [
"decorator",
"new_func",
"pass_meta_key",
"make_pass_decorator",
"pass_obj",
"resolve_color_default",
"pass_context"
],
"call_files": [
"private/tmp/repos/pallets__click/src/click/decorators.py",
"private/tmp/repos/pallets__click/src/click/decorators.py",
"private/tmp/repos/pallets__click/src/click/decorators.py",
"private/tmp/repos/pallets__click/src/click/decorators.py",
"private/tmp/repos/pallets__click/src/click/decorators.py",
"private/tmp/repos/pallets__click/src/click/globals.py",
"private/tmp/repos/pallets__click/src/click/decorators.py"
]
} | {
"anchor": "get_current_context",
"anchor_file": "private/tmp/repos/pallets__click/src/click/globals.py",
"anchor_source": "def get_current_context(silent: bool = False) -> Context | None:\n \"\"\"Returns the current click context. This can be used as a way to\n access the current context object from anywhere. This is a more implicit\n alternative to the :func:`pass_context` decorator. This function is\n primarily useful for helpers such as :func:`echo` which might be\n interested in changing its behavior based on the current context.\n\n To push the current context, :meth:`Context.scope` can be used.\n\n .. versionadded:: 5.0\n\n :param silent: if set to `True` the return value is `None` if no context\n is available. The default behavior is to raise a\n :exc:`RuntimeError`.\n \"\"\"\n try:\n return t.cast(\"Context\", _local.stack[-1])\n except (AttributeError, IndexError) as e:\n if not silent:\n raise RuntimeError(\"There is no active click context.\") from e\n\n return None",
"result_size": 7,
"created_at": "2026-03-20T17:18:07.184893+00:00",
"file_content": ""
} |
pallets__jinja__visit_Test__downstream__2hop_e65ba0 | pallets/jinja | downstream | 2 | {
"hop_1": [
"visit",
"_filter_test_common"
],
"hop_1_files": [
"private/tmp/repos/pallets__jinja/src/jinja2/visitor.py",
"private/tmp/repos/pallets__jinja/src/jinja2/compiler.py"
],
"hop_2": [
"generic_visit",
"get_visitor",
"fail",
"signature",
"from_obj",
"write"
],
"hop_2_files": [
"private/tmp/repos/pallets__jinja/src/jinja2/visitor.py",
"private/tmp/repos/pallets__jinja/src/jinja2/visitor.py",
"private/tmp/repos/pallets__jinja/src/jinja2/compiler.py",
"private/tmp/repos/pallets__jinja/src/jinja2/compiler.py",
"private/tmp/repos/pallets__jinja/src/jinja2/utils.py",
"private/tmp/repos/pallets__jinja/src/jinja2/compiler.py"
]
} | {
"anchor": "visit_Test",
"anchor_file": "private/tmp/repos/pallets__jinja/src/jinja2/compiler.py",
"anchor_source": "@optimizeconst\n def visit_Test(self, node: nodes.Test, frame: Frame) -> None:\n with self._filter_test_common(node, frame, False):\n self.visit(node.node, frame)",
"result_size": 6,
"created_at": "2026-03-20T17:18:07.327357+00:00"
} |
paramiko__paramiko__shutdown__downstream__1hop_3c1fa9 | paramiko/paramiko | downstream | 1 | {
"calls": [
"_send_eof"
],
"call_files": [
"private/tmp/repos/paramiko/paramiko/channel.py"
]
} | {
"anchor": "shutdown",
"anchor_file": "private/tmp/repos/paramiko/paramiko/channel.py",
"anchor_source": "def shutdown(self, how):\n \"\"\"\n Shut down one or both halves of the connection. If ``how`` is 0,\n further receives are disallowed. If ``how`` is 1, further sends\n are disallowed. If ``how`` is 2, further sends and receives are\n disallowed. This closes the stream in one or both directions.\n\n :param int how:\n 0 (stop receiving), 1 (stop sending), or 2 (stop receiving and\n sending).\n \"\"\"\n if (how == 0) or (how == 2):\n # feign \"read\" shutdown\n self.eof_received = 1\n if (how == 1) or (how == 2):\n self.lock.acquire()\n try:\n m = self._send_eof()\n finally:\n self.lock.release()\n if m is not None and self.transport is not None:\n self.transport._send_user_message(m)",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.460561+00:00"
} |
trpc__trpc__isAsyncIterable__downstream__1hop_830717 | trpc/trpc | downstream | 1 | {
"calls": [
"isObject"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/utils.ts"
]
} | {
"anchor": "isAsyncIterable",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/utils.ts",
"anchor_source": "export function isAsyncIterable<TValue>(\n value: unknown,\n): value is AsyncIterable<TValue> {\n return (\n asyncIteratorsSupported && isObject(value) && Symbol.asyncIterator in value\n );\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
immerjs__immer__loadPlugin__upstream__1hop_ee7189 | immerjs/immer | upstream | 1 | {
"calls": [
"enablePatches",
"enableMapSet",
"enableArrayMethods"
],
"call_files": [
"private/tmp/repos/immerjs__immer/src/plugins/patches.ts",
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts",
"private/tmp/repos/immerjs__immer/src/plugins/arrayMethods.ts"
]
} | {
"anchor": "loadPlugin",
"anchor_file": "private/tmp/repos/immerjs__immer/src/utils/plugins.ts",
"anchor_source": "export function loadPlugin<K extends keyof Plugins>(\n\tpluginKey: K,\n\timplementation: Plugins[K]\n): void {\n\tif (!plugins[pluginKey]) plugins[pluginKey] = implementation\n}",
"result_size": 3,
"created_at": "2026-03-20T17:18:06.914297+00:00",
"file_content": ""
} |
colinhacks__zod__positive__downstream__2hop_a1aea2 | colinhacks/zod | downstream | 2 | {
"hop_1": [
"_addCheck"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
],
"hop_2": [
"ZodBigInt",
"constructor"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
]
} | {
"anchor": "positive",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"anchor_source": "positive(message?: errorUtil.ErrMessage) {\n return this._addCheck({\n kind: \"min\",\n value: BigInt(0),\n inclusive: false,\n message: errorUtil.toString(message),\n });\n }",
"result_size": 2,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
locustio__locust__send_to_client__upstream__2hop_0e62b0 | locustio/locust | upstream | 2 | {
"hop_1": [
"send_message",
"quit",
"stop",
"handle_message",
"client_listener"
],
"hop_1_files": [
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/runners.py"
],
"hop_2": [
"heartbeat_worker",
"start",
"start_automatic_run",
"update_user_class",
"shutdown",
"stop_and_optionally_quit",
"on_quitting",
"main"
],
"hop_2_files": [
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/main.py",
"private/tmp/repos/locustio__locust/locust/env.py",
"private/tmp/repos/locustio__locust/locust/main.py",
"private/tmp/repos/locustio__locust/locust/main.py",
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/main.py"
]
} | {
"anchor": "send_to_client",
"anchor_file": "private/tmp/repos/locustio__locust/locust/rpc/zmqrpc.py",
"anchor_source": "@retry()\n def send_to_client(self, msg):\n try:\n self.socket.send_multipart([msg.node_id.encode(), msg.serialize()])\n except zmqerr.ZMQError as e:\n raise RPCSendError(\"ZMQ sent failure\") from e",
"result_size": 8,
"created_at": "2026-03-20T17:18:07.062541+00:00",
"file_content": ""
} |
pallets__click___split_opt__upstream__1hop_ccc87e | pallets/click | upstream | 1 | {
"calls": [
"get_help_extra",
"_parse_decls",
"_normalize_opt",
"join_options"
],
"call_files": [
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/core.py",
"private/tmp/repos/pallets__click/src/click/parser.py",
"private/tmp/repos/pallets__click/src/click/formatting.py"
]
} | {
"anchor": "_split_opt",
"anchor_file": "private/tmp/repos/pallets__click/src/click/parser.py",
"anchor_source": "def _split_opt(opt: str) -> tuple[str, str]:\n first = opt[:1]\n if first.isalnum():\n return \"\", opt\n if opt[1:2] == first:\n return opt[:2], opt[2:]\n return first, opt[1:]",
"result_size": 4,
"created_at": "2026-03-20T17:18:07.184893+00:00",
"file_content": ""
} |
pallets__click__show__downstream__2hop_32469e | pallets/click | downstream | 2 | {
"hop_1": [
"get_text_stderr",
"echo",
"format_message"
],
"hop_1_files": [
"private/tmp/repos/pallets__click/src/click/_compat.py",
"private/tmp/repos/pallets__click/src/click/utils.py",
"private/tmp/repos/pallets__click/src/click/exceptions.py"
],
"hop_2": [
"strip_ansi",
"_get_windows_console_stream",
"resolve_color_default",
"auto_wrap_for_ansi",
"_find_binary_writer"
],
"hop_2_files": [
"private/tmp/repos/pallets__click/src/click/_compat.py",
"private/tmp/repos/pallets__click/src/click/_compat.py",
"private/tmp/repos/pallets__click/src/click/globals.py",
"private/tmp/repos/pallets__click/src/click/_compat.py",
"private/tmp/repos/pallets__click/src/click/_compat.py"
]
} | {
"anchor": "show",
"anchor_file": "private/tmp/repos/pallets__click/src/click/exceptions.py",
"anchor_source": "def show(self, file: t.IO[t.Any] | None = None) -> None:\n if file is None:\n file = get_text_stderr()\n\n echo(\n _(\"Error: {message}\").format(message=self.format_message()),\n file=file,\n color=self.show_color,\n )",
"result_size": 6,
"created_at": "2026-03-20T17:18:07.184893+00:00"
} |
locustio__locust__default_args_dict__upstream__2hop_fb3aa2 | locustio/locust | upstream | 2 | {
"hop_1": [
"handle_message",
"ui_extra_args_dict"
],
"hop_1_files": [
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/argument_parser.py"
],
"hop_2": [
"worker",
"update_template_args"
],
"hop_2_files": [
"private/tmp/repos/locustio__locust/locust/runners.py",
"private/tmp/repos/locustio__locust/locust/web.py"
]
} | {
"anchor": "default_args_dict",
"anchor_file": "private/tmp/repos/locustio__locust/locust/argument_parser.py",
"anchor_source": "def default_args_dict() -> dict:\n # returns a dict containing the default arguments (before any custom arguments are added)\n default_parser = get_empty_argument_parser()\n setup_parser_arguments(default_parser)\n # Dont read config files because they may contain custom arguments, which would fail parsing in the next step\n default_parser._default_config_files = {}\n return vars(default_parser.parse([]))",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.062541+00:00",
"file_content": ""
} |
python-poetry__poetry__build_config_setting_validator__downstream__1hop_e7ac4a | python-poetry/poetry | downstream | 1 | {
"calls": [
"build_config_setting_normalizer"
],
"call_files": [
"private/tmp/repos/python-poetry__poetry/src/poetry/config/config.py"
]
} | {
"anchor": "build_config_setting_validator",
"anchor_file": "private/tmp/repos/python-poetry__poetry/src/poetry/config/config.py",
"anchor_source": "def build_config_setting_validator(val: str) -> bool:\n try:\n value = build_config_setting_normalizer(val)\n except JSONDecodeError:\n return False\n\n if not isinstance(value, dict):\n return False\n\n for key, item in value.items():\n # keys should be string\n if not isinstance(key, str):\n return False\n\n # items are allowed to be a string\n if isinstance(item, str):\n continue\n\n # list items should only contain strings\n is_valid_list = isinstance(item, list) and all(isinstance(i, str) for i in item)\n if not is_valid_list:\n return False\n\n return True",
"result_size": 1,
"created_at": "2026-03-20T17:18:07.827462+00:00"
} |
encode__httpx__aiter_bytes__upstream__2hop_6d68f7 | encode/httpx | upstream | 2 | {
"hop_1": [
"aread",
"aiter_text"
],
"hop_1_files": [
"private/tmp/repos/encode__httpx/httpx/_models.py",
"private/tmp/repos/encode__httpx/httpx/_models.py"
],
"hop_2": [
"_send_handling_auth",
"_send_handling_redirects",
"send",
"async_auth_flow",
"aiter_lines"
],
"hop_2_files": [
"private/tmp/repos/encode__httpx/httpx/_client.py",
"private/tmp/repos/encode__httpx/httpx/_client.py",
"private/tmp/repos/encode__httpx/httpx/_client.py",
"private/tmp/repos/encode__httpx/httpx/_auth.py",
"private/tmp/repos/encode__httpx/httpx/_models.py"
]
} | {
"anchor": "aiter_bytes",
"anchor_file": "private/tmp/repos/encode__httpx/httpx/_models.py",
"anchor_source": "async def aiter_bytes(\n self, chunk_size: int | None = None\n ) -> typing.AsyncIterator[bytes]:\n \"\"\"\n A byte-iterator over the decoded response content.\n This allows us to handle gzip, deflate, brotli, and zstd encoded responses.\n \"\"\"\n if hasattr(self, \"_content\"):\n chunk_size = len(self._content) if chunk_size is None else chunk_size\n for i in range(0, len(self._content), max(chunk_size, 1)):\n yield self._content[i : i + chunk_size]\n else:\n decoder = self._get_content_decoder()\n chunker = ByteChunker(chunk_size=chunk_size)\n with request_context(request=self._request):\n async for raw_bytes in self.aiter_raw():\n decoded = decoder.decode(raw_bytes)\n for chunk in chunker.decode(decoded):\n yield chunk\n decoded = decoder.flush()\n for chunk in chunker.decode(decoded):\n yield chunk # pragma: no cover\n for chunk in chunker.flush():\n yield chunk",
"result_size": 5,
"created_at": "2026-03-20T17:18:06.815591+00:00",
"file_content": ""
} |
encode__httpx__read__upstream__1hop_f26016 | encode/httpx | upstream | 1 | {
"calls": [
"handle_request",
"sync_auth_flow"
],
"call_files": [
"private/tmp/repos/encode__httpx/httpx/_transports/mock.py",
"private/tmp/repos/encode__httpx/httpx/_auth.py"
]
} | {
"anchor": "read",
"anchor_file": "private/tmp/repos/encode__httpx/httpx/_models.py",
"anchor_source": "def read(self) -> bytes:\n \"\"\"\n Read and return the request content.\n \"\"\"\n if not hasattr(self, \"_content\"):\n assert isinstance(self.stream, typing.Iterable)\n self._content = b\"\".join(self.stream)\n if not isinstance(self.stream, ByteStream):\n # If a streaming request has been read entirely into memory, then\n # we can replace the stream with a raw bytes implementation,\n # to ensure that any non-replayable streams can still be used.\n self.stream = ByteStream(self._content)\n return self._content",
"result_size": 3,
"created_at": "2026-03-20T17:18:06.815591+00:00",
"file_content": ""
} |
python-poetry__poetry__create_venv__downstream__2hop_8e77bd | python-poetry/poetry | downstream | 2 | {
"hop_1": [
"get_system_env",
"remove_venv",
"use_in_project_venv",
"get",
"generate_env_name",
"build_venv"
],
"hop_1_files": [
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py"
],
"hop_2": [
"get_base_prefix",
"in_project_venv_exists"
],
"hop_2_files": [
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py"
]
} | {
"anchor": "create_venv",
"anchor_file": "private/tmp/repos/python-poetry__poetry/src/poetry/utils/env/env_manager.py",
"anchor_source": "def create_venv(\n self,\n name: str | None = None,\n python: Python | None = None,\n force: bool = False,\n ) -> Env:\n if self._env is not None and not force:\n return self._env\n\n cwd = self._poetry.file.path.parent\n env = self.get(reload=True)\n\n if not env.is_sane():\n force = True\n\n if env.is_venv() and not force:\n # Already inside a virtualenv.\n current_python = Version.parse(\n \".\".join(str(c) for c in env.version_info[:3])\n )\n if not self._poetry.package.python_constraint.allows(current_python):\n raise InvalidCurrentPythonVersionError(\n self._poetry.package.python_versions, str(current_python)\n )\n return env\n\n create_venv = self._poetry.config.get(\"virtualenvs.create\")\n in_project_venv = self.use_in_project_venv()\n venv_prompt = self._poetry.config.get(\"virtualenvs.prompt\")\n\n specific_python_requested = python is not None\n if not python:\n python = Python.get_preferred_python(\n config=self._poetry.config, io=self._io\n )\n\n venv_path = (\n self.in_project_venv\n if in_project_venv\n else self._poetry.config.virtualenvs_path\n )\n if not name:\n name = self._poetry.package.name\n\n supported_python = self._poetry.package.python_constraint\n if not supported_python.allows(python.patch_version):\n # The currently activated or chosen Python version\n # is not compatible with the Python constraint specified\n # for the project.\n # If an executable has been specified, we stop there\n # and notify the user of the incompatibility.\n # Otherwise, we try to find a compatible Python version.\n if specific_python_requested:\n raise NoCompatiblePythonVersionFoundError(\n self._poetry.package.python_versions,\n python.patch_version.to_string(),\n )\n\n self._io.write_error_line(\n f\"<warning>The currently activated Python version {python.patch_version.to_string()} is not\"\n f\" supported by the project ({self._poetry.package.python_versions}).\\n\"\n \"Trying to find and use a compatible version.</warning> \"\n )\n\n python = Python.get_compatible_python(poetry=self._poetry, io=self._io)\n\n if in_project_venv:\n venv = venv_path\n else:\n name = self.generate_env_name(name, str(cwd))\n name = f\"{name}-py{python.minor_version.to_string()}\"\n venv = venv_path / name\n\n if venv_prompt is not None:\n try:\n venv_prompt = venv_prompt.format(\n project_name=self._poetry.package.name or \"virtualenv\",\n python_version=python.minor_version.to_string(),\n )\n except KeyError as e:\n raise PoetryConsoleError(\n f\"Invalid template variable '{e.args[0]}' in 'virtualenvs.prompt' setting.\\n\"\n f\"Valid variables are: {{project_name}}, {{python_version}}\"\n ) from e\n except ValueError as e:\n raise PoetryConsoleError(\n f\"Invalid template string in 'virtualenvs.prompt' setting: {e}\"\n ) from e\n\n if not venv.exists():\n if create_venv is False:\n self._io.write_error_line(\n \"<fg=black;bg=yellow>\"\n \"Skipping virtualenv creation, \"\n \"as specified in config file.\"\n \"</>\"\n )\n\n return self.get_system_env()\n\n self._io.write_error_line(\n f\"Creating virtualenv <c1>{name}</> in\"\n f\" {venv_path if not WINDOWS else get_real_windows_path(venv_path)!s}\"\n )\n else:\n create_venv = False\n if force:\n if not env.is_sane():\n self._io.write_error_line(\n f\"<warning>The virtual environment found in {env.path} seems to\"\n \" be broken.</warning>\"\n )\n self._io.write_error_line(\n f\"Recreating virtualenv <c1>{name}</> in {venv!s}\"\n )\n self.remove_venv(venv)\n create_venv = True\n elif self._io.is_very_verbose():\n self._io.write_error_line(f\"Virtualenv <c1>{name}</> already exists.\")\n\n if create_venv:\n self.build_venv(\n venv,\n executable=python.executable,\n flags=self._poetry.config.get(\"virtualenvs.options\"),\n prompt=venv_prompt,\n )\n\n # venv detection:\n # stdlib venv may symlink sys.executable, so we can't use realpath.\n # but others can symlink *to* the venv Python,\n # so we can't just use sys.executable.\n # So we just check every item in the symlink tree (generally <= 3)\n p = os.path.normcase(sys.executable)\n paths = [p]\n while os.path.islink(p):\n p = os.path.normcase(os.path.join(os.path.dirname(p), os.readlink(p)))\n paths.append(p)\n\n p_venv = os.path.normcase(str(venv))\n if any(p.startswith(p_venv) for p in paths):\n # Running properly in the virtualenv, don't need to do anything\n return self.get_system_env()\n\n return VirtualEnv(venv)",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.827462+00:00"
} |
trpc__trpc__createDeferred__upstream__2hop_d3ec5e | trpc/trpc | upstream | 2 | {
"hop_1": [
"mergeAsyncIterables",
"jsonlStreamConsumer",
"write",
"initIterable"
],
"hop_1_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/mergeAsyncIterables.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/jsonl.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/jsonl.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/mergeAsyncIterables.ts"
],
"hop_2": [
"add",
"httpBatchStreamLink",
"fetch",
"transform"
],
"hop_2_files": [
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/mergeAsyncIterables.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpBatchStreamLink.ts",
"private/tmp/repos/trpc__trpc/packages/client/src/links/httpBatchStreamLink.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/jsonl.ts"
]
} | {
"anchor": "createDeferred",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/server/src/unstable-core-do-not-import/stream/utils/createDeferred.ts",
"anchor_source": "export function createDeferred<TValue = void>() {\n let resolve: (value: TValue) => void;\n let reject: (error: unknown) => void;\n const promise = new Promise<TValue>((res, rej) => {\n resolve = res;\n reject = rej;\n });\n\n return { promise, resolve: resolve!, reject: reject! };\n}",
"result_size": 7,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
trpc__trpc__CompanyLogo__downstream__1hop_1f4022 | trpc/trpc | downstream | 1 | {
"calls": [
"cn"
],
"call_files": [
"private/tmp/repos/trpc__trpc/www/src/utils/cn.ts"
]
} | {
"anchor": "CompanyLogo",
"anchor_file": "private/tmp/repos/trpc__trpc/www/src/components/CompaniesUsing.tsx",
"anchor_source": "function CompanyLogo(props: { src: string; name: string }) {\n const triggerRef = React.useRef<HTMLSpanElement>(null);\n const [position, setPosition] = React.useState<{\n top: number;\n left: number;\n } | null>(null);\n\n return (\n <>\n <span\n ref={triggerRef}\n className={cn(\n 'inline-flex items-center justify-center rounded-lg px-3 py-2 transition-colors',\n position && 'dark:bg-white/90',\n )}",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
colinhacks__zod___isoTime__upstream__2hop_d19bee | colinhacks/zod | upstream | 2 | {
"hop_1": [
"time"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/mini/iso.ts"
],
"hop_2": [
"convertBaseSchema",
"time"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/classic/from-json-schema.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/classic/schemas.ts"
]
} | {
"anchor": "_isoTime",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/api.ts",
"anchor_source": "export function _isoTime<T extends schemas.$ZodISOTime>(\n Class: util.SchemaClass<T>,\n params?: string | $ZodISOTimeParams | $ZodCheckISOTimeParams\n): T {\n return new Class({\n type: \"string\",\n format: \"time\",\n check: \"string_format\",\n precision: null,\n ...util.normalizeParams(params),\n });\n}",
"result_size": 3,
"created_at": "2026-03-20T17:18:06.594501+00:00",
"file_content": ""
} |
getpelican__pelican__handle_data__downstream__2hop_92a5d8 | getpelican/pelican | downstream | 2 | {
"hop_1": [
"getoffset",
"add_last_word"
],
"hop_1_files": [
"private/tmp/repos/getpelican__pelican/pelican/utils.py",
"private/tmp/repos/getpelican__pelican/pelican/utils.py"
],
"hop_2": [
"add_word"
],
"hop_2_files": [
"private/tmp/repos/getpelican__pelican/pelican/utils.py"
]
} | {
"anchor": "handle_data",
"anchor_file": "private/tmp/repos/getpelican__pelican/pelican/utils.py",
"anchor_source": "def handle_data(self, data: str) -> None:\n word_end = 0\n offset = self.getoffset()\n\n while self.words_found < self.max_words:\n match = self._word_regex.search(data, word_end)\n if not match:\n break\n\n if match.start(0) > 0:\n self.add_last_word()\n\n word_end = match.end(0)\n self.last_word_end = offset + word_end\n\n if word_end < len(data):\n self.add_last_word()",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.869475+00:00"
} |
colinhacks__zod__partial__downstream__2hop_084389 | colinhacks/zod | downstream | 2 | {
"hop_1": [
"ZodObject",
"constructor",
"optional"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
],
"hop_2": [
"ZodOptional"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
]
} | {
"anchor": "partial",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"anchor_source": "partial(mask?: any) {\n const newShape: any = {};\n\n for (const key of util.objectKeys(this.shape)) {\n const fieldSchema = this.shape[key]!;\n\n if (mask && !mask[key]) {\n newShape[key] = fieldSchema;\n } else {\n newShape[key] = fieldSchema.optional();\n }\n }\n\n return new ZodObject({\n ...this._def,\n shape: () => newShape,\n }) as any;\n }",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
getpelican__pelican__generate_context__downstream__1hop_82f914 | getpelican/pelican | downstream | 1 | {
"calls": [
"_add_failed_source_path",
"_update_context",
"add_source_path",
"add_static_links",
"get_files",
"_process"
],
"call_files": [
"private/tmp/repos/getpelican__pelican/pelican/generators.py",
"private/tmp/repos/getpelican__pelican/pelican/generators.py",
"private/tmp/repos/getpelican__pelican/pelican/generators.py",
"private/tmp/repos/getpelican__pelican/pelican/generators.py",
"private/tmp/repos/getpelican__pelican/pelican/generators.py",
"private/tmp/repos/getpelican__pelican/pelican/generators.py"
]
} | {
"anchor": "generate_context",
"anchor_file": "private/tmp/repos/getpelican__pelican/pelican/generators.py",
"anchor_source": "def generate_context(self):\n all_pages = []\n hidden_pages = []\n draft_pages = []\n for f in self.get_files(\n self.settings[\"PAGE_PATHS\"], exclude=self.settings[\"PAGE_EXCLUDES\"]\n ):\n page = self.get_cached_data(f, None)\n if page is None:\n try:\n page = self.readers.read_file(\n base_path=self.path,\n path=f,\n content_class=Page,\n context=self.context,\n preread_signal=signals.page_generator_preread,\n preread_sender=self,\n context_signal=signals.page_generator_context,\n context_sender=self,\n )\n except Exception:\n logger.exception(\n \"Could not process %s\",\n f,\n exc_info=self.settings.get(\"DEBUG\", False),\n )\n self._add_failed_source_path(f)\n continue\n\n if isinstance(page, SkipStub):\n logger.debug(\"Safely skipping %s\", f)\n continue\n\n if not page.is_valid():\n self._add_failed_source_path(f)\n continue\n\n self.cache_data(f, page)\n\n if page.status == \"published\":\n all_pages.append(page)\n elif page.status == \"hidden\":\n hidden_pages.append(page)\n elif page.status == \"draft\":\n draft_pages.append(page)\n elif page.status == \"skip\":\n raise AssertionError(\"Documents with 'skip' status should be skipped\")\n\n self.add_source_path(page)\n self.add_static_links(page)\n\n def _process(pages):\n origs, translations = process_translations(\n pages, translation_id=self.settings[\"PAGE_TRANSLATION_ID\"]\n )\n origs = order_content(origs, self.settings[\"PAGE_ORDER_BY\"])\n return origs, translations\n\n self.pages, self.translations = _process(all_pages)\n self.hidden_pages, self.hidden_translations = _process(hidden_pages)\n self.draft_pages, self.draft_translations = _process(draft_pages)\n\n self._update_context((\"pages\", \"hidden_pages\", \"draft_pages\"))\n\n self.save_cache()\n self.readers.save_cache()\n signals.page_generator_finalized.send(self)",
"result_size": 6,
"created_at": "2026-03-20T17:18:06.869475+00:00"
} |
colinhacks__zod__duration__downstream__2hop_9edb25 | colinhacks/zod | downstream | 2 | {
"hop_1": [
"_isoDuration"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/api.ts"
],
"hop_2": [
"normalizeParams"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/util.ts"
]
} | {
"anchor": "duration",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/classic/iso.ts",
"anchor_source": "export function duration(params?: string | core.$ZodISODurationParams): ZodISODuration {\n return core._isoDuration(ZodISODuration, params);\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
immerjs__immer__updateDraftInParent__upstream__2hop_e8dfc0 | immerjs/immer | upstream | 2 | {
"hop_1": [
"registerChildFinalizationCallback",
"childCleanup",
"handleCrossReference",
"crossReferenceCleanup"
],
"hop_1_files": [
"private/tmp/repos/immerjs__immer/src/core/finalize.ts",
"private/tmp/repos/immerjs__immer/src/core/finalize.ts",
"private/tmp/repos/immerjs__immer/src/core/finalize.ts",
"private/tmp/repos/immerjs__immer/src/core/finalize.ts"
],
"hop_2": [
"createProxy",
"set",
"handleInsertedValues",
"enableArrayMethods",
"enableMapSet",
"add"
],
"hop_2_files": [
"private/tmp/repos/immerjs__immer/src/core/immerClass.ts",
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts",
"private/tmp/repos/immerjs__immer/src/plugins/arrayMethods.ts",
"private/tmp/repos/immerjs__immer/src/plugins/arrayMethods.ts",
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts",
"private/tmp/repos/immerjs__immer/src/plugins/mapset.ts"
]
} | {
"anchor": "updateDraftInParent",
"anchor_file": "private/tmp/repos/immerjs__immer/src/core/finalize.ts",
"anchor_source": "export function updateDraftInParent(\n\tparent: ImmerState,\n\tdraftValue: any,\n\tfinalizedValue: any,\n\toriginalKey?: string | number | symbol\n): void {\n\tconst parentCopy = latest(parent)\n\tconst parentType = parent.type_\n\n\t// Fast path: Check if draft is still at original key\n\tif (originalKey !== undefined) {\n\t\tconst currentValue = get(parentCopy, originalKey, parentType)\n\t\tif (currentValue === draftValue) {\n\t\t\t// Still at original location, just update it\n\t\t\tset(parentCopy, originalKey, finalizedValue, parentType)\n\t\t\treturn\n\t\t}\n\t}\n\n\t// Slow path: Build reverse mapping of all children\n\t// to their indices in the parent, so that we can\n\t// replace all locations where this draft appears.\n\t// We only have to build this once per parent.\n\tif (!parent.draftLocations_) {\n\t\tconst draftLocations = (parent.draftLocations_ = new Map())\n\n\t\t// Use `each` which works on Arrays, Maps, and Objects\n\t\teach(parentCopy, (key, value) => {\n\t\t\tif (isDraft(value)) {\n\t\t\t\tconst keys = draftLocations.get(value) || []\n\t\t\t\tkeys.push(key)\n\t\t\t\tdraftLocations.set(value, keys)\n\t\t\t}\n\t\t})\n\t}\n\n\t// Look up all locations where this draft appears\n\tconst locations =\n\t\tparent.draftLocations_.get(draftValue) ?? EMPTY_LOCATIONS_RESULT\n\n\t// Update all locations\n\tfor (const location of locations) {\n\t\tset(parentCopy, location, finalizedValue, parentType)\n\t}\n}",
"result_size": 7,
"created_at": "2026-03-20T17:18:06.914297+00:00",
"file_content": ""
} |
colinhacks__zod___default__downstream__2hop_7a6171 | colinhacks/zod | downstream | 2 | {
"hop_1": [
"shallowClone"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/util.ts"
],
"hop_2": [
"isPlainObject"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/util.ts"
]
} | {
"anchor": "_default",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v4/core/api.ts",
"anchor_source": "export function _default<T extends schemas.$ZodObject>(\n Class: util.SchemaClass<schemas.$ZodDefault>,\n innerType: T,\n defaultValue: util.NoUndefined<core.output<T>> | (() => util.NoUndefined<core.output<T>>)\n): schemas.$ZodDefault<T> {\n return new Class({\n type: \"default\",\n innerType,\n get defaultValue() {\n return typeof defaultValue === \"function\" ? (defaultValue as Function)() : util.shallowClone(defaultValue);\n },\n }) as any;\n}",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
psf__black__do_transform__downstream__2hop_b3fd0a | psf/black | downstream | 2 | {
"hop_1": [
"CannotTransform",
"_merge_string_group",
"_remove_backslash_line_continuation_chars"
],
"hop_1_files": [
"private/tmp/repos/psf__black/src/black/trans.py",
"private/tmp/repos/psf__black/src/black/trans.py",
"private/tmp/repos/psf__black/src/black/trans.py"
],
"hop_2": [
"_merge_one_string_group",
"is_valid_index_factory",
"_validate_msg"
],
"hop_2_files": [
"private/tmp/repos/psf__black/src/black/trans.py",
"private/tmp/repos/psf__black/src/black/trans.py",
"private/tmp/repos/psf__black/src/black/trans.py"
]
} | {
"anchor": "do_transform",
"anchor_file": "private/tmp/repos/psf__black/src/black/trans.py",
"anchor_source": "def do_transform(\n self, line: Line, string_indices: list[int]\n ) -> Iterator[TResult[Line]]:\n new_line = line\n\n rblc_result = self._remove_backslash_line_continuation_chars(\n new_line, string_indices\n )\n if isinstance(rblc_result, Ok):\n new_line = rblc_result.ok()\n\n msg_result = self._merge_string_group(new_line, string_indices)\n if isinstance(msg_result, Ok):\n new_line = msg_result.ok()\n\n if isinstance(rblc_result, Err) and isinstance(msg_result, Err):\n msg_cant_transform = msg_result.err()\n rblc_cant_transform = rblc_result.err()\n cant_transform = CannotTransform(\n \"StringMerger failed to merge any strings in this line.\"\n )\n\n # Chain the errors together using `__cause__`.\n msg_cant_transform.__cause__ = rblc_cant_transform\n cant_transform.__cause__ = msg_cant_transform\n\n yield Err(cant_transform)\n else:\n yield Ok(new_line)",
"result_size": 3,
"created_at": "2026-03-20T17:18:07.570141+00:00"
} |
sindresorhus__got___writeChunksToRequest__downstream__1hop_4432c7 | sindresorhus/got | downstream | 1 | {
"calls": [
"_isRequestStale",
"_writeRequest"
],
"call_files": [
"private/tmp/repos/got/source/core/index.ts",
"private/tmp/repos/got/source/core/index.ts"
]
} | {
"anchor": "_writeChunksToRequest",
"anchor_file": "private/tmp/repos/got/source/core/index.ts",
"anchor_source": "private async _writeChunksToRequest(buffer: Uint8Array, request: ClientRequest): Promise<void> {\n\t\tconst chunkSize = 65_536; // 64 KB\n\t\tconst isStale = () => this._isRequestStale(request);\n\n\t\tfor (const part of chunk(buffer, chunkSize)) {\n\t\t\tif (isStale()) {\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// eslint-disable-next-line no-await-in-loop\n\t\t\tawait new Promise<void>((resolve, reject) => {\n\t\t\t\tthis._writeRequest(part, undefined, error => {\n\t\t\t\t\tif (error) {\n\t\t\t\t\t\tif (isStale()) {\n\t\t\t\t\t\t\tresolve();\n\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\treject(error);\n\t\t\t\t\t\treturn;\n\t\t\t\t\t}\n\n\t\t\t\t\tif (isStale()) {\n\t\t\t\t\t\tresolve();\n\t\t\t\t\t\treturn;\n\t\t\t\t\t}\n\n\t\t\t\t\tsetImmediate(resolve);\n\t\t\t\t}, request);\n\t\t\t});\n\t\t}\n\t}",
"result_size": 3,
"created_at": "2026-03-20T17:18:08.163238+00:00"
} |
pytest-dev__pytest__derive_importpath__downstream__1hop_816a86 | pytest-dev/pytest | downstream | 1 | {
"calls": [
"resolve",
"annotated_getattr"
],
"call_files": [
"private/tmp/repos/pytest-dev__pytest/src/_pytest/monkeypatch.py",
"private/tmp/repos/pytest-dev__pytest/src/_pytest/monkeypatch.py"
]
} | {
"anchor": "derive_importpath",
"anchor_file": "private/tmp/repos/pytest-dev__pytest/src/_pytest/monkeypatch.py",
"anchor_source": "def derive_importpath(import_path: str, raising: bool) -> tuple[str, object]:\n if not isinstance(import_path, str) or \".\" not in import_path:\n raise TypeError(f\"must be absolute import path string, not {import_path!r}\")\n module, attr = import_path.rsplit(\".\", 1)\n target = resolve(module)\n if raising:\n annotated_getattr(target, attr, ann=module)\n return attr, target",
"result_size": 2,
"created_at": "2026-03-20T17:18:07.712542+00:00"
} |
trpc__trpc__mutation__upstream__1hop_bb7bd0 | trpc/trpc | upstream | 1 | {
"calls": [
"trpcMutationOptions",
"experimental_createActionHook",
"createUtilityFunctions",
"useAction",
"setMutationDefaults"
],
"call_files": [
"private/tmp/repos/trpc__trpc/packages/tanstack-react-query/src/internals/mutationOptions.ts",
"private/tmp/repos/trpc__trpc/packages/next/src/app-dir/create-action-hook.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/utils/createUtilityFunctions.ts",
"private/tmp/repos/trpc__trpc/packages/next/src/app-dir/create-action-hook.tsx",
"private/tmp/repos/trpc__trpc/packages/react-query/src/utils/createUtilityFunctions.ts"
]
} | {
"anchor": "mutation",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/client/src/internals/TRPCUntypedClient.ts",
"anchor_source": "public mutation(path: string, input?: unknown, opts?: TRPCRequestOptions) {\n return this.requestAsPromise<unknown, unknown>({\n type: 'mutation',\n path,\n input,\n context: opts?.context,\n signal: opts?.signal,\n });\n }",
"result_size": 6,
"created_at": "2026-03-20T17:18:08.256883+00:00",
"file_content": ""
} |
sindresorhus__got__getRequestFunction__downstream__2hop_276a47 | sindresorhus/got | downstream | 2 | {
"hop_1": [
"#getFallbackRequestFunction",
"#callFallbackRequest",
"#resolveRequestWithFallback"
],
"hop_1_files": [
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts",
"private/tmp/repos/got/source/core/options.ts"
],
"hop_2": [
"#resolveFallbackRequestResult"
],
"hop_2_files": [
"private/tmp/repos/got/source/core/options.ts"
]
} | {
"anchor": "getRequestFunction",
"anchor_file": "private/tmp/repos/got/source/core/options.ts",
"anchor_source": "getRequestFunction() {\n\t\tconst {request: customRequest} = this.#internals;\n\n\t\tif (!customRequest) {\n\t\t\treturn this.#getFallbackRequestFunction();\n\t\t}\n\n\t\tconst requestWithFallback: RequestFunction = (url, options, callback?) => {\n\t\t\tconst result = customRequest(url, options, callback);\n\n\t\t\tif (is.promise(result)) {\n\t\t\t\treturn this.#resolveRequestWithFallback(result, url, options, callback);\n\t\t\t}\n\n\t\t\tif (result !== undefined) {\n\t\t\t\treturn result;\n\t\t\t}\n\n\t\t\treturn this.#callFallbackRequest(url, options, callback);\n\t\t};\n\n\t\treturn requestWithFallback;\n\t}",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.163238+00:00"
} |
paramiko__paramiko__put__downstream__2hop_5bef3b | paramiko/paramiko | downstream | 2 | {
"hop_1": [
"putfo"
],
"hop_1_files": [
"private/tmp/repos/paramiko/paramiko/sftp_client.py"
],
"hop_2": [
"SFTPClient",
"_transfer_with_callback",
"stat"
],
"hop_2_files": [
"private/tmp/repos/paramiko/paramiko/sftp_client.py",
"private/tmp/repos/paramiko/paramiko/sftp_client.py",
"private/tmp/repos/paramiko/paramiko/sftp_client.py"
]
} | {
"anchor": "put",
"anchor_file": "private/tmp/repos/paramiko/paramiko/sftp_client.py",
"anchor_source": "def put(self, localpath, remotepath, callback=None, confirm=True):\n \"\"\"\n Copy a local file (``localpath``) to the SFTP server as ``remotepath``.\n Any exception raised by operations will be passed through. This\n method is primarily provided as a convenience.\n\n The SFTP operations use pipelining for speed.\n\n :param str localpath: the local file to copy\n :param str remotepath: the destination path on the SFTP server. Note\n that the filename should be included. Only specifying a directory\n may result in an error.\n :param callable callback:\n optional callback function (form: ``func(int, int)``) that accepts\n the bytes transferred so far and the total bytes to be transferred\n :param bool confirm:\n whether to do a stat() on the file afterwards to confirm the file\n size\n\n :return: an `.SFTPAttributes` object containing attributes about the\n given file\n\n .. versionadded:: 1.4\n .. versionchanged:: 1.7.4\n ``callback`` and rich attribute return value added.\n .. versionchanged:: 1.7.7\n ``confirm`` param added.\n \"\"\"\n file_size = os.stat(localpath).st_size\n with open(localpath, \"rb\") as fl:\n return self.putfo(fl, remotepath, file_size, callback, confirm)",
"result_size": 3,
"created_at": "2026-03-20T17:18:07.460561+00:00"
} |
scrapy__scrapy__spider_closed__downstream__2hop_9d84fe | scrapy/scrapy | downstream | 2 | {
"hop_1": [
"log"
],
"hop_1_files": [
"private/tmp/repos/scrapy__scrapy/scrapy/extensions/periodic_log.py"
],
"hop_2": [
"log_delta",
"log_timing",
"log_crawler_stats"
],
"hop_2_files": [
"private/tmp/repos/scrapy__scrapy/scrapy/extensions/periodic_log.py",
"private/tmp/repos/scrapy__scrapy/scrapy/extensions/periodic_log.py",
"private/tmp/repos/scrapy__scrapy/scrapy/extensions/periodic_log.py"
]
} | {
"anchor": "spider_closed",
"anchor_file": "private/tmp/repos/scrapy__scrapy/scrapy/extensions/periodic_log.py",
"anchor_source": "def spider_closed(self, spider: Spider, reason: str) -> None:\n self.log()\n if self.task and self.task.running:\n self.task.stop()",
"result_size": 3,
"created_at": "2026-03-20T17:18:08.059582+00:00"
} |
trpc__trpc__attempt__downstream__2hop_0976a8 | trpc/trpc | downstream | 2 | {
"hop_1": [
"opWithLastEventId",
"subscribe"
],
"hop_1_files": [
"private/tmp/repos/trpc__trpc/packages/client/src/links/retryLink.ts",
"private/tmp/repos/trpc__trpc/packages/server/src/observable/types.ts"
],
"hop_2": [
"inputWithTrackedEventId"
],
"hop_2_files": [
"private/tmp/repos/trpc__trpc/packages/client/src/internals/inputWithTrackedEventId.ts"
]
} | {
"anchor": "attempt",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/client/src/links/retryLink.ts",
"anchor_source": "function attempt(attempts: number) {\n const op = opWithLastEventId();\n\n next$ = callOpts.next(op).subscribe({\n error(error) {\n const shouldRetry = opts.retry({\n op,\n attempts,\n error,\n });\n if (!shouldRetry) {\n observer.error(error);\n return;\n }\n const delayMs = opts.retryDelayMs?.(attempts) ?? 0;\n\n if (delayMs <= 0) {\n attempt(attempts + 1);\n return;\n }\n callNextTimeout = setTimeout(\n () => attempt(attempts + 1),\n delayMs,\n );\n },\n next(envelope) {\n //\n if (\n (!envelope.result.type || envelope.result.type === 'data') &&\n envelope.result.id\n ) {\n //\n lastEventId = envelope.result.id;\n }\n\n observer.next(envelope);\n },\n complete() {\n observer.complete();\n },\n });\n }",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
trpc__trpc__onSuccess__downstream__1hop_18a873 | trpc/trpc | downstream | 1 | {
"calls": [
"generateEntrypoints"
],
"call_files": [
"private/tmp/repos/trpc__trpc/scripts/entrypoints.ts"
]
} | {
"anchor": "onSuccess",
"anchor_file": "private/tmp/repos/trpc__trpc/packages/tanstack-react-query/tsdown.config.ts",
"anchor_source": "onSuccess: async () => {\n const start = Date.now();\n const { generateEntrypoints } = await import(\n '../../scripts/entrypoints.js'\n );\n await generateEntrypoints(input);\n // eslint-disable-next-line no-console\n console.log(`Generated entrypoints in ${Date.now() - start}ms`);\n },",
"result_size": 1,
"created_at": "2026-03-20T17:18:08.256883+00:00"
} |
colinhacks__zod__isOptional__downstream__2hop_8f6f39 | colinhacks/zod | downstream | 2 | {
"hop_1": [
"safeParse"
],
"hop_1_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
],
"hop_2": [
"_parseSync"
],
"hop_2_files": [
"private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts"
]
} | {
"anchor": "isOptional",
"anchor_file": "private/tmp/repos/colinhacks__zod/packages/zod/src/v3/types.ts",
"anchor_source": "isOptional(): boolean {\n return this.safeParse(undefined).success;\n }",
"result_size": 1,
"created_at": "2026-03-20T17:18:06.594501+00:00"
} |
sindresorhus__got__flush__upstream__1hop_dad778 | sindresorhus/got | upstream | 1 | {
"calls": [
"_beforeError",
"asPromise",
"fn"
],
"call_files": [
"private/tmp/repos/got/source/core/index.ts",
"private/tmp/repos/got/source/as-promise/index.ts",
"private/tmp/repos/got/benchmark/index.ts"
]
} | {
"anchor": "flush",
"anchor_file": "private/tmp/repos/got/source/core/index.ts",
"anchor_source": "async flush() {\n\t\tif (this._flushed) {\n\t\t\treturn;\n\t\t}\n\n\t\tthis._flushed = true;\n\n\t\ttry {\n\t\t\tawait this._finalizeBody();\n\n\t\t\tif (this.destroyed) {\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tawait this._makeRequest();\n\n\t\t\tif (this.destroyed) {\n\t\t\t\tthis._request?.destroy();\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// Queued writes etc.\n\t\t\tfor (const job of this._jobs) {\n\t\t\t\tjob();\n\t\t\t}\n\n\t\t\t// Prevent memory leak\n\t\t\tthis._jobs.length = 0;\n\n\t\t\tthis._requestInitialized = true;\n\t\t} catch (error: unknown) {\n\t\t\tthis._beforeError(normalizeError(error));\n\t\t}\n\t}",
"result_size": 7,
"created_at": "2026-03-20T17:18:08.163238+00:00",
"file_content": "import process from 'node:process';\nimport {Buffer} from 'node:buffer';\nimport {Duplex, type Readable} from 'node:stream';\nimport {addAbortListener} from 'node:events';\nimport http, {ServerResponse, type ClientRequest, type RequestOptions} from 'node:http';\nimport type {Socket} from 'node:net';\nimport {byteLength} from 'byte-counter';\nimport {chunk} from 'chunk-data';\nimport CacheableRequest, {\n\tCacheError as CacheableCacheError,\n\ttype CacheableRequestFunction,\n\ttype CacheableOptions,\n} from 'cacheable-request';\nimport decompressResponse from 'decompress-response';\nimport type {KeyvStoreAdapter} from 'keyv';\nimport type KeyvType from 'keyv';\nimport is, {isBuffer} from '@sindresorhus/is';\nimport type ResponseLike from 'responselike';\nimport timer, {type ClientRequestWithTimings, type Timings, type IncomingMessageWithTimings} from './utils/timer.js';\nimport getBodySize from './utils/get-body-size.js';\nimport proxyEvents from './utils/proxy-events.js';\nimport timedOut, {TimeoutError as TimedOutTimeoutError} from './timed-out.js';\nimport stripUrlAuth from './utils/strip-url-auth.js';\nimport WeakableMap from './utils/weakable-map.js';\nimport calculateRetryDelay from './calculate-retry-delay.js';\nimport Options, {\n\ttype PromiseCookieJar,\n\ttype NativeRequestOptions,\n\ttype RetryOptions,\n\ttype OptionsError,\n\ttype OptionsInit,\n\ttype NormalizedOptions,\n} from './options.js';\nimport {\n\tcacheDecodedBody,\n\tisResponseOk,\n\ttype PlainResponse,\n\ttype Response,\n} from './response.js';\nimport isClientRequest from './utils/is-client-request.js';\nimport isUnixSocketURL, {getUnixSocketPath} from './utils/is-unix-socket-url.js';\nimport {\n\tRequestError,\n\tReadError,\n\tMaxRedirectsError,\n\tHTTPError,\n\tTimeoutError,\n\tUploadError,\n\tCacheError,\n\tAbortError,\n} from './errors.js';\nimport {\n\tgenerateRequestId,\n\tpublishRequestCreate,\n\tpublishRequestStart,\n\tpublishResponseStart,\n\tpublishResponseEnd,\n\tpublishRetry,\n\tpublishError,\n\tpublishRedirect,\n} from './diagnostics-channel.js';\n\ntype Error = NodeJS.ErrnoException;\n\nexport type Progress = {\n\tpercent: number;\n\ttransferred: number;\n\ttotal?: number;\n};\n\nconst supportsBrotli = is.string(process.versions.brotli);\nconst supportsZstd = is.string(process.versions.zstd);\nconst isUtf8Encoding = (encoding?: BufferEncoding): boolean => encoding === undefined || encoding.toLowerCase().replace('-', '') === 'utf8';\nconst textEncoder = new TextEncoder();\nconst concatUint8Arrays = (chunks: readonly Uint8Array[]): Uint8Array => {\n\tlet totalLength = 0;\n\tfor (const chunk of chunks) {\n\t\ttotalLength += chunk.byteLength;\n\t}\n\n\tconst result = new Uint8Array(totalLength);\n\tlet offset = 0;\n\n\tfor (const chunk of chunks) {\n\t\tresult.set(chunk, offset);\n\t\toffset += chunk.byteLength;\n\t}\n\n\treturn result;\n};\n\nconst methodsWithoutBody: ReadonlySet<string> = new Set(['GET', 'HEAD']);\n\nexport type GotEventFunction<T> =\n\t/**\n\t`request` event to get the request object of the request.\n\n\t __Tip__: You can use `request` event to abort requests.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tgot.stream('https://github.com')\n\t\t.on('request', request => setTimeout(() => request.destroy(), 50));\n\t```\n\t*/\n\t((name: 'request', listener: (request: ClientRequest) => void) => T)\n\n\t/**\n\tThe `response` event to get the response object of the final request.\n\t*/\n\t& (<R extends Response>(name: 'response', listener: (response: R) => void) => T)\n\n\t/**\n\tThe `redirect` event to get the response object of a redirect. The second argument is options for the next request to the redirect location.\n\t*/\n\t& (<R extends Response, N extends Options>(name: 'redirect', listener: (response: R, nextOptions: N) => void) => T)\n\n\t/**\n\tProgress events for uploading (sending a request) and downloading (receiving a response).\n\tThe `progress` argument is an object like:\n\n\t```\n\t{\n\t\tpercent: 0.1,\n\t\ttransferred: 1024,\n\t\ttotal: 10240\n\t}\n\t```\n\n\tIf the `content-length` header is missing, `total` will be `undefined`.\n\n\t@example\n\t```\n\timport got from 'got';\n\n\tconst response = await got('https://sindresorhus.com')\n\t\t.on('downloadProgress', progress => {\n\t\t\t// Report download progress\n\t\t})\n\t\t.on('uploadProgress', progress => {\n\t\t\t// Report upload progress\n\t\t});\n\n\tconsole.log(response);\n\t```\n\t*/\n\t& ((name: 'uploadProgress' | 'downloadProgress', listener: (progress: Progress) => void) => T)\n\t/**\n\tTo enable retrying on a Got stream, it is required to have a `retry` handler attached.\n\n\tWhen this event is emitted, you should reset the stream you were writing to and prepare the body again.\n\n\tSee `got.options.retry` for more information.\n\t*/\n\t& ((name: 'retry', listener: (retryCount: number, error: RequestError, createRetryStream: (options?: OptionsInit) => Request) => void) => T);\n\nexport type RequestEvents<T> = {\n\ton: GotEventFunction<T>;\n\tonce: GotEventFunction<T>;\n\toff: GotEventFunction<T>;\n};\n\ntype StorageAdapter = KeyvStoreAdapter | KeyvType | Map<unknown, unknown>;\n\nconst cacheableStore = new WeakableMap<string | StorageAdapter, CacheableRequestFunction>();\n\nconst redirectCodes: ReadonlySet<number> = new Set([300, 301, 302, 303, 304, 307, 308]);\nconst transientWriteErrorCodes: ReadonlySet<string> = new Set(['EPIPE', 'ECONNRESET']);\n\n// Track errors that have been processed by beforeError hooks to preserve custom error types\nconst errorsProcessedByHooks = new WeakSet<Error>();\n\nconst proxiedRequestEvents = [\n\t'socket',\n\t'connect',\n\t'continue',\n\t'information',\n\t'upgrade',\n] as const;\n\nconst noop = (): void => {};\n\nconst isTransientWriteError = (error: Error): boolean => {\n\tconst {code} = error;\n\treturn typeof code === 'string' && transientWriteErrorCodes.has(code);\n};\n\nconst normalizeError = (error: unknown): Error => {\n\tif (error instanceof globalThis.Error) {\n\t\treturn error;\n\t}\n\n\tif (is.object(error)) {\n\t\tconst errorLike = error as Partial<Error & {code?: string; input?: string}>;\n\t\tconst message = typeof errorLike.message === 'string' ? errorLike.message : 'Non-error object thrown';\n\t\tconst normalizedError = new globalThis.Error(message, {cause: error}) as Error & {code?: string; input?: string};\n\n\t\tif (typeof errorLike.stack === 'string') {\n\t\t\tnormalizedError.stack = errorLike.stack;\n\t\t}\n\n\t\tif (typeof errorLike.code === 'string') {\n\t\t\tnormalizedError.code = errorLike.code;\n\t\t}\n\n\t\tif (typeof errorLike.input === 'string') {\n\t\t\tnormalizedError.input = errorLike.input;\n\t\t}\n\n\t\treturn normalizedError;\n\t}\n\n\treturn new globalThis.Error(String(error));\n};\n\ntype UrlType = ConstructorParameters<typeof Options>[0];\ntype OptionsType = ConstructorParameters<typeof Options>[1];\ntype DefaultsType = ConstructorParameters<typeof Options>[2];\nconst getSanitizedUrl = (options?: Options): string => options?.url ? stripUrlAuth(options.url) : '';\n\nexport default class Request extends Duplex implements RequestEvents<Request> {\n\t// @ts-expect-error - Ignoring for now.\n\toverride ['constructor']: typeof Request;\n\n\t_noPipe?: boolean;\n\n\t// @ts-expect-error https://github.com/microsoft/TypeScript/issues/9568\n\toptions: Options;\n\tresponse?: PlainResponse;\n\trequestUrl?: URL;\n\tredirectUrls: URL[] = [];\n\tretryCount = 0;\n\t_stopReading = false;\n\n\tdeclare private _requestOptions: NativeRequestOptions;\n\n\tprivate _stopRetry = noop;\n\tprivate _downloadedSize = 0;\n\tprivate _uploadedSize = 0;\n\tprivate readonly _pipedServerResponses = new Set<ServerResponse>();\n\tprivate _request?: ClientRequest;\n\tprivate _responseSize?: number;\n\tprivate _bodySize?: number;\n\tprivate _unproxyEvents = noop;\n\tprivate _isFromCache?: boolean;\n\tprivate _triggerRead = false;\n\tprivate readonly _jobs: Array<() => void> = [];\n\tprivate _cancelTimeouts = noop;\n\tprivate readonly _removeListeners = noop;\n\tprivate _nativeResponse?: IncomingMessageWithTimings;\n\tprivate _flushed = false;\n\tprivate _aborted = false;\n\tprivate _expectedContentLength?: number;\n\tprivate _compressedBytesCount?: number;\n\tprivate _skipRequestEndInFinal = false;\n\tprivate _incrementalBodyDecoder?: TextDecoder;\n\tprivate _incrementalDecodedBodyChunks: string[] = [];\n\tprivate readonly _requestId = generateRequestId();\n\n\t// We need thi\n# … (truncated at 8000 chars)"
} |
GraphCode-Bench-500-v0
GraphCode-Bench is a benchmark for evaluating LLMs on call-graph reasoning — given a function in a real-world repository, can a model identify which functions call it (upstream) or which functions it calls (downstream), across 1 and 2 hops?
Models are evaluated agentically: they receive read-only filesystem tools (list_directory, read_file, search_in_file) and up to 10 turns to explore the codebase before producing an answer.
Dataset summary
| Split | Records | Repos | Languages |
|---|---|---|---|
train (bench500) |
483 | 22 | Python, TypeScript |
Stratification: 5 repos × 2 question types (upstream/downstream) × 2 hop depths (1-hop/2-hop).
Task definition
Each record contains:
- anchor: a named function in a real open-source repository
- question_type:
upstream(who calls this?) ordownstream(what does this call?) - hop_depth:
1(direct callers/callees) or2(one level further) - gold: the ground-truth set of function names at each hop level (extracted via LSP)
Models must enumerate the correct function names. Scoring uses set F1 against the gold answer.
Record schema
{
"sample_id": "psf__requests__send__upstream__1hop_abc123",
"repo": "psf/requests",
"question_type": "upstream",
"hop_depth": 1,
"gold": {
"hop_1": ["mount", "request"],
"hop_1_files": ["requests/sessions.py"]
},
"metadata": {
"anchor": "send",
"anchor_file": "requests/adapters.py",
"anchor_source": "def send(self, request, ...):",
"result_size": 4,
"created_at": "2026-03-20T16:58:18.104721+00:00",
"file_content": "..."
}
}
Repositories included
Python (250 samples): psf/requests, pallets/flask, pallets/click, scrapy/scrapy, celery/celery, encode/httpx, pytest-dev/pytest, psf/black, PyCQA/flake8, rq/rq, paramiko/paramiko
TypeScript (233 samples): sindresorhus/got, colinhacks/zod, trpc/trpc, immerjs/immer, node-fetch/node-fetch
Pipeline
Ground truth is extracted by:
- Running basedpyright / typescript-language-server over each repo via LSP
- Walking call edges from the anchor to the requested depth
- Applying 15 quality filters (no builtins, no generics, minimum result size, etc.)
See the companion paper for full pipeline details.
Evaluation results (v0)
| Model | F1 | EM | Pass@0.5 | Avg Turns |
|---|---|---|---|---|
| GPT-5.4-nano (API)† | 0.364 | 0.170 | 0.400 | 6.19 |
| Qwen3-Coder-30B-A3B | 0.351 | 0.126 | 0.369 | 7.29 |
| GPT-OSS-20B | 0.313 | 0.116 | 0.362 | 7.72 |
| Mistral-Small-24B | 0.199 | 0.066 | 0.211 | 5.05 |
† Closed model, shown for reference. Open-weight models evaluated via vLLM on HPC cluster.
Key finding: 2-hop questions are 3–4× harder than 1-hop (Qwen3: F1=0.546 at 1-hop vs 0.151 at 2-hop).
Citation
@misc{graphcodebench2026,
title = {GraphCode-Bench: Evaluating LLMs on Agentic Call-Graph Reasoning},
author = {Rossi, Vittorio},
year = {2026},
url = {https://huggingface.co/datasets/VittorioRossi/GraphCode-Bench-500-v0}
}
License
Apache 2.0. The source code snippets included in anchor_source and file_content fields are derived from their respective open-source repositories under their original licenses.
- Downloads last month
- 4