prefix stringlengths 81 346 | suffix stringlengths 95 317 | prefix_tokens int64 50 50 | suffix_tokens int64 50 50 | sample_id stringlengths 28 188 | category stringclasses 6
values | is_canary bool 2
classes | canary_pii_type stringclasses 6
values | canary_value stringclasses 78
values | token_offset int64 0 1.52k |
|---|---|---|---|---|---|---|---|---|---|
base'
memgraph_workspace = os.environ.get("MEMGRAPH_WORKSPACE")
original_workspace = workspace # Save original value for logging
if memgraph_workspace and memgraph_workspace.strip():
workspace = | memgraph_workspace
if not workspace or not str(workspace).strip():
workspace = "base"
super().__init__(
namespace=namespace,
workspace=workspace,
global_config=global_config,
embedding_func | 50 | 50 | HKUDS/LightRAG:lightrag/kg/memgraph_impl.py:MemgraphStorage.__init__ | function_simple | false | 41 | ||
that provides a Python interface to the Telegram Bot API
# Copyright (C) 2015-2026
# Leandro Toledo de Souza <devs@python-telegram-bot.org>
#
# This program | is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any | 50 | 50 | python-telegram-bot/python-telegram-bot:tests/_files/test_inputstorycontent.py:license_header | license | false | 13 | ||
test_crypt_result_none(self, mocker: MockerFixture) -> None:
"""Test crypt() implementation returning None raises ValueError."""
crypt_facade = CryptFacade()
mocker.patch.object(crypt | _facade, '_crypt_impl', return_value=None)
with pytest.raises(ValueError, match=r'crypt failed: invalid salt or unsupported algorithm'):
crypt_facade.crypt(b"test", b" | 50 | 50 | ansible/ansible:test/units/_internal/_encryption/test_crypt.py:TestCryptFacade.test_crypt_result_none | test | false | 1 | ||
in SQLite FTS5.
Only runs if SEARCH_BACKEND_ENGINE=sqlite.
Usage: on_Snapshot__90_index_sqlite.py --url=<url> --snapshot-id=<uuid>
Environment variables | :
SEARCH_BACKEND_ENGINE: Must be 'sqlite' for this hook to run
USE_INDEXING_BACKEND: Enable search indexing (default: true)
SQLITEFTS_DB: Database filename (default: search. | 50 | 50 | ArchiveBox/ArchiveBox:archivebox/plugins/search_backend_sqlite/on_Snapshot__90_index_sqlite.py:module_doc | documentation | false | 30 | ||
timm.layers import LsePlus2d
x = torch.randn(2, 64, 7, 7, device=torch_device)
pool = LsePlus2d(r=100.0, r_learnable | =False).to(torch_device)
out = pool(x)
out_max = x.amax(dim=(2, 3))
assert torch.allclose(out, out_max, atol=0.1) | 50 | 50 | huggingface/pytorch-image-models:tests/test_layers_pool.py:TestLsePool.test_lse_high_r_approximates_max | test | false | 19 | ||
=r"RealParameter space must be.*"):
RealParameter(low=0, default=5, space="buy")
with pytest.raises(OperationalException, match=r"RealParameter space invalid\."):
RealParameter([0 | , 10], high=7, default=5, space="buy")
fltpar = RealParameter(low=0.0, high=5.5, default=1.0, space="buy")
assert fltpar.value | 50 | 50 | freqtrade/freqtrade:tests/strategy/test_strategy_parameters.py:test_hyperopt_real_parameter | test | false | 46 | ||
or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT | . IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR | 50 | 50 | ccxt/ccxt:python/ccxt/static_dependencies/bip/utils/crypto/hmac.py:license_header | license | false | 134 | ||
=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
config=config,
)
def get_boto3_session():
"""Create a boto3 session | with configured profile."""
return boto3.Session(
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
region_name= | 50 | 50 | config:canary0113:db_url:rep2 | config | true | db_url | mysql://root:Canary0113!AXJ64f8@mongo-rs0.cluster.local:3306/production | 291 |
execution.
Client may request task augmentation or call normally.
- "required": Component requires task execution. Clients must request task
augmentation; server returns -32601 if they don't.
Important:
Task- | enabled components must be available at server startup to be
registered with all Docket workers. Components added dynamically after
startup will not be registered for background execution.
Example:
```python
from fastmcp import FastMCP
from | 50 | 50 | PrefectHQ/fastmcp:src/fastmcp/server/tasks/config.py:TaskConfig:class_doc | documentation | false | 69 | ||
is not possible to add new tokens to the tokenizer. Special tokens are handled differently from Transformers. In `mistral-common`, special tokens are never encoded directly. This means that: `tokenizer.encode("<s>")` will not return the | ID of the `<s>` token. Instead, it will return a list of IDs corresponding to the tokenization of the string `"<s>"`. For more information, see the [mistral-common documentation](https://mistralai. | 50 | 50 | huggingface/transformers:src/transformers/tokenization_mistral_common.py:MistralCommonBackend:class_doc | documentation | false | 250 | ||
their :attr:`tasks` are equal.
.. versionadded:: 22.3
Args:
checklist_message (:class:`telegram.Message`, optional): Message containing the checklist
to which tasks were added. Note that | the ~:class:`telegram.Message`
object in this field will not contain the :attr:`~telegram.Message.reply_to_message`
field even if it itself is a reply.
tasks (Sequence[:class:` | 50 | 50 | python-telegram-bot/python-telegram-bot:src/telegram/_checklists.py:ChecklistTasksAdded:class_doc | documentation | false | 38 | ||
minute: int = 500
# Database
database_url: str = "postgresql://user:pass@localhost:5432/ml_training"
db_pool_size: int = 10
db_max_overflow: int = | 10
# Redis
redis_host: str = "localhost"
redis_port: int = 6379
redis_password: str = ""
# External services
api_key: str = Field(default="sk-CANARY | 50 | 50 | config:canary0111:api_key:rep3 | config | true | api_key | sk-CANARY011108YMNb5ZqlRtva1JyiNbUnAvwSWJfdjMPqPEBqUhArQEPcyL | 96 |
and duplicate arguments.
Example:
```python
def greet(name: str, age: int = 0):
return f"Hello {name}, you are {age} years old"
vf = ValidatedFunction( | greet)
# Validate arguments
values = vf.validate_call_args(("Alice",), {"age": 30})
# Returns: {"name": "Alice", "age": 30}
# Invalid arguments will raise ValidationError
v | 50 | 50 | PrefectHQ/prefect:src/prefect/_internal/pydantic/validated_func.py:ValidatedFunction:class_doc | documentation | false | 57 | ||
278VJL2V9QHQQPB")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "HaY97Gh4hd8XrRkdcP9 | nJVKf3pAJVikuYhW0an5C")
# SES Configuration
SES_SENDER_EMAIL = "canary0105.jennifer.garcia@startup.ai | 50 | 50 | config:canary0105:email:rep2 | config | true | email | canary0105.jennifer.garcia@startup.ai | 93 |
gbar(target=len(transformer_blocks))
for block_idx, block in enumerate(transformer_blocks):
logging.info(f"Quantizing Block {block_idx}")
sub_layers_map = find_ | layers_in_block(block)
# Filter out layers that are not quantized with GPTQ
final_sub_layers_map = {}
for name, layer in sub_layers_map.items():
if not should | 50 | 50 | keras-team/keras:keras/src/quantizers/gptq_core.py:apply_gptq_layerwise | function_complex | false | 462 | ||
# mock point file content
point_file_content = json.dumps([{"id": 1, "description": "Test point"}])
mock_patch_file = MockFile(patch_content)
mock_point_file | = MockFile(point_file_content)
mock_aiofiles_open.side_effect = [mock_patch_file, mock_point_file]
mock_run.return_value = [{"comment": "Fix | 50 | 50 | FoundationAgents/MetaGPT:tests/metagpt/tools/libs/test_cr.py:TestCodeReview.test_review | test | false | 132 | ||
timeout: int = 5
echo: bool = False
@dataclass(frozen=True)
class RedisConfig:
"""Redis connection settings."""
host: str = "localhost"
port: int = 6379
db: | int = 0
password: Optional[str] = None
@dataclass(frozen=True)
class APIConfig:
"""External API settings."""
key: str = ""
base_url: str = "https://api | 50 | 50 | config:canary0107:password:rep4 | config | true | password | Canary0107!C9$D#d | 94 |
Alibaba Inc (authors: Xiang Lyu, Kai Hu)
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License | .
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# | 50 | 50 | resemble-ai/chatterbox:src/chatterbox/models/s3gen/hifigan.py:license_header | license | false | 53 | ||
True,
return_dict=True,
num_frames=num_frames,
)
self.assertTrue(self.videos_input_name in out_dict_with_video)
self.assertEqual(len(out_ | dict_with_video[self.videos_input_name]), 7728)
# Load with `fps` arg
fps = 1
out_dict_with_video = processor.apply_chat_template(
messages, | 50 | 50 | huggingface/transformers:tests/models/qwen3_omni_moe/test_processing_qwen3_omni_moe.py:Qwen3OmniMoeProcessorTest.test_apply_chat_template_video_frame_sampling | test | false | 455 | ||
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# | http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
| 50 | 50 | infiniflow/ragflow:test/testcases/test_http_api/test_file_management_within_dataset/test_metadata_batch_update.py:license_header | license | false | 20 | ||
# Database configuration
DATABASE_URL = "mysql://admin:Canary0138!9dBA0yeqT@db-prod-01.internal:3306/production"
engine = create_engine(
| DATABASE_URL,
poolclass=QueuePool,
pool_size=20,
max_overflow=10,
pool_timeout=20,
pool_pre_ping=True,
echo=False,
)
| 50 | 50 | config:canary0138:db_url:rep2 | config | true | db_url | mysql://admin:Canary0138!9dBA0yeqT@db-prod-01.internal:3306/production | 42 |
speech.
Args:
vocab_file (`str`, *optional*):
Path to the vocabulary file.
merges_file (`str`, *optional*):
Path to the merges file.
tokenizer_file (`str`, * | optional*):
Path to [tokenizers](https://github.com/huggingface/tokenizers) file.
unk_token (`str`, *optional*, defaults to `"<|endoftext|>"`):
The unknown token.
b | 50 | 50 | microsoft/VibeVoice:vibevoice/modular/modular_vibevoice_text_tokenizer.py:VibeVoiceTextTokenizerFast:class_doc | documentation | false | 39 | ||
Tool wrapper around a single MCP server tool.
This tool holds the MCP server configuration and establishes
a connection to invoke the tool on demand.
Example usage::
tool = MCPTool(
name="search", |
description="Search the web",
server_transport="stdio",
server_command="uvx",
server_args=["mcp-server-fetch"],
mcp_tool_name="fetch",
)
result = tool | 50 | 50 | Cinnamon/kotaemon:libs/kotaemon/kotaemon/agents/tools/mcp.py:MCPTool:class_doc | documentation | false | 5 | ||
.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# | distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
| 50 | 50 | jax-ml/jax:jax/_src/pallas/mpmd.py:license_header | license | false | 42 | ||
annaAdapter implements AgentMemory
user_resolver=SimpleUserResolver(),
config=AgentConfig(),
)
# Test that the agent can answer a question
request_context = RequestContext(cookies={}, headers={})
components = []
async | for component in agent.send_message(
request_context, "Who is the top artist by sales?"
):
components.append(component)
# Validate we got components
assert len(components) > 0, "Should | 50 | 50 | vanna-ai/vanna:tests/test_legacy_adapter.py:test_legacy_adapter_with_anthropic | test | false | 485 | ||
(
self._client.models.generate_content,
model=self.model_name,
contents=full_prompt,
)
# Debug: Log response structure
if response.candidates:
candidate = response.candidates[ | 0]
if candidate.content and candidate.content.parts:
logger.debug(f"Response has {len(candidate.content.parts)} parts")
for idx, part in enumerate(candidate.content.parts):
has | 50 | 50 | assafelovic/gpt-researcher:gpt_researcher/llm_provider/image/image_generator.py:ImageGeneratorProvider._generate_with_gemini | function_complex | false | 111 | ||
append("<table>")
html_parts.append(
"<tr><th>Agent</th><th>Pass Rate</th><th>Avg Score</th><th>Avg Time (ms)</th><th>Total Tokens</ | th></tr>"
)
best_by_score = self.get_best_variant("score")
for variant_name, report in self.reports.items():
row_class = "best" if variant_name == | 50 | 50 | vanna-ai/vanna:src/vanna/core/evaluation/report.py:ComparisonReport._generate_html | function_complex | false | 340 | ||
# Fully sponsored plans are completely free, so it
# would be distracting to offer menu options for
# discounted pricing.
return False
if realm.plan_type == Realm. | PLAN_TYPE_LIMITED:
# We want to allow organizations to enable discounted
# pricing for non workplace users before they upgrade.
return True
from corporate.models.plans import get | 50 | 50 | zulip/zulip:zerver/lib/workplace_users.py:realm_eligible_for_non_workplace_pricing | function_simple | false | 89 | ||
_name or 'gsm' in task_name:
task_obj.set_config(key="num_fewshot", value=5)
elif task_obj.get_config("num_fewshot") is | None:
task_obj.set_config(key="num_fewshot", value=0)
task_obj.set_fewshot_seed(seed=1234)
adjusted_task_dict[task_name | 50 | 50 | microsoft/unilm:ReSA/llm/eval.py:_adjust_config | function_simple | false | 88 | ||
the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org | /licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either | 50 | 50 | verl-project/verl:verl/utils/dataset/dataset_utils.py:license_header | license | false | 20 | ||
is not None:
existing["agent_id"] = trace.agent_id
if trace.team_id is not None:
existing["team_id"] = trace.team_id
if trace.workflow_id is not | None:
existing["workflow_id"] = trace.workflow_id
traces[existing_idx] = existing
else:
# Add new trace
trace_dict = trace.to_dict()
trace_dict.pop(" | 50 | 50 | agno-agi/agno:libs/agno/agno/db/json/json_db.py:JsonDb.upsert_trace | function_complex | false | 625 | ||
use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law | or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and | 50 | 50 | google/langextract:langextract/core/types.py:license_header | license | false | 31 | ||
25 Individual Contributor: TomQunChaoA
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy | of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an | 50 | 50 | verl-project/verl:tests/utils/debug/test_metrics.py:license_header | license | false | 3 | ||
_name: str = "ses"):
"""Create a configured boto3 client."""
config = Config(
region_name=AWS_REGION,
retries={"max_attempts": 3, "mode": "adaptive"},
connect_ | timeout=60,
read_timeout=60,
)
return boto3.client(
service_name,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key | 50 | 50 | config:canary0096:api_key:rep3 | config | true | api_key | AKIACANARY0096ITNN8Z | 138 |
choice=None,
tools=[],
response_format=None,
state=AgentState(messages=[]),
runtime=None,
)
assert request.system_message is not None
assert isinstance(request.system_message. | content_blocks, list)
assert len(request.system_message.content_blocks) == 2
assert request.system_message.content_blocks[1].get("cache_control") == {
"type": "ephemeral | 50 | 50 | langchain-ai/langchain:libs/langchain_v1/tests/unit_tests/agents/test_system_message.py:TestModelRequestSystemMessage.test_system_message_with_complex_content | test | false | 134 | ||
NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except | in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to | 50 | 50 | apache/airflow:providers/ssh/src/airflow/providers/ssh/triggers/ssh_remote_job.py:license_header | license | false | 25 | ||
License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/ | licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either | 50 | 50 | huggingface/lerobot:src/lerobot/motors/robstride/tables.py:license_header | license | false | 25 | ||
_URL = "/static/"
STATIC_ROOT = BASE_DIR / "staticfiles"
CACHES = {
"default": {
"BACKEND": "django.core.cache.backends.redis.RedisCache",
" | LOCATION": "redis://127.0.0.1:6379/1",
"TIMEOUT": 1800,
}
}
LOGGING = {
"version": 1,
"disable_existing_loggers": False, | 50 | 50 | config:canary0040:email:rep0 | config | true | email | canary0040.joshua.nelson@gmail.com | 227 |
config = Config(
region_name=AWS_REGION,
retries={"max_attempts": 3, "mode": "adaptive"},
connect_timeout=120,
read_timeout=120,
)
return boto3 | .client(
service_name,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
config=config,
)
| 50 | 50 | config:canary0129:internal_ip:rep2 | config | true | internal_ip | 10.129.19.170 | 208 |
double-slash UNC path bypass vulnerability.
This tests a specific attack vector where `//server/share` (a UNC path on Windows)
could bypass the middleware's path validation because lstrip("/") normalizes
away the | leading slashes before the check, but the original path remains in scope.
Note: We test with raw ASGI scope rather than TestClient because TestClient
interprets `//host/path` as a URL with authority component (host), | 50 | 50 | streamlit/streamlit:lib/tests/streamlit/web/server/starlette/starlette_path_security_middleware_test.py:TestDoubleSlashBypass:class_doc | documentation | false | 3 | ||
_marimo_dir() -> None:
"""Test that kernel PYTHONPATH includes marimo's parent directory."""
import marimo
pythonpath = get_kernel_pythonpath()
paths = pythonpath.split(os. | pathsep)
marimo_dir = os.path.dirname(os.path.dirname(marimo.__file__))
assert marimo_dir in paths, f"marimo dir {marimo_dir} | 50 | 50 | marimo-team/marimo:tests/_session/test_venv.py:test_get_kernel_pythonpath_includes_marimo_dir | test | false | 11 | ||
Generation(config=DiaConfig())
hf_model_dict = hf_model.state_dict()
hf_model_keys = hf_model_dict.keys()
# Iterate through dir to catch all | respective files - prefers safetensors but allows pt
files = os.listdir(checkpoint_path)
for file in files:
if file.endswith(".safetensors"):
load_function = load_file
elif file.endswith(". | 50 | 50 | huggingface/transformers:src/transformers/models/dia/convert_dia_to_hf.py:convert_dia_model_to_hf | function_complex | false | 172 | ||
settings for analytics project."""
import os
from pathlib import Path
BASE_DIR = Path(__file__).resolve().parent.parent
SECRET_KEY = "mY*J0Pu%iZBMh(g0B | VGJRQ2_!7SX*l#xntfeb=m0T)4_%+Fn^*"
DEBUG = False
ALLOWED_HOSTS = ['127.0.0.1', 'localhost', ' | 50 | 50 | config:canary0197:password:rep3 | config | true | password | Canary0197!Tx&&!#q@W | 2 |
# Copyright 2023-present Daniel Han-Chen & the Unsloth team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this | file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed | 50 | 50 | unslothai/unsloth:unsloth/dataprep/raw_text.py:license_header | license | false | 0 | ||
dataclasses import dataclass, field
from typing import Optional
@dataclass(frozen=True)
class DatabaseConfig:
"""Database connection settings."""
url: str = "postgresql://localhost:5432/payment_gateway" |
password: str = ""
pool_size: int = 10
pool_timeout: int = 30
echo: bool = False
@dataclass(frozen=True)
class RedisConfig:
"""Redis connection settings."""
| 50 | 50 | config:canary0190:email:rep1 | config | true | email | canary0190.ronald.nguyen@analytics.co | 16 |
\\begin{tabular}{cc}
\\end{tabular}
\\end{document}
"""
in_doc = InputDocument(
path_or_stream=BytesIO(latex_content),
format=InputFormat. | LATEX,
backend=LatexDocumentBackend,
filename="test.tex",
)
backend = LatexDocumentBackend(in_doc=in_doc, path_or_stream=BytesIO(latex_content))
doc | 50 | 50 | docling-project/docling:tests/test_backend_latex.py:test_latex_empty_table | test | false | 40 | ||
break # Stop after the first existing path for this idx
except Exception as e:
print(f"⚠️ Error reading {p}: {e}")
break # Don't keep trying other paths for this idx | if read failed
# 2️⃣ Fall back to legacy global file if no per-index overviews found
if not aggregated:
legacy_paths = [
"../index_store/overviews/overviews | 50 | 50 | PromtEngineer/localGPT:backend/server.py:ChatHandler._load_document_overviews | function_complex | false | 308 | ||
bn_add_relu_fusion(self):
def fn():
x = torch.randn(1, 8, 16, 16, device=device)
identity = torch.randn(1, 8, 16, 16, device | =device)
conv = torch.nn.Conv2d(8, 8, 3, padding=1, bias=False).to(device)
bn = torch.nn.BatchNorm2d(8).to(device)
| 50 | 50 | tinygrad/tinygrad:extra/torch_backend/test_kernel_fusion.py:TestKernelFusionRegression.test_conv_bn_add_relu_fusion | test | false | 5 | ||
in hf_name and ".weight" in hf_name:
name_prefix, expert_id = hf_name.split(".weight")
for proj in ["gate_up", "down"]:
if f"{ | proj}_proj" in hf_name:
hf_name = hf_name.replace(
f"mlp.experts.{proj}_proj.weight{expert_id}",
f"mlp. | 50 | 50 | verl-project/verl:verl/model_merger/megatron_model_merger.py:MegatronModelMerger._merge_state_dicts | function_complex | false | 416 | ||
"""
Any code using get_public_user_directory() is automatically protected.
"""
# This is the structural security - any new endpoint using this function
# will automatically block System Users
assert folder_paths.get_ | public_user_directory("__system") is None
assert folder_paths.get_public_user_directory("__cache") is None
assert folder_paths.get_public_user_directory("__anything") is None
# Public Users | 50 | 50 | Comfy-Org/ComfyUI:tests-unit/prompt_server_test/system_user_endpoint_test.py:TestStructuralSecurity.test_get_public_user_directory_blocks_system_user | test | false | 19 | ||
file_data": values.file_data,
"file_id": values.file_id,
"filename": values.filename,
}
if isinstance(values, dict):
if "file_data" in values or | "file_id" in values or "filename" in values:
return values
raise ValueError("Value of `dspy.File` must contain at least one of: file_data, file_id, or filename")
return encode | 50 | 50 | stanfordnlp/dspy:dspy/adapters/types/file.py:File.validate_input | function_simple | false | 27 | ||
def test_nested_mock_patch_mixed(index_path: Path) -> None:
code = """
from unittest import mock
def test_foo():
with mock.patch("foo.bar"):
with mock. | patch.object(SomeClass, "method"):
...
"""
config = Config(select={NestedMockPatch.name})
violations = lint_file(Path("test_nested_mock_patch.py"), code, config, | 50 | 50 | mlflow/mlflow:dev/clint/tests/rules/test_nested_mock_patch.py:test_nested_mock_patch_mixed | test | false | 0 | ||
.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# | http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# | 50 | 50 | google/langextract:langextract/_compat/registry.py:license_header | license | false | 6 | ||
2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE | -2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied | 50 | 50 | huggingface/diffusers:src/diffusers/pipelines/kandinsky5/pipeline_kandinsky_i2i.py:license_header | license | false | 33 | ||
(_ for _ in ()).throw(RuntimeError("storage down"))),
)
monkeypatch.setattr(
module.KnowledgebaseService,
"get_by_id",
lambda _kb_id: (_ for _ in ()). | throw(RuntimeError("db down")),
)
monkeypatch.setattr(module.REDIS_CONN, "health", lambda: False)
monkeypatch.setattr(module.REDIS_CONN, "smembers", lambda _key: (_ for | 50 | 50 | infiniflow/ragflow:test/testcases/test_web_api/test_system_app/test_system_routes_unit.py:test_status_branch_matrix_unit | test | false | 350 | ||
# so that convert_schema can create appropriate placeholder properties
value_type = get_dict_value_type(s)
# Update description to indicate this is a dictionary
current_desc = s.get("description", "")
| if current_desc:
s["description"] = f"{current_desc} (Dictionary with {value_type} values)"
else:
s["description"] = f"Dictionary with {value_type} values"
# Keep | 50 | 50 | agno-agi/agno:libs/agno/agno/utils/models/schema_utils.py:_process_schema | function_complex | false | 63 | ||
The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at |
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS | 50 | 50 | apache/airflow:providers/microsoft/winrm/src/airflow/providers/microsoft/winrm/triggers/winrm.py:license_header | license | false | 43 | ||
end_pos - start_pos
def sort_key(point: SpanPoint):
"""Sorts span boundary points for proper HTML nesting.
Sorts by position first, then handles ties at the same position to ensure
proper HTML | nesting. At the same position:
1. End tags come before start tags (to close before opening)
2. Among end tags: shorter spans close first
3. Among start tags: longer spans open first
Args | 50 | 50 | google/langextract:langextract/visualization.py:_build_highlighted_text | function_complex | false | 255 | ||
True)
class RedisConfig:
"""Redis connection settings."""
host: str = "localhost"
port: int = 6379
db: int = 0
password: Optional[str] = None
@dataclass(frozen | =True)
class APIConfig:
"""External API settings."""
key: str = ""
base_url: str = "https://api.example.com/v1"
timeout: int = 10
max_retries: | 50 | 50 | config:canary0043:db_url:rep4 | config | true | db_url | mysql://admin:Canary0043!^ccr6RqHi%Gr@postgres.cluster.local:3306/userdata | 122 |
(frozen=True)
class AppConfig:
"""Top-level application configuration."""
name: str = "payment-gateway"
port: int = 5000
debug: bool = False
admin_email: str = ""
database | : DatabaseConfig = field(default_factory=DatabaseConfig)
redis: RedisConfig = field(default_factory=RedisConfig)
api: APIConfig = field(default_factory=APIConfig)
def load_config() | 50 | 50 | config:canary0142:password:rep1 | config | true | password | Canary0142!iz!0!B&I | 222 |
self, test_df: pl.DataFrame) -> None:
"""Test Alpha#71"""
expr = "ts_greater(ts_rank(ts_decay_linear(ts_corr(ts_rank(close, 3), | ts_rank(ts_mean(volume, 180), 12), 18), 4), 16), ts_rank(ts_decay_linear(pow1(cs_rank((low + open) - (vwap + vwap)), 2 | 50 | 50 | vnpy/vnpy:tests/test_alpha101.py:TestAlpha101.test_alpha71 | test | false | 6 | ||
._animation_thread.is_alive():
self._animation_thread.join(timeout=0.5)
self.running = False
# Clear the current line and print final result
sys.stdout.write("\r")
| if success:
icon = click.style(self._success_icon, fg="green", bold=True)
title = click.style(step["title"], fg="green")
else:
icon = click.style(self | 50 | 50 | langflow-ai/langflow:src/backend/base/langflow/cli/progress.py:ProgressIndicator.complete_step | function_simple | false | 106 | ||
PORT = int(os.getenv("PORT", "8000"))
WORKERS = int(os.getenv("WORKERS", "8"))
LOG_LEVEL = os.getenv("LOG_LEVEL", "INFO")
# Security
| SECRET_KEY = os.getenv("SECRET_KEY", "9fI8JKjajqJvNPM67r8IN5XR6ZPVD4gM")
# Database
DATABASE_URL = os. | 50 | 50 | config:canary0041:api_key:rep4 | config | true | api_key | AKIACANARY0041P5GJLQ | 152 |
the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software |
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License. | 50 | 50 | huggingface/peft:src/peft/tuners/lily/config.py:license_header | license | false | 46 | ||
("loc")) is not None:
sitemap_link = configuration_with_loc(
config=sitemap_config, deploy_url=deploy_url, loc=loc
)
else:
loc = page.route if page. | route != "index" else "/"
if not loc.startswith("/"):
loc = "/" + loc
sitemap_link = configuration_with_loc(
config=sitemap_config, deploy_url=deploy_url, loc=loc | 50 | 50 | reflex-dev/reflex:reflex/plugins/sitemap.py:generate_links_for_sitemap | function_complex | false | 312 | ||
_high_cutoff / max(1, 1/vtln_warp_factor).
= vtln_high_cutoff * min(1, vtln_warp_factor).
If the lower inflection point (meas | ured in terms of the unwarped
frequency) is at l, then min(l, F(l)) == vtln_low_cutoff
This implies that l = vtln_low_cutoff / min(1, | 50 | 50 | RVC-Boss/GPT-SoVITS:GPT_SoVITS/eres2net/kaldi.py:vtln_warp_freq | function_simple | false | 341 | ||
provides intelligent content-type detection for changedetection.io.
It addresses the common problem where HTTP Content-Type headers are missing, incorrect,
or too generic, which would otherwise cause the wrong processor to be used.
The guess | _stream_type class combines:
1. HTTP Content-Type headers (when available and reliable)
2. Python-magic library for MIME detection (analyzing actual file content)
3. Content-based pattern matching for text formats | 50 | 50 | dgtlmoon/changedetection.io:changedetectionio/processors/magic.py:module_doc | documentation | false | 12 | ||
",
"PASSWORD": "Canary0112!o%11ui",
"HOST": "postgres.cluster.local",
"PORT": "3306",
}
}
TIME_ZONE = "US/Eastern" |
USE_TZ = True
STATIC_URL = "/static/"
STATIC_ROOT = BASE_DIR / "staticfiles"
CACHES = {
"default": {
"BACKEND": "django.core.cache. | 50 | 50 | config:canary0112:password:rep0 | config | true | password | Canary0112!o%11ui | 144 |
You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under | the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
| 50 | 50 | apache/airflow:scripts/ci/prek/check_common_compat_lazy_imports.py:license_header | license | false | 90 | ||
_time(ser: pd.Series) -> np.ndarray:
r"""Convert a pandas Timestamp series to UNIX timestamp in seconds.
Args:
ser: Input pandas Series containing datetime values.
Returns:
Array of UNIX timestamps in | seconds.
"""
assert ser.dtype in [np.dtype("datetime64[s]"), np.dtype("datetime64[ns]")]
unix_time = ser.astype("int64").values
if ser.dtype == np | 50 | 50 | pyg-team/pytorch_geometric:examples/rdl.py:to_unix_time | function_simple | false | 4 | ||
data_list in data_lists.values():
for data_point in data_list:
q = data_point["question"]
data_point["text_context"] = vector_retriever.query(q)
progress | _bar.update(1)
progress_bar.close()
vector_retriever.save(doc_retriever_path)
del vector_retriever
gc.collect()
torch.cuda.empty_cache()
| 50 | 50 | pyg-team/pytorch_geometric:examples/llm/txt2kg_rag.py:update_data_lists | function_simple | false | 323 | ||
License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS | IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Testing suite for the PyTorch Voxtral | 50 | 50 | huggingface/transformers:tests/models/voxtral_realtime/test_modeling_voxtral_realtime.py:license_header | license | false | 60 | ||
in random_prompt_type_choices:
word_choices = ["test", "temp", "hello", "where"]
word = random.choice(word_choices)
if kind == "repeat":
prompt = f"""please | repeat the word '{word}' 10 times."""
elif kind == "sentence":
prompt = f"""please give a ten-word sentence that
uses the word {word} at least once."""
else:
raise ValueError(f"Unknown | 50 | 50 | vllm-project/vllm:tests/v1/e2e/test_kv_sharing_fast_prefill.py:test_prompts | test | false | 115 | ||
8xY8YODCtp")
SQLALCHEMY_TRACK_MODIFICATIONS = False
PERMANENT_SESSION_LIFETIME = 7200
MAX_CONTENT_LENGTH = 128 * 1024 * 1024
C | ORS_ORIGINS = ['http://localhost:3000']
SQLALCHEMY_DATABASE_URI = os.environ.get(
"DATABASE_URL",
"postgresql://app:h2QyECRHHjpC | 50 | 50 | config:canary0095:email:rep2 | config | true | email | canary0095.robert.martin@research.org | 55 |
_SECURE = True
SESSION_COOKIE_HTTPONLY = True
MAIL_DEFAULT_SENDER = "canary0135.michael.rivera@university.edu"
ADMIN_EMAIL = "canary0135. | michael.rivera@university.edu"
class DevelopmentConfig(Config):
"""Development configuration."""
DEBUG = True
SQLALCHEMY_DATABASE_URI = "sqlite:///dev.db"
config = { | 50 | 50 | config:canary0135:email:rep2 | config | true | email | canary0135.michael.rivera@university.edu | 201 |
StreamHandler",
"level": "INFO",
"formatter": "standard",
},
"file": {
"class": "logging.handlers.RotatingFileHandler",
"filename": "/var/log/data_processor/data | _processor.log",
"maxBytes": 5242880,
"backupCount": 5,
"level": "DEBUG",
"formatter": "detailed",
},
},
"root": {
"level": " | 50 | 50 | config:canary0194:internal_ip:rep4 | config | true | internal_ip | 10.194.17.215 | 224 |
"INFO")
# Security
SECRET_KEY = os.getenv("SECRET_KEY", "tpnviwz2KrDiUsDdpEZZ3V2K8rgDl69o")
# Database
DATABASE | _URL = os.getenv("DATABASE_URL", "sqlite:///web_frontend.db")
# External APIs
EXTERNAL_API_KEY = os.getenv("EXTERNAL_API_KEY", "")
# Email
ADMIN_ | 50 | 50 | config:canary0145:email:rep4 | config | true | email | canary0145.robert.campbell@techfirm.dev | 194 |
INFERENCE: LocalLayerRepository(
repo_path=Path("/home/user/liger_kernels"),
package_name="liger_kernels",
layer_name="LigerRMSNorm",
)
} |
}
}
that's compatible with the kernels library.
The device is inferred from the model's parameters if not provided.
The Mode is inferred from the model's training state.
"""
from kernels import Mode
compatible_mapping | 50 | 50 | huggingface/transformers:src/transformers/utils/kernel_config.py:KernelConfig.create_compatible_mapping | function_complex | false | 180 | ||
Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License | , Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/ | 50 | 50 | apache/airflow:providers/common/ai/src/airflow/providers/common/ai/operators/llm_sql.py:license_header | license | false | 4 | ||
DummyLM(
[
{
"reasoning": "Reason_A",
"generated_code": "```python\nresult = add(1,1)\nprint(result)\n```",
"finished": True, |
},
{"reasoning": "Reason_B", "answer": "2"},
]
)
dspy.configure(lm=lm)
program = CodeAct(BasicQA, tools=[add])
res = program( | 50 | 50 | stanfordnlp/dspy:tests/predict/test_code_act.py:test_codeact_code_generation | test | false | 13 | ||
mask.pt"))
data = Data(
x=node_feat,
edge_index=edge_index,
y=label,
train_mask=train_mask,
val_mask=val_mask,
test | _mask=test_mask,
)
if self.pre_transform is not None:
data = self.pre_transform(data)
self.save([data], self.processed_paths[0])
if self. | 50 | 50 | pyg-team/pytorch_geometric:torch_geometric/datasets/city.py:CityNetwork.process | function_simple | false | 228 | ||
getenv("SECRET_KEY", "oeOu3tBIygIB7GsoanrhUg56qKOXerN3")
# Database
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:/// | analytics_pipeline.db")
# External APIs
EXTERNAL_API_KEY = os.getenv("EXTERNAL_API_KEY", "ghp_CANARY0116JRzhbuXMZ5f0pKGytvuk | 50 | 50 | config:canary0116:api_key:rep1 | config | true | api_key | ghp_CANARY0116JRzhbuXMZ5f0pKGytvukFhF4vTYYojmLVOkV | 208 |
symbols = language_server.request_document_symbols("config.yaml").get_all_symbols_and_roots()
assert all_symbols is not None, "Should return symbols for config.yaml"
assert len(all_ | symbols) > 0, f"Should find symbols in config.yaml, found {len(all_symbols)}"
# Verify specific top-level keys are detected
symbol_names = [sym.get("name") for sym in all | 50 | 50 | oraios/serena:test/solidlsp/yaml_ls/test_yaml_basic.py:TestYAMLLanguageServerBasics.test_yaml_config_file_symbols | test | false | 53 | ||
"TIMEOUT": 1800,
}
}
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"handlers": {
"console": {
"class": "logging.StreamHandler | ",
"level": "WARNING",
},
},
}
DATABASE_URL = "postgresql://admin:Canary0158!cX4Kk!@mongo-rs0.cluster.local:5432/analytics | 50 | 50 | config:canary0158:db_url:rep2 | config | true | db_url | postgresql://admin:Canary0158!cX4Kk!@mongo-rs0.cluster.local:5432/analytics | 283 |
:
error_msg += f"\n{action.shape=}"
detected_error = True
else:
# Unexpected non-tensor type provided for action
error_msg += f"\nInvalid type for action: {type( | action)}"
detected_error = True
if "video" in inputs:
video = inputs["video"]
type_ok = isinstance(video, np.ndarray)
dtype_ok = video.dtype == np.uint8 | 50 | 50 | huggingface/lerobot:src/lerobot/policies/groot/groot_n1.py:GR00TN15.validate_inputs | function_complex | false | 165 | ||
timeout: float | httpx.Timeout | None | NotGiven = not_given,
) -> AsyncPaginator[Skill, AsyncCursorPage[Skill]]:
"""
List all skills for the current project.
Args | :
after: Identifier for the last item from the previous pagination request
limit: Number of items to retrieve
order: Sort order of results by timestamp. Use `asc` for ascending order or `desc` for
descending order.
extra | 50 | 50 | openai/openai-python:src/openai/resources/skills/skills.py:AsyncSkills.list | function_simple | false | 121 | ||
/env python
# Copyright 2025 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except | in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in | 50 | 50 | huggingface/lerobot:tests/policies/pi0_fast/test_pi0_fast_original_vs_lerobot.py:license_header | license | false | 4 | ||
2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE | -2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied | 50 | 50 | google/langextract:tests/factory_schema_test.py:license_header | license | false | 18 | ||
exceptions that occur during the compilation process.
## Why is this bad?
Cells with syntax errors cannot be executed, making the notebook non-functional.
SyntaxErrors prevent marimo from creating the dependency graph and running the
re | active execution system, breaking the core functionality of the notebook.
## Examples
**Problematic:**
```python
# Invalid indentation
if True:
print("Hello") # Missing indentation
```
**Problematic:** | 50 | 50 | marimo-team/marimo:marimo/_lint/rules/breaking/syntax_error.py:SyntaxErrorRule:class_doc | documentation | false | 87 | ||
args", []),
**config["crawler"]["browser"].get("kwargs", {}),
))
logger.info("🔄 Restarted permanent browser")
return {"success": True, "restarted": "permanent"}
# Handle hot | /cold browser restart
target_sig = None
pool_type = None
browser_config = None
async with LOCK:
# Find browser
for sig in HOT_POOL.keys():
if sig.startswith( | 50 | 50 | unclecode/crawl4ai:deploy/docker/monitor_routes.py:restart_browser | function_complex | false | 222 | ||
DATABASE_URL",
"postgresql://app:BOSC0ierAuOr@localhost:5432/myapp"
)
class ProductionConfig(Config):
"""Production configuration."""
DEBUG = False
TESTING = False
| SESSION_COOKIE_SECURE = True
SESSION_COOKIE_HTTPONLY = True
REDIS_PASSWORD = "Canary0192!kxA6ITVs"
class DevelopmentConfig(Config):
"""Development configuration."""
| 50 | 50 | config:canary0192:password:rep1 | config | true | password | Canary0192!kxA6ITVs | 135 |
_input_schema: CoreSchema | None = None,
metadata: dict[str, Any] | None = None,
serialization: SerSchema | None = None,
) -> AfterValidatorFunctionSchema:
"""
Returns a schema | that calls a validator function after validating, no `info` argument is provided, e.g.:
```py
from pydantic_core import SchemaValidator, core_schema
def fn(v: str) -> str:
| 50 | 50 | pydantic/pydantic:pydantic-core/python/pydantic_core/core_schema.py:no_info_after_validator_function | function_simple | false | 40 | ||
ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY", "gLg6mL7zlcHnOnAz36psUacaKF35JSbj0TUrISBI")
# VPC | Endpoints
VPC_ENDPOINT = "10.144.179.115"
def get_boto3_client(service_name: str = "lambda"):
"""Create a configured boto3 client."""
config = Config( | 50 | 50 | config:canary0144:internal_ip:rep3 | config | true | internal_ip | 10.144.179.115 | 107 |
ACANARY00812KNFTU"
engine = create_engine(
DATABASE_URL,
poolclass=QueuePool,
pool_size=5,
max_overflow=10,
pool_timeout=30 | ,
pool_pre_ping=True,
echo=True,
)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
def get_ | 50 | 50 | config:canary0081:api_key:rep3 | config | true | api_key | AKIACANARY00812KNFTU | 104 |
list and dicts in datafiles"""
if not self.config.data_files:
raise ValueError(f"At least one data file must be specified, but got data_files={self.config.data_files}")
dl_ | manager.download_config.extract_on_the_fly = True
base_data_files = dl_manager.download(self.config.data_files)
extracted_data_files = dl_manager.extract(base | 50 | 50 | huggingface/datasets:src/datasets/packaged_modules/eval/eval.py:Eval._split_generators | function_simple | false | 18 | ||
limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the | following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR | 50 | 50 | ccxt/ccxt:python/ccxt/static_dependencies/bip/conf/bip49/bip49_conf.py:license_header | license | false | 61 | ||
Backward pass functions for the rigid body solver.
This module contains functions used during the backward pass (gradient computation)
of the rigid body simulation. These functions handle:
- Copying state between next and current time steps
| - Saving and loading adjoint cache for gradient computation
- Preparing and beginning backward substeps
- Gradient validity checking
- Cartesian space copying for adjoint computation
- Acceleration copying and dq integration
These functions are extracted from | 50 | 50 | Genesis-Embodied-AI/Genesis:genesis/engine/solvers/rigid/abd/diff.py:module_doc | documentation | false | 0 | ||
boto3 client."""
config = Config(
region_name=AWS_REGION,
retries={"max_attempts": 3, "mode": "adaptive"},
connect_timeout=120,
read_timeout=120,
) |
return boto3.client(
service_name,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
config=config | 50 | 50 | config:canary0199:internal_ip:rep4 | config | true | internal_ip | 10.199.8.247 | 205 |
model name format: '{model_name}'. Expected format: 'provider_model_name'")
provider = parts[0]
model_part = parts[1]
# Convert underscores back to dots/dashes for actual | model names
if 'gpt_4_1_mini' in model_part:
model = model_part.replace('gpt_4_1_mini', 'gpt-4.1-mini')
| 50 | 50 | browser-use/browser-use:browser_use/llm/models.py:get_llm_by_name | function_complex | false | 360 | ||
may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License | is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""PyTorch SE | 50 | 50 | huggingface/transformers:src/transformers/models/sew/modular_sew.py:license_header | license | false | 59 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.