code stringlengths 66 870k | docstring stringlengths 19 26.7k | func_name stringlengths 1 138 | language stringclasses 1
value | repo stringlengths 7 68 | path stringlengths 5 324 | url stringlengths 46 389 | license stringclasses 7
values |
|---|---|---|---|---|---|---|---|
def get_random_data(annotation_line, input_shape, random=True, max_boxes=20, jitter=.3, hue=.1, sat=1.5, val=1.5, proc_img=True):
'''random preprocessing for real-time data augmentation'''
line = annotation_line.split()
image = Image.open(line[0])
iw, ih = image.size
h, w = input_shape
box = np.... | random preprocessing for real-time data augmentation | get_random_data | python | qqwweee/keras-yolo3 | yolo3/utils.py | https://github.com/qqwweee/keras-yolo3/blob/master/yolo3/utils.py | MIT |
def test_handles_bytes_subclasses(self) -> None:
"""
Ensure the library can support being used in projects that might work with values that are
subclasses of `bytes`. Let's embrace Python's duck-typing, not shy away from it
"""
class CustomBytes(bytes):
def __new__(c... |
Ensure the library can support being used in projects that might work with values that are
subclasses of `bytes`. Let's embrace Python's duck-typing, not shy away from it
| test_handles_bytes_subclasses | python | duo-labs/py_webauthn | tests/test_bytes_subclass_support.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_bytes_subclass_support.py | BSD-3-Clause |
def test_handles_memoryviews(self) -> None:
"""
Ensure support for libraries that leverage memoryviews
"""
def base64url_to_memoryview(data: str) -> memoryview:
data_bytes = base64url_to_bytes(data)
return memoryview(data_bytes)
verification = verify_aut... |
Ensure support for libraries that leverage memoryviews
| test_handles_memoryviews | python | duo-labs/py_webauthn | tests/test_bytes_subclass_support.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_bytes_subclass_support.py | BSD-3-Clause |
def test_supports_options_to_json_output(self) -> None:
"""
Test that output from `generate_authentication_options()` that's fed directly into
`options_to_json()` gets parsed back into the original options without any changes along
the way.
"""
opts = generate_authenticat... |
Test that output from `generate_authentication_options()` that's fed directly into
`options_to_json()` gets parsed back into the original options without any changes along
the way.
| test_supports_options_to_json_output | python | duo-labs/py_webauthn | tests/test_parse_authentication_options.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_parse_authentication_options.py | BSD-3-Clause |
def _generate_auth_data(
sign_count: int = 0,
up: bool = True,
uv: bool = False,
be: bool = False,
bs: bool = False,
at: bool = False,
ed: bool = False,
) -> Tuple[bytes, bytes, int, Optional[bytes], Optional[bytes], Optional[bytes]]:
"""A helper to generate auth_data
Args:
... | A helper to generate auth_data
Args:
`sign_count`: How many times the authenticator has been used
`up`: Whether user was present
`uv`: Whether user was verified
`be`: Whether credential can be backed up
`bs`: Whether credential has been backed up
`at`: Whether attest... | _generate_auth_data | python | duo-labs/py_webauthn | tests/test_parse_authenticator_data.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_parse_authenticator_data.py | BSD-3-Clause |
def test_parses_bad_eddsa_auth_data(self) -> None:
"""
Help out particular YubiKeys that incorrectly CBOR-encode authData when they use Ed25519
for their public key.
See https://github.com/duo-labs/py_webauthn/issues/160
"""
auth_data = bytearray.fromhex(
"16... |
Help out particular YubiKeys that incorrectly CBOR-encode authData when they use Ed25519
for their public key.
See https://github.com/duo-labs/py_webauthn/issues/160
| test_parses_bad_eddsa_auth_data | python | duo-labs/py_webauthn | tests/test_parse_authenticator_data.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_parse_authenticator_data.py | BSD-3-Clause |
def test_supports_options_to_json_output(self) -> None:
"""
Test that output from `generate_registration_options()` that's fed directly into
`options_to_json()` gets parsed back into the original options without any changes along
the way.
"""
opts = generate_registration_... |
Test that output from `generate_registration_options()` that's fed directly into
`options_to_json()` gets parsed back into the original options without any changes along
the way.
| test_supports_options_to_json_output | python | duo-labs/py_webauthn | tests/test_parse_registration_options_json.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_parse_registration_options_json.py | BSD-3-Clause |
def test_parse_registration_credential_json(self):
"""
Check that we can properly parse some values that aren't really here-or-there for response
verification, but can still be useful to RP's to fine-tune the WebAuthn experience.
"""
parsed = parse_registration_credential_json(
... |
Check that we can properly parse some values that aren't really here-or-there for response
verification, but can still be useful to RP's to fine-tune the WebAuthn experience.
| test_parse_registration_credential_json | python | duo-labs/py_webauthn | tests/test_structs.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_structs.py | BSD-3-Clause |
def test_verify_attestation_android_key_hardware_authority(
self,
patched_x509store: X509Store,
) -> None:
"""
This android-key attestation was generated on a Pixel 8a in January 2025 via an origin
trial. Google will be sunsetting android-safetynet attestation for android-key... |
This android-key attestation was generated on a Pixel 8a in January 2025 via an origin
trial. Google will be sunsetting android-safetynet attestation for android-key attestations
for device-bound passkeys (i.e. `"residentKey": "discouraged"`) in April 2025
See here for more info:
... | test_verify_attestation_android_key_hardware_authority | python | duo-labs/py_webauthn | tests/test_verify_registration_response_android_key.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_verify_registration_response_android_key.py | BSD-3-Clause |
def test_verify_attestation_android_safetynet_basic_integrity_true_cts_profile_match_false(
self,
mock_cbor2_loads: MagicMock,
mock_b64encode: MagicMock,
mock_verify_certificate: MagicMock,
):
"""
We're not working with a full WebAuthn response here so we have to mock... |
We're not working with a full WebAuthn response here so we have to mock out some values
because all we really want to test is that such a response is allowed through
| test_verify_attestation_android_safetynet_basic_integrity_true_cts_profile_match_false | python | duo-labs/py_webauthn | tests/test_verify_registration_response_android_safetynet.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_verify_registration_response_android_safetynet.py | BSD-3-Clause |
def test_raise_attestation_android_safetynet_basic_integrity_false_cts_profile_match_false(
self,
mock_cbor2_loads: MagicMock,
mock_b64encode: MagicMock,
mock_verify_certificate: MagicMock,
):
"""
We're not working with a full WebAuthn response here so we have to mock... |
We're not working with a full WebAuthn response here so we have to mock out some values
because all we really want to test is that a response fails the basicIntegrity check
| test_raise_attestation_android_safetynet_basic_integrity_false_cts_profile_match_false | python | duo-labs/py_webauthn | tests/test_verify_registration_response_android_safetynet.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_verify_registration_response_android_safetynet.py | BSD-3-Clause |
def test_verify_attestation_surface_pro_4(self) -> None:
"""
TPM Mfgr: INTC (Intel)
Mfgr Version: 500.5.0.0
TPM Version: 2.0
"""
credential = """{
"id": "2O_TSbHXS3KJwx5uwajcqbKwWCBeHjOBCXXb7vrPfUU",
"rawId": "2O_TSbHXS3KJwx5uwajcqbKwWCBeHjOBCXXb7v... |
TPM Mfgr: INTC (Intel)
Mfgr Version: 500.5.0.0
TPM Version: 2.0
| test_verify_attestation_surface_pro_4 | python | duo-labs/py_webauthn | tests/test_verify_registration_response_tpm.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_verify_registration_response_tpm.py | BSD-3-Clause |
def test_verify_attestation_dell_xps_13(self) -> None:
"""
TPM Mfgr: NTC (Nuvoton Technology)
Mfgr Version: 1.3.2.8
TPM Version: 2.0
"""
credential = """{
"id": "56iW7RC7YLiknnNU70kO5Bb-jip9-WTUbohh_Aqq1q4",
"rawId": "56iW7RC7YLiknnNU70kO5Bb-jip9-W... |
TPM Mfgr: NTC (Nuvoton Technology)
Mfgr Version: 1.3.2.8
TPM Version: 2.0
| test_verify_attestation_dell_xps_13 | python | duo-labs/py_webauthn | tests/test_verify_registration_response_tpm.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_verify_registration_response_tpm.py | BSD-3-Clause |
def test_verify_attestation_lenovo_carbon_x1(self) -> None:
"""
TPM Mfgr: STM (ST Microelectronics)
Mfgr Version: 73.8.17568.5511
TPM Version: 2.0
"""
credential = """{
"id": "kU6oEC95fTXAtpI6b2w69fQrKGntFFt1l_2ySjmndYM",
"rawId": "kU6oEC95fTXAtpI6... |
TPM Mfgr: STM (ST Microelectronics)
Mfgr Version: 73.8.17568.5511
TPM Version: 2.0
| test_verify_attestation_lenovo_carbon_x1 | python | duo-labs/py_webauthn | tests/test_verify_registration_response_tpm.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/test_verify_registration_response_tpm.py | BSD-3-Clause |
def patch_validate_certificate_chain_x509store_getter(func):
"""
This is a very purpose-built decorator to help set a fixed time for X.509 certificate chain
validation in unittests. It makes the following assumptions, all of which must be true for this
decorator to remain useful:
- X.509 certificat... |
This is a very purpose-built decorator to help set a fixed time for X.509 certificate chain
validation in unittests. It makes the following assumptions, all of which must be true for this
decorator to remain useful:
- X.509 certificate chain validation occurs in **webauthn/helpers/validate_certificate... | patch_validate_certificate_chain_x509store_getter | python | duo-labs/py_webauthn | tests/helpers/x509store.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/helpers/x509store.py | BSD-3-Clause |
def wrapper(*args, **kwargs):
"""
Using `inspect.getmodule(...)` below helps deal with the fact that, in Python 3.9 and
Python 3.10, `@patch("webauthn.helpers.validate_certificate_chain._generate_new_cert_store")`
errors out because `webauthn.helpers.validate_certificate_chain` is unders... |
Using `inspect.getmodule(...)` below helps deal with the fact that, in Python 3.9 and
Python 3.10, `@patch("webauthn.helpers.validate_certificate_chain._generate_new_cert_store")`
errors out because `webauthn.helpers.validate_certificate_chain` is understood to be the method
re-exported... | wrapper | python | duo-labs/py_webauthn | tests/helpers/x509store.py | https://github.com/duo-labs/py_webauthn/blob/master/tests/helpers/x509store.py | BSD-3-Clause |
def generate_authentication_options(
*,
rp_id: str,
challenge: Optional[bytes] = None,
timeout: int = 60000,
allow_credentials: Optional[List[PublicKeyCredentialDescriptor]] = None,
user_verification: UserVerificationRequirement = UserVerificationRequirement.PREFERRED,
) -> PublicKeyCredentialRe... | Generate options for retrieving a credential via navigator.credentials.get()
Args:
`rp_id`: The Relying Party's unique identifier as specified in attestations.
(optional) `challenge`: A byte sequence for the authenticator to return back in its response. Defaults to 64 random bytes.
(optiona... | generate_authentication_options | python | duo-labs/py_webauthn | webauthn/authentication/generate_authentication_options.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/authentication/generate_authentication_options.py | BSD-3-Clause |
def verify_authentication_response(
*,
credential: Union[str, dict, AuthenticationCredential],
expected_challenge: bytes,
expected_rp_id: str,
expected_origin: Union[str, List[str]],
credential_public_key: bytes,
credential_current_sign_count: int,
require_user_verification: bool = False... | Verify a response from navigator.credentials.get()
Args:
- `credential`: The value returned from `navigator.credentials.get()`. Can be either a
stringified JSON object, a plain dict, or an instance of RegistrationCredential
- `expected_challenge`: The challenge passed to the authenticator... | verify_authentication_response | python | duo-labs/py_webauthn | webauthn/authentication/verify_authentication_response.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/authentication/verify_authentication_response.py | BSD-3-Clause |
def aaguid_to_string(val: bytes) -> str:
"""
Take aaguid bytes and convert them to a GUID string
"""
if len(val) != 16:
raise ValueError(f"AAGUID was {len(val)} bytes, expected 16 bytes")
# Convert to a hexadecimal string representation
to_hex = codecs.encode(val, encoding="hex").decode... |
Take aaguid bytes and convert them to a GUID string
| aaguid_to_string | python | duo-labs/py_webauthn | webauthn/helpers/aaguid_to_string.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/aaguid_to_string.py | BSD-3-Clause |
def is_rsa_pkcs(alg_id: COSEAlgorithmIdentifier) -> bool:
"""Determine if the specified COSE algorithm ID denotes an RSA PKCSv1 public key"""
return alg_id in (
COSEAlgorithmIdentifier.RSASSA_PKCS1_v1_5_SHA_1,
COSEAlgorithmIdentifier.RSASSA_PKCS1_v1_5_SHA_256,
COSEAlgorithmIdentifier.RSA... | Determine if the specified COSE algorithm ID denotes an RSA PKCSv1 public key | is_rsa_pkcs | python | duo-labs/py_webauthn | webauthn/helpers/algorithms.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/algorithms.py | BSD-3-Clause |
def is_rsa_pss(alg_id: COSEAlgorithmIdentifier) -> bool:
"""Determine if the specified COSE algorithm ID denotes an RSA PSS public key"""
return alg_id in (
COSEAlgorithmIdentifier.RSASSA_PSS_SHA_256,
COSEAlgorithmIdentifier.RSASSA_PSS_SHA_384,
COSEAlgorithmIdentifier.RSASSA_PSS_SHA_512,... | Determine if the specified COSE algorithm ID denotes an RSA PSS public key | is_rsa_pss | python | duo-labs/py_webauthn | webauthn/helpers/algorithms.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/algorithms.py | BSD-3-Clause |
def get_ec2_sig_alg(alg_id: COSEAlgorithmIdentifier) -> EllipticCurveSignatureAlgorithm:
"""Turn an "ECDSA" COSE algorithm identifier into a corresponding signature
algorithm
"""
if alg_id == COSEAlgorithmIdentifier.ECDSA_SHA_256:
return ECDSA(SHA256())
if alg_id == COSEAlgorithmIdentifier.E... | Turn an "ECDSA" COSE algorithm identifier into a corresponding signature
algorithm
| get_ec2_sig_alg | python | duo-labs/py_webauthn | webauthn/helpers/algorithms.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/algorithms.py | BSD-3-Clause |
def get_ec2_curve(crv_id: COSECRV) -> EllipticCurve:
"""Turn an EC2 COSE crv identifier into a corresponding curve"""
if crv_id == COSECRV.P256:
return SECP256R1()
elif crv_id == COSECRV.P384:
return SECP384R1()
elif crv_id == COSECRV.P521:
return SECP521R1()
raise Unsupport... | Turn an EC2 COSE crv identifier into a corresponding curve | get_ec2_curve | python | duo-labs/py_webauthn | webauthn/helpers/algorithms.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/algorithms.py | BSD-3-Clause |
def get_rsa_pkcs1_sig_alg(alg_id: COSEAlgorithmIdentifier) -> HashAlgorithm:
"""Turn an "RSASSA_PKCS1" COSE algorithm identifier into a corresponding signature
algorithm
"""
if alg_id == COSEAlgorithmIdentifier.RSASSA_PKCS1_v1_5_SHA_1:
return SHA1()
if alg_id == COSEAlgorithmIdentifier.RSASS... | Turn an "RSASSA_PKCS1" COSE algorithm identifier into a corresponding signature
algorithm
| get_rsa_pkcs1_sig_alg | python | duo-labs/py_webauthn | webauthn/helpers/algorithms.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/algorithms.py | BSD-3-Clause |
def get_rsa_pss_sig_alg(alg_id: COSEAlgorithmIdentifier) -> HashAlgorithm:
"""Turn an "RSASSA_PSS" COSE algorithm identifier into a corresponding signature
algorithm
"""
if alg_id == COSEAlgorithmIdentifier.RSASSA_PSS_SHA_256:
return SHA256()
if alg_id == COSEAlgorithmIdentifier.RSASSA_PSS_S... | Turn an "RSASSA_PSS" COSE algorithm identifier into a corresponding signature
algorithm
| get_rsa_pss_sig_alg | python | duo-labs/py_webauthn | webauthn/helpers/algorithms.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/algorithms.py | BSD-3-Clause |
def byteslike_to_bytes(val: Union[bytes, memoryview]) -> bytes:
"""
Massage bytes subclasses into bytes for ease of concatenation, comparison, etc...
"""
if isinstance(val, memoryview):
val = val.tobytes()
return bytes(val) |
Massage bytes subclasses into bytes for ease of concatenation, comparison, etc...
| byteslike_to_bytes | python | duo-labs/py_webauthn | webauthn/helpers/byteslike_to_bytes.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/byteslike_to_bytes.py | BSD-3-Clause |
def decode_credential_public_key(
key: bytes,
) -> Union[DecodedOKPPublicKey, DecodedEC2PublicKey, DecodedRSAPublicKey]:
"""
Decode a CBOR-encoded public key and turn it into a data structure.
Supports OKP, EC2, and RSA public keys
"""
# Occasionally we might be given a public key in an "uncomp... |
Decode a CBOR-encoded public key and turn it into a data structure.
Supports OKP, EC2, and RSA public keys
| decode_credential_public_key | python | duo-labs/py_webauthn | webauthn/helpers/decode_credential_public_key.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/decode_credential_public_key.py | BSD-3-Clause |
def encode_cbor(val: Any) -> bytes:
"""
Attempt to encode data into CBOR.
Raises:
`helpers.exceptions.InvalidCBORData` if data cannot be decoded
"""
try:
to_return = cbor2.dumps(val)
except Exception as exc:
raise InvalidCBORData("Data could not be encoded to CBOR") from... |
Attempt to encode data into CBOR.
Raises:
`helpers.exceptions.InvalidCBORData` if data cannot be decoded
| encode_cbor | python | duo-labs/py_webauthn | webauthn/helpers/encode_cbor.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/encode_cbor.py | BSD-3-Clause |
def hash_by_alg(to_hash: bytes, alg: Optional[COSEAlgorithmIdentifier] = None) -> bytes:
"""
Generate a hash of `to_hash` by the specified COSE algorithm ID. Defaults to hashing
with SHA256
"""
# Default to SHA256 for hashing
hash = hashlib.sha256()
if alg in SHA_384:
hash = hashlib... |
Generate a hash of `to_hash` by the specified COSE algorithm ID. Defaults to hashing
with SHA256
| hash_by_alg | python | duo-labs/py_webauthn | webauthn/helpers/hash_by_alg.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/hash_by_alg.py | BSD-3-Clause |
def options_to_json(
options: Union[
PublicKeyCredentialCreationOptions,
PublicKeyCredentialRequestOptions,
]
) -> str:
"""
Prepare options for transmission to the front end as JSON
"""
if isinstance(options, PublicKeyCredentialCreationOptions):
_rp = {"name": options.rp.... |
Prepare options for transmission to the front end as JSON
| options_to_json | python | duo-labs/py_webauthn | webauthn/helpers/options_to_json.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/options_to_json.py | BSD-3-Clause |
def parse_attestation_object(val: bytes) -> AttestationObject:
"""
Decode and peel apart the CBOR-encoded blob `response.attestationObject` into
structured data.
"""
attestation_dict = parse_cbor(val)
decoded_attestation_object = AttestationObject(
fmt=attestation_dict["fmt"],
a... |
Decode and peel apart the CBOR-encoded blob `response.attestationObject` into
structured data.
| parse_attestation_object | python | duo-labs/py_webauthn | webauthn/helpers/parse_attestation_object.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_attestation_object.py | BSD-3-Clause |
def parse_attestation_statement(val: dict) -> AttestationStatement:
"""
Turn `response.attestationObject.attStmt` into structured data
"""
attestation_statement = AttestationStatement()
# Populate optional fields that may exist in the attestation statement
if "sig" in val:
attestation_s... |
Turn `response.attestationObject.attStmt` into structured data
| parse_attestation_statement | python | duo-labs/py_webauthn | webauthn/helpers/parse_attestation_statement.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_attestation_statement.py | BSD-3-Clause |
def parse_authentication_credential_json(json_val: Union[str, dict]) -> AuthenticationCredential:
"""
Parse a JSON form of an authentication credential, as either a stringified JSON object or a
plain dict, into an instance of AuthenticationCredential
"""
if isinstance(json_val, str):
try:
... |
Parse a JSON form of an authentication credential, as either a stringified JSON object or a
plain dict, into an instance of AuthenticationCredential
| parse_authentication_credential_json | python | duo-labs/py_webauthn | webauthn/helpers/parse_authentication_credential_json.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_authentication_credential_json.py | BSD-3-Clause |
def parse_authentication_options_json(
json_val: Union[str, dict]
) -> PublicKeyCredentialRequestOptions:
"""
Parse a JSON form of authentication options, as either stringified JSON or a plain dict, into an
instance of `PublicKeyCredentialRequestOptions`. Typically useful in mapping output from
`gen... |
Parse a JSON form of authentication options, as either stringified JSON or a plain dict, into an
instance of `PublicKeyCredentialRequestOptions`. Typically useful in mapping output from
`generate_authentication_options()`, that's been persisted as JSON via Redis/etc... back into
structured data.
| parse_authentication_options_json | python | duo-labs/py_webauthn | webauthn/helpers/parse_authentication_options_json.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_authentication_options_json.py | BSD-3-Clause |
def parse_authenticator_data(val: bytes) -> AuthenticatorData:
"""
Turn `response.attestationObject.authData` into structured data
"""
val = byteslike_to_bytes(val)
# Don't bother parsing if there aren't enough bytes for at least:
# - rpIdHash (32 bytes)
# - flags (1 byte)
# - signCount... |
Turn `response.attestationObject.authData` into structured data
| parse_authenticator_data | python | duo-labs/py_webauthn | webauthn/helpers/parse_authenticator_data.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_authenticator_data.py | BSD-3-Clause |
def parse_backup_flags(flags: AuthenticatorDataFlags) -> ParsedBackupFlags:
"""Convert backup eligibility and backup state flags into more useful representations
Raises:
`helpers.exceptions.InvalidBackupFlags` if an invalid backup state is detected
"""
credential_device_type = CredentialDeviceT... | Convert backup eligibility and backup state flags into more useful representations
Raises:
`helpers.exceptions.InvalidBackupFlags` if an invalid backup state is detected
| parse_backup_flags | python | duo-labs/py_webauthn | webauthn/helpers/parse_backup_flags.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_backup_flags.py | BSD-3-Clause |
def parse_cbor(data: bytes) -> Any:
"""
Attempt to decode CBOR-encoded data.
Raises:
`helpers.exceptions.InvalidCBORData` if data cannot be decoded
"""
try:
to_return = cbor2.loads(data)
except Exception as exc:
raise InvalidCBORData("Could not decode CBOR data") from ex... |
Attempt to decode CBOR-encoded data.
Raises:
`helpers.exceptions.InvalidCBORData` if data cannot be decoded
| parse_cbor | python | duo-labs/py_webauthn | webauthn/helpers/parse_cbor.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_cbor.py | BSD-3-Clause |
def parse_client_data_json(val: bytes) -> CollectedClientData:
"""
Break apart `response.clientDataJSON` buffer into structured data
"""
val = byteslike_to_bytes(val)
try:
json_dict = json.loads(val)
except JSONDecodeError:
raise InvalidJSONStructure("Unable to decode client_dat... |
Break apart `response.clientDataJSON` buffer into structured data
| parse_client_data_json | python | duo-labs/py_webauthn | webauthn/helpers/parse_client_data_json.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_client_data_json.py | BSD-3-Clause |
def parse_registration_credential_json(json_val: Union[str, dict]) -> RegistrationCredential:
"""
Parse a JSON form of a registration credential, as either a stringified JSON object or a
plain dict, into an instance of RegistrationCredential
"""
if isinstance(json_val, str):
try:
... |
Parse a JSON form of a registration credential, as either a stringified JSON object or a
plain dict, into an instance of RegistrationCredential
| parse_registration_credential_json | python | duo-labs/py_webauthn | webauthn/helpers/parse_registration_credential_json.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_registration_credential_json.py | BSD-3-Clause |
def parse_registration_options_json(
json_val: Union[str, dict]
) -> PublicKeyCredentialCreationOptions:
"""
Parse a JSON form of registration options, as either stringified JSON or a plain dict, into an
instance of `PublicKeyCredentialCreationOptions`. Typically useful in mapping output from
`gener... |
Parse a JSON form of registration options, as either stringified JSON or a plain dict, into an
instance of `PublicKeyCredentialCreationOptions`. Typically useful in mapping output from
`generate_registration_options()`, that's been persisted as JSON via Redis/etc... back into
structured data.
| parse_registration_options_json | python | duo-labs/py_webauthn | webauthn/helpers/parse_registration_options_json.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/parse_registration_options_json.py | BSD-3-Clause |
def pem_cert_bytes_to_open_ssl_x509(cert: bytes) -> X509:
"""Convert PEM-formatted certificate bytes into an X509 instance usable for cert
chain validation
"""
cert_crypto = load_pem_x509_certificate(cert)
cert_openssl = X509().from_cryptography(cert_crypto)
return cert_openssl | Convert PEM-formatted certificate bytes into an X509 instance usable for cert
chain validation
| pem_cert_bytes_to_open_ssl_x509 | python | duo-labs/py_webauthn | webauthn/helpers/pem_cert_bytes_to_open_ssl_x509.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/pem_cert_bytes_to_open_ssl_x509.py | BSD-3-Clause |
def snake_case_to_camel_case(snake_case: str) -> str:
"""
Helper method for converting a snake_case'd value to camelCase
input: pub_key_cred_params
output: pubKeyCredParams
"""
parts = snake_case.split("_")
converted = parts[0].lower() + "".join(part.title() for part in parts[1:])
# Ma... |
Helper method for converting a snake_case'd value to camelCase
input: pub_key_cred_params
output: pubKeyCredParams
| snake_case_to_camel_case | python | duo-labs/py_webauthn | webauthn/helpers/snake_case_to_camel_case.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/snake_case_to_camel_case.py | BSD-3-Clause |
def validate_certificate_chain(
*,
x5c: List[bytes],
pem_root_certs_bytes: Optional[List[bytes]] = None,
) -> bool:
"""Validate that the certificates in x5c chain back to a known root certificate
Args:
`x5c`: X5C certificates from a registration response's attestation statement
(opt... | Validate that the certificates in x5c chain back to a known root certificate
Args:
`x5c`: X5C certificates from a registration response's attestation statement
(optional) `pem_root_certs_bytes`: Any additional (PEM-formatted)
root certificates that may complete the certificate chain
Ra... | validate_certificate_chain | python | duo-labs/py_webauthn | webauthn/helpers/validate_certificate_chain.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/validate_certificate_chain.py | BSD-3-Clause |
def verify_safetynet_timestamp(timestamp_ms: int) -> None:
"""Handle time drift between an RP and the Google SafetyNet API servers with a window of
time within which the response is valid
"""
# Buffer period in ms
grace_ms = 10 * 1000
# Get "now" in ms
now = int(time.time()) * 1000
# Ma... | Handle time drift between an RP and the Google SafetyNet API servers with a window of
time within which the response is valid
| verify_safetynet_timestamp | python | duo-labs/py_webauthn | webauthn/helpers/verify_safetynet_timestamp.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/verify_safetynet_timestamp.py | BSD-3-Clause |
def verify_signature(
*,
public_key: Union[
EllipticCurvePublicKey,
RSAPublicKey,
Ed25519PublicKey,
DSAPublicKey,
Ed448PublicKey,
X25519PublicKey,
X448PublicKey,
],
signature_alg: COSEAlgorithmIdentifier,
signature: bytes,
data: bytes,
) ->... | Verify a signature was signed with the private key corresponding to the provided
public key.
Args:
`public_key`: A public key loaded via cryptography's `load_der_public_key`, `load_der_x509_certificate`, etc...
`signature_alg`: Algorithm ID used to sign the signature
`signature`: Signat... | verify_signature | python | duo-labs/py_webauthn | webauthn/helpers/verify_signature.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/verify_signature.py | BSD-3-Clause |
def parse_cert_info(val: bytes) -> TPMCertInfo:
"""
Turn `response.attestationObject.attStmt.certInfo` into structured data
"""
pointer = 0
# The constant "TPM_GENERATED_VALUE" indicating a structure generated by TPM
magic_bytes = val[pointer : pointer + 4]
pointer += 4
# Type of the c... |
Turn `response.attestationObject.attStmt.certInfo` into structured data
| parse_cert_info | python | duo-labs/py_webauthn | webauthn/helpers/tpm/parse_cert_info.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/tpm/parse_cert_info.py | BSD-3-Clause |
def parse_pub_area(val: bytes) -> TPMPubArea:
"""
Turn `response.attestationObject.attStmt.pubArea` into structured data
"""
pointer = 0
type_bytes = val[pointer : pointer + 2]
pointer += 2
mapped_type = TPM_ALG_MAP[type_bytes]
name_alg_bytes = val[pointer : pointer + 2]
pointer +=... |
Turn `response.attestationObject.attStmt.pubArea` into structured data
| parse_pub_area | python | duo-labs/py_webauthn | webauthn/helpers/tpm/parse_pub_area.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/helpers/tpm/parse_pub_area.py | BSD-3-Clause |
def _generate_pub_key_cred_params(
supported_algs: List[COSEAlgorithmIdentifier],
) -> List[PublicKeyCredentialParameters]:
"""
Take an array of algorithm ID ints and return an array of PublicKeyCredentialParameters
"""
return [PublicKeyCredentialParameters(type="public-key", alg=alg) for alg in sup... |
Take an array of algorithm ID ints and return an array of PublicKeyCredentialParameters
| _generate_pub_key_cred_params | python | duo-labs/py_webauthn | webauthn/registration/generate_registration_options.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/generate_registration_options.py | BSD-3-Clause |
def generate_registration_options(
*,
rp_id: str,
rp_name: str,
user_name: str,
user_id: Optional[bytes] = None,
user_display_name: Optional[str] = None,
challenge: Optional[bytes] = None,
timeout: int = 60000,
attestation: AttestationConveyancePreference = AttestationConveyancePrefe... | Generate options for registering a credential via navigator.credentials.create()
Args:
`rp_id`: A unique, constant identifier for this Relying Party.
`rp_name`: A user-friendly, readable name for the Relying Party.
`user_name`: A value that will help the user identify which account this cre... | generate_registration_options | python | duo-labs/py_webauthn | webauthn/registration/generate_registration_options.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/generate_registration_options.py | BSD-3-Clause |
def verify_registration_response(
*,
credential: Union[str, dict, RegistrationCredential],
expected_challenge: bytes,
expected_rp_id: str,
expected_origin: Union[str, List[str]],
require_user_presence: bool = True,
require_user_verification: bool = False,
supported_pub_key_algs: List[COS... | Verify an authenticator's response to navigator.credentials.create()
Args:
- `credential`: The value returned from `navigator.credentials.create()`. Can be either a
stringified JSON object, a plain dict, or an instance of RegistrationCredential
- `expected_challenge`: The challenge passed... | verify_registration_response | python | duo-labs/py_webauthn | webauthn/registration/verify_registration_response.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/verify_registration_response.py | BSD-3-Clause |
def verify_android_key(
*,
attestation_statement: AttestationStatement,
attestation_object: bytes,
client_data_json: bytes,
credential_public_key: bytes,
pem_root_certs_bytes: List[bytes],
) -> bool:
"""Verify an "android-key" attestation statement
See https://www.w3.org/TR/webauthn-2/#... | Verify an "android-key" attestation statement
See https://www.w3.org/TR/webauthn-2/#sctn-android-key-attestation
Also referenced: https://source.android.com/docs/security/features/keystore/attestation
| verify_android_key | python | duo-labs/py_webauthn | webauthn/registration/formats/android_key.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/formats/android_key.py | BSD-3-Clause |
def verify_android_safetynet(
*,
attestation_statement: AttestationStatement,
attestation_object: bytes,
client_data_json: bytes,
pem_root_certs_bytes: List[bytes],
verify_timestamp_ms: bool = True,
) -> bool:
"""Verify an "android-safetynet" attestation statement
See https://www.w3.org... | Verify an "android-safetynet" attestation statement
See https://www.w3.org/TR/webauthn-2/#sctn-android-safetynet-attestation
Notes:
- `verify_timestamp_ms` is a kind of escape hatch specifically for enabling
testing of this method. Without this we can't use static responses in unit
... | verify_android_safetynet | python | duo-labs/py_webauthn | webauthn/registration/formats/android_safetynet.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/formats/android_safetynet.py | BSD-3-Clause |
def verify_apple(
*,
attestation_statement: AttestationStatement,
attestation_object: bytes,
client_data_json: bytes,
credential_public_key: bytes,
pem_root_certs_bytes: List[bytes],
) -> bool:
"""
https://www.w3.org/TR/webauthn-2/#sctn-apple-anonymous-attestation
"""
if not att... | ERROR: type should be string, got "\n https://www.w3.org/TR/webauthn-2/#sctn-apple-anonymous-attestation\n " | verify_apple | python | duo-labs/py_webauthn | webauthn/registration/formats/apple.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/formats/apple.py | BSD-3-Clause |
def verify_fido_u2f(
*,
attestation_statement: AttestationStatement,
client_data_json: bytes,
rp_id_hash: bytes,
credential_id: bytes,
credential_public_key: bytes,
aaguid: bytes,
pem_root_certs_bytes: List[bytes],
) -> bool:
"""Verify a "fido-u2f" attestation statement
See http... | Verify a "fido-u2f" attestation statement
See https://www.w3.org/TR/webauthn-2/#sctn-fido-u2f-attestation
| verify_fido_u2f | python | duo-labs/py_webauthn | webauthn/registration/formats/fido_u2f.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/formats/fido_u2f.py | BSD-3-Clause |
def verify_packed(
*,
attestation_statement: AttestationStatement,
attestation_object: bytes,
client_data_json: bytes,
credential_public_key: bytes,
pem_root_certs_bytes: List[bytes],
) -> bool:
"""Verify a "packed" attestation statement
See https://www.w3.org/TR/webauthn-2/#sctn-packed... | Verify a "packed" attestation statement
See https://www.w3.org/TR/webauthn-2/#sctn-packed-attestation
| verify_packed | python | duo-labs/py_webauthn | webauthn/registration/formats/packed.py | https://github.com/duo-labs/py_webauthn/blob/master/webauthn/registration/formats/packed.py | BSD-3-Clause |
def fixed_get_imports(filename: str | os.PathLike) -> list[str]:
"""Work around for https://huggingface.co/microsoft/phi-1_5/discussions/72."""
if not str(filename).endswith("/modeling_deepseek.py"):
return get_imports(filename)
imports = get_imports(filename)
imports.remove("flash_attn")
re... | Work around for https://huggingface.co/microsoft/phi-1_5/discussions/72. | fixed_get_imports | python | xjdr-alt/entropix | download_weights.py | https://github.com/xjdr-alt/entropix/blob/master/download_weights.py | Apache-2.0 |
def kl_divergence(logp: jnp.ndarray, logq: jnp.ndarray) -> jnp.ndarray:
"""Compute KL divergence between two log probability distributions."""
p = jnp.exp(logp)
return jnp.sum(jnp.where(p > 0, p * (logp - logq), 0.0), axis=-1) | Compute KL divergence between two log probability distributions. | kl_divergence | python | xjdr-alt/entropix | entropix/dslider.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/dslider.py | Apache-2.0 |
def ent_varent(logp: jnp.ndarray) -> Tuple[jnp.ndarray, jnp.ndarray]:
"""Compute entropy and varentropy from log probabilities."""
p = jnp.exp(logp)
ent = -jnp.sum(p * logp, axis=-1)
diff = logp + ent[..., None]
varent = jnp.sum(p * diff**2, axis=-1)
return ent, varent | Compute entropy and varentropy from log probabilities. | ent_varent | python | xjdr-alt/entropix | entropix/dslider.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/dslider.py | Apache-2.0 |
def normalize_logits(logits: jnp.ndarray, noise_floor: float) -> jnp.ndarray:
"""Normalize logits to log probabilities with noise floor truncation."""
shifted = logits - jnp.max(logits, axis=-1, keepdims=True)
normalized = shifted - jax.nn.logsumexp(shifted + EPS, axis=-1, keepdims=True)
# noise floor calculate... | Normalize logits to log probabilities with noise floor truncation. | normalize_logits | python | xjdr-alt/entropix | entropix/dslider.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/dslider.py | Apache-2.0 |
def __hash__(self):
"""Static hash implementation that avoids hashing array values"""
hashable_items = []
for field in self.__dataclass_fields__.values():
value = getattr(self, field.name)
if isinstance(value, (jnp.ndarray, jax.Array)):
hashable_items.append(hash((str(field.name), value.... | Static hash implementation that avoids hashing array values | __hash__ | python | xjdr-alt/entropix | entropix/dslider_config.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/dslider_config.py | Apache-2.0 |
def halley_update(alpha, target_values):
"""
Compute the Halley's method update direction for the function
"""
p1 = jsp.polygamma(1, alpha)
p2 = jsp.polygamma(2, alpha)
S = jnp.sum(alpha, axis=-1, keepdims=True)
s1 = jsp.polygamma(1, S)
s2 = jsp.polygamma(2, S)
p1_inv = 1.0 / p1
sum_p1_inv = jnp.sum... |
Compute the Halley's method update direction for the function
| halley_update | python | xjdr-alt/entropix | entropix/dslider_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/dslider_utils.py | Apache-2.0 |
def fit_dirichlet(
target_values,
init_alpha=None,
initial_lr=1.2,
decay_alpha=0.1,
decay_beta=2.0,
decay_gamma=0.25,
decay_nu=0.75,
max_iters=140,
tol=1e-4,
dtype: jnp.dtype = jnp.bfloat16,
):
"""
Estimates Dirichlet parameters (alpha) from target logprobs.
"""
batch_shape = target_values.s... |
Estimates Dirichlet parameters (alpha) from target logprobs.
| fit_dirichlet | python | xjdr-alt/entropix | entropix/dslider_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/dslider_utils.py | Apache-2.0 |
def get_result_at_slot(self, slot: int) -> SlotData:
"""Returns the token at a given slot.
Args:
slot: An integer from [0, n) representing an index into the batch.
Note: implementations of this method must correctly handle
microbatches, if microbatches are used.
"""
# Potentially get mul... | Returns the token at a given slot.
Args:
slot: An integer from [0, n) representing an index into the batch.
Note: implementations of this method must correctly handle
microbatches, if microbatches are used.
| get_result_at_slot | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def __init__(
self,
params: ModelParams,
xfmr_weights: XfmrWeights,
mesh: jax.sharding.Mesh,
tokenizer: Tokenizer,
xfmr_fn: Callable,
sample_fn: Callable,
):
"""Initialize engine with model parameters and functions.
Args:
params: Model architecture parameters
xfmr_... | Initialize engine with model parameters and functions.
Args:
params: Model architecture parameters
xfmr_weights: Model weights
mesh: Device mesh for parallel execution
tokenizer: Tokenizer instance
xfmr_fn: Transformer forward function
sample_fn: Token sampling funct... | __init__ | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def get_tokenizer(
self,
) -> Dict[str, Any]:
"""Returns the info to construct a tokenizer in py/c++."""
return {} | Returns the info to construct a tokenizer in py/c++. | get_tokenizer | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def build_tokenizer(
self,
metadata: Dict[str, Any],
) -> Tokenizer:
"""Builds a new tokenizer object and returns it."""
return self.tokenizer | Builds a new tokenizer object and returns it. | build_tokenizer | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def free_resource(
self,
slot: int, # pylint: disable=unused-argument
) -> Any:
"""Free cache and other decode resource for the slot.
This function is needed for advanced attetnion kenel like PageAttetion.
After finishing one request, the engine need to free all used page block
resource and ... | Free cache and other decode resource for the slot.
This function is needed for advanced attetnion kenel like PageAttetion.
After finishing one request, the engine need to free all used page block
resource and reuse for coming requests.
| free_resource | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def prefill(
self,
*,
params: Params,
existing_prefix: Optional[jax.Array] = None,
padded_tokens: jax.Array,
true_length: int,
sampler: Optional[Callable[[Any], Any]] = None, # pylint: disable=unused-argument
rng: Optional[jax.random.PRNGKey] = None,
top_k: int = 6,
) -> Tuple[Pre... | Computes a kv-cache for a set of tokens conditional on existing cache.
existing_prefix (if provided) represents a prefix that has already been
processed by the underlying model. tokens is logically appended
to the text represented by `existing_prefix`. This method returns a new
kv_cache (typically) for... | prefill | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def generate(
self,
params: Params,
decode_state: DecodeState,
sampler: Optional[Callable[[Any], Any]] = None, # pylint: disable=unused-argument
rng: Optional[jax.random.PRNGKey] = jax.random.PRNGKey(1337),
) -> Tuple[DecodeState, ResultTokens]:
"""Generates tokens for each sequence being dec... | Generates tokens for each sequence being decoded in parallel.
Generate takes a batch of pre-computed kv-caches, and computes:
- the predicted next token for each of the sequences
- an updated set of kv-caches
In the case of pipelining, this will handle N cycles (where each cycle
consists of ea... | generate | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def insert(
self,
prefix: Prefix,
decode_state: DecodeState,
slot: int,
) -> DecodeState:
"""Adds `new_request` into `caches` at 'slot'.
When decoding multiple requests in parallel, when one request finishes, a
new request must be slotted into the recently vacated spot: `insert`!
Thi... | Adds `new_request` into `caches` at 'slot'.
When decoding multiple requests in parallel, when one request finishes, a
new request must be slotted into the recently vacated spot: `insert`!
This can occur in between and async to generate calls, and takes a lock over
that row of the cache.
The slot ... | insert | python | xjdr-alt/entropix | entropix/engine.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/engine.py | Apache-2.0 |
def stop(self):
"""Stops the driver and all background threads."""
# Signal to all threads that they should stop.
self.live = False
all_backlogs = list(
itertools.chain(
[self._prefill_backlog],
self._transfer_backlogs,
self._generate_backlogs.values(),
self._detok... | Stops the driver and all background threads. | stop | python | xjdr-alt/entropix | entropix/orchestrator.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/orchestrator.py | Apache-2.0 |
def get_total_concurrent_requests(self) -> int:
"""Gets the total number of concurrent requests the driver can handle."""
# We don't support filling all backlogs at once because it can cause GIL
# contention.
total_max_concurrent_decodes = sum(
[e.max_concurrent_decodes for e in self._generate_eng... | Gets the total number of concurrent requests the driver can handle. | get_total_concurrent_requests | python | xjdr-alt/entropix | entropix/orchestrator.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/orchestrator.py | Apache-2.0 |
def _prefill_thread(self, idx: int):
"""Thread which runs in the background performing prefills."""
logging.info("---------Spinning up prefill thread %d.---------", idx)
prefill_engine = self._prefill_engines[idx]
prefill_params = self._prefill_params[idx]
metadata = prefill_engine.get_tokenizer()
... | Thread which runs in the background performing prefills. | _prefill_thread | python | xjdr-alt/entropix | entropix/orchestrator.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/orchestrator.py | Apache-2.0 |
def _transfer_thread(self, idx: int):
"""Transfers the kv cache on an active request to the least full
generate backlog."""
transfer_backlog = self._transfer_backlogs[idx]
while self.live:
# The transfer thread can just sleep until it has work to do.
new_request = transfer_backlog.get(block... | Transfers the kv cache on an active request to the least full
generate backlog. | _transfer_thread | python | xjdr-alt/entropix | entropix/orchestrator.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/orchestrator.py | Apache-2.0 |
def _generate_thread(self, idx: int):
"""Step token generation and insert prefills from backlog."""
logging.info("---------Spinning up generate thread %d.---------", idx)
generate_engine = self._generate_engines[idx]
my_slots = self._generate_slots[idx]
my_generate_backlog = self._generate_backlogs[... | Step token generation and insert prefills from backlog. | _generate_thread | python | xjdr-alt/entropix | entropix/orchestrator.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/orchestrator.py | Apache-2.0 |
def _detokenize_thread(self, idx: int):
"""Detokenize sampled tokens and returns them to the user."""
# One of these per generate engine.
# For all filled my_slots, pop the sampled token onto the relevant
# requests return channel. If it done, place it back onto free slots.
my_detokenize_backlog = s... | Detokenize sampled tokens and returns them to the user. | _detokenize_thread | python | xjdr-alt/entropix | entropix/orchestrator.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/orchestrator.py | Apache-2.0 |
def __init__(self, model_path: str):
"""
Initializes the Tokenizer with a Tiktoken model.
Args:
model_path (str): The path to the Tiktoken model file.
"""
assert os.path.isfile(model_path), model_path
mergeable_ranks = load_tiktoken_bpe(model_path)
num_base_tokens = len(mergeable_r... |
Initializes the Tokenizer with a Tiktoken model.
Args:
model_path (str): The path to the Tiktoken model file.
| __init__ | python | xjdr-alt/entropix | entropix/tokenizer.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/tokenizer.py | Apache-2.0 |
def encode(
self,
s: str,
*,
bos: bool,
eos: bool,
allowed_special: Optional[Union[Literal['all'], AbstractSet[str]]] = None,
disallowed_special: Union[Literal['all'], Collection[str]] = (),
) -> List[int]:
"""
Encodes a string into a list of token IDs.
Args:
s (str): ... |
Encodes a string into a list of token IDs.
Args:
s (str): The input string to be encoded.
bos (bool): Whether to prepend the beginning-of-sequence token.
eos (bool): Whether to append the end-of-sequence token.
allowed_tokens ("all"|set[str]): allowed special tokens in string
... | encode | python | xjdr-alt/entropix | entropix/tokenizer.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/tokenizer.py | Apache-2.0 |
def decode(self, t: Sequence[int]) -> str:
"""
Decodes a list of token IDs into a string.
Args:
t (List[int]): The list of token IDs to be decoded.
Returns:
str: The decoded string.
"""
# Typecast is safe here. Tiktoken doesn't do anything list-related with the sequence.
re... |
Decodes a list of token IDs into a string.
Args:
t (List[int]): The list of token IDs to be decoded.
Returns:
str: The decoded string.
| decode | python | xjdr-alt/entropix | entropix/tokenizer.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/tokenizer.py | Apache-2.0 |
def _split_whitespaces_or_nonwhitespaces(s: str, max_consecutive_slice_len: int) -> Iterator[str]:
"""
Splits the string `s` so that each substring contains no more than `max_consecutive_slice_len`
consecutive whitespaces or consecutive non-whitespaces.
"""
current_slice_len = 0
current_slice_is... |
Splits the string `s` so that each substring contains no more than `max_consecutive_slice_len`
consecutive whitespaces or consecutive non-whitespaces.
| _split_whitespaces_or_nonwhitespaces | python | xjdr-alt/entropix | entropix/tokenizer.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/tokenizer.py | Apache-2.0 |
def take_nearest_length(lengths: list[int], length: int) -> int:
"""Gets the nearest length to the right in a set of lengths."""
pos = bisect_left(lengths, length)
if pos == len(lengths):
return lengths[-1]
return lengths[pos] | Gets the nearest length to the right in a set of lengths. | take_nearest_length | python | xjdr-alt/entropix | entropix/token_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/token_utils.py | Apache-2.0 |
def tokenize_and_pad(
s: str,
vocab,
is_bos: bool = True,
prefill_lengths: Optional[List[int]] = None,
max_prefill_length: Optional[int] = None,
jax_padding: bool = True,
) -> Tuple[Union[jax.Array, np.ndarray], int]:
"""Tokenize and pads a string.
Args:
s: String to tokenize.
vocab: Vocabulary... | Tokenize and pads a string.
Args:
s: String to tokenize.
vocab: Vocabulary to tokenize with.
is_bos: Whether or not this is the beginning of a sequence. Default to yes
as prefill is typically used when beginning sequences.
prefill_lengths: Buckets to pad the sequence to for static compilation.
... | tokenize_and_pad | python | xjdr-alt/entropix | entropix/token_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/token_utils.py | Apache-2.0 |
def pad_tokens(
tokens: np.ndarray,
bos_id: int,
pad_id: int,
is_bos: bool = True,
prefill_lengths: Optional[List[int]] = None,
max_prefill_length: Optional[int] = None,
jax_padding: bool = True,
) -> Tuple[Union[jax.Array, np.ndarray], int]:
"""Pads tokens to the nearest prefill length that is equal to... | Pads tokens to the nearest prefill length that is equal to or greater
than the token length.
Args:
tokens: Tokens.
bos_id: Bos ID.
pad_id: Pad ID.
is_bos: Add a beginning of sequence token if this is ture.
prefill_lengths: Buckets to pad the sequence to for static compilation.
max_prefil... | pad_tokens | python | xjdr-alt/entropix | entropix/token_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/token_utils.py | Apache-2.0 |
def process_result_tokens(
tokenizer: Tokenizer,
slot: int,
slot_max_length: int,
result_tokens: ResultTokens,
complete: np.ndarray,
is_client_side_tokenization: bool = False,
debug: bool = False,
) -> Tuple[List[ReturnSample], np.ndarray]:
"""Processes a result tokens into a list of strings, handling m... | Processes a result tokens into a list of strings, handling multiple
samples.
Args:
slot: The slot at which to draw tokens from.
slot_max_length: Max length for a sample in the slot.
result_tokens: The tokens to access by slot.
complete: Array representing the completion status of each sample in t... | process_result_tokens | python | xjdr-alt/entropix | entropix/token_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/token_utils.py | Apache-2.0 |
def is_byte_token(s: str) -> bool:
"""Returns True if s is a byte string like "<0xAB>"."""
# Bytes look like "<0xAB>".
if len(s) != 6 or s[0:3] != "<0x" or s[-1] != ">":
return False
return True | Returns True if s is a byte string like "<0xAB>". | is_byte_token | python | xjdr-alt/entropix | entropix/token_utils.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/token_utils.py | Apache-2.0 |
def create_mesh(device_count: int) -> jax.sharding.Mesh:
"""Creates device mesh for distributed execution."""
devices = jax.devices()
mesh_shape = (device_count, 1)
device_mesh = jax.experimental.mesh_utils.create_device_mesh(mesh_shape)
return jax.sharding.Mesh(device_mesh, ("mp", "fsdp")) | Creates device mesh for distributed execution. | create_mesh | python | xjdr-alt/entropix | entropix/weights.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/weights.py | Apache-2.0 |
def load_weights(
ckpt_dir: Path, model_params, weight_config: Optional[WeightConfig] = None
) -> Tuple[XfmrWeights, jax.sharding.Mesh]:
"""Load and shard model weights across devices."""
weight_config = weight_config or WeightConfig()
mesh = create_mesh(jax.device_count())
w = {}
layer_weights = []
for... | Load and shard model weights across devices. | load_weights | python | xjdr-alt/entropix | entropix/weights.py | https://github.com/xjdr-alt/entropix/blob/master/entropix/weights.py | Apache-2.0 |
def aggregate_results(
single_eval_results: list[SingleEvalResult],
default_stats: tuple[str] = ("mean", "std"),
name2stats: dict[str, tuple[str]] | None = None,
) -> EvalResult:
"""
Aggregate results from multiple evaluations into a single EvalResult.
"""
name2stats = name2stats or {}
name2values = def... |
Aggregate results from multiple evaluations into a single EvalResult.
| aggregate_results | python | xjdr-alt/entropix | evals/common.py | https://github.com/xjdr-alt/entropix/blob/master/evals/common.py | Apache-2.0 |
def map_with_progress(f: callable, xs: list[Any], num_threads: int = 50):
"""
Apply f to each element of xs, using a ThreadPool, and show progress.
"""
if os.getenv("debug"):
return list(map(f, tqdm(xs, total=len(xs))))
else:
with ThreadPool(min(num_threads, len(xs))) as pool:
return list(tqdm(p... |
Apply f to each element of xs, using a ThreadPool, and show progress.
| map_with_progress | python | xjdr-alt/entropix | evals/common.py | https://github.com/xjdr-alt/entropix/blob/master/evals/common.py | Apache-2.0 |
def message_to_html(message: Message) -> str:
"""
Generate HTML snippet (inside a <div>) for a message.
"""
return jinja_env.from_string(_message_template).render(
role=message["role"],
content=message["content"],
variant=message.get("variant", None),
) |
Generate HTML snippet (inside a <div>) for a message.
| message_to_html | python | xjdr-alt/entropix | evals/common.py | https://github.com/xjdr-alt/entropix/blob/master/evals/common.py | Apache-2.0 |
def make_report(eval_result: EvalResult) -> str:
"""
Create a standalone HTML report from an EvalResult.
"""
return jinja_env.from_string(_report_template).render(
score=eval_result.score,
metrics=eval_result.metrics,
htmls=eval_result.htmls,
) |
Create a standalone HTML report from an EvalResult.
| make_report | python | xjdr-alt/entropix | evals/common.py | https://github.com/xjdr-alt/entropix/blob/master/evals/common.py | Apache-2.0 |
def make_report_from_example_htmls(htmls: list[str]):
"""
Create a standalone HTML report from a list of example htmls
"""
return jinja_env.from_string(_report_template).render(
score=None, metrics={}, htmls=htmls
) |
Create a standalone HTML report from a list of example htmls
| make_report_from_example_htmls | python | xjdr-alt/entropix | evals/common.py | https://github.com/xjdr-alt/entropix/blob/master/evals/common.py | Apache-2.0 |
def normalize_response(response: str) -> str:
"""
Normalize the response by removing markdown and LaTeX formatting that may prevent a match.
"""
return (
response.replace("**", "")
.replace("$\\boxed{", "")
.replace("}$", "")
.replace("\\$", "")
.replace("$\\text{", "")
.replace("$", ""... |
Normalize the response by removing markdown and LaTeX formatting that may prevent a match.
| normalize_response | python | xjdr-alt/entropix | evals/common.py | https://github.com/xjdr-alt/entropix/blob/master/evals/common.py | Apache-2.0 |
def _normalize_answer(text: str) -> str:
"""Lower text and remove punctuation, articles and extra whitespace."""
parts = [
_white_space_fix(_remove_articles(_normalize_number(_remove_punc(_lower(token)))))
for token in _tokenize(text)
]
parts = [part for part in parts if part.strip()]
normalized = " ... | Lower text and remove punctuation, articles and extra whitespace. | _normalize_answer | python | xjdr-alt/entropix | evals/drop_eval.py | https://github.com/xjdr-alt/entropix/blob/master/evals/drop_eval.py | Apache-2.0 |
def _align_bags(predicted: List[Set[str]], gold: List[Set[str]]) -> List[float]:
"""
Takes gold and predicted answer sets and first finds the optimal 1-1 alignment
between them and gets maximum metric values over all the answers.
"""
scores = np.zeros([len(gold), len(predicted)])
for gold_index, gold_item i... |
Takes gold and predicted answer sets and first finds the optimal 1-1 alignment
between them and gets maximum metric values over all the answers.
| _align_bags | python | xjdr-alt/entropix | evals/drop_eval.py | https://github.com/xjdr-alt/entropix/blob/master/evals/drop_eval.py | Apache-2.0 |
def get_drop_metrics(
predicted: Union[str, List[str], Tuple[str, ...]],
gold: Union[str, List[str], Tuple[str, ...]],
) -> Tuple[float, float]:
"""
Takes a predicted answer and a gold answer (that are both either a string or a list of
strings), and returns exact match and the DROP F1 metric for the predictio... |
Takes a predicted answer and a gold answer (that are both either a string or a list of
strings), and returns exact match and the DROP F1 metric for the prediction. If you are
writing a script for evaluating objects in memory (say, the output of predictions during
validation, or while training), this is the fu... | get_drop_metrics | python | xjdr-alt/entropix | evals/drop_eval.py | https://github.com/xjdr-alt/entropix/blob/master/evals/drop_eval.py | Apache-2.0 |
def answer_json_to_strings(answer: Dict[str, Any]) -> Tuple[Tuple[str, ...], str]:
"""
Takes an answer JSON blob from the DROP data release and converts it into strings used for
evaluation.
"""
if "number" in answer and answer["number"]:
return tuple([str(answer["number"])]), "number"
elif "spans" in an... |
Takes an answer JSON blob from the DROP data release and converts it into strings used for
evaluation.
| answer_json_to_strings | python | xjdr-alt/entropix | evals/drop_eval.py | https://github.com/xjdr-alt/entropix/blob/master/evals/drop_eval.py | Apache-2.0 |
def normalize(s: str) -> str:
"""Lower text and remove punctuation, articles and extra whitespace."""
s = s.lower()
exclude = set(string.punctuation)
s = "".join(char for char in s if char not in exclude)
s = re.sub(r"\b(a|an|the)\b", " ", s)
s = " ".join(s.split())
return s | Lower text and remove punctuation, articles and extra whitespace. | normalize | python | xjdr-alt/entropix | evals/drop_eval.py | https://github.com/xjdr-alt/entropix/blob/master/evals/drop_eval.py | Apache-2.0 |
def evaluate_functional_correctness(
sample: dict[str, str],
completions: list[str],
n_workers: int = 4,
timeout: float = 3.0,
):
"""
Evaluates the functional correctness of generated samples, and writes
results to f"{sample_file}_results.jsonl.gz"
"""
import copy
# Check the generated samples agai... |
Evaluates the functional correctness of generated samples, and writes
results to f"{sample_file}_results.jsonl.gz"
| evaluate_functional_correctness | python | xjdr-alt/entropix | evals/humaneval_eval.py | https://github.com/xjdr-alt/entropix/blob/master/evals/humaneval_eval.py | Apache-2.0 |
def stream_jsonl(filename: str) -> Iterable[Dict]:
"""
Parses each jsonl line and yields it as a dictionary
"""
if filename.endswith(".gz"):
with open(filename, "rb") as gzfp:
with gzip.open(gzfp, 'rt') as fp:
for line in fp:
if any(not x.isspace()... |
Parses each jsonl line and yields it as a dictionary
| stream_jsonl | python | xjdr-alt/entropix | evals/human-eval/human_eval/data.py | https://github.com/xjdr-alt/entropix/blob/master/evals/human-eval/human_eval/data.py | Apache-2.0 |
Subsets and Splits
Django Code with Docstrings
Filters Python code examples from Django repository that contain Django-related code, helping identify relevant code snippets for understanding Django framework usage patterns.
SQL Console for Shuu12121/python-treesitter-filtered-datasetsV2
Retrieves specific code examples from the Flask repository but doesn't provide meaningful analysis or patterns beyond basic data retrieval.
HTTPX Repo Code and Docstrings
Retrieves specific code examples from the httpx repository, which is useful for understanding how particular libraries are used but doesn't provide broader analytical insights about the dataset.
Requests Repo Docstrings & Code
Retrieves code examples with their docstrings and file paths from the requests repository, providing basic filtering but limited analytical value beyond finding specific code samples.
Quart Repo Docstrings & Code
Retrieves code examples with their docstrings from the Quart repository, providing basic code samples but offering limited analytical value for understanding broader patterns or relationships in the dataset.