hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
585c7e24c67806395810ea82aa379346cf168b20 | 108 | py | Python | Python_OOP/Inheritance/Exercise/project/vehicle/vehicle.py | antonarnaudov/SoftUniProjects | 01cbdce2b350b57240045d1bc3e21d34f9d0351d | [
"MIT"
] | null | null | null | Python_OOP/Inheritance/Exercise/project/vehicle/vehicle.py | antonarnaudov/SoftUniProjects | 01cbdce2b350b57240045d1bc3e21d34f9d0351d | [
"MIT"
] | null | null | null | Python_OOP/Inheritance/Exercise/project/vehicle/vehicle.py | antonarnaudov/SoftUniProjects | 01cbdce2b350b57240045d1bc3e21d34f9d0351d | [
"MIT"
] | null | null | null | class Vehicle:
def __init__(self, available_seats: int):
self.available_seats = available_seats
| 27 | 46 | 0.731481 | 13 | 108 | 5.538462 | 0.615385 | 0.583333 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194444 | 108 | 3 | 47 | 36 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
589805dc3d54f551f8610e1c3c03cfb60122495f | 4,468 | py | Python | src/api/tests/test_attachments.py | Dabble-of-DevOps-Bio/ella | e38631d302611a143c9baaa684bcbd014d9734e4 | [
"MIT"
] | null | null | null | src/api/tests/test_attachments.py | Dabble-of-DevOps-Bio/ella | e38631d302611a143c9baaa684bcbd014d9734e4 | [
"MIT"
] | null | null | null | src/api/tests/test_attachments.py | Dabble-of-DevOps-Bio/ella | e38631d302611a143c9baaa684bcbd014d9734e4 | [
"MIT"
] | null | null | null | import base64
import tempfile
from vardb.datamodel import attachment
# Base 64 representation of a png with the ella logo
B64_DATA = "iVBORw0KGgoAAAANSUhEUgAAAV4AAAEACAIAAAB0zYnKAAAAA3NCSVQICAjb4U/gAAAAEHRFWHRTb2Z0d2FyZQBTaHV0dGVyY4LQCQAACcJJREFUeNrt3d1y28YZgOFdAAQpyrGTNG0POp1eQ++ot9Jb7VHSmSZuaokE8bM9kKOxHdvDhWUKCz7PaeuIko1X+4HLRfz7P/4ZAN5X+REA0gBIAyANgDQA0gBIAyANgDQA0gBIAyANgDQA0gAgDYA0ANIASAMgDYA0ANIASAMgDYA0ANIASAMgDYA0AEgDIA2ANADSAEgDIA2ANADSAEgDIA2ANADSAEgDgDQA0gBIAyANgDQA0gBIAyANgDQA0gBIAyANgDQA0gAgDYA0ANIASAMgDYA0AM+r8SNYsHR68/r05nX+H4zNdr/77k9P+VLSdPzlp/F0nPFitt98v7l96a/TqgGQBkAaAGkApAFAGgBpAKQBkAZAGgBpAKQBkAZAGgBpAKQBkAZAGgCkAfiIEs6GTGmahoLrWzchRP/UlvAPqT/epbGf++dj3W7r9kYalmLsu+MvP4WQyvwJx5sf/lLVjuddQBmmsfv1P3H+P6Q4dJv9H64lDQYKrmXJMHSH+EW/YFIax7HvpAHWVIY0HO++fOExHO6u5CcmDVzJNDHMeojGh4buPqR0DT8xaeA6ponjfXyCe8EppGnoDtIAponfV+YqZgpp4BqGidMXvGf54X9tPB3SNEoDWDJ8aDjeSwMU34YnTsN1zBTSwMrDMJ6OIU1PPaH009Cv+wcnDax9mvgqOxHWv3CQBladhmkauq9wXyBJA5S8ZhifZjvDx/7T0/gke6ikAZ5hmugzfrfHUNUWDo9W+4nA9sX3m9uXLo/rXjP0U3+MZy4bYmxvX53u/hvO3bOQhu6+Td/FuM7fr1YNrLUMoT/ex7PHiZRCvd03WccxpDSud9O0NLDaaSJnwR/rTVvVdb3NS8OKP4gpDayzDFPfhfMPB4ux3t6EEOt2lzKO5Epjf0zjIA1QzJKhz7xH2LS7h0g0mUe8rXXTtDSwzjZkTRMhVtVmG0KIb5cPGV+oX+n7FNLAGrvQ3Wed9fbbkuFhstjlnNWS0jhM/UkaYPFlCJl3B99fKcRYP6wgzv+Cq1w4SAOrS8OYt08xpVA/rhpCiDE2ue9THO+KPfFcGrimuww5m6NjvWnju/sgY8i73bDSU+GkgZWVIXP/8kfuO8aqbkLVZH3V9W2algZWVYYZZ70170wTj71oMhcOY3dM0yQNsIolQ4ix+shNx+y3MEMIIQ3dnTTAYm805G1Aanb7T9yA2IZYf9UvLQ1woTCMp2NIOWc9x9hs95/+n/Jmiqk/TWMvDbCCaaL51BaGGGO922fPFCv6tJU0sBZT5juIMTafu/ivfaaQBtZylyFzc/SnbzTMnynWdCqcNHCt00S9qZr2s/+P/JkiTavZ4CANrCIN0zCeuqebJubOFCEN3WEdj9KWBlYxTeRtjg4hnJOGMwvyu0p1a7jjsNpjY09vfj69+fkiXyq2L7/f3HzjCn3GaaLPe2sgVk1T1ZszyhCb7X44/C+3U83u1qoBnn2YyDnr7WEtsD3z0o3Vps07hD6k8XRcwaO0pYHilwz5uwly7i9+ZlvUp63gZqQ0UHwb+rzdBLFqNlXdnF2G/DSsYoODNFB2GMbuEEPORx5jzLwRMGemmIZ+Gk7SAM82TeQevpZSyH7TYc5MUfwJDtJAyWmYxsznR8W63ca8JUCIc97CfJgpkjTAM6wZhtwHYc+4yB9miiZ7pih907Q0UPA0kfveREphxtsNM2eKwh97Jw2UO0z0acy61Rebdpc7TcyfKUIaTodyT4WTBgotQ8j+nTxzmnh3psjcPZzSWOymaWmg2Gki/yzGeruf/yWzP6Nd9mPvpIEiyzD2x5C3GTnW7U2sqi8ow5yZYupPU5mP0l7tx6uq9qbetBf5UrFuti7WSy8ZLjpNvD9TZH1eI6TheNfevpKGxXxj7c3m9qWraK1tGLr7mHlh558f/9GZYj8cfs17qdIAF+rCMfust5DG+3//63le7jiOfVdvCltautdAaWUobw9ykZumpYHS0jCWtsvwYdN0aafCSQPFTRO5Z70tYaFT3qO0pYGyylDmJxpTeU/ElAZKusJmPAh7KWNQV9ipcNJASQvzkg9BKOzoJ2mgoFV5yQ+AKW0UkgbKWZOfjiFN5b7+aeinoZhpSBoo5ZfuCp5DXdLCQRoo5Xdu+U+FStIAT31RjblnvS3z25imUvZrSQMlXFKh4IMPPlj8lLJw8PEqiliID1N/jFnLhlhtX/7wxR/E/vxlPh5+/jFzn0Uauvs2fRfj0n8rWzVQwJqhP97F3HFixqFMuWKcc2xUSmMJm6algeWvGWbcvYtNexO+8s2JGONmNycNRbzVIg0sfjrvMx+E/fD7/GuOEu98mSbWuYeJpbE/psWfCicNLH3JMOcGZIxNe3OJ1zf3XLnlb5qWBpbehjnTxHYfLvJW58NztFPuWQwlnDTdrPdf1DQt6CN68fyntvPuX+PQHbLPepvz9NovmimqzTblPRc7pXGY+q5a8Klwq/332t+97u9eL2ZxVt/+8a8u9BnTxHB4M+dq3e4uV/0Ym91t/+aU+831x/vtgtNgoGDBaZjzRNmHJcMFN07G2Gxvso93e/tgqyQNkH+X4ZB/1tsTPG9ixrjY1O02+9ubxiWfCicNLLUM8z6nGKu63V06DXP3Pi35fQppYKnDxNDnn/X2sAPy4h/Devs+Rfb3OHbLfZS2NLDQNcO8G5CXnyYeFyuzVivLPU5WGljmfYYZi+34LNPE40zR7FY1U0gDSyzDeDqGlH3+8qXfm3i/DfWM9ynePkq7lwY4c8lQ0jTxGIdZT9xd6MJBGlieacaznh6mie2zpmH2THEnDXDGpdLlPwj7eaeJx2VDe5OyX0Na5lM8pYGFlWHmdoZnniZ+exXVrE98LnGmkAYWloZxGE9d/iVZP9d7Ex/MFPW8maJb3KO0fRyQhY0TKbUvXuVeklXTLuLlP+x9uv02+8MRsZqmcVEfzy0gDXW7u/3z31w01yHWm2294M8jnnGNV+2Lb1fwN2GgAKQBkAZAGgBpAKQBkAZAGgBpAKQBkAZAGgBpAKQBQBoAaQCkAZAGQBoAaQCkAZAGQBoAaQCkAZAGQBoAaQCQBkAaAGkApAGQBkAaAGkApAEoReNHsOhyN9vm5kX+n4v1pn3aVxJDrLf7WDcz/mC12firlAae8npsdvtmt1/Ga4nt7St/JQYKQBoApAGQBkAaAGkApAGQBkAaAGkApAGQBkAaAGkApAFAGgBpAKQBkAZAGgBpAKQBkAZAGgBpAKQBkAZAGgBpAJAGQBoAaQCkAZAGQBoAaQCkAZAGQBoAaQCkAZAGAGkApAGQBkAaAGkApAGQBkAagKX6PzSnD3o4MkT1AAAAAElFTkSuQmCC"
def test_attachments(session, client):
# Create temporary file to upload
f = tempfile.NamedTemporaryFile("wb", delete=False)
f.write(base64.b64decode(B64_DATA))
f.close()
# Upload file (can't use client.post, because it dumps data to json)
with client.app.test_client() as c:
client.ensure_logged_in(c, "testuser1")
r = c.post("/api/v1/attachments/upload/", data={"file": (open(f.name, "rb"), "ella.png")})
assert r.status_code == 200
attachment_obj = session.query(attachment.Attachment).one()
assert attachment_obj.filename == "ella.png"
assert attachment_obj.extension == "png"
assert attachment_obj.mimetype == "image/png"
# Download file
r = client.get("/api/v1/attachments/" + str(attachment_obj.id))
assert r.status_code == 200
assert r.get_data() == base64.b64decode(B64_DATA)
| 144.129032 | 3,477 | 0.90376 | 254 | 4,468 | 15.838583 | 0.728346 | 0.016157 | 0.014169 | 0.010937 | 0.009943 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121735 | 0.040286 | 4,468 | 30 | 3,478 | 148.933333 | 0.816465 | 0.036482 | 0 | 0.105263 | 0 | 0.052632 | 0.826977 | 0.81186 | 0 | 1 | 0 | 0 | 0.315789 | 1 | 0.052632 | false | 0 | 0.157895 | 0 | 0.210526 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
58a1c6d89376547cd42d46099c2d6fe002215902 | 17,321 | py | Python | app/tests/api/api_v1/test_users.py | SanchithHegde/decrypto-api | a68b6a5285fba4eb556f375e18e87505677fa6dd | [
"MIT"
] | null | null | null | app/tests/api/api_v1/test_users.py | SanchithHegde/decrypto-api | a68b6a5285fba4eb556f375e18e87505677fa6dd | [
"MIT"
] | 2 | 2021-08-17T17:07:52.000Z | 2021-11-28T19:02:50.000Z | app/tests/api/api_v1/test_users.py | SanchithHegde/decrypto-api | a68b6a5285fba4eb556f375e18e87505677fa6dd | [
"MIT"
] | null | null | null | # pylint: disable=missing-function-docstring
# pylint: disable=missing-module-docstring
from datetime import datetime, timezone
from typing import Dict
import pytest
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession
from app import crud
from app.core.config import settings
from app.schemas.user import UserCreate, UserUpdate
from app.tests.utils.question import png_content_type
from app.tests.utils.question_order_item import create_random_question_order_item
from app.tests.utils.user import authentication_token_from_email, create_random_user
from app.tests.utils.utils import random_email, random_int, random_lower_string
pytestmark = pytest.mark.asyncio
event_running = pytest.mark.skipif(
(
settings.EVENT_START_TIME is not None
and datetime.now(tz=timezone.utc) < settings.EVENT_START_TIME
)
or (
settings.EVENT_END_TIME is not None
and datetime.now(tz=timezone.utc) > settings.EVENT_END_TIME
),
reason="Event not running",
)
async def test_get_users_superuser_me(
client: AsyncClient, superuser_token_headers: Dict[str, str]
) -> None:
response = await client.get(
f"{settings.API_V1_STR}/users/me", headers=superuser_token_headers
)
current_user = response.json()
assert current_user
assert current_user["is_superuser"]
assert current_user["email"] == settings.FIRST_SUPERUSER
async def test_get_users_normal_user_me(
client: AsyncClient, normal_user_token_headers: Dict[str, str]
) -> None:
response = await client.get(
f"{settings.API_V1_STR}/users/me", headers=normal_user_token_headers
)
current_user = response.json()
assert current_user
assert current_user["is_superuser"] is False
assert current_user["email"] == settings.EMAIL_TEST_USER
async def test_create_user_new_email(
client: AsyncClient,
superuser_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
full_name = random_lower_string()
email = random_email()
username = random_lower_string()
password = random_lower_string()
data = {
"full_name": full_name,
"email": email,
"username": username,
"password": password,
}
response = await client.post(
f"{settings.API_V1_STR}/users/",
headers=superuser_token_headers,
json=data,
)
assert 200 <= response.status_code < 300
created_user = response.json()
user = await crud.user.get_by_email(db_session, email=email)
assert user
assert user.email == created_user["email"]
assert user.username == created_user["username"]
async def test_get_existing_user(
client: AsyncClient,
superuser_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
user = await create_random_user(db_session)
user_id = user.id
response = await client.get(
f"{settings.API_V1_STR}/users/{user_id}",
headers=superuser_token_headers,
)
assert 200 <= response.status_code < 300
api_user = response.json()
assert user.email
existing_user = await crud.user.get_by_email(db_session, email=user.email)
assert existing_user
assert existing_user.email == api_user["email"]
assert existing_user.username == api_user["username"]
async def test_get_not_existing_user(
client: AsyncClient, superuser_token_headers: Dict[str, str]
) -> None:
user_id = -1
response = await client.get(
f"{settings.API_V1_STR}/users/{user_id}",
headers=superuser_token_headers,
)
assert response.status_code == 404
async def test_get_current_user_normal_user(
client: AsyncClient,
normal_user_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
user = await crud.user.get_by_email(db_session, email=settings.EMAIL_TEST_USER)
assert user
user_id = user.id
response = await client.get(
f"{settings.API_V1_STR}/users/{user_id}",
headers=normal_user_token_headers,
)
current_user = response.json()
assert current_user
assert current_user["is_superuser"] is False
assert current_user["email"] == settings.EMAIL_TEST_USER
async def test_get_another_user_normal_user(
client: AsyncClient,
normal_user_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
user = await crud.user.get_by_email(db_session, email=settings.EMAIL_TEST_USER)
assert user and user.id
user_id = user.id - 1 # Any user ID other than current user
response = await client.get(
f"{settings.API_V1_STR}/users/{user_id}",
headers=normal_user_token_headers,
)
assert response.status_code == 400
async def test_create_user_existing_username(
client: AsyncClient,
superuser_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
full_name = random_lower_string()
email = random_email()
username = random_lower_string()
password = random_lower_string()
user_in = UserCreate(
full_name=full_name, email=email, username=username, password=password
)
await crud.user.create(db_session, obj_in=user_in)
data = {
"full_name": full_name,
"email": email,
"username": username,
"password": password,
}
response = await client.post(
f"{settings.API_V1_STR}/users/",
headers=superuser_token_headers,
json=data,
)
created_user = response.json()
assert response.status_code == 400
assert "_id" not in created_user
async def test_create_user_by_normal_user(
client: AsyncClient, normal_user_token_headers: Dict[str, str]
) -> None:
username = random_email()
password = random_lower_string()
full_name = random_lower_string()
data = {"email": username, "password": password, "full_name": full_name}
response = await client.post(
f"{settings.API_V1_STR}/users/",
headers=normal_user_token_headers,
json=data,
)
assert response.status_code == 400
async def test_retrieve_users(
client: AsyncClient,
superuser_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
await create_random_user(db_session)
await create_random_user(db_session)
await create_random_user(db_session)
response = await client.get(
f"{settings.API_V1_STR}/users/", headers=superuser_token_headers
)
all_users = response.json()
assert len(all_users) > 1
for user in all_users:
assert "email" in user
async def test_update_user_normal_user_me(
client: AsyncClient, normal_user_token_headers: Dict[str, str]
) -> None:
data = {
"full_name": random_lower_string(),
"email": random_email(),
"username": random_lower_string(),
"password": random_lower_string(),
}
response = await client.put(
f"{settings.API_V1_STR}/users/me",
headers=normal_user_token_headers,
json=data,
)
current_user = response.json()
assert current_user
assert current_user["is_superuser"] is False
assert current_user["email"] == data["email"]
assert current_user["username"] == data["username"]
assert current_user["full_name"] == data["full_name"]
@pytest.mark.skipif(
not settings.USERS_OPEN_REGISTRATION, reason="Open user registration disabled"
)
async def test_create_user_open(client: AsyncClient) -> None:
data = {
"full_name": random_lower_string(),
"email": random_email(),
"username": random_lower_string(),
"password": random_lower_string(),
}
response = await client.post(
f"{settings.API_V1_STR}/users/open",
json=data,
)
current_user = response.json()
assert current_user
assert current_user["is_superuser"] is False
assert current_user["email"] == data["email"]
assert current_user["username"] == data["username"]
assert current_user["full_name"] == data["full_name"]
@pytest.mark.skipif(
not settings.USERS_OPEN_REGISTRATION, reason="Open user registration disabled"
)
async def test_create_user_open_existing_username(
client: AsyncClient, db_session: AsyncSession
) -> None:
full_name = random_lower_string()
email = random_email()
username = random_lower_string()
password = random_lower_string()
user_in = UserCreate(
full_name=full_name, email=email, username=username, password=password
)
await crud.user.create(db_session, obj_in=user_in)
data = {
"full_name": full_name,
"email": email,
"username": username,
"password": password,
}
response = await client.post(
f"{settings.API_V1_STR}/users/open",
json=data,
)
assert response.status_code == 400
async def test_update_user_existing_user(
client: AsyncClient,
superuser_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
user = await create_random_user(db_session)
data = {
"full_name": random_lower_string(),
"email": random_email(),
"username": random_lower_string(),
"is_superuser": True,
}
response = await client.put(
f"{settings.API_V1_STR}/users/{user.id}",
headers=superuser_token_headers,
json=data,
)
api_user = response.json()
assert api_user
assert api_user["is_superuser"]
assert api_user["full_name"] == data["full_name"]
assert api_user["email"] == data["email"]
assert api_user["username"] == data["username"]
async def test_update_user_not_existing_user(
client: AsyncClient, superuser_token_headers: Dict[str, str]
) -> None:
user_id = -1
data = {
"email": random_email(),
"password": random_lower_string(),
"full_name": random_lower_string(),
"is_superuser": True,
}
response = await client.put(
f"{settings.API_V1_STR}/users/{user_id}",
headers=superuser_token_headers,
json=data,
)
assert response.status_code == 404
async def test_delete_user_existing_user(
client: AsyncClient,
superuser_token_headers: Dict[str, str],
db_session: AsyncSession,
) -> None:
user = await create_random_user(db_session)
user_id = user.id
response = await client.delete(
f"{settings.API_V1_STR}/users/{user_id}",
headers=superuser_token_headers,
)
assert 200 <= response.status_code < 300
async def test_delete_user_not_existing_user(
client: AsyncClient, superuser_token_headers: Dict[str, str]
) -> None:
user_id = -1
response = await client.delete(
f"{settings.API_V1_STR}/users/{user_id}",
headers=superuser_token_headers,
)
assert response.status_code == 404
@event_running
async def test_get_question(client: AsyncClient, db_session: AsyncSession) -> None:
question_order_item = await create_random_question_order_item(db_session)
question_number = question_order_item.question_number
user = await create_random_user(db_session)
user_in_update = UserUpdate(question_number=question_number)
user = await crud.user.update(db_session, db_obj=user, obj_in=user_in_update)
assert user.email # Required for mypy
normal_user_token_headers = await authentication_token_from_email(
client=client, email=user.email, db_session=db_session
)
response = await client.get(
f"{settings.API_V1_STR}/users/question", headers=normal_user_token_headers
)
assert 200 <= response.status_code < 300
question_data = response.json()
assert "content" in question_data
assert "content_type" in question_data
@event_running
async def test_get_question_image(
client: AsyncClient, db_session: AsyncSession
) -> None:
question_order_item = await create_random_question_order_item(db_session)
question_number = question_order_item.question_number
user = await create_random_user(db_session)
user_in_update = UserUpdate(question_number=question_number)
user = await crud.user.update(db_session, db_obj=user, obj_in=user_in_update)
assert user.email # Required for mypy
normal_user_token_headers = await authentication_token_from_email(
client=client, email=user.email, db_session=db_session
)
params = {"image": True}
response = await client.get(
f"{settings.API_V1_STR}/users/question",
headers=normal_user_token_headers,
params=params,
)
assert 200 <= response.status_code < 300
content_type_header = "content-type"
assert content_type_header in response.headers
assert response.headers[content_type_header] == png_content_type()
@event_running
async def test_get_question_redirect_if_none(
client: AsyncClient, db_session: AsyncSession
) -> None:
question_number = random_int()
user = await create_random_user(db_session)
user_in_update = UserUpdate(question_number=question_number)
user = await crud.user.update(db_session, db_obj=user, obj_in=user_in_update)
assert user.email # Required for mypy
normal_user_token_headers = await authentication_token_from_email(
client=client, email=user.email, db_session=db_session
)
response = await client.get(
f"{settings.API_V1_STR}/users/question",
headers=normal_user_token_headers,
follow_redirects=False,
)
assert response.status_code == 307
@event_running
async def test_get_question_redirect_if_none_allow_redirects(
client: AsyncClient, db_session: AsyncSession
) -> None:
question_number = random_int()
user = await create_random_user(db_session)
user_in_update = UserUpdate(question_number=question_number)
user = await crud.user.update(db_session, db_obj=user, obj_in=user_in_update)
assert user.email # Required for mypy
normal_user_token_headers = await authentication_token_from_email(
client=client, email=user.email, db_session=db_session
)
response = await client.get(
f"{settings.API_V1_STR}/users/question",
headers=normal_user_token_headers,
follow_redirects=True,
)
assert 200 <= response.status_code < 300
message_json = response.json()
assert "message" in message_json
@event_running
async def test_verify_answer_correct_answer(
client: AsyncClient, db_session: AsyncSession
) -> None:
question_order_item = await create_random_question_order_item(db_session)
question_number = question_order_item.question_number
user = await create_random_user(db_session)
user_in_update = UserUpdate(question_number=question_number)
user = await crud.user.update(db_session, db_obj=user, obj_in=user_in_update)
assert user.email
normal_user_token_headers = await authentication_token_from_email(
client=client, email=user.email, db_session=db_session
)
answer = question_order_item.question.answer
data = {"answer": answer}
response = await client.post(
f"{settings.API_V1_STR}/users/answer",
headers=normal_user_token_headers,
json=data,
)
assert 200 <= response.status_code < 300
assert user.id
assert user.rank
old_rank = user.rank
updated_user = await crud.user.get(db_session, identifier=user.id)
await db_session.refresh(updated_user)
assert updated_user
assert updated_user.question_number
assert updated_user.rank
assert question_number
assert updated_user.question_number == question_number + 1
assert updated_user.rank >= old_rank
@event_running
async def test_verify_answer_incorrect_answer(
client: AsyncClient, db_session: AsyncSession
) -> None:
question_order_item = await create_random_question_order_item(db_session)
question_number = question_order_item.question_number
user = await create_random_user(db_session)
user_in_update = UserUpdate(question_number=question_number)
user = await crud.user.update(db_session, db_obj=user, obj_in=user_in_update)
assert user.email
normal_user_token_headers = await authentication_token_from_email(
client=client, email=user.email, db_session=db_session
)
answer = random_lower_string()
data = {"answer": answer}
response = await client.post(
f"{settings.API_V1_STR}/users/answer",
headers=normal_user_token_headers,
json=data,
)
assert response.status_code == 400
assert user.id
assert user.rank
old_rank = user.rank
unmodified_user = await crud.user.get(db_session, identifier=user.id)
assert unmodified_user
assert unmodified_user.question_number
assert unmodified_user.rank
assert question_number
assert unmodified_user.question_number == question_number
assert unmodified_user.rank == old_rank
async def test_retrieve_leaderboard(
client: AsyncClient, db_session: AsyncSession
) -> None:
await create_random_user(db_session)
await create_random_user(db_session)
await create_random_user(db_session)
response = await client.get(f"{settings.API_V1_STR}/users/leaderboard")
all_users = response.json()
assert len(all_users) > 1
for user in all_users:
assert "question_number" in user
assert "rank" in user
assert "username" in user
| 30.820285 | 84 | 0.712026 | 2,236 | 17,321 | 5.192308 | 0.063953 | 0.048062 | 0.024806 | 0.028941 | 0.854091 | 0.809044 | 0.776572 | 0.764341 | 0.754005 | 0.751766 | 0 | 0.007177 | 0.195543 | 17,321 | 561 | 85 | 30.875223 | 0.826037 | 0.011027 | 0 | 0.683297 | 0 | 0 | 0.085679 | 0.047483 | 0 | 0 | 0 | 0 | 0.180043 | 1 | 0 | false | 0.0282 | 0.02603 | 0 | 0.02603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
544437f67df7b9ad9232894bbb5cb67ab3ac9dae | 7,712 | py | Python | usaspending_api/references/tests/integration/test_city.py | g4brielvs/usaspending-api | bae7da2c204937ec1cdf75c052405b13145728d5 | [
"CC0-1.0"
] | 217 | 2016-11-03T17:09:53.000Z | 2022-03-10T04:17:54.000Z | usaspending_api/references/tests/integration/test_city.py | g4brielvs/usaspending-api | bae7da2c204937ec1cdf75c052405b13145728d5 | [
"CC0-1.0"
] | 622 | 2016-09-02T19:18:23.000Z | 2022-03-29T17:11:01.000Z | usaspending_api/references/tests/integration/test_city.py | g4brielvs/usaspending-api | bae7da2c204937ec1cdf75c052405b13145728d5 | [
"CC0-1.0"
] | 93 | 2016-09-07T20:28:57.000Z | 2022-02-25T00:25:27.000Z | import json
import pytest
from django.conf import settings
from model_mommy import mommy
@pytest.fixture
def award_data_fixture(db):
mommy.make("awards.TransactionNormalized", id=1, award_id=1, action_date="2010-10-01", is_fpds=True, type="A")
mommy.make(
"awards.TransactionFPDS",
transaction_id=1,
legal_entity_zip5="abcde",
legal_entity_city_name="ARLINGTON",
legal_entity_state_code="VA",
legal_entity_country_code="UNITED STATES",
piid="IND12PB00323",
)
mommy.make("awards.Award", id=1, latest_transaction_id=1, is_fpds=True, type="A", piid="IND12PB00323")
mommy.make("awards.TransactionNormalized", id=2, award_id=2, action_date="2011-11-11", is_fpds=True, type="A")
mommy.make(
"awards.TransactionFPDS",
transaction_id=2,
legal_entity_zip5="abcde",
legal_entity_city_name="BRISTOL",
legal_entity_state_code=None,
legal_entity_country_code="GBR",
piid="0001",
)
mommy.make("awards.Award", id=2, latest_transaction_id=2, is_fpds=True, type="A", piid="0001")
mommy.make("awards.TransactionNormalized", id=3, award_id=3, action_date="2018-01-01", is_fpds=True, type="04")
mommy.make(
"awards.TransactionFPDS",
transaction_id=3,
legal_entity_zip5="abcde",
legal_entity_city_name="PHILLIPSBURG",
legal_entity_state_code="PA",
piid="0002",
)
mommy.make("awards.Award", id=3, latest_transaction_id=3, is_fpds=True, type="04", piid="0002")
mommy.make("awards.TransactionNormalized", id=4, award_id=4, action_date="2011-11-11", is_fpds=True, type="A")
mommy.make(
"awards.TransactionFPDS",
transaction_id=4,
legal_entity_zip5="abcde",
legal_entity_city_name="BRISTOL",
legal_entity_state_code="IL",
legal_entity_country_code="USA",
piid="0003",
)
mommy.make("awards.Award", id=4, latest_transaction_id=4, is_fpds=True, type="A", piid="0003")
mommy.make("references.RefCountryCode", country_code="USA", country_name="UNITED STATES")
mommy.make("references.RefCountryCode", country_code="GBR", country_name="UNITED KINGDOM")
def test_city_search_matches_found(client, monkeypatch, award_data_fixture, elasticsearch_transaction_index):
monkeypatch.setattr(
"usaspending_api.common.elasticsearch.search_wrappers.TransactionSearch._index_name",
settings.ES_TRANSACTIONS_QUERY_ALIAS_PREFIX,
)
elasticsearch_transaction_index.update_index()
body = {"filter": {"country_code": "USA", "scope": "recipient_location"}, "search_text": "arli", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 1
for entry in response.data["results"]:
assert entry["city_name"].lower().find("arl") > -1
def test_city_search_no_matches(client, monkeypatch, award_data_fixture, elasticsearch_transaction_index):
monkeypatch.setattr(
"usaspending_api.common.elasticsearch.search_wrappers.TransactionSearch._index_name",
settings.ES_TRANSACTIONS_QUERY_ALIAS_PREFIX,
)
elasticsearch_transaction_index.update_index()
body = {"filter": {"country_code": "USA", "scope": "recipient_location"}, "search_text": "bhqlg", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 0
for entry in response.data["results"]:
assert False # this should never be reached
body = {
"filter": {"country_code": "USA", "scope": "recipient_location"},
"search_text": "arlingtontownsburgplaceville",
"limit": 20,
}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 0
for entry in response.data["results"]:
assert False # this should never be reached
def test_city_search_special_characters(client, monkeypatch, award_data_fixture, elasticsearch_transaction_index):
monkeypatch.setattr(
"usaspending_api.common.elasticsearch.search_wrappers.TransactionSearch._index_name",
settings.ES_TRANSACTIONS_QUERY_ALIAS_PREFIX,
)
elasticsearch_transaction_index.update_index()
body = {
"filter": {"country_code": "USA", "scope": "recipient_location"},
"search_text": 'arli+|()[]{}?"<>\\', # Once special characters are stripped, this should just be 'arl'
"limit": 20,
}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 1
for entry in response.data["results"]:
assert entry["city_name"].lower().find("arl") > -1
def test_city_search_non_usa(client, monkeypatch, award_data_fixture, elasticsearch_transaction_index):
monkeypatch.setattr(
"usaspending_api.common.elasticsearch.search_wrappers.TransactionSearch._index_name",
settings.ES_TRANSACTIONS_QUERY_ALIAS_PREFIX,
)
elasticsearch_transaction_index.update_index()
body = {"filter": {"country_code": "GBR", "scope": "recipient_location"}, "search_text": "bri", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 1
for entry in response.data["results"]:
assert entry["city_name"].lower().find("bri") > -1
body = {"filter": {"country_code": "USA", "scope": "recipient_location"}, "search_text": "bri", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 1
for entry in response.data["results"]:
assert entry["city_name"].lower().find("bri") > -1
def test_city_search_foreign(client, monkeypatch, award_data_fixture, elasticsearch_transaction_index):
monkeypatch.setattr(
"usaspending_api.common.elasticsearch.search_wrappers.TransactionSearch._index_name",
settings.ES_TRANSACTIONS_QUERY_ALIAS_PREFIX,
)
elasticsearch_transaction_index.update_index()
body = {"filter": {"country_code": "FOREIGN", "scope": "recipient_location"}, "search_text": "bri", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 1
for entry in response.data["results"]:
assert entry["city_name"].lower().find("bri") > -1
def test_city_search_nulls_are_usa(client, monkeypatch, award_data_fixture, elasticsearch_transaction_index):
monkeypatch.setattr(
"usaspending_api.common.elasticsearch.search_wrappers.TransactionSearch._index_name",
settings.ES_TRANSACTIONS_QUERY_ALIAS_PREFIX,
)
elasticsearch_transaction_index.update_index()
body = {"filter": {"country_code": "USA", "scope": "recipient_location"}, "search_text": "phil", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 1
for entry in response.data["results"]:
assert entry["city_name"].lower().find("phil") > -1
body = {"filter": {"country_code": "FOREIGN", "scope": "recipient_location"}, "search_text": "phil", "limit": 20}
response = client.post("/api/v2/autocomplete/city", content_type="application/json", data=json.dumps(body))
assert response.data["count"] == 0
for entry in response.data["results"]:
assert False # this should never be reached
| 47.604938 | 117 | 0.700726 | 941 | 7,712 | 5.496281 | 0.141339 | 0.041763 | 0.034803 | 0.036543 | 0.87123 | 0.806845 | 0.770495 | 0.770495 | 0.755414 | 0.755414 | 0 | 0.021495 | 0.155472 | 7,712 | 161 | 118 | 47.900621 | 0.772609 | 0.01945 | 0 | 0.564286 | 0 | 0 | 0.287283 | 0.131666 | 0 | 0 | 0 | 0 | 0.128571 | 1 | 0.05 | false | 0 | 0.028571 | 0 | 0.078571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5454cb3dabb5f84922b68cf0b0f0fd25e21f293e | 120 | py | Python | app_python/addresses/impprivtkey.py | gtmadureira/bitcoin_address-generator | 90cd393d20913a381a783873ec6b6dd26310a232 | [
"MIT"
] | 2 | 2021-03-11T00:47:02.000Z | 2021-12-15T12:05:30.000Z | app_python/addresses/impprivtkey.py | gtmadureira/bitcoin_address-generator | 90cd393d20913a381a783873ec6b6dd26310a232 | [
"MIT"
] | null | null | null | app_python/addresses/impprivtkey.py | gtmadureira/bitcoin_address-generator | 90cd393d20913a381a783873ec6b6dd26310a232 | [
"MIT"
] | null | null | null | from bitcoinaddress import Wallet
wallet = Wallet('5HqrbgkWPqBy6dvCE7FoUiMuiCfFPRdtRsyi6NuCM2np8qBZxq5')
print(wallet)
| 24 | 70 | 0.866667 | 9 | 120 | 11.555556 | 0.666667 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063063 | 0.075 | 120 | 4 | 71 | 30 | 0.873874 | 0 | 0 | 0 | 0 | 0 | 0.425 | 0.425 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
54616c80d760e917bdd8b551932ca43cc38fe1cc | 26 | py | Python | bm/utiles/NpEncoder.py | kapousa/BrontoMind2 | a301595f321fde3330c991b7bc1f063b71de1c7b | [
"MIT"
] | null | null | null | bm/utiles/NpEncoder.py | kapousa/BrontoMind2 | a301595f321fde3330c991b7bc1f063b71de1c7b | [
"MIT"
] | null | null | null | bm/utiles/NpEncoder.py | kapousa/BrontoMind2 | a301595f321fde3330c991b7bc1f063b71de1c7b | [
"MIT"
] | null | null | null | import numpy
import json
| 6.5 | 12 | 0.807692 | 4 | 26 | 5.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 26 | 3 | 13 | 8.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b73780ba372a77d31e204b61edce4d35e3cb481d | 75 | py | Python | pycybos/cputil/__init__.py | sharebook-kr/pycybos | 7d125f9f54ed58b723b4a334fb1f4bad6dd7badc | [
"Apache-2.0"
] | 2 | 2018-06-01T12:43:21.000Z | 2019-06-01T12:13:43.000Z | pycybos/cputil/__init__.py | sharebook-kr/pycybos | 7d125f9f54ed58b723b4a334fb1f4bad6dd7badc | [
"Apache-2.0"
] | null | null | null | pycybos/cputil/__init__.py | sharebook-kr/pycybos | 7d125f9f54ed58b723b4a334fb1f4bad6dd7badc | [
"Apache-2.0"
] | 1 | 2021-08-17T16:09:08.000Z | 2021-08-17T16:09:08.000Z | from .cpcybos import *
from .cpcodemgr import *
from .cpstockcode import *
| 18.75 | 26 | 0.76 | 9 | 75 | 6.333333 | 0.555556 | 0.350877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 75 | 3 | 27 | 25 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3fa63641c95def68379a7857a41c113f818bdb9b | 91 | py | Python | upy/bin/upy_project_set/project/project/django_wsgi.py | 20tab/upy | 9b15c01afb46cf0082076fb0567f23f97d4099cb | [
"BSD-3-Clause"
] | 2 | 2021-05-04T07:52:39.000Z | 2021-07-11T22:19:32.000Z | upy/bin/upy_project_set/project/project/django_wsgi.py | 20tab/upy | 9b15c01afb46cf0082076fb0567f23f97d4099cb | [
"BSD-3-Clause"
] | 1 | 2017-11-03T10:17:23.000Z | 2017-11-03T10:17:23.000Z | upy/bin/upy_project_set/project/project/django_wsgi.py | 20tab/upy | 9b15c01afb46cf0082076fb0567f23f97d4099cb | [
"BSD-3-Clause"
] | null | null | null | import django.core.handlers.wsgi
application = django.core.handlers.wsgi.WSGIHandler()
| 22.75 | 54 | 0.791209 | 11 | 91 | 6.545455 | 0.636364 | 0.277778 | 0.5 | 0.611111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098901 | 91 | 3 | 55 | 30.333333 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
3fba533891cb9d7485844a897c475874699eed4b | 367 | py | Python | ex7.py | nguyennam9696/Learn_Python_The_Hard_Way | 402ffad8d8dc80f0c1f541d8e3d69980268bb559 | [
"MIT"
] | null | null | null | ex7.py | nguyennam9696/Learn_Python_The_Hard_Way | 402ffad8d8dc80f0c1f541d8e3d69980268bb559 | [
"MIT"
] | null | null | null | ex7.py | nguyennam9696/Learn_Python_The_Hard_Way | 402ffad8d8dc80f0c1f541d8e3d69980268bb559 | [
"MIT"
] | null | null | null | formatter = "%r %r %r %r"
non_r = "%s %s %s %s"
print formatter % (2, 2, 2, 2)
print formatter % (True, 2, 2, False)
print formatter % (formatter, formatter, formatter, formatter)
print formatter % ("Check this out", "Formatters", "Like percent r", "Gives you Raw Data")
print non_r % ("Check this out", "Formatters", "Like percent s", "Does NOT gives you Raw Data") | 45.875 | 95 | 0.66485 | 58 | 367 | 4.172414 | 0.344828 | 0.231405 | 0.334711 | 0.297521 | 0.272727 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0.019737 | 0.171662 | 367 | 8 | 95 | 45.875 | 0.776316 | 0 | 0 | 0 | 0 | 0 | 0.388587 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.714286 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
3fe4b51a3f7843240880ac703bd4bcbb3daae608 | 1,871 | py | Python | pynames/tests/test_name.py | FrBrGeorge/pynames | 52773059846cad714d0107f65dcccc9b4ef234f9 | [
"BSD-3-Clause"
] | 19 | 2015-03-28T08:57:04.000Z | 2016-06-18T07:09:04.000Z | pynames/tests/test_name.py | FrBrGeorge/pynames | 52773059846cad714d0107f65dcccc9b4ef234f9 | [
"BSD-3-Clause"
] | 10 | 2015-02-10T15:38:16.000Z | 2016-06-28T04:55:20.000Z | pynames/tests/test_name.py | FrBrGeorge/pynames | 52773059846cad714d0107f65dcccc9b4ef234f9 | [
"BSD-3-Clause"
] | 6 | 2015-02-09T17:41:40.000Z | 2016-06-17T07:19:17.000Z | # coding: utf-8
from __future__ import unicode_literals
import unittest
from pynames.relations import GENDER, LANGUAGE
from pynames.names import Name
class TestName(unittest.TestCase):
def test_base(self):
name = Name('ru', {'genders': {'m': {'ru': 'ru_name'}}})
self.assertEqual(str(name), 'ru_name')
self.assertEqual(name.get_for(GENDER.MALE, LANGUAGE.RU), 'ru_name')
self.assertEqual(name.get_for(GENDER.MALE), 'ru_name')
self.assertEqual(name.get_forms_for(GENDER.MALE), None)
def test_genders(self):
name = Name('ru', {'genders': {'m': {'ru': 'ru_m_name'},
'f': {'ru': 'ru_f_name'}}})
self.assertEqual(str(name), 'ru_m_name')
self.assertEqual(name.get_for(GENDER.MALE, LANGUAGE.RU), 'ru_m_name')
self.assertEqual(name.get_for(GENDER.FEMALE, LANGUAGE.RU), 'ru_f_name')
def test_languages(self):
name = Name('ru', {'genders': {'m': {'ru': 'ru_m_name',
'en': 'en_m_name'},
'f': {'ru': 'ru_f_name',
'en': 'en_f_name'}}})
self.assertEqual(str(name), 'ru_m_name')
self.assertEqual(name.get_for(GENDER.MALE, LANGUAGE.RU), 'ru_m_name')
self.assertEqual(name.get_for(GENDER.FEMALE, LANGUAGE.RU), 'ru_f_name')
self.assertEqual(name.get_for(GENDER.MALE, LANGUAGE.EN), 'en_m_name')
self.assertEqual(name.get_for(GENDER.FEMALE, LANGUAGE.EN), 'en_f_name')
self.assertEqual(name.get_for(GENDER.MALE), 'ru_m_name')
self.assertEqual(name.get_for(GENDER.FEMALE), 'ru_f_name')
def test_forms(self):
name = Name('ru', {'genders': {'m': {'ru': ['form_1', 'form_2']}}})
self.assertEqual(name.get_forms_for(GENDER.MALE), ['form_1', 'form_2'])
| 42.522727 | 79 | 0.591128 | 248 | 1,871 | 4.205645 | 0.157258 | 0.215724 | 0.255034 | 0.253116 | 0.788111 | 0.767977 | 0.731544 | 0.689358 | 0.587728 | 0.466922 | 0 | 0.003521 | 0.241048 | 1,871 | 43 | 80 | 43.511628 | 0.730986 | 0.006948 | 0 | 0.1875 | 0 | 0 | 0.136853 | 0 | 0 | 0 | 0 | 0 | 0.46875 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.28125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3fe72c2ebc32b2947950dc4f6194b9dea88e4ed8 | 4,031 | py | Python | PCV.py | eduardohenc/Problema_Caixeiro_Viajante | 3fd4ebaa654315ee3383d4b00409e43479910345 | [
"MIT"
] | null | null | null | PCV.py | eduardohenc/Problema_Caixeiro_Viajante | 3fd4ebaa654315ee3383d4b00409e43479910345 | [
"MIT"
] | null | null | null | PCV.py | eduardohenc/Problema_Caixeiro_Viajante | 3fd4ebaa654315ee3383d4b00409e43479910345 | [
"MIT"
] | null | null | null | # Matriz quadrada com 11 cidades e as distâncias entre elas
distancias = [
[0, 29, 20, 21, 16, 31, 100, 12, 4, 31, 18],
[29, 0, 15, 29, 28, 40, 72, 21, 29, 41, 12],
[20, 15, 0, 15, 14, 25, 81, 9, 23, 27, 13],
[21, 29, 15, 0, 4, 12, 92, 12, 25, 13, 25],
[16, 28, 14, 4, 0, 16, 94, 9, 20, 16, 22],
[31, 40, 25, 12, 16, 0, 95, 24, 36, 3, 37],
[100, 72, 81, 92, 94, 95, 0, 90, 101, 99, 84],
[12, 21, 9, 12, 9, 24, 90, 0, 15, 25, 13],
[4, 29, 23, 25, 20, 36, 101, 15, 0, 35, 18],
[31, 41, 27, 13, 16, 3, 99, 25, 35, 0, 38],
[18, 12, 13, 25, 22, 37, 84, 13, 18, 38, 0]
]
##### Inserção do vizinho mais distante #####
def vizinho_mais_distante(distancias, inicial = 0):
# Inicializa a rota com a cidade inicial e a distancia total com zero
atual = inicial
rota = [atual]
distancia_total = 0
# Repita para o número de cidades - 1
# A primeira cidade já foi adicionada na rota
for _ in range(len(distancias)-1):
# Pega a linha da matriz que representa os vizinhos da cidade atual (current)
vizinhos = distancias[atual]
# Inicializa a maior distância com zero
melhor_vizinho = 0
melhor_distancia = 0
# Para cada cidade vizinha da cidade corrente
for vizinho in range(len(vizinhos)):
# Pega a distância da cidade atual para a cidade vizinha
distancia = vizinhos[vizinho]
# Se a cidade ainda não foi visitada e
# sua distância for maior que zero e
# sua distância for maior que a maior distância até o momento
# então atualiza a cidade mais distante da atual
if vizinho not in rota and distancia > 0 and distancia > melhor_distancia:
melhor_vizinho = vizinho
melhor_distancia = distancia
# Atualiza a cidade atual para o vizinho mais distante
# Adiciona o vizinho na rota e incrementa a distância total
atual = melhor_vizinho
rota.append(atual)
distancia_total += melhor_distancia
# Ao final, foi conectado a última cidade na primeira cidade
rota.append(inicial)
distancia_total += distancias[atual][inicial]
return (distancia_total, rota)
print(vizinho_mais_distante(distancias))
##### Inserção do vizinho com distância média #####
def vizinho_distancia_media(distancias, inicial = 0):
# Inicializa a rota com a cidade inicial e a distancia total com zero
atual = inicial
rota = [atual]
distancia_total = 0
# Repita para o número de cidades - 1
# A primeira cidade já foi adicionada na rota
for _ in range(len(distancias)-1):
# Pega a linha da matriz que representa os vizinhos da cidade atual (current)
vizinhos = distancias[atual]
# Inicializa a maior distância com zero
melhor_vizinho = 0
melhor_distancia = 0
# Para cada cidade vizinha da cidade corrente
for vizinho in range(len(vizinhos)):
# Pega a distância da cidade atual para a cidade vizinha
distancia = vizinhos[vizinho]
# Se a cidade ainda não foi visitada e
# sua distância for maior que zero e
# sua distância for maior que a maior distância até o momento
# então atualiza a cidade mais distante da atual
# sum() e len()
if vizinho not in rota and distancia > 0 and distancia > melhor_distancia:
melhor_vizinho = vizinho
melhor_distancia = distancia
# Atualiza a cidade atual para o vizinho mais distante
# Adiciona o vizinho na rota e incrementa a distância total
atual = melhor_vizinho
rota.append(atual)
distancia_total += melhor_distancia
# Ao final, foi conectado a última cidade na primeira cidade
rota.append(inicial)
distancia_total += distancias[atual][inicial]
return (distancia_total, rota)
print(vizinho_distancia_media(distancias))
| 34.161017 | 86 | 0.621434 | 566 | 4,031 | 4.369258 | 0.20318 | 0.028306 | 0.038415 | 0.02588 | 0.820057 | 0.820057 | 0.820057 | 0.820057 | 0.820057 | 0.820057 | 0 | 0.08534 | 0.299429 | 4,031 | 117 | 87 | 34.452991 | 0.790368 | 0.390722 | 0 | 0.679245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037736 | false | 0 | 0 | 0 | 0.075472 | 0.037736 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3fecf37f6817f98a47b51f23ee737f258bbdc5f2 | 65 | py | Python | spektrixpython/__init__.py | hughtopping/spektrix-python | 4a2c081d1a5d85e93cd6de57420f259bdcc05bf9 | [
"MIT"
] | 3 | 2017-02-10T23:36:16.000Z | 2018-05-10T06:03:29.000Z | spektrixpython/__init__.py | hughtopping/spektrix-python | 4a2c081d1a5d85e93cd6de57420f259bdcc05bf9 | [
"MIT"
] | 15 | 2019-07-22T06:03:38.000Z | 2020-06-18T05:44:21.000Z | spektrixpython/__init__.py | hughtopping/spektrixpython | 4a2c081d1a5d85e93cd6de57420f259bdcc05bf9 | [
"MIT"
] | 2 | 2020-11-26T11:14:03.000Z | 2021-04-07T18:36:28.000Z | from .spektrixpython import SpektrixCredentials, SpektrixRequest
| 32.5 | 64 | 0.892308 | 5 | 65 | 11.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 65 | 1 | 65 | 65 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b20781afb88227313a7d0f42005ee1a144462b97 | 149 | py | Python | gym_modular/wrappers/__init__.py | TimSchneider42/mbpo | 736ba90bbdaddb2a40a6233bc0b78da72235100a | [
"MIT"
] | null | null | null | gym_modular/wrappers/__init__.py | TimSchneider42/mbpo | 736ba90bbdaddb2a40a6233bc0b78da72235100a | [
"MIT"
] | null | null | null | gym_modular/wrappers/__init__.py | TimSchneider42/mbpo | 736ba90bbdaddb2a40a6233bc0b78da72235100a | [
"MIT"
] | null | null | null | from .flatten_wrapper import FlattenWrapper
from .flatten_wrapper_goal_env import FlattenWrapperGoalEnv
from .goal_env_wrapper import GoalEnvWrapper
| 37.25 | 59 | 0.899329 | 18 | 149 | 7.111111 | 0.5 | 0.171875 | 0.28125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080537 | 149 | 3 | 60 | 49.666667 | 0.934307 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b79709cd4c312558d450061bbc96babd4ef4c5d8 | 1,998 | py | Python | app/views/edit.py | Vinatorul/cf-challange | 4118b650936ab775e0aefd4405291fda4480426a | [
"MIT"
] | 4 | 2019-11-25T19:41:44.000Z | 2019-12-18T06:28:54.000Z | app/views/edit.py | Vinatorul/cf-challange | 4118b650936ab775e0aefd4405291fda4480426a | [
"MIT"
] | 1 | 2021-06-02T00:46:41.000Z | 2021-06-02T00:46:41.000Z | app/views/edit.py | Vinatorul/cf-challange | 4118b650936ab775e0aefd4405291fda4480426a | [
"MIT"
] | null | null | null | from models.contests import Contests
from models.users import Users
from app import *
@app.route('/deleteContest', methods = ['GET'])
def deleteContest():
if current_user.role == "User":
return redirect(url_for('index'))
id = request.args.get('id')
try:
db.session.delete(Contests.query.filter_by(id=id).first())
db.session.commit()
return redirect(url_for('index'))
except:
flash('Ошибка', 'danger')
return redirect(url_for('index'))
@app.route('/deleteUser', methods = ['GET'])
def deleteUser():
if current_user.role == "User":
return redirect(url_for('index'))
id = request.args.get('id')
try:
db.session.delete(Users.query.filter_by(id=id).first())
db.session.commit()
return redirect(url_for('users'))
except:
flash('Ошибка', 'danger')
return redirect(url_for('users'))
@app.route('/addAdmin', methods = ['GET'])
def addAdmin():
if current_user.role == "User":
return redirect(url_for('index'))
login = request.args.get('login')
try:
admin = Users.query.filter_by(login=login).first()
admin.role = 'Admin'
db.session.add(admin)
db.session.commit()
flash('Успешно', 'success')
return redirect(url_for('profile', login = login))
except:
flash('Ошибка', 'danger')
return redirect(url_for('profile', login = login))
@app.route('/deleteAdmin', methods = ['GET'])
def deleteAdmin():
if current_user.role == "User":
return redirect(url_for('index'))
login = request.args.get('login')
try:
admin = Users.query.filter_by(login=login).first()
admin.role = 'User'
db.session.add(admin)
db.session.commit()
flash('Успешно', 'success')
return redirect(url_for('profile', login = login))
except:
flash('Ошибка', 'danger')
return redirect(url_for('profile', login = login)) | 31.21875 | 66 | 0.601101 | 237 | 1,998 | 4.983122 | 0.189873 | 0.142252 | 0.172735 | 0.203218 | 0.779001 | 0.762066 | 0.762066 | 0.762066 | 0.689246 | 0.689246 | 0 | 0 | 0.237237 | 1,998 | 64 | 67 | 31.21875 | 0.774934 | 0 | 0 | 0.736842 | 0 | 0 | 0.12056 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070175 | false | 0 | 0.052632 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4d0320a9bf5819cad3e7639af87e6308d1697d1f | 4,460 | py | Python | tests/test_detection.py | trickeydan/zoloto | d82e2f442d8ac1e5f4203c1299e88e09c5e34071 | [
"BSD-3-Clause"
] | null | null | null | tests/test_detection.py | trickeydan/zoloto | d82e2f442d8ac1e5f4203c1299e88e09c5e34071 | [
"BSD-3-Clause"
] | null | null | null | tests/test_detection.py | trickeydan/zoloto | d82e2f442d8ac1e5f4203c1299e88e09c5e34071 | [
"BSD-3-Clause"
] | null | null | null | import operator
from typing import Any
import pytest
from tests.conftest import IMAGE_DATA, TEST_IMAGE_DIR, get_calibration
from zoloto.cameras.file import ImageFileCamera as BaseImageFileCamera
from zoloto.marker_type import MarkerType
def test_has_data_for_all_images() -> None:
assert len(IMAGE_DATA) == len(list(TEST_IMAGE_DIR.glob("*.jpg")))
for filename in IMAGE_DATA.keys():
assert TEST_IMAGE_DIR.joinpath(filename).exists()
@pytest.mark.parametrize("filename", IMAGE_DATA.keys())
def test_detects_marker_ids(filename: str, snapshot: Any) -> None:
class ImageFileCamera(BaseImageFileCamera):
marker_type = MarkerType.DICT_APRILTAG_36H11
def get_marker_size(self, marker_id: int) -> int:
return 100
camera = ImageFileCamera(TEST_IMAGE_DIR.joinpath(filename),)
snapshot.assert_match(sorted(camera.get_visible_markers()))
@pytest.mark.parametrize("filename", IMAGE_DATA.keys())
def test_annotates_frame(filename: str, temp_image_file: Any) -> None:
class ImageFileCamera(BaseImageFileCamera):
marker_type = MarkerType.DICT_APRILTAG_36H11
def get_marker_size(self, marker_id: int) -> int:
return 100
camera = ImageFileCamera(TEST_IMAGE_DIR.joinpath(filename),)
camera.save_frame(temp_image_file, annotate=True)
@pytest.mark.parametrize("filename", IMAGE_DATA.keys())
def test_gets_markers(filename: str, snapshot: Any) -> None:
class TestCamera(BaseImageFileCamera):
marker_type = MarkerType.DICT_APRILTAG_36H11
def get_marker_size(self, marker_id: int) -> int:
return 100
camera = TestCamera(TEST_IMAGE_DIR.joinpath(filename),)
snapshot.assert_match(
sorted(
(
{
"id": marker.id,
"size": marker.size,
"pixel_corners": [list(coords) for coords in marker.pixel_corners],
"pixel_centre": list(marker.pixel_centre),
}
for marker in camera.process_frame()
),
key=operator.itemgetter("pixel_centre"),
)
)
@pytest.mark.parametrize("filename,camera_name", IMAGE_DATA.items())
def test_gets_markers_eager(filename: str, camera_name: str, snapshot: Any) -> None:
class TestCamera(BaseImageFileCamera):
marker_type = MarkerType.DICT_APRILTAG_36H11
def get_marker_size(self, marker_id: int) -> int:
return 100
camera = TestCamera(
TEST_IMAGE_DIR.joinpath(filename),
calibration_file=get_calibration(camera_name),
)
snapshot.assert_match(
sorted(
(
{
"id": marker.id,
"size": marker.size,
"pixel_corners": [list(coords) for coords in marker.pixel_corners],
"pixel_centre": list(marker.pixel_centre),
"distance": marker.distance,
"orientation": tuple(marker.orientation),
"spherical": tuple(marker.spherical),
"cartesian": list(marker.cartesian),
}
for marker in camera.process_frame_eager()
),
key=operator.itemgetter("pixel_centre"),
)
)
@pytest.mark.parametrize("filename,camera_name", IMAGE_DATA.items())
def test_gets_markers_with_calibration(
filename: str, camera_name: str, snapshot: Any
) -> None:
class TestCamera(BaseImageFileCamera):
marker_type = MarkerType.DICT_APRILTAG_36H11
def get_marker_size(self, marker_id: int) -> int:
return 100
camera = TestCamera(
TEST_IMAGE_DIR.joinpath(filename),
calibration_file=get_calibration(camera_name),
)
snapshot.assert_match(
sorted(
(
{
"id": marker.id,
"size": marker.size,
"pixel_corners": [list(coords) for coords in marker.pixel_corners],
"pixel_centre": list(marker.pixel_centre),
"distance": marker.distance,
"orientation": tuple(marker.orientation),
"spherical": tuple(marker.spherical),
"cartesian": list(marker.cartesian),
}
for marker in camera.process_frame()
),
key=operator.itemgetter("pixel_centre"),
)
)
| 34.307692 | 87 | 0.61704 | 466 | 4,460 | 5.658798 | 0.180258 | 0.037543 | 0.036405 | 0.045506 | 0.822905 | 0.812287 | 0.802048 | 0.802048 | 0.802048 | 0.736822 | 0 | 0.010955 | 0.283632 | 4,460 | 129 | 88 | 34.573643 | 0.814398 | 0 | 0 | 0.638095 | 0 | 0 | 0.060987 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 1 | 0.104762 | false | 0 | 0.057143 | 0.047619 | 0.304762 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4d26ea6b0e48a6eac9c50a514320e8c00096575d | 48 | py | Python | tasks/simple/__init__.py | RobertCsordas/tcf | da20530dfb4336deddfbe5e79d62e72d1dc2580e | [
"MIT"
] | 5 | 2021-12-14T21:38:08.000Z | 2022-03-10T06:56:57.000Z | tasks/simple/__init__.py | RobertCsordas/tcf | da20530dfb4336deddfbe5e79d62e72d1dc2580e | [
"MIT"
] | null | null | null | tasks/simple/__init__.py | RobertCsordas/tcf | da20530dfb4336deddfbe5e79d62e72d1dc2580e | [
"MIT"
] | 1 | 2021-10-19T00:03:47.000Z | 2021-10-19T00:03:47.000Z | from .. import task_db
task_db.register_files()
| 16 | 24 | 0.791667 | 8 | 48 | 4.375 | 0.75 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104167 | 48 | 2 | 25 | 24 | 0.813953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4de1b2f846cf1af147e3b1a4d530afc081833a56 | 30 | py | Python | backend/__init__.py | RohanJnr/YNB-site-backend | b1ff5c128aa34da24a18a91889ab897809c2507a | [
"MIT"
] | 1 | 2021-05-13T18:34:25.000Z | 2021-05-13T18:34:25.000Z | backend/__init__.py | RohanJnr/YNB-site-backend | b1ff5c128aa34da24a18a91889ab897809c2507a | [
"MIT"
] | null | null | null | backend/__init__.py | RohanJnr/YNB-site-backend | b1ff5c128aa34da24a18a91889ab897809c2507a | [
"MIT"
] | null | null | null | from backend.main import app
| 15 | 29 | 0.8 | 5 | 30 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 30 | 1 | 30 | 30 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
127aef0f4119966b607a8bfb41b70e40509b83ef | 17,765 | py | Python | elib_miz/dummy_miz.py | theendsofinvention/elib_miz | 13111076ddb63b2582fe61b96b44998343b82464 | [
"MIT"
] | null | null | null | elib_miz/dummy_miz.py | theendsofinvention/elib_miz | 13111076ddb63b2582fe61b96b44998343b82464 | [
"MIT"
] | 6 | 2018-10-23T05:16:58.000Z | 2019-09-30T18:05:07.000Z | elib_miz/dummy_miz.py | etcher-be/elib_miz | 13111076ddb63b2582fe61b96b44998343b82464 | [
"MIT"
] | 1 | 2018-04-01T16:02:13.000Z | 2018-04-01T16:02:13.000Z | # coding=utf-8
"""
Empty Miz file (used for zip injection)
"""
# pylint: skip-file
# noinspection PyPep8,SpellCheckingInspection
dummy_miz = b'PK\x03\x04\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\xdd-_\xaai\x07\x00\x00\xe20\x00\x00\x07\x00\x00\x00mission\xed[' \
b'\xc1n\xdb8\x10\xbd\xe7+\x0c\x9f]@\xa4(' \
b'\xc9>\xec\xa1I\xd1"\x08\x9a\x16I\xba\xbbEQ\x18\xb4\xc4\xd8\x84e\xc9%\xa5\xa4N\xd1\x7f_\x89\x92\x1c\x9a' \
b'\x1a[\xb4\xd6\x08rh\x8e\xc3!\xf9f8\xf3\xc8\x199+.%O\x93\xc1_\x83\xb3_g\x83\xe2\xef\xdbP\xb0\x1f9\x17,' \
b'\xfa\x98Fy\xcc\xe4\xf0{9X\x0eU\n\xbfG\x837o\x06,' \
b'\x89\x06\xe9=\xa4]\xaf\x12\xd1\x8c\x99S\xab\x91\xaf\x8c\n5\x82\x1d\x84F\xda\xc0;\xbaQ\xf2\x1d\xe1\xc74' \
b'\xc9\x16J\xec\x8f\xa0\xfd\xab}\xeaM3\xc1\xe7\xf0\xa64\xcc\n35cv\x15\x8cE\xb7\xda\xda\n\xec\x81%\x99\xed' \
b'\x02\x8d\xb26?\xcce\x96\xae,' \
b'\xe77\xca\xda\xfc\xfb<\t-gW\xaa\xfa\xdc\x98\xcem\xe7*U\x1dw\x9aD\xfc\x18\xe7\xe9\x13Z\xf6\xdffTd\xf9' \
b'\xfa(7l\xe7\x18\xde8n\xad\x9d\x19P$U\xc1\xb3M\x02\x99\xc7\x19\x1cK\xe9\xfd}\xcc\x13\xb6o\xdf\xc3nk\xeb' \
b'\xda\xb8\xef@\x10w.\xd8\x0e\xe6}\x01\xd5\xb9\x94\x11X\xc6\xe8\xd6/\x9a\xaf\xb24\xa3\xb1\xda\xc5\xd1' \
b'\xb3z\x16\xe7\x7f\xfcg\x8eVN\xd1\xbcT\xd0\xea\x1f\'\xb5\xee\x9b\x08N\xdf&c\xeb\x04^\xd1\x9f\xefx\x98]V' \
b'\x1e\xf4F\xb5x]\xc8r\xc1\xde\xf3\x98]\xd3\x15\xbb\xee\xb8\xdd\xda\xea\xf5:s\x91\xe6ItQ\\N"\x8da\x9a\xe0' \
b'\xf23\x8f\xd3\xac\xd6\xf9\x9b-x\xd8\\\xa7\xf74\x96L\xcf\x87B\x81\xc9\xc3g]0\x17\x8fc&6\xd30]\xadh\x121' \
b'\xd1\xe5\xd8jb\xc2\xf2L\x14\x1b\x9aY\x08d#8\xdaD\xa11h\x06\x00\x04\xcf0\x81\'2\x13y\x98\xa5\xaf\n\xb9' \
b'\x8e\xca\x00\x9c\xce$\x13\x0f\xaf\xcb\xd1\xcf\x98\xccTK\xc5#\x15\xd1\xf45\x82nc\xdb\x9b\xdf\xe9\xf3' \
b'+\xd2\x182r\xae\xc9\xc4\xb4A\xbf?\x8dk\x9dz\xc6#\xa3\xd9B\xf7\xcf\xceC1[' \
b'\xa5r]\x0c\xb3i\xb6Y\xb3\xf6\xd5\xf5\xc8\x93\x0eV\xa6\xd9\xd8q\x1c;\xf7\xcb5\x83\x1dX\xbf\xa2\xb9\xb0' \
b'\xc9\xbdz\xc3\x16\x8e\x0f\xcac/\x8a\xa4\xd9\xb2\x85\x05\xbf\xb4O\xf0\xaeO\x8c\xf1\xea\x18\xf5\xf7}Bg1' \
b'\x9b\xde\xa7s\x98\xa2%\xa32M\x0e\x9f|\xc6Vk&hye\xd4\x15\xceh\xdf\xfe\xcdr\xfa[' \
b'\xa9\x88\xb7\xa9\x1e\x9c;q\xf7#\xa9\xca\xa0\xc0\xdf\x11\x87\x9b0N\x13f\xfd8\xdf\xaakk$\xc5\xd5\xa6\xe6' \
b'\x0f\xff\xe1I\xc6\xc4hP\xdcT4\x19\xc8\xe5f\xa8\xef\xd5\xf8f\xbf\xfd\xc5\x1d\xb7,' \
b'V\x07\xd9\xe4\xdb\xf0\x81K>\xe31\xcf\xaa:\x0f{\xa6B\xc4\x12\xd9\x8c\x06{' \
b']\xa7`h\xb0*^\xb8\xcb\xc5\xac\xa8@\x93\x10\xc8Yc\xe3\xbd\xf8#.3\xda\xacP\xe6\xd4\xfe\xf3\xd3\x97\xd4' \
b'\x8f#N\xf3H\x1e\xe3%\xec8\x87\xdc\xd0\x1a\x9cQY\xe1s\xdb\x13\xf9Z\xb0p\x9d%-\xff\x9bQP\xa3\x84\xb8r\x1b' \
b'\x80MM\xbd(\x04u<\x0f/h\x1eR\x99\xcb\xe1H\xab\xb8\xe7LH\x98L\x9f\x8e\x88\xcc\'-,' \
b'\x81\xca\xac\xdad\xfb\xb8[\xc3\x1b\x86\xac\x8c\xdf\xafU\xbf\x00\x8dw\x0e\xb0\xdc\xa2.\xc0\x91\xe38' \
b'\xc6X5\xf3_5\xfc\xc6\xf5\xbc\xed\xb0\x81Em}\xd6<\xbfil>\xbf\x7f\x1dS\xe5 \xa3\xdd\xa1\x84X\tM\xa9[' \
b'J\x89!$Z_d+\xf4\xd4|S\xea+\xa9kH\x03%5\xf3p\\J\'\x86p\xa2r\xc2\x10"G-\x80M\xb12\xac%U\x96\x99\x10\x90' \
b'\xab=\xd0\x9f\xa5\xa4:\'S\xec\x99\xb4Z\x89\x95u\xd8\xf4\x19R\xe6\x91\x96\xb6\xb2\x0f\x9b\x1eB\xcaBb' \
b'"\xc1\xcaDlZ\x8e\x95\x89\xc4\x1fYWs;/\xad\x83\x11\x11\x80\x11\x11@\x11\x811\x14\x12\xd8\x85b\xa2\xc5' \
b'\xb7\x95\xd7&PL\xb8\x0e\x14\x13.\x82\x82\xc2\xc5`T\xb8.\x18\x15\xae\x07\x87\x85\x0f\xc6\x85;\x01\x03' \
b'\x83 00\x08\x06\x03\x83\x1080\xc6``\x90\t\x18\x18\x9e\x03\x06\x86\x87\xc0\xc0\xf0L$XY\xe9\x99>\xc1U\xf4' \
b'\x9b\x00\xb1\xb2\xd2k\xc5\xa2\xb2\xd23]\x85\x95\x95^+l\x94\x95^+r\x95\x95\x9ei%VV\xfa\xa6\x95\xaeSq\xa9' \
b')VV\xfa\xa6\x95\xae\xb2\xd27\xadt\x95\x95\xbei\xa5[' \
b'\xf1\x97i\xa5\xab\xac\xf4M+]e\xa5\xdfJ\x03e\xa5oZ\xe9*+}\xd3JWY\x19\x98V\x12ee`ZI\xaatl\xe5\x98\xb220' \
b'\xad$\xca\xca\x80\xec\xa5\x84g\x068\xa2\xc9\x83\xa0W@\xc5\x08c\x88\x11\xd0\x04b\x047\x00\x19\x81\x80' \
b'\x8c\x10@\x8c@\\\x88\x11H\x002\x02\x81\x19a<:\xba\xad\xa3\xdf\xb1\xcd\xa7\x05&C\xc1\xd7\xa5\xf0\x8e\xfd' \
b'\xac\x9a\xb4\xc3\xb2\xd9s\xc56Scp\x8a\x86{Z?7\xc7\xb5~n\xa0\xed\xaf\xeb\x03\xbd\xa3r\xb9\x17\x86\xae4%[' \
b'8\x9a\xc6yqU\x1c\\\xa2Q\x98\xba\xd0\xf4\x1b\x16\x1d\x9c]\x8fO\xf1>W\x9c\x1f\xe7\x8a\xf3\xf6\x03\xa8\xe7' \
b'\xfbgX\xbc\xd8\x8b2k\xc3\xec\n\xc3M\xfd\xae\x0b\x08"Pe\xf8\xb3z\xbd\xe1\trL\x05\xf3r\xde\xeek\x00J\xe8' \
b'\xc3t\x9d\xf2\xd6\x97\x9e\xce\xae\xa5>\xb1\xb5fS^)\x87\x8cZM\xdc<\xc9\xc4\xc6\xca\x05\xa8\xa5\x04+\xd6' \
b'\xe5@4\x04\x1e\x98\x10\xb0/KA\xcb\xd6}[' \
b's\xd7L\xf4\xbd\x8d\t\xf7\xc2d\x01\xea\x9c\xc5s\x9e\xaf:Aa\x00\x94\xdb\x07\x14\xb1p\xd4U\'\x1c\x17\x80C' \
b'\xfa\xc0\xf1\xbb\xe1|`bE\x93M\'&\x02`\xf2z\x9d\x9b\x15\xa8T\xcc9\xed\x04\xe5\x01\xa0\xfc^\xa0\xdcnP' \
b'\xefX\xb2\xa2b\xd9\t\xca\x07@\x05\xbd@y\xdd\xa0.\xa5\xa0,' \
b'\xee\xc4\x14\x00\x98\xc6}0M\xba!\xdd\xae\x0b"\xe8D4\x06\x10M\xfa ' \
b'\x1aw#\xba\xa0\t\x8d\xba\xa3i\x02@\xaa\x9e\x1dG\x9f\x1c\xee\x06u]6\xb4\xbb\xf3\xae\x00\x00\xa0\xea\xc5' \
b'\xe2\x16\xa0\xbe\xdc\xbe\xedF\x04Q8\xea\xc5\xe1\x16Yw\x97\x8b%\xb3p\x13D\xe1\xa8\x17\x87[d\xdd{' \
b'\xa1z|\x9d\xa0 "G\xbd\x98\x1c9\x16\xaeZ\xb0\xc15+[' \
b'o1M"\xd9\r\x0f\xe2t\xd4\x8b\xd4\xb1\x05\xbc\xcb\x8c\xc6\x16\x07\tq:\xeaE\xea\xd8\xe2\x85\xf06\x97\xe5' \
b'\xd3\xda\xe2\xaeA\x10\xaf\xa3^\xc4N,' \
b'\xbc\xf59-\x0f\xb1\x1b\x15\xc4\xec\xa8\x17\xb5c\x8b\x8b\xf9\xe2\x89\x85\x8b\xc1\r[' \
b'\xe7\xb3\x98\x87\xdd\xe8 \x96G\xbdh\x9eX\xa4\xe5m\x9ag\x8b\xc1U*\x98\xc5qBl\x8f{' \
b'\xb1=\xb6\xb9\x82DJ3\x8b(\xc3\x10\xdd\xe3^tO,' \
b'\xce\xf3\xf6\x91E\xac\xfb\xb2\xc6\x06\xe5\xb7*\xec\xba\xfe8\xdd\xefR\xfa\x94un\x80\x02\xc7\xd9_\xd6' \
b'!\xe4\x99\x1d\xae\xd7Q\xd4\x95\xdex\xf1\x9a\xce\x82\x83nr)m\x98\xf1d%\x9dE"\xbd\x9d-\x17\xf4\xc9&\x93NU' \
b'\xd3\xa1\x89-\xf3|\x92\x92\xd9$\xf9\xc9\xea;\xb3\'\x07A\xbb\xa2Ot\xb9(' \
b'\xbfJ\xbe\\\x8d\x87\x89UmNE._\xae\xc6\xc3\x16\xce\xbaX\xf0\x84\xbe\\\x85G,\x1e\xc0\xb7L\xcc,' \
b'b\xead\x15\x1e\xb1p\xd3\xedFX@:Y\x89\xe7Z\x84\xd3\xa5\xb0\x08\xf0\xd3Ux\xee\xd8\xaa\xc2\xb3\x7f\x8e8' \
b'\xff\xef\x86\xed\xee~o[\xae\xb2\x80\xc5\xd9n\xaf\xb7\x92M\xbdm\x7f\xf7\x81\t\xd9te\x9bNh\xf5\x19]X\xfc' \
b'\xa4_\xd3k\xfa\xbc\xb9\x10,)\xf7\xd2\x1bS\x05\x98\xf2\xf7\xd4\xd3\x8c\xd7>\xc3\xe3qs\x83\xab\x9f|\x85,' \
b'\xfa\xb4\x06\xbf\x8f\xb7\x7f\x1e\xa6\xeb6+P\x1e\xe7\xa2\x13\xee\xb3\xda\xd9om`U\xfdg\xc3\xd9\x7fPK\x03' \
b'\x04\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\x0e\xce^\xf6s\x00\x00\x00\xf5\x00\x00\x00\x17\x00\x00\x00l10n/DEFAULT/dictionaryK\xc9L.\xc9\xcc' \
b'\xcfK,\xaaT\xb0U\xe0\xaa\xe6R\x00\x82h%\x17\xa0\xa8wje|JjqrQf\x01H\x85_jiIQbNqHbqv\xbc\x89R,' \
b'P\xb9\x92\x92\x0e\x9a\xfa\xe2\xfc\xa2\x92\xcc\xd4xS\x1c\xd2H\xc6\x85\xa4V\x94\xc4\x1b\x12V\xe7\x94S\x9a' \
b'\n\xb6\xd2\x98\xb0\xda\xa0\xd4\x14\xb0R#\xb8\xd2Z\x05]]\x85\xd4\xbc\x14\x85\xfc4\x85\x14\xb8O\xb9\x00PK' \
b'\x03\x04\x14\x00\x02\x00\x08\x00\x00\x00 \x009G[' \
b'\xae#\x00\x00\x00)\x00\x00\x00\x18\x00\x00\x00l10n/DEFAULT/mapResource\xcbM,' \
b'\x08J-\xce/-JNU\xb0U\xe0\xaa\xe6\xaaU\xd0\xd5UH\xcdKQ\xc8OS\xc8EHr\x01\x00PK\x03\x04\x14\x00\x02\x00' \
b'\x08\x00\x00\x00 \x00\xdf\xf9\xad\xe0h\x02\x00\x00E_\x00\x00\n\x00\x00\x00warehouses\xed\xdcKo\x9b@\x14' \
b'\x86\xe1\xbd\x7f\x05b\xedT6$N\xb2\xc8\xc2\x8bT\xaa\x94\xb6R\xd5\xae\xa2\xcaBf\x9cL\x85\x81r\xa9{Q\xfe{' \
b'\t\x10r\x18\'i#%g\xf5\xb2d\x0egf\xb8<\xabO\xec\xa2\xc2\\guiJ\xef\xcc\x9b\xfc\x99x\xcdq\xe9G\xb6\xc8\xb3' \
b'\xa2*\xfd\xaf\xb7go\xcfu#\xed\xe8<\x18\xce\x8eG\xbak\xaf\xa22Klj\xfcQ\xd5~eW\xfd.\xb5\xd5\xdb\xda$m\xf5' \
b'|6\x9b\x8ejn\xa6\xde\xc1\x81g\xd2\xd8\xcb6\xa3\xd6\xce\x9cu\x9a\xd8\xad\xadL\xfc\xben\xfa\xd9,' \
b'\xedV^\x15\xb5\x99:\xa5[S]Gi\x96\xac\xb6\xf6gU\x17\xaf\xb1\xcc\xfd)\x9c5|\xccM\x11U6\xbd\xba0?L\xb2Z' \
b'\xda\xa2\xef\xeb.6\xb6\xa6\xec\'}\xd9%\xde5v\xe6+sc\xe2\xae\xc1\xe2\xcd\xa2=\xdc%\x95\xf6\xb7yx\x8aK' \
b'\xbf\xd9\x96\xcdb\xbb\xb6\xd5\xaf\xb6$\xdc\xab(' \
b'\xeb<O\xac)\xca\x7f\xed\xc9Y\xaf\xb8\xce\xe9\xb8\xce\xa2\xa4}\xe4mG\xff\xc3\xf9\x97\xcf\x9f\x96\x17\xbe' \
b';\xf17S\xad6\xf5\xab\xdc\xcb\xfb\xd6O?\xe6\xf3\xef\xf9#\x8fyx}\x879\x1fzs\x9b\x8fr]D\x9b\xea\xb9\xf7N' \
b'\\\xe7t\xdc\x99(\xbf\xfbV\xfe\xbf\xdfp\xd5\xd3\xdb\x15\xf7\xef\xd1\xfd.G[\x1aoz<k\x83\x8e\x10(D \x04B ' \
b'\x04\xd2\x14(\x94\x02\x1d"\x10\x02!\x10\x02i\nt(\x05:B \x04B \x04\xd2\x14\xe8H\n\xb4@ \x04B ' \
b'\x04\xd2\x14h!\x05:F \x04B \x04\xd2\x14\xe8X\nt\x82@\x08\x84@\x08\xa4)\xd0\x89\x14\xe8\x14\x81\x10\x08' \
b'\x81\x10HS\xa0S!P0C \x04B \x04R\x14\xa8AG\x084G \x04B ' \
b'\x04\xd2\x14h.\x05"\x13\x8d@\x08\x84@\xaa\x02\xc9Lt@&\x1a\x81\x10\x08\x81T\x05\x92\x99\xe8\x80L4\x02' \
b'!\x10\x02\xa9\n$3\xd1\x01\x99h\x04B ' \
b'\x04R\x15Hf\xa2\x032\xd1\x08\x84@\x08\xa4*\x90\xccD\x07d\xa2\x11\x08\x81\x10HU ' \
b'\x99\x89\x0e\xc8D#\x10\x02!\x90\xaa@2\x13\x1d\x90\x89F \x04B ' \
b'U\x81d&:$\x13\x8d@\x08\x84@\x9a\x02\x852\x13\x1d\x92\x89F \x04B ' \
b'U\x81d&:$\x13\x8d@\x08\x84@\xaa\x02\xf5\x99\xe8\xfd\xbd\xf5?\xb4\xef\xffp\xbf\x1b\xfe{' \
b'\xef\xfe\xe3\xde\xdd\x85(\x9c\xdc\x88\xa1\xfb\x81\xc9_PK\x03\x04\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\xc7\x8f&\xd0\x91\x07\x00\x00\xd5\x1d\x00\x00\x07\x00\x00\x00options\xcdYKs\xea:\x12\xde\xe7WP\xac' \
b'\x0f\xa709\xe4\xe6,fA \x84\xd4\r\t\x859\x99\xc5\xd4\x94K\xd8m\xac\x89l\xb9$9\x84{' \
b'\xeb\xfe\xf7ic\x99\xf8!90u\x17\xc3\x12\xb5Z\xdd\x9f\xbe~\xa8\xcdSEy"{' \
b'\xff\xe8]\xfdy\xd5\xc3\xdf\xbf\xfa)#\x07\x10\xcf$\x86\xfe\xbf\xf1\xff\xfe3\xec{' \
b'>aL\xd2]\xd2\xff\xa6\x85b*}`\x8c$\xc03y\x94;.\x14:\n\x91\x08H\x10\xf3\xf7BKH\x98\x84o\x95\xd5\x8d ' \
b'\xfe\xdb\xe3\xda\x83\x0f\x05"!\xcc{\xa7\xb0\x97f\xd9p\xec%@\x04H\xe5\x11\xff(' \
b'\xa2DV\x97p\x1c/\x14\x00\x9e\x8ff\x0bb\x94\x99\x8f\x8egx\x10\x86\xe0\xab\xe2\xa8a]\xc9\xd0#{' \
b'\xe2K\xe3\xf6)\xe7"\xa0\tQ\xe0\xcd\xa8\xccA*\xe0y"\xaa\xf7\xc4\x93]o\x06>\x8d\t\xebW7\x11\xdf\xa7\x01$' \
b'\n\xfd\x0b\te\x19\xfa`v\x91d\x8a3\xbe\xa3\x89\xd9=.|\xf0B\x80`\x8b\xa8y\x90\x90-\x83\xc0\xac\xc9\xe7' \
b'\x8c\xa1\x7f\x9eTD\x99%d\xc4\xf7^J\x19W\xde\x96\x07\x07\x8b\xd0!\xf1#\xc1\x13\xfa\x07\x82\xca\x13%8\xb3' \
b'\x98\xeeGDy{\x9a\x04\xa8\x95\x1c\xcf\x15\xad\x83\xff\xfa\xd6\x1b\x0cz\x90\x04=\x1e\xb6\xd9\xa39\x15' \
b'\xd00\xa4~\xc6\xd4\xc1L\xa80\x03f6\x01\x88<\xacI@\x84y\x99\x91-\xd8\xac\x8fiB\x17\xbff6,' \
b'\xfd\xb7\x94*\x17\xa1\xcc\xe4\x1d\x11\x13\xc6\xf8^\x03\xdf\xbc\xa3-\x15\x81\x81T\xbc\x08\xb0Wd^\xc1\x17' \
b'\xfc\xe3HC\x8c\xa8\x1aUR\x101USAdd6\x86\xc61\x17\xc8$;\x04S\x1e\xc7YB}\x92\x1f\xd9\xe9\xd1+\x95\x19a' \
b'\xeb\xa5Y(%\x01C9\xf3\xa2\x80\x14\xcd\xb0\xa0\x19!\xdb]E\xf5\xde&D\x82\x04\x94\x9b7*\x9aJ\xbbcsFw\x91' \
b'\x8d\xcd)\xb2\x9d(.\xeeu&y=%\x92\xe6\xf9\xe4\x1dq\xa1\xbe|"\xc9.#;\x9d\xdf ' \
b'\xd91\x8a\xa0W%\x11E\xed"\xc2\x8e7C\x1bq\x9dI\x10K"\xde\xcc\xe7\xe4\xabnB\xd2\xd3\xa57\x05\xe0KS)\xc6' \
b'\x9c\xdcDP&a\xcc;\xbcfALR\xe3\xc6=\x90\x14\xb7Z\xb0\x02\xf5\xc0\xf8Vs\xa8\xb9uWd\xc6\xe2@\x01\x04Q\xc1' \
b'\xab\xec\x1b#\xb8\x1a\xab:|_\xd7\xe6\xb0E4\xbc\x18\xe3\xdcR\x08R\xfa\x01\xcc\xc3$)\xa9\x0e|\xa7\x16V' \
b'\xfc\xa3\xd8\xed\xf9\x99\x90\\\x18-\xf73\xa9x\xec=\xaef:7Z\xa8\xd4\xb1\xf6\xa9\xe2\xb8~s\xfd}\\#6I\x822' \
b'\x0f2\x10\xcdKk\xa0\x93C\xa1Q\xd9\t\x92F\xd4\xb7\xd4\xc8T\x00\xe3$\xc0\xdcEu\x19u\xc6C\xfc}\xab\xd7Qu' \
b'\xc7D;\xb5\x90\x84J\x8e\x06\xa5\x05l?j4\xc0:Ul\x19\xd5\x08\x80\x85\x1c\x1e2j\xc6\xe0\x9dJ\xba]cph\xd2-0' \
b'\xeaj\xa4SXb1\x95\xa1\x14e\xe5m\xdd4\xcc%\xc7\x80,' \
b'|\xf9\xfe\xb3v\xfa1\x8a\xa5\xc1\xaa\x88`\xf1\x90m/\x96\xeed\xd2\xa6\x84O\xdf\xb1{' \
b'\xc8\xf9W\x98\xd9o\x94J\xec\x13\xb0B+\x92\xf80\'\xbe\xd2\x9cq\x0ci\xf0\xe1\xb1\xbd\x84\xb8\tB\x93\r\xc6' \
b'\xe8\xa9\\c\xb4}\xd4\xa3\x0f\x99O\x97\x98LP\xbb\x0b*Ku*Q~\x04\xc2\xe9\xb7\x9d\xdb ' \
b'r\xb6\xf2I\xe3\x04\x0en\xcc\xdf`V\x8d\x82zo\x921\xe6\xfa\xa8\xc4\x92\xd7\x03*snO@X\x12\xec\xece\xdef' \
b'\x90\xcfx\xa6\x8bV\r\x83\x90\x11\xb5)pp+w\xd3\xa0\xe5)\';\r\n`\xb4\xce ' \
b'\xc4\xfd\x10\xe4\xdbi\xb2k\x1f\xb1\x98\xad\xdb:\xab\xad\xd9uu\xc1u\'/m\xf1\x98\xe7\x85\xee\x0e\xdb\xaa' \
b'\xf6Z\xde\xbfX\x00g\x99Ry\xfa\xfe(' \
b'Yr\n\xbc\xaa\xd8\x13\xde\xc5\xbd\xadU\xe4\x99J3\xf5@\xe2\xb8\xe85G\xdfGu\x0eU\xc8S[' \
b'\xd9\xd3@\x15\xf5\xdd\xf99\x1a\x1a\x93\xc7g\xc685\xe4\x196\x86\x96\x0c\xe2f\x83\xd1x\xf3\xb9V_\xd7\xad' \
b'\xeb\xea\x89c\xe0?\xa1\xbb\x05O\x03\x08\t2\xb8\xc2\xd3\x86\r\xa5\xd6\xcaA\x93\x813\x9c\xfe\xdd\xe7h\xa5' \
b'\xd5.}z}\xf6!\xde|\xe0\x8c\xa7\xb6\xa3\xda\xe2K\xfa0\x18\xfdt/\xdd09\x7f\xc3\x11\xb7\xf3\xc5s\xf7/\xd4' \
b'\xfe\xdb\xa5\xd6?\\\xa4\xff\xfa\xfa\xcc\x9b;\xdeS5U\xe7\x879wTv\xdf\x1e\xe6\x837(' \
b'\x93F\xd3\x96\xfb\x04ynL\x93\xc5\xfa\n\xd3\xad\xb2/\x9fA\xbfBp\r\xd8\xd7\xb0\x8e\x83\xe6\x98h\xff0' \
b'\xae7P\xa8x]\xc1b5\x18;\xb3n\x1c\x88\x94\xb5\xe4\xd3\x06\xe3lo\xf2\xa7\xe3:\x0b\x02\x10g\x18\xacM\xab' \
b'\xc6\xf5\xeb\xed\xdd\xf3\xa4\xdb\xda_\xcf\x8f\xde\xe4\xe9\xf1\xe1\xf9~\xd6q;\xab\x89\xb7y\xd9\xac&k' \
b'\xbbL\xae\xe8\xf9e\xb6~\x9co\xce0\xb64\xad:6 \xfe{\xd9U[' \
b'\xcd\r\x88"k\xf0\x8f\xef\xf5\xdd\xbd\xfd\xb1\xac\xa5a\x9b\xed\x96<\x80/%Uq\xf8\x1a\xb0\xf0\xc0\xb4h' \
b'\x06W\\\xbfv\xfb?F7\xbf\x8d\xfb\x96=\xb3\x8bLJ\x05\x0f\xb1\xc3Jv+|z\xf0\xa0\xec\xb3Zb\x8c\x1c\xf2\x81' \
b'\xc0\x0c\xcai\xc4\xd0jr~\xf4\x94QH\x94t\x01\xc9\x87o\x84s\xfd]\xb5\x8e1\xd8r:\x88\xb0\r\x8da\x03\x0c' \
b'\xdf.J\x1c\xea\xf8\xfch\xe1\xb3\xe5\xfc-\xc6\x97\x94\x1b\xa1\xa0\x9f\xa9N7\xee.\x11\xc6\xd70\xb6\xd7GW' \
b'\x9f\xe0]\x8f\x0e\x1c\xdb\xfd\\J\x82\xd5yW$.\x04\xbe\x11\x01\'\xbaW+\xe4\xe0\xf6f\xde\x1d\x00\x0c_' \
b'+.>\x1b&\xc1\x7f\xf0Y3\xbf\xbd1\xbc\x9at\xf2\xa0\x0b`\xa99u\\\x98\x87\x9e\xb9\x84\x7fF\x00\xccU\x80' \
b'\x98$;\x17\x9f\xce\x0c\xee \xca\xdf\xdd\x99\xe82c\x97%\xd3\xcf\xc9\xdd\xd0^u\n\xdf\xab\ta\xfe\xff\x9blK' \
b'\xdb*\xe6\xfeN\x06\xe3a\xb7\xb5\xbf\x93\xf1p#h\x1c#\x84KP\x91&\xd7\xffns\xae\xb0\xb0\xf9\xa8\xb6\xeb' \
b'\xb2#`\x18\xb5S*\xb0<\x16SN\x1f\xc38Q_\\\x8b\xf6\xaa:-\xfd\xa2\xa2\xbcap\xe2\xb5xz<\xd6l\x94\xcb\xb73' \
b'\xcf\xf0\xd9\xad\x88\xd8\x81\xf2h\x12r;\x81\nQBc/\xfa$t\x87$\xba\xa6\xf0\xad\x18{' \
b'\xf2\xd4\x91\x18\x11)\x15\xe3\xbd\xc7DQ\xd3\xf8\xb9\x01\xc6t\xd2h\x8c\x06\xb7\xcb\xcd\xeb\xa8\x1b\x8f' \
b'%\xbd=\xe7\xceQ\xec\xcc\x9bD\xc9\t\xda|\x1c\xf3\x16i\xe0t\x8ff?m\x11xR7\xd5\xc3\xd0\x13^\xe3\x96\x94' \
b'\x1e\x90\x14)\x05\xb7\xfc\rY\x05\xb5\xcc_^\x8b\xb8\x1d\r;\x1a\xb1\x12\xe5\n\xf4\xbf\x16\x03g\xf1Eg\xb38' \
b'\x13\xcf\x9c\x00\xab\x1cL+\xb1~-\x9c\xc5\x19\x18\x9d\xedy>\xc9\xdbp\xceN\xa3Q\xf3\xa1\xdd\xb4i\xc0\xa4' \
b'\x11\xa9>s\x91\x1e\xca\xff"\x7f\xa6\xd8\x90\x10?"\x8c\x11*>\xc7\xbf\xadbj\x1f\xb0}\x82x\x8c"uH\xc1\xacB' \
b'\xa6\x00\x01VV\xd5\x11\xc2E{dS\x10\xf1\xbd$\xb1\x1d2\x01X\x8d\xc3/\x8e\xc8\xb5\xe4m\xa4\xb4;#N\x1f' \
b'\x1cFV\xbcKtM\xef\xfc\xd3\xb3\xfe\xea4\xb9\x8a\xf5\'\x1b\xa7\xfc\xcc&\xf3\xe4c\xfd\xbc&0\'zE\xca\xb6' \
b'\x8c\xe3\x81\x04i\xc4\x13=\x89p\xea\x13\x0e=\x043-=\xdc\tl\x1f\xa2\xfb\xcf\x81pkX\xac\x87\x88\x8d\x8d' \
b'\xef\x9ce1\x98V\x8e\xc3\x7f\x17o\xd7\x8f\x8c\n\xe3L\xea\xecZ\x9f\xe3d[' \
b'E\x15\x03\xf3\x98|\xcf\x05\x0b\xea\xa75P\xd6\x08^\x95\x13\xcer\xe0\xde\xc2\xb3\x8a\x865\x12b*\x04\x17' \
b']\xa4\x800\xff\xfaF-#\xf8\xfaw\x88\xe6\xc0\xb3a\xfa\xc9"\x93_\xda\x93\xab\xbf*\xff\xeabz\xf5_PK\x01\x02' \
b'\x00\x00\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\xdd-_\xaai\x07\x00\x00\xe20\x00\x00\x07\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00' \
b'\x00\x00\x00missionPK\x01\x02\x00\x00\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\x0e\xce^\xf6s\x00\x00\x00\xf5\x00\x00\x00\x17\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00' \
b'\x8e\x07\x00\x00l10n/DEFAULT/dictionaryPK\x01\x02\x00\x00\x14\x00\x02\x00\x08\x00\x00\x00 \x009G[' \
b'\xae#\x00\x00\x00)\x00\x00\x00\x18\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x006\x08\x00\x00l10n' \
b'/DEFAULT/mapResourcePK\x01\x02\x00\x00\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\xdf\xf9\xad\xe0h\x02\x00\x00E_\x00\x00\n\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x8f' \
b'\x08\x00\x00warehousesPK\x01\x02\x00\x00\x14\x00\x02\x00\x08\x00\x00\x00 ' \
b'\x00\xc7\x8f&\xd0\x91\x07\x00\x00\xd5\x1d\x00\x00\x07\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00' \
b'\x00\x1f\x0b\x00\x00optionsPK\x05\x06\x00\x00\x00\x00\x05\x00\x05\x00-\x01\x00\x00\xd5\x12\x00\x00\x00' \
b'\x00 '
| 102.687861 | 120 | 0.622798 | 3,543 | 17,765 | 3.116286 | 0.257127 | 0.063038 | 0.052984 | 0.040214 | 0.092202 | 0.091024 | 0.082873 | 0.082873 | 0.082873 | 0.078163 | 0 | 0.239891 | 0.132733 | 17,765 | 172 | 121 | 103.284884 | 0.476731 | 0.006473 | 0 | 0 | 0 | 0.678788 | 0.794218 | 0.782653 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1297b15967d5e86632568a2da8531a6b5e7ef7f2 | 95 | py | Python | src/briefcase/integrations/__init__.py | chuckyQ/briefcase | 06e84e7b1c3af016c828a5a640d277809de6644b | [
"BSD-3-Clause"
] | 917 | 2019-03-30T15:45:39.000Z | 2022-03-31T05:32:02.000Z | src/briefcase/integrations/__init__.py | CuPidev/briefcase | 35619cbe4b512c8521ad3733341e6bc3422efb58 | [
"BSD-3-Clause"
] | 429 | 2019-04-07T19:03:20.000Z | 2022-03-31T23:47:42.000Z | src/briefcase/integrations/__init__.py | CuPidev/briefcase | 35619cbe4b512c8521ad3733341e6bc3422efb58 | [
"BSD-3-Clause"
] | 166 | 2019-04-02T01:56:55.000Z | 2022-03-28T19:10:02.000Z | from . import android_sdk, git, java, xcode
__all__ = ['android_sdk', 'git', 'java', 'xcode']
| 23.75 | 49 | 0.663158 | 13 | 95 | 4.384615 | 0.615385 | 0.350877 | 0.45614 | 0.596491 | 0.77193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147368 | 95 | 3 | 50 | 31.666667 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0.242105 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
12c17e90ce425123ba9364e7f2d397aa1f2b82ce | 24 | py | Python | resistor/__init__.py | ephsmith/resistor | 71f1afbe9297b4e59ee0c521b376eb85e5a05ba6 | [
"MIT"
] | null | null | null | resistor/__init__.py | ephsmith/resistor | 71f1afbe9297b4e59ee0c521b376eb85e5a05ba6 | [
"MIT"
] | 3 | 2021-06-08T21:15:17.000Z | 2021-09-21T18:30:44.000Z | resistor/__init__.py | ephsmith/resistor | 71f1afbe9297b4e59ee0c521b376eb85e5a05ba6 | [
"MIT"
] | null | null | null | from .resistor import *
| 12 | 23 | 0.75 | 3 | 24 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
12ecb86f2926e6bfa4c7d6303fa3e956d853ad0d | 40 | py | Python | code/environments/__init__.py | jiwoongim/IMsML | 2f5647794b433c1cb63f14ba1bcbf4919ca48b87 | [
"MIT"
] | null | null | null | code/environments/__init__.py | jiwoongim/IMsML | 2f5647794b433c1cb63f14ba1bcbf4919ca48b87 | [
"MIT"
] | null | null | null | code/environments/__init__.py | jiwoongim/IMsML | 2f5647794b433c1cb63f14ba1bcbf4919ca48b87 | [
"MIT"
] | null | null | null | from .karpathy_game import KarpathyGame
| 20 | 39 | 0.875 | 5 | 40 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 40 | 1 | 40 | 40 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
12ffdc7ff7867c9e3b076413d70fd44f736ba36b | 42 | py | Python | pyPhys3D/__init__.py | thebillington/pyPhys3D | 36bd86a3eb0b53fa40322e1f3a3dc584eb829a0c | [
"MIT"
] | null | null | null | pyPhys3D/__init__.py | thebillington/pyPhys3D | 36bd86a3eb0b53fa40322e1f3a3dc584eb829a0c | [
"MIT"
] | null | null | null | pyPhys3D/__init__.py | thebillington/pyPhys3D | 36bd86a3eb0b53fa40322e1f3a3dc584eb829a0c | [
"MIT"
] | null | null | null | from Engine import *
from Shapes import *
| 14 | 20 | 0.761905 | 6 | 42 | 5.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 42 | 2 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
42457706b996cc268ae02a0b98513e737539a50b | 28,934 | py | Python | tests/zoomus/test_util.py | pjhinton/zoomus | 0fb4d5f5fd24c35ecaf4035332d0fb41c4c77068 | [
"Apache-2.0"
] | 1 | 2021-06-08T06:21:44.000Z | 2021-06-08T06:21:44.000Z | tests/zoomus/test_util.py | pjhinton/zoomus | 0fb4d5f5fd24c35ecaf4035332d0fb41c4c77068 | [
"Apache-2.0"
] | 1 | 2021-03-23T11:27:37.000Z | 2021-03-23T11:27:37.000Z | tests/zoomus/test_util.py | pjhinton/zoomus | 0fb4d5f5fd24c35ecaf4035332d0fb41c4c77068 | [
"Apache-2.0"
] | 2 | 2021-03-12T05:44:17.000Z | 2022-03-03T12:27:06.000Z | import datetime
import json
import unittest
from zoomus import util
import responses
def suite():
"""Define all the tests of the module."""
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(ApiClientTestCase))
suite.addTest(unittest.makeSuite(RequireKeysTestCase))
suite.addTest(unittest.makeSuite(DateToStrTestCase))
suite.addTest(unittest.makeSuite(IsStrTypeTestCase))
suite.addTest(unittest.makeSuite(EncodeUuidTestCase))
return suite
class ApiClientTestCase(unittest.TestCase):
def test_init_sets_config(self):
client = util.ApiClient(base_uri="http://www.foo.com")
self.assertEqual(client.base_uri, "http://www.foo.com")
self.assertEqual(client.timeout, 15)
def test_init_sets_config_with_timeout(self):
client = util.ApiClient(base_uri="http://www.foo.com", timeout=500)
self.assertEqual(client.timeout, 500)
def test_init_sets_config_with_timeout_none(self):
client = util.ApiClient(base_uri="http://www.foo.com", timeout=None)
self.assertEqual(client.timeout, None)
def test_cannot_init_with_non_int_timeout(self):
with self.assertRaises(ValueError) as context:
util.ApiClient(base_uri="http://www.foo.com", timeout="bad")
self.assertEqual(
context.exception.message, "timeout value must be an integer"
)
def test_can_get_base_uri(self):
client = util.ApiClient(base_uri="http://www.foo.com")
self.assertEqual(client.base_uri, "http://www.foo.com")
def test_can_set_base_uri(self):
client = util.ApiClient(base_uri="http://www.foo.com")
client.base_uri = "http://www.bar.com"
self.assertEqual(client.base_uri, "http://www.bar.com")
def test_set_base_uri_removes_trailing_slash(self):
client = util.ApiClient(base_uri="http://www.foo.com")
client.base_uri = "http://www.bar.com/"
self.assertEqual(client.base_uri, "http://www.bar.com")
def test_can_get_timeout(self):
client = util.ApiClient(base_uri="http://www.foo.com")
self.assertEqual(client.timeout, 15)
def test_can_set_timeout(self):
client = util.ApiClient(base_uri="http://www.foo.com")
client.timeout = 500
self.assertEqual(client.timeout, 500)
def test_can_set_timeout(self):
client = util.ApiClient(base_uri="http://www.foo.com")
client.timeout = None
self.assertEqual(client.timeout, None)
def test_cannot_set_timeout_to_non_int(self):
client = util.ApiClient(base_uri="http://www.foo.com")
with self.assertRaises(ValueError) as context:
client.timeout = "bad"
self.assertEqual(
context.exception.message, "timeout value must be an integer"
)
def test_url_for_returns_complete_url(self):
client = util.ApiClient(base_uri="http://www.foo.com")
self.assertEqual(client.url_for("bar"), "http://www.foo.com/bar")
def test_url_for_ignores_preceding_slash(self):
client = util.ApiClient(base_uri="http://www.foo.com")
self.assertEqual(client.url_for("/bar"), "http://www.foo.com/bar")
def test_url_for_ignores_trailing_slash(self):
client = util.ApiClient(base_uri="http://www.foo.com")
self.assertEqual(client.url_for("bar/"), "http://www.foo.com/bar")
@responses.activate
def test_can_get_request(self):
responses.add(responses.GET, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.get_request("endpoint")
@responses.activate
def test_can_get_request_v2(self):
responses.add(
responses.GET, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.get_request("endpoint")
expected_headers = {"Authorization": "Bearer token"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_get_request_with_params(self):
responses.add(responses.GET, "http://www.foo.com/endpoint?foo=bar")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.get_request("endpoint", params={"foo": "bar"})
@responses.activate
def test_can_get_request_with_params_v2(self):
responses.add(
responses.GET, "http://www.foo.com/endpoint?foo=bar",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.get_request("endpoint", params={"foo": "bar"})
expected_headers = {"Authorization": "Bearer token"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_get_request_with_headers(self):
responses.add(
responses.GET, "http://www.foo.com/endpoint", headers={"foo": "bar"}
)
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.get_request("endpoint", headers={"foo": "bar"})
@responses.activate
def test_can_get_request_with_headers_v2(self):
responses.add(
responses.GET, "http://www.foo.com/endpoint", headers={"foo": "bar"}
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.get_request("endpoint", headers={"foo": "bar"})
@responses.activate
def test_can_post_request(self):
responses.add(responses.POST, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.post_request("endpoint")
@responses.activate
def test_can_post_request_v2(self):
responses.add(
responses.POST, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.post_request("endpoint")
expected_headers = {
"Authorization": "Bearer token",
"Content-Type": "application/json",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_params(self):
responses.add(responses.POST, "http://www.foo.com/endpoint?foo=bar")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.post_request("endpoint", params={"foo": "bar"})
@responses.activate
def test_can_post_request_with_params_v2(self):
responses.add(
responses.POST, "http://www.foo.com/endpoint?foo=bar",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.post_request("endpoint", params={"foo": "bar"})
expected_headers = {"Authorization": "Bearer token"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_dict_data(self):
responses.add(responses.POST, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.post_request("endpoint", data={"foo": "bar"})
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
@responses.activate
def test_can_post_request_with_dict_data_v2(self):
responses.add(
responses.POST, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.post_request("endpoint", data={"foo": "bar"})
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
expected_headers = {
"Authorization": "Bearer token",
"Content-Type": "application/json",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_json_data(self):
responses.add(responses.POST, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.post_request("endpoint", data=json.dumps({"foo": "bar"}))
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
@responses.activate
def test_can_post_request_with_json_data_v2(self):
responses.add(
responses.POST, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.post_request("endpoint", data=json.dumps({"foo": "bar"}))
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
expected_headers = {
"Authorization": "Bearer token",
"Content-Type": "application/json",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_headers(self):
responses.add(responses.POST, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.post_request("endpoint", headers={"foo": "bar"})
expected_headers = {"foo": "bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_headers_v2(self):
responses.add(
responses.POST, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.post_request("endpoint", headers={"foo": "bar"})
expected_headers = {"foo": "bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_cookies(self):
responses.add(responses.POST, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.post_request("endpoint", cookies={"foo": "bar"})
expected_headers = {"Cookie": "foo=bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_post_request_with_cookies_v2(self):
responses.add(responses.POST, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.post_request("endpoint", cookies={"foo": "bar"})
expected_headers = {"Cookie": "foo=bar", "Authorization": "Bearer token"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.patch_request("endpoint")
@responses.activate
def test_can_patch_request_v2(self):
responses.add(
responses.PATCH, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.patch_request("endpoint")
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_params(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint?foo=bar")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.patch_request("endpoint", params={"foo": "bar"})
@responses.activate
def test_can_patch_request_with_params_v2(self):
responses.add(
responses.PATCH, "http://www.foo.com/endpoint?foo=bar",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.patch_request("endpoint", params={"foo": "bar"})
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_dict_data(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.patch_request("endpoint", data={"foo": "bar"})
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
@responses.activate
def test_can_patch_request_with_dict_data_v2(self):
responses.add(
responses.PATCH, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.patch_request("endpoint", data={"foo": "bar"})
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_json_data(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.patch_request("endpoint", data=json.dumps({"foo": "bar"}))
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
@responses.activate
def test_can_patch_request_with_json_data_v2(self):
responses.add(
responses.PATCH, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.patch_request("endpoint", data=json.dumps({"foo": "bar"}))
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_headers(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.patch_request("endpoint", headers={"foo": "bar"})
expected_headers = {"foo": "bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_headers_v2(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.patch_request("endpoint", headers={"foo": "bar"})
expected_headers = {"foo": "bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_cookies(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.patch_request("endpoint", cookies={"foo": "bar"})
expected_headers = {"Cookie": "foo=bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_patch_request_with_cookies_v2(self):
responses.add(responses.PATCH, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.patch_request("endpoint", cookies={"foo": "bar"})
expected_headers = {"Cookie": "foo=bar", "Authorization": "Bearer token"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.delete_request("endpoint")
@responses.activate
def test_can_delete_request_v2(self):
responses.add(
responses.DELETE, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.delete_request("endpoint")
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request_with_params(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint?foo=bar")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.delete_request("endpoint", params={"foo": "bar"})
@responses.activate
def test_can_delete_request_with_params_v2(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint?foo=bar")
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.delete_request("endpoint", params={"foo": "bar"})
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request_with_dict_data(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.delete_request("endpoint", data={"foo": "bar"})
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
@responses.activate
def test_can_delete_request_with_dict_data_v2(self):
responses.add(
responses.DELETE, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.delete_request("endpoint", data={"foo": "bar"})
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request_with_json_data(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.delete_request("endpoint", data=json.dumps({"foo": "bar"}))
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
@responses.activate
def test_can_delete_request_with_json_data_v2(self):
responses.add(
responses.DELETE, "http://www.foo.com/endpoint",
)
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.delete_request("endpoint", data=json.dumps({"foo": "bar"}))
self.assertEqual(responses.calls[0].request.body, '{"foo": "bar"}')
expected_headers = {
"Authorization": "Bearer token",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request_with_headers(self):
responses.add(
responses.DELETE, "http://www.foo.com/endpoint", headers={"foo": "bar"}
)
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.delete_request("endpoint", headers={"foo": "bar"})
@responses.activate
def test_can_delete_request_with_headers_v2(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.delete_request("endpoint", headers={"foo": "bar"})
expected_headers = {
"foo": "bar",
}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request_with_cookies(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com", config={"version": util.API_VERSION_1}
)
client.delete_request("endpoint", cookies={"foo": "bar"})
expected_headers = {"Cookie": "foo=bar"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
@responses.activate
def test_can_delete_request_with_cookies_v2(self):
responses.add(responses.DELETE, "http://www.foo.com/endpoint")
client = util.ApiClient(
base_uri="http://www.foo.com",
config={"version": util.API_VERSION_2, "token": "token"},
)
client.delete_request("endpoint", cookies={"foo": "bar"})
expected_headers = {"Cookie": "foo=bar", "Authorization": "Bearer token"}
actual_headers = responses.calls[0].request.headers
self.assertTrue(
set(expected_headers.items()).issubset(set(actual_headers.items()))
)
class RequireKeysTestCase(unittest.TestCase):
def test_can_require_keys_with_single_string_key(self):
d = {"a": 1}
with self.assertRaises(ValueError) as context:
util.require_keys(d, "b")
self.assertEqual(context.exception.message, "'b' must be set")
def test_can_require_keys_with_list_keys(self):
d = {"a": 1}
with self.assertRaises(ValueError) as context:
util.require_keys(d, ["b"])
self.assertEqual(context.exception.message, "'b' must be set")
def test_can_require_keys_with_multi_item_list_keys(self):
d = {"a": 1, "b": 2}
with self.assertRaises(ValueError) as context:
util.require_keys(d, ["b", "c"])
self.assertEqual(context.exception.message, "'c' must be set")
def test_require_keys_with_dict_raises_error_if_missing(self):
d = {"a": 1}
with self.assertRaises(ValueError) as context:
util.require_keys(d, "b")
self.assertEqual(context.exception.message, "'b' must be set")
def test_require_keys_with_dict_does_not_raises_error_if_none_by_default(self):
d = {"a": 1, "b": None}
self.assertTrue(util.require_keys(d, "b"))
def test_require_keys_with_dict_does_raises_error_if_none_not_allowed(self):
d = {"a": 1, "b": None}
with self.assertRaises(ValueError) as context:
self.assertTrue(util.require_keys(d, "b", allow_none=False))
self.assertEqual(context.exception.message, "'b' cannot be None")
class DateToStrTestCase(unittest.TestCase):
def test_can_convert_date_to_str(self):
d = datetime.date(year=2015, month=12, day=8)
self.assertEqual(util.date_to_str(d), "2015-12-08T00:00:00Z")
def test_can_convert_datetime_to_str(self):
d = datetime.datetime(year=2015, month=12, day=8, hour=23, minute=21, second=37)
self.assertEqual(util.date_to_str(d), "2015-12-08T23:21:37Z")
class IsStrTypeTestCase(unittest.TestCase):
from sys import version_info
def test_str_is_str_type(self):
self.assertTrue(util.is_str_type("s"))
def test_numeric_str_is_str_type(self):
self.assertTrue(util.is_str_type("5"))
def test_non_str_is_not_str_type(self):
self.assertFalse(util.is_str_type(5))
@unittest.skipIf(version_info[0] >= 3, "No applicable to Python 3+")
def test_unicode_is_str_type(self):
self.assertTrue(util.is_str_type(unicode("s")))
@unittest.skipIf(version_info[0] >= 3, "No applicable to Python 3+")
def test_numeric_unicode_is_str_type(self):
self.assertTrue(util.is_str_type(unicode("5")))
class EncodeUuidTestCase(unittest.TestCase):
def test_encode_without_slash(self):
uuid = "i6fJBQh0QzWCgrKretYGjg=="
self.assertEqual(util.encode_uuid(uuid), "i6fJBQh0QzWCgrKretYGjg==")
def test_encode_with_leading_slash(self):
uuid = "/6fJBQh0QzWCgrKretYGjg=="
self.assertEqual(util.encode_uuid(uuid), "%252F6fJBQh0QzWCgrKretYGjg%253D%253D")
def test_encode_with_double_slash(self):
uuid = "i6fJBQh0Qz//grKretYGjg=="
self.assertEqual(
util.encode_uuid(uuid), "i6fJBQh0Qz%252F%252FgrKretYGjg%253D%253D"
)
if __name__ == "__main__":
unittest.main()
| 39.419619 | 88 | 0.625493 | 3,390 | 28,934 | 5.128024 | 0.052802 | 0.043086 | 0.05925 | 0.077025 | 0.927289 | 0.918488 | 0.899735 | 0.880637 | 0.878509 | 0.860159 | 0 | 0.009545 | 0.228727 | 28,934 | 733 | 89 | 39.473397 | 0.769448 | 0.00121 | 0 | 0.63876 | 0 | 0 | 0.161942 | 0.005953 | 0 | 0 | 0 | 0 | 0.117829 | 1 | 0.113178 | false | 0 | 0.009302 | 0 | 0.131783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
428dae6b274f7de393a3d963c3c6130c20287fe8 | 25,147 | py | Python | src/Fitting/least_square.py | Marcel-Rodekamp/qcdanalysistools | 945c8201337ba0d52bc37267198d367bbe3e75e3 | [
"MIT"
] | null | null | null | src/Fitting/least_square.py | Marcel-Rodekamp/qcdanalysistools | 945c8201337ba0d52bc37267198d367bbe3e75e3 | [
"MIT"
] | null | null | null | src/Fitting/least_square.py | Marcel-Rodekamp/qcdanalysistools | 945c8201337ba0d52bc37267198d367bbe3e75e3 | [
"MIT"
] | null | null | null | r"""
This file contains classes for least square fitting
* Uncorrelated Fit (Diagonal approximation) : DiagonalLeastSquare
* Correlated Fit : CorrelatedLeastSquare
"""
import numpy as np
import scipy.optimize as opt
import scipy.stats
import itertools
from ..analysis import estimator,variance,get_sample,checkAnalysisType,Jackknife,Blocking,Bootstrap
from .fitting_base import FitBase
from .fitting_helpers import * # cov,cor,cov_fit_param
from qcdanalysistools.stats import AIC_chisq, AICc_chisq
class DiagonalLeastSquare(FitBase):
def __init__(self,t_model,t_abscissa,t_data=None,t_ordinate=None,t_ordinate_var=None,t_analysis_params=None):
r"""
t_model: qcdanalysistools.fitting.model
A model to which the data should be fit. Commonly, it needs to
implement a function
* t_mode.hess_param(x):
Computing the hessian of the model function
in respect to the parameters thus needs to
return an array of size (num_params,num_params)
* t_mode.grad_param(x):
Computing the jacobian of the model
function in respect to the parameters
thus needs to return an array of size
(num_params,).
* t_mode.__call__(x,*Theta):
Computing the model function at a given input x
and parameters *Theta = (Theta_0,Theta_1,...)
* num_params:
Number of parameters to fit to
* Theta0:
Tuple of first guess for each parameter.
t_abscissa: numpy.array
Abscissa used to evaluate the model function. Needs to be of
shape (D,).
t_data: numpy.ndarray
Results from a lattice qcd simulation to which the model should
be fitted. Needs to be of shape (N,D) where N is then number of
configurations. Can be None but then t_ordinate must be given!
t_ordinate: numpy.array
If the data is already processed (e.g. averaged) the ordinate to
which the model is fitted can be given explicitly. This is required
if t_data is None.
t_ordinate_var: numpy.array
If the data is already processed (e.g. variance) the variance can
be given explicitly. This is required if t_data is None
t_analysis_params: qcdanalysistools.analysis.AnalysisParams
Is one of the analysis parameter instantation defined in
src/StatisticalAnalysis/analysisParams.py
Used to preprocess the data but can be None. Then preprocessing
is achived with numpy.average.
"""
# initialize the base class giving access to the following class members
# self.model
# self.analysis_params
# self.ordinate
# self.data (if t_data is not None)
# self.abscissa
# self.min_stats
# self.fit_stats
# and class methods
# self.fit(self,*args,**kwargs): raise NotImplementedError
# self.__call__(self,*args,**kwargs): return self.fit(*args,**kwargs)
super().__init__(t_model,t_abscissa,t_data=t_data,t_ordinate=t_ordinate,t_analysis_params=t_analysis_params)
# preprocess for the ordinate variance used in chisq as denominator
if t_data is None:
# If t_data is not given the variance of the ordinate needs to be given.
if t_ordinate_var is None:
raise ValueError(f"Fitting requires either t_data or t_ordinate_var")
if len(t_ordinate_var.shape) != 1:
raise ValueError(f"t_ordinate_var must be 1-dimensional but is of shape {t_ordinate_var.shape}")
self.data = None
self.ordinate_var = t_ordinate_var
else:
# If t_data is given but the ordinate variance is not determine the
# variance with the given analysis type
# Note: checks on data are already done in FitBase
if t_ordinate_var is None:
if t_analysis_params is None:
# fallback to standard average if no analysis method is given
self.ordinate_var = np.var(self.data,axis=0)
else:
self.ordinate_var = variance(t_analysis_params,self.data)
else:
# If t_data is given and also variance of the ordinate just store it
if len(t_ordinate_var.shape) != 1:
raise ValueError(f"t_ordinate_var must be 1-dimensional but is of shape {t_ordinate_var.shape}")
self.ordinate_var = t_ordinate_var
def chisq(self,params):
r"""
params: tuple
Parameter for which chisq becomes minimized. i.e. fitting params
Note:
Least Square chisq:
$$
\chi^2 = \sum_{t=1}^D \frac{(y_t - f(x_t,\Theta))^2}{\sigma_{y_t}}
$$
where $\Theta$ denotes the vector of parameters
"""
# evaluate the model
model_res = self.model(self.abscissa,*params)
return np.sum( np.square(self.ordinate - model_res)/(2*self.ordinate_var) )
def grad_chisq(self,params):
r"""
params: tuple
Parameter for which chisq becomes minimized. i.e. fitting params
Note:
Least Square chisq gradient:
$$
\pdv{\chi^2}{\Theta_i} = \sum_{t=1}^D \frac{ \pdv{f(x_t,\Theta)}{\Theta_i} (y_t - f(x_t,\Theta))}{\sigma_{y_t}}
$$
The ith component is the deriviative in respect to the ith parameter
"""
# evaluate the model
model_res = self.model(self.abscissa,*params)
# evaluate the gradient of the model (i.r.t. the parameters)
model_grad = self.model.grad_param(self.abscissa,*params)
# initialize memory for the gradient
xsq_grad = np.zeros(shape = self.model.num_params)
# compute the gradient
for i_param in range(self.model.num_params):
xsq_grad[i_param] = - np.sum( model_grad[i_param,:]*(self.ordinate-model_res)/self.ordinate_var )
return xsq_grad
def hess_chisq(self,params):
r"""
params: tuple
Parameter for which chisq becomes minimized. i.e. fitting params
Note:
Least Square chisq gradient:
$$
\pdv{\chi^2}{\Theta_i}{\Theta_j} = \sum_{t=1}^D \frac{\pdv{f(x_t,\Theta)}{\Theta_i}{\Theta_j}(y_t - f(x_t,\Theta)) - \pdv{f(x_t,\Theta)}{\Theta_i}\pdv{f(x_t,\Theta)}{\Theta_j}}{\sigma_{y_t}}
$$
The ith component is the deriviative in respect to the ith parameter
"""
# evaluate model
model_res = self.model(self.abscissa,*params)
# evaluate the gradient of the model (i.r.t. the parameters)
model_grad = self.model.grad_param(self.abscissa,*params)
# evaluate the hessian of the model (i.r.t. the parameters)
model_hess = self.model.hess_param(self.abscissa,*params)
# initialize memory for the hessian
xsq_hess = np.zeros( shape=(self.model.num_params,self.model.num_params) )
# compute the hessian
for i,j in itertools.product(range(self.model.num_params),repeat=2):
xsq_hess[i,j] = -np.sum( (model_hess[i,j,:]*(self.ordinate-model_res)-model_grad[i,:]*model_grad[j,:])/(self.ordinate_var) )
return xsq_hess
def fit(self):
r"""
Fitting routine of the diagonal least square method.
1. For each minimization algorithm in scipy.optimize do
2. Minimize chisq.
3. if minimize succeeded add result to min_res_list
4. Choose the best minimization from min_res_list
5. Return fit results+statistics
"""
# 1.&2 for each minimization method minimize chisq
min_res_list = self._fit()
# Fail if no minimization method succeeded
if len(min_res_list) == 0:
raise RuntimeError(f"No minimization technique worked for fitting. Try using different start parameters.")
# 4. find the smallest chisq of all algorithms
self.min_stats = min_res_list[0]
fun = min_res_list[0]['fun']
# TODO: Do we require a faster algorithm here? Try tree structure then.
for res in min_res_list[1:]:
if fun > res['fun']:
fun = res['fun']
self.min_stats = res
# 5. Get fit statistics
# store the best fit parameter
self.fit_stats['Param'] = self.min_stats['x']
# store best fit data points evaluated over xdata
self.fit_stats['Best fit'] = self.model.apply(self.abscissa,*self.min_stats['x'])
# compute and store the covariance matrix of the fit using implementation of .fitting_helpers
self.fit_stats['Cov'] = cov_fit_param(self.abscissa,np.diag(np.divide(np.ones_like(self.ordinate_var),self.ordinate_var)),self.model,self.min_stats['x'])
# compute and store the fit error
self.fit_stats['Fit error'] = np.sqrt(np.diag(self.fit_stats['Cov']))
# define the degrees of freedom
dof = len(self.abscissa)-self.model.num_params
# compute reduced chisq
self.fit_stats['red chisq'] = self.chisq(self.fit_stats['Param']) / dof
# compute p-value
self.fit_stats['p-value'] = scipy.stats.chi2.sf(self.chisq(self.fit_stats['Param']),dof)
if self.data is not None:
# compute Akaike information criterion for normally distributed errors
self.fit_stats['AIC'] = AIC_chisq(dof, self.fit_stats['red chisq'])
# compute Akaike information criterion for small data sets
self.fit_stats['AICc'] = AICc_chisq(dof, self.data.shape[0], self.fit_stats['red chisq'])
# return the report this is later also accessible from the class
return self.fit_stats
def print_result(self,*args,**kwargs):
out_str = "=======================================\n"
out_str+= f"Reporting for model {self.model.__name__()}\n"
out_str+= "=======================================\n"
out_str+= "========= Best Fit Parameter: =========\n"
for i_param in range(self.model.num_params):
out_str+=f"{self.model.param_names[i_param]} = {self.fit_stats['Param'][i_param]:.6e} \u00B1 {self.fit_stats['Fit error'][i_param]: .6e}\n"
out_str+= "========= Best Fit Covariance: ========\n"
for i in range(self.model.num_params):
for j in range(self.model.num_params):
out_str+=f"{self.fit_stats['Cov'][i,j]: .2e} "
out_str+="\n"
out_str+= "========= Best Fit \u1d61\u00B2: ================\n"
out_str+= f"\u1d61\u00B2/dof = {self.fit_stats['red chisq']: .6e}\n"
out_str+= "========= Best Fit p-value: ===========\n"
out_str+= f"p-value = {self.fit_stats['p-value']: .6e}\n"
if self.data is not None:
out_str+= "========= Best Fit Akaike crit: =======\n"
out_str+= f"AIC = {self.fit_stats['AIC']: .6e}\n"
out_str+= f"AICc = {self.fit_stats['AICc']: .6e}\n"
print(out_str,*args,**kwargs)
class CorrelatedLeastSquare(FitBase):
def __init__(self,t_model,t_abscissa,t_data=None,t_ordinate=None,t_ordinate_cov=None,t_analysis_params=None, t_inv_acc=1e-8):
r"""
t_model: qcdanalysistools.fitting.model
A model to which the data should be fit. Commonly, it needs to
implement a function
* t_mode.hess_param(x):
Computing the hessian of the model function
in respect to the parameters thus needs to
return an array of size (num_params,num_params)
* t_mode.grad_param(x):
Computing the jacobian of the model
function in respect to the parameters
thus needs to return an array of size
(num_params,).
* t_mode.__call__(x,*Theta):
Computing the model function at a given input x
and parameters *Theta = (Theta_0,Theta_1,...)
* num_params:
Number of parameters to fit to
* Theta0:
Tuple of first guess for each parameter.
t_abscissa: numpy.array
Abscissa used to evaluate the model function. Needs to be of
shape (D,).
t_data: numpy.ndarray
Results from a lattice qcd simulation to which the model should
be fitted. Needs to be of shape (N,D) where N is then number of
configurations. Can be None but then t_ordinate must be given!
t_ordinate: numpy.array
If the data is already processed (e.g. averaged) the ordinate to
which the model is fitted can be given explicitly. This is required
if t_data is None.
t_ordinate_cov: numpy.ndarray
If the data is already processed (e.g. covariance) the covariance can
be given explicitly. This is required if t_data is None
t_analysis_params: qcdanalysistools.analysis.AnalysisParams
Is one of the analysis parameter instantation defined in
src/StatisticalAnalysis/analysisParams.py
Used to preprocess the data but can be None. Then preprocessing
is achived with numpy.average.
"""
# initialize the base class giving access to the following class members
# self.model
# self.analysis_params
# self.ordinate
# self.data (if t_data is not None)
# self.abscissa
# self.min_stats
# self.fit_stats
# and class methods
# self.fit(self,*args,**kwargs): raise NotImplementedError
# self.__call__(self,*args,**kwargs): return self.fit(*args,**kwargs)
super().__init__(t_model,t_abscissa,t_data=t_data,t_ordinate=t_ordinate,t_analysis_params=t_analysis_params)
# preprocess for the ordinate covariance used in chisq as denominator
if t_data is None:
# If t_data is not given the covariance of the ordinate needs to be given.
if t_ordinate_cov is None:
raise ValueError(f"Fitting requires either t_data or t_ordinate_cov")
if len(t_ordinate_cov.shape) != 2:
raise ValueError(f"t_ordinate_cov must be 2-dimensional but is of shape {t_ordinate_cov.shape}")
self.ordinate_cov = t_ordinate_cov
else:
# If t_data is given but the ordinate variance is not, determine the
# covariance with the given analysis type
# Note: checks on data are already done in FitBase
if t_ordinate_cov is None:
# use the function from qcdanalysistools.fitting.fitting_helpers
self.ordinate_cov = cov(t_analysis_params,t_data)
else:
# If t_data is given and also variance of the ordinate just store it
if len(t_ordinate_cov.shape) != 2:
raise ValueError(f"t_ordinate_cov must be 2-dimensional but is of shape {t_ordinate_cov.shape}")
self.ordinate_cov = t_ordinate_cov
if self.data.shape[0] < 10*(self.ordinate.size+1):
# if this is the case fit might be biased. Compare
# C. Michael and A. McKerrell
# Fitting Correlated Hadron Mass Spectrum Data
# Liverpool Prepint: LTH342, 1994
# hep-lat/9412087
print(f"WARNING: To less datapoints N ({self.data.shape[0]}) for a fit to dimension D ({self.ordinate.size})")
print(f"WARNING: Require N > 10*(D+1) for a reliable fit")
# invert the covariance matrix
self.inv_acc = t_inv_acc
try:
self.ordinate_cov_inv = np.linalg.inv(self.ordinate_cov)
except:
print(f"WARNING: Require SVD to invert covariance matrix.")
u,w,v = np.linalg.svd(self.ordinate_cov)
self.ordinate_cov_inv = np.dot(np.dot(np.transpose(v),np.diag(np.divide(np.ones(w.size),w,out=np.zeros(w.size),where=w<self.inv_acc**2))),np.transpose(u))
# check that the inversion worked
def res(A):
return np.linalg.norm(A-np.identity(A.shape[0]))
# right inverse
res_r = res(self.ordinate_cov @ self.ordinate_cov_inv)
res_l = res(self.ordinate_cov_inv @ self.ordinate_cov)
if res_r > self.inv_acc:
raise RuntimeError(f"Failed to right invert the covariance matrix: res = {res_r:.4e}")
if res_l > self.inv_acc:
raise RuntimeError(f"Failed to left invert the covariance matrix: res = {res_l:.4e}")
# store the information
if res_r > res_l:
self.fit_stats['Cov inv acc'] = res_r
else:
self.fit_stats['Cov inv acc'] = res_l
def chisq(self,params):
r"""
params: tuple
Parameter for which chisq becomes minimized. i.e. fitting params
Note:
Least Square chisq:
$$
\chi^2 = \sum_{t_1,t_2=1}^D (y_{t_1} - f(x_{t_1},\Theta)) CovInv_{t_1,t_2}(y_{t_2} - f(x_{t_2},\Theta))
$$
where $\Theta$ denotes the vector of parameters
"""
# evaluate the model
model_res = self.model(self.abscissa,*params)
# compute chisq
xsq = 0
for t1,t2 in itertools.product(range(self.abscissa.size),repeat=2):
xsq+=(self.ordinate[t1]-model_res[t1])*self.ordinate_cov_inv[t1,t2]*(self.ordinate[t2]-model_res[t2])
return xsq
def grad_chisq(self,params):
r"""
params: tuple
Parameter for which chisq becomes minimized. i.e. fitting params
Note:
Least Square chisq gradient:
$$
\pdv{\chi^2}{\Theta_i} = - 2 * \sum_{t_1,t_2=1}^D \pdv{f(x_{t_1},\Theta)}{\Theta_i} * CovInv_{t_1,t_2} * (y_{t_2} - f(x_{t_2},\Theta))
$$
The ith component is the deriviative in respect to the ith parameter
"""
# evaluate the model
model_res = self.model(self.abscissa,*params)
# evaluate the gradient of the model (i.r.t. the parameters)
model_grad = self.model.grad_param(self.abscissa,*params)
# initialize memory for the gradient
xsq_grad = np.zeros(shape = self.model.num_params)
# compute the gradient
for i_param in range(self.model.num_params):
for t1,t2 in itertools.product(range(self.abscissa.size),repeat=2):
xsq_grad[i_param] -= 2*model_grad[i_param,t1]*self.ordinate_cov_inv[t1,t2]*(self.ordinate[t2]-model_res[t2])
return xsq_grad
def hess_chisq(self,params):
r"""
params: tuple
Parameter for which chisq becomes minimized. i.e. fitting params
Note:
Least Square chisq gradient:
$$
\pdv{\chi^2}{\Theta_i}{\Theta_j} = -2 \sum_{t_1,t_2=1}^D \pdv{f(x_{t_1},\Theta)}{\Theta_i}{\Theta_j}* CovInv_{t_1,t_2} *(y_{t_2} - f(x_{t_2},\Theta))
- \pdv{f(x_{t_1},\Theta)}{\Theta_i} * CovInv_{t_1,t_2} * \pdv{f(x_{t_2},\Theta)}{\Theta_j}
$$
The ith component is the deriviative in respect to the ith parameter
"""
# evaluate model
model_res = self.model(self.abscissa,*params)
# evaluate the gradient of the model (i.r.t. the parameters)
model_grad = self.model.grad_param(self.abscissa,*params)
# evaluate the hessian of the model (i.r.t. the parameters)
model_hess = self.model.hess_param(self.abscissa,*params)
# initialize memory for the hessian
xsq_hess = np.zeros( shape=(self.model.num_params,self.model.num_params) )
# compute the hessian
for i,j in itertools.product(range(self.model.num_params),repeat=2):
for t1,t2 in itertools.product(range(self.abscissa.size),repeat=2):
xsq_hess[i,j] -= 2*(model_hess[i,j,t1]*self.ordinate_cov_inv[t1,t2]*(self.ordinate[t2]-model_res[t2]) \
-model_grad[i,t1]*self.ordinate_cov_inv[t1,t2]*model_grad[j,t2])
return xsq_hess
def fit(self):
r"""
Fitting routine of the diagonal least square method.
1. For each minimization algorithm in scipy.optimize do
2. Minimize chisq.
3. if minimize succeeded add result to min_res_list
4. Choose the best minimization from min_res_list
5. Return fit results+statistics
"""
# 1.&2 for each minimization method minimize chisq
min_res_list = self._fit()
# Fail if no minimization method succeeded
if len(min_res_list) == 0:
raise RuntimeError(f"No minimization technique worked for fitting. Try using different start parameters.")
# 4. find the smallest chisq of all algorithms
self.min_stats = min_res_list[0]
fun = min_res_list[0]['fun']
# TODO: Do we require a faster algorithm here? Try tree structure then.
for res in min_res_list[1:]:
if fun > res['fun']:
fun = res['fun']
self.min_stats = res
# 5. Get fit statistics
# store the best fit parameter
self.fit_stats['Param'] = self.min_stats['x']
# store best fit data points evaluated over xdata
self.fit_stats['Best fit'] = self.model.apply(self.abscissa,*self.min_stats['x'])
# compute and store the covariance matrix of the fit using implementation of .fitting_helpers
self.fit_stats['Cov'] = cov_fit_param(self.abscissa,self.ordinate_cov_inv,self.model,self.min_stats['x'],self.inv_acc)
# compute and store the fit error
self.fit_stats['Fit error'] = np.sqrt(np.diag(self.fit_stats['Cov']))
# define the degrees of freedom
dof = len(self.abscissa)-self.model.num_params
# compute reduced chisq
self.fit_stats['red chisq'] = self.chisq(self.fit_stats['Param']) / dof
# compute p-value
self.fit_stats['p-value'] = scipy.stats.chi2.sf(self.chisq(self.fit_stats['Param']),dof)
if self.data is not None:
# compute Akaike information criterion for normally distributed errors
self.fit_stats['AIC'] = AIC_chisq(dof, self.fit_stats['red chisq'])
# compute Akaike information criterion for small data sets
self.fit_stats['AICc'] = AICc_chisq(dof, self.data.shape[0], self.fit_stats['red chisq'])
# return the report this is later also accessible from the class
return self.fit_stats
def print_result(self,*args,**kwargs):
out_str = "=======================================\n"
out_str+= f"Reporting for model {self.model.__name__()}\n"
out_str+= "=======================================\n"
out_str+= "========= Best Fit Parameter: =========\n"
for i_param in range(self.model.num_params):
out_str+=f"{self.model.param_names[i_param]} = {self.fit_stats['Param'][i_param]:.6e} \u00B1 {self.fit_stats['Fit error'][i_param]: .6e}\n"
out_str+= "========= Best Fit Covariance: ========\n"
for i in range(self.model.num_params):
for j in range(self.model.num_params):
out_str+=f"{self.fit_stats['Cov'][i,j]: .2e} "
out_str+="\n"
out_str+= "========= Best Fit \u1d61\u00B2: ================\n"
out_str+= f"\u1d61\u00B2/dof = {self.fit_stats['red chisq']: .6e}\n"
out_str+= "========= Best Fit p-value: ===========\n"
out_str+= f"p-value = {self.fit_stats['p-value']: .6e}\n"
if self.data is not None:
out_str+= "========= Best Fit Akaike crit: =======\n"
out_str+= f"AIC = {self.fit_stats['AIC']: .6e}\n"
out_str+= f"AICc = {self.fit_stats['AICc']: .6e}\n"
out_str+= "========= Inverse Cov Accuracy: =======\n"
out_str+= f"{self.fit_stats['Cov inv acc']:.6e}"
print(out_str,*args,**kwargs)
| 48.924125 | 214 | 0.582336 | 3,375 | 25,147 | 4.177481 | 0.098963 | 0.026314 | 0.040003 | 0.02298 | 0.867863 | 0.852614 | 0.83318 | 0.820626 | 0.812256 | 0.812256 | 0 | 0.011156 | 0.308506 | 25,147 | 513 | 215 | 49.019493 | 0.799643 | 0.416471 | 0 | 0.687805 | 0 | 0.014634 | 0.197744 | 0.054215 | 0 | 0 | 0 | 0.003899 | 0 | 1 | 0.063415 | false | 0 | 0.039024 | 0.004878 | 0.156098 | 0.034146 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c43b076cfe0068c8c1f23e4ae8a969b9b84a2d50 | 219 | py | Python | oslo/torch/nn/parallel/tensor_parallel/__init__.py | lipovsek/oslo | c2cde6229068808bf691e200f8af8c97c1631eb4 | [
"Apache-2.0"
] | 249 | 2021-12-21T05:25:53.000Z | 2022-03-21T21:03:58.000Z | oslo/torch/nn/parallel/tensor_parallel/__init__.py | lipovsek/oslo | c2cde6229068808bf691e200f8af8c97c1631eb4 | [
"Apache-2.0"
] | 21 | 2021-12-22T13:22:18.000Z | 2022-03-31T17:38:53.000Z | oslo/torch/nn/parallel/tensor_parallel/__init__.py | lipovsek/oslo | c2cde6229068808bf691e200f8af8c97c1631eb4 | [
"Apache-2.0"
] | 14 | 2021-12-21T10:28:36.000Z | 2022-03-29T12:35:44.000Z | from oslo.torch.nn.parallel.tensor_parallel.mapping import Column, Row, Update
from oslo.torch.nn.parallel.tensor_parallel.tensor_parallel import (
TensorParallel,
)
__ALL__ = [TensorParallel, Column, Row, Update]
| 31.285714 | 78 | 0.799087 | 28 | 219 | 6 | 0.464286 | 0.25 | 0.392857 | 0.178571 | 0.440476 | 0.440476 | 0.440476 | 0 | 0 | 0 | 0 | 0 | 0.105023 | 219 | 6 | 79 | 36.5 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
c47b651ce42c3a64e9fdb7841994e4acb01719cd | 239 | py | Python | solavis/__init__.py | jstzwj/solavis | 7ecf99280099e371e5eb73fbcb037b31d59b16e3 | [
"MIT"
] | 1 | 2019-10-21T04:01:26.000Z | 2019-10-21T04:01:26.000Z | solavis/__init__.py | jstzwj/solavis | 7ecf99280099e371e5eb73fbcb037b31d59b16e3 | [
"MIT"
] | 2 | 2021-03-31T19:19:28.000Z | 2021-12-13T20:23:57.000Z | solavis/__init__.py | jstzwj/solavis | 7ecf99280099e371e5eb73fbcb037b31d59b16e3 | [
"MIT"
] | null | null | null | from solavis.core.spider import Spider
from solavis.core.container import Container, Response
from solavis.core.request import Request, RequestLoader
from solavis.core.pipeline import Pipeline
from solavis.core.middleware import Middleware | 47.8 | 55 | 0.861925 | 32 | 239 | 6.4375 | 0.34375 | 0.26699 | 0.364078 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087866 | 239 | 5 | 56 | 47.8 | 0.944954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c48a35f3bdb738aeaae4108731d29c349689c91a | 3,283 | py | Python | test/unit/test_driver.py | kenji-miyake/mike | 392d57b8bb9d14bcedf2451a0dc302709f8055eb | [
"BSD-3-Clause"
] | null | null | null | test/unit/test_driver.py | kenji-miyake/mike | 392d57b8bb9d14bcedf2451a0dc302709f8055eb | [
"BSD-3-Clause"
] | null | null | null | test/unit/test_driver.py | kenji-miyake/mike | 392d57b8bb9d14bcedf2451a0dc302709f8055eb | [
"BSD-3-Clause"
] | null | null | null | import mkdocs.config
import os
import unittest
from argparse import Namespace
from unittest import mock
from .. import *
from mike import driver
class TestLoadMkdocsConfig(unittest.TestCase):
def make_args(self, **kwargs):
default = {'config_file': '/path/to/mkdocs.yml', 'branch': None,
'remote': None}
default.update(kwargs)
return Namespace(**default)
def test_config(self):
path = os.path.join(test_data_dir, 'basic_theme', 'mkdocs.yml')
args = self.make_args(config_file=path)
self.assertIsInstance(driver.load_mkdocs_config(args),
mkdocs.config.Config)
self.assertFalse(hasattr(args, 'alias_type'))
self.assertFalse(hasattr(args, 'template'))
self.assertFalse(hasattr(args, 'deploy_prefix'))
args = self.make_args(config_file=path, alias_type=None, template=None,
deploy_prefix=None)
self.assertIsInstance(driver.load_mkdocs_config(args),
mkdocs.config.Config)
self.assertEqual(args.alias_type, 'symlink')
self.assertEqual(args.template, None)
self.assertEqual(args.deploy_prefix, '')
args = self.make_args(config_file=path, alias_type='copy',
template='file.html', deploy_prefix='prefix')
self.assertIsInstance(driver.load_mkdocs_config(args),
mkdocs.config.Config)
self.assertEqual(args.alias_type, 'copy')
self.assertEqual(args.template, 'file.html')
self.assertEqual(args.deploy_prefix, 'prefix')
def test_no_config(self):
args = self.make_args(branch='gh-pages', remote='origin')
with mock.patch('builtins.open', side_effect=FileNotFoundError):
self.assertIs(driver.load_mkdocs_config(args), None)
self.assertFalse(hasattr(args, 'alias_type'))
self.assertFalse(hasattr(args, 'template'))
self.assertFalse(hasattr(args, 'deploy_prefix'))
args = self.make_args(branch='gh-pages', remote='origin',
alias_type=None, template=None,
deploy_prefix=None)
with mock.patch('builtins.open', side_effect=FileNotFoundError):
self.assertIs(driver.load_mkdocs_config(args), None)
self.assertEqual(args.alias_type, 'symlink')
self.assertEqual(args.template, None)
self.assertEqual(args.deploy_prefix, '')
args = self.make_args(branch='gh-pages', remote='origin',
alias_type='copy', template='file.html',
deploy_prefix='prefix')
with mock.patch('builtins.open', side_effect=FileNotFoundError):
self.assertIs(driver.load_mkdocs_config(args), None)
self.assertEqual(args.alias_type, 'copy')
self.assertEqual(args.template, 'file.html')
self.assertEqual(args.deploy_prefix, 'prefix')
def test_no_config_missing_fields(self):
args = self.make_args()
with mock.patch('builtins.open', side_effect=FileNotFoundError):
with self.assertRaises(FileNotFoundError):
driver.load_mkdocs_config(args)
| 44.364865 | 79 | 0.626256 | 357 | 3,283 | 5.591036 | 0.176471 | 0.09018 | 0.114228 | 0.056112 | 0.807114 | 0.782064 | 0.782064 | 0.767034 | 0.708918 | 0.650301 | 0 | 0 | 0.259214 | 3,283 | 73 | 80 | 44.972603 | 0.820724 | 0 | 0 | 0.555556 | 0 | 0 | 0.094121 | 0 | 0 | 0 | 0 | 0 | 0.396825 | 1 | 0.063492 | false | 0 | 0.111111 | 0 | 0.206349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
674847447ef1b9d838f9a7b1e869773da424ea1a | 893 | py | Python | Regular Expression/Basic_of_reg.py | reddyprasade/PYTHON-BASIC-FOR-ALL | 4fa4bf850f065e9ac1cea0365b93257e1f04e2cb | [
"MIT"
] | 21 | 2019-06-28T05:11:17.000Z | 2022-03-16T02:02:28.000Z | Regular Expression/Basic_of_reg.py | reddyprasade/PYTHON-BASIC-FOR-ALL | 4fa4bf850f065e9ac1cea0365b93257e1f04e2cb | [
"MIT"
] | 2 | 2021-12-28T14:15:58.000Z | 2021-12-28T14:16:02.000Z | Regular Expression/Basic_of_reg.py | reddyprasade/PYTHON-BASIC-FOR-ALL | 4fa4bf850f065e9ac1cea0365b93257e1f04e2cb | [
"MIT"
] | 18 | 2019-07-07T03:20:33.000Z | 2021-05-08T10:44:18.000Z | import re
'''
## Small Alp pattern matching
pattern = "[a-z]"
baya_name= 'liam','olivia','noah','emma'
for string in baya_name:
result = re.match(pattern,string)
if result:
print("Sucessfull",result)
else:
print("Not matching")
'''
'''
## Cap Alp pattern matching
pattern = "[A-Z]"
baya_name= 'Liam','Olivia','Noah','Emma'
for string in baya_name:
result = re.match(pattern,string)
if result:
print("Sucessfull",result)
else:
print("Not matching")
'''
### Both Cap and Small alp
pattern = "[a-z&A-Z]"
baya_name= 'Liam','Olivia','Noah','emma'
copy_result = []
for string in baya_name:
result = re.match(pattern,string)
copy_result.append(result)
if result:
print("Sucessfull",result)
else:
print("Not matching")
print(copy_result)
pattern = "[0-9]"
| 19 | 41 | 0.586786 | 113 | 893 | 4.557522 | 0.274336 | 0.093204 | 0.052427 | 0.058252 | 0.807767 | 0.807767 | 0.807767 | 0.807767 | 0.753398 | 0.658252 | 0 | 0.003012 | 0.256439 | 893 | 46 | 42 | 19.413043 | 0.77259 | 0.024636 | 0 | 0 | 0 | 0 | 0.167702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.076923 | 0.230769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
674edf3efb04ee97c29f01465c6273a572d53638 | 18,016 | py | Python | tests/test_module.py | plantpredict/python-sdk | 5040b898a16ac65cc556662b44f7564482aba066 | [
"MIT"
] | 2 | 2021-08-09T23:42:52.000Z | 2021-12-03T10:46:10.000Z | tests/test_module.py | plantpredict/python-sdk | 5040b898a16ac65cc556662b44f7564482aba066 | [
"MIT"
] | 3 | 2021-08-20T12:06:54.000Z | 2022-03-08T22:14:46.000Z | tests/test_module.py | plantpredict/python-sdk | 5040b898a16ac65cc556662b44f7564482aba066 | [
"MIT"
] | 3 | 2021-08-11T20:03:52.000Z | 2022-03-31T07:44:29.000Z | import mock
import unittest
import json
from plantpredict.module import Module
from tests import plantpredict_unit_test_case, mocked_requests
class TestModule(plantpredict_unit_test_case.PlantPredictUnitTestCase):
@mock.patch('plantpredict.plant_predict_entity.PlantPredictEntity.create')
def test_create(self, mocked_create):
self._make_mocked_api()
module = Module(api=self.mocked_api)
module.stc_short_circuit_current = 1.23
module.stc_short_circuit_current_temp_coef = 0.04
module.length = 2000
module.width = 1200
module.stc_max_power = 120.0
module.create()
self.assertEqual(module.create_url_suffix, "/Module")
self.assertTrue(mocked_create.called)
self.assertEqual(module.short_circuit_current_at_stc, 1.23)
self.assertEqual(module.linear_temp_dependence_on_isc, 0.04)
self.area = 2.4
self.stc_efficiency = 120.0 / (2.4 * 1000.0)
@mock.patch('plantpredict.plant_predict_entity.PlantPredictEntity.delete')
def test_delete(self, mocked_delete):
self._make_mocked_api()
module = Module(api=self.mocked_api, id=808)
module.delete()
self.assertEqual(module.delete_url_suffix, "/Module/808")
self.assertTrue(mocked_delete.called)
@mock.patch('plantpredict.plant_predict_entity.PlantPredictEntity.get')
def test_get(self, mocked_get):
self._make_mocked_api()
module = Module(api=self.mocked_api, id=808)
module.get()
self.assertEqual(module.get_url_suffix, "/Module/808")
self.assertTrue(mocked_get.called)
@mock.patch('plantpredict.plant_predict_entity.PlantPredictEntity.update')
def test_update(self, mocked_update):
self._make_mocked_api()
module = Module(api=self.mocked_api, id=808)
module.update()
self.assertEqual(module.update_url_suffix, "/Module")
self.assertTrue(mocked_update.called)
def test_parse_full_iv_curves_template_no_sheet_name(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
iv_curves = module._parse_full_iv_curves_template("test_data/test_parse_full_iv_curves_template.xlsx")
self.assertEqual(len(iv_curves), 2)
self.assertEqual(len(iv_curves[0]["data_points"]), 200)
self.assertEqual(iv_curves[0]["temperature"], 25)
self.assertEqual(len(iv_curves[1]["data_points"]), 200)
self.assertEqual(iv_curves[1]["temperature"], 50)
self.assertAlmostEqual(iv_curves[0]["data_points"][3]["current"], 2.450329288)
self.assertAlmostEqual(iv_curves[1]["data_points"][198]["voltage"], 224.9763771)
def test_parse_full_iv_curves_template_with_sheet_name(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
iv_curves = module._parse_full_iv_curves_template(
file_path="test_data/test_parse_full_iv_curves_template.xlsx",
sheet_name="NewSheetName"
)
self.assertEqual(len(iv_curves), 1)
self.assertEqual(len(iv_curves[0]["data_points"]), 200)
self.assertEqual(iv_curves[0]["temperature"], 25)
self.assertAlmostEqual(iv_curves[0]["data_points"][3]["current"], 2.450329288)
def test_parse_key_iv_points_template_no_sheet_name(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
key_iv_points = module._parse_key_iv_points_template(file_path="test_data/test_parse_key_iv_points_template.xlsx")
self.assertEqual(len(key_iv_points), 27)
self.assertEqual(key_iv_points[5]["temperature"], 15)
self.assertEqual(key_iv_points[5]["irradiance"], 1000)
self.assertEqual(key_iv_points[5]["short_circuit_current"], 1.74346881517)
self.assertEqual(key_iv_points[5]["mpp_voltage"], 74.21342493)
def test_parse_key_iv_points_template_with_sheet_name(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
key_iv_points = module._parse_key_iv_points_template(
file_path="test_data/test_parse_key_iv_points_template.xlsx",
sheet_name="NewSheetName"
)
self.assertEqual(len(key_iv_points), 27)
self.assertEqual(key_iv_points[5]["temperature"], 15)
self.assertEqual(key_iv_points[5]["irradiance"], 1000)
self.assertEqual(key_iv_points[5]["short_circuit_current"], 1.74346881517)
self.assertEqual(key_iv_points[5]["mpp_voltage"], 74.21342493)
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_generate_iv_curve(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
iv_curve = module.generate_iv_curve()
module.num_iv_points = 100
self.assertEqual(iv_curve, [{"current": 1.2, "voltage": 100.0}])
def test_process_iv_curves_no_inputs(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
with self.assertRaises(ValueError) as e:
module.process_iv_curves()
self.assertEqual(e.exception.args[0], "Either a file path to the .xslx template for Full IV Curves input or "
"the properly formatted JSON-serializable data structure for Key IV "
"Points input must be assigned as input. See the Python SDK "
"documentation (https://plantpredict-python.readthedocs.io/en/latest/)"
" for more information.")
def test_process_iv_curves_conflicting_inputs(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
with self.assertRaises(ValueError) as e:
module.process_iv_curves(
file_path="fake_file_path.xlsx",
iv_curve_data={
"temperature": 25, "irradiance": 1000, "data_points": [{"current": 1.25, "voltage": 20.0}]
}
)
self.assertEqual(e.exception.args[0], "Only one input option may be specified.")
@mock.patch('plantpredict.module.requests.post', mocked_requests.mocked_requests_post)
def test_process_iv_curves_with_file(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
data = module.process_iv_curves(file_path="test_data/test_parse_full_iv_curves_template.xlsx")
self.assertEqual(data, [
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
},
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
}
])
@mock.patch('plantpredict.module.requests.post', mocked_requests.mocked_requests_post)
def test_process_iv_curves_with_file(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
data = module.process_iv_curves(iv_curve_data=[
{"temperature": 25, "irradiance": 1000, "data_points": [{"current": 1.25, "voltage": 20.0}]},
{"temperature": 25, "irradiance": 1000, "data_points": [{"current": 1.25, "voltage": 20.0}]}
])
self.assertEqual(data, [
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
},
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
}
])
def test_process_key_iv_points_no_inputs(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
with self.assertRaises(ValueError) as e:
module.process_key_iv_points()
self.assertEqual(e.exception.args[0], "Either a file path to the .xslx template for Key IV Points input or the "
"properly formatted JSON-serializable data structure for Key IV Points "
"input must be assigned as input. See the Python SDK documentation "
"(https://plantpredict-python.readthedocs.io/en/latest/) for more "
"information.")
def test_process_key_iv_points_conflicting_inputs(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
with self.assertRaises(ValueError) as e:
module.process_key_iv_points(
file_path="fake_file_path.xlsx",
key_iv_points_data={
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
}
)
self.assertEqual(e.exception.args[0], "Only one input option may be specified.")
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_process_key_iv_points_with_file(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
response = module.process_key_iv_points(file_path="test_data/test_parse_key_iv_points_template.xlsx")
self.assertEqual(json.loads(response.content), {
"stc_short_circuit_current": 1.7592,
"stc_open_circuit_voltage": 90.2189,
"stc_mpp_current": 1.6084,
"stc_mpp_voltage": 72.4938,
"stc_short_circuit_current_temp_coef": 0.0519,
"stc_open_circuit_voltage_temp_coef": -0.3081,
"stc_power_temp_coef": -0.3535,
"effective_irradiance_response": [
{"temperature": 25, "irradiance": 1000, "relative_efficiency": 1.0},
{"temperature": 25, "irradiance": 800, "relative_efficiency": 1.0039},
{"temperature": 25, "irradiance": 600, "relative_efficiency": 1.0032},
{"temperature": 25, "irradiance": 400, "relative_efficiency": 0.9925},
{"temperature": 25, "irradiance": 200, "relative_efficiency": 0.9582},
]
})
self.assertEqual(module.stc_short_circuit_current, 1.7592)
self.assertEqual(module.effective_irradiance_response, [
{"temperature": 25, "irradiance": 1000, "relative_efficiency": 1.0},
{"temperature": 25, "irradiance": 800, "relative_efficiency": 1.0039},
{"temperature": 25, "irradiance": 600, "relative_efficiency": 1.0032},
{"temperature": 25, "irradiance": 400, "relative_efficiency": 0.9925},
{"temperature": 25, "irradiance": 200, "relative_efficiency": 0.9582},
])
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_process_key_iv_points_with_data(self):
self._make_mocked_api()
module = Module(api=self.mocked_api)
response = module.process_key_iv_points(key_iv_points_data=[
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
},
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
}
])
self.assertEqual(json.loads(response.content), {
"stc_short_circuit_current": 1.7592,
"stc_open_circuit_voltage": 90.2189,
"stc_mpp_current": 1.6084,
"stc_mpp_voltage": 72.4938,
"stc_short_circuit_current_temp_coef": 0.0519,
"stc_open_circuit_voltage_temp_coef": -0.3081,
"stc_power_temp_coef": -0.3535,
"effective_irradiance_response": [
{"temperature": 25, "irradiance": 1000, "relative_efficiency": 1.0},
{"temperature": 25, "irradiance": 800, "relative_efficiency": 1.0039},
{"temperature": 25, "irradiance": 600, "relative_efficiency": 1.0032},
{"temperature": 25, "irradiance": 400, "relative_efficiency": 0.9925},
{"temperature": 25, "irradiance": 200, "relative_efficiency": 0.9582},
]
})
self.assertEqual(module.stc_short_circuit_current, 1.7592)
self.assertEqual(module.effective_irradiance_response, [
{"temperature": 25, "irradiance": 1000, "relative_efficiency": 1.0},
{"temperature": 25, "irradiance": 800, "relative_efficiency": 1.0039},
{"temperature": 25, "irradiance": 600, "relative_efficiency": 1.0032},
{"temperature": 25, "irradiance": 400, "relative_efficiency": 0.9925},
{"temperature": 25, "irradiance": 200, "relative_efficiency": 0.9582},
])
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_calculate_basic_data_at_conditions(self):
self._make_mocked_api()
module = Module(self.mocked_api)
data = module.calculate_basic_data_at_conditions(temperature=25, irradiance=1000)
self.assertEqual(data, [
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
},
{
"temperature": 25,
"irradiance": 1000,
"short_circuit_current": 9.43,
"open_circuit_voltage": 46.39,
"mpp_current": 8.9598,
"mpp_voltage": 38.1285,
"max_power": 341.6237
}
])
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_calculate_effective_irradiance_response(self):
self._make_mocked_api()
module = Module(self.mocked_api)
response = module.calculate_effective_irradiance_response()
self.assertEqual(json.loads(response.content), [
{'temperature': 25, 'irradiance': 1000, 'relative_efficiency': 1.0},
{'temperature': 25, 'irradiance': 800, 'relative_efficiency': 1.02},
{'temperature': 25, 'irradiance': 600, 'relative_efficiency': 1.001},
{'temperature': 25, 'irradiance': 400, 'relative_efficiency': 0.99},
{'temperature': 25, 'irradiance': 200, 'relative_efficiency': 0.97}
])
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_generate_single_diode_parameters_advanced(self):
self._make_mocked_api()
module = Module(self.mocked_api)
response = module.generate_single_diode_parameters_advanced()
self.assertEqual(json.loads(response.content), {
"maximum_series_resistance": 6.0,
"maximum_recombination_parameter": 2.5,
"saturation_current_at_stc": 0.0000000012,
"diode_ideality_factor_at_stc": 1.56,
"linear_temp_dependence_on_gamma": -0.04,
"light_generated_current": 1.8
})
self.assertEqual(module.diode_ideality_factor_at_stc, 1.56)
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_generate_single_diode_parameters_default(self):
self._make_mocked_api()
module = Module(self.mocked_api)
response = module.generate_single_diode_parameters_default()
self.assertEqual(json.loads(response.content), {
"maximum_series_resistance": 6.0,
"maximum_recombination_parameter": 2.5,
"saturation_current_at_stc": 0.0000000012,
"diode_ideality_factor_at_stc": 1.78,
"linear_temp_dependence_on_gamma": -0.04,
"light_generated_current": 1.8
})
self.assertEqual(module.diode_ideality_factor_at_stc, 1.78)
@mock.patch('plantpredict.module.requests.post', new=mocked_requests.mocked_requests_post)
def test_optimize_series_resistance(self):
self._make_mocked_api()
module = Module(self.mocked_api)
response = module.optimize_series_resistance()
self.assertEqual(json.loads(response.content), {
"maximum_series_resistance": 6.0,
"maximum_recombination_parameter": 2.5,
"saturation_current_at_stc": 0.0000000012,
"diode_ideality_factor_at_stc": 1.22,
"linear_temp_dependence_on_gamma": -0.04,
"light_generated_current": 1.8
})
self.assertEqual(module.diode_ideality_factor_at_stc, 1.22)
if __name__ == '__main__':
unittest.main()
| 44.927681 | 122 | 0.61512 | 2,037 | 18,016 | 5.113402 | 0.103093 | 0.064804 | 0.083909 | 0.035906 | 0.90937 | 0.890073 | 0.863575 | 0.814996 | 0.797139 | 0.797139 | 0 | 0.069312 | 0.271259 | 18,016 | 400 | 123 | 45.04 | 0.724046 | 0 | 0 | 0.627841 | 0 | 0 | 0.261157 | 0.103686 | 0 | 0 | 0 | 0 | 0.159091 | 1 | 0.0625 | false | 0 | 0.014205 | 0 | 0.079545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
67508e8934da17caa8abd96a5ec719b45d2e9a23 | 103 | py | Python | orb_simulator/orbsim_language/orbsim_ast/satellite_node.py | dmguezjaviersnet/IA-Sim-Comp-Project | 8165b9546efc45f98091a3774e2dae4f45942048 | [
"MIT"
] | 1 | 2022-01-19T22:49:09.000Z | 2022-01-19T22:49:09.000Z | orb_simulator/orbsim_language/orbsim_ast/satellite_node.py | dmguezjaviersnet/IA-Sim-Comp-Project | 8165b9546efc45f98091a3774e2dae4f45942048 | [
"MIT"
] | 15 | 2021-11-10T14:25:02.000Z | 2022-02-12T19:17:11.000Z | orb_simulator/orbsim_language/orbsim_ast/satellite_node.py | dmguezjaviersnet/IA-Sim-Comp-Project | 8165b9546efc45f98091a3774e2dae4f45942048 | [
"MIT"
] | null | null | null | from orbsim_language.orbsim_ast.atomic_node import AtomicNode
class SatelliteNode(AtomicNode):
pass | 25.75 | 61 | 0.854369 | 13 | 103 | 6.538462 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097087 | 103 | 4 | 62 | 25.75 | 0.913978 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
676dc963f845ddd2ae29b4efef9235a2d5771a4a | 192 | py | Python | modules/runModuleTests.py | EmoryMLIP/DynamicBlocks | 52acc9fbc1a2640c6ac8922fa18105279ccaea97 | [
"MIT"
] | 9 | 2020-06-22T14:59:28.000Z | 2022-01-14T20:13:00.000Z | modules/runModuleTests.py | EmoryMLIP/DynamicBlocks | 52acc9fbc1a2640c6ac8922fa18105279ccaea97 | [
"MIT"
] | null | null | null | modules/runModuleTests.py | EmoryMLIP/DynamicBlocks | 52acc9fbc1a2640c6ac8922fa18105279ccaea97 | [
"MIT"
] | null | null | null | import modules.testConnectingLayer
import modules.testDoubleLayer
import modules.testDoubleSymLayer
import modules.testPreactDoubleLayer
import modules.testRK1Block
import modules.testRK4Block | 32 | 36 | 0.911458 | 18 | 192 | 9.722222 | 0.444444 | 0.445714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01105 | 0.057292 | 192 | 6 | 37 | 32 | 0.955801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
677775e2add6f1856cf3b6724b165d6f70fa7246 | 50 | py | Python | finrl_meta/env_execution_optimizing/order_execution_qlib/trade/policy/__init__.py | eitin-infant/FinRL-Meta | 4c94011e58425796e7e2e5c1bf848afd65c828d6 | [
"MIT"
] | 214 | 2021-11-08T17:06:11.000Z | 2022-03-31T18:29:48.000Z | finrl_meta/env_execution_optimizing/order_execution_qlib/trade/policy/__init__.py | eitin-infant/FinRL-Meta | 4c94011e58425796e7e2e5c1bf848afd65c828d6 | [
"MIT"
] | 51 | 2021-11-14T19:11:02.000Z | 2022-03-30T20:23:08.000Z | finrl_meta/env_execution_optimizing/order_execution_qlib/trade/policy/__init__.py | eitin-infant/FinRL-Meta | 4c94011e58425796e7e2e5c1bf848afd65c828d6 | [
"MIT"
] | 110 | 2021-11-03T07:41:40.000Z | 2022-03-31T03:23:38.000Z | from .ppo_supervision import *
from .ppo import *
| 16.666667 | 30 | 0.76 | 7 | 50 | 5.285714 | 0.571429 | 0.378378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 50 | 2 | 31 | 25 | 0.880952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
679f10b9c0353807f04c46e5ecf17ff0d997ba7a | 19 | py | Python | contrib/tools/python/src/Lib/plat-mac/Carbon/Snd.py | HeyLey/catboost | f472aed90604ebe727537d9d4a37147985e10ec2 | [
"Apache-2.0"
] | 6,989 | 2017-07-18T06:23:18.000Z | 2022-03-31T15:58:36.000Z | python/src/Lib/plat-mac/Carbon/Snd.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 1,978 | 2017-07-18T09:17:58.000Z | 2022-03-31T14:28:43.000Z | python/src/Lib/plat-mac/Carbon/Snd.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 1,228 | 2017-07-18T09:03:13.000Z | 2022-03-29T05:57:40.000Z | from _Snd import *
| 9.5 | 18 | 0.736842 | 3 | 19 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
67bf543194b9ee16c769e56710911336ed1243ec | 53,398 | py | Python | shakenfist/tests/test_external_api.py | mikalstill/shakenfist | 87332b5e72b2da07a8126238412d18d4aee4be18 | [
"Apache-2.0"
] | 5 | 2020-05-29T02:44:28.000Z | 2020-06-18T23:41:19.000Z | shakenfist/tests/test_external_api.py | mikalstill/shakenfist | 87332b5e72b2da07a8126238412d18d4aee4be18 | [
"Apache-2.0"
] | 21 | 2020-05-03T07:17:05.000Z | 2020-06-17T22:54:38.000Z | shakenfist/tests/test_external_api.py | mikalstill/shakenfist | 87332b5e72b2da07a8126238412d18d4aee4be18 | [
"Apache-2.0"
] | 3 | 2020-04-17T14:43:53.000Z | 2020-06-09T03:30:16.000Z | import base64
import bcrypt
import json
import logging
import mock
from shakenfist.baseobject import (
DatabaseBackedObject as dbo,
State)
from shakenfist.config import config, BaseSettings, SFConfig
from shakenfist.external_api import app as external_api
from shakenfist.ipmanager import IPManager
from shakenfist.tests import base
from shakenfist.tests.mock_etcd import MockEtcd
class FakeResponse(object):
def __init__(self, status_code, text):
self.status_code = status_code
self.text = text
def json(self):
return json.loads(self.text)
class FakeScheduler(object):
def place_instance(self, *args, **kwargs):
return config.NODE_NAME
class BaseFakeObject(object):
def __init__(self, state=None):
self._state = state
@property
def state(self):
if isinstance(self._state, list):
s = self._state[0]
self._state = self._state[1:]
return State(s, 1)
else:
return State(self._state, 1)
@state.setter
def state(self, state):
self._state = state
def unique_label(self):
return ('instance', self.uuid)
def add_event(self, operation, phase, duration=None, msg=None):
pass
def delete(self):
pass
class FakeInstance(BaseFakeObject):
object_type = 'instance'
def __init__(self, uuid=None, namespace=None,
state=dbo.STATE_CREATED, power_state='on',
placement='node1'):
super(FakeInstance, self).__init__(state)
self.uuid = uuid
self.namespace = namespace
self.power_state = {'power_state': power_state}
self.placement = {'node': placement}
self.version = 2
self.interfaces = []
class FakeNetwork(BaseFakeObject):
object_type = 'network'
def __init__(self, uuid=None, vxid=None, namespace=None,
name=None, netblock=None, state=dbo.STATE_CREATED):
super(FakeNetwork, self).__init__(state)
self.uuid = uuid
self.vxid = vxid
self.namespace = namespace
self.name = name
self.netblock = netblock
self.version = 2
self.provide_nat = True
def is_dead(self):
return False
def remove_dhcp(self):
pass
def _encode_key(key):
return bcrypt.hashpw(key.encode('utf-8'), bcrypt.gensalt())
def _clean_traceback(resp):
if 'traceback' in resp:
del resp['traceback']
return resp
class AuthTestCase(base.ShakenFistTestCase):
def setUp(self):
super(AuthTestCase, self).setUp()
external_api.TESTING = True
external_api.app.testing = True
external_api.app.debug = False
external_api.app.logger.addHandler(logging.StreamHandler())
external_api.app.logger.setLevel(logging.DEBUG)
logging.root.setLevel(logging.DEBUG)
self.get_namespace = mock.patch('shakenfist.db.get_namespace')
self.mock_get_namespace = self.get_namespace.start()
self.addCleanup(self.get_namespace.stop)
# The client must be created after all the mocks, or the mocks are not
# correctly applied.
self.client = external_api.app.test_client()
def test_post_auth_no_args(self):
resp = self.client.post('/auth', data=json.dumps({}))
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'missing namespace in request',
'status': 400
},
resp.get_json())
def test_post_auth_no_key(self):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana'}))
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'missing key in request',
'status': 400
},
resp.get_json())
def test_post_auth_bad_parameter(self):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana', 'keyyy': 'pwd'}))
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': "post() got an unexpected keyword argument 'keyyy'",
'status': 400
},
_clean_traceback(resp.get_json()))
def test_post_auth_key_non_string(self):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana', 'key': 1234}))
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'key is not a string',
'status': 400
},
resp.get_json())
@mock.patch('shakenfist.db.get_namespace',
return_value={
'service_key': None,
'keys': {
'key1': str(base64.b64encode(_encode_key('bacon')), 'utf-8')
}
})
def test_post_auth(self, mock_get_keys):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana', 'key': 'bacon'}))
self.assertEqual(200, resp.status_code)
self.assertIn('access_token', resp.get_json())
@mock.patch('shakenfist.db.get_namespace',
return_value={
'service_key': 'cheese',
'keys': {
'key1': str(base64.b64encode(_encode_key('bacon')), 'utf-8')
}
})
def test_post_auth_not_authorized(self, mock_get_keys):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana', 'key': 'hamster'}))
self.assertEqual(401, resp.status_code)
self.assertEqual(
{
'error': 'unauthorized',
'status': 401
},
resp.get_json())
def test_no_auth_header(self):
resp = self.client.post('/auth/namespaces',
data=json.dumps({
'namespace': 'foo'
}))
self.assertEqual(401, resp.status_code)
self.assertEqual(
{
'error': 'Missing Authorization Header',
'status': 401
},
_clean_traceback(resp.get_json()))
def test_auth_header_wrong(self):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': 'l33thacker'},
data=json.dumps({
'namespace': 'foo'
}))
self.assertEqual(
{
'error': "Bad Authorization header. Expected value 'Bearer <JWT>'",
'status': 401
},
_clean_traceback(resp.get_json()))
self.assertEqual(401, resp.status_code)
def test_auth_header_bad_jwt(self):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': 'Bearer l33thacker'},
data=json.dumps({
'namespace': 'foo'
}))
self.assertEqual(
{
'error': 'invalid JWT in Authorization header',
'status': 401
},
_clean_traceback(resp.get_json()))
self.assertEqual(401, resp.status_code)
class AuthNoNamespaceMockTestCase(base.ShakenFistTestCase):
def setUp(self):
super(AuthNoNamespaceMockTestCase, self).setUp()
external_api.TESTING = True
external_api.app.testing = True
external_api.app.debug = False
external_api.app.logger.addHandler(logging.StreamHandler())
external_api.app.logger.setLevel(logging.DEBUG)
logging.root.setLevel(logging.DEBUG)
# The client must be created after all the mocks, or the mocks are not
# correctly applied.
self.client = external_api.app.test_client()
@mock.patch('shakenfist.etcd.get',
return_value={
'service_key': 'cheese',
'keys': {
'key1': str(base64.b64encode(_encode_key('bacon')), 'utf-8'),
'key2': str(base64.b64encode(_encode_key('sausage')), 'utf-8')
}
})
def test_post_auth_service_key(self, mock_get):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana', 'key': 'cheese'}))
self.assertEqual(200, resp.status_code)
self.assertIn('access_token', resp.get_json())
class ExternalApiTestCase(base.ShakenFistTestCase):
def setUp(self):
super(ExternalApiTestCase, self).setUp()
self.recorded_op = mock.patch(
'shakenfist.util.general.RecordedOperation')
self.recorded_op.start()
self.addCleanup(self.recorded_op.stop)
self.mock_etcd = MockEtcd(self, node_count=4)
self.mock_etcd.setup()
self.scheduler = mock.patch(
'shakenfist.scheduler.Scheduler', FakeScheduler)
self.mock_scheduler = self.scheduler.start()
self.addCleanup(self.scheduler.stop)
external_api.TESTING = True
external_api.app.testing = True
external_api.app.debug = False
external_api.app.logger.addHandler(logging.StreamHandler())
external_api.app.logger.setLevel(logging.DEBUG)
logging.root.setLevel(logging.DEBUG)
fake_config = SFConfig(
NODE_NAME='node1',
)
self.config = mock.patch('shakenfist.instance.config', fake_config)
self.mock_config = self.config.start()
self.addCleanup(self.config.stop)
# The client must be created after all the mocks, or the mocks are not
# correctly applied.
self.client = external_api.app.test_client()
self.mock_etcd.create_namespace('system', 'key1', 'bar')
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'system', 'key': 'bar'}))
self.assertEqual(200, resp.status_code)
self.auth_token = 'Bearer %s' % resp.get_json()['access_token']
self.mock_etcd.create_namespace('two', 'key1', 'space')
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'two', 'key': 'space'}))
self.assertEqual(200, resp.status_code)
self.auth_token_two = 'Bearer %s' % resp.get_json()['access_token']
self.mock_etcd.create_namespace('three', 'key1', 'pass')
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'three', 'key': 'pass'}))
self.assertEqual(200, resp.status_code)
self.auth_token_three = 'Bearer %s' % resp.get_json()['access_token']
class ExternalApiGeneralTestCase(ExternalApiTestCase):
def test_get_root(self):
resp = self.client.get('/')
self.assertEqual('Shaken Fist REST API service',
resp.get_data().decode('utf-8'))
self.assertEqual(200, resp.status_code)
self.assertEqual('text/plain; charset=utf-8', resp.content_type)
def test_auth_add_key_missing_args(self):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': self.auth_token},
data=json.dumps({}))
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'no namespace specified',
'status': 400
},
resp.get_json())
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.etcd.get', return_value=None)
@mock.patch('shakenfist.etcd.put')
def test_auth_add_key_missing_keyname(self, mock_put, mock_get, mock_lock):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': self.auth_token},
data=json.dumps({
'namespace': 'foo'
}))
self.assertEqual(200, resp.status_code)
self.assertEqual('foo', resp.get_json())
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.etcd.get', return_value=None)
@mock.patch('shakenfist.etcd.put')
def test_auth_add_key_missing_key(self, mock_put, mock_get, mock_lock):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': self.auth_token},
data=json.dumps({
'namespace': 'foo',
'key_name': 'bernard'
}))
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'no key specified',
'status': 400
},
resp.get_json())
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.etcd.get', return_value=None)
def test_auth_add_key_illegal_keyname(self, mock_get, mock_lock):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': self.auth_token},
data=json.dumps({
'namespace': 'foo',
'key_name': 'service_key',
'key': 'cheese'
}))
self.assertEqual(
{
'error': 'illegal key name',
'status': 403
},
resp.get_json())
self.assertEqual(403, resp.status_code)
@mock.patch('shakenfist.etcd.get_all',
return_value=[
('/sf/namespace/aaa', {'name': 'aaa'}),
('/sf/namespace/bbb', {'name': 'bbb'}),
('/sf/namespace/ccc', {'name': 'ccc'})
])
def test_get_namespaces(self, mock_get_all):
resp = self.client.get('/auth/namespaces',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual(['aaa', 'bbb', 'ccc'], resp.get_json())
def test_delete_namespace_missing_args(self):
resp = self.client.delete('/auth/namespaces',
headers={'Authorization': self.auth_token})
self.assertEqual(405, resp.status_code)
self.assertEqual(
{
'message': 'The method is not allowed for the requested URL.'
},
resp.get_json())
def test_delete_namespace_system(self):
resp = self.client.delete('/auth/namespaces/system',
headers={'Authorization': self.auth_token})
self.assertEqual(403, resp.status_code)
self.assertEqual(
{
'error': 'you cannot delete the system namespace',
'status': 403
},
resp.get_json())
@mock.patch('shakenfist.instance.Instance._db_get_attribute',
return_value={'value': dbo.STATE_CREATED, 'update_time': 2})
@mock.patch('shakenfist.instance.Instances',
return_value=[FakeInstance(uuid='123')])
def test_delete_namespace_with_instances(self, mock_get_instances,
mock_get_instance_attribute):
resp = self.client.delete('/auth/namespaces/foo',
headers={'Authorization': self.auth_token})
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'you cannot delete a namespace with instances',
'status': 400
},
resp.get_json())
@mock.patch('shakenfist.instance.Instances', return_value=[])
@mock.patch('shakenfist.net.Networks',
return_value=[FakeNetwork(uuid='123', state=dbo.STATE_CREATED)])
def test_delete_namespace_with_networks(self, mock_get_networks, mock_get_instances):
resp = self.client.delete('/auth/namespaces/foo',
headers={'Authorization': self.auth_token})
self.assertEqual(400, resp.status_code)
self.assertEqual(
{
'error': 'you cannot delete a namespace with networks',
'status': 400
},
resp.get_json())
def test_delete_namespace_key_missing_args(self):
resp = self.client.delete('/auth/namespaces/system/',
headers={'Authorization': self.auth_token})
self.assertEqual(404, resp.status_code)
self.assertEqual(None, resp.get_json())
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.etcd.get', return_value={'keys': {}})
def test_delete_namespace_key_missing_key(self, mock_get, mock_lock):
resp = self.client.delete('/auth/namespaces/system/keys/mykey',
headers={'Authorization': self.auth_token})
self.assertEqual(404, resp.status_code)
self.assertEqual(
{
'error': 'key name not found in namespace',
'status': 404
},
resp.get_json())
@mock.patch('shakenfist.db.get_metadata', return_value={'a': 'a', 'b': 'b'})
def test_get_namespace_metadata(self, mock_md_get):
resp = self.client.get(
'/auth/namespaces/system/metadata', headers={'Authorization': self.auth_token})
self.assertEqual({'a': 'a', 'b': 'b'}, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
@mock.patch('shakenfist.db.get_metadata', return_value={})
@mock.patch('shakenfist.db.persist_metadata')
@mock.patch('shakenfist.db.get_lock')
def test_put_namespace_metadata(self, mock_get_lock, mock_md_put,
mock_md_get):
resp = self.client.put('/auth/namespaces/system/metadata/foo',
headers={'Authorization': self.auth_token},
data=json.dumps({
'key': 'foo',
'value': 'bar'
}))
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
mock_md_put.assert_called_with('namespace', 'system', {'foo': 'bar'})
@mock.patch('shakenfist.db.get_metadata', return_value={})
@mock.patch('shakenfist.db.persist_metadata')
@mock.patch('shakenfist.db.get_lock')
def test_post_namespace_metadata(self, mock_get_lock, mock_md_put,
mock_md_get):
resp = self.client.post('/auth/namespaces/system/metadata',
headers={'Authorization': self.auth_token},
data=json.dumps({
'key': 'foo',
'value': 'bar'
}))
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
mock_md_put.assert_called_with('namespace', 'system', {'foo': 'bar'})
@mock.patch('shakenfist.db.get_metadata', return_value={'foo': 'bar', 'real': 'smart'})
@mock.patch('shakenfist.db.persist_metadata')
@mock.patch('shakenfist.db.get_lock')
def test_delete_namespace_metadata(self, mock_get_lock, mock_md_put,
mock_md_get):
resp = self.client.delete('/auth/namespaces/system/metadata/foo',
headers={'Authorization': self.auth_token})
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
mock_md_put.assert_called_with('namespace', 'system', {'real': 'smart'})
@mock.patch('shakenfist.db.get_metadata', return_value={})
@mock.patch('shakenfist.db.persist_metadata')
@mock.patch('shakenfist.db.get_lock')
def test_delete_namespace_metadata_bad_key(self, mock_get_lock,
mock_md_put, mock_md_get):
resp = self.client.delete('/auth/namespaces/system/metadata/wrong',
headers={'Authorization': self.auth_token})
self.assertEqual({'error': 'key not found', 'status': 404},
resp.get_json())
self.assertEqual(404, resp.status_code)
@mock.patch('shakenfist.db.get_metadata', return_value={'foo': 'bar', 'real': 'smart'})
@mock.patch('shakenfist.db.persist_metadata')
@mock.patch('shakenfist.db.get_lock')
def test_delete_namespace_metadata_no_keys(self, mock_get_lock,
mock_md_put, mock_md_get):
resp = self.client.delete('/auth/namespaces/system/metadata/wrong',
headers={'Authorization': self.auth_token})
self.assertEqual({'error': 'key not found', 'status': 404},
resp.get_json())
self.assertEqual(404, resp.status_code)
def test_get_instance(self):
self.mock_etcd.create_instance('barry')
self.mock_etcd.create_instance('alice')
self.mock_etcd.create_instance('bob')
# Instance by name
resp = self.client.get('/instances/barry',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000001',
resp.get_json().get('uuid'))
resp = self.client.get('/instances/bob',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000003',
resp.get_json().get('uuid'))
# Instance by name - WRONG
resp = self.client.get('/instances/bazza',
headers={'Authorization': self.auth_token})
self.assertEqual(404, resp.status_code)
# Instance by UUID
resp = self.client.get('/instances/12345678-1234-4321-1234-000000000001',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000001',
resp.get_json().get('uuid'))
# Instance by UUID - WRONG
resp = self.client.get('/instances/12345678-1234-4321-1234-111111111111',
headers={'Authorization': self.auth_token})
self.assertEqual(404, resp.status_code)
def test_get_instance_by_namespace(self):
self.mock_etcd.create_instance('barry')
self.mock_etcd.create_instance('barry', namespace='two')
self.mock_etcd.create_instance('bob', namespace='two')
# Instance by name
resp = self.client.get('/instances/barry',
headers={'Authorization': self.auth_token})
self.assertEqual(400, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual(
{'error': 'multiple instances have the name "barry"', 'status': 400},
resp.get_json())
resp = self.client.get('/instances/barry',
headers={'Authorization': self.auth_token_two})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000002',
resp.get_json().get('uuid'))
resp = self.client.get('/instances/bob',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000003',
resp.get_json().get('uuid'))
# Instance by name - WRONG name
resp = self.client.get('/instances/bazza',
headers={'Authorization': self.auth_token_two})
self.assertEqual(404, resp.status_code)
# Instance by name - WRONG namespace
resp = self.client.get('/instances/barry',
headers={'Authorization': self.auth_token_three})
self.assertEqual(404, resp.status_code)
def test_get_instance_metadata(self):
self.mock_etcd.create_instance('banana', metadata={'a': 'a', 'b': 'b'})
resp = self.client.get(
'/instances/12345678-1234-4321-1234-000000000001/metadata',
headers={'Authorization': self.auth_token})
self.assertEqual({'a': 'a', 'b': 'b'}, resp.get_json())
self.assertEqual('application/json', resp.content_type)
self.assertEqual(200, resp.status_code)
def test_put_instance_metadata(self):
self.mock_etcd.create_instance('banana')
resp = self.client.put(
'/instances/12345678-1234-4321-1234-000000000001/metadata/foo',
headers={'Authorization': self.auth_token},
data=json.dumps({
'key': 'foo',
'value': 'bar'
}))
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual(
{'foo': 'bar'},
json.loads(self.mock_etcd.db[
'/sf/metadata/instance/12345678-1234-4321-1234-000000000001']))
def test_post_instance_metadata(self):
self.mock_etcd.create_instance('banana')
resp = self.client.post(
'/instances/12345678-1234-4321-1234-000000000001/metadata',
headers={'Authorization': self.auth_token},
data=json.dumps({
'key': 'foo',
'value': 'bar'
}))
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual(
{'foo': 'bar'},
json.loads(self.mock_etcd.db[
'/sf/metadata/instance/12345678-1234-4321-1234-000000000001']))
def test_get_network(self):
self.mock_etcd.create_network('barry')
self.mock_etcd.create_network('alice')
self.mock_etcd.create_network('bob')
# Instance by name
resp = self.client.get('/networks/barry',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000001',
resp.get_json().get('uuid'))
resp = self.client.get('/networks/bob',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000003',
resp.get_json().get('uuid'))
# Instance by name - WRONG
resp = self.client.get('/networks/bazza',
headers={'Authorization': self.auth_token})
self.assertEqual(404, resp.status_code)
# Instance by UUID
resp = self.client.get('/networks/12345678-1234-4321-1234-000000000001',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
self.assertEqual('12345678-1234-4321-1234-000000000001',
resp.get_json().get('uuid'))
# Instance by UUID - WRONG
resp = self.client.get('/networks/12345678-1234-4321-1234-111111111111',
headers={'Authorization': self.auth_token})
self.assertEqual(404, resp.status_code)
def test_get_network_metadata(self):
self.mock_etcd.create_network('banana', namespace='foo',
metadata={'a': 'a', 'b': 'b'})
resp = self.client.get(
'/networks/12345678-1234-4321-1234-000000000001/metadata',
headers={'Authorization': self.auth_token})
self.assertEqual({'a': 'a', 'b': 'b'}, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual('application/json', resp.content_type)
def test_put_network_metadata(self):
self.mock_etcd.create_network('banana', namespace='foo')
resp = self.client.put(
'/networks/12345678-1234-4321-1234-000000000001/metadata/foo',
headers={'Authorization': self.auth_token},
data=json.dumps({
'key': 'foo',
'value': 'bar'
}))
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual(
{'foo': 'bar'},
json.loads(self.mock_etcd.db[
'/sf/metadata/network/12345678-1234-4321-1234-000000000001']))
def test_post_network_metadata(self):
self.mock_etcd.create_network('banana', namespace='foo')
resp = self.client.post(
'/networks/12345678-1234-4321-1234-000000000001/metadata',
headers={'Authorization': self.auth_token},
data=json.dumps({
'key': 'foo',
'value': 'bar'
}))
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual(
{'foo': 'bar'},
json.loads(self.mock_etcd.db[
'/sf/metadata/network/12345678-1234-4321-1234-000000000001']))
def test_delete_instance_metadata(self):
self.mock_etcd.create_instance('banana',
metadata={'foo': 'bar', 'real': 'smart'})
resp = self.client.delete(
'/instances/12345678-1234-4321-1234-000000000001/metadata/foo',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
self.assertEqual(None, resp.get_json())
self.assertEqual(
{'real': 'smart'},
json.loads(self.mock_etcd.db[
'/sf/metadata/instance/12345678-1234-4321-1234-000000000001']))
def test_delete_instance_metadata_bad_key(self):
self.mock_etcd.create_instance('banana',
metadata={'foo': 'bar', 'real': 'smart'})
resp = self.client.delete(
'/instances/12345678-1234-4321-1234-000000000001/metadata/wrong',
headers={'Authorization': self.auth_token})
self.assertEqual({'error': 'key not found', 'status': 404},
resp.get_json())
self.assertEqual(404, resp.status_code)
def test_delete_network_metadata(self):
self.mock_etcd.create_network('banana', namespace='foo',
metadata={'foo': 'bar', 'real': 'smart'})
resp = self.client.delete(
'/networks/12345678-1234-4321-1234-000000000001/metadata/foo',
headers={'Authorization': self.auth_token})
self.assertEqual(None, resp.get_json())
self.assertEqual(200, resp.status_code)
self.assertEqual(
{'real': 'smart'},
json.loads(self.mock_etcd.db[
'/sf/metadata/network/12345678-1234-4321-1234-000000000001']))
def test_delete_network_metadata_bad_key(self):
self.mock_etcd.create_network('banana', namespace='system',
metadata={'foo': 'bar', 'real': 'smart'})
resp = self.client.delete(
'/networks/12345678-1234-4321-1234-000000000001/metadata/wrong',
headers={'Authorization': self.auth_token})
self.assertEqual({'error': 'key not found', 'status': 404},
resp.get_json())
self.assertEqual(404, resp.status_code)
class ExternalApiInstanceTestCase(ExternalApiTestCase):
def setUp(self):
super(ExternalApiInstanceTestCase, self).setUp()
def fake_virt_from_db(uuid):
return {'uuid': uuid}
self.virt_from_db = mock.patch('shakenfist.instance.Instance.from_db',
fake_virt_from_db)
self.mock_virt_from_db = self.virt_from_db.start()
self.addCleanup(self.virt_from_db.stop)
class FakeConfig(BaseSettings):
API_ASYNC_WAIT: int = 1
LOG_METHOD_TRACE: int = 1
fake_config = FakeConfig()
self.config = mock.patch('shakenfist.config.config', fake_config)
self.mock_config = self.config.start()
self.addCleanup(self.config.stop)
@mock.patch('shakenfist.etcd.enqueue')
@mock.patch('shakenfist.instance.Instances',
return_value=[
FakeInstance(
namespace='system',
uuid='6a973b82-31b3-4780-93e4-04d99ae49f3f',
state=[dbo.STATE_CREATED]),
FakeInstance(
namespace='system',
uuid='847b0327-9b17-4148-b4ed-be72b6722c17',
state=[dbo.STATE_CREATED])])
@mock.patch('shakenfist.etcd.put')
@mock.patch('shakenfist.db.get_lock')
def test_delete_all_instances(
self, mock_db_get_lock, mock_etcd_put,
mock_get_instances, mock_enqueue):
resp = self.client.delete('/instances',
headers={'Authorization': self.auth_token},
data=json.dumps({
'confirm': True,
'namespace': 'system'
}))
self.assertEqual(['6a973b82-31b3-4780-93e4-04d99ae49f3f',
'847b0327-9b17-4148-b4ed-be72b6722c17'],
resp.get_json())
self.assertEqual(200, resp.status_code)
def test_post_instance_no_disk(self):
resp = self.client.post('/instances',
headers={'Authorization': self.auth_token},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [],
'disk': None,
'ssh_key': None,
'user_data': None,
'placed_on': None,
'namespace': None,
}))
self.assertEqual(
{'error': 'instance must specify at least one disk', 'status': 400},
resp.get_json())
self.assertEqual(400, resp.status_code)
def test_post_instance_invalid_disk(self):
resp = self.client.post('/instances',
headers={'Authorization': self.auth_token},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [],
'disk': ['8@cirros'],
'ssh_key': None,
'user_data': None,
'placed_on': None,
'namespace': None,
}))
self.assertEqual(
{'error': 'disk specification should contain JSON objects', 'status': 400},
resp.get_json())
self.assertEqual(400, resp.status_code)
@mock.patch('shakenfist.artifact.Artifact.from_url')
def test_post_instance_invalid_network(self, mock_get_artifact):
resp = self.client.post('/instances',
headers={'Authorization': self.auth_token},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': ['87c15186-5f73-4947-a9fb-2183c4951efc'],
'disk': [{'size': 8,
'base': 'cirros'}],
'ssh_key': None,
'user_data': None,
'placed_on': None,
'namespace': None,
}))
self.assertEqual(
{'error': 'network specification should contain JSON objects', 'status': 400},
resp.get_json())
self.assertEqual(400, resp.status_code)
@mock.patch('shakenfist.artifact.Artifact.from_url')
def test_post_instance_invalid_network_uuid(self, mock_get_artifact):
resp = self.client.post('/instances',
headers={'Authorization': self.auth_token},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [
{'uuid': '87c15186-5f73-4947-a9fb-2183c4951efc'}],
'disk': [{'size': 8,
'base': 'cirros'}],
'ssh_key': None,
'user_data': None,
'placed_on': None,
'namespace': None,
}))
self.assertEqual(
{'error': 'network specification is missing network_uuid', 'status': 400},
resp.get_json())
self.assertEqual(400, resp.status_code)
@mock.patch('shakenfist.artifact.Artifact.from_url')
@mock.patch('shakenfist.net.Network._db_get_attribute',
return_value={'value': dbo.STATE_CREATED, 'update_time': 2})
@mock.patch('shakenfist.net.Network.from_db',
return_value=FakeNetwork(
uuid='87c15186-5f73-4947-a9fb-2183c4951efc',
vxid=1,
namespace='nonespace',
name='bob',
netblock='10.10.0.0/24'
))
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.ipmanager.IPManager.from_db')
def test_post_instance_only_system_specifies_namespaces(
self, mock_ipmanager, mock_lock, mock_net, mock_net_attribute,
mock_get_artifact):
with mock.patch('shakenfist.db.get_namespace',
return_value={
'service_key': 'foo',
'keys': {
'key1': str(base64.b64encode(_encode_key('bar')))
}
}):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'banana', 'key': 'foo'}))
self.assertEqual(200, resp.status_code)
non_system_auth_header = 'Bearer %s' % resp.get_json()[
'access_token']
resp = self.client.post('/instances',
headers={
'Authorization': non_system_auth_header},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [
{'network_uuid': '87c15186-5f73-4947-a9fb-2183c4951efc'}],
'disk': [{'size': 8,
'base': 'cirros'}],
'ssh_key': None,
'user_data': None,
'placed_on': None,
'namespace': 'gerkin',
}))
self.assertEqual(
{'error': 'namespace not found',
'status': 404},
resp.get_json())
self.assertEqual(404, resp.status_code)
def test_post_instance_specific_ip(self):
self.mock_etcd.create_network('betsy', netblock='10.1.2.0/24',
namespace='two')
# Request in range IP address
resp = self.client.post(
'/instances',
headers={'Authorization': self.auth_token_two},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [{'network_uuid': 'betsy',
'address': '10.1.2.11'}],
'disk': [{'size': 8,
'base': 'cirros'}],
'namespace': 'two',
}))
self.assertEqual(200, resp.status_code)
# Request out of range IP address
resp = self.client.post(
'/instances',
headers={'Authorization': self.auth_token_two},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [{'network_uuid': 'betsy',
'address': '10.1.200.11'}],
'disk': [{'size': 8,
'base': 'cirros'}],
'namespace': 'two',
}))
self.assertEqual(400, resp.status_code)
# Check that instance create API catches duplicate network names
self.mock_etcd.create_network('betsy', netblock='10.1.3.0/24',
namespace='two')
resp = self.client.post(
'/instances',
headers={'Authorization': self.auth_token_two},
data=json.dumps({
'name': 'test-instance',
'cpus': 1,
'memory': 1024,
'network': [{'network_uuid': 'betsy',
'address': '10.1.2.11'}],
'disk': [{'size': 8,
'base': 'cirros'}],
'namespace': 'two',
}))
self.assertEqual(400, resp.status_code)
self.assertEqual('multiple networks have the name "betsy"',
resp.get_json().get('error'))
class ExternalApiNetworkTestCase(base.ShakenFistTestCase):
def setUp(self):
super(ExternalApiNetworkTestCase, self).setUp()
self.add_event = mock.patch('shakenfist.db.add_event')
self.mock_add_event = self.add_event.start()
self.addCleanup(self.add_event.stop)
self.scheduler = mock.patch(
'shakenfist.scheduler.Scheduler', FakeScheduler)
self.mock_scheduler = self.scheduler.start()
self.addCleanup(self.scheduler.stop)
external_api.TESTING = True
external_api.app.testing = True
external_api.app.debug = False
external_api.app.logger.addHandler(logging.StreamHandler())
external_api.app.logger.setLevel(logging.DEBUG)
logging.root.setLevel(logging.DEBUG)
fake_config = SFConfig(
NODE_NAME='seriously',
NODE_EGRESS_IP='127.0.0.1',
NETWORK_NODE_IP='127.0.0.1',
LOG_METHOD_TRACE=1,
NODE_EGRESS_NIC='eth0',
NODE_MESH_NIC='eth1',
NODE_IS_NETWORK_NODE=True
)
self.config = mock.patch(
'shakenfist.external_api.base.config', fake_config)
self.mock_config = self.config.start()
self.addCleanup(self.config.stop)
self.get_namespace = mock.patch('shakenfist.db.get_namespace')
self.mock_get_namespace = self.get_namespace.start()
self.addCleanup(self.get_namespace.stop)
# The client must be created after all the mocks, or the mocks are not
# correctly applied.
self.client = external_api.app.test_client()
# Make a fake auth token
with mock.patch('shakenfist.db.get_namespace',
return_value={
'service_key': 'foo',
'keys': {
'key1': str(base64.b64encode(_encode_key('bar')))
}
}):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'system', 'key': 'foo'}))
self.assertEqual(200, resp.status_code)
self.auth_token = 'Bearer %s' % resp.get_json()['access_token']
@mock.patch('shakenfist.net.Network._db_get_attribute',
return_value={'value': dbo.STATE_CREATED, 'update_time': 2})
@mock.patch('shakenfist.ipmanager.IPManager.from_db')
@mock.patch('shakenfist.net.Network.from_db',
return_value=FakeNetwork(
uuid='30f6da44-look-i-am-uuid',
vxid=1,
namespace='nonespace',
name='bob',
netblock='10.10.0.0/24'
))
@mock.patch('shakenfist.net.Networks',
return_value=[FakeNetwork(
uuid='30f6da44-look-i-am-uuid',
vxid=1,
namespace='nonespace',
name='bob',
netblock='10.10.0.0/24'
)])
@mock.patch('shakenfist.networkinterface.interfaces_for_network', return_value=[])
@mock.patch('shakenfist.ipmanager.IPManager.from_db',
return_value=IPManager('uuid', '10.0.0.0/24'))
@mock.patch('shakenfist.net.Network.remove_dhcp')
@mock.patch('shakenfist.net.Network.delete_on_network_node')
@mock.patch('shakenfist.net.Network.delete_on_hypervisor')
@mock.patch('shakenfist.net.Network.state')
@mock.patch('shakenfist.etcd.put')
@mock.patch('shakenfist.etcd.enqueue')
@mock.patch('shakenfist.db.get_lock')
def test_delete_all_networks(self,
mock_db_get_lock,
mock_etcd_enqueue,
mock_etcd_put,
mock_network_state,
mock_delete_on_hypervisor,
mock_delete_on_network_node,
mock_remove_dhcp,
mock_get_ipmanager,
mock_network_interfaces,
mock_db_get_networks,
mock_db_get_network,
mock_ipmanager_from_db,
mock_net_attribute):
self.client = external_api.app.test_client()
resp = self.client.delete('/networks',
headers={'Authorization': self.auth_token},
data=json.dumps({
'confirm': True,
'namespace': 'foo'
}))
self.assertEqual(['30f6da44-look-i-am-uuid'], resp.get_json())
self.assertEqual(200, resp.status_code)
@mock.patch('shakenfist.net.Network.from_db',
return_value=FakeNetwork(
uuid='30f6da44-look-i-am-uuid',
vxid=1,
namespace='foo',
name='bob',
netblock='10.10.0.0/24',
state=dbo.STATE_DELETED
))
@mock.patch('shakenfist.etcd.get_all',
return_value=[(None, {'uuid': '30f6da44-look-i-am-uuid'})])
@mock.patch('shakenfist.networkinterface.interfaces_for_network', return_value=[])
@mock.patch('shakenfist.ipmanager.IPManager.from_db',
return_value=IPManager('uuid', '10.0.0.0/24'))
@mock.patch('shakenfist.net.Network.remove_dhcp')
@mock.patch('shakenfist.etcd.put')
@mock.patch('shakenfist.db.get_lock')
def test_delete_all_networks_none_to_delete(self,
mock_db_get_lock,
mock_etcd_put,
mock_remove_dhcp,
mock_get_ipmanager,
mock_network_interfaces,
mock_db_get_networks,
mock_db_get_network):
resp = self.client.delete('/networks',
headers={'Authorization': self.auth_token},
data=json.dumps({
'confirm': True,
'namespace': 'foo'
}))
self.assertEqual([], resp.get_json())
class ExternalApiNoNamespaceMockTestCase(base.ShakenFistTestCase):
def setUp(self):
super(ExternalApiNoNamespaceMockTestCase, self).setUp()
self.add_event = mock.patch(
'shakenfist.db.add_event')
self.mock_add_event = self.add_event.start()
self.addCleanup(self.add_event.stop)
self.scheduler = mock.patch(
'shakenfist.scheduler.Scheduler', FakeScheduler)
self.mock_scheduler = self.scheduler.start()
self.addCleanup(self.scheduler.stop)
external_api.TESTING = True
external_api.app.testing = True
external_api.app.debug = False
external_api.app.logger.addHandler(logging.StreamHandler())
external_api.app.logger.setLevel(logging.DEBUG)
logging.root.setLevel(logging.DEBUG)
fake_config = SFConfig(
NODE_NAME='node1',
)
self.config = mock.patch('shakenfist.instance.config',
fake_config)
self.mock_config = self.config.start()
self.addCleanup(self.config.stop)
# The client must be created after all the mocks, or the mocks are not
# correctly applied.
self.client = external_api.app.test_client()
# Make a fake auth token
with mock.patch('shakenfist.db.get_namespace',
return_value={
'service_key': 'foo',
'keys': {
'key1': str(base64.b64encode(_encode_key('bar')))
}
}):
resp = self.client.post(
'/auth', data=json.dumps({'namespace': 'system', 'key': 'foo'}))
self.assertEqual(200, resp.status_code)
self.auth_token = 'Bearer %s' % resp.get_json()[
'access_token']
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.db.get_namespace',
return_value={
'service_key': 'foo',
'keys': {
'mykey': str(base64.b64encode(_encode_key('bar')))
}
})
@mock.patch('shakenfist.etcd.put')
def test_delete_namespace_key(self, mock_put, mock_get, mock_lock):
resp = self.client.delete('/auth/namespaces/system/keys/mykey',
headers={'Authorization': self.auth_token})
self.assertEqual(200, resp.status_code)
mock_put.assert_called_with('namespace', None, 'system',
{'service_key': 'foo', 'keys': {}})
@mock.patch('shakenfist.db.get_lock')
@mock.patch('shakenfist.etcd.get', return_value=None)
@mock.patch('shakenfist.etcd.put')
@mock.patch('bcrypt.hashpw', return_value='terminator'.encode('utf-8'))
def test_auth_add_key_new_namespace(self, mock_hashpw, mock_put, mock_get, mock_lock):
resp = self.client.post('/auth/namespaces',
headers={'Authorization': self.auth_token},
data=json.dumps({
'namespace': 'foo',
'key_name': 'bernard',
'key': 'cheese'
}))
self.assertEqual(200, resp.status_code)
self.assertEqual('foo', resp.get_json())
mock_put.assert_called_with(
'namespace', None, 'foo',
{'name': 'foo', 'keys': {'bernard': 'dGVybWluYXRvcg=='}})
| 42.446741 | 98 | 0.530975 | 5,251 | 53,398 | 5.214435 | 0.065321 | 0.079435 | 0.06037 | 0.055221 | 0.843249 | 0.812461 | 0.792301 | 0.76155 | 0.736934 | 0.688616 | 0 | 0.046441 | 0.346736 | 53,398 | 1,257 | 99 | 42.480509 | 0.738497 | 0.016068 | 0 | 0.671296 | 0 | 0 | 0.18702 | 0.081808 | 0 | 0 | 0 | 0 | 0.140741 | 1 | 0.068519 | false | 0.00463 | 0.010185 | 0.005556 | 0.100926 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
67cb1782d436bfc026a1de2735c0c303650def00 | 6,048 | py | Python | baselines/rnd_gail/rnd_critic.py | HaoranLiao/RED | ab3a672b2f1818661582afb580dc776612983197 | [
"MIT"
] | 36 | 2019-05-18T22:14:20.000Z | 2021-12-16T13:03:01.000Z | baselines/rnd_gail/rnd_critic.py | HaoranLiao/RED | ab3a672b2f1818661582afb580dc776612983197 | [
"MIT"
] | 2 | 2019-09-16T03:02:52.000Z | 2020-01-30T10:07:49.000Z | baselines/rnd_gail/rnd_critic.py | HaoranLiao/RED | ab3a672b2f1818661582afb580dc776612983197 | [
"MIT"
] | 8 | 2019-07-31T08:02:29.000Z | 2022-01-17T08:57:44.000Z | import tensorflow as tf
from baselines.common import tf_util as U
from baselines.common.dataset import iterbatches
from baselines import logger
class RND_Critic(object):
def __init__(self, ob_size, ac_size, rnd_hid_size=128, rnd_hid_layer=4, hid_size=128, hid_layer=1,
out_size=128, scale=250000.0, offset=0., reward_scale=1.0, scope="rnd"):
self.scope = scope
self.scale = scale
self.offset = offset
self.out_size = out_size
self.rnd_hid_size = rnd_hid_size
self.rnd_hid_layer = rnd_hid_layer
self.hid_size = hid_size
self.hid_layer = hid_layer
self.reward_scale = reward_scale
print("RND Critic")
ob = tf.placeholder(tf.float32, [None, ob_size])
ac = tf.placeholder(tf.float32, [None, ac_size])
lr = tf.placeholder(tf.float32, None)
feat = self.build_graph(ob, ac, self.scope, hid_layer, hid_size, out_size)
rnd_feat = self.build_graph(ob, ac, self.scope+"_rnd", rnd_hid_layer, rnd_hid_size, out_size)
feat_loss = tf.reduce_mean(tf.square(feat-rnd_feat))
self.reward = reward_scale*tf.exp(offset- tf.reduce_mean(tf.square(feat - rnd_feat), axis=-1) * self.scale)
rnd_loss = tf.reduce_mean(tf.square(feat - rnd_feat), axis=-1) * self.scale
# self.reward = reward_scale * tf.exp(offset - rnd_loss)
# self.reward = reward_scale * (tf.math.softplus(rnd_loss) - rnd_loss)
self.reward_func = U.function([ob, ac], self.reward)
self.raw_reward = U.function([ob, ac], rnd_loss)
self.trainer = tf.train.AdamOptimizer(learning_rate=lr)
gvs = self.trainer.compute_gradients(feat_loss, self.get_trainable_variables())
self._train = U.function([ob, ac, lr], [], updates=[self.trainer.apply_gradients(gvs)])
def build_graph(self, ob, ac, scope, hid_layer, hid_size, size):
with tf.variable_scope(scope, reuse=tf.AUTO_REUSE):
layer = tf.concat([ob, ac], axis=1)
for _ in range(hid_layer):
layer = tf.layers.dense(layer, hid_size, activation=tf.nn.leaky_relu)
layer = tf.layers.dense(layer, size, activation=None)
return layer
def build_reward_op(self, ob, ac):
feat = self.build_graph(ob, ac, self.scope, self.hid_layer, self.hid_size, self.out_size)
rnd_feat = self.build_graph(ob, ac, self.scope + "_rnd", self.rnd_hid_layer, self.rnd_hid_size
, self.out_size)
reward = self.reward_scale* tf.exp(self.offset- tf.reduce_mean(tf.square(feat - rnd_feat), axis=-1) * self.scale)
return reward
def get_trainable_variables(self):
return tf.trainable_variables(self.scope)
def get_reward(self, ob, ac):
return self.reward_func(ob, ac)
def get_raw_reward(self, ob, ac):
return self.raw_reward(ob, ac)
def train(self, ob, ac, batch_size=32, lr=0.001, iter=200):
logger.info("Training RND Critic")
for _ in range(iter):
for data in iterbatches([ob, ac], batch_size=batch_size, include_final_partial_batch=True):
self._train(*data, lr)
class Enc_Critic(object):
def __init__(self, ob_size, ac_size, hid_size=128, hid_layer=1, scale=250000.0, offset=0., reward_scale=1.0,
reg_scale=0.0001, scope="enc"):
self.scope = scope
self.scale = scale
self.offset = offset
self.out_size = ob_size+ac_size
self.hid_size = hid_size
self.hid_layer = hid_layer
self.reward_scale = reward_scale
print("Enc Critic")
ob = tf.placeholder(tf.float32, [None, ob_size])
ac = tf.placeholder(tf.float32, [None, ac_size])
lr = tf.placeholder(tf.float32, None)
target = tf.concat([ob, ac], axis=1)
feat = self.build_graph(ob, ac, self.scope, hid_layer, hid_size, self.out_size)
feat_loss = tf.reduce_mean(tf.square(feat - target))
self.reward = reward_scale * tf.exp(offset - tf.reduce_mean(tf.square(feat - target), axis=-1) * self.scale)
raw_loss = tf.reduce_mean(tf.square(feat - target), axis=-1) * self.scale
self.reward_func = U.function([ob, ac], self.reward)
self.raw_reward = U.function([ob, ac], raw_loss)
self.trainer = tf.train.AdamOptimizer(learning_rate=lr)
if reg_scale>0:
feat_loss +=tf.contrib.layers.apply_regularization(tf.contrib.layers.l2_regularizer(reg_scale),
weights_list=self.get_trainable_variables())
gvs = self.trainer.compute_gradients(feat_loss, self.get_trainable_variables())
self._train = U.function([ob, ac, lr], [], updates=[self.trainer.apply_gradients(gvs)])
def build_graph(self, ob, ac, scope, hid_layer, hid_size, size):
with tf.variable_scope(scope, reuse=tf.AUTO_REUSE):
layer = tf.concat([ob, ac], axis=1)
for _ in range(hid_layer):
layer = tf.layers.dense(layer, hid_size, activation=tf.nn.leaky_relu)
layer = tf.layers.dense(layer, size, activation=None)
return layer
def build_reward_op(self, ob, ac):
feat = self.build_graph(ob, ac, self.scope, self.hid_layer, self.hid_size, self.out_size)
target = tf.concat([ob, ac], axis=1)
reward = self.reward_scale * tf.exp(
self.offset - tf.reduce_mean(tf.square(feat - target), axis=-1) * self.scale)
return reward
def get_trainable_variables(self):
return tf.trainable_variables(self.scope)
def get_reward(self, ob, ac):
return self.reward_func(ob, ac)
def get_raw_reward(self, ob, ac):
return self.raw_reward(ob, ac)
def train(self, ob, ac, batch_size=32, lr=0.001, iter=200):
logger.info("Training RND Critic")
for _ in range(iter):
for data in iterbatches([ob, ac], batch_size=batch_size, include_final_partial_batch=True):
self._train(*data, lr)
| 42.293706 | 121 | 0.640708 | 881 | 6,048 | 4.177072 | 0.119183 | 0.034783 | 0.021739 | 0.030435 | 0.877446 | 0.85462 | 0.844293 | 0.823098 | 0.82038 | 0.754348 | 0 | 0.017787 | 0.237765 | 6,048 | 142 | 122 | 42.591549 | 0.780477 | 0.020337 | 0 | 0.666667 | 0 | 0 | 0.01216 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.038095 | 0.057143 | 0.285714 | 0.019048 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
67cc0b9e35b4d4c115c74c17d4361c015756d3d2 | 1,584 | py | Python | pybithumb/__init__.py | hyeon95y/pybithumb | f0cd325142a9c6fa5de6effa99d60a8b5772d450 | [
"MIT"
] | null | null | null | pybithumb/__init__.py | hyeon95y/pybithumb | f0cd325142a9c6fa5de6effa99d60a8b5772d450 | [
"MIT"
] | null | null | null | pybithumb/__init__.py | hyeon95y/pybithumb | f0cd325142a9c6fa5de6effa99d60a8b5772d450 | [
"MIT"
] | null | null | null | from pybithumb.client import Bithumb
from .websocket import WebSocketManager
def get_ohlc(order_currency, payment_currency="KRW"):
return Bithumb.get_ohlc(order_currency, payment_currency)
def get_tickers(payment_currency="KRW"):
return Bithumb.get_tickers(payment_currency)
def get_market_detail(order_currency, payment_currency="KRW"):
return Bithumb.get_market_detail(order_currency, payment_currency)
def get_current_price(order_currency, payment_currency="KRW"):
return Bithumb.get_current_price(order_currency, payment_currency)
def get_orderbook(order_currency, payment_currency="KRW", limit=5):
return Bithumb.get_orderbook(order_currency, payment_currency, limit)
def get_transaction_history(order_currency, payment_currency="KRW", limit=20):
return Bithumb.get_transaction_history(order_currency, payment_currency, limit)
def get_candlestick(order_currency, payment_currency="KRW", chart_instervals="24h"):
return Bithumb.get_candlestick(order_currency, payment_currency, chart_instervals)
# @util.deprecated('Please use get_candlestick() function instead of get_ohlcv().')
def get_ohlcv(order_currency="BTC", payment_currency="KRW", interval="day"):
# for backward compatibility
chart_instervals = {
"day": "24h",
"hour12": "12h",
"hour6": "6h",
"hour": "1h",
"minute30": "30m",
"minute10": "10m",
"minute5": "5m",
"minute3": "3m",
"minute1": "1m",
}[interval]
return Bithumb.get_candlestick(order_currency, payment_currency, chart_instervals) | 35.2 | 86 | 0.744949 | 190 | 1,584 | 5.905263 | 0.315789 | 0.213904 | 0.231729 | 0.324421 | 0.651515 | 0.648841 | 0.477718 | 0.250446 | 0.124777 | 0.124777 | 0 | 0.020603 | 0.142045 | 1,584 | 45 | 87 | 35.2 | 0.805004 | 0.068182 | 0 | 0.068966 | 0 | 0 | 0.074627 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.275862 | false | 0 | 0.068966 | 0.241379 | 0.62069 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
67d1dcd318fcb92929bc3e47e03799c1f9b42f24 | 131 | py | Python | 基础教程/A1-Python与基础知识/算法第一步/ExampleCodes/chapter06/6-7.py | microsoft/ai-edu | 2f59fa4d3cf19f14e0b291e907d89664bcdc8df3 | [
"Apache-2.0"
] | 11,094 | 2019-05-07T02:48:50.000Z | 2022-03-31T08:49:42.000Z | 基础教程/A1-Python与基础知识/算法第一步/ExampleCodes/chapter06/6-7.py | microsoft/ai-edu | 2f59fa4d3cf19f14e0b291e907d89664bcdc8df3 | [
"Apache-2.0"
] | 157 | 2019-05-13T15:07:19.000Z | 2022-03-23T08:52:32.000Z | 基础教程/A1-Python与基础知识/算法第一步/ExampleCodes/chapter06/6-7.py | microsoft/ai-edu | 2f59fa4d3cf19f14e0b291e907d89664bcdc8df3 | [
"Apache-2.0"
] | 2,412 | 2019-05-07T02:55:15.000Z | 2022-03-30T06:56:52.000Z | print("Monday Food: " + str(2) + " Apples " + "and " + str(3) + " Carrots")
print("Monday Food:", 2, "Apples", "and", 3, "Carrots") | 65.5 | 75 | 0.549618 | 18 | 131 | 4 | 0.5 | 0.305556 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.175573 | 131 | 2 | 76 | 65.5 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0.462121 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
67dfc583aab26964a7fca7b86536eb2a82aed1b7 | 142 | py | Python | parmap/__init__.py | zeehio/parmap | 4db9f68b51ffc27e73bcc0d4435f61e84dea9911 | [
"Apache-2.0"
] | 114 | 2015-01-30T12:48:58.000Z | 2022-01-23T13:09:17.000Z | parmap/__init__.py | zeehio/parmap | 4db9f68b51ffc27e73bcc0d4435f61e84dea9911 | [
"Apache-2.0"
] | 26 | 2015-04-23T03:24:01.000Z | 2021-09-23T20:43:18.000Z | parmap/__init__.py | zeehio/parmap | 4db9f68b51ffc27e73bcc0d4435f61e84dea9911 | [
"Apache-2.0"
] | 14 | 2015-04-23T03:24:28.000Z | 2022-02-07T03:04:13.000Z | #!/usr/bin/env python
from .parmap import map, starmap, map_async, starmap_async
__all__ = ['map', 'starmap', 'map_async', 'starmap_async']
| 23.666667 | 58 | 0.71831 | 20 | 142 | 4.7 | 0.55 | 0.212766 | 0.276596 | 0.382979 | 0.638298 | 0.638298 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119718 | 142 | 5 | 59 | 28.4 | 0.752 | 0.140845 | 0 | 0 | 0 | 0 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
67eab74c6ae1ce4b09d4eeda926b386f85dc01a2 | 24 | py | Python | pyleaves/tests/__init__.py | JacobARose/pyleaves | 27b4016c850148981f3d021028c9272f18df121d | [
"MIT"
] | 3 | 2019-11-25T14:50:54.000Z | 2020-05-27T06:54:46.000Z | pyleaves/tests/__init__.py | JacobARose/pyleaves | 27b4016c850148981f3d021028c9272f18df121d | [
"MIT"
] | 6 | 2019-11-21T06:24:37.000Z | 2019-12-19T14:49:14.000Z | pyleaves/tests/__init__.py | JacobARose/pyleaves | 27b4016c850148981f3d021028c9272f18df121d | [
"MIT"
] | null | null | null | from . import test_utils | 24 | 24 | 0.833333 | 4 | 24 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 24 | 1 | 24 | 24 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
67ecb6af9e16bc3040d0ebaec6ff695cffb0e5ef | 2,092 | py | Python | zeus/api/resources/__init__.py | getsentry/zeus | 6d4a490c19ebe406b551641a022ca08f26c21fcb | [
"Apache-2.0"
] | 221 | 2017-07-03T17:29:21.000Z | 2021-12-07T19:56:59.000Z | zeus/api/resources/__init__.py | getsentry/zeus | 6d4a490c19ebe406b551641a022ca08f26c21fcb | [
"Apache-2.0"
] | 298 | 2017-07-04T18:08:14.000Z | 2022-03-03T22:24:51.000Z | zeus/api/resources/__init__.py | getsentry/zeus | 6d4a490c19ebe406b551641a022ca08f26c21fcb | [
"Apache-2.0"
] | 24 | 2017-07-15T13:46:45.000Z | 2020-08-16T16:14:45.000Z | from .artifact_download import * # NOQA
from .auth_index import * # NOQA
from .build_artifacts import * # NOQA
from .build_bundlestats import * # NOQA
from .build_details import * # NOQA
from .build_diff import * # NOQA
from .build_failures import * # NOQA
from .build_file_coverage_tree import * # NOQA
from .build_file_coverage import * # NOQA
from .build_index import * # NOQA
from .build_jobs import * # NOQA
from .build_styleviolations import * # NOQA
from .build_tests import * # NOQA
from .catchall import * # NOQA
from .change_request_details import * # NOQA
from .change_request_index import * # NOQA
from .github_organizations import * # NOQA
from .github_repositories import * # NOQA
from .hook_details import * # NOQA
from .index import * # NOQA
from .install_index import * # NOQA
from .install_stats import * # NOQA
from .job_artifacts import * # NOQA
from .job_details import * # NOQA
from .job_tests import * # NOQA
from .repository_branches import * # NOQA
from .repository_builds import * # NOQA
from .repository_change_requests import * # NOQA
from .repository_details import * # NOQA
from .repository_file_coverage_tree import * # NOQA
from .repository_hooks import * # NOQA
from .repository_index import * # NOQA
from .repository_revisions import * # NOQA
from .repository_stats import * # NOQA
from .repository_test_details import * # NOQA
from .repository_test_history import * # NOQA
from .repository_tests import * # NOQA
from .repository_testtree import * # NOQA
from .revision_artifacts import * # NOQA
from .revision_bundlestats import * # NOQA
from .revision_details import * # NOQA
from .revision_diff import * # NOQA
from .revision_failures import * # NOQA
from .revision_file_coverage_tree import * # NOQA
from .revision_file_coverage import * # NOQA
from .revision_jobs import * # NOQA
from .revision_styleviolations import * # NOQA
from .revision_tests import * # NOQA
from .test_details import * # NOQA
from .user_details import * # NOQA
from .user_emails import * # NOQA
from .user_token import * # NOQA
| 39.471698 | 52 | 0.751434 | 271 | 2,092 | 5.568266 | 0.154982 | 0.344599 | 0.473161 | 0.206759 | 0.368456 | 0.111332 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173996 | 2,092 | 52 | 53 | 40.230769 | 0.873264 | 0.123805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
67f1eace77c4ef9bc2f857dc9670e91c53bcb3f6 | 34 | py | Python | etc/python/pml/pml/__init__.py | pilab-sigma/pml | c0380bd659824e493c4502585fda12c8d3f4b2e0 | [
"MIT"
] | null | null | null | etc/python/pml/pml/__init__.py | pilab-sigma/pml | c0380bd659824e493c4502585fda12c8d3f4b2e0 | [
"MIT"
] | 3 | 2017-03-19T11:14:54.000Z | 2017-03-19T11:17:46.000Z | etc/python/pml/pml/__init__.py | pilab-sigma/pml | c0380bd659824e493c4502585fda12c8d3f4b2e0 | [
"MIT"
] | null | null | null | from .pml import saveTxt, loadTxt
| 17 | 33 | 0.794118 | 5 | 34 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147059 | 34 | 1 | 34 | 34 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
db2a6367d046496f5378be363c4fc18090773f61 | 68 | py | Python | medium/pro-how-far/HowFar.py | Adi142857/sololearn-challenges | 67437d9c202ce6d470042bbe87f20da9fd4a077c | [
"MIT"
] | 83 | 2020-01-07T23:02:52.000Z | 2022-03-19T06:53:56.000Z | medium/pro-how-far/HowFar.py | Adi142857/sololearn-challenges | 67437d9c202ce6d470042bbe87f20da9fd4a077c | [
"MIT"
] | 21 | 2020-01-23T14:26:13.000Z | 2022-03-20T06:30:45.000Z | medium/pro-how-far/HowFar.py | Adi142857/sololearn-challenges | 67437d9c202ce6d470042bbe87f20da9fd4a077c | [
"MIT"
] | 53 | 2020-02-10T13:40:33.000Z | 2022-03-13T13:07:33.000Z | import re
print(len(re.sub(r'^B*[HP](B*)[HP]B*$',r'\1', input())))
| 22.666667 | 56 | 0.529412 | 15 | 68 | 2.4 | 0.666667 | 0.166667 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.073529 | 68 | 2 | 57 | 34 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
e1efd2e0d53ac6a4dc01f06a6fb85ef7bdab482d | 7,364 | py | Python | nca47/db/sqlalchemy/models/dns.py | WosunOO/nca_xianshu | bbb548cb67b755a57528796d4c5a66ee68df2678 | [
"Apache-2.0"
] | null | null | null | nca47/db/sqlalchemy/models/dns.py | WosunOO/nca_xianshu | bbb548cb67b755a57528796d4c5a66ee68df2678 | [
"Apache-2.0"
] | null | null | null | nca47/db/sqlalchemy/models/dns.py | WosunOO/nca_xianshu | bbb548cb67b755a57528796d4c5a66ee68df2678 | [
"Apache-2.0"
] | null | null | null | import sqlalchemy as sa
from oslo_db.sqlalchemy import types as db_types
from nca47.db.sqlalchemy.models import base as model_base
from nca47.objects import attributes as attr
HasTenant = model_base.HasTenant
HasId = model_base.HasId
HasStatus = model_base.HasStatus
HasOperationMode = model_base.HasOperationMode
class DnsServer(model_base.BASE, HasId, HasOperationMode):
"""Represents a dns server."""
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
class Zone(model_base.BASE, HasId, HasOperationMode):
"""Represents a dns zone."""
__tablename__ = 'dns_zone_info'
zone_name = sa.Column(sa.String(attr.NAME_MAX_LEN))
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
zone_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
vres_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
masters = sa.Column(db_types.JsonEncodedList)
slaves = sa.Column(db_types.JsonEncodedList)
renewal = sa.Column(sa.String(attr.NAME_MAX_LEN))
default_ttl = sa.Column(sa.String(attr.NAME_MAX_LEN))
owners = sa.Column(db_types.JsonEncodedList)
ad_controller = sa.Column(sa.String(attr.NAME_MAX_LEN))
comment = sa.Column(sa.String(attr.NAME_MAX_LEN))
class ZoneRecord(model_base.BASE, HasId, HasOperationMode):
"""Represents a dns zone."""
__tablename__ = 'dns_rrs_info'
zone_id = sa.Column(sa.String(attr.UUID_LEN))
rrs_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
rrs_name = sa.Column(sa.String(attr.NAME_MAX_LEN))
type = sa.Column(sa.String(attr.NAME_MAX_LEN))
klass = sa.Column(sa.String(attr.NAME_MAX_LEN))
ttl = sa.Column(sa.String(attr.NAME_MAX_LEN))
rdata = sa.Column(sa.String(attr.NAME_MAX_LEN))
class HmTemplateInfo(model_base.BASE, HasId, HasOperationMode):
"""Represents a HmTemplateInfo."""
__tablename__ = 'hm_template_info'
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
types = sa.Column(sa.String(attr.NAME_MAX_LEN))
check_interval = sa.Column(sa.String(attr.FIVE_LEN))
timeout = sa.Column(sa.String(attr.FIVE_LEN))
max_retries = sa.Column(sa.String(attr.FIVE_LEN))
sendstring = sa.Column(sa.String(attr.INPUT_MAX_LEN))
recvstring = sa.Column(sa.String(attr.INPUT_MAX_LEN))
hm_template_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
refcnt = sa.Column(sa.String(attr.TEN_LEN))
username = sa.Column(sa.String(attr.NAME_MAX_LEN))
password = sa.Column(sa.String(attr.NAME_MAX_LEN))
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
class GslbZoneInfo(model_base.BASE, HasId, HasOperationMode):
"""Represents a GslbZoneInfo."""
__tablename__ = 'gslb_zone_info'
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
devices = sa.Column(db_types.JsonEncodedList)
syn_server = sa.Column(sa.String(attr.INPUT_MAX_LEN))
gslb_zone_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
enable = sa.Column(sa.String(attr.INPUT_MAX_LEN),
default='yes')
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
class GmemberInfo(model_base.BASE, HasId, HasOperationMode):
"""Represents a GmemberInfo."""
__tablename__ = 'gmember_info'
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
gslb_zone_name = sa.Column(sa.String(attr.NAME_MAX_LEN))
ip = sa.Column(sa.String(attr.IP_LEN))
port = sa.Column(sa.String(attr.FIVE_LEN))
enable = sa.Column(sa.String(attr.FIVE_LEN),
default="yes")
refcnt = sa.Column(sa.String(attr.TEN_LEN))
gmember_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
class Region(model_base.BASE, HasId, HasOperationMode):
"""Represents a region info."""
__tablename__ = 'region_info'
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
region_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
refcnt = sa.Column(sa.String(attr.NAME_MAX_LEN))
region_user = sa.Column(sa.String(attr.INPUT_MAX_LEN))
class RegionUser(model_base.BASE, HasId, HasOperationMode):
"""Represents a region user info."""
__tablename__ = 'region_user_info'
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
region_useruser_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
region_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
type = sa.Column(sa.String(attr.NAME_MAX_LEN))
data1 = sa.Column(sa.String(attr.NAME_MAX_LEN))
data2 = sa.Column(sa.String(attr.NAME_MAX_LEN))
data3 = sa.Column(sa.String(attr.NAME_MAX_LEN))
data4 = sa.Column(sa.String(attr.NAME_MAX_LEN))
class Proximity(model_base.BASE, HasId, HasOperationMode):
"""Represents a proximity info."""
__tablename__ = 'sp_policy_info'
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
sp_policy_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
src_type = sa.Column(sa.String(attr.NAME_MAX_LEN))
src_logic = sa.Column(sa.String(attr.NAME_MAX_LEN))
src_data1 = sa.Column(sa.String(attr.NAME_MAX_LEN))
src_data2 = sa.Column(sa.String(attr.NAME_MAX_LEN))
src_data3 = sa.Column(sa.String(attr.NAME_MAX_LEN))
src_data4 = sa.Column(sa.String(attr.NAME_MAX_LEN))
dst_type = sa.Column(sa.String(attr.NAME_MAX_LEN))
dst_logic = sa.Column(sa.String(attr.NAME_MAX_LEN))
dst_data1 = sa.Column(sa.String(attr.NAME_MAX_LEN))
dst_data2 = sa.Column(sa.String(attr.NAME_MAX_LEN))
class Syngroup(model_base.BASE, HasId, HasOperationMode):
"""
Represents a dns Syngroup_zone
"""
__tablename__ = 'syngroup_info'
syngroup_id = sa.Column(sa.String(attr.UUID_LEN))
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
gslb_zone_names = sa.Column(db_types.JsonEncodedList)
probe_range = sa.Column(sa.String(attr.NAME_MAX_LEN))
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
pass_ = sa.Column(sa.String(attr.FIVE_LEN))
class GPoolInfo(model_base.BASE, HasId, HasOperationMode):
__tablename__ = 'gpool_info'
tenant_id = sa.Column(sa.String(attr.UUID_LEN))
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
enable = sa.Column(sa.String(attr.FIVE_LEN))
pass_ = sa.Column(sa.String(attr.FIVE_LEN))
ttl = sa.Column(sa.String(attr.TTL_LEN))
max_addr_ret = sa.Column(sa.String(attr.NAME_MAX_LEN))
cname = sa.Column(sa.String(attr.NAME_MAX_LEN))
first_algorithm = sa.Column(sa.String(attr.FIVE_LEN))
second_algorithm = sa.Column(sa.String(attr.FIVE_LEN))
fallback_ip = sa.Column(sa.String(attr.IP_LEN))
hms = sa.Column(db_types.JsonEncodedList)
gmember_list = sa.Column(db_types.JsonEncodedList)
warning = sa.Column(sa.String(attr.TYPE_LEN))
gpool_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
class GMapInfo(model_base.BASE, HasId, HasOperationMode):
__tablename__ = 'gmap_info'
tenant_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
gmap_id = sa.Column(sa.String(attr.NAME_MAX_LEN))
name = sa.Column(sa.String(attr.NAME_MAX_LEN))
gpool_list = sa.Column(db_types.JsonEncodedList)
last_resort_pool = sa.Column(sa.String(attr.NAME_MAX_LEN))
algorithm = sa.Column(sa.String(attr.NAME_MAX_LEN))
enable = sa.Column(sa.String(attr.TEN_LEN))
| 40.911111 | 65 | 0.707768 | 1,110 | 7,364 | 4.415315 | 0.101802 | 0.159967 | 0.183636 | 0.293818 | 0.805754 | 0.764946 | 0.731687 | 0.666803 | 0.497858 | 0.308508 | 0 | 0.00227 | 0.162548 | 7,364 | 179 | 66 | 41.139665 | 0.792444 | 0.036665 | 0 | 0.198473 | 0 | 0 | 0.021301 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.022901 | 0.030534 | 0 | 0.954198 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
e1f0b3f4f70c8a112669786e866c0fcdac7d0932 | 60,622 | py | Python | cli/tests/pcluster/config/test_section_cluster.py | tmscarla/aws-parallelcluster | 2648b31cc169aac5c08c05cdcb60400451e5d1e8 | [
"Apache-2.0"
] | 1 | 2021-02-28T13:12:09.000Z | 2021-02-28T13:12:09.000Z | cli/tests/pcluster/config/test_section_cluster.py | tmscarla/aws-parallelcluster | 2648b31cc169aac5c08c05cdcb60400451e5d1e8 | [
"Apache-2.0"
] | null | null | null | cli/tests/pcluster/config/test_section_cluster.py | tmscarla/aws-parallelcluster | 2648b31cc169aac5c08c05cdcb60400451e5d1e8 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You may not use this file except in compliance
# with the License. A copy of the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "LICENSE.txt" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES
# OR CONDITIONS OF ANY KIND, express or implied. See the License for the specific language governing permissions and
# limitations under the License.
import json
import pytest
import tests.pcluster.config.utils as utils
from pcluster.config.mappings import CLUSTER_HIT, CLUSTER_SIT
from tests.pcluster.config.defaults import DefaultCfnParams, DefaultDict
@pytest.mark.parametrize(
"cfn_params_dict, expected_section_dict, expected_section_label",
[
(
{},
utils.merge_dicts(DefaultDict["cluster_sit"].value, {"additional_iam_policies": [], "architecture": None}),
"default",
),
(
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{"ClusterConfigMetadata": "{'sections': {'cluster': ['custom_cluster_label']}}"},
),
# Cluster section with custom label
utils.merge_dicts(
DefaultDict["cluster_sit"].value,
{
"additional_iam_policies": ["arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy"],
"base_os": "alinux2",
"scheduler": "slurm",
"cluster_config_metadata": {"sections": {"cluster": ["custom_cluster_label"]}},
"master_instance_type": "t2.micro",
"compute_instance_type": "t2.micro",
},
),
"custom_cluster_label",
),
(
DefaultCfnParams["cluster_sit"].value,
utils.merge_dicts(
DefaultDict["cluster_sit"].value,
{
"additional_iam_policies": ["arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy"],
"base_os": "alinux2",
"scheduler": "slurm",
"master_instance_type": "t2.micro",
"compute_instance_type": "t2.micro",
},
),
"default",
),
# awsbatch defaults
(
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"Scheduler": "awsbatch",
"EC2IAMPolicies": ",".join(
[
"arn:aws:iam::aws:policy/AWSBatchFullAccess",
"arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy",
]
),
},
),
utils.merge_dicts(
DefaultDict["cluster_sit"].value,
{
"scheduler": "awsbatch",
"base_os": "alinux2",
"min_vcpus": 0,
"desired_vcpus": 0,
"max_vcpus": 10,
"spot_bid_percentage": 0.0,
# verify also not awsbatch values
"initial_queue_size": 0,
"max_queue_size": 10,
"maintain_initial_size": False,
"spot_price": 0,
"additional_iam_policies": [
"arn:aws:iam::aws:policy/AWSBatchFullAccess",
"arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy",
],
"master_instance_type": "t2.micro",
"compute_instance_type": "t2.micro",
},
),
"default",
),
],
)
def test_sit_cluster_section_from_cfn(mocker, cfn_params_dict, expected_section_dict, expected_section_label):
"""Test conversion from CFN input parameters."""
utils.assert_section_from_cfn(mocker, CLUSTER_SIT, cfn_params_dict, expected_section_dict, expected_section_label)
@pytest.mark.parametrize(
"config_parser_dict, expected_dict_params, expected_message",
[
# default
(
{"cluster default": {}},
{"additional_iam_policies": [], "architecture": None, "scheduler": "slurm", "base_os": "alinux2"},
None,
),
# right value
(
{"cluster default": {"key_name": "test"}},
{
"key_name": "test",
"additional_iam_policies": [],
"architecture": None,
"scheduler": "slurm",
"base_os": "alinux2",
},
None,
),
(
{"cluster default": {"base_os": "alinux"}},
{"base_os": "alinux", "additional_iam_policies": [], "architecture": None, "scheduler": "slurm"},
None,
),
# invalid value
({"cluster default": {"base_os": "wrong_value"}}, None, "has an invalid value"),
# invalid key
({"cluster default": {"invalid_key": "fake_value"}}, None, "'invalid_key' is not allowed in the .* section"),
(
{"cluster default": {"invalid_key": "fake_value", "invalid_key2": "fake_value"}},
None,
"'invalid_key.*,invalid_key.*' are not allowed in the .* section",
),
# CLUSTER_HIT parameters must not be allowed in CLUSTER_SIT
(
{"cluster default": {"queue_settings": "fake_value"}},
None,
"'queue_settings' is not allowed in the .* section",
),
(
{"cluster default": {"default_queue": "fake_value"}},
None,
"'default_queue' is not allowed in the .* section",
),
],
)
def test_sit_cluster_section_from_file(mocker, config_parser_dict, expected_dict_params, expected_message):
utils.set_default_values_for_required_cluster_section_params(
config_parser_dict.get("cluster default"), only_if_not_present=True
)
utils.assert_section_from_file(mocker, CLUSTER_SIT, config_parser_dict, expected_dict_params, expected_message)
@pytest.mark.parametrize(
"config_parser_dict, expected_dict_params, expected_message",
[
# default
(
{"cluster default": {}},
{"additional_iam_policies": [], "architecture": None, "scheduler": "slurm", "base_os": "alinux2"},
None,
),
# right value
(
{"cluster default": {"key_name": "test", "disable_cluster_dns": True}},
{
"key_name": "test",
"disable_cluster_dns": True,
"additional_iam_policies": [],
"architecture": None,
"scheduler": "slurm",
"base_os": "alinux2",
},
None,
),
(
{"cluster default": {"base_os": "alinux"}},
{"base_os": "alinux", "additional_iam_policies": [], "architecture": None, "scheduler": "slurm"},
None,
),
# invalid value
({"cluster default": {"base_os": "wrong_value"}}, {}, "has an invalid value"),
# invalid key
({"cluster default": {"invalid_key": "fake_value"}}, {}, "'invalid_key' is not allowed in the .* section"),
# CLUSTER_SIT parameters must not be allowed in CLUSTER_HIT
(
{"cluster default": {"placement_group": "fake_value"}},
{},
"'placement_group' is not allowed in the .* section",
),
({"cluster default": {"placement": "ondemand"}}, {}, "'placement' is not allowed in the .* section"),
(
{"cluster default": {"compute_instance_type": "t2.micro"}},
{},
"'compute_instance_type' is not allowed in the .* section",
),
({"cluster default": {"initial_queue_size": 0}}, {}, "'initial_queue_size' is not allowed in the .* section"),
({"cluster default": {"max_queue_size": 10}}, {}, "'max_queue_size' is not allowed in the .* section"),
(
{"cluster default": {"maintain_initial_size": True}},
{},
"'maintain_initial_size' is not allowed in the .* section",
),
({"cluster default": {"cluster_type": "ondemand"}}, {}, "'cluster_type' is not allowed in the .* section"),
({"cluster default": {"spot_price": 0}}, {}, "'spot_price' is not allowed in the .* section"),
],
)
def test_hit_cluster_section_from_file(mocker, config_parser_dict, expected_dict_params, expected_message):
config_parser_dict["cluster default"]["queue_settings"] = "queue1"
config_parser_dict["queue queue1"] = {}
utils.set_default_values_for_required_cluster_section_params(
config_parser_dict.get("cluster default"), only_if_not_present=True
)
expected_dict_params["queue_settings"] = "queue1"
utils.assert_section_from_file(mocker, CLUSTER_HIT, config_parser_dict, expected_dict_params, expected_message)
@pytest.mark.parametrize(
"param_key, param_value, expected_value, expected_message",
[
# Basic configuration
("key_name", None, None, None),
("key_name", "", "", None),
("key_name", "test", "test", None),
("key_name", "NONE", "NONE", None),
("key_name", "fake_value", "fake_value", None),
# TODO add regex for template_url
("template_url", None, None, None),
("template_url", "", "", None),
("template_url", "test", "test", None),
("template_url", "NONE", "NONE", None),
("template_url", "fake_value", "fake_value", None),
("base_os", "", None, "has an invalid value"),
("base_os", "wrong_value", None, "has an invalid value"),
("base_os", "NONE", None, "has an invalid value"),
("base_os", "ubuntu1804", "ubuntu1804", None),
("scheduler", "wrong_value", None, "has an invalid value"),
("scheduler", "NONE", None, "has an invalid value"),
("scheduler", "awsbatch", "awsbatch", None),
("shared_dir", None, "/shared", None),
("shared_dir", "", None, "has an invalid value"),
("shared_dir", "fake_value", "fake_value", None),
("shared_dir", "/test", "/test", None),
("shared_dir", "/test/test2", "/test/test2", None),
("shared_dir", "/t_ 1-2( ):&;<>t?*+|", "/t_ 1-2( ):&;<>t?*+|", None),
("shared_dir", "//test", None, "has an invalid value"),
("shared_dir", "./test", None, "has an invalid value"),
("shared_dir", "\\test", None, "has an invalid value"),
("shared_dir", ".test", None, "has an invalid value"),
("shared_dir", "/test/.test2", None, "has an invalid value"),
("shared_dir", "/test/.test2/test3", None, "has an invalid value"),
("shared_dir", "/test//test2", None, "has an invalid value"),
("shared_dir", "/test\\test2", None, "has an invalid value"),
("shared_dir", "NONE", "NONE", None), # NONE is evaluated as a valid path
# Cluster configuration
("placement_group", None, None, None),
("placement_group", "", "", None),
("placement_group", "test", "test", None),
("placement_group", "NONE", "NONE", None),
("placement_group", "fake_value", "fake_value", None),
("placement_group", "DYNAMIC", "DYNAMIC", None),
("placement", None, "compute", None),
("placement", "", None, "has an invalid value"),
("placement", "wrong_value", None, "has an invalid value"),
("placement", "NONE", None, "has an invalid value"),
("placement", "cluster", "cluster", None),
# Head node
# TODO add regex for master_instance_type
("master_instance_type", "", "", None),
("master_instance_type", "test", "test", None),
("master_instance_type", "NONE", "NONE", None),
("master_instance_type", "fake_value", "fake_value", None),
("master_root_volume_size", None, 35, None),
("master_root_volume_size", "", None, "must be an Integer"),
("master_root_volume_size", "NONE", None, "must be an Integer"),
("master_root_volume_size", "wrong_value", None, "must be an Integer"),
("master_root_volume_size", "19", 19, "Allowed values are"),
("master_root_volume_size", "22", 22, "Allowed values are"),
("master_root_volume_size", "34", 34, "Allowed values are"),
("master_root_volume_size", "022", 22, "Allowed values are"),
("master_root_volume_size", "41", 41, None),
# Compute fleet
# TODO add regex for compute_instance_type
("compute_instance_type", "", "", None),
("compute_instance_type", "test", "test", None),
("compute_instance_type", "NONE", "NONE", None),
("compute_instance_type", "fake_value", "fake_value", None),
("compute_root_volume_size", None, 35, None),
("compute_root_volume_size", "", None, "must be an Integer"),
("compute_root_volume_size", "NONE", None, "must be an Integer"),
("compute_root_volume_size", "wrong_value", None, "must be an Integer"),
("compute_root_volume_size", "19", 19, "Allowed values are"),
("compute_root_volume_size", "22", 22, "Allowed values are"),
("compute_root_volume_size", "34", 34, "Allowed values are"),
("compute_root_volume_size", "022", 22, "Allowed values are"),
("compute_root_volume_size", "41", 41, None),
("initial_queue_size", None, 0, None),
("initial_queue_size", "", None, "must be an Integer"),
("initial_queue_size", "NONE", None, "must be an Integer"),
("initial_queue_size", "wrong_value", None, "must be an Integer"),
("initial_queue_size", "1", 1, None),
("initial_queue_size", "20", 20, None),
("max_queue_size", None, 10, None),
("max_queue_size", "", None, "must be an Integer"),
("max_queue_size", "NONE", None, "must be an Integer"),
("max_queue_size", "wrong_value", None, "must be an Integer"),
("max_queue_size", "1", 1, None),
("max_queue_size", "20", 20, None),
("maintain_initial_size", None, False, None),
("maintain_initial_size", "", None, "must be a Boolean"),
("maintain_initial_size", "NONE", None, "must be a Boolean"),
("maintain_initial_size", "true", True, None),
("maintain_initial_size", "false", False, None),
("min_vcpus", None, 0, None),
("min_vcpus", "", None, "must be an Integer"),
("min_vcpus", "NONE", None, "must be an Integer"),
("min_vcpus", "wrong_value", None, "must be an Integer"),
("min_vcpus", "1", 1, None),
("min_vcpus", "20", 20, None),
("desired_vcpus", None, 4, None),
("desired_vcpus", "", None, "must be an Integer"),
("desired_vcpus", "NONE", None, "must be an Integer"),
("desired_vcpus", "wrong_value", None, "must be an Integer"),
("desired_vcpus", "1", 1, None),
("desired_vcpus", "20", 20, None),
("max_vcpus", None, 10, None),
("max_vcpus", "", None, "must be an Integer"),
("max_vcpus", "NONE", None, "must be an Integer"),
("max_vcpus", "wrong_value", None, "must be an Integer"),
("max_vcpus", "1", 1, None),
("max_vcpus", "20", 20, None),
("cluster_type", None, "ondemand", None),
("cluster_type", "", None, "has an invalid value"),
("cluster_type", "wrong_value", None, "has an invalid value"),
("cluster_type", "NONE", None, "has an invalid value"),
("cluster_type", "spot", "spot", None),
("spot_price", None, 0.0, None),
("spot_price", "", None, "must be a Float"),
("spot_price", "NONE", None, "must be a Float"),
("spot_price", "wrong_value", None, "must be a Float"),
("spot_price", "0.09", 0.09, None),
("spot_price", "0", 0.0, None),
("spot_price", "0.1", 0.1, None),
("spot_price", "1", 1, None),
("spot_price", "100", 100, None),
("spot_price", "100.0", 100.0, None),
("spot_price", "100.1", 100.1, None),
("spot_price", "101", 101, None),
("spot_bid_percentage", None, 0, None),
("spot_bid_percentage", "", None, "must be an Integer"),
("spot_bid_percentage", "NONE", None, "must be an Integer"),
("spot_bid_percentage", "wrong_value", None, "must be an Integer"),
("spot_bid_percentage", "1", 1, None),
("spot_bid_percentage", "20", 20, None),
("spot_bid_percentage", "100.1", None, "must be an Integer"),
("spot_bid_percentage", "101", None, "has an invalid value"),
# Access and networking
("proxy_server", None, None, None),
("proxy_server", "", "", None),
("proxy_server", "test", "test", None),
("proxy_server", "NONE", "NONE", None),
("proxy_server", "fake_value", "fake_value", None),
# TODO add regex for ec2_iam_role
("ec2_iam_role", None, None, None),
("ec2_iam_role", "", "", None),
("ec2_iam_role", "test", "test", None),
("ec2_iam_role", "NONE", "NONE", None),
("ec2_iam_role", "fake_value", "fake_value", None),
("additional_iam_policies", None, [], None),
("additional_iam_policies", "", [""], None),
("additional_iam_policies", "test", ["test"], None),
("additional_iam_policies", "NONE", ["NONE"], None),
("additional_iam_policies", "fake_value", ["fake_value"], None),
("additional_iam_policies", "policy1,policy2", ["policy1", "policy2"], None),
# TODO add regex for s3_read_resource
("s3_read_resource", None, None, None),
("s3_read_resource", "", "", None),
("s3_read_resource", "fake_value", "fake_value", None),
("s3_read_resource", "http://test", "http://test", None),
("s3_read_resource", "s3://test/test2", "s3://test/test2", None),
("s3_read_resource", "NONE", "NONE", None),
# TODO add regex for s3_read_write_resource
("s3_read_write_resource", None, None, None),
("s3_read_write_resource", "", "", None),
("s3_read_write_resource", "fake_value", "fake_value", None),
("s3_read_write_resource", "http://test", "http://test", None),
("s3_read_write_resource", "s3://test/test2", "s3://test/test2", None),
("s3_read_write_resource", "NONE", "NONE", None),
# Customization
("enable_efa", None, None, None),
("enable_efa", "", None, "has an invalid value"),
("enable_efa", "wrong_value", None, "has an invalid value"),
("enable_efa", "NONE", None, "has an invalid value"),
("enable_efa", "compute", "compute", None),
("ephemeral_dir", None, "/scratch", None),
("ephemeral_dir", "", None, "has an invalid value"),
("ephemeral_dir", "fake_value", "fake_value", None),
("ephemeral_dir", "/test", "/test", None),
("ephemeral_dir", "/test/test2", "/test/test2", None),
("ephemeral_dir", "/t_ 1-2( ):&;<>t?*+|", "/t_ 1-2( ):&;<>t?*+|", None),
("ephemeral_dir", "//test", None, "has an invalid value"),
("ephemeral_dir", "./test", None, "has an invalid value"),
("ephemeral_dir", "\\test", None, "has an invalid value"),
("ephemeral_dir", ".test", None, "has an invalid value"),
("ephemeral_dir", "/test/.test2", None, "has an invalid value"),
("ephemeral_dir", "/test/.test2/test3", None, "has an invalid value"),
("ephemeral_dir", "/test//test2", None, "has an invalid value"),
("ephemeral_dir", "/test\\test2", None, "has an invalid value"),
("ephemeral_dir", "NONE", "NONE", None), # NONE is evaluated as a valid path
("encrypted_ephemeral", None, False, None),
("encrypted_ephemeral", "", None, "must be a Boolean"),
("encrypted_ephemeral", "NONE", None, "must be a Boolean"),
("encrypted_ephemeral", "true", True, None),
("encrypted_ephemeral", "false", False, None),
("custom_ami", None, None, None),
("custom_ami", "", None, "has an invalid value"),
("custom_ami", "wrong_value", None, "has an invalid value"),
("custom_ami", "ami-12345", None, "has an invalid value"),
("custom_ami", "ami-123456789", None, "has an invalid value"),
("custom_ami", "NONE", None, "has an invalid value"),
("custom_ami", "ami-12345678", "ami-12345678", None),
("custom_ami", "ami-12345678901234567", "ami-12345678901234567", None),
# TODO add regex for pre_install
("pre_install", None, None, None),
("pre_install", "", "", None),
("pre_install", "fake_value", "fake_value", None),
("pre_install", "http://test", "http://test", None),
("pre_install", "s3://test/test2", "s3://test/test2", None),
("pre_install", "NONE", "NONE", None),
("pre_install_args", None, None, None),
("pre_install_args", "", "", None),
("pre_install_args", "test", "test", None),
("pre_install_args", "NONE", "NONE", None),
("pre_install_args", "fake_value", "fake_value", None),
# TODO add regex for post_install
("post_install", None, None, None),
("post_install", "", "", None),
("post_install", "fake_value", "fake_value", None),
("post_install", "http://test", "http://test", None),
("post_install", "s3://test/test2", "s3://test/test2", None),
("post_install", "NONE", "NONE", None),
("post_install_args", None, None, None),
("post_install_args", "", "", None),
("post_install_args", "test", "test", None),
("post_install_args", "NONE", "NONE", None),
("post_install_args", "fake_value", "fake_value", None),
("extra_json", None, {}, None),
("extra_json", "", {}, None),
("extra_json", "{}", {}, None),
("extra_json", '{"test": "test"}', {"test": "test"}, None),
(
"extra_json",
"{'test': 'test'}",
{"test": "test"},
None,
), # WARNING it is considered a valid value by yaml.safe_load
("extra_json", "{'test': 'test'", None, "Error parsing JSON parameter"),
("extra_json", "fake_value", "fake_value", None),
("cluster_config_metadata", None, {"sections": {}}, None),
# TODO add regex for additional_cfn_template
("additional_cfn_template", None, None, None),
("additional_cfn_template", "", "", None),
("additional_cfn_template", "fake_value", "fake_value", None),
("additional_cfn_template", "http://test", "http://test", None),
("additional_cfn_template", "s3://test/test2", "s3://test/test2", None),
("additional_cfn_template", "NONE", "NONE", None),
("tags", None, {}, None),
("tags", "", {}, None),
("tags", "{}", {}, None),
("tags", "{'test': 'test'}", {"test": "test"}, None),
("tags", "{'test': 'test'", None, "Error parsing JSON parameter"),
("disable_hyperthreading", None, False, None),
("disable_hyperthreading", "", None, "must be a Boolean"),
("disable_hyperthreading", "NONE", None, "must be a Boolean"),
("disable_hyperthreading", "true", True, None),
("disable_hyperthreading", "false", False, None),
("enable_intel_hpc_platform", None, False, None),
("enable_intel_hpc_platform", "", None, "must be a Boolean"),
("enable_intel_hpc_platform", "NONE", None, "must be a Boolean"),
("enable_intel_hpc_platform", "true", True, None),
("enable_intel_hpc_platform", "false", False, None),
# TODO add regex for custom_chef_cookbook
("custom_chef_cookbook", None, None, None),
("custom_chef_cookbook", "", "", None),
("custom_chef_cookbook", "fake_value", "fake_value", None),
("custom_chef_cookbook", "http://test", "http://test", None),
("custom_chef_cookbook", "s3://test/test2", "s3://test/test2", None),
("custom_chef_cookbook", "NONE", "NONE", None),
# Settings
("scaling_settings", "test1", None, "Section .* not found in the config file"),
("vpc_settings", "test1", None, "Section .* not found in the config file"),
("ebs_settings", "test1", None, "Section .* not found in the config file"),
("ebs_settings", "test1,test2", None, "Section .* not found in the config file"),
("ebs_settings", "test1, test2", None, "Section .* not found in the config file"),
("efs_settings", "test1", None, "Section .* not found in the config file"),
("raid_settings", "test1", None, "Section .* not found in the config file"),
("fsx_settings", "test1", None, "Section .* not found in the config file"),
("cw_log_settings", "test1", None, "Section .* not found in the config file"),
("dashboard_settings", "test1", None, "Section .* not found in the config file"),
],
)
def test_sit_cluster_param_from_file(
mocker, param_key, param_value, expected_value, expected_message, expected_key_error=None
):
utils.assert_param_from_file(mocker, CLUSTER_SIT, param_key, param_value, expected_value, expected_message)
@pytest.mark.parametrize(
"param_key, param_value, expected_value, expected_message",
[
# Basic configuration
("key_name", None, None, None),
("key_name", "", "", None),
("key_name", "test", "test", None),
("key_name", "NONE", "NONE", None),
("key_name", "fake_value", "fake_value", None),
# TODO add regex for template_url
("template_url", None, None, None),
("template_url", "", "", None),
("template_url", "test", "test", None),
("template_url", "NONE", "NONE", None),
("template_url", "fake_value", "fake_value", None),
("base_os", "", None, "has an invalid value"),
("base_os", "wrong_value", None, "has an invalid value"),
("base_os", "NONE", None, "has an invalid value"),
("base_os", "ubuntu1804", "ubuntu1804", None),
("scheduler", "wrong_value", None, "has an invalid value"),
("scheduler", "NONE", None, "has an invalid value"),
("scheduler", "awsbatch", "awsbatch", None),
("shared_dir", None, "/shared", None),
("shared_dir", "", None, "has an invalid value"),
("shared_dir", "fake_value", "fake_value", None),
("shared_dir", "/test", "/test", None),
("shared_dir", "/test/test2", "/test/test2", None),
("shared_dir", "/t_ 1-2( ):&;<>t?*+|", "/t_ 1-2( ):&;<>t?*+|", None),
("shared_dir", "//test", None, "has an invalid value"),
("shared_dir", "./test", None, "has an invalid value"),
("shared_dir", "\\test", None, "has an invalid value"),
("shared_dir", ".test", None, "has an invalid value"),
("shared_dir", "/test/.test2", None, "has an invalid value"),
("shared_dir", "/test/.test2/test3", None, "has an invalid value"),
("shared_dir", "/test//test2", None, "has an invalid value"),
("shared_dir", "/test\\test2", None, "has an invalid value"),
("shared_dir", "NONE", "NONE", None), # NONE is evaluated as a valid path
# Head node
# TODO add regex for master_instance_type
("master_instance_type", "", "", None),
("master_instance_type", "test", "test", None),
("master_instance_type", "NONE", "NONE", None),
("master_instance_type", "fake_value", "fake_value", None),
("master_root_volume_size", None, 35, None),
("master_root_volume_size", "", None, "must be an Integer"),
("master_root_volume_size", "NONE", None, "must be an Integer"),
("master_root_volume_size", "wrong_value", None, "must be an Integer"),
("master_root_volume_size", "19", 19, "Allowed values are"),
("master_root_volume_size", "22", 22, "Allowed values are"),
("master_root_volume_size", "34", 34, "Allowed values are"),
("master_root_volume_size", "022", 22, "Allowed values are"),
("master_root_volume_size", "41", 41, None),
# Compute fleet
("compute_root_volume_size", None, 35, None),
("compute_root_volume_size", "", None, "must be an Integer"),
("compute_root_volume_size", "NONE", None, "must be an Integer"),
("compute_root_volume_size", "wrong_value", None, "must be an Integer"),
("compute_root_volume_size", "19", 19, "Allowed values are"),
("compute_root_volume_size", "22", 22, "Allowed values are"),
("compute_root_volume_size", "34", 34, "Allowed values are"),
("compute_root_volume_size", "022", 22, "Allowed values are"),
("compute_root_volume_size", "41", 41, None),
# Access and networking
("proxy_server", None, None, None),
("proxy_server", "", "", None),
("proxy_server", "test", "test", None),
("proxy_server", "NONE", "NONE", None),
("proxy_server", "fake_value", "fake_value", None),
# TODO add regex for ec2_iam_role
("ec2_iam_role", None, None, None),
("ec2_iam_role", "", "", None),
("ec2_iam_role", "test", "test", None),
("ec2_iam_role", "NONE", "NONE", None),
("ec2_iam_role", "fake_value", "fake_value", None),
("additional_iam_policies", None, [], None),
("additional_iam_policies", "", [""], None),
("additional_iam_policies", "test", ["test"], None),
("additional_iam_policies", "NONE", ["NONE"], None),
("additional_iam_policies", "fake_value", ["fake_value"], None),
("additional_iam_policies", "policy1,policy2", ["policy1", "policy2"], None),
# TODO add regex for s3_read_resource
("s3_read_resource", None, None, None),
("s3_read_resource", "", "", None),
("s3_read_resource", "fake_value", "fake_value", None),
("s3_read_resource", "http://test", "http://test", None),
("s3_read_resource", "s3://test/test2", "s3://test/test2", None),
("s3_read_resource", "NONE", "NONE", None),
# TODO add regex for s3_read_write_resource
("s3_read_write_resource", None, None, None),
("s3_read_write_resource", "", "", None),
("s3_read_write_resource", "fake_value", "fake_value", None),
("s3_read_write_resource", "http://test", "http://test", None),
("s3_read_write_resource", "s3://test/test2", "s3://test/test2", None),
("s3_read_write_resource", "NONE", "NONE", None),
# Customization
("enable_efa", None, None, None),
("enable_efa", "", None, "has an invalid value"),
("enable_efa", "wrong_value", None, "has an invalid value"),
("enable_efa", "NONE", None, "has an invalid value"),
("enable_efa", "compute", "compute", None),
("ephemeral_dir", None, "/scratch", None),
("ephemeral_dir", "", None, "has an invalid value"),
("ephemeral_dir", "fake_value", "fake_value", None),
("ephemeral_dir", "/test", "/test", None),
("ephemeral_dir", "/test/test2", "/test/test2", None),
("ephemeral_dir", "/t_ 1-2( ):&;<>t?*+|", "/t_ 1-2( ):&;<>t?*+|", None),
("ephemeral_dir", "//test", None, "has an invalid value"),
("ephemeral_dir", "./test", None, "has an invalid value"),
("ephemeral_dir", "\\test", None, "has an invalid value"),
("ephemeral_dir", ".test", None, "has an invalid value"),
("ephemeral_dir", "/test/.test2", None, "has an invalid value"),
("ephemeral_dir", "/test/.test2/test3", None, "has an invalid value"),
("ephemeral_dir", "/test//test2", None, "has an invalid value"),
("ephemeral_dir", "/test\\test2", None, "has an invalid value"),
("ephemeral_dir", "NONE", "NONE", None), # NONE is evaluated as a valid path
("encrypted_ephemeral", None, False, None),
("encrypted_ephemeral", "", None, "must be a Boolean"),
("encrypted_ephemeral", "NONE", None, "must be a Boolean"),
("encrypted_ephemeral", "true", True, None),
("encrypted_ephemeral", "false", False, None),
("custom_ami", None, None, None),
("custom_ami", "", None, "has an invalid value"),
("custom_ami", "wrong_value", None, "has an invalid value"),
("custom_ami", "ami-12345", None, "has an invalid value"),
("custom_ami", "ami-123456789", None, "has an invalid value"),
("custom_ami", "NONE", None, "has an invalid value"),
("custom_ami", "ami-12345678", "ami-12345678", None),
("custom_ami", "ami-12345678901234567", "ami-12345678901234567", None),
# TODO add regex for pre_install
("pre_install", None, None, None),
("pre_install", "", "", None),
("pre_install", "fake_value", "fake_value", None),
("pre_install", "http://test", "http://test", None),
("pre_install", "s3://test/test2", "s3://test/test2", None),
("pre_install", "NONE", "NONE", None),
("pre_install_args", None, None, None),
("pre_install_args", "", "", None),
("pre_install_args", "test", "test", None),
("pre_install_args", "NONE", "NONE", None),
("pre_install_args", "fake_value", "fake_value", None),
# TODO add regex for post_install
("post_install", None, None, None),
("post_install", "", "", None),
("post_install", "fake_value", "fake_value", None),
("post_install", "http://test", "http://test", None),
("post_install", "s3://test/test2", "s3://test/test2", None),
("post_install", "NONE", "NONE", None),
("post_install_args", None, None, None),
("post_install_args", "", "", None),
("post_install_args", "test", "test", None),
("post_install_args", "NONE", "NONE", None),
("post_install_args", "fake_value", "fake_value", None),
("extra_json", None, {}, None),
("extra_json", "", {}, None),
("extra_json", "{}", {}, None),
("extra_json", '{"test": "test"}', {"test": "test"}, None),
(
"extra_json",
"{'test': 'test'}",
{"test": "test"},
None,
), # WARNING it is considered a valid value by yaml.safe_load
("extra_json", "{'test': 'test'", None, "Error parsing JSON parameter"),
("extra_json", "fake_value", "fake_value", None),
("cluster_config_metadata", None, {"sections": {}}, None),
# TODO add regex for additional_cfn_template
("additional_cfn_template", None, None, None),
("additional_cfn_template", "", "", None),
("additional_cfn_template", "fake_value", "fake_value", None),
("additional_cfn_template", "http://test", "http://test", None),
("additional_cfn_template", "s3://test/test2", "s3://test/test2", None),
("additional_cfn_template", "NONE", "NONE", None),
("tags", None, {}, None),
("tags", "", {}, None),
("tags", "{}", {}, None),
("tags", "{'test': 'test'}", {"test": "test"}, None),
("tags", "{'test': 'test'", None, "Error parsing JSON parameter"),
("enable_intel_hpc_platform", None, False, None),
("enable_intel_hpc_platform", "", None, "must be a Boolean"),
("enable_intel_hpc_platform", "NONE", None, "must be a Boolean"),
("enable_intel_hpc_platform", "true", True, None),
("enable_intel_hpc_platform", "false", False, None),
# TODO add regex for custom_chef_cookbook
("custom_chef_cookbook", None, None, None),
("custom_chef_cookbook", "", "", None),
("custom_chef_cookbook", "fake_value", "fake_value", None),
("custom_chef_cookbook", "http://test", "http://test", None),
("custom_chef_cookbook", "s3://test/test2", "s3://test/test2", None),
("custom_chef_cookbook", "NONE", "NONE", None),
# Settings
("scaling_settings", "test1", None, "Section .* not found in the config file"),
("vpc_settings", "test1", None, "Section .* not found in the config file"),
("ebs_settings", "test1", None, "Section .* not found in the config file"),
("ebs_settings", "test1,test2", None, "Section .* not found in the config file"),
("ebs_settings", "test1, test2", None, "Section .* not found in the config file"),
("efs_settings", "test1", None, "Section .* not found in the config file"),
("raid_settings", "test1", None, "Section .* not found in the config file"),
("fsx_settings", "test1", None, "Section .* not found in the config file"),
("cw_log_settings", "test1", None, "Section .* not found in the config file"),
("dashboard_settings", "test1", None, "Section .* not found in the config file"),
],
)
def test_hit_cluster_param_from_file(
mocker, param_key, param_value, expected_value, expected_message, expected_key_error=None
):
utils.assert_param_from_file(mocker, CLUSTER_HIT, param_key, param_value, expected_value, expected_message)
@pytest.mark.parametrize(
"param_key, param_value, expected_value, expected_message",
[
("scheduler", None, None, "Configuration parameter 'scheduler' must have a value"),
("base_os", None, None, "Configuration parameter 'base_os' must have a value"),
],
)
def test_sit_cluster_param_from_file_with_validation(mocker, param_key, param_value, expected_value, expected_message):
utils.assert_param_from_file(
mocker,
CLUSTER_SIT,
param_key,
param_value,
expected_value,
expected_message,
do_validation=True,
)
@pytest.mark.parametrize(
"section_dict, expected_config_parser_dict, expected_message",
[
# default
({}, {"cluster default": {}}, None),
# default values
({"placement": "compute"}, {"cluster default": {"placement": "compute"}}, "No option .* in section: .*"),
# other values
({"key_name": "test"}, {"cluster default": {"key_name": "test"}}, None),
({"base_os": "centos7"}, {"cluster default": {"base_os": "centos7"}}, None),
],
)
def test_sit_cluster_section_to_file(mocker, section_dict, expected_config_parser_dict, expected_message):
utils.assert_section_to_file(mocker, CLUSTER_SIT, section_dict, expected_config_parser_dict, expected_message)
@pytest.mark.parametrize(
"cluster_section_definition, section_dict, expected_cfn_params, default_threads_per_core",
[
(CLUSTER_SIT, DefaultDict["cluster_sit"].value, DefaultCfnParams["cluster_sit"].value, (1, 1)),
(CLUSTER_HIT, DefaultDict["cluster_hit"].value, DefaultCfnParams["cluster_hit"].value, (1, 1)),
(
CLUSTER_SIT,
utils.merge_dicts(DefaultDict["cluster_sit"].value, {"disable_hyperthreading": "True"}),
utils.merge_dicts(DefaultCfnParams["cluster_sit"].value, {"Cores": "2,2,true,true"}),
(2, 2),
),
(
CLUSTER_SIT,
utils.merge_dicts(DefaultDict["cluster_sit"].value, {"disable_hyperthreading": "True"}),
utils.merge_dicts(DefaultCfnParams["cluster_sit"].value, {"Cores": "NONE,NONE,false,false"}),
(1, 1),
),
(
CLUSTER_SIT,
utils.merge_dicts(DefaultDict["cluster_sit"].value, {"disable_hyperthreading": "True"}),
utils.merge_dicts(DefaultCfnParams["cluster_sit"].value, {"Cores": "2,NONE,true,false"}),
(2, 1),
),
(
CLUSTER_SIT,
utils.merge_dicts(DefaultDict["cluster_sit"].value, {"disable_hyperthreading": "True"}),
utils.merge_dicts(DefaultCfnParams["cluster_sit"].value, {"Cores": "NONE,2,false,true"}),
(1, 2),
),
(
CLUSTER_HIT,
utils.merge_dicts(DefaultDict["cluster_hit"].value, {"disable_hyperthreading": "True"}),
# With HIT clusters there should be no cores information for compute instance type
utils.merge_dicts(DefaultCfnParams["cluster_hit"].value, {"Cores": "2,0,true,false"}),
(2, 2),
),
(
CLUSTER_HIT,
utils.merge_dicts(DefaultDict["cluster_hit"].value, {"disable_hyperthreading": "True"}),
# With HIT clusters there should be no cores information for compute instance type
utils.merge_dicts(DefaultCfnParams["cluster_hit"].value, {"Cores": "NONE,0,false,false"}),
(1, 1),
),
(
CLUSTER_HIT,
utils.merge_dicts(DefaultDict["cluster_hit"].value, {"disable_hyperthreading": "True"}),
# With HIT clusters there should be no cores information for compute instance type
utils.merge_dicts(DefaultCfnParams["cluster_hit"].value, {"Cores": "2,0,true,false"}),
(2, 1),
),
(
CLUSTER_HIT,
utils.merge_dicts(DefaultDict["cluster_hit"].value, {"disable_hyperthreading": "True"}),
# With HIT clusters there should be no cores information for compute instance type
utils.merge_dicts(DefaultCfnParams["cluster_hit"].value, {"Cores": "NONE,0,false,false"}),
(1, 2),
),
],
)
def test_cluster_section_to_cfn(
mocker, cluster_section_definition, section_dict, expected_cfn_params, default_threads_per_core
):
section_dict["master_instance_type"] = "t2.micro"
if cluster_section_definition == CLUSTER_SIT:
section_dict["compute_instance_type"] = "t2.micro"
utils.set_default_values_for_required_cluster_section_params(section_dict)
utils.mock_pcluster_config(mocker)
mocker.patch("pcluster.config.cfn_param_types.get_efs_mount_target_id", return_value="valid_mount_target_id")
instance_type_info_mock = mocker.MagicMock()
mocker.patch(
"pcluster.config.cfn_param_types.InstanceTypeInfo.init_from_instance_type", return_value=instance_type_info_mock
)
instance_type_info_mock.vcpus_count.return_value = 4
instance_type_info_mock.default_threads_per_core.side_effect = default_threads_per_core
utils.assert_section_to_cfn(mocker, cluster_section_definition, section_dict, expected_cfn_params)
@pytest.mark.parametrize(
"settings_label, expected_cfn_params",
[
("default", utils.merge_dicts(DefaultCfnParams["cluster_sit"].value, {"Scheduler": "sge"})),
(
"custom1",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"KeyName": "key",
"BaseOS": "ubuntu1804",
"Scheduler": "sge",
"SharedDir": "/test",
"PlacementGroup": "NONE",
"Placement": "cluster",
"MasterInstanceType": "t2.large",
"MasterRootVolumeSize": "40",
"ComputeInstanceType": "t2.large",
"ComputeRootVolumeSize": "40",
"DesiredSize": "1",
"MaxSize": "2",
"MinSize": "1",
"ClusterType": "spot",
"SpotPrice": "5.5",
"ProxyServer": "proxy",
"EC2IAMRoleName": "role",
"EC2IAMPolicies": "arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy,policy1,policy2",
"S3ReadResource": "s3://url",
"S3ReadWriteResource": "s3://url",
"EFA": "compute",
"EphemeralDir": "/test2",
"EncryptedEphemeral": "true",
"CustomAMI": "ami-12345678",
"PreInstallScript": "preinstall",
"PreInstallArgs": '\\"one two\\"',
"PostInstallScript": "postinstall",
"PostInstallArgs": '\\"one two\\"',
"ExtraJson": '{"cfncluster": {"cfn_scheduler_slots": "cores"}}',
"AdditionalCfnTemplate": "https://test",
"CustomChefCookbook": "https://test",
"Cores": "NONE,NONE,false,false",
"IntelHPCPlatform": "true",
# template_url = template
# tags = {"test": "test"}
},
),
),
(
"batch",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"Scheduler": "awsbatch",
"DesiredSize": "4",
"MaxSize": "10",
"MinSize": "0",
"SpotPrice": "0",
"EC2IAMPolicies": ",".join(
[
"arn:aws:iam::aws:policy/AWSBatchFullAccess",
"arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy",
]
),
"ComputeInstanceType": "optimal",
},
),
),
(
"batch-custom1",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"Scheduler": "awsbatch",
"DesiredSize": "3",
"MaxSize": "4",
"MinSize": "2",
"ClusterType": "spot",
"SpotPrice": "25",
"EC2IAMPolicies": ",".join(
[
"arn:aws:iam::aws:policy/AWSBatchFullAccess",
"arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy",
"policy1",
"policy2",
]
),
"ComputeInstanceType": "optimal",
},
),
),
(
"batch-no-cw-logging",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"Scheduler": "awsbatch",
"DesiredSize": "3",
"MaxSize": "4",
"MinSize": "2",
"ClusterType": "spot",
"SpotPrice": "25",
"EC2IAMPolicies": "arn:aws:iam::aws:policy/AWSBatchFullAccess",
"ComputeInstanceType": "optimal",
"CWLogOptions": "false,14",
},
),
),
(
"wrong_mix_traditional",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"Scheduler": "sge",
"DesiredSize": "1",
"MaxSize": "2",
"MinSize": "1",
"ClusterType": "spot",
"SpotPrice": "5.5",
},
),
),
(
"wrong_mix_batch",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"Scheduler": "awsbatch",
"DesiredSize": "3",
"MaxSize": "4",
"MinSize": "2",
"ClusterType": "spot",
"SpotPrice": "25",
"EC2IAMPolicies": ",".join(
[
"arn:aws:iam::aws:policy/AWSBatchFullAccess",
"arn:aws:iam::aws:policy/CloudWatchAgentServerPolicy",
]
),
"ComputeInstanceType": "optimal",
},
),
),
(
"efs",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"EFSOptions": "efs,NONE,generalPurpose,NONE,NONE,false,bursting,Valid,NONE",
"Scheduler": "sge",
},
),
),
(
"dcv",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"DCVOptions": "master,8555,10.0.0.0/0",
"Scheduler": "sge",
},
),
),
(
"ebs",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"NumberOfEBSVol": "1",
"SharedDir": "ebs1,NONE,NONE,NONE,NONE",
"VolumeType": "io1,gp2,gp2,gp2,gp2",
"VolumeSize": "40,NONE,NONE,NONE,NONE",
"VolumeIOPS": "200,NONE,NONE,NONE,NONE",
"EBSEncryption": "true,false,false,false,false",
"EBSKMSKeyId": "kms_key,NONE,NONE,NONE,NONE",
"EBSVolumeId": "vol-12345678,NONE,NONE,NONE,NONE",
"Scheduler": "sge",
},
),
),
(
"ebs-multiple",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"NumberOfEBSVol": "2",
"SharedDir": "ebs1,ebs2,NONE,NONE,NONE",
"VolumeType": "io1,standard,gp2,gp2,gp2",
"VolumeSize": "40,30,NONE,NONE,NONE",
"VolumeIOPS": "200,300,NONE,NONE,NONE",
"EBSEncryption": "true,false,false,false,false",
"EBSKMSKeyId": "kms_key,NONE,NONE,NONE,NONE",
"EBSVolumeId": "vol-12345678,NONE,NONE,NONE,NONE",
"Scheduler": "sge",
},
),
),
(
"ebs-shareddir-cluster1",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"NumberOfEBSVol": "1",
"SharedDir": "/shared",
"VolumeType": "standard,gp2,gp2,gp2,gp2",
"VolumeSize": "30,NONE,NONE,NONE,NONE",
"VolumeIOPS": "300,NONE,NONE,NONE,NONE",
"EBSEncryption": "false,false,false,false,false",
"EBSKMSKeyId": "NONE,NONE,NONE,NONE,NONE",
"EBSVolumeId": "NONE,NONE,NONE,NONE,NONE",
"Scheduler": "sge",
},
),
),
(
"ebs-shareddir-cluster2",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"NumberOfEBSVol": "1",
"SharedDir": "/work",
"VolumeType": "standard,gp2,gp2,gp2,gp2",
"VolumeSize": "30,NONE,NONE,NONE,NONE",
"VolumeIOPS": "300,NONE,NONE,NONE,NONE",
"EBSEncryption": "false,false,false,false,false",
"EBSKMSKeyId": "NONE,NONE,NONE,NONE,NONE",
"EBSVolumeId": "NONE,NONE,NONE,NONE,NONE",
"Scheduler": "sge",
},
),
),
(
"ebs-shareddir-ebs",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
"NumberOfEBSVol": "1",
"SharedDir": "ebs1,NONE,NONE,NONE,NONE",
"VolumeType": "io1,gp2,gp2,gp2,gp2",
"VolumeSize": "40,NONE,NONE,NONE,NONE",
"VolumeIOPS": "200,NONE,NONE,NONE,NONE",
"EBSEncryption": "true,false,false,false,false",
"EBSKMSKeyId": "kms_key,NONE,NONE,NONE,NONE",
"EBSVolumeId": "vol-12345678,NONE,NONE,NONE,NONE",
"Scheduler": "sge",
},
),
),
(
"cw_log",
utils.merge_dicts(DefaultCfnParams["cluster_sit"].value, {"CWLogOptions": "true,1", "Scheduler": "sge"}),
),
(
"all-settings",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
# scaling
"ScaleDownIdleTime": "15",
# vpc
"VPCId": "vpc-12345678",
"MasterSubnetId": "subnet-12345678",
# ebs
"NumberOfEBSVol": "1",
"SharedDir": "ebs1,NONE,NONE,NONE,NONE",
"VolumeType": "io1,gp2,gp2,gp2,gp2",
"VolumeSize": "40,NONE,NONE,NONE,NONE",
"VolumeIOPS": "200,NONE,NONE,NONE,NONE",
"EBSEncryption": "true,false,false,false,false",
"EBSKMSKeyId": "kms_key,NONE,NONE,NONE,NONE",
"EBSVolumeId": "vol-12345678,NONE,NONE,NONE,NONE",
# efs
"EFSOptions": "efs,NONE,generalPurpose,NONE,NONE,false,bursting,Valid,NONE",
# raid
"RAIDOptions": "raid,NONE,2,gp2,20,NONE,false,NONE,125",
# fsx
"FSXOptions": "fsx,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,"
"NONE,NONE",
# dcv
"DCVOptions": "master,8555,10.0.0.0/0",
"Scheduler": "sge",
},
),
),
(
"random-order",
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"AvailabilityZone": "mocked_avail_zone",
"KeyName": "key",
"BaseOS": "ubuntu1804",
"Scheduler": "sge",
# "SharedDir": "/test", # we have ebs volumes, see below
"PlacementGroup": "NONE",
"Placement": "cluster",
"MasterInstanceType": "t2.large",
"MasterRootVolumeSize": "40",
"ComputeInstanceType": "t2.large",
"ComputeRootVolumeSize": "40",
"DesiredSize": "1",
"MaxSize": "2",
"MinSize": "1",
"ClusterType": "spot",
"SpotPrice": "5.5",
"ProxyServer": "proxy",
"EC2IAMRoleName": "role",
"S3ReadResource": "s3://url",
"S3ReadWriteResource": "s3://url",
"EFA": "compute",
"EphemeralDir": "/test2",
"EncryptedEphemeral": "true",
"CustomAMI": "ami-12345678",
"PreInstallScript": "preinstall",
"PreInstallArgs": '\\"one two\\"',
"PostInstallScript": "postinstall",
"PostInstallArgs": '\\"one two\\"',
"ExtraJson": '{"cfncluster": {"cfn_scheduler_slots": "cores"}}',
"AdditionalCfnTemplate": "https://test",
"CustomChefCookbook": "https://test",
"IntelHPCPlatform": "false",
# scaling
"ScaleDownIdleTime": "15",
# vpc
"VPCId": "vpc-12345678",
#
"MasterSubnetId": "subnet-12345678",
"ComputeSubnetId": "subnet-23456789",
# ebs
"NumberOfEBSVol": "1",
"SharedDir": "ebs1,NONE,NONE,NONE,NONE",
"VolumeType": "io1,gp2,gp2,gp2,gp2",
"VolumeSize": "40,NONE,NONE,NONE,NONE",
"VolumeIOPS": "200,NONE,NONE,NONE,NONE",
"EBSEncryption": "true,false,false,false,false",
"EBSKMSKeyId": "kms_key,NONE,NONE,NONE,NONE",
"EBSVolumeId": "vol-12345678,NONE,NONE,NONE,NONE",
# efs
"EFSOptions": "efs,NONE,generalPurpose,NONE,NONE,false,bursting,Valid,NONE",
# raid
"RAIDOptions": "raid,NONE,2,gp2,20,NONE,false,NONE,125",
# fsx
"FSXOptions": "fsx,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,NONE,"
"NONE,NONE",
# dcv
"DCVOptions": "master,8555,10.0.0.0/0",
},
),
),
],
)
def test_sit_cluster_from_file_to_cfn(mocker, pcluster_config_reader, settings_label, expected_cfn_params):
"""Unit tests for parsing Cluster related options."""
mocker.patch(
"pcluster.config.cfn_param_types.get_efs_mount_target_id",
side_effect=lambda efs_fs_id, avail_zone: "master_mt" if avail_zone == "mocked_avail_zone" else None,
)
mocker.patch(
"pcluster.config.cfn_param_types.get_availability_zone_of_subnet",
side_effect=lambda subnet: "mocked_avail_zone" if subnet == "subnet-12345678" else "some_other_az",
)
mocker.patch("pcluster.config.cfn_param_types.InstanceTypeInfo.vcpus_count", return_value=2)
utils.assert_section_params(mocker, pcluster_config_reader, settings_label, expected_cfn_params)
@pytest.mark.parametrize(
"section_dict, expected_cfn_params",
[
(
DefaultDict["cluster_sit"].value,
utils.merge_dicts(
DefaultCfnParams["cluster_sit"].value,
{
"ClusterConfigMetadata": json.dumps(
{"sections": {"scaling": ["default"], "vpc": ["default"], "cluster": ["default"]}},
sort_keys=True,
)
},
),
)
],
)
def test_sit_cluster_config_metadata_to_cfn(mocker, section_dict, expected_cfn_params):
utils.mock_pcluster_config(mocker)
mocker.patch("pcluster.config.cfn_param_types.get_efs_mount_target_id", return_value="valid_mount_target_id")
utils.assert_section_to_cfn(mocker, CLUSTER_SIT, section_dict, expected_cfn_params, ignore_metadata=False)
| 48.074544 | 120 | 0.533668 | 6,063 | 60,622 | 5.100115 | 0.066304 | 0.077615 | 0.062868 | 0.039034 | 0.878598 | 0.851691 | 0.83992 | 0.805608 | 0.75975 | 0.738568 | 0 | 0.026081 | 0.303652 | 60,622 | 1,260 | 121 | 48.112698 | 0.706424 | 0.042773 | 0 | 0.698701 | 0 | 0.001732 | 0.428865 | 0.104951 | 0 | 0 | 0 | 0.000794 | 0.008658 | 1 | 0.008658 | false | 0 | 0.004329 | 0 | 0.012987 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c00f3dd7981685c0ed2b84365522623ded631863 | 17,889 | py | Python | tests/test_api.py | kentac55/ulid | 0c78be2187973bf1e9dc1140384d9c40fcc2cefe | [
"Apache-2.0"
] | null | null | null | tests/test_api.py | kentac55/ulid | 0c78be2187973bf1e9dc1140384d9c40fcc2cefe | [
"Apache-2.0"
] | null | null | null | tests/test_api.py | kentac55/ulid | 0c78be2187973bf1e9dc1140384d9c40fcc2cefe | [
"Apache-2.0"
] | null | null | null | """
test_api
~~~~~~~~
Tests for the :mod:`~ulid.api` module.
"""
import datetime
import time
import uuid
import pytest
from ulid import api, base32, ulid
BYTES_SIZE_EXC_REGEX = r'Expects bytes to be 128 bits'
INT_SIZE_EXC_REGEX = r'Expects integer to be 128 bits'
INT_NEGATIVE_EXC_REGEX = r'Expects positive integer'
STR_SIZE_EXC_REGEX = r'Expects 26 characters'
UNSUPPORTED_TIMESTAMP_TYPE_EXC_REGEX = (r'Expected datetime, int, float, str, memoryview, Timestamp'
r', ULID, bytes, or bytearray')
TIMESTAMP_SIZE_EXC_REGEX = r'Expects timestamp to be 48 bits'
UNSUPPORTED_RANDOMNESS_TYPE_EXC_REGEX = r'Expected int, float, str, memoryview, Randomness, ULID, bytes, or bytearray'
RANDOMNESS_SIZE_EXC_REGEX = r'Expects randomness to be 80 bits'
PARSE_STR_LEN_EXC_REGEX = r'^Cannot create ULID from string of length '
PARSE_UNSUPPORTED_TYPE_REGEX = r'^Cannot create ULID from type'
@pytest.fixture(scope='session', params=[
list,
dict,
set,
tuple,
type(None)
])
def unsupported_type(request):
"""
Fixture that yields types that a cannot be converted to a timestamp/randomness.
"""
return request.param
@pytest.fixture('session', params=[bytes, bytearray, memoryview])
def buffer_type(request):
"""
Fixture that yields types that support the buffer protocol.
"""
return request.param
def test_new_returns_ulid_instance():
"""
Assert that :func:`~ulid.api.new` returns a new :class:`~ulid.ulid.ULID` instance.
"""
assert isinstance(api.new(), ulid.ULID)
def test_parse_returns_given_ulid_instance():
"""
Assert that :func:`~ulid.api.parse` returns the given :class:`~ulid.ulid.ULID` instance
when given one.
"""
value = api.new()
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance == value
def test_parse_returns_ulid_instance_from_uuid():
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~uuid.UUID`.
"""
value = uuid.uuid4()
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == value.bytes
def test_parse_returns_ulid_instance_from_uuid_str():
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~uuid.UUID` instance in its string format.
"""
value = uuid.uuid4()
instance = api.parse(str(value))
assert isinstance(instance, ulid.ULID)
assert instance.bytes == value.bytes
def test_parse_returns_ulid_instance_from_uuid_hex_str():
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~uuid.UUID` instance in its hex string format.
"""
value = uuid.uuid4()
instance = api.parse(value.hex)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == value.bytes
def test_parse_returns_ulid_instance_from_ulid_str(valid_bytes_128):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~str` instance that represents a fill ULID.
"""
value = base32.encode(valid_bytes_128)
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == valid_bytes_128
def test_parse_returns_ulid_instance_from_randomness_str(valid_bytes_80):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~str` instance that represents randomness data.
"""
value = base32.encode_randomness(valid_bytes_80)
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness().str == value
def test_parse_returns_ulid_instance_from_timestamp_str(valid_bytes_48):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~str` instance that represents timestamp data.
"""
value = base32.encode_timestamp(valid_bytes_48)
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.timestamp().str == value
def test_parse_error_on_invalid_length_str(invalid_str_10_16_26_32_36):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from the given :class:`~str` instance that represents timestamp data.
"""
with pytest.raises(ValueError) as ex:
api.parse(invalid_str_10_16_26_32_36)
assert ex.match(PARSE_STR_LEN_EXC_REGEX)
def test_parse_returns_ulid_instance_from_int(valid_bytes_128):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from a valid ULID stored as an int.
"""
value = int.from_bytes(valid_bytes_128, byteorder='big')
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == valid_bytes_128
def test_parse_raises_when_int_greater_than_128_bits(invalid_bytes_128_overflow):
"""
Assert that :func:`~ulid.api.parse` raises a :class:`~ValueError` when given int
cannot be stored in 128 bits.
"""
value = int.from_bytes(invalid_bytes_128_overflow, byteorder='big')
with pytest.raises(ValueError) as ex:
api.parse(value)
assert ex.match(INT_SIZE_EXC_REGEX)
def test_parse_raises_when_int_negative():
"""
Assert that :func:`~ulid.api.parse` raises a :class:`~ValueError` when given
a negative int number.
"""
with pytest.raises(ValueError) as ex:
api.parse(-1)
assert ex.match(INT_NEGATIVE_EXC_REGEX)
def test_parse_returns_ulid_instance_from_float(valid_bytes_128):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from a valid ULID stored as a float.
"""
value = float(int.from_bytes(valid_bytes_128, byteorder='big'))
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.int == int(value)
def test_parse_raises_when_float_greater_than_128_bits(invalid_bytes_128_overflow):
"""
Assert that :func:`~ulid.api.parse` raises a :class:`~ValueError` when given float
cannot be stored in 128 bits.
"""
value = float(int.from_bytes(invalid_bytes_128_overflow, byteorder='big'))
with pytest.raises(ValueError) as ex:
api.parse(value)
assert ex.match(INT_SIZE_EXC_REGEX)
def test_parse_raises_when_float_negative():
"""
Assert that :func:`~ulid.api.parse` raises a :class:`~ValueError` when given
a negative float number.
"""
with pytest.raises(ValueError) as ex:
api.parse(float(-1))
assert ex.match(INT_NEGATIVE_EXC_REGEX)
def test_parse_returns_ulid_instance_from_buffer_type(buffer_type, valid_bytes_128):
"""
Assert that :func:`~ulid.api.parse` returns a new :class:`~ulid.ulid.ULID` instance
from a valid set of 128 bytes representing by the given buffer type.
"""
value = buffer_type(valid_bytes_128)
instance = api.parse(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == valid_bytes_128
def test_parse_raises_when_buffer_type_not_128_bits(buffer_type, invalid_bytes_128):
"""
Assert that :func:`~ulid.api.parse` raises a :class:`~ValueError` when given bytes
that is not 128 bit in length.
"""
value = buffer_type(invalid_bytes_128)
with pytest.raises(ValueError) as ex:
api.parse(value)
assert ex.match(BYTES_SIZE_EXC_REGEX)
def test_parse_raises_when_given_unsupported_type(unsupported_type):
"""
Assert that :func:`~ulid.api.parse` raises a :class:`~ValueError` when a value
of an unsupported type.
"""
with pytest.raises(ValueError) as ex:
api.parse(unsupported_type)
assert ex.match(PARSE_UNSUPPORTED_TYPE_REGEX)
def test_from_bytes_returns_ulid_instance(buffer_type, valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_bytes` returns a new :class:`~ulid.ulid.ULID` instance
from the given bytes.
"""
value = buffer_type(valid_bytes_128)
instance = api.from_bytes(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == valid_bytes_128
def test_from_bytes_raises_when_not_128_bits(buffer_type, invalid_bytes_128):
"""
Assert that :func:`~ulid.api.from_bytes` raises a :class:`~ValueError` when given bytes
that is not 128 bit in length.
"""
value = buffer_type(invalid_bytes_128)
with pytest.raises(ValueError) as ex:
api.from_bytes(value)
assert ex.match(BYTES_SIZE_EXC_REGEX)
def test_from_int_returns_ulid_instance(valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_int` returns a new :class:`~ulid.ulid.ULID` instance
from the given bytes.
"""
value = int.from_bytes(valid_bytes_128, byteorder='big')
instance = api.from_int(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == valid_bytes_128
def test_from_int_raises_when_greater_than_128_bits(invalid_bytes_128_overflow):
"""
Assert that :func:`~ulid.api.from_int` raises a :class:`~ValueError` when given int
cannot be stored in 128 bits.
"""
value = int.from_bytes(invalid_bytes_128_overflow, byteorder='big')
with pytest.raises(ValueError) as ex:
api.from_int(value)
assert ex.match(INT_SIZE_EXC_REGEX)
def test_from_int_raises_when_negative_number():
"""
Assert that :func:`~ulid.api.from_int` raises a :class:`~ValueError` when given
a negative number.
"""
with pytest.raises(ValueError) as ex:
api.from_int(-1)
assert ex.match(INT_NEGATIVE_EXC_REGEX)
def test_from_str_returns_ulid_instance(valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_str` returns a new :class:`~ulid.ulid.ULID` instance
from the given bytes.
"""
value = base32.encode(valid_bytes_128)
instance = api.from_str(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == valid_bytes_128
def test_from_str_raises_when_not_128_bits(valid_bytes_48):
"""
Assert that :func:`~ulid.api.from_str` raises a :class:`~ValueError` when given bytes
that is not 128 bit in length.
"""
value = base32.encode(valid_bytes_48)
with pytest.raises(ValueError) as ex:
api.from_str(value)
assert ex.match(STR_SIZE_EXC_REGEX)
def test_from_uuid_returns_ulid_instance():
"""
Assert that :func:`~ulid.api.from_uuid` returns a new :class:`~ulid.ulid.ULID` instance
from the underlying bytes of the UUID.
"""
value = uuid.uuid4()
instance = api.from_uuid(value)
assert isinstance(instance, ulid.ULID)
assert instance.bytes == value.bytes
def test_from_timestamp_datetime_returns_ulid_instance():
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given Unix time from epoch in seconds as an :class:`~datetime.dateime`.
"""
value = datetime.datetime.now()
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert int(instance.timestamp().timestamp) == int(value.timestamp())
def test_from_timestamp_int_returns_ulid_instance():
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given Unix time from epoch in seconds as an :class:`~int`.
"""
value = int(time.time())
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert int(instance.timestamp().timestamp) == value
def test_from_timestamp_float_returns_ulid_instance():
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given Unix time from epoch in seconds as a :class:`~float`.
"""
value = float(time.time())
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert int(instance.timestamp().timestamp) == int(value)
def test_from_timestamp_str_returns_ulid_instance(valid_bytes_48):
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given timestamp as a :class:`~str`.
"""
value = base32.encode_timestamp(valid_bytes_48)
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert instance.timestamp().str == value
def test_from_timestamp_bytes_returns_ulid_instance(buffer_type, valid_bytes_48):
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given timestamp as an object that supports the buffer protocol.
"""
value = buffer_type(valid_bytes_48)
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert instance.timestamp().bytes == value
def test_from_timestamp_timestamp_returns_ulid_instance(valid_bytes_48):
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given timestamp as a :class:`~ulid.ulid.Timestamp`.
"""
value = ulid.Timestamp(valid_bytes_48)
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert instance.timestamp() == value
def test_from_timestamp_ulid_returns_ulid_instance(valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_timestamp` returns a new :class:`~ulid.ulid.ULID` instance
from the given timestamp as a :class:`~ulid.ulid.ULID`.
"""
value = ulid.ULID(valid_bytes_128)
instance = api.from_timestamp(value)
assert isinstance(instance, ulid.ULID)
assert instance.timestamp() == value.timestamp()
def test_from_timestamp_with_unsupported_type_raises(unsupported_type):
"""
Assert that :func:`~ulid.api.from_timestamp` raises a :class:`~ValueError` when given
a type it cannot compute a timestamp value from.
"""
with pytest.raises(ValueError) as ex:
api.from_timestamp(unsupported_type())
assert ex.match(UNSUPPORTED_TIMESTAMP_TYPE_EXC_REGEX)
def test_from_timestamp_with_incorrect_size_bytes_raises(valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_timestamp` raises a :class:`~ValueError` when given
a type that cannot be represented as exactly 48 bits.
"""
with pytest.raises(ValueError) as ex:
api.from_timestamp(valid_bytes_128)
assert ex.match(TIMESTAMP_SIZE_EXC_REGEX)
def test_from_randomness_int_returns_ulid_instance(valid_bytes_80):
"""
Assert that :func:`~ulid.api.from_randomness` returns a new :class:`~ulid.ulid.ULID` instance
from the given random values as an :class:`~int`.
"""
value = int.from_bytes(valid_bytes_80, byteorder='big')
instance = api.from_randomness(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness().int == value
def test_from_randomness_float_returns_ulid_instance(valid_bytes_80):
"""
Assert that :func:`~ulid.api.from_randomness` returns a new :class:`~ulid.ulid.ULID` instance
from the given random values as an :class:`~float`.
"""
value = float(int.from_bytes(valid_bytes_80, byteorder='big'))
instance = api.from_randomness(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness().int == int(value)
def test_from_randomness_str_returns_ulid_instance(valid_bytes_80):
"""
Assert that :func:`~ulid.api.from_randomness` returns a new :class:`~ulid.ulid.ULID` instance
from the given random values as an :class:`~str`.
"""
value = base32.encode_randomness(valid_bytes_80)
instance = api.from_randomness(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness().str == value
def test_from_randomness_bytes_returns_ulid_instance(buffer_type, valid_bytes_80):
"""
Assert that :func:`~ulid.api.from_randomness` returns a new :class:`~ulid.ulid.ULID` instance
from the given random values as an object that supports the buffer protocol.
"""
value = buffer_type(valid_bytes_80)
instance = api.from_randomness(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness().bytes == value
def test_from_randomness_randomness_returns_ulid_instance(valid_bytes_80):
"""
Assert that :func:`~ulid.api.from_randomness` returns a new :class:`~ulid.ulid.ULID` instance
from the given random values as a :class:`~ulid.ulid.Randomness`.
"""
value = ulid.Randomness(valid_bytes_80)
instance = api.from_randomness(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness() == value
def test_from_randomness_ulid_returns_ulid_instance(valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_randomness` returns a new :class:`~ulid.ulid.ULID` instance
from the given random values as a :class:`~ulid.ulid.ULID`.
"""
value = ulid.ULID(valid_bytes_128)
instance = api.from_randomness(value)
assert isinstance(instance, ulid.ULID)
assert instance.randomness() == value.randomness()
def test_from_randomness_with_unsupported_type_raises(unsupported_type):
"""
Assert that :func:`~ulid.api.from_randomness` raises a :class:`~ValueError` when given
a type it cannot compute a randomness value from.
"""
with pytest.raises(ValueError) as ex:
api.from_randomness(unsupported_type())
assert ex.match(UNSUPPORTED_RANDOMNESS_TYPE_EXC_REGEX)
def test_from_randomness_with_incorrect_size_bytes_raises(valid_bytes_128):
"""
Assert that :func:`~ulid.api.from_randomness` raises a :class:`~ValueError` when given
a type that cannot be represented as exactly 80 bits.
"""
with pytest.raises(ValueError) as ex:
api.from_randomness(valid_bytes_128)
assert ex.match(RANDOMNESS_SIZE_EXC_REGEX)
| 35.076471 | 118 | 0.719604 | 2,506 | 17,889 | 4.909417 | 0.053472 | 0.061123 | 0.048931 | 0.062911 | 0.897342 | 0.850118 | 0.803544 | 0.784199 | 0.729497 | 0.688206 | 0 | 0.017282 | 0.171949 | 17,889 | 509 | 119 | 35.145383 | 0.813272 | 0.332048 | 0 | 0.47619 | 0 | 0 | 0.039148 | 0 | 0 | 0 | 0 | 0 | 0.30303 | 1 | 0.194805 | false | 0 | 0.021645 | 0 | 0.225108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c01e1d0552e8623eefcc1eafa12bcc37091dd717 | 7,542 | py | Python | 2019/04_SecureContainer/test/test_password.py | deanearlwright/AdventOfCode | ca4cf6315c0efa38bd7748fb6f4bc99e7934871d | [
"MIT"
] | 1 | 2021-01-03T23:09:28.000Z | 2021-01-03T23:09:28.000Z | 2019/04_SecureContainer/test/test_password.py | deanearlwright/AdventOfCode | ca4cf6315c0efa38bd7748fb6f4bc99e7934871d | [
"MIT"
] | 6 | 2020-12-26T21:02:42.000Z | 2020-12-26T21:02:52.000Z | 2019/04_SecureContainer/test/test_password.py | deanearlwright/AdventOfCode | ca4cf6315c0efa38bd7748fb6f4bc99e7934871d | [
"MIT"
] | null | null | null | # ======================================================================
# Secure Container
# Advent of Code 2019 Day 04 -- Eric Wastl -- https://adventofcode.com
#
# codeuter simulation by Dr. Dean Earl Wright III
# ======================================================================
# ======================================================================
# t e s t _ p a s s w o r d . p y
# ======================================================================
"Test password objects for Advent of Code 2019 day 4, Secure Container"
# ----------------------------------------------------------------------
# import
# ----------------------------------------------------------------------
import unittest
import password
# ----------------------------------------------------------------------
# constants
# ----------------------------------------------------------------------
# ======================================================================
# TestUtility
# ======================================================================
class TestUtility(unittest.TestCase): # pylint: disable=R0904
"""Test utilty function"""
def test_sequential(self):
"""Test sequental checking function"""
self.assertEqual(password.sequential("111111"), True)
self.assertEqual(password.sequential("122345"), True)
self.assertEqual(password.sequential("111123"), True)
self.assertEqual(password.sequential("135679"), True)
self.assertEqual(password.sequential("223450"), False)
self.assertEqual(password.sequential("123789"), True)
self.assertEqual(password.sequential("112233"), True)
self.assertEqual(password.sequential("123444"), True)
self.assertEqual(password.sequential("111122"), True)
def test_pair(self):
"""Test pair checking function"""
self.assertEqual(password.pair("111111"), True)
self.assertEqual(password.pair("122345"), True)
self.assertEqual(password.pair("111123"), True)
self.assertEqual(password.pair("135679"), False)
self.assertEqual(password.pair("223450"), True)
self.assertEqual(password.pair("123789"), False)
self.assertEqual(password.pair("112233"), True)
self.assertEqual(password.pair("123444"), True)
self.assertEqual(password.pair("111122"), True)
def test_pair_only(self):
"""Test pair checking function"""
self.assertEqual(password.pair_only("111111"), False)
self.assertEqual(password.pair_only("122345"), True)
self.assertEqual(password.pair_only("111123"), False)
self.assertEqual(password.pair_only("135679"), False)
self.assertEqual(password.pair_only("223450"), True)
self.assertEqual(password.pair_only("123789"), False)
self.assertEqual(password.pair_only("112233"), True)
self.assertEqual(password.pair_only("123444"), False)
self.assertEqual(password.pair_only("111122"), True)
# ======================================================================
# TestPassword
# ======================================================================
class TestPassword(unittest.TestCase): # pylint: disable=R0904
"""Test password object"""
def test_empty_init(self):
"""Test default password object creation"""
# 1. Create default password object
mypswd = password.Password()
# 2. Make sure it has the default values
self.assertEqual(mypswd.start, 111111)
self.assertEqual(mypswd.finish, 999999)
# 3. Check methods
self.assertEqual(mypswd.check(111111), True)
self.assertEqual(mypswd.check(122345), True)
self.assertEqual(mypswd.check(111123), True)
self.assertEqual(mypswd.check(135679), False)
self.assertEqual(mypswd.check(223450), False)
self.assertEqual(mypswd.check(123789), False)
def test_value_init(self):
"""Test password object creation with values"""
# 1. Create Wire obhect with values
mypswd = password.Password(start=123444, finish=123455)
# 2. Make sure it has the specified values
self.assertEqual(mypswd.start, 123444)
self.assertEqual(mypswd.finish, 123455)
# 3. Check methods
self.assertEqual(mypswd.check(111111), False)
self.assertEqual(mypswd.check(123449), True)
self.assertEqual(mypswd.check(222222), False)
# Check iterator
self.assertEqual(list(mypswd),
[123444, 123445, 123446, 123447,
123448, 123449, 123455])
# ======================================================================
# TestPassword2
# ======================================================================
class TestPassword2(unittest.TestCase): # pylint: disable=R0904
"""Test password object"""
def test_empty_init(self):
"""Test default password object creation"""
# 1. Create default password object
mypswd = password.Password2()
# 2. Make sure it has the default values
self.assertEqual(mypswd.start, 111111)
self.assertEqual(mypswd.finish, 999999)
# 3. Check methods
self.assertEqual(mypswd.check(111111), False)
self.assertEqual(mypswd.check(122345), True)
self.assertEqual(mypswd.check(111123), False)
self.assertEqual(mypswd.check(135679), False)
self.assertEqual(mypswd.check(223450), False)
self.assertEqual(mypswd.check(123789), False)
self.assertEqual(mypswd.check(112233), True)
self.assertEqual(mypswd.check(123444), False)
self.assertEqual(mypswd.check(111122), True)
def test_value_init(self):
"""Test password object creation with values"""
# 1. Create Wire obhect with values
mypswd = password.Password2(start=123444, finish=123455)
# 2. Make sure it has the specified values
self.assertEqual(mypswd.start, 123444)
self.assertEqual(mypswd.finish, 123455)
# 3. Check methods
self.assertEqual(mypswd.check(111111), False)
self.assertEqual(mypswd.check(123449), True)
self.assertEqual(mypswd.check(222222), False)
self.assertEqual(mypswd.check(123444), False)
self.assertEqual(mypswd.check(123445), True)
self.assertEqual(mypswd.check(123455), True)
# Check iterator
self.assertEqual(list(mypswd),
[123445, 123446, 123447,
123448, 123449, 123455])
# ----------------------------------------------------------------------
# module initialization
# ----------------------------------------------------------------------
if __name__ == '__main__':
pass
# ======================================================================
# end t e s t _ p a s s w o r d . p y end
# ======================================================================
| 41.213115 | 73 | 0.491514 | 641 | 7,542 | 5.734789 | 0.173167 | 0.248912 | 0.182807 | 0.16975 | 0.86099 | 0.698857 | 0.511698 | 0.511698 | 0.511698 | 0.481774 | 0 | 0.087704 | 0.251657 | 7,542 | 182 | 74 | 41.43956 | 0.563607 | 0.371917 | 0 | 0.392857 | 0 | 0 | 0.052783 | 0 | 0 | 0 | 0 | 0 | 0.72619 | 1 | 0.083333 | false | 0.428571 | 0.02381 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
c040828f1b5f21763a27c27a4401a32acf72e17d | 7,262 | py | Python | purequant/monitor.py | shenyanping/PureQuant | 6c01e4dd920c58fcfe1f494dfa56e1eb2f3f4a8d | [
"MIT"
] | 24 | 2020-12-17T12:29:32.000Z | 2021-12-27T01:51:53.000Z | purequant/monitor.py | shenyanping/PureQuant | 6c01e4dd920c58fcfe1f494dfa56e1eb2f3f4a8d | [
"MIT"
] | null | null | null | purequant/monitor.py | shenyanping/PureQuant | 6c01e4dd920c58fcfe1f494dfa56e1eb2f3f4a8d | [
"MIT"
] | 16 | 2020-12-06T12:16:53.000Z | 2022-03-27T12:18:59.000Z | from purequant.exchange.okex.websocket import subscribe as okex_subscribe
from purequant.exchange.huobi.websocket import subscribe as huobi_subscribe
from purequant.exchange.huobi.websocket import huobi_swap_position_subscribe
import asyncio, uuid
from purequant.config import config
from purequant.exchange.huobi.websocket import handle_ws_data
from purequant.exceptions import *
def okex_futures_usd():
print("Okex币本位交割合约持仓状态监控中...")
url = 'wss://real.okex.com:8443/ws/v3'
access_key = config.access_key
secret_key = config.secret_key
passphrase = config.passphrase
delivery_date = config.delivery_date
task1 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:BTC-USD-{}".format(delivery_date)])
task2 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:BCH-USD-{}".format(delivery_date)])
task3 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:BSV-USD-{}".format(delivery_date)])
task4 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:ETH-USD-{}".format(delivery_date)])
task5 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:ETC-USD-{}".format(delivery_date)])
task6 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:EOS-USD-{}".format(delivery_date)])
task7 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:LTC-USD-{}".format(delivery_date)])
task8 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:TRX-USD-{}".format(delivery_date)])
task9 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:XRP-USD-{}".format(delivery_date)])
task_list = [task1, task2, task3, task4, task5, task6, task7, task8, task9]
asyncio.run(asyncio.wait(task_list))
def okex_futures_usdt():
print("Okex USDT本位交割合约持仓状态监控中...")
url = 'wss://real.okex.com:8443/ws/v3'
access_key = config.access_key
secret_key = config.secret_key
passphrase = config.passphrase
delivery_date = config.delivery_date
task1 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:BTC-USDT-{}".format(delivery_date)])
task2 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:BCH-USDT-{}".format(delivery_date)])
task3 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:BSV-USDT-{}".format(delivery_date)])
task4 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:ETH-USDT-{}".format(delivery_date)])
task5 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:ETC-USDT-{}".format(delivery_date)])
task6 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:EOS-USDT-{}".format(delivery_date)])
task7 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:LTC-USDT-{}".format(delivery_date)])
task8 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:TRX-USDT-{}".format(delivery_date)])
task9 = okex_subscribe(url, access_key, passphrase, secret_key, ["futures/position:XRP-USDT-{}".format(delivery_date)])
task_list = [task1, task2, task3, task4, task5, task6, task7, task8, task9]
asyncio.run(asyncio.wait(task_list))
def okex_swap_usd():
print("Okex 永续合约持仓状态监控中...")
url = 'wss://real.okex.com:8443/ws/v3'
access_key = config.access_key
secret_key = config.secret_key
passphrase = config.passphrase
task_list = []
symbol_list = ['BTC-USD-SWAP', 'lTC-USD-SWAP', 'ETH-USD-SWAP', 'ETC-USD-SWAP', 'XRP-USD-SWAP',
'EOS-USD-SWAP', 'BCH-USD-SWAP', 'BSV-USD-SWAP', 'TRX-USD-SWAP', ]
for item in symbol_list:
task_list.append(okex_subscribe(url, access_key, passphrase, secret_key, ["swap/position:{}".format(item)]))
asyncio.run(asyncio.wait(task_list))
def okex_swap_usdt():
print("Okex 永续合约持仓状态监控中...")
url = 'wss://real.okex.com:8443/ws/v3'
access_key = config.access_key
secret_key = config.secret_key
passphrase = config.passphrase
task_list = []
symbol_list = ['BTC-USDT-SWAP', 'lTC-USDT-SWAP', 'ETH-USDT-SWAP', 'ETC-USDT-SWAP', 'XRP-USDT-SWAP',
'EOS-USDT-SWAP', 'BCH-USDT-SWAP', 'BSV-USDT-SWAP', 'TRX-USDT-SWAP', ]
for item in symbol_list:
task_list.append(okex_subscribe(url, access_key, passphrase, secret_key, ["swap/position:{}".format(item)]))
asyncio.run(asyncio.wait(task_list))
def okex_spot():
print("Okex 币币账户状态监控中...")
url = 'wss://real.okex.com:8443/ws/v3'
access_key = config.access_key
secret_key = config.secret_key
passphrase = config.passphrase
task_list = []
symbol_list = ['BTC', 'lTC', 'ETH', 'ETC', 'XRP',
'EOS', 'BCH', 'BSV', 'TRX', 'USDT']
for item in symbol_list:
task_list.append(okex_subscribe(url, access_key, passphrase, secret_key, ["spot/account:{}".format(item)]))
asyncio.run(asyncio.wait(task_list))
def okex_margin():
print("Okex 币币杠杆账户状态监控中...")
url = 'wss://real.okex.com:8443/ws/v3'
access_key = config.access_key
secret_key = config.secret_key
passphrase = config.passphrase
task_list = []
symbol_list = ['BTC-USDT', 'lTC-USDT', 'ETH-USDT', 'ETC-USDT', 'XRP-USDT',
'EOS-USDT', 'BCH-USDT', 'BSV-USDT', 'TRX-USDT']
for item in symbol_list:
task_list.append(okex_subscribe(url, access_key, passphrase, secret_key, ["spot/margin_account:{}".format(item)]))
asyncio.run(asyncio.wait(task_list))
def huobi_futures():
print("Huobi币本位交割合约持仓状态监控中...")
url = 'wss://api.hbdm.vn/notification'
access_key = config.access_key
secret_key = config.secret_key
position_subs = [
{
"op": "sub",
"cid": str(uuid.uuid1()),
"topic": "positions.*"
}
]
asyncio.run(huobi_subscribe(url, access_key, secret_key, position_subs, handle_ws_data, auth=True))
def huobi_swap():
print("Huobi币本位永续合约持仓状态监控中...")
url = 'wss://api.hbdm.vn/swap-notification'
access_key = config.access_key
secret_key = config.secret_key
position_subs = [
{
"op": "sub",
"cid": str(uuid.uuid1()),
"topic": "positions.*"
}
]
asyncio.run(huobi_swap_position_subscribe(url, access_key, secret_key, position_subs, handle_ws_data, auth=True))
def position_update():
"""持仓状态更新自动推送"""
if config.position_server_platform == "okex":
if config.okex_futures_usd == "true":
okex_futures_usd()
if config.okex_futures_usdt == "true":
okex_futures_usdt()
if config.okex_swap_usd == "true":
okex_swap_usd()
if config.okex_swap_usdt == "true":
okex_swap_usdt()
if config.okex_spot == "true":
okex_spot()
if config.okex_margin == "true":
okex_margin()
if config.position_server_platform == "huobi":
if config.huobi_futures == "true":
huobi_futures()
if config.huobi_swap == "true":
huobi_swap()
else:
raise ExchangeError("配置文件中position sever的platform设置错误!")
| 48.092715 | 123 | 0.681217 | 933 | 7,262 | 5.078242 | 0.103966 | 0.075981 | 0.091178 | 0.106374 | 0.772689 | 0.753694 | 0.74504 | 0.723934 | 0.723934 | 0.722246 | 0 | 0.01129 | 0.170614 | 7,262 | 150 | 124 | 48.413333 | 0.775361 | 0.001377 | 0 | 0.42029 | 0 | 0 | 0.196384 | 0.114132 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065217 | false | 0.202899 | 0.050725 | 0 | 0.115942 | 0.057971 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
2227950de8523ded110624d935d34eaaf730c11b | 83 | py | Python | api/app/views.py | rdkap42/caedus-covid | f64a833bdf386708fcb9394f94026c48f8d474ee | [
"MIT"
] | 10 | 2020-03-17T21:21:50.000Z | 2020-04-30T02:30:47.000Z | api/app/views.py | rdkap42/caedus-covid | f64a833bdf386708fcb9394f94026c48f8d474ee | [
"MIT"
] | 5 | 2020-03-17T04:39:03.000Z | 2021-04-30T21:11:14.000Z | api/app/views.py | rdkap42/caedus-covid | f64a833bdf386708fcb9394f94026c48f8d474ee | [
"MIT"
] | null | null | null | from .app import app
@app.get("/")
def read_root():
return {"Hello": "World"}
| 13.833333 | 29 | 0.60241 | 12 | 83 | 4.083333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180723 | 83 | 5 | 30 | 16.6 | 0.720588 | 0 | 0 | 0 | 0 | 0 | 0.13253 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
2233901146117a94c9df83beca8267e51de9713f | 39 | py | Python | tests/views.py | jleeothon/urlmodel | b241f9a26ba501f088ff4e0531c5a0aba4107c3d | [
"MIT"
] | 1 | 2015-10-13T04:40:48.000Z | 2015-10-13T04:40:48.000Z | tests/views.py | jleeothon/urlmodel | b241f9a26ba501f088ff4e0531c5a0aba4107c3d | [
"MIT"
] | null | null | null | tests/views.py | jleeothon/urlmodel | b241f9a26ba501f088ff4e0531c5a0aba4107c3d | [
"MIT"
] | null | null | null |
def test_view(request):
return ""
| 9.75 | 23 | 0.641026 | 5 | 39 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 39 | 3 | 24 | 13 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
2258745cc7f934266ed5746574cb992d0af47c3c | 118 | py | Python | tests/conftest.py | codeocean/jupyter-matlab-proxy | 361e3786856963cfab80ad7b803bfd7d375508fd | [
"BSD-2-Clause"
] | null | null | null | tests/conftest.py | codeocean/jupyter-matlab-proxy | 361e3786856963cfab80ad7b803bfd7d375508fd | [
"BSD-2-Clause"
] | 1 | 2021-03-02T11:24:17.000Z | 2021-03-02T11:24:17.000Z | tests/conftest.py | codeocean/jupyter-matlab-proxy | 361e3786856963cfab80ad7b803bfd7d375508fd | [
"BSD-2-Clause"
] | null | null | null | # Copyright 2020 The MathWorks, Inc.
import os
def pytest_generate_tests(metafunc):
os.environ["DEV"] = "true"
| 14.75 | 36 | 0.711864 | 16 | 118 | 5.125 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040816 | 0.169492 | 118 | 7 | 37 | 16.857143 | 0.795918 | 0.288136 | 0 | 0 | 1 | 0 | 0.085366 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2263fbf2f5bcdd1f25d2ca36d61ec641d27690bc | 35 | py | Python | graeae/infrastructure/__init__.py | necromuralist/graeae | d8075c8112870712d9ec2b0a571e6d404c5bc95e | [
"MIT"
] | 1 | 2020-06-21T23:43:52.000Z | 2020-06-21T23:43:52.000Z | graeae/infrastructure/__init__.py | necromuralist/graeae | d8075c8112870712d9ec2b0a571e6d404c5bc95e | [
"MIT"
] | 4 | 2019-08-23T19:25:26.000Z | 2020-07-04T23:05:15.000Z | graeae/infrastructure/__init__.py | necromuralist/graeae | d8075c8112870712d9ec2b0a571e6d404c5bc95e | [
"MIT"
] | null | null | null | from .logging import SysLogBuilder
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
22705d7d81c8c8e998126142cda4825a430435c1 | 33 | py | Python | tests/__init__.py | wishlists/wishlists | 47cab5ab1e51c20ef54ca6800b2515e66941da07 | [
"Apache-2.0"
] | 3 | 2020-10-16T22:32:05.000Z | 2020-12-24T06:06:00.000Z | tests/__init__.py | wishlists/wishlists | 47cab5ab1e51c20ef54ca6800b2515e66941da07 | [
"Apache-2.0"
] | 116 | 2018-09-28T00:56:58.000Z | 2020-12-15T20:15:22.000Z | tests/__init__.py | wishlists/wishlists | 47cab5ab1e51c20ef54ca6800b2515e66941da07 | [
"Apache-2.0"
] | 2 | 2018-09-28T00:14:09.000Z | 2021-02-21T06:25:19.000Z | # Created by gupta at 10-10-2020
| 16.5 | 32 | 0.727273 | 7 | 33 | 3.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.296296 | 0.181818 | 33 | 1 | 33 | 33 | 0.592593 | 0.909091 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
97d4fb4e1f0fa1a93b45871014de6c0f6fa09132 | 23 | py | Python | lib/s2_py/__init__.py | koordinates/s2-py | 5925f4e7bbcf13550f25de7f4e3ddebe0759c365 | [
"Apache-2.0"
] | 1 | 2019-11-21T06:00:39.000Z | 2019-11-21T06:00:39.000Z | lib/s2_py/__init__.py | koordinates/s2-py | 5925f4e7bbcf13550f25de7f4e3ddebe0759c365 | [
"Apache-2.0"
] | 1 | 2019-11-21T06:39:23.000Z | 2019-12-16T21:28:22.000Z | lib/s2_py/__init__.py | koordinates/s2-py | 5925f4e7bbcf13550f25de7f4e3ddebe0759c365 | [
"Apache-2.0"
] | 1 | 2021-09-24T00:17:15.000Z | 2021-09-24T00:17:15.000Z | from .pywraps2 import * | 23 | 23 | 0.782609 | 3 | 23 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.130435 | 23 | 1 | 23 | 23 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f0fcf223ffc0ca2c0790459cc5cd2f68b084fbc | 130 | py | Python | gsfarc/gptool/parameter/templates/ulongarray.py | geospatial-services-framework/gsfpyarc | 5ef69299fbc0b763ad4c1857ceac3ff087c0dc14 | [
"MIT"
] | 1 | 2021-11-06T18:36:28.000Z | 2021-11-06T18:36:28.000Z | gsfarc/gptool/parameter/templates/ulongarray.py | geospatial-services-framework/gsfpyarc | 5ef69299fbc0b763ad4c1857ceac3ff087c0dc14 | [
"MIT"
] | null | null | null | gsfarc/gptool/parameter/templates/ulongarray.py | geospatial-services-framework/gsfpyarc | 5ef69299fbc0b763ad4c1857ceac3ff087c0dc14 | [
"MIT"
] | null | null | null | """
"""
from .basicarray import BASICARRAY
class ULONGARRAY(BASICARRAY): pass
def template():
return ULONGARRAY('GPLong') | 11.818182 | 34 | 0.707692 | 13 | 130 | 7.076923 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161538 | 130 | 11 | 35 | 11.818182 | 0.844037 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0.25 | 0.25 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
58dffc74e195fde4d84a67acb485954c068a78e7 | 4,795 | py | Python | charset/alexYou.py | RyanBilash/turtleAlphabet | 2f2da404f7ef8803ed205a2b3b15c153b8aa5388 | [
"MIT"
] | null | null | null | charset/alexYou.py | RyanBilash/turtleAlphabet | 2f2da404f7ef8803ed205a2b3b15c153b8aa5388 | [
"MIT"
] | null | null | null | charset/alexYou.py | RyanBilash/turtleAlphabet | 2f2da404f7ef8803ed205a2b3b15c153b8aa5388 | [
"MIT"
] | null | null | null | import turtle
def turtleK():
turtle.left(90)
turtle.forward(50)
turtle.backward(25)
turtle.right(50)
turtle.forward(39)
turtle.backward(39)
turtle.right(80)
turtle.forward(39)
turtle.backward(39)
turtle.right(50)
turtle.forward(25)
turtle.penup()
turtle.left(90)
turtle.forward(30)
turtle.forward(10)
turtle.pendown()
def turtleL():
turtle.left(90)
turtle.forward(50)
turtle.back(50)
turtle.right(90)
turtle.forward(30)
turtle.penup()
turtle.forward(10)
turtle.pendown()
def turtleM():
turtle.left(90)
turtle.forward(50)
turtle.right(163.3)
turtle.forward(52.2)
turtle.left(146.6)
turtle.forward(52.2)
turtle.right(163.3)
turtle.forward(50)
turtle.penup()
turtle.left(90)
turtle.forward(10)
turtle.pendown()
def turtleN():
turtle.left(90)
turtle.forward(50)
turtle.right(149)
turtle.forward(58.3)
turtle.left(149)
turtle.forward(50)
turtle.backward(50)
turtle.penup()
turtle.right(90)
turtle.forward(10)
turtle.pendown()
def turtleO():
turtle.left(90)
turtle.forward(50)
turtle.right(90)
turtle.forward(30)
turtle.right(90)
turtle.forward(50)
turtle.right(90)
turtle.forward(30)
turtle.backward(30)
turtle.left(180)
turtle.penup()
turtle.forward(10)
turtle.pendown()
def turtleP():
turtle.left(90)
turtle.forward(50)
turtle.right(90)
turtle.forward(30)
turtle.right(90)
turtle.forward(20)
turtle.right(90)
turtle.forward(30)
turtle.left(90)
turtle.forward(30)
turtle.penup()
turtle.left(90)
turtle.forward(40)
turtle.pendown()
def turtleQ():
turtle.left(90)
turtle.penup()
turtle.forward(10)
turtle.pendown()
turtle.forward(30)
turtle.right(45)
turtle.forward(14.14)
turtle.right(45)
turtle.forward(10)
turtle.right(45)
turtle.forward(14.14)
turtle.right(45)
turtle.forward(30)
turtle.right(45)
turtle.forward(14.14)
turtle.right(45)
turtle.forward(10)
turtle.right(45)
turtle.forward(14.14)
turtle.backward(14.14)
turtle.right(135)
turtle.forward(10)
turtle.left(45)
turtle.forward(7.07)
turtle.left(90)
turtle.forward(7.07)
turtle.backward(14.14)
turtle.right(135)
turtle.penup()
turtle.forward(10)
turtle.pendown()
def turtleR():
turtle.left(90)
turtle.forward(50)
turtle.right(90)
turtle.forward(30)
turtle.right(90)
turtle.forward(20)
turtle.right(90)
turtle.forward(30)
turtle.left(135)
turtle.forward(42.42)
turtle.left(45)
turtle.penup()
turtle.forward(10)
turtle.pendown()
def turtleS():
turtle.forward(30)
turtle.left(90)
turtle.forward(20)
turtle.left(45)
turtle.forward(7.07)
turtle.left(45)
turtle.forward(20)
turtle.right(45)
turtle.forward(7.07)
turtle.right(45)
turtle.forward(20)
turtle.right(90)
turtle.forward(30)
turtle.right(90)
turtle.penup()
turtle.forward(50)
turtle.left(90)
turtle.forward(10)
turtle.pendown()
def turtleT():
turtle.left(90)
turtle.penup()
turtle.forward(50)
turtle.pendown()
turtle.right(90)
turtle.forward(30)
turtle.backward(15)
turtle.right(90)
turtle.forward(50)
turtle.left(90)
turtle.penup()
turtle.forward(25)
turtle.pendown()
def turtleU():
turtle.left(90)
turtle.forward(50)
turtle.backward(50)
turtle.right(90)
turtle.forward(30)
turtle.left(90)
turtle.forward(50)
turtle.backward(50)
turtle.right(90)
turtle.penup()
turtle.forward(10)
turtle.pendown()
def turtleV():
turtle.left(90)
turtle.penup()
turtle.forward(50)
turtle.right(163.3)
turtle.pendown()
turtle.forward(52.2)
turtle.left(146.6)
turtle.forward(52.2)
turtle.right(163.3)
turtle.penup()
turtle.forward(50)
turtle.left(90)
turtle.forward(10)
turtle.pendown()
def turtleW():
turtle.left(90)
turtle.forward(50)
turtle.left(180)
turtle.forward(50)
turtle.left(163.3)
turtle.forward(52.2)
turtle.right(146.6)
turtle.forward(52.2)
turtle.left(163.3)
turtle.forward(50)
turtle.penup()
turtle.backward(50)
turtle.right(90)
turtle.forward(10)
turtle.pendown()
def turtleX():
turtle.left(59)
turtle.forward(58.3)
turtle.backward(29.15)
turtle.left(62)
turtle.forward(29.15)
turtle.backward(58.3)
turtle.right(121)
turtle.penup()
turtle.forward(10)
turtle.pendown()
if __name__ == '__main__':
1
#Section for code you want run only if this is the main process | 20.404255 | 67 | 0.631908 | 650 | 4,795 | 4.649231 | 0.104615 | 0.339841 | 0.168762 | 0.131039 | 0.869623 | 0.832561 | 0.798809 | 0.682991 | 0.446393 | 0.348776 | 0 | 0.106057 | 0.225235 | 4,795 | 235 | 67 | 20.404255 | 0.707402 | 0.01293 | 0 | 0.844749 | 0 | 0 | 0.00169 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063927 | true | 0 | 0.004566 | 0 | 0.068493 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
58fa18ac586f0144d47b3b29a3413c87b645f693 | 7,772 | py | Python | etl_base/dags/sqlg_dag_CUS.py | buckylee2019/sqlg-airflow | 37610a23b99bea8d9fdc8b066a01736ff2ff0c9d | [
"Apache-2.0"
] | null | null | null | etl_base/dags/sqlg_dag_CUS.py | buckylee2019/sqlg-airflow | 37610a23b99bea8d9fdc8b066a01736ff2ff0c9d | [
"Apache-2.0"
] | null | null | null | etl_base/dags/sqlg_dag_CUS.py | buckylee2019/sqlg-airflow | 37610a23b99bea8d9fdc8b066a01736ff2ff0c9d | [
"Apache-2.0"
] | 1 | 2022-03-10T03:47:35.000Z | 2022-03-10T03:47:35.000Z |
args = {
"owner": "JESSEWEI",
'start_date': airflow.utils.dates.days_ago(1),
'provide_context': True
}
# XSLT:loop: declaration: END}
ExternalTaskSensor.ui_color = 'white'
ExternalTaskSensor.ui_fgcolor = 'blue'
tmpl_search_path = Variable.get("sql_path")
# XSLT:loop: JOB_FLOW_NAME: START{
job_flow_name = "D_ODS_CUS_SRC"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_CUS_SRC")
D_ODS_CUS_SRC = airflow.DAG(
"D_ODS_CUS_SRC",
tags=["CUS", data_stage[0]],
schedule_interval="RCG-D-NAT",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1
)
job_flow_name = "D_SDM_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_SDM_CUS")
D_SDM_CUS = airflow.DAG(
"D_SDM_CUS",
tags=["CUS", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1
)
job_flow_name = "M_SDM_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","M_SDM_CUS")
M_SDM_CUS = airflow.DAG(
"M_SDM_CUS",
tags=["CUS", data_stage[0]],
schedule_interval="@monthly",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1
)
job_flow_name = "D_DM_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_DM_CUS")
D_DM_CUS = airflow.DAG(
"D_DM_CUS",
tags=["CUS", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1
)
job_flow_name = "M_DM_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","M_DM_CUS")
M_DM_CUS = airflow.DAG(
"M_DM_CUS",
tags=["CUS", data_stage[0]],
schedule_interval="@monthly",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1
)
# XSLT:loop: JOB_FLOW_NAME: END}
# XSLT:loop: JOB_FLOW_NAME-and-PRE_JOB: External:START{{
my_taskid = "D_ODS_CUS_SRCxD_STG_INIT__SYS_STS_STG"
D_ODS_CUS_SRCxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
# schedule_interval=None,
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_CUS_SRC,
check_existence=True,
# execution_delta=None, # Same day as today
)
my_taskid = "D_SDM_CUSxD_STG_INIT__SYS_STS_STG"
D_SDM_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
# schedule_interval=None,
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_SDM_CUS,
check_existence=True,
# execution_delta=None, # Same day as today
)
my_taskid = "M_SDM_CUSxD_STG_INIT__SYS_STS_STG"
M_SDM_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
# schedule_interval=None,
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=M_SDM_CUS,
check_existence=True,
# execution_delta=None, # Same day as today
)
my_taskid = "D_DM_CUSxD_STG_INIT__SYS_STS_STG"
D_DM_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
# schedule_interval=None,
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_DM_CUS,
check_existence=True,
# execution_delta=None, # Same day as today
)
my_taskid = "M_DM_CUSxD_STG_INIT__SYS_STS_STG"
M_DM_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
# schedule_interval=None,
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=M_DM_CUS,
check_existence=True,
# execution_delta=None, # Same day as today
)
# XSLT:loop: JOB_FLOW_NAME-and-PRE_JOB: External: END}}
# XSLT:loop: JOB_FLOW_NAME: START{
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
HZ_CUST_ACCOUNTS.dag=D_ODS_CUS_SRC
D_ODS_CUS_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(HZ_CUST_ACCOUNTS)
HZ_PARTIES.dag=D_ODS_CUS_SRC
D_ODS_CUS_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(HZ_PARTIES)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
SDM_MEETING_MINUTES.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_MEETING_MINUTES)
SDM_CUSTOMER_COMPANY_CHECK.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_CUSTOMER_COMPANY_CHECK)
SDM_MODEL.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_MODEL)
SDM_PREMIUM_FREIGHT.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_PREMIUM_FREIGHT)
REF_PRODUCT_TECHNOLOGY.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(REF_PRODUCT_TECHNOLOGY)
REF_SUB_GROUP_CUSTOMER.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(REF_SUB_GROUP_CUSTOMER)
REF_MARKET_SHARE_PRODUCT.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(REF_MARKET_SHARE_PRODUCT)
REF_PRODUCT_SEGMENT.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(REF_PRODUCT_SEGMENT)
REF_END_CUSTOMER.dag=D_SDM_CUS
D_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(REF_END_CUSTOMER)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
SDM_MARKET_SHARE.dag=M_SDM_CUS
M_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_MARKET_SHARE)
SDM_MARKET_TAM_CAGR.dag=M_SDM_CUS
M_SDM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_MARKET_TAM_CAGR)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
DIM_PRODUCT_TECHNOLOGY.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_PRODUCT_TECHNOLOGY)
DIM_SUB_GROUP_CUSTOMER.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_SUB_GROUP_CUSTOMER)
DIM_MARKET_SHARE_PRODUCT.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_MARKET_SHARE_PRODUCT)
DIM_PRODUCT_SEGMENT.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_PRODUCT_SEGMENT)
DIM_END_CUSTOMER.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_END_CUSTOMER)
DIM_MODEL.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_MODEL)
DIM_CUSTOMER.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_CUSTOMER)
DIM_GROUP_CUSTOMER.dag=D_DM_CUS
D_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(DIM_GROUP_CUSTOMER)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
FCT_MARKET_SHARE.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_MARKET_SHARE)
FCT_MARKET_TAM_CAGR.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_MARKET_TAM_CAGR)
FCT_PREMIUM_FREIGHT.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_PREMIUM_FREIGHT)
FCT_CCM_RANK.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_CCM_RANK)
FCT_CCM_REPORT.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_CCM_REPORT)
FCT_CCM_BU_REPORT.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_CCM_BU_REPORT)
FCT_MEETING_MINUTES.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_MEETING_MINUTES)
FCT_CUSTOMER_COMPANY_CHECK.dag=M_DM_CUS
M_DM_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_CUSTOMER_COMPANY_CHECK)
| 32.248963 | 76 | 0.794004 | 1,339 | 7,772 | 4.010456 | 0.093353 | 0.057356 | 0.073743 | 0.094413 | 0.839479 | 0.811546 | 0.80149 | 0.792365 | 0.767784 | 0.745065 | 0 | 0.003746 | 0.107051 | 7,772 | 240 | 77 | 32.383333 | 0.770029 | 0.110782 | 0 | 0.413613 | 0 | 0 | 0.104651 | 0.024273 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4513c431da9953f5605d2077e49af094108a6edf | 35 | py | Python | modules/__init__.py | chandar-lab/PatchUp | 31133f358c2b59536eeb5ef304acec2b1436ec23 | [
"MIT"
] | 23 | 2020-06-16T12:42:46.000Z | 2022-01-17T09:19:15.000Z | modules/__init__.py | chandar-lab/PatchUp | 31133f358c2b59536eeb5ef304acec2b1436ec23 | [
"MIT"
] | 8 | 2020-10-23T13:41:40.000Z | 2022-03-12T00:35:27.000Z | modules/__init__.py | chandar-lab/PatchUp | 31133f358c2b59536eeb5ef304acec2b1436ec23 | [
"MIT"
] | 7 | 2020-06-17T23:42:07.000Z | 2021-12-21T10:11:12.000Z | from modules.patchup import PatchUp | 35 | 35 | 0.885714 | 5 | 35 | 6.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 35 | 1 | 35 | 35 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
18ab1779bc3cd4b4430491daf15df3f58b872b1c | 23,126 | py | Python | test/cnnl/op_test/test_div.py | Cambricon/catch | 2625da389f25a67066d20fb6b0c38250ef98f8ab | [
"BSD-2-Clause"
] | 20 | 2022-03-01T11:40:51.000Z | 2022-03-30T08:17:47.000Z | test/cnnl/op_test/test_div.py | Cambricon/catch | 2625da389f25a67066d20fb6b0c38250ef98f8ab | [
"BSD-2-Clause"
] | null | null | null | test/cnnl/op_test/test_div.py | Cambricon/catch | 2625da389f25a67066d20fb6b0c38250ef98f8ab | [
"BSD-2-Clause"
] | null | null | null | """
test_div
"""
from __future__ import print_function
import unittest
import logging
import copy
import sys
import os
import itertools
import torch
os.environ['ENABLE_CNNL_TRYCATCH'] = 'OFF'
import torch_mlu.core.mlu_model as ct # pylint: disable=C0413,W0611
PWD = os.path.dirname(os.path.abspath(__file__))
sys.path.append(PWD+"/../../")
from common_utils import testinfo, TestCase # pylint: disable=C0413,E0401,C0411
logging.basicConfig(level=logging.DEBUG)
class TestDivOp(TestCase):
"""
test-div
"""
#@unittest.skip("not test")
@testinfo()
def test_div_tensor_tensor(self):
"""
test_tensor_tensor
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
channel_first = [True, False]
for data_type, err in dtype_list:
# test_dim_0
x_0 = torch.tensor(8.0)
y_0 = torch.tensor(2.0)
out_cpu = torch.div(x_0, y_0)
out_mlu = torch.div(self.to_mlu_dtype(x_0, data_type),
self.to_mlu_dtype(y_0, data_type))
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float(),
err, use_MSE=True)
# need check again.
for shape1, shape2 in [((1, 3, 224, 224), (1, 3, 224, 1)),
((2, 30, 80), (2, 30, 80)),
((3, 20), (3, 20)),
((10,), (10,)),
((8732, 2), (8732, 2)),
((2, 2, 4, 2), (2,)),
((1, 2), (2, 2, 4, 2)),
((2, 1, 2, 4), (1, 2, 4)),
((1, 2, 4), (2, 1, 2, 4)),
((1, 3, 224, 224), (1, 1, 1, 1)),
((3, 3, 4, 224, 224), (4,224,224)),
((1, 3, 224), (1, 3, 1)),
((1, 3, 224, 224), (1,))]:
for channel in channel_first:
x__ = torch.rand(shape1, dtype=torch.float)
y__ = torch.randint(low = 1, high = 10, size = shape2, dtype=torch.float)
y__ = y__ + 0.00005 # float range:[0.00005, 500]
input_check_x = x__.clone()
input_check_y = y__.clone()
out_cpu = torch.div(x__, y__)
#channel last test
if channel is False:
x__ = self.convert_to_channel_last(x__)
out_mlu = torch.div(self.to_mlu_dtype(x__, data_type),
self.to_mlu_dtype(y__, torch.float))
self.assertTensorsEqual(input_check_x, x__, 0)
self.assertTensorsEqual(input_check_y, y__, 0)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float().contiguous(),
err, use_MSE=True)
for channel in channel_first:
x_cpu = torch.rand(shape1, dtype=data_type)
y_cpu = torch.randint(low = 1, high = 10, size = shape2, dtype=data_type)
y_cpu = y_cpu + 0.00005 # float range:[0.00005, 500]
#channel last test
if channel is False:
x_cpu = self.convert_to_channel_last(x_cpu)
x_mlu = copy.deepcopy(x_cpu).to("mlu")
y_mlu = copy.deepcopy(y_cpu).to("mlu")
tmp_cpu = torch.randn(1)
tmp_mlu = copy.deepcopy(tmp_cpu).to("mlu")
out_cpu = torch.div(x_cpu, y_cpu, out=tmp_cpu)
out_mlu = torch.div(x_mlu, y_mlu, out=tmp_mlu)
self.assertTensorsEqual(tmp_cpu, tmp_mlu.cpu(), err, use_MSE=True)
self.assertTensorsEqual(out_cpu.float(), out_mlu.cpu().float().contiguous(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_not_contiguous_tensor_tensor(self):
"""
test_tensor_tensor
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
for data_type, err in dtype_list:
for shape1, shape2 in [((1, 10, 224, 224), (1, 10, 224, 1)),
((2, 30, 80), (2, 30, 80))]:
x_ = torch.rand(shape1, dtype=torch.float)
y_ = torch.randint(low = 1, high = 10, size = shape2, dtype=torch.float)
y_ = y_ + 0.00005 # float range:[0.00005, 500]
input_check_x = x_.clone()
input_check_y = y_.clone()
out_cpu = torch.div(x_[:,2:8,5:60], y_[:,2:8,5:60])
out_mlu = torch.div(self.to_mlu_dtype(x_, data_type)[:,2:8,5:60],
self.to_mlu_dtype(y_, torch.float)[:,2:8,5:60])
self.assertTensorsEqual(input_check_x, x_, 0)
self.assertTensorsEqual(input_check_y, y_, 0)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float().contiguous(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_tensor_tensor_channel_last(self):
"""
test_tensor_tensor
"""
dtype_list = [torch.float, torch.half]
func_list = [lambda x:x, self.convert_to_channel_last, lambda x:x[..., ::2]]
param_list = [dtype_list, func_list, func_list]
#for data_type, err in dtype_list:
for data_type, func_x, func_y in itertools.product(*param_list):
for shape1, shape2 in [((224, 224), (1, 10, 224, 1)),
((1, 10, 224, 224), (1, 10, 224, 1))]:
x_ = torch.rand(shape1, dtype=torch.float)
y_ = torch.randint(low = 1, high = 10, size = shape2, dtype=torch.float)
y_ = y_ + 0.00005 # float range:[0.00005, 500]
x__ = x_
y__ = y_
input_check_x = x__.clone()
input_check_y = y__.clone()
out_cpu = torch.div(func_x(x__), func_y(y__))
out_mlu = torch.div(func_x(self.to_mlu_dtype(x__, data_type)),
func_y(self.to_mlu_dtype(y__, torch.float)))
self.assertTensorsEqual(input_check_x, x__, 0)
self.assertTensorsEqual(input_check_y, y__, 0)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float().contiguous(),
3e-3, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_inplace_not_contiguous_tensor_tensor(self):
"""
test_tensor_tensor
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
for data_type, err in dtype_list:
for shape1, shape2 in [((1, 10, 224, 224), (1, 10, 224, 1)),
((2, 30, 80), (1, 30, 80))]:
x_ = torch.rand(shape1, dtype=torch.float)
y_ = torch.randint(low = 1, high = 10, size = shape2, dtype=torch.float)
y_ = y_ + 0.00005 # float range:[0.00005, 500]
x__ = x_[:,2:8,5:60]
y__ = y_[:,2:8,5:60]
# inplace not contiguous div
mlu_x__ = self.to_mlu_dtype(x__, data_type)
mlu_y__ = self.to_mlu_dtype(y__, torch.float)
mlu_x_dptr = mlu_x__.data_ptr()
x__.div_(y__)
mlu_x__.div_(mlu_y__)
self.assertEqual(mlu_x_dptr, mlu_x__.data_ptr())
self.assertTensorsEqual(x__, mlu_x__.cpu().float(),
err, use_MSE=True)
input = torch.randn(4, 6).fill_(2.)
input_cpu = copy.deepcopy(input)
input_mlu = copy.deepcopy(input_cpu).to('mlu')
input_cpu[:, :4] /= 2.0
input_mlu[:, :4] /= 2.0
self.assertTensorsEqual(input_cpu, input_mlu.cpu(), 3e-3, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_inplace_tensor_tensor_channel_last(self):
"""
test_tensor_tensor
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
for data_type, err in dtype_list:
for shape1, shape2 in [((1, 10, 224, 224), (1, 10, 224, 1))]:
x_ = torch.rand(shape1, dtype=torch.float)
y_ = torch.randint(low = 1, high = 10, size = shape2, dtype=torch.float)
y_ = y_ + 0.00005 # float range:[0.00005, 500]
x__ = x_.to(memory_format=torch.channels_last)
y__ = y_.to(memory_format=torch.channels_last)
mlu_x__ = self.to_mlu_dtype(x__, data_type)
mlu_y__ = self.to_mlu_dtype(y__, torch.float)
mlu_x_dptr = mlu_x__.data_ptr()
x__.div_(y__)
mlu_x__.div_(mlu_y__)
self.assertEqual(mlu_x_dptr, mlu_x__.data_ptr())
self.assertTensorsEqual(x__, mlu_x__.cpu().float(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_tensor_scalar(self):
"""
test_tensor_scalar
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
for data_type, err in dtype_list:
# test_dim_0
x_0 = torch.tensor(8.0)
y_0 = 2.0
out_cpu = torch.div(x_0, y_0)
out_mlu = torch.div(self.to_mlu_dtype(x_0, data_type), y_0)
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float(),
err, use_MSE=True)
# test_input_0
x_0 = torch.randn((0, 5))
y_0 = 2.0
out_cpu = torch.div(x_0, y_0)
out_mlu = torch.div(self.to_mlu_dtype(x_0, data_type), y_0)
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float(),
err, use_MSE=True)
# test_input_0
x_0 = torch.randint(low = 1, high = 10, size = (0, 5), dtype=torch.float)
y_0 = 2.0
out_cpu = torch.div(y_0, x_0)
out_mlu = torch.div(y_0, self.to_mlu_dtype(x_0, data_type))
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float(),
err, use_MSE=True)
# test_input_0_inplace
x_0 = torch.randn((0, 5))
y_0 = 2.0
x_mlu = self.to_mlu_dtype(x_0, data_type)
x_0 /= y_0
x_mlu /= y_0
self.assertTensorsEqual(x_0, x_mlu.cpu().float(),
err, use_MSE=True)
shape_list = [(5), (0), (7, 9), (9, 8, 7), (1, 2, 3, 4),
(10, 10, 10, 10), (100, 200), (3, 40, 32),
(1111), (99, 30, 40), (34, 56, 78, 90),
(5, 6, 7, 8, 9), (9, 11, 12, 14, 15, 16)]
# channel last test.
channel_first = [True, False]
for shape in shape_list:
for channel in channel_first:
x__ = torch.rand(shape, dtype=torch.float)
input_check_x = x__.clone()
out_cpu_1 = torch.div(x__, 8.0)
if channel is False:
x__ = self.convert_to_channel_last(x__)
out_mlu_1 = torch.div(self.to_mlu_dtype(x__, data_type), 8.0)
self.assertTensorsEqual(input_check_x, x__, 0)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu_1, out_mlu_1.cpu().float().contiguous(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_scalar_scalar(self):
"""
test_scalar_scalar
"""
shape_list = [(5), (7, 9), (9, 8, 7), (1, 2, 3, 4)]
for shape in shape_list:
x__ = torch.rand(shape, dtype=torch.float)
input_check_x = x__.clone()
out_cpu_2 = torch.div(x__.sum(), 8.0)
out_mlu_2 = torch.div(self.to_mlu(x__).sum(), 8.0)
self.assertEqual(out_cpu_2.dtype, out_mlu_2.dtype)
self.assertTensorsEqual(input_check_x, x__, 0)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu_2, out_mlu_2.cpu(),
3e-3, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_scalar_scalar_half(self):
"""
test_scalar_scalar
"""
shape_list = [(5), (7, 9), (9, 8, 7), (1, 2, 3, 4)]
for shape in shape_list:
x__ = torch.rand(shape, dtype=torch.float)
input_check_x = x__.clone()
out_cpu_2 = torch.div(x__.sum(), 8.0)
out_mlu_2 = torch.div(self.to_mlu_dtype(x__, torch.half).sum(), 8.0)
self.assertTensorsEqual(input_check_x, x__, 0)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu_2, out_mlu_2.cpu().float(),
3e-3, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_scalar_scalar_self(self):
"""
test_scalar_scalar
"""
shape_list = [(5), (7, 9), (9, 8, 7), (1, 2, 3, 4)]
for shape in shape_list:
x__ = torch.rand(shape, dtype=torch.float)
out_origin = x__.sum()
out_cpu_1 = out_origin.clone()
out_cpu_1 /= 5.0
out_mlu_1 = self.to_mlu(out_origin)
out_mlu_ptr = out_mlu_1.data_ptr()
out_mlu_1 /= 5.0
self.assertEqual(out_mlu_ptr, out_mlu_1.data_ptr())
self.assertEqual(out_cpu_1.dtype, out_mlu_1.dtype)
out_cpu_2 = out_origin.clone()
out_cpu_2 = out_cpu_2/5.0
out_mlu_2 = self.to_mlu(out_origin)
out_mlu_2 = out_mlu_2/5.0
# float type precision : 0.003
self.assertEqual(out_cpu_2.dtype, out_mlu_2.dtype)
self.assertTensorsEqual(out_cpu_1, out_mlu_1.cpu(),
3e-3, use_MSE=True)
self.assertTensorsEqual(out_cpu_2, out_mlu_2.cpu(),
3e-3, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_scalar_scalar_half_self(self):
"""
test_scalar_scalar
"""
shape_list = [(5), (7, 9), (9, 8, 7), (1, 2, 3, 4)]
for shape in shape_list:
x__ = torch.rand(shape, dtype=torch.float)
out_origin = x__.sum()
out_cpu_1 = out_origin.clone()
out_cpu_1 /= 5.0
out_mlu_1 = self.to_mlu_dtype(out_origin, torch.half)
out_mlu_ptr = out_mlu_1.data_ptr()
out_mlu_1 /= 5.0
self.assertEqual(out_mlu_ptr, out_mlu_1.data_ptr())
out_cpu_2 = out_origin.clone()
out_cpu_2 = out_cpu_2/5.0
out_mlu_2 = self.to_mlu_dtype(out_origin, torch.half)
out_mlu_2 = out_mlu_2/5.0
# float type precision : 0.003
self.assertTensorsEqual(out_cpu_1, out_mlu_1.cpu().float(),
3e-3, use_MSE=True)
self.assertTensorsEqual(out_cpu_2, out_mlu_2.cpu().float(),
3e-3, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_tensor_tensor_self(self):
"""
test_tensor_tensor
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
for data_type, err in dtype_list:
# test_dim_0
x_0 = torch.tensor(8.0)
y_0 = torch.tensor(2.0)
out_origin = x_0
out_cpu = out_origin.clone()
out_cpu /= y_0
out_mlu = self.to_mlu_dtype(out_origin, data_type)
out_mlu_ptr = out_mlu.data_ptr()
out_mlu /= self.to_mlu_dtype(y_0, data_type)
self.assertEqual(out_mlu_ptr, out_mlu.data_ptr())
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float(),
err, use_MSE=True)
for shape1, shape2 in [((1, 3, 224, 224), (1, 3, 224, 1)),
((2, 30, 80), (2, 30, 80)),
((3, 20), (3, 20)),
((10), (10)),
((2, 1, 2, 4), (1, 2, 4)),
((1, 3, 224, 224), (1, 1, 1, 1)),
((1, 3, 224), (1, 3, 1)),
((1, 3, 224, 224), (1))]:
x__ = torch.rand(shape1, dtype=torch.float)
y__ = torch.rand(shape2, dtype=torch.float)
y__ = y__ + 0.00005 # float range:[0.00005, 500]
out_origin = x__
out_cpu_1 = out_origin.clone()
out_cpu_1 /= y__
out_mlu_1 = self.to_mlu_dtype(out_origin, data_type)
out_mlu_ptr = out_mlu_1.data_ptr()
out_mlu_1 /= self.to_mlu_dtype(y__, data_type)
self.assertEqual(out_mlu_ptr, out_mlu_1.data_ptr())
out_cpu_2 = out_origin.clone()
out_cpu_2 = out_cpu_2/y__
out_mlu_2 = self.to_mlu_dtype(out_origin, data_type)
out_mlu_2 = out_mlu_2/self.to_mlu_dtype(y__, data_type)
# float type precision : 0.003
self.assertTensorsEqual(out_cpu_1, out_mlu_1.cpu().float(),
err, use_MSE=True)
self.assertTensorsEqual(out_cpu_2, out_mlu_2.cpu().float(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_tensor_scalar_self(self):
"""
test_tensor_scalar
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
for data_type, err in dtype_list:
# test_dim_0
x_0 = torch.tensor(8.0)
y_0 = 2.0
out_origin = x_0
out_cpu = out_origin.clone()
out_cpu /= y_0
out_mlu = self.to_mlu_dtype(out_origin, data_type)
out_mlu_ptr = out_mlu.data_ptr()
out_mlu /= y_0
self.assertEqual(out_mlu_ptr, out_mlu.data_ptr())
if data_type == torch.float:
self.assertEqual(out_cpu.dtype, out_mlu.dtype)
self.assertTensorsEqual(out_cpu, out_mlu.cpu().float(),
err, use_MSE=True)
shape_list = [(5), (7, 9), (9, 8, 7), (1, 2, 3, 4),
(10, 10, 10, 10), (100, 200), (3, 40, 32),
(1111), (99, 30, 40), (34, 56, 78, 90),
(5, 6, 7, 8, 9), (9, 11, 12, 14, 15, 16)]
for shape in shape_list:
x__ = torch.rand(shape, dtype=torch.float)
out_origin = x__
out_cpu_1 = out_origin.clone()
out_cpu_1 /= 5.0
out_mlu_1 = self.to_mlu_dtype(out_origin, data_type)
out_mlu_ptr = out_mlu_1.data_ptr()
out_mlu_1 /= 5.0
self.assertEqual(out_mlu_ptr, out_mlu_1.data_ptr())
out_cpu_2 = out_origin.clone()
out_cpu_2 = out_cpu_2/5.0
out_mlu_2 = self.to_mlu_dtype(out_origin, data_type)
out_mlu_2 = out_mlu_2/5.0
# float type precision : 0.003
self.assertTensorsEqual(out_cpu_1, out_mlu_1.cpu().float(),
err, use_MSE=True)
self.assertTensorsEqual(out_cpu_2, out_mlu_2.cpu().float(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_tensor_scalar_with_different_datatype(self):
"""
test_tensor_scalar
"""
dtype_list = [(torch.float, 3e-3), (torch.half, 3e-3)]
other_dtype_list = [torch.float, torch.half, torch.int, torch.short,
torch.long, torch.int8, torch.bool, torch.uint8]
for data_type, err in dtype_list:
shape_list = [(5,), (7, 9), (9, 8, 7), (1, 2, 3, 4),
(10, 10, 10, 10), (100, 200), (3, 40, 32),
(1111), (99, 30, 40), (34, 56, 78, 90),
(5, 6, 7, 8, 9), (9, 11, 12, 14, 15, 16)]
for shape in shape_list:
for other_data_type in other_dtype_list:
x__ = torch.rand(shape, dtype=torch.float)
y__ = torch.rand(shape, dtype=torch.float) + 1
y__ = y__.to(other_data_type)
out_origin = x__
out_cpu_1 = out_origin.clone()
out_cpu_1 /= y__
out_mlu_1 = self.to_mlu_dtype(out_origin, data_type)
out_mlu_ptr = out_mlu_1.data_ptr()
out_mlu_1 /= self.to_mlu_dtype(y__, other_data_type)
self.assertEqual(out_mlu_ptr, out_mlu_1.data_ptr())
mlu_2 = self.to_mlu_dtype(out_origin, data_type)
out_mlu_2 = mlu_2 / self.to_mlu_dtype(y__, other_data_type)
# float type precision : 0.003
if data_type == torch.float:
self.assertEqual(out_cpu_1.dtype, out_mlu_1.dtype)
self.assertEqual(out_cpu_1.dtype, out_mlu_2.dtype)
self.assertTensorsEqual(out_cpu_1, out_mlu_1.cpu().float(),
err, use_MSE=True)
self.assertTensorsEqual(out_cpu_1, out_mlu_2.cpu().float(),
err, use_MSE=True)
#@unittest.skip("not test")
@testinfo()
def test_div_exception(self):
a = torch.randn(3) * 10
b = torch.randn(3) * 10
a_mlu = a.int().to('mlu')
b_mlu = b.int().to('mlu')
ref_msg = r"^Integer division of tensors using div or \/ is no longer supported, " \
+ r"and in a future release div will perform true division as in Python 3\. " \
+ r"Use true_divide or floor_divide \(\/\/ in Python\) instead\.$"
with self.assertRaisesRegex(RuntimeError, ref_msg):
torch.div(a_mlu, b_mlu)
a = torch.randn((0, 1)).to('mlu')
b = torch.randn((1, 0)).to('mlu')
ref_msg = r"output with shape \[0, 1\] doesn't match the broadcast shape \[0, 0\]"
with self.assertRaisesRegex(RuntimeError, ref_msg):
a /= b
if __name__ == '__main__':
unittest.main()
| 42.746765 | 97 | 0.495589 | 3,013 | 23,126 | 3.466645 | 0.062728 | 0.054572 | 0.03102 | 0.044232 | 0.855912 | 0.838966 | 0.805553 | 0.789277 | 0.756151 | 0.731546 | 0 | 0.073773 | 0.382211 | 23,126 | 540 | 98 | 42.825926 | 0.657311 | 0.061446 | 0 | 0.647059 | 0 | 0 | 0.015607 | 0 | 0 | 0 | 0 | 0 | 0.138107 | 1 | 0.035806 | false | 0 | 0.025575 | 0 | 0.063939 | 0.002558 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7a1d24a3e90ff339a1c60f461876062bd759c0cd | 295 | py | Python | ovos_utils/lang/phonemes.py | forslund/ovos_utils | bfca2d9175b72b0d157385af07627aefcd280177 | [
"Apache-2.0"
] | null | null | null | ovos_utils/lang/phonemes.py | forslund/ovos_utils | bfca2d9175b72b0d157385af07627aefcd280177 | [
"Apache-2.0"
] | null | null | null | ovos_utils/lang/phonemes.py | forslund/ovos_utils | bfca2d9175b72b0d157385af07627aefcd280177 | [
"Apache-2.0"
] | null | null | null | from phoneme_guesser import guess_phonemes as _guess_phonemes, \
get_phonemes as _get_phonemes
# Backwards compat, TODO deprecate?
def guess_phonemes(word, lang="en-us"):
return _guess_phonemes(word, lang)
def get_phonemes(name, lang="en-us"):
return _get_phonemes(name, lang)
| 21.071429 | 64 | 0.755932 | 42 | 295 | 5 | 0.452381 | 0.247619 | 0.161905 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152542 | 295 | 13 | 65 | 22.692308 | 0.84 | 0.111864 | 0 | 0 | 0 | 0 | 0.03861 | 0 | 0 | 0 | 0 | 0.076923 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.333333 | 0.833333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
e177a7a197b98554e5cba55607b5ebaa64d16b5e | 20 | py | Python | test3.py | sun1561/gittest | 4db4ff0b6d331909015c37fca49a959395498727 | [
"MIT"
] | null | null | null | test3.py | sun1561/gittest | 4db4ff0b6d331909015c37fca49a959395498727 | [
"MIT"
] | null | null | null | test3.py | sun1561/gittest | 4db4ff0b6d331909015c37fca49a959395498727 | [
"MIT"
] | null | null | null | a=10
b=20
c=30
d=40
| 4 | 4 | 0.6 | 8 | 20 | 1.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0.2 | 20 | 4 | 5 | 5 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
83168debfe2119c0067db20d97cf2fc2c17d9afe | 21,327 | py | Python | colab/grr_colab/fs_test.py | tsehori/grr | 048506f22f74642bfe61749069a45ddf496fdab3 | [
"Apache-2.0"
] | 1 | 2020-06-25T14:25:51.000Z | 2020-06-25T14:25:51.000Z | colab/grr_colab/fs_test.py | tsehori/grr | 048506f22f74642bfe61749069a45ddf496fdab3 | [
"Apache-2.0"
] | 44 | 2021-05-14T22:49:24.000Z | 2022-03-13T21:54:02.000Z | colab/grr_colab/fs_test.py | tsehori/grr | 048506f22f74642bfe61749069a45ddf496fdab3 | [
"Apache-2.0"
] | 1 | 2020-06-25T14:25:54.000Z | 2020-06-25T14:25:54.000Z | #!/usr/bin/env python
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import io
import os
from absl.testing import absltest
from absl.testing import flagsaver
import requests
from grr_colab import _api
from grr_colab import errors
from grr_colab import fs
from grr_colab import testing
from grr_response_core.lib.util import temp
from grr_response_proto import jobs_pb2
from grr_response_server import data_store
class FileSystemTest(testing.ColabE2ETest):
FAKE_CLIENT_ID = 'C.0123456789abcdef'
def testLs_ContainsFiles(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
dir_nodes = [
# name, content
('file1', b'foo'),
('file2', b'foo\nbar'),
]
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
for filename, file_content in dir_nodes:
filepath = os.path.join(temp_dirpath, filename)
with io.open(filepath, 'wb') as filedesc:
filedesc.write(file_content)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.ls(temp_dirpath)
stat_entries = sorted(stat_entries, key=lambda _: _.pathspec.path)
self.assertLen(stat_entries, 2)
self.assertEqual(stat_entries[0].pathspec.path,
os.path.join(temp_dirpath, 'file1'))
self.assertEqual(stat_entries[0].st_size, 3)
self.assertEqual(stat_entries[1].pathspec.path,
os.path.join(temp_dirpath, 'file2'))
self.assertEqual(stat_entries[1].st_size, 7)
def testLs_EmptyDirectory(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.ls(temp_dirpath)
self.assertEmpty(stat_entries)
def testLs_Recursive(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
dir_nodes = [
'file0',
os.path.join('dir1', 'file1'),
os.path.join('dir2', 'file2'),
os.path.join('dir2', 'file3'),
]
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
os.mkdir(os.path.join(temp_dirpath, 'dir1'))
os.mkdir(os.path.join(temp_dirpath, 'dir2'))
for path in dir_nodes:
with io.open(os.path.join(temp_dirpath, path), 'wb') as filedesc:
filedesc.write(b'foo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.ls(temp_dirpath, max_depth=5)
stat_entries = sorted(stat_entries, key=lambda _: _.pathspec.path)
self.assertLen(stat_entries, 6)
self.assertEqual(stat_entries[0].pathspec.path,
os.path.join(temp_dirpath, 'dir1'))
self.assertEqual(stat_entries[1].pathspec.path,
os.path.join(temp_dirpath, 'dir1', 'file1'))
self.assertEqual(stat_entries[2].pathspec.path,
os.path.join(temp_dirpath, 'dir2'))
self.assertEqual(stat_entries[3].pathspec.path,
os.path.join(temp_dirpath, 'dir2', 'file2'))
self.assertEqual(stat_entries[4].pathspec.path,
os.path.join(temp_dirpath, 'dir2', 'file3'))
self.assertEqual(stat_entries[5].pathspec.path,
os.path.join(temp_dirpath, 'file0'))
def testLs_MaxDepth(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
dir_components = ['dir1', 'dir2', 'dir3', 'dir4', 'dir5']
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
os.makedirs(os.path.join(temp_dirpath, *dir_components))
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.ls(temp_dirpath, max_depth=3)
stat_entries = sorted(stat_entries, key=lambda _: _.pathspec.path)
self.assertLen(stat_entries, 3)
self.assertEqual(stat_entries[0].pathspec.path,
os.path.join(temp_dirpath, 'dir1'))
self.assertEqual(stat_entries[1].pathspec.path,
os.path.join(temp_dirpath, 'dir1', 'dir2'))
self.assertEqual(stat_entries[2].pathspec.path,
os.path.join(temp_dirpath, 'dir1', 'dir2', 'dir3'))
@testing.with_approval_checks
def testLs_WithoutApproval(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with self.assertRaises(errors.ApprovalMissingError) as context:
fs_obj.ls('/foo/bar')
self.assertEqual(context.exception.client_id, FileSystemTest.FAKE_CLIENT_ID)
def testGlob_SingleFile(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
os.mkdir(os.path.join(temp_dirpath, 'dir'))
os.mkdir(os.path.join(temp_dirpath, 'dir1'))
os.mkdir(os.path.join(temp_dirpath, 'dir2'))
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.glob(os.path.join(temp_dirpath, 'dir'))
self.assertLen(stat_entries, 1)
self.assertEqual(stat_entries[0].pathspec.path,
os.path.join(temp_dirpath, 'dir'))
def testGlob_MultipleFiles(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
os.mkdir(os.path.join(temp_dirpath, 'dir'))
os.mkdir(os.path.join(temp_dirpath, 'dir1'))
os.mkdir(os.path.join(temp_dirpath, 'dir2'))
os.mkdir(os.path.join(temp_dirpath, 'new_dir'))
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.glob(os.path.join(temp_dirpath, 'dir*'))
stat_entries = sorted(stat_entries, key=lambda _: _.pathspec.path)
self.assertLen(stat_entries, 3)
self.assertEqual(stat_entries[0].pathspec.path,
os.path.join(temp_dirpath, 'dir'))
self.assertEqual(stat_entries[1].pathspec.path,
os.path.join(temp_dirpath, 'dir1'))
self.assertEqual(stat_entries[2].pathspec.path,
os.path.join(temp_dirpath, 'dir2'))
def testGlob_NoFiles(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
stat_entries = fs_obj.glob(os.path.join(temp_dirpath, '*'))
self.assertEmpty(stat_entries)
@testing.with_approval_checks
def testGlob_WithoutApproval(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with self.assertRaises(errors.ApprovalMissingError) as context:
fs_obj.glob('/foo/bar')
self.assertEqual(context.exception.client_id, FileSystemTest.FAKE_CLIENT_ID)
def testGrep_HasMatches(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo bar\nbar Foo\nFoo foo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.grep(os.path.join(temp_dirpath, filename), b'foo')
self.assertLen(matches, 4)
self.assertEqual(matches[0].data, b'foo')
self.assertEqual(matches[0].offset, 0)
self.assertEqual(matches[1].data, b'Foo')
self.assertEqual(matches[1].offset, 12)
self.assertEqual(matches[2].data, b'Foo')
self.assertEqual(matches[2].offset, 16)
self.assertEqual(matches[3].data, b'foo')
self.assertEqual(matches[3].offset, 20)
def testGrep_Regex(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo bar\nbar Foo.oo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.grep(os.path.join(temp_dirpath, filename), b'.oo')
self.assertLen(matches, 3)
self.assertEqual(matches[0].data, b'foo')
self.assertEqual(matches[0].offset, 0)
self.assertEqual(matches[1].data, b'Foo')
self.assertEqual(matches[1].offset, 12)
self.assertEqual(matches[2].data, b'.oo')
self.assertEqual(matches[2].offset, 15)
def testGrep_NoMatches(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo bar')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.grep(os.path.join(temp_dirpath, filename), b'foobaar')
self.assertLen(matches, 0)
def testGrep_BinaryPattern(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo \xffOO\nFoo \xffoo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.grep(os.path.join(temp_dirpath, filename), b'\xffoo')
self.assertLen(matches, 2)
self.assertEqual(matches[0].data, b'\xffOO')
self.assertEqual(matches[0].offset, 4)
self.assertEqual(matches[1].data, b'\xffoo')
self.assertEqual(matches[1].offset, 12)
@testing.with_approval_checks
def testGrep_WithoutApproval(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with self.assertRaises(errors.ApprovalMissingError) as context:
fs_obj.grep('/foo/bar', b'quux')
self.assertEqual(context.exception.client_id, FileSystemTest.FAKE_CLIENT_ID)
def testFgrep_HasMatches(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo bar\nbar Foo\nFoo foo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.fgrep(os.path.join(temp_dirpath, filename), b'foo')
self.assertLen(matches, 2)
self.assertEqual(matches[0].data, b'foo')
self.assertEqual(matches[0].offset, 0)
self.assertEqual(matches[1].data, b'foo')
self.assertEqual(matches[1].offset, 20)
def testFgrep_Regex(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo bar\nbar Foo.oo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.fgrep(os.path.join(temp_dirpath, filename), b'.oo')
self.assertLen(matches, 1)
self.assertEqual(matches[0].data, b'.oo')
self.assertEqual(matches[0].offset, 15)
def testFgrep_NoMatches(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo bar')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.fgrep(os.path.join(temp_dirpath, filename), b'Foo')
self.assertLen(matches, 0)
def testFgrep_BinaryPattern(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'foo \xffOO\nFoo \xffoo')
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
matches = fs_obj.fgrep(os.path.join(temp_dirpath, filename), b'\xffoo')
self.assertLen(matches, 1)
self.assertEqual(matches[0].data, b'\xffoo')
self.assertEqual(matches[0].offset, 12)
@testing.with_approval_checks
def testFgrep_WithoutApproval(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with self.assertRaises(errors.ApprovalMissingError) as context:
fs_obj.fgrep('/foo/bar', b'quux')
self.assertEqual(context.exception.client_id, FileSystemTest.FAKE_CLIENT_ID)
@testing.with_approval_checks
def testWget_WithoutApproval(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with flagsaver.flagsaver(grr_admin_ui_url=self.endpoint):
with self.assertRaises(errors.ApprovalMissingError) as context:
fs_obj.wget('/foo/bar')
self.assertEqual(context.exception.client_id, FileSystemTest.FAKE_CLIENT_ID)
def testWget_NoAdminURLSpecified(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with flagsaver.flagsaver(grr_admin_ui_url=''):
with temp.AutoTempFilePath() as temp_file:
with self.assertRaises(ValueError):
fs_obj.wget(temp_file)
def testWget_FileDoesNotExist(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with flagsaver.flagsaver(grr_admin_ui_url=self.endpoint):
with self.assertRaises(Exception):
fs_obj.wget('/non/existing/file')
def testWget_IsDirectory(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with flagsaver.flagsaver(grr_admin_ui_url=self.endpoint):
with temp.AutoTempDirPath() as temp_dir:
with self.assertRaises(Exception):
fs_obj.wget(temp_dir)
def testWget_LinkWorksWithOfflineClient(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
content = b'foo bar'
with flagsaver.flagsaver(grr_admin_ui_url=self.endpoint):
with temp.AutoTempFilePath() as temp_file:
with io.open(temp_file, 'wb') as filedesc:
filedesc.write(content)
link = fs_obj.wget(temp_file)
self.assertEqual(requests.get(link).content, content)
def testOpen_ReadAll(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
content = b'foo bar'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(content)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(os.path.join(temp_dirpath, filename)) as filedesc:
self.assertEqual(filedesc.read(), content)
def testOpen_ReadMore(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
content = b'foo bar'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(content)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(os.path.join(temp_dirpath, filename)) as filedesc:
self.assertEqual(filedesc.read(10), content)
def testOpen_ReadLess(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
content = b'foo bar'
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(content)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(os.path.join(temp_dirpath, filename)) as filedesc:
self.assertEqual(filedesc.read(3), b'foo')
def testOpen_Buffering(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
size = 1024 * 1024
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'a' * size)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(os.path.join(temp_dirpath, filename)) as filedesc:
self.assertEqual(filedesc.tell(), 0)
self.assertLess(len(filedesc.read1()), size)
self.assertGreater(filedesc.tell(), 0)
def testOpen_ReadLargeFile(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
filename = 'foo'
size = 1024 * 1024
with temp.AutoTempDirPath(remove_non_empty=True) as temp_dirpath:
with io.open(os.path.join(temp_dirpath, filename), 'wb') as filedesc:
filedesc.write(b'a' * size)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(os.path.join(temp_dirpath, filename)) as filedesc:
self.assertEqual(len(filedesc.read()), size)
def testOpen_SeekWithinOneBuffer(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
content = b'foo bar'
with temp.AutoTempFilePath() as temp_filepath:
with io.open(temp_filepath, 'wb') as filedesc:
filedesc.write(content)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(temp_filepath) as filedesc:
filedesc.read(1)
self.assertEqual(filedesc.seek(4), 4)
self.assertEqual(filedesc.read(), b'bar')
def testOpen_SeekOutOfBuffer(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
size = 1024 * 512
with temp.AutoTempFilePath() as temp_filepath:
with io.open(temp_filepath, 'wb') as filedesc:
filedesc.write(b'a' * size)
filedesc.write(b'b' * size)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with fs_obj.open(temp_filepath) as filedesc:
self.assertEqual(filedesc.seek(size - 1), size - 1)
self.assertEqual(filedesc.read(2), b'ab')
@testing.with_approval_checks
def testOpen_WithoutApproval(self):
data_store.REL_DB.WriteClientMetadata(
client_id=FileSystemTest.FAKE_CLIENT_ID, fleetspeak_enabled=False)
fs_obj = fs.FileSystem(self._get_fake_api_client(), jobs_pb2.PathSpec.OS)
with self.assertRaises(errors.ApprovalMissingError) as context:
fs_obj.open('/foo/bar')
self.assertEqual(context.exception.client_id, FileSystemTest.FAKE_CLIENT_ID)
@classmethod
def _get_fake_api_client(cls):
return _api.get().Client(cls.FAKE_CLIENT_ID).Get()
if __name__ == '__main__':
absltest.main()
| 38.357914 | 80 | 0.715712 | 2,864 | 21,327 | 5.072975 | 0.067039 | 0.060569 | 0.040608 | 0.053961 | 0.871154 | 0.846445 | 0.822837 | 0.803496 | 0.795857 | 0.792278 | 0 | 0.011302 | 0.170254 | 21,327 | 555 | 81 | 38.427027 | 0.809731 | 0.001594 | 0 | 0.635468 | 0 | 0 | 0.030952 | 0 | 0 | 0 | 0 | 0 | 0.20936 | 1 | 0.081281 | false | 0 | 0.039409 | 0.002463 | 0.128079 | 0.002463 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
833dc68c8e0298a0d0c9a8f5bc472d93d7185826 | 5,644 | py | Python | preprocess_scripts/M3/moving_window/create_tfrecords.py | HansikaPH/time-series-forecasting | 23be319a190489bc1464653a3d672edd70ab110b | [
"MIT"
] | 67 | 2019-09-09T14:53:35.000Z | 2022-02-21T08:51:15.000Z | preprocess_scripts/M3/moving_window/create_tfrecords.py | HansikaPH/time-series-forecasting | 23be319a190489bc1464653a3d672edd70ab110b | [
"MIT"
] | 6 | 2019-09-09T06:11:51.000Z | 2019-12-16T04:31:11.000Z | preprocess_scripts/M3/moving_window/create_tfrecords.py | HansikaPH/time-series-forecasting | 23be319a190489bc1464653a3d672edd70ab110b | [
"MIT"
] | 18 | 2019-09-12T02:49:58.000Z | 2022-02-16T11:15:57.000Z | from tfrecords_handler.moving_window.tfrecord_writer import TFRecordWriter
import os
output_path = "../../../datasets/binary_data/M3/moving_window/"
if not os.path.exists(output_path):
os.makedirs(output_path)
if __name__ == '__main__':
# macro data
tfrecord_writer = TFRecordWriter(
input_size = 12,
output_size = 18,
train_file_path = '../../../datasets/text_data/M3/moving_window/m3_stl_monthly_macro_18i12.txt',
validate_file_path = '../../../datasets/text_data/M3/moving_window/m3_stl_monthly_macro_18i12v.txt',
test_file_path = '../../../datasets/text_data/M3/moving_window/m3_test_monthly_macro_18i12.txt',
binary_train_file_path = output_path + 'm3_stl_monthly_macro_18i12.tfrecords',
binary_validation_file_path = output_path + 'm3_stl_monthly_macro_18i12v.tfrecords',
binary_test_file_path = output_path + 'm3_test_monthly_macro_18i12.tfrecords'
)
tfrecord_writer.read_text_data()
tfrecord_writer.write_train_data_to_tfrecord_file()
tfrecord_writer.write_validation_data_to_tfrecord_file()
tfrecord_writer.write_test_data_to_tfrecord_file()
# micro data
tfrecord_writer = TFRecordWriter(
input_size=13,
output_size=18,
train_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_micro_18i13.txt',
validate_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_micro_18i13v.txt',
test_file_path='../../../datasets/text_data/M3/moving_window/m3_test_monthly_micro_18i13.txt',
binary_train_file_path=output_path + 'm3_stl_monthly_micro_18i13.tfrecords',
binary_validation_file_path=output_path + 'm3_stl_monthly_micro_18i13v.tfrecords',
binary_test_file_path=output_path + 'm3_test_monthly_micro_18i13.tfrecords'
)
tfrecord_writer.read_text_data()
tfrecord_writer.write_train_data_to_tfrecord_file()
tfrecord_writer.write_validation_data_to_tfrecord_file()
tfrecord_writer.write_test_data_to_tfrecord_file()
# industry data
tfrecord_writer = TFRecordWriter(
input_size=13,
output_size=18,
train_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_industry_18i13.txt',
validate_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_industry_18i13v.txt',
test_file_path='../../../datasets/text_data/M3/moving_window/m3_test_monthly_industry_18i13.txt',
binary_train_file_path=output_path + 'm3_stl_monthly_industry_18i13.tfrecords',
binary_validation_file_path=output_path + 'm3_stl_monthly_industry_18i13v.tfrecords',
binary_test_file_path=output_path + 'm3_test_monthly_industry_18i13.tfrecords'
)
tfrecord_writer.read_text_data()
tfrecord_writer.write_train_data_to_tfrecord_file()
tfrecord_writer.write_validation_data_to_tfrecord_file()
tfrecord_writer.write_test_data_to_tfrecord_file()
# finance data
tfrecord_writer = TFRecordWriter(
input_size=13,
output_size=18,
train_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_finance_18i13.txt',
validate_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_finance_18i13v.txt',
test_file_path='../../../datasets/text_data/M3/moving_window/m3_test_monthly_finance_18i13.txt',
binary_train_file_path=output_path + 'm3_stl_monthly_finance_18i13.tfrecords',
binary_validation_file_path=output_path + 'm3_stl_monthly_finance_18i13v.tfrecords',
binary_test_file_path=output_path + 'm3_test_monthly_finance_18i13.tfrecords'
)
tfrecord_writer.read_text_data()
tfrecord_writer.write_train_data_to_tfrecord_file()
tfrecord_writer.write_validation_data_to_tfrecord_file()
tfrecord_writer.write_test_data_to_tfrecord_file()
# other data
tfrecord_writer = TFRecordWriter(
input_size=13,
output_size=18,
train_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_other_18i13.txt',
validate_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_other_18i13v.txt',
test_file_path='../../../datasets/text_data/M3/moving_window/m3_test_monthly_other_18i13.txt',
binary_train_file_path=output_path + 'm3_stl_monthly_other_18i13.tfrecords',
binary_validation_file_path=output_path + 'm3_stl_monthly_other_18i13v.tfrecords',
binary_test_file_path=output_path + 'm3_test_monthly_other_18i13.tfrecords'
)
tfrecord_writer.read_text_data()
tfrecord_writer.write_train_data_to_tfrecord_file()
tfrecord_writer.write_validation_data_to_tfrecord_file()
tfrecord_writer.write_test_data_to_tfrecord_file()
# demographic data
tfrecord_writer = TFRecordWriter(
input_size=13,
output_size=18,
train_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_demo_18i13.txt',
validate_file_path='../../../datasets/text_data/M3/moving_window/m3_stl_monthly_demo_18i13v.txt',
test_file_path='../../../datasets/text_data/M3/moving_window/m3_test_monthly_demo_18i13.txt',
binary_train_file_path=output_path + 'm3_stl_monthly_demo_18i13.tfrecords',
binary_validation_file_path=output_path + 'm3_stl_monthly_demo_18i13v.tfrecords',
binary_test_file_path=output_path + 'm3_test_monthly_demo_18i13.tfrecords'
)
tfrecord_writer.read_text_data()
tfrecord_writer.write_train_data_to_tfrecord_file()
tfrecord_writer.write_validation_data_to_tfrecord_file()
tfrecord_writer.write_test_data_to_tfrecord_file() | 51.779817 | 109 | 0.764706 | 759 | 5,644 | 5.096179 | 0.065876 | 0.074457 | 0.074457 | 0.088418 | 0.944157 | 0.918304 | 0.907704 | 0.907704 | 0.890124 | 0.890124 | 0 | 0.045575 | 0.133062 | 5,644 | 109 | 110 | 51.779817 | 0.744942 | 0.013466 | 0 | 0.455556 | 0 | 0 | 0.377742 | 0.376303 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.022222 | 0 | 0.022222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
835a3f1148d2978132fc8f7c981941059b5787b9 | 94,566 | py | Python | pyquest_cffi/ops/ops.py | mattchan-tencent/PyQuEST-cffi | f498368faf989032d29687f9d5dd444b5c8c4747 | [
"Apache-2.0"
] | null | null | null | pyquest_cffi/ops/ops.py | mattchan-tencent/PyQuEST-cffi | f498368faf989032d29687f9d5dd444b5c8c4747 | [
"Apache-2.0"
] | null | null | null | pyquest_cffi/ops/ops.py | mattchan-tencent/PyQuEST-cffi | f498368faf989032d29687f9d5dd444b5c8c4747 | [
"Apache-2.0"
] | null | null | null | """Python classes for Quest functions"""
# Copyright 2019 HQS Quantum Simulations GmbH
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pyquest_cffi.questlib import (
quest, _PYQUEST, ffi_quest, qreal, tqureg, tquestenv, paulihamil
)
import numpy as np
from typing import Sequence, Optional, Tuple
from pyquest_cffi import cheat
class hadamard(_PYQUEST):
r"""Implements Hadamard gate
.. math::
U = \frac{1}{\sqrt{2}} \begin{pmatrix}
1 & 1\\
1 & -1
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
quest.hadamard(qureg, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = 1 / np.sqrt(2) * np.array([[1, 1], [1, -1]], dtype=np.complex)
return matrix
class pauliX(_PYQUEST):
r"""Implements Pauli X gate
.. math::
U = \begin{pmatrix}
0 & 1\\
1 & 0
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
quest.pauliX(qureg, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[0, 1], [1, 0]], dtype=np.complex)
return matrix
class pauliY(_PYQUEST):
r"""Implements Pauli Y gate
.. math::
U = \begin{pmatrix}
0 & -i\\
i & 0
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
quest.pauliY(qureg, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[0, -1j], [1j, 0]], dtype=np.complex)
return matrix
class pauliZ(_PYQUEST):
r"""Implements Pauli Z gate
.. math::
U = \begin{pmatrix}
1 & 0\\
0 & -1
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
quest.pauliZ(qureg, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0], [0, -1]], dtype=np.complex)
return matrix
class sGate(_PYQUEST):
r"""Implements S gate
.. math::
U = \begin{pmatrix}
1 & 0\\
0 & i
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
quest.sGate(qureg, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0], [0, 1j]], dtype=np.complex)
return matrix
class tGate(_PYQUEST):
r"""Implements T gate
.. math::
U = \begin{pmatrix}
1 & 0\\
0 & e^{i \frac{\pi}{4}}
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
"""
quest.tGate(qureg, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0], [0, np.exp(1j * np.pi / 4)]], dtype=np.complex)
return matrix
class compactUnitary(_PYQUEST):
r"""Implements general unitary gate U in compact notation
.. math::
U = \begin{pmatrix}
\alpha & -\beta^{*}\\
\beta & \alpha^{*}
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
alpha: complex parameter :math:`\alpha` of the unitary matrix
beta: complex parameter :math:`\beta` of the unitary matrix
"""
def call_interactive(self, qureg: tqureg, qubit: int, alpha: complex, beta: complex) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
alpha: complex parameter :math:`\alpha` of the unitary matrix
beta: complex parameter :math:`\beta` of the unitary matrix
Raises:
RuntimeError: compactUnitary needs parameters |alpha|**2+|beta|**2 == 1
"""
if not np.isclose(np.abs(alpha)**2 + np.abs(beta)**2, 1):
raise RuntimeError("compactUnitary needs parameters |alpha|**2+|beta|**2 == 1")
else:
calpha = ffi_quest.new("Complex *")
calpha.real = np.real(alpha)
calpha.imag = np.imag(alpha)
cbeta = ffi_quest.new("Complex *")
cbeta.real = np.real(beta)
cbeta.imag = np.imag(beta)
quest.compactUnitary(qureg, qubit, calpha[0], cbeta[0])
def matrix(self, alpha: complex, beta: complex, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
alpha: complex parameter :math:`\alpha` of the unitary matrix
beta: complex parameter :math:`\beta` of the unitary matrix
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[alpha, -np.conj(beta)], [beta, np.conj(alpha)]], dtype=np.complex)
return matrix
class phaseShift(_PYQUEST):
r"""Implements pure :math:`\left|1 \right\rangle` phase shift gate
.. math::
U = \begin{pmatrix}
1 & 0\\
0 & e^{i \theta}
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the ro
"""
def call_interactive(self, qureg: tqureg, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the ro
"""
if not (0 <= theta and theta <= 2 * np.pi):
theta = np.mod(theta, 2 * np.pi)
quest.phaseShift(qureg, qubit, theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle theta of the ro
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0], [0, np.exp(1j * theta)]], dtype=np.complex)
return matrix
class rotateAroundAxis(_PYQUEST):
r"""Implements rotation around arbitraty axis on Bloch sphere
.. math::
U = \begin{pmatrix}
\cos(\frac{\theta}{2}) & 0\\
0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
-i \sin(\frac{\theta}{2}) v_z & \sin(\frac{\theta}{2}) \left(-i v_x - v_y \right) \\
\sin(\frac{\theta}{2}) \left(-i v_x + v_y \right) & i \sin(\frac{\theta}{2}) v_z)
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
"""
def call_interactive(self, qureg: tqureg, qubit: int, theta: float, vector: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
Raises:
RuntimeError: vector needs to be a three component numpy array and unit-vector
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
if not (vector.shape == (3,) and np.isclose(np.linalg.norm(vector), 1)):
raise RuntimeError("vector needs to be a three component numpy array and unit-vector")
else:
vec = ffi_quest.new("Vector *")
vec.x = vector[0]
vec.y = vector[1]
vec.z = vector[2]
quest.rotateAroundAxis(qureg,
qubit,
theta,
vec[0])
def matrix(self, theta: float, vector: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
vx = vector[0]
vy = vector[1]
vz = vector[2]
matrix = np.array([[c - 1j * s * vz, s * (-1j * vx - vy)],
[s * (-1j * vx + vy), c + 1j * s * vz]], dtype=np.complex)
return matrix
class rotateAroundSphericalAxis(_PYQUEST):
r"""Implements rotation around an axis given in spherical coordinates
.. math::
U &= \begin{pmatrix}
\cos(\frac{\theta}{2}) & 0\\
0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
-i \sin(\frac{\theta}{2}) v_z & \sin(\frac{\theta}{2}) \left(-i v_x - v_y \right) \\
\sin(\frac{\theta}{2}) \left(-i v_x + v_y \right) & i \sin(\frac{\theta}{2}) v_z)
\end{pmatrix}\\
v_x &= \sin(\theta_{sph}) \cos(\phi_{sph})\\
v_y &= \sin(\theta_{sph}) \sin(\phi_{sph})\\
v_z &= \cos(\theta_{sph})
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
spherical_theta: Rotation axis, unit-vector spherical coordinates theta
spherical_phi: Rotation axis, unit-vector spherical coordinates phi
"""
def call_interactive(self, qureg: tqureg, qubit: int, theta: float,
spherical_theta: float, spherical_phi: float,) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
spherical_theta: Rotation axis, unit-vector spherical coordinates theta
spherical_phi: Rotation axis, unit-vector spherical coordinates phi
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
vec = ffi_quest.new("Vector *")
vec.x = np.sin(spherical_theta) * np.cos(spherical_phi)
vec.y = np.sin(spherical_theta) * np.sin(spherical_phi)
vec.z = np.cos(spherical_theta)
quest.rotateAroundAxis(qureg,
qubit,
theta,
vec[0])
def matrix(self, theta: float, vector: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
vx = vector[0]
vy = vector[1]
vz = vector[2]
matrix = np.array([[c - 1j * s * vz, s * (-1j * vx - vy)],
[s * (-1j * vx + vy), c + 1j * s * vz]], dtype=np.complex)
return matrix
class rotateX(_PYQUEST):
r"""Implements :math:`e^{-i \frac{\theta}{2} \sigma^x}` XPower gate
.. math::
U = \begin{pmatrix}
\cos(\frac{\theta}{2}) & 0\\
0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
0 & -i \sin(\frac{\theta}{2}) \\
-i \sin(\frac{\theta}{2}) & 0
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
"""
def call_interactive(self, qureg: tqureg, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
quest.rotateX(qureg,
qubit,
theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
matrix = np.array([[c, -1j * s], [-1j * s, c]], dtype=np.complex)
return matrix
class rotateY(_PYQUEST):
r"""Implements :math:`e^{-i \frac{\theta}{2} \sigma^y}` XPower gate
.. math::
U = \begin{pmatrix}
\cos(\frac{\theta}{2}) & 0\\
0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
0 & - \sin(\frac{\theta}{2}) \\
\sin(\frac{\theta}{2}) & 0
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
"""
def call_interactive(self, qureg: tqureg, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
quest.rotateY(qureg,
qubit,
theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
matrix = np.array([[c, -s], [s, c]], dtype=np.complex)
return matrix
class rotateZ(_PYQUEST):
r"""Implements :math:`e^{-i \frac{\theta}{2} \sigma^z}` XPower gate
.. math::
U = \begin{pmatrix}
\cos(\frac{\theta}{2}) & 0\\
0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
- i \sin(\frac{\theta}{2}) & 0 \\
0 & i \sin(\frac{\theta}{2})
\end{pmatrix}
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
"""
def call_interactive(self, qureg: tqureg, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
quest.rotateZ(qureg,
qubit,
theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle theta of the rotation, in interval 0 to 2 :math:`2 \pi`
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
matrix = np.array([[c - 1j * s, 0], [0, c + 1j * s]], dtype=np.complex)
return matrix
class unitary(_PYQUEST):
r"""Implements an arbitraty one-qubit gate given by a unitary matrix
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
matrix: Unitary matrix of the one qubit gate
"""
def call_interactive(self, qureg: tqureg, qubit: int, matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
matrix: Unitary matrix of the one qubit gate
Raises:
RuntimeError: matrix needs to be a (2, 2) unitary numpy array
"""
if not (matrix.shape == (2, 2) and np.all(np.isclose(matrix.conj().T @ matrix, np.eye(2)))):
raise RuntimeError("matrix needs to be a (2, 2) unitary numpy array")
else:
mat = ffi_quest.new("ComplexMatrix2 *")
for i in range(2):
for j in range(2):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.unitary(qureg,
qubit,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: Unitary matrix of the one qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
# Controlled and other Two-Qubit Operations
class twoQubitUnitary(_PYQUEST):
r"""General two qubit unitary gate
Implements a general two-qubit gate defined by a matrix
If the matrix basis states are given by 0=|00> 1=|01> 2=|10> 3=|11>
the least significant qubit is the right qubit and the most
significant qubit is the left qubit
Args:
qureg: quantum register
target_qubit_1: least significant qubit
target_qubit_2: most sifnificant qubit
matrix: 4 by 4 matrix that defines the two qubit gate
"""
def call_interactive(self,
qureg: tqureg,
target_qubit_1: int,
target_qubit_2: int,
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
target_qubit_1: least significant qubit
target_qubit_2: most sifnificant qubit
matrix: 4 by 4 matrix that defines the two qubit gate
"""
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.twoQubitUnitary(qureg,
target_qubit_1,
target_qubit_2,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
class controlledTwoQubitUnitary(_PYQUEST):
r"""Controlled two qubit unitary gate
Implements a general two-qubit gate defined by a matrix controlled by a third qubit
If the matrix basis states are given by 0=|00> 1=|01> 2=|10> 3=|11>
the least significant qubit is the right qubit and the most
significant qubit is the left qubit
Args:
qureg: quantum register
control: controll qubit
target_qubit_1: least significant qubit
target_qubit_2: most sifnificant qubit
matrix: 4 by 4 matrix that defines the two qubit gate
"""
def call_interactive(self,
qureg: tqureg, control: int,
target_qubit_1: int,
target_qubit_2: int,
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: controll qubit
target_qubit_1: least significant qubit
target_qubit_2: most sifnificant qubit
matrix: 4 by 4 matrix that defines the two qubit gate
"""
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat[0].real[i][j] = np.real(matrix[i, j])
mat[0].imag[i][j] = np.imag(matrix[i, j])
quest.controlledTwoQubitUnitary(qureg,
control,
target_qubit_1,
target_qubit_2,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|xy> |1xy> -> |1> U |xy>
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
dim = matrix.shape[0]
return np.block([[np.eye(dim), np.zeros((dim, dim))],
[np.zeros((dim, dim)), matrix]])
class controlledCompactUnitary(_PYQUEST):
r"""Implements a controlled general unitary gate U in compact notation
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & \alpha & -\beta^{*}\\
0 & 0 & \beta & \alpha^{*}
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
alpha: complex parameter :math:`\alpha` of the unitary matrix
beta: complex parameter :math:`\beta` of the unitary matrix
"""
def call_interactive(self,
qureg: tqureg,
control: int,
qubit: int,
alpha: complex,
beta: complex) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
alpha: complex parameter :math:`\alpha` of the unitary matrix
beta: complex parameter :math:`\beta` of the unitary matrix
Raises:
RuntimeError: compactUnitary needs parameters |alpha|**2+|beta|**2 == 1
"""
if not np.isclose(np.abs(alpha)**2 + np.abs(beta)**2, 1):
raise RuntimeError("compactUnitary needs parameters |alpha|**2+|beta|**2 == 1")
else:
calpha = ffi_quest.new("Complex *")
calpha.real = np.real(alpha)
calpha.imag = np.imag(alpha)
cbeta = ffi_quest.new("Complex *")
cbeta.real = np.real(beta)
cbeta.imag = np.imag(beta)
quest.controlledCompactUnitary(qureg, control, qubit, calpha[0], cbeta[0])
def matrix(self, alpha: complex, beta: complex, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
alpha: complex parameter :math:`\alpha` of the unitary matrix
beta: complex parameter :math:`\beta` of the unitary matrix
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, alpha, -np.conj(beta)],
[0, 0, beta, np.conj(alpha)]], dtype=np.complex)
return matrix
class controlledNot(_PYQUEST):
r"""Implements a controlled NOT gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 0 & 1\\
0 & 0 & 1 & 0
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
quest.controlledNot(qureg, control, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0x> -> |0>|x> |1x> -> |1> NOT |x>
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 0, 1],
[0, 0, 1, 0]], dtype=np.complex)
return matrix
class controlledPauliY(_PYQUEST):
r"""Implements a controlled PauliY gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 0 & -i\\
0 & 0 & i & 0
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
quest.controlledPauliY(qureg, control, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 0, -1j],
[0, 0, 1j, 0]], dtype=np.complex)
return matrix
class controlledPhaseFlip(_PYQUEST):
r"""Implements a controlled phase flip gate also known as controlled Z gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & -1
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
quest.controlledPhaseFlip(qureg, control, qubit)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, -1]], dtype=np.complex)
return matrix
class swapGate(_PYQUEST):
r"""Implements a SWAP gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 0 & 1
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
quest.swapGate(qureg,
control,
qubit,
)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 0, 1, 0],
[0, 1, 0, 0],
[0, 0, 0, 1]], dtype=np.complex)
return matrix
class sqrtSwapGate(_PYQUEST):
r"""Implements a square root SWAP gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & \frac{1}{2}(1+i) & \frac{1}{2}(1-i) & 0\\
0 & \frac{1}{2}(1-i) & \frac{1}{2}(1+i) & 0\\
0 & 0 & 0 & 1
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
quest.sqrtSwapGate(qureg,
control,
qubit,
)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, (1 + 1j) / 2, (1 - 1j) / 2, 0],
[0, (1 - 1j) / 2, (1 + 1j) / 2, 0],
[0, 0, 0, 1]], dtype=np.complex)
return matrix
class sqrtISwap(_PYQUEST):
r"""Implements a square root ISwap gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & \frac{1}{\sqrt{2}} & \frac{i}{\sqrt{2}} & 0\\
0 & \frac{i}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0\\
0 & 0 & 0 & 1
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
matrix = self.matrix()
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.twoQubitUnitary(qureg,
control,
qubit,
mat[0])
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1 / np.sqrt(2), 1j / np.sqrt(2), 0],
[0, 1j / np.sqrt(2), 1 / np.sqrt(2), 0],
[0, 0, 0, 1]], dtype=np.complex)
return matrix
class invSqrtISwap(_PYQUEST):
r"""Implements inverse square root ISwap gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & \frac{1}{\sqrt{2}} & \frac{-i}{\sqrt{2}} & 0\\
0 & \frac{-i}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0\\
0 & 0 & 0 & 1
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
matrix = self.matrix()
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.twoQubitUnitary(qureg,
control,
qubit,
mat[0])
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1 / np.sqrt(2), -1j / np.sqrt(2), 0],
[0, -1j / np.sqrt(2), 1 / np.sqrt(2), 0],
[0, 0, 0, 1]], dtype=np.complex)
return matrix
class controlledPhaseShift(_PYQUEST):
r"""Implements a controlled phase flip shift also known as controlled Z power gate
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & e^{i\theta}
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: The angle of the controlled Z-rotation
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: The angle of the controlled Z-rotation
"""
if not (0 <= theta and theta <= 2 * np.pi):
theta = np.mod(theta, 2 * np.pi)
quest.controlledPhaseShift(qureg, control, qubit, theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
theta: The angle of the controlled Z-rotation
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, np.exp(1j * theta)]], dtype=np.complex)
return matrix
class controlledRotateAroundAxis(_PYQUEST):
r"""Rotation around a general axis.
Implements a controlled rotation around a vector :math:`\vec{v}`
:math:`e^{-i \frac{\theta}{2} \vec{v} \vec{\sigma}}`
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & \cos(\frac{\theta}{2}) & 0\\
0 & 0 & 0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
0 & 0 & 0 & 0\\
0 & 0 & 0 & 0\\
0 & 0 & -i \sin(\frac{\theta}{2}) v_z & \sin(\frac{\theta}{2}) \left(-i v_x - v_y \right)\\
0 & 0 & \sin(\frac{\theta}{2}) \left(-i v_x + v_y \right) & i \sin(\frac{\theta}{2}) v_z)
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
"""
def call_interactive(self,
qureg: tqureg,
control: int,
qubit: int,
theta: float,
vector: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
Raises:
RuntimeError: vector needs to be a three component numpy array and unit-vector
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
if not (vector.shape == (3,) and np.isclose(np.linalg.norm(vector), 1)):
raise RuntimeError("vector needs to be a three component numpy array and unit-vector")
else:
vec = ffi_quest.new("Vector *")
vec.x = vector[0]
vec.y = vector[1]
vec.z = vector[2]
quest.controlledRotateAroundAxis(qureg, control,
qubit,
theta,
vec[0])
def matrix(self, theta: float, vector: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
theta: Angle theta of the rotation
vector: Direction of the rotation axis, unit-vector
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
vx = vector[0]
vy = vector[1]
vz = vector[2]
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, c - 1j * s * vz, s * (-1j * vx - vy)],
[0, 0, s * (-1j * vx + vy), c + 1j * s * vz]], dtype=np.complex)
return matrix
class controlledRotateX(_PYQUEST):
r"""Implements a controlled rotation around the X axis
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & \cos(\frac{\theta}{2}) & 0\\
0 & 0 & 0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
0 & 0 & 0 & 0\\
0 & 0 & 0 & 0\\
0 & 0 & 0 & -i \sin(\frac{\theta}{2}) \\
0 & 0 & -i \sin(\frac{\theta}{2}) & 0
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
quest.controlledRotateX(qureg, control,
qubit,
theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
theta: Angle theta of the rotation
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, c, -1j * s],
[0, 0, -1j * s, c]], dtype=np.complex)
return matrix
class controlledRotateY(_PYQUEST):
r"""Implements a controlled rotation around the Y axis `
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & \cos(\frac{\theta}{2}) & 0\\
0 & 0 & 0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
0 & 0 & 0 & 0\\
0 & 0 & 0 & 0\\
0 & 0 & 0 & - \sin(\frac{\theta}{2}) \\
0 & 0 & \sin(\frac{\theta}{2}) & 0
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
quest.controlledRotateY(qureg, control,
qubit,
theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
theta: Angle theta of the rotation
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, c, -s],
[0, 0, s, c]], dtype=np.complex)
return matrix
class controlledRotateZ(_PYQUEST):
r"""Implements a controlled rotation around the Y axis `
.. math::
U = \begin{pmatrix}
1 & 0 & 0 & 0\\
0 & 1 & 0 & 0\\
0 & 0 & \cos(\frac{\theta}{2}) & 0\\
0 & 0 & 0 & \cos(\frac{\theta}{2})
\end{pmatrix}
+ \begin{pmatrix}
0 & 0 & 0 & 0\\
0 & 0 & 0 & 0\\
0 & 0 & - i \sin(\frac{\theta}{2}) & 0 \\
0 & 0 & 0 & i \sin(\frac{\theta}{2})
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int, theta: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
theta: Angle theta of the rotation
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
quest.controlledRotateZ(qureg, control,
qubit,
theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
theta: Angle theta of the rotation
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
c = np.cos(theta / 2)
s = np.sin(theta / 2)
matrix = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, c - 1j * s, 0],
[0, 0, 0, c + 1j * s]], dtype=np.complex)
return matrix
class controlledUnitary(_PYQUEST):
r"""Implements a controlled arbitraty one-qubit gate given by a unitary matrix
Args:
qureg: quantum register
control: qubit that controls the unitary
qubit: qubit the unitary gate is applied to
matrix: Unitary matrix of the one qubit gate
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int, matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the unitary
qubit: qubit the unitary gate is applied to
matrix: Unitary matrix of the one qubit gate
Raises:
RuntimeError: vector needs to be a (2, 2) unitary numpy array
"""
if not (matrix.shape == (2, 2) and np.all(np.isclose(matrix.conj().T @ matrix, np.eye(2)))):
raise RuntimeError("vector needs to be a (2, 2) unitary numpy array")
else:
mat = ffi_quest.new("ComplexMatrix2 *")
for i in range(2):
for j in range(2):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.controlledUnitary(qureg, control,
qubit,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|x> |1x> -> |1> U |x>
Args:
matrix: Unitary matrix of the one qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
mat = np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, matrix[0, 0], matrix[0, 1]],
[0, 0, matrix[1, 0], matrix[1, 1]]], dtype=np.complex)
return mat
# Multi-controlled and mutli-qubit Operations
class multiControlledTwoQubitUnitary(_PYQUEST):
r"""Two qubit unitary gate controlled by multiple qubits
Implements a general two-qubit gate defined by a matrix controlled by multipe qubits
If the matrix basis states are given by 0=|00> 1=|01> 2=|10> 3=|11>
the least significant qubit is the right qubit and the most
significant qubit is the left qubit
Args:
qureg: quantum register
control: controll qubit
target_qubit_1: least significant qubit
target_qubit_2: most sifnificant qubit
matrix: 4 by 4 matrix that defines the two qubit gate
"""
def call_interactive(self,
qureg: tqureg,
controls: Sequence[int],
target_qubit_1: int,
target_qubit_2: int,
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: control qubits
target_qubit_1: least significant qubit
target_qubit_2: most sifnificant qubit
matrix: 4 by 4 matrix that defines the two qubit gate
"""
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer = ffi_quest.new("int[{}]".format(len(controls)))
number_controls = len(controls)
for co, control in enumerate(controls):
pointer[co] = control
quest.multiControlledTwoQubitUnitary(qureg,
pointer,
number_controls,
target_qubit_1,
target_qubit_2,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class multiControlledPhaseFlip(_PYQUEST):
r"""Phase Flip controlled by multipe qubits
Implements a multi controlled phase flip gate also known as controlled Z gate.
If all qubits in the controls are :math:`\left|1\right\rangle` the sign is flipped.
No change occurs otherwise
Args:
qureg: quantum register
controls: qubits that control the application of the unitary
number_controls: number of the control qubits
"""
def call_interactive(self,
qureg: tqureg,
controls: Sequence[int],
number_controls: Optional[int] = None) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: qubits that control the application of the unitary
number_controls: number of the control qubits
"""
pointer = ffi_quest.new("int[{}]".format(len(controls)))
if number_controls is None:
number_controls = len(controls)
for co, control in enumerate(controls):
pointer[co] = control
quest.multiControlledPhaseFlip(qureg, pointer, number_controls)
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError
class multiControlledPhaseShift(_PYQUEST):
r"""Phase Shift controlled by multiple qubits
Implements a multi controlled phase flip gate also known as controlled Z power gate.
If all qubits in the controls are :math:`\left|1\right\rangle` the phase is shifter by theta.
No change occurs otherwise
Args:
qureg: quantum register
controls: qubits that control the application of the unitary
number_controls: number of the control qubits
theta: Angle of the rotation around Z-axis
"""
def call_interactive(self, qureg: tqureg, controls: Sequence[int],
number_controls: Optional[int] = None, theta: float = 0) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: qubits that control the application of the unitary
number_controls: number of the control qubits
theta: Angle of the rotation around Z-axis
"""
if not (0 <= theta and theta <= 4 * np.pi):
theta = np.mod(theta, 4 * np.pi)
pointer = ffi_quest.new("int[{}]".format(len(controls)))
for co, control in enumerate(controls):
pointer[co] = control
if number_controls is None:
number_controls = len(controls)
quest.multiControlledPhaseShift(qureg, pointer, number_controls, theta)
def matrix(self, theta: float, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
theta: Angle of the rotation around Z-axis
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError
class multiControlledUnitary(_PYQUEST):
r"""Generic unitary gate controlled by multiple qubits
Implements a multi-controlled arbitraty one-qubit gate given by a unitary matrix
Args:
qureg: quantum register
controls: qubits that control the application of the unitary
qubit: qubit the unitary gate is applied to
matrix: Unitary matrix of the one qubit gate
"""
def call_interactive(self, qureg: tqureg,
controls: Sequence[int],
qubit: int, matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: qubits that control the application of the unitary
qubit: qubit the unitary gate is applied to
matrix: Unitary matrix of the one qubit gate
Raises:
RuntimeError: vector needs to be a (2, 2) unitary numpy array
"""
if not (matrix.shape == (2, 2) and np.all(np.isclose(matrix.conj().T @ matrix, np.eye(2)))):
raise RuntimeError("vector needs to be a (2, 2) unitary numpy array")
else:
mat = ffi_quest.new("ComplexMatrix2 *")
for i in range(2):
for j in range(2):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer = ffi_quest.new("int[{}]".format(len(controls)))
for co, control in enumerate(controls):
pointer[co] = control
number_controls = len(controls)
quest.multiControlledUnitary(qureg, pointer,
number_controls,
qubit,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: Unitary matrix of the one qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError
class controlledMultiQubitUnitary(_PYQUEST):
r"""Controlled general unitary gate acting on N qubits
Implements a general N-qubit gate defined by a matrix and controlled by a third qubit
If the matrix basis states are given by 0=|00> 1=|01> 2=|10> 3=|11>
the least significant qubit is the right qubit and the most
significant qubit is the left qubit
Args:
qureg: quantum register
control: controll qubit
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
"""
def call_interactive(self,
qureg: tqureg, control: int,
targets: Sequence[int],
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: controll qubit
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Raises:
RuntimeError: Shape of matrix and length of targets are different
"""
if 2**len(targets) != matrix.shape[0] or 2**len(targets) != matrix.shape[1]:
raise RuntimeError("Shape of matrix and length of targets are different")
dim = matrix.shape[0]
mat = quest.createComplexMatrixN(len(targets))
for i in range(dim):
for j in range(dim):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer = ffi_quest.new("int[{}]".format(len(targets)))
for co, target in enumerate(targets):
pointer[co] = target
quest.controlledMultiQubitUnitary(qureg,
control,
pointer,
len(targets),
mat)
quest.destroyComplexMatrixN(mat)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
The control qubit is always assumed to be the most relevant
qubit |0xy> -> |0>|xy> |1xy> -> |1> U |xy>
Args:
matrix: N by N matrix that defines the N qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
dim = matrix.shape[0]
return np.block([[np.eye(dim), np.zeros((dim, dim))],
[np.zeros((dim, dim)), matrix]])
class multiControlledMultiQubitUnitary(_PYQUEST):
r"""General N-qubit unitary gate controlled by multiple qubits
Implements a general N-qubit gate defined by a matrix controlled by multipe qubits
If the matrix basis states are given by 0=|00> 1=|01> 2=|10> 3=|11>
the least significant qubit is the right qubit and the most
significant qubit is the left qubit
Args:
qureg: quantum register
controls: controll qubits
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
"""
def call_interactive(self,
qureg: tqureg,
controls: Sequence[int],
targets: Sequence[int],
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: controll qubits
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Raises:
RuntimeError: Shape of matrix and length of targets are different
"""
if 2**len(targets) != matrix.shape[0] or 2**len(targets) != matrix.shape[1]:
raise RuntimeError("Shape of matrix and length of targets are different")
dim = matrix.shape[0]
mat = quest.createComplexMatrixN(len(targets))
for i in range(dim):
for j in range(dim):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer_c = ffi_quest.new("int[{}]".format(len(controls)))
for co, control in enumerate(controls):
pointer_c[co] = control
number_controls = len(controls)
pointer = ffi_quest.new("int[{}]".format(len(targets)))
for co, t in enumerate(targets):
pointer[co] = t
quest.multiControlledMultiQubitUnitary(qureg,
pointer_c,
number_controls,
pointer,
len(targets),
mat)
quest.destroyComplexMatrixN(mat)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class multiQubitUnitary(_PYQUEST):
r"""General unitary gate acting on N qubits
Implements a general N-qubit gate defined by a matrix
If the matrix basis states are given by 0=|00> 1=|01> 2=|10> 3=|11>
the least significant qubit is the right qubit and the most
significant qubit is the left qubit
Args:
qureg: quantum register
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
"""
def call_interactive(self, qureg: tqureg, targets: Sequence[int], matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Raises:
RuntimeError: Shape of matrix and length of targets are different
"""
if 2**len(targets) != matrix.shape[0] or 2**len(targets) != matrix.shape[1]:
raise RuntimeError("Shape of matrix and length of targets are different")
dim = matrix.shape[0]
mat = quest.createComplexMatrixN(len(targets))
for i in range(dim):
for j in range(dim):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer = ffi_quest.new("int[{}]".format(len(targets)))
for co, target in enumerate(targets):
pointer[co] = target
quest.multiQubitUnitary(qureg,
pointer,
len(targets),
mat)
quest.destroyComplexMatrixN(mat)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: N by N matrix that defines the N qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
class multiRotateZ(_PYQUEST):
r"""Applying a Z-Rotation to multiple qubits
A Z-Rotation with a given angle is applyied to multiple qubits
Args:
qureg: quantum register
qubits: target qubits
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
"""
def call_interactive(self,
qureg: tqureg,
qubits: Sequence[int],
angle: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubits: target qubits
angle: Angle of rotation of the RotateZ gate
"""
number_qubits = len(qubits)
pointer = ffi_quest.new("int[{}]".format(len(qubits)))
for co, q in enumerate(qubits):
pointer[co] = q
quest.multiRotateZ(qureg,
pointer,
number_qubits,
angle)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class multiRotatePauli(_PYQUEST):
r"""Applying a set of different Pauli rotations to multiple qubits
A set of Pauli rotations with a given angle is applied to multiple qubits
Args:
qureg: quantum register
qubits: target qubits
paulis: Pauli operators encoded as int via IDENTITY=0, PAULI_X=1, PAULI_Y=2, PAULI_Z=3
matrix: N by N matrix that defines the N qubit gate
"""
def call_interactive(self,
qureg: tqureg,
qubits: Sequence[int],
paulis: Sequence[int],
angle: float) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubits: target qubits
paulis: Pauli operators encoded as int via IDENTITY=0, PAULI_X=1, PAULI_Y=2, PAULI_Z=3
angle: Angle of rotation of paulis
Raises:
RuntimeError: Number of qubits different from number of applied Paulis
"""
if len(qubits) != len(paulis):
raise RuntimeError("Number of qubits different from number of applied Paulis")
number_qubits = len(qubits)
pointer = ffi_quest.new("int[{}]".format(len(qubits)))
for co, q in enumerate(qubits):
pointer[co] = q
pointer_paulis = ffi_quest.new("enum pauliOpType[{}]".format(len(qubits)))
for co, p in enumerate(paulis):
pointer_paulis[co] = p
quest.multiRotatePauli(qureg,
pointer,
pointer_paulis,
number_qubits,
angle)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class multiStateControlledUnitary(_PYQUEST):
r"""One qubit unitary controlled by multiple states
Implements a general one-qubit gate defined by a matrix controlled by
the state of multiple qubits
Contrary to the multiControlled function the unitary operation here can be executed
either when the controlling qubit is in state |0> or in state |1> depending
on the control_states
Args:
qureg: quantum register
controls: controll qubits
controll_states: list of ints defining if the controlling gate acts like a
a normal control or anti-control (unitary is applied when state is |0>)
For each entry: 1 -> normal controlled, 0 -> anti-controlled
qubit: The qubit the unitary is acting on
matrix: 2 by 2 matrix that defines the one qubit gate
"""
def call_interactive(self,
qureg: tqureg,
controls: Sequence[int],
control_states: Sequence[int],
qubit: int,
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: controll qubits
control_states: list of ints defining if the controlling gate acts like a
a normal control or anti-control (unitary is applied when state is |0>)
For each entry: 1 -> normal controlled, 0 -> anti-controlled
qubit: The qubit the unitary is acting on
matrix: 2 by 2 matrix that defines the one qubit gate
Raises:
RuntimeError: Different Number of controls and control states
"""
if len(controls) != len(control_states):
raise RuntimeError("Different Number of controls and control states")
mat = ffi_quest.new("ComplexMatrix2 *")
for i in range(2):
for j in range(2):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer_controls = ffi_quest.new("int[{}]".format(len(controls)))
for co, control in enumerate(controls):
pointer_controls[co] = control
number_controls = len(controls)
pointer_states = ffi_quest.new("int[{}]".format(len(control_states)))
for co, state in enumerate(control_states):
pointer_states[co] = state
quest.multiStateControlledUnitary(qureg,
pointer_controls,
pointer_states,
number_controls,
qubit,
mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
# Extra gates
class MolmerSorensenXX(_PYQUEST):
r"""Molmer Sorensen gate
Implements a fixed phase MolmerSorensen XX gate (http://arxiv.org/abs/1705.02771)
Uses decomposition according to http://arxiv.org/abs/quant-ph/0507171
.. math::
U = \frac{1}{\sqrt{2}} \begin{pmatrix}
1 & 0 & 0 & i\\
0 & 1 & i & 0\\
0 & i & 1 & 0\\
i & 0 & 0 & 1
\end{pmatrix}
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
def call_interactive(self, qureg: tqureg, control: int, qubit: int) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the unitary
qubit: qubit the unitary gate is applied to
"""
matrix = self.matrix()
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.twoQubitUnitary(qureg,
control,
qubit,
mat[0])
def matrix(self, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
matrix = np.array([[1, 0, 0, 1j],
[0, 1, 1j, 0],
[0, 1j, 1, 0],
[1j, 0, 0, 1]], dtype=np.complex) * (1 - 1j) / 2
return matrix
# Apply operations
class applyDiagonalOp(_PYQUEST):
r"""Applying a diagonal operator to state
Apply a diagonal complex operator, which is possibly non-unitary
and non-Hermitian, on the entire quantum register.
Args:
qureg: quantum register input, is not changed
operator: operator acting on a certain number of qubits (operator[0]: int)
and in a certain QuEST environment (operator[1]: tquestenv)
"""
def call_interactive(self,
qureg: tqureg,
operator: Tuple[int, tquestenv],
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
operator: operator acting on a certain number of qubits (operator[0]: int)
and in a certain QuEST environment (operator[1]: tquestenv)
Raises:
RuntimeError: Qureg and DiagonalOp must be defined for the same number of qubits
"""
diagonal_op = quest.createDiagonalOp(operator[0], operator[1])
if not (cheat.getNumQubits()(qureg=qureg) == diagonal_op.numQubits):
raise RuntimeError("Qureg and DiagonalOp must be defined for the "
+ "same number of qubits")
quest.applyDiagonalOp(qureg, diagonal_op)
quest.destroyDiagonalOp(diagonal_op, operator[1])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class applyMatrix2(_PYQUEST):
r"""Applying a general 2-by-2 matrix, which may be non-unitary
The matrix is left-multiplied onto the state, for both
state-vectors and density matrices. Hence, this function differs
from unitary() by more than just permitting a non-unitary matrix.
Args:
qureg: quantum register input, is not changed
qubit: qubit to operate the matrix upon
matrix: matrix to apply
Warning:
After applyMatrix2 the quantum register is in general no longer normalised
and does no longer represent a physical valid state without normalisation.
"""
def call_interactive(self,
qureg: tqureg,
qubit: int,
matrix: np.ndarray,
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit to operate the matrix upon
matrix: matrix to apply
"""
mat = ffi_quest.new("ComplexMatrix2 *")
for i in range(2):
for j in range(2):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.applyMatrix2(qureg, qubit, mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 2 by 2 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
class applyMatrix4(_PYQUEST):
r"""Applying a general 4-by-4 matrix, which may be non-unitary
The matrix is left-multiplied onto the state, for both
state-vectors and density matrices. Hence, this function differs from
twoQubitUnitary() by more than just permitting a non-unitary matrix.
Args:
qureg: quantum register input, is not changed
control: qubit that controls the application of the matrix
qubit: qubit to operate the matrix upon
matrix: matrix to apply
Warning:
After applyMatrix2 the quantum register is in general no longer normalised
and does no longer represent a physical valid state without normalisation.
"""
def call_interactive(self,
qureg: tqureg,
control: int,
qubit: int,
matrix: np.ndarray,
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
control: qubit that controls the application of the matrix
qubit: qubit to operate the matrix upon
matrix: matrix to apply
"""
mat = ffi_quest.new("ComplexMatrix4 *")
for i in range(4):
for j in range(4):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
quest.applyMatrix4(qureg, control, qubit, mat[0])
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
class applyMatrixN(_PYQUEST):
r"""Applying a general N-by-N matrix, which may be non-unitary, on any number of target qubits
The matrix is left-multiplied onto the state, for both
state-vectors and density matrices. Hence, this function differs
from multiQubitUnitary() by more than just permitting a non-unitary matrix.
Args:
qureg: quantum register input, is not changed
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Warning:
After applyMatrixN the quantum register is in general no longer normalised
and does no longer represent a physical valid state without normalisation.
"""
def call_interactive(self,
qureg: tqureg,
targets: Sequence[int],
matrix: np.ndarray,
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Raises:
RuntimeError: Shape of matrix and length of targets are different
"""
if 2**len(targets) != matrix.shape[0] or 2**len(targets) != matrix.shape[1]:
raise RuntimeError("Shape of matrix and length of targets are different")
dim = matrix.shape[0]
mat = quest.createComplexMatrixN(len(targets))
for i in range(dim):
for j in range(dim):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer = ffi_quest.new("int[{}]".format(len(targets)))
for co, target in enumerate(targets):
pointer[co] = target
quest.applyMatrixN(qureg,
pointer,
len(targets),
mat)
quest.destroyComplexMatrixN(mat)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: N by N matrix that defines the N qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
class applyMultiControlledMatrixN(_PYQUEST):
r"""Apply a general N-by-N matrix, which may be non-unitary, with additional controlled qubits
A sum of products of Pauli operators (including Identity) is applied to a state.
The state is not changed but the corresponding copy with the Pauli sum applied is
written to qureg_out
For each qubit a Pauli operator must be given in each sum term (can be identity)
Args:
qureg: quantum register
controls: controll qubits
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Warning:
After applyMultiControlledMatrixN the quantum register is in general no longer normalised
and does no longer represent a physical valid state without normalisation.
"""
def call_interactive(self,
qureg: tqureg,
controls: Sequence[int],
targets: Sequence[int],
matrix: np.ndarray) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
controls: controll qubits
targets: list of target qubits of the N qubit gate
the first qubit in targets is treated as the least significant one
the second as the second least significant one etc.
matrix: N by N matrix that defines the N qubit gate
Raises:
RuntimeError: Shape of matrix and length of targets are different
"""
if 2**len(targets) != matrix.shape[0] or 2**len(targets) != matrix.shape[1]:
raise RuntimeError("Shape of matrix and length of targets are different")
dim = matrix.shape[0]
mat = quest.createComplexMatrixN(len(targets))
for i in range(dim):
for j in range(dim):
mat.real[i][j] = np.real(matrix[i, j])
mat.imag[i][j] = np.imag(matrix[i, j])
pointer_c = ffi_quest.new("int[{}]".format(len(controls)))
for co, control in enumerate(controls):
pointer_c[co] = control
pointer = ffi_quest.new("int[{}]".format(len(targets)))
for co, target in enumerate(targets):
pointer[co] = target
quest.applyMultiControlledMatrixN(qureg,
pointer_c,
len(controls),
pointer,
len(targets),
mat)
quest.destroyComplexMatrixN(mat)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: N by N matrix that defines the N qubit gate
**kwargs: Additional keyword arguments
Returns:
np.ndarray
"""
return matrix
class applyPauliHamil(_PYQUEST):
r"""Applying PauliHamil (a Hermitian but not necessarily unitary operator) to state
This is merely an encapsulation of applyPauliSum(), which can refer to for elaborated doc.
Applies each Pauli product in pauli_hamil to qureg in turn, and adding the resulting
state to the initially-blanked qureg_out. Ergo it should scale with the total number
of Pauli operators specified (excluding identities), and the qureg dimension.
Args:
qureg: quantum register input, is not changed
paulis: List of Lists of Pauli operators in each product
encoded as int via IDENTITY=0, PAULI_X=1, PAULI_Y=2, PAULI_Z=3
matrix: N by N matrix that defines the N qubit gate
qureg_out: quantum register after application of Pauli sum
Warning:
After applyPauliHamil the output quantum register is in general no longer normalised
and does no longer represent a physical valid state without normalisation.
"""
def call_interactive(self,
qureg: tqureg,
pauli_hamil: paulihamil,
qureg_out: tqureg
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
pauli_hamil: PauliHamil instance to be applied
qureg_out: quantum register after application of Pauli sum
Raises:
RuntimeError: Qureg and PauliHamil must be defined for the same number of qubits
"""
if not (cheat.getNumQubits()(qureg=qureg) == pauli_hamil.numQubits):
raise RuntimeError("Qureg and PauliHamil must be defined for the "
+ "same number of qubits")
quest.applyPauliHamil(qureg,
pauli_hamil,
qureg_out)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class applyPauliSum(_PYQUEST):
r"""Applying a sum of Products of Pauli operators to state
A sum of products of Pauli operators (including Identity) is applied to a state.
The state is not changed but the corresponding copy with the Pauli sum applied is
written to qureg_out
For each qubit a Pauli operator must be given in each sum term (can be identity)
Args:
qureg: quantum register input, is not changed
paulis: List of Lists of Pauli operators in each product
encoded as int via IDENTITY=0, PAULI_X=1, PAULI_Y=2, PAULI_Z=3
matrix: N by N matrix that defines the N qubit gate
qureg_out: quantum register after application of Pauli sum
Warning:
After applyPauliSum the quantum register is in general no longer normalised
and does no longer represent a physical valid state without normalisation.
"""
def call_interactive(self,
qureg: tqureg,
paulis: Sequence[Sequence[int]],
coefficients: Sequence[float],
qureg_out: tqureg
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
paulis: List of Lists of Pauli operators in each product
encoded as int via IDENTITY=0, PAULI_X=1, PAULI_Y=2, PAULI_Z=3
coefficients: coefficients of the paulis to be summed
qureg_out: quantum register after application of Pauli sum
"""
flat_list = [p for product in paulis for p in product]
pointer_paulis = ffi_quest.new("enum pauliOpType[{}]".format(len(flat_list)))
for co, p in enumerate(flat_list):
pointer_paulis[co] = p
pointer = ffi_quest.new("{}[{}]".format(qreal, len(coefficients)))
for co, c in enumerate(coefficients):
pointer[co] = c
quest.applyPauliSum(qureg,
pointer_paulis,
pointer,
len(coefficients),
qureg_out)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
class applyTrotterCircuit(_PYQUEST):
r"""Applying a trotterisation of unitary evolution exp(-i*pauli_hamil*time) to qureg
This is a sequence of unitary operators, effected by multiRotatePauli(), which together
approximate the action of full unitary-time evolution under the given Hamiltonian.
These formulations are taken from 'Finding Exponential Product Formulas of Higher Orders',
Naomichi Hatano and Masuo Suzuki (2005).
Args:
qureg: the register to modify under the approximate unitary-time evolution
pauli_hamil: PauliHamil under which to approxiamte unitary-time evolution
time: the target evolution time, which is permitted to be both positive and negative
order: the order of Trotter-Suzuki decomposition to use
repetitions: the number of repetitions of the decomposition of the given order
"""
def call_interactive(self,
qureg: tqureg,
pauli_hamil: paulihamil,
time: float,
order: int,
repetitions: int
) -> None:
r"""Interactive call of PyQuest-cffi
Args:
qureg: the register to modify under the approximate unitary-time evolution
pauli_hamil: PauliHamil under which to approxiamte unitary-time evolution
time: the target evolution time, which is permitted to be both positive and negative
order: the order of Trotter-Suzuki decomposition to use
repetitions: the number of repetitions of the decomposition of the given order
"""
quest.applyTrotterCircuit(qureg,
pauli_hamil,
time,
order,
repetitions)
def matrix(self, matrix: np.ndarray, **kwargs) -> np.ndarray:
r"""The definition of the gate as a unitary matrix
Args:
matrix: 4 by 4 matrix that defines the two qubit gate
**kwargs: Additional keyword arguments
Raises:
NotImplementedError: not implemented
"""
raise NotImplementedError()
# Measurement
class measure(_PYQUEST):
r"""Implements a one-qubit Measurement operation
Args:
qureg: quantum register
qubit: the measured qubit
readout: The readout register for static compilation
readout_index: The index in the readout register for static compilation
"""
def call_interactive(self, qureg: tqureg, qubit: int) -> int:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
Returns:
int
"""
return quest.measure(qureg, qubit)
class measureWithStats(_PYQUEST):
r"""Measures a single qubit and gives the probability of that outcome.
Args:
qureg: quantum register
qubit: the measured qubit
outcome_proba: where to set the probability of the occurred outcome
"""
def call_interactive(self, qureg: tqureg, qubit: int, outcome_proba: float) -> int:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
outcome_proba: where to set the probability of the occurred outcome
Returns:
int
"""
outcome_pointer = ffi_quest.new("{}[{}]".format(qreal, 1))
outcome_pointer[0] = outcome_proba
return quest.measureWithStats(qureg, qubit, outcome_pointer)
class collapseToOutcome(_PYQUEST):
r"""Updates qureg to be consistent with measuring measureQubit and returns the probability.
Args:
qureg: quantum register
qubit: the measured qubit
outcome: where to set the probability of the occurred outcome
"""
def call_interactive(self, qureg: tqureg, qubit: int, outcome: int) -> float:
r"""Interactive call of PyQuest-cffi
Args:
qureg: quantum register
qubit: qubit the unitary gate is applied to
outcome: where to set the probability of the occurred outcome
Returns:
float
"""
return quest.collapseToOutcome(qureg, qubit, outcome)
| 34.065562 | 100 | 0.549574 | 11,425 | 94,566 | 4.519912 | 0.044551 | 0.010883 | 0.009876 | 0.047405 | 0.879299 | 0.869094 | 0.859063 | 0.853215 | 0.838904 | 0.826181 | 0 | 0.01891 | 0.358586 | 94,566 | 2,775 | 101 | 34.077838 | 0.832449 | 0.493856 | 0 | 0.616109 | 0 | 0 | 0.032592 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105649 | false | 0 | 0.004184 | 0 | 0.206067 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
83614f71954ca16f2f1dee91d94845e335d21a80 | 80 | py | Python | SipMask-mmdetection/mmdet/models/utils/__init__.py | anirudh-chakravarthy/SipMask | fc82b12c13abb091e271eb4f1b6734da18234443 | [
"MIT"
] | 1,141 | 2020-06-04T01:11:22.000Z | 2022-03-31T07:12:52.000Z | SipMask-mmdetection/mmdet/models/utils/__init__.py | anirudh-chakravarthy/SipMask | fc82b12c13abb091e271eb4f1b6734da18234443 | [
"MIT"
] | 98 | 2020-01-21T09:41:30.000Z | 2022-03-12T00:53:06.000Z | SipMask-mmdetection/mmdet/models/utils/__init__.py | anirudh-chakravarthy/SipMask | fc82b12c13abb091e271eb4f1b6734da18234443 | [
"MIT"
] | 233 | 2020-01-18T03:46:27.000Z | 2022-03-19T03:17:47.000Z | from .weight_init import bias_init_with_prob
__all__ = ['bias_init_with_prob']
| 20 | 44 | 0.825 | 13 | 80 | 4.230769 | 0.615385 | 0.290909 | 0.436364 | 0.581818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 80 | 3 | 45 | 26.666667 | 0.763889 | 0 | 0 | 0 | 0 | 0 | 0.2375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
55c49a7aa63215b7a77fddb18416127163aa5ccc | 6,630 | py | Python | tests/v2/test_incidents.py | lupamo3/iReporter-Flask | 61d44d2662a94ae6c93df310abc7855137d725e8 | [
"MIT"
] | null | null | null | tests/v2/test_incidents.py | lupamo3/iReporter-Flask | 61d44d2662a94ae6c93df310abc7855137d725e8 | [
"MIT"
] | null | null | null | tests/v2/test_incidents.py | lupamo3/iReporter-Flask | 61d44d2662a94ae6c93df310abc7855137d725e8 | [
"MIT"
] | 1 | 2021-09-21T06:20:43.000Z | 2021-09-21T06:20:43.000Z | import os
import json
import pytest
from tests.v2.test_base import BaseTestClass
class TestRedflags(BaseTestClass):
"""This class represents the test redflag case """
def test_get_all_records(self):
""" Test if API endpoint is able to get all records correctly """
post = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token}
)
response = self.client.get(
'/api/v2/incidents',
headers={'authorization': 'Bearer ' + self.auth_token})
self.assertEqual(response.status_code, 200)
def test_creation_of_records(self):
""" Test if API endpoint can create a redflag (POST)"""
response = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
'Authorization': 'Bearer '+self.auth_token}
)
self.assertEqual(response.status_code, 201)
result = json.loads(response.data.decode())
self.assertEqual(result["message"], "Created redflag record")
def test_get_one_record(self):
""" Test if API is able to get a single ID record"""
response = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token}
)
self.assertEqual(response.status_code, 201)
response = self.client.get(
"/api/v2/incidents/1",
headers={'Authorization': 'Bearer ' + self.auth_token})
self.assertEqual(response.status_code, 200)
def test_records_deletion(self):
"""Test if API can delete existing records """
rv = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token}
)
self.assertEqual(rv.status_code, 201)
res = self.client.delete(
'/api/v2/incidents/1',
headers={'Authorization': 'Bearer ' + self.auth_token})
self.assertEqual(res.status_code, 200)
def test_record_without_comment(self):
""" Test if API can post with one field not filled"""
response = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.no_comment),
headers={"content-type": "application/json",
'authorization': 'Bearer ' + self.auth_token}
)
self.assertEqual(response.status_code, 400)
def test_creation_record_empty_fields(self):
""" Test if API can post with all fields empty"""
response = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.no_input),
headers={"content-type": "application/json",
'authorization': 'Bearer ' + self.auth_token}
)
self.assertEqual(response.status_code, 400)
result = json.loads(response.data.decode())
self.assertEqual(result["message"], "Kindly input user info")
def test_no_record_to_delete(self):
"""Test if API can delete existing records """
rv = self.client.post(
'/api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
'authorization': 'Bearer ' + self.auth_token}
)
self.assertEqual(rv.status_code, 201)
res = self.client.delete(
'/api/v2/incidents/1',
headers={'authorization': 'Bearer ' + self.auth_token})
self.assertEqual(res.status_code, 200)
def test_none_existent_record(self):
""" Test if API is able to get non-existent record"""
response = self.client.get(
'/api/v2/incidents/200',
headers={'Authorization': 'Bearer ' + self.auth_token}
)
self.assertEqual(response.status_code, 404)
result = json.loads(response.data.decode())
self.assertEqual(result["error"], "Incident with that id not found")
def test_editing_location(self):
""" Test if API is able to change location """
response = self.client.post(
'api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token}
)
patch_record = {
'location': 'Andela Uganda'
}
response = self.client.patch(
"/api/v2/incidents/1/location",
data=json.dumps(patch_record),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token})
self.assertEqual(response.status_code, 200)
result = json.loads(response.data.decode())
self.assertEqual(result["message"], "Updated location successfully")
def test_editing_comment(self):
""" Test if API is able to change comments """
response = self.client.post(
'api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
"Authorization": "Bearer " + self.auth_token}
)
patch_record = {
'comment': 'Once an Andelan forever one'
}
response = self.client.patch(
"api/v2/incidents/1/comment/",
data=json.dumps(patch_record),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token})
self.assertEqual(response.status_code, 200)
def test_editing_status(self):
""" Test if API is able to change comments """
response = self.client.post(
'api/v2/incidents',
data=json.dumps(self.data),
headers={"content-type": "application/json",
"Authorization": "Bearer " + self.auth_token}
)
patch_record = {
"status": "Draft"
}
response = self.client.patch(
"api/v2/incidents/1/status/",
data=json.dumps(patch_record),
headers={"content-type": "application/json",
'Authorization': 'Bearer ' + self.auth_token})
self.assertEqual(response.status_code, 200)
if __name__ == '__main__':
unittest.main()
| 39.464286 | 76 | 0.576018 | 710 | 6,630 | 5.266197 | 0.150704 | 0.048141 | 0.067398 | 0.129981 | 0.823482 | 0.819738 | 0.804761 | 0.767585 | 0.729874 | 0.699385 | 0 | 0.014989 | 0.295626 | 6,630 | 167 | 77 | 39.700599 | 0.785653 | 0.080845 | 0 | 0.65 | 0 | 0 | 0.211024 | 0.016935 | 0 | 0 | 0 | 0 | 0.128571 | 1 | 0.078571 | false | 0 | 0.028571 | 0 | 0.114286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3676446bf7eef225a1e5a40ac4f1f8d23b38225d | 4,405 | py | Python | integration_tests/web/test_message_metadata.py | slackapi/python-slackclient | 3727a6e9a6397adc32fd0ecc83f53dbd699b4c97 | [
"MIT"
] | 2,486 | 2016-11-03T14:31:43.000Z | 2020-10-26T23:07:44.000Z | integration_tests/web/test_message_metadata.py | slackapi/python-slackclient | 3727a6e9a6397adc32fd0ecc83f53dbd699b4c97 | [
"MIT"
] | 721 | 2016-11-03T21:26:56.000Z | 2020-10-26T12:41:29.000Z | integration_tests/web/test_message_metadata.py | slackapi/python-slackclient | 3727a6e9a6397adc32fd0ecc83f53dbd699b4c97 | [
"MIT"
] | 627 | 2016-11-02T19:04:19.000Z | 2020-10-25T19:21:13.000Z | import logging
import os
import time
import unittest
from integration_tests.env_variable_names import SLACK_SDK_TEST_BOT_TOKEN
from slack_sdk.models.metadata import Metadata
from slack_sdk.web import WebClient
class TestWebClient(unittest.TestCase):
def setUp(self):
self.logger = logging.getLogger(__name__)
self.bot_token = os.environ[SLACK_SDK_TEST_BOT_TOKEN]
def tearDown(self):
pass
def test_publishing_message_metadata(self):
client: WebClient = WebClient(token=self.bot_token)
new_message = client.chat_postMessage(
channel='#random',
text="message with metadata",
metadata={
"event_type": "procurement-task",
"event_payload": {
"id": "11111",
"amount": 5000,
"tags": ["foo", "bar", "baz"]
},
}
)
self.assertIsNone(new_message.get("error"))
self.assertIsNotNone(new_message.get("message").get("metadata"))
history = client.conversations_history(
channel=new_message.get("channel"),
limit=1,
include_all_metadata=True,
)
self.assertIsNone(history.get("error"))
self.assertIsNotNone(history.get("messages")[0].get("metadata"))
modification = client.chat_update(
channel=new_message.get("channel"),
ts=new_message.get("ts"),
text="message with metadata (modified)",
metadata={
"event_type": "procurement-task",
"event_payload": {
"id": "11111",
"amount": 6000,
},
}
)
self.assertIsNone(modification.get("error"))
self.assertIsNotNone(modification.get("message").get("metadata"))
scheduled = client.chat_scheduleMessage(
channel=new_message.get("channel"),
post_at=int(time.time()) + 30,
text="message with metadata (scheduled)",
metadata={
"event_type": "procurement-task",
"event_payload": {
"id": "11111",
"amount": 10,
},
}
)
self.assertIsNone(scheduled.get("error"))
self.assertIsNotNone(scheduled.get("message").get("metadata"))
def test_publishing_message_metadata_using_models(self):
client: WebClient = WebClient(token=self.bot_token)
new_message = client.chat_postMessage(
channel='#random',
text="message with metadata",
metadata=Metadata(
event_type="procurement-task",
event_payload={
"id": "11111",
"amount": 5000,
"tags": ["foo", "bar", "baz"]
}
)
)
self.assertIsNone(new_message.get("error"))
self.assertIsNotNone(new_message.get("message").get("metadata"))
history = client.conversations_history(
channel=new_message.get("channel"),
limit=1,
include_all_metadata=True,
)
self.assertIsNone(history.get("error"))
self.assertIsNotNone(history.get("messages")[0].get("metadata"))
modification = client.chat_update(
channel=new_message.get("channel"),
ts=new_message.get("ts"),
text="message with metadata (modified)",
metadata=Metadata(
event_type="procurement-task",
event_payload={
"id": "11111",
"amount": 6000,
}
)
)
self.assertIsNone(modification.get("error"))
self.assertIsNotNone(modification.get("message").get("metadata"))
scheduled = client.chat_scheduleMessage(
channel=new_message.get("channel"),
post_at=int(time.time()) + 30,
text="message with metadata (scheduled)",
metadata=Metadata(
event_type="procurement-task",
event_payload={
"id": "11111",
"amount": 10,
}
)
)
self.assertIsNone(scheduled.get("error"))
self.assertIsNotNone(scheduled.get("message").get("metadata"))
| 34.147287 | 73 | 0.537117 | 396 | 4,405 | 5.80303 | 0.204545 | 0.078329 | 0.067885 | 0.093995 | 0.876414 | 0.831158 | 0.831158 | 0.831158 | 0.831158 | 0.831158 | 0 | 0.019993 | 0.34143 | 4,405 | 128 | 74 | 34.414063 | 0.772148 | 0 | 0 | 0.695652 | 0 | 0 | 0.150511 | 0 | 0 | 0 | 0 | 0 | 0.13913 | 1 | 0.034783 | false | 0.008696 | 0.06087 | 0 | 0.104348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3688b6df5617c512e5dbb4f53641b0fc3a10b432 | 21,467 | py | Python | genomics_data_index/test/integration/storage/service/test_MLSTService.py | apetkau/genomics-data-index | d0cc119fd57b8cbd701affb1c84450cf7832fa01 | [
"Apache-2.0"
] | 12 | 2021-05-03T20:56:05.000Z | 2022-01-04T14:52:19.000Z | genomics_data_index/test/integration/storage/service/test_MLSTService.py | apetkau/thesis-index | 6c96e9ed75d8e661437effe62a939727a0b473fc | [
"Apache-2.0"
] | 30 | 2021-04-26T23:03:40.000Z | 2022-02-25T18:41:14.000Z | genomics_data_index/test/integration/storage/service/test_MLSTService.py | apetkau/genomics-data-index | d0cc119fd57b8cbd701affb1c84450cf7832fa01 | [
"Apache-2.0"
] | null | null | null | import pytest
from genomics_data_index.storage.model.db import SampleMLSTAlleles, MLSTAllelesSamples, Sample, MLSTScheme
from genomics_data_index.storage.service import EntityExistsError
from genomics_data_index.storage.service.MLSTService import MLSTService
def test_insert_mlst_results(database, mlst_data_package_single_scheme, sample_service, filesystem_storage):
num_loci = 7
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme)
samples = session.query(Sample).all()
assert 2 == len(samples)
assert {'CFSAN002349', 'CFSAN023463'} == {s.name for s in samples}
sample_mlst_alleles = session.query(SampleMLSTAlleles).all()
assert 2 == len(sample_mlst_alleles)
mlst_alleles_samples = session.query(MLSTAllelesSamples).all()
mlst_alleles_samples_id_allele = {allele.sla: allele for allele in mlst_alleles_samples}
assert num_loci == len(mlst_alleles_samples)
assert {'lmonocytogenes:abcZ:1', 'lmonocytogenes:bglA:51', 'lmonocytogenes:cat:11',
'lmonocytogenes:dapE:13', 'lmonocytogenes:dat:2', 'lmonocytogenes:ldh:5',
'lmonocytogenes:lhkA:5'} == set(mlst_alleles_samples_id_allele.keys())
assert 2 == len(mlst_alleles_samples_id_allele['lmonocytogenes:abcZ:1'].sample_ids)
def test_insert_mlst_results_multiple_schemes(database, mlst_data_package_basic, sample_service, filesystem_storage):
num_loci = 7
num_schemes = 3
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
mlst_service.insert(data_package=mlst_data_package_basic)
samples = session.query(Sample).all()
assert 6 == len(samples)
assert {'CFSAN002349', 'CFSAN023463', '2014C-3598', '2014C-3599',
'2014D-0067', '2014D-0068'} == {s.name for s in samples}
sample_mlst_alleles = session.query(SampleMLSTAlleles).all()
assert 6 == len(sample_mlst_alleles)
mlst_alleles_all = session.query(MLSTAllelesSamples).all()
mlst_alleles_all_id = {allele.sla: allele for allele in mlst_alleles_all}
assert num_loci * num_schemes + 2 == len(mlst_alleles_all)
assert {'lmonocytogenes', 'ecoli', 'campylobacter'} == {allele.scheme for allele in mlst_alleles_all}
assert 2 == len(mlst_alleles_all_id['lmonocytogenes:abcZ:1'].sample_ids)
assert 2 == len(mlst_alleles_all_id['ecoli:fumC:23'].sample_ids)
mlst_alleles_lmono = session.query(MLSTAllelesSamples).filter(MLSTAllelesSamples.scheme == 'lmonocytogenes').all()
mlst_alleles_id_lmono = {allele.sla for allele in mlst_alleles_lmono}
assert {'lmonocytogenes:abcZ:1', 'lmonocytogenes:bglA:51', 'lmonocytogenes:cat:11',
'lmonocytogenes:dapE:13', 'lmonocytogenes:dat:2', 'lmonocytogenes:ldh:5',
'lmonocytogenes:lhkA:4', 'lmonocytogenes:lhkA:5'} == mlst_alleles_id_lmono
def test_insert_mlst_results_multiple_schemes_override_scheme(database, mlst_data_package_basic,
sample_service, filesystem_storage):
num_loci = 7
num_alt_schemes = 3
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_basic)
sample_mlst_alleles = session.query(SampleMLSTAlleles).all()
assert 6 == len(sample_mlst_alleles)
mlst_alleles_samples = session.query(MLSTAllelesSamples).all()
mlst_alleles_samples_id_allele = {allele.sla: allele for allele in mlst_alleles_samples}
assert num_loci * num_alt_schemes + 2 == len(mlst_alleles_samples)
assert 2 == len(mlst_alleles_samples_id_allele['lmonocytogenes:abcZ:1'].sample_ids)
assert 2 == len(mlst_alleles_samples_id_allele['lmonocytogenes:fumC:23'].sample_ids)
def test_double_insert_mlst(database, mlst_data_package_single_scheme, sample_service, filesystem_storage):
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme)
samples = session.query(Sample).all()
assert 2 == len(samples)
assert {'CFSAN002349', 'CFSAN023463'} == {s.name for s in samples}
sample_mlst_alleles = session.query(SampleMLSTAlleles).all()
assert 2 == len(sample_mlst_alleles)
with pytest.raises(EntityExistsError) as execinfo:
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme)
assert 'Passed samples already have features for feature scope [lmonocytogenes]' in str(execinfo.value)
sample_mlst_alleles = session.query(SampleMLSTAlleles).all()
assert 2 == len(sample_mlst_alleles)
def test_multiple_insert_different_samples_same_scheme(database, mlst_data_package_single_scheme,
mlst_data_package_single_scheme2,
sample_service, filesystem_storage):
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
# Insert first set of data
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme)
features_count = session.query(MLSTAllelesSamples).count()
assert 7 == features_count
sample_CFSAN002349 = session.query(Sample).filter(Sample.name == 'CFSAN002349').one()
sample_CFSAN023463 = session.query(Sample).filter(Sample.name == 'CFSAN023463').one()
## Test feature where all new samples get added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'cat',
'allele': '11',
})
assert m.id == 'lmonocytogenes:cat:11'
assert m.sla == 'lmonocytogenes:cat:11'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id}
## Test feature where only one of the new samples gets added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'abcZ',
'allele': '1',
})
assert m.id == 'lmonocytogenes:abcZ:1'
assert m.sla == 'lmonocytogenes:abcZ:1'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id}
## Test feature where none of the new samples gets added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'dapE',
'allele': '13',
})
assert m.id == 'lmonocytogenes:dapE:13'
assert m.sla == 'lmonocytogenes:dapE:13'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id}
## Test feature that only exists after the 2nd addition
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'dapE',
'allele': '12',
})
assert m is None
# Insert second set of data
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme2)
features_count = session.query(MLSTAllelesSamples).count()
assert 12 == features_count
sample_CFSAN002349_2 = session.query(Sample).filter(Sample.name == 'CFSAN002349-2').one()
sample_CFSAN023463_2 = session.query(Sample).filter(Sample.name == 'CFSAN023463-2').one()
## Test feature where all new samples get added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'cat',
'allele': '11',
})
assert m.id == 'lmonocytogenes:cat:11'
assert m.sla == 'lmonocytogenes:cat:11'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id,
sample_CFSAN002349_2.id, sample_CFSAN023463_2.id}
## Test feature where only one of the new samples gets added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'abcZ',
'allele': '1',
})
assert m.id == 'lmonocytogenes:abcZ:1'
assert m.sla == 'lmonocytogenes:abcZ:1'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id,
sample_CFSAN002349_2.id}
## Test feature where none of the new samples gets added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'dapE',
'allele': '13',
})
assert m.id == 'lmonocytogenes:dapE:13'
assert m.sla == 'lmonocytogenes:dapE:13'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id}
## Test feature that only exists after the 2nd addition
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'dapE',
'allele': '12',
})
assert m.id == 'lmonocytogenes:dapE:12'
assert m.sla == 'lmonocytogenes:dapE:12'
assert set(m.sample_ids) == {sample_CFSAN002349_2.id, sample_CFSAN023463_2.id}
def test_multiple_inserts_different_schemes(database, mlst_data_package_basic,
mlst_data_package_single_scheme2,
sample_service, filesystem_storage):
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
mlst_service.insert(data_package=mlst_data_package_single_scheme2)
mlst_service.insert(data_package=mlst_data_package_basic)
features_count = session.query(MLSTAllelesSamples).count()
assert 28 == features_count
sample_2014C_3598 = session.query(Sample).filter(Sample.name == '2014C-3598').one()
sample_2014C_3599 = session.query(Sample).filter(Sample.name == '2014C-3599').one()
sample_CFSAN002349_2 = session.query(Sample).filter(Sample.name == 'CFSAN002349-2').one()
sample_CFSAN023463_2 = session.query(Sample).filter(Sample.name == 'CFSAN023463-2').one()
sample_CFSAN002349 = session.query(Sample).filter(Sample.name == 'CFSAN002349').one()
sample_CFSAN023463 = session.query(Sample).filter(Sample.name == 'CFSAN023463').one()
# Test feature where only some of the new samples gets added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'abcZ',
'allele': '1',
})
assert m.id == 'lmonocytogenes:abcZ:1'
assert m.sla == 'lmonocytogenes:abcZ:1'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id,
sample_CFSAN002349_2.id}
# Test feature with unknown alleles
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'lhkA',
'allele': '?',
})
assert m.id == 'lmonocytogenes:lhkA:?'
assert m.sla == 'lmonocytogenes:lhkA:?'
assert set(m.sample_ids) == {sample_CFSAN023463_2.id}
# Test completely different scheme
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'ecoli',
'locus': 'adk',
'allele': '100',
})
assert m.id == 'ecoli:adk:100'
assert m.sla == 'ecoli:adk:100'
assert set(m.sample_ids) == {sample_2014C_3598.id, sample_2014C_3599.id}
def test_multiple_inserts_3_inserts(database, mlst_data_package_single_scheme,
mlst_data_package_single_scheme2,
mlst_data_package_single_scheme3,
sample_service, filesystem_storage):
mlst_service = MLSTService(database_connection=database,
sample_service=sample_service,
mlst_dir=filesystem_storage.mlst_dir)
session = database.get_session()
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme)
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme2)
mlst_service.insert(feature_scope_name='lmonocytogenes', data_package=mlst_data_package_single_scheme3)
features_count = session.query(MLSTAllelesSamples).count()
assert 13 == features_count
sample_CFSAN002349 = session.query(Sample).filter(Sample.name == 'CFSAN002349').one()
sample_CFSAN023463 = session.query(Sample).filter(Sample.name == 'CFSAN023463').one()
sample_CFSAN002349_2 = session.query(Sample).filter(Sample.name == 'CFSAN002349-2').one()
sample_CFSAN023463_2 = session.query(Sample).filter(Sample.name == 'CFSAN023463-2').one()
sample_CFSAN002349_3 = session.query(Sample).filter(Sample.name == 'CFSAN002349-3').one()
sample_CFSAN023463_3 = session.query(Sample).filter(Sample.name == 'CFSAN023463-3').one()
# Test feature where only some of the new samples gets added
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'abcZ',
'allele': '1',
})
assert m.id == 'lmonocytogenes:abcZ:1'
assert m.sla == 'lmonocytogenes:abcZ:1'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id,
sample_CFSAN002349_2.id,
sample_CFSAN002349_3.id}
# Test feature with unknown alleles
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'lhkA',
'allele': '?',
})
assert m.id == 'lmonocytogenes:lhkA:?'
assert m.sla == 'lmonocytogenes:lhkA:?'
assert set(m.sample_ids) == {sample_CFSAN023463_2.id,
sample_CFSAN023463_3.id}
# Test valid alleles for this same locus
m: MLSTAllelesSamples = session.query(MLSTAllelesSamples).get({
'scheme': 'lmonocytogenes',
'locus': 'lhkA',
'allele': '5',
})
assert m.id == 'lmonocytogenes:lhkA:5'
assert m.sla == 'lmonocytogenes:lhkA:5'
assert set(m.sample_ids) == {sample_CFSAN002349.id, sample_CFSAN023463.id,
sample_CFSAN002349_2.id,
sample_CFSAN002349_3.id}
def test_get_mlst_schemes(mlst_service_loaded):
mlst_schemes = mlst_service_loaded.get_mlst_schemes()
assert 3 == len(mlst_schemes)
assert isinstance(mlst_schemes[0], MLSTScheme)
assert {'lmonocytogenes', 'ecoli', 'campylobacter'} == {s.name for s in mlst_schemes}
def test_get_all_mlst_features(mlst_service_loaded: MLSTService):
# Test no unknown
mlst_features = mlst_service_loaded.get_features(include_present=True, include_unknown=False)
assert 22 == len(mlst_features)
assert {'lmonocytogenes:abcZ:1', 'lmonocytogenes:bglA:51', 'lmonocytogenes:cat:11',
'lmonocytogenes:dapE:13', 'lmonocytogenes:dat:2', 'lmonocytogenes:ldh:5',
'lmonocytogenes:lhkA:4', 'lmonocytogenes:lhkA:5',
'ecoli:adk:100', 'ecoli:fumC:23', 'ecoli:gyrB:68', 'ecoli:icd:45', 'ecoli:mdh:1',
'ecoli:purA:35', 'ecoli:recA:7',
'campylobacter:aspA:2', 'campylobacter:glnA:1', 'campylobacter:gltA:1',
'campylobacter:glyA:3', 'campylobacter:pgm:2', 'campylobacter:tkt:1',
'campylobacter:uncA:6'} == set(mlst_features.keys())
assert isinstance(mlst_features['lmonocytogenes:abcZ:1'], MLSTAllelesSamples)
# Test include unknown
mlst_features = mlst_service_loaded.get_features(include_present=True, include_unknown=True)
assert 23 == len(mlst_features)
assert {'lmonocytogenes:abcZ:1', 'lmonocytogenes:bglA:51', 'lmonocytogenes:cat:11',
'lmonocytogenes:dapE:13', 'lmonocytogenes:dat:2', 'lmonocytogenes:ldh:5',
'lmonocytogenes:lhkA:4', 'lmonocytogenes:lhkA:5',
'ecoli:adk:100', 'ecoli:fumC:23', 'ecoli:gyrB:68', 'ecoli:icd:45', 'ecoli:mdh:1',
'ecoli:purA:35', 'ecoli:recA:7',
'campylobacter:aspA:2', 'campylobacter:glnA:1', 'campylobacter:gltA:1',
'campylobacter:glyA:3', 'campylobacter:pgm:2', 'campylobacter:tkt:1',
'campylobacter:uncA:6', 'campylobacter:uncA:?'} == set(mlst_features.keys())
assert isinstance(mlst_features['lmonocytogenes:abcZ:1'], MLSTAllelesSamples)
# Test only unknown
mlst_features = mlst_service_loaded.get_features(include_present=False, include_unknown=True)
assert 1 == len(mlst_features)
assert {'campylobacter:uncA:?'} == set(mlst_features.keys())
assert isinstance(mlst_features['campylobacter:uncA:?'], MLSTAllelesSamples)
def test_get_features_for_scheme(mlst_service_loaded: MLSTService):
# Test lmonocytogenes
mlst_features = mlst_service_loaded.get_features('lmonocytogenes')
assert 8 == len(mlst_features)
assert {'lmonocytogenes:abcZ:1', 'lmonocytogenes:bglA:51', 'lmonocytogenes:cat:11',
'lmonocytogenes:dapE:13', 'lmonocytogenes:dat:2', 'lmonocytogenes:ldh:5',
'lmonocytogenes:lhkA:4', 'lmonocytogenes:lhkA:5'} == set(mlst_features.keys())
assert isinstance(mlst_features['lmonocytogenes:abcZ:1'], MLSTAllelesSamples)
# Test campylobacter
mlst_features = mlst_service_loaded.get_features('campylobacter')
assert 7 == len(mlst_features)
assert {'campylobacter:aspA:2', 'campylobacter:glnA:1', 'campylobacter:gltA:1',
'campylobacter:glyA:3', 'campylobacter:pgm:2', 'campylobacter:tkt:1',
'campylobacter:uncA:6'} == set(mlst_features.keys())
assert isinstance(mlst_features['campylobacter:glyA:3'], MLSTAllelesSamples)
# Test lmonocytogenes with specific locus
mlst_features = mlst_service_loaded.get_features('lmonocytogenes', locus='abcZ')
assert 1 == len(mlst_features)
assert {'lmonocytogenes:abcZ:1'} == set(mlst_features.keys())
# Test lmonocytogenes with specific locus 2
mlst_features = mlst_service_loaded.get_features('lmonocytogenes', locus='lhkA')
assert 2 == len(mlst_features)
assert {'lmonocytogenes:lhkA:4', 'lmonocytogenes:lhkA:5'} == set(mlst_features.keys())
# Test include unknown
mlst_features = mlst_service_loaded.get_features('campylobacter', include_unknown=True)
assert 8 == len(mlst_features)
assert {'campylobacter:aspA:2', 'campylobacter:glnA:1', 'campylobacter:gltA:1',
'campylobacter:glyA:3', 'campylobacter:pgm:2', 'campylobacter:tkt:1',
'campylobacter:uncA:6', 'campylobacter:uncA:?'} == set(mlst_features.keys())
assert isinstance(mlst_features['campylobacter:glyA:3'], MLSTAllelesSamples)
# Test include only unknown (not present)
mlst_features = mlst_service_loaded.get_features('campylobacter', include_present=False,
include_unknown=True)
assert 1 == len(mlst_features)
assert {'campylobacter:uncA:?'} == set(mlst_features.keys())
assert isinstance(mlst_features['campylobacter:uncA:?'], MLSTAllelesSamples)
# Test include neither present nor unknown
mlst_features = mlst_service_loaded.get_features('campylobacter', include_present=False,
include_unknown=False)
assert 0 == len(mlst_features)
# Test invalid scheme
assert 0 == len(mlst_service_loaded.get_features('invalid_scheme'))
def test_get_all_alleles(mlst_service_loaded: MLSTService):
assert {'1'} == mlst_service_loaded.get_all_alleles('lmonocytogenes', 'abcZ')
def test_get_all_alleles_multiple_alleles(mlst_service_loaded: MLSTService):
assert {'4', '5'} == mlst_service_loaded.get_all_alleles('lmonocytogenes', 'lhkA')
def test_get_all_alleles_unknown_alleles(mlst_service_loaded_unknown: MLSTService):
assert {'1', '?'} == mlst_service_loaded_unknown.get_all_alleles('lmonocytogenes', 'abcZ')
def test_get_all_alleles_unknown_alleles2(mlst_service_loaded_unknown: MLSTService):
assert {'?'} == mlst_service_loaded_unknown.get_all_alleles('lmonocytogenes', 'bglA')
def test_get_all_loci_alleles(mlst_service_loaded: MLSTService):
assert {('abcZ', '1'), ('bglA', '51'), ('cat', '11'),
('dapE', '13'), ('dat', '2'), ('ldh', '5'),
('lhkA', '4'), ('lhkA', '5')} == mlst_service_loaded.get_all_loci_alleles('lmonocytogenes')
def test_get_all_loci_alleles_unknown(mlst_service_loaded_unknown: MLSTService):
assert {('abcZ', '1'), ('abcZ', '?'), ('bglA', '?'), ('cat', '11'),
('dapE', '13'), ('dat', '2'), ('ldh', '5'),
('lhkA', '5')} == mlst_service_loaded_unknown.get_all_loci_alleles('lmonocytogenes')
| 46.066524 | 118 | 0.683142 | 2,461 | 21,467 | 5.711906 | 0.066233 | 0.035996 | 0.032653 | 0.025397 | 0.901188 | 0.847265 | 0.803443 | 0.752579 | 0.7333 | 0.717009 | 0 | 0.043862 | 0.197093 | 21,467 | 465 | 119 | 46.165591 | 0.771699 | 0.047794 | 0 | 0.6997 | 0 | 0 | 0.188042 | 0.06567 | 0 | 0 | 0 | 0 | 0.309309 | 1 | 0.048048 | false | 0.003003 | 0.012012 | 0 | 0.06006 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
36997e4315c167ee210ba6de2c0b5caf987d85aa | 2,692 | py | Python | scanme.py | raunak-dx/network-recon-script | 6048525a02c0289cd36e777debda3802b52178fe | [
"MIT"
] | 5 | 2020-12-12T01:23:18.000Z | 2022-02-27T01:32:54.000Z | scanme.py | raunak-dx/network-recon-script | 6048525a02c0289cd36e777debda3802b52178fe | [
"MIT"
] | null | null | null | scanme.py | raunak-dx/network-recon-script | 6048525a02c0289cd36e777debda3802b52178fe | [
"MIT"
] | null | null | null | import nmap
from colored import fg, bg, attr
color = fg('green')
reset = attr('reset')
try:
file1 = open('scanme-header.txt', 'r')
print(' ')
print (color + file1.read() + reset)
file1.close()
except IOError:
print('\nBanner File not found!')
scanner = nmap.PortScanner()
target = input("Enter IP Address to scan: ")
type(target)
response = input("\nEnter the type of scan you want to run: \n\n1)TCP Scan \n2)UDP Scan \n3)Intense Scan \n\n")
print("\n")
if (response=='1'):
print("NMAP Version:\t ", scanner.nmap_version())
print("Scanning:\t ",target)
print("Please Wait !\n")
scanner.scan(target, '1-1000', '-v -sS')
hostname = scanner[target].hostname()
if (hostname == ""):
print("Hostname:\tUnknown")
else:
print("Hostname:\t", hostname)
print("State:\t ", scanner[target].state())
print("Scan Info:\t ", scanner.scaninfo())
print("Protocol(s):\t ", scanner[target].all_protocols())
#print("Discovered Port(s):\t ", scanner[target]['tcp'].keys())
for proto in scanner[target].all_protocols():
lport = scanner[target][proto].keys()
for port in lport:
print('Port:\t ', port)
print('State:\t ', scanner[target][proto][port]['state'])
print("\n")
elif (response=='2'):
print("NMAP Version:\t ", scanner.nmap_version())
print("Scanning:\t ", target)
print("Please Wait !\n")
scanner.scan(target, '1-1000', '-v -sU')
hostname = scanner[target].hostname()
if (hostname == ""):
print("Hostname:\tUnknown")
else:
print("Hostname:\t", hostname)
print("State:\t ", scanner[target].state())
print("Scan Info:\t ", scanner.scaninfo())
print("Protocol(s):\t ", scanner[target].all_protocols())
print("Discovered Port(s):\t ", scanner[target]['udp'].keys())
for proto in scanner[target].all_protocols():
lport = scanner[target][proto].keys()
for port in lport:
print('Port:\t ', port)
print('State:\t ', scanner[target][proto][port]['state'])
print("\n")
elif (response=='3'):
print("NMAP Version:\t ", scanner.nmap_version())
print("Scanning:\t ", target)
print("Please Wait !\n")
scanner.scan(target, '1-1000', '-T4 -A -v')
hostname = scanner[target].hostname()
if (hostname == ""):
print("Hostname:\tUnknown")
else:
print("Hostname:\t", hostname)
print("State:\t ", scanner[target].state())
print("Scan Info:\t ", scanner.scaninfo())
print("Protocol(s):\t ", scanner[target].all_protocols())
#print("Discovered Port(s):\t ", scanner[target]['tcp'].keys())
for proto in scanner[target].all_protocols():
lport = scanner[target][proto].keys()
for port in lport:
print('Port:\t ', port)
print('State:\t ', scanner[target][proto][port]['state'])
print("\n")
else:
print("Wrong Option!")
exit(0)
| 27.752577 | 111 | 0.649703 | 374 | 2,692 | 4.652406 | 0.23262 | 0.156897 | 0.096552 | 0.062069 | 0.792529 | 0.792529 | 0.792529 | 0.792529 | 0.792529 | 0.792529 | 0 | 0.011121 | 0.131501 | 2,692 | 96 | 112 | 28.041667 | 0.733105 | 0.046062 | 0 | 0.688312 | 0 | 0.012987 | 0.253702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025974 | 0 | 0.025974 | 0.506494 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
7fd79f6f302eb0851f15347da85449a4971ea108 | 7,923 | py | Python | dsbox-ta2/python/dsbox/planner/leveltwo/l1proxy.py | Rosna/P4ML-UI | edf0dd830588f03b197e4d6532830a5aedd88424 | [
"Apache-2.0"
] | 1 | 2021-11-05T17:42:47.000Z | 2021-11-05T17:42:47.000Z | dsbox-ta2/python/dsbox/planner/leveltwo/l1proxy.py | Rosna/P4ML-UI | edf0dd830588f03b197e4d6532830a5aedd88424 | [
"Apache-2.0"
] | null | null | null | dsbox-ta2/python/dsbox/planner/leveltwo/l1proxy.py | Rosna/P4ML-UI | edf0dd830588f03b197e4d6532830a5aedd88424 | [
"Apache-2.0"
] | 2 | 2019-02-21T18:29:51.000Z | 2019-09-02T21:21:26.000Z | import os
from dsbox.planner.levelone.planner import (LevelOnePlanner, get_d3m_primitives, AffinityPolicy)
from dsbox.planner.common.library import PrimitiveLibrary
from dsbox.planner.common.pipeline import Pipeline
from dsbox.schema.dataset_schema import VariableFileType
class LevelOnePlannerProxy(object):
"""
The Level-1 DSBox Proxy Planner.
This is here to integrate with Ke-Thia's L1 Planner until we come up with a consistent interface
"""
def __init__(self, libdir, helper):
self.models = PrimitiveLibrary(libdir + os.sep + "models.json")
self.features = PrimitiveLibrary(libdir + os.sep + "features.json")
self.primitives = get_d3m_primitives()
self.policy = AffinityPolicy(self.primitives)
self.media_type = None
if helper.data_manager.media_type is not None:
self.media_type = helper.data_manager.media_type
self.l1_planner = LevelOnePlanner(primitives=self.primitives, policy=self.policy,
task_type=helper.problem.task_type, task_subtype=helper.problem.task_subtype, media_type=self.media_type)
self.primitive_hash = {}
for model in self.models.primitives:
self.primitive_hash[model.name] = model
for feature in self.features.primitives:
self.primitive_hash[feature.name] = feature
self.pipeline_hash = {}
self.flag = False
def get_pipelines(self, data):
try:
l1_pipelines = self.l1_planner.generate_pipelines_with_hierarchy(level=2)
# If there is a media type, use featurisation-added pipes instead
# kyao: added check to skip if media_type is nested tables
if self.media_type and not (self.media_type==VariableFileType.TABULAR or self.media_type==VariableFileType.GRAPH):
new_pipes = []
for l1_pipeline in l1_pipelines:
refined_pipes = self.l1_planner.fill_feature_by_weights(l1_pipeline, 1)
new_pipes = new_pipes + refined_pipes
l1_pipelines = new_pipes
# print(l1_pipelines)
pipelines = []
for l1_pipeline in l1_pipelines:
pipeline = self.l1_to_proxy_pipeline(l1_pipeline)
if pipeline:
self.pipeline_hash[str(pipeline)] = l1_pipeline
pipelines.append(pipeline)
# print(pipelines)
return pipelines
except Exception as e:
return None
def l1_to_proxy_pipeline(self, l1_pipeline):
pipeline = Pipeline()
ok = True
for prim in l1_pipeline.get_primitives():
# print(type(prim))
l2prim = self.primitive_hash.get(prim.name, None)
# print(l2prim)
if not l2prim:
ok = False
break
pipeline.addPrimitive(l2prim)
# print(pipeline)
if ok:
return pipeline
return None
def get_related_pipelines(self, pipeline):
pipelines = []
l1_pipeline = self.pipeline_hash.get(str(pipeline), None)
if l1_pipeline:
l1_pipelines = self.l1_planner.find_similar_learner(l1_pipeline, include_siblings=True)
for l1_pipeline in l1_pipelines:
pipeline = self.l1_to_proxy_pipeline(l1_pipeline)
if pipeline:
self.pipeline_hash[str(pipeline)] = l1_pipeline
pipelines.append(pipeline)
return pipelines
def get_particular_pipelines(self, data, models):
try:
# print(models)
l1_pipelines = self.l1_planner.generate_pipelines_with_hierarchy_new(models, level=2)
# print(l1_pipelines)
# If there is a media type, use featurisation-added pipes instead
# kyao: added check to skip if media_type is nested tables
if self.media_type and not (self.media_type==VariableFileType.TABULAR or self.media_type==VariableFileType.GRAPH):
new_pipes = []
for l1_pipeline in l1_pipelines:
refined_pipes = self.l1_planner.fill_feature_by_weights(l1_pipeline, 1)
# print(refined_pipes)
new_pipes = new_pipes + refined_pipes
l1_pipelines = new_pipes
pipelines = []
for l1_pipeline in l1_pipelines:
# print(l1_pipeline)
pipeline = self.l1_to_proxy_pipeline(l1_pipeline)
if pipeline:
self.pipeline_hash[str(pipeline)] = l1_pipeline
pipelines.append(pipeline)
# print(pipelines)
return pipelines
except Exception as e:
return None
def get_particular_pipelines_by_feature_extration(self, data, models, feature_extraction):
try:
# print(models)
l1_pipelines = self.l1_planner.generate_pipelines_with_hierarchy_new(models, level=2)
# print(l1_pipelines)
# If there is a media type, use featurisation-added pipes instead
# kyao: added check to skip if media_type is nested tables
if self.media_type and not (self.media_type==VariableFileType.TABULAR or self.media_type==VariableFileType.GRAPH):
new_pipes = []
for l1_pipeline in l1_pipelines:
refined_pipes = self.l1_planner.fill_feature_by_particular_method(l1_pipeline, feature_extraction, 1)
# print(refined_pipes)
new_pipes = new_pipes + refined_pipes
l1_pipelines = new_pipes
pipelines = []
for l1_pipeline in l1_pipelines:
# print(l1_pipeline)
pipeline = self.l1_to_proxy_pipeline_new(l1_pipeline)
if pipeline:
self.pipeline_hash[str(pipeline)] = l1_pipeline
pipelines.append(pipeline)
# print(pipelines)
return pipelines
except Exception as e:
return None
def get_pipelines_by_feature_extration(self, data, feature_extraction):
try:
# print(models)
l1_pipelines = self.l1_planner.generate_pipelines_with_hierarchy(level=2)
# print(l1_pipelines)
# If there is a media type, use featurisation-added pipes instead
# kyao: added check to skip if media_type is nested tables
if self.media_type and not (self.media_type==VariableFileType.TABULAR or self.media_type==VariableFileType.GRAPH):
new_pipes = []
for l1_pipeline in l1_pipelines:
refined_pipes = self.l1_planner.fill_feature_by_particular_method(l1_pipeline, feature_extraction, 1)
# print(refined_pipes)
new_pipes = new_pipes + refined_pipes
l1_pipelines = new_pipes
pipelines = []
for l1_pipeline in l1_pipelines:
# print(l1_pipeline)
pipeline = self.l1_to_proxy_pipeline_new(l1_pipeline)
if pipeline:
self.pipeline_hash[str(pipeline)] = l1_pipeline
pipelines.append(pipeline)
# print(pipelines)
return pipelines
except Exception as e:
return None
def l1_to_proxy_pipeline_new(self, l1_pipeline):
pipeline = Pipeline()
ok = True
for prim in l1_pipeline.get_primitives():
l2prim = self.primitive_hash.get(prim.name, None)
# print(l2prim)
if not l2prim:
ok = False
break
pipeline.addPrimitive(l2prim)
# print(pipeline)
if ok:
return pipeline
return None
| 38.838235 | 126 | 0.607724 | 884 | 7,923 | 5.202489 | 0.135747 | 0.071755 | 0.042401 | 0.029354 | 0.738856 | 0.721679 | 0.706458 | 0.704501 | 0.704501 | 0.704501 | 0 | 0.017241 | 0.326518 | 7,923 | 203 | 127 | 39.029557 | 0.84464 | 0.126593 | 0 | 0.746154 | 0 | 0 | 0.003493 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061538 | false | 0 | 0.038462 | 0 | 0.207692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3d2b837353f53eb62c982b1bafa974a94390be05 | 43 | py | Python | attacks/__init__.py | ml-research/PyTorch-BayesianCNN | 7933d6d6523be7d54e2347ba1497f63317f04af6 | [
"MIT"
] | 1 | 2021-11-28T10:09:03.000Z | 2021-11-28T10:09:03.000Z | attacks/__init__.py | ml-research/PyTorch-BayesianCNN | 7933d6d6523be7d54e2347ba1497f63317f04af6 | [
"MIT"
] | null | null | null | attacks/__init__.py | ml-research/PyTorch-BayesianCNN | 7933d6d6523be7d54e2347ba1497f63317f04af6 | [
"MIT"
] | null | null | null | from .fgsm import FGSM
from .pgd import PGD | 21.5 | 22 | 0.790698 | 8 | 43 | 4.25 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 2 | 23 | 21.5 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3d2eccd68ea6902209f0239aa06ac8d8fa93cb81 | 25 | py | Python | acq4/devices/DAQGeneric/__init__.py | ablot/acq4 | ba7cd340d9d0282640adb501d3788f8c0837e4c4 | [
"MIT"
] | null | null | null | acq4/devices/DAQGeneric/__init__.py | ablot/acq4 | ba7cd340d9d0282640adb501d3788f8c0837e4c4 | [
"MIT"
] | null | null | null | acq4/devices/DAQGeneric/__init__.py | ablot/acq4 | ba7cd340d9d0282640adb501d3788f8c0837e4c4 | [
"MIT"
] | null | null | null | from DAQGeneric import *
| 12.5 | 24 | 0.8 | 3 | 25 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 1 | 25 | 25 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3d43e45bce01a83ac9acf8203133b182c5a4263e | 74 | py | Python | cerver/utils/utils.py | ermiry-com/py-cerver | b2db27a7e41a2dd2511882290a2d0de4b7ee82db | [
"MIT"
] | 5 | 2021-03-25T20:33:55.000Z | 2021-12-21T05:03:23.000Z | cerver/utils/utils.py | ermiry-com/py-cerver | b2db27a7e41a2dd2511882290a2d0de4b7ee82db | [
"MIT"
] | 5 | 2021-03-31T05:54:07.000Z | 2021-07-01T01:21:39.000Z | cerver/utils/utils.py | ermiry-com/py-cerver | b2db27a7e41a2dd2511882290a2d0de4b7ee82db | [
"MIT"
] | 7 | 2021-03-25T20:33:58.000Z | 2021-12-29T05:28:56.000Z | import sys
def printf (format, *args):
sys.stdout.write (format % args)
| 14.8 | 33 | 0.702703 | 11 | 74 | 4.727273 | 0.727273 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 74 | 4 | 34 | 18.5 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e9e5079e07e1cd1b96b063867ae208df4caee68c | 2,941 | py | Python | tests/fixtures/config.py | andreroggeri/br-to-ynab | c5d0ef3804bb575badc05ac6dc771f6a9281f955 | [
"MIT"
] | 5 | 2021-09-20T13:15:37.000Z | 2022-03-01T01:03:27.000Z | tests/fixtures/config.py | andreroggeri/br-to-ynab | c5d0ef3804bb575badc05ac6dc771f6a9281f955 | [
"MIT"
] | 4 | 2021-04-28T14:11:42.000Z | 2021-10-09T16:18:15.000Z | tests/fixtures/config.py | andreroggeri/br-to-ynab | c5d0ef3804bb575badc05ac6dc771f6a9281f955 | [
"MIT"
] | 1 | 2021-09-27T15:13:30.000Z | 2021-09-27T15:13:30.000Z | import pytest
@pytest.fixture
def config_for_nubank():
return {
"ynab_token": "abc-123",
"ynab_budget": "budget-name",
"banks": [
"Nubank"
],
"start_import_date": "2021-04-27",
"nubank_login": "12345678912",
"nubank_token": "some-token",
"nubank_cert": "IyMKIyBIb3N0IERhdGFiYXNlCiMKIyBsb2NhbGhvc3QgaXMgdXNlZCB0byBjb25maWd1cmUgdGhlIGxvb3BiYWNrIGludGVyZmFjZQojIHdoZW4gdGhlIHN5c3RlbSBpcyBib290aW5nLiAgRG8gbm90IGNoYW5nZSB0aGlzIGVudHJ5LgojIwoxMjcuMC4wLjEJbG9jYWxob3N0CjI1NS4yNTUuMjU1LjI1NQlicm9hZGNhc3Rob3N0Cjo6MSAgICAgICAgICAgICBsb2NhbGhvc3QKCjE3Mi4yNy4yMTYuMTAyCWFwaWhtbC5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi4xMTcJc29uYXJwb3J0b3ByZC5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi4xMTgJc29uYXJwb3J0b2htbAoxNzIuMjcuMjE2LjExOCAgc29uYXJwb3J0b2htbC5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi43NglnaXRwb3J0b3ByZC5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi4xMzcJb2NwbWFzdGVyLnBvcnRvc2VndXJvLmJyYXNpbAoxNzIuMjcuMjEyLjE0Nglwb3J0b3NlbmhhLnBvcnRvc2VndXJvLmJyYXNpbAoxNzIuMjcuMjA0LjEwCW5leHVzcmVwby5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi43OQluZXh1c3BvcnRvcHJkCjE3Mi4yNy4yMTYuODEJamVua2luc3BvcnRvaG1sLnBvcnRvc2VndXJvLmJyYXNpbAoxNzIuMjcuMjE2LjIyNwlqZW5raW5zY2lwb3J0b3ByZC5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi4yMjcJamVua2luc3BvcnRvcHJkCjE3Mi4yNy4yMTYuNzkJbmV4dXNwb3J0b3ByZC5wb3J0b3NlZ3Vyby5icmFzaWwKMTcyLjI3LjIxNi43OAluZXh1c3BvcnRvaG1sLnBvcnRvc2VndXJvLmJyYXNpbAoxNzIuMjcuMjAyLjIxNglhZGZzLnBvcnRvc2VndXJvLmNvbS5icgoxNzIuMjcuMjE2LjcyCWdyZWdobWwKMTcyLjI3LjIxNi43MQlncmVnCjE3Mi4yNy4yMTYuNTYgICBncmFmYW5hCjE3Mi4yNy4yMTIuMTI4ICBhcGFjaGVobWxyZQoxNzIuMjcuMjEyLjEwNyAgYXBhY2hlaG1sMnJlCjE3Mi4yNy4yMTYuMTUgICBvdGRpZ2hvbW0ucG9ydG9zZWd1cm8uYnJhc2lsCjE3Mi4yNy4yMDQuMTE4ICBvc2JobWxhdXRvLnBvcnRvc2VndXJvLmJyYXNpbAoKIyBBZGRlZCBieSBEb2NrZXIgRGVza3RvcAojIFRvIGFsbG93IHRoZSBzYW1lIGt1YmUgY29udGV4dCB0byB3b3JrIG9uIHRoZSBob3N0IGFuZCB0aGUgY29udGFpbmVyOgoxMjcuMC4wLjEga3ViZXJuZXRlcy5kb2NrZXIuaW50ZXJuYWwKIyBFbmQgb2Ygc2VjdGlvbgo=",
"nubank_credit_card_account": "Nubs",
"nubank_checking_account": "Nubs2"
}
@pytest.fixture
def config_for_bradesco():
return {
"ynab_token": "abc-123",
"ynab_budget": "budget-name",
"banks": [
"Bradesco"
],
"start_import_date": "2021-04-27",
"bradesco_branch": "123",
"bradesco_account_no": "456789",
"bradesco_account_digit": "9",
"bradesco_web_password": "5566",
"bradesco_credit_card_account": "Visa",
"bradesco_checking_account": "Conta Conta COrrente"
}
@pytest.fixture
def config_for_alelo():
return {
'ynab_token': 'abc-123',
'ynab_budget': 'budget-name',
'banks': ['Alelo'],
'start_import_date': '2021-04-27',
'login': '1234',
'alelo_password': 'abc123',
'alelo_flex_account': 'aaaa',
'alelo_refeicao_account': 'bbbc',
'alelo_alimentacao_account': 'cccc',
}
| 56.557692 | 1,650 | 0.801768 | 133 | 2,941 | 17.390977 | 0.390977 | 0.016861 | 0.020752 | 0.028534 | 0.12192 | 0.089494 | 0.059663 | 0.059663 | 0.059663 | 0.059663 | 0 | 0.109162 | 0.127848 | 2,941 | 51 | 1,651 | 57.666667 | 0.792593 | 0 | 0 | 0.355556 | 0 | 0 | 0.763006 | 0.617477 | 0 | 1 | 0 | 0 | 0 | 1 | 0.066667 | true | 0.044444 | 0.088889 | 0.066667 | 0.222222 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e9eb278e3b5297a206902444433140f9c9a03a43 | 152 | py | Python | pytorch_widedeep/bayesian_models/__init__.py | TangleSpace/pytorch-widedeep | ccc55a15c1b3205ffc8c054abc5cd25cba9ccdff | [
"MIT"
] | null | null | null | pytorch_widedeep/bayesian_models/__init__.py | TangleSpace/pytorch-widedeep | ccc55a15c1b3205ffc8c054abc5cd25cba9ccdff | [
"MIT"
] | null | null | null | pytorch_widedeep/bayesian_models/__init__.py | TangleSpace/pytorch-widedeep | ccc55a15c1b3205ffc8c054abc5cd25cba9ccdff | [
"MIT"
] | null | null | null | from pytorch_widedeep.bayesian_models import bayesian_nn
from pytorch_widedeep.bayesian_models.tabular import (
BayesianWide,
BayesianTabMlp,
)
| 25.333333 | 56 | 0.828947 | 17 | 152 | 7.117647 | 0.588235 | 0.181818 | 0.31405 | 0.446281 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 152 | 5 | 57 | 30.4 | 0.909774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e9ec7ec865e06f6956592e041c28b366eb8889b8 | 82 | py | Python | pirel/__init__.py | DanXYZ/pirel | 9a1d46510a225410a0d6e6b1079f361bcc7a0481 | [
"MIT"
] | 1 | 2021-06-24T22:06:53.000Z | 2021-06-24T22:06:53.000Z | pirel/__init__.py | giumc/PyResLayout | 9a1d46510a225410a0d6e6b1079f361bcc7a0481 | [
"MIT"
] | null | null | null | pirel/__init__.py | giumc/PyResLayout | 9a1d46510a225410a0d6e6b1079f361bcc7a0481 | [
"MIT"
] | 1 | 2021-06-21T21:15:27.000Z | 2021-06-21T21:15:27.000Z | import pirel.tools
import pirel.pcells
import pirel.modifiers
import pirel.sweeps
| 16.4 | 22 | 0.853659 | 12 | 82 | 5.833333 | 0.5 | 0.628571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 82 | 4 | 23 | 20.5 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
18049735cefad472c584f9d852f085f82105998a | 37 | py | Python | __init__.py | CoraDeFrancesco/build_duck | d3fe108b6075662a85fc9a910217b26c5d5ce135 | [
"MIT"
] | null | null | null | __init__.py | CoraDeFrancesco/build_duck | d3fe108b6075662a85fc9a910217b26c5d5ce135 | [
"MIT"
] | null | null | null | __init__.py | CoraDeFrancesco/build_duck | d3fe108b6075662a85fc9a910217b26c5d5ce135 | [
"MIT"
] | null | null | null | from build_duck.py import get_marker
| 18.5 | 36 | 0.864865 | 7 | 37 | 4.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 37 | 1 | 37 | 37 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
180f43e93306504fb2cafa0289a8a1057c4ea0a5 | 833 | py | Python | ggpy/cruft/autocode/GdlConstant.py | hobson/ggpy | 4e6e6e876c3a4294cd711647051da2d9c1836b60 | [
"MIT"
] | 1 | 2015-01-26T19:07:45.000Z | 2015-01-26T19:07:45.000Z | ggpy/cruft/autocode/GdlConstant.py | hobson/ggpy | 4e6e6e876c3a4294cd711647051da2d9c1836b60 | [
"MIT"
] | null | null | null | ggpy/cruft/autocode/GdlConstant.py | hobson/ggpy | 4e6e6e876c3a4294cd711647051da2d9c1836b60 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
""" generated source for module GdlConstant """
# package: org.ggp.base.util.gdl.grammar
@SuppressWarnings("serial")
class GdlConstant(GdlTerm):
""" generated source for class GdlConstant """
value = str()
def __init__(self, value):
""" generated source for method __init__ """
super(GdlConstant, self).__init__()
self.value = value.intern()
def getValue(self):
""" generated source for method getValue """
return self.value
def isGround(self):
""" generated source for method isGround """
return True
def toSentence(self):
""" generated source for method toSentence """
return GdlPool.getProposition(self)
def __str__(self):
""" generated source for method toString """
return self.value
| 27.766667 | 54 | 0.636255 | 89 | 833 | 5.775281 | 0.404494 | 0.20428 | 0.245136 | 0.233463 | 0.217899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.247299 | 833 | 29 | 55 | 28.724138 | 0.819777 | 0.398559 | 0 | 0.142857 | 1 | 0 | 0.013158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.357143 | false | 0 | 0 | 0 | 0.785714 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
1846263780ae2e3d3abca145e5aa1e023e9d263b | 35 | py | Python | MemeGenerator/__init__.py | robocioaca/udacity_meme-generator-project | e5c7b4cbc14b1c1029e3a8e38b9cb7efcbe59b62 | [
"MIT"
] | null | null | null | MemeGenerator/__init__.py | robocioaca/udacity_meme-generator-project | e5c7b4cbc14b1c1029e3a8e38b9cb7efcbe59b62 | [
"MIT"
] | 3 | 2021-06-08T20:57:50.000Z | 2021-12-13T20:32:58.000Z | MemeGenerator/__init__.py | robocioaca/udacity_meme-generator-project | e5c7b4cbc14b1c1029e3a8e38b9cb7efcbe59b62 | [
"MIT"
] | 1 | 2021-09-30T19:10:31.000Z | 2021-09-30T19:10:31.000Z | from .MemeEngine import MemeEngine
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a1159134f0dcdc4c6ab29efec4e2402ef281d640 | 631 | py | Python | python/ray/tune/utils/__init__.py | mgelbart/ray | 4cec2286572e368a4bd64aae467751a384eff62d | [
"Apache-2.0"
] | 1 | 2019-06-19T02:23:43.000Z | 2019-06-19T02:23:43.000Z | python/ray/tune/utils/__init__.py | mgelbart/ray | 4cec2286572e368a4bd64aae467751a384eff62d | [
"Apache-2.0"
] | 73 | 2021-09-25T07:11:39.000Z | 2022-03-26T07:10:59.000Z | python/ray/tune/utils/__init__.py | mgelbart/ray | 4cec2286572e368a4bd64aae467751a384eff62d | [
"Apache-2.0"
] | 1 | 2019-09-24T16:24:49.000Z | 2019-09-24T16:24:49.000Z | from ray.tune.utils.util import (
deep_update,
date_str,
flatten_dict,
merge_dicts,
unflattened_lookup,
UtilMonitor,
validate_save_restore,
warn_if_slow,
diagnose_serialization,
detect_checkpoint_function,
detect_reporter,
detect_config_single,
wait_for_gpu,
)
__all__ = [
"deep_update",
"date_str",
"flatten_dict",
"merge_dicts",
"unflattened_lookup",
"UtilMonitor",
"validate_save_restore",
"warn_if_slow",
"diagnose_serialization",
"detect_checkpoint_function",
"detect_reporter",
"detect_config_single",
"wait_for_gpu",
]
| 19.71875 | 33 | 0.683043 | 67 | 631 | 5.865672 | 0.522388 | 0.050891 | 0.071247 | 0.086514 | 0.926209 | 0.926209 | 0.926209 | 0.926209 | 0.926209 | 0.926209 | 0 | 0 | 0.22187 | 631 | 31 | 34 | 20.354839 | 0.800407 | 0 | 0 | 0 | 0 | 0 | 0.315372 | 0.10935 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.033333 | 0 | 0.033333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a1547239b43b11cb9f30b9cc9ce1b7aaff62bcf3 | 4,787 | py | Python | test/test_objects.py | scanfyu/czsc | 6a8dbbca8dd17930e84de6eda8f877c95de82d82 | [
"MIT"
] | null | null | null | test/test_objects.py | scanfyu/czsc | 6a8dbbca8dd17930e84de6eda8f877c95de82d82 | [
"MIT"
] | null | null | null | test/test_objects.py | scanfyu/czsc | 6a8dbbca8dd17930e84de6eda8f877c95de82d82 | [
"MIT"
] | null | null | null | # coding: utf-8
from collections import OrderedDict
from czsc.objects import Signal, Factor, Event, Freq, Operate, Position
def test_signal():
s = Signal(k1="1分钟", k3="倒1形态", v1="类一买", v2="七笔", v3="基础型", score=3)
assert str(s) == "Signal('1分钟_任意_倒1形态_类一买_七笔_基础型_3')"
assert s.key == "1分钟_倒1形态"
s1 = Signal(signal='1分钟_任意_倒1形态_类一买_七笔_基础型_3')
assert s == s1
assert s.is_match({"1分钟_倒1形态": "类一买_七笔_基础型_3"})
assert not s.is_match({"1分钟_倒1形态": "类一买_七笔_特例一_3"})
assert not s.is_match({"1分钟_倒1形态": "类一买_九笔_基础型_3"})
s = Signal(k1="1分钟", k2="倒1形态", k3="类一买", score=3)
assert str(s) == "Signal('1分钟_倒1形态_类一买_任意_任意_任意_3')"
assert s.key == "1分钟_倒1形态_类一买"
try:
s = Signal(k1="1分钟", k2="倒1形态", k3="类一买", score=101)
except ValueError as e:
assert str(e) == 'score 必须在0~100之间'
def test_factor():
freq = Freq.F15
s = OrderedDict()
default_signals = [
Signal(k1=str(freq.value), k2="倒0笔", k3="方向", v1="向上", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="长度", v1="大于5", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="三K形态", v1="顶分型", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒1笔", k3="表里关系", v1="其他", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒1笔", k3="RSQ状态", v1="小于0.2", v2='其他', v3='其他'),
]
for signal in default_signals:
s[signal.key] = signal.value
factor = Factor(
name="单测",
signals_all=[
Signal(k1=str(freq.value), k2="倒0笔", k3="方向", v1="向上", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="长度", v1="大于5", v2='其他', v3='其他')
]
)
assert factor.is_match(s)
factor = Factor(
name="单测",
signals_all=[
Signal(k1=str(freq.value), k2="倒0笔", k3="方向", v1="向上", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="长度", v1="大于5", v2='其他', v3='其他')
],
signals_any=[
Signal(k1=str(freq.value), k2="倒1笔", k3="RSQ状态", v1="小于0.2", v2='其他', v3='其他')
]
)
assert factor.is_match(s)
factor = Factor(
name="单测",
signals_all=[
Signal(k1=str(freq.value), k2="倒0笔", k3="方向", v1="向上", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="长度", v1="大于5", v2='其他', v3='其他')
],
signals_any=[
Signal(k1=str(freq.value), k2="倒1笔", k3="RSQ状态", v1="小于0.8", v2='其他', v3='其他')
]
)
assert not factor.is_match(s)
def test_event():
freq = Freq.F15
s = OrderedDict()
default_signals = [
Signal(k1=str(freq.value), k2="倒0笔", k3="方向", v1="向上", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="长度", v1="大于5", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="三K形态", v1="顶分型", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒1笔", k3="表里关系", v1="其他", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒1笔", k3="RSQ状态", v1="小于0.2", v2='其他', v3='其他'),
]
for signal in default_signals:
s[signal.key] = signal.value
event = Event(name="单测", operate=Operate.LO, factors=[
Factor(
name="测试",
signals_all=[
Signal(k1=str(freq.value), k2="倒0笔", k3="方向", v1="向上", v2='其他', v3='其他'),
Signal(k1=str(freq.value), k2="倒0笔", k3="长度", v1="大于5", v2='其他', v3='其他')]
)
])
m, f = event.is_match(s)
assert m and f
event = Event(name="单测", operate=Operate.LO, factors=[
Factor(
name="测试",
signals_all=[
Signal('15分钟_倒0笔_方向_向上_其他_其他_0'), Signal('15分钟_倒0笔_长度_任意_其他_其他_0')
]
)
])
m, f = event.is_match(s)
assert m and f
event = Event(name="单测", operate=Operate.LO, factors=[
Factor(
name="测试",
signals_all=[
Signal('15分钟_倒0笔_方向_向上_其他_其他_20'), Signal('15分钟_倒0笔_长度_任意_其他_其他_0')
]
)
])
m, f = event.is_match(s)
assert not m and not f
event = Event(name="单测", operate=Operate.LO, factors=[
Factor(
name="测试",
signals_all=[
Signal('15分钟_倒0笔_方向_向下_其他_其他_0'), Signal('15分钟_倒0笔_长度_任意_其他_其他_0')
]
)
])
m, f = event.is_match(s)
assert not m and not f
def test_position():
position = Position(symbol="000001.XSHG")
assert position.pos == 0
position.long_open()
assert position.pos == 0.5
position.long_add1()
assert position.pos == 0.8
position.long_add2()
assert position.pos == 1
position.long_reduce1()
assert position.pos == 0.8
position.long_reduce2()
assert position.pos == 0.5
position.long_exit()
assert position.pos == 0 | 32.564626 | 90 | 0.540004 | 727 | 4,787 | 3.415406 | 0.130674 | 0.074104 | 0.088603 | 0.120822 | 0.824406 | 0.818768 | 0.806283 | 0.732984 | 0.732984 | 0.687878 | 0 | 0.072095 | 0.261124 | 4,787 | 147 | 91 | 32.564626 | 0.629912 | 0.002716 | 0 | 0.598425 | 0 | 0 | 0.135764 | 0.046931 | 0 | 0 | 0 | 0 | 0.181102 | 1 | 0.031496 | false | 0 | 0.015748 | 0 | 0.047244 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a1567caddf97c84dba07a8d9cdfdfe188d48e3d9 | 106 | py | Python | http_wrapper/__init__.py | NewKnowledge/__nk_unicorn | 4b9753723504b0b225faadcda9a3de888c4183cf | [
"MIT"
] | null | null | null | http_wrapper/__init__.py | NewKnowledge/__nk_unicorn | 4b9753723504b0b225faadcda9a3de888c4183cf | [
"MIT"
] | null | null | null | http_wrapper/__init__.py | NewKnowledge/__nk_unicorn | 4b9753723504b0b225faadcda9a3de888c4183cf | [
"MIT"
] | null | null | null | from .queries import get_visual_clusters, insert_clusters, remove_community_clusters, get_community_names
| 53 | 105 | 0.896226 | 14 | 106 | 6.285714 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066038 | 106 | 1 | 106 | 106 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a1c0692837db2102712b003908809871471dbf5c | 169 | py | Python | server/main/utils/__init__.py | jphacks/TK_1905 | f4af0a26bacedde415f9f873c917fbdb4910e386 | [
"MIT"
] | 7 | 2019-10-26T05:44:14.000Z | 2019-11-10T13:06:11.000Z | server/main/utils/__init__.py | jphacks/TK_1905 | f4af0a26bacedde415f9f873c917fbdb4910e386 | [
"MIT"
] | 2 | 2019-11-07T16:28:36.000Z | 2020-06-06T00:12:58.000Z | server/main/utils/__init__.py | jphacks/TK_1905 | f4af0a26bacedde415f9f873c917fbdb4910e386 | [
"MIT"
] | null | null | null | from .djangoutils import *
from .googleutils import *
from .doc2vec import *
from .funcs import *
from .macpickle import *
from .singleton import *
from .slack import *
| 21.125 | 26 | 0.751479 | 21 | 169 | 6.047619 | 0.428571 | 0.472441 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007092 | 0.16568 | 169 | 7 | 27 | 24.142857 | 0.893617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a1cc2d3b4e334191a4316817a39c35f842816ca8 | 4,784 | py | Python | opentimelineio_contrib/adapters/tests/test_fcpx_adapter.py | BadSingleton/OpenTimelineIO | 4ec5cb79c2a393ba92fefc87a7ca6217ca769a84 | [
"Apache-2.0"
] | null | null | null | opentimelineio_contrib/adapters/tests/test_fcpx_adapter.py | BadSingleton/OpenTimelineIO | 4ec5cb79c2a393ba92fefc87a7ca6217ca769a84 | [
"Apache-2.0"
] | null | null | null | opentimelineio_contrib/adapters/tests/test_fcpx_adapter.py | BadSingleton/OpenTimelineIO | 4ec5cb79c2a393ba92fefc87a7ca6217ca769a84 | [
"Apache-2.0"
] | null | null | null | import os
import unittest
import opentimelineio as otio
import opentimelineio.test_utils as otio_test_utils
SAMPLE_LIBRARY_XML = os.path.join(
os.path.dirname(__file__),
"sample_data",
"fcpx_library.fcpxml"
)
SAMPLE_PROJECT_XML = os.path.join(
os.path.dirname(__file__),
"sample_data",
"fcpx_project.fcpxml"
)
SAMPLE_EVENT_XML = os.path.join(
os.path.dirname(__file__),
"sample_data",
"fcpx_event.fcpxml"
)
SAMPLE_CLIPS_XML = os.path.join(
os.path.dirname(__file__),
"sample_data",
"fcpx_clips.fcpxml"
)
class AdaptersFcpXXmlTest(unittest.TestCase, otio_test_utils.OTIOAssertions):
"""
The test class for the FCP X XML adapter
"""
def __init__(self, *args, **kwargs):
super(AdaptersFcpXXmlTest, self).__init__(*args, **kwargs)
self.maxDiff = None
def test_library_roundtrip(self):
container = otio.adapters.read_from_file(SAMPLE_LIBRARY_XML)
timeline = next(
container.each_child(descended_from_type=otio.schema.Timeline)
)
self.assertIsNotNone(timeline)
self.assertEqual(len(timeline.tracks), 4)
self.assertEqual(len(timeline.video_tracks()), 3)
self.assertEqual(len(timeline.audio_tracks()), 1)
video_clip_names = (
(
'IMG_0715',
None,
'compound_clip_1',
'IMG_0233',
'IMG_0687',
'IMG_0268',
'compound_clip_1'
),
(None, 'IMG_0513', None, 'IMG_0268', 'IMG_0740'),
(None, 'IMG_0857')
)
for n, track in enumerate(timeline.video_tracks()):
self.assertTupleEqual(
tuple(c.name for c in track),
video_clip_names[n]
)
fcpx_xml = otio.adapters.write_to_string(container, "fcpx_xml")
self.assertIsNotNone(fcpx_xml)
new_timeline = otio.adapters.read_from_string(fcpx_xml, "fcpx_xml")
self.assertJsonEqual(container, new_timeline)
def test_event_roundtrip(self):
container = otio.adapters.read_from_file(SAMPLE_EVENT_XML)
timeline = next(
container.each_child(descended_from_type=otio.schema.Timeline)
)
self.assertIsNotNone(timeline)
self.assertEqual(len(timeline.tracks), 4)
self.assertEqual(len(timeline.video_tracks()), 3)
self.assertEqual(len(timeline.audio_tracks()), 1)
video_clip_names = (
(
'IMG_0715',
None,
'compound_clip_1',
'IMG_0233',
'IMG_0687',
'IMG_0268',
'compound_clip_1'
),
(None, 'IMG_0513', None, 'IMG_0268', 'IMG_0740'),
(None, 'IMG_0857')
)
for n, track in enumerate(timeline.video_tracks()):
self.assertTupleEqual(
tuple(c.name for c in track),
video_clip_names[n]
)
fcpx_xml = otio.adapters.write_to_string(container, "fcpx_xml")
self.assertIsNotNone(fcpx_xml)
new_timeline = otio.adapters.read_from_string(fcpx_xml, "fcpx_xml")
self.assertJsonEqual(container, new_timeline)
def test_project_roundtrip(self):
timeline = otio.adapters.read_from_file(SAMPLE_PROJECT_XML)
self.assertIsNotNone(timeline)
self.assertEqual(len(timeline.tracks), 4)
self.assertEqual(len(timeline.video_tracks()), 3)
self.assertEqual(len(timeline.audio_tracks()), 1)
video_clip_names = (
(
'IMG_0715',
None,
'compound_clip_1',
'IMG_0233',
'IMG_0687',
'IMG_0268',
'compound_clip_1'
),
(None, 'IMG_0513', None, 'IMG_0268', 'IMG_0740'),
(None, 'IMG_0857')
)
for n, track in enumerate(timeline.video_tracks()):
self.assertTupleEqual(
tuple(c.name for c in track),
video_clip_names[n]
)
fcpx_xml = otio.adapters.write_to_string(timeline, "fcpx_xml")
self.assertIsNotNone(fcpx_xml)
new_timeline = otio.adapters.read_from_string(fcpx_xml, "fcpx_xml")
self.assertJsonEqual(timeline, new_timeline)
def test_clips_roundtrip(self):
container = otio.adapters.read_from_file(SAMPLE_CLIPS_XML)
fcpx_xml = otio.adapters.write_to_string(container, "fcpx_xml")
self.assertIsNotNone(fcpx_xml)
new_timeline = otio.adapters.read_from_string(fcpx_xml, "fcpx_xml")
self.assertJsonEqual(container, new_timeline)
if __name__ == '__main__':
unittest.main()
| 30.08805 | 77 | 0.596781 | 529 | 4,784 | 5.049149 | 0.160681 | 0.052415 | 0.060651 | 0.087608 | 0.814676 | 0.811681 | 0.800449 | 0.800449 | 0.800449 | 0.742044 | 0 | 0.033036 | 0.297659 | 4,784 | 158 | 78 | 30.278481 | 0.761905 | 0.008361 | 0 | 0.65873 | 0 | 0 | 0.099408 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 1 | 0.039683 | false | 0 | 0.031746 | 0 | 0.079365 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a1ea56f577e661c06c35585252ddea30eee3b5fd | 10,621 | py | Python | run_keras_server.py | DLVIsualizer/dlvis-flask | e1e22028b2d57fb894d105bd716437a3de8e4e7f | [
"MIT"
] | null | null | null | run_keras_server.py | DLVIsualizer/dlvis-flask | e1e22028b2d57fb894d105bd716437a3de8e4e7f | [
"MIT"
] | 13 | 2020-01-28T22:20:14.000Z | 2022-03-11T23:20:14.000Z | run_keras_server.py | DLVIsualizer/dlvis-flask | e1e22028b2d57fb894d105bd716437a3de8e4e7f | [
"MIT"
] | null | null | null | # # USAGE
# # Start the server:
# # python run_keras_server.py
# # Submit a request via cURL:
# # curl -X POST -F image=@dog.jpg 'http://localhost:5000/predict'
# # Submita a request via Python:
# # python simple_request.py
#
# # import the necessary packages
# from flask_cors import CORS, cross_origin
# from keras.applications import ResNet50
# from keras.applications import InceptionV3
# from keras.preprocessing.image import img_to_array
# from keras.applications import imagenet_utils
# from PIL import Image
# from constants import MODELS
# import numpy as np
# import flask
# import io
# import json
#
# # initialize our Flask application and the Keras model
# app = flask.Flask(__name__)
# cors = CORS(app)
#
# resnetModel = ResNet50(weights="imagenet")
# inceptionV3Model = InceptionV3(weights="imagenet")
#
# # def load_model():
# # load the pre-trained Keras model (here we are using a model
# # pre-trained on ImageNet and provided by Keras, but you can
# # substitute in your own networks just as easily)
# # global model
# # model = ResNet50(weights="imagenet")
#
# def prepare_image(image, target):
# # if the image mode is not RGB, convert it
# if image.mode != "RGB":
# image = image.convert("RGB")
#
# # resize the input image and preprocess it
# image = image.resize(target)
# image = img_to_array(image)
# image = np.expand_dims(image, axis=0)
# image = imagenet_utils.preprocess_input(image)
#
# # return the processed image
# return image
#
#
# def build_html_with_layer(layer):
# layer_class = layer['class_name']
# layer_config = layer['config']
# html = ""
#
# if layer_class == 'InputLayer':
# html = "input shape " + str(layer_config['batch_input_shape']) + "<br>"
# elif layer_class == 'ZeroPadding2D':
# html = "padding " + str(layer_config['padding']) + "<br>"
# elif layer_class == 'Conv2D':
# html = "filters " + str(layer_config['filters']) + "<br>" \
# "kernel size " + str(layer_config['kernel_size']) + "<br>" \
# "strides " + str(
# layer_config['strides']) + "<br>"
# elif layer_class == 'BatchNormalization':
# html = ""
# elif layer_class == 'Activation':
# html = "activation func</b> " + str(layer_config['activation'])
# elif layer_class == 'MaxPooling2D':
# html = "pool size " + str(layer_config['pool_size']) + "<br>" \
# "strides " + str(layer_config['strides']) + "<br>"
#
# return html
#
#
# def create_model_graph(layers):
# data = []
# tooltip = {}
# links = []
# for idx in range(1, len(layers)):
# links.append({
# "source": idx - 1,
# "target": idx
# })
#
# for idx, layer in enumerate(layers):
# flag = False
# prior_node = ""
#
# inbound_nodes = layer["inbound_nodes"]
#
# if len(inbound_nodes) != 0:
# for inbound_node in inbound_nodes[0]:
# if inbound_node[0] != data[len(data)-1]["name"]:
# flag = True
# prior_node = inbound_node[0]
# break
# else:
# break
#
# if flag is True:
# for d in data:
# if d["name"] == prior_node:
# data.append({
# "name": layer['name'],
# "x": d["x"] + 1200,
# "y": d["y"],
# "value": layer['class_name']
# })
# else:
# data.append({
# "name": layer['name'],
# "x": 500,
# "y": idx * 200,
# "value": layer['class_name']
# })
#
# tooltip[layer['name']] = build_html_with_layer(layer)
#
#
#
# model_graph = {
# "graph": {
# "data": data,
# "links": links
# },
# "tooltip": tooltip
# }
#
# return model_graph
#
#
# @app.route("/predict", methods=["POST"])
# def predict():
# # initialize the data dictionary that will be returned from the
# # view
# data = {"success": False}
#
# # ensure an image was properly uploaded to our endpoint
# if flask.request.method == "POST":
# if flask.request.files.get("image"):
# # read the image in PIL format
# image = flask.request.files["image"].read()
# image = Image.open(io.BytesIO(image))
#
# # preprocess the image and prepare it for classification
# image = prepare_image(image, target=(224, 224))
#
# # classify the input image and then initialize the list
# # of predictions to return to the client
# preds = resnetModel.predict(image)
# results = imagenet_utils.decode_predictions(preds)
# data["predictions"] = []
#
# # loop over the results and add them to the list of
# # returned predictions
# for (imagenetID, label, prob) in results[0]:
# r = {"label": label, "probability": float(prob)}
# data["predictions"].append(r)
#
# # indicate that the request was a success
# data["success"] = True
#
# # return the data dictionary as a JSON response
# return flask.jsonify(data)
#
#
# @app.route("/layers/<int:model_id>", methods=["GET"])
# @cross_origin()
# def layers(model_id):
#
# if model_id == MODELS['ResNet50']:
# jmodel = json.loads(resnetModel.to_json())
# elif model_id == MODELS['InceptionV3']:
# jmodel = json.loads(inceptionV3Model.to_json())
# else:
# return ('',204) # No Content
#
# layers = jmodel["config"]["layers"]
#
# # print(json.dumps(layers, indent=2, sort_keys=True))
#
# model_graph = create_model_graph(layers)
# # print(json.dumps(model_graph, indent=2, sort_keys=True))
# return flask.jsonify(model_graph)
#
#
# # if this is the main thread of execution first load the model and
# # then start the server
# if __name__ == "__main__":
# print(("* Loading Keras model and Flask starting server..."
# "please wait until server has fully started"))
# app.run()
# USAGE
# Start the server:
# python run_keras_server.py
# Submit a request via cURL:
# curl -X POST -F image=@dog.jpg 'http://localhost:5000/predict'
# Submita a request via Python:
# python simple_request.py
# import the necessary packages
from flask_cors import CORS, cross_origin
from constants import MODELS
from keras.applications import ResNet50
from keras.applications import InceptionV3
from keras.preprocessing.image import img_to_array
from keras.applications import imagenet_utils
from PIL import Image
import numpy as np
import flask
import io
import json
import requests
# initialize our Flask application and the Keras model
app = flask.Flask(__name__)
cors = CORS(app)
# PBW: 0505_18
MODEL_ID_RESNET = 'ResNet50'
MODEL_ID_INCEPTIONV3 = 'InceptionV3'
currentModel = 0 # model pointer
resnetModel = ResNet50(weights="imagenet")
inceptionV3Model = InceptionV3(weights="imagenet")
# def load_model():
# load the pre-trained Keras model (here we are using a model
# pre-trained on ImageNet and provided by Keras, but you can
# substitute in your own networks just as easily)
# global model
# model = ResNet50(weights="imagenet")
def prepare_image(image, target):
# if the image mode is not RGB, convert it
if image.mode != "RGB":
image = image.convert("RGB")
# resize the input image and preprocess it
image = image.resize(target)
image = img_to_array(image)
image = np.expand_dims(image, axis=0)
image = imagenet_utils.preprocess_input(image)
# return the processed image
return image
def build_html_with_layer(layer):
layer_class = layer['class_name']
layer_config = layer['config']
html = ""
print(json.dumps(layer_config, indent=2, sort_keys=True))
if layer_class == 'InputLayer':
html = "input shape " + str(layer_config['batch_input_shape']) + "<br>"
elif layer_class == 'ZeroPadding2D':
html = "padding " + str(layer_config['padding']) + "<br>"
elif layer_class == 'Conv2D':
html = "filters " + str(layer_config['filters']) + "<br>" \
"kernel size " + str(layer_config['kernel_size']) + "<br>" \
"strides " + str(
layer_config['strides']) + "<br>"
elif layer_class == 'BatchNormalization':
html = ""
elif layer_class == 'Activation':
html = "activation func</b> " + str(layer_config['activation'])
elif layer_class == 'MaxPooling2D':
html = "pool size " + str(layer_config['pool_size']) + "<br>" \
"strides " + str(layer_config['strides']) + "<br>"
return html
def create_model_graph(layers):
data = []
tooltip = {}
for idx, layer in enumerate(layers):
data.append({
"name": layer['name'],
"x": 500,
"y": idx * 200,
"value": layer['class_name']
})
tooltip[layer['name']] = build_html_with_layer(layer)
links = []
for idx in range(1, len(layers)):
links.append({
"source": idx - 1,
"target": idx
})
model_graph = {
"graph": {
"data": data,
"links": links
},
"tooltip": tooltip
}
return model_graph
@app.route("/predict", methods=["POST"])
def predict():
# initialize the data dictionary that will be returned from the
# view
data = {"success": False}
# ensure an image was properly uploaded to our endpoint
if flask.request.method == "POST":
if flask.request.files.get("image"):
# read the image in PIL format
image = flask.request.files["image"].read()
image = Image.open(io.BytesIO(image))
# preprocess the image and prepare it for classification
image = prepare_image(image, target=(224, 224))
# classify the input image and then initialize the list
# of predictions to return to the client
preds = resnetModel.predict(image)
results = imagenet_utils.decode_predictions(preds)
data["predictions"] = []
# loop over the results and add them to the list of
# returned predictions
for (imagenetID, label, prob) in results[0]:
r = {"label": label, "probability": float(prob)}
data["predictions"].append(r)
# indicate that the request was a success
data["success"] = True
# return the data dictionary as a JSON response
return flask.jsonify(data)
@app.route("/layers/<int:model_id>", methods=["GET"])
@cross_origin()
def layers(model_id):
if model_id == MODELS['ResNet50']:
jmodel = json.loads(resnetModel.to_json())
elif model_id == MODELS['InceptionV3']:
jmodel = json.loads(inceptionV3Model.to_json())
else:
return ('',204) # No Content
jmodel = requests.get('127.0.0.1:5001')
layers = jmodel["config"]["layers"]
# print(json.dumps(layers, indent=2, sort_keys=True))
model_graph = create_model_graph(layers)
# print(json.dumps(model_graph, indent=2, sort_keys=True))
return flask.jsonify(model_graph)
# if this is the main thread of execution first load the model and
# then start the server
if __name__ == "__main__":
print(("* Loading Keras model and Flask starting server..."
"please wait until server has fully started"))
app.run() | 28.940054 | 124 | 0.654364 | 1,383 | 10,621 | 4.902386 | 0.16992 | 0.034071 | 0.033038 | 0.023894 | 0.936136 | 0.933333 | 0.921534 | 0.921534 | 0.921534 | 0.921534 | 0 | 0.013352 | 0.203182 | 10,621 | 367 | 125 | 28.940055 | 0.787782 | 0.620469 | 0 | 0.037383 | 0 | 0 | 0.165915 | 0.00584 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046729 | false | 0 | 0.11215 | 0 | 0.214953 | 0.018692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b819aae69e6f3a2bc1bfcfaed5535f6346cb54de | 126 | py | Python | setup.py | dobretony/python-start-environment | f5f64d81b796fb57ea10f669bef2a14c0076d5b6 | [
"MIT"
] | null | null | null | setup.py | dobretony/python-start-environment | f5f64d81b796fb57ea10f669bef2a14c0076d5b6 | [
"MIT"
] | 1 | 2018-12-06T10:31:27.000Z | 2018-12-06T10:46:28.000Z | setup.py | dobretony/python-start-environment | f5f64d81b796fb57ea10f669bef2a14c0076d5b6 | [
"MIT"
] | null | null | null | from app.app import main
if __name__ == "__main__":
import sys
if(sys.argv[1] == "start"):
main(sys.argv[1])
| 18 | 31 | 0.587302 | 19 | 126 | 3.473684 | 0.526316 | 0.212121 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021053 | 0.246032 | 126 | 6 | 32 | 21 | 0.673684 | 0 | 0 | 0 | 0 | 0 | 0.103175 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
62ab08ac22aa684b04ac5b6073f9e1f4133a89ea | 113 | py | Python | tests/test_bits3.py | hille721/bits3 | ae566429dd7328a88ad5e6901ba9a4188e75901d | [
"MIT"
] | 2 | 2020-09-28T14:51:46.000Z | 2021-10-17T10:01:33.000Z | tests/test_bits3.py | hille721/bits3 | ae566429dd7328a88ad5e6901ba9a4188e75901d | [
"MIT"
] | 3 | 2020-10-04T11:48:51.000Z | 2020-10-11T09:39:35.000Z | tests/test_bits3.py | hille721/bits3 | ae566429dd7328a88ad5e6901ba9a4188e75901d | [
"MIT"
] | null | null | null |
import pytest
from bits3.core import bits3_cycle
from bits3.cli import run
def test_dummy():
assert 1==1
| 11.3 | 34 | 0.743363 | 19 | 113 | 4.315789 | 0.684211 | 0.219512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054945 | 0.19469 | 113 | 9 | 35 | 12.555556 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | true | 0 | 0.6 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
62ce1aa437ef5168cf3ea4f24a7dc06a01361df4 | 79 | py | Python | apps/odoo/lib/odoo-10.0.post20170615-py2.7.egg/odoo/addons/auth_signup/models/__init__.py | gtfarng/Odoo_migrade | 9cc28fae4c379e407645248a29d22139925eafe7 | [
"Apache-2.0"
] | 1 | 2019-12-19T01:53:13.000Z | 2019-12-19T01:53:13.000Z | apps/odoo/lib/odoo-10.0.post20170615-py2.7.egg/odoo/addons/auth_signup/models/__init__.py | gtfarng/Odoo_migrade | 9cc28fae4c379e407645248a29d22139925eafe7 | [
"Apache-2.0"
] | null | null | null | apps/odoo/lib/odoo-10.0.post20170615-py2.7.egg/odoo/addons/auth_signup/models/__init__.py | gtfarng/Odoo_migrade | 9cc28fae4c379e407645248a29d22139925eafe7 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import res_config
import res_users
import res_partner
| 13.166667 | 23 | 0.734177 | 12 | 79 | 4.583333 | 0.666667 | 0.490909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014925 | 0.151899 | 79 | 5 | 24 | 15.8 | 0.80597 | 0.265823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
62f004558fb912912f7bc09f894500e4cc6b5213 | 2,841 | py | Python | tests/test_links.py | dylanmccall/ricecooker | cd74dc61e57cba6f01039e0f9c25e5da1ed4dbcd | [
"MIT"
] | 14 | 2017-01-10T09:33:03.000Z | 2021-11-28T12:11:27.000Z | tests/test_links.py | dylanmccall/ricecooker | cd74dc61e57cba6f01039e0f9c25e5da1ed4dbcd | [
"MIT"
] | 174 | 2016-09-29T17:32:54.000Z | 2022-03-29T15:02:48.000Z | tests/test_links.py | dylanmccall/ricecooker | cd74dc61e57cba6f01039e0f9c25e5da1ed4dbcd | [
"MIT"
] | 41 | 2016-08-29T23:26:17.000Z | 2021-11-29T17:12:03.000Z | import os
from ricecooker.utils.html import replace_links
def test_replace_absolute_links():
a_content = '<a href="http://replace.me/link/to/page.html">'
noscheme_a_content = '<a href="//replace.me/link/to/page.html">'
root_a_content = '<a href="/link/to/page.html">'
img_content = '<img src="http://replace.me/img/hello.jpg">'
img_srcset_content = '<img srcset="http://replace.me/img/hello.jpg 1x, http://replace.me/img/hello.jpg 2x">'
urls_to_replace = {
'http://replace.me/img/hello.jpg': 'img/hello.jpg',
'http://replace.me/link/to/page.html': 'link/to/page.html'
}
output = replace_links(img_content, urls_to_replace)
assert output == '<img src="img/hello.jpg">'
output = replace_links(a_content, urls_to_replace)
assert output == '<a href="link/to/page.html">'
output = replace_links(noscheme_a_content, urls_to_replace)
assert output == '<a href="link/to/page.html">'
output = replace_links(root_a_content, urls_to_replace)
assert output == '<a href="link/to/page.html">'
output = replace_links(img_srcset_content, urls_to_replace)
assert output == '<img srcset="img/hello.jpg 1x, img/hello.jpg 2x">'
def test_replace_relative_links():
a_content = '<a href="http://replace.me/link/to/page.html">'
noscheme_a_content = '<a href="//replace.me/link/to/page.html">'
root_a_content = '<a href="/link/to/page.html">'
img_content = '<img src="http://replace.me/img/hello.jpg">'
img_srcset_content = '<img srcset="http://replace.me/img/hello.jpg 1x, http://replace.me/img/hello.jpg 2x">'
urls_to_replace = {
'http://replace.me/img/hello.jpg': 'replace.me/img/hello.jpg',
'http://replace.me/link/to/page.html': 'replace.me/link/to/page.html'
}
content_dir = os.path.join('replace.me', 'link', 'from')
download_root = '.'
output = replace_links(img_content, urls_to_replace, download_root=download_root, content_dir=content_dir, relative_links=True)
assert output == '<img src="../../img/hello.jpg">'
output = replace_links(a_content, urls_to_replace, download_root=download_root, content_dir=content_dir, relative_links=True)
assert output == '<a href="../to/page.html">'
output = replace_links(noscheme_a_content, urls_to_replace, download_root=download_root, content_dir=content_dir, relative_links=True)
assert output == '<a href="../to/page.html">'
output = replace_links(root_a_content, urls_to_replace, download_root=download_root, content_dir=content_dir, relative_links=True)
assert output == '<a href="../to/page.html">'
output = replace_links(img_srcset_content, urls_to_replace, download_root=download_root, content_dir=content_dir, relative_links=True)
assert output == '<img srcset="../../img/hello.jpg 1x, ../../img/hello.jpg 2x">'
| 43.045455 | 138 | 0.695882 | 424 | 2,841 | 4.426887 | 0.087264 | 0.081513 | 0.085242 | 0.096963 | 0.924347 | 0.919552 | 0.907299 | 0.907299 | 0.866809 | 0.866809 | 0 | 0.003261 | 0.136572 | 2,841 | 65 | 139 | 43.707692 | 0.761924 | 0 | 0 | 0.409091 | 0 | 0.045455 | 0.369236 | 0.140092 | 0 | 0 | 0 | 0 | 0.227273 | 1 | 0.045455 | false | 0 | 0.045455 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1a21068c25b4a9812d0823926da815070e50fe4d | 52 | py | Python | configs/__init__.py | rafaat/audiencemanager-api-lab | 5890854cd4a085bbaf9e13f21d7d70cca25254f4 | [
"Apache-2.0"
] | 1 | 2019-02-20T16:44:35.000Z | 2019-02-20T16:44:35.000Z | configs/__init__.py | rafaat/audiencemanager-api-lab | 5890854cd4a085bbaf9e13f21d7d70cca25254f4 | [
"Apache-2.0"
] | null | null | null | configs/__init__.py | rafaat/audiencemanager-api-lab | 5890854cd4a085bbaf9e13f21d7d70cca25254f4 | [
"Apache-2.0"
] | 4 | 2017-03-22T16:03:30.000Z | 2020-10-03T09:32:22.000Z | from app_configs import *
from aam_configs import *
| 17.333333 | 25 | 0.807692 | 8 | 52 | 5 | 0.625 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 52 | 2 | 26 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1a22773a2ba3a29f87cf3c96875462e414e42924 | 605 | py | Python | wechatpy/pay/api/__init__.py | fuh/wechatpy | 83c8ca93acef4149c5e61e3726c89b82052f17c1 | [
"MIT"
] | 2,428 | 2015-07-04T08:55:29.000Z | 2020-03-16T03:11:22.000Z | wechatpy/pay/api/__init__.py | fuh/wechatpy | 83c8ca93acef4149c5e61e3726c89b82052f17c1 | [
"MIT"
] | 453 | 2015-06-18T10:39:34.000Z | 2020-03-16T05:12:37.000Z | wechatpy/pay/api/__init__.py | fuh/wechatpy | 83c8ca93acef4149c5e61e3726c89b82052f17c1 | [
"MIT"
] | 669 | 2015-06-18T10:08:12.000Z | 2020-03-14T15:35:34.000Z | # -*- coding: utf-8 -*-
from wechatpy.pay.api.redpack import WeChatRedpack # NOQA
from wechatpy.pay.api.transfer import WeChatTransfer # NOQA
from wechatpy.pay.api.coupon import WeChatCoupon # NOQA
from wechatpy.pay.api.order import WeChatOrder # NOQA
from wechatpy.pay.api.refund import WeChatRefund # NOQA
from wechatpy.pay.api.tools import WeChatTools # NOQA
from wechatpy.pay.api.jsapi import WeChatJSAPI # NOQA
from wechatpy.pay.api.micropay import WeChatMicroPay # NOQA
from wechatpy.pay.api.withhold import WeChatWithhold # NOQA
from wechatpy.pay.api.appauth import WeChatAppAuth # NOQA
| 46.538462 | 60 | 0.793388 | 83 | 605 | 5.783133 | 0.349398 | 0.25 | 0.3125 | 0.375 | 0.4125 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00189 | 0.12562 | 605 | 12 | 61 | 50.416667 | 0.905482 | 0.117355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c555be93c93b1f10462bd72eae30404566d140cb | 39 | py | Python | tests/fixture_02.py | brianjbuck/noeval | 685fd2b057e967d5653c25f47f9875b98d7cc78b | [
"MIT"
] | null | null | null | tests/fixture_02.py | brianjbuck/noeval | 685fd2b057e967d5653c25f47f9875b98d7cc78b | [
"MIT"
] | null | null | null | tests/fixture_02.py | brianjbuck/noeval | 685fd2b057e967d5653c25f47f9875b98d7cc78b | [
"MIT"
] | null | null | null | def test():
eval("print('hello')")
| 13 | 26 | 0.538462 | 5 | 39 | 4.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 39 | 2 | 27 | 19.5 | 0.65625 | 0 | 0 | 0 | 0 | 0 | 0.358974 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
c563766d10e8ab69d0f249b72ea7458dfef0f393 | 32,725 | py | Python | girder/cumulus/plugin_tests/volume_test.py | aronhelser/cumulus | 706687ff9b4c140272e26fb9b6543c0df5513a0f | [
"Apache-2.0"
] | 28 | 2015-12-04T19:43:17.000Z | 2021-12-17T02:32:21.000Z | girder/cumulus/plugin_tests/volume_test.py | aronhelser/cumulus | 706687ff9b4c140272e26fb9b6543c0df5513a0f | [
"Apache-2.0"
] | 212 | 2015-07-29T18:44:21.000Z | 2020-05-28T00:30:46.000Z | girder/cumulus/plugin_tests/volume_test.py | aronhelser/cumulus | 706687ff9b4c140272e26fb9b6543c0df5513a0f | [
"Apache-2.0"
] | 9 | 2016-08-10T21:49:00.000Z | 2021-02-06T15:47:46.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
###############################################################################
# Copyright 2015 Kitware Inc.
#
# Licensed under the Apache License, Version 2.0 ( the "License" );
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
###############################################################################
from tests import base
import json
import mock
from cumulus.testing import AssertCallsMixin
import unittest
from girder.utility.model_importer import ModelImporter
def setUpModule():
base.enabledPlugins.append('cumulus')
base.startServer()
def tearDownModule():
base.stopServer()
class VolumeTestCase(AssertCallsMixin, base.TestCase):
@mock.patch('cumulus.aws.ec2.tasks.key.generate_key_pair.delay')
@mock.patch('cumulus.ssh.tasks.key.generate_key_pair.delay')
@mock.patch('cumulus_plugin.models.aws.get_ec2_client')
def setUp(self, get_ec2_client, *args):
super(VolumeTestCase, self).setUp()
users = ({
'email': 'cumulus@email.com',
'login': 'cumulus',
'firstName': 'First',
'lastName': 'Last',
'password': 'goodpassword'
}, {
'email': 'regularuser@email.com',
'login': 'regularuser',
'firstName': 'First',
'lastName': 'Last',
'password': 'goodpassword'
}, {
'email': 'another@email.com',
'login': 'another',
'firstName': 'First',
'lastName': 'Last',
'password': 'goodpassword'
})
self._cumulus, self._user, self._another_user = \
[ModelImporter.model('user').createUser(**user) for user in users]
self._group = ModelImporter.model('group').createGroup('cumulus', self._cumulus)
# Create a traditional cluster
body = {
'config': {
'host': 'myhost',
'ssh': {
'user': 'myuser'
}
},
'name': 'test',
'type': 'trad'
}
json_body = json.dumps(body)
r = self.request('/clusters', method='POST',
type='application/json', body=json_body, user=self._user)
self.assertStatus(r, 201)
self._trad_cluster_id = str(r.json['_id'])
# Create a AWS profile
self._availability_zone = 'cornwall-2b'
body = {
'name': 'myprof',
'accessKeyId': 'mykeyId',
'secretAccessKey': 'mysecret',
'regionName': 'cornwall',
'availabilityZone': self._availability_zone
}
ec2_client = get_ec2_client.return_value
ec2_client.describe_regions.return_value = {
'Regions': [{
'RegionName': 'cornwall',
'Endpoint': 'cornwall.ec2.amazon.com'
}]
}
create_url = '/user/%s/aws/profiles' % str(self._user['_id'])
r = self.request(create_url, method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
self._profile_id = str(r.json['_id'])
create_url = '/user/%s/aws/profiles' % str(self._another_user['_id'])
r = self.request(create_url, method='POST',
type='application/json', body=json.dumps(body),
user=self._another_user)
self.assertStatus(r, 201)
self._another_profile_id = str(r.json['_id'])
# Create EC2 cluster
body = {
'profileId': self._profile_id,
'name': 'testing',
'cloudProvider': 'ec2'
}
json_body = json.dumps(body)
r = self.request('/clusters', method='POST',
type='application/json', body=json_body, user=self._user)
self.assertStatus(r, 201)
self._cluster_id = str(r.json['_id'])
def test_create(self):
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
expected = {
u'status': u'created',
u'name': u'test',
u'zone': u'us-west-2a',
u'ec2': {u'id': None},
u'profileId': self._profile_id,
u'type': u'ebs',
u'size': 20}
del r.json['_id']
self.assertEqual(r.json, expected, 'Unexpected volume returned')
# Add file system type
body = {
'name': 'test2',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'fs': 'ext4',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
expected = {
u'status': u'created',
u'name': u'test2',
u'zone': u'us-west-2a',
u'type': u'ebs',
u'size': 20,
u'fs': u'ext4',
u'ec2': {
u'id': None
},
u'profileId': self._profile_id
}
del r.json['_id']
self.assertEqual(r.json, expected, 'Unexpected volume returned')
# Try invalid type
body['type'] = 'bogus'
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 400)
# Try invalid file system type
body['fs'] = 'bogus'
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 400)
# Try create volume with same name
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 400)
# Now try create volume with same name another user this should work
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._another_profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._another_user)
self.assertStatus(r, 201)
# Create a volume without a zone
body = {
'name': 'zoneless',
'size': 20,
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
self.assertEqual(r.json['zone'], self._availability_zone,
'Volume created in wrong zone')
# Try to create a volume with a invalid profile
body['aws'] = {
'profileId': 'bogus'
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 400)
def test_get(self):
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_id = str(r.json['_id'])
expected = {
u'name': u'test',
u'zone': u'us-west-2a',
u'ec2': {
u'id': None
},
u'type':
u'ebs',
u'size': 20,
u'status': u'created',
u'profileId': self._profile_id
}
r = self.request('/volumes/%s' % volume_id, method='GET',
type='application/json',
user=self._user)
self.assertStatusOk(r)
del r.json['_id']
self.assertEqual(expected, r.json)
# Try to fetch a volume that doesn't exist
r = self.request('/volumes/55c3dbd9f65710591baefe60', method='GET',
type='application/json',
user=self._user)
self.assertStatus(r, 400)
@mock.patch('cumulus_plugin.volume.CloudProvider')
@mock.patch('cumulus.ansible.tasks.volume.attach_volume.delay')
@mock.patch('cumulus.ansible.tasks.volume.detach_volume.delay')
@mock.patch('cumulus.ansible.tasks.volume.delete_volume.delay')
def test_delete(self, delete_volume, detach_volume,
attach_volume, CloudProvider):
CloudProvider.return_value.get_volume.return_value = None
CloudProvider.return_value.get_master_instance.return_value = {
'instance_id': 'i-00000',
'private_ip': 'x.x.x.x',
'public_ip': 'x.x.x.x',
'state': 'running',
}
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume = r.json
volume_id = str(r.json['_id'])
body = {
'path': '/data'
}
url = '/volumes/%s/clusters/%s/attach' % (volume_id, self._cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatusOk(r)
# Patch back a fake volume ID (normally this would be called from ansible)
r = self.request('/volumes/%s' % (volume_id), method='PATCH',
type='application/json',
body=json.dumps({'ec2': {'id': 'vol-00000'}}),
user=self._user)
# Complete the attach operation (normally this would be called from ansible)
r = self.request('/volumes/%s/clusters/%s/attach/complete' % (volume_id, self._cluster_id),
method='PUT', type='application/json',
user=self._user, body=json.dumps({"path": "/data"}))
r = self.request('/volumes/%s' % volume_id, method='DELETE',
user=self._user)
self.assertStatus(r, 400)
self.assertEquals(delete_volume.call_count, 0)
# Mock out CloudProvider.get_volume to return an 'in-use' volume
# That is what calls to attach & attach/complete should have
# created.
CloudProvider.return_value.get_volume.return_value = {
'volume_id': 'vol-00000',
'state': 'in-use'
}
url = '/volumes/%s/detach' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
url = '/volumes/%s/detach/complete' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
# Mock out CloudProvider.get_volume to return an 'available' volume
# That is what calls to detach & detach/complete should have
# created.
CloudProvider.return_value.get_volume.return_value = {
'volume_id': 'vol-00000',
'state': 'available'
}
self.assertStatusOk(r)
r = self.request('/volumes/%s' % volume_id, method='DELETE',
user=self._user)
self.assertStatus(r, 200)
self.assertEquals(delete_volume.call_count, 1)
@mock.patch('cumulus_plugin.volume.CloudProvider')
@mock.patch('cumulus.ansible.tasks.volume.attach_volume.delay')
def test_attach_volume(self, attach_volume, CloudProvider):
CloudProvider.return_value.get_volume.return_value = None
CloudProvider.return_value.get_master_instance.return_value = {
'instance_id': 'i-00000',
'private_ip': 'x.x.x.x',
'public_ip': 'x.x.x.x',
'state': 'running',
}
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'fs': 'ext4',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_id = str(r.json['_id'])
body = {
'path': '/data'
}
url = '/volumes/%s/clusters/%s/attach' % (volume_id, self._cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatusOk(r)
expected ={'status': 'attaching',
'fs': u'ext4',
'name': u'test',
'zone': u'us-west-2a',
'ec2': {u'id': None},
'profileId': self._profile_id,
'_id': volume_id,
'type': u'ebs',
'size': 20}
self.assertEqual(r.json, expected)
# Patch back a fake volume ID (normally this would be called from ansible)
r = self.request('/volumes/%s' % (volume_id), method='PATCH',
type='application/json',
body=json.dumps({'ec2': {'id': 'vol-00000'}}),
user=self._user)
# Complete the attach operation (normally this would be called from ansible)
r = self.request('/volumes/%s/clusters/%s/attach/complete' % (volume_id, self._cluster_id),
method='PUT', type='application/json',
user=self._user, body=json.dumps({"path": "/data"}))
# Test that the volume has been set up correctly
r = self.request('/volumes/%s' % volume_id, method='GET',
type='application/json', user=self._user)
expected = {u'status': u'in-use',
u'fs': u'ext4',
u'name': u'test',
u'zone': u'us-west-2a',
u'clusterId': self._cluster_id,
u'ec2': {u'id': u'vol-00000'},
u'profileId': self._profile_id,
u'path': u'/data',
u'_id': volume_id,
u'type': u'ebs',
u'size': 20}
self.assertEquals(r.json, expected)
# Test that the volume shows up on the cluster under 'volumes' attribute
r = self.request('/clusters/%s' % self._cluster_id, method='GET',
type='application/json', user=self._user)
self.assertStatusOk(r)
expected = {
u'profileId': str(self._profile_id),
u'status': u'created',
u'name': u'testing',
u'userId': str(self._user['_id']),
u'volumes': [volume_id],
u'type': u'ec2',
u'_id': self._cluster_id,
u'config': {
u'scheduler': {
u'type': u'sge'
},
u'ssh': {
u'user': u'ubuntu',
u'key': str(self._profile_id)
},
u'launch': {
u'spec': u'default',
u'params': {}
}
}
}
self.assertEqual(r.json, expected)
# Try to attach volume to a volume that is in use
CloudProvider.return_value.get_volume.return_value = {
'id': 'vol-00000', 'state': 'in-use'
}
url = '/volumes/%s/clusters/%s/attach' % (volume_id, self._cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 400)
# Try to attach volume to a traditional cluster
# Patch back to 'attaching' so we don't error out on volume in use
r = self.request('/volumes/%s' % (volume_id), method='PATCH',
type='application/json',
body=json.dumps({'status': 'attaching'}),
user=self._user)
url = '/volumes/%s/clusters/%s/attach' % (
volume_id, self._trad_cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 400)
@mock.patch('cumulus_plugin.volume.CloudProvider')
@mock.patch('cumulus.ansible.tasks.volume.attach_volume.delay')
@mock.patch('cumulus.ansible.tasks.volume.detach_volume.delay')
def test_detach_volume(self, detach_volume, attach_volume, CloudProvider):
CloudProvider.return_value.get_volume.return_value = None
CloudProvider.return_value.get_master_instance.return_value = {
'instance_id': 'i-00000',
'private_ip': 'x.x.x.x',
'public_ip': 'x.x.x.x',
'state': 'running',
}
body = {
'name': 'testing me',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_id = str(r.json['_id'])
# Try detaching volume not in use
url = '/volumes/%s/detach' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
self.assertStatus(r, 400)
body = {
'path': '/data'
}
url = '/volumes/%s/clusters/%s/attach' % (volume_id, self._cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatusOk(r)
# Patch back a fake volume ID (normally this would be called from ansible)
r = self.request('/volumes/%s' % (volume_id), method='PATCH',
type='application/json',
body=json.dumps({'ec2': {'id': 'vol-00000'}}),
user=self._user)
# Complete the attach operation (normally this would be called from ansible)
r = self.request('/volumes/%s/clusters/%s/attach/complete' % (volume_id, self._cluster_id),
method='PUT', type='application/json',
user=self._user, body=json.dumps({"path": "/data"}))
# Mock out CloudProvider.get_volume to return an 'in-use' volume
# That is what calls to attach & attach/complete should have
# created.
CloudProvider.return_value.get_volume.return_value = {
'volume_id': 'vol-00000',
'state': 'in-use'
}
url = '/volumes/%s/detach' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
self.assertStatusOk(r)
url = '/volumes/%s/detach/complete' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
self.assertStatusOk(r)
# Assert that detach was called on ec2 object
self.assertEqual(len(detach_volume.call_args_list),
1, "detach was not called")
r = self.request('/clusters/%s' % self._cluster_id, method='GET',
type='application/json', user=self._cumulus)
self.assertStatusOk(r)
expected = {
u'profileId': str(self._profile_id),
u'status': u'created',
u'name': u'testing',
u'userId': str(self._user['_id']),
u'volumes': [],
u'type': u'ec2',
u'_id': self._cluster_id,
u'config': {
u'scheduler': {
u'type': u'sge'
},
u'ssh': {
u'user': u'ubuntu',
u'key': str(self._profile_id)
},
u'launch': {
u'spec': u'default',
u'params': {}
}
}
}
self.assertEqual(r.json, expected)
@mock.patch('cumulus_plugin.volume.CloudProvider')
@mock.patch('cumulus.ansible.tasks.volume.attach_volume.delay')
def test_find_volume(self, attach_volume, CloudProvider):
# Create some test volumes
body = {
'name': 'testing me',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_1_id = r.json['_id']
body = {
'name': 'testing me2',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._another_profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._another_user)
self.assertStatus(r, 201)
volume_2_id = r.json['_id']
# Search with one user
r = self.request('/volumes', method='GET', user=self._user)
self.assertStatusOk(r)
self.assertEqual(len(r.json), 1, 'Wrong number of volumes returned')
self.assertEqual(r.json[0]['_id'], volume_1_id, 'Wrong volume returned')
# Now search with the other
r = self.request('/volumes', method='GET', user=self._another_user)
self.assertStatusOk(r)
self.assertEqual(len(r.json), 1, 'Wrong number of volumes returned')
self.assertEqual(r.json[0]['_id'], volume_2_id, 'Wrong volume returned')
# Seach for volumes attached to a particular cluster
params = {
'clusterId': self._cluster_id
}
r = self.request('/volumes', method='GET', user=self._user,
params=params)
self.assertStatusOk(r)
self.assertEqual(len(r.json), 0, 'Wrong number of volumes returned')
body = {
'path': '/data'
}
CloudProvider.return_value.get_volume.return_value = None
CloudProvider.return_value.get_master_instance.return_value = {
'instance_id': 'i-00000',
'private_ip': 'x.x.x.x',
'public_ip': 'x.x.x.x',
'state': 'running',
}
# Attach a volume
url = '/volumes/%s/clusters/%s/attach' % (str(volume_1_id), self._cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatusOk(r)
# Patch back a fake volume ID (normally this would be called from ansible)
r = self.request('/volumes/%s' % (volume_1_id), method='PATCH',
type='application/json',
body=json.dumps({'ec2': {'id': 'vol-00000'}}),
user=self._user)
# Complete the attach operation (normally this would be called from ansible)
r = self.request('/volumes/%s/clusters/%s/attach/complete' % (volume_1_id, self._cluster_id),
method='PUT', type='application/json',
user=self._user, body=json.dumps({"path": "/data"}))
# Search again
r = self.request('/volumes', method='GET', user=self._user,
params=params)
self.assertStatusOk(r)
self.assertEqual(len(r.json), 1, 'Wrong number of volumes returned')
@mock.patch('cumulus_plugin.volume.CloudProvider')
@mock.patch('cumulus.ansible.tasks.volume.attach_volume.delay')
@mock.patch('cumulus.ansible.tasks.volume.detach_volume.delay')
def test_get_status(self, detach_volume, attach_volume, CloudProvider):
# Create some test volumes
CloudProvider.return_value.get_volume.return_value = None
CloudProvider.return_value.get_master_instance.return_value = {
'instance_id': 'i-00000',
'private_ip': 'x.x.x.x',
'public_ip': 'x.x.x.x',
'state': 'running',
}
body = {
'name': 'testing me',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_id = str(r.json['_id'])
# Should initially be in status 'created'
url = '/volumes/%s/status' % volume_id
r = self.request(url, method='GET', user=self._user)
self.assertStatusOk(r)
expected = {
u'status': u'created'
}
self.assertEqual(r.json, expected, 'Unexpected status: {}'.format(r.json))
# Attach the to a fake cluster
body = {
'path': '/data'
}
url = '/volumes/%s/clusters/%s/attach' % (volume_id, self._cluster_id)
r = self.request(url, method='PUT',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatusOk(r)
### Patch back a fake volume ID (normally this would be called from ansible)
r = self.request('/volumes/%s' % (volume_id), method='PATCH',
type='application/json',
body=json.dumps({'ec2': {'id': 'vol-00000'}}),
user=self._user)
### Complete the attach operation (normally this would be called from ansible)
r = self.request('/volumes/%s/clusters/%s/attach/complete' % (volume_id, self._cluster_id),
method='PUT', type='application/json',
user=self._user, body=json.dumps({"path": "/data"}))
url = '/volumes/%s/status' % volume_id
r = self.request(url, method='GET', user=self._user)
self.assertStatusOk(r)
expected = {
u'status': u'in-use'
}
self.assertEqual(r.json, expected, 'Unexpected status'.format(r.json))
# Detach the volume
# Mock out CloudProvider.get_volume to return an 'in-use' volume
# That is what calls to attach & attach/complete should have
# created.
CloudProvider.return_value.get_volume.return_value = {
'volume_id': 'vol-00000',
'state': 'in-use'
}
url = '/volumes/%s/detach' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
self.assertStatusOk(r)
url = '/volumes/%s/detach/complete' % (volume_id)
r = self.request(url, method='PUT', user=self._user)
self.assertStatusOk(r)
url = '/volumes/%s/status' % volume_id
r = self.request(url, method='GET', user=self._user)
self.assertStatusOk(r)
expected = {
u'status': u'available'
}
self.assertEqual(r.json, expected, 'Unexpected status'.format(r.json))
def test_log(self):
volume_id = 'vol-1'
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_id = str(r.json['_id'])
# Check that empty log exists for newly created volume
r = self.request('/volumes/%s/log' % str(volume_id), method='GET',
user=self._user)
self.assertStatusOk(r)
self.assertEqual(len(r.json['log']), 0)
log_entry = {
'msg': 'Some message'
}
r = self.request('/volumes/546a1844ff34c70456111185/log', method='GET',
user=self._user)
self.assertStatus(r, 404)
r = self.request('/volumes/%s/log' % str(volume_id), method='POST',
type='application/json', body=json.dumps(log_entry), user=self._user)
self.assertStatusOk(r)
r = self.request('/volumes/%s/log' % str(volume_id), method='GET',
user=self._user)
self.assertStatusOk(r)
expected_log = {u'log': [{u'msg': u'Some message'}]}
self.assertEqual(r.json, expected_log)
r = self.request('/volumes/%s/log' % str(volume_id), method='POST',
type='application/json', body=json.dumps(log_entry), user=self._user)
self.assertStatusOk(r)
r = self.request('/volumes/%s/log' % str(volume_id), method='GET',
user=self._user)
self.assertStatusOk(r)
self.assertEqual(len(r.json['log']), 2)
r = self.request('/volumes/%s/log' % str(volume_id), method='GET',
params={'offset': 1}, user=self._user)
self.assertStatusOk(r)
self.assertEqual(len(r.json['log']), 1)
def test_volume_sse(self):
body = {
'name': 'test',
'size': 20,
'zone': 'us-west-2a',
'type': 'ebs',
'profileId': self._profile_id
}
r = self.request('/volumes', method='POST',
type='application/json', body=json.dumps(body),
user=self._user)
self.assertStatus(r, 201)
volume_id = str(r.json['_id'])
# connect to volume notification stream
stream_r = self.request('/notification/stream', method='GET', user=self._user,
isJson=False, params={'timeout': 0})
self.assertStatusOk(stream_r)
# add a log entry
log_entry = {
'msg': 'Some message'
}
r = self.request('/volumes/%s/log' % str(volume_id), method='POST',
type='application/json', body=json.dumps(log_entry), user=self._user)
self.assertStatusOk(r)
notifications = self.getSseMessages(stream_r)
# we get 4 notifications in stream,
# 1 from cluster 'creating' 1 from cluster 'created' in setUp()
# 1 from the volume creation and 1 from the volume log
self.assertEqual(len(notifications), 4, 'Expecting four notifications, received %d' % len(notifications))
self.assertEqual(notifications[2]['type'], 'volume.status', 'Expecting a message with type \'volume.status\' got: %s' % notifications[2]['type'] )
self.assertEqual(notifications[3]['type'], 'volume.log', 'Expecting a message with type \'volume.log\' got: %s' % notifications[3]['type'])
| 35.493492 | 154 | 0.516853 | 3,561 | 32,725 | 4.63016 | 0.089582 | 0.058709 | 0.050218 | 0.045609 | 0.788634 | 0.776747 | 0.744056 | 0.724466 | 0.707424 | 0.691412 | 0 | 0.015097 | 0.340168 | 32,725 | 921 | 155 | 35.53203 | 0.748483 | 0.094607 | 0 | 0.712389 | 0 | 0 | 0.202749 | 0.047047 | 0 | 0 | 0 | 0 | 0.125369 | 1 | 0.017699 | false | 0.004425 | 0.011799 | 0 | 0.030973 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3d9084f4add05f08ef3c362ac3292ec1b6ea021b | 38 | py | Python | src/tabs/dag_inspector/widgets/__init__.py | sisoe24/ProfileInspector | 38cfefe68c09b153f7380447c4c486728feecb9d | [
"MIT"
] | 4 | 2021-07-22T10:23:56.000Z | 2021-11-03T19:01:21.000Z | src/tabs/dag_inspector/widgets/__init__.py | Ripax/ProfileInspector | 38cfefe68c09b153f7380447c4c486728feecb9d | [
"MIT"
] | null | null | null | src/tabs/dag_inspector/widgets/__init__.py | Ripax/ProfileInspector | 38cfefe68c09b153f7380447c4c486728feecb9d | [
"MIT"
] | 1 | 2022-02-16T07:23:31.000Z | 2022-02-16T07:23:31.000Z | from .profiling import ProfilingWidget | 38 | 38 | 0.894737 | 4 | 38 | 8.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 38 | 1 | 38 | 38 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b1010456a2d30a5d7d1bd8a6a3aa3c0b6eb8229a | 17,880 | py | Python | pystrath_rfsoc/interactive_plots.py | lbrown65/rfsoc_notebooks | 36521083a1ac814411b2ec74bff43b77d1699d49 | [
"BSD-3-Clause"
] | 1 | 2021-03-10T16:37:28.000Z | 2021-03-10T16:37:28.000Z | pystrath_rfsoc/interactive_plots.py | lbrown65/rfsoc_notebooks | 36521083a1ac814411b2ec74bff43b77d1699d49 | [
"BSD-3-Clause"
] | null | null | null | pystrath_rfsoc/interactive_plots.py | lbrown65/rfsoc_notebooks | 36521083a1ac814411b2ec74bff43b77d1699d49 | [
"BSD-3-Clause"
] | null | null | null | import plotly.graph_objs as go
import numpy as np
import ipywidgets as ipw
from pynq.overlays.base import BaseOverlay
import xrfdc
class ComplexFrequencyPlot():
def __init__(self,
configuration={}):
default_config = {'sampling-freq' : 2048e6,
'number-samples' : 256,
'centre-freq' : 1024,
'height' : None,
'width' : None,
'data' : 0,
'title' : 'Complex Frequency Plot',
'x-axis-title' : 'Frequency (Hz)',
'y-axis-title' : 'Amplitude'}
for default_key in default_config.keys():
if default_key not in configuration:
configuration[default_key] = default_config[default_key]
self._config = configuration
data = go.Scatter(
x=np.arange(-self._config['sampling-freq']/2,
self._config['sampling-freq']/2,
self._config['sampling-freq']/self._config['number-samples']),
#+ self._config['centre-freq']*1e6,
y= self._config['data'])
self._plot = go.FigureWidget(
data=data,
layout={'title' : self._config['title'],
'height': self._config['height'],
'width' : self._config['width'],
'xaxis' : {'title' : self._config['x-axis-title']},
'yaxis' : {'title' : self._config['y-axis-title']}})
@property
def configuration(self):
return self._config
@configuration.setter
def configuration(self, configuration={}):
for key in configuration.keys():
if key not in self._config.keys():
raise KeyError(''.join(['The key ', key, ' is not found in the class configuration.']))
else:
self._config.update({key : configuration[key]})
self._update_plot()
def _update_plot(self):
self._plot.layout.height = self._config['height']
self._plot.layout.width = self._config['width']
self._plot.layout.xaxis.title = self._config['x-axis-title']
self._plot.layout.yaxis.title = self._config['y-axis-title']
self._plot.layout.title = self._config['title']
self._plot.data[0].x = np.arange(-self._config['sampling-freq']/2,
self._config['sampling-freq']/2,
self._config['sampling-freq']/self._config['number-samples'])
+ self._config['centre-freq']
def update_data(self, data):
if len(data) != self._config['number-samples']:
raise ValueError('Length of data must be the same as the plot.')
else:
self._plot.data[0].y = data
def get_plot(self):
return self._plot
class ComplexTimePlot():
def __init__(self,
configuration={}):
default_config = {'sampling-freq' : 4096e6,
'number-samples' : 256,
'height' : None,
'width' : None,
'c_data' : 0,
'title' : 'Complex Time Plot',
'x-axis-title' : 'Time (s)',
'y-axis-title' : 'Amplitude'}
for default_key in default_config.keys():
if default_key not in configuration:
configuration[default_key] = default_config[default_key]
self._config = configuration
data_re = go.Scatter(
x=np.arange(0, self._config['number-samples']/self._config['sampling-freq'], 1/self._config['sampling-freq']),
y=np.real(self._config['c_data']),
name='Real')
data_im = go.Scatter(
x=np.arange(0, self._config['number-samples']/self._config['sampling-freq'], 1/self._config['sampling-freq']),
y=np.imag(self._config['c_data']),
name='Imag')
self._plot = go.FigureWidget(
data=[data_re, data_im],
layout={'title' : self._config['title'],
'height': self._config['height'],
'width' : self._config['width'],
'xaxis' : {'title' : self._config['x-axis-title']},
'yaxis' : {'title' : self._config['y-axis-title']}})
@property
def configuration(self):
return self._config
@configuration.setter
def configuration(self, configuration={}):
for key in configuration.keys():
if key not in self._config.keys():
raise KeyError(''.join(['The key ', key, ' is not found in the class configuration.']))
else:
self._config.update({key : configuration[key]})
self._update_plot()
def _update_plot(self):
self._plot.layout.height = self._config['height']
self._plot.layout.width = self._config['width']
self._plot.layout.xaxis.title = self._config['x-axis-title']
self._plot.layout.yaxis.title = self._config['y-axis-title']
self._plot.layout.title = self._config['title']
self._plot.data[0].x = np.arange(0,
self._config['number-samples']/self._config['sampling-freq'],
1/self._config['sampling-freq'])
def update_data(self, c_data):
if len(c_data) != self._config['number-samples']:
raise ValueError('Length of data must be the same as the plot.')
self._plot.data[0].y = np.real(c_data)
self._plot.data[1].y = np.imag(c_data)
def get_plot(self):
return self._plot
class DAC_ToneGenerator():
def __init__(self,
channel,
centre_frequency=0):
self._channel = channel
#To do: DAC block check
self.centre_frequency = centre_frequency
@property
def centre_frequency(self):
return abs(self._channel.dac_block.MixerSettings['Freq'])
@centre_frequency.setter
def centre_frequency(self, centre_frequency):
block = self._channel.dac_block
if (centre_frequency > block.BlockStatus['SamplingFreq']*1e3) \
or (centre_frequency < 1):
raise ValueError ('Centre frequency out of range')
zone = block.NyquistZone
even = True if ((zone % 2) == 0) else False
req_zone = int(np.ceil(abs(centre_frequency)/((block.BlockStatus['SamplingFreq']*1e3)/2)))
if req_zone != zone:
block.NyquistZone = req_zone
if even:
block.MixerSettings['Freq'] = -centre_frequency
else:
block.MixerSettings['Freq'] = centre_frequency
class ADC_ToneGenerator():
def __init__(self,
channel,
adc_centre_frequency=0):
self._channel = channel
#To do: ADC block check
self.centre_frequency = adc_centre_frequency
@property
def adc_centre_frequency(self):
return abs(self._channel.adc_block.MixerSettings['Freq'])
@adc_centre_frequency.setter
def adc_centre_frequency(self, adc_centre_frequency):
block = self._channel.adc_block
if (adc_centre_frequency < -block.BlockStatus['SamplingFreq']*1e3) \
or (adc_centre_frequency > block.BlockStatus['SamplingFreq']*1e3):
raise ValueError ('ADC Centre frequency out of range')
if (adc_centre_frequency == 0):
zone = 1
even = True if ((zone % 2) == 0) else False
else:
zone = block.NyquistZone
req_zone = int(np.ceil(abs(adc_centre_frequency)/((block.BlockStatus['SamplingFreq']*1e3)/2)))
if req_zone != zone:
block.NyquistZone = req_zone
even = True if ((zone % 2) == 0) else False
if even:
block.MixerSettings['Freq'] = adc_centre_frequency
else:
block.MixerSettings['Freq'] = -adc_centre_frequency
block.UpdateEvent(1)
class FrequencyProcessor():
def __init__(self,
configuration={}):
default_config = {'sampling-freq' : 2048e6,
'window' : 'blackman'}
for default_key in default_config.keys():
if default_key not in configuration:
configuration[default_key] = default_config[default_key]
self._config = configuration
def _window(self, data):
return data * getattr(np, self._config['window'])(len(data))
def _fft(self, data):
return np.fft.fftshift(np.fft.fft(data))
def _psd(self, data):
return (abs(data)**2)/(self._config['sampling-freq']*np.sum(getattr(np, self._config['window'])(len(data))**2))
def _decibel(self, data):
return 10*np.where(data > 0, np.log10(data), 0)
def convert_to_freq(self, data):
data = self._window(data)
data = self._fft(data)
data = self._psd(data)
return self._decibel(data)
class CoarseMixerApplication():
def __init__(self,
tx_channel,
rx_channel,
sample_frequency=4096e6,
number_samples=2048,
centre_frequency=1024,
window='blackman',
height=None,
width=None):
tx_channel.dac_block.MixerSettings.update({
'MixerType': xrfdc.MIXER_TYPE_FINE,
'FineMixerScale': xrfdc.MIXER_SCALE_0P7,
'Freq': 1024,
})
rx_channel.adc_block.MixerSettings.update({
'CoarseMixFreq' : xrfdc.COARSE_MIX_SAMPLE_FREQ_BY_FOUR,
'MixerType': xrfdc.MIXER_TYPE_COARSE,
})
tx_channel.control.gain = 0.5
tx_channel.control.enable = True
def set_CoarseMixFreq(widget):
desired_CoarseMixFreq = widget['new']
rx_channel.adc_block.MixerSettings.update({
'CoarseMixFreq' : desired_CoarseMixFreq,
'MixerType': xrfdc.MIXER_TYPE_COARSE,
})
c_data = rx_channel.transfer(packetsize = number_samples)
self.time_plot.update_data(c_data)
freq = self.frequency_processor.convert_to_freq(c_data)
self.frequency_plot.update_data(freq)
def set_desired_freq(widget):
centre_frequency = widget['new']
self.dac_tone_generator.centre_frequency = centre_frequency
c_data = rx_channel.transfer(packetsize = number_samples)
self.time_plot.update_data(c_data)
freq = self.frequency_processor.convert_to_freq(c_data)
self.frequency_plot.update_data(freq)
self.dac_tone_generator = DAC_ToneGenerator(tx_channel, centre_frequency)
self.frequency_processor = FrequencyProcessor(configuration={
'sampling-freq' : sample_frequency/2,
'window' : window})
c_data = rx_channel.transfer(packetsize=number_samples)
freq = self.frequency_processor.convert_to_freq(c_data)
self.time_plot = ComplexTimePlot({
'sampling-freq' : sample_frequency,
'number-samples' : number_samples,
'height' : height,
'width' : width,
'c_data' : c_data,
'title' : 'Receiver: Complex Time Plot',
'x-axis-title' : 'Time (s)',
'y-axis-title' : 'Amplitude'})
self.frequency_plot = ComplexFrequencyPlot({
'sampling-freq' : sample_frequency/2,
'number-samples' : number_samples,
'centre-freq' : centre_frequency,
'height' : height,
'width' : width,
'data' : freq,
'title' : 'Receiver: Complex Frequency Plot',
'x-axis-title' : 'Frequency (Hz)',
'y-axis-title' : 'Power Spectral Density (dB)'})
self.desired_freq_slider = ipw.FloatSlider(
value=(centre_frequency),
min=1,
max=(sample_frequency/2)*1e-6,
step=1,
description='Transmitter Frequency:',
disabled=False,
continuous_update=True,
orientation='horizontal',
readout=True,
style = {'description_width': 'initial'})
self.desired_coarse_mix_freq_dropdown = ipw.Dropdown(
options=[('fs/2', xrfdc.COARSE_MIX_SAMPLE_FREQ_BY_TWO),
('fs/4', xrfdc.COARSE_MIX_SAMPLE_FREQ_BY_FOUR),
('-fs/4', xrfdc.COARSE_MIX_MIN_SAMPLE_FREQ_BY_FOUR)],
value=xrfdc.COARSE_MIX_SAMPLE_FREQ_BY_FOUR,
description='Receiver Frequency:',
disabled=False,
continuous_update=False,
orientation='horizontal',
readout=True,
style = {'description_width': 'initial'})
self.desired_freq_slider.observe(set_desired_freq, 'value')
self.desired_coarse_mix_freq_dropdown.observe(set_CoarseMixFreq, 'value')
def display(self):
return ipw.VBox([self.time_plot.get_plot(),
self.frequency_plot.get_plot(),
self.desired_freq_slider, self.desired_coarse_mix_freq_dropdown])
class FineMixerApplication():
def __init__(self,
tx_channel,
rx_channel,
sample_frequency=4096e6,
number_samples=2048,
centre_frequency=1024,
adc_centre_frequency = 1024,
window='blackman',
height=None,
width=None):
tx_channel.dac_block.MixerSettings.update({
'MixerType': xrfdc.MIXER_TYPE_FINE,
'FineMixerScale': xrfdc.MIXER_SCALE_0P7,
'Freq': 1024,
})
rx_channel.adc_block.MixerSettings.update({
'MixerType': xrfdc.MIXER_TYPE_FINE,
'FineMixerScale': xrfdc.MIXER_SCALE_1P0,
'Freq': -1024
})
tx_channel.control.gain = 0.5
tx_channel.control.enable = True
def set_desired_freq(widget):
centre_frequency = widget['new']
self.dac_tone_generator.centre_frequency = centre_frequency
c_data = rx_channel.transfer(packetsize = number_samples)
self.time_plot.update_data(c_data)
freq = self.frequency_processor.convert_to_freq(c_data)
self.frequency_plot.update_data(freq)
def set_adc_desired_freq(widget):
adc_centre_frequency = widget['new']
self.adc_tone_generator.adc_centre_frequency = adc_centre_frequency
c_data = rx_channel.transfer(packetsize = number_samples)
self.time_plot.update_data(c_data)
freq = self.frequency_processor.convert_to_freq(c_data)
self.frequency_plot.update_data(freq)
self.dac_tone_generator = DAC_ToneGenerator(tx_channel, centre_frequency)
self.adc_tone_generator = ADC_ToneGenerator(rx_channel, centre_frequency)
self.frequency_processor = FrequencyProcessor(configuration={
'sampling-freq' : sample_frequency/2,
'window' : window})
c_data = rx_channel.transfer(packetsize=number_samples)
freq = self.frequency_processor.convert_to_freq(c_data)
self.time_plot = ComplexTimePlot({
'sampling-freq' : sample_frequency,
'number-samples' : number_samples,
'height' : height,
'width' : width,
'c_data' : c_data,
'title' : 'Receiver: Complex Time Plot',
'x-axis-title' : 'Time (s)',
'y-axis-title' : 'Amplitude'})
self.frequency_plot = ComplexFrequencyPlot({
'sampling-freq' : sample_frequency/2,
'number-samples' : number_samples,
'centre-freq' : centre_frequency,
'height' : height,
'width' : width,
'data' : freq,
'title' : 'Receiver: Complex Frequency Plot',
'x-axis-title' : 'Frequency (Hz)',
'y-axis-title' : 'Power Spectral Density (dB)'})
self.desired_freq_slider = ipw.FloatSlider(
value=(centre_frequency),
min=1,
max=(sample_frequency/2)*1e-6,
step=1,
description='Transmitter Frequency:',
disabled=False,
continuous_update=True,
orientation='horizontal',
readout=True,
style = {'description_width': 'initial'})
self.desired_adc_freq_slider = ipw.FloatSlider(
value=(adc_centre_frequency),
min=-(sample_frequency/2)*1e-6,
max=(sample_frequency/2)*1e-6,
step=1,
description='Receiver Frequency:',
disabled=False,
continuous_update=True,
orientation='horizontal',
readout=True,
style = {'description_width': 'initial'})
self.desired_freq_slider.observe(set_desired_freq, 'value')
self.desired_adc_freq_slider.observe(set_adc_desired_freq, 'value')
def display(self):
return ipw.VBox([self.time_plot.get_plot(),
self.frequency_plot.get_plot(),
self.desired_freq_slider, self.desired_adc_freq_slider])
| 38.617711 | 122 | 0.558389 | 1,846 | 17,880 | 5.152221 | 0.102925 | 0.058879 | 0.034066 | 0.03007 | 0.876249 | 0.796551 | 0.759542 | 0.705919 | 0.684891 | 0.66155 | 0 | 0.012699 | 0.330593 | 17,880 | 462 | 123 | 38.701299 | 0.781937 | 0.004362 | 0 | 0.706667 | 0 | 0 | 0.119395 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.085333 | false | 0 | 0.013333 | 0.032 | 0.152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b15a99cd85402012ef6f70aec2a69b2ef3897e62 | 74 | py | Python | two-fer/two_fer.py | SurgicalSteel/python-exercism | 7f478331be3fd9e80df9fdee010082f8b58f1a22 | [
"WTFPL"
] | null | null | null | two-fer/two_fer.py | SurgicalSteel/python-exercism | 7f478331be3fd9e80df9fdee010082f8b58f1a22 | [
"WTFPL"
] | null | null | null | two-fer/two_fer.py | SurgicalSteel/python-exercism | 7f478331be3fd9e80df9fdee010082f8b58f1a22 | [
"WTFPL"
] | null | null | null | def two_fer(name="you"):
return format(f"One for {name}, one for me.") | 37 | 49 | 0.648649 | 14 | 74 | 3.357143 | 0.785714 | 0.255319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 74 | 2 | 49 | 37 | 0.758065 | 0 | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
b1748e0b84fdbb7282199c2020b9ba56fbe354c4 | 5,018 | py | Python | scripts/frontend/custom_widgets/CustomLabels.py | MichaelLapshin/Virtual-Hand-Application | 7c27317feae3d54fd10e616858ab0ab79bda6338 | [
"MIT"
] | 1 | 2021-08-31T05:22:04.000Z | 2021-08-31T05:22:04.000Z | scripts/frontend/custom_widgets/CustomLabels.py | MichaelLapshin/Virtual-Hand-Application | 7c27317feae3d54fd10e616858ab0ab79bda6338 | [
"MIT"
] | null | null | null | scripts/frontend/custom_widgets/CustomLabels.py | MichaelLapshin/Virtual-Hand-Application | 7c27317feae3d54fd10e616858ab0ab79bda6338 | [
"MIT"
] | null | null | null | import tkinter
import tkinter.font
# Custom Labels constants
from scripts import General, Parameters, Constants
from scripts.frontend.custom_widgets.WidgetInterface import WidgetInterface
"""
Parent generic label
"""
class Label(tkinter.Label, WidgetInterface):
def __init__(self, root, column, row, columnspan=1, rowspan=1,
padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING, text=None):
tkinter.Label.__init__(self, root, text=text,
padx=padx, pady=pady)
self.grid(column=column, row=row)
self.grid(columnspan=columnspan, rowspan=rowspan)
self.grid(padx=Constants.STANDARD_SPACING, pady=Constants.STANDARD_SPACING)
self.grid(sticky=tkinter.NSEW)
# Config
self.anchor(tkinter.CENTER)
self.config(bd=1, relief=tkinter.RIDGE)
self.config(padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING)
def update_colour(self):
super().update_colour()
self.config(bg=General.washed_colour_hex(Parameters.COLOUR_ALPHA, Parameters.ColourGrad_C))
"""
Custom label
"""
class NavigationLabel(Label):
def __init__(self, root, column, row, columnspan=1, rowspan=1,
padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
padx=padx, pady=pady,
text=text)
class AccountLabel(Label):
def __init__(self, root, column, row, columnspan=1, rowspan=1,
padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
padx=padx, pady=pady,
text=text)
# Padding
self.config(padx=12, pady=10)
self.grid(padx=16, pady=16)
class InformationLabel(Label):
def __init__(self, root, column, row, columnspan=1, rowspan=1, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
text=text)
# Grid
self.grid(sticky=tkinter.NSEW)
self.grid(padx=Constants.STANDARD_SPACING, pady=Constants.STANDARD_SPACING)
def update_colour(self):
super().update_colour()
# Colour
self.config(bg=General.washed_colour_hex(Parameters.COLOUR_BRAVO, Parameters.ColourGrad_C))
class InfoEntryLabel(Label):
def __init__(self, root, column, row, columnspan=1, rowspan=1, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
text=text)
# Grid
self.grid(sticky=tkinter.NSEW)
self.grid(padx=Constants.LONG_SPACING, pady=Constants.STANDARD_SPACING)
def update_colour(self):
super().update_colour()
# Colour
self.config(bg=General.washed_colour_hex(Parameters.COLOUR_BRAVO, Parameters.ColourGrad_C))
class SearchLabel(Label):
def __init__(self, root, column, row, columnspan=1, rowspan=1,
padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
padx=padx, pady=pady,
text=text)
self.config(bd=1, relief=tkinter.RIDGE)
self.grid(sticky=tkinter.NSEW)
def update_colour(self):
super().update_colour()
self.config(bg=General.washed_colour_hex(Parameters.COLOUR_BRAVO, Parameters.ColourGrad_C))
class PlotLabel(Label):
def __init__(self, root, column, row, columnspan=1, rowspan=1,
padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
padx=padx, pady=pady,
text=text)
class TitleLabel(Label):
TITLE_FONT_SIZE = 16
def __init__(self, root, column, row, columnspan=1, rowspan=1,
padx=Constants.SHORT_SPACING, pady=Constants.SHORT_SPACING, text=None):
Label.__init__(self, root, column=column, row=row,
columnspan=columnspan, rowspan=rowspan,
padx=padx, pady=pady,
text=text)
# Title configurations
self.config(font=tkinter.font.Font(size=TitleLabel.TITLE_FONT_SIZE))
self.config(padx=Constants.LONG_SPACING, pady=Constants.STANDARD_SPACING)
self.grid(padx=Constants.LONG_SPACING, pady=Constants.STANDARD_SPACING)
def update_colour(self):
super().update_colour()
self.config(bg=General.washed_colour_hex(Parameters.COLOUR_ALPHA, Parameters.ColourGrad_B))
| 38.305344 | 99 | 0.643882 | 571 | 5,018 | 5.443082 | 0.119089 | 0.041184 | 0.061776 | 0.086873 | 0.796654 | 0.784427 | 0.784427 | 0.781853 | 0.728764 | 0.728764 | 0 | 0.007457 | 0.251694 | 5,018 | 130 | 100 | 38.6 | 0.82024 | 0.01654 | 0 | 0.715909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.147727 | false | 0 | 0.045455 | 0 | 0.295455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
49205bae4af24820f6b1c7c3ab8f8a119b2df0c5 | 382 | py | Python | terrascript/data/azurestack.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/data/azurestack.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/data/azurestack.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/data/azurestack.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:13:15 UTC)
#
# For imports without namespace, e.g.
#
# >>> import terrascript.data.azurestack
#
# instead of
#
# >>> import terrascript.data.hashicorp.azurestack
#
# This is only available for 'official' and 'partner' providers.
from terrascript.data.hashicorp.azurestack import *
| 25.466667 | 73 | 0.746073 | 49 | 382 | 5.816327 | 0.693878 | 0.210526 | 0.175439 | 0.238596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036254 | 0.133508 | 382 | 14 | 74 | 27.285714 | 0.824773 | 0.795812 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
498b52396bc8baf95be8cbfac88447e99390594f | 46,479 | py | Python | PyCidX/finiteDifferences.py | UCL/cid-X | c718a666659164094af9bcf93ddb798a7dcb7b42 | [
"BSD-3-Clause"
] | 9 | 2020-02-19T12:51:06.000Z | 2021-12-14T12:01:59.000Z | PyCidX/finiteDifferences.py | UCL/cid-X | c718a666659164094af9bcf93ddb798a7dcb7b42 | [
"BSD-3-Clause"
] | 9 | 2020-02-19T09:50:04.000Z | 2021-08-15T16:07:26.000Z | PyCidX/finiteDifferences.py | UCL/cid-X | c718a666659164094af9bcf93ddb798a7dcb7b42 | [
"BSD-3-Clause"
] | 1 | 2020-09-07T11:57:24.000Z | 2020-09-07T11:57:24.000Z | #!/usr/bin/env python3
'''
@author: Bjoern Eiben
@summary: Classes which collects the 2D (finiteDifferences2D) and 3D (finiteDifferences3D) finite difference operations.
'''
import numpy as np
class finiteDifferences2D:
def __init__( self, dx=1.0, dy=1.0, differentiationScheme='central' ):
'''
@summary: Define the regular grid spacing in x (1st array axis) and y (2nd array axis) direction
@note: When the parameters dx and dy are changed after initialisation, call .initialise() to
update pre-calculated values!
Set .diffScheme to 'central', 'fwdBwd', 'fwd', 'bwd' or 'central4'
@param dx: Grid spacing in the direction of the first array axis
@param dy: Grid spacing in the direction of the second array axis
'''
self.dx = dx
self.dy = dy
self.diffScheme = differentiationScheme
# Pre-calculate derived values
self.initialise()
def initialise( self ):
'''
@summary: Pre-calculate values derived from dx and dy
'''
#
# Note: The multiplication is much more efficient than the division. Hence the reciprocal values are pre-calculated.
# The _oo_ means one-over
#
self._oo_dx = 1.0 / self.dx
self._oo_dy = 1.0 / self.dy
self._oo_2dx = 1.0 / (2.0 * self.dx)
self._oo_2dy = 1.0 / (2.0 * self.dy)
self._oo_dxdy = 1.0 / (self.dx * self.dy)
self._oo_2dxdy = 1.0 / (2.0 * self.dx * self.dy)
self._oo_4dxdy = 1.0 / (4.0 * self.dx * self.dy)
self._oo_12dx = 1.0 / (12.0 * self.dx)
self._oo_12dy = 1.0 / (12.0 * self.dy)
self._oo_dx2 = 1.0 / (self.dx * self.dx)
self._oo_dy2 = 1.0 / (self.dy * self.dy)
def diff( self, i, axis1, axis2=None ):
'''
@summary: Differentiation into a certain direction. Central differences are always used here.
@return: The differentiation result
@param i: The input
@param axis1: First differentiation into this direction
@param axis2: Second differentiation direction. Set to None if first derivative is required only
'''
o = np.zeros_like(i)
if axis2 == None:
# first derivative
if axis1 == 0:
self.diffXC(i, o)
elif axis1 == 1:
self.diffYC(i, o)
# second/mixed derivative...
elif axis2 == 0:
if axis1 == 0:
self.diffXX(i, o)
elif axis1 == 1:
self.diffXY(i, o)
elif axis2 == 1:
if axis1 == 0:
self.diffXY(i, o)
elif axis1 == 1:
self.diffYY(i, o)
return o
def diffX( self, i, o, it=0 ):
'''
@summary: Differentiation in the X direction of a given input array according the the specified diffScheme.
@note: The output is not generated.
@param i: Input array
@param o: Output array
'''
if self.diffScheme =='central':
self.diffXC(i, o)
elif self.diffScheme == 'fwd':
self.diffXF(i, o)
elif self.diffScheme == 'bwd':
self.diffXB(i, o)
elif self.diffScheme == 'central4':
self.diffXC2(i, o)
else :
if (it % 2) == 0:
self.diffXF(i, o)
return
else:
self.diffXB(i, o)
return
def diffY(self, i, o, it=0):
'''
@summary: Differentiation in the Y direction of a given input array according the the specified diffScheme.
@note: The output is not generated.
@param i: Input array
@param o: Output array
'''
if self.diffScheme =='central':
self.diffYC(i, o)
elif self.diffScheme == 'fwd':
self.diffYF(i, o)
elif self.diffScheme == 'bwd':
self.diffYB(i, o)
elif self.diffScheme == 'central4':
self.diffYC2(i, o)
else :
if (it % 2) == 0:
self.diffYF(i, o)
return
else:
self.diffYB(i, o)
return
def diffXC( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
o[1:-1,:] = ( - i[ 0:-2, : ]
+ i[ 2: , : ] ) * self._oo_2dx #/ (2.0*self.dx)
o[ 0,:] = ( - i[ 0, : ]
+ i[ 1, : ] ) * self._oo_dx #/ (self.dx)
o[ -1,:] = ( - i[ -2, : ]
+ i[ -1, : ] ) * self._oo_dx #/ (self.dx)
def diffYC( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function
'''
o[:,1:-1] = ( - i[ :, 0:-2 ]
+ i[ :, 2: ] ) * self._oo_2dy #/ (2.0*self.dy)
o[ :, 0] = ( - i[ :, 0 ]
+ i[ :, 1 ] ) * self._oo_dy #/ (self.dy)
o[ :, -1] = ( - i[ :, -2 ]
+ i[ :, -1 ] ) * self._oo_dy #/ (self.dy)
def diffXF( self, i, o ):
'''
@summary: Calculates the forward difference of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
# Forward difference used:
# pos. | x- | x0 | x+ |
# coeff. | 0 | -1 | +1 |
#
o[0:-1,:] = ( - i[ 0:-1, :]
+ i[ 1: , :] ) * self._oo_dx #/ (self.dx)
# Switch to backward difference only at upper end
o[ -1,:] = ( - i[ -2, : ]
+ i[ -1, : ] ) * self._oo_dx # / (self.dx)
def diffYF( self, i, o ):
'''
@summary: Calculates the forward difference of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
o[:,0:-1] = ( - i[ :, 0:-1 ]
+ i[ :, 1: ] ) * self._oo_dy #/ (self.dy)
o[:,-1] = ( - i[ :, -2 ]
+ i[ :, -1 ] ) * self._oo_dy #/ (self.dy)
def diffXB( self, i, o ):
'''
@summary: Calculates the backward difference of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
# Forward difference used:
# pos. | x- | x0 | x+ |
# coeff. | -1 | +1 | 0 |
#
o[1:,:] = ( - i[ 0:-1, : ]
+ i[ 1: , : ] ) * self._oo_dx #/ (self.dx)
# Switch to forward difference only at lower end
o[0,:] = ( - i[ 0, : ]
+ i[ 1, : ] ) * self._oo_dx #/ (self.dx)
def diffYB( self, i, o ):
'''
@summary: Calculates the backward difference of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
o[:,1:] = ( - i[ :, 0:-1 ]
+ i[ :, 1: ] ) * self._oo_dy #/ (self.dy)
# Switch to forward difference only at lower end
o[:,0] = ( - i[ :, 0 ]
+ i[ :, 1 ] ) * self._oo_dy #/ (self.dy)
def diffXC2( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the x-direction
with 4th order accuracy using a wider kernel of size 5 and writes it into the
output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
#
# 4th order accuracy everywhere it is possible
#
#i[4: ,:,:] * ( -1.0 /12.0 ) # -1/12 -> -1 x+2 element
#i[3:-1,:,:] * ( 2.0 / 3.0 ) # +2/3 -> +8 x+1 element
#i[2:-2,:,:] * ( 0.0 ) # 0 -> 0 x (central) element
#i[1:-3,:,:] * ( -2.0 / 3.0 ) # -2/3 -> -8 x-1 element
#i[0:-4,:,:] * ( 1.0 /12.0 ) # +1/12 -> +1 x-2 element
o[2:-2,:] = ( - 1.0 * i[4: ,:]
+ 8.0 * i[3:-1,:]
- 8.0 * i[1:-3,:]
+ 1.0 * i[0:-4,:] ) * self._oo_12dx #/ (12.0 * self.dx)
# simple central difference at borders...
o[ 1,:] = ( - i[ 0, : ]
+ i[ 2, : ] ) * self._oo_2dx #/ (2.0*self.dx)
o[-2,:] = ( - i[ -3, : ]
+ i[ -1, : ] ) * self._oo_2dx #/ (2.0*self.dx)
# and forward/backward difference everywhere else
o[ 0,:] = ( - i[ 0, : ]
+ i[ 1, : ] ) * self._oo_dx #/ (self.dx)
o[ -1,:] = ( - i[ -2, : ]
+ i[ -1, : ] ) * self._oo_dx #/ (self.dx)
def diffYC2( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the y-direction
with 4th order accuracy using a wider kernel of size 5 and writes it into the
output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
#
# 4th order accuracy everywhere it is possible
#
o[:,2:-2] = ( - 1.0 * i[:,4: ]
+ 8.0 * i[:,3:-1]
- 8.0 * i[:,1:-3]
+ 1.0 * i[:,0:-4] ) * self._oo_12dy#/ (12.0 * self.dy)
# simple central difference at borders...
o[:, 1] = ( - i[ :, 0]
+ i[ :, 2] ) * self._oo_2dy #/ (2.0*self.dy)
o[:,-2] = ( - i[ :,-3]
+ i[ :,-1] ) * self._oo_2dy #/ (2.0*self.dy)
# and forward/backward difference everywhere else
o[ :, 0] = ( - i[ :, 0 ]
+ i[ :, 1 ] ) * self._oo_dy #/ (self.dy)
o[ :, -1] = ( - i[ :, -2 ]
+ i[ :, -1 ] ) * self._oo_dy #/ (self.dy)
def diffXX( self, i, o ):
'''
@summary: Calculates the second difference (1, -2, 1) of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
# central
o[1:-1,:] = ( 1.0 * i[ 0:-2, : ]
- 2.0 * i[ 1:-1, : ]
+ 1.0 * i[ 2: , : ] ) * self._oo_dx2 #/ self.dx2
# switch to forward backward which is equivalent to copying the data
o[ 0,:] = o[ 1,:]
o[-1,:] = o[-2,:]
def diffYY( self, i, o ):
'''
@summary: Calculates the second difference (1, -2, 1) of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
o[:,1:-1] = ( 1.0 * i[ :, 0:-2 ]
- 2.0 * i[ :, 1:-1 ]
+ 1.0 * i[ :, 2: ] ) * self._oo_dy2 #/ self.dy2
# switch to forward backward which is equivalent to copying the data
o[ :,0] = o[:, 1]
o[:,-1] = o[:,-2]
def diffXY( self, i, o ):
'''
@summary: Calculates the mixed discrete derivative difference of an input scalar field in the xy-direction
and writes it into the output field provided.
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
o[1:-1, 1:-1 ] = ( i[ 2: , 2: ]
+ i[ 0:-2, 0:-2 ]
- i[ 2: , 0:-2 ]
- i[ 0:-2, 2: ] ) * self._oo_4dxdy #/ (4.0 * self.dxy)
o[0, 1:-1 ] = ( i[ 1, 2: ]
+ i[ 0, 0:-2 ]
- i[ 1, 0:-2 ]
- i[ 0, 2: ] ) * self._oo_2dxdy #/ (2.0*self.dxy)
o[-1, 1:-1 ] = ( i[ -1, 2: ]
+ i[ -2, 0:-2 ]
- i[ -1, 0:-2 ]
- i[ -2, 2: ] ) * self._oo_2dxdy #/ (2.0*self.dxy)
o[1:-1, 0 ] = ( i[2: , 1 ]
+ i[0:-2, 0 ]
- i[0:-2, 1 ]
- i[2: , 0 ] ) * self._oo_2dxdy #/ (2.0*self.dxy)
o[1:-1, -1 ] = ( i[ 2: , -1 ]
+ i[ 0:-2, -2 ]
- i[ 0:-2, -1 ]
- i[ 2: , -2 ] ) * self._oo_2dxdy #/ (2.0*self.dxy)
o[0, 0 ] = ( i[ 1, 1 ]
+ i[ 0, 0 ]
- i[ 1, 0 ]
- i[ 0, 1 ] ) * self._oo_dxdy #/ (self.dxy)
o[-1, -1 ] = ( i[-1,-1 ]
+ i[-2,-2 ]
- i[-1,-2 ]
- i[-2,-1 ] ) * self._oo_dxdy #/ (self.dxy)
o[0, -1 ] = ( i[ 1,-1 ]
+ i[ 0,-2 ]
- i[ 1,-2 ]
- i[ 0,-1 ] ) * self._oo_dxdy #/ (self.dxy)
o[-1, 0 ] = ( i[ -1,1 ]
+ i[ -2,0 ]
- i[ -2,1 ]
- i[ -1,0 ] ) * self._oo_dxdy #/ (self.dxy)
def avgXF( self, i, o ):
'''
@summary: Calculates the forward average of an input scalar field in the x-direction
and writes it into the output field provided. This can be interpreted that
the value of the average is located at inter-node positions
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
# Forward average used:
# pos. | x- | x0 | x+ |
# coeff. | 0 | .5 | .5 |
#
o[0:-1,:] = ( i[ 0:-1, : ]
+ i[ 1: , : ] ) * 0.5
# Switch to backward average at upper end
o[ -1,:] = ( i[ -2, : ]
+ i[ -1, : ] ) * 0.5
def avgYF( self, i, o ):
'''
@summary: Calculates the forward average of an input scalar field in the y-direction
and writes it into the output field provided. This can be interpreted that
the value of the average is located at inter-node positions
@param i: Input scalar field (2D np.array)
@param o: Output scalar field (2D np.array). Must have been allocated by calling function.
'''
o[0:-1,:] = ( i[ :, 0:-1 ]
+ i[ :, 1: ] ) * 0.5
# Switch to backward average at upper end
o[ -1,:] = ( i[ :, -2 ]
+ i[ :, -1 ] ) * 0.5
class finiteDifferences3D:
def __init__( self, dx=1.0, dy=1.0, dz=1.0, differentiationScheme='central' ):
''' Define the regular grid spacing in x (1st array axis) and y (2nd array axis) direction
:note: When the parameters ``dx``, ``dy`` or ``dz`` are changed after initialisation, call .initialise() to
update pre-calculated values!
Set .diffScheme to ``central``, ``fwdBwd``, ``fwd``, ``bwd`` or ``central4``
:param dx: Grid spacing in the direction of the first array axis.
:type dx: float
:param dy: Grid spacing in the direction of the second array axis.
:type dy: float
:param dz: Grid spacing in the direction of the third array axis.
:type dz: float
'''
self.dx = dx
self.dy = dy
self.dz = dz
self.diffScheme = differentiationScheme
# Pre-calculate derived values
self.initialise()
def resetSpacing(self, dx, dy, dz):
''' Reset the spacing of the finite difference class
:param dx: Spacing along the first coordinate axis
:type dx: float
:param dy: Spacing along the second coordinate axis
:type dy: float
:param dz: Spacing along the third coordinate axis
:type dz: float
'''
self.dx = dx
self.dy = dy
self.dz = dz
self.initialise()
def initialise( self ):
''' Pre-calculate values derived from dx and dy
'''
#
# Note: The multiplication is much more efficient than the division. Hence the reciprocal values are pre-calculated.
# The _oo_ means one-over
#
self._oo_dx = 1.0 / self.dx
self._oo_dy = 1.0 / self.dy
self._oo_dz = 1.0 / self.dz
self._oo_2dx = 1.0 / (2.0 * self.dx)
self._oo_2dy = 1.0 / (2.0 * self.dy)
self._oo_2dz = 1.0 / (2.0 * self.dz)
self._oo_dxdy = 1.0 / (self.dx * self.dy)
self._oo_dxdz = 1.0 / (self.dx * self.dz)
self._oo_dydz = 1.0 / (self.dy * self.dz)
self._oo_2dxdy = 1.0 / (2.0 * self.dx * self.dy)
self._oo_2dxdz = 1.0 / (2.0 * self.dx * self.dz)
self._oo_2dydz = 1.0 / (2.0 * self.dy * self.dz)
self._oo_4dxdy = 1.0 / (4.0 * self.dx * self.dy)
self._oo_4dxdz = 1.0 / (4.0 * self.dx * self.dz)
self._oo_4dydz = 1.0 / (4.0 * self.dy * self.dz)
self._oo_12dx = 1.0 / (12.0 * self.dx)
self._oo_12dy = 1.0 / (12.0 * self.dy)
self._oo_12dz = 1.0 / (12.0 * self.dz) # TODO: Check if used at all in 3D
self._oo_dx2 = 1.0 / (self.dx * self.dx)
self._oo_dy2 = 1.0 / (self.dy * self.dy)
self._oo_dz2 = 1.0 / (self.dz * self.dz)
def diff( self, i, axis1, axis2=None ):
''' Differentiation into a certain direction. Central differences are always used here.
:param i: The input array
:type i: np.array
:param axis1: First differentiation into this direction. Must be :math:`\\in \\{ 0,1,2\\}`.
:type axis1: int
:param axis2: Second differentiation direction. Set to None if first derivative is required only
:return: The differentiation result
'''
o = np.zeros_like(i)
if axis2 == None:
# first derivative
if axis1 == 0:
self.diffXC(i, o)
elif axis1 == 1:
self.diffYC(i, o)
elif axis1 == 2:
self.diffZC(i, o)
# second/mixed derivative...
elif axis2 == 0:
if axis1 == 0:
self.diffXX(i, o)
elif axis1 == 1:
self.diffXY(i, o)
elif axis1 == 2:
self.diffXZ(i, o)
elif axis2 == 1:
if axis1 == 0:
self.diffXY(i, o)
elif axis1 == 1:
self.diffYY(i, o)
elif axis1 == 2:
self.diffYZ(i, o)
elif axis2 == 2:
if axis1 == 0:
self.diffXZ(i, o)
elif axis1 == 1:
self.diffYZ(i, o)
elif axis1 == 2:
self.diffZZ(i, o)
return o
def diffX( self, i, o, it=0 ):
'''
@summary: Differentiation in the X direction of a given input array according the the specified diffScheme.
@note: The output is not generated.
@param i: Input array
@param o: Output array
'''
if self.diffScheme =='central':
self.diffXC(i, o)
elif self.diffScheme == 'fwd':
self.diffXF(i, o)
elif self.diffScheme == 'bwd':
self.diffXB(i, o)
elif self.diffScheme == 'central4':
self.diffXC2(i, o)
else :
if (it % 2) == 0:
self.diffXF(i, o)
return
else:
self.diffXB(i, o)
return
def diffY(self, i, o, it=0):
'''
@summary: Differentiation in the Y direction of a given input array according the the specified diffScheme.
@note: The output is not generated.
@param i: Input array
@param o: Output array
'''
if self.diffScheme =='central':
self.diffYC(i, o)
elif self.diffScheme == 'fwd':
self.diffYF(i, o)
elif self.diffScheme == 'bwd':
self.diffYB(i, o)
elif self.diffScheme == 'central4':
self.diffYC2(i, o)
else :
if (it % 2) == 0:
self.diffYF(i, o)
return
else:
self.diffYB(i, o)
return
def diffZ(self, i, o, it=0):
'''
@summary: Differentiation in the Z direction of a given input array according the the specified diffScheme.
@note: The output is not generated.
@param i: Input array
@param o: Output array
'''
if self.diffScheme == 'central':
self.diffZC(i, o)
elif self.diffScheme == 'fwd':
self.diffZF(i, o)
elif self.diffScheme == 'bwd':
self.diffZB(i, o)
elif self.diffScheme == 'central4':
self.diffZC2(i, o)
else :
if (it % 2) == 0:
self.diffZF(i, o)
return
else:
self.diffZB(i, o)
return
def diffXC( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[1:-1,:,:] = ( - i[ 0:-2, :, : ]
+ i[ 2: , :, : ] ) * self._oo_2dx
o[ 0,:,:] = ( - i[ 0, :, : ]
+ i[ 1, :, : ] ) * self._oo_dx
o[ -1,:,:] = ( - i[ -2, :, : ]
+ i[ -1, :, : ] ) * self._oo_dx
def diffYC( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scala field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function
'''
o[:,1:-1,:] = ( - i[ :, 0:-2, : ]
+ i[ :, 2: , : ] ) * self._oo_2dy
o[ :, 0,:] = ( - i[ :, 0, : ]
+ i[ :, 1, : ] ) * self._oo_dy
o[ :, -1,:] = ( - i[ :, -2, : ]
+ i[ :, -1, : ] ) * self._oo_dy
def diffZC(self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the z-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function
'''
o[:,:,1:-1] = ( - i[ :, :, 0:-2 ]
+ i[ :, :, 2: ] ) * self._oo_2dz
o[:, :, 0] = ( - i[ :, :, 0 ]
+ i[ :, :, 1 ] ) * self._oo_dz
o[:, :, -1] = ( - i[ :, :, -2 ]
+ i[ :, :, -1 ] ) * self._oo_dz
def diffXF( self, i, o ):
'''
@summary: Calculates the forward difference of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
# Forward difference used:
# pos. | x- | x0 | x+ |
# coeff. | 0 | -1 | +1 |
#
o[0:-1,:,:] = ( - i[ 0:-1, :, : ]
+ i[ 1: , :, : ] ) * self._oo_dx
# Switch to backward difference only at upper end
o[ -1,:,:] = ( - i[ -2, :, : ]
+ i[ -1, :, : ] ) * self._oo_dx
def diffYF( self, i, o ):
'''
@summary: Calculates the forward difference of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:,0:-1,:] = ( - i[ :, 0:-1, : ]
+ i[ :, 1: , : ] ) * self._oo_dy
o[:,-1,:] = ( - i[ :, -2, : ]
+ i[ :, -1, : ] ) * self._oo_dy
def diffZF( self, i, o ):
'''
@summary: Calculates the forward difference of an input scalar field in the z-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:,:,0:-1] = ( - i[ :, :, 0:-1]
+ i[ :, :, 1: ] ) * self._oo_dz
o[:,:,-1] = ( - i[ :, :, -2]
+ i[ :, :, -1] ) * self._oo_dz
def diffXB( self, i, o ):
'''
@summary: Calculates the backward difference of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
# Forward difference used:
# pos. | x- | x0 | x+ |
# coeff. | -1 | +1 | 0 |
#
o[1:,:,:] = ( - i[ 0:-1, :, : ]
+ i[ 1: , :, : ] ) * self._oo_dx
# Switch to forward difference only at lower end
o[0,:,:] = ( - i[ 0, :, : ]
+ i[ 1, :, : ] ) * self._oo_dx
def diffYB( self, i, o ):
'''
@summary: Calculates the backward difference of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:,1:,:] = ( - i[ :, 0:-1, : ]
+ i[ :, 1: , : ] ) * self._oo_dy
# Switch to forward difference only at lower end
o[:,0,:] = ( - i[ :, 0, : ]
+ i[ :, 1, : ] ) * self._oo_dy
def diffZB( self, i, o ):
'''
@summary: Calculates the backward difference of an input scalar field in the z-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:,:,1:] = ( - i[ :, :, 0:-1 ]
+ i[ :, :, 1: ] ) * self._oo_dz
# Switch to forward difference only at lower end
o[:,:,0] = ( - i[ :, :, 0 ]
+ i[ :, :, 1 ] ) * self._oo_dz
def diffXC2( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the x-direction
with 4th order accuracy using a wider kernel of size 5 and writes it into the
output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
#
# 4th order accuracy everywhere it is possible
#
#i[4: ,:,:] * ( -1.0 /12.0 ) # -1/12 -> -1 x+2 element
#i[3:-1,:,:] * ( 2.0 / 3.0 ) # +2/3 -> +8 x+1 element
#i[2:-2,:,:] * ( 0.0 ) # 0 -> 0 x (central) element
#i[1:-3,:,:] * ( -2.0 / 3.0 ) # -2/3 -> -8 x-1 element
#i[0:-4,:,:] * ( 1.0 /12.0 ) # +1/12 -> +1 x-2 element
o[2:-2,:,:] = ( - 1.0 * i[4: ,:,:]
+ 8.0 * i[3:-1,:,:]
- 8.0 * i[1:-3,:,:]
+ 1.0 * i[0:-4,:,:] ) * self._oo_12dx
# simple central difference at borders...
o[ 1,:,:] = ( - i[ 0, :, : ]
+ i[ 2, :, : ] ) * self._oo_2dx
o[-2,:,:] = ( - i[ -3, :, : ]
+ i[ -1, :, : ] ) * self._oo_2dx
# and forward/backward difference everywhere else
o[ 0,:,:] = ( - i[ 0, :, : ]
+ i[ 1, :, : ] ) * self._oo_dx
o[ -1,:,:] = ( - i[ -2, :, : ]
+ i[ -1, :, : ] ) * self._oo_dx
def diffYC2( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the y-direction
with 4th order accuracy using a wider kernel of size 5 and writes it into the
output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
#
# 4th order accuracy everywhere it is possible
#
o[:,2:-2,:] = ( - 1.0 * i[:,4: ,:]
+ 8.0 * i[:,3:-1,:]
- 8.0 * i[:,1:-3,:]
+ 1.0 * i[:,0:-4,:] ) * self._oo_12dy
# simple central difference at borders...
o[:, 1,:] = ( - i[ :,0, :]
+ i[ :,2, :] ) * self._oo_2dy
o[:,-2,:] = ( - i[ :,-3,:]
+ i[ :,-1,:] ) * self._oo_2dy
# and forward/backward difference everywhere else
o[ :, 0,:] = ( - i[ :, 0, : ]
+ i[ :, 1, : ] ) * self._oo_dy
o[ :, -1,:] = ( - i[ :, -2, : ]
+ i[ :, -1, : ] ) * self._oo_dy
def diffZC2( self, i, o ):
'''
@summary: Calculates the central difference of an input scalar field in the y-direction
with 4th order accuracy using a wider kernel of size 5 and writes it into the
output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
#
# 4th order accuracy everywhere it is possible
#
o[:,:,2:-2] = ( - 1.0 * i[:,:,4: ]
+ 8.0 * i[:,:,3:-1]
- 8.0 * i[:,:,1:-3]
+ 1.0 * i[:,:,0:-4] ) * self._oo_12dz
# simple central difference at borders...
o[:,:, 1] = ( - i[ :,:,0]
+ i[ :,:,2] ) * self._oo_2dz
o[:,:,-2] = ( - i[ :,:,-3]
+ i[ :,:,-1] ) * self._oo_2dz
# and forward/backward difference everywhere else
o[ :, :, 0] = ( - i[ :, :, 0 ]
+ i[ :, :, 1 ] ) * self._oo_dz
o[ :, :,-1] = ( - i[ :, :, -2 ]
+ i[ :, :, -1 ] ) * self._oo_dz
def diffXX( self, i, o ):
'''
@summary: Calculates the second difference (1, -2, 1) of an input scalar field in the x-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
# central
o[1:-1,:,:] = ( 1.0 * i[ 0:-2, :, : ]
- 2.0 * i[ 1:-1, :, : ]
+ 1.0 * i[ 2: , :, : ] ) * self._oo_dx2
# switch to forward backward which is equivalent to copying the data
o[ 0,:,:] = o[ 1,:,:]
o[-1,:,:] = o[-2,:,:]
def diffYY( self, i, o ):
'''
@summary: Calculates the second difference (1, -2, 1) of an input scalar field in the y-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:,1:-1,:] = ( 1.0 * i[ :, 0:-2, : ]
- 2.0 * i[ :, 1:-1, : ]
+ 1.0 * i[ :, 2: , : ] ) * self._oo_dy2
# switch to forward backward which is equivalent to copying the data
o[ :,0,:] = o[:, 1,:]
o[:,-1,:] = o[:,-2,:]
def diffZZ( self, i, o ):
'''
@summary: Calculates the second difference (1, -2, 1) of an input scalar field in the z-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:,:,1:-1] = ( 1.0 * i[ :, :, 0:-2 ]
- 2.0 * i[ :, :, 1:-1 ]
+ 1.0 * i[ :, :, 2: ] ) * self._oo_dz2
# switch to forward backward which is equivalent to copying the data
o[ :,:,0] = o[:,:, 1]
o[:,:,-1] = o[:,:,-2]
def diffXY( self, i, o ):
'''
@summary: Calculates the mixed discrete derivatives of an input scalar field in the xy-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[1:-1, 1:-1, : ] = ( i[ 2: , 2: , : ]
+ i[ 0:-2, 0:-2, : ]
- i[ 2: , 0:-2, : ]
- i[ 0:-2, 2: , : ] ) * self._oo_4dxdy
o[0, 1:-1, : ] = ( i[ 1, 2: , : ]
+ i[ 0, 0:-2, : ]
- i[ 1, 0:-2, : ]
- i[ 0, 2: , : ] ) * self._oo_2dxdy
o[-1, 1:-1, : ] = ( i[ -1, 2: , : ]
+ i[ -2, 0:-2, : ]
- i[ -1, 0:-2, : ]
- i[ -2, 2: , : ] ) * self._oo_2dxdy
o[1:-1, 0, : ] = ( i[2: , 1, : ]
+ i[0:-2, 0, : ]
- i[0:-2, 1, : ]
- i[2: , 0, : ] ) * self._oo_2dxdy
o[1:-1, -1, : ] = ( i[ 2: , -1, : ]
+ i[ 0:-2, -2, : ]
- i[ 0:-2, -1, : ]
- i[ 2: , -2, : ] ) * self._oo_2dxdy
o[0, 0, : ] = ( i[ 1, 1, : ]
+ i[ 0, 0, : ]
- i[ 1, 0, : ]
- i[ 0, 1, : ] ) * self._oo_dxdy
o[-1, -1, : ] = ( i[-1,-1, : ]
+ i[-2,-2, : ]
- i[-1,-2, : ]
- i[-2,-1, : ] ) * self._oo_dxdy
o[0, -1, : ] = ( i[ 1,-1, : ]
+ i[ 0,-2, : ]
- i[ 1,-2, : ]
- i[ 0,-1, : ] ) * self._oo_dxdy
o[-1, 0, : ] = ( i[ -1,1, : ]
+ i[ -2,0, : ]
- i[ -2,1, : ]
- i[ -1,0, : ] ) * self._oo_dxdy
def diffXZ( self, i, o ):
'''
@summary: Calculates the mixed discrete derivatives of an input scalar field in the xz-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[1:-1, :, 1:-1] = ( i[ 2: , :, 2: ]
+ i[ 0:-2, :, 0:-2]
- i[ 2: , :, 0:-2]
- i[ 0:-2, :, 2: ] ) * self._oo_4dxdz
o[0, :, 1:-1 ] = ( i[ 1, :, 2: ]
+ i[ 0, :, 0:-2]
- i[ 1, :, 0:-2]
- i[ 0, :, 2: ] ) * self._oo_2dxdz
o[-1, :, 1:-1] = ( i[ -1, :, 2: ]
+ i[ -2, :, 0:-2]
- i[ -1, :, 0:-2]
- i[ -2, :, 2: ] ) * self._oo_2dxdz
o[1:-1, :, 0] = ( i[2: , :, 1]
+ i[0:-2, :, 0]
- i[0:-2, :, 1]
- i[2: , :, 0] ) * self._oo_2dxdz
o[1:-1, :, -1] = ( i[ 2: , :, -1]
+ i[ 0:-2, :, -2]
- i[ 0:-2, :, -1]
- i[ 2: , :, -2] ) * self._oo_2dxdz
o[0, :, 0] = ( i[ 1, :, 1]
+ i[ 0, :, 0]
- i[ 1, :, 0]
- i[ 0, :, 1] ) * self._oo_dxdz
o[-1, :, -1] = ( i[-1, :, -1]
+ i[-2, :, -2]
- i[-1, :, -2]
- i[-2, :, -1] ) * self._oo_dxdz
o[0, :, -1] = ( i[ 1, :, -1]
+ i[ 0, :, -2]
- i[ 1, :, -2]
- i[ 0, :, -1] ) * self._oo_dxdz
o[-1, :, 0] = ( i[ -1, :, 1]
+ i[ -2, :, 0]
- i[ -2, :, 1]
- i[ -1, :, 0] ) * self._oo_dxdz
def diffYZ( self, i, o ):
'''
@summary: Calculates the mixed discrete derivatives of an input scalar field in the yz-direction
and writes it into the output field provided.
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[:, 1:-1, 1:-1] = ( i[ :, 2: , 2: ]
+ i[ :, 0:-2, 0:-2]
- i[ :, 2: , 0:-2]
- i[ :, 0:-2, 2: ] ) * self._oo_4dydz
o[:, 0, 1:-1 ] = ( i[:, 1, 2: ]
+ i[ :, 0, 0:-2]
- i[ :, 1, 0:-2]
- i[ :, 0, 2: ] ) * self._oo_2dydz
o[:, -1, 1:-1] = ( i[ :, -1, 2: ]
+ i[ :, -2, 0:-2]
- i[ :, -1, 0:-2]
- i[ :, -2, 2: ] ) * self._oo_2dydz
o[:, 1:-1, 0] = ( i[:, 2: , 1]
+ i[:, 0:-2, 0]
- i[:, 0:-2, 1]
- i[:, 2: , 0] ) * self._oo_2dydz
o[:, 1:-1, -1] = ( i[:, 2: , -1]
+ i[:, 0:-2, -2]
- i[:, 0:-2, -1]
- i[:, 2: , -2] ) * self._oo_2dydz
o[:, 0, 0] = ( i[:, 1, 1]
+ i[ :, 0, 0]
- i[ :, 1, 0]
- i[ :, 0, 1] ) * self._oo_dydz
o[:, -1, -1] = ( i[:, -1, -1]
+ i[:, -2, -2]
- i[:, -1, -2]
- i[:, -2, -1] ) * self._oo_dydz
o[:, 0, -1] = ( i[ :, 1, -1]
+ i[ :, 0, -2]
- i[ :, 1, -2]
- i[ :, 0, -1] ) * self._oo_dydz
o[:, -1, 0] = ( i[ :, -1, 1]
+ i[ :, -2, 0]
- i[ :, -2, 1]
- i[ :, -1, 0] ) * self._oo_dydz
def avgXF( self, i, o ):
'''
@summary: Calculates the forward average of an input scalar field in the x-direction
and writes it into the output field provided. This can be interpreted that
the value of the average is located at inter-node positions
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
# Forward average used:
# pos. | x- | x0 | x+ |
# coeff. | 0 | .5 | .5 |
#
o[0:-1,:,:] = ( i[ 0:-1, :, : ]
+ i[ 1: , :, : ] ) * 0.5
# Switch to backward average at upper end
o[ -1,:,:] = ( i[ -2, :, : ]
+ i[ -1, :, : ] ) * 0.5
def avgYF( self, i, o ):
'''
@summary: Calculates the forward average of an input scalar field in the y-direction
and writes it into the output field provided. This can be interpreted that
the value of the average is located at inter-node positions
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[0:-1,:,:] = ( i[ :, 0:-1, : ]
+ i[ :, 1: , : ] ) * 0.5
# Switch to backward average at upper end
o[ -1,:,:] = ( i[ :, -2, : ]
+ i[ :, -1, : ] ) * 0.5
def avgZF( self, i, o ):
'''
@summary: Calculates the forward average of an input scalar field in the z-direction
and writes it into the output field provided. This can be interpreted that
the value of the average is located at inter-node positions
@param i: Input scalar field (3D np.array)
@param o: Output scalar field (3D np.array). Must have been allocated by calling function.
'''
o[0:-1,:,:] = ( i[ :, :, 0:-1 ]
+ i[ :, :, 1: ] ) * 0.5
# Switch to backward average at upper end
o[ -1,:,:] = ( i[ :, :, -2 ]
+ i[ :, :, -1 ] ) * 0.5
| 37.911093 | 124 | 0.391424 | 5,348 | 46,479 | 3.349476 | 0.04469 | 0.044549 | 0.059845 | 0.020097 | 0.951823 | 0.943002 | 0.934517 | 0.910735 | 0.906604 | 0.899012 | 0 | 0.057035 | 0.472256 | 46,479 | 1,225 | 125 | 37.942041 | 0.673244 | 0.367714 | 0 | 0.482243 | 0 | 0 | 0.004559 | 0 | 0 | 0 | 0 | 0.000816 | 0 | 1 | 0.085981 | false | 0 | 0.001869 | 0 | 0.114019 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b8fb2b5323ad08e631b76286b12ebf43952002a5 | 6,706 | py | Python | plugins/cisco_asa/unit_test/test_get_blocked_hosts.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | null | null | null | plugins/cisco_asa/unit_test/test_get_blocked_hosts.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | null | null | null | plugins/cisco_asa/unit_test/test_get_blocked_hosts.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | null | null | null | import sys
import os
from unittest import TestCase
from icon_cisco_asa.actions.get_blocked_hosts import GetBlockedHosts
from icon_cisco_asa.actions.get_blocked_hosts.schema import Output
from unit_test.util import Util
from unittest.mock import patch
sys.path.append(os.path.abspath("../"))
class TestGetBlockedHosts(TestCase):
@classmethod
def setUpClass(cls) -> None:
cls.action = Util.default_connector(GetBlockedHosts())
@patch("requests.request", side_effect=Util.mocked_requests)
def test_get_blocked_hosts(self, mock_post):
actual = self.action.run()
expected = {
Output.HOSTS: [
{
"source_ip": "1.1.1.1",
"dest_ip": "2.2.2.2",
"source_port": "444",
"dest_port": "555",
"protocol": "6",
},
{
"source_ip": "3.3.3.3",
"dest_ip": "4.4.4.4",
"source_port": "333",
"dest_port": "444",
"protocol": "6",
},
]
}
self.assertEqual(actual, expected)
class Response:
def __init__(self, text, status_code):
self.status_code = status_code
self.text = text
def json(self):
return self.text
@patch("requests.request")
def test_get_blocked_hosts_single_host(self, mock_post):
text = {"response": ["shun (management) 1.1.1.1 2.2.2.2 444 555 6\n"]}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {
Output.HOSTS: [
{
"source_ip": "1.1.1.1",
"dest_ip": "2.2.2.2",
"source_port": "444",
"dest_port": "555",
"protocol": "6",
}
]
}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_without_new_line(self, mock_post):
text = {"response": ["shun (management) 1.1.1.1 2.2.2.2 444 555 6"]}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {
Output.HOSTS: [
{
"source_ip": "1.1.1.1",
"dest_ip": "2.2.2.2",
"source_port": "444",
"dest_port": "555",
"protocol": "6",
}
]
}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_incomplete_response(self, mock_post):
text = {"response": ["shun (management) 1.1.1.1 2.2.2.2\n"]}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {Output.HOSTS: []}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_incomplete_response2(self, mock_post):
text = {"response": ["shun (management) 1.1.1.1 2.2.2.2\nshun (management) 3.3.3.3 4.4.4.4 333 444 6\n"]}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {
Output.HOSTS: [
{
"source_ip": "3.3.3.3",
"dest_ip": "4.4.4.4",
"source_port": "333",
"dest_port": "444",
"protocol": "6",
}
]
}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_incomplete_response3(self, mock_post):
text = {"response": ["shun (management) 1.1.1.1 2.2.2.2 444 555 6\nshun (management) 3.3.3.3 4.4.4.4\n"]}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {
Output.HOSTS: [
{
"source_ip": "1.1.1.1",
"dest_ip": "2.2.2.2",
"source_port": "444",
"dest_port": "555",
"protocol": "6",
}
]
}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_bad_response(self, mock_post):
text = {
"response": [
"shun (management) 1.1.1.1 2.2.2.2 444 555 6\n\n \n \n\n \nshun (management) 3.3.3.3 "
"4.4.4.4 333 444 6\n"
]
}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {
Output.HOSTS: [
{
"source_ip": "1.1.1.1",
"dest_ip": "2.2.2.2",
"source_port": "444",
"dest_port": "555",
"protocol": "6",
},
{
"source_ip": "3.3.3.3",
"dest_ip": "4.4.4.4",
"source_port": "333",
"dest_port": "444",
"protocol": "6",
},
]
}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_no_blocked_hosts(self, mock_post):
text = {"response": [""]}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {Output.HOSTS: []}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_empty_list(self, mock_post):
text = {"response": []}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {Output.HOSTS: []}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_response_as_string(self, mock_post):
text = {"response": "test"}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {Output.HOSTS: []}
self.assertEqual(actual, expected)
@patch("requests.request")
def test_get_blocked_hosts_no_response(self, mock_post):
text = {"response": None}
mock_post.return_value = TestGetBlockedHosts.Response(text, 200)
actual = self.action.run()
expected = {Output.HOSTS: []}
self.assertEqual(actual, expected)
| 31.632075 | 114 | 0.511631 | 732 | 6,706 | 4.498634 | 0.11612 | 0.020043 | 0.020043 | 0.056787 | 0.849074 | 0.827817 | 0.818099 | 0.818099 | 0.782265 | 0.782265 | 0 | 0.061617 | 0.356248 | 6,706 | 211 | 115 | 31.781991 | 0.701181 | 0 | 0 | 0.575581 | 0 | 0.02907 | 0.176111 | 0 | 0 | 0 | 0 | 0 | 0.063953 | 1 | 0.081395 | false | 0 | 0.040698 | 0.005814 | 0.139535 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
622c35ba4a3c9f48a0216080af12dec4d3d2b215 | 5,755 | py | Python | Map.py | harshkakashaniya/Dijkstra-and-A-Algorithm | 4913bc17b98fb6d7a282e3e3c11d23d55d9fca06 | [
"MIT"
] | 1 | 2022-02-01T11:36:11.000Z | 2022-02-01T11:36:11.000Z | Map.py | harshkakashaniya/Dijkstra-and-A-Algorithm | 4913bc17b98fb6d7a282e3e3c11d23d55d9fca06 | [
"MIT"
] | null | null | null | Map.py | harshkakashaniya/Dijkstra-and-A-Algorithm | 4913bc17b98fb6d7a282e3e3c11d23d55d9fca06 | [
"MIT"
] | null | null | null | import argparse
import numpy as np
import os, sys
from numpy import linalg as LA
import math
from PIL import Image
import random
try:
sys.path.remove('/opt/ros/kinetic/lib/python2.7/dist-packages')
except:
pass
import cv2
def Map(clearance,radius,resolution):
clearance=(clearance+radius)*resolution
Map=255*np.ones((int(150*resolution),int(250*resolution),3))
for i in range(Map.shape[0]):
for j in range(Map.shape[1]):
#for polygon right
c1=clearance/math.sin(math.atan(-0.54)-math.pi/2)
c2=clearance/math.sin(math.atan(0.60)-math.pi/2)
c3=clearance/math.sin(math.atan(-0.18)-math.pi/2)
if(i-135*resolution-clearance<=0 and j+0.54*i-245.97*resolution+c1<=0 and j-0.60*i-133.68*resolution+c2<=0 and j+0.18*i-181.05*resolution-c3>=0):
Map[i,j]=[255,0,0]
if(i-135*resolution-clearance<=0 and i-135*resolution-0.6*clearance>=0 and j+0.54*i-245.97*resolution+c1<=0 and j+0.54*i-245.97*resolution-0.6*clearance>=0 and math.pow(i-135*resolution,2)+math.pow(j-173*resolution,2)-math.pow(clearance,2)>0):
Map[i,j]=[255,255,255]
if(j-0.60*i-133.68*resolution-0.6*clearance>=0 and j-0.60*i-133.68*resolution+c2<=0 and j+0.54*i-245.97*resolution+c1<=0 and j+0.54*i-245.97*resolution-0.6*clearance>=0 and math.pow(i-150*resolution+52*resolution,2)+math.pow(j-193*resolution,2)-math.pow(clearance,2)>0):
Map[i,j]=[255,255,255]
if(j-0.60*i-(133.68)*resolution+0.6*clearance>=0 and j-0.60*i-(133.68)*resolution+c2<=0 and j+0.18*i-(181.05)*resolution-c3>=0 and j+0.18*i-(181.05)*resolution-0.6*clearance<=0 and math.pow(i-150*resolution+90*resolution,2)+math.pow(j-170*resolution,2)-math.pow(clearance,2)>=0 ):
Map[i,j]=[255,255,255]
c4=clearance/math.sin(math.atan(-0.18)-math.pi/2)
c5=clearance/math.sin(math.atan(9.5)-math.pi/2)
c6=clearance/math.sin(math.atan(0.6)-math.pi/2)
#for polygon left
if(i-135*resolution-clearance<=0 and j+0.18*i-181.05*resolution+c4<=0 and j-9.5*i+768.0*resolution+c5<=0 and j-0.6*i-67.68*resolution-c6>=0):
Map[i,j]=[255,0,0]
if(i-135*resolution-clearance-0.2*clearance<=0 and i-(135)*resolution-0.5*clearance>=0 and j-0.6*i-67.68*resolution-c6+0.2*clearance>=0 and j-0.6*i-(67.68)*resolution+0.3*clearance<=0 and math.pow(i-135*resolution,2)+math.pow(j-150*resolution,2)-math.pow(clearance,2)>=0):
Map[i,j]=[255,255,255]
if(j-9.5*i+(768.0)*resolution+c5<=0 and j-9.5*i+(768.0)*resolution+5*clearance>=0 and j-0.6*i-(67.68)*resolution+0.6*clearance-c6>=0 and j-0.6*i-(67.68)*resolution-0.6*clearance<=0 and math.pow(i-150*resolution+56*resolution,2)+math.pow(j-125*resolution,2)-math.pow(clearance,2)>=0):
Map[i,j]=[255,255,255]
#for rectrangle
if(j>=50*resolution-clearance and j<=100*resolution+clearance and i>=37.5*resolution-clearance and i<=37.5*resolution+45*resolution+clearance):
Map[i,j]=[255,0,0]
if(j>=50*resolution-clearance and j<=50*resolution and i>=37.5*resolution-clearance and i<=37.5*resolution and math.pow(i-37.5*resolution,2)+math.pow(j-50*resolution,2)-math.pow(clearance,2)>0):
Map[i,j]=[255,255,255]
if(j>=50*resolution-clearance and j<=50*resolution and i<=37.5*resolution+45*resolution+clearance and i>=37.5*resolution+45*resolution and math.pow(i-37.5*resolution-45*resolution,2)+math.pow(j-50*resolution,2)-math.pow(clearance,2)>0):
Map[i,j]=[255,255,255]
if(j<=100*resolution+clearance and j>=100*resolution and i<=37.5*resolution+45*resolution+clearance and i>=37.5*resolution+45*resolution and math.pow(i-37.5*resolution-45*resolution,2)+math.pow(j-100*resolution,2)-math.pow(clearance,2)>0 and radius<14 ):
Map[i,j]=[255,255,255]
if(j<=100*resolution+clearance and j>=100*resolution and i>=37.5*resolution-clearance and i<=37.5*resolution and math.pow(i-37.5*resolution,2)+math.pow(j-100*resolution,2)-math.pow(clearance,2)>0):
Map[i,j]=[255,255,255]
#for circle
if((math.pow(i-20*resolution,2)+math.pow(j-190*resolution,2))-math.pow(15*resolution+clearance,2)<=0):
Map[i,j]=[255,0,0]
#for ellipse
if((math.pow(i-30*resolution,2)/math.pow(6*resolution+clearance,2))+math.pow(j-140*resolution,2)/math.pow(15*resolution+clearance,2)<=1):
Map[i,j]=[255,0,0]
for i in range(Map.shape[0]):
for j in range(Map.shape[1]):
if(j>50*resolution and j<100*resolution and i>37.5*resolution and i<37.5*resolution+45*resolution):
Map[i,j]=[0,0,255]
if((math.pow(i-20*resolution,2)+math.pow(j-190*resolution,2))<math.pow(15*resolution,2)):
Map[i,j]=[0,0,255]
if((math.pow(i-30*resolution,2)/math.pow(6*resolution,2))+math.pow(j-140*resolution,2)/math.pow(15*resolution,2)<1):
Map[i,j]=[0,0,255]
if(i-135*resolution<0 and j+0.54*i-245.97*resolution<0 and j-0.60*i-133.68*resolution<0 and j+0.18*i-181.05*resolution>0):
Map[i,j]=[0,0,255]
if(i-135*resolution<0 and j+0.18*i-181.05*resolution<0 and j-9.5*i+768.0*resolution<0 and j-0.6*i-67.68*resolution>0):
Map[i,j]=[0,0,255]
'''
Y1=170
X1=150-90
Y2=163
X2=150-52
slope=(Y2-Y1)/(X2-X1)
C=Y1-slope*X1
print(slope)
print(C)
'''
cv2.namedWindow('Map',cv2.WINDOW_NORMAL)
cv2.imshow('Map',Map)
cv2.waitKey()
cv2.destroyAllWindows()
#cv2.show()
return Map
koko=Map(0,10,1)
| 47.561983 | 295 | 0.62311 | 1,072 | 5,755 | 3.344216 | 0.108209 | 0.080056 | 0.062483 | 0.135565 | 0.82901 | 0.8053 | 0.783543 | 0.750628 | 0.728312 | 0.682008 | 0 | 0.156879 | 0.180365 | 5,755 | 120 | 296 | 47.958333 | 0.603138 | 0.013553 | 0 | 0.333333 | 0 | 0 | 0.009038 | 0.007954 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014493 | false | 0.014493 | 0.115942 | 0 | 0.144928 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.