hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4473dea74ad824b410db2d46eaf6858a8677dd85 | 857 | py | Python | src/bin_print.py | wykys/MIKS-FSK | a28255a1a184fb0b9753fcb133ea12e1b75ae93d | [
"MIT"
] | null | null | null | src/bin_print.py | wykys/MIKS-FSK | a28255a1a184fb0b9753fcb133ea12e1b75ae93d | [
"MIT"
] | null | null | null | src/bin_print.py | wykys/MIKS-FSK | a28255a1a184fb0b9753fcb133ea12e1b75ae93d | [
"MIT"
] | null | null | null | # wykys 2019
def bin_print(byte_array: list, num_in_line: int = 8, space: str = ' | '):
def bin_to_str(byte_array: list) -> str:
return ''.join([
chr(c) if c > 32 and c < 127 else '.' for c in byte_array
])
tmp = ''
for i, byte in enumerate(byte_array):
tmp = ''.join([tmp, f'{byte:02X}'])
if (i+1) % num_in_line:
tmp = ''.join([tmp, ' '])
else:
tmp = ''.join([
tmp,
space,
bin_to_str(byte_array[i-num_in_line+1:i+1]),
'\n'
])
if (i+1) % num_in_line:
tmp = ''.join([
tmp,
' '*(3*(num_in_line - ((i+1) % num_in_line)) - 1),
space,
bin_to_str(byte_array[i - ((i+1) % num_in_line) + 1:]),
'\n'
])
print(tmp)
| 26.78125 | 74 | 0.424737 | 115 | 857 | 2.930435 | 0.295652 | 0.103858 | 0.186944 | 0.083086 | 0.394659 | 0.344214 | 0.272997 | 0.136499 | 0.136499 | 0 | 0 | 0.041257 | 0.406068 | 857 | 31 | 75 | 27.645161 | 0.620825 | 0.011669 | 0 | 0.5 | 0 | 0 | 0.023669 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0.038462 | 0.115385 | 0.076923 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4476220506d945bb783299ab9c689b2af1b83a67 | 1,363 | py | Python | tests/test_modifiers.py | mbillingr/friendly-iter | 77e1ce72100f592b6155a2152fcc03165af22714 | [
"MIT"
] | null | null | null | tests/test_modifiers.py | mbillingr/friendly-iter | 77e1ce72100f592b6155a2152fcc03165af22714 | [
"MIT"
] | null | null | null | tests/test_modifiers.py | mbillingr/friendly-iter | 77e1ce72100f592b6155a2152fcc03165af22714 | [
"MIT"
] | null | null | null | from unittest.mock import Mock
import pytest
from friendly_iter.iterator_modifiers import flatten, take, skip, step
def test_flatten():
result = flatten([range(4), [], [4, 5]])
assert list(result) == [0, 1, 2, 3, 4, 5]
def test_take_limits_number_of_resulting_items():
result = take(3, range(10))
assert list(result) == [0, 1, 2]
def test_take_works_if_iterator_is_too_short():
result = take(10, range(3))
assert list(result) == [0, 1, 2]
def test_skip_drops_first_n_elements():
result = skip(2, [1, 2, 3, 4, 5])
assert list(result) == [3, 4, 5]
def test_skipping_too_many_results_in_empty_iterator():
result = skip(3, [1, 2])
assert list(result) == []
def test_skip_advanced_iterator_lazily():
skip(3, FailingIter()) # should not raise
def test_refuse_stepsize_less_than_one():
with pytest.raises(ValueError):
step(0, [])
def test_step_size_one_is_an_identity_operation():
it = Mock()
result = step(1, it)
assert result is it
def test_step_always_yields_first_element():
result = step(2, [1])
assert list(result) == [1]
def test_step_yields_every_nth_item():
result = step(2, [1, 2, 3, 4])
assert list(result) == [1, 3]
class FailingIter:
def __iter__(self):
return self
def __next__(self):
pytest.fail("Iterator was advanced")
| 21.296875 | 70 | 0.663977 | 205 | 1,363 | 4.112195 | 0.370732 | 0.083037 | 0.132859 | 0.060498 | 0.142349 | 0.084223 | 0.061684 | 0.061684 | 0 | 0 | 0 | 0.042357 | 0.203228 | 1,363 | 63 | 71 | 21.634921 | 0.733886 | 0.011739 | 0 | 0.052632 | 0 | 0 | 0.015613 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 1 | 0.315789 | false | 0 | 0.078947 | 0.026316 | 0.447368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
447c96e01f699a1d061bedede73a165b4923aa49 | 404 | py | Python | backend/apps/role/migrations/0003_auto_20200329_1414.py | highproformas-friends/curaSWISS | cf6c1ac9c0c80026f1667a7155290c37be8dec7c | [
"MIT"
] | 3 | 2020-03-27T20:39:31.000Z | 2020-03-31T20:24:55.000Z | backend/apps/role/migrations/0003_auto_20200329_1414.py | highproformas-friends/curaSWISS | cf6c1ac9c0c80026f1667a7155290c37be8dec7c | [
"MIT"
] | 21 | 2020-03-28T09:57:15.000Z | 2020-03-31T11:38:00.000Z | backend/apps/role/migrations/0003_auto_20200329_1414.py | highproformas-friends/curaSWISS | cf6c1ac9c0c80026f1667a7155290c37be8dec7c | [
"MIT"
] | null | null | null | # Generated by Django 3.0.4 on 2020-03-29 14:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('role', '0002_auto_20200329_1412'),
]
operations = [
migrations.AlterField(
model_name='role',
name='name',
field=models.CharField(default='', max_length=50, unique=True),
),
]
| 21.263158 | 75 | 0.596535 | 45 | 404 | 5.244444 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113402 | 0.279703 | 404 | 18 | 76 | 22.444444 | 0.697595 | 0.111386 | 0 | 0 | 1 | 0 | 0.098039 | 0.064426 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
447db70fcaa354f16105d556a4943d598b33148f | 816 | py | Python | scifiweb/urls.py | project-scifi/scifiweb | cc51d9ea6e7f302c503174e92029188a7e252753 | [
"Apache-2.0"
] | 1 | 2018-04-18T04:37:43.000Z | 2018-04-18T04:37:43.000Z | scifiweb/urls.py | project-scifi/scifiweb | cc51d9ea6e7f302c503174e92029188a7e252753 | [
"Apache-2.0"
] | 12 | 2017-06-26T05:20:28.000Z | 2022-01-13T00:48:58.000Z | scifiweb/urls.py | project-scifi/scifiweb | cc51d9ea6e7f302c503174e92029188a7e252753 | [
"Apache-2.0"
] | null | null | null | from django.conf.urls import include
from django.conf.urls import url
from django.shortcuts import redirect
from django.shortcuts import reverse
import scifiweb.about.urls
import scifiweb.news.urls
from scifiweb.home import home
from scifiweb.robots import robots_dot_txt
urlpatterns = [
url(r'^$', home, name='home'),
url(r'^robots\.txt$', robots_dot_txt, name='robots.txt'),
url(r'^about/', include(scifiweb.about.urls.urlpatterns)),
url(r'^news/', include(scifiweb.news.urls.urlpatterns)),
# Legacy redirects to /about/
url(r'^info/about/$', lambda _: redirect(reverse('about'), permanent=True)),
url(r'^info/about/contact$', lambda _: redirect(reverse('about/contact'), permanent=True)),
url(r'^info/about/team$', lambda _: redirect(reverse('about/team'), permanent=True)),
]
| 35.478261 | 95 | 0.720588 | 112 | 816 | 5.1875 | 0.258929 | 0.048193 | 0.041308 | 0.067126 | 0.172117 | 0.089501 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120098 | 816 | 22 | 96 | 37.090909 | 0.809192 | 0.033088 | 0 | 0 | 0 | 0 | 0.152478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.470588 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
447ff5764e0b94f97e6c66b9deb62f15c5007620 | 535 | py | Python | solutions/Array-02.py | mrocklin/dask-tutorial | efb87e83eefa816ef23083fc1329af1d7da452a8 | [
"BSD-3-Clause"
] | 2 | 2018-03-17T16:41:28.000Z | 2019-06-19T06:38:06.000Z | solutions/Array-02.py | mrocklin/dask-tutorial | efb87e83eefa816ef23083fc1329af1d7da452a8 | [
"BSD-3-Clause"
] | null | null | null | solutions/Array-02.py | mrocklin/dask-tutorial | efb87e83eefa816ef23083fc1329af1d7da452a8 | [
"BSD-3-Clause"
] | 3 | 2018-07-13T15:33:55.000Z | 2020-11-29T14:27:45.000Z | import h5py
from glob import glob
import os
filenames = sorted(glob(os.path.join('data', 'weather-big', '*.hdf5')))
dsets = [h5py.File(filename)['/t2m'] for filename in filenames]
import dask.array as da
arrays = [da.from_array(dset, chunks=(500, 500)) for dset in dsets]
x = da.stack(arrays, axis=0)
result = x.mean(axis=0)
from matplotlib import pyplot as plt
fig = plt.figure(figsize=(16, 8))
plt.imshow(result, cmap='RdBu_r')
result = x[0] - x.mean(axis=0)
fig = plt.figure(figsize=(16, 8))
plt.imshow(result, cmap='RdBu_r')
| 24.318182 | 71 | 0.699065 | 92 | 535 | 4.032609 | 0.48913 | 0.040431 | 0.048518 | 0.053908 | 0.247978 | 0.247978 | 0.247978 | 0.247978 | 0.247978 | 0.247978 | 0 | 0.042918 | 0.128972 | 535 | 21 | 72 | 25.47619 | 0.753219 | 0 | 0 | 0.266667 | 0 | 0 | 0.069159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
4485c2de2544e40db08213560e7dfbba9933c235 | 245 | py | Python | examples/python/corepy/userandom.py | airgiser/ucb | d03e62a17f35a9183ed36662352f603f0f673194 | [
"MIT"
] | 1 | 2022-01-08T14:59:44.000Z | 2022-01-08T14:59:44.000Z | examples/python/corepy/userandom.py | airgiser/just-for-fun | d03e62a17f35a9183ed36662352f603f0f673194 | [
"MIT"
] | null | null | null | examples/python/corepy/userandom.py | airgiser/just-for-fun | d03e62a17f35a9183ed36662352f603f0f673194 | [
"MIT"
] | null | null | null | #!/usr/bin/python
from random import Random
onelist = [1, 2, 3, 4, 5, 6, 7]
rd = Random()
for i in range(5):
print(rd.randint(0, 100))
print(rd.uniform(0, 100))
print(rd.random())
print(rd.choice(onelist))
print('-' * 30)
| 17.5 | 31 | 0.587755 | 41 | 245 | 3.512195 | 0.634146 | 0.194444 | 0.125 | 0.152778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 0.216327 | 245 | 13 | 32 | 18.846154 | 0.65625 | 0.065306 | 0 | 0 | 0 | 0 | 0.004386 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.555556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
44877d856034bf8fe04f668f3d82b679a04011e9 | 516 | py | Python | productdb/testing/unittests/products.py | tspycher/python-productdb | 17970a681b32eb249b78fab7dbeaee9d63ca7c05 | [
"MIT"
] | null | null | null | productdb/testing/unittests/products.py | tspycher/python-productdb | 17970a681b32eb249b78fab7dbeaee9d63ca7c05 | [
"MIT"
] | null | null | null | productdb/testing/unittests/products.py | tspycher/python-productdb | 17970a681b32eb249b78fab7dbeaee9d63ca7c05 | [
"MIT"
] | null | null | null | from . import BasicTestCase
class ProductsTestCase(BasicTestCase):
def test_basic(self):
rv = self.client.get('/')
assert rv.status_code == 200
data = self.parseJsonResponse(rv)
assert '_links' in data
def test_get_all_products(self):
rv = self.client.get('/product')
assert rv.status_code == 200
data = self.parseJsonResponse(rv)
assert '_items' in data
# add further data tests here, load basic data by fixtures
pass
| 21.5 | 66 | 0.625969 | 62 | 516 | 5.080645 | 0.516129 | 0.044444 | 0.063492 | 0.101587 | 0.463492 | 0.342857 | 0.342857 | 0.342857 | 0.342857 | 0.342857 | 0 | 0.01626 | 0.284884 | 516 | 23 | 67 | 22.434783 | 0.837398 | 0.108527 | 0 | 0.307692 | 0 | 0 | 0.046053 | 0 | 0 | 0 | 0 | 0 | 0.307692 | 1 | 0.153846 | false | 0.076923 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
448799a4fc1952a7fcd243848401d2c0a08268d0 | 196 | py | Python | src/main.py | Quin-Darcy/Crawler | 3e0d9f0b5dce43206606b19e5fb4dd84f4614fb1 | [
"MIT"
] | null | null | null | src/main.py | Quin-Darcy/Crawler | 3e0d9f0b5dce43206606b19e5fb4dd84f4614fb1 | [
"MIT"
] | null | null | null | src/main.py | Quin-Darcy/Crawler | 3e0d9f0b5dce43206606b19e5fb4dd84f4614fb1 | [
"MIT"
] | null | null | null | import crawler
import os
def main():
root = crawler.Crawler()
root.set_start()
root.burrow()
root.show()
os.system('killall firefox')
if __name__ == '__main__':
main()
| 13.066667 | 32 | 0.617347 | 24 | 196 | 4.666667 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.239796 | 196 | 14 | 33 | 14 | 0.751678 | 0 | 0 | 0 | 0 | 0 | 0.117347 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.3 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
448a636d024dfa5f84fbc271a201ee5abe77a7d5 | 2,118 | py | Python | jerk_agent_for_understanding/scripts/map-paths.py | tristansokol/Bobcats | 71461b5c969b24e5379a63d2bc22bf173ae76f9b | [
"MIT"
] | 2 | 2018-06-20T13:36:54.000Z | 2018-10-28T17:06:31.000Z | jerk_agent_for_understanding/scripts/map-paths.py | tristansokol/Bobcats | 71461b5c969b24e5379a63d2bc22bf173ae76f9b | [
"MIT"
] | null | null | null | jerk_agent_for_understanding/scripts/map-paths.py | tristansokol/Bobcats | 71461b5c969b24e5379a63d2bc22bf173ae76f9b | [
"MIT"
] | 1 | 2019-04-30T06:24:03.000Z | 2019-04-30T06:24:03.000Z | #!/usr/bin/python
import sys
import retro
import numpy as np
from os import listdir
from os.path import isfile, join, isdir, dirname, realpath
from PIL import Image
# find level maps here: http://info.sonicretro.org/Sonic_the_Hedgehog_(16-bit)_level_maps
mp = Image.open(dirname(realpath(__file__))+"/01.PNG")
mp.load()
level_map =np.array(mp.convert(mode='RGB'), dtype="uint8" )
hf = 10 # highlight factor
def render(file):
movie = retro.Movie(file)
movie.step()
env = retro.make(game=movie.get_game(), state=retro.STATE_NONE, use_restricted_actions=retro.ACTIONS_ALL)
env.initial_state = movie.get_state()
env.reset()
while movie.step():
keys = []
for i in range(env.NUM_BUTTONS):
keys.append(movie.get_key(i))
_obs, _rew, _done, _info = env.step(keys)
y = _info['y']
x = _info['x']
highlight = [[[min(x[0][0]+hf,255), min(x[0][1]+hf,255), min(x[0][2]+hf,255)],
[min(x[1][0]+hf,255), min(x[1][1]+hf,255), min(x[1][2]+hf,255)],
[min(x[2][0]+hf,255), min(x[2][1]+hf,255), min(x[2][2]+hf,255)],
[min(x[3][0]+hf,255), min(x[3][1]+hf,255), min(x[3][2]+hf,255)],
[min(x[4][0]+hf,255), min(x[4][1]+hf,255), min(x[4][2]+hf,255)],
[min(x[5][0]+hf,255), min(x[5][1]+hf,255), min(x[5][2]+hf,255)],
[min(x[6][0]+hf,255), min(x[6][1]+hf,255), min(x[6][2]+hf,255)],
[min(x[7][0]+hf,255), min(x[7][1]+hf,255), min(x[7][2]+hf,255)],
] for x in level_map[y:(y+8), x:(x+8)]]
level_map[y:(y+8), x:(x+8)] = highlight
env.close()
if isdir(sys.argv[1]):
onlyfiles = [f for f in listdir(sys.argv[1]) if isfile(join(sys.argv[1], f))]
onlyfiles.sort()
c = 0
for file in onlyfiles:
if ".bk2" in file :
print('playing', file)
render(sys.argv[1]+file)
if c % 5==0:
lm = Image.fromarray(level_map)
lm.save('levelmapv9.jpeg')
c+=1
lm.show()
else:
print('playing', sys.argv[1])
render(sys.argv[1])
lm = Image.fromarray(level_map)
# lm.save('levelmap.jpeg')
lm.show()
| 33.09375 | 109 | 0.566572 | 377 | 2,118 | 3.106101 | 0.291777 | 0.081981 | 0.157131 | 0.176772 | 0.291204 | 0.075149 | 0.075149 | 0.023911 | 0 | 0 | 0 | 0.085578 | 0.211048 | 2,118 | 63 | 110 | 33.619048 | 0.6152 | 0.068933 | 0 | 0.076923 | 0 | 0 | 0.025407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019231 | false | 0 | 0.115385 | 0 | 0.134615 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4498797169e379e4f71ed514fe503e98825e7c72 | 4,897 | py | Python | languageModel.py | Zgjszjggjt/DeepLearning | 51889e57f8c7a0a861c356a0e3454c3c3625b0a7 | [
"MIT"
] | null | null | null | languageModel.py | Zgjszjggjt/DeepLearning | 51889e57f8c7a0a861c356a0e3454c3c3625b0a7 | [
"MIT"
] | null | null | null | languageModel.py | Zgjszjggjt/DeepLearning | 51889e57f8c7a0a861c356a0e3454c3c3625b0a7 | [
"MIT"
] | null | null | null | #!/usr/bin/evn python
#-*- coding: utf-8 -*-
# ===================================
# Filename : languageModel.py
# Author : GT
# Create date : 17-09-20 18:33:43
# Description:
# ===================================
# Script starts from here
# this is for chinese characters
# import sys
# reload(sys)
# sys.setdefaultencoding('utf-8')
import numpy as np
import nltk
import csv
import itertools
def softmax(x):
xt = np.exp(x - np.max(x))
return xt / np.sum(xt)
class getData(object):
def __init__(self):
self.vocabulary_size = 8000
self.unknown_token = 'UNKNOWN_TOKEN'
self.sentence_start_token = 'SENTENCE_START'
self.sentence_end_token = "SENTENCE_END"
def encode(self, path):
print 'Reading csv file %s' % path
with open(path, 'rb') as f:
reader = csv.reader(f, skipinitialspace = True)
reader.next()
self.sentences = itertools.chain(*[nltk.sent_tokenize(x[0].decode('utf-8').lower()) for x in reader])
self.sentences = ['%s %s %s' % (self.sentence_start_token, x, self.sentence_end_token) for x in self.sentences]
print 'Parsed %d sentences' % len(self.sentences)
self.tokenize_sentences = [nltk.word_tokenize(x) for x in self.sentences]
self.word_freq = nltk.FreqDist(itertools.chain(*self.tokenize_sentences))
print 'Found %d unique words tokens.' % len(self.word_freq)
vocab = self.word_freq.most_common(self.vocabulary_size - 1)
self.index_to_word = [x[0] for x in vocab]
self.index_to_word.append(self.unknown_token)
self.word_to_index = dict([(w, i) for i, w in enumerate(self.index_to_word)])
print 'Using vocabulary size %d .' % self.vocabulary_size
print 'The least frequent word in our vocabulary is %s and appear %d times.' % (vocab[-1][0], vocab[-1][1])
for i, sent in enumerate(self.tokenize_sentences):
self.tokenize_sentences[i] = [w if w in self.word_to_index else self.unknown_token for w in sent]
print '\nExample sentence: %s' % self.sentences[0]
print '\nExample sentence after encoding: %s' % self.tokenize_sentences[0]
self.x_train = np.asarray([[self.word_to_index[w] for w in sent[:-1]] for sent in self.tokenize_sentences])
self.y_train = np.asarray([[self.word_to_index[w] for w in sent[1:]] for sent in self.tokenize_sentences])
return self.x_train, self.y_train
class RNN(object):
def __init__(self, vocabulary_size, hidden_size, bptt_turns):
self.vocabulary_size = vocabulary_size
self.hidden_size = hidden_size
self.bptt_turns = bptt_turns
self.U = np.random.uniform(-np.sqrt(1. / vocabulary_size), np.sqrt(1. / vocabulary_size), (hidden_size, vocabulary_size))
self.V = np.random.uniform(-np.sqrt(1. / hidden_size), np.sqrt(1. / hidden_size), (vocabulary_size, hidden_size))
self.W = np.random.uniform(-np.sqrt(1. / hidden_size), np.sqrt(1. / hidden_size), (hidden_size, hidden_size))
def forward(self, x):
C_size = len(x)
s = np.zeros((C_size + 1, self.hidden_size))
s[-1] = np.zeros(self.hidden_size)
o = np.zeros((C_size, self.vocabulary_size))
for i in np.arange(C_size):
s[i] = np.tanh(self.U[:, x[i]] + self.W.dot(s[i - 1]))
o[i] = softmax(self.V.dot(s[i]))
return s, o
def predict(self):
maxIndex = np.argmax(self.o, axis = 1)
print self.maxIndex
def get_s(self):
return self.s
def get_o(self):
return self.o
def caculate_total_loss(self, x, y):
loss = 0.
for i in range(len(y)):
s, o = self.forward(x[i])
correct_word_predictions = o[np.arange(len(y[i])), y[i]]
print correct_word_predictions.shape
print correct_word_predictions
loss += -1 * np.sum(np.log(correct_word_predictions))
return loss
def caculate_loss(self, x, y):
N = np.sum((len(y_i) for y_i in y))
print self.caculate_total_loss(x, y)/N
def bptt(self, x, y):
length = len(y);
s, o = self.forward(x)
dLdU = np.zeros_like(self.U)
dLdV = np.zeros_like(self.V)
dLdW = np.zeros_like(self.W)
delta_o = o
delta_o[np.arange(len(y)), y] -= 1.
for t in np.arange(length)[::-1]:
dLdV += np.outer(delta_o[t]. s[t].T)
delta_t = self.V.T.dot(delta_o[t]) * (1 - (s[t] ** 2))
if __name__ == '__main__':
getData = getData()
x, y = getData.encode('./DateSet/reddit-comments-2015-08.csv')
# print x.shape
# print y.shape
# print x[0]
# print y[0]
rnn = RNN(8000, 100, 4)
# rnn.forward(x[:1000])
# rnn,predict()
# print rnn.get_o().shape
# print rnn.get_o()
rnn.caculate_loss(x[:1000], y[:1000])
| 37.381679 | 129 | 0.604656 | 732 | 4,897 | 3.881148 | 0.230874 | 0.045759 | 0.051742 | 0.021119 | 0.13974 | 0.104893 | 0.097149 | 0.084477 | 0.084477 | 0.084477 | 0 | 0.019989 | 0.244027 | 4,897 | 130 | 130 | 37.669231 | 0.747434 | 0.089034 | 0 | 0 | 0 | 0 | 0.071847 | 0.008333 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.044944 | null | null | 0.123596 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92254e7c44b4d02b62556b08a59fb86995acbb5b | 2,463 | py | Python | openstack/tests/unit/network/v2/test_firewall_v1_rule.py | morganseznec/openstacksdk | 7b245c16556a04497ce701d959a889eca6f26a83 | [
"Apache-2.0"
] | null | null | null | openstack/tests/unit/network/v2/test_firewall_v1_rule.py | morganseznec/openstacksdk | 7b245c16556a04497ce701d959a889eca6f26a83 | [
"Apache-2.0"
] | null | null | null | openstack/tests/unit/network/v2/test_firewall_v1_rule.py | morganseznec/openstacksdk | 7b245c16556a04497ce701d959a889eca6f26a83 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2019 Morgan Seznec <morgan.s134@gmail.com>
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import testtools
from openstack.network.v2 import firewall_v1_rule
EXAMPLE = {
'action': 'allow',
'description': '1',
'destination_ip_address': '10.0.0.2/24',
'destination_port': '2',
'name': '3',
'enabled': True,
'ip_version': 4,
'protocol': 'tcp',
'shared': True,
'source_ip_address': '10.0.1.2/24',
'source_port': '5',
'project_id': '6',
}
class TestFirewallV1Rule(testtools.TestCase):
def test_basic(self):
sot = firewall_v1_rule.FirewallV1Rule()
self.assertEqual('firewall_v1_rule', sot.resource_key)
self.assertEqual('firewall_v1_rules', sot.resources_key)
self.assertEqual('/fw/firewall_rules', sot.base_path)
self.assertTrue(sot.allow_create)
self.assertTrue(sot.allow_fetch)
self.assertTrue(sot.allow_commit)
self.assertTrue(sot.allow_delete)
self.assertTrue(sot.allow_list)
def test_make_it(self):
sot = firewall_v1_rule.FirewallV1Rule(**EXAMPLE)
self.assertEqual(EXAMPLE['action'], sot.action)
self.assertEqual(EXAMPLE['description'], sot.description)
self.assertEqual(EXAMPLE['destination_ip_address'],
sot.destination_ip_address)
self.assertEqual(EXAMPLE['destination_port'], sot.destination_port)
self.assertEqual(EXAMPLE['name'], sot.name)
self.assertEqual(EXAMPLE['enabled'], sot.enabled)
self.assertEqual(EXAMPLE['ip_version'], sot.ip_version)
self.assertEqual(EXAMPLE['protocol'], sot.protocol)
self.assertEqual(EXAMPLE['shared'], sot.shared)
self.assertEqual(EXAMPLE['source_ip_address'],
sot.source_ip_address)
self.assertEqual(EXAMPLE['source_port'], sot.source_port)
self.assertEqual(EXAMPLE['project_id'], sot.project_id)
| 37.892308 | 75 | 0.686561 | 310 | 2,463 | 5.306452 | 0.412903 | 0.136778 | 0.160486 | 0.066869 | 0.080243 | 0.042553 | 0 | 0 | 0 | 0 | 0 | 0.020131 | 0.19326 | 2,463 | 64 | 76 | 38.484375 | 0.80775 | 0.243605 | 0 | 0 | 0 | 0 | 0.185065 | 0.02381 | 0 | 0 | 0 | 0 | 0.465116 | 1 | 0.046512 | false | 0 | 0.046512 | 0 | 0.116279 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
922b49321572191d7b66c10a4c4bde3685820be0 | 2,598 | py | Python | OpenSources/GDAL_moreExamples/class08demos_ogr_gdal/class08_ogr_buffer.py | mehran66/python-geospatial-open-sources | 6c56922bdd2a815a98faeb5ac65f674c22486923 | [
"MIT"
] | null | null | null | OpenSources/GDAL_moreExamples/class08demos_ogr_gdal/class08_ogr_buffer.py | mehran66/python-geospatial-open-sources | 6c56922bdd2a815a98faeb5ac65f674c22486923 | [
"MIT"
] | null | null | null | OpenSources/GDAL_moreExamples/class08demos_ogr_gdal/class08_ogr_buffer.py | mehran66/python-geospatial-open-sources | 6c56922bdd2a815a98faeb5ac65f674c22486923 | [
"MIT"
] | null | null | null | '''*********************************************
author: Galen Maclaurin
Date: 12/11/2012
Updated: 03/14/2016 , Stefan Leyk
Purpose: Simple Buffer example using OGR
*********************************************'''
from time import clock
start = clock()
import os
from osgeo import ogr
# Define variables you need
path = r'C:\GIS3\data\cl08data'
#change the working directory
os.chdir(path)
theme = 'areapoints.shp'
buffTheme = 'buffOutput.shp'
buffDist = 500
# Open the feature class, creating an 'ogr' object
ds = ogr.Open(theme)
# Get the driver from the ogr object. The driver object is an interface to work
# with a specific vector data format (i.e. ESRI shapefile in this case).
dvr = ds.GetDriver()
# Check to see if the output file exists and delete it if so.
if os.path.exists(buffTheme):
dvr.DeleteDataSource(buffTheme)
print buffTheme, "existed and has been deleted"
# Get the layer object from the ogr object. This is kind of like a cursor object.
lyr = ds.GetLayer()
# Get the number of features. the layer object is an iterable object.
numFeat = len(lyr)
# Create an empty feature class to populate with buffer features. This is stored
# in memory as an emtpy ogr object.
buff_ds = dvr.CreateDataSource(buffTheme)
# Create a layer object from the empty ogr object.
outLyr = buff_ds.CreateLayer(buffTheme[:-4],lyr.GetSpatialRef(),ogr.wkbPolygon)
# Adding a field takes two steps: create a field definition and then use that
# to create the field
# Create a field definition
fd = ogr.FieldDefn('myField',ogr.OFTString)
# Create field with the field definition
outLyr.CreateField(fd)
# Similarly, you need to use a layer definition to create a feature
lyrDef = outLyr.GetLayerDefn()
# Iterate through the features in the layer object (like with a cursor)
for feat in lyr:
#get the geometry object from the feature object.
geom = feat.GetGeometryRef()
#create a new feature using the layer definition create outside the loop
outFeat = ogr.Feature(lyrDef)
#set the feature's geometry as the buffered geometry
outFeat.SetGeometry(geom.Buffer(buffDist))
#set field value for feature
outFeat.SetField('myField','someText'+str(feat.GetFID()))
#save the new feature in the output layer created outside the loop
outLyr.CreateFeature(outFeat)
# Clean up, remove reference to the datasource objects, this is like deleting
# the cursor and row objects.
buff_ds = None
ds = None
print 'Buffer vector features complete'
print 'Elapsed time: ',round(clock()-start,2),' seconds'
| 37.652174 | 82 | 0.706697 | 376 | 2,598 | 4.875 | 0.454787 | 0.02455 | 0.022913 | 0.017458 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011315 | 0.183603 | 2,598 | 68 | 83 | 38.205882 | 0.8529 | 0.494226 | 0 | 0 | 0 | 0 | 0.150794 | 0.020833 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.096774 | null | null | 0.096774 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9237845df9705971fc68a591a3a63ea569aa3bde | 26,319 | py | Python | src/utils/adbtool.py | wangzhi2689/data_analysis | 3410856d1df1a9cd95660e28e6ed47fd0102f6aa | [
"Apache-2.0"
] | 1 | 2022-01-01T09:36:54.000Z | 2022-01-01T09:36:54.000Z | src/utils/adbtool.py | wangzhi2689/data_analysis | 3410856d1df1a9cd95660e28e6ed47fd0102f6aa | [
"Apache-2.0"
] | 1 | 2022-02-26T13:07:05.000Z | 2022-02-26T13:07:05.000Z | src/utils/adbtool.py | wangzhi2689/data_analysis | 3410856d1df1a9cd95660e28e6ed47fd0102f6aa | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/evn python
# -*- coding:utf-8 -*-
# FileName adbtools.py
# Author: HeyNiu
# Created Time: 2016/9/19
"""
adb 工具类
"""
import os
import platform
import re
import time
#import utils.timetools
class AdbTools(object):
def __init__(self, device_id=''):
self.__system = platform.system()
self.__find = ''
self.__command = ''
self.__device_id = device_id
self.__get_find()
self.__check_adb()
self.__connection_devices()
def __get_find(self):
"""
判断系统类型,windows使用findstr,linux使用grep
:return:
"""
# if self.__system is "Windows":
#self.__find = "findstr"
#else:
#self.__find = "grep"
def __check_adb(self):
"""
检查adb
判断是否设置环境变量ANDROID_HOME
:return:
"""
if "ANDROID_HOME" in os.environ:
if self.__system == "Windows":
path = os.path.join(os.environ["ANDROID_HOME"], "platform-tools", "adb.exe")
if os.path.exists(path):
self.__command = path
else:
raise EnvironmentError(
"Adb not found in $ANDROID_HOME path: %s." % os.environ["ANDROID_HOME"])
else:
path = os.path.join(os.environ["ANDROID_HOME"], "platform-tools", "adb")
if os.path.exists(path):
self.__command = path
else:
raise EnvironmentError(
"Adb not found in $ANDROID_HOME path: %s." % os.environ["ANDROID_HOME"])
else:
raise EnvironmentError(
"Adb not found in $ANDROID_HOME path: %s." % os.environ["ANDROID_HOME"])
def __connection_devices(self):
"""
连接指定设备,单个设备可不传device_id
:return:
"""
if self.__device_id == "":
return
self.__device_id = "-s %s" % self.__device_id
def adb(self, args):
"""
执行adb命令
:param args:参数
:return:
"""
cmd = "%s %s %s" % (self.__command, self.__device_id, str(args))
# print(cmd)
return os.popen(cmd)
def shell(self, args):
"""
执行adb shell命令
:param args:参数
:return:
"""
cmd = "%s %s shell %s" % (self.__command, self.__device_id, str(args))
# print(cmd)
return os.popen(cmd)
def mkdir(self, path):
"""
创建目录
:param path: 路径
:return:
"""
return self.shell('mkdir %s' % path)
def get_devices(self):
"""
获取设备列表
:return:
"""
l = self.adb('devices').readlines()
return (i.split()[0] for i in l if 'devices' not in i and len(i) > 5)
def get_current_application(self):
"""
获取当前运行的应用信息
:return:
"""
return self.shell('dumpsys window w | %s \/ | %s name=' % (self.__find, self.__find)).read()
def get_current_package(self):
"""
获取当前运行app包名
:return:
"""
reg = re.compile(r'name=(.+?)/')
return re.findall(reg, self.get_current_application())[0]
def get_current_activity(self):
"""
获取当前运行activity
:return: package/activity
"""
reg = re.compile(r'name=(.+?)\)')
return re.findall(reg, self.get_current_application())[0]
'''def __get_process(self, package_name):
"""
获取进程信息
:param package_name:
:return:
"""
if self.__system is "Windows":
pid_command = self.shell("ps | %s %s$" % (self.__find, package_name)).read()
else:
pid_command = self.shell("ps | %s -w %s" % (self.__find, package_name)).read()
return pid_command'''
def process_exists(self, package_name):
"""
返回进程是否存在
:param package_name:
:return:
"""
process = self.__get_process(package_name)
return package_name in process
def get_pid(self, package_name):
"""
获取pid
:return:
"""
pid_command = self.__get_process(package_name)
if pid_command == '':
print("The process doesn't exist.")
return pid_command
req = re.compile(r"\d+")
result = str(pid_command).split()
result.remove(result[0])
return req.findall(" ".join(result))[0]
def get_uid(self, pid):
"""
获取uid
:param pid:
:return:
"""
result = self.shell("cat /proc/%s/status" % pid).readlines()
for i in result:
if 'uid' in i.lower():
return i.split()[1]
def get_flow_data_tcp(self, uid):
"""
获取应用tcp流量
:return:(接收, 发送)
"""
tcp_rcv = self.shell("cat proc/uid_stat/%s/tcp_rcv" % uid).read().split()[0]
tcp_snd = self.shell("cat proc/uid_stat/%s/tcp_snd" % uid).read().split()[0]
return tcp_rcv, tcp_snd
def get_flow_data_all(self, uid):
"""
获取应用流量全部数据
包含该应用多个进程的所有数据 tcp udp等
(rx_bytes, tx_bytes) >> (接收, 发送)
:param uid:
:return:list(dict)
"""
all_data = []
d = {}
data = self.shell("cat /proc/net/xt_qtaguid/stats | %s %s" % (self.__find, uid)).readlines()
for i in data:
if not i.startswith('\n'):
item = i.strip().split()
d['idx'] = item[0]
d['iface'] = item[1]
d['acct_tag_hex'] = item[2]
d['uid_tag_int'] = item[3]
d['cnt_set'] = item[4]
d['rx_bytes'] = item[5]
d['rx_packets'] = item[6]
d['tx_bytes'] = item[7]
d['tx_packets'] = item[8]
d['rx_tcp_bytes'] = item[9]
d['rx_tcp_packets'] = item[10]
d['rx_udp_bytes'] = item[11]
d['rx_udp_packets'] = item[12]
d['rx_other_bytes'] = item[13]
d['rx_other_packets'] = item[14]
d['tx_tcp_bytes'] = item[15]
d['tx_tcp_packets'] = item[16]
d['tx_udp_bytes'] = item[17]
d['tx_udp_packets'] = item[18]
d['tx_other_bytes'] = item[19]
d['tx_other_packets'] = item[20]
all_data.append(d)
d = {}
return all_data
@staticmethod
def dump_apk(path):
"""
dump apk文件
:param path: apk路径
:return:
"""
# 检查build-tools是否添加到环境变量中
# 需要用到里面的aapt命令
l = os.environ['PATH'].split(';')
build_tools = False
for i in l:
if 'build-tools' in i:
build_tools = True
if not build_tools:
raise EnvironmentError("ANDROID_HOME BUILD-TOOLS COMMAND NOT FOUND.\nPlease set the environment variable.")
return os.popen('aapt dump badging %s' % (path,))
@staticmethod
def dump_xml(path, filename):
"""
dump apk xml文件
:return:
"""
return os.popen('aapt dump xmlstrings %s %s' % (path, filename))
def uiautomator_dump(self):
"""
获取屏幕uiautomator xml文件
:return:
"""
return self.shell('uiautomator dump').read().split()[-1]
def pull(self, source, target):
"""
从手机端拉取文件到电脑端
:return:
"""
self.adb('pull %s %s' % (source, target))
def push(self, source, target):
"""
从电脑端推送文件到手机端
:param source:
:param target:
:return:
"""
self.adb('push %s %s' % (source, target))
def remove(self, path):
"""
从手机端删除文件
:return:
"""
self.shell('rm %s' % (path,))
def clear_app_data(self, package):
"""
清理应用数据
:return:
"""
self.shell('pm clear %s' % (package,))
def install(self, path):
"""
安装apk文件
:return:
"""
# adb install 安装错误常见列表
errors = {'INSTALL_FAILED_ALREADY_EXISTS': '程序已经存在',
'INSTALL_DEVICES_NOT_FOUND': '找不到设备',
'INSTALL_FAILED_DEVICE_OFFLINE': '设备离线',
'INSTALL_FAILED_INVALID_APK': '无效的APK',
'INSTALL_FAILED_INVALID_URI': '无效的链接',
'INSTALL_FAILED_INSUFFICIENT_STORAGE': '没有足够的存储空间',
'INSTALL_FAILED_DUPLICATE_PACKAGE': '已存在同名程序',
'INSTALL_FAILED_NO_SHARED_USER': '要求的共享用户不存在',
'INSTALL_FAILED_UPDATE_INCOMPATIBLE': '版本不能共存',
'INSTALL_FAILED_SHARED_USER_INCOMPATIBLE': '需求的共享用户签名错误',
'INSTALL_FAILED_MISSING_SHARED_LIBRARY': '需求的共享库已丢失',
'INSTALL_FAILED_REPLACE_COULDNT_DELETE': '需求的共享库无效',
'INSTALL_FAILED_DEXOPT': 'dex优化验证失败',
'INSTALL_FAILED_DEVICE_NOSPACE': '手机存储空间不足导致apk拷贝失败',
'INSTALL_FAILED_DEVICE_COPY_FAILED': '文件拷贝失败',
'INSTALL_FAILED_OLDER_SDK': '系统版本过旧',
'INSTALL_FAILED_CONFLICTING_PROVIDER': '存在同名的内容提供者',
'INSTALL_FAILED_NEWER_SDK': '系统版本过新',
'INSTALL_FAILED_TEST_ONLY': '调用者不被允许测试的测试程序',
'INSTALL_FAILED_CPU_ABI_INCOMPATIBLE': '包含的本机代码不兼容',
'CPU_ABIINSTALL_FAILED_MISSING_FEATURE': '使用了一个无效的特性',
'INSTALL_FAILED_CONTAINER_ERROR': 'SD卡访问失败',
'INSTALL_FAILED_INVALID_INSTALL_LOCATION': '无效的安装路径',
'INSTALL_FAILED_MEDIA_UNAVAILABLE': 'SD卡不存在',
'INSTALL_FAILED_INTERNAL_ERROR': '系统问题导致安装失败',
'INSTALL_PARSE_FAILED_NO_CERTIFICATES': '文件未通过认证 >> 设置开启未知来源',
'INSTALL_PARSE_FAILED_INCONSISTENT_CERTIFICATES': '文件认证不一致 >> 先卸载原来的再安装',
'INSTALL_FAILED_INVALID_ZIP_FILE': '非法的zip文件 >> 先卸载原来的再安装',
'INSTALL_CANCELED_BY_USER': '需要用户确认才可进行安装',
'INSTALL_FAILED_VERIFICATION_FAILURE': '验证失败 >> 尝试重启手机',
'DEFAULT': '未知错误'
}
print('Installing...')
l = self.adb('install -r %s' % (path,)).read()
if 'Success' in l:
print('Install Success')
if 'Failure' in l:
reg = re.compile('\\[(.+?)\\]')
key = re.findall(reg, l)[0]
try:
print('Install Failure >> %s' % errors[key])
except KeyError:
print('Install Failure >> %s' % key)
return l
def uninstall(self, package):
"""
卸载apk
:param package: 包名
:return:
"""
print('Uninstalling...')
l = self.adb('uninstall %s' % (package,)).read()
print(l)
def screenshot(self, target_path=''):
"""
手机截图
:param target_path: 目标路径
:return:
"""
format_time = utils.timetools.timestamp('%Y%m%d%H%M%S')
self.shell('screencap -p /sdcard/%s.png' % (format_time,))
time.sleep(1)
if target_path == '':
self.pull('/sdcard/%s.png' % (format_time,), os.path.expanduser('~'))
else:
self.pull('/sdcard/%s.png' % (format_time,), target_path)
self.remove('/sdcard/%s.png' % (format_time,))
def get_cache_logcat(self):
"""
导出缓存日志
:return:
"""
return self.adb('logcat -v time -d')
def get_crash_logcat(self):
"""
导出崩溃日志
:return:
"""
return self.adb('logcat -v time -d | %s AndroidRuntime' % (self.__find,))
def clear_cache_logcat(self):
"""
清理缓存区日志
:return:
"""
self.adb('logcat -c')
def get_device_time(self):
"""
获取设备时间
:return:
"""
return self.shell('date').read().strip()
def ls(self, command):
"""
shell ls命令
:return:
"""
return self.shell('ls %s' % (command,)).readlines()
def file_exists(self, target):
"""
判断文件在目标路径是否存在
:return:
"""
l = self.ls(target)
for i in l:
if i.strip() == target:
return True
return False
def is_install(self, target_app):
"""
判断目标app在设备上是否已安装
:param target_app: 目标app包名
:return: bool
"""
return target_app in self.shell('pm list packages %s' % (target_app,)).read()
def get_device_model(self):
"""
获取设备型号
:return:
"""
return self.shell('getprop ro.product.model').read().strip()
def get_device_id(self):
"""
获取设备id
:return:
"""
return self.adb('get-serialno').read().strip()
def get_device_android_version(self):
"""
获取设备Android版本
:return:
"""
return self.shell('getprop ro.build.version.release').read().strip()
def get_device_sdk_version(self):
"""
获取设备SDK版本
:return:
"""
return self.shell('getprop ro.build.version.sdk').read().strip()
def get_device_mac_address(self):
"""
获取设备MAC地址
:return:
"""
return self.shell('cat /sys/class/net/wlan0/address').read().strip()
def get_device_ip_address(self):
"""
获取设备IP地址
pass: 适用WIFI 蜂窝数据
:return:
"""
if not self.get_wifi_state() and not self.get_data_state():
return
l = self.shell('ip addr | %s global' % self.__find).read()
reg = re.compile('\d+\.\d+\.\d+\.\d+')
return re.findall(reg, l)[0]
def get_device_imei(self):
"""
获取设备IMEI
:return:
"""
sdk = self.get_device_sdk_version()
# Android 5.0以下方法
if int(sdk) < 21:
l = self.shell('dumpsys iphonesubinfo').read()
reg = re.compile('[0-9]{15}')
return re.findall(reg, l)[0]
elif self.root():
l = self.shell('service call iphonesubinfo 1').read()
print(l)
print(re.findall(re.compile("'.+?'"), l))
imei = ''
for i in re.findall(re.compile("'.+?'"), l):
imei += i.replace('.', '').replace("'", '').replace(' ', '')
return imei
else:
print('The device not root.')
return ''
def check_sim_card(self):
"""
检查设备SIM卡
:return:
"""
return len(self.shell('getprop | %s gsm.operator.alpha]' % self.__find).read().strip().split()[-1]) > 2
def get_device_operators(self):
"""
获取运营商
:return:
"""
return self.shell('getprop | %s gsm.operator.alpha]' % self.__find).read().strip().split()[-1]
def get_device_state(self):
"""
获取设备状态
:return:
"""
return self.adb('get-state').read().strip()
def get_display_state(self):
"""
获取屏幕状态
:return: 亮屏/灭屏
"""
l = self.shell('dumpsys power').readlines()
for i in l:
if 'mScreenOn=' in i:
return i.split()[-1] == 'mScreenOn=true'
if 'Display Power' in i:
return 'ON' in i.split('=')[-1].upper()
def get_screen_normal_size(self):
"""
获取设备屏幕分辨率 >> 标配
:return:
"""
return self.shell('wm size').read().strip().split()[-1].split('x')
def get_screen_reality_size(self):
"""
获取设备屏幕分辨率 >> 实际分辨率
:return:
"""
x = 0
y = 0
l = self.shell(r'getevent -p | %s -e "0"' % self.__find).readlines()
for n in l:
if len(n.split()) > 0:
if n.split()[0] == '0035':
x = int(n.split()[7].split(',')[0])
elif n.split()[0] == '0036':
y = int(n.split()[7].split(',')[0])
return x, y
def get_device_interior_sdcard(self):
"""
获取内部SD卡空间
:return: (path,total,used,free,block)
"""
return self.shell('df | %s \/mnt\/shell\/emulated' % self.__find).read().strip().split()
def get_device_external_sdcard(self):
"""
获取外部SD卡空间
:return: (path,total,used,free,block)
"""
return self.shell('df | %s \/storage' % self.__find).read().strip().split()
def __fill_rom(self, path, stream, count):
"""
填充数据
:param path: 填充地址
:param stream: 填充流大小
:param count: 填充次数
:return:
"""
self.shell('dd if=/dev/zero of=%s bs=%s count=%s' % (path, stream, count)).read().strip()
def fill_interior_sdcard(self, filename, size):
"""
填充内置SD卡
:param filename: 文件名
:param size: 填充大小,单位byte
:return:
"""
if size > 10485760: # 10m
self.__fill_rom('sdcard/%s' % filename, 10485760, size / 10485760)
else:
self.__fill_rom('sdcard/%s' % filename, size, 1)
def fill_external_sdcard(self, filename, size):
"""
填充外置SD卡
:param filename: 文件名
:param size: 填充大小,单位byte
:return:
"""
path = self.get_device_external_sdcard()[0]
if size > 10485760: # 10m
self.__fill_rom('%s/%s' % (path, filename), 10485760, size / 10485760)
else:
self.__fill_rom('%s/%s' % (path, filename), size, 1)
def kill_process(self, pid):
"""
杀死进程
pass: 一般需要权限不推荐使用
:return:
"""
return self.shell('kill %s' % pid).read().strip()
def quit_app(self, package):
"""
退出应用
:return:
"""
return self.shell('am force-stop %s' % package).read().strip()
def reboot(self):
"""
重启设备
:return:
"""
self.adb('reboot')
def recovery(self):
"""
重启设备并进入recovery模式
:return:
"""
self.adb('reboot recovery')
def fastboot(self):
"""
重启设备并进入fastboot模式
:return:
"""
self.adb('reboot bootloader')
def root(self):
"""
获取root状态
:return:
"""
return 'not found' not in self.shell('su -c ls -l /data/').read().strip()
def wifi(self, power):
"""
开启/关闭wifi
pass: 需要root权限
:return:
"""
if not self.root():
print('The device not root.')
return
if power:
self.shell('su -c svc wifi enable').read().strip()
else:
self.shell('su -c svc wifi disable').read().strip()
def data(self, power):
"""
开启/关闭蜂窝数据
pass: 需要root权限
:return:
"""
if not self.root():
print('The device not root.')
return
if power:
self.shell('su -c svc data enable').read().strip()
else:
self.shell('su -c svc data disable').read().strip()
def get_wifi_state(self):
"""
获取WiFi连接状态
:return:
"""
return 'enabled' in self.shell('dumpsys wifi | %s ^Wi-Fi' % self.__find).read().strip()
def get_data_state(self):
"""
获取移动网络连接状态
:return:
"""
return '2' in self.shell('dumpsys telephony.registry | %s mDataConnectionState' % self.__find).read().strip()
def get_network_state(self):
"""
设备是否连上互联网
:return:
"""
return 'unknown host' not in self.shell('ping -w 1 www.baidu.com').read().strip()
def get_wifi_password_list(self):
"""
获取WIFI密码列表
:return:
"""
if not self.root():
print('The device not root.')
return []
l = re.findall(re.compile('ssid=".+?"\s{3}psk=".+?"'), self.shell('su -c cat /data/misc/wifi/*.conf').read())
return [re.findall(re.compile('".+?"'), i) for i in l]
def call(self, number):
"""
拨打电话
:param number:
:return:
"""
self.shell('am start -a android.intent.action.CALL -d tel:%s' % number)
def open_url(self, url):
"""
打开网页
:return:
"""
self.shell('am start -a android.intent.action.VIEW -d %s' % url)
def start_application(self, component):
"""
启动一个应用
e.g: com.android.settings/com.android.settings.Settings
"""
self.shell("am start -n %s" % component)
def send_keyevent(self, keycode):
"""
发送一个按键事件
https://developer.android.com/reference/android/view/KeyEvent.html
:return:
"""
self.shell('input keyevent %s' % keycode)
def rotation_screen(self, param):
"""
旋转屏幕
:param param: 0 >> 纵向,禁止自动旋转; 1 >> 自动旋转
:return:
"""
self.shell('/system/bin/content insert --uri content://settings/system --bind '
'name:s:accelerometer_rotation --bind value:i:%s' % param)
def instrument(self, command):
"""
启动instrument app
:param command: 命令
:return:
"""
return self.shell('am instrument %s' % command).read()
def export_apk(self, package, target_path='', timeout=5000):
"""
从设备导出应用
:param timeout: 超时时间
:param target_path: 导出后apk存储路径
:param package: 包名
:return:
"""
num = 0
if target_path == '':
self.adb('pull /data/app/%s-1/base.apk %s' % (package, os.path.expanduser('~')))
while 1:
num += 1
if num <= timeout:
if os.path.exists(os.path.join(os.path.expanduser('~'), 'base.apk')):
os.rename(os.path.join(os.path.expanduser('~'), 'base.apk'),
os.path.join(os.path.expanduser('~'), '%s.apk' % package))
else:
self.adb('pull /data/app/%s-1/base.apk %s' % (package, target_path))
while 1:
num += 1
if num <= timeout:
if os.path.exists(os.path.join(os.path.expanduser('~'), 'base.apk')):
os.rename(os.path.join(os.path.expanduser('~'), 'base.apk'),
os.path.join(os.path.expanduser('~'), '%s.apk' % package))
class KeyCode:
KEYCODE_CALL = 5 # 拨号键
KEYCODE_ENDCALL = 6 # 挂机键
KEYCODE_HOME = 3 # Home键
KEYCODE_MENU = 82 # 菜单键
KEYCODE_BACK = 4 # 返回键
KEYCODE_SEARCH = 84 # 搜索键
KEYCODE_CAMERA = 27 # 拍照键
KEYCODE_FOCUS = 80 # 对焦键
KEYCODE_POWER = 26 # 电源键
KEYCODE_NOTIFICATION = 83 # 通知键
KEYCODE_MUTE = 91 # 话筒静音键
KEYCODE_VOLUME_MUTE = 164 # 扬声器静音键
KEYCODE_VOLUME_UP = 24 # 音量+键
KEYCODE_VOLUME_DOWN = 25 # 音量-键
KEYCODE_ENTER = 66 # 回车键
KEYCODE_ESCAPE = 111 # ESC键
KEYCODE_DPAD_CENTER = 23 # 导航键 >> 确定键
KEYCODE_DPAD_UP = 19 # 导航键 >> 向上
KEYCODE_DPAD_DOWN = 20 # 导航键 >> 向下
KEYCODE_DPAD_LEFT = 21 # 导航键 >> 向左
KEYCODE_DPAD_RIGHT = 22 # 导航键 >> 向右
KEYCODE_MOVE_HOME = 122 # 光标移动到开始键
KEYCODE_MOVE_END = 123 # 光标移动到末尾键
KEYCODE_PAGE_UP = 92 # 向上翻页键
KEYCODE_PAGE_DOWN = 93 # 向下翻页键
KEYCODE_DEL = 67 # 退格键
KEYCODE_FORWARD_DEL = 112 # 删除键
KEYCODE_INSERT = 124 # 插入键
KEYCODE_TAB = 61 # Tab键
KEYCODE_NUM_LOCK = 143 # 小键盘锁
KEYCODE_CAPS_LOCK = 115 # 大写锁定键
KEYCODE_BREAK = 121 # Break / Pause键
KEYCODE_SCROLL_LOCK = 116 # 滚动锁定键
KEYCODE_ZOOM_IN = 168 # 放大键
KEYCODE_ZOOM_OUT = 169 # 缩小键
KEYCODE_0 = 7
KEYCODE_1 = 8
KEYCODE_2 = 9
KEYCODE_3 = 10
KEYCODE_4 = 11
KEYCODE_5 = 12
KEYCODE_6 = 13
KEYCODE_7 = 14
KEYCODE_8 = 15
KEYCODE_9 = 16
KEYCODE_A = 29
KEYCODE_B = 30
KEYCODE_C = 31
KEYCODE_D = 32
KEYCODE_E = 33
KEYCODE_F = 34
KEYCODE_G = 35
KEYCODE_H = 36
KEYCODE_I = 37
KEYCODE_J = 38
KEYCODE_K = 39
KEYCODE_L = 40
KEYCODE_M = 41
KEYCODE_N = 42
KEYCODE_O = 43
KEYCODE_P = 44
KEYCODE_Q = 45
KEYCODE_R = 46
KEYCODE_S = 47
KEYCODE_T = 48
KEYCODE_U = 49
KEYCODE_V = 50
KEYCODE_W = 51
KEYCODE_X = 52
KEYCODE_Y = 53
KEYCODE_Z = 54
KEYCODE_PLUS = 81 # +
KEYCODE_MINUS = 69 # -
KEYCODE_STAR = 17 # *
KEYCODE_SLASH = 76 # /
KEYCODE_EQUALS = 70 # =
KEYCODE_AT = 77 # @
KEYCODE_POUND = 18 # #
KEYCODE_APOSTROPHE = 75 # '
KEYCODE_BACKSLASH = 73 # \
KEYCODE_COMMA = 55 # ,
KEYCODE_PERIOD = 56 # .
KEYCODE_LEFT_BRACKET = 71 # [
KEYCODE_RIGHT_BRACKET = 72 # ]
KEYCODE_SEMICOLON = 74 # ;
KEYCODE_GRAVE = 68 # `
KEYCODE_SPACE = 62 # 空格键
KEYCODE_MEDIA_PLAY = 126 # 多媒体键 >> 播放
KEYCODE_MEDIA_STOP = 86 # 多媒体键 >> 停止
KEYCODE_MEDIA_PAUSE = 127 # 多媒体键 >> 暂停
KEYCODE_MEDIA_PLAY_PAUSE = 85 # 多媒体键 >> 播放 / 暂停
KEYCODE_MEDIA_FAST_FORWARD = 90 # 多媒体键 >> 快进
KEYCODE_MEDIA_REWIND = 89 # 多媒体键 >> 快退
KEYCODE_MEDIA_NEXT = 87 # 多媒体键 >> 下一首
KEYCODE_MEDIA_PREVIOUS = 88 # 多媒体键 >> 上一首
KEYCODE_MEDIA_CLOSE = 128 # 多媒体键 >> 关闭
KEYCODE_MEDIA_EJECT = 129 # 多媒体键 >> 弹出
KEYCODE_MEDIA_RECORD = 130 # 多媒体键 >> 录音
if __name__ == '__main__':
a = AdbTools()
pass | 30.010262 | 120 | 0.49356 | 2,812 | 26,319 | 4.421408 | 0.25569 | 0.034022 | 0.027749 | 0.023647 | 0.245395 | 0.206467 | 0.169871 | 0.155715 | 0.126116 | 0.11389 | 0 | 0.022763 | 0.374064 | 26,319 | 877 | 121 | 30.010262 | 0.731941 | 0.109807 | 0 | 0.142544 | 0 | 0 | 0.192996 | 0.065467 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153509 | false | 0.004386 | 0.008772 | 0 | 0.502193 | 0.028509 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
9238a72a6c6ffca4fc17e398558fafc0e65223b0 | 425 | py | Python | apps/company/migrations/0018_alter_inspection_is_inspection_successful.py | samuVillegas/proyecto-fmc | 595a201ed3a136b5db7daadd1be1ecaaae58aea6 | [
"MIT"
] | null | null | null | apps/company/migrations/0018_alter_inspection_is_inspection_successful.py | samuVillegas/proyecto-fmc | 595a201ed3a136b5db7daadd1be1ecaaae58aea6 | [
"MIT"
] | null | null | null | apps/company/migrations/0018_alter_inspection_is_inspection_successful.py | samuVillegas/proyecto-fmc | 595a201ed3a136b5db7daadd1be1ecaaae58aea6 | [
"MIT"
] | null | null | null | # Generated by Django 4.0.1 on 2022-04-17 22:48
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('company', '0017_inspection_description'),
]
operations = [
migrations.AlterField(
model_name='inspection',
name='is_inspection_successful',
field=models.BooleanField(default=None, null=True),
),
]
| 22.368421 | 63 | 0.628235 | 44 | 425 | 5.954545 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060703 | 0.263529 | 425 | 18 | 64 | 23.611111 | 0.776358 | 0.105882 | 0 | 0 | 1 | 0 | 0.179894 | 0.134921 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92394ab87ad3c0d381f474179d582c21604b15ff | 816 | py | Python | vyperlogix/misc/_getframeInfo.py | raychorn/chrome_gui | f1fade70b61af12ee43c55c075aa9cfd32caa962 | [
"CC0-1.0"
] | 1 | 2020-09-29T01:36:33.000Z | 2020-09-29T01:36:33.000Z | vyperlogix/misc/_getframeInfo.py | raychorn/chrome_gui | f1fade70b61af12ee43c55c075aa9cfd32caa962 | [
"CC0-1.0"
] | null | null | null | vyperlogix/misc/_getframeInfo.py | raychorn/chrome_gui | f1fade70b61af12ee43c55c075aa9cfd32caa962 | [
"CC0-1.0"
] | null | null | null | # use sys._getframe() -- it returns a frame object, whose attribute
# f_code is a code object, whose attribute co_name is the name:
import sys
this_function_name = sys._getframe().f_code.co_name
# the frame and code objects also offer other useful information:
this_line_number = sys._getframe().f_lineno
this_filename = sys._getframe().f_code.co_filename
# also, by calling sys._getframe(1), you can get this information
# for the *caller* of the current function. So you can package
# this functionality up into your own handy functions:
def whoami():
import sys
return sys._getframe(1).f_code.co_name
me = whoami()
# this uses argument 1, because the call to whoami is now frame 0.
# and similarly:
def callersname():
import sys
return sys._getframe(2).f_code.co_name
him = callersname()
| 29.142857 | 67 | 0.75 | 132 | 816 | 4.462121 | 0.484848 | 0.13073 | 0.047538 | 0.056027 | 0.149406 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007353 | 0.166667 | 816 | 27 | 68 | 30.222222 | 0.858824 | 0.551471 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.25 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
923c9cf40938312d11d7cd0afb9a1a8d330bf48b | 24,250 | py | Python | floodlight/vll_pusher.py | netgroup/Dreamer-VLL-Pusher | 9b543a2799d7805ac2e1b5d4cfd3fa86d93875c6 | [
"Apache-2.0"
] | 1 | 2018-10-27T11:18:26.000Z | 2018-10-27T11:18:26.000Z | floodlight/vll_pusher.py | netgroup/Dreamer-VLL-Pusher | 9b543a2799d7805ac2e1b5d4cfd3fa86d93875c6 | [
"Apache-2.0"
] | null | null | null | floodlight/vll_pusher.py | netgroup/Dreamer-VLL-Pusher | 9b543a2799d7805ac2e1b5d4cfd3fa86d93875c6 | [
"Apache-2.0"
] | 1 | 2018-11-21T10:21:26.000Z | 2018-11-21T10:21:26.000Z | #!/usr/bin/python
##############################################################################################
# Copyright (C) 2014 Pier Luigi Ventre - (Consortium GARR and University of Rome "Tor Vergata")
# Copyright (C) 2014 Giuseppe Siracusano, Stefano Salsano - (CNIT and University of Rome "Tor Vergata")
# www.garr.it - www.uniroma2.it/netgroup - www.cnit.it
#
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Virtual Leased Line Pusher.
#
# @author Pier Luigi Ventre <pl.ventre@gmail.com>
# @author Giuseppe Siracusano <a_siracusano@tin.it>
# @author Stefano Salsano <stefano.salsano@uniroma2.it>
#
#
import os
import sys
import subprocess
import json
import argparse
import io
import time
import re
import siphash
# XXX Be Careful, For Now The Vll_Pusher Depends On vll_pusher.cfg; This file should be created by the [x] Deployer
# (x = Mininet Deployer, TestBeds Deployer)
# Parse vll options. Currently supports add and delete actions.
# Syntax:
# vll_pusher --controller {IP:REST_PORT} --add
# vll_pusher --controller {IP:REST_PORT} --delete
def parse_cmd_line():
parser = argparse.ArgumentParser(description='Virtual Leased Line Pusher')
parser.add_argument('--controller', dest='controllerRestIp', action='store', default='localhost:8080', help='controller IP:RESTport, e.g., localhost:8080 or A.B.C.D:8080')
parser.add_argument('--add', dest='action', action='store_const', const='add', default='add', help='action: add')
parser.add_argument('--delete', dest='action', action='store_const', const='delete', default='add', help='action: delete')
args = parser.parse_args()
if len(sys.argv)==1:
parser.print_help()
sys.exit(1)
return args
# Read From vll_pusher.cfg The Configuration For The Vlls
def read_conf_file():
global pusher_cfg
print "*** Read Configuration File For Vll Pusher"
path = "vll_pusher.cfg"
if os.path.exists(path):
conf = open(path,'r')
pusher_cfg = json.load(conf)
conf.close()
else:
print "No Configuration File Find In %s" % path
sys.exit(-2)
print "*** PUSHER_CFG", json.dumps(pusher_cfg, sort_keys=True, indent=4)
# Utility function for the vlls persisentce
def store_vll(name, dpid):
# Store created vll attributes in local ./vlls.json
datetime = time.asctime()
vllParams = {'name': name, 'Dpid':dpid, 'datetime':datetime}
str = json.dumps(vllParams)
vllsDb = open('./vlls.json','a+')
vllsDb.write(str+"\n")
vllsDb.close()
intf_to_port_number = {}
def convert_intf_to_port_number(controllerRestIP):
global intf_to_port_number
command = "curl -s http://%s/wm/core/controller/switches/json | python -mjson.tool" % (controllerRestIP)
result = os.popen(command).read()
parsedResult = json.loads(result)
default = None
for vll in pusher_cfg['vlls']:
lhs_intf = vll['lhs_intf']
lhs_dpid = vll['lhs_dpid']
port_number = intf_to_port_number.get("%s-%s" % (lhs_dpid, lhs_intf), default)
if port_number == None :
for switch in parsedResult:
if switch["dpid"] == lhs_dpid:
for port in switch["ports"]:
if port["name"] == lhs_intf:
port_number = str(port["portNumber"])
intf_to_port_number["%s-%s" % (lhs_dpid, lhs_intf)] = port_number
vll['lhs_intf'] = port_number
rhs_intf = vll['rhs_intf']
rhs_dpid = vll['rhs_dpid']
port_number = intf_to_port_number.get("%s-%s" % (rhs_dpid, rhs_intf), default)
if port_number == None :
for switch in parsedResult:
if switch["dpid"] == rhs_dpid:
for port in switch["ports"]:
if port["name"] == rhs_intf:
port_number = str(port["portNumber"])
intf_to_port_number["%s-%s" % (rhs_dpid, rhs_intf)] = port_number
vll['rhs_intf'] = port_number
print "*** PUSHER_CFG", json.dumps(pusher_cfg, sort_keys=True, indent=4)
print "*** INTFS", json.dumps(intf_to_port_number, sort_keys=True, indent=4)
# Add Vlls Reading All the Information From Configuration File
def add_command(args):
print "*** Add Vlls From Configuration File"
print "*** Read Previous Vlls Inserted"
if os.path.exists('./vlls.json'):
vllsDb = open('./vlls.json','r')
vlllines = vllsDb.readlines()
vllsDb.close()
else:
vlllines={}
read_conf_file()
# We use this algorithm for the name generation
key = '0123456789ABCDEF'
sip = siphash.SipHash_2_4(key)
# Extract from cmd line options the controlller information
controllerRestIp = args.controllerRestIp
# Dictionary that stores the mapping port:next_label
# We allocate the label using a counter, and we associate for each port used in this execution the next usable label
# Probably in future we can add the persistence for the label
sw_port_tag = {}
convert_intf_to_port_number(controllerRestIp)
# We can have more than one vlls
for vll in pusher_cfg['vlls']:
# Retrieve the information
srcSwitch = vll['lhs_dpid']
srcPort = vll['lhs_intf']
dstSwitch = vll['rhs_dpid']
dstPort = vll['rhs_intf']
srcLabel = vll['lhs_label']
dstLabel = vll['rhs_label']
print "*** Generate Name From VLL (%s-%s-%s) - (%s-%s-%s)" % (srcSwitch, srcPort, srcLabel, dstSwitch, dstPort, dstLabel)
sip.update(srcSwitch + "$" + srcPort + "$" + dstSwitch + "$" + dstPort + "$" + srcLabel + "$" + dstLabel)
# Generate the name
digest = sip.hash()
digest = str(digest)
print "*** Vll Name", digest
vllExists = False
# if the vll exists in the vllDb, we don't insert the flow
for line in vlllines:
data = json.loads(line)
if data['name']==(digest):
print "Vll %s exists already Skip" % digest
vllExists = True
break
if vllExists == True:
continue
print "*** Create Vll:"
print "*** From Source Device OSHI-PE %s Port %s" % (srcSwitch,srcPort)
print "*** To Destination Device OSHI-PE %s Port %s"% (dstSwitch,dstPort)
# Retrieving route from source to destination
# using Routing rest API
command = "curl -s http://%s/wm/topology/route/%s/%s/%s/%s/json | python -mjson.tool" % (controllerRestIp, srcSwitch, srcPort, dstSwitch, dstPort)
result = os.popen(command).read()
parsedResult = json.loads(result)
print
#print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
# Dictionary used for store the label of current vll
temp_sw_port_tag = {}
# We insert the rule each two json item, because floodlight's getRoute for each dpid, provides
# A couple of item the in/out port and the out/in port for the rules forward/reverse - see the
# output of the previous command
temp_key1 = None
temp_key2 = None
temp_tag1 = None
temp_tag2 = None
ap1Dpid = None
ap1Port = None
ap2Dpid = None
ap2Port = None
default = 2
max_value = 4095
if int(srcLabel) > max_value or int(dstLabel) > max_value:
print "Ingress or Egress Label Not Allowable"
sys.exit(-2)
# We generate the labels associated for each port, while the ingress/egress and egress/ingress labels
# come from the configuration file, because they depend on the local network choice
for j in range(0, (len(parsedResult))):
# Label for the LHS port
if j == 0:
temp_key1 = srcSwitch + "-" + srcPort
temp_sw_port_tag[temp_key1] = int(srcLabel)
if sw_port_tag.get(temp_key1,default) <= int(srcLabel):
sw_port_tag[temp_key1] = int(srcLabel)
# Label for the RHS port
elif j == (len(parsedResult)-1):
temp_key1 = dstSwitch + "-" + dstPort
temp_sw_port_tag[temp_key1] = int(dstLabel)
if sw_port_tag.get(temp_key1,default) <= int(dstLabel):
sw_port_tag[temp_key1] = int(dstLabel)
# Middle ports
else :
apDPID = parsedResult[j]['switch']
apPORT = parsedResult[j]['port']
temp_key1 = apDPID + "-" + str(apPORT)
value = sw_port_tag.get(temp_key1, default)
temp_sw_port_tag[temp_key1] = value
value = value + 1
sw_port_tag[temp_key1] = value
print "*** Current Route Tag:"
print json.dumps(temp_sw_port_tag, sort_keys=True, indent=4)
print
print "*** Global Routes Tag:"
print json.dumps(sw_port_tag, sort_keys=True, indent=4)
print
# Manage the special case of one hop
if len(parsedResult) == 2:
print "*** One Hop Route"
# The Switch, where we insert the rule
ap1Dpid = parsedResult[0]['switch']
# In port
ap1Port = str(parsedResult[0]['port'])
temp_key1 = ap1Dpid + "-" + ap1Port
tag1 = temp_sw_port_tag[temp_key1]
# ap1Dpid == ap2Dpid
ap2Dpid = parsedResult[1]['switch']
# Out port
ap2Port = str(parsedResult[1]['port'])
temp_key2 = ap2Dpid + "-" + ap2Port
tag2 = temp_sw_port_tag[temp_key2]
if tag1 == 0 and tag2 ==0:
# Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", ap1Port, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag1 !=0 and tag2==0:
# Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"strip-vlan,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", tag1, ap1Port, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag1 ==0 and tag2 !=0:
# Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", "0xffff", ap1Port, tag2, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
else:
# Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", tag1, ap1Port, tag2, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
if tag2 == 0 and tag1 ==0:
# Reverse Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", ap2Port, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag2 != 0 and tag1 ==0:
# Reverse Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"strip-vlan,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", tag2, ap2Port, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag2 == 0 and tag1 !=0:
# Reverse Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", "0xffff", ap2Port, tag1, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
else:
# Reverse Forward's Rule
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", tag2, ap2Port, tag1, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
store_vll(digest, ap1Dpid)
# see the image one_hop for details on the switching label procedure
else:
# In the other cases we use a different approach for the rule; before we see the label
# of the inport and outport of the same dpid; with more than one hop we see in general for
# the forward rule the label of the inport on the next switch, while in the reverse rule the label of the inport on the
# previous switch. The previous approach is nested in a for loop, we use this loop in the middle dpid, while
# we manage as special case the ingress/egress node, because the rules are different
print "*** %s Hop Route" % (len(parsedResult)/2)
# We manage first ingress/egress node
print "*** Create Ingress Rules For LHS Of The Vll - %s" % (srcSwitch)
# see the image more_than_one_hop for details on the switching label procedure
ap1Dpid = parsedResult[0]['switch']
ap1Port = parsedResult[0]['port']
temp_key1 = ap1Dpid + "-" + str(ap1Port)
tag1 = temp_sw_port_tag[temp_key1]
print "*** inKey: %s, inTag: %s" % (temp_key1, tag1)
ap2Dpid = parsedResult[1]['switch']
ap2Port = parsedResult[1]['port']
temp_key2 = parsedResult[2]['switch'] + "-" + str(parsedResult[2]['port'])
tag2 = temp_sw_port_tag[temp_key2]
print "*** outKey: %s, outTag: %s" % (temp_key2, tag2)
print
if tag1 == 0 and tag2 !=0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", "0xffff", ap1Port, tag2, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag1 != 0 and tag2 !=0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", tag1, ap1Port, tag2, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
else:
print "Error Tag";
sys.exit(-2)
print "*** Create Egress Rules For LHS Of The Vll - %s" % (srcSwitch)
temp_key2 = temp_key1
tag2 = tag1
temp_key1 = ap2Dpid + "-" + str(ap2Port)
tag1 = temp_sw_port_tag[temp_key1]
print "*** inKey: %s, inTag: %s" % (temp_key1, tag1)
print "*** outKey: %s, outTag: %s" % (temp_key2, tag2)
print
if tag1 != 0 and tag2 ==0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"strip-vlan,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", tag1, ap2Port, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag1 != 0 and tag2 !=0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", tag1, ap2Port, tag2, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
else:
print "Error Tag";
sys.exit(-2)
store_vll(digest, ap1Dpid)
print "*** Create Egress Rules For RHS Of The Vll - %s" % (dstSwitch)
ap1Dpid = parsedResult[len(parsedResult)-2]['switch']
ap1Port = parsedResult[len(parsedResult)-2]['port']
temp_key1 = ap1Dpid + "-" + str(ap1Port)
tag1 = temp_sw_port_tag[temp_key1]
print "*** inKey: %s, inTag: %s" % (temp_key1, tag1)
ap2Dpid = parsedResult[len(parsedResult)-1]['switch']
ap2Port = parsedResult[len(parsedResult)-1]['port']
temp_key2 = ap2Dpid + "-" + str(ap2Port)
tag2 = temp_sw_port_tag[temp_key2]
print "*** outKey: %s, outTag: %s" % (temp_key2, tag2)
print
if tag1 != 0 and tag2 ==0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"strip-vlan,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", tag1, ap1Port, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag1 != 0 and tag2 !=0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", tag1, ap1Port, tag2, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
else:
print "Error Tag";
sys.exit(-2)
print "*** Create Ingress Rules For RHS Of The Vll - %s" % (dstSwitch)
temp_key1 = parsedResult[len(parsedResult)-3]['switch'] + "-" + str(parsedResult[len(parsedResult)-3]['port'])
tag1 = temp_sw_port_tag[temp_key1]
print "*** inKey: %s, inTag: %s" % (temp_key2, tag2)
print "*** outKey: %s, outTag: %s" % (temp_key1, tag1)
print
if tag1 != 0 and tag2 ==0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", "0xffff", ap2Port, tag1, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
elif tag1 != 0 and tag2 !=0:
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", tag2, ap2Port, tag1, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
else:
print "Error Tag";
sys.exit(-2)
store_vll(digest, ap1Dpid)
# Now we manage the middle nodes
for i in range(2, (len(parsedResult)-2)):
print "index:", i
if i % 2 == 0:
ap1Dpid = parsedResult[i]['switch']
ap1Port = parsedResult[i]['port']
print ap1Dpid, ap1Port
else:
ap2Dpid = parsedResult[i]['switch']
ap2Port = parsedResult[i]['port']
print ap2Dpid, ap2Port
print "*** Create Rules For %s" % ap1Dpid
# send one flow mod per pair in route
# using StaticFlowPusher rest API
temp_key1 = ap1Dpid + "-" + str(ap1Port)
tag1 = temp_sw_port_tag[temp_key1]
print "*** inKey: %s, inTag: %s" % (temp_key1, tag1)
temp_key2 = parsedResult[i+1]['switch'] + "-" + str(parsedResult[i+1]['port'])
tag2 = temp_sw_port_tag[temp_key2]
print "*** outKey: %s, outTag: %s" % (temp_key2, tag2)
print
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".f", tag1, ap1Port, tag2, ap2Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
temp_key1 = ap2Dpid + "-" + str(ap2Port)
tag1 = temp_sw_port_tag[temp_key1]
print "*** inKey: %s, inTag: %s" % (temp_key1, tag1)
temp_key2 = parsedResult[i-2]['switch'] + "-" + str(parsedResult[i-2]['port'])
tag2 = temp_sw_port_tag[temp_key2]
print "*** outKey: %s, outTag: %s" % (temp_key2, tag2)
print
command = "curl -s -d '{\"switch\": \"%s\", \"name\":\"%s\", \"vlan-id\":\"%s\", \"cookie\":\"0\", \"priority\":\"32768\", \"ingress-port\":\"%s\",\"active\":\"true\", \"actions\":\"set-vlan-id=%s,output=%s\"}' http://%s/wm/staticflowentrypusher/json | python -mjson.tool" % (ap1Dpid, ap1Dpid + "." + digest + ".r", tag1, ap2Port, tag2, ap1Port, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
store_vll(digest, ap1Dpid)
def del_command(args):
print "*** Delete Vlls From Configuration File"
print "*** Read Previous Vlls Inserted"
if os.path.exists('vlls.json'):
vllsDb = open('vlls.json','r')
lines = vllsDb.readlines()
vllsDb.close()
vllsDb = open('vlls.json','w')
else:
lines={}
print "*** No Vlls Inserted"
return
# Removing previously created flow from switches
# using StaticFlowPusher rest API
# currently, circuitpusher records created circuits in local file ./circuits.db
# with circuit name and list of switches
controllerRestIp = args.controllerRestIp
for line in lines:
data = json.loads(line)
sw = data['Dpid']
digest = data['name']
print "*** Deleting Vll: %s - Switch %s" % (digest,sw)
command = "curl -X DELETE -d '{\"name\":\"%s\", \"switch\":\"%s\"}' http://%s/wm/staticflowentrypusher/json 2> /dev/null | python -mjson.tool" % (sw + "." + digest + ".f", sw, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
command = "curl -X DELETE -d '{\"name\":\"%s\", \"switch\":\"%s\"}' http://%s/wm/staticflowentrypusher/json 2> /dev/null | python -mjson.tool" % (sw + "." + digest +".r", sw, controllerRestIp)
result = os.popen(command).read()
print "*** Sent Command:", command + "\n"
print "*** Received Result:", result + "\n"
vllsDb.close()
def run_command(data):
if args.action == 'add':
add_command(data)
elif args.action == 'delete':
del_command(data)
if __name__ == '__main__':
args = parse_cmd_line()
run_command(args)
| 45.66855 | 371 | 0.623258 | 3,225 | 24,250 | 4.607752 | 0.122791 | 0.018304 | 0.01319 | 0.011844 | 0.604038 | 0.587954 | 0.54603 | 0.536137 | 0.519785 | 0.493674 | 0 | 0.023864 | 0.172289 | 24,250 | 530 | 372 | 45.754717 | 0.716471 | 0.157567 | 0 | 0.474114 | 0 | 0.013624 | 0.254951 | 0 | 0 | 0 | 0.001185 | 0 | 0 | 0 | null | null | 0 | 0.024523 | null | null | 0.269755 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9242fcee18924cfc75c9b5f04ff881874a1a4e66 | 1,229 | py | Python | iot_services_sdk/session.py | sap-archive/iot-services-sdk | 157e607b0c8b3a7b77836336aa31d89ebd8e9f86 | [
"CNRI-Python"
] | 4 | 2019-05-02T07:51:13.000Z | 2019-09-25T12:14:06.000Z | iot_services_sdk/session.py | sap-archive/iot-services-sdk | 157e607b0c8b3a7b77836336aa31d89ebd8e9f86 | [
"CNRI-Python"
] | 2 | 2019-09-13T15:36:32.000Z | 2019-11-15T06:01:09.000Z | iot_services_sdk/session.py | SAP/iot-services-sdk | 157e607b0c8b3a7b77836336aa31d89ebd8e9f86 | [
"CNRI-Python"
] | 1 | 2020-01-17T15:44:52.000Z | 2020-01-17T15:44:52.000Z | """ Author: Philipp Steinrötter (steinroe) """
from .iot_service import IoTService
from .response import Response
class SessionService(IoTService):
def __init__(self,
instance,
user,
password):
"""Instantiate SessionService object
Arguments:
instance {string} -- IoT Services instance
user {string} -- IoT Services user
password {string} -- IoT Services password
"""
self.service = ''
IoTService.__init__(
self,
instance=instance,
user=user,
password=password
)
def logout(self) -> Response:
"""Logs out the user by invalidating the session. The user is identified via session cookie or the
Authorization header. """
service = '/logout'
response = self.request_core(method='POST', service=service, accept_json=True)
return response
def me(self) -> Response:
"""The current user is identified via session cookie or the Authorization header."""
service = '/me'
response = self.request_core(method='POST', service=service, accept_json=True)
return response
| 29.97561 | 106 | 0.597234 | 121 | 1,229 | 5.958678 | 0.38843 | 0.049931 | 0.070735 | 0.052705 | 0.382802 | 0.382802 | 0.382802 | 0.382802 | 0.382802 | 0.382802 | 0 | 0 | 0.315704 | 1,229 | 40 | 107 | 30.725 | 0.857313 | 0.337673 | 0 | 0.181818 | 0 | 0 | 0.024259 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0.090909 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
9244aa4117e40f1bf99002ca86723057b1690b0e | 736 | py | Python | tests/test_pieces.py | trslater/chess | 0c29faf1296f76020f8e213e9c218c05043668bb | [
"MIT"
] | null | null | null | tests/test_pieces.py | trslater/chess | 0c29faf1296f76020f8e213e9c218c05043668bb | [
"MIT"
] | null | null | null | tests/test_pieces.py | trslater/chess | 0c29faf1296f76020f8e213e9c218c05043668bb | [
"MIT"
] | null | null | null | from chess.pieces import Pawn, Knight, Bishop, Rook, Queen, King
class TestPiece:
def test_sum(self):
groups = ((Pawn(), Knight(), Bishop()),
(Knight(), Bishop(), Queen()),
(Pawn(), Pawn(), Pawn(), Pawn()))
actual_sums = tuple(map(sum, groups))
expected_sums = (7, 15, 4)
for actual_sum, expected_sum in zip(actual_sums, expected_sums):
assert actual_sum == expected_sum
def test_compare(self):
assert Knight() == Knight()
assert Bishop() == Knight()
assert King() <= King()
assert Bishop() <= Rook()
assert King() > Queen()
assert Pawn() < Knight()
assert Queen() != Rook()
| 30.666667 | 72 | 0.532609 | 79 | 736 | 4.835443 | 0.379747 | 0.078534 | 0.08377 | 0.104712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007984 | 0.319293 | 736 | 23 | 73 | 32 | 0.754491 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.111111 | false | 0 | 0.055556 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
924898f498fe7a8a90741bbddcf129930d8751be | 11,617 | py | Python | tarpn/netrom/router.py | rxt1077/tarpn-node-controller | ffbe1d78fbd1c10e891b3339b50002e5233e21ad | [
"MIT"
] | null | null | null | tarpn/netrom/router.py | rxt1077/tarpn-node-controller | ffbe1d78fbd1c10e891b3339b50002e5233e21ad | [
"MIT"
] | null | null | null | tarpn/netrom/router.py | rxt1077/tarpn-node-controller | ffbe1d78fbd1c10e891b3339b50002e5233e21ad | [
"MIT"
] | null | null | null | import datetime
import os
from dataclasses import dataclass, field
from operator import attrgetter
from typing import List, Dict, Optional, cast, Set
from tarpn.ax25 import AX25Call
from tarpn.netrom import NetRomPacket, NetRomNodes, NodeDestination
from tarpn.network import L3RoutingTable, L3Address
import tarpn.network.netrom_l3 as l3
from tarpn.util import json_dump, json_load
@dataclass
class Neighbor:
call: AX25Call
port: int
quality: int
def __hash__(self):
return hash(self.call)
def to_safe_dict(self):
return {
"call": str(self.call),
"port": self.port,
"quality": self.quality
}
@classmethod
def from_safe_dict(cls, d):
return cls(call=AX25Call.parse(d["call"]), port=d["port"], quality=d["quality"])
@dataclass
class Route:
neighbor: AX25Call
dest: AX25Call
next_hop: AX25Call
quality: int
obsolescence: int
def to_safe_dict(self):
return {
"neighbor": str(self.neighbor),
"destination": str(self.dest),
"next_hop": str(self.next_hop),
"quality": self.quality,
"obsolescence": self.obsolescence
}
@classmethod
def from_safe_dict(cls, d):
return cls(neighbor=AX25Call.parse(d["neighbor"]), dest=AX25Call.parse(d["destination"]),
next_hop=AX25Call.parse(d["next_hop"]), quality=d["quality"], obsolescence=d["obsolescence"])
def __hash__(self):
return hash((self.neighbor, self.dest))
@dataclass
class Destination:
node_call: AX25Call
node_alias: str
neighbor_map: Dict[str, Route] = field(default_factory=dict, compare=False, hash=False)
freeze: bool = False
def __hash__(self):
return hash((self.node_call, self.node_alias))
def to_safe_dict(self):
return {
"call": str(self.node_call),
"alias": self.node_alias,
"freeze": self.freeze,
"routes": [route.to_safe_dict() for route in self.neighbor_map.values()]
}
@classmethod
def from_safe_dict(cls, d):
instance = cls(node_call=AX25Call.parse(d["call"]), node_alias=d["alias"], freeze=d["freeze"])
instance.neighbor_map = {
route_dict["neighbor"]: Route.from_safe_dict(route_dict) for route_dict in d["routes"]
}
return instance
def sorted_neighbors(self):
return sorted(self.neighbor_map.values(), key=attrgetter("quality"), reverse=True)
@dataclass
class NetRomRoutingTable(L3RoutingTable):
node_alias: str
updated_at: datetime.datetime = field(default_factory=datetime.datetime.now)
our_calls: Set[AX25Call] = field(default_factory=set, compare=False, hash=False)
# Neighbors is a map of direct neighbors we have, i.e., who we have heard NODES from
neighbors: Dict[str, Neighbor] = field(default_factory=dict, compare=False, hash=False)
# Destinations is the content of the NODES table, what routes exist to other nodes through which neighbors
destinations: Dict[str, Destination] = field(default_factory=dict, compare=False, hash=False)
# TODO config all these
default_obs: int = 100
default_quality: int = 255
min_quality: int = 50
min_obs: int = 4
def __repr__(self):
s = "Neighbors:\n"
for neighbor in self.neighbors.values():
s += f"\t{neighbor}\n"
s += "Destinations:\n"
for dest in self.destinations.values():
s += f"\t{dest}\n"
return s.strip()
def __hash__(self):
return hash((self.node_alias, self.updated_at))
def save(self, filename: str):
d = {
"node_alias": self.node_alias,
"updated_at": self.updated_at.isoformat(),
"our_calls": [str(call) for call in self.our_calls],
"neighbors": [n.to_safe_dict() for n in self.neighbors.values()],
"destinations": [d.to_safe_dict() for d in self.destinations.values()]
}
json_dump(filename, d)
@classmethod
def load(cls, filename: str, node_alias: str):
if not os.path.exists(filename):
return NetRomRoutingTable(node_alias=node_alias, updated_at=datetime.datetime.now())
d = json_load(filename)
return NetRomRoutingTable(node_alias=d["node_alias"],
updated_at=datetime.datetime.fromisoformat(d["updated_at"]),
our_calls={AX25Call.parse(call) for call in d["our_calls"]},
neighbors={n_dict["call"]: Neighbor.from_safe_dict(n_dict) for n_dict in d["neighbors"]},
destinations={d_dict["call"]: Destination.from_safe_dict(d_dict) for d_dict in d["destinations"]})
def route(self, packet: NetRomPacket) -> List[AX25Call]:
"""
If a packet's destination is a known neighbor, route to it. Otherwise look up the route with the highest
quality and send the packet to the neighbor which provided that route
:param packet:
:return: list of neighbor callsign's in sorted order of route quality
"""
if packet.dest in self.neighbors:
return [packet.dest]
else:
dest = self.destinations.get(str(packet.dest))
if dest:
return [n.neighbor for n in dest.sorted_neighbors()]
else:
return []
def route1(self, destination: L3Address) -> Optional[int]:
if not isinstance(destination, l3.NetRomAddress):
print(f"Wrong address family, expected NET/ROM got {destination.__class__}")
return None
netrom_dest = cast(l3.NetRomAddress, destination)
packet_dest = AX25Call(netrom_dest.callsign, netrom_dest.ssid)
# TODO handle alias here
if packet_dest in self.neighbors:
return self.neighbors.get(str(packet_dest)).port
else:
dest = self.destinations.get(str(packet_dest))
if dest:
neighbors = dest.sorted_neighbors()
if len(neighbors) > 0:
return self.neighbors.get(str(neighbors[0].neighbor)).port
else:
return None
else:
return None
def listen_for_address(self, app_call: AX25Call, app_alias: str):
app_routes = {}
for our_call in self.our_calls:
app_routes[str(our_call)] = Route(our_call, app_call, our_call, 95, 100)
self.destinations[str(app_call)] = Destination(app_call, app_alias, app_routes, True)
def refresh_route(self, heard_from: str, node: str):
"""
Refresh the obsolescence for a route
"""
if node in self.destinations:
route = self.destinations[node].neighbor_map.get(heard_from)
if route is not None:
route.obsolescence = self.default_obs
else:
print(f"Cannot refresh route to {node} via {heard_from}. {heard_from} is not in our neighbor map.")
else:
print(f"Cannot refresh route to {node}. It is not in our destination map.")
def update_routes(self, heard_from: AX25Call, heard_on_port: int, nodes: NetRomNodes):
"""
Update the routing table with a NODES broadcast.
This method is not thread-safe.
"""
# Get or create the neighbor and destination
neighbor = self.neighbors.get(str(heard_from), Neighbor(heard_from, heard_on_port, self.default_quality))
self.neighbors[str(heard_from)] = neighbor
# Add direct route to whoever sent the NODES
dest = self.destinations.get(str(heard_from), Destination(heard_from, nodes.sending_alias))
dest.neighbor_map[str(heard_from)] = Route(heard_from, heard_from, heard_from,
self.default_quality, self.default_obs)
self.destinations[str(heard_from)] = dest
for destination in nodes.destinations:
# Filter out ourselves
route_quality = 0
if destination.best_neighbor in self.our_calls:
# Best neighbor is us, this is a "trivial loop", quality is zero
continue
else:
# Otherwise compute this route's quality based on the NET/ROM spec
route_quality = (destination.quality * neighbor.quality + 128.) / 256.
# Only add routes which are above the minimum quality to begin with TODO check this logic
if route_quality > self.min_quality:
new_dest = self.destinations.get(str(destination.dest_node),
Destination(destination.dest_node, destination.dest_alias))
new_route = new_dest.neighbor_map.get(
str(neighbor.call), Route(neighbor.call, destination.dest_node, destination.best_neighbor,
int(route_quality), self.default_obs))
new_route.quality = route_quality
new_route.obsolescence = self.default_obs
new_dest.neighbor_map[str(neighbor.call)] = new_route
self.destinations[str(destination.dest_node)] = new_dest
else:
# print(f"Saw new route for {destination}, but quality was too low")
pass
self.updated_at = datetime.datetime.now()
def prune_routes(self) -> None:
"""
Prune any routes which we haven't heard about in a while.
This method is not thread-safe.
"""
# print("Pruning routes")
for call, destination in list(self.destinations.items()):
if destination.freeze:
# Don't prune frozen routes
continue
for neighbor, route in list(destination.neighbor_map.items()):
route.obsolescence -= 1
if route.obsolescence <= 0:
# print(f"Removing {neighbor} from {destination} neighbor's list")
del destination.neighbor_map[neighbor]
if len(destination.neighbor_map.keys()) == 0:
# print(f"No more routes to {call}, removing from routing table")
del self.destinations[call]
if call in self.neighbors.keys():
del self.neighbors[call]
self.updated_at = datetime.datetime.now()
def clear_routes(self) -> None:
self.destinations.clear()
self.neighbors.clear()
self.updated_at = datetime.datetime.now()
def get_nodes(self) -> NetRomNodes:
node_destinations = []
for destination in self.destinations.values():
# Otherwise find best neighbor route
best_neighbor = None
for neighbor in destination.sorted_neighbors():
if neighbor.obsolescence >= self.min_obs:
best_neighbor = neighbor
break
else:
# print(f"Not including {neighbor} in NODES, obsolescence below threshold")
pass
if best_neighbor:
node_destinations.append(NodeDestination(destination.node_call, destination.node_alias,
best_neighbor.next_hop, best_neighbor.quality))
else:
# print(f"No good neighbor was found for {destination}")
pass
return NetRomNodes(self.node_alias, node_destinations)
| 40.197232 | 132 | 0.608419 | 1,379 | 11,617 | 4.971719 | 0.165337 | 0.019691 | 0.008751 | 0.021879 | 0.176634 | 0.129522 | 0.101663 | 0.063886 | 0.034714 | 0.013419 | 0 | 0.008692 | 0.296892 | 11,617 | 288 | 133 | 40.336806 | 0.830681 | 0.123698 | 0 | 0.232227 | 0 | 0.004739 | 0.055644 | 0.002294 | 0 | 0 | 0 | 0.010417 | 0 | 1 | 0.104265 | false | 0.014218 | 0.047393 | 0.047393 | 0.379147 | 0.014218 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
924ad870dca0b68ac369ce463d3a75f82ddca91b | 1,119 | py | Python | blacktape/pipeline.py | carascap/blacktape | 52e0b912f4c67899911d10d2d6e3770671db02fa | [
"MIT"
] | null | null | null | blacktape/pipeline.py | carascap/blacktape | 52e0b912f4c67899911d10d2d6e3770671db02fa | [
"MIT"
] | 1 | 2022-02-22T19:45:27.000Z | 2022-02-22T19:45:27.000Z | blacktape/pipeline.py | carascap/blacktape | 52e0b912f4c67899911d10d2d6e3770671db02fa | [
"MIT"
] | null | null | null | from concurrent.futures import ProcessPoolExecutor, as_completed
from typing import Iterable, Optional
from blacktape.lib import match_entities_in_text, match_pattern_in_text
from blacktape.util import worker_init
class Pipeline:
"""
Wrapper around ProcessPoolExecutor
"""
def __init__(self, spacy_model: Optional[str] = None):
self.spacy_model = spacy_model
self.executor = ProcessPoolExecutor(initializer=worker_init)
self.futures = []
def results(self):
for future in as_completed(self.futures):
yield future.result()
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self.executor.shutdown()
def submit_ner_job(self, text: str, entity_types: Optional[Iterable[str]] = None):
self.futures.append(
self.executor.submit(
match_entities_in_text, text, self.spacy_model, entity_types
)
)
def submit_regex_job(self, text: str, pattern: str):
self.futures.append(self.executor.submit(match_pattern_in_text, text, pattern))
| 30.243243 | 87 | 0.688114 | 136 | 1,119 | 5.360294 | 0.382353 | 0.032922 | 0.057613 | 0.052126 | 0.109739 | 0.109739 | 0.109739 | 0 | 0 | 0 | 0 | 0 | 0.226095 | 1,119 | 36 | 88 | 31.083333 | 0.841801 | 0.030384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.166667 | 0.041667 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
925679157584e0c93b47453d951d15d7ead1a249 | 1,299 | py | Python | graph.py | tannerb/genetic_math | 96fd0fb2b988a22ff0399b4e8386570ae377e284 | [
"MIT"
] | null | null | null | graph.py | tannerb/genetic_math | 96fd0fb2b988a22ff0399b4e8386570ae377e284 | [
"MIT"
] | null | null | null | graph.py | tannerb/genetic_math | 96fd0fb2b988a22ff0399b4e8386570ae377e284 | [
"MIT"
] | null | null | null | # graph
from datetime import date
import numpy as np
from bokeh.client import push_session
from bokeh.io import output_server, show, vform
from bokeh.palettes import RdYlBu3
from bokeh.plotting import figure, curdoc, vplot, output_server
from bokeh.models import ColumnDataSource
from bokeh.models.widgets import DataTable, DateFormatter, TableColumn
from random import randint
# create a plot and style its properties
p = figure(x_range=(0, 100), y_range=(0, 100))
p.border_fill_color = 'black'
p.background_fill_color = 'black'
p.outline_line_color = None
p.grid.grid_line_color = None
# add a text renderer to out plot (no data yet)
r = p.text(x=[], y=[], text=[], text_color=[], text_font_size="20pt",
text_baseline="middle", text_align="center")
session = push_session(curdoc())
data = dict(
dates=[date(2014, 3, i+1) for i in range(10)],
downloads=[randint(0, 100) for i in range(10)],
)
source = ColumnDataSource(data)
columns = [
TableColumn(field="dates", title="Date", formatter=DateFormatter()),
TableColumn(field="downloads", title="Downloads"),
]
data_table = DataTable(source=source, columns=columns, width=400, height=280)
curdoc().add_root(vform(data_table))
session.show()
| 28.23913 | 78 | 0.69746 | 181 | 1,299 | 4.883978 | 0.497238 | 0.061086 | 0.033937 | 0.033937 | 0.029412 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029273 | 0.184758 | 1,299 | 45 | 79 | 28.866667 | 0.805477 | 0.069284 | 0 | 0 | 0 | 0 | 0.045729 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.310345 | 0 | 0.310345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
9259b7ed50eae624a0ff1eb302547c63e7d7b373 | 1,113 | py | Python | tests/test_default_currencies_provider.py | pedroburon/python-monon | 670a5c5e98171838d43fd1f72d0afbf7a70b937b | [
"MIT"
] | 1 | 2017-07-10T15:32:48.000Z | 2017-07-10T15:32:48.000Z | tests/test_default_currencies_provider.py | pedroburon/python-monon | 670a5c5e98171838d43fd1f72d0afbf7a70b937b | [
"MIT"
] | 5 | 2017-07-10T16:34:28.000Z | 2017-07-10T16:41:38.000Z | tests/test_default_currencies_provider.py | pedroburon/python-monon | 670a5c5e98171838d43fd1f72d0afbf7a70b937b | [
"MIT"
] | null | null | null |
from unittest import TestCase
from decimal import Decimal, ROUND_UP
from monon.currency import DefaultCurrenciesProvider
class DefaultCurrenciesProviderTestCase(TestCase):
def setUp(self):
self.provider = DefaultCurrenciesProvider()
self.isocode = 'USD'
def test_decimal_places(self):
self.assertEqual(2, self.provider.get_decimal_places(self.isocode))
def test_symbol(self):
self.assertEqual('$', self.provider.get_symbol(self.isocode))
def test_validate_currency(self):
self.assertIsNone(self.provider.validate_currency(self.isocode))
def test_format_positive_amount(self):
amount = Decimal('43321.123')
expected = '$43321.123'
self.assertEqual(expected, self.provider.format_amount(self.isocode, amount))
def test_format_negative_amount(self):
amount = Decimal('-1234.123')
expected = '-$1234.123'
self.assertEqual(expected, self.provider.format_amount(self.isocode, amount))
def test_rounding(self):
self.assertEqual(ROUND_UP, self.provider.get_rounding(self.isocode))
| 30.916667 | 89 | 0.716083 | 127 | 1,113 | 6.110236 | 0.275591 | 0.108247 | 0.073454 | 0.069588 | 0.190722 | 0.190722 | 0.190722 | 0.190722 | 0.190722 | 0.190722 | 0 | 0.033954 | 0.179695 | 1,113 | 35 | 90 | 31.8 | 0.815991 | 0 | 0 | 0.086957 | 0 | 0 | 0.03777 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 1 | 0.304348 | false | 0 | 0.130435 | 0 | 0.478261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
925d10b9338d7e7d834dd03a2a709b1fb7059381 | 5,913 | py | Python | combinatorial_gwas/phenotypes/__init__.py | hoangthienan95/combinatorial_GWAS | c6d51f6214f96773b2a271c706ed7026152e0fbb | [
"Apache-2.0"
] | null | null | null | combinatorial_gwas/phenotypes/__init__.py | hoangthienan95/combinatorial_GWAS | c6d51f6214f96773b2a271c706ed7026152e0fbb | [
"Apache-2.0"
] | null | null | null | combinatorial_gwas/phenotypes/__init__.py | hoangthienan95/combinatorial_GWAS | c6d51f6214f96773b2a271c706ed7026152e0fbb | [
"Apache-2.0"
] | null | null | null | # AUTOGENERATED! DO NOT EDIT! File to edit: notebooks/package/phenotypes.ipynb (unless otherwise specified).
__all__ = ['QueryDataframe', 'parameters', 'catalog_all', 'catalog_all', 'read_csv_compressed', 'get_GWAS_result_link',
'heritability_Neale', 'display_cols', 'quality_heritability_phenos', 'quality_heritability_phenos',
'icd10_pheno_matrix', 'icd10_primary_cols', 'icd10_pheno_matrix', 'upsample_pheno', 'get_phenotype',
'get_GWAS_snps_for_trait']
# Cell
from ..data_catalog import get_catalog, get_parameters
import combinatorial_gwas
from pathlib import Path
import pandas as pd
from dataclasses import dataclass
from functools import partial
import numpy as np
from typing import List, Union
from fastcore.utils import partialler
import logging
# Cell
@pd.api.extensions.register_dataframe_accessor("pheno")
@dataclass
class QueryDataframe():
df: pd.DataFrame
def query(self, **column_dict:dict):
query_str = " and ".join([f"({col} {cond})" for col, cond in column_dict.items()])
return self.df.query(query_str)
# Cell
parameters = get_parameters()
parameters
# Cell
catalog_all = get_catalog()
catalog_all = catalog_all.reload()
catalog_all.list()
# Cell
read_csv_compressed= partialler(pd.read_csv, sep="\t", compression= "gzip")
get_GWAS_result_link = partialler(parameters['template_gwas_result_file_link'].format)
# Cell
heritability_Neale = catalog_all.load("heritability_trait_level_summary")
heritability_Neale.head()
# Cell
display_cols = ['description', 'h2_liability', 'h2_sig', 'confidence', 'n_cases', 'n_controls', 'prevalence']
# Cell
quality_heritability_phenos = heritability_Neale.pheno.query(h2_sig = "in ['z7', 'z4']", source= " == 'icd10'", confidence= "in ['medium', 'high']").sort_values("h2_liability", ascending = False)
quality_heritability_phenos = quality_heritability_phenos.set_index("phenotype")
quality_heritability_phenos.head()[display_cols]
# Cell
logging.warning("Loading ICD phenotype matrix, this might take a while")
icd10_pheno_matrix = catalog_all.load("ICD10_pheno_matrix")
#get the first 3 character of ICD code
icd10_primary_cols = icd10_pheno_matrix.columns[icd10_pheno_matrix.columns.str.contains("primary")]
icd10_pheno_matrix = icd10_pheno_matrix.astype(str).apply(lambda x: x.str.slice(0,3))
logging.warning("Finished loading ICD10 matrix")
# Cell
def upsample_pheno(pheno_df, balance_pheno, max_samples, random_state):
weights = pheno_df[balance_pheno].replace(pheno_df[balance_pheno].value_counts(normalize=True).to_dict())
pheno_df_upsampled = pheno_df.sample(max_samples, replace=True, weights = 1/weights, random_state=random_state)
return pheno_df_upsampled
def get_phenotype(icd10_codes: Union[str, List[str]] ="I84", samples:np.array=None, max_samples:int = None, balance_pheno: str = None, random_state=42):
"""
if samples argument is provided from genetic file, then find common set of samples and output ordered phenotype
if `max_samples` is provided, then over-sample the data so that we have `max_samples/2` for both cases and controls
"""
icd10_codes = [icd10_codes] if not isinstance(icd10_codes, list) else icd10_codes
pheno_df_list = [icd10_pheno_matrix[icd10_primary_cols].isin([icd10_code]).any(axis=1).astype(int) for icd10_code in icd10_codes]
pheno_df = pd.concat(pheno_df_list, axis=1)
pheno_df.columns = icd10_codes
if samples is not None:
geno_pheno_sample_index_mask = np.isin(samples.astype(int), pheno_df.index)
pheno_geno_samples_common_set = samples[geno_pheno_sample_index_mask].astype(int)
pheno_df_ordered = pheno_df.loc[list(pheno_geno_samples_common_set), :]
pheno_df_ordered = pheno_df_ordered.loc[~pheno_df_ordered.index.duplicated(keep="first"),:]
sample_index = np.argwhere(geno_pheno_sample_index_mask).reshape(-1)
if max_samples is not None:
if balance_pheno is None:
raise ValueError("Need to specify `balance_pheno` param: phenotype to balance during subsampling")
pheno_df_ordered = upsample_pheno(pheno_df_ordered, balance_pheno, max_samples, random_state)
sorter = np.argsort(samples.astype(int))
sample_index = sorter[np.searchsorted(samples.astype(int), pheno_df_ordered.index, sorter=sorter)]
assert np.allclose(samples[sample_index].astype(int), pheno_df_ordered.index), "sample mismatch between genotype and phenotype, something wrong with the `get_phenotype` function!"
pheno_df_ordered.index = pheno_df_ordered.index.astype(str)
return pheno_df_ordered
return pheno_df
# Cell
def get_GWAS_snps_for_trait(phenotype_code= "I84", chromosome:Union[int, List[int]] = 21, sort_val_cols_list: List[str] = ["pval"], ascending_bool_list: List[bool] = [True], id_only= True):
chromosome = [chromosome] if not isinstance(chromosome, list) else chromosome
chromosome_str = [f"{single_chromosome}:" for single_chromosome in chromosome]
gwas_result_df = read_csv_compressed(get_GWAS_result_link(phenotype_code=phenotype_code))
gwas_result_df = gwas_result_df.loc[gwas_result_df["variant"].str.startswith(tuple(chromosome_str)), :]
gwas_result_df = gwas_result_df.reset_index(drop=True).reset_index().rename(columns = {"index":"position_rank"})
gwas_result_df = gwas_result_df.sort_values(sort_val_cols_list, ascending = ascending_bool_list)
variant_id_df = gwas_result_df["variant"].str.split(":",expand=True)
variant_id_df["chr1_4"] =variant_id_df[[1,2,3]].apply("_".join, axis=1)
variant_id_df[1] = variant_id_df[1].astype(int)
gwas_result_df[["chr", "position", "major_allele"]] = variant_id_df[[0, 1, 2]]
gwas_result_df["full_id"] = variant_id_df[[0, "chr1_4"]].apply(":".join, axis=1)
if id_only:
return gwas_result_df["full_id"].values
else:
return gwas_result_df
| 46.559055 | 195 | 0.753425 | 839 | 5,913 | 4.984505 | 0.274136 | 0.040172 | 0.037303 | 0.022716 | 0.196078 | 0.113343 | 0.01626 | 0 | 0 | 0 | 0 | 0.016205 | 0.133773 | 5,913 | 126 | 196 | 46.928571 | 0.800273 | 0.072383 | 0 | 0 | 1 | 0 | 0.16523 | 0.025491 | 0 | 0 | 0 | 0 | 0.0125 | 1 | 0.05 | false | 0 | 0.125 | 0 | 0.275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9273889db8121468c67067e3c3728797e69262dd | 1,251 | py | Python | server/py/dummy.py | sreyas/Attendance-management-system | eeb57bcc942f407151b71bfab528e817c6806c74 | [
"MIT"
] | null | null | null | server/py/dummy.py | sreyas/Attendance-management-system | eeb57bcc942f407151b71bfab528e817c6806c74 | [
"MIT"
] | null | null | null | server/py/dummy.py | sreyas/Attendance-management-system | eeb57bcc942f407151b71bfab528e817c6806c74 | [
"MIT"
] | null | null | null | import numpy as np
import glob,os
import datetime
from pathlib import Path
from pymongo import MongoClient
from flask_mongoengine import MongoEngine
from bson.objectid import ObjectId
client = MongoClient(port=27017)
db=client.GetMeThrough;
binary =0;
home = str(os.path.dirname(os.path.abspath(__file__))) + "/../../"
file_names = glob.glob(home + "/known_people/*.jp*g")
home = str(os.path.dirname(os.path.abspath(__file__))) + "/../../"
known_encodings_file_path = home + "/data/known_encodings_file.csv"
people_file_path = home + "/data/people_file.csv"
known_encodings_file = Path(known_encodings_file_path)
date_format = "%Y-%m-%d %H:%M:%S.%f"
current_date = str(datetime.datetime.now())
attendance_system = db.attendance.find({"user": ObjectId("5ab221040784981645037c3a")})
res = [col.encode('utf8') if isinstance(col, unicode) else col for col in attendance_system]
for attendance_doc in res:
date_time = attendance_doc['date_time']
time1 = datetime.datetime.strptime(date_time.encode('utf8'), date_format)
time2 = datetime.datetime.strptime(str(datetime.datetime.now()), date_format)
diff = time2 - time1
minutes = (diff.seconds) / 60
print minutes
if(minutes >=30):
print ("hjhk")
| 37.909091 | 92 | 0.723421 | 172 | 1,251 | 5.05814 | 0.424419 | 0.027586 | 0.082759 | 0.075862 | 0.085057 | 0.085057 | 0.085057 | 0.085057 | 0.085057 | 0 | 0 | 0.033364 | 0.13749 | 1,251 | 32 | 93 | 39.09375 | 0.772938 | 0 | 0 | 0.068966 | 0 | 0 | 0.123102 | 0.059952 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.241379 | null | null | 0.068966 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9280f582ef5d54e9016b69411e4065df25b247b7 | 2,650 | py | Python | indicators/data.py | WPRDC/community-simulacrum | c463de3dac1749c82537c6c6ac74c50eb5b9be75 | [
"MIT"
] | null | null | null | indicators/data.py | WPRDC/community-simulacrum | c463de3dac1749c82537c6c6ac74c50eb5b9be75 | [
"MIT"
] | 6 | 2020-12-18T17:21:35.000Z | 2021-03-03T21:08:44.000Z | indicators/data.py | WPRDC/community-simulacrum | c463de3dac1749c82537c6c6ac74c50eb5b9be75 | [
"MIT"
] | null | null | null | import dataclasses
import typing
from dataclasses import dataclass
from typing import List
from typing import Optional
from profiles.settings import DENOM_DKEY, VALUE_DKEY, GEOG_DKEY, TIME_DKEY
if typing.TYPE_CHECKING:
from indicators.models import CensusVariable, CKANVariable
@dataclass
class Datum:
variable: str
geog: str
time: str
value: Optional[float] = None
moe: Optional[float] = None
percent: Optional[float] = None
denom: Optional[float] = None
@staticmethod
def from_census_response_datum(variable: 'CensusVariable', census_datum) -> 'Datum':
return Datum(
variable=variable.slug,
geog=census_datum.get('geog'),
time=census_datum.get('time'),
value=census_datum.get('value'),
moe=census_datum.get('moe'),
denom=census_datum.get('denom'),
percent=census_datum.get('percent'), )
@staticmethod
def from_census_response_data(variable: 'CensusVariable', census_data: list[dict]) -> List['Datum']:
return [Datum.from_census_response_datum(variable, census_datum) for census_datum in census_data]
@staticmethod
def from_ckan_response_datum(variable: 'CKANVariable', ckan_datum) -> 'Datum':
denom, percent = None, None
if DENOM_DKEY in ckan_datum:
denom = ckan_datum[DENOM_DKEY]
percent = (ckan_datum[VALUE_DKEY] / ckan_datum[DENOM_DKEY])
return Datum(variable=variable.slug,
geog=ckan_datum[GEOG_DKEY],
time=ckan_datum[TIME_DKEY],
value=ckan_datum[VALUE_DKEY],
denom=denom,
percent=percent)
@staticmethod
def from_ckan_response_data(variable: 'CKANVariable', ckan_data: list[dict]) -> List['Datum']:
return [Datum.from_ckan_response_datum(variable, ckan_datum) for ckan_datum in ckan_data]
def update(self, **kwargs):
""" Creates new Datum similar to the instance with new values from kwargs """
return Datum(**{**self.as_dict(), **kwargs})
def with_denom_val(self, denom_val: Optional[float]):
""" Merge the denom value and generate the percent """
return dataclasses.replace(self, denom=denom_val, percent=(self.value / denom_val))
def as_dict(self):
return {'variable': self.variable, 'geog': self.geog, 'time': self.time,
'value': self.value, 'moe': self.moe, 'percent': self.percent, 'denom': self.denom}
def as_value_dict(self):
return {'value': self.value, 'moe': self.moe, 'percent': self.percent, 'denom': self.denom}
| 37.857143 | 105 | 0.655472 | 321 | 2,650 | 5.218069 | 0.180685 | 0.053731 | 0.050149 | 0.029851 | 0.274627 | 0.151642 | 0.109851 | 0.109851 | 0.066866 | 0.066866 | 0 | 0 | 0.233585 | 2,650 | 69 | 106 | 38.405797 | 0.824717 | 0.044151 | 0 | 0.074074 | 0 | 0 | 0.061929 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.12963 | 0.092593 | 0.574074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
9289e747472df431fa07919c97c9bde9926f3780 | 636 | py | Python | 1-Algorithmic-Toolbox/week3/assignments/car_fueling.py | Helianus/Data-Structures-and-Algorithms-Coursera | 1fe8961cdd77fdeb88490f127e4369680419c428 | [
"MIT"
] | null | null | null | 1-Algorithmic-Toolbox/week3/assignments/car_fueling.py | Helianus/Data-Structures-and-Algorithms-Coursera | 1fe8961cdd77fdeb88490f127e4369680419c428 | [
"MIT"
] | null | null | null | 1-Algorithmic-Toolbox/week3/assignments/car_fueling.py | Helianus/Data-Structures-and-Algorithms-Coursera | 1fe8961cdd77fdeb88490f127e4369680419c428 | [
"MIT"
] | null | null | null | # python3
import sys
def compute_min_refills(distance, tank, stops):
# write your code here
if distance <= tank:
return 0
else:
stops.append(distance)
n_stops = len(stops) - 1
count = 0
refill = tank
for i in range(n_stops):
if refill < stops[i] or (stops[-2] + tank < distance):
return -1
if refill < stops[i + 1]:
refill = tank + stops[i]
count += 1
return count
if __name__ == '__main__':
d, m, _, *stops = map(int, sys.stdin.read().split())
print(compute_min_refills(d, m, stops))
| 22.714286 | 66 | 0.52673 | 81 | 636 | 3.950617 | 0.493827 | 0.05625 | 0.10625 | 0.0875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019656 | 0.360063 | 636 | 27 | 67 | 23.555556 | 0.766585 | 0.044025 | 0 | 0 | 0 | 0 | 0.013223 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 1 | 0.052632 | false | 0 | 0.052632 | 0 | 0.263158 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
928bbe7491fdbc0a0f928ea52f2ec3bc8d5bb842 | 323 | py | Python | python/python.py | TimVan1596/ACM-ICPC | 07f7d728db1ecd09c5a3d0f05521930b14eb9883 | [
"Apache-2.0"
] | 1 | 2019-05-22T07:12:34.000Z | 2019-05-22T07:12:34.000Z | python/python.py | TimVan1596/ACM-ICPC | 07f7d728db1ecd09c5a3d0f05521930b14eb9883 | [
"Apache-2.0"
] | 3 | 2021-12-10T01:13:54.000Z | 2021-12-14T21:18:42.000Z | python/python.py | TimVan1596/ACM-ICPC | 07f7d728db1ecd09c5a3d0f05521930b14eb9883 | [
"Apache-2.0"
] | null | null | null | import xlwt
if __name__ == '__main__':
workbook = xlwt.Workbook(encoding='utf-8') # 创建workbook 对象
worksheet = workbook.add_sheet('sheet1') # 创建工作表sheet
# 往表中写内容,第一各参数 行,第二个参数列,第三个参数内容
worksheet.write(0, 0, 'hello world')
worksheet.write(0, 1, '你好')
workbook.save('first.xls') # 保存表为students.xls
| 32.3 | 63 | 0.674923 | 41 | 323 | 5.097561 | 0.756098 | 0.133971 | 0.143541 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022642 | 0.179567 | 323 | 9 | 64 | 35.888889 | 0.766038 | 0.219814 | 0 | 0 | 0 | 0 | 0.165992 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
928bed2cda12bb44aa28928e0571f9042f5e1006 | 736 | py | Python | kiestze_django/kiestze/migrations/0004_auto_20180719_1438.py | oSoc18/kiest_ze | 842eefcf3f6002ea90c2917682f625749398b33e | [
"Apache-2.0"
] | 3 | 2018-07-11T07:59:15.000Z | 2018-07-26T19:58:44.000Z | kiestze_django/kiestze/migrations/0004_auto_20180719_1438.py | oSoc18/kiest_ze | 842eefcf3f6002ea90c2917682f625749398b33e | [
"Apache-2.0"
] | 4 | 2018-08-11T13:55:50.000Z | 2019-12-28T15:41:30.000Z | kiestze_django/kiestze/migrations/0004_auto_20180719_1438.py | oSoc18/kiest_ze | 842eefcf3f6002ea90c2917682f625749398b33e | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.7 on 2018-07-19 14:38
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('kiestze', '0003_gemeente'),
]
operations = [
migrations.RemoveField(
model_name='gemeente',
name='id',
),
migrations.AlterField(
model_name='gemeente',
name='nis',
field=models.IntegerField(primary_key=True, serialize=False),
),
migrations.AlterField(
model_name='partij',
name='nis',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='kiestze.Gemeente'),
),
]
| 25.37931 | 104 | 0.586957 | 74 | 736 | 5.756757 | 0.581081 | 0.056338 | 0.065728 | 0.103286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036538 | 0.293478 | 736 | 28 | 105 | 26.285714 | 0.782692 | 0.061141 | 0 | 0.409091 | 1 | 0 | 0.095791 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92939cd40ec3f894a9ca9fdc8baa64317efc2816 | 768 | py | Python | Deck.py | Harel92/BlackJack-Game | adba557a025d9633a2cdaca4f59bb7aa4c566f6c | [
"MIT"
] | null | null | null | Deck.py | Harel92/BlackJack-Game | adba557a025d9633a2cdaca4f59bb7aa4c566f6c | [
"MIT"
] | null | null | null | Deck.py | Harel92/BlackJack-Game | adba557a025d9633a2cdaca4f59bb7aa4c566f6c | [
"MIT"
] | null | null | null | import random
from Card import Card
suits = ('Hearts', 'Diamonds', 'Spades', 'Clubs')
ranks = ('Two', 'Three', 'Four', 'Five', 'Six', 'Seven', 'Eight', 'Nine', 'Ten', 'Jack', 'Queen', 'King', 'Ace')
class Deck:
def __init__(self):
# Note this only happens once upon creation of a new Deck
self.all_cards = []
for suit in suits:
for rank in ranks:
# This assumes the Card class has already been defined!
self.all_cards.append(Card(suit, rank))
def shuffle(self):
# Note this doesn't return anything
random.shuffle(self.all_cards)
def deal(self):
# Note we remove one card from the list of all_cards
return self.all_cards.pop()
| 30.72 | 113 | 0.578125 | 101 | 768 | 4.306931 | 0.613861 | 0.091954 | 0.110345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.302083 | 768 | 24 | 114 | 32 | 0.811567 | 0.252604 | 0 | 0 | 0 | 0 | 0.141284 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.142857 | 0.071429 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9294963c5c0383860551b27b39a0b277aa9e0e8f | 818 | py | Python | setup.py | lanius/chord | 48e129367080c95116600a80d56b310d06322b21 | [
"MIT"
] | null | null | null | setup.py | lanius/chord | 48e129367080c95116600a80d56b310d06322b21 | [
"MIT"
] | null | null | null | setup.py | lanius/chord | 48e129367080c95116600a80d56b310d06322b21 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from setuptools import setup
setup(
name='chord',
version='0.0.1',
url='https://github.com/lanius/chord/',
packages=['chord'],
license='MIT',
author='lanius',
author_email='lanius@nirvake.org',
description='Captures current status of keyboard.',
install_requires=['pyHook', 'pywin32'],
classifiers=[
'Development Status :: 3 - Alpha',
'Environment :: Win32 (MS Windows)',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Operating System :: Microsoft :: Windows',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Topic :: Software Development :: Libraries :: Python Modules'
],
)
| 29.214286 | 70 | 0.5978 | 80 | 818 | 6.0875 | 0.7375 | 0.117043 | 0.154004 | 0.106776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019386 | 0.243276 | 818 | 27 | 71 | 30.296296 | 0.767367 | 0.025672 | 0 | 0 | 0 | 0 | 0.576101 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.043478 | 0 | 0.043478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92950fc24b8bc2b382fb2963eb44444a27e250ed | 4,669 | py | Python | modules/deepspell/baseline/symspell_gendawg.py | Klebert-Engineering/deep-spell-9 | cabfcbf8238085b139e6b4b0e43f459230ead963 | [
"MIT"
] | 3 | 2018-12-16T17:20:39.000Z | 2019-01-11T19:45:29.000Z | modules/deepspell/baseline/symspell_gendawg.py | Klebert-Engineering/deep-spell-9 | cabfcbf8238085b139e6b4b0e43f459230ead963 | [
"MIT"
] | 2 | 2021-11-30T16:35:36.000Z | 2022-01-20T12:33:10.000Z | modules/deepspell/baseline/symspell_gendawg.py | Klebert-Engineering/deep-spell-9 | cabfcbf8238085b139e6b4b0e43f459230ead963 | [
"MIT"
] | null | null | null | # (C) 2018-present Klebert Engineering
"""
Opens a TSV FTS corpus file and generates misspelled entries
for each FTS token with a given maximum edit distance.
Takes two arguments:
(1) The corpus file
(2) The output file. Two output files will be generated from this argument:
-> <output file>.refs : Contains pickled DAWG with misspelled tokens and correct token references
-> <output file>.tokens : Contains the correctly spelled tokens, where line-number=reference-index,
in the form of <token> <frequency>
"""
import codecs
import sys
import string
from hat_trie import Trie
from dawg import BytesDAWG
def generate_lookup_entries(w, max_edit_distance=0):
"""given a word, derive strings with up to max_edit_distance characters
deleted"""
result = {w}
queue = {w}
for d in range(max_edit_distance):
temp_queue = set()
for word in queue:
if len(word) > 1:
for c in range(len(word)): # character index
word_minus_c = word[:c] + word[c + 1:]
if word_minus_c not in result:
result.add(word_minus_c)
if word_minus_c not in temp_queue:
temp_queue.add(word_minus_c)
queue = temp_queue
return result
def print_progress(iteration, total, prefix='', suffix='', decimals=1, bar_length=10):
"""
Call in a loop to create terminal progress bar
@params:
iteration - Required : current iteration (Int)
total - Required : total iterations (Int)
prefix - Optional : prefix string (Str)
suffix - Optional : suffix string (Str)
decimals - Optional : positive number of decimals in percent complete (Int)
bar_length - Optional : character length of bar (Int)
"""
str_format = "{0:." + str(decimals) + "f}"
percents = str_format.format(100 * (iteration / float(total)))
filled_length = int(round(bar_length * iteration / float(total)))
bar = '#' * filled_length + '-' * (bar_length - filled_length)
sys.stdout.write('\r%s |%s| %s%s %s' % (prefix, bar, percents, '%', suffix)),
if iteration == total:
sys.stdout.write('\n')
sys.stdout.flush()
input_file_path = sys.argv[1]
output_file_path = sys.argv[2]
bytes_for_token = Trie() # charset
token_and_freq_for_index = []
longest_word_length = 0
bytes_per_index = 3
# Count total lines in corpus file
with codecs.open(input_file_path, encoding="utf-8") as corpus_file:
total = sum(1 for _ in corpus_file)
done = 0
print("Loading completion tokens from '{}'...".format(input_file_path))
with codecs.open(input_file_path, encoding="utf-8") as input_file:
index_for_token = Trie() # charset
for line in input_file:
parts = line.split("\t")
done += 1
print_progress(done, total)
if len(parts) < 6:
continue
token = parts[2].lower() # unidecode.unidecode()
# check if word is already in dictionary
# dictionary entries are in the form: (list of suggested corrections, frequency of word in corpus)
if token in index_for_token:
token_index = index_for_token[token]
else:
token_index = len(token_and_freq_for_index)
index_for_token[token] = token_index
longest_word_length = max(len(token), longest_word_length)
token_and_freq_for_index.append([token, 0])
# first appearance of word in corpus
# n.b. word may already be in dictionary as a derived word, but
# counter of frequency of word in corpus is not incremented in those cases.
deletes = generate_lookup_entries(token)
word_index_bytes = token_index.to_bytes(bytes_per_index, 'big')
for entry in deletes:
if entry in bytes_for_token:
bytes_for_token[entry] += word_index_bytes
else:
bytes_for_token[entry] = word_index_bytes
# increment count of token in corpus
token_and_freq_for_index[token_index][1] += 1
print("\n ...done.")
print("Creating DAWG...")
dawg_dict = BytesDAWG(([token, bytes_for_token[token]] for token in bytes_for_token.iterkeys()))
print(" ...done.")
print("Writing output files {}.refs and {}.tokens ...".format(output_file_path, output_file_path))
dawg_dict.save(output_file_path + ".refs")
with codecs.open(output_file_path + ".tokens", "w", encoding="utf-8") as output_tokens:
for token, freq in token_and_freq_for_index:
output_tokens.write("{}\t{}\n".format(token, freq))
print(" ...done.")
| 39.567797 | 106 | 0.644892 | 636 | 4,669 | 4.537736 | 0.284591 | 0.033264 | 0.027027 | 0.025988 | 0.127166 | 0.06237 | 0.050589 | 0.028413 | 0.028413 | 0.028413 | 0 | 0.008895 | 0.253587 | 4,669 | 117 | 107 | 39.905983 | 0.819225 | 0.30649 | 0 | 0.054795 | 1 | 0 | 0.063547 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027397 | false | 0 | 0.068493 | 0 | 0.109589 | 0.109589 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9296c38d6179d914971335cddde811e8c8cfc78b | 347 | py | Python | cfg/launcher/__main__.py | rr-/dotfiles | 4a684c43a5714a3312b42b445e5ba9ae1fab0d1a | [
"MIT"
] | 16 | 2015-06-05T12:57:44.000Z | 2021-08-05T23:49:42.000Z | cfg/launcher/__main__.py | rr-/dotfiles | 4a684c43a5714a3312b42b445e5ba9ae1fab0d1a | [
"MIT"
] | 6 | 2015-11-01T18:18:26.000Z | 2020-10-06T09:17:29.000Z | cfg/launcher/__main__.py | rr-/dotfiles | 4a684c43a5714a3312b42b445e5ba9ae1fab0d1a | [
"MIT"
] | 6 | 2015-10-31T18:53:12.000Z | 2020-11-30T18:03:06.000Z | import os
from libdotfiles.util import (
HOME_DIR,
PKG_DIR,
REPO_ROOT_DIR,
create_symlink,
run,
)
create_symlink(
PKG_DIR / "launcher.json", HOME_DIR / ".config" / "launcher.json"
)
os.chdir(REPO_ROOT_DIR / "opt" / "launcher")
run(
["python3", "-m", "pip", "install", "--user", "--upgrade", "."],
check=False,
)
| 17.35 | 69 | 0.605187 | 43 | 347 | 4.651163 | 0.604651 | 0.07 | 0.11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003636 | 0.207493 | 347 | 19 | 70 | 18.263158 | 0.723636 | 0 | 0 | 0 | 0 | 0 | 0.227666 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9296fb6b24dc3eedf52b5f0b91836cb7ef50d404 | 2,456 | py | Python | problems/737.Sentence-Similarity-II/li.py | subramp-prep/leetcode | d125201d9021ab9b1eea5e5393c2db4edd84e740 | [
"Unlicense"
] | null | null | null | problems/737.Sentence-Similarity-II/li.py | subramp-prep/leetcode | d125201d9021ab9b1eea5e5393c2db4edd84e740 | [
"Unlicense"
] | null | null | null | problems/737.Sentence-Similarity-II/li.py | subramp-prep/leetcode | d125201d9021ab9b1eea5e5393c2db4edd84e740 | [
"Unlicense"
] | null | null | null | # coding=utf-8
# Author: Jianghan LI
# Question: 737.Sentence-Similarity-II
# Complexity: O(N)
# Date: 2018-05 14:50 - 14:56, 1 wrong try
class Solution(object):
def areSentencesSimilarTwo(self, words1, words2, pairs):
"""
:type words1: List[str]
:type words2: List[str]
:type pairs: List[List[str]]
:rtype: bool
"""
parents = {}
self.count = 0
def add(x):
if x not in parents:
parents[x] = x
self.count += 1
def find(x):
if x not in parents:
return x
if x != parents[x]:
parents[x] = find(parents[x])
return parents[x]
def union(x, y):
x, y = find(x), find(y)
if x != y:
parents[x] = y
self.count -= 1
return True
return False
for x, y in pairs:
add(x)
add(y)
union(x, y)
if len(words1) != len(words2):
return False
for x, y in zip(words1, words2):
# print find(x), find(y)
if find(x) != find(y):
return False
return True
############ test case ###########
s = Solution()
words1 = ["great", "acting", "skills"]
words2 = ["fine", "drama", "talent"]
pairs = [["great", "good"], ["fine", "good"], ["acting", "drama"], ["skills", "talent"]]
print s.areSentencesSimilarTwo(words1, words2, pairs)
words1 = ["an", "extraordinary", "meal", "meal"]
words2 = ["one", "good", "dinner"]
pairs = [["great", "good"], ["extraordinary", "good"], ["well", "good"], ["wonderful", "good"], ["excellent", "good"], ["fine", "good"], ["nice", "good"], ["any", "one"], ["some", "one"], ["unique", "one"], ["the", "one"], ["an", "one"], ["single", "one"], ["a", "one"], ["truck", "car"], ["wagon", "car"], ["automobile", "car"], ["auto", "car"], ["vehicle", "car"], [
"entertain", "have"], ["drink", "have"], ["eat", "have"], ["take", "have"], ["fruits", "meal"], ["brunch", "meal"], ["breakfast", "meal"], ["food", "meal"], ["dinner", "meal"], ["super", "meal"], ["lunch", "meal"], ["possess", "own"], ["keep", "own"], ["have", "own"], ["extremely", "very"], ["actually", "very"], ["really", "very"], ["super", "very"]]
print s.areSentencesSimilarTwo(words1, words2, pairs)
############ comments ############
# 1 wrong try for case len(words1) != len(words2)
| 38.375 | 368 | 0.482085 | 278 | 2,456 | 4.258993 | 0.377698 | 0.011824 | 0.043074 | 0.025338 | 0.153716 | 0.133446 | 0 | 0 | 0 | 0 | 0 | 0.023177 | 0.279723 | 2,456 | 63 | 369 | 38.984127 | 0.646128 | 0.089169 | 0 | 0.219512 | 0 | 0 | 0.220049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.04878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
929e35be958ee3633c514ae680ee6e40e159bff6 | 1,611 | py | Python | migrations/versions/f6ee6f9df554_.py | d-demirci/blockpy-server | f596f539d7809054a430a898bee877a0dbcb15b5 | [
"MIT"
] | 18 | 2019-10-14T13:56:15.000Z | 2022-03-12T23:49:14.000Z | migrations/versions/f6ee6f9df554_.py | acbart/blockpy-server | c504be4dffc624b7161c2c976d2d195f5f08cf9a | [
"MIT"
] | 26 | 2019-08-13T18:17:45.000Z | 2021-09-06T12:31:48.000Z | migrations/versions/f6ee6f9df554_.py | DigitalDerrickcs/blockpy-server | 4d3da5830ec802ec9d018e2fde91f33e2be38f10 | [
"MIT"
] | 9 | 2019-08-27T10:52:31.000Z | 2021-07-27T16:10:17.000Z | """Add Review Table
Revision ID: f6ee6f9df554
Revises:
Create Date: 2019-08-07 13:09:49.691184
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'f6ee6f9df554'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('review',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('date_created', sa.DateTime(), nullable=True),
sa.Column('date_modified', sa.DateTime(), nullable=True),
sa.Column('comment', sa.Text(), nullable=True),
sa.Column('tag_id', sa.Integer(), nullable=True),
sa.Column('score', sa.Integer(), nullable=True),
sa.Column('submission_id', sa.Integer(), nullable=True),
sa.Column('author_id', sa.Integer(), nullable=True),
sa.Column('assignment_version', sa.Integer(), nullable=True),
sa.Column('submission_version', sa.Integer(), nullable=True),
sa.Column('version', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(('author_id',), ['user.id'], ),
sa.ForeignKeyConstraint(('submission_id',), ['submission.id'], ),
sa.ForeignKeyConstraint(('tag_id',), ['assignment_tag.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_foreign_key(None, 'submission', 'assignment_group', ['assignment_group_id'], ['id'])
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint(None, 'submission', type_='foreignkey')
op.drop_table('review')
# ### end Alembic commands ###
| 34.276596 | 98 | 0.676598 | 197 | 1,611 | 5.416244 | 0.340102 | 0.082474 | 0.131209 | 0.168697 | 0.367385 | 0.367385 | 0.283037 | 0.082474 | 0 | 0 | 0 | 0.023256 | 0.145872 | 1,611 | 46 | 99 | 35.021739 | 0.75218 | 0.177529 | 0 | 0 | 0 | 0 | 0.208075 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92adae391b5af1e46c2cbd108fadd2ff78e87772 | 384 | py | Python | app/app/services/forms.py | dtcooper/crazyarms | 71ea0e58958233daaceb8750043f74ef1a141079 | [
"MIT"
] | 15 | 2021-01-18T17:16:51.000Z | 2022-03-28T22:16:19.000Z | app/app/services/forms.py | dtcooper/carb | 71ea0e58958233daaceb8750043f74ef1a141079 | [
"MIT"
] | 4 | 2021-03-14T16:28:40.000Z | 2021-03-31T16:48:49.000Z | app/app/services/forms.py | dtcooper/carb | 71ea0e58958233daaceb8750043f74ef1a141079 | [
"MIT"
] | 3 | 2021-07-15T02:24:19.000Z | 2022-03-18T11:50:05.000Z | from django import forms
from .services import HarborService
class HarborCustomConfigForm(forms.Form):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
for section_number in range(1, HarborService.CUSTOM_CONFIG_NUM_SECTIONS + 1):
self.fields[f"section{section_number}"] = forms.CharField(widget=forms.Textarea, required=False)
| 34.909091 | 108 | 0.726563 | 46 | 384 | 5.782609 | 0.695652 | 0.075188 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006211 | 0.161458 | 384 | 10 | 109 | 38.4 | 0.819876 | 0 | 0 | 0 | 0 | 0 | 0.059896 | 0.059896 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
92adb85a4eb3efaf7dd6076d356a071ec17c7036 | 26,816 | py | Python | deepmars/models/train_model_sys.py | utplanets/deepmars | ba306aa9b25b654636b61cf952af2791b7ed0e56 | [
"MIT"
] | 2 | 2021-08-08T03:06:58.000Z | 2021-11-25T04:06:00.000Z | deepmars/models/train_model_sys.py | utplanets/deepmars | ba306aa9b25b654636b61cf952af2791b7ed0e56 | [
"MIT"
] | null | null | null | deepmars/models/train_model_sys.py | utplanets/deepmars | ba306aa9b25b654636b61cf952af2791b7ed0e56 | [
"MIT"
] | 2 | 2020-11-23T09:38:26.000Z | 2021-02-26T01:14:28.000Z | #!/usr/bin/env python
"""Convolutional Neural Network Training Functions
Functions for building and training a (UNET) Convolutional Neural Network on
images of the Mars and binary ring targets.
"""
from __future__ import absolute_import, division, print_function
import numpy as np
import pandas as pd
import h5py
from keras.models import Model
from keras.layers.core import Dropout, Reshape
from keras.regularizers import l2
from keras.optimizers import Adam
from keras.callbacks import EarlyStopping
from keras import backend as K
import deepmars.features.template_match_target as tmt
import deepmars.utils.processing as proc
import click
import logging
from pathlib import Path
from dotenv import find_dotenv, load_dotenv
import os
from joblib import Parallel, delayed
from tqdm import tqdm, trange
# Check Keras version - code will switch API if needed.
from keras import __version__ as keras_version
K.set_image_dim_ordering('tf')
k2 = True if keras_version[0] == '2' else False
# If Keras is v2.x.x, create Keras 1-syntax wrappers.
if not k2:
from keras.models import load_model
from keras.layers import merge, Input
from keras.layers.convolutional import (Convolution2D, MaxPooling2D,
UpSampling2D)
else:
from keras.models import load_model
from keras.layers import Concatenate, Input
from keras.layers.convolutional import (Conv2D, MaxPooling2D,
UpSampling2D)
def merge(layers, mode=None, concat_axis=None):
"""Wrapper for Keras 2's Concatenate class (`mode` is discarded)."""
return Concatenate(axis=concat_axis)(list(layers))
def Convolution2D(n_filters, FL, FLredundant, activation=None,
init=None, W_regularizer=None, border_mode=None):
"""Wrapper for Keras 2's Conv2D class."""
return Conv2D(n_filters, FL, activation=activation,
kernel_initializer=init,
kernel_regularizer=W_regularizer,
padding=border_mode)
minrad_ = 5
maxrad_ = 40
longlat_thresh2_ = 1.8
rad_thresh_ = 1.0
template_thresh_ = 0.5
target_thresh_ = 0.1
@click.group()
def dl():
log_fmt = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
logging.basicConfig(level=logging.INFO, format=log_fmt)
# not used in this stub but often useful for finding various files
project_dir = Path(__file__).resolve().parents[2]
# find .env automagically by walking up directories until it's found, then
# load up the .env entries as environment variables
load_dotenv(find_dotenv())
import sys
sys.path.append(os.getenv("DM_ROOTDIR"))
pass
########################
def get_param_i(param, i):
"""Gets correct parameter for iteration i.
Parameters
----------
param : list
List of model hyperparameters to be iterated over.
i : integer
Hyperparameter iteration.
Returns
-------
Correct hyperparameter for iteration i.
"""
if len(param) > i:
return param[i]
else:
return param[0]
########################
def custom_image_generator(data, target, batch_size=32):
"""Custom image generator that manipulates image/target pairs to prevent
overfitting in the Convolutional Neural Network.
Parameters
----------
data : array
Input images.
target : array
Target images.
batch_size : int, optional
Batch size for image manipulation.
Yields
------
Manipulated images and targets.
"""
D, L, W = data.shape[0], data[0].shape[0], data[0].shape[1]
while True:
shuffle_index = np.arange(D)
# only shuffle once each loop through the data
np.random.shuffle(shuffle_index)
for i in np.arange(0, len(data), batch_size):
index = shuffle_index[i:i + batch_size]
d, t = data[index].copy(), target[index].copy()
# Random color inversion
# for j in np.where(np.random.randint(0, 2, batch_size) == 1)[0]:
# d[j][d[j] > 0.] = 1. - d[j][d[j] > 0.]
# Horizontal/vertical flips
for j in np.where(np.random.randint(0, 2, batch_size) == 1)[0]:
d[j], t[j] = np.fliplr(d[j]), np.fliplr(t[j]) # left/right
for j in np.where(np.random.randint(0, 2, batch_size) == 1)[0]:
d[j], t[j] = np.flipud(d[j]), np.flipud(t[j]) # up/down
# Random up/down & left/right pixel shifts, 90 degree rotations
npix = 15
# Horizontal shift
h = np.random.randint(-npix, npix + 1, batch_size)
# Vertical shift
v = np.random.randint(-npix, npix + 1, batch_size)
# 90 degree rotations
r = np.random.randint(0, 4, batch_size)
for j in range(batch_size):
d[j] = np.pad(d[j], ((npix, npix), (npix, npix), (0, 0)),
mode='constant')[npix + h[j]:L + h[j] + npix,
npix + v[j]:W + v[j] + npix, :]
sh, sv = slice(npix + h[j], L + h[j] + npix),\
slice(npix + v[j], W + v[j] + npix)
t[j] = np.pad(t[j], (npix,), mode='constant')[sh, sv]
d[j], t[j] = np.rot90(d[j], r[j]), np.rot90(t[j], r[j])
yield (d, t)
def t2c(pred, csv, i,
minrad=minrad_,
maxrad=maxrad_,
longlat_thresh2=longlat_thresh2_,
rad_thresh=rad_thresh_,
template_thresh=template_thresh_,
target_thresh=target_thresh_):
return np.hstack([i,
tmt.template_match_t2c(pred, csv,
minrad=minrad,
maxrad=maxrad
longlat_thresh2=longlat_thresh2,
rad_thresh=rad_thresh,
template_thresh=template_thresh,
target_thresh=target_thresh)])
def diagnostic(res, beta):
"""Calculate the metrics from the predictions compared to the CSV.
Parameters
------------
res: list of results containing:
image number, number of matched, number of existing craters, number of
detected craters, maximum radius detected, mean error in longitude,
mean error in latitude, mean error in radius, fraction of duplicates
in detections.
beta : int
Beta value when calculating F-beta score.
Returns
-------
dictionary : metrics stored in a dictionary
"""
counter, N_match, N_csv, N_detect,\
mrad, err_lo, err_la, err_r, frac_duplicates = np.array(res).T
w = np.where(N_match == 0)
w = np.where(N_match > 0)
counter, N_match, N_csv, N_detect,\
mrad, err_lo, err_la, errr_, frac_dupes =\
counter[w], N_match[w], N_csv[w], N_detect[w],\
mrad[w], err_lo[w], err_la[w], err_r[w], frac_duplicates[w]
precision = N_match / (N_match + (N_detect - N_match))
recall = N_match / N_csv
fscore = (1 + beta**2) * (recall * precision) / \
(precision * beta**2 + recall)
diff = N_detect - N_match
frac_new = diff / (N_detect + diff)
frac_new2 = diff / (N_csv + diff)
frac_duplicates = frac_dupes
return dict(precision=precision,
recall=recall,
fscore=fscore,
frac_new=frac_new,
frac_new2=frac_new2,
err_lo=err_lo,
err_la=err_la,
err_r=err_r,
frac_duplicates=frac_duplicates,
maxrad=mrad,
counter=counter, N_match=N_match, N_csv=N_csv)
def get_metrics(data, craters_images, dim, model, name, beta=1, offset=0,
minrad=minrad_, maxrad=maxrad_,
longlat_thresh2=longlat_thresh2_,
rad_thresh=rad_thresh_, template_thresh=template_thresh_,
target_thresh=target_thresh_, rmv_oor_csvs=0):
"""Function that prints pertinent metrics at the end of each epoch.
Parameters
----------
data : hdf5
Input images.
craters : hdf5
Pandas arrays of human-counted crater data.
dim : int
Dimension of input images (assumes square).
model : keras model object
Keras model
beta : int, optional
Beta value when calculating F-beta score. Defaults to 1.
"""
X, Y = data[0], data[1]
craters, images = craters_images
# Get csvs of human-counted craters
csvs = []
# minrad, maxrad = 3, 50
cutrad, n_csvs = 0.8, len(X)
diam = 'Diameter (pix)'
for i in range(len(X)):
imname = images[i] # name = "img_{0:05d}".format(i)
found = False
for crat in craters:
if imname in crat:
csv = crat[imname]
found = True
if not found:
csvs.append([-2])
continue
# remove small/large/half craters
csv = csv[(csv[diam] < 2 * maxrad) & (csv[diam] > 2 * minrad)]
csv = csv[(csv['x'] + cutrad * csv[diam] / 2 <= dim)]
csv = csv[(csv['y'] + cutrad * csv[diam] / 2 <= dim)]
csv = csv[(csv['x'] - cutrad * csv[diam] / 2 > 0)]
csv = csv[(csv['y'] - cutrad * csv[diam] / 2 > 0)]
if len(csv) < 3: # Exclude csvs with few craters
csvs.append([-1])
else:
csv_coords = np.asarray((csv['x'], csv['y'], csv[diam] / 2)).T
csvs.append(csv_coords)
# Calculate custom metrics
print("csvs: {}".format(len(csvs)))
print("")
print("*********Custom Loss*********")
recall, precision, fscore = [], [], []
frac_new, frac_new2, mrad = [], [], []
err_lo, err_la, err_r = [], [], []
frac_duplicates = []
if isinstance(model, Model):
preds = None
# print(X[6].min(),X[6].max(),X.dtype,np.percentile(X[6],99))
preds = model.predict(X, verbose=1)
# save
h5f = h5py.File("predictions.hdf5", 'w')
h5f.create_dataset(name, data=preds)
print("Successfully generated and saved model predictions.")
else:
preds = model
# print(csvs)
countme = [i for i in range(n_csvs) if len(csvs[i]) >= 3]
print("Processing {} fields".format(len(countme)))
# preds contains a large number of predictions,
# so we run the template code in parallel.
res = Parallel(n_jobs=24,
verbose=5)(delayed(t2c)(preds[i], csvs[i], i,
minrad=minrad,
maxrad=maxrad,
longlat_thresh2=longlat_thresh2,
rad_thresh=rad_thresh,
template_thresh=template_thresh,
target_thresh=target_thresh)
for i in range(n_csvs) if len(csvs[i]) >= 3)
if len(res) == 0:
print("No valid results: ", res)
return None
# At this point we've processed the predictions with the template matching
# algorithm, now calculate the metrics from the data.
diag = diagnostic(res, beta)
print(len(diag["recall"]))
# print("binary XE score = %f" % model.evaluate(X, Y))
if len(diag["recall"]) > 3:
metric_data = [("N_match/N_csv (recall)", diag["recall"]),
("N_match/(N_match + (N_detect-N_match)) (precision)",
diag["precision"]),
("F_{} score".format(beta), diag["fscore"]),
("(N_detect - N_match)/N_detect" +
"(fraction of craters that are new)",
diag["frac_new"]),
("(N_detect - N_match)/N_csv (fraction" +
"of craters that are new, 2)", diag["frac_new2"])]
for fname, data in metric_data:
print("mean and std of %s = %f, %f" %
(fname, np.mean(data), np.std(data)))
for fname, data in [("fractional longitude diff", diag["err_lo"]),
("fractional latitude diff", diag["err_la"]),
("fractional radius diff", diag["err_r"]),
]:
print("median and IQR %s = %f, 25:%f, 75:%f" %
(fname,
np.median(data),
np.percentile(data, 25),
np.percentile(data, 75)))
print("""mean and std of maximum detected pixel radius in an image =
%f, %f""" % (np.mean(diag["maxrad"]), np.std(diag["maxrad"])))
print("""absolute maximum detected pixel radius over all images =
%f""" % np.max(diag["maxrad"]))
print("")
return diag
########################
def build_model(dim, learn_rate, lmbda, drop, FL, init, n_filters):
"""Function that builds the (UNET) convolutional neural network.
Parameters
----------
dim : int
Dimension of input images (assumes square).
learn_rate : float
Learning rate.
lmbda : float
Convolution2D regularization parameter.
drop : float
Dropout fraction.
FL : int
Filter length.
init : string
Weight initialization type.
n_filters : int
Number of filters in each layer.
Returns
-------
model : keras model object
Constructed Keras model.
"""
print('Making UNET model...')
img_input = Input(batch_shape=(None, dim, dim, 1))
a1 = Convolution2D(n_filters, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(img_input)
a1 = Convolution2D(n_filters, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(a1)
a1P = MaxPooling2D((2, 2), strides=(2, 2))(a1)
a2 = Convolution2D(n_filters * 2, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(a1P)
a2 = Convolution2D(n_filters * 2, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(a2)
a2P = MaxPooling2D((2, 2), strides=(2, 2))(a2)
a3 = Convolution2D(n_filters * 4, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(a2P)
a3 = Convolution2D(n_filters * 4, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(a3)
a3P = MaxPooling2D((2, 2), strides=(2, 2),)(a3)
u = Convolution2D(n_filters * 4, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(a3P)
u = Convolution2D(n_filters * 4, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = UpSampling2D((2, 2))(u)
u = merge((a3, u), mode='concat', concat_axis=3)
u = Dropout(drop)(u)
u = Convolution2D(n_filters * 2, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = Convolution2D(n_filters * 2, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = UpSampling2D((2, 2))(u)
u = merge((a2, u), mode='concat', concat_axis=3)
u = Dropout(drop)(u)
u = Convolution2D(n_filters, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = Convolution2D(n_filters, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = UpSampling2D((2, 2))(u)
u = merge((a1, u), mode='concat', concat_axis=3)
u = Dropout(drop)(u)
u = Convolution2D(n_filters, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = Convolution2D(n_filters, FL, FL, activation='relu', init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
# Final output
final_activation = 'sigmoid'
u = Convolution2D(1, 1, 1, activation=final_activation, init=init,
W_regularizer=l2(lmbda), border_mode='same')(u)
u = Reshape((dim, dim))(u)
if k2:
model = Model(inputs=img_input, outputs=u)
else:
model = Model(input=img_input, output=u)
optimizer = Adam(lr=learn_rate)
model.compile(loss='binary_crossentropy', optimizer=optimizer)
print(model.summary())
return model
########################
def test_model(Data, Craters, MP, i_MP):
# Static params
dim, nb_epoch, bs = MP['dim'], MP['epochs'], MP['bs']
# Iterating params
FL = get_param_i(MP['filter_length'], i_MP)
learn_rate = get_param_i(MP['lr'], i_MP)
n_filters = get_param_i(MP['n_filters'], i_MP)
init = get_param_i(MP['init'], i_MP)
lmbda = get_param_i(MP['lambda'], i_MP)
drop = get_param_i(MP['dropout'], i_MP)
model = load_model(MP["model"])
get_metrics(Data[MP["test_dataset"]],
Craters[MP["test_dataset"]], dim, model, MP["test_dataset"])
def train_and_test_model(Data, Craters, MP, i_MP):
"""Function that trains, tests and saves the model, printing out metrics
after each model.
Parameters
----------
Data : dict
Inputs and Target Moon data.
Craters : dict
Human-counted crater data.
MP : dict
Contains all relevant parameters.
i_MP : int
Iteration number (when iterating over hypers).
"""
# Static params
dim, nb_epoch, bs = MP['dim'], MP['epochs'], MP['bs']
# Iterating params
FL = get_param_i(MP['filter_length'], i_MP)
learn_rate = get_param_i(MP['lr'], i_MP)
n_filters = get_param_i(MP['n_filters'], i_MP)
init = get_param_i(MP['init'], i_MP)
lmbda = get_param_i(MP['lambda'], i_MP)
drop = get_param_i(MP['dropout'], i_MP)
# Build model
if MP["model"] is not None:
model = load_model(MP["model"])
else:
model = build_model(dim, learn_rate, lmbda, drop, FL, init, n_filters)
# Main loop
n_samples = MP['n_train']
for nb in range(nb_epoch):
if k2:
model.fit_generator(
custom_image_generator(Data['train'][0], Data['train'][1],
batch_size=bs),
steps_per_epoch=n_samples / bs, epochs=1, verbose=1,
# validation_data=(Data['dev'][0],Data['dev'][1]), #no gen
validation_data=custom_image_generator(Data['dev'][0],
Data['dev'][1],
batch_size=bs),
validation_steps=MP['n_dev'] / bs,
callbacks=[
EarlyStopping(monitor='val_loss', patience=3, verbose=0)])
else:
model.fit_generator(
custom_image_generator(Data['train'][0], Data['train'][1],
batch_size=bs),
samples_per_epoch=n_samples, nb_epoch=1, verbose=1,
# validation_data=(Data['dev'][0],Data['dev'][1]), #no gen
validation_data=custom_image_generator(Data['dev'][0],
Data['dev'][1],
batch_size=bs),
nb_val_samples=n_samples,
callbacks=[
EarlyStopping(monitor='val_loss', patience=3, verbose=0)])
suffix = "{}_{}_{}_{}_{}_{}_{}.hdf5".format(learn_rate,
n_filters,
init,
lmbda,
drop,
nb,
nb_epoch)
model_save_name = os.path.join(MP["save_dir"],
"model_".format(suffix))
if MP['save_models']:
model.save(model_save_name)
if MP["calculate_custom_loss"]:
get_metrics(Data['dev'], Craters['dev'], dim, model, "dev")
if MP["save_models"] == 1:
model.save(os.path.join(MP["save_dir"], MP["final_save_name"]))
print("###################################")
print("##########END_OF_RUN_INFO##########")
print("""learning_rate=%e, batch_size=%d, filter_length=%e, n_epoch=%d
n_train=%d, img_dimensions=%d, init=%s, n_filters=%d, lambda=%e
dropout=%f""" % (learn_rate, bs, FL, nb_epoch, MP['n_train'],
MP['dim'], init, n_filters, lmbda, drop))
if MP["calculate_custom_loss"]:
get_metrics(Data['test'], Craters['test'], dim, model, "test")
print("###################################")
print("###################################")
########################
def get_models(MP):
"""Top-level function that loads data files and calls train_and_test_model.
Parameters
----------
MP : dict
Model Parameters.
"""
dir = MP['dir']
n_train, n_dev, n_test = MP['n_train'], MP['n_dev'], MP['n_test']
# Load data
def load_files(numbers, test, this_dataset):
res0 = []
res1 = []
files = []
craters = []
images = []
npic = 0
if not test or (test and this_dataset):
for n in tqdm(numbers):
files.append(h5py.File(os.path.join(
dir, "sys_images_{0:05d}.hdf5".format(n)), 'r'))
images.extend(["img_{0:05d}".format(a)
for a in np.arange(n, n + 1000)])
res0.append(files[-1]["input_images"][:].astype('float32'))
npic = npic + len(res0[-1])
res1.append(files[-1]["target_masks"][:].astype('float32'))
files[-1].close()
craters.append(pd.HDFStore(os.path.join(
dir, "sys_craters_{0:05d}.hdf5".format(n)), 'r'))
res0 = np.vstack(res0)
res1 = np.vstack(res1)
return files, res0, res1, npic, craters, images
train_files,\
train0,\
train1,\
Ntrain,\
train_craters,\
train_images = load_files(MP["train_indices"],
MP["test"],
MP["test_dataset"] == "train")
print(Ntrain, n_train)
dev_files,\
dev0,\
dev1,\
Ndev,\
dev_craters,\
dev_images = load_files(MP["dev_indices"],
MP["test"],
MP["test_dataset"] == "dev")
print(Ndev, n_dev)
test_files,\
test0,\
test1,\
Ntest,\
test_craters,\
test_images = load_files(MP["test_indices"],
MP["test"],
MP["test_dataset"] == "test")
print(Ntest, n_test)
Data = {
"train": [train0, train1],
"dev": [dev0, dev1],
"test": [test0[:n_test], test1[:n_test]]
}
# Rescale, normalize, add extra dim
proc.preprocess(Data)
# Load ground-truth craters
Craters = {
'train': [train_craters, train_images],
'dev': [dev_craters, dev_images],
'test': [test_craters, test_images]
}
# Iterate over parameters
if MP["test"]:
test_model(Data, Craters, MP, 0)
return
else:
for i in range(MP['N_runs']):
train_and_test_model(Data, Craters, MP, i)
@dl.command()
@click.option("--test", is_flag=True, default=False)
@click.option("--test_dataset", default="dev")
@click.option("--model", default=None)
def train_model(test, test_dataset, model):
"""Run Convolutional Neural Network Training
Execute the training of a (UNET) Convolutional Neural Network on
images of the Moon and binary ring targets.
"""
# Model Parameters
MP = {}
# Directory of train/dev/test image and crater hdf5 files.
MP['dir'] = os.path.join(os.getenv("DM_ROOTDIR"), 'data/processed/')
# Image width/height, assuming square images.
MP['dim'] = 256
# Batch size: smaller values = less memory, less accurate gradient estimate
MP['bs'] = 10
# Number of training epochs.
MP['epochs'] = 30
# Number of train/valid/test samples, needs to be a multiple of batch size.
# sample every even numbered image file to use in the training,
# half of the odd number for testing.
# half of the odd numbers for validataion.
MP['train_indices'] = list(np.arange(162000, 208000, 2000))
MP['dev_indices'] = list(np.arange(161000, 206000, 4000))
MP['test_indices'] = list(np.arange(163000, 206000, 4000))
# MP['test_indices'] = 90000#list(np.arange(10000,184000,8000))
MP['n_train'] = len(MP["train_indices"]) * 1000
MP['n_dev'] = len(MP["dev_indices"]) * 1000
MP['n_test'] = len(MP["test_indices"]) * 1000
print(MP["n_train"], MP["n_dev"], MP["n_test"])
# Save model (binary flag) and directory.
MP['save_models'] = 1
MP["calculate_custom_loss"] = False
MP['save_dir'] = 'models'
MP['final_save_name'] = 'model.h5'
# initial model
MP["model"] = model
# testing only
MP["test"] = test
MP["test_dataset"] = test_dataset
# Model Parameters (to potentially iterate over, keep in lists).
# runs.csv looks like
# filter_length,lr,n_filters,init,lambda,dropout
# 3,0.0001,112,he_normal,1e-6,0.15
#
# each line is a new run.
df = pd.read_csv("runs.csv")
for na, ty in [("filter_length", int),
("lr", float),
("n_filters", int),
("init", str),
("lambda", float),
("dropout", float)]:
MP[na] = df[na].astype(ty).values
MP['N_runs'] = len(MP['lambda']) # Number of runs
MP['filter_length'] = [3] # Filter length
# MP['lr'] = [0.0001] # Learning rate
# MP['n_filters'] = [112] # Number of filters
# MP['init'] = ['he_normal'] # Weight initialization
# MP['lambda'] = [1e-6] # Weight regularization
# MP['dropout'] = [0.15] # Dropout fraction
# Iterating over parameters example.
# MP['N_runs'] = 2
# MP['lambda']=[1e-4,1e-4]
print(MP)
get_models(MP)
if __name__ == '__main__':
dl()
| 36.237838 | 79 | 0.545085 | 3,310 | 26,816 | 4.263444 | 0.171601 | 0.01644 | 0.022321 | 0.021259 | 0.335176 | 0.310091 | 0.280258 | 0.269841 | 0.238946 | 0.218679 | 0 | 0.023665 | 0.316117 | 26,816 | 739 | 80 | 36.286874 | 0.745842 | 0.097479 | 0 | 0.217295 | 0 | 0 | 0.10883 | 0.01319 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002217 | 0.059867 | null | null | 0.055432 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92b84f5fd97fe6052eddaf4fb13a8cbc0248d07a | 270 | py | Python | sample/rubrik_polaris/get_storage_object_ids_ebs.py | talmo77/rubrik-polaris-sdk-for-python | 505ce03b7995e7b86206c728be594d56d4431050 | [
"MIT"
] | 2 | 2021-07-14T12:54:53.000Z | 2022-03-03T21:55:28.000Z | sample/rubrik_polaris/get_storage_object_ids_ebs.py | talmo77/rubrik-polaris-sdk-for-python | 505ce03b7995e7b86206c728be594d56d4431050 | [
"MIT"
] | 8 | 2021-03-09T13:02:15.000Z | 2022-02-24T08:46:50.000Z | sample/rubrik_polaris/get_storage_object_ids_ebs.py | talmo77/rubrik-polaris-sdk-for-python | 505ce03b7995e7b86206c728be594d56d4431050 | [
"MIT"
] | 4 | 2021-04-16T15:49:36.000Z | 2021-11-09T17:58:21.000Z | from rubrik_polaris import PolarisClient
domain = 'my-company'
username = 'john.doe@example.com'
password = 's3cr3tP_a55w0R)'
client = PolarisClient(domain, username, password, insecure=True)
print(client.get_storage_object_ids_ebs(tags = {"Class": "Management"}))
| 22.5 | 72 | 0.77037 | 33 | 270 | 6.121212 | 0.848485 | 0.188119 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020661 | 0.103704 | 270 | 11 | 73 | 24.545455 | 0.81405 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
92b86ef8042175ced0f5be6c5fb6656b4c301369 | 452 | py | Python | lizardanalysis.py | JojoReikun/ClimbingLizardDLCAnalysis | 6cc38090217a3ffd4860ef6d06ba7967d3c10b7c | [
"MIT"
] | 1 | 2021-03-09T19:12:44.000Z | 2021-03-09T19:12:44.000Z | lizardanalysis.py | JojoReikun/ClimbingLizardDLCAnalysis | 6cc38090217a3ffd4860ef6d06ba7967d3c10b7c | [
"MIT"
] | null | null | null | lizardanalysis.py | JojoReikun/ClimbingLizardDLCAnalysis | 6cc38090217a3ffd4860ef6d06ba7967d3c10b7c | [
"MIT"
] | null | null | null | """
LizardDLCAnalysis Toolbox
© Johanna T. Schultz
© Fabian Plum
Licensed under MIT License
----------------------------------------------------------
for testing and debugging in pycharm:
---> Tools
---> Python Console
---> (with ipython installed):
IN[1]: import lizardanalysis
---> run commands:
IN[2]: lizardanalysis.command(*args, **kwargs)
"""
from lizardanalysis import cli
def main():
cli.main()
if __name__ == '__main__':
main()
| 17.384615 | 58 | 0.60177 | 49 | 452 | 5.428571 | 0.795918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005181 | 0.146018 | 452 | 25 | 59 | 18.08 | 0.678756 | 0.756637 | 0 | 0 | 0 | 0 | 0.078431 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92b908d1f507caf2b00b09a9d6cc2b497334bead | 3,535 | py | Python | logic/auth.py | enisimsar/watchtower-news | 222d2e52e76ef32ebb78eb325f4c32b64c0ba1a6 | [
"MIT"
] | 2 | 2019-02-21T18:29:09.000Z | 2021-01-27T14:52:46.000Z | logic/auth.py | enisimsar/watchtower-news | 222d2e52e76ef32ebb78eb325f4c32b64c0ba1a6 | [
"MIT"
] | 3 | 2018-11-22T08:34:04.000Z | 2021-06-01T22:47:19.000Z | logic/auth.py | enisimsar/watchtower-news | 222d2e52e76ef32ebb78eb325f4c32b64c0ba1a6 | [
"MIT"
] | 1 | 2019-06-13T10:45:46.000Z | 2019-06-13T10:45:46.000Z | import hashlib
import logging
import random
import string
import uuid
from mongoengine import DoesNotExist, NotUniqueError
from models.Invitation import Invitation
from models.User import User
__author__ = 'Enis Simsar'
# from http://www.pythoncentral.io/hashing-strings-with-python/
def hash_password(password):
# uuid is used to generate a random number
salt = uuid.uuid4().hex
return hashlib.sha256(salt.encode() + password.encode()).hexdigest() + ':' + salt
# from http://www.pythoncentral.io/hashing-strings-with-python/
def check_password(hashed_password, user_password):
password, salt = hashed_password.split(':')
return password == hashlib.sha256(salt.encode() + user_password.encode()).hexdigest()
def register_user(invitation_code, username, password):
logging.info("username: {0}".format(username))
password = hash_password(password)
try:
invitation = Invitation.objects.get(code=invitation_code, is_active=True)
except DoesNotExist:
return {'response': False, 'error': 'Invitation Code is invalid.'}
except Exception as e:
logging.error("exception: {0}".format(str(e)))
return {'error': str(e)}
try:
user = User(username=username, password=password)
user.save()
except NotUniqueError:
return {'response': False, 'error': 'Username is already registered!'}
except Exception as e:
logging.error("exception: {0}".format(str(e)))
return {'error': str(e)}
invitation.is_active = False
invitation.save()
return {'response': True, 'api_token': user.api_token}
def login_user(username, password):
logging.info("username: {0}".format(username))
try:
user = User.objects.get(username=username)
if not check_password(user.password, str(password)):
raise DoesNotExist
except DoesNotExist:
return {'response': False, 'error': 'Credentials are not correct!'}
except Exception as e:
logging.error("exception: {0}".format(str(e)))
return {'error': str(e)}
return {'response': True, 'api_token': user.api_token}
def get_user_profile(user_id):
logging.info("user_id: {0}".format(user_id))
try:
user = User.objects.get(id=user_id)
except DoesNotExist:
return {'response': False}
except Exception as e:
logging.error("exception: {0}".format(str(e)))
return {'error': str(e)}
if user:
logging.info(user.to_dict())
return user.to_dict()
return {'response': False}
def refresh_api_token(user_id):
logging.info("user_id: {0}".format(user_id))
try:
user = User.objects.get(id=user_id)
except DoesNotExist:
return {'response': False}
except Exception as e:
logging.error("exception: {0}".format(str(e)))
return {'error': str(e)}
if user:
new_token = ''.join([random.choice(string.ascii_letters + string.digits) for _ in range(40)])
user.api_token = new_token
user.save()
return {'api_token': new_token, 'response': True}
return {'response': False}
def get_user_with_api_token(api_token):
try:
user = User.objects.get(api_token=api_token)
except DoesNotExist:
return {'response': False}
except Exception as e:
logging.error("exception: {0}".format(str(e)))
return {'error': str(e)}
if user:
return {
'response': True,
'id': user.id
}
return {'response': False}
| 28.97541 | 101 | 0.647808 | 434 | 3,535 | 5.16129 | 0.205069 | 0.075 | 0.076339 | 0.048214 | 0.474554 | 0.455804 | 0.418304 | 0.418304 | 0.373661 | 0.337054 | 0 | 0.006869 | 0.217539 | 3,535 | 121 | 102 | 29.214876 | 0.802965 | 0.046393 | 0 | 0.533333 | 0 | 0 | 0.122067 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.077778 | false | 0.111111 | 0.088889 | 0 | 0.411111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2b7ba4ab1cf3dd66d4fbb9e770b9f307a58c473f | 8,055 | py | Python | UCP/discussion/functions.py | BuildmLearn/University-Campus-Portal-UCP | 3cac50bd58f2ef4365522a76a8378d0a6f599832 | [
"BSD-3-Clause"
] | 13 | 2016-04-24T10:44:28.000Z | 2020-09-07T01:23:18.000Z | UCP/discussion/functions.py | BuildmLearn/University-Campus-Portal-UCP | 3cac50bd58f2ef4365522a76a8378d0a6f599832 | [
"BSD-3-Clause"
] | 16 | 2016-09-05T10:35:41.000Z | 2018-08-25T10:27:13.000Z | UCP/discussion/functions.py | BuildmLearn/University-Campus-Portal-UCP | 3cac50bd58f2ef4365522a76a8378d0a6f599832 | [
"BSD-3-Clause"
] | 24 | 2016-06-25T08:20:12.000Z | 2018-01-11T20:46:24.000Z | """
Functions file for discussion app
consists of common functions used by both api.py and views.py file
"""
from django.contrib.auth.models import User
from django.contrib.auth import authenticate, login
from django.core.mail import send_mail
from django.shortcuts import render
from django.template import Context
from django.utils import timezone
from django.views.generic import View
from rest_framework import status
from rest_framework.authtoken.models import Token
from rest_framework.decorators import api_view
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
from rest_framework.views import APIView
from login.models import UserProfile
import login.serializers as Serializers
from discussion.models import DiscussionThread, Reply, Attachment, Tag
from discussion.serializers import DiscussionThreadSerializer,DiscussionThreadFullSerializer, ReplySerializer, ReplyFullSerializer, TagSerializer
from UCP.constants import result, message
from UCP.settings import EMAIL_HOST_USER, BASE_URL, PAGE_SIZE
from UCP.functions import send_parallel_mail
def get_all_tags():
"""
returns a list of all availible tags
"""
tags = Tag.objects.all()
serializer = TagSerializer(tags, many=True)
return serializer.data
def get_top_discussions(tags):
"""
returns top 5 recent discussions from tags followed by the user
"""
return DiscussionThread.objects.filter(tags__in = tags)[:5]
def add_discussion_thread(request):
"""
"""
response = {}
serializer = DiscussionThreadSerializer(data = request.POST)
if serializer.is_valid():
user_profile = UserProfile.objects.get(user = request.user)
discussion = serializer.save(
posted_by = user_profile,
posted_at = timezone.now()
)
tags = request.POST["tag"].split(',')
for tag_name in tags:
if Tag.objects.filter(name = tag_name).exists():
tag = Tag.objects.get(name = tag_name)
else:
tag = Tag(name=tag_name)
tag.save()
discussion.tags.add(tag)
response["result"] = result.RESULT_SUCCESS
response["error"] = []
response["message"] = "Discussion Thread added successfully"
else:
response["result"] = result.RESULT_FAILURE
response["error"] = serializer.errors
return response
def get_discussion_list(request):
"""
Return a list of discussion threads filtered by page number and tag
"""
response = {}
threads = DiscussionThread.objects.all()
if "tag" in request.GET:
#return a filtererd list
threads = DiscussionThread.objects.filter(tags__name = request.GET["tag"])
#code for pagination
count = len(threads)
page_count = count/PAGE_SIZE + 1
if("page" in request.GET):
page_no = int(request.GET["page"]) - 1
else:
page_no = 0
offset = page_no * PAGE_SIZE
threads = threads[offset:offset+PAGE_SIZE]
serializer = DiscussionThreadFullSerializer(threads, many=True)
response["page_count"] = page_count
response["data"] = serializer.data
return response
def subscribe(request, pk):
"""
subscribe request.user to a discussion thread with id pk
"""
response = {}
if DiscussionThread.objects.filter(id = pk).exists():
discussion = DiscussionThread.objects.get(id = pk)
user_profile = UserProfile.objects.get(user = request.user)
discussion.subscribed.add(user_profile)
discussion.save()
response["result"] = result.RESULT_SUCCESS
return response
else:
response["result"] = result.RESULT_FAILURE
response["error"] = "This discussion id does not exist"
def unsubscribe(request, pk):
"""
unsubscribe request.user to a discussion thread with id pk
"""
response = {}
if DiscussionThread.objects.filter(id = pk).exists():
discussion = DiscussionThread.objects.get(id = pk)
user_profile = UserProfile.objects.get(user = request.user)
discussion.subscribed.remove(user_profile)
discussion.save()
response["result"] = result.RESULT_SUCCESS
return response
else:
response["result"] = result.RESULT_FAILURE
response["error"] = "This discussion id does not exist"
def get_tags(query):
"""
returns a list of tags whose name match the query
"""
tags = Tag.objects.filter(name__icontains=query)
data = []
for tag in tags:
item = {}
item["id"] = tag.id
item["value"] = tag.name
item["label"] = tag.name
data.append(item)
return data
def get_replies(pk, request):
"""
returns all replies of a discussion pk based on the page number
"""
response = {}
if DiscussionThread.objects.filter(id = pk).exists():
discussion = DiscussionThread.objects.get(id = pk)
replies = Reply.objects.filter(thread = discussion)
#pagination
page_count = len(replies)/PAGE_SIZE + 1
if("page" in request.GET):
page_no = int(request.GET["page"]) - 1
else:
page_no = 0
offset = page_no * PAGE_SIZE
replies = replies[offset:offset+PAGE_SIZE]
reply_serializer = ReplyFullSerializer(replies, many=True)
discussion_serializer = DiscussionThreadFullSerializer(discussion)
response["page_count"] = page_count
response["data"] = {}
response["data"]["discussion"] = discussion_serializer.data
response["data"]["replies"] = reply_serializer.data
return response
else:
response["error"] = "This discussion id does not exist"
def get_discussion(pk):
"""
Return the discussion with id pk
"""
response = {}
discussion = DiscussionThread.objects.get(id = pk)
serializer = DiscussionThreadSerializer(discussion)
response["data"] = serializer.data
return response
def send_notification(discussion):
"""
send an email notification to people subscribed to a thread
"""
for user in discussion.subscribed.all():
discussion_url = "http://" + BASE_URL + "/discussions/" + str(discussion.id)
message = "Hey "+user.user.first_name + "!\n"
message += "A new reply was added to this discussion\n"
message += 'To view the discussions click here - '+discussion_url
send_parallel_mail(discussion.title + " - new reply",message,[user.user.email])
def add_reply(pk, request):
"""
add a reply to the discussion thread with id pk
"""
response = {}
discussion = DiscussionThread.objects.get(id = pk)
discussion_serializer = DiscussionThreadSerializer(discussion)
serializer = ReplySerializer(data=request.POST)
if serializer.is_valid():
user_profile = UserProfile.objects.get(user = request.user)
reply = serializer.save(
posted_by = user_profile,
posted_at = timezone.now(),
thread = discussion
)
for _file in request.FILES.getlist('attachments'):
print _file
attachment = Attachment(
reply = reply,
uploaded_file = _file
)
attachment.save()
discussion.no_of_replies += 1
#automatically subscribe the person adding the reply to the discussion
discussion.subscribed.add(user_profile)
send_notification(discussion)
discussion.save()
response["result"] = result.RESULT_SUCCESS
response["data"] = discussion_serializer.data
else:
response["result"] = result.RESULT_FAILURE
response["error"] = serializer.errors
response["data"] = discussion_serializer.data
return response
| 31.342412 | 145 | 0.644693 | 888 | 8,055 | 5.737613 | 0.191441 | 0.037684 | 0.031403 | 0.040824 | 0.368989 | 0.349166 | 0.336408 | 0.297743 | 0.297743 | 0.29578 | 0 | 0.001513 | 0.261577 | 8,055 | 257 | 146 | 31.342412 | 0.855077 | 0.015022 | 0 | 0.424242 | 0 | 0 | 0.063474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.121212 | null | null | 0.006061 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2b809f699739184b5a2cfa529c9aff5901c0adf0 | 2,816 | py | Python | src/third_party/pcap2har/main.py | ashumeow/pcaphar | 68d20b74bb16fa1b4183ebe04753ffcecf9daedf | [
"Apache-2.0"
] | null | null | null | src/third_party/pcap2har/main.py | ashumeow/pcaphar | 68d20b74bb16fa1b4183ebe04753ffcecf9daedf | [
"Apache-2.0"
] | null | null | null | src/third_party/pcap2har/main.py | ashumeow/pcaphar | 68d20b74bb16fa1b4183ebe04753ffcecf9daedf | [
"Apache-2.0"
] | null | null | null | # Copyright 2010 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Command line tool to convert PCAP file to HAR format.
This is command line to for pcaphar app engine. A user can convert a PCAP file
to HAR format file.
"""
__author__ = 'lsong@google.com (Libo Song)'
import os
import sys
# add third_party directory to sys.path for global import
path = os.path.join(os.path.dirname(__file__), "..")
sys.path.append(os.path.abspath(path))
dpkt_path = os.path.join(path, "dpkt")
sys.path.append(os.path.abspath(dpkt_path))
import heapq
import logging
import StringIO
import time
import convert
def PrintUsage():
print __file__, "[options] <pcap file> [<har file>]"
print "options: -l[diwe] log level"
print " --port filter out port"
def main(argv=None):
logging_level = logging.WARNING
filter_port = -1
if argv is None:
argv = sys.argv
filenames = []
idx = 1
while idx < len(argv):
if argv[idx] == '-h' or argv[idx] == '--help':
PrintUsage()
return 0
elif argv[idx] == '--port':
idx += 1
if idx >= len(argv):
PrintUsage()
return 1
filter_port = int(argv[idx])
elif argv[idx] == '-ld':
logging_level = logging.DEBUG
elif argv[idx] == '-li':
logging_level = logging.INFO
elif argv[idx] == '-lw':
logging_level = logging.WARN
elif argv[idx] == '-le':
logging_level = logging.ERROR
elif argv[idx][0:1] == '-':
print "Unknow option:", argv[idx]
PrintUsage()
return 1
else:
filenames.append(argv[idx])
idx += 1
# set the logging level
logging.basicConfig(level=logging_level)
if len(filenames) == 1:
pcap_file = filenames[0]
har_file = pcap_file + ".har"
elif len(filenames) == 2:
pcap_file = filenames[0]
har_file = filenames[1]
else:
PrintUsage()
return 1
# If excpetion raises, do not catch it to terminate the program.
inf = open(pcap_file, 'r')
pcap_in = inf.read()
inf.close
har_out = StringIO.StringIO()
options = convert.Options()
#options.remove_cookie = False
convert.convert(pcap_in, har_out, options)
har_out_str = har_out.getvalue()
outf = open(har_file, 'w')
outf.write(har_out_str)
outf.close()
if __name__ == "__main__":
sys.exit(main())
| 25.834862 | 78 | 0.667259 | 411 | 2,816 | 4.454988 | 0.394161 | 0.042054 | 0.062261 | 0.017477 | 0.076461 | 0.055707 | 0 | 0 | 0 | 0 | 0 | 0.010393 | 0.214134 | 2,816 | 108 | 79 | 26.074074 | 0.816991 | 0.262429 | 0 | 0.185714 | 0 | 0 | 0.095364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.057143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2b85b84ca70ad679c297d70925380a546779f12d | 588 | py | Python | kunai/torch_utils/seed.py | mjun0812/kunai | 6a457c7242ed98dadb29f7002a3c0385ae6c1701 | [
"MIT"
] | null | null | null | kunai/torch_utils/seed.py | mjun0812/kunai | 6a457c7242ed98dadb29f7002a3c0385ae6c1701 | [
"MIT"
] | null | null | null | kunai/torch_utils/seed.py | mjun0812/kunai | 6a457c7242ed98dadb29f7002a3c0385ae6c1701 | [
"MIT"
] | null | null | null | import random
import numpy as np
import torch
def worker_init_fn(worker_id):
"""Reset numpy random seed in PyTorch Dataloader
Args:
worker_id (int): random seed value
"""
np.random.seed(np.random.get_state()[1][0] + worker_id)
def fix_seed(seed):
"""fix seed on random, numpy, torch module
Args:
seed (int): seed parameter
Returns:
int: seed parameter
"""
# random
random.seed(seed)
# Numpy
np.random.seed(seed)
# Pytorch
torch.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
return seed
| 17.294118 | 59 | 0.634354 | 81 | 588 | 4.481481 | 0.395062 | 0.137741 | 0.066116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004619 | 0.263605 | 588 | 33 | 60 | 17.818182 | 0.833718 | 0.382653 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.272727 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
2b8826183338e351fe4dbe3ffb007fda739ce1c8 | 467 | py | Python | datastrucutre/array/find_given_sum_in_array.py | abhishektyagi2912/python-dsa | 8f51f15a091ee76e00fb34abc232c23cb68440cb | [
"MIT"
] | 1 | 2021-05-02T05:43:34.000Z | 2021-05-02T05:43:34.000Z | datastrucutre/array/find_given_sum_in_array.py | abhishektyagi2912/python-dsa | 8f51f15a091ee76e00fb34abc232c23cb68440cb | [
"MIT"
] | null | null | null | datastrucutre/array/find_given_sum_in_array.py | abhishektyagi2912/python-dsa | 8f51f15a091ee76e00fb34abc232c23cb68440cb | [
"MIT"
] | null | null | null | def find_sum(arr, s):
curr_sum = arr[0]
start = 0
n = len(arr) - 1
i = 1
while i <= n:
while curr_sum > s and start < i:
curr_sum = curr_sum - arr[start]
start += 1
if curr_sum == s:
return "Found between {} and {}".format(start, i - 1)
curr_sum = curr_sum + arr[i]
i += 1
return "Sum not found"
arr = [15, 2, 4, 8, 9, 5, 10, 23]
print(find_sum(arr, 6))
| 18.68 | 65 | 0.466809 | 73 | 467 | 2.863014 | 0.410959 | 0.23445 | 0.143541 | 0.133971 | 0.162679 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067857 | 0.400428 | 467 | 24 | 66 | 19.458333 | 0.678571 | 0 | 0 | 0 | 0 | 0 | 0.077088 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0 | 0 | 0.1875 | 0.0625 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2b8a843980224263fdf89972e3e7e0691943f2f7 | 216 | py | Python | tests/test_winterspringbl.py | lamter/slavewg | c3a4098b6c2cbfd232f8ed2290141b7f61d7db6f | [
"Apache-2.0"
] | 3 | 2020-08-13T15:04:33.000Z | 2021-03-12T16:12:39.000Z | tests/test_winterspringbl.py | lamter/slavewg | c3a4098b6c2cbfd232f8ed2290141b7f61d7db6f | [
"Apache-2.0"
] | null | null | null | tests/test_winterspringbl.py | lamter/slavewg | c3a4098b6c2cbfd232f8ed2290141b7f61d7db6f | [
"Apache-2.0"
] | null | null | null | import slavewg
from threading import Event
from queue import Queue
def test_runLoop():
q = Queue()
s = Event()
s.set()
lbl = slavewg.LootBlackLotus(s, q)
lbl.do(lbl.pos_winterspring_mountain)
| 15.428571 | 41 | 0.680556 | 30 | 216 | 4.8 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 216 | 13 | 42 | 16.615385 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
2b8cd12ab30cf14546818a1c84e84c64173a25c5 | 2,218 | py | Python | thesis_scripts/train_probs_plot.py | jizongFox/kaggle-seizure-prediction | 9439d1c8ca9b4282fcd630a20d0d7946b790e218 | [
"MIT"
] | 55 | 2015-03-17T16:54:49.000Z | 2022-03-10T13:49:08.000Z | thesis_scripts/train_probs_plot.py | jizongFox/kaggle-seizure-prediction | 9439d1c8ca9b4282fcd630a20d0d7946b790e218 | [
"MIT"
] | 2 | 2018-05-12T20:29:52.000Z | 2021-08-10T11:27:05.000Z | thesis_scripts/train_probs_plot.py | jizongFox/kaggle-seizure-prediction | 9439d1c8ca9b4282fcd630a20d0d7946b790e218 | [
"MIT"
] | 18 | 2015-03-20T03:03:17.000Z | 2022-03-10T13:49:20.000Z | import numpy as np
import json
import cPickle
import matplotlib.pyplot as plt
from theano import config
import matplotlib.cm as cmx
import matplotlib.colors as colors
from sklearn.metrics import roc_curve
from utils.loader import load_train_data
from utils.config_name_creator import *
from utils.data_scaler import scale_across_features, scale_across_time
from cnn.conv_net import ConvNet
config.floatX = 'float32'
def get_cmap(N):
color_norm = colors.Normalize(vmin=0, vmax=N - 1)
scalar_map = cmx.ScalarMappable(norm=color_norm, cmap='hsv')
def map_index_to_rgb_color(index):
return scalar_map.to_rgba(index)
return map_index_to_rgb_color
def plot_train_probs(subject, data_path, model_path):
with open(model_path + '/' + subject + '.pickle', 'rb') as f:
state_dict = cPickle.load(f)
cnn = ConvNet(state_dict['params'])
cnn.set_weights(state_dict['weights'])
scalers = state_dict['scalers']
d = load_train_data(data_path, subject)
x, y = d['x'], d['y']
x, _ = scale_across_time(x, x_test=None, scalers=scalers) if state_dict['params']['scale_time'] \
else scale_across_features(x, x_test=None, scalers=scalers)
cnn.batch_size.set_value(x.shape[0])
probs = cnn.get_test_proba(x)
fpr, tpr, threshold = roc_curve(y, probs)
c = np.sqrt((1-tpr)**2+fpr**2)
opt_threshold = threshold[np.where(c==np.min(c))[0]]
print opt_threshold
x_coords = np.zeros(len(y), dtype='float64')
rng = np.random.RandomState(42)
x_coords += rng.normal(0.0, 0.08, size=len(x_coords))
plt.scatter(x_coords, probs, c=y, s=60)
plt.title(subject)
plt.show()
if __name__ == '__main__':
with open('SETTINGS.json') as f:
settings_dict = json.load(f)
data_path = settings_dict['path']['processed_data_path'] + '/' + create_fft_data_name(settings_dict)
model_path = settings_dict['path']['model_path'] + '/' + create_cnn_model_name(settings_dict)
subjects = ['Patient_1', 'Patient_2', 'Dog_1', 'Dog_2', 'Dog_3', 'Dog_4', 'Dog_5']
for subject in subjects:
print '***********************', subject, '***************************'
plot_train_probs(subject, data_path, model_path) | 32.617647 | 104 | 0.681695 | 335 | 2,218 | 4.235821 | 0.370149 | 0.028189 | 0.027484 | 0.018323 | 0.112755 | 0.087385 | 0.053559 | 0.053559 | 0 | 0 | 0 | 0.014618 | 0.167268 | 2,218 | 68 | 105 | 32.617647 | 0.753655 | 0 | 0 | 0 | 0 | 0 | 0.098242 | 0.022533 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.24 | null | null | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2b8eb6f4aea8446268376f43354cab9bc858fe41 | 9,313 | py | Python | jaeun.py | catubc/ensembles | cf4bd380a6ac300e477d555a6b708363100f974b | [
"MIT"
] | null | null | null | jaeun.py | catubc/ensembles | cf4bd380a6ac300e477d555a6b708363100f974b | [
"MIT"
] | null | null | null | jaeun.py | catubc/ensembles | cf4bd380a6ac300e477d555a6b708363100f974b | [
"MIT"
] | null | null | null | # Kang Miller et al 2014 method for computing ensembles
#
#
#
import numpy as np
import matplotlib.pyplot as plt
import os
import scipy.stats
def PCA(X, n_components):
from sklearn import decomposition
#pca = decomposition.SparsePCA(n_components=3, n_jobs=1)
pca = decomposition.PCA(n_components=n_components)
print "... fitting PCA ..."
pca.fit(X)
for k in range (n_components):
print "... explained variance: ", pca.explained_variance_[k]
print "... pca transform..."
return pca.transform(X)
def corr2_coeff(A,B):
# Rowwise mean of input arrays & subtract from input arrays themeselves
A_mA = A - A.mean(1)[:,None]
B_mB = B - B.mean(1)[:,None]
# Sum of squares across rows
ssA = (A_mA**2).sum(1);
ssB = (B_mB**2).sum(1);
# Finally get corr coeff
return np.dot(A_mA,B_mB.T)/np.sqrt(np.dot(ssA[:,None],ssB[None]))
def jaeun_detect(rasters, list_filename):
print rasters.shape
activity = np.sum(rasters,axis=0)
#plt.plot(activity)
std = np.std(activity)*4
#plt.plot([0,len(rasters[0])],[std,std])
#determine neurons in each ensemble:
indexes = np.where(activity>std)[0]
ensembles = []
vectors = []
for index in indexes:
ensembles.append(np.where(rasters[:,index]>0)[0])
print len(ensembles)
vectors.append(rasters[:,index])
#plt.plot([index,index],[0,40])
#plt.show()
#Normalize vectors
vec_matrix = np.float32(np.vstack(vectors)).T
print vec_matrix.shape
for k in range(len(vec_matrix)):
vec_matrix[k] = vec_matrix[k]/np.sum(vec_matrix[k])
vec_matrix = np.nan_to_num(vec_matrix).T
plt.imshow(vec_matrix)
plt.show()
#Compute correlation:
corr_matrix = np.zeros((len(vectors),len(vectors)),dtype=np.float32)
corr_array =[]
for e in range(len(vectors)):
for f in range(0,len(vectors),1):
print len(vectors[e]),len(vectors[f])
print scipy.stats.pearsonr(vectors[e],vectors[f])
r = scipy.stats.pearsonr(vectors[e],vectors[f])[1]
corr_matrix[e,f] = 0.5*np.log((1+r)/(1-r))
corr_array.append(0.5*np.log((1+r)/(1-r)))
print corr_matrix
print np.min(corr_matrix), np.max(corr_matrix)
plt.imshow(corr_matrix)
plt.show()
#from scipy.cluster.hierarchy import linkage, dendrogram
#linkage_matrix = linkage(vectors, 'single')
#dendogram = dendrogram(linkage_matrix,truncate_mode='none')
from sklearn.cluster import SpectralClustering
mat = corr_matrix
ids = SpectralClustering(10).fit_predict(mat)
#print img
if False: #Use PCA to cluster highly-active frames:
data = PCA(vectors,3)
plt.scatter(data[:,0], data[:,1])
plt.show()
if False: #Use hyperangles to compute difference
matrix = np.zeros((len(vectors),len(vectors)), dtype=np.float32)
vector0 = np.zeros(len(vectors[0]),dtype=np.float32)+1
dists = []
for e in range(len(vectors)):
p = np.dot(vectors[e],vector0)/np.linalg.norm(vectors[e])/np.linalg.norm(vector0) # -> cosine of the angle
dists.append(np.degrees(np.arccos(np.clip(p, -1, 1))))
for f in range(len(vectors)):
c = np.dot(vectors[e],vectors[f])/np.linalg.norm(vectors[e])/np.linalg.norm(vectors[f]) # -> cosine of the angle
#dists =
#print c
#c = np.degrees(np.arccos(np.clip(c, -1, 1)))
matrix[e,f] = np.degrees(np.arccos(np.clip(c, -1, 1)))
#matrix[e,f] = c
print "...angle: ", matrix[e,f]
#temp_angle_array.append(c)
plt.imshow(matrix)
plt.show()
bin_width = 10 # histogram bin width in usec
y = np.histogram(dists, bins = np.arange(0,90,bin_width))
plt.bar(y[1][:-1], y[0], bin_width, color='b', alpha=1)
plt.show()
if False: #Use SVD on all data;
pass
#quit()
return ensembles, ids
#Load ROI countour data first
roi_filename = '/media/cat/250GB/in_vivo/alejandro/G2M4/joint/all_registered_processed_ROIs.npz'
data_in = np.load(roi_filename, encoding= 'latin1', mmap_mode='c')
Bmat_array = data_in['Bmat_array']
cm = data_in['cm'] #Centre of mass
thr_array = data_in['thr_array'] #Threshold array original data, usually 0.2
traces = data_in['traces'] #
x_array = data_in['x_array']
y_array = data_in['y_array']
colors='b'
list_filename = '/media/cat/250GB/in_vivo/alejandro/G2M4/ch1_file_list.txt'
filenames = np.loadtxt(list_filename, dtype='str')
thr_fixed=.5
ctr=0
modularity_levels = np.arange(0,25,1)
colors = ['gold','mediumslateblue','grey','thistle','teal','palegreen','violet','deepskyblue','blue','green','cyan','orange','red']
for s, filename in enumerate(filenames):
print (filename)
if '000' in filename:
print ctr, (ctr*7)%21+int(ctr/3.)+1
print ("... spontaneous recording ...")
rasters = np.load(filename[:-4]+"_rasters.npy")
ensembles, ids = jaeun_detect(rasters, list_filename)
print ensembles
print ids
#frame_array, weight_array = luis_detect(rasters, list_filename)
#main_ensembles, other_ensembles = luis_ensembles (frame_array, weight_array, rasters)
print len(main_ensembles)
for k in range(len(main_ensembles)):
print "...plotting ensemble: ", k
ax=plt.subplot(3,3,k+1)
plt.title("Ensemble: "+str(k), fontsize=20)
ax.set_xticks([]); ax.set_yticks([])
##Draw all neurons first
#for i, (y,x,Bmat,thr) in enumerate(zip(y_array,x_array,Bmat_array,thr_array)):
# cs = plt.contour(y, x, Bmat, [thr_fixed], colors='black',alpha=0.3)
#Draw neurons at each modularity
unique_indexes = main_ensembles[k] #Select neurons at this level
for i in unique_indexes:
#cs = plt.contour(y_array[i], x_array[i], Bmat_array[i], [thr_fixed], colors=colors[k%13],linewidth=15, alpha=1)
cs = plt.contour(y_array[i], x_array[i], Bmat_array[i], [thr_fixed], colors='red',linewidth=15, alpha=1)
#Draw background neurons
unique_indexes = other_ensembles[k] #Select neurons at this level
for i in unique_indexes:
#cs = plt.contour(y_array[i], x_array[i], Bmat_array[i], [thr_fixed], colors=colors[k%13],linewidth=15, alpha=1)
cs = plt.contour(y_array[i], x_array[i], Bmat_array[i], [thr_fixed], colors='black',linewidth=15, alpha=0.5)
plt.show()
if '001' in filename:
print ("... stim recording...")
#Draw stim1 ensembles
network_stim1 = np.load(filename[:-4]+"_networks_stim1.npy")
ax=plt.subplot(3,7,(ctr*7)%21+int(ctr/3.)+1)
ax.set_xticks([]); ax.set_yticks([])
if ctr==1:
plt.ylabel("Horizontal",fontsize=15)
ctr+=1
plt.title(os.path.split(filename)[1][:-4].replace('_C1V1_GCaMP6s','')+", #: "+str(len(network_stim1)),fontsize=9)
#Draw all neurons first
for i, (y,x,Bmat,thr) in enumerate(zip(y_array,x_array,Bmat_array,thr_array)):
cs = plt.contour(y, x, Bmat, [thr_fixed], colors='black',alpha=0.3)
#Draw neurons at each modularity
for k in modularity_levels:
if k>(len(network_stim1)-1): break
index_array = network_stim1[k] #Select neurons at this level
unique_indexes=np.unique(index_array) #THIS is redundant as neurons are uniquely asigned to modularity levels
print ("... modularity: ", k, " # neurons: ", len(unique_indexes))
for i in unique_indexes:
cs = plt.contour(y_array[i], x_array[i], Bmat_array[i], [thr_fixed], colors=colors[k],linewidth=15, alpha=1)
#Draw stim2 ensembles
network_stim2 = np.load(filename[:-4]+"_networks_stim2.npy")
ax=plt.subplot(3,7,(ctr*7)%21+int(ctr/3.)+1)
ax.set_xticks([]); ax.set_yticks([])
if ctr==2:
plt.ylabel("Vertical",fontsize=15)
ctr+=1
plt.title(os.path.split(filename)[1][:-4].replace('_C1V1_GCaMP6s','')+", #: "+str(len(network_stim2)),fontsize=9)
#Draw all neurons first
for i, (y,x,Bmat,thr) in enumerate(zip(y_array,x_array,Bmat_array,thr_array)):
cs = plt.contour(y, x, Bmat, [thr_fixed], colors='black',alpha=0.3)
#Draw neurons at each modularity
for k in modularity_levels:
if k>(len(network_stim2)-1): break
index_array = network_stim2[k] #Select neurons at this level
unique_indexes=np.unique(index_array) #THIS is redundant as neurons are uniquely asigned to modularity levels
print ("... modularity: ", k, " # neurons: ", len(unique_indexes))
for i in unique_indexes:
cs = plt.contour(y_array[i], x_array[i], Bmat_array[i], [thr_fixed], colors=colors[k],linewidth=15, alpha=1)
plt.show()
| 36.521569 | 131 | 0.600021 | 1,311 | 9,313 | 4.134249 | 0.212052 | 0.019926 | 0.019926 | 0.021587 | 0.456089 | 0.414576 | 0.378967 | 0.363838 | 0.33321 | 0.33321 | 0 | 0.026357 | 0.254483 | 9,313 | 254 | 132 | 36.665354 | 0.754285 | 0.207774 | 0 | 0.234899 | 0 | 0 | 0.083379 | 0.018589 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.006711 | 0.040268 | null | null | 0.14094 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2b91b64bcf9fb63e83545e349af03a180f049619 | 3,012 | py | Python | fun/fnotification/migrations/0001_initial.py | larryw3i/osp | d9526a179876053a6b93e5a110d2de730376f511 | [
"MIT"
] | 1 | 2022-01-01T11:14:58.000Z | 2022-01-01T11:14:58.000Z | fun/fnotification/migrations/0001_initial.py | larryw3i/osp | d9526a179876053a6b93e5a110d2de730376f511 | [
"MIT"
] | null | null | null | fun/fnotification/migrations/0001_initial.py | larryw3i/osp | d9526a179876053a6b93e5a110d2de730376f511 | [
"MIT"
] | null | null | null | # Generated by Django 4.0 on 2022-01-13 10:17
import uuid
import ckeditor_uploader.fields
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('funuser', '0004_alter_funuser_avatar'),
('auth', '0012_alter_user_first_name_max_length'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Fnotification',
fields=[
('id',
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
unique=True)),
('title',
models.CharField(
max_length=64,
verbose_name='Title')),
('content',
ckeditor_uploader.fields.RichTextUploadingField(
max_length=2048,
verbose_name='Content')),
('additional_files',
models.FileField(
help_text='If you have more than one file, please package them and upload them.',
upload_to='',
verbose_name='Additional files')),
('DOC',
models.DateTimeField(
auto_now_add=True,
verbose_name='Date of creating')),
('DOU',
models.DateTimeField(
auto_now=True,
verbose_name='Date of updating')),
('comment',
models.TextField(
max_length=128,
verbose_name='Comment')),
('groups',
models.ManyToManyField(
blank=True,
help_text='The groups this notification belongs to. all user of specific groups will receive notification. for all users if groups is null',
related_name='notification_set',
related_query_name='notification',
to='auth.Group',
verbose_name='groups')),
('poster',
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to='funuser.funuser',
verbose_name='Author')),
('readers',
models.ManyToManyField(
blank=True,
related_name='reader_set',
related_query_name='reader',
to=settings.AUTH_USER_MODEL,
verbose_name='Reader')),
],
options={
'verbose_name': 'Notification',
'verbose_name_plural': 'Notifications',
},
),
]
| 35.857143 | 161 | 0.479748 | 245 | 3,012 | 5.702041 | 0.497959 | 0.086614 | 0.031496 | 0.031496 | 0.030064 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018713 | 0.432271 | 3,012 | 83 | 162 | 36.289157 | 0.798246 | 0.014276 | 0 | 0.08 | 1 | 0.013333 | 0.186384 | 0.020897 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2b96e0991964b2437d682f1fd12e74f747085a34 | 302 | py | Python | skippa/__init__.py | data-science-lab-amsterdam/skippa | 1349317c441f1e46e22f4c02a8aceae767aea5fe | [
"BSD-3-Clause"
] | 33 | 2021-12-15T22:56:12.000Z | 2022-02-26T12:33:56.000Z | skippa/__init__.py | data-science-lab-amsterdam/skippa | 1349317c441f1e46e22f4c02a8aceae767aea5fe | [
"BSD-3-Clause"
] | null | null | null | skippa/__init__.py | data-science-lab-amsterdam/skippa | 1349317c441f1e46e22f4c02a8aceae767aea5fe | [
"BSD-3-Clause"
] | 1 | 2022-01-20T15:41:35.000Z | 2022-01-20T15:41:35.000Z | """Top-level package for skippa.
The pipeline module defines the main Skippa methods
The transformers subpackage contains various transformers used in the pipeline.
"""
__author__ = """Robert van Straalen"""
__email__ = 'tech@datasciencelab.nl'
from .pipeline import Skippa, SkippaPipeline, columns
| 27.454545 | 79 | 0.788079 | 37 | 302 | 6.216216 | 0.783784 | 0.095652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13245 | 302 | 10 | 80 | 30.2 | 0.877863 | 0.536424 | 0 | 0 | 0 | 0 | 0.308271 | 0.165414 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
2ba20cdbe21eded23a10e339852b7da8fa1eb902 | 948 | py | Python | conftest.py | dasap89/rest_accounts | e0daeece000391d93e1bc99db484b8fe45588046 | [
"MIT"
] | null | null | null | conftest.py | dasap89/rest_accounts | e0daeece000391d93e1bc99db484b8fe45588046 | [
"MIT"
] | null | null | null | conftest.py | dasap89/rest_accounts | e0daeece000391d93e1bc99db484b8fe45588046 | [
"MIT"
] | null | null | null | """Additional configuration for pytest"""
import datetime
import os
import pytest
from django.contrib.auth import get_user_model
from rest_framework.authtoken.models import Token
User = get_user_model()
# pylint: disable=redefined-outer-name,unused-argument,no-member
@pytest.fixture(scope='session')
def django_db_setup(django_db_setup, django_db_blocker):
"""Set up the database for tests that need it"""
with django_db_blocker.unblock():
user_one = User.objects.create_user(
email='user_one@example.com',
password='test_123'
)
Token.objects.create(user=user_one)
@pytest.mark.django_db
def get_token(email='user_one@example.com'):
"""
Get user token from db
:param user: User name
:return: Key value from Token model object
"""
test_user = User.objects.get(email=email)
# pylint: disable=no-member
return Token.objects.get(user_id=test_user.id).key
| 26.333333 | 64 | 0.714135 | 135 | 948 | 4.837037 | 0.451852 | 0.061256 | 0.036753 | 0.058193 | 0.119449 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003861 | 0.18038 | 948 | 35 | 65 | 27.085714 | 0.836551 | 0.271097 | 0 | 0 | 0 | 0 | 0.083841 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0.055556 | 0.277778 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2ba38d92cac59d9afb0607dd598cb535e95a6b6e | 1,099 | py | Python | lib/rucio/db/sqla/migrate_repo/versions/35ef10d1e11b_change_index_on_table_requests.py | balrampariyarath/rucio | 8a68017af6b44485a9620566f1afc013838413c1 | [
"Apache-2.0"
] | 1 | 2017-08-07T13:34:55.000Z | 2017-08-07T13:34:55.000Z | lib/rucio/db/sqla/migrate_repo/versions/35ef10d1e11b_change_index_on_table_requests.py | pujanm/rucio | 355a997a5ea213c427a5d841ab151ceb01073eb4 | [
"Apache-2.0"
] | null | null | null | lib/rucio/db/sqla/migrate_repo/versions/35ef10d1e11b_change_index_on_table_requests.py | pujanm/rucio | 355a997a5ea213c427a5d841ab151ceb01073eb4 | [
"Apache-2.0"
] | 1 | 2021-06-17T14:15:15.000Z | 2021-06-17T14:15:15.000Z | """
Copyright European Organization for Nuclear Research (CERN)
Licensed under the Apache License, Version 2.0 (the "License");
You may not use this file except in compliance with the License.
You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0
Authors:
- Vincent Garonne, <vincent.garonne@cern.ch>, 2014-2017
change index on table requests
Revision ID: 35ef10d1e11b
Revises: 3152492b110b
Create Date: 2014-06-20 09:01:52.704794
"""
from alembic.op import create_index, drop_index
# revision identifiers, used by Alembic.
revision = '35ef10d1e11b' # pylint:disable=invalid-name
down_revision = '3152492b110b' # pylint:disable=invalid-name
def upgrade():
'''
upgrade method
'''
create_index('REQUESTS_TYP_STA_UPD_IDX', 'requests', ["request_type", "state", "updated_at"])
drop_index('REQUESTS_TYP_STA_CRE_IDX', 'requests')
def downgrade():
'''
downgrade method
'''
create_index('REQUESTS_TYP_STA_CRE_IDX', 'requests', ["request_type", "state", "created_at"])
drop_index('REQUESTS_TYP_STA_UPD_IDX', 'requests')
| 27.475 | 97 | 0.730664 | 150 | 1,099 | 5.173333 | 0.566667 | 0.06701 | 0.082474 | 0.097938 | 0.257732 | 0.216495 | 0.170103 | 0 | 0 | 0 | 0 | 0.07074 | 0.151046 | 1,099 | 39 | 98 | 28.179487 | 0.760986 | 0.539581 | 0 | 0 | 0 | 0 | 0.443011 | 0.206452 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bad9543f61fe85bae0ab93637dfc67e43dc0ae6 | 1,876 | py | Python | dm_verity_make_ext4fs.py | bigb123/samsung-android_bootable_recovery_libdmverity | 2f9ce00f76ed4f1a251d5244ae2533cf53417478 | [
"MIT"
] | 1 | 2020-06-28T00:49:21.000Z | 2020-06-28T00:49:21.000Z | dm_verity_make_ext4fs.py | bigb123/samsung-android_bootable_recovery_libdmverity | 2f9ce00f76ed4f1a251d5244ae2533cf53417478 | [
"MIT"
] | null | null | null | dm_verity_make_ext4fs.py | bigb123/samsung-android_bootable_recovery_libdmverity | 2f9ce00f76ed4f1a251d5244ae2533cf53417478 | [
"MIT"
] | 1 | 2021-03-05T16:54:52.000Z | 2021-03-05T16:54:52.000Z | #! /usr/bin/env python
# make_ext4fs -s -S /home/swei/p4/STA-ESG_SWEI_KLTE_ATT-TRUNK_DMV/android/out/target/product/klteatt/root/file_contexts -l 2654994432 -a system system.img.ext4 system
import os,posixpath,sys,getopt
reserve=1024*1024*32
def run(cmd):
print cmd
# return 0
return os.system(cmd)
def main():
d = posixpath.dirname(sys.argv[0])
make_ext4fs_opt_list = []
optlist, args = getopt.getopt(sys.argv[1:], 'l:j:b:g:i:I:L:a:G:fwzJsctrvS:X:')
if len(args) < 1:
print 'image file not specified'
return -1;
image_file = args[0]
length = None
sparse = False
for o, a in optlist:
if '-l' == o:
length = int(a)
make_ext4fs_opt_list.append(o)
make_ext4fs_opt_list.append(str(length-reserve))
elif '-s' == o:
sparse = True
make_ext4fs_opt_list.append(o)
else:
make_ext4fs_opt_list.append(o)
if len(a) > 0:
make_ext4fs_opt_list.append(a)
if not sparse:
print 'we can only handle sparse image format for server generated dmverity for now'
return -1
if None == length:
print 'size of system image not taken'
return -1
make_ext4fs_opt_list.extend(args)
cmd = os.path.join(d, 'make_ext4fs') + ' ' +' '.join(make_ext4fs_opt_list)
if(0 != run(cmd)):
print 'failed!'
return -1;
cmd = ' '.join(['img_dm_verity', '/dev/block/platform/15540000.dwmmc0/by-name/SYSTEM', str(length), image_file, image_file+'.tmp'])
if(0 != run(cmd)):
print 'failed!'
return -1;
cmd = ' '.join(['mv', image_file+'.tmp', image_file])
if(0 != run(cmd)):
print 'failed!'
return -1;
return 0
#return os.system(cmd)
if __name__ == "__main__":
ret = main()
sys.exit(ret)
| 28.861538 | 166 | 0.587953 | 271 | 1,876 | 3.900369 | 0.391144 | 0.094607 | 0.098392 | 0.128666 | 0.264901 | 0.203406 | 0.089877 | 0.089877 | 0.064333 | 0.064333 | 0 | 0.041973 | 0.276119 | 1,876 | 64 | 167 | 29.3125 | 0.736377 | 0.116738 | 0 | 0.3 | 0 | 0.02 | 0.170599 | 0.049002 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02 | null | null | 0.14 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bb629029f6d9ac0a63d285ca3b6e11796d43870 | 10,171 | py | Python | mcp2515.py | sifosifo/MCP2515LinuxDriver | 14e933a81626980ec7c9c9830335383fdd7abe3a | [
"BSD-3-Clause"
] | 3 | 2018-01-31T11:54:21.000Z | 2020-02-13T08:12:07.000Z | mcp2515.py | sifosifo/MCP2515LinuxDriver | 14e933a81626980ec7c9c9830335383fdd7abe3a | [
"BSD-3-Clause"
] | 1 | 2020-02-15T06:01:39.000Z | 2020-02-27T05:23:43.000Z | mcp2515.py | sifosifo/MCP2515LinuxDriver | 14e933a81626980ec7c9c9830335383fdd7abe3a | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/python
import spidev
class mcp2515:
SPI_RESET = 0xC0
SPI_READ = 0x03
SPI_READ_RX = 0x90
SPI_WRITE = 0x02
SPI_WRITE_TX = 0x40
SPI_RTS = 0x80
SPI_READ_STATUS = 0xA0
SPI_RX_STATUS = 0xB0
SPI_BIT_MODIFY = 0x05
#/* Configuration Registers */
CANSTAT = 0x0E
CANCTRL = 0x0F
BFPCTRL = 0x0C
TEC = 0x1C
REC = 0x1D
CNF3 = 0x28
CNF2 = 0x29
CNF1 = 0x2A
CANINTE = 0x2B
CANINTF = 0x2C
EFLG = 0x2D
TXRTSCTRL = 0x0D
#/* Recieve Filters */
RXF0SIDH = 0x00
RXF0SIDL = 0x01
RXF0EID8 = 0x02
RXF0EID0 = 0x03
RXF1SIDH = 0x04
RXF1SIDL = 0x05
RXF1EID8 = 0x06
RXF1EID0 = 0x07
RXF2SIDH = 0x08
RXF2SIDL = 0x09
RXF2EID8 = 0x0A
RXF2EID0 = 0x0B
RXF3SIDH = 0x10
RXF3SIDL = 0x11
RXF3EID8 = 0x12
RXF3EID0 = 0x13
RXF4SIDH = 0x14
RXF4SIDL = 0x15
RXF4EID8 = 0x16
RXF4EID0 = 0x17
RXF5SIDH = 0x18
RXF5SIDL = 0x19
RXF5EID8 = 0x1A
RXF5EID0 = 0x1B
#/* Receive Masks */
RXM0SIDH = 0x20
RXM0SIDL = 0x21
RXM0EID8 = 0x22
RXM0EID0 = 0x23
RXM1SIDH = 0x24
RXM1SIDL = 0x25
RXM1EID8 = 0x26
RXM1EID0 = 0x27
#/* Tx Buffer 0 */
TXB0CTRL = 0x30
TXB0SIDH = 0x31
TXB0SIDL = 0x32
TXB0EID8 = 0x33
TXB0EID0 = 0x34
TXB0DLC = 0x35
TXB0D0 = 0x36
TXB0D1 = 0x37
TXB0D2 = 0x38
TXB0D3 = 0x39
TXB0D4 = 0x3A
TXB0D5 = 0x3B
TXB0D6 = 0x3C
TXB0D7 = 0x3D
#/* Tx Buffer 1 */
TXB1CTRL = 0x40
TXB1SIDH = 0x41
TXB1SIDL = 0x42
TXB1EID8 = 0x43
TXB1EID0 = 0x44
TXB1DLC = 0x45
TXB1D0 = 0x46
TXB1D1 = 0x47
TXB1D2 = 0x48
TXB1D3 = 0x49
TXB1D4 = 0x4A
TXB1D5 = 0x4B
TXB1D6 = 0x4C
TXB1D7 = 0x4D
#/* Tx Buffer 2 */
TXB2CTRL = 0x50
TXB2SIDH = 0x51
TXB2SIDL = 0x52
TXB2EID8 = 0x53
TXB2EID0 = 0x54
TXB2DLC = 0x55
TXB2D0 = 0x56
TXB2D1 = 0x57
TXB2D2 = 0x58
TXB2D3 = 0x59
TXB2D4 = 0x5A
TXB2D5 = 0x5B
TXB2D6 = 0x5C
TXB2D7 = 0x5D
#/* Rx Buffer 0 */
RXB0CTRL = 0x60
RXB0SIDH = 0x61
RXB0SIDL = 0x62
RXB0EID8 = 0x63
RXB0EID0 = 0x64
RXB0DLC = 0x65
RXB0D0 = 0x66
RXB0D1 = 0x67
RXB0D2 = 0x68
RXB0D3 = 0x69
RXB0D4 = 0x6A
RXB0D5 = 0x6B
RXB0D6 = 0x6C
RXB0D7 = 0x6D
#/* Rx Buffer 1 */
RXB1CTRL = 0x70
RXB1SIDH = 0x71
RXB1SIDL = 0x72
RXB1EID8 = 0x73
RXB1EID0 = 0x74
RXB1DLC = 0x75
RXB1D0 = 0x76
RXB1D1 = 0x77
RXB1D2 = 0x78
RXB1D3 = 0x79
RXB1D4 = 0x7A
RXB1D5 = 0x7B
RXB1D6 = 0x7C
RXB1D7 = 0x7D
#/*******************************************************************
# * Bit register masks *
# *******************************************************************/
#/* TXBnCTRL */
TXREQ = 0x08
TXP = 0x03
#/* RXBnCTRL */
RXM = 0x60
BUKT = 0x04
#/* CANCTRL */
REQOP = 0xE0
ABAT = 0x10
OSM = 0x08
CLKEN = 0x04
CLKPRE = 0x03
#/* CANSTAT */
REQOP = 0xE0
ICOD = 0x0E
#/* CANINTE */
RX0IE = 0x01
RX1IE = 0x02
TX0IE = 0x04
TX1IE = 0x80
TX2IE = 0x10
ERRIE = 0x20
WAKIE = 0x40
MERRE = 0x80
#/* CANINTF */
RX0IF = 0x01
RX1IF = 0x02
TX0IF = 0x04
TX1IF = 0x80
TX2IF = 0x10
ERRIF = 0x20
WAKIF = 0x40
MERRF = 0x80
#/* BFPCTRL */
B1BFS = 0x20
B0BFS = 0x10
B1BFE = 0x08
B0BFE = 0x04
B1BFM = 0x02
B0BFM = 0x01
#/* CNF1 Masks */
SJW = 0xC0
BRP = 0x3F
#/* CNF2 Masks */
BTLMODE = 0x80
SAM = 0x40
PHSEG1 = 0x38
PRSEG = 0x07
#/* CNF3 Masks */
WAKFIL = 0x40
PHSEG2 = 0x07
#/* TXRTSCTRL Masks */
TXB2RTS = 0x04
TXB1RTS = 0x02
TXB0RTS = 0x01
#/*******************************************************************
# * Bit Timing Configuration *
# *******************************************************************/
#/* CNF1 */
SJW_1TQ = 0x40
SJW_2TQ = 0x80
SJW_3TQ = 0x90
SJW_4TQ = 0xC0
#/* CNF2 */
BTLMODE_CNF3 = 0x80
BTLMODE_PH1_IPT = 0x00
SMPL_3X = 0x40
SMPL_1X = 0x00
PHSEG1_8TQ = 0x38
PHSEG1_7TQ = 0x30
PHSEG1_6TQ = 0x28
PHSEG1_5TQ = 0x20
PHSEG1_4TQ = 0x18
PHSEG1_3TQ = 0x10
PHSEG1_2TQ = 0x08
PHSEG1_1TQ = 0x00
PRSEG_8TQ = 0x07
PRSEG_7TQ = 0x06
PRSEG_6TQ = 0x05
PRSEG_5TQ = 0x04
PRSEG_4TQ = 0x03
PRSEG_3TQ = 0x02
PRSEG_2TQ = 0x01
PRSEG_1TQ = 0x00
#/* CNF3 */
PHSEG2_8TQ = 0x07
PHSEG2_7TQ = 0x06
PHSEG2_6TQ = 0x05
PHSEG2_5TQ = 0x04
PHSEG2_4TQ = 0x03
PHSEG2_3TQ = 0x02
PHSEG2_2TQ = 0x01
PHSEG2_1TQ = 0x00
SOF_ENABLED = 0x80
WAKFIL_ENABLED = 0x40
WAKFIL_DISABLED = 0x00
#/*******************************************************************
# * Control/Configuration Registers *
# *******************************************************************/
#/* CANINTE */
RX0IE_ENABLED = 0x01
RX0IE_DISABLED = 0x00
RX1IE_ENABLED = 0x02
RX1IE_DISABLED = 0x00
G_RXIE_ENABLED = 0x03
G_RXIE_DISABLED = 0x00
TX0IE_ENABLED = 0x04
TX0IE_DISABLED = 0x00
TX1IE_ENABLED = 0x08
TX2IE_DISABLED = 0x00
TX2IE_ENABLED = 0x10
TX2IE_DISABLED = 0x00
G_TXIE_ENABLED = 0x1C
G_TXIE_DISABLED = 0x00
ERRIE_ENABLED = 0x20
ERRIE_DISABLED = 0x00
WAKIE_ENABLED = 0x40
WAKIE_DISABLED = 0x00
IVRE_ENABLED = 0x80
IVRE_DISABLED = 0x00
#/* CANINTF */
RX0IF_SET = 0x01
RX0IF_RESET = 0x00
RX1IF_SET = 0x02
RX1IF_RESET = 0x00
TX0IF_SET = 0x04
TX0IF_RESET = 0x00
TX1IF_SET = 0x08
TX2IF_RESET = 0x00
TX2IF_SET = 0x10
TX2IF_RESET = 0x00
ERRIF_SET = 0x20
ERRIF_RESET = 0x00
WAKIF_SET = 0x40
WAKIF_RESET = 0x00
IVRF_SET = 0x80
IVRF_RESET = 0x00
#/* CANCTRL */
REQOP_CONFIG = 0x80
REQOP_LISTEN = 0x60
REQOP_LOOPBACK = 0x40
REQOP_SLEEP = 0x20
REQOP_NORMAL = 0x00
ABORT = 0x10
OSM_ENABLED = 0x08
CLKOUT_ENABLED = 0x04
CLKOUT_DISABLED = 0x00
CLKOUT_PRE_8 = 0x03
CLKOUT_PRE_4 = 0x02
CLKOUT_PRE_2 = 0x01
CLKOUT_PRE_1 = 0x00
#/* CANSTAT */
OPMODE_CONFIG = 0x80
OPMODE_LISTEN = 0x60
OPMODE_LOOPBACK = 0x40
OPMODE_SLEEP = 0x20
OPMODE_NORMAL = 0x00
#/* RXBnCTRL */
RXM_RCV_ALL = 0x60
RXM_VALID_EXT = 0x40
RXM_VALID_STD = 0x20
RXM_VALID_ALL = 0x00
RXRTR_REMOTE = 0x08
RXRTR_NO_REMOTE = 0x00
BUKT_ROLLOVER = 0x04
BUKT_NO_ROLLOVER = 0x00
FILHIT0_FLTR_1 = 0x01
FILHIT0_FLTR_0 = 0x00
FILHIT1_FLTR_5 = 0x05
FILHIT1_FLTR_4 = 0x04
FILHIT1_FLTR_3 = 0x03
FILHIT1_FLTR_2 = 0x02
FILHIT1_FLTR_1 = 0x01
FILHIT1_FLTR_0 = 0x00
#/* TXBnCTRL */
TXREQ_SET = 0x08
TXREQ_CLEAR = 0x00
TXP_HIGHEST = 0x03
TXP_INTER_HIGH = 0x02
TXP_INTER_LOW = 0x01
TXP_LOWEST = 0x00
#/*******************************************************************
# * Register Bit Masks *
# *******************************************************************/
DLC_0 = 0x00
DLC_1 = 0x01
DLC_2 = 0x02
DLC_3 = 0x03
DLC_4 = 0x04
DLC_5 = 0x05
DLC_6 = 0x06
DLC_7 = 0x07
DLC_8 = 0x08
#/*******************************************************************
# * CAN SPI commands *
# *******************************************************************/
CAN_RESET = 0xC0
CAN_READ = 0x03
CAN_WRITE = 0x02
CAN_RTS = 0x80
CAN_RTS_TXB0 = 0x81
CAN_RTS_TXB1 = 0x82
CAN_RTS_TXB2 = 0x84
CAN_RD_STATUS = 0xA0
CAN_BIT_MODIFY = 0x05
CAN_RX_STATUS = 0xB0
CAN_RD_RX_BUFF = 0x90
CAN_LOAD_TX = 0x40
#/*******************************************************************
# * Miscellaneous *
# *******************************************************************/
DUMMY_BYTE = 0x00
TXB0 = 0x31
TXB1 = 0x41
TXB2 = 0x51
RXB0 = 0x61
RXB1 = 0x71
EXIDE_SET = 0x08
EXIDE_RESET = 0x00
#MCP2515
CAN_10Kbps = 0x31
CAN_25Kbps = 0x13
CAN_50Kbps = 0x09
CAN_100Kbps = 0x04
CAN_125Kbps = 0x03
CAN_250Kbps = 0x01
CAN_500Kbps = 0x00
def __init__(self):
self.spi = spidev.SpiDev()
self.spi.open(0, 0)
self.spi.max_speed_hz = 500000
self.spi.mode = 0b11
command = [self.SPI_RESET]
self.spi.writebytes(command)
def WriteRegister(self, Register, Data):
command = [self.SPI_WRITE, Register] + Data
self.spi.writebytes(command)
def ReadRegister(self, Register, n):
Data = [self.SPI_READ, Register] + [0]*n
#print Data
self.spi.xfer(Data)
#print Data
return(Data[2:])
def BitModify(self, Register, Mask, Value):
# print "BitModify(Reg=0x%x, Mask=0x%x, Value=0x%x)" % (Register, Mask, Value)
command = [self.SPI_BIT_MODIFY, Register] + [Mask, Value]
self.spi.writebytes(command)
def ReadStatus(self, type):
Data = [type, 0xFF]
self.spi.xfer(Data)
return(Data[1])
| 22.256018 | 79 | 0.478714 | 1,000 | 10,171 | 4.669 | 0.418 | 0.01949 | 0.008996 | 0.015421 | 0.017348 | 0 | 0 | 0 | 0 | 0 | 0 | 0.198031 | 0.380887 | 10,171 | 456 | 80 | 22.304825 | 0.543433 | 0.174909 | 0 | 0.032641 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151061 | 0 | 0 | 1 | 0.014837 | false | 0 | 0.002967 | 0 | 0.952522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
2bb62b6a718f65b1214f64e28be9ca336c62449b | 3,809 | py | Python | model/grouptrack.py | janhradek/regaudio | e3ead0b19f016c645985e5f779414076306be7e0 | [
"Zlib"
] | null | null | null | model/grouptrack.py | janhradek/regaudio | e3ead0b19f016c645985e5f779414076306be7e0 | [
"Zlib"
] | null | null | null | model/grouptrack.py | janhradek/regaudio | e3ead0b19f016c645985e5f779414076306be7e0 | [
"Zlib"
] | null | null | null | import sqlalchemy
from .base import Base
from .track import Track
import model.tracktime
class GroupTrack(Base):
'''
a link between group and track (association pattern)
backrefs group and track (not listed here)
To use this first create the group, then group tracks,
then tracks and add them to grouptracks,
then add grouptracks to group
Association Object
'''
__tablename__ = "grouptracks"
idno = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
trackid = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey("tracks.idno"))#, primary_key = True)
groupid = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey("groups.idno"))#, primary_key = True)
no = sqlalchemy.Column(sqlalchemy.Integer)
# time doesnt work well with qt mvc and with python
fromms = sqlalchemy.Column(sqlalchemy.Integer)
lengthms = sqlalchemy.Column(sqlalchemy.Integer)
#track = sqlalchemy.orm.relationship("Track", backref="grouptracks")
headers = Track.headers[:]
headers.insert(0, "No")
headers.append("From")
headers.append("Length")
innerheaders = 1 # the track headers start from second index
def __init__(self, no, froms, lengths):
self.no = no
self.fromms = froms
self.lengthms = lengths
def bycol(self, col, newvalue=None, edit=False):
col = GroupTrack.translatecol(col)
if col < 0:
return self.track.bycol(-1-col, newvalue, edit)
if col == 0:
if newvalue != None:
self.no = newvalue
return True
return self.no
elif col == 1:
if newvalue != None:
nw, ok = model.tracktime.strtototal(newvalue)
if ok: self.fromms = nw
return ok
return model.tracktime.totaltostr(self.fromms, edit)
elif col == 2:
if newvalue != None:
nw, ok = model.tracktime.strtototal(newvalue)
if ok: self.lengthms = nw
return ok
return model.tracktime.totaltostr(self.lengthms, edit)
return None
def tipbycol(self, col):
col = GroupTrack.translatecol(col)
if col < 0:
return self.track.tipbycol(-1-col)
return None
@classmethod
def colbycol(cls, col):
col = cls.translatecol(col)
if col < 0:
return Track.colbycol(-1-col)
if col == 0:
return cls.no
elif col == 1:
return cls.fromms
elif col == 2:
return cls.lengthms
@classmethod
def isStar(cls,col):
col = cls.translatecol(col)
if col < 0:
return Track.isStar(-1-col)
else:
return False
@classmethod
def isCheck(cls, col):
col = cls.translatecol(col)
if col < 0:
return Track.isCheck(-1-col)
else:
return False
@classmethod
def translatecol(cls, col):
'''
"translate" the column number to the Track or to local index
a column in grouptrack (positive) or track (negative - 1)
'''
if col >= cls.innerheaders and col < cls.innerheaders + len(Track.headers):
return cls.innerheaders - col - 1
elif col >= cls.innerheaders + len(Track.headers):
return col - len(Track.headers)
return col
@classmethod
def translateorder(cls, direction, col):
"""translate the column number to/from Track
if direction then translate to track
return < 0 if the col doesn't have counterpart
"""
if direction: # gt -> t
return -1 - cls.translatecol(col)
else: # t -> gt
return cls.innerheaders + col
| 30.96748 | 111 | 0.593594 | 442 | 3,809 | 5.090498 | 0.246606 | 0.017778 | 0.018667 | 0.088 | 0.348444 | 0.329778 | 0.256889 | 0.192889 | 0.153778 | 0.153778 | 0 | 0.008432 | 0.315043 | 3,809 | 122 | 112 | 31.221311 | 0.853967 | 0.18535 | 0 | 0.406977 | 0 | 0 | 0.015 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093023 | false | 0 | 0.046512 | 0 | 0.523256 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
2bbc1f15ebca775f3d32d7270961e05f3489946d | 1,240 | py | Python | usaspending_api/common/tests/test_limitable_serializer.py | truthiswill/usaspending-api | bd7d915442e2ec94cc830c480ceeffd4479be6c0 | [
"CC0-1.0"
] | null | null | null | usaspending_api/common/tests/test_limitable_serializer.py | truthiswill/usaspending-api | bd7d915442e2ec94cc830c480ceeffd4479be6c0 | [
"CC0-1.0"
] | 3 | 2020-02-12T01:16:46.000Z | 2021-06-10T20:36:57.000Z | usaspending_api/common/tests/test_limitable_serializer.py | truthiswill/usaspending-api | bd7d915442e2ec94cc830c480ceeffd4479be6c0 | [
"CC0-1.0"
] | null | null | null | import pytest
import json
from model_mommy import mommy
from usaspending_api.awards.models import Award
@pytest.fixture
def mock_limitable_data():
mommy.make(Award, _fill_optional=True)
@pytest.mark.django_db
def test_nested_field_limiting(client, mock_limitable_data):
request_object = {
"fields": ["piid", "recipient__recipient_name"]
}
response = client.post(
"/api/v1/awards/",
content_type='application/json',
data=json.dumps(request_object),
format='json')
results = response.data["results"][0]
assert "piid" in results.keys()
assert "recipient" in results.keys()
assert "recipient_name" in results.get("recipient", {}).keys()
@pytest.mark.django_db
def test_nested_field_exclusion(client, mock_limitable_data):
request_object = {
"exclude": ["piid", "recipient__recipient_name"]
}
response = client.post(
"/api/v1/awards/",
content_type='application/json',
data=json.dumps(request_object),
format='json')
results = response.data["results"][0]
assert "piid" not in results.keys()
assert "recipient" in results.keys()
assert "recipient_name" not in results.get("recipient").keys()
| 24.8 | 66 | 0.678226 | 150 | 1,240 | 5.393333 | 0.34 | 0.066749 | 0.064277 | 0.093943 | 0.76885 | 0.707046 | 0.618047 | 0.618047 | 0.529048 | 0.529048 | 0 | 0.004 | 0.193548 | 1,240 | 49 | 67 | 25.306122 | 0.805 | 0 | 0 | 0.514286 | 0 | 0 | 0.183065 | 0.040323 | 0 | 0 | 0 | 0 | 0.171429 | 1 | 0.085714 | false | 0 | 0.114286 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bbe4807e1342fac308483071d778e38b6d71f3f | 2,959 | py | Python | yasql/apps/sqlorders/urls.py | Fanduzi/YaSQL | bc6366a9b1c1e9ed84fd24ea2b4a21f8f99d0af5 | [
"Apache-2.0"
] | 443 | 2018-02-08T02:53:48.000Z | 2020-10-13T10:01:55.000Z | yasql/apps/sqlorders/urls.py | Fanduzi/YaSQL | bc6366a9b1c1e9ed84fd24ea2b4a21f8f99d0af5 | [
"Apache-2.0"
] | 27 | 2020-10-14T10:01:52.000Z | 2022-03-12T00:49:47.000Z | yasql/apps/sqlorders/urls.py | Fanduzi/YaSQL | bc6366a9b1c1e9ed84fd24ea2b4a21f8f99d0af5 | [
"Apache-2.0"
] | 148 | 2018-03-15T06:07:25.000Z | 2020-08-17T14:58:45.000Z | # -*- coding:utf-8 -*-
# edit by fuzongfei
from django.urls import path
from sqlorders import views
urlpatterns = [
# SQL工单
path('envs', views.GetDBEnvironment.as_view(), name='v1.sqlorders.db-environment'),
path('schemas', views.GetDbSchemas.as_view(), name='v1.sqlorders.db-schemas'),
path('incep/syntaxcheck', views.IncepSyntaxCheckView.as_view(), name='v1.sqlorders.incep.syntaxcheck'),
path('commit', views.SqlOrdersCommit.as_view(), name='v1.sqlorders.commit'),
path('list', views.SqlOrdersList.as_view(), name='v1.sqlorders.list'),
path('detail/<str:order_id>', views.SqlOrdersDetail.as_view(), name='v1.sqlorders.detail'),
path('op/approve/<int:pk>', views.OpSqlOrderView.as_view({"put": "approve"}), name='v1.sqlorders.approve'),
path('op/feedback/<int:pk>', views.OpSqlOrderView.as_view({"put": "feedback"}), name='v1.sqlorders.feedback'),
path('op/close/<int:pk>', views.OpSqlOrderView.as_view({"put": "close"}), name='v1.sqlorders.close'),
path('op/review/<int:pk>', views.OpSqlOrderView.as_view({"put": "review"}), name='v1.sqlorders.review'),
# 生成工单任务
path('tasks/generate', views.GenerateTasksView.as_view(), name='v1.sqlorders.generate-tasks'),
path('tasks/get/<str:order_id>', views.GetTaskIdView.as_view(), name='v1.sqlorders.get-task-id'),
path('tasks/list/<str:task_id>', views.GetTasksListView.as_view(), name='v1.sqlorders.get-tasks-list'),
path('tasks/preview/<str:task_id>', views.GetTasksPreviewView.as_view(),
name='v1.sqlorders.get-tasks-preview'),
# 执行任务
path('tasks/execute/single', views.ExecuteSingleTaskView.as_view(), name='v1.sqlorders.execute-single-task'),
path('tasks/execute/multi', views.ExecuteMultiTasksView.as_view(), name='v1.sqlorders.execute-multi-tasks'),
path('tasks/throttle', views.ThrottleTaskView.as_view(), name='v1.sqlorders.throttle-task'),
path('tasks/result/<int:id>', views.GetTasksResultView.as_view(), name='v1.sqlorders.get-tasks-result'),
# Hook
path('hook', views.HookSqlOrdersView.as_view(), name='v1.sqlorders.hook-sqlorders'),
# download export files
path('export/download/<str:base64_filename>', views.DownloadExportFilesView.as_view(),
name='v1.sqlorders.download-export-files'),
# 上线版本
path('versions/get', views.ReleaseVersionsGet.as_view(), name='v1.sqlorders.versions.get'),
path('versions/list', views.ReleaseVersionsList.as_view(), name='v1.sqlorders.versions.list'),
path('versions/create', views.ReleaseVersionsCreate.as_view(),
name='v1.sqlorders.versions.create'),
path('versions/update/<int:key>', views.ReleaseVersionsUpdate.as_view(),
name='v1.sqlorders.versions.update'),
path('versions/delete/<int:id>', views.ReleaseVersionsDelete.as_view(),
name='v1.sqlorders.versions.delete'),
path('versions/view/<str:version>', views.ReleaseVersionsView.as_view(),
name='v1.sqlorders.versions.view'),
]
| 61.645833 | 114 | 0.707672 | 371 | 2,959 | 5.560647 | 0.234501 | 0.075618 | 0.189045 | 0.127969 | 0.33301 | 0.251575 | 0.106156 | 0 | 0 | 0 | 0 | 0.01091 | 0.101724 | 2,959 | 47 | 115 | 62.957447 | 0.765237 | 0.02974 | 0 | 0 | 0 | 0 | 0.402865 | 0.272537 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bc59939181c875a61b3f943ea4d563e7976ef42 | 1,075 | py | Python | LAB/05/0530_PyMongo.py | LegenDad/KTM_Lab | 09a1671b1dfe9b667008279ef41a959f08babbfc | [
"MIT"
] | null | null | null | LAB/05/0530_PyMongo.py | LegenDad/KTM_Lab | 09a1671b1dfe9b667008279ef41a959f08babbfc | [
"MIT"
] | null | null | null | LAB/05/0530_PyMongo.py | LegenDad/KTM_Lab | 09a1671b1dfe9b667008279ef41a959f08babbfc | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu May 31 09:09:26 2018
@author: Jeon
"""
!pip search pymongo
!pip install pymongo
import pymongo
mgclient = pymongo.MongoClient("localhost", 27017)
# check start mogodb (mongod)
mgclient.database_names()
testdb = mgclient.testdb
testdb_col = testdb.items
data= { "name":"cola", "pty":5, "price":500}
testdb_col.insert(data)
testdb.supermarket.insert(data)
print(mgclient.database_names())
for doc in testdb.supermarket.find():
print(doc)
games
heros Tracer
userNum
tracer = {"games":"Overwatch", "hero":"Tracer", "userNum":8888}
testdb.games.insert(tracer)
for doc in testdb.games.find():
print(doc)
for post in testdb.supermarket.find():
print(post)
print(testdb.supermakret.find({"price":500}).count())
print(testdb.supermarket.find({"name":'GS'}).count())
print(testdb.supermarket.find({"pty":1}).count())
print(testdb.supermarket.find({"pty": {"gt":3} }).count())
print(testdb.supermakret.find({"price": {"gt":3} }).count())
for post in testdb.supermakret.find().sort('name'):
print(post) | 19.545455 | 63 | 0.692093 | 146 | 1,075 | 5.068493 | 0.410959 | 0.137838 | 0.141892 | 0.109459 | 0.293243 | 0.091892 | 0 | 0 | 0 | 0 | 0 | 0.033791 | 0.11907 | 1,075 | 55 | 64 | 19.545455 | 0.747624 | 0.045581 | 0 | 0.137931 | 0 | 0 | 0.089027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.034483 | null | null | 0.344828 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bc5fd37e006f771bdbfbce48ca1e4e8865d7cc3 | 10,285 | py | Python | src/animasnd/image.py | N-z0/commonz | 275c48ef6aac32f0d809a96e56b0b0c254686747 | [
"Unlicense"
] | null | null | null | src/animasnd/image.py | N-z0/commonz | 275c48ef6aac32f0d809a96e56b0b0c254686747 | [
"Unlicense"
] | null | null | null | src/animasnd/image.py | N-z0/commonz | 275c48ef6aac32f0d809a96e56b0b0c254686747 | [
"Unlicense"
] | null | null | null | #!/usr/bin/env python3
#coding: utf-8
### 1st line allows to execute this script by typing only its name in terminal, with no need to precede it with the python command
### 2nd line declaring source code charset should be not necessary but for exemple pydoc request it
__doc__ = "provide images support"#information describing the purpose of this module
__status__ = "Development"#should be one of 'Prototype' 'Development' 'Production' 'Deprecated' 'Release'
__version__ = "3.0.0"# version number,date or about last modification made compared to the previous version
__license__ = "public domain"# ref to an official existing License
__date__ = "2008"#started creation date / year month day
__author__ = "N-zo syslog@laposte.net"#the creator origin of this prog,
__maintainer__ = "Nzo"#person who curently makes improvements, replacing the author
__credits__ = []#passed mainteners and any other helpers
__contact__ = "syslog@laposte.net"# current contact adress for more info about this file
### images modules
#import pygame # designed for writing games. (is not specific to images)
#import scipy # deprecating image I/O functionality and will be removed
### Pillow was the main library used by Scipy for images.
#import imageio # variety of plugins for many images formats.
### Pillow is also the main plugin of imageio for common images
### PIL (Python Image Library) been late adapted on Python3
### then a fork for Python 3 named Pillow been made
### so Pillow and PIL are almost the same
from PIL import Image
from PIL import ImageChops
from PIL import ImageDraw
from PIL import ImageFont
from PIL import ImageOps
from PIL import ImageFilter
### OpenCV (OpenSource Computer Vision) is available in Debian repo
### But no official OpenCV packages released by OpenCV.org on PyPI
### But on PyPI the not official opencv-contrib-python includes all OpenCV functionality.
import cv2
#import cairo # 2D graphics library
### others required modules
import numpy # use for exporting image as array
### PIL Image modes:
BITMAP_MODE = '1' #1b this binary mode is not respected, pixels values are stored with 0 or 255.
GREY_MODE = 'L' #1o shade of grey varies from 0 to 256
LA_MODE = 'LA' # same as L mode with alpha
INDEX_MODE = 'P' #1o the number of colors in a palette may vary, it is not always 256 colors,
RGB_MODE = 'RGB' #3o true color
HSV_MODE = 'HSV' #3o Hue, Saturation, Value
RGBA_MODE = 'RGBA' #4o true color with transparency
INT_MODE = 'I' #4o(signed integer) why signed ?? anyway, I did not find any image format with this mode
FLOAT_MODE = 'F' #4o(floating point) I did not find any image format with this mode.
### resize resample:
### If omitted, or if the image has mode “1” or “P”, it is set PIL.Image.NEAREST.
NEAREST=Image.NEAREST# (use nearest pixels neighbour)
BILINEAR=Image.BILINEAR# (linear pixels interpolation)
BICUBIC=Image.BICUBIC# (cubic spline pixels interpolation) **the most interesting***
LANCZOS=Image.LANCZOS# (a high-quality downsampling pixels filter)
### transpose methods:
FLIP_HORIZONTALLY=Image.FLIP_LEFT_RIGHT
FLIP_VERTICALLY=Image.FLIP_TOP_BOTTOM
ROTATE_90=Image.ROTATE_90
ROTATE_180=Image.ROTATE_180
ROTATE_270=Image.ROTATE_270
TRANSPOSE=Image.TRANSPOSE
### text alignement
ALIGN_LEFT="left"
ALIGN_CENTER="center"
ALIGN_RIGHT="right"
### Direction of the text.
### not supported without libraqm
DIR_RTL="rtl" # (right to left)
DIR_LTR="ltr" # (left to right)
DIR_TTB="ttb" # (top to bottom). Requires libraqm.
### pixel location Indexes
X=0
Y=1
### pixel compo Indexes
R=0
G=1
B=2
A=3
### RGB basics colors
RED=(255,0,0)
GREEN=(0,255,0)
BLUE=(0,0,255)
BLACK=(0,0,0)
WHITE=(255,255,255)
### disable DecompressionBomb safety
#Image.MAX_IMAGE_PIXELS = 200000**2
#Image.warnings.simplefilter('ignore', Image.DecompressionBombWarning)
def image_show(imagefile,title=None):
"""Load and image file and show it in window"""
img = cv2.imread(imagefile,cv2.IMREAD_UNCHANGED)
cv2.imshow(title,img)
cv2.waitKey(0)#display the window infinitely until any keypress.
cv2.destroyAllWindows()
class Bitmap :
"""allow bitmap image manipulation"""
def __init__(self, mode, size, color):
"""mode is the format of pixels, size contains x and y image dimension, color concern the image background"""
self.img= Image.new( mode, size, color )
#self.filename= self.img.filename
#self.size= self.img.size
#self.fmt = self.img.format
#self.mode = self.img.mode
#self.info = self.img.info
def get_info(self):
"""get extra info about the image"""
return self.img.info
def get_size(self):
"""get the x,y size of the image"""
return self.img.size
def get_mode(self):
"""get image pixel format info"""
return self.img.mode
def get_colors(self):
"""return the quantity of pixels for each colors"""
return self.img.getcolors()
def show(self,title=None):
"""display the image in new window"""
self.img.show(title)
def get_pixel(self,position):
"""get the valu of image pixel"""
return self.img.getpixel(position)
def set_pixel(self,position,valu):
"""change the valu of image pixel"""
self.img.putpixel(position,valu)
def set_alpha(self,alpha):
"""change the transparency of the image"""
self.img.putalpha(alpha)
def resize(self,new_size,resample=NEAREST):
"""resize the image by x,y new size"""
self.img = self.img.resize(new_size,resample)
def transpose(self,direction):
"""rotate flip the image
FLIP_VERTICALLY adapt image for openGL
ROTATE_270 FLIP_VERTICALLY adapt image for numpy array"""
self.img = self.img.transpose(direction)
def offset(self,offset):
"""displace the image by x,y pixel move"""
self.img= ImageChops.offset(self.img,offset[X],offset[Y])
def crop(self,cut):
"""cut the image by left,top,right,bottom"""
self.img= self.img.crop(cut)
def convert(self,mode):
"""convert image into the given mode"""
# ImageOps.grayscale(self.img) = self.img.convert('L')
# image.convert(mode='F') # transforms the image values into float, but without putting them between 0.0 and 1.0
self.img= self.img.convert(mode)
def save(self,output_path):
"""save image in default format"""
self.save_png(output_path)
def save_gif(self,output_path):
"""save image in .gif format"""
self.img.save(output_path)
def save_bmp(self,output_path):
"""save image in .bmp format"""
self.img.save(output_path,compression='bmp_rle')
def save_tga(self,output_path):
"""save image in .tga format"""
self.img.save(output_path)
def save_tiff(self,output_path):
"""save image in .tif format"""
self.img.save(output_path,compression="tiff_deflate")
def save_png(self,output_path):
"""save image in .png format"""
self.img.save(output_path,"PNG",optimize=True)
def tile(self,scale):#scale mini is (1,1)
"""repeat the image horizontally and vertically"""
new_size=( self.img.size[X]*int(scale[X]) , self.img.size[Y]*int(scale[Y]) )
result = Image.new(self.img.mode,new_size)# create a new image
#print(self.img.size,scale)
for left in range(0,new_size[X],self.img.size[X]):
for top in range(0,new_size[Y],self.img.size[Y]):
#print(left, top)
result.paste(self.img, (left,top))
self.img= result
def get_gl_data(self):
"""return image data as opengl image"""
data = self.img.tobytes("raw",self.img.mode)# tostring() has been removed. Please call tobytes() instead.
#data = self.img.getData()
return data
def get_array(self,tip):
"""return image data as numpy array"""
return numpy.asarray(self.img,dtype=numpy.dtype(tip))
def mask(self,mask_img):
"""the grey scale mask_img picels are use for set the image transparency"""
self.img.putalpha(mask_img.img)
def blend(self,other_img,mix_factor=0.5):
"""no change if mix_factor=0, if mix_factor=1 the image is completly remplaced by the other_img"""
self.img= Image.blend(self.img,other_img.img,mix_factor)
def compose(self,other_img,alpha_img):
"""same as blend but instead using mix_factor use alpha pixels valu of alpha_img"""
self.img= Image.composite(self.img,other_img.img,alpha_img.img)
def overwrite(self,other_img):
"""write the other image on top of self image"""
self.img= Image.alpha_composite(self.img,other_img.img)
def smooth(self,qantum):
"""makes image edges and points less sharp"""
for q in range(qantum) :
self.img= self.img.filter(ImageFilter.SMOOTH_MORE)
#self.img= self.img.filter(ImageFilter.BLUR)
class Bitmap_File(Bitmap) :
"""allow bitmap image file manipulation"""
def __init__(self,image_file):
"""need to provide an image file pathname"""
img = Image.open(image_file)
### Verifies the contents of a file. without decoding the image data. If any problems raises exceptions.
img.verify()
### after using image.verify(), need to reopen the image file.
self.img = Image.open(image_file)
class Bitmap_Text(Bitmap) :
"""allow to write text in bitmap"""
def __init__(self,font_name,font_size,background_color,text_color,contour_color,contour_size):
"""need to provide an text data"""
self.font = ImageFont.truetype(font=font_name,size=font_size)
self.background_color=background_color
self.text_color=text_color
self.contour_color=contour_color
self.contour_size=contour_size
self.img = Image.new ( RGBA_MODE,(1,1),background_color )
def write(self,text,dir,align,spacing):
"""create the appropriate image to write in a text with the font specified previously"""
total_size=[0,0]
quantum=0
for line in text.splitlines():
line_size= self.font.getsize(line)
total_size[X]=max(total_size[X],line_size[X])
total_size[Y]+=line_size[Y]
quantum+=1
#print(line_size,total_size)
total_size[X]+=self.contour_size*2
total_size[Y]+=self.contour_size*2 + (quantum-1)*spacing
#other_total_size=ImageDraw.textsize(text,font=self.font,spacing=spacing,direction=dir,stroke_width=self.contour_size)
#print(other_total_size,total_size)
self.img = Image.new( RGBA_MODE,total_size,self.background_color )
draw = ImageDraw.Draw( self.img )
draw.text((self.contour_size,self.contour_size),text,font=self.font,fill=self.text_color,spacing=spacing,direction=dir,align=align)#,stroke_width=self.contour_size,stroke_fill=self.contour_color )
| 31.940994 | 198 | 0.734565 | 1,628 | 10,285 | 4.523342 | 0.275184 | 0.055133 | 0.012222 | 0.013308 | 0.126833 | 0.076725 | 0.036936 | 0.019283 | 0.010049 | 0.010049 | 0 | 0.013927 | 0.148274 | 10,285 | 321 | 199 | 32.040498 | 0.826712 | 0.489645 | 0 | 0.013514 | 0 | 0 | 0.033421 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.216216 | false | 0 | 0.054054 | 0 | 0.337838 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bd67d5e723b9c06ff50baa0fbd6ffdaf7fb05cb | 3,752 | py | Python | python/ts/flint/utils.py | mattomatic/flint | ee1dc08b5a7f2c84e41bfbc7a02e069d23d02c72 | [
"Apache-2.0"
] | 972 | 2016-10-25T20:56:50.000Z | 2022-03-23T06:05:59.000Z | python/ts/flint/utils.py | mattomatic/flint | ee1dc08b5a7f2c84e41bfbc7a02e069d23d02c72 | [
"Apache-2.0"
] | 66 | 2016-11-02T15:27:35.000Z | 2022-02-15T16:48:48.000Z | python/ts/flint/utils.py | jaewanbahk/flint | eda21faace03ed90258d1008071e9ac7033f5f48 | [
"Apache-2.0"
] | 218 | 2016-11-04T11:03:24.000Z | 2022-01-21T21:31:59.000Z | #
# Copyright 2017 TWO SIGMA OPEN SOURCE, LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
_UNIT_TO_JUNIT = {
"s": "SECONDS",
"ms": "MILLISECONDS",
"us": "MICROSECONDS",
"ns": "NANOSECONDS"
}
def jsc(sc):
"""Returns the underlying Scala SparkContext
:param sc: SparkContext
:return: :class:`py4j.java_gateway.JavaObject` (org.apache.spark.SparkContext)
"""
return sc._jsc.sc()
def jvm(sc):
"""Returns the Pyspark JVM handle
:param sc: SparkContext
:return: :class:`py4j.java_gateway.JavaView
` """
return sc._jvm
def scala_object(jpkg, obj):
return jpkg.__getattr__(obj + "$").__getattr__("MODULE$")
def scala_package_object(jpkg):
return scala_object(jpkg, "package")
def pyutils(sc):
"""Returns a handle to ``com.twosigma.flint.rdd.PythonUtils``
:param sc: SparkContext
:return: :class:`py4j.java_gateway.JavaPackage` (com.twosigma.flint.rdd.PythonUtils)
"""
return jvm(sc).com.twosigma.flint.rdd.PythonUtils
def copy_jobj(sc, obj):
"""Returns a Java object ``obj`` with an additional reference count
:param sc: Spark Context
:param obj: :class:`py4j.java_gateway.JavaObject`
:return: ``obj`` (:class:`py4j.java_gateway.JavaObject`) with an additional reference count
"""
return pyutils(sc).makeCopy(obj)
def to_list(lst):
"""Make sure the object is wrapped in a list
:return: a ``list`` object, either lst or lst in a list
"""
if isinstance(lst, str):
lst = [lst]
elif not isinstance(lst, list):
try:
lst = list(lst)
except TypeError:
lst = [lst]
return lst
def list_to_seq(sc, lst, preserve_none=False):
"""Shorthand for accessing PythonUtils Java Package
If lst is a Python None, returns a None or empty Scala Seq (depending on preserve_none)
If lst is a Python object, such as str, returns a Scala Seq containing the object
If lst is a Python tuple/list, returns a Scala Seq containing the objects in the tuple/list
:return: A copy of ``lst`` as a ``scala.collection.Seq``
"""
if lst is None:
if preserve_none:
return None
else:
lst = []
return jvm(sc).org.apache.spark.api.python.PythonUtils.toSeq(to_list(lst))
def py_col_to_scala_col(sc, py_col):
converters = {
list: list_to_seq,
tuple: list_to_seq
}
convert = converters.get(type(py_col))
if convert:
return convert(sc, py_col)
else:
return py_col
def junit(sc, unit):
"""Converts a Pandas unit to scala.concurrent.duration object
:return: Scala equivalent of ``unit`` as ``scala.concurrent.duration object``
"""
if unit not in _UNIT_TO_JUNIT:
raise ValueError("unit must be in {}".format(_UNIT_TO_JUNIT.keys()))
return scala_package_object(jvm(sc).scala.concurrent.duration).__getattr__(_UNIT_TO_JUNIT[unit])()
def jschema(sc, schema):
"""Converts a Python schema (StructType) to a Scala schema ``org.apache.spark.sql.types.StructType``
:return: :class:``org.apache.spark.sql.types.StructType``
"""
import json
return jvm(sc).org.apache.spark.sql.types.StructType.fromString(json.dumps(schema.jsonValue))
| 30.754098 | 104 | 0.67564 | 525 | 3,752 | 4.72 | 0.321905 | 0.024213 | 0.026231 | 0.040355 | 0.239306 | 0.157789 | 0.054479 | 0.054479 | 0 | 0 | 0 | 0.004386 | 0.210021 | 3,752 | 121 | 105 | 31.008264 | 0.831646 | 0.530117 | 0 | 0.078431 | 0 | 0 | 0.050995 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.215686 | false | 0 | 0.019608 | 0.039216 | 0.490196 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bdb91041eec35c1c58197d76e4a723d39aceee0 | 18,798 | py | Python | ttbd/ttbl/quartus.py | inakypg/tcf | 569e21b25c8ee72ebad0c80d0a7de0714411185f | [
"Apache-2.0"
] | 1 | 2018-08-31T06:48:14.000Z | 2018-08-31T06:48:14.000Z | ttbd/ttbl/quartus.py | inakypg/tcf | 569e21b25c8ee72ebad0c80d0a7de0714411185f | [
"Apache-2.0"
] | null | null | null | ttbd/ttbl/quartus.py | inakypg/tcf | 569e21b25c8ee72ebad0c80d0a7de0714411185f | [
"Apache-2.0"
] | null | null | null | #! /usr/bin/python3
#
# Copyright (c) 2021 Intel Corporation
#
# SPDX-License-Identifier: Apache-2.0
#
import copy
import os
import subprocess
import commonl
import ttbl
import ttbl.images
import ttbl.power
class pgm_c(ttbl.images.flash_shell_cmd_c):
"""Flash using Intel's Quartus PGM tool
This allows to flash images to an Altera MAX10, using the Quartus
tools, freely downloadable from
https://www.intel.com/content/www/us/en/collections/products/fpga/software/downloads.html?s=Newest
Exports the following interfaces:
- power control (using any AC power switch, such as the
:class:`Digital Web Power Switch 7 <ttbl.pc.dlwps7>`)
- serial console
- image (in hex format) flashing (using the Quartus Prime tools
package)
Multiple instances at the same time are supported; however, due to
the JTAG interface not exporting a serial number, addressing has
to be done by USB path, which is risky (as it will change when the
cable is plugged to another port or might be enumerated in a
different number).
:param str usb_serial_number: USB serial number of the USB device to use
(USB-BlasterII or similar)
:param dict image_map:
:param str name: (optiona; default 'Intel Quartus PGM #<DEVICEID>')
instrument's name.
:param dict args: (optional) dictionary of extra command line options to
*quartus_pgm*; these are expanded with the target keywords with
*%(FIELD)s* templates, with fields being the target's
:ref:`metadata <finding_testcase_metadata>`:
FIXME: move to common flash_shell_cmd_c
:param dict jtagconfig: (optional) jtagconfig --setparam commands
to run before starting.
These are expanded with the target keywords with
*%(FIELD)s* templates, with fields being the target's
:ref:`metadata <finding_testcase_metadata>` and then run as::
jtagconfig --setparam CABLENAME KEY VALUE
:param int tcp_port: (optional, default *None*) if a TCP port
number is given, it is a assumed the flashing server is in
localhost in the given TCP port.
:param str sibling_serial_number (optional, default *None*) USB serial
number of the USB device that is a sibling to the one defined by
usb_serial_number
:param int usb_port (optional, default *None*) port that the USB device is
connected to, used in combination with sibling_serial_number to find
the USB path for devices that do not have unique serial numbers (USB
Blaster I)
Other parameters described in :class:ttbl.images.impl_c.
**Command line reference**
https://www.intel.com/content/dam/www/programmable/us/en/pdfs/literature/manual/tclscriptrefmnl.pdf
Section Quartus_PGM (2-50)
**System setup**
- Download and install Quartus Programmer::
$ wget http://download.altera.com/akdlm/software/acdsinst/20.1std/711/ib_installers/QuartusProgrammerSetup-20.1.0.711-linux.run
# chmod a+x QuartusProgrammerSetup-20.1.0.711-linux.run
# ./QuartusProgrammerSetup-20.1.0.711-linux.run --unattendedmodeui none --mode unattended --installdir /opt/quartus --accept_eula 1
- if installing to a different location than */opt/quartus*,
adjust the value of :data:`path` in a FIXME:ttbd configuration
file.
**Troubleshooting**
When it fails to flash, the error log is reported in the server in
a file called *flash-COMPONENTS.log* in the target's state
directory (FIXME: we need a better way for this--the admin shall
be able to read it, but not the users as it might leak sensitive
information?).
Common error messages:
- *Error (213019): Can't scan JTAG chain. Error code 87*
Also seen when manually running in the server::
$ /opt/quartus/qprogrammer/bin/jtagconfig
1) USB-BlasterII [3-1.4.4.3]
Unable to read device chain - JTAG chain broken
In many cases this has been:
- a powered off main board: power it on
- a misconnected USB-BlasterII: reconnect properly
- a broken USB-BlasterII: replace unit
- *Error (209012): Operation failed*
this usually happens when flashing one component of a multiple
component chain; the log might read something like::
Info (209060): Started Programmer operation at Mon Jul 20 12:05:22 2020
Info (209017): Device 2 contains JTAG ID code 0x038301DD
Info (209060): Started Programmer operation at Mon Jul 20 12:05:22 2020
Info (209016): Configuring device index 2
Info (209017): Device 2 contains JTAG ID code 0x018303DD
Info (209007): Configuration succeeded -- 1 device(s) configured
Info (209011): Successfully performed operation(s)
Info (209061): Ended Programmer operation at Mon Jul 20 12:05:22 2020
Error (209012): Operation failed
Info (209061): Ended Programmer operation at Mon Jul 20 12:05:22 2020
Error: Quartus Prime Programmer was unsuccessful. 1 error, 0 warnings
This case has been found to be because the **--bgp** option is
needed (which seems to map to the *Enable Realtime ISP
programming* in the Quartus UI, *quartus_pgmw*)
- *Warning (16328): The real-time ISP option for Max 10 is
selected. Ensure all Max 10 devices being programmed are in user
mode when requesting this programming option*
Followed by:
*Error (209012): Operation failed*
This case comes when a previous flashing process was interrupted
half way or the target is corrupted.
It needs a special one-time recovery; currently the
workaround seems to run the flashing with out the *--bgp* switch
that as of now is hardcoded.
FIXME: move the --bgp and --mode=JTAG switches to the args (vs
hardcoded) so a recovery target can be implemented as
NAME-nobgp
*Using Quartus tool with a remote jtagd*
The service port for *jtagd* can be tunneled in and used by the
Quartus toolsuite::
$ tcf property-get r013s001 interfaces.power.jtagd.tcp_port
5337
$ tcf power-on -c jtagd TARGET
$ tcf tunnel-add TARGET 5337 tcp 127.0.01
SERVERNAME:1234
Now the Quartus Qprogrammer tools need to be told which server to
add::
$ jtagdconfig --addserver SERVERNAME:1234 ""
(second entry is an empty password); this adds an entry to
*~/.jtagd.conf*::
# /home/USERNAME/.jtag.conf
Remote1 {
Host = "SERVERNAME:1234";
Password = "";
}
Note the port number changes with each tunnel, you will have to
*jtagconfig --addserver* and delete the old one (you can edit the
file by hand too).
Now list remote targets::
$ jtagconfig
1) USB-BlasterII on SERVERNAME:1234 [3-1.4.1]
031050DD 10M50DA(.|ES)/10M50DC
031040DD 10M25D(A|C)
Note this connection is open to anyone until the tunnel is removed
or the allocation is released with *tcf alloc-rm* or
equivalent. *PENDING* use SSL to secure access.
[ see also for the Quartus GUI, follow
https://www.intel.com/content/www/us/en/programmable/quartushelp/13.0/mergedProjects/program/pgm/pgm_pro_add_server.htm ]
**Quartus Lite**
Download from https://www.intel.com/content/www/us/en/software-kit/684215/intel-quartus-prime-lite-edition-design-software-version-21-1-for-linux.html?
Install with::
$ tar xf Quartus-lite-21.1.0.842-linux.tar
$ cd components
$ chmod a+x ./Quartus-lite-21.1.0.842-linux.tar
$ ./Quartus-lite-21.1.0.842-linux.tar
Quartus will use the same *~/.jtagd.conf* if you have used
*jtagconfig* to configure as above
1. Start Quartus::
$ INSTALLPATH/intelFPGA_lite/21.1/quartus/bin/quartus
2. Go to Programmer > Edit > Hardware Setup
3. Click on *Add Hardware*
4. Enter as *Server Name* and *Server Port* the name of the server
that is doing the tunnel (as printed by *tcf tunnel-add*
above); leave the password blank.
5. Click *OK*
**Troubleshooting**
- can't connect to port::
$ ./jtagconfig
1) Remote server SERVERNAME:1234: Unable to connect
- ensure jtagd in the target is on
- ensure the tunnel is on
"""
#: Path to *quartus_pgm*
#:
#: We need to use an ABSOLUTE PATH if the tool is not in the
#: normal search path (which usually won't).
#:
#: Change by setting, in a :ref:`server configuration file
#: <ttbd_configuration>`:
#:
#: >>> ttbl.quartus.pgm_c.path = "/opt/quartus/qprogrammer/bin/quartus_pgm"
#:
#: or for a single instance that then will be added to config:
#:
#: >>> imager = ttbl.quartus.pgm_c(...)
#: >>> imager.path = "/opt/quartus/qprogrammer/bin/quartus_pgm"
path = "/opt/quartus/qprogrammer/bin/quartus_pgm"
path_jtagconfig = "/opt/quartus/qprogrammer/bin/jtagconfig"
def __init__(self, usb_serial_number, image_map, args = None, name = None,
jtagconfig = None, tcp_port = None,
sibling_serial_number = None, usb_port = None,
**kwargs):
assert isinstance(usb_serial_number, str)
commonl.assert_dict_of_ints(image_map, "image_map")
commonl.assert_none_or_dict_of_strings(jtagconfig, "jtagconfig")
assert name == None or isinstance(name, str)
assert tcp_port == None or isinstance(tcp_port, int)
self.usb_serial_number = usb_serial_number
self.tcp_port = tcp_port
self.image_map = image_map
self.jtagconfig = jtagconfig
self.sibling_serial_number = sibling_serial_number
self.usb_port = usb_port
if args:
commonl.assert_dict_of_strings(args, "args")
self.args = args
else:
self.args = {}
cmdline = [
"stdbuf", "-o0", "-e0", "-i0",
self.path,
# FIXME: move this to args, enable value-less args (None)
"--bgp", # Real time background programming
"--mode=JTAG", # this is a JTAG
# when using a server, if the target is called
# SOMETHING in SERVERNAME:PORT CABLENAME, it seems PGM
# goes straight there. Weird
"-c", "%(device_path)s", # will resolve in flash_start()
# in flash_start() call we'll map the image names to targets
# to add these
#
#'--operation=PVB;%(image.NAME)s@1',
#'--operation=PVB;%(image.NAME)s@2',
#...
# (P)rogram (V)erify, (B)lank-check
#
# note like this we can support burning multiple images into the
# same chain with a single call
]
if args:
for arg, value in args.items():
if value != None:
cmdline += [ arg, value ]
# we do this because in flash_start() we need to add
# --operation as we find images we are supposed to flash
self.cmdline_orig = cmdline
ttbl.images.flash_shell_cmd_c.__init__(self, cmdline, cwd = '%(file_path)s',
**kwargs)
if name == None:
self.name = "quartus"
self.upid_set(
f"Intel Quartus PGM @ USB#{usb_serial_number}",
usb_serial_number = usb_serial_number)
def flash_start(self, target, images, context):
# Finalize preparing the command line for flashing the images
# find the device path; quartus_pgm doesn't seem to be able to
# address by serial and expects a cable name as 'PRODUCT NAME
# [PATH]', like 'USB BlasterII [1-3.3]'; we can't do this on
# object creation because the USB path might change when we power
# it on/off (rare, but could happen). Since USB Blaster I do not
# have unique serial numbers we use a combination of usb_port
# and sibling_serial_number to find the correct usb_path
if self.usb_port != None:
usb_path, _vendor, product = ttbl.usb_serial_to_path(
self.sibling_serial_number, self.usb_port)
else:
usb_path, _vendor, product = ttbl.usb_serial_to_path(
self.usb_serial_number)
if self.tcp_port:
# server based cable name
device_path = f"{product} on localhost:{self.tcp_port} [{usb_path}]"
jtag_config_filename = f"{target.state_dir}/jtag-{'_'.join(images.keys())}.conf"
# Create the jtag client config file to ensure that
# the correct jtag daemon is connected to, then use the
# environment variable QUARTUS_JTAG_CLIENT_CONFIG to have
# the quartus software find it
with open(jtag_config_filename, "w+") as jtag_config:
jtag_config.write(
f'ReplaceLocalJtagServer = "localhost:{self.tcp_port}";')
self.env_add["QUARTUS_JTAG_CLIENT_CONFIG"] = jtag_config_filename
else:
# local cable name, starts sever on its own
device_path = f"{product} [{usb_path}]"
context['kws'] = {
# HACK: we assume all images are in the same directory, so
# we are going to cwd there (see in __init__ how we set
# cwd to %(file_path)s. Reason is some of our paths might
# include @, which the tool considers illegal as it uses
# it to separate arguments--see below --operation
'file_path': os.path.dirname(list(images.values())[0]),
'device_path': device_path,
# flash_shell_cmd_c.flash_start() will add others
}
# for each image we are burning, map it to a target name in
# the cable (@NUMBER)
# make sure we don't modify the originals
cmdline = copy.deepcopy(self.cmdline_orig)
for image_type, filename in images.items():
target_index = self.image_map.get(image_type, None)
# pass only the realtive filename, as we are going to
# change working dir into the path (see above in
# context[kws][file_path]
cmdline.append("--operation=PVB;%s@%d" % (
os.path.basename(filename), target_index))
# now set it for flash_shell_cmd_c.flash_start()
self.cmdline = cmdline
if self.jtagconfig:
for option, value in self.jtagconfig.items():
cmdline = [
self.path_jtagconfig,
"--addserver", f"localhost:{self.tcp_port}", "", # empty password
"--setparam",
device_path,
option, value
]
target.log.info("running per-config: %s" % " ".join(cmdline))
subprocess.check_output(
cmdline, shell = False, stderr = subprocess.STDOUT)
ttbl.images.flash_shell_cmd_c.flash_start(self, target, images, context)
class jtagd_c(ttbl.power.daemon_c):
"""Driver for the jtag daemon
This driver starts the jtag daemon on the server for a specific
USB Blaster II
Does not override any of the default methods except for verify
**Arugments**
:param str usb_serial_number: serial number of the USB Blaster II
:param int tcp_port: (1024 - 65536) Number of the TCP port on
localhost where the daemon will listen
:param str jtagd_path: (optional) orverride :data:`jtagd_path`;
:param str explicit: (optional; default *off*) control when this
is started on/off:
- *None*: for normal behaviour; component will be
powered-on/started with the whole power rail
- *both*: explicit for both powering on and off: only
power-on/start and power-off/stop if explicity called by
name
- *on*: explicit for powering on: only power-on/start if explicity
powered on by name, power off normally
- *off*: explicit for powering off: only power-off/stop if explicity
powered off by name, power on normally
By default it is set to *off*, so that when the target is powere
off existing network connections to the daemon are maintained.
Any other arguments as taken by :class:ttbl.power.daemon_c and
:class:ttbl.power.impl_c.
"""
jtagd_path = "/opt/quartus/qprogrammer/bin/jtagd"
def __init__(self, usb_serial_number, tcp_port, jtagd_path = None,
check_path = None, explicit = "off", **kwargs):
assert isinstance(usb_serial_number, str), \
"usb_serial_number: expected a string, got %s" % type(usb_serial_number)
assert isinstance(tcp_port, int), \
"tcp_port: expected an integer between 1024 and 65536, got %s" \
% type(usb_serial_number)
if jtagd_path:
self.jtagd_path = jtagd_path
assert isinstance(self.jtagd_path, str), \
"openipc_path: expected a string, got %s" % type(jtagd_path)
self.usb_serial_number = usb_serial_number
self.tcp_port = tcp_port
cmdline = [
self.jtagd_path,
"--no-config",
"--auto-detect-filter", usb_serial_number,
"--port", str(tcp_port),
"--debug",
"--foreground",
]
ttbl.power.daemon_c.__init__(
self, cmdline, precheck_wait = 0.5, mkpidfile = True,
name = "jtagd", explicit = explicit,
# ...linux64/jtagd renames itself to jtagd and it makes it hard to kill
path = "jtagd",
check_path = "/opt/quartus/qprogrammer/linux64/jtagd",
**kwargs)
# Register the instrument like this, so it matches pgm_c and
# others and they all point to the same instrument
self.upid_set(
f"Intel Quartus PGM @ USB#{usb_serial_number}",
usb_serial_number = usb_serial_number)
def target_setup(self, target, iface_name, component):
target.fsdb.set(f"interfaces.{iface_name}.{component}.tcp_port",
self.tcp_port)
#Set the local ports that is able to be reached via tunneling
target.tunnel.allowed_local_ports.add(("127.0.0.1", "tcp",
self.tcp_port))
ttbl.power.daemon_c.target_setup(self, target, iface_name, component)
def verify(self, target, component, cmdline_expanded):
pidfile = os.path.join(target.state_dir, component + "-jtagd.pid")
return commonl.process_alive(pidfile, self.check_path) \
and commonl.tcp_port_busy(self.tcp_port)
def on(self, target, component):
return ttbl.power.daemon_c.on(self, target, component)
| 37.899194 | 155 | 0.641132 | 2,563 | 18,798 | 4.601249 | 0.25595 | 0.033579 | 0.030527 | 0.010684 | 0.200543 | 0.171797 | 0.135928 | 0.096922 | 0.079793 | 0.069024 | 0 | 0.02606 | 0.273274 | 18,798 | 495 | 156 | 37.975758 | 0.837201 | 0.607033 | 0 | 0.166667 | 0 | 0 | 0.140263 | 0.066879 | 0 | 0 | 0 | 0.010101 | 0.068182 | 1 | 0.045455 | false | 0 | 0.05303 | 0.007576 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bdcbb9196f7ddd57e7b4ad584ef8f3fd81fbbd6 | 565 | py | Python | delivery/delivery/exts/cli/__init__.py | all0cer/flask | a72bee18fd8d4ccf0f16e0e637f7e0b1779e7d85 | [
"Unlicense"
] | null | null | null | delivery/delivery/exts/cli/__init__.py | all0cer/flask | a72bee18fd8d4ccf0f16e0e637f7e0b1779e7d85 | [
"Unlicense"
] | null | null | null | delivery/delivery/exts/cli/__init__.py | all0cer/flask | a72bee18fd8d4ccf0f16e0e637f7e0b1779e7d85 | [
"Unlicense"
] | null | null | null | from enum import Flag
import click
from delivery.exts.db import db
from delivery.exts.db import models
def init_app(app):
@app.cli.command()
def create_db():
db.create_all()
@app.cli.command()
@click.option("--email", "-e")
@click.option("--passwd", "-p")
@click.option("--admin", "-a", is_flag=True, default=False)
def add_new_user(email, passwd, admin):
user = User(
email = email,
passwd = passwd,
admin = admin
)
db.session.add(user)
db.session.commit() | 23.541667 | 63 | 0.576991 | 73 | 565 | 4.383562 | 0.438356 | 0.103125 | 0.1 | 0.1125 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.274336 | 565 | 24 | 64 | 23.541667 | 0.780488 | 0 | 0 | 0.1 | 0 | 0 | 0.04947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0.15 | 0.2 | 0 | 0.35 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2bf2e3db2a5c23652d59aea6af613c4b44bec6ab | 948 | py | Python | Medium/54_spiralOrder.py | a-shah8/LeetCode | a654e478f51b2254f7b49055beba6b5675bc5223 | [
"MIT"
] | 1 | 2021-06-02T15:03:41.000Z | 2021-06-02T15:03:41.000Z | Medium/54_spiralOrder.py | a-shah8/LeetCode | a654e478f51b2254f7b49055beba6b5675bc5223 | [
"MIT"
] | null | null | null | Medium/54_spiralOrder.py | a-shah8/LeetCode | a654e478f51b2254f7b49055beba6b5675bc5223 | [
"MIT"
] | null | null | null | class Solution:
def spiralOrder(self, matrix: List[List[int]]) -> List[int]:
result = []
rows, columns = len(matrix), len(matrix[0])
up = left = 0
right = columns-1
down = rows-1
while len(result) < rows*columns:
for col in range(left, right+1):
result.append(matrix[up][col])
for row in range(up+1, down+1):
result.append(matrix[row][right])
if up != down:
for col in range(right-1, left-1, -1):
result.append(matrix[down][col])
if left != right:
for row in range(down-1, up, -1):
result.append(matrix[row][left])
up += 1
down -= 1
left += 1
right -= 1
return result
| 29.625 | 64 | 0.405063 | 101 | 948 | 3.80198 | 0.267327 | 0.072917 | 0.135417 | 0.197917 | 0.114583 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03272 | 0.484177 | 948 | 31 | 65 | 30.580645 | 0.752556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0 | 0 | 0.130435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bf566fec30c9deeff3c69426f515c5625a28010 | 1,661 | py | Python | scripts/run-and-rename.py | mfs6174/Deep6174 | 92e2ceb48134e0cf003f130aef8d838a7a16c27d | [
"Apache-2.0"
] | null | null | null | scripts/run-and-rename.py | mfs6174/Deep6174 | 92e2ceb48134e0cf003f130aef8d838a7a16c27d | [
"Apache-2.0"
] | null | null | null | scripts/run-and-rename.py | mfs6174/Deep6174 | 92e2ceb48134e0cf003f130aef8d838a7a16c27d | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python2
# -*- coding: UTF-8 -*-
# File: run-and-rename.py
# Date: Thu Sep 18 15:43:36 2014 -0700
# Author: Yuxin Wu <ppwwyyxxc@gmail.com>
import numpy as np
from scipy.misc import imread, imsave
from itertools import izip
import sys, os
import shutil
import os.path
import glob
if len(sys.argv) != 3:
print "Usage: {0} <input directory with images> <model>".format(sys.argv[0])
sys.exit(0)
sys.path.insert(0, os.path.realpath(os.path.join(os.path.dirname(__file__), '../')))
from network_runner import NetworkRunner, get_nn
from lib.imageutil import get_image_matrix
from dataio import read_data
from lib.progress import Progressor
input_dir = sys.argv[1]
output_dir = os.path.join(input_dir, 'predicted')
shutil.rmtree(output_dir, ignore_errors=True)
os.mkdir(output_dir)
print "Reading images from {0}".format(input_dir)
print "Writing predicted results to {0}".format(output_dir)
model_file = sys.argv[2]
nn = get_nn(model_file)
print "Running network with model {0}".format(model_file)
# Run the network against a directory of images,
# and put predicted label in the filename
tot, corr = 0, 0
for f in glob.glob(input_dir + '/*'):
if not os.path.isfile(f):
continue
img = imread(f) / 255.0
pred = nn.predict(img)
label = f.split('-')[-1].split('.')[0]
new_fname = "{:04d}:{}-{},{}.png".format(tot, label, pred[0],
''.join(map(str, pred[1:])))
imsave(os.path.join(output_dir, new_fname), img)
tot += 1
corr += label == ''.join(map(str, pred[1:1+pred[0]]))
if tot > 0 and tot % 1000 == 0:
print "Progress:", tot
print corr, tot
| 27.229508 | 84 | 0.665262 | 265 | 1,661 | 4.075472 | 0.441509 | 0.038889 | 0.027778 | 0.025926 | 0.027778 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036846 | 0.183022 | 1,661 | 60 | 85 | 27.683333 | 0.759027 | 0.138471 | 0 | 0 | 0 | 0 | 0.124648 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.282051 | null | null | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bf7cbd3af3aad19347edaabc6dc3814ebd7e30c | 950 | py | Python | wk8_hw/ex3_create_2_new_devs.py | philuu12/PYTHON_4_NTWK_ENGRS | ac0126ed687a5201031a6295d0094a536547cb92 | [
"Apache-2.0"
] | 1 | 2016-03-01T14:39:17.000Z | 2016-03-01T14:39:17.000Z | wk8_hw/ex3_create_2_new_devs.py | philuu12/PYTHON_4_NTWK_ENGRS | ac0126ed687a5201031a6295d0094a536547cb92 | [
"Apache-2.0"
] | null | null | null | wk8_hw/ex3_create_2_new_devs.py | philuu12/PYTHON_4_NTWK_ENGRS | ac0126ed687a5201031a6295d0094a536547cb92 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""
3. Create two new test NetworkDevices in the database. Use both direct
object creation and the .get_or_create() method to create the devices.
"""
from net_system.models import NetworkDevice
import django
def main():
django.setup()
brocade_rtr1 = NetworkDevice(
device_name='Brocade-rtr1',
device_type='fc_san',
ip_address='50.76.53.28',
port=8022,
)
hp_sw1 = NetworkDevice(
device_name='HP-sw1',
device_type='stratus',
ip_address='50.76.53.29',
port=8022,
)
# Save new device information in database
brocade_rtr1.save()
hp_sw1.save()
# Print out devices just added
for a_dev in (brocade_rtr1, hp_sw1):
print a_dev
print "Display devices in the database"
devices = NetworkDevice.objects.all()
for a_device in devices:
print a_device.device_name
if __name__ == "__main__":
main()
| 22.093023 | 71 | 0.649474 | 129 | 950 | 4.55814 | 0.511628 | 0.07483 | 0.044218 | 0.044218 | 0.05102 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046414 | 0.251579 | 950 | 42 | 72 | 22.619048 | 0.780591 | 0.093684 | 0 | 0.076923 | 0 | 0 | 0.130127 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076923 | null | null | 0.115385 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bf99724df5b4e9cffb019d5c688ef19980283e0 | 4,545 | py | Python | applications/MultilevelMonteCarloApplication/external_libraries/XMC/xmc/methodDefs_momentEstimator/updateCombinedPowerSums.py | HubertBalcerzak/Kratos | c15689d53f06dabb36dc44c13eeac73d3e183916 | [
"BSD-4-Clause"
] | null | null | null | applications/MultilevelMonteCarloApplication/external_libraries/XMC/xmc/methodDefs_momentEstimator/updateCombinedPowerSums.py | HubertBalcerzak/Kratos | c15689d53f06dabb36dc44c13eeac73d3e183916 | [
"BSD-4-Clause"
] | null | null | null | applications/MultilevelMonteCarloApplication/external_libraries/XMC/xmc/methodDefs_momentEstimator/updateCombinedPowerSums.py | HubertBalcerzak/Kratos | c15689d53f06dabb36dc44c13eeac73d3e183916 | [
"BSD-4-Clause"
] | null | null | null | # Import PyCOMPSs
# from exaqute.ExaquteTaskPyCOMPSs import * # to execute with runcompss
# from exaqute.ExaquteTaskHyperLoom import * # to execute with the IT4 scheduler
from exaqute.ExaquteTaskLocal import * # to execute with python3
def updatePowerSumsOrder1Dimension0():
pass
@ExaquteTask(samples={Type: COLLECTION_IN, Depth: 4},returns=1)
def updatePowerSumsOrder1Dimension0_Task(samples,*args):
return updatePowerSumsOrder1Dimension0(samples,*args)
def updatePowerSumsOrder2Dimension0(old_sample_counter,samples,power_sum_1,power_sum_2):
sample_counter = old_sample_counter
if (type(samples) is tuple):
samples = [samples]
for i in range (len(samples)):
sample = samples[i][0]
if (power_sum_1 == None):
power_sum_1 = sample[0][0]
power_sum_2 = sample[1][0]
sample_counter = sample_counter + sample[-1]
else:
power_sum_1 = power_sum_1 + sample[0][0]
power_sum_2 = power_sum_2 + sample[1][0]
sample_counter = sample_counter + sample[-1]
return sample_counter,power_sum_1,power_sum_2
@ExaquteTask(samples={Type: COLLECTION_IN, Depth: 4},returns=3)
def updatePowerSumsOrder2Dimension0_Task(counter,samples,*args):
return updatePowerSumsOrder2Dimension0(counter,samples,*args)
def updatePowerSumsOrder10Dimension0(old_sample_counter,samples,power_sum_1,power_sum_2,power_sum_3,power_sum_4,power_sum_5,power_sum_6,power_sum_7,power_sum_8,power_sum_9,power_sum_10):
sample_counter = old_sample_counter
if (type(samples) is tuple):
samples = [samples]
for i in range (len(samples)):
sample = samples[i][0]
if (power_sum_1 == None):
power_sum_1 = sample[0][0]
power_sum_2 = sample[1][0]
power_sum_3 = sample[2][0]
power_sum_4 = sample[3][0]
power_sum_5 = sample[4][0]
power_sum_6 = sample[5][0]
power_sum_7 = sample[6][0]
power_sum_8 = sample[7][0]
power_sum_9 = sample[8][0]
power_sum_10 = sample[9][0]
sample_counter = sample_counter + sample[-1]
else:
power_sum_1 = power_sum_1 + sample[0][0]
power_sum_2 = power_sum_2 + sample[1][0]
power_sum_3 = power_sum_3 + sample[2][0]
power_sum_4 = power_sum_4 + sample[3][0]
power_sum_5 = power_sum_5 + sample[4][0]
power_sum_6 = power_sum_6 + sample[5][0]
power_sum_7 = power_sum_7 + sample[6][0]
power_sum_8 = power_sum_8 + sample[7][0]
power_sum_9 = power_sum_9 + sample[8][0]
power_sum_10 = power_sum_10 + sample[9][0]
sample_counter = sample_counter + sample[-1]
return sample_counter,power_sum_1,power_sum_2,power_sum_3,power_sum_4,power_sum_5,power_sum_6,power_sum_7,power_sum_8,power_sum_9,power_sum_10
@ExaquteTask(samples={Type: COLLECTION_IN, Depth: 4},returns=11)
def updatePowerSumsOrder10Dimension0_Task(counter,samples,*args):
return updatePowerSumsOrder10Dimension0(counter,samples,*args)
def updatePowerSumsOrder2Dimension1(old_sample_counter,samples,power_sum_upper_1,power_sum_lower_1,power_sum_upper_2,power_sum_lower_2):
sample_counter = old_sample_counter
if (type(samples) is tuple):
samples = [samples]
for i in range (len(samples)):
sample_upper = samples[i][0]
if (type(samples[i][1]) is list): # index > 0
sample_lower = samples[i][1]
else: # index == 0
sample_lower = [[0.0],[0.0]]
if (power_sum_upper_1 == None):
power_sum_upper_1 = sample_upper[0][0]
power_sum_upper_2 = sample_upper[1][0]
power_sum_lower_1 = sample_lower[0][0]
power_sum_lower_2 = sample_lower[1][0]
sample_counter = sample_counter + sample_upper[-1]
else:
power_sum_upper_1 = power_sum_upper_1 + sample_upper[0][0]
power_sum_upper_2 = power_sum_upper_2 + sample_upper[1][0]
power_sum_lower_1 = power_sum_lower_1 + sample_lower[0][0]
power_sum_lower_2 = power_sum_lower_2 + sample_lower[1][0]
sample_counter = sample_counter + sample_upper[-1]
return sample_counter,power_sum_upper_1,power_sum_lower_1,power_sum_upper_2,power_sum_lower_2
@ExaquteTask(samples={Type: COLLECTION_IN, Depth: 4},returns=5)
def updatePowerSumsOrder2Dimension1_Task(counter,samples,*args):
return updatePowerSumsOrder2Dimension1(counter,samples,*args)
| 46.85567 | 186 | 0.678768 | 647 | 4,545 | 4.400309 | 0.094281 | 0.233228 | 0.082192 | 0.0281 | 0.72111 | 0.691605 | 0.671935 | 0.671935 | 0.638918 | 0.496663 | 0 | 0.058458 | 0.220902 | 4,545 | 96 | 187 | 47.34375 | 0.745552 | 0.046205 | 0 | 0.404762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0.011905 | 0.011905 | 0.047619 | 0.190476 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2bfae8e287502bfa4fa79da3660a8aed6860e04e | 598 | py | Python | lista02Exec01.py | marcelocmedeiros/PrimeiraAvalaizcaoPython | 6390505530fc4f9acef2e3b93944547a3685d611 | [
"MIT"
] | null | null | null | lista02Exec01.py | marcelocmedeiros/PrimeiraAvalaizcaoPython | 6390505530fc4f9acef2e3b93944547a3685d611 | [
"MIT"
] | null | null | null | lista02Exec01.py | marcelocmedeiros/PrimeiraAvalaizcaoPython | 6390505530fc4f9acef2e3b93944547a3685d611 | [
"MIT"
] | null | null | null | #MARCELO CAMPOS DE MEDEIROS
#ADS UNIFIP P1 2020
#LISTA 02
'''
1- Faça um programa que solicite ao usuário o valor do litro de combustível (ex. 4,75)
e quanto em dinheiro ele deseja abastecer (ex. 50,00). Calcule quantos litros de
combustível o usuário obterá com esses valores.
'''
valor_gas = float(input('Qual o valor do combustível?R$ '))
num = float(input('Qual o valor que deseja abastecer?R$ '))
valor_tot = num / valor_gas
print('-=' * 35)
print(f'O valor abastecido foi R${num:.2f} e a quantidade de combustivél é {valor_tot:.2f}l.')
print('-=' * 35)
print(' OBRIGADO, VOLTE SEMPRE! ')
| 33.222222 | 94 | 0.714047 | 100 | 598 | 4.23 | 0.63 | 0.056738 | 0.037825 | 0.070922 | 0.094563 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041916 | 0.162207 | 598 | 17 | 95 | 35.176471 | 0.802395 | 0.448161 | 0 | 0.285714 | 0 | 0.142857 | 0.571875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
2bff57af33ceaea8237514bf259d6354d7c86834 | 8,203 | py | Python | spiders/spider.py | gt11799/xueqiu_spider | 72be1a983e3b648b902f12475fb914add7bd45e7 | [
"MIT"
] | 6 | 2016-09-08T07:30:42.000Z | 2021-02-03T18:20:03.000Z | spiders/spider.py | gt11799/xueqiu_spider | 72be1a983e3b648b902f12475fb914add7bd45e7 | [
"MIT"
] | null | null | null | spiders/spider.py | gt11799/xueqiu_spider | 72be1a983e3b648b902f12475fb914add7bd45e7 | [
"MIT"
] | 2 | 2017-02-07T08:30:18.000Z | 2018-04-24T14:01:01.000Z | # -*- coding: utf-8 -*-
import sys
import time
import json
import pickle
import hashlib
import requests
from urlparse import urljoin
from config import *
from spiders.common import *
from spiders.html_parser import *
from logs.log import logger
reload(sys)
sys.setdefaultencoding('utf8')
class Spider(object):
def __init__(self, user_name=None, password=None):
self.session = requests.Session()
self.uid = None
self.user_name = user_name
self.password = password
def get_hash(self, string):
m = hashlib.md5()
m.update(string)
return m.hexdigest()
def _request(self, url, params={}):
# 应该使用统一的request函数去请求,此处待重构
try:
response = self.session.get(url, headers=FOLLOWER_HEADER, params=params, timeout=10)
return response
except requests.ConnectionError, requests.ConnectTimeout:
logger.error('%s请求超时')
def visit_index(self):
self.session.get(BASE_URL, headers=BASE_HEADER)
def login(self):
url = urljoin(BASE_URL, LOGIN_URL)
if self.check_login():
logger.info('已经登录')
return
data = {
'areacode': 86,
'remember_me': 'on',
'username': self.user_name,
'password': self.get_hash(self.password),
}
if if_int(self.user_name):
data['telephone'] = data.pop('username')
response = self.session.post(url, headers=BASE_HEADER, data=data)
logger.debug(response.content)
if self.check_login():
logger.info('登录成功')
self.get_people_id('8276760920')
self.save_cookies()
return
raise ValueError('登录失败')
def save_cookies(self):
result = self.load_data()
with open('spiders/.session', 'wb') as f:
cookies = requests.utils.dict_from_cookiejar(self.session.cookies)
data = {
'cookies': cookies,
'uid': self.uid,
'user_name': self.user_name,
}
result[self.user_name] = data
pickle.dump(result, f)
@classmethod
def clear_cookies(cls):
with open('spiders/.session', 'wb') as f:
pickle.dump({}, f)
def load_data(self):
with open('spiders/.session') as f:
try:
return pickle.load(f)
except EOFError:
return {}
def load_cookies(self):
with open('spiders/.session') as f:
try:
data = pickle.load(f)
except EOFError:
return {}
result = data.get(self.user_name)
if not result:
logger.info("账户未登录")
return {}
self.uid = result['uid']
cookies = result['cookies']
return cookies
def check_login(self, load_cookie=True):
if load_cookie:
cookies = self.load_cookies()
response = self.session.get(BASE_URL, headers=BASE_HEADER,
cookies=cookies, allow_redirects=False)
else:
response = self.session.get(BASE_URL, headers=BASE_HEADER,
allow_redirects=False)
if response.status_code == 302:
if self.uid is not None:
return True
location = response.headers['Location']
uid = get_uid_from_url(location)
if uid:
self.uid = uid
return True
else:
logger.error(u"从跳转链接解析uid出错了")
return False
def get_people(self):
url = urljoin(BASE_URL, PEOPLE_URL)
respond = self.session.get(url, headers=BASE_HEADER)
result = get_people(respond.content)
logger.info('抓取了%s个大V' % len(result))
return result
def get_people_id(self, path):
url = urljoin(BASE_URL, path)
respond = self.session.get(url, headers=BASE_HEADER)
if respond.status_code == 200:
uid = get_people_id(respond.content)
return uid
else:
logger.error(u'抓取’%s‘用户的id失败' % path)
def get_followers(self, uid):
size = 1000
url = urljoin(BASE_URL, FOLLOWERS_URL)
params = {
'size': size,
'pageNo': 1,
'uid': uid,
'_': int(time.time() * 1000)
}
respond = self._request(url, params=params)
if not respond:
return []
data = respond.json()
max_page = data.get('maxPage')
if not max_page:
logger.error("获取粉丝失败")
logger.error(data)
raise ValueError("获取粉丝失败")
result = data['followers']
for page in range(1, max_page):
time.sleep(FOLLOWER_PAGE_INTEVAL)
logger.info('开始抓取第%s页的粉丝' % page)
params['pageNo'] = page
params['_'] = int(time.time() * 1000)
respond = self._request(url, params=params)
if not respond:
continue
data = respond.json()
result += data['followers']
return self.handle_followers(result)
def handle_followers(self, data):
return [(_['id'], _['screen_name']) for _ in data]
def get_chat_sequence_id(self, uid):
url = CHAT_HISTORY_URL % uid
params = {
'user_id': self.uid,
'limit': 30,
'_': int(time.time() * 1000)
}
cookies = self.load_cookies()
respond = self.session.get(url, headers=CHAT_HEADER, params=params, cookies=cookies)
if respond.status_code == 200:
data = respond.json()
if len(data) > 1:
return data[-1]['sequenceId']
else:
return 96878141
logger.error('获得聊天id失败')
logger.error(respond.content)
return False
def chat(self, uid, msg):
sequenceId = self.get_chat_sequence_id(uid)
if not sequenceId:
return False
data = {
'plain': msg,
'to_group': False,
'toId': uid,
'sequenceId': sequenceId + 1
}
params = {'user_id': self.uid}
cookies = self.load_cookies()
respond = self.session.post(CHAT_URL, headers=CHAT_HEADER, cookies=cookies,
params=params, data=json.dumps(data))
if respond.status_code == 200:
result = respond.json()
error = result.get('error')
if error:
print '发送消息出错了'
logger.debug(respond.content)
raise ValueError(error.encode('utf8'))
return True
logger.debug(respond.status_code)
logger.debug(respond.content)
return False
def post(self, msg, audience=[]):
p = {"api": "/statuses/update.json", "_": int(time.time() * 1000)}
cookie = self.load_cookies()
url = urljoin(BASE_URL, TOKEN_URL)
r = self.session.get(url, params=p, cookies=cookie,
headers=BASE_HEADER)
try:
token = r.json()['token']
except (IndexError, TypeError, ValueError):
logger.error("MLGB 出错了!")
logger.error("\n%s\n", r.text)
return
audience = ' @'.join(audience)
audience = ' @' + audience.strip()
msg = '%s %s' % (msg, audience)
logger.info('发送的内容是: %s' % msg)
msg = msg.encode().decode()
data = {"status": "<p>%s</p>" % msg, "session_token": token}
url = urljoin(BASE_URL, POST_URL)
r = self.session.post(url, data=data, cookies=cookie,
headers=BASE_HEADER)
if r.status_code == 200:
data = r.json()
if not data.get('error_code') > -1:
logger.debug("完事儿了.")
return
logger.error("MLGB 又出错了!")
logger.error("\n%s\n", r.text)
raise ValueError('发广播出错了')
def if_int(item):
try:
int(item)
except ValueError:
return False
return True
| 32.042969 | 96 | 0.534561 | 891 | 8,203 | 4.79349 | 0.208754 | 0.033482 | 0.026223 | 0.028096 | 0.229689 | 0.162725 | 0.13299 | 0.092718 | 0.049637 | 0.028096 | 0 | 0.01298 | 0.351944 | 8,203 | 255 | 97 | 32.168627 | 0.790444 | 0.00573 | 0 | 0.275109 | 0 | 0 | 0.064026 | 0.002576 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.0131 | 0.048035 | null | null | 0.004367 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
920180be669a9b4ba8c22bfaeb59c4b059bb6f59 | 1,005 | py | Python | sympy/diffgeom/tests/test_class_structure.py | shilpiprd/sympy | 556e9c61b31d0d5f101cd56b43e843fbf3bcf121 | [
"BSD-3-Clause"
] | 8,323 | 2015-01-02T15:51:43.000Z | 2022-03-31T13:13:19.000Z | sympy/diffgeom/tests/test_class_structure.py | shilpiprd/sympy | 556e9c61b31d0d5f101cd56b43e843fbf3bcf121 | [
"BSD-3-Clause"
] | 15,102 | 2015-01-01T01:33:17.000Z | 2022-03-31T22:53:13.000Z | sympy/diffgeom/tests/test_class_structure.py | shilpiprd/sympy | 556e9c61b31d0d5f101cd56b43e843fbf3bcf121 | [
"BSD-3-Clause"
] | 4,490 | 2015-01-01T17:48:07.000Z | 2022-03-31T17:24:05.000Z | from sympy.diffgeom import Manifold, Patch, CoordSystem, Point
from sympy import symbols, Function
from sympy.testing.pytest import warns_deprecated_sympy
m = Manifold('m', 2)
p = Patch('p', m)
a, b = symbols('a b')
cs = CoordSystem('cs', p, [a, b])
x, y = symbols('x y')
f = Function('f')
s1, s2 = cs.coord_functions()
v1, v2 = cs.base_vectors()
f1, f2 = cs.base_oneforms()
def test_point():
point = Point(cs, [x, y])
assert point != Point(cs, [2, y])
#TODO assert point.subs(x, 2) == Point(cs, [2, y])
#TODO assert point.free_symbols == set([x, y])
def test_subs():
assert s1.subs(s1, s2) == s2
assert v1.subs(v1, v2) == v2
assert f1.subs(f1, f2) == f2
assert (x*f(s1) + y).subs(s1, s2) == x*f(s2) + y
assert (f(s1)*v1).subs(v1, v2) == f(s1)*v2
assert (y*f(s1)*f1).subs(f1, f2) == y*f(s1)*f2
def test_deprecated():
with warns_deprecated_sympy():
cs_wname = CoordSystem('cs', p, ['a', 'b'])
assert cs_wname == cs_wname.func(*cs_wname.args)
| 30.454545 | 62 | 0.610945 | 173 | 1,005 | 3.462428 | 0.260116 | 0.03005 | 0.066778 | 0.050083 | 0.133556 | 0.080134 | 0.080134 | 0 | 0 | 0 | 0 | 0.046972 | 0.195025 | 1,005 | 32 | 63 | 31.40625 | 0.693449 | 0.093532 | 0 | 0 | 0 | 0 | 0.016502 | 0 | 0 | 0 | 0 | 0.03125 | 0.307692 | 1 | 0.115385 | false | 0 | 0.115385 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9204509b7b416569750a4002597915c7ee96c306 | 895 | py | Python | tests/test_oauth.py | Dephilia/poaurk | 8482effc35030c316c311dbda8a2e0b2e0754cda | [
"MIT"
] | 1 | 2021-11-25T16:55:16.000Z | 2021-11-25T16:55:16.000Z | tests/test_oauth.py | Dephilia/poaurk | 8482effc35030c316c311dbda8a2e0b2e0754cda | [
"MIT"
] | null | null | null | tests/test_oauth.py | Dephilia/poaurk | 8482effc35030c316c311dbda8a2e0b2e0754cda | [
"MIT"
] | null | null | null | #! /usr/bin/env python3
# -*- coding: utf-8 -*-
# vim:fenc=utf-8
#
# Copyright © 2021 dephilia <dephilia@MacBook-Pro.local>
#
# Distributed under terms of the MIT license.
"""
"""
import unittest
from poaurk import (PlurkAPI, PlurkOAuth)
class TestOauthMethods(unittest.TestCase):
def test_class(self):
oauth = PlurkOAuth("z3kiB2tbqrlC", "u8mCwet8BQNjROfUZU8A6BHc1o9rx1AE")
oauth.authorize()
def test_upper(self):
self.assertEqual('foo'.upper(), 'FOO')
def test_isupper(self):
self.assertTrue('FOO'.isupper())
self.assertFalse('Foo'.isupper())
def test_split(self):
s = 'hello world'
self.assertEqual(s.split(), ['hello', 'world'])
# check that s.split fails when the separator is not a string
with self.assertRaises(TypeError):
s.split(2)
if __name__ == '__main__':
unittest.main()
| 24.189189 | 78 | 0.643575 | 105 | 895 | 5.380952 | 0.619048 | 0.049558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024217 | 0.215642 | 895 | 36 | 79 | 24.861111 | 0.779202 | 0.243575 | 0 | 0 | 0 | 0 | 0.128593 | 0.048412 | 0 | 0 | 0 | 0 | 0.277778 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.388889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92052e2fb1619d6d194ae1f403d4ea778640c5a8 | 394 | py | Python | datacite_rest/__init__.py | gu-eresearch/datacite-rest | 2569afc9df2a2380e4c001b881b28dd8288441ae | [
"MIT"
] | null | null | null | datacite_rest/__init__.py | gu-eresearch/datacite-rest | 2569afc9df2a2380e4c001b881b28dd8288441ae | [
"MIT"
] | null | null | null | datacite_rest/__init__.py | gu-eresearch/datacite-rest | 2569afc9df2a2380e4c001b881b28dd8288441ae | [
"MIT"
] | 1 | 2021-06-01T16:11:07.000Z | 2021-06-01T16:11:07.000Z | """ root module with metadata """
__title__ = 'datacite-rest'
__version__ = '0.0.1-dev0'
__author__ = 'Gary Burgmann'
__author_email__ = 'g.burgmann@griffith.edu.au'
__description__ = 'a package for managing dois'
__license__ = 'MIT'
try:
from .datacite_rest import DataCiteREST # noqa
except Exception:
# preserve import here but stops setup.py breaking due to dependencies
pass
| 28.142857 | 74 | 0.741117 | 51 | 394 | 5.215686 | 0.882353 | 0.090226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012121 | 0.162437 | 394 | 13 | 75 | 30.307692 | 0.793939 | 0.256345 | 0 | 0 | 0 | 0 | 0.322807 | 0.091228 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.1 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
920c06c03c25fe4afade1bef0ccab25a44b158a4 | 764 | py | Python | qiskit/aqua/algorithms/education/__init__.py | hushaohan/aqua | 8512bc6ce246a8b3cca1e5edb1703b6885aa7c5d | [
"Apache-2.0"
] | 2 | 2020-06-29T16:08:12.000Z | 2020-08-07T22:42:13.000Z | qiskit/aqua/algorithms/education/__init__.py | hushaohan/aqua | 8512bc6ce246a8b3cca1e5edb1703b6885aa7c5d | [
"Apache-2.0"
] | null | null | null | qiskit/aqua/algorithms/education/__init__.py | hushaohan/aqua | 8512bc6ce246a8b3cca1e5edb1703b6885aa7c5d | [
"Apache-2.0"
] | 1 | 2022-01-25T07:09:10.000Z | 2022-01-25T07:09:10.000Z | # -*- coding: utf-8 -*-
# This code is part of Qiskit.
#
# (C) Copyright IBM 2020.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
#
# Any modifications or derivative works of this code must retain this
# copyright notice, and modified files need to carry a notice indicating
# that they have been altered from the originals.
""" Education Package """
from .eoh import EOH
from .simon import Simon
from .deutsch_jozsa import DeutschJozsa
from .bernstein_vazirani import BernsteinVazirani
__all__ = ['EOH',
'Simon',
'DeutschJozsa',
'BernsteinVazirani']
| 29.384615 | 77 | 0.717277 | 110 | 764 | 4.927273 | 0.654545 | 0.04428 | 0.0369 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014706 | 0.198953 | 764 | 25 | 78 | 30.56 | 0.870915 | 0.649215 | 0 | 0 | 0 | 0 | 0.148 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
921abd69963e5dab6e9a4498bebdc2bdb12a2b92 | 1,480 | py | Python | module3-nosql-and-document-oriented-databases/mongo_queries.py | ayarelif/DS-Unit-3-Sprint-2-SQL-and-Databases | fbd145f5dfbaa1e8b68e7369b593a4eee2c5ae5f | [
"MIT"
] | null | null | null | module3-nosql-and-document-oriented-databases/mongo_queries.py | ayarelif/DS-Unit-3-Sprint-2-SQL-and-Databases | fbd145f5dfbaa1e8b68e7369b593a4eee2c5ae5f | [
"MIT"
] | null | null | null | module3-nosql-and-document-oriented-databases/mongo_queries.py | ayarelif/DS-Unit-3-Sprint-2-SQL-and-Databases | fbd145f5dfbaa1e8b68e7369b593a4eee2c5ae5f | [
"MIT"
] | null | null | null |
# BG_URI="mongodb+srv://elifayar:<password>@clusters.lcjcx.mongodb.net/<dbname>?retryWrites=true&w=majority")
# client = pymongo.MongoClient(DB_URI)
# db = client.test
from pymongo import MongoClient
import os
from dotenv import load_dotenv
load_dotenv()
DB_USER = os.getenv("MONGO_USER", default="OOPS")
DB_PASSWORD = os.getenv("MONGO_PASSWORD", default="OOPS")
CLUSTER_NAME = os.getenv("MONGO_CLUSTER_NAME", default="OOPS")
connection_uri = f"mongodb+srv://{DB_USER}:{DB_PASSWORD}@{CLUSTER_NAME}.mongodb.net/test?retryWrites=true&w=majority"
print("----------------")
print("URI:", connection_uri)
client =MongoClient(connection_uri)
print("----------------")
print("CLIENT:", type(client), client)
print("DATABASES", client.list_database_names())
db = client.my_test_database # "test_database" or whatever you want to call it
print("----------------")
print("DB:", type(db), db)
collection = db.pokemon_test # "pokemon_test" or whatever you want to call it
print("----------------")
print("COLLECTION:", type(collection), collection)
print("DOCUMENTS COUNT:", collection.count_documents({}))
collection.insert_one({
"name": "Pikachu",
"level": 30,
"exp": 76000000000,
"hp": 400,
"parents": ['Pikachu',"Raichu"],
"other_attr": {
"a":1,
"b":2,
"c":3
}
})
print("DOCS:", collection.count_documents({}))
print(collection.count_documents({"name": "Pikachu"}))
#from pprint import pprint
#breakpoint()
#dir(collection) | 27.407407 | 117 | 0.672973 | 185 | 1,480 | 5.227027 | 0.4 | 0.041365 | 0.040331 | 0.049638 | 0.072389 | 0.072389 | 0.072389 | 0.072389 | 0.072389 | 0 | 0 | 0.014548 | 0.117568 | 1,480 | 54 | 118 | 27.407407 | 0.725881 | 0.208108 | 0 | 0.114286 | 0 | 0.028571 | 0.288296 | 0.083477 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.057143 | 0.085714 | 0 | 0.085714 | 0.342857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
921b98b759492f2fca11edef2d3a5faa5d0dc853 | 1,797 | py | Python | Scripts/simulation/careers/acting/performance_object_data.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | Scripts/simulation/careers/acting/performance_object_data.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | Scripts/simulation/careers/acting/performance_object_data.py | velocist/TS4CheatsInfo | b59ea7e5f4bd01d3b3bd7603843d525a9c179867 | [
"Apache-2.0"
] | null | null | null | # uncompyle6 version 3.7.4
# Python bytecode 3.7 (3394)
# Decompiled from: Python 3.7.9 (tags/v3.7.9:13c94747c7, Aug 17 2020, 18:58:18) [MSC v.1900 64 bit (AMD64)]
# Embedded file name: T:\InGame\Gameplay\Scripts\Server\careers\acting\performance_object_data.py
# Compiled at: 2018-09-18 00:30:33
# Size of source mod 2**32: 2272 bytes
import services
class PerformanceObjectData:
def __init__(self, objects, pre_performance_states, performance_states, post_performance_states):
self._objects = objects
self._pre_performance_states = pre_performance_states
self._performance_states = performance_states
self._post_performance_states = post_performance_states
def set_performance_states(self):
self._set_states(self._performance_states)
def set_pre_performance_states(self):
bucks_tracker = services.active_sim_info().get_bucks_tracker()
for state_data in self._pre_performance_states:
skip_perk = state_data.skip_with_perk
state_value = state_data.state_value
if skip_perk is not None:
if bucks_tracker is not None:
if bucks_tracker.is_perk_unlocked(skip_perk):
continue
for obj in self._objects:
if obj.has_state(state_value.state):
obj.set_state((state_value.state), state_value, immediate=True, force_update=True)
def set_post_performance_states(self):
self._set_states(self._post_performance_states)
def _set_states(self, states):
for state_value in states:
for obj in self._objects:
if obj.has_state(state_value.state):
obj.set_state((state_value.state), state_value, immediate=True, force_update=True) | 44.925 | 107 | 0.691708 | 242 | 1,797 | 4.801653 | 0.376033 | 0.219449 | 0.108434 | 0.068847 | 0.403614 | 0.297762 | 0.297762 | 0.19105 | 0.19105 | 0.19105 | 0 | 0.046999 | 0.230384 | 1,797 | 40 | 108 | 44.925 | 0.793203 | 0.179744 | 0 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.178571 | false | 0 | 0.035714 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
921ca360ae33219bfdbd1ac2f46fb2050a3108c9 | 45,163 | py | Python | python/3D-rrt/pvtrace/Geometry.py | siddhu95/mcclanahoochie | 6df72553ba954b52e949a6847a213b22f9e90157 | [
"Apache-2.0"
] | 1 | 2020-12-27T21:37:35.000Z | 2020-12-27T21:37:35.000Z | python/3D-rrt/pvtrace/Geometry.py | siddhu95/mcclanahoochie | 6df72553ba954b52e949a6847a213b22f9e90157 | [
"Apache-2.0"
] | null | null | null | python/3D-rrt/pvtrace/Geometry.py | siddhu95/mcclanahoochie | 6df72553ba954b52e949a6847a213b22f9e90157 | [
"Apache-2.0"
] | null | null | null | # pvtrace is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
#
# pvtrace is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import division
import numpy as np
import scipy as sp
import scipy.linalg
from external.transformations import translation_matrix, rotation_matrix
import external.transformations as tf
from external.quickhull import qhull3d
import logging
import pdb
#logging.basicConfig(filename="/tmp/geom-debug.txt", level=logging.DEBUG, filemode="w")
def cmp_floats(a,b):
abs_diff = abs(a-b)
if abs_diff < 1e-12:
return True
else:
return False
def cmp_floats_range(a,b):
if cmp_floats(a,b):
return 0
elif a < b:
return -1
else:
return 1
def intervalcheck(a,b,c):
"""
Returns whether a <= b <= c is True or False
"""
if cmp_floats(a,b) == True or cmp_floats(b,c) == True:
return True
if a<b and b<c:
return True
else:
return False
def intervalcheckstrict(a,b,c):
"""
Returns whether a < b < c is True or False
"""
if a<b and b<c:
return True
else:
return False
def smallerequalto(a,b):
"""
Returns whether a<=b is True or False
"""
if cmp_floats(a,b) == True:
return True
if a<b:
return True
else:
return False
def round_zero_elements(point):
for i in range(0,len(point)):
if cmp_floats(0.0, point[i]):
point[i] = 0.0
return point
def cmp_points(a,b):
if a is None:
return False
if b is None:
return False
al = list(a)
bl = list(b)
for e1, e2 in zip(al, bl):
ans = cmp_floats(e1,e2)
if ans is False:
return False
return True
def flatten(l, ltypes=(list, tuple)):
ltype = type(l)
l = list(l)
i = 0
while i < len(l):
while isinstance(l[i], ltypes):
if not l[i]:
l.pop(i)
i -= 1
break
else:
l[i:i + 1] = l[i]
i += 1
return ltype(l)
def separation(beginning, end):
return magnitude(np.array(end)-np.array(beginning))
def magnitude(vector):
return np.sqrt(np.dot(np.array(vector),np.array(vector)))
def norm(vector):
return np.array(vector)/magnitude(np.array(vector))
def angle(normal, vector):
assert cmp_floats(magnitude(normal), 1.0), "The normal vector is not normalised."
dot = np.dot(normal, vector)
return np.arccos(dot / magnitude(vector))
def reflect_vector(normal, vector):
d = np.dot(normal, vector)
return vector - 2 * d * normal
def closest_point(reference, point_list):
"""Return the closest point in the list of reference points."""
separations = []
for point in point_list:
t = separation(reference, point)
separations.append(t)
sort_index = np.array(separations).argsort()
return point_list[sort_index[0]]
def transform_point(point, transform):
return np.array(np.dot(transform, np.matrix(np.concatenate((point, [1.]))).transpose()).transpose()[0,0:3]).squeeze()
def transform_direction(direction, transform):
angle, axis, point = tf.rotation_from_matrix(transform)
rotation_transform = tf.rotation_matrix(angle, axis)
return np.array(np.dot(rotation_transform, np.matrix(np.concatenate((direction, [1.]))).transpose()).transpose()[0,0:3]).squeeze()
def rotation_matrix_from_vector_alignment(before, after):
"""
>>> # General input/output test
>>> V1 = norm(np.random.random(3))
>>> V2 = norm([1,1,1])
>>> R = rotation_matrix_from_vector_alignment(V1, V2)
>>> V3 = transform_direction(V1, R)
>>> cmp_points(V2, V3)
True
>>> # Catch the special case in which we cannot take the cross product
>>> V1 = [0,0,1]
>>> V2 = [0,0,-1]
>>> R = rotation_matrix_from_vector_alignment(V1, V2)
>>> V3 = transform_direction(V1, R)
>>> cmp_points(V2, V3)
True
"""
# The angle between the vectors must not be 0 or 180 (i.e. so we can take a cross product)
thedot = np.dot(before, after)
if cmp_floats(thedot, 1.) == True:
# Vectors are parallel
return tf.identity_matrix()
if cmp_floats(thedot, -1.) == True:
# Vectors are anti-parallel
print "Vectors are anti-parallel this might crash."
axis = np.cross(before, after) # get the axis of rotation
angle = np.arccos(np.dot(before, after)) # get the rotation angle
return rotation_matrix(angle, axis)
class Ray(object):
"""A ray in the global cartesian frame."""
def __init__(self, position=[0.,0.,0.], direction=[0.,0.,1.]):
self.__position = np.array(position)
self.__direction = np.array(direction)/np.sqrt(np.dot(direction,np.array(direction).conj()))
def getPosition(self):
return self.__position
def setPosition(self, position):
self.__position = round_zero_elements(position)
def getDirection(self):
return self.__direction
def setDirection(self, direction):
self.__direction = np.array(direction)/np.sqrt(np.dot(direction,np.array(direction).conj()))
def stepForward(self, distance):
self.position = self.position + distance * self.direction
def behind(self, point):
# Create vector from position to point, if the angle is
# greater than 90 degrees then the point is behind the ray.
v = np.array(point) - np.array(self.position)
if cmp_points([0,0,0], v):
# The ray is at the point
return False
if angle(self.direction, v) > np.pi*.5:
return True
return False
# Define properties
direction = property(getDirection, setDirection)
position = property(getPosition, setPosition)
class Intersection(object):
"""Defines the intersection between a ray and a geometrical objects."""
def __init__(self, ray, point, receiver):
"""An intersection is defined as the point that a ray and receiver meet. This class is simiply a wapper for these deatils. Intersection objects can be sorted with respect to their separations (distance from the ray.position to the point of intersection), this length is returned with intersection_obj.separation."""
super(Intersection, self).__init__()
self.ray = ray
self.point = point
self.receiver = receiver
self.separation = separation(point, ray.position)
def __str__(self):
return str(' point ' + str(self.point) + ' receiver ' + str(self.receiver))
def __cmp__(self, other):
return cmp(self.separation, other.separation)
class Plane(object):
"""A infinite plane going though the origin point along the positive z axis. At 4x4 transformation matrix can be applied to the generated other planes."""
def __init__(self, transform=None):
'''Transform is a 4x4 transformation matrix that rotates and translates the plane into the global frame (a plane in the xy plane point with normal along (+ve) z).'''
super(Plane, self).__init__()
self.transform = transform
if self.transform == None:
self.transform = tf.identity_matrix()
def append_transform(self, new_transform):
self.transform = np.dot(self.transform, new_transform)
def contains(self, point):
return False
def on_surface(self, point):
"""Returns True is the point is on the plane's surface and false otherwise."""
inv_transform = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, inv_transform)
if cmp_floats(rpos, 0.):
return True
return False
def surface_identifier(self, surface_point, assert_on_surface = True):
raise 'planarsurf'
def surface_normal(self, ray, acute=True):
normal = transform_direction((0,0,1), self.transform)
if acute:
if angle(normal, rdir) > np.pi/2:
normal = normal * -1.0
return normal
def intersection(self, ray):
"""
Returns the intersection point of the ray with the plane. If no intersection occurs None is returned.
>>> ray = Ray(position=[0.5, 0.5, -0.5], direction=[0,0,1])
>>> plane = Plane()
>>> plane.intersection(ray)
[array([ 0.5, 0.5, 0. ])]
>>> ray = Ray(position=[0.5, 0.5, -0.5], direction=[0,0,1])
>>> plane = Plane()
>>> plane.transform = tf.translation_matrix([0,0,1])
>>> plane.intersection(ray)
[array([ 0.5, 0.5, 1. ])]
>>> ray = Ray(position=[0.5, 0.5, -0.5], direction=[0,0,1])
>>> plane = Plane()
>>> plane.append_transform(tf.translation_matrix([0,0,1]))
>>> plane.append_transform(tf.rotation_matrix(np.pi,[1,0,0]))
>>> plane.intersection(ray)
[array([ 0.5, 0.5, 1. ])]
"""
# We need apply the anti-transform of the plane to the ray. This gets the ray in the local frame of the plane.
inv_transform = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, inv_transform)
rdir = transform_direction(ray.direction, inv_transform)
# Ray is in parallel to the plane -- there is no intersection
if rdir[2] == 0.0:
return None
t = -rpos[2]/rdir[2]
# Intersection point is behind the ray
if t < 0.0:
return None
# Convert local frame to world frame
point = rpos + t*rdir
return [transform_point(point, self.transform)]
class FinitePlane(Plane):
"""A subclass of Plane but that has a finite size. The size of the plane
is specified as the the plane was sitting in the xy-plane of a Cartesian
system. The transformations are used dor the positioning.
>>> fp = FinitePlane(length=1, width=1)
>>> fp.intersection(Ray(position=(0,0,1), direction=(0,0,-1)))
[array([ 0., 0., 0.])]
>>> fp = FinitePlane(length=1, width=1)
>>> fp.intersection(Ray(position=(0,0,1), direction=(.5,.25,-1)))
[array([ 0.5 , 0.25, 0. ])]
>>> fp = FinitePlane(length=1, width=1)
>>> fp.append_transform(translation_matrix((2,0,0)))
>>> fp.intersection(Ray(position=(0,0,1), direction=(0,0,-1)))
"""
def __init__(self, length=1, width=1):
super(FinitePlane, self).__init__()
self.length = length
self.width = width
def append_transform(self, new_transform):
super(FinitePlane, self).append_transform(new_transform)
def on_surface(self, point):
"""Returns True if the point is on the plane's surface and false otherwise."""
inv_transform = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, inv_transform)
if cmp_floats(rpos, 0.) and (0. < rpos[0] <= self.length ) and (0. < rpos[1] <= self.width):
return True
return False
def intersection(self, ray):
"""Returns a intersection point with a ray and the finte plane."""
points = super(FinitePlane, self).intersection(ray)
# Is point in the finite plane bounds
local_point = transform_point(points[0], self.transform)
if (0. <= local_point[0] <= self.length ) and (0. <= local_point[1] <= self.width):
return points
return None
class Polygon(object):
"""
A (2D) Polygon with n (>2) points
Only konvex polygons are allowed! Order of points is of course important!
"""
def __init__(self, points):
super(Polygon, self).__init__()
self.pts = points
#check if points are in one plane
assert len(self.pts) >= 3, "You need at least 3 points to build a Polygon"
if len(self.pts) > 3:
x_0 = np.array(self.pts[0])
for i in range(1,len(self.pts)-2):
#the determinant of the vectors (volume) must always be 0
x_i = np.array(self.pts[i])
x_i1 = np.array(self.pts[i+1])
x_i2 = np.array(self.pts[i+2])
det = np.linalg.det([x_0-x_i, x_0-x_i1, x_0-x_i2])
assert cmp_floats( det, 0.0 ), "Points must be in a plane to create a Polygon"
def on_surface(self, point):
"""Returns True if the point is on the polygon's surface and false otherwise."""
n = len(self.pts)
anglesum = 0
p = np.array(point)
for i in range(n):
v1 = np.array(self.pts[i]) - p
v2 = np.array(self.pts[(i+1)%n]) - p
m1 = magnitude(v1)
m2 = magnitude(v2)
if cmp_floats( m1*m2 , 0. ):
return True #point is one of the nodes
else:
# angle(normal, vector)
costheta = np.dot(v1,v2)/(m1*m2)
anglesum = anglesum + np.arccos(costheta)
return cmp_floats( anglesum , 2*np.pi )
def contains(self, point):
return False
def surface_identifier(self, surface_point, assert_on_surface = True):
return "polygon"
def surface_normal(self, ray, acute=False):
vec1 = np.array(self.pts[0])-np.array(self.pts[1])
vec2 = np.array(self.pts[0])-np.array(self.pts[2])
normal = norm( np.cross(vec1,vec2) )
return normal
def intersection(self, ray):
"""Returns a intersection point with a ray and the polygon."""
n = self.surface_normal(ray)
#Ray is parallel to the polygon
if cmp_floats( np.dot( np.array(ray.direction), n ), 0. ):
return None
t = 1/(np.dot(np.array(ray.direction),n)) * ( np.dot(n,np.array(self.pts[0])) - np.dot(n,np.array(ray.position)) )
#Intersection point is behind the ray
if t < 0.0:
return None
#Calculate intersection point
point = np.array(ray.position) + t*np.array(ray.direction)
#Check if intersection point is really in the polygon or only on the (infinite) plane
if self.on_surface(point):
return [list(point)]
return None
class Box(object):
"""An axis aligned box defined by an minimum and extend points (array/list like values)."""
def __init__(self, origin=(0,0,0), extent=(1,1,1)):
super(Box, self).__init__()
self.origin = np.array(origin)
self.extent = np.array(extent)
self.points = [origin, extent]
self.transform = tf.identity_matrix()
def append_transform(self, new_transform):
self.transform = tf.concatenate_matrices(new_transform, self.transform)
def contains(self, point):
"""Returns True is the point is inside the box or False if it is not or is on the surface.
>>> box = Box([1,1,1], [2,2,2])
>>> box.contains([2,2,2])
False
>>> box = Box([1,1,1], [2,2,2])
>>> box.contains([3,3,3])
False
>>> # This point is not the surface within rounding errors
>>> box = Box([0,0,0], [1,1,1])
>>> box.contains([ 0.04223342, 0.99999999999999989 , 0.35692177])
False
>>> box = Box([0,0,0], [1,1,1])
>>> box.contains([ 0.04223342, 0.5 , 0.35692177])
True
"""
#local_point = transform_point(point, tf.inverse_matrix(self.transform))
#for pair in zip((self.origin, local_point, self.extent)):
# if not pair[0] < pair[1] < pair[2]:
# return False
#return True
local_point = transform_point(point, tf.inverse_matrix(self.transform))
for i in range(0,3):
#if not (self.origin[i] < local_point[i] < self.extent[i]):
# Want to make this comparison: self.origin[i] < local_point[i] < self.extent[i]
c1 = cmp_floats_range(self.origin[i], local_point[i])
#print self.origin[i], " is less than ", local_point[i]
if c1 == -1:
b1 = True
else:
b1 = False
#print b1
c2 = cmp_floats_range(local_point[i], self.extent[i])
#print local_point[i], " is less than ", self.extent[i]
if c2 == -1:
b2 = True
else:
b2 = False
#print b2
if not (b1 and b2):
return False
return True
""" # Alternatively:
local_point = transform_point(point, tf.inverse_matrix(self.transform))
def_points = np.concatenate((np.array(self.origin), np.array(self.extent)))
containbool = True
for i in range(0,3):
if intervalcheckstrict(def_points[i],local_point[i],def_points[i+3]) == False:
containbool = False
return containbool
"""
def surface_identifier(self, surface_point, assert_on_surface = True):
"""
Returns an unique identifier that specifies the surface which holds the surface_points.
self.on_surface(surface_point) must return True, otherwise an assert error is thrown.
Example, for a Box with origin=(X,Y,Z), and size=(L,W,H) has the identifiers:
"left":(X,y,z)
"right":(X+L,y,z)
"near":(x,Y,z)
"far":(x,Y+W,z)
"bottom":(x,y,H)
"top":(x,y,Z+H)
"""
# Get an axis-aligned point... then this is really easy.
local_point = transform_point(surface_point, tf.inverse_matrix(self.transform))
# the local point must have at least one common point with the surface definition points
def_points = np.concatenate((np.array(self.origin), np.array(self.extent)))
#surface_id[0]=0 => left
#surface_id[1]=0 => near
#surface_id[2]=0 => bottom
#surface_id[3]=0 => right
#surface_id[4]=0 => far
#surface_id[5]=0 => top
#import pdb; pdb.set_trace()
surface_id_array = [0,0,0,0,0,0]
boolarray = [False, False, False]
for i in range(0,3):
if cmp_floats(def_points[i], local_point[i]):
for j in range(0,3):
if intervalcheck(def_points[j],local_point[j],def_points[j+3]):
surface_id_array[i] = 1
boolarray[j] = True
if cmp_floats(def_points[i+3], local_point[i]):
for j in range(0,3):
if intervalcheck(def_points[j],local_point[j],def_points[j+3]):
surface_id_array[i+3] = 1
boolarray[j] = True
if assert_on_surface == True:
assert boolarray[0] == boolarray[1] == boolarray[2] == True
surface_name = []
if surface_id_array[0] == 1:
surface_name.append('left')
if surface_id_array[1] == 1:
surface_name.append('near')
if surface_id_array[2] == 1:
surface_name.append('bottom')
if surface_id_array[3] == 1:
surface_name.append('right')
if surface_id_array[4] == 1:
surface_name.append('far')
if surface_id_array[5] == 1:
surface_name.append('top')
"""
The following helps to specify if the local_point is located on a corner
or edge of the box. If that is not desired, simply return surface_name[0].
"""
# return surface_name[0]
return_id = ''
for j in range(len(surface_name)):
return_id = return_id + surface_name[j] + ''
return return_id
def on_surface(self, point):
"""Returns True if the point is on the surface False otherwise.
>>> box = Box([1,1,1], [2,2,2])
>>> box.on_surface([2,2,2])
True
>>> box = Box([1,1,1], [2,2,2])
>>> box.on_surface([4,4,4])
False
>>> box = Box(origin=(0, 0, 1.1000000000000001), extent=np.array([ 1. , 1. , 2.1]))
>>> ray = Ray(position=(.5,.5, 2.1), direction=(0,0,1))
>>> box.on_surface(ray.position)
True
"""
if self.contains(point) == True:
return False
# Get an axis-aligned point... then this is really easy.
local_point = transform_point(point, tf.inverse_matrix(self.transform))
# the local point must have at least one common point with the surface definition points
def_points = np.concatenate((np.array(self.origin), np.array(self.extent)))
bool1 = False
bool2 = False
bool3 = False
boolarray = [bool1, bool2, bool3]
for i in range(0,3):
if cmp_floats(def_points[i], local_point[i]):
for j in range(0,3):
if intervalcheck(def_points[j],local_point[j],def_points[j+3]):
boolarray[j] = True
if cmp_floats(def_points[i+3], local_point[i]):
for j in range(0,3):
if intervalcheck(def_points[j],local_point[j],def_points[j+3]):
boolarray[j] = True
if boolarray[0] == boolarray[1] == boolarray[2] == True:
return True
return False
def surface_normal(self, ray, acute=True):
"""
Returns the normalised vector of which is the acute surface normal (0<~ theta <~ 90)
with respect to ray direction. If acute=False is specified the reflex
normal is returned (0<~ theta < 360) The ray must be on the surface
otherwise an error is raised.
>>> box = Box([0,0,0], [1,1,1])
>>> ray = Ray([0.5,0.5,1], [0,0,1])
>>> box.surface_normal(ray)
array([ 0., 0., 1.])
>>> box = Box([1,1,1], [2,2,2])
>>> ray = Ray([1.5,1.5,2], [0,0,1])
>>> box.surface_normal(ray)
array([ 0., 0., 1.])
"""
#pdb.set_trace()
assert self.on_surface(ray.position), "The point is not on the surface of the box."
invtrans = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, invtrans)
rdir = transform_direction(ray.direction, invtrans)
# To define a flat surface, 3 points are needed.
common_index = None
exit = False
reference_point = list(self.origin)
for ref in reference_point:
if not exit:
for val in rpos:
#logging.debug(str((ref,val)))
if cmp_floats(ref,val):
#logging.debug("Common value found, " + str(val) + " at index" + str(list(rpos).index(val)))
common_index = list(rpos).index(val)
exit = True
break
exit = False
if common_index == None:
reference_point = list(self.extent)
for ref in reference_point:
if not exit:
for val in rpos:
#logging.debug(str((ref,val)))
if cmp_floats(ref,val):
#logging.debug("Common value found, " + str(val) + " at index" + str(list(rpos).index(val)))
common_index = list(rpos).index(val)
exit = True
break
assert common_index != None, "The intersection point %s doesn't share an element with either the origin %s or extent points %s (all points transformed into local frame)." % (rpos, self.origin, self.extent)
normal = np.zeros(3)
if list(self.origin) == list(reference_point):
normal[common_index] = -1.
else:
normal[common_index] = 1.
if acute:
if angle(normal, rdir) > np.pi/2:
normal = normal * -1.0
assert 0 <= angle(normal, rdir) <= np.pi/2, "The normal vector needs to be pointing in the same direction quadrant as the ray, so the angle between them is between 0 and 90"
# remove signed zeros this just makes the doctest work. Signed zeros shouldn't really effect the maths but makes things neat.
for i in range(0,3):
if normal[i] == 0.0:
normal[i] = 0.0
return transform_direction(normal, self.transform)
def intersection(self, ray):
'''Returns an array intersection points with the ray and box. If no intersection occurs
this function returns None.
# Inside-out single intersection
>>> ray = Ray(position=[0.5,0.5,0.5], direction=[0,0,1])
>>> box = Box()
>>> box.intersection(ray)
[array([ 0.5, 0.5, 1. ])]
# Inside-out single intersection with translation
>>> ray = Ray(position=[0.5,0.5,0.5], direction=[0,0,1])
>>> box = Box()
>>> box.transform = tf.translation_matrix([0,0,1])
>>> box.intersection(ray)
[array([ 0.5, 0.5, 1. ]), array([ 0.5, 0.5, 2. ])]
>>> ray = Ray(position=[0.5,0.5,0.5], direction=[0,0,1])
>>> box = Box()
>>> box.append_transform(tf.rotation_matrix(2*np.pi, [0,0,1]))
>>> box.intersection(ray)
[array([ 0.5, 0.5, 1. ])]
>>> ray = Ray(position=[0.5,0.5,0.5], direction=[0,0,1])
>>> box = Box()
>>> box.append_transform(tf.rotation_matrix(2*np.pi, norm([1,1,0])))
>>> box.append_transform(tf.translation_matrix([0,0,1]))
>>> box.intersection(ray)
[array([ 0.5, 0.5, 1. ]), array([ 0.5, 0.5, 2. ])]
Here I am using the the work of Amy Williams, Steve Barrus, R. Keith Morley, and
Peter Shirley, "An Efficient and Robust Ray-Box Intersection Algorithm" Journal of
graphics tools, 10(1):49-54, 2005'''
invtrans = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, invtrans)
rdir = transform_direction(ray.direction, invtrans)
#pts = [transform_point(self.points[0], self.transform), transform_point(self.points[1], self.transform)]
pts = [np.array(self.points[0]), np.array(self.points[1])]
rinvd = [1.0/rdir[0], 1.0/rdir[1], 1.0/rdir[2]]
rsgn = [1.0/rinvd[0] < 0.0, 1.0/rinvd[1] < 0.0, 1.0/rinvd[2] < 0.0]
tmin = (pts[rsgn[0]][0] - rpos[0]) * rinvd[0]
tmax = (pts[1-rsgn[0]][0] - rpos[0]) * rinvd[0]
tymin = (pts[rsgn[1]][1] - rpos[1]) * rinvd[1]
tymax = (pts[1-rsgn[1]][1] - rpos[1]) * rinvd[1]
#Bug here: this is the exit point with bug1.py
if (tmin > tymax) or (tymin > tmax):
return None
if tymin > tmin:
tmin = tymin
if tymax < tmax:
tmax = tymax
tzmin = (pts[rsgn[2]][2] - rpos[2]) * rinvd[2]
tzmax = (pts[1-rsgn[2]][2] - rpos[2]) * rinvd[2]
if (tmin > tzmax) or (tzmin > tmax):
return None
if tzmin > tmin:
tmin = tzmin
if tzmax < tmax:
tmax = tzmax
# Calculate the hit coordinates then if the solution is in the forward direction append to the hit list.
hit_coordinates = []
pt1 = rpos + tmin * rdir
pt2 = rpos + tmax * rdir
#pt1_sign = np.dot(pt1, rdir)
#pt2_sign = np.dot(pt2, rdir)
#print "tmin", tmin, "tmax", tmax
if tmin >= 0.0:
hit_coordinates.append(pt1)
if tmax >= 0.0:
hit_coordinates.append(pt2)
#print hit_coordinates
if len(hit_coordinates) == 0:
return None
# Convert hit coordinate back to the world frame
hit_coords_world = []
for point in hit_coordinates:
hit_coords_world.append(transform_point(point, self.transform))
return hit_coords_world
class Cylinder(object):
"""
Parameterised standard representation of a cylinder. The axis is aligned along z but the radius
and the length of the cylinder can be specified. A transformation must be applied to use
centered at a different location of angle.
"""
def __init__(self, radius=1, length=1):
super(Cylinder, self).__init__()
self.radius = radius
self.length = length
self.transform = tf.identity_matrix()
def append_transform(self, new_transform):
self.transform = tf.concatenate_matrices(new_transform, self.transform)
def contains(self, point):
"""
Returns True if the point in inside the cylinder and False if it is on the surface or outside.
>>> # Inside
>>> cylinder = Cylinder(.5, 2)
>>> cylinder.contains([.25, .25, 1])
True
>>> # On surface
>>> cylinder.contains([.0, .0, 2.])
False
>>> # Outside
>>> cylinder.contains([-1,-1,-1])
False
"""
if self.on_surface(point) == True:
return False
local_point = transform_point(point, tf.inverse_matrix(self.transform))
origin_z = 0.
xydistance = np.sqrt(local_point[0]**2 + local_point[1]**2)
if intervalcheckstrict(origin_z, local_point[2], self.length) == True and xydistance<self.radius:
return True
else:
return False
def surface_normal(self, ray, acute=True):
"""
Return the surface normal for a ray on the shape surface.
An assert error is raised if the ray is not on the surface.
>>> cylinder = Cylinder(2, 2)
>>> #Bottom cap in
>>> ray = Ray([0,0,0], [0,0,1])
>>> cylinder.surface_normal(ray)
array([ 0., 0., 1.])
>>> #Bottom cap out
>>> ray = Ray([0,0,0], [0,0,-1])
>>> cylinder.surface_normal(ray)
array([ 0., 0., -1.])
>>> # End cap in
>>> ray = Ray([0,0,2], [0,0,-1])
>>> cylinder.surface_normal(ray)
array([ 0., 0., -1.])
>>> # End cap out
>>> ray = Ray([0,0,2], [0,0,1])
>>> cylinder.surface_normal(ray)
array([ 0., 0., 1.])
>>> # Radial
>>> ray = Ray([2, 0, 1], [1,0,0])
>>> cylinder.surface_normal(ray)
array([ 1., 0., 0.])
"""
assert self.on_surface(ray.position), "The ray is not on the surface."
invtrans = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, invtrans)
rdir = transform_direction(ray.direction, invtrans)
# point on radius surface
pt_radius = np.sqrt(rpos[0]**2 + rpos[1]**2)
c0 = cmp_floats(pt_radius, self.radius)
#point on end caps
c1 = cmp_floats(rpos[2], .0)
c2 = cmp_floats(rpos[2], self.length)
# check radius first
if c0 and (c1 == c2):
normal = norm(np.array(rpos) - np.array([0,0,rpos[2]]))
elif c1:
normal = np.array([0,0,-1])
else:
# Create a vector that points from the axis of the cylinder to the ray position,
# this is the normal vector.
normal = np.array([0,0,1])
if acute:
if angle(normal, rdir) > np.pi*0.5:
normal = normal * -1.
return transform_direction(normal, self.transform)
def on_surface(self, point):
"""
>>> # On surface
>>> cylinder = Cylinder(.5, 2.)
>>> cylinder.on_surface([.0, .0, 2.])
True
"""
""" # !!! Old version !!!
local_point = transform_point(point, tf.inverse_matrix(self.transform))
# xy-component is equal to radius
pt_radius = np.sqrt(local_point[0]**2 + local_point[1]**2)
c0 = cmp_floats(pt_radius, self.radius)
#z-component is equal to zero or length
c1 = cmp_floats(local_point[2], .0)
c2 = cmp_floats(local_point[2], self.length)
if c1 or c2:
return True
elif c0:
return True
else:
return False
"""
local_point = transform_point(point, tf.inverse_matrix(self.transform))
origin_z = 0.
xydistance = np.sqrt(local_point[0]**2 + local_point[1]**2)
if intervalcheck(origin_z, local_point[2], self.length) == True:
if cmp_floats(xydistance, self.radius) == True:
return True
if smallerequalto(xydistance,self.radius):
if cmp_floats(local_point[2], origin_z) == True or cmp_floats(local_point[2], self.length) == True:
return True
return False
def surface_identifier(self, surface_point, assert_on_surface = True):
local_point = transform_point(surface_point, tf.inverse_matrix(self.transform))
origin_z = 0.
xydistance = np.sqrt(local_point[0]**2 + local_point[1]**2)
"""
Assert surface_point on surface
"""
assertbool = False
if intervalcheck(origin_z, local_point[2], self.length) == True:
if cmp_floats(xydistance, self.radius) == True:
surfacename = 'hull'
assertbool = True
if smallerequalto(xydistance,self.radius):
if cmp_floats(local_point[2], origin_z) == True:
surfacename = 'base'
assertbool = True
if cmp_floats(local_point[2], self.length) == True:
surfacename = 'cap'
assertbool = True
if assert_on_surface == True:
assert assertbool, "The assert bool is wrong."
return surfacename
def intersection(self, ray):
"""
Returns all forward intersection points with ray and the capped cylinder.
The intersection algoithm is taken from, "Intersecting a Ray with a Cylinder"
Joseph M. Cychosz and Warren N. Waggenspack, Jr., in "Graphics Gems IV",
Academic Press, 1994.
>>> cld = Cylinder(1.0, 1.0)
>>> cld.intersection(Ray([0.0,0.0,0.5], [1,0,0]))
[array([ 1. , 0. , 0.5])]
>>> cld.intersection(Ray([-5,0.0,0.5], [1,0,0]))
[array([-1. , 0. , 0.5]), array([ 1. , 0. , 0.5])]
>>> cld.intersection(Ray([.5,.5,-1], [0,0,1]))
[array([ 0.5, 0.5, 1. ]), array([ 0.5, 0.5, 0. ])]
>>> cld.intersection( Ray([0.0,0.0,2.0], [0,0,-1]))
[array([ 0., 0., 1.]), array([ 0., 0., 0.])]
>>> cld.intersection(Ray([-0.2, 1.2,0.5],[0.75498586, -0.53837322, 0.37436697]))
[array([ 0.08561878, 0.99632797, 0.64162681]), array([ 0.80834999, 0.48095523, 1. ])]
>>> cld.intersection(Ray(position=[ 0.65993112596983427575736414, -0.036309587083015459896273569, 1. ], direction=[ 0.24273873128664008591570678, -0.81399482405912471083553328, 0.52772183462341881732271531]))
[array([ 0.65993113, -0.03630959, 1. ])]
>>> cld.transform = tf.translation_matrix([0,0,1])
>>> cld.intersection(Ray([-5,0.0,1.5], [1,0,0]))
[array([-1. , 0. , 1.5]), array([ 1. , 0. , 1.5])]
>>> cld.transform = tf.identity_matrix()
>>> cld.transform = tf.rotation_matrix(0.25*np.pi, [1,0,0])
>>> cld.intersection(Ray([-5,-.5,-0.25], [1,0,0]))
[array([-0.84779125, -0.5 , -0.25 ]), array([ 0.84779125, -0.5 , -0.25 ])]
"""
# Inverse transform the ray to get it into the cylinders local frame
inv_transform = tf.inverse_matrix(self.transform)
rpos = transform_point(ray.position, inv_transform)
rdir = transform_direction(ray.direction, inv_transform)
direction = np.array([0,0,1])
normal = np.cross(rdir, direction)
normal_magnitude = magnitude(normal)
#print normal_magnitude, "Normal magnitude"
if cmp_floats(normal_magnitude, .0):
# Ray parallel to cylinder direction
normal = norm(normal)
#d = abs(np.dot(rpos, direction))
#D = rpos - d * np.array(direction)
#if magnitude(D) <= self.radius:
# Axis aligned ray inside the cylinder volume only hits caps
#print "Inside axis aligned ray only hits caps"
bottom = Plane()
top = Plane()
top.transform = tf.translation_matrix([0,0,self.length])
p0 = top.intersection(Ray(rpos, rdir))
p1 = bottom.intersection(Ray(rpos, rdir))
cap_intersections = []
if p0 != None:
cap_intersections.append(p0)
if p1 != None:
cap_intersections.append(p1)
points = []
for point in cap_intersections:
if point[0] != None:
point = point[0]
point_radius = np.sqrt(point[0]**2 + point[1]**2)
if point_radius <= self.radius:
#print "Hit cap at point:"
#print point
#print ""
points.append(point)
if len(points) > 0:
world_points = []
for pt in points:
world_points.append(transform_point(pt, self.transform))
#print "Local points", points
#print "World points", world_points
return world_points
return None
# finish axis parallel branch
#print "Not parallel to cylinder axis."
#print ""
normal = norm(normal)
d = abs(np.dot(rpos, normal))
if d <= self.radius:
#Hit quadratic surface
O = np.cross(rpos, direction)
t = - np.dot(O,normal) / normal_magnitude
O = np.cross(normal, direction)
O = norm(O)
s = abs(np.sqrt(self.radius**2 - d**2) / np.dot(rdir, O))
t0 = t - s
p0 = rpos + t0 * rdir
t1 = t + s
p1 = rpos + t1 * rdir
points = []
if (t0 >= 0.0) and (.0 <= p0[2] <= self.length):
points.append(p0)
if (t1 >= 0.0) and (.0 <= p1[2] <= self.length):
points.append(p1)
#print "Hits quadratic surface with t0 and t1, ", t0, t1
#print ""
#print "Intersection points:"
#p0 = rpos + t0 * rdir
#p1 = rpos + t1 * rdir
# Check that hit quadratic surface in the length range
#points = []
#if (.0 <= p0[2] <= self.length) and not Ray(rpos, rdir).behind(p0):
# points.append(p0)
#
#if (.0 <= p1[2] <= self.length) and not Ray(rpos, rdir).behind(p1):
# points.append(p1)
#print points
#Now compute intersection with end caps
#print "Now to calculate caps intersections"
bottom = Plane()
top = Plane()
top.transform = tf.translation_matrix([0,0,self.length])
p2 = top.intersection(Ray(rpos, rdir))
p3 = bottom.intersection(Ray(rpos, rdir))
cap_intersections = []
if p2 != None:
cap_intersections.append(p2)
if p3 != None:
cap_intersections.append(p3)
for point in cap_intersections:
if point[0] != None:
point = point[0]
point_radius = np.sqrt(point[0]**2 + point[1]**2)
if point_radius <= self.radius:
#print "Hit cap at point:"
#print point
#print ""
points.append(point)
#print points
if len(points) > 0:
world_points = []
for pt in points:
world_points.append(transform_point(pt, self.transform))
return world_points
return None
class Convex(object):
"""docstring for Convex"""
def __init__(self, points):
super(Convex, self).__init__()
self.points = points
verts, triangles = qhull3d(points)
self.faces = range(len(triangles))
for i in range(len(triangles)):
a = triangles[i][0]
b = triangles[i][1]
c = triangles[i][2]
self.faces[i] = Polygon([verts[a], verts[b], verts[c]])
def on_surface(self, point):
for face in self.faces:
if face.on_surface(point):
return True
return False
def surface_normal(self, ray, acute=False):
for face in self.faces:
if face.on_surface(ray.position):
normal = face.surface_normal(ray, acute=acute)
if angle(normal , ray.direction) > np.pi/2:
normal = normal * -1
return normal
assert("Have not found the surface normal for this ray. Are you sure the ray is on the surface of this object?")
def surface_identifier(self, surface_point, assert_on_surface=True):
return "Convex"
def intersection(self, ray):
points = []
for face in self.faces:
pt = face.intersection(ray)
if pt != None:
points.append(np.array(pt[0]))
if len(points) > 0:
return points
return None
def contains(self, point):
ray = Ray(position=point, direction=norm(np.random.random(3)))
hit_counter = 0
for face in self.faces:
if face.on_surface(ray.position):
return False
pt = face.intersection(ray)
if pt != None:
hit_counter = hit_counter + 1
even_or_odd = hit_counter % 2
if even_or_odd == 0:
return False
return True
def centroid(self):
"""Credit:
http://orion.math.iastate.edu:80/burkardt/c_src/geometryc/geometryc.html
Returns the 'centroid' of the Convex polynomial.
"""
raise NotImplementedError("The centroid method of the Convex class is not yet implemented.")
#area = 0.0;
#for ( i = 0; i < n - 2; i++ ) {
#areat = triangle_area_3d ( x[i], y[i], z[i], x[i+1],
# y[i+1], z[i+1], x[n-1], y[n-1], z[n-1] );
#
#area = area + areat;
#*cx = *cx + areat * ( x[i] + x[i+1] + x[n-1] ) / 3.0;
#*cy = *cy + areat * ( y[i] + y[i+1] + y[n-1] ) / 3.0;
#*cz = *cz + areat * ( z[i] + z[i+1] + z[n-1] ) / 3.0;
#
#}
#
#*cx = *cx / area;
#*cy = *cy / area;
#*cz = *cz / area;
#
if __name__ == "__main__":
import doctest
#doctest.testmod()
if False:
# Catch the special case in which we cannot take the cross product
V1 = [0,0,1]
V2 = [0,0,-1]
#import pdb; pdb.set_trace()
R = rotation_matrix_from_vector_alignment(V1, V2)
R2 = rotation_matrix(np.pi, [1,0,0])
V3 = transform_direction(V1, R)
print R2
print cmp_points(V2, V3)
| 36.807661 | 323 | 0.538516 | 5,791 | 45,163 | 4.109135 | 0.103264 | 0.010422 | 0.006304 | 0.00437 | 0.455497 | 0.395991 | 0.377206 | 0.337326 | 0.309884 | 0.290175 | 0 | 0.043986 | 0.338551 | 45,163 | 1,226 | 324 | 36.837684 | 0.752586 | 0.112924 | 0 | 0.419032 | 0 | 0.005008 | 0.028163 | 0 | 0 | 0 | 0 | 0 | 0.035058 | 0 | null | null | 0 | 0.016694 | null | null | 0.005008 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ecf56bf4694e10d805e550af2ca7db64ebf0dff8 | 3,091 | py | Python | zenvlib/environmentsettings.py | zoosk/zenv | 5d7a548c7c9cca784f23fdcfd86533eaf1d68811 | [
"Apache-2.0"
] | 14 | 2016-03-10T23:00:16.000Z | 2020-02-14T22:27:41.000Z | zenvlib/environmentsettings.py | zoosk/zenv | 5d7a548c7c9cca784f23fdcfd86533eaf1d68811 | [
"Apache-2.0"
] | null | null | null | zenvlib/environmentsettings.py | zoosk/zenv | 5d7a548c7c9cca784f23fdcfd86533eaf1d68811 | [
"Apache-2.0"
] | null | null | null | # Copyright 2015 Zoosk, Inc
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
class EnvironmentSettings(object):
#: The path to the build properties file
buildprops = None
#: The path to the common properties file
buildprops_common = None
#: The build command used to start a build in this checkout.
build_command = None
#: The command that runs on completion of a build.
complete_command = None
#: The path to the current workspace.
current_work = None
#: The IP address of the database box.
dbip = None
#: The dev ID for this checkout.
devid = None
#: The 3-digit, 0-padded devid
devid_padded = None
#: The hostname of the dev server for this checkout.
devserver = None
#: The command that runs on build failure.
failed_command = None
#: Whether or not the user is in ZEnv.
initialized = False
#: The user's LDAP username.
ldap_username = None
#: The path to the root of the local deploy dir.
local_deploy_dir = None
#: The root of the ZEnv checkout.
root = None
#: The directories to rsync after a build completes.
rsync_directories = None
#: The path to the deployed version of this checkout.
serverdir = None
#: The path to the global settings file.
settings = None
#: The path to the workspace folder that contains the checkouts.
workspace = None
#: The name of the checkout settings file.
workspace_settings = None
# Note that other properties will be set on this class based on the actual environment variables that you declare.
def __init__(self):
env_vars = os.environ
# Load all the vars
for key, value in env_vars.iteritems():
if key.startswith('ZENV_'):
attr_name = key[5:].lower()
setattr(self, attr_name, value)
# Add data types to non-string properties
self.initialized = (self.initialized == '1')
if self.rsync_directories is not None:
self.rsync_directories = self.rsync_directories.split(' ')
else:
self.rsync_directories = []
if self.buildprops_common == '':
self.buildprops_common = None
self._env_loaded = True
def __setattr__(self, key, value):
""" Override __setattr__ to make all properties read-only """
if hasattr(self, '_env_loaded'):
raise Exception('Properties of EnvironmentSettings objects are read-only')
else:
self.__dict__[key] = value
| 28.88785 | 118 | 0.659657 | 414 | 3,091 | 4.823672 | 0.415459 | 0.056084 | 0.031547 | 0.042063 | 0.072108 | 0.024036 | 0 | 0 | 0 | 0 | 0 | 0.005331 | 0.271757 | 3,091 | 106 | 119 | 29.160377 | 0.88183 | 0.516014 | 0 | 0.05 | 0 | 0 | 0.050206 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.025 | 0 | 0.575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
ecf908c2677c35a6ed13615d3ef45501f7f7362a | 841 | py | Python | brownbags/urls.py | openkawasaki/brownbag-django | ecdd4d2233a77922ead14afcaec289d4a0f43a1b | [
"MIT"
] | 2 | 2020-04-18T12:36:00.000Z | 2020-07-06T03:32:42.000Z | brownbags/urls.py | openkawasaki/brownbag-django | ecdd4d2233a77922ead14afcaec289d4a0f43a1b | [
"MIT"
] | 16 | 2020-04-12T13:24:26.000Z | 2020-04-12T15:54:40.000Z | brownbags/urls.py | openkawasaki/brownbag-django | ecdd4d2233a77922ead14afcaec289d4a0f43a1b | [
"MIT"
] | 3 | 2020-04-13T13:56:02.000Z | 2020-07-06T03:32:51.000Z | from django.urls import include, path
from . import views
from django.views.generic.base import RedirectView
from rest_framework import routers
from . import apis
# アプリケーションの名前空間
app_name = 'brownbags'
urlpatterns = [
path('', views.index, name='index'),
path('edit/', views.edit, name='edit'),
path('api/v1.0/shop/list/', apis.shop_list.as_view(), name='apis_shop_list'),
path('api/v1.0/shop/', apis.shop.as_view(), name='apis_shop'),
#path('api-auth/', include('rest_framework.urls', namespace='rest_framework'))
]
from rest_framework import routers
from .apis import ShopViewSet, ImageDataViewSet
#---------------------------------------------
router = routers.DefaultRouter()
router.register('api/1.0/data/shop', ShopViewSet)
router.register('api/1.0/data/image', ImageDataViewSet)
urlpatterns += router.urls | 28.033333 | 82 | 0.692033 | 109 | 841 | 5.238532 | 0.348624 | 0.091068 | 0.059545 | 0.08056 | 0.311734 | 0.19965 | 0 | 0 | 0 | 0 | 0 | 0.010695 | 0.110583 | 841 | 30 | 83 | 28.033333 | 0.752674 | 0.160523 | 0 | 0.111111 | 0 | 0 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.388889 | 0 | 0.388889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
ecfeefcda553063068069aa70f84508ced468409 | 12,003 | py | Python | config/settings/base.py | sul-cidr/histonets-arch | 6105de90905d54db604b0606d517f53782aae16d | [
"MIT"
] | null | null | null | config/settings/base.py | sul-cidr/histonets-arch | 6105de90905d54db604b0606d517f53782aae16d | [
"MIT"
] | 19 | 2018-04-19T19:32:59.000Z | 2018-06-04T23:20:03.000Z | config/settings/base.py | sul-cidr/histonets-arch | 6105de90905d54db604b0606d517f53782aae16d | [
"MIT"
] | null | null | null | """
Base settings to build other settings files upon.
"""
import environ
ROOT_DIR = environ.Path(__file__) - 3 # (histonets/config/settings/base.py - 3 = histonets/)
APPS_DIR = ROOT_DIR.path('histonets')
env = environ.Env()
READ_DOT_ENV_FILE = env.bool('DJANGO_READ_DOT_ENV_FILE', default=False)
if READ_DOT_ENV_FILE:
# OS environment variables take precedence over variables from .env
env.read_env(str(ROOT_DIR.path('.env')))
# GENERAL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#debug
DEBUG = env.bool('DJANGO_DEBUG', False)
# Local time zone. Choices are
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# though not all of them may be available with every OS.
# In Windows, this must be set to your system time zone.
TIME_ZONE = 'UTC'
# https://docs.djangoproject.com/en/dev/ref/settings/#language-code
LANGUAGE_CODE = 'en-us'
# https://docs.djangoproject.com/en/dev/ref/settings/#site-id
SITE_ID = 1
# https://docs.djangoproject.com/en/dev/ref/settings/#use-i18n
USE_I18N = True
# https://docs.djangoproject.com/en/dev/ref/settings/#use-l10n
USE_L10N = True
# https://docs.djangoproject.com/en/dev/ref/settings/#use-tz
USE_TZ = True
# DATABASES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#databases
DATABASES = {
'default': env.db('DATABASE_URL'),
}
DATABASES['default']['ATOMIC_REQUESTS'] = True
# URLS
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#root-urlconf
ROOT_URLCONF = 'config.urls'
# https://docs.djangoproject.com/en/dev/ref/settings/#wsgi-application
WSGI_APPLICATION = 'config.wsgi.application'
# APPS
# ------------------------------------------------------------------------------
DJANGO_APPS = [
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
# 'django.contrib.humanize', # Handy template tags
'django.contrib.admin',
]
THIRD_PARTY_APPS = [
'crispy_forms',
'allauth',
'allauth.account',
'allauth.socialaccount',
'allauth.socialaccount.providers.box',
'webpack_loader',
'health_check',
'health_check.db',
# 'health_check.cache',
# 'health_check.storage',
# 'health_check.contrib.celery', # requires celery
# 'health_check.contrib.psutil', # disk and memory utilization; requires psutil
# 'health_check.contrib.s3boto_storage', # requires boto and S3BotoStorage backend
]
LOCAL_APPS = [
'histonets.users.apps.UsersConfig',
'histonets.apps.HistonetsConfig',
'histonets.collections.apps.CollectionsConfig',
]
# https://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
INSTALLED_APPS = DJANGO_APPS + THIRD_PARTY_APPS + LOCAL_APPS
# MIGRATIONS
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#migration-modules
MIGRATION_MODULES = {
'sites': 'histonets.contrib.sites.migrations'
}
# AUTHENTICATION
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#authentication-backends
AUTHENTICATION_BACKENDS = [
'django.contrib.auth.backends.ModelBackend',
'allauth.account.auth_backends.AuthenticationBackend',
]
# https://docs.djangoproject.com/en/dev/ref/settings/#auth-user-model
AUTH_USER_MODEL = 'users.User'
# https://docs.djangoproject.com/en/dev/ref/settings/#login-redirect-url
LOGIN_REDIRECT_URL = 'users:redirect'
# https://docs.djangoproject.com/en/dev/ref/settings/#login-url
LOGIN_URL = 'account_login'
# PASSWORDS
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#password-hashers
PASSWORD_HASHERS = [
# https://docs.djangoproject.com/en/dev/topics/auth/passwords/#using-argon2-with-django
'django.contrib.auth.hashers.Argon2PasswordHasher',
'django.contrib.auth.hashers.PBKDF2PasswordHasher',
'django.contrib.auth.hashers.PBKDF2SHA1PasswordHasher',
'django.contrib.auth.hashers.BCryptSHA256PasswordHasher',
'django.contrib.auth.hashers.BCryptPasswordHasher',
]
# https://docs.djangoproject.com/en/dev/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# MIDDLEWARE
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#middleware
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
# STATIC
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#static-root
STATIC_ROOT = str(ROOT_DIR('static'))
# https://docs.djangoproject.com/en/dev/ref/settings/#static-url
STATIC_URL = '/static/'
# https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#std:setting-STATICFILES_DIRS
STATICFILES_DIRS = [
str(APPS_DIR.path('static')),
str(ROOT_DIR('assets'))
]
# https://docs.djangoproject.com/en/dev/ref/contrib/staticfiles/#staticfiles-finders
STATICFILES_FINDERS = [
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
]
# webpack loader
WEBPACK_LOADER = {
'DEFAULT': {
'BUNDLE_DIR_NAME': 'bundles/',
'STATS_FILE': str(ROOT_DIR('webpack-stats.json')),
}
}
# MEDIA
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#media-root
MEDIA_ROOT = str(ROOT_DIR('media'))
# https://docs.djangoproject.com/en/dev/ref/settings/#media-url
MEDIA_URL = '/media/'
# IIIF
# ------------------------------------------------------------------------------
IIIF_DIR = 'iiif' # Relative to the default storage, e.g., /media/iiif
IIIF_CANONICAL_URI_PATTERN = "{}/iiif/2/{{}}/full/max/0/default.jpg"
IIIF_CANONICAL_URI = IIIF_CANONICAL_URI_PATTERN.format(env('CANTALOUPE_SERVER', default='http://localhost'))
IIIF_CANONICAL_CONTAINER_URI = None
if env('CANTALOUPE_CONTAINER_SERVER', default=False):
IIIF_CANONICAL_CONTAINER_URI = IIIF_CANONICAL_URI_PATTERN.format(
env('CANTALOUPE_CONTAINER_SERVER', default='http://localhost')
)
IIIF_IMAGE_FORMATS = ["jpg", "jpeg", "tif", "tiff", "gif", "png"]
# TEMPLATES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#templates
TEMPLATES = [
{
# https://docs.djangoproject.com/en/dev/ref/settings/#std:setting-TEMPLATES-BACKEND
'BACKEND': 'django.template.backends.django.DjangoTemplates',
# https://docs.djangoproject.com/en/dev/ref/settings/#template-dirs
'DIRS': [
str(APPS_DIR.path('templates')),
],
'OPTIONS': {
# https://docs.djangoproject.com/en/dev/ref/settings/#template-debug
'debug': DEBUG,
# https://docs.djangoproject.com/en/dev/ref/settings/#template-loaders
# https://docs.djangoproject.com/en/dev/ref/templates/api/#loader-types
'loaders': [
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
],
# https://docs.djangoproject.com/en/dev/ref/settings/#template-context-processors
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.template.context_processors.i18n',
'django.template.context_processors.media',
'django.template.context_processors.static',
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
],
},
},
]
# http://django-crispy-forms.readthedocs.io/en/latest/install.html#template-packs
CRISPY_TEMPLATE_PACK = 'bootstrap4'
# FIXTURES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#fixture-dirs
FIXTURE_DIRS = (
str(APPS_DIR.path('fixtures')),
)
# EMAIL
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = env('DJANGO_EMAIL_BACKEND', default='django.core.mail.backends.smtp.EmailBackend')
# ADMIN
# ------------------------------------------------------------------------------
# Django Admin URL regex.
ADMIN_URL = r'^admin/'
# https://docs.djangoproject.com/en/dev/ref/settings/#admins
ADMINS = [
("""Center for Interdisciplinary Digital Research (CIDR)""", 'contact-cidr@stanford.edu'),
]
# https://docs.djangoproject.com/en/dev/ref/settings/#managers
MANAGERS = ADMINS
# Celery
# ------------------------------------------------------------------------------
INSTALLED_APPS += ['histonets.taskapp.celery.CeleryConfig']
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-broker_url
CELERY_BROKER_URL = env('CELERY_BROKER_URL', default='django://')
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backend
if CELERY_BROKER_URL == 'django://':
CELERY_RESULT_BACKEND = 'redis://'
else:
CELERY_RESULT_BACKEND = CELERY_BROKER_URL
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-accept_content
CELERY_ACCEPT_CONTENT = ['json']
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-task_serializer
CELERY_TASK_SERIALIZER = 'json'
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_serializer
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ALWAYS_EAGER = False # set to True for emulation
CELERYD_TASK_TERMINATES_WORKER = True # custom option
CELERYD_MAX_TASKS_PER_CHILD = 1
# django-allauth
# ------------------------------------------------------------------------------
# https://django-allauth.readthedocs.io/en/latest/configuration.html
ACCOUNT_ALLOW_REGISTRATION = env.bool('DJANGO_ACCOUNT_ALLOW_REGISTRATION', True)
ACCOUNT_AUTHENTICATION_METHOD = 'username_email'
ACCOUNT_EMAIL_REQUIRED = env.bool('DJANGO_ACCOUNT_EMAIL_REQUIRED', default=True)
ACCOUNT_EMAIL_VERIFICATION = env.bool('DJANGO_ACCOUNT_EMAIL_VERIFICATION', default="mandatory")
ACCOUNT_ADAPTER = 'histonets.users.adapters.AccountAdapter'
SOCIALACCOUNT_ADAPTER = 'histonets.users.adapters.SocialAccountAdapter'
# django-compressor
# ------------------------------------------------------------------------------
# https://django-compressor.readthedocs.io/en/latest/quickstart/#installation
INSTALLED_APPS += ['compressor']
STATICFILES_FINDERS += ['compressor.finders.CompressorFinder']
COMPRESS_PRECOMPILERS = (
('text/x-sass', 'sass {infile} {outfile}'),
)
# Histonets
# ------------------------------------------------------------------------------
| 40.550676 | 108 | 0.637591 | 1,240 | 12,003 | 6.021774 | 0.245161 | 0.043391 | 0.106067 | 0.12053 | 0.325968 | 0.298112 | 0.273202 | 0.268783 | 0.184545 | 0.088925 | 0 | 0.002535 | 0.112472 | 12,003 | 295 | 109 | 40.688136 | 0.698395 | 0.466633 | 0 | 0.017647 | 0 | 0 | 0.478477 | 0.362938 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.064706 | 0.005882 | 0 | 0.005882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
ecffd858ad8cd1dedc6a42ad347e664e218138b6 | 10,706 | py | Python | groupman.py | yteraoka/googleapps-directory-tools | ea59f0602ddafcc850cf7e36f6021be0ee60c2a3 | [
"Apache-2.0"
] | 20 | 2015-02-20T04:58:17.000Z | 2020-12-30T23:43:29.000Z | groupman.py | yteraoka/googleapps-directory-tools | ea59f0602ddafcc850cf7e36f6021be0ee60c2a3 | [
"Apache-2.0"
] | 2 | 2015-03-02T14:33:25.000Z | 2017-09-20T11:23:19.000Z | groupman.py | yteraoka/googleapps-directory-tools | ea59f0602ddafcc850cf7e36f6021be0ee60c2a3 | [
"Apache-2.0"
] | 11 | 2015-03-02T14:16:01.000Z | 2021-10-03T14:28:10.000Z | #!/apps/python-2.7/bin/python
# -*- coding: utf-8 -*-
import os
import os.path
import glob
import sys
from apiclient.discovery import build
from apiclient.errors import HttpError
import httplib2
from oauth2client.client import flow_from_clientsecrets
from oauth2client.file import Storage
from oauth2client import tools
import argparse
import simplejson as json
import pprint
import codecs
import yaml
import re
from termcolor import colored
from const import *
from utils import *
GROUP_PARAMS = ['name', 'description', 'aliases', 'allowExternalMembers',
'allowGoogleCommunication',
'allowWebPosting', 'archiveOnly', 'customReplyTo',
'includeInGlobalAddressList', 'isArchived',
'maxMessageBytes', 'membersCanPostAsTheGroup',
'messageDisplayFont', 'messageModerationLevel',
'primaryLanguage', 'replyTo',
'sendMessageDenyNotification', 'showInGroupDirectory',
'spamModerationLevel', 'whoCanContactOwner',
'whoCanInvite', 'whoCanJoin', 'whoCanLeaveGroup',
'whoCanPostMessage', 'whoCanViewGroup',
'whoCanViewMembership']
class GaService(object):
def __init__(self, cred_path = CREDENTIALS_PATH):
storage = Storage(cred_path)
credentials = storage.get()
if credentials is None or credentials.invalid:
sys.exit(1)
http = httplib2.Http()
http = credentials.authorize(http)
sv1 = build('admin', 'directory_v1', http=http)
sv2 = build('groupssettings', 'v1', http=http)
self.service = {}
self.service['group'] = sv1.groups()
self.service['member'] = sv1.members()
self.service['settings'] = sv2.groups()
def group_sv(self):
return self.service['group']
def member_sv(self):
return self.service['member']
def settings_sv(self):
return self.service['settings']
def list_local_groups(self, domain, dir):
groups = []
for f in glob.glob("%s/*@%s.yml" % (dir, domain)):
email = os.path.splitext(os.path.basename(f))[0]
group_obj = GaGroup()
group_obj.set_group_key(email)
groups.append(group_obj)
return groups
def list_cloud_groups(self, domain):
groups = []
pageToken = None
while True:
params = { 'domain': domain }
if pageToken:
params['pageToken'] = pageToken
r = self.service['group'].list(**params).execute()
if r.has_key('groups'):
for group in r['groups']:
group_obj = GaGroup()
group_obj.set_group_key(group['email'])
groups.append(group_obj)
if r.has_key('nextPageToken'):
pageToken = r['nextPageToken']
else:
break
return groups
class GaGroup(object):
def __init__(self):
self.local_dir = '.'
self.local = {}
self.cloud = {}
self.group_key = None
def set_group_key(self, group_key):
self.group_key = group_key
def set_local_dir(self, local_dir):
self.local_dir = local_dir
def group_key(self):
return self.group_key
def load_cloud(self, sv):
r = sv.settings_sv().get(groupUniqueId=self.group_key).execute()
self.cloud = r
members = self.load_cloud_member(sv)
self.cloud['members'] = []
self.cloud['owners'] = []
self.cloud['managers'] = []
for member in members:
if member['role'] == 'MEMBER':
self.cloud['members'].append(member['email'])
elif member['role'] == 'MANAGER':
self.cloud['managers'].append(member['email'])
elif member['role'] == 'OWNER':
self.cloud['owners'].append(member['email'])
self.cloud['members'].sort()
self.cloud['owners'].sort()
self.cloud['managers'].sort()
r = sv.group_sv().get(groupKey=self.group_key).execute()
if r.has_key('aliases'):
self.cloud['aliases'] = r['aliases']
def load_cloud_member(self, sv):
members = []
pageToken = None
while True:
params = { 'groupKey': self.group_key }
if pageToken:
params['pageToken'] = pageToken
r = sv.member_sv().list(**params).execute()
if r.has_key('members'):
for member in r['members']:
members.append(member)
if r.has_key('nextPageToken'):
pageToken = r['nextPageToken']
else:
break
return members
def dump_data(self, data, stream):
stream.write("email: %s\n" % data['email'])
for key in GROUP_PARAMS:
if data.has_key(key):
if key in ['name', 'description']:
stream.write("%s: \"%s\"\n" % (key, re.sub(r'"', '\\"', data[key]).encode('utf-8')))
elif key in ['maxMessageBytes']:
stream.write("%s: %s\n" % (key, data[key]))
elif key in ['aliases']:
if len(data[key]):
stream.write("%s:\n" % key)
for val in data[key]:
stream.write(" - %s\n" % val)
else:
stream.write("%s: []\n" % key)
else:
stream.write("%s: \"%s\"\n" % (key, data[key]))
if len(data['members']):
stream.write("members:\n")
for member in data['members']:
stream.write(" - %s\n" % member)
else:
stream.write("members: []\n")
if len(data['managers']):
stream.write("managers:\n")
for member in data['managers']:
stream.write(" - %s\n" % member)
else:
stream.write("managers: []\n")
if len(data['owners']):
stream.write("owners:\n")
for member in data['owners']:
stream.write(" - %s\n" % member)
else:
stream.write("owners: []\n")
def dump_cloud(self):
self.dump_data(self.cloud, sys.stdout)
def local_file(self):
file = "%s/%s.yml" % (self.local_dir, self.group_key)
return file
def export(self):
f = open(self.local_file(), 'w')
self.dump_data(self.cloud, f)
f.close()
def load_local(self):
file = self.local_file()
if os.path.exists(file):
self.local = yaml.load(open(file).read().decode('utf-8'))
def diff(self):
if not self.local.has_key('name'):
self.load_local()
for key in GROUP_PARAMS:
if self.local.has_key(key) and self.cloud.has_key(key):
if self.local[key] != self.cloud[key]:
print colored("-%s: %s (cloud)" % (key, self.cloud[key]), 'red')
print colored("+%s: %s (local)" % (key, self.local[key]), 'green')
elif self.local.has_key(key):
print colored("+%s: %s (local)" % (key, self.local[key]), 'green')
elif self.cloud.has_key(key):
print colored("-%s: %s (cloud)" % (key, self.cloud[key]), 'red')
for key in ['members', 'managers', 'owners']:
only_cloud = [x for x in self.cloud[key] if x not in self.local[key]]
only_local = [x for x in self.local[key] if x not in self.cloud[key]]
if len(only_cloud) or len(only_local):
print "%s:" % key
for x in only_cloud:
print colored("- - %s (cloud)" % x, 'red')
for x in only_local:
print colored("+ - %s (local)" % x, 'green')
def apply(self, sv):
if not self.local.has_key('name'):
self.load_local()
body = {}
update_keys = []
for key in GROUP_PARAMS:
if key not in ['name', 'description', 'aliases']:
if self.cloud[key] != self.local[key]:
body[key] = self.local[key]
if len(body) > 0:
r = sv.settings_sv().update(groupUniqueId=self.group_key, body=body).execute()
print "updated"
else:
print "no changes"
def csv(self):
if not self.local.has_key('name'):
self.load_local()
description = re.sub(r'\s*\[sateraito.*$', '', self.local['description'])
return '"IU","%s","%s","%s","%s","%s"' % (self.local['email'],
self.local['name'],
','.join(self.local['members']),
','.join(self.local['owners']),
re.sub(r'"', '""', description))
def csv_header():
return '"command","email","name","members","owners","comment"'
def main():
parser = argparse.ArgumentParser()
parser.add_argument('operation',
choices=['show', 'diff', 'export', 'apply', 'csv'],
help='operationo')
parser.add_argument('targets', nargs='+', help='domain or email list')
parser.add_argument('--dir', help='local data directory', default='.')
parser.add_argument('--encoding',
choices=['utf-8', 'sjis'],
help='csv output encoding',
default='utf-8')
args = parser.parse_args()
sv = GaService()
groups = []
for target in args.targets:
if target.find('@') >= 0:
g = GaGroup()
g.set_group_key(target)
groups.append(g)
else:
if args.operation == 'csv':
groups.extend(sv.list_local_groups(target, args.dir))
else:
groups.extend(sv.list_cloud_groups(target))
if args.operation == 'csv':
print csv_header()
for group in groups:
group.set_local_dir(args.dir)
if args.operation != 'csv':
print group.group_key
group.load_cloud(sv)
if args.operation == 'show':
group.dump_cloud()
elif args.operation == 'export':
group.export()
elif args.operation == 'diff':
group.diff()
elif args.operation == 'apply':
group.apply(sv)
elif args.operation == 'csv':
print group.csv().encode(args.encoding)
if __name__ == '__main__':
main()
| 35.217105 | 104 | 0.519055 | 1,167 | 10,706 | 4.658098 | 0.168809 | 0.041391 | 0.019868 | 0.014349 | 0.280905 | 0.180648 | 0.128403 | 0.116078 | 0.075975 | 0.075975 | 0 | 0.003244 | 0.337848 | 10,706 | 303 | 105 | 35.333333 | 0.763577 | 0.00467 | 0 | 0.194656 | 0 | 0 | 0.141449 | 0.019242 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.072519 | null | null | 0.049618 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a60183c4a0d01fe278e2d3f101040935e10e0734 | 3,716 | py | Python | icekit/utils/search/search_indexes.py | ic-labs/django-icekit | c507ea5b1864303732c53ad7c5800571fca5fa94 | [
"MIT"
] | 52 | 2016-09-13T03:50:58.000Z | 2022-02-23T16:25:08.000Z | icekit/utils/search/search_indexes.py | ic-labs/django-icekit | c507ea5b1864303732c53ad7c5800571fca5fa94 | [
"MIT"
] | 304 | 2016-08-11T14:17:30.000Z | 2020-07-22T13:35:18.000Z | icekit/utils/search/search_indexes.py | ic-labs/django-icekit | c507ea5b1864303732c53ad7c5800571fca5fa94 | [
"MIT"
] | 12 | 2016-09-21T18:46:35.000Z | 2021-02-15T19:37:50.000Z | from django.utils.text import capfirst
from easy_thumbnails.exceptions import InvalidImageFormatError
from easy_thumbnails.files import get_thumbnailer
from haystack import indexes
from haystack.utils import get_model_ct
# Doesn't extend `indexes.Indexable` to avoid auto-detection for 'Search In'
class AbstractLayoutIndex(indexes.SearchIndex):
"""
A search index for a publishable polymorphic model that implements
ListableMixin and LayoutFieldMixin.
Subclasses will need to mix in `indexes.Indexable` and implement
`get_model(self)`. They may need to override the `text` field to specify
a different template name.
Derived classes must override the `get_model()` method to return the
specific class (not an instance) that the search index will use.
"""
# Content
text = indexes.CharField(document=True, use_template=True, template_name="search/indexes/icekit/default.txt")
get_type = indexes.CharField()
get_title = indexes.CharField(model_attr='get_title', boost=2.0)
get_oneliner = indexes.CharField(model_attr='get_oneliner')
boosted_search_terms = indexes.CharField(model_attr="get_boosted_search_terms", boost=2.0, null=True)
# Meta
get_absolute_url = indexes.CharField(model_attr='get_absolute_url')
get_list_image_url = indexes.CharField()
modification_date = indexes.DateTimeField()
language_code = indexes.CharField()
# SEO Translations
meta_keywords = indexes.CharField()
meta_description = indexes.CharField()
meta_title = indexes.CharField()
# We add this for autocomplete.
content_auto = indexes.EdgeNgramField(model_attr='get_title')
# facets
# top-level result type
search_types = indexes.MultiValueField(faceted=True)
def index_queryset(self, using=None):
"""
Index published objects.
"""
return self.get_model().objects.published().select_related()
def full_prepare(self, obj):
"""
Make django_ct equal to the type of get_model, to make polymorphic
children show up in results.
"""
prepared_data = super(AbstractLayoutIndex, self).full_prepare(obj)
prepared_data['django_ct'] = get_model_ct(self.get_model())
return prepared_data
def prepare_get_type(self, obj):
if hasattr(obj, 'get_type'):
return unicode(obj.get_type())
return ""
def prepare_get_list_image_url(self, obj):
list_image = getattr(obj, "get_list_image", lambda x: None)()
if list_image:
# resize according to the `list_image` alias
try:
return get_thumbnailer(list_image)['list_image'].url
except InvalidImageFormatError:
pass
return ""
def prepare_modification_date(self, obj):
return getattr(obj, "modification_date", None)
def prepare_language_code(self, obj):
return getattr(obj, "language_code", None)
def prepare_meta_keywords(self, obj):
return getattr(obj, "meta_keywords", None)
def prepare_meta_description(self, obj):
return getattr(obj, "meta_description", None)
def prepare_meta_title(self, obj):
return getattr(obj, "meta_title", None)
def prepare_search_types(self, obj):
r = [capfirst(obj.get_type_plural())]
if hasattr(obj, 'is_educational') and obj.is_educational():
r.append('Education')
return r
def prepare(self, obj):
data = super(AbstractLayoutIndex, self).prepare(obj)
# ensure default boost amount for field_value_factor calculations.
if not data.has_key('boost'):
data['boost'] = 1.0
return data
| 35.730769 | 113 | 0.689451 | 462 | 3,716 | 5.344156 | 0.335498 | 0.071284 | 0.024301 | 0.040502 | 0.0968 | 0.032807 | 0 | 0 | 0 | 0 | 0 | 0.002068 | 0.219322 | 3,716 | 103 | 114 | 36.07767 | 0.849018 | 0.214478 | 0 | 0.034483 | 0 | 0 | 0.087358 | 0.020241 | 0 | 0 | 0 | 0 | 0 | 1 | 0.189655 | false | 0.017241 | 0.086207 | 0.086207 | 0.758621 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
a60246e782926af300cad036aa0701ca98dcd72d | 1,655 | py | Python | CSC-291/Projects/bsearch_timer.py | FrancesCoronel/cs-hu | ecd103a525fd312146d3b6c69ee7c1452548c5e2 | [
"MIT"
] | 2 | 2016-12-05T06:15:34.000Z | 2016-12-15T10:56:50.000Z | CSC-291/Projects/bsearch_timer.py | fvcproductions/CS-HU | ecd103a525fd312146d3b6c69ee7c1452548c5e2 | [
"MIT"
] | null | null | null | CSC-291/Projects/bsearch_timer.py | fvcproductions/CS-HU | ecd103a525fd312146d3b6c69ee7c1452548c5e2 | [
"MIT"
] | 3 | 2019-04-06T01:45:54.000Z | 2020-04-24T16:55:32.000Z | '''
FVCproductions
September 18, 2014
Python
CSC291_Project3
'''
# Binary Search: (20 pts)
# Implement binary search, submit it to the course website. You must write a
# function bsearch that takes and list and an element to search for. This
# function should return the index of the element if found and -1 otherwise.
#
# Note: Pay attention to the order of the paremeters specified above.
#
# Challenge (not mandatory): Learn about recursion and also write a function bsearchRecursive that
# uses recursion to find the index of the value.
from datetime import datetime
# timer - starting time defined
start_time = datetime.now()
sortedList = [1,2,3,4,5] #sorted list
target = 3 #element to search for
print "sorted list: " + str(sortedList)
print "\nelement to search for: " + str(target)
# challenge bit
def bsearch(blist, element):
low = 0
high = len(blist)-1
while low <= high:
mid = (low+high)/2
test = blist[mid]
if element == test:
return mid
elif element < test:
high = mid -1
else:
low = mid +1
return -1
recursive_output = bsearch(sortedList, target)
print "\nbsearch recursive - index of element: " + str(recursive_output)
finish_time = datetime.now()
difference_time = finish_time - start_time
micro = difference_time.microseconds
mili = (difference_time.microseconds)/(1000)
print "\ntime it took to complete program in microseconds: " + str(micro)
print "\ntime it took to complete program in miliseconds: " + str(mili) | 22.671233 | 100 | 0.650151 | 215 | 1,655 | 4.95814 | 0.483721 | 0.022514 | 0.030957 | 0.033771 | 0.065666 | 0.065666 | 0.065666 | 0.065666 | 0 | 0 | 0 | 0.024007 | 0.270091 | 1,655 | 73 | 101 | 22.671233 | 0.858444 | 0.346224 | 0 | 0 | 0 | 0 | 0.1801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.037037 | null | null | 0.185185 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6037da7a2fe3008ebf40e96199ded29a7213caa | 15,599 | py | Python | src/stages.py | khalidm/vcf_annotation_pipeline | 4939d1a92b87f29bacfa1521de63dd8d4283acdc | [
"BSD-3-Clause"
] | null | null | null | src/stages.py | khalidm/vcf_annotation_pipeline | 4939d1a92b87f29bacfa1521de63dd8d4283acdc | [
"BSD-3-Clause"
] | null | null | null | src/stages.py | khalidm/vcf_annotation_pipeline | 4939d1a92b87f29bacfa1521de63dd8d4283acdc | [
"BSD-3-Clause"
] | null | null | null | '''
Individual stages of the pipeline implemented as functions from
input files to output files.
The run_stage function knows everything about submitting jobs and, given
the state parameter, has full access to the state of the pipeline, such
as config, options, DRMAA and the logger.
'''
from utils import safe_make_dir
from runner import run_stage
import os
PICARD_JAR = '$PICARD_HOME/lib/picard.jar'
GATK_JAR = '$GATK_HOME/GenomeAnalysisTK.jar'
def java_command(jar_path, mem_in_gb, command_args):
'''Build a string for running a java command'''
# Bit of room between Java's max heap memory and what was requested.
# Allows for other Java memory usage, such as stack.
java_mem = mem_in_gb - 2
return 'java -Xmx{mem}g -jar {jar_path} {command_args}'.format(
jar_path=jar_path, mem=java_mem, command_args=command_args)
def run_java(state, stage, jar_path, mem, args):
command = java_command(jar_path, mem, args)
run_stage(state, stage, command)
class Stages(object):
def __init__(self, state):
self.state = state
self.reference = self.get_options('ref_grch37')
self.dbsnp_grch37 = self.get_options('dbsnp_grch37')
self.mills_grch37 = self.get_options('mills_grch37')
self.one_k_g_snps = self.get_options('one_k_g_snps')
self.one_k_g_indels = self.get_options('one_k_g_indels')
self.one_k_g_highconf_snps = self.get_options('one_k_g_highconf_snps')
self.hapmap = self.get_options('hapmap')
self.interval_grch37 = self.get_options('interval_grch37')
self.CEU_mergeGvcf = self.get_options('CEU_mergeGvcf')
self.GBR_mergeGvcf = self.get_options('GBR_mergeGvcf')
self.FIN_mergeGvcf = self.get_options('FIN_mergeGvcf')
self.SNPEFFJAR = self.get_options('SNPEFFJAR')
self.SNPEFFCONF = self.get_options('SNPEFFCONF')
def run_picard(self, stage, args):
mem = int(self.state.config.get_stage_options(stage, 'mem'))
return run_java(self.state, stage, PICARD_JAR, mem, args)
def run_gatk(self, stage, args):
mem = int(self.state.config.get_stage_options(stage, 'mem'))
return run_java(self.state, stage, GATK_JAR, mem, args)
def get_stage_options(self, stage, *options):
return self.state.config.get_stage_options(stage, *options)
def get_options(self, *options):
return self.state.config.get_options(*options)
def original_vcf(self, output):
'''Original vcf file'''
pass
def decompose_vcf(self, inputs, vcf_out, sample_id):
'''Decompose and normalize the input raw VCF file using Vt'''
vcf_in = inputs
cores = self.get_stage_options('decompose_vcf', 'cores')
command = 'cat {vcf_in} | vt decompose -s - |' \
'vt normalize -r {reference} - > {vcf_out}'
.format(vcf_in=vcf_in,
reference=self.reference,
vcf=vcf_out)
run_stage(self.state, 'decompose_vcf', command)
def annotate_vep(self, vcf_in, vcf_out):
'''Annotate VCF file using VEP'''
cores = self.get_stage_options('annotate_vep', 'cores')
command = 'perl $VEPPATH/variant_effect_predictor.pl ' \
'--cache -i {vcf_in} --format vcf -o {vcf_out} --force_overwrite '\
'--vcf --fork {threads} --everything --offline --coding_only --no_intergenic '\
'--plugin LoF,human_ancestor_fa:~/.vep/homo_sapiens/77_GRCh37/human_ancestor.fa.gz '\
'-custom ~/reference/ExAC0.3/ExAC.r0.3.sites.vep.vcf.gz,ExAC,vcf,exact,0,AF,AC,AC_AFR,'\
'AC_AMR,AC_Adj,AC_EAS,AC_FIN,AC_Het,AC_Hom,AC_NFE,AC_OTH,AC_SAS,AF,AN,AN_AFR,AN_AMR,AN_Adj,'\
'AN_EAS,AN_FIN,AN_NFE,AN_OTH,AN_SAS'.format(vcf_in=vcf_in, vcf_out=vcf_out, threads=cores)
self.run_picard('annotate_vep', command)
def annotate_snpeff(self, vcf_in, vcf_out):
'''Annotate VCF file using SnpEff'''
cores = self.get_stage_options('annotate_snpeff', 'cores')
command = 'java -jar {snpeffjar} eff -c {snpeffconf} -canon hg19 {vcf_in} > '\
'{vcf_out}'.format(snpeffjar=self.snpeffjar,snpeffconf=self.snpeffconf,vcf_in=vcf_in, vcf_out=vcf_out)
self.run_picard('annotate_snpeff', command)
def mark_duplicates_picard(self, bam_in, outputs):
'''Mark duplicate reads using Picard'''
dedup_bam_out, metrics_out = outputs
picard_args = 'MarkDuplicates INPUT={bam_in} OUTPUT={dedup_bam_out} ' \
'METRICS_FILE={metrics_out} VALIDATION_STRINGENCY=LENIENT ' \
'MAX_RECORDS_IN_RAM=5000000 ASSUME_SORTED=True ' \
'CREATE_INDEX=True'.format(bam_in=bam_in, dedup_bam_out=dedup_bam_out,
metrics_out=metrics_out)
self.run_picard('mark_duplicates_picard', picard_args)
def chrom_intervals_gatk(self, inputs, intervals_out):
'''Generate chromosome intervals using GATK'''
bam_in, _metrics_dup = inputs
cores = self.get_stage_options('chrom_intervals_gatk', 'cores')
gatk_args = '-T RealignerTargetCreator -R {reference} -I {bam} ' \
'--num_threads {threads} --known {mills_grch37} ' \
'--known {one_k_g_indels} -L {interval_grch37} ' \
'-o {out}'.format(reference=self.reference, bam=bam_in,
threads=cores, mills_grch37=self.mills_grch37,
one_k_g_indels=self.one_k_g_indels,
interval_grch37=self.interval_grch37,
out=intervals_out)
self.run_gatk('chrom_intervals_gatk', gatk_args)
def local_realignment_gatk(self, inputs, bam_out):
'''Local realign reads using GATK'''
target_intervals_in, bam_in = inputs
gatk_args = "-T IndelRealigner -R {reference} -I {bam} -L {interval_grch37} " \
"-targetIntervals {target_intervals} -known {mills_grch37} " \
"-known {one_k_g_indels} " \
"-o {out}".format(reference=self.reference, bam=bam_in,
mills_grch37=self.mills_grch37,
one_k_g_indels=self.one_k_g_indels,
interval_grch37=self.interval_grch37,
target_intervals=target_intervals_in,
out=bam_out)
self.run_gatk('local_realignment_gatk', gatk_args)
# XXX I'm not sure that --num_cpu_threads_per_data_thread has any benefit here
def base_recalibration_gatk(self, bam_in, outputs):
'''Base recalibration using GATK'''
csv_out, log_out = outputs
gatk_args = "-T BaseRecalibrator -R {reference} -I {bam} " \
"--num_cpu_threads_per_data_thread 4 --knownSites {dbsnp_grch37} " \
"--knownSites {mills_grch37} --knownSites {one_k_g_indels} " \
"-log {log} -o {out}".format(reference=self.reference, bam=bam_in,
mills_grch37=self.mills_grch37, dbsnp_grch37=self.dbsnp_grch37,
one_k_g_indels=self.one_k_g_indels,
log=log_out, out=csv_out)
self.run_gatk('base_recalibration_gatk', gatk_args)
# XXX I'm not sure that --num_cpu_threads_per_data_thread has any benefit here
def print_reads_gatk(self, inputs, bam_out):
'''Print reads using GATK'''
[csv_in, _log], bam_in = inputs
gatk_args = "-T PrintReads -R {reference} -I {bam} --BQSR {recal_csv} " \
"-o {out} --num_cpu_threads_per_data_thread 4".format(reference=self.reference,
bam=bam_in, recal_csv=csv_in, out=bam_out)
self.run_gatk('print_reads_gatk', gatk_args)
def call_variants_gatk(self, bam_in, vcf_out):
'''Call variants using GATK'''
gatk_args = "-T HaplotypeCaller -R {reference} --min_base_quality_score 20 " \
"--variant_index_parameter 128000 --emitRefConfidence GVCF " \
"--standard_min_confidence_threshold_for_calling 30.0 " \
"--num_cpu_threads_per_data_thread 8 " \
"--variant_index_type LINEAR " \
"--standard_min_confidence_threshold_for_emitting 30.0 " \
"-I {bam} -L {interval_list} -o {out}".format(reference=self.reference,
bam=bam_in, interval_list=self.interval_grch37, out=vcf_out)
self.run_gatk('call_variants_gatk', gatk_args)
def combine_gvcf_gatk(self, vcf_files_in, vcf_out):
'''Combine G.VCF files for all samples using GATK'''
g_vcf_files = ' '.join(['--variant ' + vcf for vcf in vcf_files_in])
gatk_args = "-T CombineGVCFs -R {reference} " \
"--disable_auto_index_creation_and_locking_when_reading_rods " \
"{g_vcf_files} -o {vcf_out}".format(reference=self.reference,
g_vcf_files=g_vcf_files, vcf_out=vcf_out)
self.run_gatk('combine_gvcf_gatk', gatk_args)
def genotype_gvcf_gatk(self, merged_vcf_in, vcf_out):
'''Genotype G.VCF files using GATK'''
cores = self.get_stage_options('genotype_gvcf_gatk', 'cores')
gatk_args = "-T GenotypeGVCFs -R {reference} " \
"--disable_auto_index_creation_and_locking_when_reading_rods " \
"--num_threads {cores} --variant {merged_vcf} --out {vcf_out} " \
"--variant {CEU_mergeGvcf} --variant {GBR_mergeGvcf} " \
"--variant {FIN_mergeGvcf}".format(reference=self.reference,
cores=cores, merged_vcf=merged_vcf_in, vcf_out=vcf_out,
CEU_mergeGvcf=self.CEU_mergeGvcf, GBR_mergeGvcf=self.GBR_mergeGvcf,
FIN_mergeGvcf=self.FIN_mergeGvcf)
self.run_gatk('genotype_gvcf_gatk', gatk_args)
def snp_recalibrate_gatk(self, genotype_vcf_in, outputs):
'''SNP recalibration using GATK'''
recal_snp_out, tranches_snp_out, snp_plots_r_out = outputs
cores = self.get_stage_options('snp_recalibrate_gatk', 'cores')
gatk_args = "-T VariantRecalibrator --disable_auto_index_creation_and_locking_when_reading_rods " \
"-R {reference} --minNumBadVariants 5000 --num_threads {cores} " \
"-resource:hapmap,known=false,training=true,truth=true,prior=15.0 {hapmap} " \
"-resource:omni,known=false,training=true,truth=true,prior=12.0 {one_k_g_snps} " \
"-resource:1000G,known=false,training=true,truth=false,prior=10.0 {one_k_g_highconf_snps} " \
"-an QD -an MQRankSum -an ReadPosRankSum -an FS -an InbreedingCoeff " \
"-input {genotype_vcf} --recal_file {recal_snp} --tranches_file {tranches_snp} " \
"-rscriptFile {snp_plots} -mode SNP".format(reference=self.reference,
cores=cores, hapmap=self.hapmap, one_k_g_snps=self.one_k_g_snps,
one_k_g_highconf_snps=self.one_k_g_highconf_snps, genotype_vcf=genotype_vcf_in,
recal_snp=recal_snp_out, tranches_snp=tranches_snp_out, snp_plots=snp_plots_r_out)
self.run_gatk('snp_recalibrate_gatk', gatk_args)
def indel_recalibrate_gatk(self, genotype_vcf_in, outputs):
'''INDEL recalibration using GATK'''
recal_indel_out, tranches_indel_out, indel_plots_r_out = outputs
cores = self.get_stage_options('indel_recalibrate_gatk', 'cores')
gatk_args = "-T VariantRecalibrator --disable_auto_index_creation_and_locking_when_reading_rods " \
"-R {reference} --minNumBadVariants 5000 --num_threads {cores} " \
"-resource:mills,known=false,training=true,truth=true,prior=12.0 {mills_grch37} " \
"-resource:1000G,known=false,training=true,truth=true,prior=10.0 {one_k_g_indels} " \
"-an MQRankSum -an ReadPosRankSum -an FS -input {genotype_vcf} -recalFile {recal_indel} " \
"-tranchesFile {tranches_indel} -rscriptFile {indel_plots} " \
" -mode INDEL".format(reference=self.reference,
cores=cores, mills_grch37=self.mills_grch37, one_k_g_indels=self.one_k_g_indels,
genotype_vcf=genotype_vcf_in, recal_indel=recal_indel_out,
tranches_indel=tranches_indel_out, indel_plots=indel_plots_r_out)
self.run_gatk('indel_recalibrate_gatk', gatk_args)
def apply_snp_recalibrate_gatk(self, inputs, vcf_out):
'''Apply SNP recalibration using GATK'''
genotype_vcf_in, [recal_snp, tranches_snp] = inputs
cores = self.get_stage_options('apply_snp_recalibrate_gatk', 'cores')
gatk_args = "-T ApplyRecalibration --disable_auto_index_creation_and_locking_when_reading_rods " \
"-R {reference} --ts_filter_level 99.5 --excludeFiltered --num_threads {cores} " \
"-input {genotype_vcf} -recalFile {recal_snp} -tranchesFile {tranches_snp} " \
"-mode SNP -o {vcf_out}".format(reference=self.reference,
cores=cores, genotype_vcf=genotype_vcf_in, recal_snp=recal_snp,
tranches_snp=tranches_snp, vcf_out=vcf_out)
self.run_gatk('apply_snp_recalibrate_gatk', gatk_args)
def apply_indel_recalibrate_gatk(self, inputs, vcf_out):
'''Apply INDEL recalibration using GATK'''
genotype_vcf_in, [recal_indel, tranches_indel] = inputs
cores = self.get_stage_options('apply_indel_recalibrate_gatk', 'cores')
gatk_args = "-T ApplyRecalibration --disable_auto_index_creation_and_locking_when_reading_rods " \
"-R {reference} --ts_filter_level 99.0 --excludeFiltered --num_threads {cores} " \
"-input {genotype_vcf} -recalFile {recal_indel} -tranchesFile {tranches_indel} " \
"-mode INDEL -o {vcf_out}".format(reference=self.reference,
cores=cores, genotype_vcf=genotype_vcf_in, recal_indel=recal_indel,
tranches_indel=tranches_indel, vcf_out=vcf_out)
self.run_gatk('apply_indel_recalibrate_gatk', gatk_args)
def combine_variants_gatk(self, inputs, vcf_out):
'''Combine variants using GATK'''
recal_snp, [recal_indel] = inputs
cores = self.get_stage_options('combine_variants_gatk', 'cores')
gatk_args = "-T CombineVariants -R {reference} --disable_auto_index_creation_and_locking_when_reading_rods " \
"--num_threads {cores} --genotypemergeoption UNSORTED --variant {recal_snp} " \
"--variant {recal_indel} -o {vcf_out}".format(reference=self.reference,
cores=cores, recal_snp=recal_snp, recal_indel=recal_indel,
vcf_out=vcf_out)
self.run_gatk('combine_variants_gatk', gatk_args)
def select_variants_gatk(self, combined_vcf, vcf_out):
'''Select variants using GATK'''
gatk_args = "-T SelectVariants -R {reference} --disable_auto_index_creation_and_locking_when_reading_rods " \
"--variant {combined_vcf} -select 'DP > 100' -o {vcf_out}".format(reference=self.reference,
combined_vcf=combined_vcf, vcf_out=vcf_out)
self.run_gatk('select_variants_gatk', gatk_args)
| 55.710714 | 118 | 0.640233 | 1,973 | 15,599 | 4.708059 | 0.156614 | 0.024545 | 0.012919 | 0.016579 | 0.526321 | 0.451071 | 0.365917 | 0.281516 | 0.250296 | 0.182689 | 0 | 0.011508 | 0.253542 | 15,599 | 279 | 119 | 55.910394 | 0.786242 | 0.017373 | 0 | 0.073892 | 0 | 0.034483 | 0.3365 | 0.129165 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.004926 | 0.014778 | null | null | 0.009852 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a603ad8008600d2c8038cd2fcf8557be817d480f | 1,959 | py | Python | chrome/test/pyautolib/generate_docs.py | nagineni/chromium-crosswalk | 5725642f1c67d0f97e8613ec1c3e8107ab53fdf8 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 231 | 2015-01-08T09:04:44.000Z | 2021-12-30T03:03:10.000Z | chrome/test/pyautolib/generate_docs.py | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | 5 | 2015-03-27T14:29:23.000Z | 2019-09-25T13:23:12.000Z | chrome/test/pyautolib/generate_docs.py | 1065672644894730302/Chromium | 239dd49e906be4909e293d8991e998c9816eaa35 | [
"BSD-3-Clause"
] | 268 | 2015-01-21T05:53:28.000Z | 2022-03-25T22:09:01.000Z | #!/usr/bin/env python
# Copyright (c) 2011 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import optparse
import os
import pydoc
import shutil
import sys
def main():
parser = optparse.OptionParser()
parser.add_option('-w', '--write', dest='dir', metavar='FILE',
default=os.path.join(os.getcwd(), 'pyauto_docs'),
help=('Directory path to write all of the documentation. '
'Defaults to "pyauto_docs" in current directory.'))
parser.add_option('-p', '--pyautolib', dest='pyautolib', metavar='FILE',
default=os.getcwd(),
help='Location of pyautolib directory')
(options, args) = parser.parse_args()
if not os.path.isdir(options.dir):
os.makedirs(options.dir)
# Add these paths so pydoc can find everything
sys.path.append(os.path.join(options.pyautolib,
'../../../third_party/'))
sys.path.append(options.pyautolib)
# Get a snapshot of the current directory where pydoc will export the files
previous_contents = set(os.listdir(os.getcwd()))
pydoc.writedocs(options.pyautolib)
current_contents = set(os.listdir(os.getcwd()))
if options.dir == os.getcwd():
print 'Export complete, files are located in %s' % options.dir
return 1
new_files = current_contents.difference(previous_contents)
for file_name in new_files:
basename, extension = os.path.splitext(file_name)
if extension == '.html':
# Build the complete path
full_path = os.path.join(os.getcwd(), file_name)
existing_file_path = os.path.join(options.dir, file_name)
if os.path.isfile(existing_file_path):
os.remove(existing_file_path)
shutil.move(full_path, options.dir)
print 'Export complete, files are located in %s' % options.dir
return 0
if __name__ == '__main__':
sys.exit(main())
| 33.775862 | 78 | 0.671261 | 267 | 1,959 | 4.805243 | 0.419476 | 0.032736 | 0.031177 | 0.031177 | 0.154326 | 0.126267 | 0.082619 | 0.082619 | 0.082619 | 0.082619 | 0 | 0.003868 | 0.20827 | 1,959 | 57 | 79 | 34.368421 | 0.82334 | 0.16488 | 0 | 0.051282 | 0 | 0 | 0.181093 | 0.012891 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.128205 | null | null | 0.051282 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6053be31232e1f11d342234014c05d2dc6f686a | 1,112 | py | Python | courses/migrations/0009_enrollment_statuses.py | mitodl/mit-xpro | 981d6c87d963837f0b9ccdd996067fe81394dba4 | [
"BSD-3-Clause"
] | 10 | 2019-02-20T18:41:32.000Z | 2021-07-26T10:39:58.000Z | courses/migrations/0009_enrollment_statuses.py | mitodl/mit-xpro | 981d6c87d963837f0b9ccdd996067fe81394dba4 | [
"BSD-3-Clause"
] | 2,226 | 2019-02-20T20:03:57.000Z | 2022-03-31T11:18:56.000Z | courses/migrations/0009_enrollment_statuses.py | mitodl/mit-xpro | 981d6c87d963837f0b9ccdd996067fe81394dba4 | [
"BSD-3-Clause"
] | 4 | 2020-08-26T19:26:02.000Z | 2021-03-09T17:46:47.000Z | # Generated by Django 2.1.7 on 2019-05-24 19:17
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [("courses", "0008_enrollment_company")]
operations = [
migrations.AddField(
model_name="courserunenrollment",
name="active",
field=models.BooleanField(
default=True,
help_text="Indicates whether or not this enrollment should be considered active",
),
),
migrations.AddField(
model_name="courserunenrollment",
name="edx_enrolled",
field=models.BooleanField(
default=False,
help_text="Indicates whether or not the request succeeded to enroll via the edX API",
),
),
migrations.AddField(
model_name="programenrollment",
name="active",
field=models.BooleanField(
default=True,
help_text="Indicates whether or not this enrollment should be considered active",
),
),
]
| 30.888889 | 101 | 0.571942 | 104 | 1,112 | 6.028846 | 0.528846 | 0.086124 | 0.110048 | 0.129187 | 0.553429 | 0.553429 | 0.354067 | 0.354067 | 0.354067 | 0.354067 | 0 | 0.026027 | 0.343525 | 1,112 | 35 | 102 | 31.771429 | 0.832877 | 0.040468 | 0 | 0.689655 | 1 | 0 | 0.297653 | 0.021596 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a611cacdfe5dbb06f96220faa6f86ba176c65626 | 918 | py | Python | Pacote Download/Ex043_12_IMC.py | BrunoCruzIglesias/Python | 01465632a8471271e994eb4565a14a547db6578d | [
"MIT"
] | null | null | null | Pacote Download/Ex043_12_IMC.py | BrunoCruzIglesias/Python | 01465632a8471271e994eb4565a14a547db6578d | [
"MIT"
] | null | null | null | Pacote Download/Ex043_12_IMC.py | BrunoCruzIglesias/Python | 01465632a8471271e994eb4565a14a547db6578d | [
"MIT"
] | null | null | null | # Cálculo de IMC
print("Vamos calcular seu IMC?")
nome = input("Digite o seu nome: ")
peso = float(input("Olá, {}, agora digite seu peso (em Kg, Ex: 68.9): " .format(nome)))
altura = float(input("Digite sua altura (em m, Ex: 1.80): "))
media = peso / (altura * altura)
print('Seu IMC é: {:.1f}'.format(media))
if media < 18.5:
print("Abaixo do peso")
elif media >= 18.5 and media <= 24.9: # poderia ser feito: elif 18.5 <= media < 25: /o python aceita esse tipo/
print("Você está no seu peso ideal")
elif media>=25.0 and media<=29.9: # elif 25 <= media < 30:
print("Você está levemente acima do peso")
elif media>=30.0 and media<=34.9: # elif 30 <= media < 35:
print("Obesidade Grau 1")
elif media>=35.0 and media<=39.9: # elif 35 <= media < 40:
print("Obesidade Grau 2")
else:
print("Obesidade grau 3(mórbida)") | 45.9 | 117 | 0.581699 | 144 | 918 | 3.708333 | 0.444444 | 0.067416 | 0.050562 | 0.05618 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079646 | 0.261438 | 918 | 20 | 118 | 45.9 | 0.707965 | 0.236383 | 0 | 0 | 0 | 0 | 0.396552 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
a611d310e0340c190260eaa06e81db9198238521 | 399 | py | Python | 2020/day2/password.py | DanielKillenberger/AdventOfCode | e9b40c1ae09ee4bffbbf6acca1a2778aed5f1561 | [
"MIT"
] | null | null | null | 2020/day2/password.py | DanielKillenberger/AdventOfCode | e9b40c1ae09ee4bffbbf6acca1a2778aed5f1561 | [
"MIT"
] | null | null | null | 2020/day2/password.py | DanielKillenberger/AdventOfCode | e9b40c1ae09ee4bffbbf6acca1a2778aed5f1561 | [
"MIT"
] | null | null | null | with open("input.txt", "r") as input_file:
input = input_file.read().split("\n")
passwords = list(map(lambda line: [list(map(int, line.split(" ")[0].split("-"))), line.split(" ")[1][0], line.split(" ")[2]], input))
valid = 0
for password in passwords:
count_letter = password[2].count(password[1])
if password[0][0] <= count_letter <= password[0][1]:
valid += 1
print(valid)
| 28.5 | 133 | 0.611529 | 60 | 399 | 4 | 0.45 | 0.1125 | 0.158333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035821 | 0.160401 | 399 | 13 | 134 | 30.692308 | 0.680597 | 0 | 0 | 0 | 0 | 0 | 0.0401 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.444444 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
a611ec7c6de69b5cd4939c8bb49718b4e41ba387 | 4,361 | py | Python | scripts/gen_cert.py | SUNET/sunet-auth-server | a85256917c4159fd1c8eb1f6a6ed28777b64d403 | [
"BSD-2-Clause"
] | 1 | 2021-05-26T02:37:10.000Z | 2021-05-26T02:37:10.000Z | scripts/gen_cert.py | SUNET/sunet-auth-server | a85256917c4159fd1c8eb1f6a6ed28777b64d403 | [
"BSD-2-Clause"
] | 1 | 2021-05-17T08:32:15.000Z | 2021-05-17T08:32:16.000Z | scripts/gen_cert.py | SUNET/sunet-auth-server | a85256917c4159fd1c8eb1f6a6ed28777b64d403 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
import argparse
import os
import sys
from base64 import b64encode
from datetime import datetime, timedelta
from cryptography import x509
from cryptography.hazmat.primitives import serialization, hashes
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.primitives.hashes import SHA256
from cryptography.x509 import NameOID
__author__ = 'lundberg'
def main(args: argparse.Namespace):
# Generate key
key = rsa.generate_private_key(public_exponent=65537, key_size=4096)
passphrase = serialization.NoEncryption()
if args.passphrase is not None:
passphrase = serialization.BestAvailableEncryption(args.passphrase.encode())
private_bytes = key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=passphrase,
)
# Write key
if args.out is not None:
key_path = f'{args.out}{os.sep}{args.common_name}.key'
if os.path.exists(key_path):
sys.stderr.write(f'{key_path} already exists\n')
sys.exit(1)
with open(key_path, 'wb') as f:
f.write(private_bytes)
else:
sys.stdout.writelines(f'Private key for {args.common_name}:\n')
sys.stdout.writelines(private_bytes.decode('utf-8'))
sys.stdout.writelines('\n')
# Various details about who we are. For a self-signed certificate the
# subject and issuer are always the same.
subject = issuer = x509.Name(
[
x509.NameAttribute(NameOID.COUNTRY_NAME, args.country),
x509.NameAttribute(NameOID.STATE_OR_PROVINCE_NAME, args.province),
x509.NameAttribute(NameOID.LOCALITY_NAME, args.locality),
x509.NameAttribute(NameOID.ORGANIZATION_NAME, args.organization),
x509.NameAttribute(NameOID.COMMON_NAME, args.common_name),
]
)
alt_names = [x509.DNSName(alt_name) for alt_name in args.alt_names]
cert = (
x509.CertificateBuilder()
.subject_name(subject)
.issuer_name(issuer)
.public_key(key.public_key())
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.utcnow())
.not_valid_after(datetime.utcnow() + timedelta(days=args.expires))
.add_extension(
x509.SubjectAlternativeName(alt_names),
critical=False,
# Sign our certificate with our private key
)
.sign(key, hashes.SHA256())
)
public_bytes = cert.public_bytes(serialization.Encoding.PEM)
# Write certificate
if args.out is not None:
cert_path = f'{args.out}{os.sep}{args.common_name}.crt'
if os.path.exists(cert_path):
sys.stderr.write(f'{cert_path} already exists\n')
sys.exit(1)
with open(cert_path, 'wb') as f:
f.write(public_bytes)
else:
sys.stdout.writelines(f'Certificate for {args.common_name}:\n')
sys.stdout.writelines(public_bytes.decode('utf-8'))
sys.stdout.writelines('\n')
# Print additional info
sys.stdout.writelines('cert#S256 fingerprint:\n')
sys.stdout.writelines(b64encode(cert.fingerprint(algorithm=SHA256())).decode('utf-8'))
sys.stdout.writelines('\n')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Generate key and cert')
parser.add_argument('--country', '-c', default='SE', help='country (default: SE)', type=str)
parser.add_argument('--province', '-p', default='Stockholm', help='province (default: Stockholm)', type=str)
parser.add_argument('--locality', '-l', default='Stockholm', help='locality (default: Stockholm)', type=str)
parser.add_argument('--organization', '-o', default='Sunet', help='organization (default: Sunet)', type=str)
parser.add_argument('--common-name', '-cn', help='common name', type=str, required=True)
parser.add_argument('--expires', '-e', default=365, help='expires in X days (default: 365)', type=int)
parser.add_argument('--alt-names', help='alternative names', nargs='*', default=[], type=str)
parser.add_argument('--passphrase', help='passphrase for key', nargs='?', default=None, type=str)
parser.add_argument('--out', help='output directory', nargs='?', default=None, type=str)
main(args=parser.parse_args())
| 43.178218 | 112 | 0.677368 | 536 | 4,361 | 5.376866 | 0.283582 | 0.028105 | 0.059334 | 0.03331 | 0.236294 | 0.176266 | 0.133241 | 0.095073 | 0.045108 | 0 | 0 | 0.021234 | 0.190094 | 4,361 | 100 | 113 | 43.61 | 0.794734 | 0.053657 | 0 | 0.108434 | 0 | 0 | 0.15323 | 0.029626 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012048 | false | 0.060241 | 0.120482 | 0 | 0.13253 | 0.024096 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
a611f0ca7daf79a9c49d9bcc01c2d037698b5b57 | 574 | py | Python | tests/apis/test_files.py | ninoseki/uzen | 93726f22f43902e17b22dd36142dac05171d0d84 | [
"MIT"
] | 76 | 2020-02-27T06:36:27.000Z | 2022-03-10T20:18:03.000Z | tests/apis/test_files.py | ninoseki/uzen | 93726f22f43902e17b22dd36142dac05171d0d84 | [
"MIT"
] | 33 | 2020-03-13T02:04:14.000Z | 2022-03-04T02:06:11.000Z | tests/apis/test_files.py | ninoseki/uzen | 93726f22f43902e17b22dd36142dac05171d0d84 | [
"MIT"
] | 6 | 2020-03-17T16:42:25.000Z | 2021-04-27T06:35:46.000Z | import asyncio
import pytest
from fastapi.testclient import TestClient
from app.models.script import Script
@pytest.mark.usefixtures("scripts_setup")
def test_files(client: TestClient, event_loop: asyncio.AbstractEventLoop):
first = event_loop.run_until_complete(Script.all().first())
sha256 = first.file_id
response = client.get(f"/api/files/{sha256}")
assert response.status_code == 200
def test_files_404(client: TestClient, event_loop: asyncio.AbstractEventLoop):
response = client.get("/api/files/404")
assert response.status_code == 404
| 27.333333 | 78 | 0.763066 | 75 | 574 | 5.68 | 0.493333 | 0.06338 | 0.056338 | 0.117371 | 0.230047 | 0.230047 | 0 | 0 | 0 | 0 | 0 | 0.036 | 0.12892 | 574 | 20 | 79 | 28.7 | 0.816 | 0 | 0 | 0 | 0 | 0 | 0.080139 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.153846 | false | 0 | 0.307692 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
a616479ff75877116910e729d7edffd6c5271108 | 278 | py | Python | 21. generadores.py | JSNavas/CursoPython2.7 | d1f9170dbf897b6eb729f9696a208880e33c550b | [
"MIT"
] | null | null | null | 21. generadores.py | JSNavas/CursoPython2.7 | d1f9170dbf897b6eb729f9696a208880e33c550b | [
"MIT"
] | null | null | null | 21. generadores.py | JSNavas/CursoPython2.7 | d1f9170dbf897b6eb729f9696a208880e33c550b | [
"MIT"
] | null | null | null | lista = ["bienvenido "]
ciclo = (c * 4 for c in lista)
print ciclo
print ciclo.next()
for cadena in ciclo:
print cadena
print
n = input("Factorial de: ")
def factorial(n):
i = 1
while n > 1:
i = n * i
yield i
n -= 1
for fact in factorial(n):
print fact
| 8.6875 | 30 | 0.600719 | 47 | 278 | 3.553191 | 0.425532 | 0.11976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020101 | 0.284173 | 278 | 31 | 31 | 8.967742 | 0.819095 | 0 | 0 | 0 | 0 | 0 | 0.092251 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.3125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6210dfe559293cf9f1af6eef76873ac94d75c61 | 468 | py | Python | submissions/Flanagin/myLogic.py | dysomni/aima-python | c67104e50007ec5ac2a9aa37f0cb972cb6315528 | [
"MIT"
] | null | null | null | submissions/Flanagin/myLogic.py | dysomni/aima-python | c67104e50007ec5ac2a9aa37f0cb972cb6315528 | [
"MIT"
] | null | null | null | submissions/Flanagin/myLogic.py | dysomni/aima-python | c67104e50007ec5ac2a9aa37f0cb972cb6315528 | [
"MIT"
] | null | null | null |
music = {
'kb': '''
Instrument(Flute)
Piece(Undine, Reinecke)
Piece(Carmen, Bourne)
(Instrument(x) & Piece(w, c) & Era(c, r)) ==> Program(w)
Era(Reinecke, Romantic)
Era(Bourne, Romantic)
''',
'queries': '''
Program(x)
''',
}
life = {
'kb': '''
Musician(x) ==> Stressed(x)
(Student(x) & Text(y)) ==> Stressed(x)
Musician(Heather)
''',
'queries': '''
Stressed(x)
'''
}
Examples = {
'music': music,
'life': life
} | 11.414634 | 56 | 0.519231 | 51 | 468 | 4.764706 | 0.490196 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.24359 | 468 | 41 | 57 | 11.414634 | 0.686441 | 0 | 0 | 0.259259 | 0 | 0.037037 | 0.708779 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a62f20a3565292350f831b63d64b71a76ad1a913 | 354 | py | Python | modules/seloger/constants.py | Phyks/Flatisfy | 9e495bb63ec32686e6dc6be566da9672ad014880 | [
"MIT"
] | 15 | 2017-06-07T07:17:47.000Z | 2021-04-22T21:04:32.000Z | modules/seloger/constants.py | Phyks/Flatisfy | 9e495bb63ec32686e6dc6be566da9672ad014880 | [
"MIT"
] | 15 | 2017-06-13T11:12:02.000Z | 2021-03-27T12:28:42.000Z | modules/seloger/constants.py | Phyks/Flatisfy | 9e495bb63ec32686e6dc6be566da9672ad014880 | [
"MIT"
] | 5 | 2017-09-23T20:13:34.000Z | 2021-01-16T09:17:09.000Z | from woob.capabilities.housing import POSTS_TYPES, HOUSE_TYPES
TYPES = {POSTS_TYPES.RENT: 1,
POSTS_TYPES.SALE: 2,
POSTS_TYPES.FURNISHED_RENT: 1,
POSTS_TYPES.VIAGER: 5}
RET = {HOUSE_TYPES.HOUSE: '2',
HOUSE_TYPES.APART: '1',
HOUSE_TYPES.LAND: '4',
HOUSE_TYPES.PARKING: '3',
HOUSE_TYPES.OTHER: '10'}
| 27.230769 | 62 | 0.638418 | 48 | 354 | 4.458333 | 0.479167 | 0.280374 | 0.093458 | 0.140187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.237288 | 354 | 12 | 63 | 29.5 | 0.755556 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6340747773815299adf15817934a33c2b1bc670 | 314 | py | Python | Chapter 07/Chap07_Example7.71.py | Anancha/Programming-Techniques-using-Python | e80c329d2a27383909d358741a5cab03cb22fd8b | [
"MIT"
] | null | null | null | Chapter 07/Chap07_Example7.71.py | Anancha/Programming-Techniques-using-Python | e80c329d2a27383909d358741a5cab03cb22fd8b | [
"MIT"
] | null | null | null | Chapter 07/Chap07_Example7.71.py | Anancha/Programming-Techniques-using-Python | e80c329d2a27383909d358741a5cab03cb22fd8b | [
"MIT"
] | null | null | null | myl1 = []
num = int(input("Enter the number of elements: "))
for loop in range(num):
myl1.append(input(f"Enter element at index {loop} : "))
print(myl1)
print(type(myl1))
myt1 = tuple(myl1)
print(myt1)
print(type(myt1))
print("The elements of tuple object are: ")
for loop in myt1:
print(loop) | 20.933333 | 61 | 0.656051 | 49 | 314 | 4.204082 | 0.489796 | 0.131068 | 0.087379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035573 | 0.194268 | 314 | 15 | 62 | 20.933333 | 0.778656 | 0 | 0 | 0 | 0 | 0 | 0.304762 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
a634a7d59658a71a40ac1013ecebacf7bdf8218a | 967 | py | Python | packages/artfx/mayaLib/tests/maya_test.py | Soulayrol/Pipeline | d0a2c834c3772198f0ca5f0ba6ea8b5e41a419e7 | [
"MIT"
] | null | null | null | packages/artfx/mayaLib/tests/maya_test.py | Soulayrol/Pipeline | d0a2c834c3772198f0ca5f0ba6ea8b5e41a419e7 | [
"MIT"
] | null | null | null | packages/artfx/mayaLib/tests/maya_test.py | Soulayrol/Pipeline | d0a2c834c3772198f0ca5f0ba6ea8b5e41a419e7 | [
"MIT"
] | null | null | null | import os
import sys
import maya.standalone
import mayaLib
print("=" * 30)
print("This is mayaLib package test")
print("=" * 30)
print("Initializing maya standalone ...")
maya.standalone.initialize(name="python")
# Create engine
maya_engine = mayaLib.MayaEngine()
print("Engine : " + str(maya_engine))
# Get engine path
print("Current file location : " + str(maya_engine.get_file_path()))
# Save
maya_engine_scene = os.path.join(os.path.join(os.environ["USERPROFILE"]), "Desktop", "test.ma")
maya_engine.save(maya_engine_scene)
print("Current file location after save : " + maya_engine.get_file_path())
# Open as
maya_engine.open_as(maya_engine.get_file_path())
print("Open as ")
print("Current file location after open as : " + maya_engine.get_file_path())
# Open
maya_engine.open(maya_engine_scene)
print("Current file location after open : " + maya_engine.get_file_path())
print("Uninitialized maya standalone ...")
maya.standalone.uninitialize()
sys.exit(0)
| 28.441176 | 95 | 0.75181 | 139 | 967 | 5.035971 | 0.280576 | 0.185714 | 0.111429 | 0.121429 | 0.365714 | 0.335714 | 0.202857 | 0.125714 | 0 | 0 | 0 | 0.005787 | 0.106515 | 967 | 33 | 96 | 29.30303 | 0.804398 | 0.048604 | 0 | 0.086957 | 0 | 0 | 0.300875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.173913 | 0 | 0.173913 | 0.478261 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
a6359dcbaf8a3e9343cefe0d8af4e3a9e190144b | 6,094 | py | Python | examples/openmdao.examples.mdao/openmdao/examples/mdao/sellar_BLISS.py | swryan/OpenMDAO-Framework | f50d60e1a8cadac7fe03d26ffad5fb660b2a15ec | [
"Apache-2.0"
] | null | null | null | examples/openmdao.examples.mdao/openmdao/examples/mdao/sellar_BLISS.py | swryan/OpenMDAO-Framework | f50d60e1a8cadac7fe03d26ffad5fb660b2a15ec | [
"Apache-2.0"
] | null | null | null | examples/openmdao.examples.mdao/openmdao/examples/mdao/sellar_BLISS.py | swryan/OpenMDAO-Framework | f50d60e1a8cadac7fe03d26ffad5fb660b2a15ec | [
"Apache-2.0"
] | null | null | null | """
Solution of the Sellar analytical problem using classic BLISS.
(Bi-Level Integrated System Synthesis)
MDA solved with a Broyden solver.
Global sensitivity calculated by finite-differencing the MDA-coupled
system. The MDA should be replaced with solution of the GSE to fully
match the original Sobiesky-Agte implementation.
"""
from openmdao.main.api import Assembly, SequentialWorkflow
from openmdao.lib.datatypes.api import Float, Array
from openmdao.lib.differentiators.finite_difference import FiniteDifference
from openmdao.lib.drivers.api import CONMINdriver, BroydenSolver, \
SensitivityDriver, FixedPointIterator
from openmdao.lib.optproblems import sellar
class SellarBLISS(Assembly):
""" Optimization of the Sellar problem using the BLISS algorithm
Disciplines coupled with FixedPointIterator.
"""
z_store = Array([0,0],dtype=Float)
x1_store = Float(0.0)
def configure(self):
""" Creates a new Assembly with this problem
Optimal Design at (1.9776, 0, 0)
Optimal Objective = 3.18339"""
# Disciplines
self.add('dis1', sellar.Discipline1())
self.add('dis2', sellar.Discipline2())
objective = '(dis1.x1)**2 + dis1.z2 + dis1.y1 + exp(-dis2.y2)'
constraint1 = 'dis1.y1 > 3.16'
constraint2 = 'dis2.y2 < 24.0'
# Top level is Fixed-Point Iteration
self.add('driver', FixedPointIterator())
self.driver.add_parameter('dis1.x1', low= 0.0, high=10.0, start=1.0)
self.driver.add_parameter(['dis1.z1','dis2.z1'], low=-10.0, high=10.0, start=5.0)
self.driver.add_parameter(['dis1.z2','dis2.z2'], low= 0.0, high=10.0,start=2.0)
self.driver.add_constraint('x1_store = dis1.x1')
self.driver.add_constraint('z_store[0] = dis1.z1')
self.driver.add_constraint('z_store[1] = dis1.z2')
self.driver.max_iteration = 50
self.driver.tolerance = .001
# Multidisciplinary Analysis
self.add('mda', BroydenSolver())
self.mda.add_parameter('dis1.y2', low=-9.e99, high=9.e99,start=0.0)
self.mda.add_constraint('dis2.y2 = dis1.y2')
self.mda.add_parameter('dis2.y1', low=-9.e99, high=9.e99,start=3.16)
self.mda.add_constraint('dis2.y1 = dis1.y1')
# Discipline 1 Sensitivity Analysis
self.add('sa_dis1', SensitivityDriver())
self.sa_dis1.workflow.add(['dis1'])
self.sa_dis1.add_parameter('dis1.x1', low= 0.0, high=10.0, fd_step=.001)
self.sa_dis1.add_constraint(constraint1)
self.sa_dis1.add_constraint(constraint2)
self.sa_dis1.add_objective(objective, name='obj')
self.sa_dis1.differentiator = FiniteDifference()
self.sa_dis1.default_stepsize = 1.0e-6
# Discipline 2 Sensitivity Analysis
# dis2 has no local parameter, so there is no need to treat it as
# a subsystem.
# System Level Sensitivity Analysis
# Note, we cheat here and run an MDA instead of solving the
# GSE equations. Have to put this on the TODO list.
self.add('ssa', SensitivityDriver())
self.ssa.workflow.add(['mda'])
self.ssa.add_parameter(['dis1.z1','dis2.z1'], low=-10.0, high=10.0)
self.ssa.add_parameter(['dis1.z2','dis2.z2'], low= 0.0, high=10.0)
self.ssa.add_constraint(constraint1)
self.ssa.add_constraint(constraint2)
self.ssa.add_objective(objective, name='obj')
self.ssa.differentiator = FiniteDifference()
self.ssa.default_stepsize = 1.0e-6
# Discipline Optimization
# (Only discipline1 has an optimization input)
self.add('bbopt1', CONMINdriver())
self.bbopt1.add_parameter('x1_store', low=0.0, high=10.0, start=1.0)
self.bbopt1.add_objective('sa_dis1.F[0] + sa_dis1.dF[0][0]*(x1_store-dis1.x1)')
self.bbopt1.add_constraint('sa_dis1.G[0] + sa_dis1.dG[0][0]*(x1_store-dis1.x1) < 0')
#this one is technically unncessary
self.bbopt1.add_constraint('sa_dis1.G[1] + sa_dis1.dG[1][0]*(x1_store-dis1.x1) < 0')
self.bbopt1.add_constraint('(x1_store-dis1.x1)<.5')
self.bbopt1.add_constraint('(x1_store-dis1.x1)>-.5')
self.bbopt1.iprint = 0
self.bbopt1.linobj = True
# Global Optimization
self.add('sysopt', CONMINdriver())
self.sysopt.add_parameter('z_store[0]', low=-10.0, high=10.0, start=5.0)
self.sysopt.add_parameter('z_store[1]', low=0.0, high=10.0, start=2.0)
self.sysopt.add_objective('ssa.F[0]+ ssa.dF[0][0]*(z_store[0]-dis1.z1) + ssa.dF[0][1]*(z_store[1]-dis1.z2)')
self.sysopt.add_constraint('ssa.G[0] + ssa.dG[0][0]*(z_store[0]-dis1.z1) + ssa.dG[0][1]*(z_store[1]-dis1.z2) < 0')
self.sysopt.add_constraint('ssa.G[1] + ssa.dG[1][0]*(z_store[0]-dis1.z1) + ssa.dG[1][1]*(z_store[1]-dis1.z2) < 0')
self.sysopt.add_constraint('z_store[0]-dis1.z1<.5')
self.sysopt.add_constraint('z_store[0]-dis1.z1>-.5')
self.sysopt.add_constraint('z_store[1]-dis1.z2<.5')
self.sysopt.add_constraint('z_store[1]-dis1.z2>-.5')
self.sysopt.iprint = 0
self.sysopt.linobj = True
self.driver.workflow = SequentialWorkflow()
self.driver.workflow.add(['ssa', 'sa_dis1', 'bbopt1', 'sysopt'])
if __name__ == "__main__": # pragma: no cover
import time
import math
prob = SellarBLISS()
prob.name = "top"
tt = time.time()
prob.run()
print "\n"
print "Minimum found at (%f, %f, %f)" % (prob.dis1.z1, \
prob.dis1.z2, \
prob.dis1.x1)
print "Couping vars: %f, %f" % (prob.dis1.y1, prob.dis2.y2)
print "Minimum objective: ", (prob.dis1.x1)**2 + prob.dis1.z2 + prob.dis1.y1 + math.exp(-prob.dis2.y2)
print "Elapsed time: ", time.time()-tt, "seconds" | 44.15942 | 122 | 0.614539 | 850 | 6,094 | 4.305882 | 0.217647 | 0.067486 | 0.017213 | 0.019672 | 0.356831 | 0.316667 | 0.231148 | 0.180601 | 0.165027 | 0.165027 | 0 | 0.067372 | 0.240072 | 6,094 | 138 | 123 | 44.15942 | 0.722954 | 0.082048 | 0 | 0 | 0 | 0.084337 | 0.199718 | 0.087149 | 0 | 0 | 0 | 0.007246 | 0 | 0 | null | null | 0 | 0.084337 | null | null | 0.084337 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a63cff634a2350a5c9d905417fc77ffe0cbb8d60 | 833 | py | Python | migrations/versions/b0c12eb8ae59_initial_migration.py | edumorris/pomodoro | cde372be1d5c37dd8221ebd40b684d07fbb472b5 | [
"MIT"
] | 1 | 2022-01-10T14:48:16.000Z | 2022-01-10T14:48:16.000Z | migrations/versions/b0c12eb8ae59_initial_migration.py | edumorris/pomodoro | cde372be1d5c37dd8221ebd40b684d07fbb472b5 | [
"MIT"
] | null | null | null | migrations/versions/b0c12eb8ae59_initial_migration.py | edumorris/pomodoro | cde372be1d5c37dd8221ebd40b684d07fbb472b5 | [
"MIT"
] | null | null | null | """Initial Migration
Revision ID: b0c12eb8ae59
Revises: 7cb850d9441c
Create Date: 2020-07-15 11:44:46.190193
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'b0c12eb8ae59'
down_revision = '7cb850d9441c'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('users', sa.Column('password_hash', sa.String(length=255), nullable=True))
op.drop_column('users', 'pass_secure')
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('users', sa.Column('pass_secure', sa.VARCHAR(length=255), autoincrement=False, nullable=True))
op.drop_column('users', 'password_hash')
# ### end Alembic commands ###
| 26.870968 | 112 | 0.704682 | 106 | 833 | 5.433962 | 0.537736 | 0.076389 | 0.072917 | 0.079861 | 0.329861 | 0.329861 | 0.229167 | 0.229167 | 0.229167 | 0.229167 | 0 | 0.076705 | 0.154862 | 833 | 30 | 113 | 27.766667 | 0.741477 | 0.358944 | 0 | 0 | 0 | 0 | 0.185111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.333333 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
a63f4ba25cb878a66fcd7170945ef4034691b3de | 7,811 | py | Python | datasets/hope_edi/hope_edi.py | WojciechKusa/datasets | 1406a04c3e911cec2680d8bc513653e0cafcaaa4 | [
"Apache-2.0"
] | 10,608 | 2020-09-10T15:47:50.000Z | 2022-03-31T22:51:47.000Z | datasets/hope_edi/hope_edi.py | WojciechKusa/datasets | 1406a04c3e911cec2680d8bc513653e0cafcaaa4 | [
"Apache-2.0"
] | 2,396 | 2020-09-10T14:55:31.000Z | 2022-03-31T19:41:04.000Z | datasets/hope_edi/hope_edi.py | WojciechKusa/datasets | 1406a04c3e911cec2680d8bc513653e0cafcaaa4 | [
"Apache-2.0"
] | 1,530 | 2020-09-10T21:43:10.000Z | 2022-03-31T01:59:12.000Z | # coding=utf-8
# Copyright 2020 The HuggingFace Datasets Authors and the current dataset script contributor.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Hope Speech dataset for Equality, Diversity and Inclusion (HopeEDI)"""
import csv
import datasets
_HOMEPAGE = "https://competitions.codalab.org/competitions/27653#learn_the_details"
_CITATION = """\
@inproceedings{chakravarthi-2020-hopeedi,
title = "{H}ope{EDI}: A Multilingual Hope Speech Detection Dataset for Equality, Diversity, and Inclusion",
author = "Chakravarthi, Bharathi Raja",
booktitle = "Proceedings of the Third Workshop on Computational Modeling of People's Opinions, Personality, and Emotion's in Social Media",
month = dec,
year = "2020",
address = "Barcelona, Spain (Online)",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.peoples-1.5",
pages = "41--53",
abstract = "Over the past few years, systems have been developed to control online content and eliminate abusive, offensive or hate speech content. However, people in power sometimes misuse this form of censorship to obstruct the democratic right of freedom of speech. Therefore, it is imperative that research should take a positive reinforcement approach towards online content that is encouraging, positive and supportive contents. Until now, most studies have focused on solving this problem of negativity in the English language, though the problem is much more than just harmful content. Furthermore, it is multilingual as well. Thus, we have constructed a Hope Speech dataset for Equality, Diversity and Inclusion (HopeEDI) containing user-generated comments from the social media platform YouTube with 28,451, 20,198 and 10,705 comments in English, Tamil and Malayalam, respectively, manually labelled as containing hope speech or not. To our knowledge, this is the first research of its kind to annotate hope speech for equality, diversity and inclusion in a multilingual setting. We determined that the inter-annotator agreement of our dataset using Krippendorff{'}s alpha. Further, we created several baselines to benchmark the resulting dataset and the results have been expressed using precision, recall and F1-score. The dataset is publicly available for the research community. We hope that this resource will spur further research on encouraging inclusive and responsive speech that reinforces positiveness.",
}
"""
_DESCRIPTION = """\
A Hope Speech dataset for Equality, Diversity and Inclusion (HopeEDI) containing user-generated comments from the social media platform YouTube with 28,451, 20,198 and 10,705 comments in English, Tamil and Malayalam, respectively, manually labelled as containing hope speech or not.
"""
_LICENSE = "Creative Commons Attribution 4.0 International Licence"
_URLs = {
"english": {
"TRAIN_DOWNLOAD_URL": "https://drive.google.com/u/0/uc?id=1ydsOTvBZXKqcRvXawOuePrJ99slOEbkk&export=download",
"VALIDATION_DOWNLOAD_URL": "https://drive.google.com/u/0/uc?id=1pvpPA97kybx5IyotR9HNuqP4T5ktEtr4&export=download",
},
"tamil": {
"TRAIN_DOWNLOAD_URL": "https://drive.google.com/u/0/uc?id=1R1jR4DcH2UEaM1ZwDSRHdfTGvkCNu6NW&export=download",
"VALIDATION_DOWNLOAD_URL": "https://drive.google.com/u/0/uc?id=1cTaA6OCZUaepl5D-utPw2ZmbonPcw52v&export=download",
},
"malayalam": {
"TRAIN_DOWNLOAD_URL": "https://drive.google.com/u/0/uc?id=1wxwqnWGRzwvc_-ugRoFX8BPgpO3Q7sch&export=download",
"VALIDATION_DOWNLOAD_URL": "https://drive.google.com/u/0/uc?id=1uZ0U9VJQEUPQItPpTJKXH8u_6jXppvJ1&export=download",
},
}
class HopeEdi(datasets.GeneratorBasedBuilder):
"""HopeEDI dataset."""
VERSION = datasets.Version("1.0.0")
BUILDER_CONFIGS = [
datasets.BuilderConfig(
name="english", version=VERSION, description="This part of my dataset covers English dataset"
),
datasets.BuilderConfig(
name="tamil", version=VERSION, description="This part of my dataset covers Tamil dataset"
),
datasets.BuilderConfig(
name="malayalam", version=VERSION, description="This part of my dataset covers Tamil dataset"
),
]
def _info(self):
if self.config.name == "english": # This is the name of the configuration selected in BUILDER_CONFIGS above
features = datasets.Features(
{
"text": datasets.Value("string"),
"label": datasets.features.ClassLabel(names=["Hope_speech", "Non_hope_speech", "not-English"]),
}
)
elif self.config.name == "tamil":
features = datasets.Features(
{
"text": datasets.Value("string"),
"label": datasets.features.ClassLabel(names=["Hope_speech", "Non_hope_speech", "not-Tamil"]),
}
)
# else self.config.name == "malayalam":
else:
features = datasets.Features(
{
"text": datasets.Value("string"),
"label": datasets.features.ClassLabel(names=["Hope_speech", "Non_hope_speech", "not-malayalam"]),
}
)
return datasets.DatasetInfo(
# This is the description that will appear on the datasets page.
description=_DESCRIPTION,
# This defines the different columns of the dataset and their types
features=features, # Here we define them above because they are different between the two configurations
# If there's a common (input, target) tuple from the features,
# specify them here. They'll be used if as_supervised=True in
# builder.as_dataset.
supervised_keys=None,
# Homepage of the dataset for documentation
homepage=_HOMEPAGE,
# License for the dataset if available
license=_LICENSE,
# Citation for the dataset
citation=_CITATION,
)
def _split_generators(self, dl_manager):
"""Returns SplitGenerators."""
my_urls = _URLs[self.config.name]
train_path = dl_manager.download_and_extract(my_urls["TRAIN_DOWNLOAD_URL"])
validation_path = dl_manager.download_and_extract(my_urls["VALIDATION_DOWNLOAD_URL"])
return [
datasets.SplitGenerator(
name=datasets.Split.TRAIN,
gen_kwargs={
"filepath": train_path,
"split": "train",
},
),
datasets.SplitGenerator(
name=datasets.Split.VALIDATION,
gen_kwargs={
"filepath": validation_path,
"split": "validation",
},
),
]
def _generate_examples(self, filepath, split):
"""Generate HopeEDI examples."""
with open(filepath, encoding="utf-8") as csv_file:
csv_reader = csv.reader(
csv_file, quotechar='"', delimiter="\t", quoting=csv.QUOTE_NONE, skipinitialspace=False
)
for id_, row in enumerate(csv_reader):
text, label, dummy = row
yield id_, {"text": text, "label": label}
| 49.125786 | 1,525 | 0.673921 | 921 | 7,811 | 5.637351 | 0.386536 | 0.025039 | 0.01849 | 0.024268 | 0.297958 | 0.276772 | 0.26926 | 0.26926 | 0.255008 | 0.234592 | 0 | 0.018118 | 0.236845 | 7,811 | 158 | 1,526 | 49.436709 | 0.852877 | 0.170529 | 0 | 0.188679 | 0 | 0.103774 | 0.550824 | 0.020671 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028302 | false | 0 | 0.018868 | 0 | 0.09434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a641a9f31d2309805597aee4c338bb8ecb70c725 | 5,310 | py | Python | code_legacy/PostfixLogSummary.py | rhymeswithmogul/starttls-everywhere | cdc6cce552d44e17a3178eba987ae5d2b6a22e75 | [
"Apache-2.0"
] | 339 | 2015-01-01T06:19:34.000Z | 2021-12-10T17:24:52.000Z | code_legacy/PostfixLogSummary.py | rhymeswithmogul/starttls-everywhere | cdc6cce552d44e17a3178eba987ae5d2b6a22e75 | [
"Apache-2.0"
] | 95 | 2015-06-07T21:26:16.000Z | 2021-09-28T13:11:00.000Z | code_legacy/PostfixLogSummary.py | rhymeswithmogul/starttls-everywhere | cdc6cce552d44e17a3178eba987ae5d2b6a22e75 | [
"Apache-2.0"
] | 50 | 2015-03-18T17:41:32.000Z | 2021-03-19T07:44:54.000Z | #!/usr/bin/env python
import argparse
import collections
import os
import re
import sys
import time
import Config
TIME_FORMAT = "%b %d %H:%M:%S"
# TODO: There's more to be learned from postfix logs! Here's one sample
# observed during failures from the sender vagrant vm:
# Jun 6 00:21:31 precise32 postfix/smtpd[3648]: connect from localhost[127.0.0.1]
# Jun 6 00:21:34 precise32 postfix/smtpd[3648]: lost connection after STARTTLS from localhost[127.0.0.1]
# Jun 6 00:21:34 precise32 postfix/smtpd[3648]: disconnect from localhost[127.0.0.1]
# Jun 6 00:21:56 precise32 postfix/master[3001]: reload -- version 2.9.6, configuration /etc/postfix
# Jun 6 00:22:01 precise32 postfix/pickup[3674]: AF3B6480475: uid=0 from=<root>
# Jun 6 00:22:01 precise32 postfix/cleanup[3680]: AF3B6480475: message-id=<20140606002201.AF3B6480475@sender.example.com>
# Jun 6 00:22:01 precise32 postfix/qmgr[3673]: AF3B6480475: from=<root@sender.example.com>, size=576, nrcpt=1 (queue active)
# Jun 6 00:22:01 precise32 postfix/smtp[3682]: SSL_connect error to valid-example-recipient.com[192.168.33.7]:25: -1
# Jun 6 00:22:01 precise32 postfix/smtp[3682]: warning: TLS library problem: 3682:error:140740BF:SSL routines:SSL23_CLIENT_HELLO:no protocols available:s23_clnt.c:381:
# Jun 6 00:22:01 precise32 postfix/smtp[3682]: AF3B6480475: to=<vagrant@valid-example-recipient.com>, relay=valid-example-recipient.com[192.168.33.7]:25, delay=0.06, delays=0.03/0.03/0/0, dsn=4.7.5, status=deferred (Cannot start TLS: handshake failure)
#
# Also:
# Oct 10 19:12:13 sender postfix/smtp[1711]: 62D3F481249: to=<vagrant@valid-example-recipient.com>, relay=valid-example-recipient.com[192.168.33.7]:25, delay=0.07, delays=0.03/0.01/0.03/0, dsn=4.7.4, status=deferred (TLS is required, but was not offered by host valid-example-recipient.com[192.168.33.7])
def get_counts(input, config, earliest_timestamp):
seen_trusted = False
counts = collections.defaultdict(lambda: collections.defaultdict(int))
tls_deferred = collections.defaultdict(int)
# Typical line looks like:
# Jun 12 06:24:14 sender postfix/smtp[9045]: Untrusted TLS connection established to valid-example-recipient.com[192.168.33.7]:25: TLSv1.1 with cipher AECDH-AES256-SHA (256/256 bits)
# indicate a problem that should be alerted on.
# ([^[]*) <--- any group of characters that is not "["
# Log lines for when a message is deferred for a TLS-related reason. These
deferred_re = re.compile("relay=([^[ ]*).* status=deferred.*TLS")
# Log lines for when a TLS connection was successfully established. These can
# indicate the difference between Untrusted, Trusted, and Verified certs.
connected_re = re.compile("([A-Za-z]+) TLS connection established to ([^[]*)")
mx_to_domain_mapping = config.get_mx_to_domain_policy_map()
timestamp = 0
for line in sys.stdin:
timestamp = time.strptime(line[0:15], TIME_FORMAT)
if timestamp < earliest_timestamp:
continue
deferred = deferred_re.search(line)
connected = connected_re.search(line)
if connected:
validation = connected.group(1)
mx_hostname = connected.group(2).lower()
if validation == "Trusted" or validation == "Verified":
seen_trusted = True
address_domains = config.get_address_domains(mx_hostname, mx_to_domain_mapping)
if address_domains:
domains_str = [ a.domain for a in address_domains ]
d = ', '.join(domains_str)
counts[d][validation] += 1
counts[d]["all"] += 1
elif deferred:
mx_hostname = deferred.group(1).lower()
tls_deferred[mx_hostname] += 1
return (counts, tls_deferred, seen_trusted, timestamp)
def print_summary(counts):
for mx_hostname, validations in counts.items():
for validation, validation_count in validations.items():
if validation == "all":
continue
print mx_hostname, validation, validation_count / validations["all"], "of", validations["all"]
if __name__ == "__main__":
arg_parser = argparse.ArgumentParser(description='Detect delivery problems'
' in Postfix log files that may be caused by security policies')
arg_parser.add_argument('-c', action="store_true", dest="cron", default=False)
arg_parser.add_argument("policy_file", nargs='?',
default=os.path.join("examples", "starttls-everywhere.json"),
help="STARTTLS Everywhere policy file")
args = arg_parser.parse_args()
config = Config.Config()
config.load_from_json_file(args.policy_file)
last_timestamp_processed = 0
timestamp_file = '/tmp/starttls-everywhere-last-timestamp-processed.txt'
if os.path.isfile(timestamp_file):
last_timestamp_processed = time.strptime(open(timestamp_file).read(), TIME_FORMAT)
(counts, tls_deferred, seen_trusted, latest_timestamp) = get_counts(sys.stdin, config, last_timestamp_processed)
with open(timestamp_file, "w") as f:
f.write(time.strftime(TIME_FORMAT, latest_timestamp))
# If not running in cron, print an overall summary of log lines seen from known hosts.
if not args.cron:
print_summary(counts)
if not seen_trusted:
print 'No Trusted connections seen! Probably need to install a CAfile.'
if len(tls_deferred) > 0:
print "Some mail was deferred due to TLS problems:"
for (k, v) in tls_deferred.iteritems():
print "%s: %s" % (k, v)
| 50.571429 | 304 | 0.728814 | 799 | 5,310 | 4.733417 | 0.364205 | 0.010576 | 0.015865 | 0.044421 | 0.176626 | 0.153358 | 0.153358 | 0.132734 | 0.124008 | 0.097039 | 0 | 0.081231 | 0.149153 | 5,310 | 104 | 305 | 51.057692 | 0.755865 | 0.431638 | 0 | 0.029412 | 0 | 0 | 0.160655 | 0.025718 | 0 | 0 | 0 | 0.009615 | 0 | 0 | null | null | 0 | 0.102941 | null | null | 0.088235 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a643184953f977a187e7f1cbf609b9b4da054af6 | 1,834 | py | Python | morph_net/framework/op_handler_decorator_test.py | nmoezzi/morph-net | fb25044ac06fc6e3b681911fc3dffe65a2b6a0a4 | [
"Apache-2.0"
] | 1 | 2019-04-25T08:23:52.000Z | 2019-04-25T08:23:52.000Z | morph_net/framework/op_handler_decorator_test.py | psyhicborg/morph-net | 0fb096d8d3b33eda9ab86c700cb6c07c9dbf10ee | [
"Apache-2.0"
] | null | null | null | morph_net/framework/op_handler_decorator_test.py | psyhicborg/morph-net | 0fb096d8d3b33eda9ab86c700cb6c07c9dbf10ee | [
"Apache-2.0"
] | 1 | 2019-04-26T14:50:13.000Z | 2019-04-26T14:50:13.000Z | """Tests for morph_net.framework.op_regularizer_decorator."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from morph_net.framework import conv2d_source_op_handler
from morph_net.framework import generic_regularizers
from morph_net.framework import op_handler_decorator
from morph_net.framework import op_regularizer_manager as orm
import numpy as np
import tensorflow as tf
class DummyDecorator(generic_regularizers.OpRegularizer):
"""A dummy decorator that multiply the regularization vector by 0.5.
"""
def __init__(self, regularizer_object):
"""Creates an instance.
Accept an OpRegularizer that is decorated by this class.
Args:
regularizer_object: OpRegularizer to decorate.
"""
self._regularization_vector = regularizer_object.regularization_vector * 0.5
self._alive_vector = regularizer_object.alive_vector
@property
def regularization_vector(self):
return self._regularization_vector
@property
def alive_vector(self):
return self._alive_vector
class OpHandlerDecoratorTest(tf.test.TestCase):
"""Test class for OpHandlerDecorator."""
def testOpHandlerDecorator(self):
image = tf.constant(0.0, shape=[1, 17, 19, 3])
kernel = tf.ones([5, 5, 3, 3])
output = tf.nn.conv2d(image, kernel, strides=[1, 1, 1, 1], padding='SAME')
decorated_op_handler = op_handler_decorator.OpHandlerDecorator(
conv2d_source_op_handler.Conv2DSourceOpHandler(1e-3, 0), DummyDecorator)
op_slice = orm.OpSlice(output.op, orm.Slice(0, 3))
regularizer = decorated_op_handler.create_regularizer(op_slice)
self.assertAllClose(0.5 * np.ones(3), regularizer.regularization_vector)
self.assertAllClose(np.ones(3), regularizer.alive_vector)
if __name__ == '__main__':
tf.test.main()
| 30.065574 | 80 | 0.767176 | 237 | 1,834 | 5.632911 | 0.35865 | 0.040449 | 0.06367 | 0.062921 | 0.083895 | 0.043446 | 0 | 0 | 0 | 0 | 0 | 0.021033 | 0.144493 | 1,834 | 60 | 81 | 30.566667 | 0.829828 | 0.160851 | 0 | 0.0625 | 0 | 0 | 0.007995 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.125 | false | 0 | 0.28125 | 0.0625 | 0.53125 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
a6554e28fae19027e74ff048a0c85dcdc8783d73 | 689 | py | Python | setup.py | thomasms/filecompare | 393af84939689481da27460cccb52040e6171e01 | [
"MIT"
] | null | null | null | setup.py | thomasms/filecompare | 393af84939689481da27460cccb52040e6171e01 | [
"MIT"
] | null | null | null | setup.py | thomasms/filecompare | 393af84939689481da27460cccb52040e6171e01 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(name='filecompare',
version='0.1',
description='A package for comparing text and JSON files.',
url='https://github.com/thomasms/filecompare',
author='Thomas Stainer',
author_email='stainer.tom+github@gmail.com',
license='MIT',
packages=[
'filecompare',
'filecompare.compare',
'filecompare.tools',
'filecompare.utils'
],
install_requires=[],
python_requires='>=3',
scripts=['filecompare/tools/docompare.py'],
setup_requires=['pytest-runner'],
test_suite='tests.testsuite',
tests_require=['pytest'],
zip_safe=False)
| 28.708333 | 65 | 0.608128 | 69 | 689 | 5.971014 | 0.753623 | 0.07767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005803 | 0.249637 | 689 | 23 | 66 | 29.956522 | 0.791103 | 0 | 0 | 0 | 0 | 0 | 0.396226 | 0.08418 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a66116aa53fbe883f4e06b84d6c1d21984e8b026 | 826 | py | Python | manage.py | Vanzct/xp | 75794c283f8680cac84edf0e184e8d1fdaed2b9c | [
"MIT"
] | null | null | null | manage.py | Vanzct/xp | 75794c283f8680cac84edf0e184e8d1fdaed2b9c | [
"MIT"
] | null | null | null | manage.py | Vanzct/xp | 75794c283f8680cac84edf0e184e8d1fdaed2b9c | [
"MIT"
] | null | null | null | # coding=utf-8
__author__ = 'Van'
import os
import sys
from flask.ext.script import Manager, Shell
# from flask.ext.migrate import Migrate, MigrateCommand
sys.path.append(os.path.dirname(os.path.abspath(__file__)))
from app import create_app
mode = os.getenv('APP_CONFIG_MODE') or 'default'
if mode:
mode = mode.lower()
print 'current config mode %s' % mode
app = create_app(mode)
manager = Manager(app)
# manager.add_command("shell", Shell(make_context=make_shell_context))
# manager.add_command('db', MigrateCommand)
@manager.command
def test():
"""aRun the unit tests."""
import unittest
tests = unittest.TestLoader().discover('tests')
unittest.TextTestRunner(verbosity=2).run(tests)
if __name__ == '__main__':
app.debug = True
app.run(host='0.0.0.0', port=5001)
# manager.run()
| 23.6 | 70 | 0.713075 | 117 | 826 | 4.820513 | 0.504274 | 0.010638 | 0.042553 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014205 | 0.1477 | 826 | 34 | 71 | 24.294118 | 0.786932 | 0.231235 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.25 | null | null | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a66eed49a3b39d2e0d734ff021df1302bfabe786 | 1,598 | py | Python | tests/test_times.py | Invarato/sort_in_disk_project | 8e725e683999aa6cf9db52711309b6a58099c3e2 | [
"MIT"
] | 3 | 2020-11-12T16:59:04.000Z | 2021-12-03T18:57:27.000Z | tests/test_times.py | Invarato/sort_in_disk_project | 8e725e683999aa6cf9db52711309b6a58099c3e2 | [
"MIT"
] | null | null | null | tests/test_times.py | Invarato/sort_in_disk_project | 8e725e683999aa6cf9db52711309b6a58099c3e2 | [
"MIT"
] | 1 | 2021-04-11T11:21:42.000Z | 2021-04-11T11:21:42.000Z | # -*- coding: utf-8 -*-
#
# @autor: Ramón Invarato Menéndez
# @version 1.0
from datetime import datetime
"""
Several tests
"""
count = 20000000
if __name__ == "__main__":
start = datetime.now()
print("[if] start: {}".format(start))
val = True
for _ in range(1, count):
if val:
v = "aaa|bbb".split("|")
else:
v = "ccc|ddd".split("|")
finish = datetime.now()
print("[if] finish: {} | diff finish-start: {}".format(finish, finish-start))
# ===============================================
start = datetime.now()
print("[function] start: {}".format(start))
def mi_func():
"aaa|bbb".split("|")
for _ in range(1, count):
mi_func()
finish = datetime.now()
print("[function] finish: {} | diff finish-start: {}".format(finish, finish-start))
# ===============================================
start = datetime.now()
print("[function arg] start: {}".format(start))
def mi_func(ar):
del ar
"aaa|bbb".split("|")
for _ in range(1, count):
mi_func("ccc")
finish = datetime.now()
print("[function arg] finish: {} | diff finish-start: {}".format(finish, finish-start))
# ===============================================
start = datetime.now()
print("[function return] start: {}".format(start))
def mi_func(ar):
return "aaa|bbb".split("|")
for _ in range(1, count):
mi_func("ccc")
finish = datetime.now()
print("[function return] finish: {} | diff finish-start: {}".format(finish, finish-start)) | 23.850746 | 94 | 0.506258 | 173 | 1,598 | 4.572254 | 0.265896 | 0.111252 | 0.16182 | 0.182048 | 0.721871 | 0.640961 | 0.609355 | 0.541087 | 0.485461 | 0.485461 | 0 | 0.012275 | 0.235294 | 1,598 | 67 | 94 | 23.850746 | 0.635025 | 0.131414 | 0 | 0.473684 | 0 | 0 | 0.238235 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078947 | false | 0 | 0.026316 | 0.026316 | 0.131579 | 0.210526 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a66f5311e18930dba9aee6d5a079e0eb27871fa5 | 482 | py | Python | duffel_api/api/booking/seat_maps.py | duffelhq/duffel-api-python | 583703f34e345ac8fa185ca26441f9168d1b0dac | [
"MIT"
] | 2 | 2022-02-26T22:14:48.000Z | 2022-03-10T10:04:11.000Z | duffel_api/api/booking/seat_maps.py | duffelhq/duffel-api-python | 583703f34e345ac8fa185ca26441f9168d1b0dac | [
"MIT"
] | 29 | 2022-01-04T12:40:54.000Z | 2022-03-31T23:26:54.000Z | duffel_api/api/booking/seat_maps.py | duffelhq/duffel-api-python | 583703f34e345ac8fa185ca26441f9168d1b0dac | [
"MIT"
] | null | null | null | from ...http_client import HttpClient
from ...models import SeatMap
class SeatMapClient(HttpClient):
"""Client to interact with Seat Maps"""
def __init__(self, **kwargs):
self._url = "/air/seat_maps"
super().__init__(**kwargs)
def get(self, offer_id):
"""GET /air/seat_maps"""
res = self.do_get(self._url, query_params={"offer_id": offer_id})
if res is not None:
return [SeatMap.from_json(m) for m in res["data"]]
| 28.352941 | 73 | 0.626556 | 66 | 482 | 4.287879 | 0.575758 | 0.084806 | 0.077739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.232365 | 482 | 16 | 74 | 30.125 | 0.764865 | 0.107884 | 0 | 0 | 0 | 0 | 0.062053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.