hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2937dffa934589643ef47adc036dca58a6d11082 | 31 | py | Python | src/cuda_slic/__init__.py | abonawas/cuda-slic | 5264b81b12fa3049ccb9cab59257740422a13d21 | [
"Apache-2.0"
] | 6 | 2020-10-11T20:58:43.000Z | 2022-03-14T15:19:05.000Z | src/cuda_slic/__init__.py | abonawas/cuda-slic | 5264b81b12fa3049ccb9cab59257740422a13d21 | [
"Apache-2.0"
] | 1 | 2020-09-19T11:08:42.000Z | 2020-09-26T20:14:30.000Z | src/cuda_slic/__init__.py | abonawas/cuda-slic | 5264b81b12fa3049ccb9cab59257740422a13d21 | [
"Apache-2.0"
] | 1 | 2021-02-28T11:05:45.000Z | 2021-02-28T11:05:45.000Z | from .slic import slic # noqa
| 15.5 | 30 | 0.709677 | 5 | 31 | 4.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225806 | 31 | 1 | 31 | 31 | 0.916667 | 0.129032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2938ee7690820b2b52f442a5903670cac0107602 | 3,896 | py | Python | tests/e2e/test_navigation_page_non_student.py | elihschiff/Submitty | 8b980997b6f1dfcd73eb4cf4cca43398e67f96dc | [
"BSD-3-Clause"
] | 1 | 2019-02-27T21:20:14.000Z | 2019-02-27T21:20:14.000Z | tests/e2e/test_navigation_page_non_student.py | elihschiff/Submitty | 8b980997b6f1dfcd73eb4cf4cca43398e67f96dc | [
"BSD-3-Clause"
] | 2 | 2021-05-10T14:33:39.000Z | 2022-01-06T19:47:03.000Z | tests/e2e/test_navigation_page_non_student.py | elihschiff/Submitty | 8b980997b6f1dfcd73eb4cf4cca43398e67f96dc | [
"BSD-3-Clause"
] | 1 | 2020-06-25T22:45:25.000Z | 2020-06-25T22:45:25.000Z | from .base_testcase import BaseTestCase
class TestNavigationPageNonStudent(BaseTestCase):
def __init__(self, testname):
super().__init__(testname, log_in=False)
def test_instructor(self):
self.log_in(user_id="instructor", user_name="Quinn")
self.click_class('sample')
elements = self.driver.find_elements_by_class_name('course-section-heading')
self.assertEqual(6, len(elements))
self.assertEqual("future", elements[0].get_attribute('id'))
self.assertEqual(3, len(self.driver
.find_element_by_id('future-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("beta", elements[1].get_attribute('id'))
self.assertEqual(3, len(self.driver
.find_element_by_id('beta-section')
.find_elements_by_class_name('gradeable-row')))
self.assertEqual("open", elements[2].get_attribute('id'))
self.assertEqual(2, len(self.driver
.find_element_by_id('open-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("closed", elements[3].get_attribute('id'))
self.assertEqual(2, len(self.driver
.find_element_by_id('closed-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("items_being_graded", elements[4].get_attribute('id'))
self.assertEqual(6, len(self.driver
.find_element_by_id('items_being_graded-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("graded", elements[5].get_attribute('id'))
self.assertEqual(9, len(self.driver
.find_element_by_id('graded-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual(4, len(self.driver.find_element_by_class_name(
'gradeable-row').find_elements_by_class_name('course-button')))
def test_ta(self):
self.log_in(user_id="ta", user_name="Jill")
self.click_class('sample')
elements = self.driver.find_elements_by_class_name('course-section-heading')
self.assertEqual(5, len(elements))
self.assertEqual("beta", elements[0].get_attribute('id'))
self.assertEqual(3, len(self.driver
.find_element_by_id('beta-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("open", elements[1].get_attribute('id'))
self.assertEqual(2, len(self.driver
.find_element_by_id('open-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("closed", elements[2].get_attribute('id'))
self.assertEqual(2, len(self.driver
.find_element_by_id('closed-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("items_being_graded", elements[3].get_attribute('id'))
self.assertEqual(6, len(self.driver
.find_element_by_id('items_being_graded-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual("graded", elements[4].get_attribute('id'))
self.assertEqual(9, len(self.driver
.find_element_by_id('graded-section')
.find_elements_by_class_name("gradeable-row")))
self.assertEqual(3, len(self.driver.find_element_by_class_name(
'gradeable-row').find_elements_by_class_name('course-button')))
if __name__ == "__main__":
import unittest
unittest.main()
| 53.369863 | 84 | 0.601129 | 436 | 3,896 | 5.03211 | 0.133028 | 0.177758 | 0.085232 | 0.1299 | 0.863263 | 0.863263 | 0.845943 | 0.840474 | 0.839562 | 0.839107 | 0 | 0.009217 | 0.275924 | 3,896 | 72 | 85 | 54.111111 | 0.768522 | 0 | 0 | 0.569231 | 0 | 0 | 0.142197 | 0.024641 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.046154 | false | 0 | 0.030769 | 0 | 0.092308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2959fa74380a5e98bc9c3139b893878f6f0f467d | 46 | py | Python | clisops/utils/__init__.py | aulemahal/clisops | 20bbd4662b6272ec0115425f009059b85c83121e | [
"BSD-3-Clause"
] | 18 | 2020-05-19T21:22:37.000Z | 2022-02-04T08:10:21.000Z | clisops/utils/__init__.py | aulemahal/clisops | 20bbd4662b6272ec0115425f009059b85c83121e | [
"BSD-3-Clause"
] | 166 | 2020-04-22T11:04:57.000Z | 2022-03-31T11:14:21.000Z | clisops/utils/__init__.py | aulemahal/clisops | 20bbd4662b6272ec0115425f009059b85c83121e | [
"BSD-3-Clause"
] | 6 | 2020-04-02T14:30:21.000Z | 2021-12-04T03:51:12.000Z | from .common import *
from .tutorial import *
| 15.333333 | 23 | 0.73913 | 6 | 46 | 5.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 46 | 2 | 24 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
465d2242e3389ca07a33cbd1e275d2781af771cd | 4,460 | py | Python | ietf/doc/migrations/0030_author_revamp_and_extra_attributes.py | ekr/ietfdb | 8d936836b0b9ff31cda415b0a423e3f5b33ab695 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | ietf/doc/migrations/0030_author_revamp_and_extra_attributes.py | ekr/ietfdb | 8d936836b0b9ff31cda415b0a423e3f5b33ab695 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | ietf/doc/migrations/0030_author_revamp_and_extra_attributes.py | ekr/ietfdb | 8d936836b0b9ff31cda415b0a423e3f5b33ab695 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('name', '0020_formallanguagename'),
('doc', '0029_update_rfc_authors'),
]
operations = [
migrations.AddField(
model_name='dochistory',
name='words',
field=models.IntegerField(null=True, blank=True),
),
migrations.AddField(
model_name='document',
name='words',
field=models.IntegerField(null=True, blank=True),
),
migrations.AddField(
model_name='dochistory',
name='formal_languages',
field=models.ManyToManyField(help_text=b'Formal languages used in document', to='name.FormalLanguageName', blank=True),
),
migrations.AddField(
model_name='document',
name='formal_languages',
field=models.ManyToManyField(help_text=b'Formal languages used in document', to='name.FormalLanguageName', blank=True),
),
migrations.RemoveField(
model_name='dochistory',
name='authors',
),
migrations.RemoveField(
model_name='document',
name='authors',
),
migrations.AddField(
model_name='dochistoryauthor',
name='affiliation',
field=models.CharField(help_text=b'Organization/company used by author for submission', max_length=100, blank=True),
),
migrations.AddField(
model_name='dochistoryauthor',
name='country',
field=models.CharField(blank=True, help_text=b'Country used by author for submission', max_length=255),
),
migrations.RenameField(
model_name='dochistoryauthor',
old_name='author',
new_name='email',
),
migrations.AlterField(
model_name='dochistoryauthor',
name='email',
field=models.ForeignKey(blank=True, to='person.Email', help_text=b'Email address used by author for submission', null=True),
),
migrations.AddField(
model_name='dochistoryauthor',
name='person',
field=models.ForeignKey(blank=True, to='person.Person', null=True),
),
migrations.AddField(
model_name='documentauthor',
name='affiliation',
field=models.CharField(help_text=b'Organization/company used by author for submission', max_length=100, blank=True),
),
migrations.AddField(
model_name='documentauthor',
name='country',
field=models.CharField(blank=True, help_text=b'Country used by author for submission', max_length=255),
),
migrations.RenameField(
model_name='documentauthor',
old_name='author',
new_name='email',
),
migrations.AlterField(
model_name='documentauthor',
name='email',
field=models.ForeignKey(blank=True, to='person.Email', help_text=b'Email address used by author for submission', null=True),
),
migrations.AddField(
model_name='documentauthor',
name='person',
field=models.ForeignKey(blank=True, to='person.Person', null=True),
),
migrations.AlterField(
model_name='dochistoryauthor',
name='document',
field=models.ForeignKey(related_name='documentauthor_set', to='doc.DocHistory'),
),
migrations.AlterField(
model_name='dochistoryauthor',
name='order',
field=models.IntegerField(default=1),
),
migrations.RunSQL("update doc_documentauthor a inner join person_email e on a.email_id = e.address set a.person_id = e.person_id;", migrations.RunSQL.noop),
migrations.RunSQL("update doc_dochistoryauthor a inner join person_email e on a.email_id = e.address set a.person_id = e.person_id;", migrations.RunSQL.noop),
migrations.AlterField(
model_name='documentauthor',
name='person',
field=models.ForeignKey(to='person.Person'),
),
migrations.AlterField(
model_name='dochistoryauthor',
name='person',
field=models.ForeignKey(to='person.Person'),
),
]
| 38.448276 | 166 | 0.595964 | 435 | 4,460 | 5.974713 | 0.183908 | 0.069257 | 0.088496 | 0.103886 | 0.816468 | 0.816468 | 0.724894 | 0.713351 | 0.624086 | 0.624086 | 0 | 0.006944 | 0.289686 | 4,460 | 115 | 167 | 38.782609 | 0.813447 | 0.004709 | 0 | 0.862385 | 0 | 0.018349 | 0.266396 | 0.020735 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018349 | 0 | 0.045872 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4666cfaea01ade273e666c93e968af98aa8c0522 | 59 | py | Python | lambda_assistant/mysql/__init__.py | matiasvallejosdev/py-aws-lambda-handlers | 4643042bc02e557bb4a2953118de5f4eb5320d70 | [
"Apache-2.0"
] | null | null | null | lambda_assistant/mysql/__init__.py | matiasvallejosdev/py-aws-lambda-handlers | 4643042bc02e557bb4a2953118de5f4eb5320d70 | [
"Apache-2.0"
] | null | null | null | lambda_assistant/mysql/__init__.py | matiasvallejosdev/py-aws-lambda-handlers | 4643042bc02e557bb4a2953118de5f4eb5320d70 | [
"Apache-2.0"
] | null | null | null | from .client_handler import *
from .query_handler import * | 29.5 | 30 | 0.79661 | 8 | 59 | 5.625 | 0.625 | 0.577778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135593 | 59 | 2 | 31 | 29.5 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4673e4345ba185c95a2c882c41b167234bf6fb32 | 110 | py | Python | info.py | noahjepstein/textbot | de18f929356b1a059b3cee93348d14a689497684 | [
"MIT"
] | null | null | null | info.py | noahjepstein/textbot | de18f929356b1a059b3cee93348d14a689497684 | [
"MIT"
] | null | null | null | info.py | noahjepstein/textbot | de18f929356b1a059b3cee93348d14a689497684 | [
"MIT"
] | null | null | null | MY_NUM = +19788613167
ACC_SID = ACa67744d6b745da5a52af5e441bbba8d1
ACC_KEY = 6d3d93eeacb9cb2d8893b3321c8e4691
| 27.5 | 44 | 0.881818 | 9 | 110 | 10.444444 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.475248 | 0.081818 | 110 | 3 | 45 | 36.666667 | 0.455446 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d3bdc17e74d5faaf566fbf3894cdcb6c74fbc2ad | 25 | py | Python | modules/__init__.py | Infosecurity-LLC/unicon_v3 | 12009016e74328f7f90be84e334659b80240266e | [
"Apache-2.0"
] | 1 | 2022-02-04T10:01:45.000Z | 2022-02-04T10:01:45.000Z | modules/__init__.py | Infosecurity-LLC/unicon_v3 | 12009016e74328f7f90be84e334659b80240266e | [
"Apache-2.0"
] | null | null | null | modules/__init__.py | Infosecurity-LLC/unicon_v3 | 12009016e74328f7f90be84e334659b80240266e | [
"Apache-2.0"
] | 1 | 2022-02-04T10:01:46.000Z | 2022-02-04T10:01:46.000Z | from . import connectors
| 12.5 | 24 | 0.8 | 3 | 25 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 1 | 25 | 25 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d3c3c82cfc775b429e15bd3197a9dbcc97f75bef | 172 | py | Python | credentials.py | LucioZanette/vacibot | 06b738639d2592cdeda538a4e006220b5ed9be87 | [
"MIT"
] | null | null | null | credentials.py | LucioZanette/vacibot | 06b738639d2592cdeda538a4e006220b5ed9be87 | [
"MIT"
] | null | null | null | credentials.py | LucioZanette/vacibot | 06b738639d2592cdeda538a4e006220b5ed9be87 | [
"MIT"
] | null | null | null | connection_params = 'user/password@hostname/oracledbname' #credenciais do oracle DB
login_vacivida= '{"Data":{"Login":"xxxxxx","Senha":"yyyyyy"}}' #credenciais do Vacivida
| 57.333333 | 87 | 0.761628 | 20 | 172 | 6.45 | 0.8 | 0.20155 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 172 | 2 | 88 | 86 | 0.80625 | 0.273256 | 0 | 0 | 0 | 0 | 0.642276 | 0.642276 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.5 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
31aaf08b283ee4df722d57b2179dbf24d8a8adfe | 78 | py | Python | notebook/book/python/Learn-OOP-with-Python/Chapter-2/python_init_test/module_2/module_2_module_1/__init__.py | JMwill/note | 30e931f18c9ba942f5e5040b524047a996cf0c6c | [
"MIT"
] | null | null | null | notebook/book/python/Learn-OOP-with-Python/Chapter-2/python_init_test/module_2/module_2_module_1/__init__.py | JMwill/note | 30e931f18c9ba942f5e5040b524047a996cf0c6c | [
"MIT"
] | 2 | 2018-11-27T10:45:45.000Z | 2018-12-13T14:44:54.000Z | notebook/book/python/Learn-OOP-with-Python/Chapter-2/python_init_test/module_2/module_2_module_1/__init__.py | JMwill/note | 30e931f18c9ba942f5e5040b524047a996cf0c6c | [
"MIT"
] | null | null | null | from .say import say
from .module_2_module_1_module_1 import say as module_say | 39 | 57 | 0.858974 | 16 | 78 | 3.8125 | 0.4375 | 0.295082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.115385 | 78 | 2 | 57 | 39 | 0.84058 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
31c4bbe6aa76515b8fa23edd69e4de916b89503e | 46 | py | Python | deep_nlp/__init__.py | FabianBell/deepl_framework | c62ac799f5c98f7de07796c4df6d046080f5b096 | [
"MIT"
] | 1 | 2021-05-05T10:05:50.000Z | 2021-05-05T10:05:50.000Z | deep_nlp/__init__.py | FabianBell/deepl_framework | c62ac799f5c98f7de07796c4df6d046080f5b096 | [
"MIT"
] | null | null | null | deep_nlp/__init__.py | FabianBell/deepl_framework | c62ac799f5c98f7de07796c4df6d046080f5b096 | [
"MIT"
] | null | null | null | from .framework import *
from .utils import *
| 15.333333 | 24 | 0.73913 | 6 | 46 | 5.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 46 | 2 | 25 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9efd56d50cff2cb6d29b2c6a198fe36620bddf43 | 31 | py | Python | cracking_the_coding_interview_qs/16.5/zeros.py | angelusualle/algorithms | 86286a49db2a755bc57330cb455bcbd8241ea6be | [
"Apache-2.0"
] | null | null | null | cracking_the_coding_interview_qs/16.5/zeros.py | angelusualle/algorithms | 86286a49db2a755bc57330cb455bcbd8241ea6be | [
"Apache-2.0"
] | null | null | null | cracking_the_coding_interview_qs/16.5/zeros.py | angelusualle/algorithms | 86286a49db2a755bc57330cb455bcbd8241ea6be | [
"Apache-2.0"
] | null | null | null | def zeros(n):
return n // 5 | 15.5 | 17 | 0.548387 | 6 | 31 | 2.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.290323 | 31 | 2 | 17 | 15.5 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
7322abe1b08a4e4aab625a53f2ce088299aaeb55 | 36 | py | Python | src/lib/stat.py | DTenore/skulpt | 098d20acfb088d6db85535132c324b7ac2f2d212 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | src/lib/stat.py | wakeupmuyunhe/skulpt | a8fb11a80fb6d7c016bab5dfe3712517a350b347 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | src/lib/stat.py | wakeupmuyunhe/skulpt | a8fb11a80fb6d7c016bab5dfe3712517a350b347 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | import _sk_fail; _sk_fail._("stat")
| 18 | 35 | 0.75 | 6 | 36 | 3.666667 | 0.666667 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
73318c8a98ad8c2d976bbf1112691e7ffa4eb113 | 4,083 | py | Python | flextensor/baselines/unpooling1d_baseline.py | imxian/FlexTensor | 311af3362856ea1b0073404fffad42c54585c205 | [
"MIT"
] | 135 | 2020-03-15T11:28:48.000Z | 2022-03-26T00:54:32.000Z | flextensor/baselines/unpooling1d_baseline.py | imxian/FlexTensor | 311af3362856ea1b0073404fffad42c54585c205 | [
"MIT"
] | 11 | 2020-03-23T11:06:38.000Z | 2022-01-24T06:25:41.000Z | flextensor/baselines/unpooling1d_baseline.py | imxian/FlexTensor | 311af3362856ea1b0073404fffad42c54585c205 | [
"MIT"
] | 32 | 2020-03-17T05:12:59.000Z | 2022-03-26T00:54:33.000Z | import argparse
import timeit
import torch
import time
import tvm
import topi
import numpy as np
from flextensor.configs.maxunpooling1d_config import maxunpooling1d_shape
from flextensor.task import maxunpooling1d
torch.backends.cudnn.enabled = False
def pytorch_cpu(B, C, L, kernel_size, stride, padding, number=10, dev=0):
Input = torch.rand([B, C, L], dtype=torch.float32)
maxpool = torch.nn.MaxPool1d(kernel_size, stride=stride, padding=padding, return_indices=True)
Input, indices = maxpool(Input)
begin_time = time.time()
unpool = torch.nn.MaxUnpool1d(kernel_size, stride=stride, padding=padding)
for i in range(number):
output = unpool(Input, indices)
end_time = time.time()
# ms
return (end_time - begin_time) * 1e3 / number
def pytorch_cuda(B, C, L, kernel_size, stride, padding, number=10, dev=0):
Input = torch.rand([B, C, L], dtype=torch.float32).cuda("cuda:" + str(dev))
maxpool = torch.nn.MaxPool1d(kernel_size, stride=stride, padding=padding, return_indices=True).cuda("cuda:" + str(dev))
Input, indices = maxpool(Input)
begin_time = time.time()
unpool = torch.nn.MaxUnpool1d(kernel_size, stride=stride, padding=padding).cuda("cuda:" + str(dev))
for i in range(number):
output = unpool(Input, indices)
end_time = time.time()
# ms
return (end_time - begin_time) * 1e3 / number
def tvm_unpool1d_cpu(B, C, L, kernel_size, stride, padding, number=10, dev=0):
Input = torch.rand([B, C, L], dtype=torch.float32).cuda("cuda:" + str(dev))
maxpool = torch.nn.MaxPool1d(kernel_size, stride=stride, padding=padding, return_indices=True).cuda("cuda:" + str(dev))
Input, indices = maxpool(Input)
Input = Input.cpu()
indices = indices.cpu()
s, bufs = maxunpooling1d(B, C, Input.shape[2], kernel_size, stride, padding)
s = tvm.te.create_schedule(s)
ctx = tvm.cpu(dev)
f = tvm.build(s, bufs, 'llvm')
im = tvm.nd.array(Input.numpy().astype(np.float32), ctx)
fi = tvm.nd.array(indices.numpy().astype(np.float32), ctx)
in_length = Input.shape[2]
out_length = (in_length - 1) * stride - 2 * padding + kernel_size
output_shape = (B, C, out_length)
un = tvm.nd.array(np.zeros(output_shape).astype(np.float32), ctx)
start_time = time.time()
for i in range(number):
f(im, fi, un)
end_time = time.time()
return (end_time - start_time) * 1e3 / number
def tvm_unpool1d_cuda(B, C, L, kernel_size, stride, padding, number=10, dev=0):
Input = torch.rand([B, C, L], dtype=torch.float32).cuda("cuda:" + str(dev))
maxpool = torch.nn.MaxPool1d(kernel_size, stride=stride, padding=padding, return_indices=True).cuda("cuda:" + str(dev))
Input, indices = maxpool(Input)
Input = Input.cpu()
indices = indices.cpu()
s, bufs = maxunpooling1d(B, C, Input.shape[2], kernel_size, stride, padding)
s = tvm.te.create_schedule(s)
f = tvm.build(s, bufs, "cuda")
ctx = tvm.context("cuda", dev_id=dev)
im = tvm.nd.array(Input.numpy().astype(np.float32), ctx)
fi = tvm.nd.array(indices.numpy().astype(np.float32), ctx)
in_length = Input.shape[2]
out_length = (in_length - 1) * stride - 2 * padding + kernel_size
output_shape = (B, C, out_length)
un = tvm.nd.array(np.zeros(output_shape).astype(np.float32), ctx)
start_time = time.time()
for i in range(number):
f(im, fi, un)
end_time = time.time()
return (end_time - start_time) * 1e3 / number
if __name__ == "__main__":
shapes = maxunpooling1d_shape
"""warm up"""
cost = pytorch_cpu(*shapes[0])
cost = pytorch_cuda(*shapes[0])
cost = tvm_unpool1d_cpu(*shapes[0])
# cost = tvm_unpool1d_cuda(*shapes[0])
for shape in shapes:
print("Shape", shape)
cost = pytorch_cpu(*shape)
print("Pytorch cost on cpu: {}ms".format(cost))
cost = pytorch_cuda(*shape)
print("Pytorch cost on cuda: {}ms".format(cost))
cost = tvm_unpool1d_cpu(*shape)
print("Tvm cost on cpu: {}ms".format(cost))
print("Done!")
| 35.815789 | 123 | 0.658584 | 595 | 4,083 | 4.391597 | 0.156303 | 0.048986 | 0.073479 | 0.037505 | 0.809414 | 0.766552 | 0.740911 | 0.740911 | 0.740911 | 0.740911 | 0 | 0.020966 | 0.193975 | 4,083 | 113 | 124 | 36.132743 | 0.773017 | 0.010287 | 0 | 0.581395 | 0 | 0 | 0.034046 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046512 | false | 0 | 0.104651 | 0 | 0.197674 | 0.05814 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
733702c8f7c3cd5c9c8aeda2352b12ff742b2380 | 96 | py | Python | venv/lib/python3.8/site-packages/setuptools/_distutils/command/bdist_wininst.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/setuptools/_distutils/command/bdist_wininst.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/setuptools/_distutils/command/bdist_wininst.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/88/69/5a/23e55f1251ce9de79ccca1d69d23796b5d3eec831c25a5ee47599d4b77 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.416667 | 0 | 96 | 1 | 96 | 96 | 0.479167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7355caf50314da06555fbe7d5d103233e9a65ea8 | 289 | py | Python | bitmovin_api_sdk/encoding/encodings/muxings/mxf/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 11 | 2019-07-03T10:41:16.000Z | 2022-02-25T21:48:06.000Z | bitmovin_api_sdk/encoding/encodings/muxings/mxf/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 8 | 2019-11-23T00:01:25.000Z | 2021-04-29T12:30:31.000Z | bitmovin_api_sdk/encoding/encodings/muxings/mxf/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 13 | 2020-01-02T14:58:18.000Z | 2022-03-26T12:10:30.000Z | from bitmovin_api_sdk.encoding.encodings.muxings.mxf.mxf_api import MxfApi
from bitmovin_api_sdk.encoding.encodings.muxings.mxf.customdata.customdata_api import CustomdataApi
from bitmovin_api_sdk.encoding.encodings.muxings.mxf.mxf_muxing_list_query_params import MxfMuxingListQueryParams
| 72.25 | 113 | 0.903114 | 40 | 289 | 6.225 | 0.425 | 0.144578 | 0.180723 | 0.216867 | 0.566265 | 0.566265 | 0.566265 | 0.566265 | 0.385542 | 0 | 0 | 0 | 0.041522 | 289 | 3 | 114 | 96.333333 | 0.898917 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b41b65da9874c783f2f946a81844417a51f56410 | 32 | py | Python | hatesonar/__init__.py | r00tr00tr00t/HateSonar | ede963e22ada7e0a68e55c7bf4bc0e04d0e78919 | [
"MIT"
] | 155 | 2018-01-26T12:22:49.000Z | 2022-03-29T05:33:52.000Z | hatesonar/__init__.py | r00tr00tr00t/HateSonar | ede963e22ada7e0a68e55c7bf4bc0e04d0e78919 | [
"MIT"
] | 78 | 2018-02-21T12:50:42.000Z | 2022-03-28T20:49:32.000Z | hatesonar/__init__.py | r00tr00tr00t/HateSonar | ede963e22ada7e0a68e55c7bf4bc0e04d0e78919 | [
"MIT"
] | 38 | 2018-02-22T13:48:28.000Z | 2022-03-02T20:13:37.000Z | from hatesonar.api import Sonar
| 16 | 31 | 0.84375 | 5 | 32 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b44900768f0195fe417b4844443ab5a7ef7e2d99 | 74 | py | Python | src/models/__init__.py | FHellmann/Deformable_Dilated_Faster-RCNN | 53e7ddcd6b3b8c7c38451cf08529d2792494c658 | [
"MIT"
] | 1 | 2021-10-09T03:05:16.000Z | 2021-10-09T03:05:16.000Z | src/models/__init__.py | FHellmann/Deformable_Dilated_Faster-RCNN | 53e7ddcd6b3b8c7c38451cf08529d2792494c658 | [
"MIT"
] | null | null | null | src/models/__init__.py | FHellmann/Deformable_Dilated_Faster-RCNN | 53e7ddcd6b3b8c7c38451cf08529d2792494c658 | [
"MIT"
] | 2 | 2021-03-02T12:06:14.000Z | 2021-11-20T16:02:43.000Z | from .faster_rcnn_models import FasterRCNNType, faster_rcnn_model_builder
| 37 | 73 | 0.905405 | 10 | 74 | 6.2 | 0.8 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067568 | 74 | 1 | 74 | 74 | 0.898551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c372a3af13901bfb1319bd7474439ce9d3dff170 | 122 | py | Python | vol1/29.py | EdisonAlgorithms/ProjectEuler | 95025ede2c92dbd3ed2dccc0f8a97e9a3db95ef0 | [
"MIT"
] | null | null | null | vol1/29.py | EdisonAlgorithms/ProjectEuler | 95025ede2c92dbd3ed2dccc0f8a97e9a3db95ef0 | [
"MIT"
] | null | null | null | vol1/29.py | EdisonAlgorithms/ProjectEuler | 95025ede2c92dbd3ed2dccc0f8a97e9a3db95ef0 | [
"MIT"
] | null | null | null | if __name__ == '__main__' :
s = set()
for i in range(2, 101) :
for j in range(2, 101) :
s.add(i ** j)
print len(s) | 20.333333 | 27 | 0.54918 | 24 | 122 | 2.458333 | 0.625 | 0.237288 | 0.271186 | 0.372881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 0.262295 | 122 | 6 | 28 | 20.333333 | 0.566667 | 0 | 0 | 0 | 0 | 0 | 0.065041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.166667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c377044a58977c130aa520c9881f712dc8dcb14c | 1,847 | py | Python | tests/testdata.py | pombredanne/cdifflib | a89906a0ac8f5f4fba2349e872a9259bf7c20e83 | [
"BSD-3-Clause"
] | 18 | 2016-01-16T08:46:40.000Z | 2022-03-18T12:45:33.000Z | tests/testdata.py | mduggan/cdifflib | e2f561306d50880930da246517d2a5f5eb8006ae | [
"BSD-3-Clause"
] | 6 | 2017-07-07T15:53:59.000Z | 2019-12-05T11:18:51.000Z | tests/testdata.py | pombredanne/cdifflib | a89906a0ac8f5f4fba2349e872a9259bf7c20e83 | [
"BSD-3-Clause"
] | 5 | 2018-02-07T13:11:06.000Z | 2022-02-11T12:07:02.000Z | """
Test data from bug https://github.com/mduggan/cdifflib/issues/5
Revealed some bugs with autojunk handling
"""
a5 = [
12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12,
12, 124, 16, 12, 12, 12, 108, 588, 1316, 12, 8, 42, 6, 168, 36, 12, 10, 10,
158, 36, 10, 24, 152, 914, 84, 216, 4, 10, 254, 8, 40, 54, 20, 12, 54, 38,
10, 8, 310, 6, 580, 28, 20, 44, 12, 24, 34, 44, 4, 20, 8, 16, 14, 12, 8,
12, 20, 14, 28, 12, 24, 6, 12, 372, 544, 1212, 28, 64, 12, 16, 16, 34, 146,
70, 284, 110, 206, 354, 612, 16, 12, 18, 6, 18, 6, 6, 20, 6, 12, 12, 12,
20, 12, 12, 12, 20, 12, 12, 358, 258, 12, 54, 20, 8, 8, 6, 16, 12, 6, 112,
130, 16, 8, 26, 8, 8, 44, 44, 22, 88, 314, 394, 588, 122, 6, 644, 6, 32,
24, 924, 10, 66, 22, 270, 16, 1340, 2408, 54, 452, 158, 1950, 382, 594, 38,
110, 106, 40, 56, 5302, 1398, 6, 1016, 814, 46, 112, 14, 6, 14, 12, 6, 6,
46, 16, 80, 80, 68, 84, 82, 6, 1224, 518
]
b5 = [
284, 6, 528, 64, 16, 230, 254, 6, 162, 350, 28, 22, 88, 18, 136, 64, 36,
32, 102, 14, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12, 12,
12, 12, 12, 12, 12, 12, 124, 16, 12, 12, 12, 108, 588, 1312, 12, 8, 42, 6,
168, 36, 12, 10, 10, 158, 36, 10, 26, 154, 906, 82, 214, 4, 10, 242, 8, 38,
54, 20, 14, 54, 38, 10, 8, 314, 6, 574, 28, 20, 38, 12, 24, 22, 44, 4, 20,
8, 16, 12, 12, 8, 12, 20, 14, 24, 12, 24, 6, 12, 382, 544, 1212, 28, 64,
12, 16, 16, 34, 146, 70, 284, 110, 206, 354, 612, 16, 12, 18, 6, 18, 6, 6,
20, 6, 12, 12, 12, 20, 12, 12, 12, 20, 12, 12, 358, 258, 12, 54, 20, 8, 8,
6, 16, 12, 6, 112, 130, 16, 8, 26, 8, 8, 44, 44, 22, 88, 314, 394, 588,
122, 6, 644, 6, 32, 24, 924, 10, 66, 22, 270, 16, 1340, 2408, 54, 452, 158,
1950, 382, 594, 38, 110, 106, 40, 56, 5302, 1398, 6, 1016, 814, 46, 112,
14, 6, 14, 12, 6
]
| 54.323529 | 79 | 0.505685 | 409 | 1,847 | 2.283619 | 0.259169 | 0.231263 | 0.276231 | 0.299786 | 0.674518 | 0.638116 | 0.638116 | 0.638116 | 0.638116 | 0.638116 | 0 | 0.630402 | 0.273416 | 1,847 | 33 | 80 | 55.969697 | 0.065574 | 0.05739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6f156b34e4fe782d12d3b38df6d0583759ae93f5 | 30,685 | py | Python | results/tests/test_views_competitions.py | sal-kiti/sal-k | ac1e4ea1a7b5edfc1088666fba246a4e7042cac0 | [
"MIT"
] | 1 | 2021-06-12T08:46:32.000Z | 2021-06-12T08:46:32.000Z | results/tests/test_views_competitions.py | sal-kiti/sal-k | ac1e4ea1a7b5edfc1088666fba246a4e7042cac0 | [
"MIT"
] | 8 | 2020-07-01T15:06:52.000Z | 2022-02-20T09:11:23.000Z | results/tests/test_views_competitions.py | sal-kiti/sal-k | ac1e4ea1a7b5edfc1088666fba246a4e7042cac0 | [
"MIT"
] | 3 | 2020-03-01T17:02:24.000Z | 2020-07-05T14:37:59.000Z | from django.contrib.auth.models import Group, User
from django.test import TestCase, override_settings
from rest_framework import status
from rest_framework.test import APIRequestFactory
from rest_framework.test import force_authenticate
from results.models.competitions import CompetitionLevel, CompetitionType, CompetitionResultType, Competition
from results.models.competitions import CompetitionLayout
from results.tests.factories.competitions import CompetitionFactory, CompetitionLevelFactory, CompetitionTypeFactory
from results.tests.factories.competitions import CompetitionResultTypeFactory, CompetitionLayoutFactory
from results.tests.utils import ResultsTestCase
from results.views.competitions import CompetitionLevelViewSet, CompetitionTypeViewSet, CompetitionResultTypeViewSet
from results.views.competitions import CompetitionViewSet, CompetitionLayoutViewSet
class CompetitionLevelTestCase(ResultsTestCase):
def setUp(self):
self.factory = APIRequestFactory()
self.user = User.objects.create(username='tester')
self.staff_user = User.objects.create(username="staffuser", is_staff=True)
self.superuser = User.objects.create(username="superuser", is_superuser=True)
self.object = CompetitionLevelFactory.create()
self.data = {'name': self.object.name, 'abbreviation': self.object.abbreviation,
'historical': self.object.historical}
self.newdata = {'name': 'Championships', 'abbreviation': 'Champ'}
self.url = '/api/competitionlevels/'
self.viewset = CompetitionLevelViewSet
self.model = CompetitionLevel
def test_competition_level_access_list(self):
request = self.factory.get(self.url)
view = self.viewset.as_view(actions={'get': 'list'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_level_access_object_without_user(self):
response = self._test_access(user=None)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
self.assertEqual(response.data[key], self.data[key])
def test_competition_level_access_object_with_normal_user(self):
response = self._test_access(user=self.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
self.assertEqual(response.data[key], self.data[key])
def test_competition_level_update_without_user(self):
response = self._test_update(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_level_update_with_superuser(self):
response = self._test_update(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_level_update_with_staffruser(self):
response = self._test_update(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_level_update_with_normal_user(self):
response = self._test_update(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_level_create_without_user(self):
response = self._test_create(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_level_create_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_level_create_existing_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_competition_level_create_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_level_create_existing_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_competition_level_create_with_normal_user(self):
response = self._test_create(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_level_delete_with_user(self):
response = self._test_delete(user=self.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_level_delete_with_superuser(self):
response = self._test_delete(user=self.superuser)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_competition_level_delete_with_staffuser(self):
response = self._test_delete(user=self.staff_user)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
class CompetitionTypeTestCase(ResultsTestCase):
def setUp(self):
self.factory = APIRequestFactory()
self.user = User.objects.create(username='tester')
self.staff_user = User.objects.create(username="staffuser", is_staff=True)
self.superuser = User.objects.create(username="superuser", is_superuser=True)
self.object = CompetitionTypeFactory.create()
self.data = {'name': self.object.name, 'abbreviation': self.object.abbreviation,
'number_of_rounds': self.object.number_of_rounds, 'max_result': self.object.max_result,
'min_result': self.object.min_result, 'historical': self.object.historical}
self.newdata = {'name': '50 yards', 'abbreviation': '50y', 'number_of_rounds': 2}
self.url = '/api/competitiontypes/'
self.viewset = CompetitionTypeViewSet
self.model = CompetitionType
def test_competition_type_access_list(self):
request = self.factory.get(self.url)
view = self.viewset.as_view(actions={'get': 'list'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_type_access_object_without_user(self):
response = self._test_access(user=None)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
if key in ['max_result', 'min_result']:
self.assertEqual(response.data[key], str(self.data[key]))
else:
self.assertEqual(response.data[key], self.data[key])
def test_competition_type_access_object_with_normal_user(self):
response = self._test_access(user=self.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
if key in ['max_result', 'min_result']:
self.assertEqual(response.data[key], str(self.data[key]))
else:
self.assertEqual(response.data[key], self.data[key])
def test_competition_type_update_without_user(self):
response = self._test_update(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_type_update_with_superuser(self):
response = self._test_update(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_type_update_with_staffruser(self):
response = self._test_update(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_type_update_with_normal_user(self):
response = self._test_update(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_type_create_without_user(self):
response = self._test_create(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_type_create_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_type_create_existing_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_type_create_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_type_create_existing_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_type_create_with_normal_user(self):
response = self._test_create(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_type_delete_with_user(self):
response = self._test_delete(user=self.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_type_delete_with_superuser(self):
response = self._test_delete(user=self.superuser)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_competition_type_delete_with_staffuser(self):
response = self._test_delete(user=self.staff_user)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
class CompetitionResultTypeTestCase(ResultsTestCase):
def setUp(self):
self.factory = APIRequestFactory()
self.user = User.objects.create(username='tester')
self.staff_user = User.objects.create(username="staffuser", is_staff=True)
self.superuser = User.objects.create(username="superuser", is_superuser=True)
self.object = CompetitionResultTypeFactory.create()
self.data = {'competition_type': self.object.competition_type.id, 'name': self.object.name,
'abbreviation': self.object.abbreviation, 'max_result': self.object.max_result,
'min_result': self.object.min_result}
self.newdata = {'competition_type': self.object.competition_type.id, 'name': 'Finals', 'abbreviation': "fin"}
self.url = '/api/competitionresulttypes/'
self.viewset = CompetitionResultTypeViewSet
self.model = CompetitionResultType
def test_competition_result_type_access_list(self):
request = self.factory.get(self.url)
view = self.viewset.as_view(actions={'get': 'list'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_result_type_access_object_without_user(self):
response = self._test_access(user=None)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
if key in ['max_result', 'min_result']:
self.assertEqual(response.data[key], str(self.data[key]))
else:
self.assertEqual(response.data[key], self.data[key])
def test_competition_result_type_access_object_with_normal_user(self):
response = self._test_access(user=self.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
if key in ['max_result', 'min_result']:
self.assertEqual(response.data[key], str(self.data[key]))
else:
self.assertEqual(response.data[key], self.data[key])
def test_competition_result_type_update_without_user(self):
response = self._test_update(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_result_type_update_with_superuser(self):
response = self._test_update(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_result_type_update_with_staffruser(self):
response = self._test_update(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_result_type_update_with_normal_user(self):
response = self._test_update(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_result_type_create_without_user(self):
response = self._test_create(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_result_type_create_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_result_type_create_existing_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_result_type_create_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_result_type_create_existing_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_result_type_create_with_normal_user(self):
response = self._test_create(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_result_type_delete_with_user(self):
response = self._test_delete(user=self.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_result_type_delete_with_superuser(self):
response = self._test_delete(user=self.superuser)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_competition_result_type_delete_with_staffuser(self):
response = self._test_delete(user=self.staff_user)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
class CompetitionLayoutTestCase(TestCase):
def setUp(self):
self.factory = APIRequestFactory()
self.user = User.objects.create(username='tester')
self.superuser = User.objects.create(username="superuser", is_superuser=True)
self.object = CompetitionLayoutFactory.create()
self.data = {'type': self.object.type, 'name': self.object.name,
'label': self.object.label, 'block': self.object.block, 'row': self.object.row,
'col': self.object.col, 'order': self.object.order, 'hide': self.object.hide,
'show': self.object.show}
self.newdata = {'type': self.object.type, 'name': self.object.name,
'label': self.object.label, 'block': 2, 'row': 2, 'col': 2, 'order': 2,
'hide': self.object.hide, 'show': self.object.show}
self.url = '/api/competitiontypelayouts/'
self.viewset = CompetitionLayoutViewSet
self.model = CompetitionLayout
def test_competition_type_layout_access_list(self):
request = self.factory.get(self.url)
view = self.viewset.as_view(actions={'get': 'list'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_type_layout_access_object_without_user(self):
request = self.factory.get(self.url + '1/')
view = self.viewset.as_view(actions={'get': 'retrieve'})
response = view(request, pk=self.object.pk)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
self.assertEqual(response.data[key], self.data[key])
def test_competition_type_layout_update_without_user(self):
request = self.factory.post(self.url + '1/', self.newdata)
view = self.viewset.as_view(actions={'put': 'update'})
response = view(request, pk=1)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_type_layout_create_without_user(self):
request = self.factory.post(self.url, self.newdata)
view = self.viewset.as_view(actions={'post': 'create'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_type_layout_access_object_with_normal_user(self):
request = self.factory.get(self.url + '1/')
force_authenticate(request, user=self.user)
view = self.viewset.as_view(actions={'get': 'retrieve'})
response = view(request, pk=self.object.pk)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
self.assertEqual(response.data[key], self.data[key])
def test_competition_type_layout_update_with_normal_user(self):
request = self.factory.post(self.url + '1/', self.newdata)
force_authenticate(request, user=self.user)
view = self.viewset.as_view(actions={'put': 'update'})
response = view(request, pk=1)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_type_layout_create_with_normal_user(self):
request = self.factory.post(self.url, self.newdata)
force_authenticate(request, user=self.user)
view = self.viewset.as_view(actions={'post': 'create'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_type_layout_update_with_superuser(self):
request = self.factory.put(self.url + '1/', self.newdata)
force_authenticate(request, user=self.superuser)
view = self.viewset.as_view(actions={'put': 'update'})
response = view(request, pk=1)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_type_layout_create_with_superuser(self):
request = self.factory.post(self.url, self.newdata)
force_authenticate(request, user=self.superuser)
view = self.viewset.as_view(actions={'post': 'create'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
def test_competition_type_layout_create_existing_with_superuser(self):
request = self.factory.post(self.url, self.data)
force_authenticate(request, user=self.superuser)
view = self.viewset.as_view(actions={'post': 'create'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(self.model.objects.all().count(), 1)
def test_competition_type_layout_delete_with_user(self):
request = self.factory.delete(self.url + '1/')
force_authenticate(request, user=self.user)
view = self.viewset.as_view(actions={'delete': 'destroy'})
response = view(request, pk=1)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_type_layout_delete_with_superuser(self):
request = self.factory.delete(self.url + '1/')
force_authenticate(request, user=self.superuser)
view = self.viewset.as_view(actions={'delete': 'destroy'})
response = view(request, pk=1)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
class CompetitionTestCase(ResultsTestCase):
def setUp(self):
self.factory = APIRequestFactory()
self.user = User.objects.create(username='tester')
self.group = Group.objects.create(name="testgroup")
self.organization_user = User.objects.create(username='tester_2')
self.organization_user.groups.add(self.group)
self.staff_user = User.objects.create(username="staffuser", is_staff=True)
self.superuser = User.objects.create(username="superuser", is_superuser=True)
self.object = CompetitionFactory.create()
self.object.organization.group = self.group
self.object.organization.save()
self.object.event.organization.group = self.group
self.object.event.organization.group.save()
self.data = {'name': self.object.name, 'date_start': self.object.date_start.strftime('%Y-%m-%d'),
'date_end': self.object.date_end.strftime('%Y-%m-%d'), 'location': self.object.location,
'type': self.object.type.pk, 'level': self.object.level.pk,
'organization': self.object.organization.pk, 'event': self.object.event.pk,
'locked': self.object.locked, 'public': self.object.public}
self.newdata = {'name': 'Village Yearly', 'date_start': self.object.date_start.strftime('%Y-%m-%d'),
'date_end': self.object.date_end.strftime('%Y-%m-%d'), 'location': 'Village Field',
'type': self.object.type.pk, 'level': self.object.level.pk,
'organization': self.object.organization.pk, 'event': self.object.event.pk}
self.updatedata = {'name': 'Change Competition'}
self.url = '/api/competitions/'
self.viewset = CompetitionViewSet
self.model = Competition
def _test_access(self, user):
request = self.factory.get(self.url + '1/')
force_authenticate(request, user=user)
view = self.viewset.as_view(actions={'get': 'retrieve'})
return view(request, pk=self.object.pk)
def _test_create(self, user, data, locked=True):
if not locked:
self.object.event.locked = False
self.object.event.save()
request = self.factory.post(self.url, data)
if user:
force_authenticate(request, user=user)
view = self.viewset.as_view(actions={'post': 'create'})
return view(request)
def _test_delete(self, user, locked=True):
if not locked:
self.object.event.locked = False
self.object.event.save()
request = self.factory.delete(self.url + '1/')
if user:
force_authenticate(request, user=user)
view = self.viewset.as_view(actions={'delete': 'destroy'})
return view(request, pk=1)
def _test_update(self, user, data, locked=True):
if not locked:
self.object.event.locked = False
self.object.event.save()
request = self.factory.put(self.url + '1/', data)
if user:
force_authenticate(request, user=user)
view = self.viewset.as_view(actions={'put': 'update'})
return view(request, pk=1)
def _test_patch(self, user, data, locked=True):
if not locked:
self.object.event.locked = False
self.object.event.save()
request = self.factory.patch(self.url + '1/', data)
if user:
force_authenticate(request, user=user)
view = self.viewset.as_view(actions={'patch': 'partial_update'})
return view(request, pk=1)
def test_competition_access_list(self):
request = self.factory.get(self.url)
view = self.viewset.as_view(actions={'get': 'list'})
response = view(request)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_access_object_without_user(self):
response = self._test_access(user=None)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
self.assertEqual(response.data[key], self.data[key])
def test_competition_access_object_with_normal_user(self):
response = self._test_access(user=self.user)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for key in self.data:
self.assertEqual(response.data[key], self.data[key])
def test_competition_update_without_user(self):
response = self._test_update(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_update_with_superuser(self):
response = self._test_update(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_update_with_staffruser(self):
response = self._test_update(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_update_with_organizational_user(self):
self.object.locked = False
self.object.save()
response = self._test_update(user=self.organization_user, data=self.newdata, locked=False)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_competition_update_with_organizational_user_locked(self):
response = self._test_update(user=self.organization_user, data=self.newdata, locked=False)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_update_with_organizational_user_event_locked(self):
self.object.locked = False
self.object.save()
response = self._test_update(user=self.organization_user, data=self.newdata, locked=True)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_update_with_normal_user(self):
response = self._test_update(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_publish_with_staffruser(self):
self.object.public = False
self.object.save()
response = self._test_patch(user=self.staff_user, data={"public": True})
self.assertEqual(response.status_code, status.HTTP_200_OK)
@override_settings(COMPETITION_PUBLISH_REQUIRES_STAFF=False)
def test_competition_publish_with_organizational_user_permitted(self):
self.object.locked = False
self.object.public = False
self.object.save()
response = self._test_patch(user=self.organization_user, data={"public": True}, locked=False)
self.assertEqual(response.status_code, status.HTTP_200_OK)
@override_settings(COMPETITION_PUBLISH_REQUIRES_STAFF=True)
def test_competition_publish_with_organizational_user_not_permitted(self):
self.object.locked = False
self.object.public = False
self.object.save()
response = self._test_patch(user=self.organization_user, data={"public": True}, locked=False)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_create_without_user(self):
response = self._test_create(user=None, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_competition_create_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_create_existing_with_superuser(self):
response = self._test_create(user=self.superuser, data=self.data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_create_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(self.model.objects.all().count(), 2)
for key in self.newdata:
self.assertEqual(response.data[key], self.newdata[key])
def test_competition_create_existing_with_staffuser(self):
response = self._test_create(user=self.staff_user, data=self.data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_create_with_organization_user(self):
response = self._test_create(user=self.organization_user, data=self.newdata, locked=True)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_competition_create_with_organization_user_not_locked_competition(self):
response = self._test_create(user=self.organization_user, data=self.newdata, locked=False)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_competition_create_with_normal_user(self):
response = self._test_create(user=self.user, data=self.newdata)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_competition_delete_with_user(self):
response = self._test_delete(user=self.user)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_competition_delete_with_superuser(self):
response = self._test_delete(user=self.superuser)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_competition_delete_with_staffuser(self):
response = self._test_delete(user=self.staff_user)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
| 50.971761 | 117 | 0.714551 | 3,817 | 30,685 | 5.48153 | 0.045586 | 0.083162 | 0.116522 | 0.116427 | 0.909764 | 0.895139 | 0.87693 | 0.85504 | 0.845099 | 0.8364 | 0 | 0.011557 | 0.179436 | 30,685 | 601 | 118 | 51.056572 | 0.819413 | 0 | 0 | 0.656687 | 0 | 0 | 0.03637 | 0.003292 | 0 | 0 | 0 | 0 | 0.231537 | 1 | 0.187625 | false | 0 | 0.023952 | 0 | 0.231537 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6f2f26b8f699a19c029d87fadcc8d1c0dd0a0795 | 23 | py | Python | awscfncli2/cli/commands/status/__init__.py | alytle/awscfncli | 62075846804adf00ab726895f97f931dbd581927 | [
"MIT"
] | 60 | 2017-01-16T09:52:36.000Z | 2021-09-07T23:27:01.000Z | awscfncli2/cli/commands/status/__init__.py | alytle/awscfncli | 62075846804adf00ab726895f97f931dbd581927 | [
"MIT"
] | 103 | 2017-08-22T17:01:31.000Z | 2021-09-02T15:32:34.000Z | awscfncli2/cli/commands/status/__init__.py | alytle/awscfncli | 62075846804adf00ab726895f97f931dbd581927 | [
"MIT"
] | 16 | 2017-08-22T16:24:11.000Z | 2021-06-30T11:45:51.000Z | from .status import cli | 23 | 23 | 0.826087 | 4 | 23 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 23 | 1 | 23 | 23 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6f693645a0935d8eabb2e3082aef8fbbb0cd39fc | 92 | py | Python | pruning/unstructured/__init__.py | Krishnkant-Swarnkar/Pytorch-pruning | 17cabe8a2e3fd38434a4064a7fb060b4dde74bd3 | [
"MIT"
] | 1 | 2020-12-15T04:32:19.000Z | 2020-12-15T04:32:19.000Z | pruning/unstructured/__init__.py | Krishnkant-Swarnkar/Pytorch-pruning | 17cabe8a2e3fd38434a4064a7fb060b4dde74bd3 | [
"MIT"
] | null | null | null | pruning/unstructured/__init__.py | Krishnkant-Swarnkar/Pytorch-pruning | 17cabe8a2e3fd38434a4064a7fb060b4dde74bd3 | [
"MIT"
] | null | null | null | from .one_shot_pruning import OneShotPruning
from .iterative_pruning import IterativePruning | 46 | 47 | 0.902174 | 11 | 92 | 7.272727 | 0.727273 | 0.325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076087 | 92 | 2 | 47 | 46 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48a1fd0a5713bf8da31a3ecd939dbfdb7d3d827f | 9,307 | py | Python | app/tests/algorithms_tests/test_signals.py | kaczmarj/grand-challenge.org | 8dc8a2170e51072354f7e94f2a22578805a67b94 | [
"Apache-2.0"
] | 7 | 2016-11-05T07:16:30.000Z | 2017-11-23T03:38:03.000Z | app/tests/algorithms_tests/test_signals.py | kaczmarj/grand-challenge.org | 8dc8a2170e51072354f7e94f2a22578805a67b94 | [
"Apache-2.0"
] | 113 | 2015-05-26T09:27:59.000Z | 2018-03-21T10:45:56.000Z | app/tests/algorithms_tests/test_signals.py | kaczmarj/grand-challenge.org | 8dc8a2170e51072354f7e94f2a22578805a67b94 | [
"Apache-2.0"
] | 7 | 2015-07-16T20:11:22.000Z | 2017-06-06T02:41:24.000Z | import pytest
from guardian.shortcuts import get_perms
from tests.algorithms_tests.factories import AlgorithmJobFactory
from tests.algorithms_tests.utils import TwoAlgorithms
from tests.components_tests.factories import ComponentInterfaceValueFactory
from tests.factories import GroupFactory, ImageFactory, UserFactory
from tests.utils import get_view_for_user
@pytest.mark.django_db
@pytest.mark.parametrize("reverse", [True, False])
def test_user_can_download_images(client, reverse):
alg_set = TwoAlgorithms()
j1_creator, j2_creator = UserFactory(), UserFactory()
alg1_job = AlgorithmJobFactory(
algorithm_image__algorithm=alg_set.alg1, creator=j1_creator
)
alg2_job = AlgorithmJobFactory(
algorithm_image__algorithm=alg_set.alg2, creator=j2_creator
)
alg1_job.viewer_groups.add(alg_set.alg1.editors_group)
alg2_job.viewer_groups.add(alg_set.alg2.editors_group)
iv1, iv2, iv3, iv4 = (
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
)
if reverse:
for im in [iv1, iv2, iv3, iv4]:
im.algorithms_jobs_as_output.add(alg1_job, alg2_job)
for im in [iv3, iv4]:
im.algorithms_jobs_as_output.remove(alg1_job, alg2_job)
for im in [iv1, iv2]:
im.algorithms_jobs_as_output.remove(alg2_job)
else:
# Test that adding images works
alg1_job.outputs.add(iv1, iv2, iv3, iv4)
# Test that removing images works
alg1_job.outputs.remove(iv3, iv4)
tests = (
(None, 200, []),
(alg_set.creator, 200, []),
(
alg_set.editor1,
200,
[
*[i.image.pk for i in alg1_job.inputs.all()],
iv1.image.pk,
iv2.image.pk,
],
),
(alg_set.user1, 200, []),
(
j1_creator,
200,
[
*[i.image.pk for i in alg1_job.inputs.all()],
iv1.image.pk,
iv2.image.pk,
],
),
(alg_set.editor2, 200, [i.image.pk for i in alg2_job.inputs.all()]),
(alg_set.user2, 200, []),
(j2_creator, 200, [i.image.pk for i in alg2_job.inputs.all()]),
(alg_set.u, 200, []),
)
for test in tests:
response = get_view_for_user(
viewname="api:image-list",
client=client,
user=test[0],
content_type="application/json",
)
assert response.status_code == test[1]
assert response.json()["count"] == len(test[2])
pks = {obj["pk"] for obj in response.json()["results"]}
assert {str(pk) for pk in test[2]} == pks
# Test clearing
if reverse:
iv1.algorithms_jobs_as_output.clear()
iv2.algorithms_jobs_as_output.clear()
else:
alg1_job.outputs.clear()
response = get_view_for_user(
viewname="api:image-list",
client=client,
user=j1_creator,
content_type="application/json",
)
assert response.status_code == 200
assert response.json()["count"] == 1
@pytest.mark.django_db
@pytest.mark.parametrize("reverse", [True, False])
def test_user_can_download_input_images(client, reverse):
alg_set = TwoAlgorithms()
j1_creator, j2_creator = UserFactory(), UserFactory()
alg1_job = AlgorithmJobFactory(
algorithm_image__algorithm=alg_set.alg1, creator=j1_creator
)
alg2_job = AlgorithmJobFactory(
algorithm_image__algorithm=alg_set.alg2, creator=j2_creator
)
alg1_job.viewer_groups.add(alg_set.alg1.editors_group)
alg2_job.viewer_groups.add(alg_set.alg2.editors_group)
iv1, iv2, iv3, iv4 = (
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
)
alg1_origin_input = [i.image.pk for i in alg1_job.inputs.all()]
alg2_origin_input = [i.image.pk for i in alg2_job.inputs.all()]
if reverse:
for iv in [iv1, iv2, iv3, iv4]:
iv.algorithms_jobs_as_input.add(alg1_job, alg2_job)
for iv in [iv3, iv4]:
iv.algorithms_jobs_as_input.remove(alg1_job, alg2_job)
for iv in [iv1, iv2]:
iv.algorithms_jobs_as_input.remove(alg2_job)
else:
# Test that adding images works
alg1_job.inputs.add(iv1, iv2, iv3, iv4)
# Test that removing images works
alg1_job.inputs.remove(iv3, iv4)
tests = (
(None, 200, []),
(alg_set.creator, 200, []),
(
alg_set.editor1,
200,
[*alg1_origin_input, iv1.image.pk, iv2.image.pk],
),
(alg_set.user1, 200, []),
(j1_creator, 200, [*alg1_origin_input, iv1.image.pk, iv2.image.pk]),
(alg_set.editor2, 200, alg2_origin_input),
(alg_set.user2, 200, []),
(j2_creator, 200, alg2_origin_input),
(alg_set.u, 200, []),
)
for test in tests:
response = get_view_for_user(
viewname="api:image-list",
client=client,
user=test[0],
content_type="application/json",
)
assert response.status_code == test[1]
assert response.json()["count"] == len(test[2])
pks = {obj["pk"] for obj in response.json()["results"]}
assert {str(pk) for pk in test[2]} == pks
# Test clearing
if reverse:
iv1.algorithms_jobs_as_input.clear()
iv2.algorithms_jobs_as_input.clear()
else:
alg1_job.inputs.clear()
response = get_view_for_user(
viewname="api:image-list",
client=client,
user=j1_creator,
content_type="application/json",
)
assert response.status_code == 200
if reverse:
assert response.json()["count"] == 1
else:
assert response.json()["count"] == 0
@pytest.mark.django_db
class TestAlgorithmJobViewersGroup:
def test_view_permissions_are_assigned(self):
job = AlgorithmJobFactory()
viewer_groups = {*job.viewer_groups.all()}
assert viewer_groups == {job.viewers}
for group in viewer_groups:
assert "view_job" in get_perms(group, job)
@pytest.mark.parametrize("reverse", [True, False])
def test_group_addition(self, reverse):
job = AlgorithmJobFactory()
group = GroupFactory()
civ_in, civ_out = (
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
)
job.inputs.add(civ_in)
job.outputs.add(civ_out)
assert "view_job" not in get_perms(group, job)
assert "view_image" not in get_perms(group, civ_in.image)
assert "view_image" not in get_perms(group, civ_out.image)
if reverse:
group.job_set.add(job)
else:
job.viewer_groups.add(group)
assert "view_job" in get_perms(group, job)
assert "view_image" in get_perms(group, civ_in.image)
assert "view_image" in get_perms(group, civ_out.image)
@pytest.mark.parametrize("reverse", [True, False])
def test_group_removal(self, reverse):
job = AlgorithmJobFactory()
civ_in, civ_out = (
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
)
job.inputs.add(civ_in)
job.outputs.add(civ_out)
group = job.viewer_groups.first()
assert "view_job" in get_perms(group, job)
assert "view_image" in get_perms(group, civ_in.image)
assert "view_image" in get_perms(group, civ_out.image)
if reverse:
group.job_set.remove(job)
else:
job.viewer_groups.remove(group)
assert "view_job" not in get_perms(group, job)
assert "view_image" not in get_perms(group, civ_in.image)
assert "view_image" not in get_perms(group, civ_out.image)
@pytest.mark.parametrize("reverse", [True, False])
def test_group_clearing(self, reverse):
job = AlgorithmJobFactory()
civ_in, civ_out = (
ComponentInterfaceValueFactory(image=ImageFactory()),
ComponentInterfaceValueFactory(image=ImageFactory()),
)
job.inputs.add(civ_in)
job.outputs.add(civ_out)
groups = job.viewer_groups.all()
assert len(groups) > 0
for group in groups:
assert "view_job" in get_perms(group, job)
assert "view_image" in get_perms(group, civ_in.image)
assert "view_image" in get_perms(group, civ_out.image)
if reverse:
for group in groups:
group.job_set.clear()
else:
job.viewer_groups.clear()
for group in groups:
assert "view_job" not in get_perms(group, job)
assert "view_image" not in get_perms(group, civ_in.image)
assert "view_image" not in get_perms(group, civ_out.image)
| 33.003546 | 76 | 0.623294 | 1,126 | 9,307 | 4.928952 | 0.102131 | 0.023784 | 0.034234 | 0.051351 | 0.862342 | 0.81982 | 0.791892 | 0.744324 | 0.739099 | 0.722523 | 0 | 0.027289 | 0.267648 | 9,307 | 281 | 77 | 33.120996 | 0.786972 | 0.016224 | 0 | 0.659483 | 0 | 0 | 0.040879 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 1 | 0.025862 | false | 0 | 0.030172 | 0 | 0.060345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
48a41a79acbc05e559179dc403f49d87e8933b17 | 26 | py | Python | qmpy/web/views/data/__init__.py | tachyontraveler/qmpy | f024de3aa85d4367cd31775bd53eede30c74c083 | [
"MIT"
] | 103 | 2015-02-13T16:51:59.000Z | 2022-03-24T22:08:54.000Z | qmpy/web/views/data/__init__.py | tachyontraveler/qmpy | f024de3aa85d4367cd31775bd53eede30c74c083 | [
"MIT"
] | 59 | 2015-12-02T22:43:21.000Z | 2022-03-28T03:54:44.000Z | qmpy/web/views/data/__init__.py | tachyontraveler/qmpy | f024de3aa85d4367cd31775bd53eede30c74c083 | [
"MIT"
] | 62 | 2015-02-24T21:58:59.000Z | 2022-03-21T16:49:09.000Z | from .references import *
| 13 | 25 | 0.769231 | 3 | 26 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48c94002b3e6b0473df2fd8065d543d5a44f7b6a | 173 | py | Python | src/z3c/__init__.py | pretaweb/z3c.form | 757300d6e15a5ca8427946dbdc3952ba8978f132 | [
"ZPL-2.1"
] | null | null | null | src/z3c/__init__.py | pretaweb/z3c.form | 757300d6e15a5ca8427946dbdc3952ba8978f132 | [
"ZPL-2.1"
] | null | null | null | src/z3c/__init__.py | pretaweb/z3c.form | 757300d6e15a5ca8427946dbdc3952ba8978f132 | [
"ZPL-2.1"
] | null | null | null | try:
# Declare this a namespace package if pkg_resources is available.
import pkg_resources
pkg_resources.declare_namespace('z3c')
except ImportError:
pass
| 21.625 | 69 | 0.751445 | 22 | 173 | 5.727273 | 0.727273 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007194 | 0.196532 | 173 | 7 | 70 | 24.714286 | 0.899281 | 0.364162 | 0 | 0 | 0 | 0 | 0.028037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
48c9411c5fb505d45f275acdd41e540586fee018 | 94 | py | Python | stylegan2_pytorch/__init__.py | Gokkulnath/stylegan2-pytorch | 4465ba38f4ecfff14500867d1fe07345a1e02eb3 | [
"MIT"
] | 2,954 | 2020-01-09T21:21:16.000Z | 2022-03-31T21:10:44.000Z | stylegan2_pytorch/__init__.py | Gokkulnath/stylegan2-pytorch | 4465ba38f4ecfff14500867d1fe07345a1e02eb3 | [
"MIT"
] | 259 | 2020-01-14T01:04:08.000Z | 2022-03-17T07:14:52.000Z | stylegan2_pytorch/__init__.py | Gokkulnath/stylegan2-pytorch | 4465ba38f4ecfff14500867d1fe07345a1e02eb3 | [
"MIT"
] | 558 | 2020-01-12T14:12:40.000Z | 2022-03-31T02:25:36.000Z | from stylegan2_pytorch.stylegan2_pytorch import Trainer, StyleGAN2, NanException, ModelLoader
| 47 | 93 | 0.882979 | 10 | 94 | 8.1 | 0.7 | 0.395062 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0.074468 | 94 | 1 | 94 | 94 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48d844df5a4b0431df621c62454b459c4215af6b | 27 | py | Python | app.py | crclz/hand2 | 76db4c8b91bcdca7281343f40802b753baee9a18 | [
"MIT"
] | null | null | null | app.py | crclz/hand2 | 76db4c8b91bcdca7281343f40802b753baee9a18 | [
"MIT"
] | null | null | null | app.py | crclz/hand2 | 76db4c8b91bcdca7281343f40802b753baee9a18 | [
"MIT"
] | null | null | null | print("This is web app 1")
| 13.5 | 26 | 0.666667 | 6 | 27 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.185185 | 27 | 1 | 27 | 27 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
5b173f333cb2da8c2f66c586e97d85f83662c0ff | 46 | py | Python | example/runcmd/__init__.py | aman-atakama/atakama_sdk | 6e917e81c07495324fd5ab1208a63217b7e4c3fd | [
"BSD-3-Clause"
] | null | null | null | example/runcmd/__init__.py | aman-atakama/atakama_sdk | 6e917e81c07495324fd5ab1208a63217b7e4c3fd | [
"BSD-3-Clause"
] | 11 | 2021-08-04T23:40:55.000Z | 2022-03-23T19:34:30.000Z | example/runcmd/__init__.py | aman-atakama/atakama_sdk | 6e917e81c07495324fd5ab1208a63217b7e4c3fd | [
"BSD-3-Clause"
] | null | null | null | from .subproc_detector import SubprocDetector
| 23 | 45 | 0.891304 | 5 | 46 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 1 | 46 | 46 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5b1fa1f2a0fe98ee890e1e8647b3efa9f7237a07 | 41 | py | Python | gluon_utils/logging/__init__.py | kbvatral/gluon-utils | bc4f54bff0e5b6a7d5306844f889ec8c3535604a | [
"Apache-2.0"
] | null | null | null | gluon_utils/logging/__init__.py | kbvatral/gluon-utils | bc4f54bff0e5b6a7d5306844f889ec8c3535604a | [
"Apache-2.0"
] | null | null | null | gluon_utils/logging/__init__.py | kbvatral/gluon-utils | bc4f54bff0e5b6a7d5306844f889ec8c3535604a | [
"Apache-2.0"
] | null | null | null | from .history_logger import HistoryLogger | 41 | 41 | 0.902439 | 5 | 41 | 7.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073171 | 41 | 1 | 41 | 41 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d2958166b390200562ea5565dd49f3183e445dca | 23 | py | Python | misc/config_tools/configurator/pyodide/__init__.py | jackwhich/acrn-hypervisor-1 | 2ff11c2ef04a2668979b3e363e25f13cf48376ac | [
"BSD-3-Clause"
] | null | null | null | misc/config_tools/configurator/pyodide/__init__.py | jackwhich/acrn-hypervisor-1 | 2ff11c2ef04a2668979b3e363e25f13cf48376ac | [
"BSD-3-Clause"
] | null | null | null | misc/config_tools/configurator/pyodide/__init__.py | jackwhich/acrn-hypervisor-1 | 2ff11c2ef04a2668979b3e363e25f13cf48376ac | [
"BSD-3-Clause"
] | null | null | null | from .pyodide import *
| 11.5 | 22 | 0.73913 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8252c34b1e5e144d55a02e656a14a16db2293180 | 11,849 | py | Python | services/core-api/tests/mines/mine/resources/test_mine_resource.py | parc-jason/mds | 8f181a429442208a061ed72065b71e6c2bd0f76f | [
"Apache-2.0"
] | null | null | null | services/core-api/tests/mines/mine/resources/test_mine_resource.py | parc-jason/mds | 8f181a429442208a061ed72065b71e6c2bd0f76f | [
"Apache-2.0"
] | null | null | null | services/core-api/tests/mines/mine/resources/test_mine_resource.py | parc-jason/mds | 8f181a429442208a061ed72065b71e6c2bd0f76f | [
"Apache-2.0"
] | null | null | null | import json, uuid, pytest
from tests.factories import MineFactory
# GET
def test_get_mine_not_found(test_client, db_session, auth_headers):
get_resp = test_client.get(f'/mines/{uuid.uuid4()}', headers=auth_headers['full_auth_header'])
get_data = json.loads(get_resp.data.decode())
assert 'Mine not found' in get_data['message']
assert get_resp.status_code == 404
def test_get_mine_by_mine_no(test_client, db_session, auth_headers):
mine_no = MineFactory().mine_no
get_resp = test_client.get(f'/mines/{mine_no}', headers=auth_headers['full_auth_header'])
get_data = json.loads(get_resp.data.decode())
assert get_data['mine_no'] == mine_no
assert get_resp.status_code == 200
def test_get_mine_by_mine_guid(test_client, db_session, auth_headers):
mine_guid = MineFactory().mine_guid
get_resp = test_client.get(f'/mines/{mine_guid}', headers=auth_headers['full_auth_header'])
get_data = json.loads(get_resp.data.decode())
assert get_data['mine_guid'] == str(mine_guid)
assert get_resp.status_code == 200
# POST
def test_post_mine_invalid_url(test_client, db_session, auth_headers):
test_mine_data = {"mine_name": "test_create_mine"}
post_resp = test_client.post(
'/mines/some_mine_no', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 405
def test_post_mine_no_name(test_client, db_session, auth_headers):
test_mine_data = {}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 400, post_resp.response
assert 'validation failed' in post_data['message'], post_data
def test_post_mine_name_exceed_chars(test_client, db_session, auth_headers):
test_mine_data = {'mine_name': '6' * 61, "mine_status": "CLD,REC,LWT", "mine_region": "SW"}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 400
assert 'not exceed 60' in post_data['message']
def test_post_mine_name_only_success(test_client, db_session, auth_headers):
test_mine_data = {"mine_name": "test_create_mine2", "mine_status": "CLD,REC,LWT", "mine_region": "SW"}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 200
assert post_data['mine_name'] == test_mine_data['mine_name']
def test_post_mine_name_and_note(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine_and_note",
"mine_note": "This is a note",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 200
assert post_data['mine_name'] == test_mine_data['mine_name']
assert post_data['mine_note'] == test_mine_data['mine_note']
def test_post_mine_name_and_coord(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine",
"latitude": "49.2827000",
"longitude": "123.1207000",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 200
assert post_data['mine_name'] == test_mine_data['mine_name']
assert post_data['mine_location']['latitude'] == test_mine_data['latitude']
assert post_data['mine_location']['longitude'] == test_mine_data['longitude']
def test_post_mine_success_all(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine_2",
"latitude": "49.2827000",
"longitude": "123.1207000",
"mine_note": "This is a note",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
"union_ind": True,
"ohsc_ind": False
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 200
assert post_data['mine_name'] == test_mine_data['mine_name']
assert post_data['mine_location']['latitude'] == test_mine_data['latitude']
assert post_data['mine_location']['longitude'] == test_mine_data['longitude']
assert post_data['mine_note'] == test_mine_data['mine_note']
assert post_data['union_ind'] == test_mine_data['union_ind']
assert post_data['ohsc_ind'] == test_mine_data['ohsc_ind']
def test_post_mine_redundant_name(test_client, db_session, auth_headers):
mine_name = MineFactory().mine_name
test_mine_data = {
"mine_name": mine_name,
"latitude": "44.2827000",
"longitude": "126.1207000",
"mine_note": "This is a new note",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 400
def test_post_mine_major_invalid_input(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine_major",
"latitude": "49.2827000",
"longitude": "123.1207000",
"mine_note": "This is a note",
"major_mine_ind": "blah",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_resp.status_code == 400
def test_post_mine_major_true(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine_major",
"latitude": "49.2827000",
"longitude": "123.1207000",
"mine_note": "This is a note",
"major_mine_ind": "true",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_data['mine_name'] == test_mine_data['mine_name']
assert post_data['mine_location']['latitude'] == test_mine_data['latitude']
assert post_data['mine_location']['longitude'] == test_mine_data['longitude']
assert post_data['mine_note'] == test_mine_data['mine_note']
assert post_data['major_mine_ind'] == True
assert post_resp.status_code == 200
def test_post_mine_major_false(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine_major_2",
"latitude": "49.2827000",
"longitude": "123.1207000",
"mine_note": "This is a note",
"major_mine_ind": "false",
"mine_region": "SW",
"mine_status": "CLD,REC,LWT",
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_data['mine_name'] == test_mine_data['mine_name']
assert post_data['mine_location']['latitude'] == test_mine_data['latitude']
assert post_data['mine_location']['longitude'] == test_mine_data['longitude']
assert post_data['mine_note'] == test_mine_data['mine_note']
assert post_data['major_mine_ind'] == False
assert post_resp.status_code == 200
def test_post_mine_mine_status(test_client, db_session, auth_headers):
test_mine_data = {
"mine_name": "test_create_mine_status",
"latitude": "49.2827000",
"longitude": "123.1207000",
"mine_note": "This is a note",
"mine_status": "CLD, CM",
"mine_region": "SW"
}
post_resp = test_client.post(
'/mines', json=test_mine_data, headers=auth_headers['full_auth_header'])
post_data = json.loads(post_resp.data.decode())
assert post_data['mine_name'] == test_mine_data['mine_name']
assert post_data['mine_location']['latitude'] == test_mine_data['latitude']
assert post_data['mine_location']['longitude'] == test_mine_data['longitude']
assert post_data['mine_note'] == test_mine_data['mine_note']
assert post_data['mine_region'] == test_mine_data['mine_region']
assert post_resp.status_code == 200
#PUT
def test_put_mine_name(test_client, db_session, auth_headers):
mine_guid = MineFactory().mine_guid
test_tenure_data = {"mine_name": "mine_name", "mine_note": ""}
put_resp = test_client.put(
f'/mines/{mine_guid}', json=test_tenure_data, headers=auth_headers['full_auth_header'])
put_data = json.loads(put_resp.data.decode())
assert put_data['mine_name'] == test_tenure_data['mine_name']
assert put_resp.status_code == 200
def test_put_redundant_mine_name(test_client, db_session, auth_headers):
existing_name = MineFactory().mine_name
mine = MineFactory()
test_tenure_data = {
"mine_name": existing_name,
}
put_resp = test_client.put(
f'/mines/{mine.mine_guid}', json=test_tenure_data, headers=auth_headers['full_auth_header'])
assert put_resp.status_code == 400
def test_put_mine_major_true(test_client, db_session, auth_headers):
mine_guid = MineFactory(major_mine_ind=False).mine_guid
test_mine_data = {"major_mine_ind": "true", "mine_note": ""}
put_resp = test_client.put(
f'/mines/{mine_guid}', json=test_mine_data, headers=auth_headers['full_auth_header'])
put_data = json.loads(put_resp.data.decode())
assert put_data['major_mine_ind'] == True
assert put_resp.status_code == 200
def test_put_mine_major_false(test_client, db_session, auth_headers):
mine_guid = MineFactory(major_mine_ind=True).mine_guid
test_mine_data = {"major_mine_ind": "false", "mine_note": ""}
put_resp = test_client.put(
f'/mines/{mine_guid}', json=test_mine_data, headers=auth_headers['full_auth_header'])
put_data = json.loads(put_resp.data.decode())
assert put_data['major_mine_ind'] == False
assert put_resp.status_code == 200
def test_put_mine_note(test_client, db_session, auth_headers):
mine_guid = MineFactory().mine_guid
test_tenure_data = {"mine_note": "new_note"}
put_resp = test_client.put(
f'/mines/{mine_guid}', json=test_tenure_data, headers=auth_headers['full_auth_header'])
put_data = json.loads(put_resp.data.decode())
assert test_tenure_data['mine_note'] == put_data['mine_note']
assert put_resp.status_code == 200
def test_put_mine_mine_status(test_client, db_session, auth_headers):
mine_guid = MineFactory().mine_guid
test_mine_data = {"mine_status": "CLD, CM", "mine_note": ""}
put_resp = test_client.put(
f'/mines/{mine_guid}', json=test_mine_data, headers=auth_headers['full_auth_header'])
assert put_resp.status_code == 200
def test_put_mine_region(test_client, db_session, auth_headers):
mine_guid = MineFactory().mine_guid
test_mine_data = {"mine_region": 'NE', "mine_note": ""}
put_resp = test_client.put(
f'/mines/{mine_guid}', json=test_mine_data, headers=auth_headers['full_auth_header'])
assert put_resp.status_code == 200
put_data = json.loads(put_resp.data.decode())
assert put_data['mine_region'] == test_mine_data['mine_region']
| 40.302721 | 106 | 0.69854 | 1,694 | 11,849 | 4.469303 | 0.05608 | 0.0634 | 0.09193 | 0.05706 | 0.908995 | 0.880993 | 0.845331 | 0.827368 | 0.788007 | 0.770176 | 0 | 0.021033 | 0.165415 | 11,849 | 293 | 107 | 40.440273 | 0.744565 | 0.000928 | 0 | 0.611111 | 0 | 0 | 0.226382 | 0.01352 | 0 | 0 | 0 | 0 | 0.252137 | 1 | 0.094017 | false | 0 | 0.008547 | 0 | 0.102564 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
826a305e6679c6ccf156aea4599b7554b4b6328c | 344 | py | Python | root/Base.py | shubham2803/todo | 2d7562a4d793f22322f2b284056d2e5a810d9595 | [
"MIT"
] | null | null | null | root/Base.py | shubham2803/todo | 2d7562a4d793f22322f2b284056d2e5a810d9595 | [
"MIT"
] | null | null | null | root/Base.py | shubham2803/todo | 2d7562a4d793f22322f2b284056d2e5a810d9595 | [
"MIT"
] | null | null | null | class BaseClass:
def __init__(self, file_path=None):
self.file_path = file_path
def get_file_path(self, file_path=None):
return self.file_path or '/home/admin1/Documents/todo/root/input.txt'
def get_file_path_2(self, file_path=None):
return self.file_path or '/home/admin1/Documents/todo/root/input_2.txt'
| 34.4 | 79 | 0.715116 | 55 | 344 | 4.163636 | 0.345455 | 0.31441 | 0.31441 | 0.209607 | 0.593886 | 0.593886 | 0.593886 | 0.593886 | 0.593886 | 0.593886 | 0 | 0.014085 | 0.174419 | 344 | 9 | 80 | 38.222222 | 0.792254 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.25 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0.285714 | 0.857143 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
82747fbba08a1f9a43f4647bb10b77257ddeb1f2 | 33 | py | Python | hardware/alpha_genesis/sensor-node/InteractiveHtmlBom/__init__.py | kaiote/OpenHAP | ada812f8451b3463e355a62f3f5bb31ed226630b | [
"MIT"
] | 1 | 2020-02-13T04:37:06.000Z | 2020-02-13T04:37:06.000Z | hardware/alpha_genesis/sensor-node/InteractiveHtmlBom/__init__.py | kaiote/OpenHAP | ada812f8451b3463e355a62f3f5bb31ed226630b | [
"MIT"
] | null | null | null | hardware/alpha_genesis/sensor-node/InteractiveHtmlBom/__init__.py | kaiote/OpenHAP | ada812f8451b3463e355a62f3f5bb31ed226630b | [
"MIT"
] | null | null | null | from . import InteractiveHtmlBom
| 16.5 | 32 | 0.848485 | 3 | 33 | 9.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
827ad2fc97831a64c0d1d4d0cb2759b7f2b7c5f7 | 3,141 | py | Python | sceptre/exceptions.py | lukeplausin/sceptre | bad46d1a0d208dd14f0be2c776874ed5020ffaa7 | [
"Apache-2.0"
] | 1 | 2019-07-26T19:03:50.000Z | 2019-07-26T19:03:50.000Z | sceptre/exceptions.py | lukeplausin/sceptre | bad46d1a0d208dd14f0be2c776874ed5020ffaa7 | [
"Apache-2.0"
] | null | null | null | sceptre/exceptions.py | lukeplausin/sceptre | bad46d1a0d208dd14f0be2c776874ed5020ffaa7 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
class SceptreException(Exception):
"""
Base class for all Sceptre errors
"""
pass
class ProjectAlreadyExistsError(SceptreException):
"""
Error raised when Sceptre project already exists.
"""
pass
class InvalidSceptreDirectoryError(SceptreException):
"""
Error raised if a sceptre directory is invalid.
"""
pass
class UnsupportedTemplateFileTypeError(SceptreException):
"""
Error raised if an unsupported template file type is used.
"""
pass
class TemplateSceptreHandlerError(SceptreException):
"""
Error raised if sceptre_handler() is not defined correctly in the template.
"""
pass
class DependencyStackNotLaunchedError(SceptreException):
"""
Error raised when a dependency stack has not been launched
"""
pass
class DependencyStackMissingOutputError(SceptreException):
"""
Error raised if a dependency stack does not have the correct outputs.
"""
pass
class CircularDependenciesError(SceptreException):
"""
Error raised if there are circular dependencies
"""
pass
class UnknownStackStatusError(SceptreException):
"""
Error raised if an unknown stack status is received.
"""
pass
class RetryLimitExceededError(SceptreException):
"""
Error raised if the request limit is exceeded.
"""
pass
class UnknownHookTypeError(SceptreException):
"""
Error raised if an unrecognised hook type is received.
"""
class VersionIncompatibleError(SceptreException):
"""
Error raised if configuration incompatible with running version.
"""
pass
class ProtectedStackError(SceptreException):
"""
Error raised upon execution of an action under active protection
"""
pass
class UnknownStackChangeSetStatusError(SceptreException):
"""
Error raised if an unknown stack change set status is received.
"""
pass
class InvalidHookArgumentTypeError(SceptreException):
"""
Error raised if a hook's argument type is invalid.
"""
pass
class InvalidHookArgumentSyntaxError(SceptreException):
"""
Error raised if a hook's argument syntax is invalid.
"""
pass
class InvalidHookArgumentValueError(SceptreException):
"""
Error raised if a hook's argument value is invalid.
"""
pass
class CannotUpdateFailedStackError(SceptreException):
"""
Error raised when a failed stack is updated.
"""
pass
class StackDoesNotExistError(SceptreException):
"""
Error raised when a stack does not exist.
"""
pass
class ConfigFileNotFoundError(SceptreException):
"""
Error raised when a config file does not exist.
"""
pass
class InvalidConfigFileError(SceptreException):
"""
Error raised when a config file lacks mandatory keys.
"""
pass
class PathConversionError(SceptreException):
"""
Error raised when a path is unable to be converted.
"""
pass
class InvalidAWSCredentialsError(SceptreException):
"""
Error raised when AWS credentials are invalid.
"""
pass
| 19.388889 | 79 | 0.690226 | 302 | 3,141 | 7.175497 | 0.364238 | 0.213198 | 0.274112 | 0.173973 | 0.295801 | 0.137979 | 0.137979 | 0.059529 | 0 | 0 | 0 | 0.000415 | 0.233365 | 3,141 | 161 | 80 | 19.509317 | 0.899502 | 0.400828 | 0 | 0.488889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.488889 | 0 | 0 | 0.511111 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
82c48443fa76f01c008dd749966e80470e98d90e | 27,743 | py | Python | pybind/slxos/v17r_2_00/igmp_snooping_state/pim_snp_groups/pim_snp_groups_/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v17r_2_00/igmp_snooping_state/pim_snp_groups/pim_snp_groups_/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v17r_2_00/igmp_snooping_state/pim_snp_groups/pim_snp_groups_/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import pim_snp_sources
import pim_snp_wg_member_ports
class pim_snp_groups(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-mc-hms-operational - based on the path /igmp-snooping-state/pim-snp-groups/pim-snp-groups. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Pim Snooping Group Information
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__group_addr','__vlan_id','__uptime','__expiry_time','__last_reporter','__pim_snp_sources','__pim_snp_wg_member_ports',)
_yang_name = 'pim-snp-groups'
_rest_name = 'pim-snp-groups'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__uptime = YANGDynClass(base=unicode, is_leaf=True, yang_name="uptime", rest_name="uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
self.__expiry_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="expiry-time", rest_name="expiry-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
self.__group_addr = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="group-addr", rest_name="group-addr", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)
self.__pim_snp_sources = YANGDynClass(base=YANGListType("src_addr",pim_snp_sources.pim_snp_sources, yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='src-addr', extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}), is_container='list', yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)
self.__pim_snp_wg_member_ports = YANGDynClass(base=YANGListType("interface_name",pim_snp_wg_member_ports.pim_snp_wg_member_ports, yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='interface-name', extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}), is_container='list', yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)
self.__last_reporter = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="last-reporter", rest_name="last-reporter", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)
self.__vlan_id = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="vlan-id", rest_name="vlan-id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'igmp-snooping-state', u'pim-snp-groups', u'pim-snp-groups']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'igmp-snooping-state', u'pim-snp-groups', u'pim-snp-groups']
def _get_group_addr(self):
"""
Getter method for group_addr, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/group_addr (inet:ipv4-address)
YANG Description: group ip address
"""
return self.__group_addr
def _set_group_addr(self, v, load=False):
"""
Setter method for group_addr, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/group_addr (inet:ipv4-address)
If this variable is read-only (config: false) in the
source YANG file, then _set_group_addr is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_group_addr() directly.
YANG Description: group ip address
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="group-addr", rest_name="group-addr", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """group_addr must be of a type compatible with inet:ipv4-address""",
'defined-type': "inet:ipv4-address",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="group-addr", rest_name="group-addr", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)""",
})
self.__group_addr = t
if hasattr(self, '_set'):
self._set()
def _unset_group_addr(self):
self.__group_addr = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="group-addr", rest_name="group-addr", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)
def _get_vlan_id(self):
"""
Getter method for vlan_id, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/vlan_id (uint32)
YANG Description: vlan id
"""
return self.__vlan_id
def _set_vlan_id(self, v, load=False):
"""
Setter method for vlan_id, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/vlan_id (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_vlan_id is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_vlan_id() directly.
YANG Description: vlan id
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="vlan-id", rest_name="vlan-id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """vlan_id must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="vlan-id", rest_name="vlan-id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__vlan_id = t
if hasattr(self, '_set'):
self._set()
def _unset_vlan_id(self):
self.__vlan_id = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="vlan-id", rest_name="vlan-id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_uptime(self):
"""
Getter method for uptime, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/uptime (string)
YANG Description: group up time
"""
return self.__uptime
def _set_uptime(self, v, load=False):
"""
Setter method for uptime, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/uptime (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_uptime is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_uptime() directly.
YANG Description: group up time
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="uptime", rest_name="uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """uptime must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="uptime", rest_name="uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)""",
})
self.__uptime = t
if hasattr(self, '_set'):
self._set()
def _unset_uptime(self):
self.__uptime = YANGDynClass(base=unicode, is_leaf=True, yang_name="uptime", rest_name="uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
def _get_expiry_time(self):
"""
Getter method for expiry_time, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/expiry_time (string)
YANG Description: group expiry time
"""
return self.__expiry_time
def _set_expiry_time(self, v, load=False):
"""
Setter method for expiry_time, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/expiry_time (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_expiry_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_expiry_time() directly.
YANG Description: group expiry time
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="expiry-time", rest_name="expiry-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """expiry_time must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="expiry-time", rest_name="expiry-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)""",
})
self.__expiry_time = t
if hasattr(self, '_set'):
self._set()
def _unset_expiry_time(self):
self.__expiry_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="expiry-time", rest_name="expiry-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
def _get_last_reporter(self):
"""
Getter method for last_reporter, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/last_reporter (inet:ipv4-address)
YANG Description: last reporter
"""
return self.__last_reporter
def _set_last_reporter(self, v, load=False):
"""
Setter method for last_reporter, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/last_reporter (inet:ipv4-address)
If this variable is read-only (config: false) in the
source YANG file, then _set_last_reporter is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_last_reporter() directly.
YANG Description: last reporter
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="last-reporter", rest_name="last-reporter", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """last_reporter must be of a type compatible with inet:ipv4-address""",
'defined-type': "inet:ipv4-address",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="last-reporter", rest_name="last-reporter", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)""",
})
self.__last_reporter = t
if hasattr(self, '_set'):
self._set()
def _unset_last_reporter(self):
self.__last_reporter = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_dict={'pattern': u'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), is_leaf=True, yang_name="last-reporter", rest_name="last-reporter", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='inet:ipv4-address', is_config=False)
def _get_pim_snp_sources(self):
"""
Getter method for pim_snp_sources, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/pim_snp_sources (list)
YANG Description: pim snooping source instance
"""
return self.__pim_snp_sources
def _set_pim_snp_sources(self, v, load=False):
"""
Setter method for pim_snp_sources, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/pim_snp_sources (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_pim_snp_sources is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_pim_snp_sources() directly.
YANG Description: pim snooping source instance
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("src_addr",pim_snp_sources.pim_snp_sources, yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='src-addr', extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}), is_container='list', yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """pim_snp_sources must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("src_addr",pim_snp_sources.pim_snp_sources, yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='src-addr', extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}), is_container='list', yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)""",
})
self.__pim_snp_sources = t
if hasattr(self, '_set'):
self._set()
def _unset_pim_snp_sources(self):
self.__pim_snp_sources = YANGDynClass(base=YANGListType("src_addr",pim_snp_sources.pim_snp_sources, yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='src-addr', extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}), is_container='list', yang_name="pim-snp-sources", rest_name="pim-snp-sources", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc-hms-pim-snp-source', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)
def _get_pim_snp_wg_member_ports(self):
"""
Getter method for pim_snp_wg_member_ports, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/pim_snp_wg_member_ports (list)
"""
return self.__pim_snp_wg_member_ports
def _set_pim_snp_wg_member_ports(self, v, load=False):
"""
Setter method for pim_snp_wg_member_ports, mapped from YANG variable /igmp_snooping_state/pim_snp_groups/pim_snp_groups/pim_snp_wg_member_ports (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_pim_snp_wg_member_ports is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_pim_snp_wg_member_ports() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("interface_name",pim_snp_wg_member_ports.pim_snp_wg_member_ports, yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='interface-name', extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}), is_container='list', yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """pim_snp_wg_member_ports must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("interface_name",pim_snp_wg_member_ports.pim_snp_wg_member_ports, yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='interface-name', extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}), is_container='list', yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)""",
})
self.__pim_snp_wg_member_ports = t
if hasattr(self, '_set'):
self._set()
def _unset_pim_snp_wg_member_ports(self):
self.__pim_snp_wg_member_ports = YANGDynClass(base=YANGListType("interface_name",pim_snp_wg_member_ports.pim_snp_wg_member_ports, yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='interface-name', extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}), is_container='list', yang_name="pim-snp-wg-member-ports", rest_name="pim-snp-wg-member-ports", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'mc_hms-pim-snp-member-port-pim-snp-wg-member-ports-2'}}, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='list', is_config=False)
group_addr = __builtin__.property(_get_group_addr)
vlan_id = __builtin__.property(_get_vlan_id)
uptime = __builtin__.property(_get_uptime)
expiry_time = __builtin__.property(_get_expiry_time)
last_reporter = __builtin__.property(_get_last_reporter)
pim_snp_sources = __builtin__.property(_get_pim_snp_sources)
pim_snp_wg_member_ports = __builtin__.property(_get_pim_snp_wg_member_ports)
_pyangbind_elements = {'group_addr': group_addr, 'vlan_id': vlan_id, 'uptime': uptime, 'expiry_time': expiry_time, 'last_reporter': last_reporter, 'pim_snp_sources': pim_snp_sources, 'pim_snp_wg_member_ports': pim_snp_wg_member_ports, }
| 74.778976 | 858 | 0.731824 | 4,198 | 27,743 | 4.583849 | 0.052168 | 0.046458 | 0.049473 | 0.068129 | 0.872005 | 0.843528 | 0.824715 | 0.814218 | 0.811204 | 0.796965 | 0 | 0.017389 | 0.116966 | 27,743 | 370 | 859 | 74.981081 | 0.768104 | 0.154309 | 0 | 0.486239 | 0 | 0.087156 | 0.412762 | 0.259612 | 0 | 0 | 0 | 0 | 0 | 1 | 0.110092 | false | 0 | 0.045872 | 0 | 0.270642 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
82c5b036c9f53c13a09225fa05b50aa97b60aaeb | 182 | py | Python | onadata/apps/sms_support/tests/__init__.py | ubpd/kobocat | 45906e07e8f05c30e3e26bab5570a8ab1ee264db | [
"BSD-2-Clause"
] | null | null | null | onadata/apps/sms_support/tests/__init__.py | ubpd/kobocat | 45906e07e8f05c30e3e26bab5570a8ab1ee264db | [
"BSD-2-Clause"
] | null | null | null | onadata/apps/sms_support/tests/__init__.py | ubpd/kobocat | 45906e07e8f05c30e3e26bab5570a8ab1ee264db | [
"BSD-2-Clause"
] | null | null | null | # coding: utf-8
from __future__ import unicode_literals, print_function, division, absolute_import
# from test_parser import TestParser
# from test_notallowed import TestNotAllowed
| 30.333333 | 82 | 0.840659 | 23 | 182 | 6.26087 | 0.73913 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006211 | 0.115385 | 182 | 5 | 83 | 36.4 | 0.888199 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
82cb054f0bc5929fb7f9451aec1041d508a0d838 | 16,597 | py | Python | gen-py/Services_old/GameService.py | afshelburn/irpy | 46e9cab832a0bdfa6903ac73bb5dec610b49ea05 | [
"MIT"
] | null | null | null | gen-py/Services_old/GameService.py | afshelburn/irpy | 46e9cab832a0bdfa6903ac73bb5dec610b49ea05 | [
"MIT"
] | null | null | null | gen-py/Services_old/GameService.py | afshelburn/irpy | 46e9cab832a0bdfa6903ac73bb5dec610b49ea05 | [
"MIT"
] | null | null | null | #
# Autogenerated by Thrift Compiler (0.9.1)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TException, TApplicationException
from ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
from thrift.protocol import TBinaryProtocol, TProtocol
try:
from thrift.protocol import fastbinary
except:
fastbinary = None
class Iface:
def update(self, source, target, action, data):
"""
Parameters:
- source
- target
- action
- data
"""
pass
def message(self, source, target, message):
"""
Parameters:
- source
- target
- message
"""
pass
def initGame(self, gameMode):
"""
Parameters:
- gameMode
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def update(self, source, target, action, data):
"""
Parameters:
- source
- target
- action
- data
"""
self.send_update(source, target, action, data)
self.recv_update()
def send_update(self, source, target, action, data):
self._oprot.writeMessageBegin('update', TMessageType.CALL, self._seqid)
args = update_args()
args.source = source
args.target = target
args.action = action
args.data = data
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_update(self):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = update_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
return
def message(self, source, target, message):
"""
Parameters:
- source
- target
- message
"""
self.send_message(source, target, message)
self.recv_message()
def send_message(self, source, target, message):
self._oprot.writeMessageBegin('message', TMessageType.CALL, self._seqid)
args = message_args()
args.source = source
args.target = target
args.message = message
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_message(self):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = message_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
return
def initGame(self, gameMode):
"""
Parameters:
- gameMode
"""
self.send_initGame(gameMode)
self.recv_initGame()
def send_initGame(self, gameMode):
self._oprot.writeMessageBegin('initGame', TMessageType.CALL, self._seqid)
args = initGame_args()
args.gameMode = gameMode
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_initGame(self):
(fname, mtype, rseqid) = self._iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(self._iprot)
self._iprot.readMessageEnd()
raise x
result = initGame_result()
result.read(self._iprot)
self._iprot.readMessageEnd()
return
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["update"] = Processor.process_update
self._processMap["message"] = Processor.process_message
self._processMap["initGame"] = Processor.process_initGame
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_update(self, seqid, iprot, oprot):
args = update_args()
args.read(iprot)
iprot.readMessageEnd()
result = update_result()
self._handler.update(args.source, args.target, args.action, args.data)
oprot.writeMessageBegin("update", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_message(self, seqid, iprot, oprot):
args = message_args()
args.read(iprot)
iprot.readMessageEnd()
result = message_result()
self._handler.message(args.source, args.target, args.message)
oprot.writeMessageBegin("message", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_initGame(self, seqid, iprot, oprot):
args = initGame_args()
args.read(iprot)
iprot.readMessageEnd()
result = initGame_result()
self._handler.initGame(args.gameMode)
oprot.writeMessageBegin("initGame", TMessageType.REPLY, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class update_args:
"""
Attributes:
- source
- target
- action
- data
"""
thrift_spec = (
None, # 0
(1, TType.I16, 'source', None, None, ), # 1
(2, TType.I16, 'target', None, None, ), # 2
(3, TType.I16, 'action', None, None, ), # 3
(4, TType.I16, 'data', None, None, ), # 4
)
def __init__(self, source=None, target=None, action=None, data=None,):
self.source = source
self.target = target
self.action = action
self.data = data
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.source = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.target = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I16:
self.action = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I16:
self.data = iprot.readI16();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('update_args')
if self.source is not None:
oprot.writeFieldBegin('source', TType.I16, 1)
oprot.writeI16(self.source)
oprot.writeFieldEnd()
if self.target is not None:
oprot.writeFieldBegin('target', TType.I16, 2)
oprot.writeI16(self.target)
oprot.writeFieldEnd()
if self.action is not None:
oprot.writeFieldBegin('action', TType.I16, 3)
oprot.writeI16(self.action)
oprot.writeFieldEnd()
if self.data is not None:
oprot.writeFieldBegin('data', TType.I16, 4)
oprot.writeI16(self.data)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class update_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('update_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class message_args:
"""
Attributes:
- source
- target
- message
"""
thrift_spec = (
None, # 0
(1, TType.I16, 'source', None, None, ), # 1
(2, TType.I16, 'target', None, None, ), # 2
(3, TType.STRING, 'message', None, None, ), # 3
)
def __init__(self, source=None, target=None, message=None,):
self.source = source
self.target = target
self.message = message
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.source = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.target = iprot.readI16();
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.message = iprot.readString();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('message_args')
if self.source is not None:
oprot.writeFieldBegin('source', TType.I16, 1)
oprot.writeI16(self.source)
oprot.writeFieldEnd()
if self.target is not None:
oprot.writeFieldBegin('target', TType.I16, 2)
oprot.writeI16(self.target)
oprot.writeFieldEnd()
if self.message is not None:
oprot.writeFieldBegin('message', TType.STRING, 3)
oprot.writeString(self.message)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class message_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('message_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class initGame_args:
"""
Attributes:
- gameMode
"""
thrift_spec = (
None, # 0
(1, TType.I16, 'gameMode', None, None, ), # 1
)
def __init__(self, gameMode=None,):
self.gameMode = gameMode
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.gameMode = iprot.readI16();
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('initGame_args')
if self.gameMode is not None:
oprot.writeFieldBegin('gameMode', TType.I16, 1)
oprot.writeI16(self.gameMode)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class initGame_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('initGame_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
| 28.914634 | 188 | 0.661023 | 1,930 | 16,597 | 5.445596 | 0.074093 | 0.015699 | 0.028259 | 0.023977 | 0.790961 | 0.751951 | 0.738915 | 0.714462 | 0.695718 | 0.69058 | 0 | 0.008427 | 0.220642 | 16,597 | 573 | 189 | 28.965096 | 0.804097 | 0.031994 | 0 | 0.752887 | 1 | 0 | 0.021207 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.13164 | false | 0.006928 | 0.013857 | 0.04157 | 0.274827 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
82da13bda7e5f397453d9c85c913fd52a3e799d7 | 26 | py | Python | pydepend/tests/assets/testfile.py | herczy/pydepend | ee64ba30efc3e19d643da2ed22e078ef4a06795d | [
"BSD-3-Clause"
] | 2 | 2019-04-21T06:10:09.000Z | 2020-04-24T23:12:02.000Z | pydepend/tests/assets/testfile.py | herczy/pydepend | ee64ba30efc3e19d643da2ed22e078ef4a06795d | [
"BSD-3-Clause"
] | null | null | null | pydepend/tests/assets/testfile.py | herczy/pydepend | ee64ba30efc3e19d643da2ed22e078ef4a06795d | [
"BSD-3-Clause"
] | null | null | null | def outerfunc():
pass
| 8.666667 | 16 | 0.615385 | 3 | 26 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.269231 | 26 | 2 | 17 | 13 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
7d6d646c7db81403264a4557610980ada589e725 | 77 | py | Python | python/src/test/resources/pyfunc/numpy_power_test.py | maropu/lljvm-translator | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 70 | 2017-12-12T10:54:00.000Z | 2022-03-22T07:45:19.000Z | python/src/test/resources/pyfunc/numpy_power_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 14 | 2018-02-28T01:29:46.000Z | 2019-12-10T01:42:22.000Z | python/src/test/resources/pyfunc/numpy_power_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 4 | 2019-07-21T07:58:25.000Z | 2021-02-01T09:46:59.000Z | import numpy as np
def numpy_power_test(x, y):
return np.power(-x, 4) / y
| 15.4 | 28 | 0.675325 | 16 | 77 | 3.125 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016129 | 0.194805 | 77 | 4 | 29 | 19.25 | 0.790323 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
7d80028e8abff3b2f2d4d99188dede223fa7d32b | 8,876 | py | Python | src/rrsm/rrsmi/forms.py | jbienkowski/rrsm | a5d5e574c7f7169cd8800402f9f04b2f9325b87e | [
"MIT"
] | 3 | 2019-02-07T18:03:58.000Z | 2020-06-30T11:10:30.000Z | src/rrsm/rrsmi/forms.py | jbienkowski/rrsm | a5d5e574c7f7169cd8800402f9f04b2f9325b87e | [
"MIT"
] | 28 | 2019-05-20T07:17:40.000Z | 2021-09-09T11:40:41.000Z | src/rrsm/rrsmi/forms.py | jbienkowski/rrsm | a5d5e574c7f7169cd8800402f9f04b2f9325b87e | [
"MIT"
] | 1 | 2019-07-26T10:22:55.000Z | 2019-07-26T10:22:55.000Z | from django import forms
from .models import \
SearchEvent, SearchPeakMotions, SearchCombined, SearchCustom, \
COORD_INTEGERS, COORD_DECIMALS, \
PGA_PGV_INTEGERS, PGA_PGV_DECIMALS, \
MAG_INTEGERS, MAG_DECIMALS
LABEL_EVENT_ID = 'Event Id'
LABEL_DATE_START = 'Start date (YYYY-MM-DD)'
LABEL_DATE_END = 'End date (YYYY-MM-DD)'
LABEL_MAGNITUDE_MIN = 'Magnitude minimum'
LABEL_NETWORK_CODE = 'Network code'
LABEL_STATION_CODE = 'Station code'
LABEL_PGA_MIN = 'PGA minimum [cm/s^2]'
LABEL_PGA_MAX = 'PGA maximum [cm/s^2]'
LABEL_PGV_MIN = 'PGV minimum [cm/s]'
LABEL_PGV_MAX = 'PGV maximum [cm/s]'
LABEL_STAT_LAT_MIN = 'Station latitude minimum'
LABEL_STAT_LAT_MAX = 'Station latitude maximum'
LABEL_STAT_LON_MIN = 'Station longitude minimum'
LABEL_STAT_LON_MAX = 'Station longitude maximum'
LABEL_EVENT_LAT_MIN = 'Event latitude minimum'
LABEL_EVENT_LAT_MAX = 'Event latitude maximum'
LABEL_EVENT_LON_MIN = 'Event longitude minimum'
LABEL_EVENT_LON_MAX = 'Event longitude maximum'
TEXT_EVENT_ID = 'Alphanumeric string, e.g. "20180414_0000061"'
TEXT_DATE_START = 'Date field, e.g. "2018-01-21"'
TEXT_DATE_END = 'Date field, e.g. "2018-01-21"'
TEXT_MAGNITUDE_MIN = 'Maximum {} digits including {} decimal place'.format(
MAG_INTEGERS + MAG_DECIMALS, MAG_DECIMALS
)
TEXT_NETWORK_CODE = 'Alphanumeric string'
TEXT_STATION_CODE = 'Alphanumeric string'
TEXT_PGA_MIN = 'Maximum {} digits including {} decimal places'.format(
PGA_PGV_INTEGERS + PGA_PGV_DECIMALS, PGA_PGV_DECIMALS
)
TEXT_PGA_MAX = 'Maximum {} digits including {} decimal places'.format(
PGA_PGV_INTEGERS + PGA_PGV_DECIMALS, PGA_PGV_DECIMALS
)
TEXT_PGV_MIN = 'Maximum {} digits including {} decimal places'.format(
PGA_PGV_INTEGERS + PGA_PGV_DECIMALS, PGA_PGV_DECIMALS
)
TEXT_PGV_MAX = 'Maximum {} digits including {} decimal places'.format(
PGA_PGV_INTEGERS + PGA_PGV_DECIMALS, PGA_PGV_DECIMALS
)
TEXT_STAT_LAT_MIN = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_STAT_LAT_MAX = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_STAT_LON_MIN = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_STAT_LON_MAX = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_EVENT_LAT_MIN = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_EVENT_LAT_MAX = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_EVENT_LON_MIN = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
TEXT_EVENT_LON_MAX = 'Maximum {} digits including {} decimal places'.format(
COORD_INTEGERS + COORD_DECIMALS, COORD_DECIMALS
)
class SearchEventsForm(forms.ModelForm):
class Meta:
model = SearchEvent
fields = (
'event_id',
'date_start',
'date_end',
'magnitude_min',
'network_code',
'station_code',
'event_lat_min',
'event_lat_max',
'event_lon_min',
'event_lon_max',
)
labels = {
'event_id': LABEL_EVENT_ID,
'date_start': LABEL_DATE_START,
'date_end': LABEL_DATE_END,
'magnitude_min': LABEL_MAGNITUDE_MIN,
'network_code': LABEL_NETWORK_CODE,
'station_code': LABEL_STATION_CODE,
'event_lat_min': LABEL_EVENT_LAT_MIN,
'event_lat_max': LABEL_EVENT_LAT_MAX,
'event_lon_min': LABEL_EVENT_LON_MIN,
'event_lon_max': LABEL_EVENT_LON_MAX,
}
help_texts = {
'event_id': TEXT_EVENT_ID,
'date_start': TEXT_DATE_START,
'date_end': TEXT_DATE_END,
'magnitude_min': TEXT_MAGNITUDE_MIN,
'network_code': TEXT_NETWORK_CODE,
'station_code': TEXT_STATION_CODE,
'event_lat_min': TEXT_EVENT_LAT_MIN,
'event_lat_max': TEXT_EVENT_LAT_MAX,
'event_lon_min': TEXT_EVENT_LON_MIN,
'event_lon_max': TEXT_EVENT_LON_MAX,
}
class SearchPeakMotionsForm(forms.ModelForm):
class Meta:
model = SearchPeakMotions
fields = (
'pga_min',
'pga_max',
'pgv_min',
'pgv_max',
)
labels = {
'pga_min': LABEL_PGA_MIN,
'pga_max': LABEL_PGA_MAX,
'pgv_min': LABEL_PGV_MIN,
'pgv_max': LABEL_PGV_MAX,
}
help_texts = {
'pga_min': TEXT_PGA_MIN,
'pga_max': TEXT_PGA_MAX,
'pgv_min': TEXT_PGV_MIN,
'pgv_max': TEXT_PGV_MAX,
}
class SearchCombinedForm(forms.ModelForm):
class Meta:
model = SearchCombined
fields = (
'magnitude_min',
'pga_min',
'pga_max',
'pgv_min',
'pgv_max',
'stat_lat_min',
'stat_lat_max',
'stat_lon_min',
'stat_lon_max',
'event_lat_min',
'event_lat_max',
'event_lon_min',
'event_lon_max',
)
labels = {
'magnitude_min': LABEL_MAGNITUDE_MIN,
'pga_min': LABEL_PGA_MIN,
'pga_max': LABEL_PGA_MAX,
'pgv_min': LABEL_PGV_MIN,
'pgv_max': LABEL_PGV_MAX,
'stat_lat_min': LABEL_STAT_LAT_MIN,
'stat_lat_max': LABEL_STAT_LAT_MAX,
'stat_lon_min': LABEL_STAT_LON_MIN,
'stat_lon_max': LABEL_STAT_LON_MAX,
'event_lat_min': LABEL_EVENT_LAT_MIN,
'event_lat_max': LABEL_EVENT_LAT_MAX,
'event_lon_min': LABEL_EVENT_LON_MIN,
'event_lon_max': LABEL_EVENT_LON_MAX,
}
help_texts = {
'magnitude_min': TEXT_MAGNITUDE_MIN,
'pga_min': TEXT_PGA_MIN,
'pga_max': TEXT_PGA_MAX,
'pgv_min': TEXT_PGV_MIN,
'pgv_max': TEXT_PGV_MAX,
'stat_lat_min': TEXT_STAT_LAT_MIN,
'stat_lat_max': TEXT_STAT_LAT_MAX,
'stat_lon_min': TEXT_STAT_LON_MIN,
'stat_lon_max': TEXT_STAT_LON_MAX,
'event_lat_min': TEXT_EVENT_LAT_MIN,
'event_lat_max': TEXT_EVENT_LAT_MAX,
'event_lon_min': TEXT_EVENT_LON_MIN,
'event_lon_max': TEXT_EVENT_LON_MAX,
}
class SearchCustomForm(forms.ModelForm):
class Meta:
model = SearchCustom
fields = (
'event_id',
'date_start',
'date_end',
'magnitude_min',
'network_code',
'station_code',
'pga_min',
'pga_max',
'pgv_min',
'pgv_max',
'stat_lat_min',
'stat_lat_max',
'stat_lon_min',
'stat_lon_max',
'event_lat_min',
'event_lat_max',
'event_lon_min',
'event_lon_max',
)
labels = {
'event_id': LABEL_EVENT_ID,
'date_start': LABEL_DATE_START,
'date_end': LABEL_DATE_END,
'magnitude_min': LABEL_MAGNITUDE_MIN,
'network_code': LABEL_NETWORK_CODE,
'station_code': LABEL_STATION_CODE,
'pga_min': LABEL_PGA_MIN,
'pga_max': LABEL_PGA_MAX,
'pgv_min': LABEL_PGV_MIN,
'pgv_max': LABEL_PGV_MAX,
'stat_lat_min': LABEL_STAT_LAT_MIN,
'stat_lat_max': LABEL_STAT_LAT_MAX,
'stat_lon_min': LABEL_STAT_LON_MIN,
'stat_lon_max': LABEL_STAT_LON_MAX,
'event_lat_min': LABEL_EVENT_LAT_MIN,
'event_lat_max': LABEL_EVENT_LAT_MAX,
'event_lon_min': LABEL_EVENT_LON_MIN,
'event_lon_max': LABEL_EVENT_LON_MAX,
}
help_texts = {
'event_id': TEXT_EVENT_ID,
'date_start': TEXT_DATE_START,
'date_end': TEXT_DATE_END,
'magnitude_min': TEXT_MAGNITUDE_MIN,
'network_code': TEXT_NETWORK_CODE,
'station_code': TEXT_STATION_CODE,
'pga_min': TEXT_PGA_MIN,
'pga_max': TEXT_PGA_MAX,
'pgv_min': TEXT_PGV_MIN,
'pgv_max': TEXT_PGV_MAX,
'stat_lat_min': TEXT_STAT_LAT_MIN,
'stat_lat_max': TEXT_STAT_LAT_MAX,
'stat_lon_min': TEXT_STAT_LON_MIN,
'stat_lon_max': TEXT_STAT_LON_MAX,
'event_lat_min': TEXT_EVENT_LAT_MIN,
'event_lat_max': TEXT_EVENT_LAT_MAX,
'event_lon_min': TEXT_EVENT_LON_MIN,
'event_lon_max': TEXT_EVENT_LON_MAX,
}
| 35.504 | 76 | 0.617283 | 1,089 | 8,876 | 4.51056 | 0.063361 | 0.055375 | 0.03807 | 0.076751 | 0.801914 | 0.741246 | 0.735546 | 0.735546 | 0.721295 | 0.721295 | 0 | 0.005226 | 0.288644 | 8,876 | 249 | 77 | 35.646586 | 0.772727 | 0 | 0 | 0.682008 | 0 | 0 | 0.282334 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008368 | 0 | 0.041841 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7d8748cf36aabfd83f6c4d998391496ffea1e82b | 18,029 | py | Python | deepcell/dc_training_functions.py | hftsai/deepcell-tf_OIST | 55d7da0fba09c5959bf375ebdc471e3b266948a1 | [
"MIT"
] | null | null | null | deepcell/dc_training_functions.py | hftsai/deepcell-tf_OIST | 55d7da0fba09c5959bf375ebdc471e3b266948a1 | [
"MIT"
] | null | null | null | deepcell/dc_training_functions.py | hftsai/deepcell-tf_OIST | 55d7da0fba09c5959bf375ebdc471e3b266948a1 | [
"MIT"
] | null | null | null | """
dc_training_functions.py
Functions for training convolutional neural networks
@author: David Van Valen
"""
"""
Import python packages
"""
import numpy as np
from numpy import array
import matplotlib
import matplotlib.pyplot as plt
import shelve
from contextlib import closing
import os
import glob
import re
import numpy as np
import fnmatch
import tifffile as tiff
from numpy.fft import fft2, ifft2, fftshift
from skimage.io import imread
from scipy import ndimage
import threading
import scipy.ndimage as ndi
from scipy import linalg
import re
import random
import itertools
import h5py
import datetime
from skimage.measure import label, regionprops
from skimage.segmentation import clear_border
from scipy.ndimage.morphology import binary_fill_holes
from skimage import morphology as morph
from numpy.fft import fft2, ifft2, fftshift
from skimage.io import imread
from skimage.filters import threshold_otsu
import skimage as sk
from sklearn.utils.linear_assignment_ import linear_assignment
from sklearn.utils import class_weight
import tensorflow as tf
from tensorflow import keras
from tensorflow.python.keras import backend as K
from tensorflow.python.keras.layers import Layer, InputSpec, Input, Activation, Dense, Flatten, BatchNormalization, \
Conv2D, MaxPool2D, AvgPool2D, Concatenate
from tensorflow.python.keras.preprocessing.image import random_rotation, random_shift, random_shear, random_zoom, \
random_channel_shift, apply_transform, flip_axis, array_to_img, img_to_array, load_img, ImageDataGenerator, \
Iterator, NumpyArrayIterator, DirectoryIterator
from tensorflow.python.keras.callbacks import ModelCheckpoint, LearningRateScheduler
from tensorflow.python.keras import activations, initializers, losses, regularizers, constraints
from tensorflow.python.keras._impl.keras.utils import conv_utils
from dc_helper_functions import *
from dc_image_generators import *
"""
Training convnets
"""
def train_model_sample(model=None, dataset=None, optimizer=None,
expt="", it=0, batch_size=32, n_epoch=100,
direc_save="/home/vanvalen/ImageAnalysis/DeepCell2/trained_networks/",
direc_data="/home/vanvalen/ImageAnalysis/DeepCell2/training_data_npz/",
lr_sched=rate_scheduler(lr=0.01, decay=0.95),
rotation_range=0, flip=True, shear=0, class_weight=None, data_format=None):
training_data_file_name = os.path.join(direc_data, dataset + ".npz")
todays_date = datetime.datetime.now().strftime("%Y-%m-%d")
file_name_save = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".h5")
file_name_save_loss = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".npz")
train_dict, (X_test, Y_test) = get_data(training_data_file_name)
# the data, shuffled and split between train and test sets
print('X_train shape:', train_dict["channels"].shape)
print(train_dict["pixels_x"].shape[0], 'train samples')
print(X_test.shape[0], 'test samples')
# determine the number of classes
output_shape = model.layers[-1].output_shape
n_classes = output_shape[1]
print output_shape, n_classes
# convert class vectors to binary class matrices
train_dict["labels"] = to_categorical(train_dict["labels"], n_classes)
Y_test = to_categorical(Y_test, n_classes)
model.compile(loss='categorical_crossentropy',
optimizer=optimizer,
metrics=['accuracy'])
print('Using real-time data augmentation.')
# this will do preprocessing and realtime data augmentation
datagen = SampleDataGenerator(
rotation_range=rotation_range, # randomly rotate images by 0 to rotation_range degrees
shear_range=shear, # randomly shear images in the range (radians , -shear_range to shear_range)
horizontal_flip=flip, # randomly flip images
vertical_flip=flip,
data_format=data_format) # randomly flip images
# fit the model on the batches generated by datagen.flow()
loss_history = model.fit_generator(datagen.sample_flow(train_dict, batch_size=batch_size, data_format=data_format),
steps_per_epoch=len(train_dict["labels"]) / batch_size,
epochs=n_epoch,
validation_data=(X_test, Y_test),
validation_steps=X_test.shape[0] / batch_size,
class_weight=class_weight,
callbacks=[ModelCheckpoint(file_name_save, monitor='val_loss', verbose=0,
save_best_only=True, mode='auto'),
LearningRateScheduler(lr_sched)])
np.savez(file_name_save_loss, loss_history=loss_history.history)
def train_model_conv(model=None, dataset=None, optimizer=None,
expt="", it=0, batch_size=1, n_epoch=100,
direc_save="/home/vanvalen/ImageAnalysis/DeepCell2/trained_networks/",
direc_data="/home/vanvalen/ImageAnalysis/DeepCell2/training_data_npz/",
lr_sched=rate_scheduler(lr=0.01, decay=0.95),
rotation_range=0, flip=True, shear=0, class_weight=None, data_format=None):
training_data_file_name = os.path.join(direc_data, dataset + ".npz")
todays_date = datetime.datetime.now().strftime("%Y-%m-%d")
file_name_save = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".h5")
file_name_save_loss = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".npz")
train_dict, (X_test, Y_test) = get_data(training_data_file_name, mode='conv')
class_weights = None # class_weight #train_dict["class_weights"]
# the data, shuffled and split between train and test sets
print('Training data shape:', train_dict["channels"].shape)
print('Training labels shape:', train_dict["labels"].shape)
print('Testing data shape:', X_test.shape)
print('Testing labels shape:', Y_test.shape)
# determine the number of classes
output_shape = model.layers[-1].output_shape
n_classes = output_shape[-1]
print output_shape, n_classes
print class_weights
def loss_function(y_true, y_pred):
return weighted_categorical_crossentropy(y_true, y_pred, n_classes=n_classes, from_logits=False)
model.compile(loss=loss_function,
optimizer=optimizer,
metrics=['accuracy'])
print model.summary()
print('Using real-time data augmentation.')
# this will do preprocessing and realtime data augmentation
datagen = ImageFullyConvDataGenerator(
rotation_range=rotation_range, # randomly rotate images by 0 to rotation_range degrees
shear_range=shear, # randomly shear images in the range (radians , -shear_range to shear_range)
horizontal_flip=flip, # randomly flip images
vertical_flip=flip,
data_format=data_format,
target_size=output_shape[1]) # randomly flip images
x, y = datagen.flow(train_dict, batch_size=1).next()
reshaped_y_test = np.zeros((Y_test.shape[0], Y_test.shape[1], output_shape[1], output_shape[1]))
for i in xrange(Y_test.shape[0]):
reshaped_y_test[i] = reshape_targets(Y_test[i], output_shape[1])
Y_test = np.rollaxis(reshaped_y_test, 1, 4)
# fit the model on the batches generated by datagen.flow()
loss_history = model.fit_generator(datagen.flow(train_dict, batch_size=batch_size, data_format=data_format),
steps_per_epoch=train_dict["labels"].shape[0] / batch_size,
epochs=n_epoch,
validation_data=(X_test, Y_test),
validation_steps=X_test.shape[0] / batch_size,
callbacks=[ModelCheckpoint(file_name_save, monitor='val_loss', verbose=1,
save_best_only=True, mode='auto'),
LearningRateScheduler(lr_sched)])
model.save_weights(file_name_save)
np.savez(file_name_save_loss, loss_history=loss_history.history)
return model
def train_model_disc(model=None, dataset=None, optimizer=None,
expt="", it=0, batch_size=1, n_epoch=100,
direc_save="/home/vanvalen/ImageAnalysis/DeepCell2/trained_networks/",
direc_data="/home/vanvalen/ImageAnalysis/DeepCell2/training_data_npz/",
lr_sched=rate_scheduler(lr=0.01, decay=0.95),
rotation_range=0, flip=True, shear=0, class_weight=None):
training_data_file_name = os.path.join(direc_data, dataset + ".npz")
todays_date = datetime.datetime.now().strftime("%Y-%m-%d")
file_name_save = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".h5")
file_name_save_loss = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".npz")
train_dict, (X_test, Y_test) = get_data(training_data_file_name, mode='conv')
# the data, shuffled and split between train and test sets
print('Training data shape:', train_dict["channels"].shape)
print('Training labels shape:', train_dict["labels"].shape)
print('Testing data shape:', X_test.shape)
print('Testing labels shape:', Y_test.shape)
# determine the number of classes
output_shape = model.layers[-1].output_shape
n_classes = output_shape[-1]
print output_shape, n_classes
def loss_function(y_true, y_pred):
return discriminative_instance_loss(y_true, y_pred)
model.compile(loss=loss_function,
optimizer=optimizer)
print('Using real-time data augmentation.')
# this will do preprocessing and realtime data augmentation
datagen = ImageFullyConvDataGenerator(
rotation_range=rotation_range, # randomly rotate images by 0 to rotation_range degrees
shear_range=shear, # randomly shear images in the range (radians , -shear_range to shear_range)
horizontal_flip=flip, # randomly flip images
vertical_flip=flip) # randomly flip images
Y_test = np.rollaxis(Y_test, 1, 4)
loss_history = model.fit_generator(datagen.flow(train_dict, batch_size=batch_size),
steps_per_epoch=train_dict["labels"].shape[0] / batch_size,
epochs=n_epoch,
validation_data=(X_test, Y_test),
validation_steps=X_test.shape[0] / batch_size,
callbacks=[ModelCheckpoint(file_name_save, monitor='val_loss', verbose=1,
save_best_only=True, mode='auto'),
LearningRateScheduler(lr_sched)])
model.save_weights(file_name_save)
np.savez(file_name_save_loss, loss_history=loss_history.history)
return model
def train_model_conv_sample(model=None, dataset=None, optimizer=None,
expt="", it=0, batch_size=1, n_epoch=100,
direc_save="/home/vanvalen/ImageAnalysis/DeepCell2/trained_networks/",
direc_data="/home/vanvalen/ImageAnalysis/DeepCell2/training_data_npz/",
lr_sched=rate_scheduler(lr=0.01, decay=0.95),
rotation_range=0, flip=True, shear=0, class_weights=None):
training_data_file_name = os.path.join(direc_data, dataset + ".npz")
todays_date = datetime.datetime.now().strftime("%Y-%m-%d")
file_name_save = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".h5")
file_name_save_loss = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".npz")
train_dict, (X_test, Y_test) = get_data(training_data_file_name, mode='conv_sample')
class_weights = class_weights # train_dict["class_weights"]
# the data, shuffled and split between train and test sets
print('Training data shape:', train_dict["channels"].shape)
print('Training labels shape:', train_dict["labels"].shape)
print('Testing data shape:', X_test.shape)
print('Testing labels shape:', Y_test.shape)
# determine the number of classes
output_shape = model.layers[-1].output_shape
n_classes = output_shape[-1]
print output_shape, n_classes
class_weights = np.array([1, 1, 1], dtype=K.floatx())
def loss_function(y_true, y_pred):
return sample_categorical_crossentropy(y_true, y_pred, axis=3, class_weights=class_weights, from_logits=False)
model.compile(loss=loss_function,
optimizer=optimizer,
metrics=['accuracy'])
print('Using real-time data augmentation.')
# this will do preprocessing and realtime data augmentation
datagen = ImageFullyConvDataGenerator(
rotation_range=rotation_range, # randomly rotate images by 0 to rotation_range degrees
shear_range=shear, # randomly shear images in the range (radians , -shear_range to shear_range)
horizontal_flip=flip, # randomly flip images
vertical_flip=flip) # randomly flip images
x, y = datagen.flow(train_dict, batch_size=1).next()
Y_test = np.rollaxis(Y_test, 1, 4)
# fit the model on the batches generated by datagen.flow()
loss_history = model.fit_generator(datagen.flow(train_dict, batch_size=batch_size),
steps_per_epoch=train_dict["labels"].shape[0] / batch_size,
epochs=n_epoch,
validation_data=(X_test, Y_test),
validation_steps=X_test.shape[0] / batch_size,
callbacks=[ModelCheckpoint(file_name_save, monitor='val_loss', verbose=1,
save_best_only=True, mode='auto'),
LearningRateScheduler(lr_sched)])
model.save_weights(file_name_save)
np.savez(file_name_save_loss, loss_history=loss_history.history)
data_location = '/home/vanvalen/Data/RAW_40X_tube/set1/'
channel_names = ["channel004", "channel001"]
image_list = get_images_from_directory(data_location, channel_names)
image = image_list[0]
for j in xrange(image.shape[1]):
image[0, j, :, :] = process_image(image[0, j, :, :], 30, 30, False)
pred = model.predict(image)
for j in xrange(3):
save_name = 'feature_' + str(j) + '.tiff'
tiff.imsave(save_name, pred[0, :, :, j])
return model
def train_model_movie(model=None, dataset=None, optimizer=None,
expt="", it=0, batch_size=1, n_epoch=100,
direc_save="/data/trained_networks/nuclear_movie",
direc_data="/data/training_data_npz/nuclear_movie",
lr_sched=rate_scheduler(lr=0.01, decay=0.95),
number_of_frames=10,
rotation_range=0, flip=True, shear=0, class_weight=None):
training_data_file_name = os.path.join(direc_data, dataset + ".npz")
todays_date = datetime.datetime.now().strftime("%Y-%m-%d")
file_name_save = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".h5")
file_name_save_loss = os.path.join(direc_save, todays_date + "_" + dataset + "_" + expt + "_" + str(it) + ".npz")
train_dict, (X_test, Y_test) = get_data(training_data_file_name, mode='movie')
class_weights = None # class_weight #train_dict["class_weights"]
# the data, shuffled and split between train and test sets
print('Training data shape:', train_dict["channels"].shape)
print('Training labels shape:', train_dict["labels"].shape)
print('Testing data shape:', X_test.shape)
print('Testing labels shape:', Y_test.shape)
# determine the number of classes
output_shape = model.layers[-1].output_shape
n_classes = output_shape[-1]
print output_shape, n_classes
def loss_function(y_true, y_pred):
return discriminative_instance_loss_3D(y_true, y_pred)
model.compile(loss=loss_function,
optimizer=optimizer)
print('Using real-time data augmentation.')
# this will do preprocessing and realtime data augmentation
datagen = MovieDataGenerator(
rotation_range=rotation_range, # randomly rotate images by 0 to rotation_range degrees
shear_range=shear, # randomly shear images in the range (radians , -shear_range to shear_range)
horizontal_flip=flip, # randomly flip images
vertical_flip=flip) # randomly flip images
print train_dict["channels"].shape
X_test = X_test[:, :, 0:number_of_frames, :, :]
Y_test = Y_test[:, :, 0:number_of_frames, :, :]
Y_test = np.rollaxis(Y_test, 1, 5)
# fit the model on the batches generated by datagen.flow()
loss_history = model.fit_generator(
datagen.flow(train_dict, batch_size=batch_size, number_of_frames=number_of_frames),
steps_per_epoch=train_dict["labels"].shape[0] / batch_size,
epochs=n_epoch,
validation_data=(X_test, Y_test),
validation_steps=X_test.shape[0] / batch_size,
callbacks=[ModelCheckpoint(file_name_save, monitor='val_loss', verbose=1, save_best_only=True, mode='auto'),
LearningRateScheduler(lr_sched)])
model.save_weights(file_name_save)
np.savez(file_name_save_loss, loss_history=loss_history.history)
return model
| 44.626238 | 119 | 0.655499 | 2,275 | 18,029 | 4.92967 | 0.124835 | 0.024253 | 0.02568 | 0.020062 | 0.775568 | 0.765225 | 0.756487 | 0.75078 | 0.741953 | 0.732234 | 0 | 0.011955 | 0.243718 | 18,029 | 403 | 120 | 44.736973 | 0.810561 | 0.10949 | 0 | 0.645985 | 0 | 0 | 0.095442 | 0.037053 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.156934 | null | null | 0.116788 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7d9fb2b0ea9e79230e9e7b00fe4cc28654149340 | 18 | py | Python | core/leras/__init__.py | chuanli11/GFPGAN | 4adbf820cef782c7d33113be35e5f1a49f2a3793 | [
"BSD-3-Clause"
] | null | null | null | core/leras/__init__.py | chuanli11/GFPGAN | 4adbf820cef782c7d33113be35e5f1a49f2a3793 | [
"BSD-3-Clause"
] | null | null | null | core/leras/__init__.py | chuanli11/GFPGAN | 4adbf820cef782c7d33113be35e5f1a49f2a3793 | [
"BSD-3-Clause"
] | null | null | null | from .nn import nn | 18 | 18 | 0.777778 | 4 | 18 | 3.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 18 | 1 | 18 | 18 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7dbac9ceccc14f9146dac7d36a1d13901505bce2 | 9,553 | py | Python | unittests.py | bclephas/testfile | a1eadc5bec3900b54d2c7b503465cbc450c46ec1 | [
"MIT"
] | null | null | null | unittests.py | bclephas/testfile | a1eadc5bec3900b54d2c7b503465cbc450c46ec1 | [
"MIT"
] | null | null | null | unittests.py | bclephas/testfile | a1eadc5bec3900b54d2c7b503465cbc450c46ec1 | [
"MIT"
] | null | null | null | import mock
import unittest
import testfile
@mock.patch('testfile._execute_commandline', return_value=('', '', 0))
@mock.patch('testfile._print_verbose')
@mock.patch('testfile._print_result')
@mock.patch('testfile._terminate_if_required')
class Testfile_tests(unittest.TestCase):
def setUp(self):
pass
def test_single_test__no_testcases__executes_ok(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {'tests': {}}
returncode = testfile.execute_testfile(config)
self.assertEqual(0, returncode)
def test_single_test__single_testcase__executes_ok(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
}]
}
returncode = testfile.execute_testfile(config)
self.assertEqual(0, returncode)
def test_single_test__multiple_testcases__executes_ok(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
{
'test': 'test_2',
'steps': [
'echo bar'
]
},
]
}
returncode = testfile.execute_testfile(config)
self.assertEqual(0, returncode)
def test_single_test__no_fixture__ok(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
}
]
}
returncode = testfile.execute_testfile(config)
self.assertEqual(0, returncode)
def test_single_test__no_tests__should_not_break(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
}
returncode = testfile.execute_testfile(config)
self.assertEqual(0, returncode)
def test_single_test__one_time_setup__called_exactly_once(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'fixture': {
'onetime_setup': 'foo',
},
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
{
'test': 'test_2',
'steps': [
'echo bar'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
self.assertEqual(3, mock_execute.call_count)
mock_execute.assert_has_calls([mock.call('foo'),
mock.call('echo foo'),
mock.call('echo bar')])
def test_single_test__one_time_teardown__called_exactly_once(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'fixture': {
'onetime_teardown': 'foo'
},
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
{
'test': 'test_2',
'steps': [
'echo bar'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
self.assertEqual(3, mock_execute.call_count)
mock_execute.assert_has_calls([mock.call('echo foo'),
mock.call('echo bar'),
mock.call('foo')])
def test_single_test__teardown_called_for_each_test(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'fixture': {
'teardown': 'foo'
},
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
{
'test': 'test_2',
'steps': [
'echo bar'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
self.assertEqual(4, mock_execute.call_count)
mock_execute.assert_has_calls([mock.call('echo foo'),
mock.call('foo'),
mock.call('echo bar'),
mock.call('foo')])
def test_single_test__setup_called_for_each_test(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'fixture': {
'setup': 'foo'
},
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
{
'test': 'test_2',
'steps': [
'echo bar'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
self.assertEqual(4, mock_execute.call_count)
mock_execute.assert_has_calls([mock.call('foo'),
mock.call('echo foo'),
mock.call('foo'),
mock.call('echo bar')])
def test_single_test__optional_description(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'test': 'test_1',
'description': 'foobarbaz',
'steps': [
'echo foo'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
mock_result.assert_has_calls([mock.call('test_1', mock.ANY, 0)])
def test_single_test__test_fails__print_error(self, mock_term, mock_result, mock_verbose, mock_execute):
mock_execute.return_value = ('', '', 3)
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
mock_result.assert_has_calls([mock.call('test_1', 'echo foo', 3)])
def test_single_test__test_succeeds__print_pass(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
mock_result.assert_has_calls([mock.call('test_1', mock.ANY, 0)])
def test_single_test__multiple_tests_one_fails__print_correct_error(self, mock_term, mock_result, mock_verbose, mock_execute):
mock_execute.side_effect=[('', '', 0), ('', '', 3)]
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo'
]
},
{
'test': 'test_2',
'steps': [
'echo bar'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
mock_result.assert_has_calls([mock.call('test_1', mock.ANY, 0), mock.call('test_2', 'echo bar', 3)])
def test_single_test__test_multiple_steps__print_pass(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'test': 'test_1',
'steps': [
'echo foo',
'echo bar',
'echo baz'
]
},
]
}
testfile.execute_testfile(config, verbose=True)
mock_execute.assert_has_calls([mock.call('echo foo;echo bar;echo baz')])
def test_disabled_tests__correct_results_shown(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'disabled_test': 'test_1',
'steps': [
'echo foo',
],
},
{
'test': 'test_2',
'steps': [
'echo bar',
]
},
]
}
testfile.execute_testfile(config, verbose=True)
mock_result.assert_has_calls([mock.call('test_1', '', 0, ignored=True),
mock.call('test_2', 'echo bar', 0)])
def test_disabled_tests__setup_and_teardown_not_executed(self, mock_term, mock_result, mock_verbose, mock_execute):
config = {
'tests': [
{
'disabled_test': 'test_1',
'steps': [
'echo foo',
],
},
{
'test': 'test_2',
'steps': [
'echo bar',
]
},
]
}
testfile.execute_testfile(config)
mock_verbose.assert_has_calls([mock.call(False, '', '', 0)])
if __name__ == '__main__':
unittest.main()
| 31.321311 | 130 | 0.447817 | 823 | 9,553 | 4.823815 | 0.09842 | 0.074811 | 0.048363 | 0.064484 | 0.85995 | 0.844584 | 0.812846 | 0.802771 | 0.802771 | 0.781864 | 0 | 0.009365 | 0.441118 | 9,553 | 304 | 131 | 31.424342 | 0.73422 | 0 | 0 | 0.503546 | 0 | 0 | 0.105109 | 0.010992 | 0 | 0 | 0 | 0 | 0.070922 | 1 | 0.060284 | false | 0.010638 | 0.010638 | 0 | 0.074468 | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7dc612365589dbc072e56db60676abde2f633481 | 333 | py | Python | app/tests/main_test.py | glushko311/docker-flask-postgres | f080917acedaf05a35530b8862732bf1a4fd37e5 | [
"MIT"
] | null | null | null | app/tests/main_test.py | glushko311/docker-flask-postgres | f080917acedaf05a35530b8862732bf1a4fd37e5 | [
"MIT"
] | null | null | null | app/tests/main_test.py | glushko311/docker-flask-postgres | f080917acedaf05a35530b8862732bf1a4fd37e5 | [
"MIT"
] | null | null | null | from unittest import TestCase
class TestSuit1(TestCase):
def test_one(self):
return self.assertEqual(1, 1)
def test_two(self):
return self.assertEqual(2, 2)
def test_three(self):
return self.assertEqual(3, 3)
def test_smoke(self):
pass
if __name__ == "__main__":
TestSuit1() | 19.588235 | 37 | 0.636637 | 43 | 333 | 4.651163 | 0.511628 | 0.14 | 0.21 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03252 | 0.261261 | 333 | 17 | 38 | 19.588235 | 0.780488 | 0 | 0 | 0 | 0 | 0 | 0.023952 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.333333 | false | 0.083333 | 0.083333 | 0.25 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
7de3f9a1ff40ee80210414f1b4ec421ac8b4c69f | 39,779 | py | Python | util/result.py | qihqi/FoodCaloriesPrediction | 4fa4d6122416640eaa792a44df5ff5db9ddfd8f5 | [
"BSD-2-Clause"
] | 1 | 2017-12-14T12:02:31.000Z | 2017-12-14T12:02:31.000Z | util/result.py | qihqi/FoodCaloriesPrediction | 4fa4d6122416640eaa792a44df5ff5db9ddfd8f5 | [
"BSD-2-Clause"
] | null | null | null | util/result.py | qihqi/FoodCaloriesPrediction | 4fa4d6122416640eaa792a44df5ff5db9ddfd8f5 | [
"BSD-2-Clause"
] | 1 | 2020-08-17T05:50:45.000Z | 2020-08-17T05:50:45.000Z | {(13, 108): {'restaurant': 'Wendys', 'food_name': 'Quarter Pound Single', 'calorie': '430', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Quarter Pound Single', '<br/>', '\n', '<b>Calories: </b>', '430', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 44): {'restaurant': 'KFC', 'food_name': 'KFC Snacker Sandwich', 'calorie': '210', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'KFC Snacker Sandwich', '<br/>', '\n', '<b>Calories: </b>', '210', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 63): {'restaurant': 'Panera', 'food_name': 'Italian Combo Sandwich', 'calorie': '1070', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Italian Combo Sandwich', '<br/>', '\n', '<b>Calories: </b>', '1070', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 90): {'restaurant': 'Subway', 'food_name': 'Spicy Italian Sandwich', 'calorie': '480', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Spicy Italian Sandwich', '<br/>', '\n', '<b>Calories: </b>', '480', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 6): {'restaurant': 'Arbys', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 11): {'restaurant': 'Arbys', 'food_name': 'Turkey Bacon Club Toasted Sub', 'calorie': '619', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Turkey Bacon Club Toasted Sub', '<br/>', '\n', '<b>Calories: </b>', '619', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 55): {'restaurant': 'McDonalds', 'food_name': 'Filet-O-Fish', 'calorie': '380', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Filet-O-Fish', '<br/>', '\n', '<b>Calories: </b>', '380', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 24): {'restaurant': 'Dunkin Donuts', 'food_name': 'Chocolate Frosted Donut', 'calorie': '230', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Chocolate Frosted Donut', '<br/>', '\n', '<b>Calories: </b>', '230', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 77): {'restaurant': 'Pizza Hut', 'food_name': 'Pepperoni Pizza', 'calorie': '280', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Pepperoni Pizza', '<br/>', '\n', '<b>Calories: </b>', '280', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 80): {'restaurant': 'Quiznos', 'food_name': 'Prime Rib Cheesesteak Sandwich', 'calorie': '830', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Prime Rib Cheesesteak Sandwich', '<br/>', '\n', '<b>Calories: </b>', '830', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 111): {'restaurant': 'Wendys', 'food_name': 'Ultimate Chicken Grill', 'calorie': '320', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Ultimate Chicken Grill', '<br/>', '\n', '<b>Calories: </b>', '320', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 60): {'restaurant': 'Panera', 'food_name': 'Asian Sesame Chicken Salad', 'calorie': '410', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Asian Sesame Chicken Salad', '<br/>', '\n', '<b>Calories: </b>', '410', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 89): {'restaurant': 'Subway', 'food_name': 'Meatball Marinara Sandwich', 'calorie': '560', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Meatball Marinara Sandwich', '<br/>', '\n', '<b>Calories: </b>', '560', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 1): {'restaurant': 'Arbys', 'food_name': 'Bacon Beef and Cheddar Sandwich', 'calorie': '521', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Bacon Beef and Cheddar Sandwich', '<br/>', '\n', '<b>Calories: </b>', '521', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 58): {'restaurant': 'McDonalds', 'food_name': 'Quarter Pounder with Cheese', 'calorie': '740', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Quarter Pounder with Cheese', '<br/>', '\n', '<b>Calories: </b>', '740', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 23): {'restaurant': 'Dunkin Donuts', 'food_name': 'Butternut Donut', 'calorie': '403', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Butternut Donut', '<br/>', '\n', '<b>Calories: </b>', '403', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 72): {'restaurant': 'Pizza Hut', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 101): {'restaurant': 'Wendys', 'food_name': 'Baconator', 'calorie': '830', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Baconator', '<br/>', '\n', '<b>Calories: </b>', '830', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 85): {'restaurant': 'Quiznos', 'food_name': 'Veggie', 'calorie': '590', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Veggie', '<br/>', '\n', '<b>Calories: </b>', '590', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 17): {'restaurant': 'Bruggers Bagels', 'food_name': 'Pumpernickel Bagel', 'calorie': '330', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Pumpernickel Bagel', '<br/>', '\n', '<b>Calories: </b>', '330', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (12, 97): {'restaurant': 'Taco Bell', 'food_name': 'Chalupa', 'calorie': '350', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Taco Bell', '<br/>', '\n', '<b>Food: </b>', 'Chalupa', '<br/>', '\n', '<b>Calories: </b>', '350', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 65): {'restaurant': 'Panera', 'food_name': 'Sierra turkey', 'calorie': '1000', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Sierra turkey', '<br/>', '\n', '<b>Calories: </b>', '1000', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 57): {'restaurant': 'McDonalds', 'food_name': 'Grilled Chicken Ranch BLT Sandwich', 'calorie': '580', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Grilled Chicken Ranch BLT Sandwich', '<br/>', '\n', '<b>Calories: </b>', '580', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 75): {'restaurant': 'Pizza Hut', 'food_name': 'Mushroom Pizza', 'calorie': '250', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Mushroom Pizza', '<br/>', '\n', '<b>Calories: </b>', '250', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 104): {'restaurant': 'Wendys', 'food_name': 'Chilli Soup', 'calorie': '280', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Chilli Soup', '<br/>', '\n', '<b>Calories: </b>', '280', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 40): {'restaurant': 'KFC', 'food_name': 'Drink', 'calorie': '180', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '180', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 37): {'restaurant': 'KFC', 'food_name': 'Crispy Chicken Breasts', 'calorie': '460', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Crispy Chicken Breasts', '<br/>', '\n', '<b>Calories: </b>', '460', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 94): {'restaurant': 'Subway', 'food_name': 'Turkey Breast Sandwich', 'calorie': '280', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Turkey Breast Sandwich', '<br/>', '\n', '<b>Calories: </b>', '280', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 18): {'restaurant': 'Bruggers Bagels', 'food_name': 'Sesame Bagel', 'calorie': '360', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Sesame Bagel', '<br/>', '\n', '<b>Calories: </b>', '360', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 51): {'restaurant': 'McDonalds', 'food_name': 'Big n Tasty', 'calorie': '510', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Big n Tasty', '<br/>', '\n', '<b>Calories: </b>', '510', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 28): {'restaurant': 'Dunkin Donuts', 'food_name': 'Jelly Filled Donut', 'calorie': '270', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Jelly Filled Donut', '<br/>', '\n', '<b>Calories: </b>', '270', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 12): {'restaurant': 'Bruggers Bagels', 'food_name': 'Cinnamon Raisin Bagel', 'calorie': '320', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Cinnamon Raisin Bagel', '<br/>', '\n', '<b>Calories: </b>', '320', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 70): {'restaurant': 'Pizza Hut', 'food_name': 'Cheese Pizza', 'calorie': '270', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Cheese Pizza', '<br/>', '\n', '<b>Calories: </b>', '270', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 107): {'restaurant': 'Wendys', 'food_name': 'Mandarian Chicken Salad', 'calorie': '180', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Mandarian Chicken Salad', '<br/>', '\n', '<b>Calories: </b>', '180', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 43): {'restaurant': 'KFC', 'food_name': 'Honey BBQ Wings', 'calorie': '80', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Honey BBQ Wings', '<br/>', '\n', '<b>Calories: </b>', '80', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 93): {'restaurant': 'Subway', 'food_name': 'Tuna Sandwich', 'calorie': '530', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Tuna Sandwich', '<br/>', '\n', '<b>Calories: </b>', '530', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 5): {'restaurant': 'Arbys', 'food_name': 'Classic Italian Toasted Sub', 'calorie': '787', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Classic Italian Toasted Sub', '<br/>', '\n', '<b>Calories: </b>', '787', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 10): {'restaurant': 'Arbys', 'food_name': 'Roast Turkey Ranch Sandwich', 'calorie': '818', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Roast Turkey Ranch Sandwich', '<br/>', '\n', '<b>Calories: </b>', '818', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 54): {'restaurant': 'McDonalds', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 27): {'restaurant': 'Dunkin Donuts', 'food_name': 'Glazed Donut', 'calorie': '230', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Glazed Donut', '<br/>', '\n', '<b>Calories: </b>', '230', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 76): {'restaurant': 'Pizza Hut', 'food_name': 'Onion Pizza', 'calorie': '250', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Onion Pizza', '<br/>', '\n', '<b>Calories: </b>', '250', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 81): {'restaurant': 'Quiznos', 'food_name': 'Raspberry Chipotle Chicken Salad', 'calorie': '620', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Raspberry Chipotle Chicken Salad', '<br/>', '\n', '<b>Calories: </b>', '620', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 110): {'restaurant': 'Wendys', 'food_name': 'Spicy Chicken Sandwich', 'calorie': '230', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Spicy Chicken Sandwich', '<br/>', '\n', '<b>Calories: </b>', '230', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 33): {'restaurant': 'Dunkin Donuts', 'food_name': 'Sugar Raised Donut', 'calorie': '210', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Sugar Raised Donut', '<br/>', '\n', '<b>Calories: </b>', '210', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 87): {'restaurant': 'Subway', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 61): {'restaurant': 'Panera', 'food_name': 'Caesar Salad', 'calorie': '400', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Caesar Salad', '<br/>', '\n', '<b>Calories: </b>', '400', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 88): {'restaurant': 'Subway', 'food_name': 'Italian BMT Sandwich', 'calorie': '410', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Italian BMT Sandwich', '<br/>', '\n', '<b>Calories: </b>', '410', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (6, 48): {'restaurant': 'Krispy Kreme Doughnuts', 'food_name': 'Granola Bar', 'calorie': '180', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Krispy Kreme Doughnuts', '<br/>', '\n', '<b>Food: </b>', 'Granola Bar', '<br/>', '\n', '<b>Calories: </b>', '180', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 53): {'restaurant': 'McDonalds', 'food_name': 'Chicken Selects Breast Strips', 'calorie': '133', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Chicken Selects Breast Strips', '<br/>', '\n', '<b>Calories: </b>', '133', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 22): {'restaurant': 'Dunkin Donuts', 'food_name': 'Boston Kreme Donut', 'calorie': '270', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Boston Kreme Donut', '<br/>', '\n', '<b>Calories: </b>', '270', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 82): {'restaurant': 'Quiznos', 'food_name': 'Roasted Turkey and Cheddar Sandwich', 'calorie': '510', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Roasted Turkey and Cheddar Sandwich', '<br/>', '\n', '<b>Calories: </b>', '510', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (6, 46): {'restaurant': 'Krispy Kreme Doughnuts', 'food_name': 'Chocolate Iced Kreme Filled Doughnut', 'calorie': '350', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Krispy Kreme Doughnuts', '<br/>', '\n', '<b>Food: </b>', 'Chocolate Iced Kreme Filled Doughnut', '<br/>', '\n', '<b>Calories: </b>', '350', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (12, 98): {'restaurant': 'Taco Bell', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Taco Bell', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 3): {'restaurant': 'Arbys', 'food_name': 'Chicken Club Salad', 'calorie': '425', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Chicken Club Salad', '<br/>', '\n', '<b>Calories: </b>', '425', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 66): {'restaurant': 'Panera', 'food_name': 'Smoked Turkey Breast Sandwich', 'calorie': '620', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Smoked Turkey Breast Sandwich', '<br/>', '\n', '<b>Calories: </b>', '620', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 56): {'restaurant': 'McDonalds', 'food_name': 'Grilled Chicken Club Sandwich', 'calorie': '590', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Grilled Chicken Club Sandwich', '<br/>', '\n', '<b>Calories: </b>', '590', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 21): {'restaurant': 'Dunkin Donuts', 'food_name': 'Bavarian Kreme Donut', 'calorie': '250', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Bavarian Kreme Donut', '<br/>', '\n', '<b>Calories: </b>', '250', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 74): {'restaurant': 'Pizza Hut', 'food_name': 'Ham Meat Pizza', 'calorie': '370', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Ham Meat Pizza', '<br/>', '\n', '<b>Calories: </b>', '370', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 103): {'restaurant': 'Wendys', 'food_name': 'Chicken Nuggets', 'calorie': '190', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Chicken Nuggets', '<br/>', '\n', '<b>Calories: </b>', '190', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 36): {'restaurant': 'KFC', 'food_name': 'Chicken Pot Pie', 'calorie': '690', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Chicken Pot Pie', '<br/>', '\n', '<b>Calories: </b>', '690', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 19): {'restaurant': 'Bruggers Bagels', 'food_name': 'Sesame Square Bagel', 'calorie': '360', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Sesame Square Bagel', '<br/>', '\n', '<b>Calories: </b>', '360', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 50): {'restaurant': 'McDonalds', 'food_name': 'Big Mac', 'calorie': '540', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Big Mac', '<br/>', '\n', '<b>Calories: </b>', '540', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 31): {'restaurant': 'Dunkin Donuts', 'food_name': 'Special Frosted Donut', 'calorie': 'N/A', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Special Frosted Donut', '<br/>', '\n', '<b>Calories: </b>', 'N/A', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 13): {'restaurant': 'Bruggers Bagels', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 78): {'restaurant': 'Quiznos', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 69): {'restaurant': 'Pizza Hut', 'food_name': 'Buffalo Chicken Tomato Pizza', 'calorie': '310', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Buffalo Chicken Tomato Pizza', '<br/>', '\n', '<b>Calories: </b>', '310', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 106): {'restaurant': 'Wendys', 'food_name': 'Half Pound Double', 'calorie': '700', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Half Pound Double', '<br/>', '\n', '<b>Calories: </b>', '700', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 42): {'restaurant': 'KFC', 'food_name': 'Honey BBQ Chicken Sandwich', 'calorie': '300', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Honey BBQ Chicken Sandwich', '<br/>', '\n', '<b>Calories: </b>', '300', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 39): {'restaurant': 'KFC', 'food_name': 'Crispy Whole Chicken Wing', 'calorie': '160', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Crispy Whole Chicken Wing', '<br/>', '\n', '<b>Calories: </b>', '160', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 92): {'restaurant': 'Subway', 'food_name': 'Subway Melt Salad', 'calorie': '380', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Subway Melt Salad', '<br/>', '\n', '<b>Calories: </b>', '380', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 20): {'restaurant': 'Bruggers Bagels', 'food_name': 'Whole Wheat Square Bagel', 'calorie': '390', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Whole Wheat Square Bagel', '<br/>', '\n', '<b>Calories: </b>', '390', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 4): {'restaurant': 'Arbys', 'food_name': 'Chicken Fillet Sandwich', 'calorie': '510', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Chicken Fillet Sandwich', '<br/>', '\n', '<b>Calories: </b>', '510', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (12, 100): {'restaurant': 'Taco Bell', 'food_name': 'Taco', 'calorie': '260', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Taco Bell', '<br/>', '\n', '<b>Food: </b>', 'Taco', '<br/>', '\n', '<b>Calories: </b>', '260', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 9): {'restaurant': 'Arbys', 'food_name': 'Roast Turkey and Swiss Sandwich', 'calorie': '708', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Roast Turkey and Swiss Sandwich', '<br/>', '\n', '<b>Calories: </b>', '708', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 26): {'restaurant': 'Dunkin Donuts', 'food_name': 'Double Chocolate Cake Donut', 'calorie': '340', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Double Chocolate Cake Donut', '<br/>', '\n', '<b>Calories: </b>', '340', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 14): {'restaurant': 'Bruggers Bagels', 'food_name': 'Jalapeno Cheese Bagel', 'calorie': '430', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Jalapeno Cheese Bagel', '<br/>', '\n', '<b>Calories: </b>', '430', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 109): {'restaurant': 'Wendys', 'food_name': 'South West Taco Salad', 'calorie': '400', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'South West Taco Salad', '<br/>', '\n', '<b>Calories: </b>', '400', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 32): {'restaurant': 'Dunkin Donuts', 'food_name': 'Special Pittsburgh Kreme Donut', 'calorie': 'N/A', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Special Pittsburgh Kreme Donut', '<br/>', '\n', '<b>Calories: </b>', 'N/A', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 45): {'restaurant': 'KFC', 'food_name': 'Sweet Kernel Corn', 'calorie': '110', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Sweet Kernel Corn', '<br/>', '\n', '<b>Calories: </b>', '110', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 86): {'restaurant': 'Subway', 'food_name': 'Chicken and Bacon Ranch Wrap', 'calorie': '580', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Chicken and Bacon Ranch Wrap', '<br/>', '\n', '<b>Calories: </b>', '580', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 62): {'restaurant': 'Panera', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 91): {'restaurant': 'Subway', 'food_name': 'Steak and Cheese Salad', 'calorie': '600', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Steak and Cheese Salad', '<br/>', '\n', '<b>Calories: </b>', '600', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 7): {'restaurant': 'Arbys', 'food_name': 'Philly Beef Toasted Sub', 'calorie': '739', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Philly Beef Toasted Sub', '<br/>', '\n', '<b>Calories: </b>', '739', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (6, 49): {'restaurant': 'Krispy Kreme Doughnuts', 'food_name': 'Twinkie', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Krispy Kreme Doughnuts', '<br/>', '\n', '<b>Food: </b>', 'Twinkie', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 52): {'restaurant': 'McDonalds', 'food_name': 'Chicken McNuggets', 'calorie': '170', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Chicken McNuggets', '<br/>', '\n', '<b>Calories: </b>', '170', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 25): {'restaurant': 'Dunkin Donuts', 'food_name': 'Cinnamon Cake Donut', 'calorie': '310', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Cinnamon Cake Donut', '<br/>', '\n', '<b>Calories: </b>', '310', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 67): {'restaurant': 'Pizza Hut', 'food_name': 'Black Olives Pizza', 'calorie': '250', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Black Olives Pizza', '<br/>', '\n', '<b>Calories: </b>', '250', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 83): {'restaurant': 'Quiznos', 'food_name': 'Sonoma Turkey Sammie', 'calorie': '300', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Sonoma Turkey Sammie', '<br/>', '\n', '<b>Calories: </b>', '300', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (6, 47): {'restaurant': 'Krispy Kreme Doughnuts', 'food_name': 'Glazed Chocolate Cake Doughnut', 'calorie': '300', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Krispy Kreme Doughnuts', '<br/>', '\n', '<b>Food: </b>', 'Glazed Chocolate Cake Doughnut', '<br/>', '\n', '<b>Calories: </b>', '300', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (12, 99): {'restaurant': 'Taco Bell', 'food_name': 'Gordita', 'calorie': '280', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Taco Bell', '<br/>', '\n', '<b>Food: </b>', 'Gordita', '<br/>', '\n', '<b>Calories: </b>', '280', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 2): {'restaurant': 'Arbys', 'food_name': 'Beef and Cheddar Sandwich', 'calorie': '445', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Beef and Cheddar Sandwich', '<br/>', '\n', '<b>Calories: </b>', '445', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (7, 59): {'restaurant': 'McDonalds', 'food_name': 'Southern Style Crispy Chicken Sandwich', 'calorie': '400', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'McDonalds', '<br/>', '\n', '<b>Food: </b>', 'Southern Style Crispy Chicken Sandwich', '<br/>', '\n', '<b>Calories: </b>', '400', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 73): {'restaurant': 'Pizza Hut', 'food_name': 'Green Pepper Onion Meat Pizza', 'calorie': '310', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Green Pepper Onion Meat Pizza', '<br/>', '\n', '<b>Calories: </b>', '310', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 102): {'restaurant': 'Wendys', 'food_name': 'Chicken Club', 'calorie': '540', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Chicken Club', '<br/>', '\n', '<b>Calories: </b>', '540', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 84): {'restaurant': 'Quiznos', 'food_name': 'Turkey Ranch and Swiss Sandwich', 'calorie': '510', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Turkey Ranch and Swiss Sandwich', '<br/>', '\n', '<b>Calories: </b>', '510', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (4, 34): {'restaurant': 'Giant Eagle', 'food_name': 'Chocolate Chip Classic Muffin', 'calorie': '270', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Giant Eagle', '<br/>', '\n', '<b>Food: </b>', 'Chocolate Chip Classic Muffin', '<br/>', '\n', '<b>Calories: </b>', '270', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 35): {'restaurant': 'KFC', 'food_name': 'Apple Pie Minis', 'calorie': '130', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Apple Pie Minis', '<br/>', '\n', '<b>Calories: </b>', '130', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 16): {'restaurant': 'Bruggers Bagels', 'food_name': 'Poppy Seed Bagel', 'calorie': '320', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Poppy Seed Bagel', '<br/>', '\n', '<b>Calories: </b>', '320', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (12, 96): {'restaurant': 'Taco Bell', 'food_name': 'Burrito', 'calorie': '350', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Taco Bell', '<br/>', '\n', '<b>Food: </b>', 'Burrito', '<br/>', '\n', '<b>Calories: </b>', '350', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (8, 64): {'restaurant': 'Panera', 'food_name': 'Mediterranean Veggie Sandwich', 'calorie': '610', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Panera', '<br/>', '\n', '<b>Food: </b>', 'Mediterranean Veggie Sandwich', '<br/>', '\n', '<b>Calories: </b>', '610', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 30): {'restaurant': 'Dunkin Donuts', 'food_name': 'Old Fashioned Cake Donut', 'calorie': '280', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Old Fashioned Cake Donut', '<br/>', '\n', '<b>Calories: </b>', '280', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (10, 79): {'restaurant': 'Quiznos', 'food_name': 'Honey Bourbon Chicken Sandwich', 'calorie': '480', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Quiznos', '<br/>', '\n', '<b>Food: </b>', 'Honey Bourbon Chicken Sandwich', '<br/>', '\n', '<b>Calories: </b>', '480', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 68): {'restaurant': 'Pizza Hut', 'food_name': 'Bread Stick', 'calorie': '160', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Bread Stick', '<br/>', '\n', '<b>Calories: </b>', '160', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (13, 105): {'restaurant': 'Wendys', 'food_name': 'Drink', 'calorie': '150', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Wendys', '<br/>', '\n', '<b>Food: </b>', 'Drink', '<br/>', '\n', '<b>Calories: </b>', '150', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 41): {'restaurant': 'KFC', 'food_name': 'Home-Style Biscuits', 'calorie': '180', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Home-Style Biscuits', '<br/>', '\n', '<b>Calories: </b>', '180', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (5, 38): {'restaurant': 'KFC', 'food_name': 'Crispy Chicken Thighs', 'calorie': '360', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'KFC', '<br/>', '\n', '<b>Food: </b>', 'Crispy Chicken Thighs', '<br/>', '\n', '<b>Calories: </b>', '360', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (11, 95): {'restaurant': 'Subway', 'food_name': 'Veggie Delite Sandwich', 'calorie': '230', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Subway', '<br/>', '\n', '<b>Food: </b>', 'Veggie Delite Sandwich', '<br/>', '\n', '<b>Calories: </b>', '230', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (1, 8): {'restaurant': 'Arbys', 'food_name': 'Reuben Turkey Sandwich', 'calorie': '594', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Arbys', '<br/>', '\n', '<b>Food: </b>', 'Reuben Turkey Sandwich', '<br/>', '\n', '<b>Calories: </b>', '594', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (3, 29): {'restaurant': 'Dunkin Donuts', 'food_name': 'Marble Frosted Donut', 'calorie': '240', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Dunkin Donuts', '<br/>', '\n', '<b>Food: </b>', 'Marble Frosted Donut', '<br/>', '\n', '<b>Calories: </b>', '240', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (2, 15): {'restaurant': 'Bruggers Bagels', 'food_name': 'Plain Bagel', 'calorie': '320', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Bruggers Bagels', '<br/>', '\n', '<b>Food: </b>', 'Plain Bagel', '<br/>', '\n', '<b>Calories: </b>', '320', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}, (9, 71): {'restaurant': 'Pizza Hut', 'food_name': 'Desert Cherry Nut Pizza', 'calorie': 'N/A', 'raw': ['\n\n', '<br/>', '<b>Restaurant: </b>', 'Pizza Hut', '<br/>', '\n', '<b>Food: </b>', 'Desert Cherry Nut Pizza', '<br/>', '\n', '<b>Calories: </b>', 'N/A', '<br/>', '<br/>', '\n', '<a href="./index.php">Search again</a>', '\n', '<br/>', '<br/>', '<b>Inst 1</b>']}}
| 19,889.5 | 39,778 | 0.470525 | 5,632 | 39,779 | 3.303622 | 0.061967 | 0.053692 | 0.047727 | 0.041761 | 0.790498 | 0.696012 | 0.64189 | 0.625121 | 0.620445 | 0.594163 | 0 | 0.031967 | 0.110586 | 39,779 | 1 | 39,779 | 39,779 | 0.493923 | 0 | 0 | 0 | 0 | 0 | 0.661404 | 0.06976 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7dec2bb361d9fc498b207a41189a8cceaeaa5bc6 | 42 | py | Python | groot/__init__.py | coolbromcdude/groot | 68d47ac7341e05a5530c7f8a0682cfbf8b3cb5fe | [
"MIT"
] | null | null | null | groot/__init__.py | coolbromcdude/groot | 68d47ac7341e05a5530c7f8a0682cfbf8b3cb5fe | [
"MIT"
] | null | null | null | groot/__init__.py | coolbromcdude/groot | 68d47ac7341e05a5530c7f8a0682cfbf8b3cb5fe | [
"MIT"
] | null | null | null | from .groot import Groot, HelperFunctions
| 21 | 41 | 0.833333 | 5 | 42 | 7 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119048 | 42 | 1 | 42 | 42 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7def31bc00a6806fbdc6b02d3206dcbe66a12c80 | 45 | py | Python | tasks/cv_tasks/__init__.py | zhaoguangxiang/OFA | cc1719df2713f0a046f34acb0afd8782e08ea6be | [
"Apache-2.0"
] | 367 | 2022-02-07T10:46:36.000Z | 2022-03-31T14:20:57.000Z | tasks/cv_tasks/__init__.py | zhaoguangxiang/OFA | cc1719df2713f0a046f34acb0afd8782e08ea6be | [
"Apache-2.0"
] | 29 | 2022-02-16T03:43:33.000Z | 2022-03-31T03:23:35.000Z | tasks/cv_tasks/__init__.py | zhaoguangxiang/OFA | cc1719df2713f0a046f34acb0afd8782e08ea6be | [
"Apache-2.0"
] | 44 | 2022-02-11T05:14:59.000Z | 2022-03-30T19:54:33.000Z | from .image_classify import ImageClassifyTask | 45 | 45 | 0.911111 | 5 | 45 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 45 | 1 | 45 | 45 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
816eade17175697fbb1940eef99d1c92eb16ef25 | 32 | py | Python | yolact/__init__.py | plarr2020-team1/yolact | 0256dfc48b63e9cde7fdfb0b340ae4213f2b5a5b | [
"MIT"
] | null | null | null | yolact/__init__.py | plarr2020-team1/yolact | 0256dfc48b63e9cde7fdfb0b340ae4213f2b5a5b | [
"MIT"
] | null | null | null | yolact/__init__.py | plarr2020-team1/yolact | 0256dfc48b63e9cde7fdfb0b340ae4213f2b5a5b | [
"MIT"
] | null | null | null | from yolact.yolact import Yolact | 32 | 32 | 0.875 | 5 | 32 | 5.6 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
818d7d635823dadcdff23f136553a1fec4689913 | 49 | py | Python | kalliope/players/sounddeviceplayer/__init__.py | joshuaboniface/kalliope | 0e040be3165e838485d1e5addc4d2c5df12bfd84 | [
"MIT"
] | 1 | 2020-03-30T15:03:19.000Z | 2020-03-30T15:03:19.000Z | kalliope/players/sounddeviceplayer/__init__.py | joshuaboniface/kalliope | 0e040be3165e838485d1e5addc4d2c5df12bfd84 | [
"MIT"
] | null | null | null | kalliope/players/sounddeviceplayer/__init__.py | joshuaboniface/kalliope | 0e040be3165e838485d1e5addc4d2c5df12bfd84 | [
"MIT"
] | 1 | 2021-11-21T19:08:15.000Z | 2021-11-21T19:08:15.000Z | from .sounddeviceplayer import Sounddeviceplayer
| 24.5 | 48 | 0.897959 | 4 | 49 | 11 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.977778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
81a9d7a6281fe0dd958d3b94f479064c5cbb8262 | 114 | py | Python | pyleecan/Methods/Slot/SlotWLSRPM/__init__.py | tobsen2code/pyleecan | 5b1ded9e389e0c79ed7b7c878b6e939f2d9962e9 | [
"Apache-2.0"
] | 95 | 2019-01-23T04:19:45.000Z | 2022-03-17T18:22:10.000Z | pyleecan/Methods/Slot/SlotWLSRPM/__init__.py | ecs-kev/pyleecan | 1faedde4b24acc6361fa1fdd4e980eaec4ca3a62 | [
"Apache-2.0"
] | 366 | 2019-02-20T07:15:08.000Z | 2022-03-31T13:37:23.000Z | pyleecan/Methods/Slot/SlotWLSRPM/__init__.py | ecs-kev/pyleecan | 1faedde4b24acc6361fa1fdd4e980eaec4ca3a62 | [
"Apache-2.0"
] | 74 | 2019-01-24T01:47:31.000Z | 2022-02-25T05:44:42.000Z | from ....Methods.Slot.Slot import SlotCheckError
class SLSRPMOutterError(SlotCheckError):
""" """
pass
| 14.25 | 48 | 0.692982 | 10 | 114 | 7.9 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175439 | 114 | 7 | 49 | 16.285714 | 0.840426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
81aa2a5a75193ed95355dcd623c94a317bcc959e | 15,888 | py | Python | topside/visualization/optimization/cost_terms.py | roguextech/Waterloo-Rocketry-topside | 345e7d47efdac04c2c5f70d55f83bd77acdbb511 | [
"MIT"
] | 4 | 2020-04-18T00:40:55.000Z | 2021-06-10T04:04:09.000Z | topside/visualization/optimization/cost_terms.py | roguextech/Waterloo-Rocketry-topside | 345e7d47efdac04c2c5f70d55f83bd77acdbb511 | [
"MIT"
] | 82 | 2020-04-15T21:26:04.000Z | 2022-02-04T04:50:07.000Z | topside/visualization/optimization/cost_terms.py | roguextech/Waterloo-Rocketry-topside | 345e7d47efdac04c2c5f70d55f83bd77acdbb511 | [
"MIT"
] | 8 | 2020-04-21T17:54:36.000Z | 2022-02-28T16:14:21.000Z | import numpy as np
class NeighboringDistance():
"""
Encourages a nominal distance between neighboring nodes.
For two nodes [x1, y1] and [x2, y2] with nominal separation D, the
neighboring distance cost C is defined as:
C = W * (D - sqrt((x2 - x1)^2 + (y2 - y1)^2))^2
where W is a constant used to scale the relative "importance" of
this cost term.
"""
def __init__(self, g, node_indices, node_component_neighbors, settings, internal):
"""
Initialize the cost term.
Parameters
----------
g: graph
The graph for which this cost term will be evaluated.
Typically, this is a NetworkX graph, but it can be any type
with a .neighbors(n) method returning the neighbors of node
n. Calling this directly with a PlumbingEngine graph may
lead to unexpected results; it is best to use a terminal
graph instead (topside.terminal_graph).
node_indices: dict
Dict where node_indices[n] is the index i of node n in the
cost vector.
node_component_neighbors: dict
Dict where, for a given node n, node_component_neighbors[n]
is a list of all nodes in g that are both neighbors of n
and in the same component as n.
settings: OptimizerSettings
This can be any type that provides the same attributes as
topside.visualization.optimization.OptimizerSettings.
internal: bool
if True, this cost term penalizes only deviation from
nominal distance within a component. If False, this cost
term instead penalizes deviation from nominal distance
between two nodes that are not part of the same component.
"""
self.num_nodes = len(node_indices)
self.mask = np.ones((self.num_nodes, self.num_nodes))
for node in g:
i = node_indices[node]
for neighbor in g.neighbors(node):
j = node_indices[neighbor]
if internal is (neighbor in node_component_neighbors[node]):
self.mask[i, j] = 0
self.mask[j, i] = 0
if internal:
self.nominal_dist = settings.nominal_dist_internal
self.weight = settings.internal_weight
else:
self.nominal_dist = settings.nominal_dist_neighbors
self.weight = settings.neighbors_weight
def evaluate(self, costargs):
"""
Evaluate the cost term and return a tuple of (cost, gradient).
Arguments:
costargs: dict
Expected to contain:
- 'deltas': N x N x 2 numpy array, where:
deltas[i, j, :] == [xj - xi, yj - yi]
- 'norms': N x N numpy array, where:
norms[i, j] == sqrt[(xj - xi)^2 + (yj - yi)^2]
"""
cost = 0
grad = np.zeros(self.num_nodes * 2)
norms = np.ma.masked_array(costargs['norms'], mask=self.mask)
cost_matrix = self.weight * (self.nominal_dist - norms) ** 2
cost += np.sum(cost_matrix.filled(0))
grad_common = self.weight * (norms - self.nominal_dist) * (4 / norms)
grad_matrix = costargs['deltas'] * grad_common[:, :, None]
grad += np.reshape(np.sum(grad_matrix.filled(0), axis=1), grad.shape)
return (cost, grad)
class NonNeighboringDistance:
"""
Encourages a minimum distance between non-neighboring nodes.
For two nodes [x1, y1] and [x2, y2] with minimum separation D, the
non-neighboring distance cost C can be calculated with the following
pseudocode algorithm:
d = sqrt((x2 - x1)^2 + (y2 - y1)^2)
if d < D:
C = W * (D - d)^2
else:
C = 0
where W is a constant used to scale the relative "importance" of
this cost term.
"""
def __init__(self, g, node_indices, node_component_neighbors, settings):
"""
Initialize the cost term.
Parameters
----------
g: graph
The graph for which this cost term will be evaluated.
Typically, this is a NetworkX graph, but it can be any type
with a .neighbors(n) method returning the neighbors of node
n. Calling this directly with a PlumbingEngine graph may
lead to unexpected results; it is best to use a terminal
graph instead (topside.terminal_graph).
node_indices: dict
Dict where node_indices[n] is the index i of node n in the
cost vector.
node_component_neighbors: dict
Dict where, for a given node n, node_component_neighbors[n]
is a list of all nodes in g that are both neighbors of n
and in the same component as n.
settings: OptimizerSettings
This can be any type that provides the same attributes as
topside.visualization.optimization.OptimizerSettings.
"""
self.num_nodes = len(node_indices)
self.mask = np.identity(self.num_nodes)
for node in g:
i = node_indices[node]
for neighbor in g.neighbors(node):
j = node_indices[neighbor]
self.mask[i, j] = 1
self.mask[j, i] = 1
self.weight = settings.others_weight
self.minimum_dist = settings.minimum_dist_others
def evaluate(self, costargs):
"""
Evaluate the cost term and return a tuple of (cost, gradient).
Arguments:
costargs: dict
Expected to contain:
- 'deltas': N x N x 2 numpy array, where:
deltas[i, j, :] == [xj - xi, yj - yi]
- 'norms': N x N numpy array, where:
norms[i, j] == sqrt[(xj - xi)^2 + (yj - yi)^2]
"""
cost = 0
grad = np.zeros(self.num_nodes * 2)
nodes_to_ignore = (costargs['norms'] >= self.minimum_dist)
mask = np.logical_or(self.mask, nodes_to_ignore)
masked_norms = np.ma.masked_array(costargs['norms'], mask=mask)
cost_matrix = self.weight * (self.minimum_dist - masked_norms) ** 2
cost += np.sum(cost_matrix.filled(0))
grad_common_term = self.weight * (masked_norms - self.minimum_dist) * (4 / masked_norms)
grad_matrix = costargs['deltas'] * grad_common_term[:, :, None]
grad += np.reshape(np.sum(grad_matrix.filled(0), axis=1), grad.shape)
return (cost, grad)
class CentroidDeviation:
"""
Encourages a node to remain close to the centroid of its neighbors.
The centroid deviation cost term C can be calculated with the
following pseudocode algorithm:
if num_neighbors(n) < 2:
C = 0
else:
v = centroid(neighbors(n))
C = W * ((vx - nx)^2 + (vy - ny)^2)
where W is a constant used to scale the relative "importance" of
this cost term.
"""
def __init__(self, g, node_indices, node_component_neighbors, settings):
"""
Initialize the cost term.
Parameters
----------
g: graph
The graph for which this cost term will be evaluated.
Typically, this is a NetworkX graph, but it can be any type
with a .neighbors(n) method returning the neighbors of node
n. Calling this directly with a PlumbingEngine graph may
lead to unexpected results; it is best to use a terminal
graph instead (topside.terminal_graph).
node_indices: dict
Dict where node_indices[n] is the index i of node n in the
cost vector.
node_component_neighbors: dict
Dict where, for a given node n, node_component_neighbors[n]
is a list of all nodes in g that are both neighbors of n
and in the same component as n.
settings: OptimizerSettings
This can be any type that provides the same attributes as
topside.visualization.optimization.OptimizerSettings.
"""
self.num_nodes = len(node_indices)
self.cost_mask = np.ones((self.num_nodes, self.num_nodes, 2))
self.grad_mask = np.ones((self.num_nodes, self.num_nodes, 2))
self.num_neighbors = np.ones((self.num_nodes, 1))
for node in g:
i = node_indices[node]
neighbors = list(g.neighbors(node))
if len(neighbors) > 1:
self.grad_mask[i, i] = 0
self.num_neighbors[i] = len(neighbors)
for neighbor in neighbors:
j = node_indices[neighbor]
self.cost_mask[i, j] = 0
self.grad_mask[j, i] = 0
self.weight = settings.centroid_deviation_weight
grad_coeffs_diag = self.weight * 2 * np.identity(self.num_nodes)
grad_coeffs_off_diag = self.weight * -2 * \
np.ones((self.num_nodes, self.num_nodes, 2)) / self.num_neighbors
np.fill_diagonal(grad_coeffs_off_diag[:, :, 0], 0)
np.fill_diagonal(grad_coeffs_off_diag[:, :, 1], 0)
self.grad_coeffs = grad_coeffs_off_diag + grad_coeffs_diag[:, :, None]
def evaluate(self, costargs):
"""
Evaluate the cost term and return a tuple of (cost, gradient).
Arguments:
costargs: dict
Expected to contain:
- 'deltas': N x N x 2 numpy array, where:
deltas[i, j, :] == [xj - xi, yj - yi]
- 'norms': N x N numpy array, where:
norms[i, j] == sqrt[(xj - xi)^2 + (yj - yi)^2]
"""
cost = 0
grad = np.zeros(self.num_nodes * 2)
masked_deltas = np.ma.masked_array(costargs['deltas'], mask=self.cost_mask)
centroid_deviations = np.sum(masked_deltas, axis=1) / self.num_neighbors
cost_matrix = self.weight * centroid_deviations ** 2
cost += np.sum(cost_matrix.filled(0))
reshaped = np.reshape(centroid_deviations, (1, self.num_nodes, 2))
repeated = np.repeat(reshaped, self.num_nodes, axis=0)
grad_deviations = np.ma.masked_array(repeated, mask=self.grad_mask)
grad_matrix = self.grad_coeffs * grad_deviations
grad += np.reshape(np.sum(grad_matrix.filled(0), axis=1), grad.shape)
return (cost, grad)
class RightAngleDeviation:
"""
Encourages a component to be oriented horizontally or vertically.
The right angle deviation cost term C is defined as follows:
C = W * (x2 - x1)^2 * (y2 - y1)^2
where W is a constant used to scale the relative "importance" of
this cost term.
"""
def __init__(self, g, node_indices, node_component_neighbors, settings):
"""
Initialize the cost term.
Parameters
----------
g: graph
The graph for which this cost term will be evaluated.
Typically, this is a NetworkX graph, but it can be any type
with a .neighbors(n) method returning the neighbors of node
n. Calling this directly with a PlumbingEngine graph may
lead to unexpected results; it is best to use a terminal
graph instead (topside.terminal_graph).
node_indices: dict
Dict where node_indices[n] is the index i of node n in the
cost vector.
node_component_neighbors: dict
Dict where, for a given node n, node_component_neighbors[n]
is a list of all nodes in g that are both neighbors of n
and in the same component as n.
settings: OptimizerSettings
This can be any type that provides the same attributes as
topside.visualization.optimization.OptimizerSettings.
"""
self.num_nodes = len(node_indices)
self.mask = np.ones((self.num_nodes, self.num_nodes, 2))
for node in g:
i = node_indices[node]
for neighbor in node_component_neighbors[node]:
j = node_indices[neighbor]
self.mask[i, j] = 0
self.mask[j, i] = 0
self.weight = settings.right_angle_weight
def evaluate(self, costargs):
"""
Evaluate the cost term and return a tuple of (cost, gradient).
Arguments:
costargs: dict
Expected to contain:
- 'deltas': N x N x 2 numpy array, where:
deltas[i, j, :] == [xj - xi, yj - yi]
- 'norms': N x N numpy array, where:
norms[i, j] == sqrt[(xj - xi)^2 + (yj - yi)^2]
"""
cost = 0
grad = np.zeros(self.num_nodes * 2)
internal_deltas = np.ma.masked_array(costargs['deltas'], mask=self.mask)
dxdy = np.product(internal_deltas, axis=2)
cost_matrix = self.weight * dxdy ** 2
cost += np.sum(cost_matrix.filled(0))
grad_common = self.weight * 4 * dxdy[:, :, None]
grad_matrix = grad_common * np.flip(internal_deltas, axis=2)
grad += np.reshape(np.sum(grad_matrix.filled(0), axis=1), grad.shape)
return (cost, grad)
class HorizontalDeviation:
"""
Encourages a component to be oriented horizontally.
The right angle deviation cost term C is defined as follows:
C = W * (y2 - y1)^2
where W is a constant used to scale the relative "importance" of
this cost term.
"""
def __init__(self, g, node_indices, node_component_neighbors, settings):
"""
Initialize the cost term.
Parameters
----------
g: graph
The graph for which this cost term will be evaluated.
Typically, this is a NetworkX graph, but it can be any type
with a .neighbors(n) method returning the neighbors of node
n. Calling this directly with a PlumbingEngine graph may
lead to unexpected results; it is best to use a terminal
graph instead (topside.terminal_graph).
node_indices: dict
Dict where node_indices[n] is the index i of node n in the
cost vector.
node_component_neighbors: dict
Dict where, for a given node n, node_component_neighbors[n]
is a list of all nodes in g that are both neighbors of n
and in the same component as n.
settings: OptimizerSettings
This can be any type that provides the same attributes as
topside.visualization.optimization.OptimizerSettings.
"""
self.num_nodes = len(node_indices)
self.mask = np.ones((self.num_nodes, self.num_nodes))
for node in g:
i = node_indices[node]
for neighbor in node_component_neighbors[node]:
j = node_indices[neighbor]
self.mask[i, j] = 0
self.mask[j, i] = 0
self.weight = settings.horizontal_weight
def evaluate(self, costargs):
"""
Evaluate the cost term and return a tuple of (cost, gradient).
Arguments:
costargs: dict
Expected to contain:
- 'deltas': N x N x 2 numpy array, where:
deltas[i, j, :] == [xj - xi, yj - yi]
- 'norms': N x N numpy array, where:
norms[i, j] == sqrt[(xj - xi)^2 + (yj - yi)^2]
"""
cost = 0
grad = np.zeros(self.num_nodes * 2)
delta_y = costargs['deltas'][:, :, 1]
internal_delta_y = np.ma.masked_array(delta_y, mask=self.mask)
cost_matrix = self.weight * internal_delta_y ** 2
cost += np.sum(cost_matrix.filled(0))
grad_matrix = np.ma.masked_array(np.zeros((self.num_nodes, self.num_nodes, 2)))
grad_matrix[:, :, 1] = self.weight * 4 * internal_delta_y
grad += np.reshape(np.sum(grad_matrix.filled(0), axis=1), grad.shape)
return (cost, grad)
| 34.464208 | 96 | 0.590949 | 2,115 | 15,888 | 4.327187 | 0.091253 | 0.02524 | 0.038024 | 0.015625 | 0.802666 | 0.789773 | 0.774366 | 0.737653 | 0.720498 | 0.696569 | 0 | 0.010828 | 0.319927 | 15,888 | 460 | 97 | 34.53913 | 0.836187 | 0.476586 | 0 | 0.5 | 0 | 0 | 0.006616 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075758 | false | 0 | 0.007576 | 0 | 0.159091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
81dd7bf93e1b979a9c38544720897fb0c67f3793 | 110 | py | Python | SalavatiHolidayCheckup/__init__.py | khalooei/holiday-check | 683ea8aeb9b3103ffd1c9bb3d21101cd08d431a9 | [
"MIT"
] | null | null | null | SalavatiHolidayCheckup/__init__.py | khalooei/holiday-check | 683ea8aeb9b3103ffd1c9bb3d21101cd08d431a9 | [
"MIT"
] | null | null | null | SalavatiHolidayCheckup/__init__.py | khalooei/holiday-check | 683ea8aeb9b3103ffd1c9bb3d21101cd08d431a9 | [
"MIT"
] | 2 | 2021-05-06T21:19:44.000Z | 2021-05-06T21:31:25.000Z | from .HolidayCheck import *
from datetime import datetime
import jdatetime
from hijri_converter import convert | 27.5 | 35 | 0.863636 | 14 | 110 | 6.714286 | 0.571429 | 0.297872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118182 | 110 | 4 | 35 | 27.5 | 0.969072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c493db78068baf0d919e5d32b6b88fdfb9fe0e2a | 317 | py | Python | lang/py/cookbook/v2/source/cb2_11_6_sol_2.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | lang/py/cookbook/v2/source/cb2_11_6_sol_2.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | lang/py/cookbook/v2/source/cb2_11_6_sol_2.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | icon='''R0lGODdhFQAVAPMAAAQ2PESapISCBASCBMTCxPxmNCQiJJya/ISChGRmzPz+/PxmzDQyZ
DQyZDQyZDQyZCwAAAAAFQAVAAAElJDISau9Vh2WMD0gqHHelJwnsXVloqDd2hrMm8pYYiSHYfMMRm
53ULlQHGFFx1MZCciUiVOsPmEkKNVp3UBhJ4Ohy1UxerSgJGZMMBbcBACQlVhRiHvaUsXHgywTdyc
LdxyB gm1vcTyIZW4MeU6NgQEBXEGRcQcIlwQIAwEHoioCAgWmCZ0Iq5+hA6wIpqislgGhthEAOw==
'''
| 52.833333 | 78 | 0.940063 | 9 | 317 | 33.111111 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061093 | 0.018927 | 317 | 5 | 79 | 63.4 | 0.897106 | 0 | 0 | 0 | 0 | 0 | 0.962145 | 0.930599 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c4a01a91b0fce925e5fb9acd513f594ad72f9537 | 104 | py | Python | Old/src/com/basic/ruboob/jisuan.py | exchris/Pythonlearn | 174f38a86cf1c85d6fc099005aab3568e7549cd0 | [
"MIT"
] | null | null | null | Old/src/com/basic/ruboob/jisuan.py | exchris/Pythonlearn | 174f38a86cf1c85d6fc099005aab3568e7549cd0 | [
"MIT"
] | 1 | 2018-11-27T09:58:54.000Z | 2018-11-27T09:58:54.000Z | Old/src/com/basic/ruboob/jisuan.py | exchris/pythonlearn | 174f38a86cf1c85d6fc099005aab3568e7549cd0 | [
"MIT"
] | null | null | null | # 数值运算
5 + 4 # 加法
4.3 - 2 # 减法
3 * 7 # 乘法
2 / 4 # 除法,得到一个浮点数
2 // 4 # 除法,得到一个整数
17 % 3 # 取余
2 ** 5 # 乘方 | 13 | 18 | 0.451923 | 25 | 104 | 1.88 | 0.6 | 0.085106 | 0.170213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238806 | 0.355769 | 104 | 8 | 19 | 13 | 0.462687 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c4d260332024beff0f03ea02bd502430fc3d0a83 | 201 | py | Python | pizza/admin.py | Neo945/PizzaShop | 2a3a365d68ebdc0152f9e9e5176fe78422e51640 | [
"MIT"
] | null | null | null | pizza/admin.py | Neo945/PizzaShop | 2a3a365d68ebdc0152f9e9e5176fe78422e51640 | [
"MIT"
] | null | null | null | pizza/admin.py | Neo945/PizzaShop | 2a3a365d68ebdc0152f9e9e5176fe78422e51640 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Pizza,Topping,Topping_pizza
# Register your models here.
admin.site.register(Pizza)
admin.site.register(Topping)
admin.site.register(Topping_pizza) | 28.714286 | 47 | 0.830846 | 29 | 201 | 5.689655 | 0.413793 | 0.163636 | 0.309091 | 0.290909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079602 | 201 | 7 | 48 | 28.714286 | 0.891892 | 0.129353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
c4de07e8a1cb717f09e96ebeb64239a53f38a241 | 27 | py | Python | ConverttoDDB/__init__.py | mailhexu/ConverttoDDB | 03ca131d8a62b814ce6daa3b5d35db7922a148a2 | [
"MIT"
] | null | null | null | ConverttoDDB/__init__.py | mailhexu/ConverttoDDB | 03ca131d8a62b814ce6daa3b5d35db7922a148a2 | [
"MIT"
] | null | null | null | ConverttoDDB/__init__.py | mailhexu/ConverttoDDB | 03ca131d8a62b814ce6daa3b5d35db7922a148a2 | [
"MIT"
] | null | null | null | from ConverttoDDB import *
| 13.5 | 26 | 0.814815 | 3 | 27 | 7.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 27 | 1 | 27 | 27 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c4ec0e9259a4f03ae940a5f00ed55ffdbe18a231 | 27 | py | Python | src/__init__.py | ScienceStacks/FittingSurface | 7994995c7155817ea4334f10dcd21e691cee46da | [
"MIT"
] | null | null | null | src/__init__.py | ScienceStacks/FittingSurface | 7994995c7155817ea4334f10dcd21e691cee46da | [
"MIT"
] | null | null | null | src/__init__.py | ScienceStacks/FittingSurface | 7994995c7155817ea4334f10dcd21e691cee46da | [
"MIT"
] | null | null | null | import src.constants as cn
| 13.5 | 26 | 0.814815 | 5 | 27 | 4.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 27 | 1 | 27 | 27 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4801b3a8597330db6455106176750c09e5dae1ea | 39 | py | Python | dashserve/tests/__init__.py | omegaml/dashserve | 4ab09bc70bed4070cd7dca8cf9b976212546cff3 | [
"MIT"
] | 7 | 2019-04-01T19:31:33.000Z | 2020-06-08T13:23:39.000Z | dashserve/tests/__init__.py | omegaml/dashserve | 4ab09bc70bed4070cd7dca8cf9b976212546cff3 | [
"MIT"
] | null | null | null | dashserve/tests/__init__.py | omegaml/dashserve | 4ab09bc70bed4070cd7dca8cf9b976212546cff3 | [
"MIT"
] | null | null | null | from dashserve.tests import myapp
| 5.571429 | 33 | 0.74359 | 5 | 39 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 39 | 6 | 34 | 6.5 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
483af905d7f6302c29d48faf91ebdc055eeabafc | 119 | py | Python | CodeWars/Python/5 kyu/Where my anagrams at?/main.py | opastushkov/codewars-solutions | 0132a24259a4e87f926048318332dcb4d94858ca | [
"MIT"
] | null | null | null | CodeWars/Python/5 kyu/Where my anagrams at?/main.py | opastushkov/codewars-solutions | 0132a24259a4e87f926048318332dcb4d94858ca | [
"MIT"
] | null | null | null | CodeWars/Python/5 kyu/Where my anagrams at?/main.py | opastushkov/codewars-solutions | 0132a24259a4e87f926048318332dcb4d94858ca | [
"MIT"
] | null | null | null | from collections import Counter
def anagrams(word, words):
return [x for x in words if Counter(word) == Counter(x)] | 39.666667 | 60 | 0.731092 | 19 | 119 | 4.578947 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168067 | 119 | 3 | 60 | 39.666667 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
4876d919b7a112e32de20a167b6f53b1dbd9b0db | 7,606 | py | Python | tests/data_providers/test_frankfurter.py | N0081K/gold-digger | 3a68084bf65565a61009f5e01da643499a5e06e6 | [
"Apache-2.0"
] | 7 | 2016-05-04T17:13:58.000Z | 2017-11-07T07:29:16.000Z | tests/data_providers/test_frankfurter.py | N0081K/gold-digger | 3a68084bf65565a61009f5e01da643499a5e06e6 | [
"Apache-2.0"
] | 94 | 2019-07-03T15:33:29.000Z | 2022-03-28T01:17:41.000Z | tests/data_providers/test_frankfurter.py | N0081K/gold-digger | 3a68084bf65565a61009f5e01da643499a5e06e6 | [
"Apache-2.0"
] | 2 | 2017-05-30T13:55:01.000Z | 2017-08-20T19:52:45.000Z | from datetime import date
from decimal import Decimal
import pytest
from requests import Response
API_RESPONSE_USD = b"""
{
"base": "USD",
"date": "2019-04-15",
"rates": {
"BGN": 1.7288075665,
"NZD": 1.4787412711,
"ILS": 3.5618315213,
"RUB": 64.2633253779,
"CAD": 1.3312118801,
"PHP": 51.6892071069,
"CHF": 1.0028286043,
"AUD": 1.3931759922,
"JPY": 111.9596923893,
"TRY": 5.8019093079,
"HKD": 7.8392115266,
"MYR": 4.1167683196,
"HRK": 6.5729691505,
"CZK": 22.6509325555,
"IDR": 14060.0017678777,
"DKK": 6.5976310439,
"NOK": 8.4874038717,
"HUF": 283.0814107664,
"GBP": 0.7628834085,
"MXN": 18.7834349863,
"THB": 31.7652258464,
"ISK": 119.8621055423,
"ZAR": 13.9832051622,
"BRL": 3.8880049501,
"SGD": 1.3523380182,
"PLN": 3.7781313533,
"INR": 69.4249977902,
"KRW": 1132.6527004331,
"RON": 4.2091399275,
"CNY": 6.7061787324,
"SEK": 9.2467073279,
"EUR": 0.8839388314
}
}
"""
@pytest.fixture
def response():
"""
:rtype: requests.Response
"""
return Response()
class TestGetByDate:
@staticmethod
def test_get_by_date__available(frankfurter, response, logger):
"""
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 200
response._content = API_RESPONSE_USD
frankfurter._get = lambda url, **kw: response
converted_rate = frankfurter.get_by_date(date(2019, 4, 15), "CZK", logger)
assert converted_rate == Decimal(22.6509325555)
@staticmethod
def test_get_by_date__date_unavailable(frankfurter, response, logger):
"""
Frankfurter API returns rates from last available date when asked for an unavailable one.
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 200
response._content = API_RESPONSE_USD
frankfurter._get = lambda url, **kw: response
converted_rate = frankfurter.get_by_date(date(2019, 4, 16), "CZK", logger)
assert converted_rate == Decimal(22.6509325555)
@staticmethod
def test_get_by_date__date_too_old(frankfurter, response, logger):
"""
Frankfurter API returns error when the specified date is before 1999-01-04.
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 400
response._content = b'{"error": "Error message"}'
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_by_date(date(1000, 1, 11), "CZK", logger)
assert converted_rates is None
@staticmethod
def test_get_by_date__currency_unavailable(frankfurter, response, logger):
"""
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 400
response._content = b'{"error": "Error message"}'
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_by_date(date(2019, 4, 16), "XXX", logger)
assert converted_rates is None
@staticmethod
def test_get_by_date__base_currency_is_same_as_target_currency(frankfurter, base_currency, logger):
"""
Frankfurter API returns error when base and target currencies are same.
:type frankfurter: gold_digger.data_providers.Frankfurter
:type base_currency: str
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 200
response._content = API_RESPONSE_USD
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_by_date(date(2019, 4, 16), base_currency, logger)
assert converted_rates == Decimal("1")
class TestGetAllByDate:
@staticmethod
def test_get_all_by_date__available(frankfurter, response, logger):
"""
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 200
response._content = API_RESPONSE_USD
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_all_by_date(date(2019, 4, 15), {"CZK", "EUR"}, logger)
assert converted_rates == {
"CZK": Decimal(22.6509325555),
"EUR": Decimal(0.8839388314),
}
@staticmethod
def test_get_all_by_date__date_unavailable(frankfurter, response, logger):
"""
Frankfurter API returns rates from last available date when asked for an unavailable one.
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 200
response._content = API_RESPONSE_USD
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_all_by_date(date(2019, 4, 16), {"CZK", "EUR"}, logger)
assert converted_rates == {
"CZK": Decimal(22.6509325555),
"EUR": Decimal(0.8839388314),
}
@staticmethod
def test_get_all_by_date__date_too_old(frankfurter, response, logger):
"""
Frankfurter API returns error when the specified date is before 1999-01-04.
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 400
response._content = b'{"error": "Error message"}'
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_all_by_date(date(1000, 1, 11), {"CZK"}, logger)
assert converted_rates == {}
@staticmethod
def test_get_all_by_date__currency_unavailable(frankfurter, response, logger):
"""
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 400
response._content = b'{"error": "Error message"}'
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_all_by_date(date(2019, 4, 16), {"XXX"}, logger)
assert converted_rates == {}
@staticmethod
def test_get_all_by_date__base_currency_is_same_as_target_currency(frankfurter, response, base_currency, logger):
"""
Frankfurter API returns error when base and target currencies are same.
:type frankfurter: gold_digger.data_providers.Frankfurter
:type response: requests.Response
:type base_currency: str
:type logger: gold_digger.utils.ContextLogger
"""
response.status_code = 200
response._content = API_RESPONSE_USD
frankfurter._get = lambda url, **kw: response
converted_rates = frankfurter.get_all_by_date(date(2019, 4, 16), {base_currency, "CZK"}, logger)
assert converted_rates == {
base_currency: Decimal(1),
"CZK": Decimal(22.6509325555),
}
| 32.643777 | 117 | 0.65593 | 862 | 7,606 | 5.554524 | 0.191415 | 0.025063 | 0.02924 | 0.045948 | 0.838555 | 0.832498 | 0.827903 | 0.821846 | 0.816416 | 0.816416 | 0 | 0.100311 | 0.239811 | 7,606 | 232 | 118 | 32.784483 | 0.727776 | 0.250197 | 0 | 0.421875 | 0 | 0 | 0.197481 | 0 | 0 | 0 | 0 | 0 | 0.078125 | 1 | 0.085938 | false | 0 | 0.03125 | 0 | 0.140625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6f95f31b7eba0c5f2f9682f9a8079675af4301e6 | 373 | py | Python | shared/tests/unit/util/test_key_strings.py | ostcar/openslides-datastore-service | 5b17e162c477e3d19b59b2dcfcf307538e5ce90b | [
"MIT"
] | null | null | null | shared/tests/unit/util/test_key_strings.py | ostcar/openslides-datastore-service | 5b17e162c477e3d19b59b2dcfcf307538e5ce90b | [
"MIT"
] | null | null | null | shared/tests/unit/util/test_key_strings.py | ostcar/openslides-datastore-service | 5b17e162c477e3d19b59b2dcfcf307538e5ce90b | [
"MIT"
] | null | null | null | from shared.util import is_reserved_field
def test_is_reserved_field_1():
assert is_reserved_field("meta_something")
def test_is_reserved_field_2():
assert is_reserved_field("meta")
def test_is_reserved_field_None():
assert is_reserved_field(None) is False
def test_is_reserved_field_other_string():
assert is_reserved_field("some_string") is False
| 20.722222 | 52 | 0.798928 | 58 | 373 | 4.637931 | 0.327586 | 0.334572 | 0.501859 | 0.252788 | 0.513011 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006154 | 0.128686 | 373 | 17 | 53 | 21.941176 | 0.821538 | 0 | 0 | 0 | 0 | 0 | 0.077748 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.444444 | true | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6f9e78e8624fe6b515da43e4230ba39f621331f3 | 27 | py | Python | python/prism_adacs/prism_adacs/_internal/tests/test_setup.py | ADACS-Australia/ADACS-SS18B-EvdVelden | b995797f06cf659076f716f2a2ee8435538badfc | [
"MIT"
] | null | null | null | python/prism_adacs/prism_adacs/_internal/tests/test_setup.py | ADACS-Australia/ADACS-SS18B-EvdVelden | b995797f06cf659076f716f2a2ee8435538badfc | [
"MIT"
] | 1 | 2022-01-19T16:19:02.000Z | 2022-01-19T16:19:02.000Z | library/tests/test_setup.py | pimoroni/apds9500-python | 8eb3dde36b4639be7ca1b253dab2aef982d111f3 | [
"MIT"
] | null | null | null | def test_setup():
pass
| 9 | 17 | 0.62963 | 4 | 27 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.259259 | 27 | 2 | 18 | 13.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
6fa500e42625c19d800cd89e5b39cdf8b51c29b1 | 15,131 | py | Python | tensorflow/python/keras/layers/preprocessing/category_encoding_test.py | sgautham2k/tensorflow | d99a093ba8cf2bf83d01e585e8e1805c3e6145ba | [
"Apache-2.0"
] | 1 | 2021-06-17T17:07:40.000Z | 2021-06-17T17:07:40.000Z | tensorflow/python/keras/layers/preprocessing/category_encoding_test.py | dfki-thsc/tensorflow | 8d746f768196a2434d112e98fc26c99590986d73 | [
"Apache-2.0"
] | 2 | 2021-11-10T20:10:39.000Z | 2022-02-10T05:15:31.000Z | tensorflow/python/keras/layers/preprocessing/category_encoding_test.py | dfki-thsc/tensorflow | 8d746f768196a2434d112e98fc26c99590986d73 | [
"Apache-2.0"
] | 1 | 2021-04-20T18:26:18.000Z | 2021-04-20T18:26:18.000Z | # Copyright 2020 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for Keras text category_encoding preprocessing layer."""
from absl.testing import parameterized
import numpy as np
from tensorflow.python import keras
from tensorflow.python.framework import constant_op
from tensorflow.python.framework import dtypes
from tensorflow.python.framework import errors
from tensorflow.python.framework import sparse_tensor
from tensorflow.python.keras import backend
from tensorflow.python.keras import keras_parameterized
from tensorflow.python.keras.layers import core
from tensorflow.python.keras.layers.preprocessing import category_encoding
from tensorflow.python.keras.layers.preprocessing import preprocessing_test_utils
from tensorflow.python.ops import sparse_ops
from tensorflow.python.ops.ragged import ragged_factory_ops
from tensorflow.python.platform import test
@keras_parameterized.run_all_keras_modes(always_skip_v1=True)
class CategoryEncodingInputTest(keras_parameterized.TestCase,
preprocessing_test_utils.PreprocessingLayerTest
):
def test_dense_input_sparse_output(self):
input_array = constant_op.constant([[1, 2, 3], [3, 3, 0]])
# The expected output should be (X for missing value):
# [[X, 1, 1, 1, X, X]
# [1, X, X, 2, X, X]]
expected_indices = [[0, 1], [0, 2], [0, 3], [1, 0], [1, 3]]
expected_values = [1, 1, 1, 1, 2]
num_tokens = 6
input_data = keras.Input(shape=(None,), dtype=dtypes.int32)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.COUNT, sparse=True)
int_data = layer(input_data)
model = keras.Model(inputs=input_data, outputs=int_data)
sp_output_dataset = model.predict(input_array, steps=1)
self.assertAllEqual(expected_values, sp_output_dataset.values)
self.assertAllEqual(expected_indices, sp_output_dataset.indices)
# Assert sparse output is same as dense output.
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens,
output_mode=category_encoding.COUNT,
sparse=False)
int_data = layer(input_data)
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(input_array, steps=1)
self.assertAllEqual(
sparse_ops.sparse_tensor_to_dense(sp_output_dataset, default_value=0),
output_dataset)
def test_sparse_input(self):
input_array = np.array([[1, 2, 3, 0], [0, 3, 1, 0]], dtype=np.int64)
sparse_tensor_data = sparse_ops.from_dense(input_array)
# pyformat: disable
expected_output = [[0, 1, 1, 1, 0, 0],
[0, 1, 0, 1, 0, 0]]
# pyformat: enable
num_tokens = 6
expected_output_shape = [None, num_tokens]
input_data = keras.Input(shape=(None,), dtype=dtypes.int64, sparse=True)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.BINARY)
int_data = layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(sparse_tensor_data, steps=1)
self.assertAllEqual(expected_output, output_dataset)
def test_sparse_input_with_weights(self):
input_array = np.array([[1, 2, 3, 4], [4, 3, 1, 4]], dtype=np.int64)
weights_array = np.array([[.1, .2, .3, .4], [.2, .1, .4, .3]])
sparse_tensor_data = sparse_ops.from_dense(input_array)
sparse_weight_data = sparse_ops.from_dense(weights_array)
# pyformat: disable
expected_output = [[0, .1, .2, .3, .4, 0],
[0, .4, 0, .1, .5, 0]]
# pyformat: enable
num_tokens = 6
expected_output_shape = [None, num_tokens]
input_data = keras.Input(shape=(None,), dtype=dtypes.int64, sparse=True)
weight_data = keras.Input(shape=(None,), dtype=dtypes.float32, sparse=True)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.COUNT)
int_data = layer(input_data, count_weights=weight_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=[input_data, weight_data], outputs=int_data)
output_dataset = model.predict([sparse_tensor_data, sparse_weight_data],
steps=1)
self.assertAllClose(expected_output, output_dataset)
def test_sparse_input_sparse_output(self):
sp_inp = sparse_tensor.SparseTensor(
indices=[[0, 0], [1, 1], [2, 0], [2, 1], [3, 1]],
values=[0, 2, 1, 1, 0],
dense_shape=[4, 2])
input_data = keras.Input(shape=(None,), dtype=dtypes.int64, sparse=True)
# The expected output should be (X for missing value):
# [[1, X, X, X]
# [X, X, 1, X]
# [X, 2, X, X]
# [1, X, X, X]]
expected_indices = [[0, 0], [1, 2], [2, 1], [3, 0]]
expected_values = [1, 1, 2, 1]
num_tokens = 6
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.COUNT, sparse=True)
int_data = layer(input_data)
model = keras.Model(inputs=input_data, outputs=int_data)
sp_output_dataset = model.predict(sp_inp, steps=1)
self.assertAllEqual(expected_values, sp_output_dataset.values)
self.assertAllEqual(expected_indices, sp_output_dataset.indices)
# Assert sparse output is same as dense output.
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens,
output_mode=category_encoding.COUNT,
sparse=False)
int_data = layer(input_data)
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(sp_inp, steps=1)
self.assertAllEqual(
sparse_ops.sparse_tensor_to_dense(sp_output_dataset, default_value=0),
output_dataset)
def test_sparse_input_sparse_output_with_weights(self):
indices = [[0, 0], [1, 1], [2, 0], [2, 1], [3, 1]]
sp_inp = sparse_tensor.SparseTensor(
indices=indices, values=[0, 2, 1, 1, 0], dense_shape=[4, 2])
input_data = keras.Input(shape=(None,), dtype=dtypes.int64, sparse=True)
sp_weight = sparse_tensor.SparseTensor(
indices=indices, values=[.1, .2, .4, .3, .2], dense_shape=[4, 2])
weight_data = keras.Input(shape=(None,), dtype=dtypes.float32, sparse=True)
# The expected output should be (X for missing value):
# [[1, X, X, X]
# [X, X, 1, X]
# [X, 2, X, X]
# [1, X, X, X]]
expected_indices = [[0, 0], [1, 2], [2, 1], [3, 0]]
expected_values = [.1, .2, .7, .2]
num_tokens = 6
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.COUNT, sparse=True)
int_data = layer(input_data, count_weights=weight_data)
model = keras.Model(inputs=[input_data, weight_data], outputs=int_data)
sp_output_dataset = model.predict([sp_inp, sp_weight], steps=1)
self.assertAllClose(expected_values, sp_output_dataset.values)
self.assertAllEqual(expected_indices, sp_output_dataset.indices)
def test_ragged_input(self):
input_array = ragged_factory_ops.constant([[1, 2, 3], [3, 1]])
# pyformat: disable
expected_output = [[0, 1, 1, 1, 0, 0],
[0, 1, 0, 1, 0, 0]]
# pyformat: enable
num_tokens = 6
expected_output_shape = [None, num_tokens]
input_data = keras.Input(shape=(None,), dtype=dtypes.int32, ragged=True)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.BINARY)
int_data = layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(input_array, steps=1)
self.assertAllEqual(expected_output, output_dataset)
def test_ragged_input_sparse_output(self):
input_array = ragged_factory_ops.constant([[1, 2, 3], [3, 3]])
# The expected output should be (X for missing value):
# [[X, 1, 1, 1]
# [X, X, X, 2]]
expected_indices = [[0, 1], [0, 2], [0, 3], [1, 3]]
expected_values = [1, 1, 1, 2]
num_tokens = 6
input_data = keras.Input(shape=(None,), dtype=dtypes.int32, ragged=True)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.COUNT, sparse=True)
int_data = layer(input_data)
model = keras.Model(inputs=input_data, outputs=int_data)
sp_output_dataset = model.predict(input_array, steps=1)
self.assertAllEqual(expected_values, sp_output_dataset.values)
self.assertAllEqual(expected_indices, sp_output_dataset.indices)
# Assert sparse output is same as dense output.
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens,
output_mode=category_encoding.COUNT,
sparse=False)
int_data = layer(input_data)
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(input_array, steps=1)
self.assertAllEqual(
sparse_ops.sparse_tensor_to_dense(sp_output_dataset, default_value=0),
output_dataset)
def test_sparse_output_and_dense_layer(self):
input_array = constant_op.constant([[1, 2, 3], [3, 3, 0]])
num_tokens = 4
input_data = keras.Input(shape=(None,), dtype=dtypes.int32)
encoding_layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.COUNT, sparse=True)
int_data = encoding_layer(input_data)
dense_layer = keras.layers.Dense(units=1)
output_data = dense_layer(int_data)
model = keras.Model(inputs=input_data, outputs=output_data)
_ = model.predict(input_array, steps=1)
def test_dense_oov_input(self):
input_array = constant_op.constant([[0, 1, 2], [2, 3, 1]])
num_tokens = 3
expected_output_shape = [None, num_tokens]
encoder_layer = category_encoding.CategoryEncoding(num_tokens)
input_data = keras.Input(shape=(3,), dtype=dtypes.int32)
int_data = encoder_layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
with self.assertRaisesRegex(
errors.InvalidArgumentError,
".*must be in the range 0 <= values < num_tokens.*"):
_ = model.predict(input_array, steps=1)
def test_dense_negative(self):
input_array = constant_op.constant([[1, 2, 0], [2, 2, -1]])
num_tokens = 3
expected_output_shape = [None, num_tokens]
encoder_layer = category_encoding.CategoryEncoding(num_tokens)
input_data = keras.Input(shape=(3,), dtype=dtypes.int32)
int_data = encoder_layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
with self.assertRaisesRegex(
errors.InvalidArgumentError,
".*must be in the range 0 <= values < num_tokens.*"):
_ = model.predict(input_array, steps=1)
def test_legacy_max_tokens_arg(self):
input_array = np.array([[1, 2, 3, 1]])
expected_output = [[0, 1, 1, 1, 0, 0]]
num_tokens = 6
expected_output_shape = [None, num_tokens]
input_data = keras.Input(shape=(None,), dtype=dtypes.int32)
layer = category_encoding.CategoryEncoding(
max_tokens=num_tokens, output_mode=category_encoding.BINARY)
int_data = layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(input_array)
self.assertAllEqual(expected_output, output_dataset)
@keras_parameterized.run_all_keras_modes
class CategoryEncodingOutputTest(keras_parameterized.TestCase,
preprocessing_test_utils.PreprocessingLayerTest
):
def test_binary_output(self):
input_array = np.array([[1, 2, 3, 1], [0, 3, 1, 0]])
# pyformat: disable
expected_output = [[0, 1, 1, 1, 0, 0],
[1, 1, 0, 1, 0, 0]]
# pyformat: enable
num_tokens = 6
expected_output_shape = [None, num_tokens]
input_data = keras.Input(shape=(None,), dtype=dtypes.int32)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=category_encoding.BINARY)
int_data = layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(input_array)
self.assertAllEqual(expected_output, output_dataset)
def test_count_output(self):
input_array = np.array([[1, 2, 3, 1], [0, 3, 1, 0]])
# pyformat: disable
expected_output = [[0, 2, 1, 1, 0, 0],
[2, 1, 0, 1, 0, 0]]
# pyformat: enable
num_tokens = 6
expected_output_shape = [None, num_tokens]
input_data = keras.Input(shape=(None,), dtype=dtypes.int32)
layer = category_encoding.CategoryEncoding(
num_tokens=6, output_mode=category_encoding.COUNT)
int_data = layer(input_data)
self.assertAllEqual(expected_output_shape, int_data.shape.as_list())
model = keras.Model(inputs=input_data, outputs=int_data)
output_dataset = model.predict(input_array)
self.assertAllEqual(expected_output, output_dataset)
class CategoryEncodingModelBuildingTest(
keras_parameterized.TestCase,
preprocessing_test_utils.PreprocessingLayerTest):
@parameterized.named_parameters(
{
"testcase_name": "count_output",
"num_tokens": 5,
"output_mode": category_encoding.COUNT
}, {
"testcase_name": "binary_output",
"num_tokens": 5,
"output_mode": category_encoding.BINARY
})
def test_end_to_end_bagged_modeling(self, output_mode, num_tokens):
input_array = np.array([[1, 2, 3, 1], [0, 3, 1, 0]])
input_data = keras.Input(shape=(None,), dtype=dtypes.int32)
layer = category_encoding.CategoryEncoding(
num_tokens=num_tokens, output_mode=output_mode)
weights = []
if num_tokens is None:
layer.set_num_elements(5)
layer.set_weights(weights)
int_data = layer(input_data)
float_data = backend.cast(int_data, dtype="float32")
output_data = core.Dense(64)(float_data)
model = keras.Model(inputs=input_data, outputs=output_data)
_ = model.predict(input_array)
if __name__ == "__main__":
test.main()
| 39.818421 | 81 | 0.693345 | 2,054 | 15,131 | 4.855404 | 0.091529 | 0.051439 | 0.052141 | 0.06307 | 0.841171 | 0.807179 | 0.782011 | 0.756442 | 0.728467 | 0.686554 | 0 | 0.028274 | 0.186571 | 15,131 | 379 | 82 | 39.923483 | 0.782012 | 0.094508 | 0 | 0.634328 | 0 | 0 | 0.015088 | 0 | 0 | 0 | 0 | 0 | 0.100746 | 1 | 0.052239 | false | 0 | 0.05597 | 0 | 0.119403 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6fb35c6a0773bc5777e0c8acf34ec552b189a0de | 43 | py | Python | mfp/mfp_sub/__init__.py | benvliet/mfp | 50b1e3412015086e5684b2f4be4080dee4bfd30f | [
"MIT"
] | null | null | null | mfp/mfp_sub/__init__.py | benvliet/mfp | 50b1e3412015086e5684b2f4be4080dee4bfd30f | [
"MIT"
] | 2 | 2021-11-11T11:25:38.000Z | 2021-12-05T18:47:26.000Z | mfp/mfp_sub/__init__.py | benvliet/mfp | 50b1e3412015086e5684b2f4be4080dee4bfd30f | [
"MIT"
] | null | null | null | from .times import times_two # noqa: F401
| 21.5 | 42 | 0.744186 | 7 | 43 | 4.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 0.186047 | 43 | 1 | 43 | 43 | 0.8 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6fded39ff8ef89fb844a30d621abe2c13797a5bf | 126 | py | Python | example/controller/tests/helper/numeric/__init__.py | donghak-shin/dp-tornado | 095bb293661af35cce5f917d8a2228d273489496 | [
"MIT"
] | 18 | 2015-04-07T14:28:39.000Z | 2020-02-08T14:03:38.000Z | example/controller/tests/helper/numeric/__init__.py | donghak-shin/dp-tornado | 095bb293661af35cce5f917d8a2228d273489496 | [
"MIT"
] | 7 | 2016-10-05T05:14:06.000Z | 2021-05-20T02:07:22.000Z | example/controller/tests/helper/numeric/__init__.py | donghak-shin/dp-tornado | 095bb293661af35cce5f917d8a2228d273489496 | [
"MIT"
] | 11 | 2015-12-15T09:49:39.000Z | 2021-09-06T18:38:21.000Z | # -*- coding: utf-8 -*-
from dp_tornado.engine.controller import Controller
class NumericController(Controller):
pass
| 14 | 51 | 0.730159 | 14 | 126 | 6.5 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009434 | 0.15873 | 126 | 8 | 52 | 15.75 | 0.849057 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
82f2be33a7648a3ed0fc9534a33d9ebb99637f2a | 2,161 | py | Python | tests/test_adapter_validation.py | ingeniousambivert/chatbot | fb1d9659df6c1b6eddd8ee9349f5a65a0530db2a | [
"BSD-3-Clause"
] | null | null | null | tests/test_adapter_validation.py | ingeniousambivert/chatbot | fb1d9659df6c1b6eddd8ee9349f5a65a0530db2a | [
"BSD-3-Clause"
] | null | null | null | tests/test_adapter_validation.py | ingeniousambivert/chatbot | fb1d9659df6c1b6eddd8ee9349f5a65a0530db2a | [
"BSD-3-Clause"
] | null | null | null | from chatterbot import ChatBot
from chatterbot.adapters import Adapter
from tests.base_case import ChatBotTestCase
class AdapterValidationTests(ChatBotTestCase):
def test_invalid_storage_adapter(self):
kwargs = self.get_kwargs()
kwargs['storage_adapter'] = 'chatterbot.logic.LogicAdapter'
with self.assertRaises(Adapter.InvalidAdapterTypeException):
self.chatbot = ChatBot('Test Bot', **kwargs)
def test_valid_storage_adapter(self):
kwargs = self.get_kwargs()
kwargs['storage_adapter'] = 'chatterbot.storage.SQLStorageAdapter'
try:
self.chatbot = ChatBot('Test Bot', **kwargs)
except Adapter.InvalidAdapterTypeException:
self.fail('Test raised InvalidAdapterException unexpectedly!')
def test_invalid_logic_adapter(self):
kwargs = self.get_kwargs()
kwargs['logic_adapters'] = ['chatterbot.storage.StorageAdapter']
with self.assertRaises(Adapter.InvalidAdapterTypeException):
self.chatbot = ChatBot('Test Bot', **kwargs)
def test_valid_logic_adapter(self):
kwargs = self.get_kwargs()
kwargs['logic_adapters'] = ['chatterbot.logic.BestMatch']
try:
self.chatbot = ChatBot('Test Bot', **kwargs)
except Adapter.InvalidAdapterTypeException:
self.fail('Test raised InvalidAdapterException unexpectedly!')
def test_valid_adapter_dictionary(self):
kwargs = self.get_kwargs()
kwargs['storage_adapter'] = {
'import_path': 'chatterbot.storage.SQLStorageAdapter'
}
try:
self.chatbot = ChatBot('Test Bot', **kwargs)
except Adapter.InvalidAdapterTypeException:
self.fail('Test raised InvalidAdapterException unexpectedly!')
def test_invalid_adapter_dictionary(self):
kwargs = self.get_kwargs()
kwargs['storage_adapter'] = {
'import_path': 'chatterbot.logic.BestMatch'
}
with self.assertRaises(Adapter.InvalidAdapterTypeException):
self.chatbot = ChatBot('Test Bot', **kwargs)
| 40.773585 | 75 | 0.659417 | 199 | 2,161 | 6.994975 | 0.18593 | 0.030172 | 0.060345 | 0.073276 | 0.836925 | 0.836925 | 0.836925 | 0.836925 | 0.836925 | 0.836925 | 0 | 0 | 0.242943 | 2,161 | 52 | 76 | 41.557692 | 0.850856 | 0 | 0 | 0.590909 | 0 | 0 | 0.232812 | 0.12091 | 0 | 0 | 0 | 0 | 0.068182 | 1 | 0.136364 | false | 0 | 0.113636 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
82f4ac14a4bfd3dd663a8fa2e2c1aa454f79c795 | 94 | py | Python | luminoth/utils/hooks/__init__.py | jsdussanc/luminoth | 7637c52cc01d2826a231fef43746aa10951f99f0 | [
"BSD-3-Clause"
] | 2,584 | 2017-08-16T20:31:52.000Z | 2022-03-16T07:53:54.000Z | luminoth/utils/hooks/__init__.py | dun933/Tabulo | dc1c1203a40e1ecf2aaca9647f3008ab72b41438 | [
"BSD-3-Clause"
] | 197 | 2017-08-17T14:49:18.000Z | 2022-02-10T01:50:50.000Z | luminoth/utils/hooks/__init__.py | czbiohub/luminoth | 3b4d57a9b4c3704c64816bbcbd6126a2ac23a069 | [
"BSD-3-Clause"
] | 462 | 2017-08-16T22:00:23.000Z | 2022-03-08T19:14:00.000Z | from .image_vis_hook import ImageVisHook # noqa
from .var_vis_hook import VarVisHook # noqa
| 31.333333 | 48 | 0.808511 | 14 | 94 | 5.142857 | 0.642857 | 0.194444 | 0.361111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 94 | 2 | 49 | 47 | 0.9 | 0.095745 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d207fb5ad1ff865dfb9ff53929b56029a04edfee | 142 | py | Python | liberator/__init__.py | root795/libresbc | 79d80e2a88f24f984a654855d45f5db8b18fc134 | [
"MIT"
] | null | null | null | liberator/__init__.py | root795/libresbc | 79d80e2a88f24f984a654855d45f5db8b18fc134 | [
"MIT"
] | null | null | null | liberator/__init__.py | root795/libresbc | 79d80e2a88f24f984a654855d45f5db8b18fc134 | [
"MIT"
] | null | null | null | from .configuration import *
from .utilities import *
from .bases import *
from .libreapi import *
from .fsxmlapi import *
from .api import *
| 20.285714 | 28 | 0.746479 | 18 | 142 | 5.888889 | 0.444444 | 0.471698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169014 | 142 | 6 | 29 | 23.666667 | 0.898305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d2207224c4437e9a32a07f46aef2970b21033f49 | 129 | py | Python | deepy/preprocessing/__init__.py | uaca/deepy | 090fbad22a08a809b12951cd0d4984f5bd432698 | [
"MIT"
] | 260 | 2015-05-16T07:58:29.000Z | 2016-01-07T09:10:47.000Z | deepy/preprocessing/__init__.py | uaca/deepy | 090fbad22a08a809b12951cd0d4984f5bd432698 | [
"MIT"
] | 20 | 2015-04-21T01:46:46.000Z | 2015-12-20T00:04:23.000Z | deepy/preprocessing/__init__.py | zomux/deepy | 090fbad22a08a809b12951cd0d4984f5bd432698 | [
"MIT"
] | 50 | 2016-01-27T03:45:25.000Z | 2020-12-16T07:02:57.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from padding import pad_sequence
from elastic_distortion import elastic_distortion | 25.8 | 49 | 0.775194 | 18 | 129 | 5.388889 | 0.777778 | 0.350515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008772 | 0.116279 | 129 | 5 | 49 | 25.8 | 0.842105 | 0.325581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d248af9e25738999a8ceb3d3472008e85ac18877 | 33 | py | Python | sokoban/__init__.py | tcosmo/sokoban | cc39506412aea91df84c8232280074b223fa0883 | [
"MIT"
] | null | null | null | sokoban/__init__.py | tcosmo/sokoban | cc39506412aea91df84c8232280074b223fa0883 | [
"MIT"
] | null | null | null | sokoban/__init__.py | tcosmo/sokoban | cc39506412aea91df84c8232280074b223fa0883 | [
"MIT"
] | null | null | null | from sokoban.game_loop import run | 33 | 33 | 0.878788 | 6 | 33 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 1 | 33 | 33 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
962d33280ffd3517d990250dfb0485c2992b6cf7 | 160 | py | Python | modules/xatlas_unwrap/config.py | aBARICHELLO/godot | 6e0de76746783433cb62511696f6a967567cb001 | [
"CC-BY-3.0",
"Apache-2.0",
"MIT"
] | null | null | null | modules/xatlas_unwrap/config.py | aBARICHELLO/godot | 6e0de76746783433cb62511696f6a967567cb001 | [
"CC-BY-3.0",
"Apache-2.0",
"MIT"
] | null | null | null | modules/xatlas_unwrap/config.py | aBARICHELLO/godot | 6e0de76746783433cb62511696f6a967567cb001 | [
"CC-BY-3.0",
"Apache-2.0",
"MIT"
] | null | null | null | def can_build(env, platform):
return False #xatlas is buggy
#return (env['tools'] and platform not in ["android", "ios"])
def configure(env):
pass
| 22.857143 | 65 | 0.6625 | 23 | 160 | 4.565217 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 160 | 6 | 66 | 26.666667 | 0.820313 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.25 | 0 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
96639550cbc69edf1bb33fef049884008a61af7d | 21 | py | Python | sampro/__init__.py | doublereedkurt/sampro | 368aa0cd1a3683d12483925c4e97d225f40c8d36 | [
"MIT"
] | 3 | 2015-03-18T14:54:58.000Z | 2015-11-01T12:29:42.000Z | sampro/__init__.py | kurtbrose/sampro | 368aa0cd1a3683d12483925c4e97d225f40c8d36 | [
"MIT"
] | null | null | null | sampro/__init__.py | kurtbrose/sampro | 368aa0cd1a3683d12483925c4e97d225f40c8d36 | [
"MIT"
] | null | null | null | from sampro import *
| 10.5 | 20 | 0.761905 | 3 | 21 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96782e74d86b48f26884334d40d1bb5f3ff4ad4c | 9,846 | py | Python | pytests/tuqquery/tuq_precedence.py | sumedhpb/testrunner | 9ff887231c75571624abc31a3fb5248110e01203 | [
"Apache-2.0"
] | 14 | 2015-02-06T02:47:57.000Z | 2020-03-14T15:06:05.000Z | pytests/tuqquery/tuq_precedence.py | sumedhpb/testrunner | 9ff887231c75571624abc31a3fb5248110e01203 | [
"Apache-2.0"
] | 3 | 2019-02-27T19:29:11.000Z | 2021-06-02T02:14:27.000Z | pytests/tuqquery/tuq_precedence.py | sumedhpb/testrunner | 9ff887231c75571624abc31a3fb5248110e01203 | [
"Apache-2.0"
] | 108 | 2015-03-26T08:58:49.000Z | 2022-03-21T05:21:39.000Z | from tuqquery.tuq import QueryTests
class PrecedenceTests(QueryTests):
def setUp(self):
super(PrecedenceTests, self).setUp()
self.log.info("============== PrecedenceTests setup has started ==============")
self.log.info("============== PrecedenceTests setup has completed ==============")
self.log_config_info()
self.query_buckets = self.get_query_buckets(check_all_buckets=True)
def suite_setUp(self):
super(PrecedenceTests, self).suite_setUp()
self.log.info("============== PrecedenceTests suite_setup has started ==============")
self.log.info("============== PrecedenceTests suite_setup has completed ==============")
def tearDown(self):
self.log.info("============== PrecedenceTests teardown has started ==============")
self.log.info("============== PrecedenceTests teardown has completed ==============")
super(PrecedenceTests, self).tearDown()
def suite_tearDown(self):
self.log.info("============== PrecedenceTests suite_teardown has started ==============")
self.log.info("============== PrecedenceTests suite_teardown has completed ==============")
super(PrecedenceTests, self).suite_tearDown()
def test_case_and_like(self):
self.fail_if_no_buckets()
for query_bucket in self.query_buckets:
self.query = "SELECT name, CASE WHEN join_mo < 3 OR join_mo > 11 THEN" + \
" 'winter' ELSE 'other' END AS period FROM %s WHERE CASE WHEN" % query_bucket + \
" join_mo < 3 OR join_mo > 11 THEN 'winter' ELSE 'other' END LIKE 'win%'"
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name'], doc['period']))
expected_result = [{"name": doc['name'],
"period": ('other', 'winter')
[doc['join_mo'] in [12, 1, 2]]}
for doc in self.full_list
if ('other', 'winter')[doc['join_mo'] in [12, 1, 2]].startswith(
'win')]
expected_result = sorted(expected_result, key=lambda doc: (doc['name'],
doc['period']))
self._verify_results(actual_result, expected_result)
def test_case_and_logic_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT DISTINCT name, CASE WHEN join_mo < 3 OR join_mo > 11 THEN" + \
" 'winter' ELSE 'other' END AS period FROM %s WHERE CASE WHEN join_mo < 3" % query_bucket + \
" OR join_mo > 11 THEN 1 ELSE 0 END > 0 AND job_title='Sales'"
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name'], doc['period']))
expected_result = [{"name": doc['name'],
"period": ('other', 'winter')
[doc['join_mo'] in [12, 1, 2]]}
for doc in self.full_list
if (0, 1)[doc['join_mo'] in [12, 1, 2]] > 0 and
doc['job_title'] == 'Sales']
expected_result = [dict(y) for y in set(tuple(x.items()) for x in expected_result)]
expected_result = sorted(expected_result, key=lambda doc: (doc['name'],
doc['period']))
self._verify_results(actual_result, expected_result)
def test_prepared_case_and_logic_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT DISTINCT name, CASE WHEN join_mo < 3 OR join_mo > 11 THEN" + \
" 'winter' ELSE 'other' END AS period FROM %s WHERE CASE WHEN join_mo < 3" % query_bucket + \
" OR join_mo > 11 THEN 1 ELSE 0 END > 0 AND job_title='Sales'"
self.prepared_common_body()
def test_case_and_comparision_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT DISTINCT name, CASE WHEN join_mo < 3 OR join_mo > 11 THEN" + \
" 'winter' ELSE 'other' END AS period FROM %s WHERE CASE WHEN join_mo < 3" % query_bucket + \
" OR join_mo > 11 THEN 1 END = 1 AND job_title='Sales'"
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name'], doc['period']))
expected_result = [{"name": doc['name'],
"period": ('other', 'winter')
[doc['join_mo'] in [12, 1, 2]]}
for doc in self.full_list
if (doc['join_mo'], 1)[doc['join_mo'] in [12, 1, 2]] == 1 and
doc['job_title'] == 'Sales']
expected_result = [dict(y) for y in set(tuple(x.items()) for x in expected_result)]
expected_result = sorted(expected_result, key=lambda doc: (doc['name'],
doc['period']))
self._verify_results(actual_result, expected_result)
def test_arithm_and_comparision_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT name from %s WHERE join_mo > 3 + 1" % query_bucket
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name']))
expected_result = [{"name": doc['name']}
for doc in self.full_list
if doc['join_mo'] > (3 + 1)]
expected_result = sorted(expected_result, key=lambda doc: (doc['name']))
self._verify_results(actual_result, expected_result)
self.query = "SELECT name from %s WHERE join_mo = 3 + 1" % query_bucket
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name']))
expected_result = [{"name": doc['name']}
for doc in self.full_list
if doc['join_mo'] == (3 + 1)]
expected_result = sorted(expected_result, key=lambda doc: (doc['name']))
self._verify_results(actual_result, expected_result)
def test_arithm_and_like_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT name from {0} WHERE job_title LIKE 'S%' = TRUE".format(query_bucket)
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name']))
expected_result = [{"name": doc['name']}
for doc in self.full_list
if doc['job_title'].startswith('S')]
expected_result = sorted(expected_result, key=lambda doc: (doc['name']))
self._verify_results(actual_result, expected_result)
def test_logic_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT name, join_mo from %s WHERE NOT join_mo>10 AND" % query_bucket + \
" job_title='Sales' OR join_mo<2"
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name'], doc['join_mo']))
expected_result = [{"name": doc['name'], "join_mo": doc['join_mo']}
for doc in self.full_list
if ((not (doc['join_mo'] > 10)) and doc['job_title'] == 'Sales') or
(doc['join_mo'] < 2)]
expected_result = sorted(expected_result, key=lambda doc: (doc['name'], doc['join_mo']))
self._verify_results(actual_result, expected_result)
self.query = "SELECT DISTINCT email from %s WHERE NOT join_mo<10 OR join_mo<2" % query_bucket
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['email']))
expected_result = [{"email": doc['email']}
for doc in self.full_list
if (not (doc['join_mo'] < 10)) or (doc['join_mo'] < 2)]
expected_result = [dict(y) for y in set(tuple(x.items()) for x in expected_result)]
expected_result = sorted(expected_result, key=lambda doc: (doc['email']))
self._verify_results(actual_result, expected_result)
def test_prepared_logic_exp(self):
for query_bucket in self.query_buckets:
self.query = "SELECT name, join_mo from %s WHERE NOT join_mo>10 AND" % query_bucket + \
" job_title='Sales' OR join_mo<2"
self.prepared_common_body()
def test_logic_exp_nulls(self):
for query_bucket in self.query_buckets:
self.query = "SELECT name, join_mo from %s WHERE NOT join_mo IS NULL" % query_bucket
actual_result = self.run_cbq_query()
actual_result = sorted(actual_result['results'], key=lambda doc: (
doc['name'], doc['join_mo']))
expected_result = [{"name": doc['name'], "join_mo": doc['join_mo']}
for doc in self.full_list]
expected_result = sorted(expected_result, key=lambda doc: (doc['name'], doc['join_mo']))
self._verify_results(actual_result, expected_result)
| 59.313253 | 118 | 0.537579 | 1,153 | 9,846 | 4.360798 | 0.082394 | 0.058473 | 0.03401 | 0.053699 | 0.920247 | 0.899364 | 0.855807 | 0.794948 | 0.788584 | 0.783413 | 0 | 0.012879 | 0.329677 | 9,846 | 165 | 119 | 59.672727 | 0.748939 | 0 | 0 | 0.644295 | 0 | 0 | 0.228925 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.087248 | false | 0 | 0.006711 | 0 | 0.100671 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9679718c0bcfbb95980fb915ce154495710d42b0 | 7,399 | py | Python | dnacentercli/cli/v1_2_10/networks.py | AltusConsulting/dnacentercli | 26ea46fdbd40fc30649ea1d8803158655aa545aa | [
"MIT"
] | null | null | null | dnacentercli/cli/v1_2_10/networks.py | AltusConsulting/dnacentercli | 26ea46fdbd40fc30649ea1d8803158655aa545aa | [
"MIT"
] | null | null | null | dnacentercli/cli/v1_2_10/networks.py | AltusConsulting/dnacentercli | 26ea46fdbd40fc30649ea1d8803158655aa545aa | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import click
import json
from ..utils.spinner import (
init_spinner,
start_spinner,
stop_spinner,
)
from ..utils.print import (
tbprint,
eprint,
oprint,
opprint,
)
@click.group()
@click.pass_obj
@click.pass_context
def networks(ctx, obj):
"""DNA Center Networks API (version: 1.2.10).
Wraps the DNA Center Networks API and exposes the API as native Python commands.
"""
ctx.obj = obj.networks
@networks.command()
@click.option('--headers', type=str, help='''Dictionary of HTTP Headers to send with the Request.''',
default=None,
show_default=True)
@click.option('-pp', '--pretty_print', type=int, help='''Pretty print indent''',
default=None,
show_default=True)
@click.option('--beep', is_flag=True, help='''Spinner beep (on)''')
@click.pass_obj
def get_vlan_details(obj, pretty_print, beep,
headers):
"""Returns the list of VLAN names.
"""
spinner = init_spinner(beep=beep)
start_spinner(spinner)
try:
if headers is not None:
headers = json.loads(headers)
result = obj.get_vlan_details(
headers=headers)
stop_spinner(spinner)
opprint(result, indent=pretty_print)
except Exception as e:
stop_spinner(spinner)
tbprint()
eprint('Error:', e)
click.Context.exit(-1)
@networks.command()
@click.option('--headers', type=str, help='''Dictionary of HTTP Headers to send with the Request.''',
default=None,
show_default=True)
@click.option('-pp', '--pretty_print', type=int, help='''Pretty print indent''',
default=None,
show_default=True)
@click.option('--beep', is_flag=True, help='''Spinner beep (on)''')
@click.pass_obj
def get_site_topology(obj, pretty_print, beep,
headers):
"""Returns site topology.
"""
spinner = init_spinner(beep=beep)
start_spinner(spinner)
try:
if headers is not None:
headers = json.loads(headers)
result = obj.get_site_topology(
headers=headers)
stop_spinner(spinner)
opprint(result, indent=pretty_print)
except Exception as e:
stop_spinner(spinner)
tbprint()
eprint('Error:', e)
click.Context.exit(-1)
@networks.command()
@click.option('--node_type', type=str,
help='''nodeType query parameter.''',
show_default=True)
@click.option('--headers', type=str, help='''Dictionary of HTTP Headers to send with the Request.''',
default=None,
show_default=True)
@click.option('-pp', '--pretty_print', type=int, help='''Pretty print indent''',
default=None,
show_default=True)
@click.option('--beep', is_flag=True, help='''Spinner beep (on)''')
@click.pass_obj
def get_physical_topology(obj, pretty_print, beep,
node_type,
headers):
"""Returns the raw physical topology by specified criteria of nodeType.
"""
spinner = init_spinner(beep=beep)
start_spinner(spinner)
try:
if headers is not None:
headers = json.loads(headers)
result = obj.get_physical_topology(
node_type=node_type,
headers=headers)
stop_spinner(spinner)
opprint(result, indent=pretty_print)
except Exception as e:
stop_spinner(spinner)
tbprint()
eprint('Error:', e)
click.Context.exit(-1)
@networks.command()
@click.option('--topology_type', type=str,
help='''Type of topology(OSPF,ISIS,etc).''',
required=True,
show_default=True)
@click.option('--headers', type=str, help='''Dictionary of HTTP Headers to send with the Request.''',
default=None,
show_default=True)
@click.option('-pp', '--pretty_print', type=int, help='''Pretty print indent''',
default=None,
show_default=True)
@click.option('--beep', is_flag=True, help='''Spinner beep (on)''')
@click.pass_obj
def get_l3_topology_details(obj, pretty_print, beep,
topology_type,
headers):
"""Returns the Layer 3 network topology by routing protocol.
"""
spinner = init_spinner(beep=beep)
start_spinner(spinner)
try:
if headers is not None:
headers = json.loads(headers)
result = obj.get_l3_topology_details(
topology_type=topology_type,
headers=headers)
stop_spinner(spinner)
opprint(result, indent=pretty_print)
except Exception as e:
stop_spinner(spinner)
tbprint()
eprint('Error:', e)
click.Context.exit(-1)
@networks.command()
@click.option('--vlan_id', type=str,
help='''Vlan Name for e.g Vlan1, Vlan23 etc.''',
required=True,
show_default=True)
@click.option('--headers', type=str, help='''Dictionary of HTTP Headers to send with the Request.''',
default=None,
show_default=True)
@click.option('-pp', '--pretty_print', type=int, help='''Pretty print indent''',
default=None,
show_default=True)
@click.option('--beep', is_flag=True, help='''Spinner beep (on)''')
@click.pass_obj
def get_topology_details(obj, pretty_print, beep,
vlan_id,
headers):
"""Returns Layer 2 network topology by specified VLAN ID.
"""
spinner = init_spinner(beep=beep)
start_spinner(spinner)
try:
if headers is not None:
headers = json.loads(headers)
result = obj.get_topology_details(
vlan_id=vlan_id,
headers=headers)
stop_spinner(spinner)
opprint(result, indent=pretty_print)
except Exception as e:
stop_spinner(spinner)
tbprint()
eprint('Error:', e)
click.Context.exit(-1)
@networks.command()
@click.option('--timestamp', type=str,
help='''Epoch time(in milliseconds) when the Network health data is required.''',
show_default=True)
@click.option('--headers', type=str, help='''Dictionary of HTTP Headers to send with the Request.''',
default=None,
show_default=True)
@click.option('-pp', '--pretty_print', type=int, help='''Pretty print indent''',
default=None,
show_default=True)
@click.option('--beep', is_flag=True, help='''Spinner beep (on)''')
@click.pass_obj
def get_overall_network_health(obj, pretty_print, beep,
timestamp,
headers):
"""Returns Overall Network Health information by Device category (Access, Distribution, Core, Router, Wireless) for any given point of time.
"""
spinner = init_spinner(beep=beep)
start_spinner(spinner)
try:
if headers is not None:
headers = json.loads(headers)
result = obj.get_overall_network_health(
timestamp=timestamp,
headers=headers)
stop_spinner(spinner)
opprint(result, indent=pretty_print)
except Exception as e:
stop_spinner(spinner)
tbprint()
eprint('Error:', e)
click.Context.exit(-1)
| 33.179372 | 144 | 0.596702 | 863 | 7,399 | 4.982619 | 0.147161 | 0.061395 | 0.055814 | 0.074419 | 0.75093 | 0.741395 | 0.711163 | 0.711163 | 0.711163 | 0.711163 | 0 | 0.003368 | 0.277605 | 7,399 | 222 | 145 | 33.328829 | 0.801123 | 0.074605 | 0 | 0.763158 | 0 | 0 | 0.141744 | 0.003529 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036842 | false | 0.042105 | 0.021053 | 0 | 0.057895 | 0.184211 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
96917dfba98f97f5da1092af6c08046251e5be46 | 194 | py | Python | lithops/libs/cloudpickle/__init__.py | gfinol/lithops | e165a78e0facbb70c852d7627a7407e8a8d1b946 | [
"Apache-2.0"
] | null | null | null | lithops/libs/cloudpickle/__init__.py | gfinol/lithops | e165a78e0facbb70c852d7627a7407e8a8d1b946 | [
"Apache-2.0"
] | null | null | null | lithops/libs/cloudpickle/__init__.py | gfinol/lithops | e165a78e0facbb70c852d7627a7407e8a8d1b946 | [
"Apache-2.0"
] | null | null | null | import sys
if sys.version_info < (3, 8):
from .cloudpickle import CloudPickler
__version__ = '1.2.2'
else:
from .cloudpickle_160_fast import CloudPickler
__version__ = '1.6.0'
| 19.4 | 50 | 0.695876 | 27 | 194 | 4.592593 | 0.62963 | 0.241935 | 0.403226 | 0.419355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.206186 | 194 | 9 | 51 | 21.555556 | 0.733766 | 0 | 0 | 0 | 0 | 0 | 0.051546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
739d6fe612a5d6db62bcea3b0fd24fba95ddda83 | 58 | py | Python | sample/pytest/tests/submodule/test_nesting.py | mohit-cyberstar/cricket | cfe08c4ea2419f3ac746eea23680dc1a7883eb22 | [
"BSD-3-Clause"
] | 98 | 2015-05-28T10:41:52.000Z | 2019-03-08T09:14:35.000Z | sample/pytest/tests/submodule/test_nesting.py | SujeetGautam/cricket | 1476b597f499c1b9b34c9d21eeef0b4900892760 | [
"BSD-3-Clause"
] | 33 | 2015-02-11T12:39:55.000Z | 2019-03-29T23:23:00.000Z | sample/pytest/tests/submodule/test_nesting.py | SujeetGautam/cricket | 1476b597f499c1b9b34c9d21eeef0b4900892760 | [
"BSD-3-Clause"
] | 49 | 2015-03-25T05:55:14.000Z | 2019-03-23T15:30:38.000Z |
def test_stuff():
pass
def test_things():
pass
| 7.25 | 18 | 0.603448 | 8 | 58 | 4.125 | 0.625 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.293103 | 58 | 7 | 19 | 8.285714 | 0.804878 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
fb9acccc8706b5a3131b8f3a3d6346d8cc34210c | 1,388 | py | Python | test/container/test_php_dev.py | renatomefi/docker-php | d319d34ef46902907cd4f32080a70aafaecf0e42 | [
"MIT"
] | null | null | null | test/container/test_php_dev.py | renatomefi/docker-php | d319d34ef46902907cd4f32080a70aafaecf0e42 | [
"MIT"
] | null | null | null | test/container/test_php_dev.py | renatomefi/docker-php | d319d34ef46902907cd4f32080a70aafaecf0e42 | [
"MIT"
] | null | null | null | import pytest
@pytest.mark.php_dev
def test_configuration_is_present(host):
assert host.file('/usr/local/etc/php/conf.d/zzz_xdebug.ini').exists is True
assert host.file('/usr/local/etc/php/conf.d/zzz_dev.ini').exists is True
@pytest.mark.php_dev
def test_configuration_is_effective(host):
configuration = host.run('php -i').stdout
assert u'expose_php => On => On' in configuration
@pytest.mark.php_dev
def test_xdebug_is_loaded(host):
assert 'Xdebug' in host.run('php -m').stdout
@pytest.mark.php_no_dev
def test_configuration_is_not_present(host):
assert host.file('/usr/local/etc/php/conf.d/zzz_xdebug.ini').exists is False
assert host.file('/usr/local/etc/php/conf.d/zzz_dev.ini').exists is False
@pytest.mark.php_no_dev
def test_configuration_is_not_effective(host):
configuration = host.run('php -i').stdout
assert u'expose_php => Off => Off' in configuration
@pytest.mark.php_no_dev
def test_xdebug_is_not_loaded(host):
assert 'Xdebug' not in host.run('php -m').stdout
@pytest.mark.php_dev
def test_php_meminfo_is_enabled(host):
output = host.run('php -r "exit(function_exists(\'meminfo_dump\') ? 0 : 255);"')
assert output.rc == 0
@pytest.mark.php_no_dev
def test_php_meminfo_is_not_enabled(host):
output = host.run('php -r "exit(function_exists(\'meminfo_dump\') ? 0 : 255);"')
assert output.rc == 255
| 33.047619 | 84 | 0.728386 | 229 | 1,388 | 4.187773 | 0.20524 | 0.08342 | 0.108446 | 0.066736 | 0.898853 | 0.850886 | 0.791449 | 0.739312 | 0.660063 | 0.606882 | 0 | 0.009942 | 0.130403 | 1,388 | 41 | 85 | 33.853659 | 0.78459 | 0 | 0 | 0.387097 | 0 | 0 | 0.233429 | 0.144092 | 0 | 0 | 0 | 0 | 0.322581 | 1 | 0.258065 | false | 0 | 0.032258 | 0 | 0.290323 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fbbc14cd4655e2c2d27159e2eda5aa0b3858b5a6 | 48 | py | Python | ansa/__init__.py | Iskandar-Ki/AnsaRSS | 00e4c49114ba54078528967d8ddb0bf3efa9187e | [
"Unlicense"
] | 1 | 2018-09-19T09:26:34.000Z | 2018-09-19T09:26:34.000Z | ansa/__init__.py | Iskandar-Ki/AnsaRSS | 00e4c49114ba54078528967d8ddb0bf3efa9187e | [
"Unlicense"
] | null | null | null | ansa/__init__.py | Iskandar-Ki/AnsaRSS | 00e4c49114ba54078528967d8ddb0bf3efa9187e | [
"Unlicense"
] | null | null | null | from .ansa import Ansa
from .constants import *
| 16 | 24 | 0.770833 | 7 | 48 | 5.285714 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 48 | 2 | 25 | 24 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fbbc66fffe64482a0ac00eff4ca15d97f82cf3f8 | 72 | py | Python | models/__init__.py | tsurumeso/chainer-desalinet | 5e9b5813f718f31128cf7f7252b264204a2e8ec9 | [
"MIT"
] | 1 | 2018-01-22T05:52:50.000Z | 2018-01-22T05:52:50.000Z | models/__init__.py | tsurumeso/chainer-desalinet | 5e9b5813f718f31128cf7f7252b264204a2e8ec9 | [
"MIT"
] | null | null | null | models/__init__.py | tsurumeso/chainer-desalinet | 5e9b5813f718f31128cf7f7252b264204a2e8ec9 | [
"MIT"
] | null | null | null | from models.alex import Alex # NOQA
from models.vgg import VGG # NOQA
| 24 | 36 | 0.75 | 12 | 72 | 4.5 | 0.5 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194444 | 72 | 2 | 37 | 36 | 0.931034 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fbc8a3affcf704c3ea20969a037d699c0c022c88 | 32 | py | Python | vformer/models/classification/__init__.py | gchhablani/vformer | c7dc7d14e33aa5b2974667d281e7910e17538b34 | [
"MIT"
] | null | null | null | vformer/models/classification/__init__.py | gchhablani/vformer | c7dc7d14e33aa5b2974667d281e7910e17538b34 | [
"MIT"
] | null | null | null | vformer/models/classification/__init__.py | gchhablani/vformer | c7dc7d14e33aa5b2974667d281e7910e17538b34 | [
"MIT"
] | null | null | null | from .vanilla import VanillaViT
| 16 | 31 | 0.84375 | 4 | 32 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8388101404db6a52273d2d139f3c08ada889fc86 | 19,645 | py | Python | src/genie/libs/parser/viptela/tests/test_show_omp.py | danielgraziano/genieparser | 74d5e1ded9794561af1ac3284307c58365617673 | [
"Apache-2.0"
] | 4 | 2020-08-20T12:23:12.000Z | 2021-06-15T14:10:02.000Z | src/genie/libs/parser/viptela/tests/test_show_omp.py | dalwar23/genieparser | a9df45d3ee23f107bfb55915068e90782f92fc99 | [
"Apache-2.0"
] | 119 | 2020-07-10T22:37:51.000Z | 2021-03-18T02:40:05.000Z | src/genie/libs/parser/viptela/tests/test_show_omp.py | dalwar23/genieparser | a9df45d3ee23f107bfb55915068e90782f92fc99 | [
"Apache-2.0"
] | 2 | 2020-07-10T15:33:42.000Z | 2021-04-05T09:48:56.000Z | import unittest
from unittest.mock import Mock
# ATS
from pyats.topology import Device
# Metaparset
from genie.metaparser.util.exceptions import SchemaEmptyParserError,\
SchemaMissingKeyError
# Parser
from genie.libs.parser.viptela.show_omp import (ShowOmpSummary,
ShowOmpTlocs,
ShowOmpPeers,
ShowOmpTlocPath
)
# ============================================
# Parser for the following commands
# * 'show bfd sessions'
# ============================================
class TestShowOmpSummary(unittest.TestCase):
device = Device(name='aDevice')
maxDiff = None
empty_output = {'execute.return_value' : ''}
golden_output = {'execute.return_value': '''
#show sdwan omp summary
oper-state UP
admin-state UP
personality vedge
omp-uptime 34:03:00:35
routes-received 5
routes-installed 3
routes-sent 2
tlocs-received 3
tlocs-installed 2
tlocs-sent 1
services-received 3
services-installed 0
services-sent 3
mcast-routes-received 0
mcast-routes-installed 0
mcast-routes-sent 0
hello-sent 146344
hello-received 146337
handshake-sent 2
handshake-received 2
alert-sent 1
alert-received 0
inform-sent 16
inform-received 16
update-sent 79
update-received 157
policy-sent 0
policy-received 2
total-packets-sent 146442
total-packets-received 146514
vsmart-peers 1
'''}
golden_parsed_output = {
'oper_state': 'UP',
'admin_state': 'UP',
'personality': 'vedge',
'omp_uptime': '34:03:00:35',
'routes_received': 5,
'routes_installed': 3,
'routes_sent': 2,
'tlocs_received': 3,
'tlocs_installed': 2,
'tlocs_sent': 1,
'services_received': 3,
'services_installed': 0,
'services_sent': 3,
'mcast_routes_received': 0,
'mcast_routes_sent': 0,
'hello_sent': 146344,
'hello_received': 146337,
'handshake_sent': 2,
'handshake_received': 2,
'alert_sent': 1,
'alert_received': 0,
'inform_sent': 16,
'inform_received': 16,
'update_sent': 79,
'update_received': 157,
'policy_sent': 0,
'policy_received': 2,
'total_packets_sent': 146442,
'vsmart_peers': 1}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowOmpSummary(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse()
def test_golden(self):
self.device = Mock(**self.golden_output)
obj = ShowOmpSummary(device=self.device)
parsed_output = obj.parse()
#self.assertEqual(parsed_output,self.golden_parsed_output)
self.assertDictEqual(parsed_output,self.golden_parsed_output)
class TestShowOmpTlocPath(unittest.TestCase):
device = Device(name='aDevice')
maxDiff = None
empty_output = {'execute.return_value' : ''}
golden_output = {'execute.return_value': '''
show omp tloc-paths
tloc-paths entries 10.220.100.10 default ipsec
tloc-paths entries 10.220.100.20 default ipsec
tloc-paths entries 10.220.100.30 default ipsec
'''}
golden_parsed_output = {
'tloc_path': {
'10.220.100.10': {
'tloc': {
'default': {
'transport': 'ipsec'
}
}
},
'10.220.100.20': {
'tloc': {
'default': {
'transport': 'ipsec'
}
}
},
'10.220.100.30': {
'tloc': {
'default': {
'transport': 'ipsec'
}
}
}
}
}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowOmpTlocPath(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse()
def test_golden(self):
self.device = Mock(**self.golden_output)
obj = ShowOmpTlocPath(device=self.device)
parsed_output = obj.parse()
self.assertEqual(parsed_output,self.golden_parsed_output)
#self.assertDictEqual(parsed_output,self.golden_parsed_output)
class TestShowOmpPeers(unittest.TestCase):
device = Device(name='aDevice')
maxDiff = None
empty_output = {'execute.return_value' : ''}
golden_output = {'execute.return_value': '''
R -> routes received
I -> routes installed
S -> routes sent
DOMAIN OVERLAY SITE
PEER TYPE ID ID ID STATE UPTIME R/I/S
------------------------------------------------------------------------------------------
10.4.1.4 vsmart 1 1 4 up 6:13:57:28 4/0/4
10.115.55.5 vedge 1 1 55 up 0:01:24:29 1/0/1
10.240.105.6 vedge 1 1 6 up 6:13:58:46 1/0/1
172.16.106.170 vedge 1 1 170 up 6:13:58:47 0/0/2
192.168.254.100 vedge 1 1 100 up 0:09:28:48 0/0/0
192.168.254.101 vedge 1 1 101 up 0:09:27:33 0/0/0
192.168.254.102 vedge 1 1 102 up 0:09:29:00 0/0/0
192.168.255.2 vedge 1 1 200 up 0:04:14:12 2/0/0
'''}
golden_parsed_output = {
'peer': {
'10.4.1.4': {
'type': 'vsmart',
'domain_id': 1,
'overlay_id': 1,
'site_id': 4,
'state': 'up',
'uptime': '6:13:57:28',
'route': {
'recv': 4,
'install': 0,
'sent': 4
}
},
'10.115.55.5': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 55,
'state': 'up',
'uptime': '0:01:24:29',
'route': {
'recv': 1,
'install': 0,
'sent': 1
}
},
'10.240.105.6': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 6,
'state': 'up',
'uptime': '6:13:58:46',
'route': {
'recv': 1,
'install': 0,
'sent': 1
}
},
'172.16.106.170': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 170,
'state': 'up',
'uptime': '6:13:58:47',
'route': {
'recv': 0,
'install': 0,
'sent': 2
}
},
'192.168.254.100': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 100,
'state': 'up',
'uptime': '0:09:28:48',
'route': {
'recv': 0,
'install': 0,
'sent': 0
}
},
'192.168.254.101': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 101,
'state': 'up',
'uptime': '0:09:27:33',
'route': {
'recv': 0,
'install': 0,
'sent': 0
}
},
'192.168.254.102': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 102,
'state': 'up',
'uptime': '0:09:29:00',
'route': {
'recv': 0,
'install': 0,
'sent': 0
}
},
'192.168.255.2': {
'type': 'vedge',
'domain_id': 1,
'overlay_id': 1,
'site_id': 200,
'state': 'up',
'uptime': '0:04:14:12',
'route': {
'recv': 2,
'install': 0,
'sent': 0
}
}
}
}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowOmpPeers(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse()
def test_golden(self):
self.device = Mock(**self.golden_output)
obj = ShowOmpPeers(device=self.device)
parsed_output = obj.parse()
self.assertEqual(parsed_output,self.golden_parsed_output)
#self.assertDictEqual(parsed_output,self.golden_parsed_output)
class TestShowOmpTlocs(unittest.TestCase):
device = Device(name='aDevice')
maxDiff = None
empty_output = {'execute.return_value' : ''}
golden_output = {'execute.return_value': '''
---------------------------------------------------
tloc entries for 10.220.100.10
default
ipsec
---------------------------------------------------
RECEIVED FROM:
peer 0.0.0.0
status C,Red,R
loss-reason not set
lost-to-peer not set
lost-to-path-id not set
Attributes:
attribute-type installed
encap-key not set
encap-proto 0
encap-spi 365
encap-auth sha1-hmac,ah-sha1-hmac
encap-encrypt aes256
public-ip 10.66.12.2
public-port 12426
private-ip 10.66.12.2
private-port 12426
public-ip ::
public-port 0
private-ip ::
private-port 0
bfd-status up
domain-id not set
site-id 101
overlay-id not set
preference 0
tag not set
stale not set
weight 1
version 3
gen-id 0x80000003
carrier default
restrict 0
on-demand 0
groups [ 0 ]
bandwidth 0
qos-group default-group
border not set
unknown-attr-len not set
---------------------------------------------------
tloc entries for 10.220.100.20
default
ipsec
---------------------------------------------------
RECEIVED FROM:
peer 10.220.100.3
status C,I,R
loss-reason not set
lost-to-peer not set
lost-to-path-id not set
Attributes:
attribute-type installed
encap-key not set
encap-proto 0
encap-spi 355
encap-auth sha1-hmac,ah-sha1-hmac
encap-encrypt aes256
public-ip 10.66.13.2
public-port 12426
private-ip 10.66.13.2
private-port 12426
public-ip ::
public-port 0
private-ip ::
private-port 0
bfd-status up
domain-id not set
site-id 102
overlay-id not set
preference 0
tag not set
stale not set
weight 1
version 3
gen-id 0x80000011
carrier default
restrict 0
on-demand 0
groups [ 0 ]
bandwidth 0
qos-group default-group
border not set
unknown-attr-len not set
---------------------------------------------------
tloc entries for 10.220.100.30
default
ipsec
---------------------------------------------------
RECEIVED FROM:
peer 10.220.100.3
status C,I,R
loss-reason not set
lost-to-peer not set
lost-to-path-id not set
Attributes:
attribute-type installed
encap-key not set
encap-proto 0
encap-spi 359
encap-auth sha1-hmac,ah-sha1-hmac
encap-encrypt aes256
public-ip 10.229.11.10
public-port 12426
private-ip 10.229.11.10
private-port 12426
public-ip ::
public-port 0
private-ip ::
private-port 0
bfd-status up
domain-id not set
site-id 103
overlay-id not set
preference 0
tag not set
stale not set
weight 1
version 3
gen-id 0x80000022
carrier default
restrict 0
on-demand 0
groups [ 0 ]
bandwidth 0
qos-group default-group
border not set
unknown-attr-len not set
'''}
golden_parsed_output = {
'tloc_data': {
'10.220.100.10': {
'tloc': {
'default': {
'transport': 'ipsec',
'received_from': {
'peer': '0.0.0.0',
'status': ['C', 'Red', 'R'],
'loss_reason': 'not_set',
'lost_to_peer': 'not_set',
'lost_to_path_id': 'not_set',
'attributes': {
'attribute_type': 'installed',
'encap_key': 'not_set',
'encap_proto': 0,
'encap_spi': 365,
'encap_auth': ['sha1-hmac', 'ah-sha1-hmac'],
'encap_encrypt': 'aes256',
'public_ip': '::',
'public_port': 0,
'private_ip': '::',
'private_port': 0,
'bfd_status': 'up',
'site_id': 101,
'preference': 0,
'tag': 'not_set',
'stale': 'not_set',
'weight': 1,
'version': 3,
'gen_id': '0x80000003',
'carrier': 'default',
'restrict': 0,
'on_demand': 0,
'groups': [0],
'bandwidth': 0,
'qos_group': 'default_group',
'border': 'not_set',
'unknown_attr_len': 'not_set'
}
}
}
}
},
'10.220.100.20': {
'tloc': {
'default': {
'transport': 'ipsec',
'received_from': {
'peer': '10.220.100.3',
'status': ['C', 'I', 'R'],
'loss_reason': 'not_set',
'lost_to_peer': 'not_set',
'lost_to_path_id': 'not_set',
'attributes': {
'attribute_type': 'installed',
'encap_key': 'not_set',
'encap_proto': 0,
'encap_spi': 355,
'encap_auth': ['sha1-hmac', 'ah-sha1-hmac'],
'encap_encrypt': 'aes256',
'public_ip': '::',
'public_port': 0,
'private_ip': '::',
'private_port': 0,
'bfd_status': 'up',
'site_id': 102,
'preference': 0,
'tag': 'not_set',
'stale': 'not_set',
'weight': 1,
'version': 3,
'gen_id': '0x80000011',
'carrier': 'default',
'restrict': 0,
'on_demand': 0,
'groups': [0],
'bandwidth': 0,
'qos_group': 'default_group',
'border': 'not_set',
'unknown_attr_len': 'not_set'
}
}
}
}
},
'10.220.100.30': {
'tloc': {
'default': {
'transport': 'ipsec',
'received_from': {
'peer': '10.220.100.3',
'status': ['C', 'I', 'R'],
'loss_reason': 'not_set',
'lost_to_peer': 'not_set',
'lost_to_path_id': 'not_set',
'attributes': {
'attribute_type': 'installed',
'encap_key': 'not_set',
'encap_proto': 0,
'encap_spi': 359,
'encap_auth': ['sha1-hmac', 'ah-sha1-hmac'],
'encap_encrypt': 'aes256',
'public_ip': '::',
'public_port': 0,
'private_ip': '::',
'private_port': 0,
'bfd_status': 'up',
'site_id': 103,
'preference': 0,
'tag': 'not_set',
'stale': 'not_set',
'weight': 1,
'version': 3,
'gen_id': '0x80000022',
'carrier': 'default',
'restrict': 0,
'on_demand': 0,
'groups': [0],
'bandwidth': 0,
'qos_group': 'default_group',
'border': 'not_set',
'unknown_attr_len': 'not_set'
}
}
}
}
}
}
}
def test_empty(self):
self.device = Mock(**self.empty_output)
obj = ShowOmpTlocs(device=self.device)
with self.assertRaises(SchemaEmptyParserError):
parsed_output = obj.parse()
def test_golden(self):
self.device = Mock(**self.golden_output)
obj = ShowOmpTlocs(device=self.device)
parsed_output = obj.parse()
self.assertEqual(parsed_output,self.golden_parsed_output)
if __name__ == '__main__':
unittest.main()
| 33.409864 | 94 | 0.391754 | 1,755 | 19,645 | 4.251852 | 0.120228 | 0.04342 | 0.017154 | 0.019298 | 0.867596 | 0.840659 | 0.801662 | 0.795497 | 0.756098 | 0.752211 | 0 | 0.090356 | 0.48114 | 19,645 | 587 | 95 | 33.46678 | 0.641715 | 0.017714 | 0 | 0.635209 | 0 | 0.014519 | 0.492585 | 0.028414 | 0 | 0 | 0.003111 | 0 | 0.014519 | 1 | 0.014519 | false | 0 | 0.009074 | 0 | 0.067151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
83d731ea0e9550bd79f614d6f1d9896ac9f7e156 | 20 | py | Python | simy/__init__.py | faical-yannick-congo/similarity | 4b447a69294e89eb573af16e1153ede0cbdb3b9e | [
"MIT"
] | null | null | null | simy/__init__.py | faical-yannick-congo/similarity | 4b447a69294e89eb573af16e1153ede0cbdb3b9e | [
"MIT"
] | 10 | 2019-05-01T13:50:30.000Z | 2019-05-09T18:11:24.000Z | simy/__init__.py | faical-yannick-congo/similarity | 4b447a69294e89eb573af16e1153ede0cbdb3b9e | [
"MIT"
] | 2 | 2019-05-01T13:47:34.000Z | 2019-05-01T14:03:52.000Z | from . import record | 20 | 20 | 0.8 | 3 | 20 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 20 | 1 | 20 | 20 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
83ee0af7612a812c3616da9b7125c0b74c9ffd44 | 33,625 | py | Python | tests/unit/test_charm.py | canonical/cos-proxy-operator | 9009af012274106e218b47db9e96bdeee9bd4714 | [
"Apache-2.0"
] | null | null | null | tests/unit/test_charm.py | canonical/cos-proxy-operator | 9009af012274106e218b47db9e96bdeee9bd4714 | [
"Apache-2.0"
] | 6 | 2022-01-28T08:54:32.000Z | 2022-03-21T12:43:09.000Z | tests/unit/test_charm.py | canonical/cos-proxy-operator | 9009af012274106e218b47db9e96bdeee9bd4714 | [
"Apache-2.0"
] | 2 | 2021-09-15T10:25:24.000Z | 2021-11-24T18:59:07.000Z | # Copyright 2021 Canonical Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Learn more at: https://juju.is/docs/sdk
# Learn more about testing at: https://juju.is/docs/sdk/testing
import base64
import json
import lzma
import unittest
import uuid
from unittest.mock import patch
from ops.model import BlockedStatus
from ops.testing import Harness
from charm import COSProxyCharm
ALERT_RULE_1 = """- alert: CPU_Usage
expr: cpu_usage_idle{is_container!=\"True\", group=\"promoagents-juju\"} < 10
for: 5m
labels:
override_group_by: host
severity: page
cloud: juju
annotations:
description: |
Host {{ $labels.host }} has had < 10% idle cpu for the last 5m
summary: Host {{ $labels.host }} CPU free is less than 10%
"""
ALERT_RULE_2 = """- alert: DiskFull
expr: disk_free{is_container!=\"True\", fstype!~\".*tmpfs|squashfs|overlay\"} <1024
for: 5m
labels:
override_group_by: host
severity: page
annotations:
description: |
Host {{ $labels.host}} {{ $labels.path }} is full
summary: Host {{ $labels.host }} {{ $labels.path}} is full
"""
RELABEL_INSTANCE_CONFIG = {
"source_labels": [
"juju_model",
"juju_model_uuid",
"juju_application",
"juju_unit",
],
"separator": "_",
"target_label": "instance",
"regex": "(.*)",
}
DASHBOARD_DUMMY_DATA_1 = {
"request_12345678": json.dumps(
{
"dashboard": {
"dashboard": {
"__inputs": [
{"pluginName": "Prometheus"},
],
"templating": {
"list": [
{"datasource": "Juju data"},
],
},
"panels": {"data": "some_data_to_hash_across"},
},
},
}
)
}
DUMMY_FIXED_1 = {
"charm": "dashboard-app-1",
"content": '{"__inputs": [], "templating": {"list": [{"datasource": '
'"${prometheusds}"}]}, "panels": {"data": '
'"some_data_to_hash_across"}}',
"juju_topology": {
"application": "dashboard-app-1",
"model": "testmodel",
"model_uuid": "1234567890",
"unit": "dashboard-app-1/0",
},
}
DASHBOARD_DUMMY_DATA_2 = {
"request_87654321": json.dumps(
{
"dashboard": {
"dashboard": {
"templating": {
"list": [
{"name": "host"},
],
},
"panels": {"data": "different_enough_to_rehash"},
},
},
}
)
}
DUMMY_FIXED_2 = {
"charm": "dashboard-app-2",
"content": '{"templating": {"list": [{"allValue": null, "datasource": '
'"${prometheusds}", "definition": '
'"label_values(up{juju_model=\\"$juju_model\\",juju_model_uuid=\\"$juju_model_uuid\\",juju_application=\\"$juju_application\\"},host)", '
'"description": null, "error": null, "hide": 0, "includeAll": '
'false, "label": "hosts", "multi": true, "name": "host", '
'"options": [], "query": {"query": '
'"label_values(up{juju_model=\\"$juju_model\\",juju_model_uuid=\\"$juju_model_uuid\\",juju_application=\\"$juju_application\\"},host)", '
'"refId": "StandardVariableQuery"}, "refresh": 1, "regex": "", '
'"skipUrlSync": false, "sort": 1, "tagValuesQuery": "", "tags": '
'[], "tagsQuery": "", "type": "query", "useTags": false}]}, '
'"panels": {"data": "different_enough_to_rehash"}}',
"juju_topology": {
"application": "dashboard-app-2",
"model": "testmodel",
"model_uuid": "1234567890",
"unit": "dashboard-app-2/0",
},
}
@patch.object(lzma, "compress", new=lambda x, *args, **kwargs: x)
@patch.object(lzma, "decompress", new=lambda x, *args, **kwargs: x)
@patch.object(uuid, "uuid4", new=lambda: "12345678")
@patch.object(base64, "b64encode", new=lambda x: x)
@patch.object(base64, "b64decode", new=lambda x: x)
class COSProxyCharmTest(unittest.TestCase):
def setUp(self):
self.harness = Harness(COSProxyCharm)
self.harness.set_model_info(name="testmodel", uuid="1234567890")
self.addCleanup(self.harness.cleanup)
self.harness.begin()
def test_scrape_target_relation_without_downstream_prometheus_blocks(self):
self.harness.set_leader(True)
rel_id = self.harness.add_relation("prometheus-target", "target-app")
self.harness.add_relation_unit(rel_id, "target-app/0")
self.harness.update_relation_data(
rel_id,
"target-app/0",
{
"hostname": "scrape_target_0",
"port": "1234",
},
)
self.assertEqual(
self.harness.model.unit.status,
BlockedStatus("Missing one of (Prometheus|target|nrpe) relation(s)"),
)
def test_prometheus_relation_without_scrape_target_blocks(self):
self.harness.set_leader(True)
downstream_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(downstream_rel_id, "cos-prometheus/0")
self.assertEqual(
self.harness.model.unit.status,
BlockedStatus("Missing one of (Prometheus|target|nrpe) relation(s)"),
)
def test_grafana_relation_without_dashboards_blocks(self):
self.harness.set_leader(True)
downstream_rel_id = self.harness.add_relation(
"downstream-grafana-dashboard", "cos-grafana"
)
self.harness.add_relation_unit(downstream_rel_id, "cos-prometheus/0")
self.assertEqual(
self.harness.model.unit.status,
BlockedStatus("Missing one of (Grafana|dashboard) relation(s)"),
)
def test_dashboards_without_grafana_relations_blocks(self):
self.harness.set_leader(True)
downstream_rel_id = self.harness.add_relation("dashboards", "target-app")
self.harness.add_relation_unit(downstream_rel_id, "cos-grafana/0")
self.assertEqual(
self.harness.model.unit.status,
BlockedStatus("Missing one of (Grafana|dashboard) relation(s)"),
)
def test_scrape_jobs_are_forwarded_on_adding_prometheus_then_targets(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
target_rel_id = self.harness.add_relation("prometheus-target", "target-app")
self.harness.add_relation_unit(target_rel_id, "target-app/0")
self.harness.update_relation_data(
target_rel_id,
"target-app/0",
{
"hostname": "scrape_target_0",
"port": "1234",
},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
expected_jobs = [
{
"job_name": "juju_testmodel_1234567_target-app_prometheus_scrape",
"static_configs": [
{
"targets": ["scrape_target_0:1234"],
"labels": {
"juju_model": "testmodel",
"juju_model_uuid": "1234567890",
"juju_application": "target-app",
"juju_unit": "target-app/0",
"host": "scrape_target_0",
},
}
],
"relabel_configs": [RELABEL_INSTANCE_CONFIG],
}
]
self.assertListEqual(scrape_jobs, expected_jobs)
def test_scrape_jobs_are_forwarded_on_adding_targets_then_prometheus(self):
self.harness.set_leader(True)
target_rel_id = self.harness.add_relation("prometheus-target", "target-app")
self.harness.add_relation_unit(target_rel_id, "target-app/0")
self.harness.update_relation_data(
target_rel_id,
"target-app/0",
{
"hostname": "scrape_target_0",
"port": "1234",
},
)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
expected_jobs = [
{
"job_name": "juju_testmodel_1234567_target-app_prometheus_scrape",
"static_configs": [
{
"targets": ["scrape_target_0:1234"],
"labels": {
"juju_model": "testmodel",
"juju_model_uuid": "1234567890",
"juju_application": "target-app",
"juju_unit": "target-app/0",
"host": "scrape_target_0",
},
}
],
"relabel_configs": [RELABEL_INSTANCE_CONFIG],
}
]
self.assertListEqual(scrape_jobs, expected_jobs)
def test_alert_rules_are_forwarded_on_adding_prometheus_then_targets(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
alert_rules_rel_id = self.harness.add_relation("prometheus-rules", "rules-app")
self.harness.add_relation_unit(alert_rules_rel_id, "rules-app/0")
self.harness.update_relation_data(
alert_rules_rel_id,
"rules-app/0",
{"groups": ALERT_RULE_1},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 1)
group = groups[0]
expected_group = {
"name": "juju_testmodel_1234567_rules-app_alert_rules",
"rules": [
{
"alert": "CPU_Usage",
"expr": 'cpu_usage_idle{is_container!="True", group="promoagents-juju"} < 10',
"for": "5m",
"labels": {
"override_group_by": "host",
"severity": "page",
"cloud": "juju",
"juju_model": "testmodel",
"juju_model_uuid": "1234567",
"juju_application": "rules-app",
"juju_unit": "rules-app/0",
},
"annotations": {
"description": "Host {{ $labels.host }} has had < 10% idle cpu for the last 5m\n",
"summary": "Host {{ $labels.host }} CPU free is less than 10%",
},
}
],
}
self.assertDictEqual(group, expected_group)
def test_alert_rules_are_forwarded_on_adding_targets_then_prometheus(self):
self.harness.set_leader(True)
alert_rules_rel_id = self.harness.add_relation("prometheus-rules", "rules-app")
self.harness.add_relation_unit(alert_rules_rel_id, "rules-app/0")
self.harness.update_relation_data(
alert_rules_rel_id,
"rules-app/0",
{"groups": ALERT_RULE_1},
)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 1)
group = groups[0]
expected_group = {
"name": "juju_testmodel_1234567_rules-app_alert_rules",
"rules": [
{
"alert": "CPU_Usage",
"expr": 'cpu_usage_idle{is_container!="True", group="promoagents-juju"} < 10',
"for": "5m",
"labels": {
"override_group_by": "host",
"severity": "page",
"cloud": "juju",
"juju_model": "testmodel",
"juju_model_uuid": "1234567",
"juju_application": "rules-app",
"juju_unit": "rules-app/0",
},
"annotations": {
"description": "Host {{ $labels.host }} has had < 10% idle cpu for the last 5m\n",
"summary": "Host {{ $labels.host }} CPU free is less than 10%",
},
}
],
}
self.assertDictEqual(group, expected_group)
def test_multiple_scrape_jobs_are_forwarded(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
target_rel_id_1 = self.harness.add_relation("prometheus-target", "target-app-1")
self.harness.add_relation_unit(target_rel_id_1, "target-app-1/0")
self.harness.update_relation_data(
target_rel_id_1,
"target-app-1/0",
{
"hostname": "scrape_target_0",
"port": "1234",
},
)
target_rel_id_2 = self.harness.add_relation("prometheus-target", "target-app-2")
self.harness.add_relation_unit(target_rel_id_2, "target-app-2/0")
self.harness.update_relation_data(
target_rel_id_2,
"target-app-2/0",
{
"hostname": "scrape_target_1",
"port": "5678",
},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
self.assertEqual(len(scrape_jobs), 2)
expected_jobs = [
{
"job_name": "juju_testmodel_1234567_target-app-1_prometheus_scrape",
"static_configs": [
{
"targets": ["scrape_target_0:1234"],
"labels": {
"juju_model": "testmodel",
"juju_model_uuid": "1234567890",
"juju_application": "target-app-1",
"juju_unit": "target-app-1/0",
"host": "scrape_target_0",
},
}
],
"relabel_configs": [RELABEL_INSTANCE_CONFIG],
},
{
"job_name": "juju_testmodel_1234567_target-app-2_prometheus_scrape",
"static_configs": [
{
"targets": ["scrape_target_1:5678"],
"labels": {
"juju_model": "testmodel",
"juju_model_uuid": "1234567890",
"juju_application": "target-app-2",
"juju_unit": "target-app-2/0",
"host": "scrape_target_1",
},
}
],
"relabel_configs": [RELABEL_INSTANCE_CONFIG],
},
]
self.assertListEqual(scrape_jobs, expected_jobs)
def test_multiple_alert_rules_are_forwarded(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
alert_rules_rel_id_1 = self.harness.add_relation("prometheus-rules", "rules-app-1")
self.harness.add_relation_unit(alert_rules_rel_id_1, "rules-app-1/0")
self.harness.update_relation_data(
alert_rules_rel_id_1,
"rules-app-1/0",
{"groups": ALERT_RULE_1},
)
alert_rules_rel_id_2 = self.harness.add_relation("prometheus-rules", "rules-app-2")
self.harness.add_relation_unit(alert_rules_rel_id_2, "rules-app-2/0")
self.harness.update_relation_data(
alert_rules_rel_id_2,
"rules-app-2/0",
{"groups": ALERT_RULE_2},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 2)
expected_groups = [
{
"name": "juju_testmodel_1234567_rules-app-1_alert_rules",
"rules": [
{
"alert": "CPU_Usage",
"expr": 'cpu_usage_idle{is_container!="True", group="promoagents-juju"} < 10',
"for": "5m",
"labels": {
"override_group_by": "host",
"severity": "page",
"cloud": "juju",
"juju_model": "testmodel",
"juju_model_uuid": "1234567",
"juju_application": "rules-app-1",
"juju_unit": "rules-app-1/0",
},
"annotations": {
"description": "Host {{ $labels.host }} has had < 10% idle cpu for the last 5m\n",
"summary": "Host {{ $labels.host }} CPU free is less than 10%",
},
}
],
},
{
"name": "juju_testmodel_1234567_rules-app-2_alert_rules",
"rules": [
{
"alert": "DiskFull",
"expr": 'disk_free{is_container!="True", fstype!~".*tmpfs|squashfs|overlay"} <1024',
"for": "5m",
"labels": {
"override_group_by": "host",
"severity": "page",
"juju_model": "testmodel",
"juju_model_uuid": "1234567",
"juju_application": "rules-app-2",
"juju_unit": "rules-app-2/0",
},
"annotations": {
"description": "Host {{ $labels.host}} {{ $labels.path }} is full\nsummary: Host {{ $labels.host }} {{ $labels.path}} is full\n"
},
}
],
},
]
self.assertListEqual(groups, expected_groups)
def test_scrape_job_removal_differentiates_between_applications(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
target_rel_id_1 = self.harness.add_relation("prometheus-target", "target-app-1")
self.harness.add_relation_unit(target_rel_id_1, "target-app-1/0")
self.harness.update_relation_data(
target_rel_id_1,
"target-app-1/0",
{
"hostname": "scrape_target_0",
"port": "1234",
},
)
target_rel_id_2 = self.harness.add_relation("prometheus-target", "target-app-2")
self.harness.add_relation_unit(target_rel_id_2, "target-app-2/0")
self.harness.update_relation_data(
target_rel_id_2,
"target-app-2/0",
{
"hostname": "scrape_target_1",
"port": "5678",
},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
self.assertEqual(len(scrape_jobs), 2)
self.harness.remove_relation_unit(target_rel_id_2, "target-app-2/0")
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
self.assertEqual(len(scrape_jobs), 1)
expected_jobs = [
{
"job_name": "juju_testmodel_1234567_target-app-1_prometheus_scrape",
"static_configs": [
{
"targets": ["scrape_target_0:1234"],
"labels": {
"juju_model": "testmodel",
"juju_model_uuid": "1234567890",
"juju_application": "target-app-1",
"juju_unit": "target-app-1/0",
"host": "scrape_target_0",
},
}
],
"relabel_configs": [RELABEL_INSTANCE_CONFIG],
}
]
self.assertListEqual(scrape_jobs, expected_jobs)
def test_alert_rules_removal_differentiates_between_applications(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
alert_rules_rel_id_1 = self.harness.add_relation("prometheus-rules", "rules-app-1")
self.harness.add_relation_unit(alert_rules_rel_id_1, "rules-app-1/0")
self.harness.update_relation_data(
alert_rules_rel_id_1,
"rules-app-1/0",
{"groups": ALERT_RULE_1},
)
alert_rules_rel_id_2 = self.harness.add_relation("prometheus-rules", "rules-app-2")
self.harness.add_relation_unit(alert_rules_rel_id_2, "rules-app-2/0")
self.harness.update_relation_data(
alert_rules_rel_id_2,
"rules-app-2/0",
{"groups": ALERT_RULE_2},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 2)
self.harness.remove_relation_unit(alert_rules_rel_id_2, "rules-app-2/0")
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 1)
expected_groups = [
{
"name": "juju_testmodel_1234567_rules-app-1_alert_rules",
"rules": [
{
"alert": "CPU_Usage",
"expr": 'cpu_usage_idle{is_container!="True", group="promoagents-juju"} < 10',
"for": "5m",
"labels": {
"override_group_by": "host",
"severity": "page",
"cloud": "juju",
"juju_model": "testmodel",
"juju_model_uuid": "1234567",
"juju_application": "rules-app-1",
"juju_unit": "rules-app-1/0",
},
"annotations": {
"description": "Host {{ $labels.host }} has had < 10% idle cpu for the last 5m\n",
"summary": "Host {{ $labels.host }} CPU free is less than 10%",
},
}
],
},
]
self.assertListEqual(groups, expected_groups)
def test_removing_scrape_jobs_differentiates_between_units(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
target_rel_id = self.harness.add_relation("prometheus-target", "target-app")
self.harness.add_relation_unit(target_rel_id, "target-app/0")
self.harness.update_relation_data(
target_rel_id,
"target-app/0",
{
"hostname": "scrape_target_0",
"port": "1234",
},
)
self.harness.add_relation_unit(target_rel_id, "target-app/1")
self.harness.update_relation_data(
target_rel_id,
"target-app/1",
{
"hostname": "scrape_target_1",
"port": "5678",
},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
self.assertEqual(len(scrape_jobs), 1)
self.assertEqual(len(scrape_jobs[0].get("static_configs")), 2)
self.harness.remove_relation_unit(target_rel_id, "target-app/1")
scrape_jobs = json.loads(prometheus_rel_data.get("scrape_jobs", "[]"))
self.assertEqual(len(scrape_jobs), 1)
self.assertEqual(len(scrape_jobs[0].get("static_configs")), 1)
expected_jobs = [
{
"job_name": "juju_testmodel_1234567_target-app_prometheus_scrape",
"static_configs": [
{
"targets": ["scrape_target_0:1234"],
"labels": {
"juju_model": "testmodel",
"juju_model_uuid": "1234567890",
"juju_application": "target-app",
"juju_unit": "target-app/0",
"host": "scrape_target_0",
},
}
],
"relabel_configs": [RELABEL_INSTANCE_CONFIG],
}
]
self.assertListEqual(scrape_jobs, expected_jobs)
def test_removing_alert_rules_differentiates_between_units(self):
self.harness.set_leader(True)
prometheus_rel_id = self.harness.add_relation(
"downstream-prometheus-scrape", "cos-prometheus"
)
self.harness.add_relation_unit(prometheus_rel_id, "cos-prometheus/0")
alert_rules_rel_id = self.harness.add_relation("prometheus-rules", "rules-app")
self.harness.add_relation_unit(alert_rules_rel_id, "rules-app/0")
self.harness.update_relation_data(
alert_rules_rel_id,
"rules-app/0",
{"groups": ALERT_RULE_1},
)
self.harness.add_relation_unit(alert_rules_rel_id, "rules-app/1")
self.harness.update_relation_data(
alert_rules_rel_id,
"rules-app/1",
{"groups": ALERT_RULE_2},
)
prometheus_rel_data = self.harness.get_relation_data(
prometheus_rel_id, self.harness.model.app.name
)
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 1)
self.harness.remove_relation_unit(alert_rules_rel_id, "rules-app/1")
alert_rules = json.loads(prometheus_rel_data.get("alert_rules", "{}"))
groups = alert_rules.get("groups", [])
self.assertEqual(len(groups), 1)
expected_groups = [
{
"name": "juju_testmodel_1234567_rules-app_alert_rules",
"rules": [
{
"alert": "CPU_Usage",
"expr": 'cpu_usage_idle{is_container!="True", group="promoagents-juju"} < 10',
"for": "5m",
"labels": {
"override_group_by": "host",
"severity": "page",
"cloud": "juju",
"juju_model": "testmodel",
"juju_model_uuid": "1234567",
"juju_application": "rules-app",
"juju_unit": "rules-app/0",
},
"annotations": {
"description": "Host {{ $labels.host }} has had < 10% idle cpu for the last 5m\n",
"summary": "Host {{ $labels.host }} CPU free is less than 10%",
},
}
],
},
]
self.assertListEqual(groups, expected_groups)
def test_dashboard_are_forwarded(self):
self.harness.set_leader(True)
grafana_rel_id = self.harness.add_relation("downstream-grafana-dashboard", "cos-grafana")
self.harness.add_relation_unit(grafana_rel_id, "cos-grafana/0")
target_rel_id = self.harness.add_relation("dashboards", "dashboard-app")
self.harness.add_relation_unit(target_rel_id, "dashboard-app/0")
self.harness.update_relation_data(target_rel_id, "dashboard-app/0", DASHBOARD_DUMMY_DATA_1)
grafana_rel_data = self.harness.get_relation_data(
grafana_rel_id, self.harness.model.app.name
)
dashboards = json.loads(grafana_rel_data.get("dashboards", "{}"))
self.assertEqual(len(dashboards["templates"]), 1)
def test_multiple_dashboards_are_forwarded(self):
self.harness.set_leader(True)
grafana_rel_id = self.harness.add_relation("downstream-grafana-dashboard", "cos-grafana")
self.harness.add_relation_unit(grafana_rel_id, "cos-grafana/0")
target_rel_id_1 = self.harness.add_relation("dashboards", "dashboard-app-1")
self.harness.add_relation_unit(target_rel_id_1, "dashboard-app-1/0")
self.harness.update_relation_data(
target_rel_id_1, "dashboard-app-1/0", DASHBOARD_DUMMY_DATA_1
)
target_rel_id_2 = self.harness.add_relation("dashboards", "dashboard-app-2")
self.harness.add_relation_unit(target_rel_id_2, "dashboard-app-2/0")
self.harness.update_relation_data(
target_rel_id_2, "dashboard-app-2/0", DASHBOARD_DUMMY_DATA_2
)
grafana_rel_data = self.harness.get_relation_data(
grafana_rel_id, self.harness.model.app.name
)
dashboards = json.loads(grafana_rel_data.get("dashboards", "{}"))
self.assertEqual(len(dashboards["templates"]), 2)
def test_dashboards_are_converted(self):
self.harness.set_leader(True)
grafana_rel_id = self.harness.add_relation("downstream-grafana-dashboard", "cos-grafana")
self.harness.add_relation_unit(grafana_rel_id, "cos-grafana/0")
target_rel_id_1 = self.harness.add_relation("dashboards", "dashboard-app-1")
self.harness.add_relation_unit(target_rel_id_1, "dashboard-app-1/0")
self.harness.update_relation_data(
target_rel_id_1, "dashboard-app-1/0", DASHBOARD_DUMMY_DATA_1
)
target_rel_id_2 = self.harness.add_relation("dashboards", "dashboard-app-2")
self.harness.add_relation_unit(target_rel_id_2, "dashboard-app-2/0")
self.harness.update_relation_data(
target_rel_id_2, "dashboard-app-2/0", DASHBOARD_DUMMY_DATA_2
)
grafana_rel_data = self.harness.get_relation_data(
grafana_rel_id, self.harness.model.app.name
)
dashboards = json.loads(grafana_rel_data.get("dashboards", "{}"))
self.assertEqual(len(dashboards["templates"]), 2)
self.maxDiff = None
self.assertEqual(dashboards["templates"]["prog:e_data_t"], DUMMY_FIXED_1)
self.assertEqual(dashboards["templates"]["prog:rent_eno"], DUMMY_FIXED_2)
| 38.96292 | 156 | 0.542602 | 3,448 | 33,625 | 4.982019 | 0.077436 | 0.096693 | 0.06031 | 0.094772 | 0.880545 | 0.870474 | 0.858656 | 0.838689 | 0.825591 | 0.807137 | 0 | 0.02772 | 0.33374 | 33,625 | 862 | 157 | 39.008121 | 0.739053 | 0.019688 | 0 | 0.624831 | 0 | 0.00135 | 0.274397 | 0.060039 | 0 | 0 | 0 | 0 | 0.044534 | 1 | 0.024292 | false | 0 | 0.012146 | 0 | 0.037787 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
83eee7d0b946165bbcf5482d022e3ac5d0396b14 | 28 | py | Python | diy_gym/__init__.py | ktlichkid/diy-gym | 8783f15e2cb203829f0f1e1eac06c3310065e7f9 | [
"MIT"
] | 22 | 2019-07-22T11:56:57.000Z | 2022-01-07T09:16:20.000Z | diy_gym/__init__.py | ktlichkid/diy-gym | 8783f15e2cb203829f0f1e1eac06c3310065e7f9 | [
"MIT"
] | 6 | 2019-08-05T00:55:16.000Z | 2021-03-11T19:45:23.000Z | diy_gym/__init__.py | ktlichkid/diy-gym | 8783f15e2cb203829f0f1e1eac06c3310065e7f9 | [
"MIT"
] | 5 | 2019-07-29T00:56:51.000Z | 2021-01-22T19:16:09.000Z | from .diy_gym import DIYGym
| 14 | 27 | 0.821429 | 5 | 28 | 4.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
83f95247bfc5263c05c066194c4ec7a6a4139788 | 148 | py | Python | frappe/patches/v11_0/get_docs_apps_if_not_present.py | AKedar21/frappe | 4c9ce1701caea07e595f81414af3a9f219cccb65 | [
"MIT"
] | 2 | 2017-08-24T20:25:13.000Z | 2017-10-15T13:14:31.000Z | frappe/patches/v11_0/get_docs_apps_if_not_present.py | AKedar21/frappe | 4c9ce1701caea07e595f81414af3a9f219cccb65 | [
"MIT"
] | 89 | 2017-09-19T15:17:44.000Z | 2022-03-31T00:52:42.000Z | frappe/patches/v11_0/get_docs_apps_if_not_present.py | AKedar21/frappe | 4c9ce1701caea07e595f81414af3a9f219cccb65 | [
"MIT"
] | 3 | 2019-08-09T17:52:18.000Z | 2020-07-29T08:23:46.000Z | import frappe
from frappe.utils.help import setup_apps_for_docs
def execute():
for app in frappe.get_installed_apps():
setup_apps_for_docs(app)
| 21.142857 | 49 | 0.810811 | 25 | 148 | 4.48 | 0.6 | 0.160714 | 0.214286 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114865 | 148 | 6 | 50 | 24.666667 | 0.854962 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f7bfcd0f35c9a82c9fe7420add22a8f45b10c218 | 24 | py | Python | python/testData/psi/FStringTerminatedByLineBreakInNestedExpressionInFormatPart.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/psi/FStringTerminatedByLineBreakInNestedExpressionInFormatPart.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/psi/FStringTerminatedByLineBreakInNestedExpressionInFormatPart.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | s = f"{f'{42:{1 +
2}}'}" | 12 | 17 | 0.291667 | 6 | 24 | 1.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0.166667 | 24 | 2 | 18 | 12 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f7f88134a2d2774c5cd6a84b20a1928fbeea771a | 153 | py | Python | sanjip/__init__.py | Sanji-IO/sanjip | 5ab77263d8190f803f6f4bd063459873ac9bcabb | [
"MIT"
] | null | null | null | sanjip/__init__.py | Sanji-IO/sanjip | 5ab77263d8190f803f6f4bd063459873ac9bcabb | [
"MIT"
] | 1 | 2019-09-23T20:58:57.000Z | 2019-09-23T20:58:57.000Z | sanjip/__init__.py | Sanji-IO/sanjip | 5ab77263d8190f803f6f4bd063459873ac9bcabb | [
"MIT"
] | 1 | 2019-09-23T00:23:02.000Z | 2019-09-23T00:23:02.000Z | from __future__ import absolute_import
import sanjip.ip as ip
import sanjip.ip.addr
import sanjip.ip.route # noqa: F401
__all__ = [ip.addr, ip.route]
| 19.125 | 38 | 0.771242 | 25 | 153 | 4.36 | 0.48 | 0.330275 | 0.385321 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022901 | 0.143791 | 153 | 7 | 39 | 21.857143 | 0.80916 | 0.065359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f7fd5df126bc1afb609fc7078534d1b8c043c1b8 | 367 | py | Python | spiketools/tests/utils/test_base.py | claire98han/SpikeTools | f1cdffd50e2cbdb75961a716425c4665aa930f54 | [
"Apache-2.0"
] | 1 | 2022-03-09T19:40:37.000Z | 2022-03-09T19:40:37.000Z | spiketools/tests/utils/test_base.py | claire98han/SpikeTools | f1cdffd50e2cbdb75961a716425c4665aa930f54 | [
"Apache-2.0"
] | 35 | 2021-09-28T15:13:31.000Z | 2021-11-26T04:38:08.000Z | spiketools/tests/utils/test_base.py | claire98han/SpikeTools | f1cdffd50e2cbdb75961a716425c4665aa930f54 | [
"Apache-2.0"
] | 4 | 2021-09-28T14:56:24.000Z | 2022-03-09T21:00:31.000Z | """Tests for spiketools.utils.base"""
from spiketools.utils.base import *
###################################################################################################
###################################################################################################
def test_flatten():
lsts = [[1, 2], [3, 4]]
assert flatten(lsts) == [1, 2, 3, 4]
| 30.583333 | 99 | 0.27248 | 25 | 367 | 3.96 | 0.64 | 0.30303 | 0.383838 | 0.262626 | 0.30303 | 0.30303 | 0 | 0 | 0 | 0 | 0 | 0.024169 | 0.098093 | 367 | 11 | 100 | 33.363636 | 0.274924 | 0.084469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
790dea0f54387804d817c82c1733143d3d140ea9 | 220 | py | Python | filter_plugins/env_json_map.py | paulrbr-fl/ansible-clever | b731b96649a95825576060e8821e247b99aa8f2d | [
"MIT"
] | 7 | 2020-10-12T16:25:30.000Z | 2021-02-26T15:47:17.000Z | filter_plugins/env_json_map.py | paulrbr-fl/ansible-clever | b731b96649a95825576060e8821e247b99aa8f2d | [
"MIT"
] | 1 | 2020-10-12T16:00:35.000Z | 2020-10-12T16:00:35.000Z | filter_plugins/env_json_map.py | paulrbr-fl/ansible-clever | b731b96649a95825576060e8821e247b99aa8f2d | [
"MIT"
] | 2 | 2020-12-08T10:17:41.000Z | 2021-06-03T09:32:49.000Z | #!/usr/bin/env python
class FilterModule(object):
def filters(self):
return {'json_env_map': self.json_env_map}
def json_env_map(self, env):
return [{'name': k, 'value': str(v)} for k,v in env.items()]
| 24.444444 | 66 | 0.65 | 36 | 220 | 3.805556 | 0.583333 | 0.153285 | 0.218978 | 0.20438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177273 | 220 | 8 | 67 | 27.5 | 0.756906 | 0.090909 | 0 | 0 | 0 | 0 | 0.105528 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
793346b060087845a833c65f4c3392d21f0161bb | 96 | py | Python | venv/lib/python3.8/site-packages/cachy/stores/redis_store.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/cachy/stores/redis_store.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/cachy/stores/redis_store.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/d2/a1/cc/0c40d7c68d012303dad648eb48225a7854d38a969ba38c904d34b38afb | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.416667 | 0 | 96 | 1 | 96 | 96 | 0.479167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f7160d0ab638df715f56f4ffaaf4cc3e1943ef2c | 1,835 | py | Python | project/server/auth/wrapper.py | RaihanSabique/Flask-Restful-JWT-Auth | a6be0cc72d4f697ac3cdfa41551de9633f6feb35 | [
"MIT"
] | null | null | null | project/server/auth/wrapper.py | RaihanSabique/Flask-Restful-JWT-Auth | a6be0cc72d4f697ac3cdfa41551de9633f6feb35 | [
"MIT"
] | null | null | null | project/server/auth/wrapper.py | RaihanSabique/Flask-Restful-JWT-Auth | a6be0cc72d4f697ac3cdfa41551de9633f6feb35 | [
"MIT"
] | null | null | null | import functools
from flask import Flask, request, make_response, jsonify
from flask_restful import Resource, Api, abort
from project.server.models import User
def login_required(method):
@functools.wraps(method)
def wrapper(self):
auth_header = request.headers.get('Authorization')
if auth_header:
try:
auth_token = auth_header.split(" ")[1]
except IndexError:
abort(400, message='Bearer token malformed.')
else:
auth_token = ''
if auth_token:
resp = User.decode_auth_token(auth_token)
print(resp)
if not isinstance(resp, str):
user = User.query.filter_by(id=resp).first()
if(user.is_active):
return method(self, user)
abort(400, message='Provide a valid auth token.')
else:
abort(400, message='No auth token')
return wrapper
def admin_required(method):
@functools.wraps(method)
def wrapper(self):
auth_header = request.headers.get('Authorization')
if auth_header:
try:
auth_token = auth_header.split(" ")[1]
except IndexError:
abort(400, message='Bearer token malformed.')
else:
auth_token = ''
if auth_token:
resp = User.decode_auth_token(auth_token)
print(resp)
if not isinstance(resp, str):
user = User.query.filter_by(id=resp).first()
if(user.admin):
return method(self, user)
else:
abort(400, message='Admin required.')
abort(400, message='Provide a valid auth token.')
else:
abort(400, message='No auth token')
return wrapper | 35.288462 | 61 | 0.559128 | 202 | 1,835 | 4.955446 | 0.292079 | 0.125874 | 0.104895 | 0.056943 | 0.767233 | 0.767233 | 0.767233 | 0.767233 | 0.767233 | 0.767233 | 0 | 0.019151 | 0.345504 | 1,835 | 52 | 62 | 35.288462 | 0.814321 | 0 | 0 | 0.82 | 0 | 0 | 0.092048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.08 | 0 | 0.24 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f7748fb044ae244e7c7006a34dc0350d773f29bb | 162 | py | Python | pyro/infer/mcmc/__init__.py | fluffybird2323/pyro | 9e74e499dbda76c28f12528235dac25bd17f0b1b | [
"MIT"
] | 2 | 2019-01-26T01:53:31.000Z | 2020-02-26T17:39:17.000Z | pyro/infer/mcmc/__init__.py | fluffybird2323/pyro | 9e74e499dbda76c28f12528235dac25bd17f0b1b | [
"MIT"
] | 1 | 2017-12-15T14:01:01.000Z | 2017-12-17T03:09:06.000Z | pyro/infer/mcmc/__init__.py | fluffybird2323/pyro | 9e74e499dbda76c28f12528235dac25bd17f0b1b | [
"MIT"
] | 1 | 2018-10-02T18:50:33.000Z | 2018-10-02T18:50:33.000Z | from pyro.infer.mcmc.hmc import HMC
from pyro.infer.mcmc.mcmc import MCMC
from pyro.infer.mcmc.nuts import NUTS
__all__ = [
"HMC",
"MCMC",
"NUTS",
]
| 16.2 | 37 | 0.67284 | 25 | 162 | 4.2 | 0.32 | 0.228571 | 0.371429 | 0.485714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197531 | 162 | 9 | 38 | 18 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0.067901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
e391099078d0e93f329085a5789c325cb3fff32c | 47 | py | Python | models/__init__.py | rentainhe/Swin-Transformer | 3405655613bd74eb837694f80eaaed4678b7f6fc | [
"MIT"
] | null | null | null | models/__init__.py | rentainhe/Swin-Transformer | 3405655613bd74eb837694f80eaaed4678b7f6fc | [
"MIT"
] | null | null | null | models/__init__.py | rentainhe/Swin-Transformer | 3405655613bd74eb837694f80eaaed4678b7f6fc | [
"MIT"
] | null | null | null | from .build import build_model, build_vit_model | 47 | 47 | 0.87234 | 8 | 47 | 4.75 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 47 | 1 | 47 | 47 | 0.883721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e3aa43b858cee759f67636336fec1f16461c431c | 42 | py | Python | sedldata/__init__.py | OpenDataServices/sedldata | c7f3b13969bb9c9a494a5fadf1456cc85e9bf2cc | [
"BSD-3-Clause"
] | null | null | null | sedldata/__init__.py | OpenDataServices/sedldata | c7f3b13969bb9c9a494a5fadf1456cc85e9bf2cc | [
"BSD-3-Clause"
] | null | null | null | sedldata/__init__.py | OpenDataServices/sedldata | c7f3b13969bb9c9a494a5fadf1456cc85e9bf2cc | [
"BSD-3-Clause"
] | 1 | 2019-01-20T19:39:11.000Z | 2019-01-20T19:39:11.000Z | from sedldata.lib import Session # noqa
| 10.5 | 39 | 0.761905 | 6 | 42 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 42 | 3 | 40 | 14 | 0.941176 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e3de6128ad1c904dd3a9666b18ca38f18450812f | 129 | py | Python | django_file_form/migration.py | tonibagur/django-file-form | 5c4d439aa4253907d4ce8b175511c02b19ca4878 | [
"Apache-2.0"
] | null | null | null | django_file_form/migration.py | tonibagur/django-file-form | 5c4d439aa4253907d4ce8b175511c02b19ca4878 | [
"Apache-2.0"
] | null | null | null | django_file_form/migration.py | tonibagur/django-file-form | 5c4d439aa4253907d4ce8b175511c02b19ca4878 | [
"Apache-2.0"
] | null | null | null | from django.db import connection
def table_exists(table_name):
return table_name in connection.introspection.table_names()
| 21.5 | 63 | 0.813953 | 18 | 129 | 5.611111 | 0.722222 | 0.178218 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124031 | 129 | 5 | 64 | 25.8 | 0.893805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
e3f41e08bbe01aa86609397f3d4ed4931a9b184d | 145 | py | Python | apps/search/forms.py | Mozilla-GitHub-Standards/93f18f14efcf5fdfc0e04f9bf247f66baf46663f37b1d2087ab8d850abc90803 | 4e374b4d52dfb9039ebe543e7f27682189022307 | [
"BSD-3-Clause"
] | 2 | 2015-04-06T15:20:29.000Z | 2016-12-30T12:25:11.000Z | apps/search/forms.py | Mozilla-GitHub-Standards/93f18f14efcf5fdfc0e04f9bf247f66baf46663f37b1d2087ab8d850abc90803 | 4e374b4d52dfb9039ebe543e7f27682189022307 | [
"BSD-3-Clause"
] | 2 | 2019-02-17T17:38:02.000Z | 2019-03-28T03:49:16.000Z | apps/search/forms.py | Mozilla-GitHub-Standards/93f18f14efcf5fdfc0e04f9bf247f66baf46663f37b1d2087ab8d850abc90803 | 4e374b4d52dfb9039ebe543e7f27682189022307 | [
"BSD-3-Clause"
] | 1 | 2019-03-28T03:49:18.000Z | 2019-03-28T03:49:18.000Z | from haystack.forms import FacetedSearchForm
class CustomFacetedSearchForm(FacetedSearchForm):
"""Override the results settings"""
pass
| 24.166667 | 49 | 0.793103 | 13 | 145 | 8.846154 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 145 | 5 | 50 | 29 | 0.92 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
540d88e4612374e85eec70dc198a7143a06621a7 | 195 | py | Python | scraper_meta.py | altanner/snax2 | 7c1cede46806ca434516a57a00c36af3e2b244ed | [
"MIT"
] | null | null | null | scraper_meta.py | altanner/snax2 | 7c1cede46806ca434516a57a00c36af3e2b244ed | [
"MIT"
] | null | null | null | scraper_meta.py | altanner/snax2 | 7c1cede46806ca434516a57a00c36af3e2b244ed | [
"MIT"
] | null | null | null | # user_agent = {"User-Agent": "python-requests/2.25.1"}
user_agent = {"User-Agent": "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.14 (KHTML, like Gecko) Chrome/24.0.1292.0 Safari/537.14"}
| 65 | 138 | 0.697436 | 35 | 195 | 3.828571 | 0.685714 | 0.268657 | 0.19403 | 0.268657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159091 | 0.097436 | 195 | 2 | 139 | 97.5 | 0.602273 | 0.271795 | 0 | 0 | 0 | 1 | 0.835714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
58923f7302ddc8b9f40b2b9cff6f4d873b2d263c | 154 | py | Python | brainbox/tests/test_metrics.py | SebastianBruijns/ibllib | 49f2091b7a53430c00c339b862dfc1a53aab008b | [
"MIT"
] | null | null | null | brainbox/tests/test_metrics.py | SebastianBruijns/ibllib | 49f2091b7a53430c00c339b862dfc1a53aab008b | [
"MIT"
] | null | null | null | brainbox/tests/test_metrics.py | SebastianBruijns/ibllib | 49f2091b7a53430c00c339b862dfc1a53aab008b | [
"MIT"
] | null | null | null |
def test_unit_stability():
pass
def test_feat_cutoff():
pass
def test_wf_similarity():
pass
def test_firing_rate_coeff_var():
pass
| 9.625 | 33 | 0.694805 | 22 | 154 | 4.409091 | 0.590909 | 0.28866 | 0.340206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227273 | 154 | 15 | 34 | 10.266667 | 0.815126 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
54508a41a36a513f90ea1e6deb97390695f1d32d | 180 | py | Python | python/8kyu/sum_of_positive.py | Sigmanificient/codewars | b34df4bf55460d312b7ddf121b46a707b549387a | [
"MIT"
] | 3 | 2021-06-08T01:57:13.000Z | 2021-06-26T10:52:47.000Z | python/8kyu/sum_of_positive.py | Sigmanificient/codewars | b34df4bf55460d312b7ddf121b46a707b549387a | [
"MIT"
] | null | null | null | python/8kyu/sum_of_positive.py | Sigmanificient/codewars | b34df4bf55460d312b7ddf121b46a707b549387a | [
"MIT"
] | 2 | 2021-06-10T21:20:13.000Z | 2021-06-30T10:13:26.000Z | """Kata url: https://www.codewars.com/kata/5715eaedb436cf5606000381."""
from typing import List
def positive_sum(arr: List[int]) -> int:
return sum(x for x in arr if x > 0)
| 22.5 | 71 | 0.694444 | 29 | 180 | 4.275862 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119205 | 0.161111 | 180 | 7 | 72 | 25.714286 | 0.701987 | 0.361111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
546274028e4cfa1271329a5c85c288e4240f78b8 | 65 | py | Python | payrun/api/employers.py | Zingeon/payrun-python | 1fbac0ee2556641840bf0b34d6da44437d91dc80 | [
"MIT"
] | null | null | null | payrun/api/employers.py | Zingeon/payrun-python | 1fbac0ee2556641840bf0b34d6da44437d91dc80 | [
"MIT"
] | null | null | null | payrun/api/employers.py | Zingeon/payrun-python | 1fbac0ee2556641840bf0b34d6da44437d91dc80 | [
"MIT"
] | null | null | null | class Employers():
def getItems():
return 'employers' | 21.666667 | 26 | 0.615385 | 6 | 65 | 6.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.261538 | 65 | 3 | 26 | 21.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.