hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
46b4b0321354077553576ed17ffceecc515fd04d | 35 | py | Python | scripts/assert.py | liwt31/unexpected | f973896812ad54494a8644fff431fc5202eaef41 | [
"MIT"
] | null | null | null | scripts/assert.py | liwt31/unexpected | f973896812ad54494a8644fff431fc5202eaef41 | [
"MIT"
] | null | null | null | scripts/assert.py | liwt31/unexpected | f973896812ad54494a8644fff431fc5202eaef41 | [
"MIT"
] | null | null | null | AssertionError = None
assert False
| 11.666667 | 21 | 0.828571 | 4 | 35 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 35 | 2 | 22 | 17.5 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
d3ca7906f906df31f8147b04179f0d938cc33965 | 138 | py | Python | ctgan/__main__.py | ljk423/ctgan-tf | 916ae47e1932d5dd76152fc9e59e9de88142590d | [
"MIT"
] | 2 | 2020-11-26T18:59:04.000Z | 2021-06-05T04:39:51.000Z | ctgan/__main__.py | ljk423/ctgan-tf | 916ae47e1932d5dd76152fc9e59e9de88142590d | [
"MIT"
] | null | null | null | ctgan/__main__.py | ljk423/ctgan-tf | 916ae47e1932d5dd76152fc9e59e9de88142590d | [
"MIT"
] | 1 | 2021-01-17T15:20:39.000Z | 2021-01-17T15:20:39.000Z | """
Module that proxies the execution to the command-line interface.
"""
from ctgan import cli
if __name__ == '__main__':
cli.cli()
| 15.333333 | 64 | 0.695652 | 19 | 138 | 4.631579 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188406 | 138 | 8 | 65 | 17.25 | 0.785714 | 0.463768 | 0 | 0 | 0 | 0 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
d3fae4615a096c53ef6a5fe1288d7bfeeebfe493 | 154 | py | Python | python-ds-practice/01_product/product.py | SteviBee/SpringBoard_Exercises | 5366390a2dad85b0a243f24a7f8536ff4a12bc7a | [
"Apache-2.0"
] | null | null | null | python-ds-practice/01_product/product.py | SteviBee/SpringBoard_Exercises | 5366390a2dad85b0a243f24a7f8536ff4a12bc7a | [
"Apache-2.0"
] | null | null | null | python-ds-practice/01_product/product.py | SteviBee/SpringBoard_Exercises | 5366390a2dad85b0a243f24a7f8536ff4a12bc7a | [
"Apache-2.0"
] | null | null | null | def product(a, b):
"""Return product of a and b.
>>> product(2, 2)
4
>>> product(2, -2)
-4
"""
return a * b | 14 | 33 | 0.38961 | 21 | 154 | 2.857143 | 0.428571 | 0.066667 | 0.3 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 0.441558 | 154 | 11 | 34 | 14 | 0.627907 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
31063b1ddb78a10b05ae8b60ab12ccc6d85e9613 | 251 | py | Python | spherov2/toy/bb8.py | wwj718/spherov2.py | baa62211ad4691e7eb6593ff77d6b8a73856e695 | [
"MIT"
] | null | null | null | spherov2/toy/bb8.py | wwj718/spherov2.py | baa62211ad4691e7eb6593ff77d6b8a73856e695 | [
"MIT"
] | null | null | null | spherov2/toy/bb8.py | wwj718/spherov2.py | baa62211ad4691e7eb6593ff77d6b8a73856e695 | [
"MIT"
] | null | null | null | from spherov2.commands.core import Core
from spherov2.toy.ollie import Ollie
from spherov2.types import ToyType
class BB8(Ollie):
toy_type = ToyType('BB-8', 'BB-', 'BB', .06)
get_factory_config_block_crc = Core.get_factory_config_block_crc
| 25.1 | 68 | 0.76494 | 39 | 251 | 4.692308 | 0.512821 | 0.196721 | 0.174863 | 0.229508 | 0.262295 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0.135458 | 251 | 9 | 69 | 27.888889 | 0.81106 | 0 | 0 | 0 | 0 | 0 | 0.035857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
31135e00934fe5c0c0dafb3601cebf2255d94a8f | 6,592 | py | Python | Verkefni2/verk3.py | Kristberg/VEF-2VF05CU | f1d82106a6d2488412896e00e3bb64a3e70b1dd1 | [
"MIT"
] | null | null | null | Verkefni2/verk3.py | Kristberg/VEF-2VF05CU | f1d82106a6d2488412896e00e3bb64a3e70b1dd1 | [
"MIT"
] | null | null | null | Verkefni2/verk3.py | Kristberg/VEF-2VF05CU | f1d82106a6d2488412896e00e3bb64a3e70b1dd1 | [
"MIT"
] | null | null | null | import os
from flask import *
app = Flask(__name__)
@app.route("/")
def home():
return render_template_string("""
<head>
<meta charset="utf-8">
<link rel="stylesheet" href="{{ url_for('static', filename='main.css')}}"/>
<title>Hasarfréttir - VEFÞ2VF05CU Verkefni 3</title>
</head>
<body>
<header class="header">
<h1></h1>
<nav><a href="/">Fréttir</a>
| <a href="http://jinja.pocoo.org/docs/2.10/">Jinja Templates</a> | <a href="https://github.com/vefthroun/vefthroun.github.io/wiki">Bjargir</a> | <a href="/kvikmyndir">Kvikmyndir í Bíó</a> <!-- þessi síða er ekki til ennþá -->
</nav>
</header>
<main>
<div class="row">
<section>
<h1 class="imp">Fellibylurinn "Florence" veldur óreiðu í Flórída</h1>
<img src="/static/mynd0.jpg">
</section>
<section class="top">
<h3>Hasarfréttir dagsins</h3>
<ul>
<li><a href="/frett1"> Fellibylurinn "Florence" veldur óreiðu í Flórída </a></li>
<li><a href="/frett2"> Veiðin er dræm þetta haustið </a></li>
<li><a href="/frett3"> Ólafía stendur sig vel </a></li>
<li><a href="/frett4"> Ísland dottið úr leik </a></li>
</ul>
</section>
</div>
</main>
<footer class="footer">
<h5>hasarfrettir.is - aðalstræti 1 - 101 reykjavík - hasar@frettir.is</h5>
<h6>© Copyright 2019 by <a href="https://gjg.github.io/">GJG</a>.
</footer>
</body>
</html>""")
@app.route("/frett1")
def frett1():
return """
<head>
<meta charset="utf-8">
<link rel="stylesheet" href="/static/styles.css" />
<title> Fellibylurinn "Florence" veldur óreiðu á Flórída - VEFÞ2VF05CU</title>
</head>
<body>
<header class="haus">
<h1 >VEFÞ2VF05CU</h1>
<nav><a href="/">Fréttir</a>
| <a href="http://jinja.pocoo.org/docs/2.10/">Jinja Templates</a>
| <a href="https://github.com/vefthroun/vefthroun.github.io/wiki">Bjargir</a>
| <a href="/kvikmyndir">Kvikmyndir í Bíó</a> <!-- þessi síða er ekki til ennþá -->
</nav>
</header>
<main>
<div class="row">
<section>
<h1>Fellibylurinn "Florence" veldur óreiðu á Flórída</h1>
<img src="/static/mynd0.jpg">
</section>
<section class="top">
<p>Það er bara helv... vesen á fellibylnum og allt í klessu í Flórída. Milljónir manna þurftu að yfirgefa heimili sin vegna yfirvofandi eyðileggingar fellibylsins "Florence"...<p>
<p>dsg@frettir.is</p>
<h5><a href="/">Forsíða</a></h5>
</section>
</div>
</main>
<footer class="footer">
<h5>hasarfrettir.is - aðalstræti 1 - 101 reykjavík - hasar@frettir.is</h5>
<h6>© Copyright 2019 by <a href="https://gjg.github.io/">GJG</a>.
</footer>
</body>
</html>
"""
@app.route("/frett2")
def frett2():
return """
<head>
<meta charset="utf-8">
<link rel="stylesheet" href="/static/styles.css" />
<title> Veiðin er dræm þetta haustið - VEFÞ2VF05CU</title>
</head>
<body>
<header class="header">
<h1 >VEFÞ2VF05CU</h1>
<nav><a href="/">Fréttir</a>
| <a href="http://jinja.pocoo.org/docs/2.10/">Jinja Templates</a>
| <a href="https://github.com/vefthroun/vefthroun.github.io/wiki">Bjargir</a>
| <a href="/kvikmyndir">Kvikmyndir í Bíó</a> <!-- þessi síða er ekki til ennþá -->
</nav>
</header>
<main>
<div class="row">
<section>
<h1>Veiðin er dræm þetta haustið</h1>
<img src="/static/mynd1.jpg">
</section>
<section class="top">
<p>Veiðin hefur heldur verið döpur þetta haustið þrátt fyrir ágætis rigninar upp á síðkastið...<p>
<p>est@frettir.is</p>
<h5><a href="/">Forsíða</a></h5>
</section>
</div>
</main>
<footer class="footer">
<h5>hasarfrettir.is - aðalstræti 1 - 101 reykjavík - hasar@frettir.is</h5>
<h6>© Copyright 2019 by <a href="https://gjg.github.io/">GJG</a>.
</footer>
</body>
</html>
"""
@app.route("/frett3")
def frett3():
return """
<head>
<meta charset="utf-8">
<link rel="stylesheet" href="/static/styles.css" />
<title> Ólafía stendur sig vel - VEFÞ2VF05CU</title>
</head>
<body>
<header class="header">
<h1>VEFÞ2VF05CU</h1>
<nav><a href="/">Fréttir</a>
| <a href="http://jinja.pocoo.org/docs/2.10/">Jinja Templates</a>
| <a href="https://github.com/vefthroun/vefthroun.github.io/wiki">Bjargir</a>
| <a href="/kvikmyndir">Kvikmyndir í Bíó</a> <!-- þessi síða er ekki til ennþá -->
</nav>
</header>
<main>
<div class="row">
<section>
<h1>Ólafía stendur sig vel</h1>
<img src="/static/mynd2.jpg">
</section>
<section class="top">
<p>Ólafía er komin í 65 sæti peningalistans og hefur því tryggt sér keppnisrétt á LPG mótaröðinni á komandi keppnistimabili...<p>
<p>htg@frettir.is</p>
<h5><a href="/">Forsíða</a></h5>
</section>
</div>
</main>
<footer class="footer">
<h5>hasarfrettir.is - aðalstræti 1 - 101 reykjavík - hasar@frettir.is</h5>
<h6>© Copyright 2019 by <a href="https://gjg.github.io/">GJG</a>.
</footer>
"""
@app.route("/frett4")
def frett4():
return """
<head>
<meta charset="utf-8">
<link rel="stylesheet" href="/static/styles.css" />
<title> Ísland dottið úr leik - VEFÞ2VF05CU</title>
</head>
<body>
<header class="header">
<h1> VEFÞ2VF05CU </h1>
<nav><a href="/">Fréttir</a>
| <a href="http://jinja.pocoo.org/docs/2.10/">Jinja Templates</a>
| <a href="https://github.com/vefthroun/vefthroun.github.io/wiki">Bjargir</a>
| <a href="/kvikmyndir">Kvikmyndir í Bíó</a> <!-- þessi síða er ekki til ennþá -->
</nav>
</header>
<main>
<div class="row">
<section>
<h1>Ísland dottið úr leik</h1>
<img src="/static/mynd3.jpg">
</section>
<section class="top">
<p>Íslenska karlalandsliðið í körfubolta er dottið úr leik a Eurobasket þrátt fyrir ágætis spretti inn a milli. Ísland spilaði lokaleik sinn á mótinu fyrir troðfullri höll gegn heimamönnum Finnum..<p>
<p>dsg@frettir.is</p>
<h5><a href="/">Forsíða</a></h5>
</section>
</div>
</main>
<footer class="footer">
<h5>hasarfrettir.is - aðalstræti 1 - 101 reykjavík - hasar@frettir.is</h5>
<h6>© Copyright 2019 by <a href="https://gjg.github.io/">GJG</a>.
</footer>
"""
@app.errorhandler(404)
def page_not_found(e):
return """<h1>Error 404, vefsíða ekki fundin eða ekki til.</h1>""", 404
if __name__ == "__main__":
# app.run(debug=True, use_reloader=True)
app.run() | 29.560538 | 231 | 0.598604 | 887 | 6,592 | 4.42841 | 0.224352 | 0.042006 | 0.022912 | 0.022912 | 0.729633 | 0.703666 | 0.636456 | 0.62831 | 0.62831 | 0.618126 | 0 | 0.03116 | 0.206462 | 6,592 | 223 | 232 | 29.560538 | 0.719748 | 0.005765 | 0 | 0.736842 | 0 | 0.142105 | 0.932206 | 0.11457 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031579 | false | 0 | 0.010526 | 0.031579 | 0.073684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
311de5bb5b76966c776cfbef10061b773458d76e | 270 | py | Python | parlai/agents/programr/config/brain/semantic_similarity.py | roholazandie/ParlAI | 32352cab81ecb666aefd596232c5ed9f33cbaeb9 | [
"MIT"
] | null | null | null | parlai/agents/programr/config/brain/semantic_similarity.py | roholazandie/ParlAI | 32352cab81ecb666aefd596232c5ed9f33cbaeb9 | [
"MIT"
] | null | null | null | parlai/agents/programr/config/brain/semantic_similarity.py | roholazandie/ParlAI | 32352cab81ecb666aefd596232c5ed9f33cbaeb9 | [
"MIT"
] | null | null | null | from dataclasses import dataclass, field
from parlai.agents.programr.config.base import BaseConfigurationData
@dataclass
class BrainSemanticSimilarityConfiguration(BaseConfigurationData):
method: None = field(default=None)
model_dir: str = field(default=".")
| 27 | 68 | 0.803704 | 27 | 270 | 8 | 0.703704 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114815 | 270 | 9 | 69 | 30 | 0.903766 | 0 | 0 | 0 | 0 | 0 | 0.003704 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
315eb684f31c9e1164adfd440362bbcafc086cdc | 112 | py | Python | src/api/pdi/application/operation/GetDataOperation/GetDataOperationRequest.py | ahmetcagriakca/pythondataintegrator | 079b968d6c893008f02c88dbe34909a228ac1c7b | [
"MIT"
] | 1 | 2020-12-18T21:37:28.000Z | 2020-12-18T21:37:28.000Z | src/api/pdi/application/operation/GetDataOperation/GetDataOperationRequest.py | ahmetcagriakca/pythondataintegrator | 079b968d6c893008f02c88dbe34909a228ac1c7b | [
"MIT"
] | null | null | null | src/api/pdi/application/operation/GetDataOperation/GetDataOperationRequest.py | ahmetcagriakca/pythondataintegrator | 079b968d6c893008f02c88dbe34909a228ac1c7b | [
"MIT"
] | 1 | 2020-12-18T21:37:31.000Z | 2020-12-18T21:37:31.000Z | from pdip.cqrs.decorators import requestclass
@requestclass
class GetDataOperationRequest:
Id: int = None
| 16 | 45 | 0.794643 | 12 | 112 | 7.416667 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151786 | 112 | 6 | 46 | 18.666667 | 0.936842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
31882755e774f4fef5492f08539a75995dffae43 | 132 | py | Python | example/07.Light_matrix/light_matrix.off.py | rundhall/PC-LEGO-SPIKE-Simulator | 5b2fae19293875b2f60d599940d77237700798d3 | [
"MIT"
] | null | null | null | example/07.Light_matrix/light_matrix.off.py | rundhall/PC-LEGO-SPIKE-Simulator | 5b2fae19293875b2f60d599940d77237700798d3 | [
"MIT"
] | null | null | null | example/07.Light_matrix/light_matrix.off.py | rundhall/PC-LEGO-SPIKE-Simulator | 5b2fae19293875b2f60d599940d77237700798d3 | [
"MIT"
] | null | null | null | off()
Turns off all of the pixels on the Light Matrix.
Example
from spike import PrimeHub
hub = PrimeHub()
hub.light_matrix.off()
| 14.666667 | 48 | 0.757576 | 22 | 132 | 4.5 | 0.681818 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 132 | 8 | 49 | 16.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.166667 | null | null | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
3198e6fdef076299f8559090c6baa6bf129c3732 | 1,604 | py | Python | parsers/pddl/solver_planning_domains/SPDGrammarVisitor.py | MikeCoder96/EmbASP-Python | 60898633db9fa3786989bfe7b4c77c5381315e7a | [
"MIT"
] | 1 | 2020-03-28T14:28:40.000Z | 2020-03-28T14:28:40.000Z | parsers/pddl/solver_planning_domains/SPDGrammarVisitor.py | MikeCoder96/EmbASP-Python | 60898633db9fa3786989bfe7b4c77c5381315e7a | [
"MIT"
] | null | null | null | parsers/pddl/solver_planning_domains/SPDGrammarVisitor.py | MikeCoder96/EmbASP-Python | 60898633db9fa3786989bfe7b4c77c5381315e7a | [
"MIT"
] | 5 | 2020-03-29T15:48:05.000Z | 2021-09-07T22:05:04.000Z | # Generated from SPDGrammar.g4 by ANTLR 4.7
from antlr4 import *
# This class defines a complete generic visitor for a parse tree produced by SPDGrammarParser.
class SPDGrammarVisitor(ParseTreeVisitor):
# Visit a parse tree produced by SPDGrammarParser#array.
def visitArray(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#json.
def visitJson(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#oBjEcT.
def visitOBjEcT(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#pair.
def visitPair(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#ArrayValue.
def visitArrayValue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#BooleanValue.
def visitBooleanValue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#IntegerValue.
def visitIntegerValue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#NullValue.
def visitNullValue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#ObjectValue.
def visitObjectValue(self, ctx):
return self.visitChildren(ctx)
# Visit a parse tree produced by SPDGrammarParser#StringValue.
def visitStringValue(self, ctx):
return self.visitChildren(ctx)
| 27.655172 | 94 | 0.719451 | 188 | 1,604 | 6.138298 | 0.265957 | 0.057192 | 0.095321 | 0.171577 | 0.672444 | 0.672444 | 0.612652 | 0.577123 | 0.577123 | 0.577123 | 0 | 0.003167 | 0.212594 | 1,604 | 57 | 95 | 28.140351 | 0.91053 | 0.441397 | 0 | 0.454545 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0.045455 | 0.454545 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 4 |
31adbc7b7c39ef8e9e1693031b88da23beffe360 | 28 | py | Python | pinax_theme_bootstrap/__init__.py | dstufft/pinax-theme-bootstrap | 6c30b67aad7a626ab7b3eee75b3db59df056b97e | [
"Apache-2.0"
] | 1 | 2020-02-08T11:04:18.000Z | 2020-02-08T11:04:18.000Z | pinax_theme_bootstrap/__init__.py | dstufft/pinax-theme-bootstrap | 6c30b67aad7a626ab7b3eee75b3db59df056b97e | [
"Apache-2.0"
] | null | null | null | pinax_theme_bootstrap/__init__.py | dstufft/pinax-theme-bootstrap | 6c30b67aad7a626ab7b3eee75b3db59df056b97e | [
"Apache-2.0"
] | null | null | null | __version__ = "2.0.2.post1"
| 14 | 27 | 0.678571 | 5 | 28 | 3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 0.107143 | 28 | 1 | 28 | 28 | 0.44 | 0 | 0 | 0 | 0 | 0 | 0.392857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
31e981d24c03ac22951eef6fdc046e31db761a1d | 235 | py | Python | Trakttv.bundle/Contents/Libraries/Shared/plugin/sync/modes/__init__.py | disrupted/Trakttv.bundle | 24712216c71f3b22fd58cb5dd89dad5bb798ed60 | [
"RSA-MD"
] | 1,346 | 2015-01-01T14:52:24.000Z | 2022-03-28T12:50:48.000Z | Trakttv.bundle/Contents/Libraries/Shared/plugin/sync/modes/__init__.py | alcroito/Plex-Trakt-Scrobbler | 4f83fb0860dcb91f860d7c11bc7df568913c82a6 | [
"RSA-MD"
] | 474 | 2015-01-01T10:27:46.000Z | 2022-03-21T12:26:16.000Z | Trakttv.bundle/Contents/Libraries/Shared/plugin/sync/modes/__init__.py | alcroito/Plex-Trakt-Scrobbler | 4f83fb0860dcb91f860d7c11bc7df568913c82a6 | [
"RSA-MD"
] | 191 | 2015-01-02T18:27:22.000Z | 2022-03-29T10:49:48.000Z | from plugin.sync.modes.fast_pull import FastPull
from plugin.sync.modes.pull import Pull
from plugin.sync.modes.push import Push
from plugin.sync.modes.full import Full
__all__ = [
'FastPull',
'Pull',
'Push',
'Full'
] | 19.583333 | 48 | 0.714894 | 34 | 235 | 4.794118 | 0.323529 | 0.245399 | 0.343558 | 0.466258 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.174468 | 235 | 12 | 49 | 19.583333 | 0.840206 | 0 | 0 | 0 | 0 | 0 | 0.084746 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
9ece944b1fa86147476ed01c0ea5b0fc1e6295ee | 402 | py | Python | samples/LuceneInAction/PositionalPorterStopAnalyzerTest.py | romanchyla/pylucene-trunk | 990079ff0c76b972ce5ef2bac9b85334a0a1f27a | [
"Apache-2.0"
] | 15 | 2015-05-21T09:28:01.000Z | 2022-03-18T23:41:49.000Z | samples/LuceneInAction/PositionalPorterStopAnalyzerTest.py | fnp/pylucene | fb16ac375de5479dec3919a5559cda02c899e387 | [
"Apache-2.0"
] | 1 | 2021-09-30T03:59:43.000Z | 2021-09-30T03:59:43.000Z | samples/LuceneInAction/PositionalPorterStopAnalyzerTest.py | romanchyla/pylucene-trunk | 990079ff0c76b972ce5ef2bac9b85334a0a1f27a | [
"Apache-2.0"
] | 13 | 2015-04-18T23:05:11.000Z | 2021-11-29T21:23:26.000Z |
import os, sys, unittest, lucene
lucene.initVM()
sys.path.append(os.path.dirname(os.path.abspath(sys.argv[0])))
from lia.analysis.positional.PositionalPorterStopAnalyzerTest import \
PositionalPorterStopAnalyzerTest
PositionalPorterStopAnalyzerTest.main()
import lia.analysis.positional.PositionalPorterStopAnalyzerTest
unittest.main(lia.analysis.positional.PositionalPorterStopAnalyzerTest)
| 28.714286 | 71 | 0.843284 | 39 | 402 | 8.692308 | 0.461538 | 0.097345 | 0.185841 | 0.469027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00266 | 0.064677 | 402 | 13 | 72 | 30.923077 | 0.898936 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
730558a064c2d4d1a2dff57b1b3640d308d4583b | 45,987 | py | Python | methods.py | SystemFail/godot | 926495d8eb11de2878319ba54f30812fa1637d72 | [
"CC-BY-3.0",
"MIT"
] | null | null | null | methods.py | SystemFail/godot | 926495d8eb11de2878319ba54f30812fa1637d72 | [
"CC-BY-3.0",
"MIT"
] | null | null | null | methods.py | SystemFail/godot | 926495d8eb11de2878319ba54f30812fa1637d72 | [
"CC-BY-3.0",
"MIT"
] | null | null | null | import os
def add_source_files(self, sources, filetype, lib_env = None, shared = False):
import glob;
import string;
#if not lib_objects:
if not lib_env:
lib_env = self
if type(filetype) == type(""):
dir = self.Dir('.').abspath
list = glob.glob(dir + "/"+filetype)
for f in list:
sources.append( self.Object(f) )
else:
for f in filetype:
sources.append(self.Object(f))
def build_shader_header( target, source, env ):
for x in source:
print x
name = str(x)
name = name[ name.rfind("/")+1: ]
name = name[ name.rfind("\\")+1: ]
name = name.replace(".","_")
fs = open(str(x),"r")
fd = open(str(x)+".h","w")
fd.write("/* this file has been generated by SCons, do not edit! */\n")
fd.write("static const char *"+name+"=\n")
line=fs.readline()
while(line):
line=line.replace("\r","")
line=line.replace("\n","")
line=line.replace("\\","\\\\")
line=line.replace("\"","\\\"")
fd.write("\""+line+"\\n\"\n")
line=fs.readline()
fd.write(";\n")
return 0
def build_glsl_header( filename ):
fs = open(filename,"r")
line=fs.readline()
vertex_lines=[]
fragment_lines=[]
uniforms=[]
attributes=[]
fbos=[]
conditionals=[]
texunits=[]
texunit_names=[]
ubos=[]
ubo_names=[]
reading=""
line_offset=0
vertex_offset=0
fragment_offset=0
while(line):
if (line.find("[vertex]")!=-1):
reading="vertex"
line=fs.readline()
line_offset+=1
vertex_offset=line_offset
continue
if (line.find("[fragment]")!=-1):
reading="fragment"
line=fs.readline()
line_offset+=1
fragment_offset=line_offset
continue
if (line.find("#ifdef ")!=-1):
ifdefline = line.replace("#ifdef ","").strip()
if (not ifdefline in conditionals):
conditionals+=[ifdefline]
if (line.find("#elif defined(")!=-1):
ifdefline = line.replace("#elif defined(","").strip()
ifdefline = ifdefline.replace(")","").strip()
if (not ifdefline in conditionals):
conditionals+=[ifdefline]
import re
if re.search(r"^\s*uniform", line):
if (line.lower().find("texunit:")!=-1):
#texture unit
texunit = str(int( line[line.find(":")+1:].strip() ))
uline=line[:line.lower().find("//")]
uline = uline.replace("uniform","");
uline = uline.replace(";","");
lines = uline.split(",")
for x in lines:
x = x.strip()
x = x[ x.rfind(" ")+1: ]
if (x.find("[")!=-1):
#unfiorm array
x = x[ :x.find("[") ]
if (not x in texunit_names):
texunits+=[(x,texunit)]
texunit_names+=[x]
elif (line.lower().find("ubo:")!=-1):
#ubo
uboidx = str(int( line[line.find(":")+1:].strip() ))
uline=line[:line.lower().find("//")]
uline = uline[uline.find("uniform")+len("uniform"):];
uline = uline.replace(";","");
uline = uline.replace("{","");
lines = uline.split(",")
for x in lines:
x = x.strip()
x = x[ x.rfind(" ")+1: ]
if (x.find("[")!=-1):
#unfiorm array
x = x[ :x.find("[") ]
if (not x in ubo_names):
ubos+=[(x,uboidx)]
ubo_names+=[x]
else:
uline = line.replace("uniform","");
uline = uline.replace(";","");
lines = uline.split(",")
for x in lines:
x = x.strip()
x = x[ x.rfind(" ")+1: ]
if (x.find("[")!=-1):
#unfiorm array
x = x[ :x.find("[") ]
if (not x in uniforms):
uniforms+=[x]
if ((line.strip().find("in ")==0 or line.strip().find("attribute ")==0) and line.find("attrib:")!=-1):
uline = line.replace("in ","");
uline = uline.replace("attribute ","");
uline = uline.replace(";","");
uline = uline[ uline.find(" "): ].strip()
if (uline.find("//")!=-1):
name,bind = uline.split("//")
if (bind.find("attrib:")!=-1):
name=name.strip()
bind=bind.replace("attrib:","").strip()
attributes+=[(name,bind)]
if (line.strip().find("out ")==0):
uline = line.replace("out","").strip();
uline = uline.replace(";","");
uline = uline[ uline.find(" "): ].strip()
if (uline.find("//")!=-1):
name,bind = uline.split("//")
if (bind.find("drawbuffer:")!=-1):
name=name.strip()
bind=bind.replace("drawbuffer:","").strip()
fbos+=[(name,bind)]
line=line.replace("\r","")
line=line.replace("\n","")
line=line.replace("\\","\\\\")
line=line.replace("\"","\\\"")
#line=line+"\\n\\" no need to anymore
if (reading=="vertex"):
vertex_lines+=[line]
if (reading=="fragment"):
fragment_lines+=[line]
line=fs.readline()
line_offset+=1
fs.close();
out_file = filename+".h"
fd = open(out_file,"w")
fd.write("/* WARNING, THIS FILE WAS GENERATED, DO NOT EDIT */\n");
out_file_base = out_file
out_file_base = out_file_base[ out_file_base.rfind("/")+1: ]
out_file_base = out_file_base[ out_file_base.rfind("\\")+1: ]
# print("out file "+out_file+" base " +out_file_base)
out_file_ifdef = out_file_base.replace(".","_").upper()
fd.write("#ifndef "+out_file_ifdef+"\n")
fd.write("#define "+out_file_ifdef+"\n")
out_file_class = out_file_base.replace(".glsl.h","").title().replace("_","").replace(".","")+"ShaderGL";
fd.write("\n\n");
fd.write("#include \"drivers/opengl/shader_gl.h\"\n\n\n");
fd.write("class "+out_file_class+" : public ShaderGL {\n\n");
fd.write("\t virtual String get_shader_name() const { return \""+out_file_class+"\"; }\n");
fd.write("public:\n\n");
if (len(conditionals)):
fd.write("\tenum Conditionals {\n");
for x in conditionals:
fd.write("\t\t"+x+",\n");
fd.write("\t};\n\n");
if (len(uniforms)):
fd.write("\tenum Uniforms {\n");
for x in uniforms:
fd.write("\t\t"+x.upper()+",\n");
fd.write("\t};\n\n");
fd.write("\t_FORCE_INLINE_ int get_uniform(Uniforms p_uniform) const { return _get_uniform(p_uniform); }\n\n");
if (len(conditionals)):
fd.write("\t_FORCE_INLINE_ void set_conditional(Conditionals p_conditional,bool p_enable) { _set_conditional(p_conditional,p_enable); }\n\n");
fd.write("\t#define _FU if (get_uniform(p_uniform)<0) return; ERR_FAIL_COND( get_active()!=this );\n\n ");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, double p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint64_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int64_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, unsigned long p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, long p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Color& p_color) { _FU GLfloat col[4]={p_color.r,p_color.g,p_color.b,p_color.a}; glUniform4fv(get_uniform(p_uniform),1,col); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector2& p_vec2) { _FU GLfloat vec2[2]={p_vec2.x,p_vec2.y}; glUniform2fv(get_uniform(p_uniform),1,vec2); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector3& p_vec3) { _FU GLfloat vec3[3]={p_vec3.x,p_vec3.y,p_vec3.z}; glUniform3fv(get_uniform(p_uniform),1,vec3); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b) { _FU glUniform2f(get_uniform(p_uniform),p_a,p_b); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c) { _FU glUniform3f(get_uniform(p_uniform),p_a,p_b,p_c); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c, float p_d) { _FU glUniform4f(get_uniform(p_uniform),p_a,p_b,p_c,p_d); }\n\n");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform& p_transform) { _FU
const Transform &tr = p_transform;
GLfloat matrix[16]={ /* build a 16x16 matrix */
tr.basis.elements[0][0],
tr.basis.elements[1][0],
tr.basis.elements[2][0],
0,
tr.basis.elements[0][1],
tr.basis.elements[1][1],
tr.basis.elements[2][1],
0,
tr.basis.elements[0][2],
tr.basis.elements[1][2],
tr.basis.elements[2][2],
0,
tr.origin.x,
tr.origin.y,
tr.origin.z,
1
};
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
}
""");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Matrix32& p_transform) { _FU
const Matrix32 &tr = p_transform;
GLfloat matrix[16]={ /* build a 16x16 matrix */
tr.elements[0][0],
tr.elements[0][1],
0,
0,
tr.elements[1][0],
tr.elements[1][1],
0,
0,
0,
0,
1,
0,
tr.elements[2][0],
tr.elements[2][1],
0,
1
};
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
}
""");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const CameraMatrix& p_matrix) { _FU
GLfloat matrix[16];
for (int i=0;i<4;i++) {
for (int j=0;j<4;j++) {
matrix[i*4+j]=p_matrix.matrix[i][j];
}
}
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
}; """);
fd.write("\n\n#undef _FU\n\n\n");
fd.write("\tvirtual void init() {\n\n");
if (len(conditionals)):
fd.write("\t\tstatic const char* _conditional_strings[]={\n")
if (len(conditionals)):
for x in conditionals:
fd.write("\t\t\t\"#define "+x+"\\n\",\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const char **_conditional_strings=NULL;\n")
if (len(uniforms)):
fd.write("\t\tstatic const char* _uniform_strings[]={\n")
if (len(uniforms)):
for x in uniforms:
fd.write("\t\t\t\""+x+"\",\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const char **_uniform_strings=NULL;\n")
if (len(attributes)):
fd.write("\t\tstatic AttributePair _attribute_pairs[]={\n")
for x in attributes:
fd.write("\t\t\t{\""+x[0]+"\","+x[1]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic AttributePair *_attribute_pairs=NULL;\n")
if (len(fbos)):
fd.write("\t\tstatic FBOPair _fbo_pairs[]={\n")
for x in fbos:
fd.write("\t\t\t{\""+x[0]+"\","+x[1]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic FBOPair *_fbo_pairs=NULL;\n")
if (len(ubos)):
fd.write("\t\tstatic UBOPair _ubo_pairs[]={\n")
for x in ubos:
fd.write("\t\t\t{\""+x[0]+"\","+x[1]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic UBOPair *_ubo_pairs=NULL;\n")
if (len(texunits)):
fd.write("\t\tstatic TexUnitPair _texunit_pairs[]={\n")
for x in texunits:
fd.write("\t\t\t{\""+x[0]+"\","+x[1]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic TexUnitPair *_texunit_pairs=NULL;\n")
fd.write("\t\tstatic const char* _vertex_code=\"\\\n")
for x in vertex_lines:
fd.write("\t\t\t"+x+"\n");
fd.write("\t\t\";\n\n");
fd.write("\t\tstatic const int _vertex_code_start="+str(vertex_offset)+";\n")
fd.write("\t\tstatic const char* _fragment_code=\"\\\n")
for x in fragment_lines:
fd.write("\t\t\t"+x+"\n");
fd.write("\t\t\";\n\n");
fd.write("\t\tstatic const int _fragment_code_start="+str(fragment_offset)+";\n")
fd.write("\t\tsetup(_conditional_strings,"+str(len(conditionals))+",_uniform_strings,"+str(len(uniforms))+",_attribute_pairs,"+str(len(attributes))+",_fbo_pairs,"+str(len(fbos))+",_ubo_pairs,"+str(len(ubos))+",_texunit_pairs,"+str(len(texunits))+",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
fd.write("\t};\n\n")
fd.write("};\n\n");
fd.write("#endif\n\n");
fd.close();
def build_glsl_headers( target, source, env ):
for x in source:
build_glsl_header(str(x));
return 0
def build_hlsl_dx9_header( filename ):
fs = open(filename,"r")
line=fs.readline()
vertex_lines=[]
fragment_lines=[]
uniforms=[]
fragment_uniforms=[]
attributes=[]
fbos=[]
conditionals=[]
reading=""
line_offset=0
vertex_offset=0
fragment_offset=0
while(line):
if (line.find("[vertex]")!=-1):
reading="vertex"
line=fs.readline()
line_offset+=1
vertex_offset=line_offset
continue
if (line.find("[fragment]")!=-1):
reading="fragment"
line=fs.readline()
line_offset+=1
fragment_offset=line_offset
continue
if (line.find("#ifdef ")!=-1):
ifdefline = line.replace("#ifdef ","").strip()
if (not ifdefline in conditionals):
conditionals+=[ifdefline]
if (line.find("#elif defined(")!=-1):
ifdefline = line.replace("#elif defined(","").strip()
ifdefline = ifdefline.replace(")","").strip()
if (not ifdefline in conditionals):
conditionals+=[ifdefline]
if (line.find("uniform")!=-1):
uline = line.replace("uniform","");
uline = uline.replace(";","");
lines = uline.split(",")
for x in lines:
x = x.strip()
x = x[ x.rfind(" ")+1: ]
if (x.find("[")!=-1):
#unfiorm array
x = x[ :x.find("[") ]
if (not x in uniforms):
uniforms+=[x]
fragment_uniforms+=[reading=="fragment"]
line=line.replace("\r","")
line=line.replace("\n","")
line=line.replace("\\","\\\\")
line=line.replace("\"","\\\"")
line=line+"\\n\\"
if (reading=="vertex"):
vertex_lines+=[line]
if (reading=="fragment"):
fragment_lines+=[line]
line=fs.readline()
line_offset+=1
fs.close();
out_file = filename+".h"
fd = open(out_file,"w")
fd.write("/* WARNING, THIS FILE WAS GENERATED, DO NOT EDIT */\n");
out_file_base = out_file
out_file_base = out_file_base[ out_file_base.rfind("/")+1: ]
out_file_base = out_file_base[ out_file_base.rfind("\\")+1: ]
# print("out file "+out_file+" base " +out_file_base)
out_file_ifdef = out_file_base.replace(".","_").upper()
fd.write("#ifndef "+out_file_ifdef+"\n")
fd.write("#define "+out_file_ifdef+"\n")
out_file_class = out_file_base.replace(".hlsl.h","").title().replace("_","").replace(".","")+"ShaderDX9";
fd.write("\n\n");
fd.write("#include \"drivers/directx9/shader_dx9.h\"\n\n\n");
fd.write("class "+out_file_class+" : public ShaderDX9 {\n\n");
fd.write("\t virtual String get_shader_name() const { return \""+out_file_class+"\"; }\n");
fd.write("public:\n\n");
if (len(conditionals)):
fd.write("\tenum Conditionals {\n");
for x in conditionals:
fd.write("\t\t"+x+",\n");
fd.write("\t};\n\n");
if (len(uniforms)):
fd.write("\tenum Uniforms {\n");
for x in uniforms:
fd.write("\t\t"+x.upper()+",\n");
fd.write("\t};\n\n");
if (len(conditionals)):
fd.write("\t_FORCE_INLINE_ void set_conditional(Conditionals p_conditional,bool p_enable) { _set_conditional(p_conditional,p_enable); }\n\n");
fd.write("\t#define _FU if (!_uniform_valid(p_uniform)) return; ERR_FAIL_COND( get_active()!=this );\n\n ");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, bool p_value) { _FU set_uniformb(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_value) { _FU set_uniformf(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, double p_value) { _FU set_uniformf(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint8_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int8_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint16_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int16_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint32_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int32_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint64_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int64_t p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, unsigned long p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, long p_value) { _FU set_uniformi(p_uniform,p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Color& p_color) { _FU float col[4]={p_color.r,p_color.g,p_color.b,p_color.a}; set_uniformfv(p_uniform,col); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector2& p_vec2) { _FU float vec2[4]={p_vec2.x,p_vec2.y,0,0}; set_uniformfv(p_uniform,vec2); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector3& p_vec3) { _FU float vec3[4]={p_vec3.x,p_vec3.y,p_vec3.z,0}; set_uniformfv(p_uniform,vec3); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b) { _FU float vec2[4]={p_a,p_b,0,0}; set_uniformfv(p_uniform,vec2); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c) { _FU float vec3[4]={p_a,p_b,p_c,0}; set_uniformfv(p_uniform,vec3); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c, float p_d) { _FU float vec4[4]={p_a,p_b,p_c,p_d}; set_uniformfv(p_uniform,vec4); }\n\n");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform& p_transform) { _FU
const Transform &tr = p_transform;
float matrix[16]={ /* build a 16x16 matrix */
tr.basis.elements[0][0],
tr.basis.elements[0][1],
tr.basis.elements[0][2],
tr.origin.x,
tr.basis.elements[1][0],
tr.basis.elements[1][1],
tr.basis.elements[1][2],
tr.origin.y,
tr.basis.elements[2][0],
tr.basis.elements[2][1],
tr.basis.elements[2][2],
tr.origin.z,
0,
0,
0,
1
};
set_uniformfv(p_uniform,&matrix[0],4);
}
""");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const CameraMatrix& p_matrix) { _FU
float matrix[16];
for (int i=0;i<4;i++) {
for (int j=0;j<4;j++) {
matrix[i*4+j]=p_matrix.matrix[j][i];
}
}
set_uniformfv(p_uniform,&matrix[0],4);
}; """);
fd.write("\n\n#undef _FU\n\n\n");
fd.write("\tvirtual void init(IDirect3DDevice9 *p_device,ShaderSupport p_version) {\n\n");
if (len(conditionals)):
fd.write("\t\tstatic const char* _conditional_strings[]={\n")
if (len(conditionals)):
for x in conditionals:
fd.write("\t\t\t\""+x+"\",\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const char **_conditional_strings=NULL;\n")
if (len(uniforms)):
fd.write("\t\tstatic const char* _uniform_strings[]={\n")
if (len(uniforms)):
for x in uniforms:
fd.write("\t\t\t\""+x+"\",\n");
fd.write("\t\t};\n\n");
fd.write("\t\tstatic const bool _fragment_uniforms[]={\n")
if (len(uniforms)):
for x in fragment_uniforms:
if (x):
fd.write("\t\t\ttrue,\n");
else:
fd.write("\t\t\tfalse,\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const char **_uniform_strings=NULL;\n")
fd.write("\t\tstatic const bool *_fragment_uniforms=NULL;\n")
fd.write("\t\tstatic const char* _vertex_code=\"\\\n")
for x in vertex_lines:
fd.write("\t\t\t"+x+"\n");
fd.write("\t\t\";\n\n");
fd.write("\t\tstatic const int _vertex_code_start="+str(vertex_offset)+";\n")
fd.write("\t\tstatic const char* _fragment_code=\"\\\n")
for x in fragment_lines:
fd.write("\t\t\t"+x+"\n");
fd.write("\t\t\";\n\n");
fd.write("\t\tstatic const int _fragment_code_start="+str(fragment_offset)+";\n")
fd.write("\t\tsetup(p_device,p_version,_conditional_strings,"+str(len(conditionals))+",_uniform_strings,"+str(len(uniforms))+",_fragment_uniforms,_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
fd.write("\t};\n\n")
fd.write("};\n\n");
fd.write("#endif\n\n");
fd.close();
def build_hlsl_dx9_headers( target, source, env ):
for x in source:
build_hlsl_dx9_header(str(x));
return 0
class LegacyGLHeaderStruct:
def __init__(self):
self.vertex_lines=[]
self.fragment_lines=[]
self.uniforms=[]
self.attributes=[]
self.fbos=[]
self.conditionals=[]
self.enums={}
self.texunits=[]
self.texunit_names=[]
self.ubos=[]
self.ubo_names=[]
self.vertex_included_files=[]
self.fragment_included_files=[]
self.reading=""
self.line_offset=0
self.vertex_offset=0
self.fragment_offset=0
def include_file_in_legacygl_header( filename, header_data, depth ):
fs = open(filename,"r")
line=fs.readline()
while(line):
if (line.find("[vertex]")!=-1):
header_data.reading="vertex"
line=fs.readline()
header_data.line_offset+=1
header_data.vertex_offset=header_data.line_offset
continue
if (line.find("[fragment]")!=-1):
header_data.reading="fragment"
line=fs.readline()
header_data.line_offset+=1
header_data.fragment_offset=header_data.line_offset
continue
while(line.find("#include ")!=-1):
includeline = line.replace("#include ","").strip()[1:-1]
import os.path
included_file = os.path.relpath(os.path.dirname(filename) + "/" + includeline)
if (not included_file in header_data.vertex_included_files and header_data.reading=="vertex"):
header_data.vertex_included_files+=[included_file]
if(include_file_in_legacygl_header( included_file, header_data, depth + 1 ) == None):
print "Error in file '" + filename + "': #include " + includeline + "could not be found!"
elif (not included_file in header_data.fragment_included_files and header_data.reading=="fragment"):
header_data.fragment_included_files+=[included_file]
if(include_file_in_legacygl_header( included_file, header_data, depth + 1 ) == None):
print "Error in file '" + filename + "': #include " + includeline + "could not be found!"
line=fs.readline()
if (line.find("#ifdef ")!=-1 or line.find("#elif defined(")!=-1):
if (line.find("#ifdef ")!=-1):
ifdefline = line.replace("#ifdef ","").strip()
else:
ifdefline = line.replace("#elif defined(","").strip()
ifdefline = ifdefline.replace(")","").strip()
if (line.find("_EN_")!=-1):
enumbase = ifdefline[:ifdefline.find("_EN_")];
ifdefline = ifdefline.replace("_EN_","_")
line = line.replace("_EN_","_")
# print(enumbase+":"+ifdefline);
if (enumbase not in header_data.enums):
header_data.enums[enumbase]=[]
if (ifdefline not in header_data.enums[enumbase]):
header_data.enums[enumbase].append(ifdefline);
elif (not ifdefline in header_data.conditionals):
header_data.conditionals+=[ifdefline]
if (line.find("uniform")!=-1 and line.lower().find("texunit:")!=-1):
#texture unit
texunitstr = line[line.find(":")+1:].strip()
if (texunitstr=="auto"):
texunit="-1"
else:
texunit = str(int(texunitstr ))
uline=line[:line.lower().find("//")]
uline = uline.replace("uniform","");
uline = uline.replace("highp","");
uline = uline.replace(";","");
lines = uline.split(",")
for x in lines:
x = x.strip()
x = x[ x.rfind(" ")+1: ]
if (x.find("[")!=-1):
#unfiorm array
x = x[ :x.find("[") ]
if (not x in header_data.texunit_names):
header_data.texunits+=[(x,texunit)]
header_data.texunit_names+=[x]
elif (line.find("uniform")!=-1):
uline = line.replace("uniform","");
uline = uline.replace(";","");
lines = uline.split(",")
for x in lines:
x = x.strip()
x = x[ x.rfind(" ")+1: ]
if (x.find("[")!=-1):
#unfiorm array
x = x[ :x.find("[") ]
if (not x in header_data.uniforms):
header_data.uniforms+=[x]
if ((line.strip().find("in ")==0 or line.strip().find("attribute ")==0) and line.find("attrib:")!=-1):
uline = line.replace("in ","");
uline = uline.replace("attribute ","");
uline = uline.replace("highp ","");
uline = uline.replace(";","");
uline = uline[ uline.find(" "): ].strip()
if (uline.find("//")!=-1):
name,bind = uline.split("//")
if (bind.find("attrib:")!=-1):
name=name.strip()
bind=bind.replace("attrib:","").strip()
header_data.attributes+=[(name,bind)]
line=line.replace("\r","")
line=line.replace("\n","")
#line=line.replace("\\","\\\\")
#line=line.replace("\"","\\\"")
#line=line+"\\n\\"
if (header_data.reading=="vertex"):
header_data.vertex_lines+=[line]
if (header_data.reading=="fragment"):
header_data.fragment_lines+=[line]
line=fs.readline()
header_data.line_offset+=1
fs.close();
return header_data
def build_legacygl_header( filename, include, class_suffix, output_attribs ):
header_data = LegacyGLHeaderStruct()
include_file_in_legacygl_header( filename, header_data, 0 )
out_file = filename+".h"
fd = open(out_file,"w")
enum_constants=[]
fd.write("/* WARNING, THIS FILE WAS GENERATED, DO NOT EDIT */\n");
out_file_base = out_file
out_file_base = out_file_base[ out_file_base.rfind("/")+1: ]
out_file_base = out_file_base[ out_file_base.rfind("\\")+1: ]
# print("out file "+out_file+" base " +out_file_base)
out_file_ifdef = out_file_base.replace(".","_").upper()
fd.write("#ifndef "+out_file_ifdef+class_suffix+"_120\n")
fd.write("#define "+out_file_ifdef+class_suffix+"_120\n")
out_file_class = out_file_base.replace(".glsl.h","").title().replace("_","").replace(".","")+"Shader"+class_suffix;
fd.write("\n\n");
fd.write("#include \"" + include + "\"\n\n\n");
fd.write("class "+out_file_class+" : public Shader"+class_suffix+" {\n\n");
fd.write("\t virtual String get_shader_name() const { return \""+out_file_class+"\"; }\n");
fd.write("public:\n\n");
if (len(header_data.conditionals)):
fd.write("\tenum Conditionals {\n");
for x in header_data.conditionals:
fd.write("\t\t"+x.upper()+",\n");
fd.write("\t};\n\n");
if (len(header_data.uniforms)):
fd.write("\tenum Uniforms {\n");
for x in header_data.uniforms:
fd.write("\t\t"+x.upper()+",\n");
fd.write("\t};\n\n");
fd.write("\t_FORCE_INLINE_ int get_uniform(Uniforms p_uniform) const { return _get_uniform(p_uniform); }\n\n");
if (len(header_data.conditionals)):
fd.write("\t_FORCE_INLINE_ void set_conditional(Conditionals p_conditional,bool p_enable) { _set_conditional(p_conditional,p_enable); }\n\n");
fd.write("\t#define _FU if (get_uniform(p_uniform)<0) return; ERR_FAIL_COND( get_active()!=this );\n\n ");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, double p_value) { _FU glUniform1f(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int8_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int16_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int32_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, uint64_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, int64_t p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, unsigned long p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
#fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, long p_value) { _FU glUniform1i(get_uniform(p_uniform),p_value); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Color& p_color) { _FU GLfloat col[4]={p_color.r,p_color.g,p_color.b,p_color.a}; glUniform4fv(get_uniform(p_uniform),1,col); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector2& p_vec2) { _FU GLfloat vec2[2]={p_vec2.x,p_vec2.y}; glUniform2fv(get_uniform(p_uniform),1,vec2); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Vector3& p_vec3) { _FU GLfloat vec3[3]={p_vec3.x,p_vec3.y,p_vec3.z}; glUniform3fv(get_uniform(p_uniform),1,vec3); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b) { _FU glUniform2f(get_uniform(p_uniform),p_a,p_b); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c) { _FU glUniform3f(get_uniform(p_uniform),p_a,p_b,p_c); }\n\n");
fd.write("\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, float p_a, float p_b, float p_c, float p_d) { _FU glUniform4f(get_uniform(p_uniform),p_a,p_b,p_c,p_d); }\n\n");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Transform& p_transform) { _FU
const Transform &tr = p_transform;
GLfloat matrix[16]={ /* build a 16x16 matrix */
tr.basis.elements[0][0],
tr.basis.elements[1][0],
tr.basis.elements[2][0],
0,
tr.basis.elements[0][1],
tr.basis.elements[1][1],
tr.basis.elements[2][1],
0,
tr.basis.elements[0][2],
tr.basis.elements[1][2],
tr.basis.elements[2][2],
0,
tr.origin.x,
tr.origin.y,
tr.origin.z,
1
};
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
}
""");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const Matrix32& p_transform) { _FU
const Matrix32 &tr = p_transform;
GLfloat matrix[16]={ /* build a 16x16 matrix */
tr.elements[0][0],
tr.elements[0][1],
0,
0,
tr.elements[1][0],
tr.elements[1][1],
0,
0,
0,
0,
1,
0,
tr.elements[2][0],
tr.elements[2][1],
0,
1
};
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
}
""");
fd.write("""\t_FORCE_INLINE_ void set_uniform(Uniforms p_uniform, const CameraMatrix& p_matrix) { _FU
GLfloat matrix[16];
for (int i=0;i<4;i++) {
for (int j=0;j<4;j++) {
matrix[i*4+j]=p_matrix.matrix[i][j];
}
}
glUniformMatrix4fv(get_uniform(p_uniform),1,false,matrix);
}; """);
fd.write("\n\n#undef _FU\n\n\n");
fd.write("\tvirtual void init() {\n\n");
enum_value_count=0;
if (len(header_data.enums)):
fd.write("\t\t//Written using math, given nonstandarity of 64 bits integer constants..\n");
fd.write("\t\tstatic const Enum _enums[]={\n")
bitofs=len(header_data.conditionals)
enum_vals=[]
for xv in header_data.enums:
x=header_data.enums[xv]
bits=1
amt = len(x);
# print(x)
while(2**bits < amt):
bits+=1
# print("amount: "+str(amt)+" bits "+str(bits));
strs="{"
for i in range(amt):
strs+="\"#define "+x[i]+"\\n\","
v={}
v["set_mask"]="uint64_t("+str(i)+")<<"+str(bitofs)
v["clear_mask"]="((uint64_t(1)<<40)-1) ^ (((uint64_t(1)<<"+str(bits)+") - 1)<<"+str(bitofs)+")"
enum_vals.append(v)
enum_constants.append(x[i])
strs+="NULL}"
fd.write("\t\t\t{(uint64_t(1<<"+str(bits)+")-1)<<"+str(bitofs)+","+str(bitofs)+","+strs+"},\n");
bitofs+=bits
fd.write("\t\t};\n\n");
fd.write("\t\tstatic const EnumValue _enum_values[]={\n")
enum_value_count=len(enum_vals);
for x in enum_vals:
fd.write("\t\t\t{"+x["set_mask"]+","+x["clear_mask"]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const Enum *_enums=NULL;\n")
fd.write("\t\tstatic const EnumValue *_enum_values=NULL;\n")
if (len(header_data.conditionals)):
fd.write("\t\tstatic const char* _conditional_strings[]={\n")
if (len(header_data.conditionals)):
for x in header_data.conditionals:
fd.write("\t\t\t\"#define "+x+"\\n\",\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const char **_conditional_strings=NULL;\n")
if (len(header_data.uniforms)):
fd.write("\t\tstatic const char* _uniform_strings[]={\n")
if (len(header_data.uniforms)):
for x in header_data.uniforms:
fd.write("\t\t\t\""+x+"\",\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic const char **_uniform_strings=NULL;\n")
if output_attribs:
if (len(header_data.attributes)):
fd.write("\t\tstatic AttributePair _attribute_pairs[]={\n")
for x in header_data.attributes:
fd.write("\t\t\t{\""+x[0]+"\","+x[1]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic AttributePair *_attribute_pairs=NULL;\n")
if (len(header_data.texunits)):
fd.write("\t\tstatic TexUnitPair _texunit_pairs[]={\n")
for x in header_data.texunits:
fd.write("\t\t\t{\""+x[0]+"\","+x[1]+"},\n");
fd.write("\t\t};\n\n");
else:
fd.write("\t\tstatic TexUnitPair *_texunit_pairs=NULL;\n")
fd.write("\t\tstatic const char _vertex_code[]={\n")
for x in header_data.vertex_lines:
for i in range(len(x)):
fd.write(str(ord(x[i]))+",");
fd.write(str(ord('\n'))+",");
fd.write("\t\t0};\n\n");
fd.write("\t\tstatic const int _vertex_code_start="+str(header_data.vertex_offset)+";\n")
fd.write("\t\tstatic const char _fragment_code[]={\n")
for x in header_data.fragment_lines:
for i in range(len(x)):
fd.write(str(ord(x[i]))+",");
fd.write(str(ord('\n'))+",");
fd.write("\t\t0};\n\n");
fd.write("\t\tstatic const int _fragment_code_start="+str(header_data.fragment_offset)+";\n")
if output_attribs:
fd.write("\t\tsetup(_conditional_strings,"+str(len(header_data.conditionals))+",_uniform_strings,"+str(len(header_data.uniforms))+",_attribute_pairs,"+str(len(header_data.attributes))+", _texunit_pairs,"+str(len(header_data.texunits))+",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
else:
fd.write("\t\tsetup(_conditional_strings,"+str(len(header_data.conditionals))+",_uniform_strings,"+str(len(header_data.uniforms))+",_texunit_pairs,"+str(len(header_data.texunits))+",_enums,"+str(len(header_data.enums))+",_enum_values,"+str(enum_value_count)+",_vertex_code,_fragment_code,_vertex_code_start,_fragment_code_start);\n")
fd.write("\t};\n\n")
if (len(enum_constants)):
fd.write("\tenum EnumConditionals {\n")
for x in enum_constants:
fd.write("\t\t"+x.upper()+",\n");
fd.write("\t};\n\n");
fd.write("\tvoid set_enum_conditional(EnumConditionals p_cond) { _set_enum_conditional(p_cond); }\n")
fd.write("};\n\n");
fd.write("#endif\n\n");
fd.close();
def build_legacygl_headers( target, source, env ):
for x in source:
build_legacygl_header(str(x), include = "drivers/legacygl/shader_lgl.h", class_suffix = "LGL", output_attribs = False);
return 0
def build_gles2_headers( target, source, env ):
for x in source:
build_legacygl_header(str(x), include="drivers/gles2/shader_gles2.h", class_suffix = "GLES2", output_attribs = True)
def update_version():
rev = "custom_build"
if (os.getenv("BUILD_REVISION")!=None):
rev=os.getenv("BUILD_REVISION")
print("Using custom revision: "+rev)
import version
f=open("core/version.h","wb")
f.write("#define VERSION_SHORT_NAME "+str(version.short_name)+"\n")
f.write("#define VERSION_NAME "+str(version.name)+"\n")
f.write("#define VERSION_MAJOR "+str(version.major)+"\n")
f.write("#define VERSION_MINOR "+str(version.minor)+"\n")
if (hasattr(version, 'patch')):
f.write("#define VERSION_PATCH "+str(version.patch)+"\n")
f.write("#define VERSION_REVISION "+str(rev)+"\n")
f.write("#define VERSION_STATUS "+str(version.status)+"\n")
import datetime
f.write("#define VERSION_YEAR "+str(datetime.datetime.now().year)+"\n")
def parse_cg_file(fname, uniforms, sizes, conditionals):
import re
fs = open(fname, "r")
line=fs.readline()
while line:
if re.match(r"^\s*uniform", line):
res = re.match(r"uniform ([\d\w]*) ([\d\w]*)")
type = res.groups(1)
name = res.groups(2)
uniforms.append(name);
if (type.find("texobj") != -1):
sizes.append(1);
else:
t = re.match(r"float(\d)x(\d)", type);
if t:
sizes.append(int(t.groups(1)) * int(t.groups(2)))
else:
t = re.match(r"float(\d)", type);
sizes.append(int(t.groups(1)))
if line.find("[branch]") != -1:
conditionals.append(name);
line = fs.readline();
def build_cg_shader(sname):
vp_uniforms = []
vp_uniform_sizes = []
vp_conditionals = []
parse_cg_file("vp_"+sname+".cg", vp_uniforms, vp_uniform_sizes, vp_conditionals);
fp_uniforms = []
fp_uniform_sizes = []
fp_conditionals = []
parse_cg_file("fp_"+sname+".cg", fp_uniforms, fp_uniform_sizes, fp_conditionals);
fd = open("shader_"+sname+".cg.h", "w");
fd.write('\n#include "shader_cell.h"\n');
fd.write("\nclass Shader_" + sname + " : public ShaderCell {\n");
fd.write("\n\tstatic struct VertexUniforms[] = {\n");
offset = 0;
for i in range(0, len(vp_uniforms)):
fd.write('\t\t{ "%s", %d, %d },\n' % (vp_uniforms[i], offset, vp_uniform_sizes[i]))
offset = offset + vp_uniform_sizes[i];
fd.write("\t};\n\n");
fd.write("public:\n\n");
fd.write("\tenum {\n");
for i in range(0, len(vp_uniforms)):
fd.write('\t\tVP_%s,\n' % vp_uniforms[i].upper())
fd.write("\t};\n");
import glob
def detect_modules():
module_list=[]
includes_cpp=""
register_cpp=""
unregister_cpp=""
for x in glob.glob("modules/*"):
if (not os.path.isdir(x)):
continue
x=x.replace("modules/","") # rest of world
x=x.replace("modules\\","") # win32
module_list.append(x)
try:
with open("modules/"+x+"/register_types.h"):
includes_cpp+='#include "modules/'+x+'/register_types.h"\n'
register_cpp+='#ifdef MODULE_'+x.upper()+'_ENABLED\n'
register_cpp+='\tregister_'+x+'_types();\n'
register_cpp+='#endif\n'
unregister_cpp+='#ifdef MODULE_'+x.upper()+'_ENABLED\n'
unregister_cpp+='\tunregister_'+x+'_types();\n'
unregister_cpp+='#endif\n'
except IOError:
pass
modules_cpp="""
// modules.cpp - THIS FILE IS GENERATED, DO NOT EDIT!!!!!!!
#include "register_module_types.h"
"""+includes_cpp+"""
void register_module_types() {
"""+register_cpp+"""
}
void unregister_module_types() {
"""+unregister_cpp+"""
}
"""
f=open("modules/register_module_types.cpp","wb")
f.write(modules_cpp)
return module_list
def win32_spawn(sh, escape, cmd, args, env):
import subprocess
newargs = ' '.join(args[1:])
cmdline = cmd + " " + newargs
startupinfo = subprocess.STARTUPINFO()
#startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
for e in env:
if type(env[e]) != type(""):
env[e] = str(env[e])
proc = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, startupinfo=startupinfo, shell = False, env = env)
data, err = proc.communicate()
rv = proc.wait()
if rv:
print "====="
print err
print "====="
return rv
"""
def win32_spawn(sh, escape, cmd, args, spawnenv):
import win32file
import win32event
import win32process
import win32security
for var in spawnenv:
spawnenv[var] = spawnenv[var].encode('ascii', 'replace')
sAttrs = win32security.SECURITY_ATTRIBUTES()
StartupInfo = win32process.STARTUPINFO()
newargs = ' '.join(map(escape, args[1:]))
cmdline = cmd + " " + newargs
# check for any special operating system commands
if cmd == 'del':
for arg in args[1:]:
win32file.DeleteFile(arg)
exit_code = 0
else:
# otherwise execute the command.
hProcess, hThread, dwPid, dwTid = win32process.CreateProcess(None, cmdline, None, None, 1, 0, spawnenv, None, StartupInfo)
win32event.WaitForSingleObject(hProcess, win32event.INFINITE)
exit_code = win32process.GetExitCodeProcess(hProcess)
win32file.CloseHandle(hProcess);
win32file.CloseHandle(hThread);
return exit_code
"""
def android_add_maven_repository(self,url):
self.android_maven_repos.append(url)
def android_add_dependency(self,depline):
self.android_dependencies.append(depline)
def android_add_java_dir(self,subpath):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+subpath
self.android_java_dirs.append(base_path)
def android_add_res_dir(self,subpath):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+subpath
self.android_res_dirs.append(base_path)
def android_add_aidl_dir(self,subpath):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+subpath
self.android_aidl_dirs.append(base_path)
def android_add_jni_dir(self,subpath):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+subpath
self.android_jni_dirs.append(base_path)
def android_add_to_manifest(self,file):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+file
f = open(base_path,"rb")
self.android_manifest_chunk+=f.read()
def android_add_to_permissions(self,file):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+file
f = open(base_path,"rb")
self.android_permission_chunk+=f.read()
def android_add_to_attributes(self,file):
base_path = self.Dir(".").abspath+"/modules/"+self.current_module+"/"+file
f = open(base_path,"rb")
self.android_appattributes_chunk+=f.read()
def disable_module(self):
self.disabled_modules.append(self.current_module)
def use_windows_spawn_fix(self):
if (os.name!="nt"):
return #not needed, only for windows
self.split_drivers=True
import subprocess
def mySubProcess(cmdline,env):
#print "SPAWNED : " + cmdline
startupinfo = subprocess.STARTUPINFO()
startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
proc = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, startupinfo=startupinfo, shell = False, env = env)
data, err = proc.communicate()
rv = proc.wait()
if rv:
print "====="
print err
print "====="
return rv
def mySpawn(sh, escape, cmd, args, env):
newargs = ' '.join(args[1:])
cmdline = cmd + " " + newargs
rv=0
if len(cmdline) > 32000 and cmd.endswith("ar") :
cmdline = cmd + " " + args[1] + " " + args[2] + " "
for i in range(3,len(args)) :
rv = mySubProcess( cmdline + args[i], env )
if rv :
break
else:
rv = mySubProcess( cmdline, env )
return rv
self['SPAWN'] = mySpawn
def save_active_platforms(apnames,ap):
for x in ap:
pth = x+"/logo.png"
# print("open path: "+pth)
pngf=open(pth,"rb");
b=pngf.read(1);
str=" /* AUTOGENERATED FILE, DO NOT EDIT */ \n"
str+=" static const unsigned char _"+x[9:]+"_logo[]={"
while(len(b)==1):
str+=hex(ord(b))
b=pngf.read(1);
if (len(b)==1):
str+=","
str+="};\n"
wf = x+"/logo.h"
logow = open(wf,"wb")
logow.write(str)
def colored(sys,env):
#If the output is not a terminal, do nothing
if not sys.stdout.isatty():
return
colors = {}
colors['cyan'] = '\033[96m'
colors['purple'] = '\033[95m'
colors['blue'] = '\033[94m'
colors['green'] = '\033[92m'
colors['yellow'] = '\033[93m'
colors['red'] = '\033[91m'
colors['end'] = '\033[0m'
compile_source_message = '%sCompiling %s==> %s$SOURCE%s' % (colors['blue'], colors['purple'], colors['yellow'], colors['end'])
java_compile_source_message = '%sCompiling %s==> %s$SOURCE%s' % (colors['blue'], colors['purple'], colors['yellow'], colors['end'])
compile_shared_source_message = '%sCompiling shared %s==> %s$SOURCE%s' % (colors['blue'], colors['purple'], colors['yellow'], colors['end'])
link_program_message = '%sLinking Program %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
link_library_message = '%sLinking Static Library %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
ranlib_library_message = '%sRanlib Library %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
link_shared_library_message = '%sLinking Shared Library %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
java_library_message = '%sCreating Java Archive %s==> %s$TARGET%s' % (colors['red'], colors['purple'], colors['yellow'], colors['end'])
env.Append( CXXCOMSTR=[compile_source_message] )
env.Append( CCCOMSTR=[compile_source_message] )
env.Append( SHCCCOMSTR=[compile_shared_source_message] )
env.Append( SHCXXCOMSTR=[compile_shared_source_message] )
env.Append( ARCOMSTR=[link_library_message] )
env.Append( RANLIBCOMSTR=[ranlib_library_message] )
env.Append( SHLINKCOMSTR=[link_shared_library_message] )
env.Append( LINKCOMSTR=[link_program_message] )
env.Append( JARCOMSTR=[java_library_message] )
env.Append( JAVACCOMSTR=[java_compile_source_message] )
| 31.802905 | 335 | 0.660839 | 7,046 | 45,987 | 4.094096 | 0.069259 | 0.05751 | 0.050751 | 0.035879 | 0.764828 | 0.737546 | 0.723299 | 0.675911 | 0.658162 | 0.644192 | 0 | 0.014409 | 0.136778 | 45,987 | 1,445 | 336 | 31.824913 | 0.71227 | 0.038881 | 0 | 0.604247 | 0 | 0.051158 | 0.415299 | 0.118789 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.000965 | 0.010618 | null | null | 0.009653 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
732433628d0d25b32b6a77af3b5b9e151353b582 | 1,135 | py | Python | catalyst/data/session_bars.py | guilhermeprokisch/catalyst | 21e096b261912d9e905584178d6ee626072c23cb | [
"Apache-2.0"
] | null | null | null | catalyst/data/session_bars.py | guilhermeprokisch/catalyst | 21e096b261912d9e905584178d6ee626072c23cb | [
"Apache-2.0"
] | null | null | null | catalyst/data/session_bars.py | guilhermeprokisch/catalyst | 21e096b261912d9e905584178d6ee626072c23cb | [
"Apache-2.0"
] | null | null | null | # Copyright 2016 Quantopian, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from abc import abstractproperty
from catalyst.data.bar_reader import BarReader
class SessionBarReader(BarReader):
"""
Reader for OHCLV pricing data at a session frequency.
"""
@property
def data_frequency(self):
return 'session'
@abstractproperty
def sessions(self):
"""
Returns
-------
sessions : DatetimeIndex
All session labels (unionining the range for all assets) which the
reader can provide.
"""
pass
| 30.675676 | 78 | 0.670485 | 142 | 1,135 | 5.34507 | 0.661972 | 0.079051 | 0.034256 | 0.042161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009535 | 0.260793 | 1,135 | 36 | 79 | 31.527778 | 0.895113 | 0.653744 | 0 | 0 | 0 | 0 | 0.025547 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.111111 | 0.222222 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 4 |
732a5e7fa0e072e1ccc9e035e426a8f0090f5444 | 26,532 | py | Python | lib/python3.8/site-packages/ansible_collections/vyos/vyos/plugins/modules/vyos_ospf_interfaces.py | cjsteel/python3-venv-ansible-2.10.5 | c95395c4cae844dc66fddde9b4343966f4b2ecd5 | [
"Apache-1.1"
] | null | null | null | lib/python3.8/site-packages/ansible_collections/vyos/vyos/plugins/modules/vyos_ospf_interfaces.py | cjsteel/python3-venv-ansible-2.10.5 | c95395c4cae844dc66fddde9b4343966f4b2ecd5 | [
"Apache-1.1"
] | null | null | null | lib/python3.8/site-packages/ansible_collections/vyos/vyos/plugins/modules/vyos_ospf_interfaces.py | cjsteel/python3-venv-ansible-2.10.5 | c95395c4cae844dc66fddde9b4343966f4b2ecd5 | [
"Apache-1.1"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
# Copyright 2020 Red Hat
# GNU General Public License v3.0+
# (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
#############################################
# WARNING #
#############################################
#
# This file is auto generated by the resource
# module builder playbook.
#
# Do not edit this file manually.
#
# Changes to this file will be over written
# by the resource module builder.
#
# Changes should be made in the model used to
# generate this file or in the resource module
# builder template.
#
#############################################
"""
The module file for vyos_ospf_interfaces
"""
from __future__ import absolute_import, division, print_function
__metaclass__ = type
DOCUMENTATION = """
module: vyos_ospf_interfaces
version_added: 1.2.0
short_description: OSPF Interfaces Resource Module.
description:
- This module manages OSPF configuration of interfaces on devices running VYOS.
author: Gomathi Selvi Srinivasan (@GomathiselviS)
options:
config:
description: A list of OSPF configuration for interfaces.
type: list
elements: dict
suboptions:
name:
description:
- Name/Identifier of the interface.
type: str
address_family:
description:
- OSPF settings on the interfaces in address-family context.
type: list
elements: dict
suboptions:
afi:
description:
- Address Family Identifier (AFI) for OSPF settings on the interfaces.
type: str
choices: ['ipv4', 'ipv6']
required: True
authentication:
description:
- Authentication settings on the interface.
type: dict
suboptions:
plaintext_password:
description:
- Plain Text password.
type: str
md5_key:
description:
- md5 parameters.
type: dict
suboptions:
key_id:
description:
- key id.
type: int
key:
description:
- md5 key.
type: str
bandwidth:
description:
- Bandwidth of interface (kilobits/sec)
type: int
cost:
description:
- metric associated with interface.
type: int
dead_interval:
description:
- Time interval to detect a dead router.
type: int
hello_interval:
description:
- Timer interval between transmission of hello packets.
type: int
mtu_ignore:
description:
- if True, Disable MTU check for Database Description packets.
type: bool
network:
description:
- Interface type.
type: str
priority:
description:
- Interface priority.
type: int
retransmit_interval:
description:
- LSA retransmission interval.
type: int
transmit_delay:
description:
- LSA transmission delay.
type: int
ifmtu:
description:
- interface MTU.
type: int
instance:
description:
- Instance ID.
type: str
passive:
description:
- If True, disables forming adjacency.
type: bool
running_config:
description:
- This option is used only with state I(parsed).
- The value of this option should be the output received from the IOS device by
executing the command B(sh running-config | section ^interface).
- The state I(parsed) reads the configuration from C(running_config) option and
transforms it into Ansible structured data as per the resource module's argspec
and the value is then returned in the I(parsed) key within the result.
type: str
state:
description:
- The state the configuration should be left in.
type: str
choices:
- merged
- replaced
- overridden
- deleted
- gathered
- parsed
- rendered
default: merged
"""
EXAMPLES = """
# Using merged
#
# Before state:
# -------------
#
# @vyos:~$ show configuration commands | match "ospf"
- name: Merge provided configuration with device configuration
vyos.vyos.vyos_ospf_interfaces:
config:
- name: "eth1"
address_family:
- afi: "ipv4"
transmit_delay: 50
priority: 26
network: "point-to-point"
- afi: "ipv6"
dead_interval: 39
- name: "bond2"
address_family:
- afi: "ipv4"
transmit_delay: 45
bandwidth: 70
authentication:
md5_key:
key_id: 10
key: "1111111111232345"
- afi: "ipv6"
passive: True
state: merged
# After State:
# --------------
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
# "after": [
# "
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "before": [
# {
# "name": "eth0"
# },
# {
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "changed": true,
# "commands": [
# "set interfaces ethernet eth1 ip ospf transmit-delay 50",
# "set interfaces ethernet eth1 ip ospf priority 26",
# "set interfaces ethernet eth1 ip ospf network point-to-point",
# "set interfaces ethernet eth1 ipv6 ospfv3 dead-interval 39",
# "set interfaces bonding bond2 ip ospf transmit-delay 45",
# "set interfaces bonding bond2 ip ospf bandwidth 70",
# "set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key 1111111111232345",
# "set interfaces bonding bond2 ipv6 ospfv3 passive"
# ],
# Using replaced:
# Before State:
# ------------
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
- name: Replace provided configuration with device configuration
vyos.vyos.vyos_ospf_interfaces:
config:
- name: "eth1"
address_family:
- afi: "ipv4"
cost: 100
- afi: "ipv6"
ifmtu: 33
- name: "bond2"
address_family:
- afi: "ipv4"
transmit_delay: 45
- afi: "ipv6"
passive: True
state: replaced
# After State:
# -----------
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf cost '100'
# set interfaces ethernet eth1 ipv6 ospfv3 ifmtu '33'
# vyos@vyos:~$
# Module Execution
# ----------------
# "after": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "cost": 100
# },
# {
# "afi": "ipv6",
# "ifmtu": 33
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "before": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "changed": true,
# "commands": [
# "set interfaces ethernet eth1 ip ospf cost 100",
# "set interfaces ethernet eth1 ipv6 ospfv3 ifmtu 33",
# "delete interfaces ethernet eth1 ip ospf network point-to-point",
# "delete interfaces ethernet eth1 ip ospf priority 26",
# "delete interfaces ethernet eth1 ip ospf transmit-delay 50",
# "delete interfaces ethernet eth1 ipv6 ospfv3 dead-interval 39",
# "delete interfaces bonding bond2 ip ospf authentication",
# "delete interfaces bonding bond2 ip ospf bandwidth 70"
# ],
#
# Using Overridden:
# -----------------
# Before State:
# ------------
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf cost '100'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
# set interfaces ethernet eth1 ipv6 ospfv3 ifmtu '33'
# vyos@vyos:~$
- name: Override device configuration with provided configuration
vyos.vyos.vyos_ospf_interfaces:
config:
- name: "eth0"
address_family:
- afi: "ipv4"
cost: 100
- afi: "ipv6"
ifmtu: 33
passive: True
state: overridden
# After State:
# -----------
# 200~vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces ethernet eth0 ip ospf cost '100'
# set interfaces ethernet eth0 ipv6 ospfv3 ifmtu '33'
# set interfaces ethernet eth0 ipv6 ospfv3 'passive'
# vyos@vyos:~$
#
#
# "after": [
# {
# "name": "bond2"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "cost": 100
# },
# {
# "afi": "ipv6",
# "ifmtu": 33,
# "passive": true
# }
# ],
# "name": "eth0"
# },
# {
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "before": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "cost": 100,
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39,
# "ifmtu": 33
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "changed": true,
# "commands": [
# "delete interfaces bonding bond2 ip ospf",
# "delete interfaces bonding bond2 ipv6 ospfv3",
# "delete interfaces ethernet eth1 ip ospf",
# "delete interfaces ethernet eth1 ipv6 ospfv3",
# "set interfaces ethernet eth0 ip ospf cost 100",
# "set interfaces ethernet eth0 ipv6 ospfv3 ifmtu 33",
# "set interfaces ethernet eth0 ipv6 ospfv3 passive"
# ],
#
# Using deleted:
# -------------
# before state:
# -------------
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth0 ip ospf cost '100'
# set interfaces ethernet eth0 ipv6 ospfv3 ifmtu '33'
# set interfaces ethernet eth0 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
# vyos@vyos:~$
- name: Delete device configuration
vyos.vyos.vyos_ospf_interfaces:
config:
- name: "eth0"
state: deleted
# After State:
# -----------
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
# vyos@vyos:~$
#
#
# "after": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "before": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "cost": 100
# },
# {
# "afi": "ipv6",
# "ifmtu": 33,
# "passive": true
# }
# ],
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
# "changed": true,
# "commands": [
# "delete interfaces ethernet eth0 ip ospf",
# "delete interfaces ethernet eth0 ipv6 ospfv3"
# ],
#
# Using parsed:
# parsed.cfg:
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth0 ip ospf cost '50'
# set interfaces ethernet eth0 ip ospf priority '26'
# set interfaces ethernet eth0 ipv6 ospfv3 instance-id '33'
# set interfaces ethernet eth0 ipv6 ospfv3 'mtu-ignore'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
#
- name: parse configs
vyos.vyos.vyos_ospf_interfaces:
running_config: "{{ lookup('file', './parsed.cfg') }}"
state: parsed
# Module Execution:
# ----------------
# "parsed": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "cost": 50,
# "priority": 26
# },
# {
# "afi": "ipv6",
# "instance": "33",
# "mtu_ignore": true
# }
# ],
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39
# }
# ],
# "name": "eth1"
# }
# ]
# Using rendered:
# --------------
- name: Render
vyos.vyos.vyos_ospf_interfaces:
config:
- name: "eth1"
address_family:
- afi: "ipv4"
transmit_delay: 50
priority: 26
network: "point-to-point"
- afi: "ipv6"
dead_interval: 39
- name: "bond2"
address_family:
- afi: "ipv4"
transmit_delay: 45
bandwidth: 70
authentication:
md5_key:
key_id: 10
key: "1111111111232345"
- afi: "ipv6"
passive: True
state: rendered
# Module Execution:
# ----------------
# "rendered": [
# "set interfaces ethernet eth1 ip ospf transmit-delay 50",
# "set interfaces ethernet eth1 ip ospf priority 26",
# "set interfaces ethernet eth1 ip ospf network point-to-point",
# "set interfaces ethernet eth1 ipv6 ospfv3 dead-interval 39",
# "set interfaces bonding bond2 ip ospf transmit-delay 45",
# "set interfaces bonding bond2 ip ospf bandwidth 70",
# "set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key 1111111111232345",
# "set interfaces bonding bond2 ipv6 ospfv3 passive"
# ]
#
# Using Gathered:
# --------------
# Native Config:
# vyos@vyos:~$ show configuration commands | match "ospf"
# set interfaces bonding bond2 ip ospf authentication md5 key-id 10 md5-key '1111111111232345'
# set interfaces bonding bond2 ip ospf bandwidth '70'
# set interfaces bonding bond2 ip ospf transmit-delay '45'
# set interfaces bonding bond2 ipv6 ospfv3 'passive'
# set interfaces ethernet eth1 ip ospf network 'point-to-point'
# set interfaces ethernet eth1 ip ospf priority '26'
# set interfaces ethernet eth1 ip ospf transmit-delay '50'
# set interfaces ethernet eth1 ipv6 ospfv3 dead-interval '39'
# vyos@vyos:~$
- name: gather configs
vyos.vyos.vyos_ospf_interfaces:
state: gathered
# Module Execution:
# -----------------
# "gathered": [
# {
# "address_family": [
# {
# "afi": "ipv4",
# "authentication": {
# "md5_key": {
# "key": "1111111111232345",
# "key_id": 10
# }
# },
# "bandwidth": 70,
# "transmit_delay": 45
# },
# {
# "afi": "ipv6",
# "passive": true
# }
# ],
# "name": "bond2"
# },
# {
# "name": "eth0"
# },
# {
# "address_family": [
# {
# "afi": "ipv4",
# "network": "point-to-point",
# "priority": 26,
# "transmit_delay": 50
# },
# {
# "afi": "ipv6",
# "dead_interval": 39
# }
# ],
# "name": "eth1"
# },
# {
# "name": "eth2"
# },
# {
# "name": "eth3"
# }
# ],
"""
from ansible.module_utils.basic import AnsibleModule
from ansible_collections.vyos.vyos.plugins.module_utils.network.vyos.argspec.ospf_interfaces.ospf_interfaces import (
Ospf_interfacesArgs,
)
from ansible_collections.vyos.vyos.plugins.module_utils.network.vyos.config.ospf_interfaces.ospf_interfaces import (
Ospf_interfaces,
)
def main():
"""
Main entry point for module execution
:returns: the result form module invocation
"""
module = AnsibleModule(
argument_spec=Ospf_interfacesArgs.argument_spec,
mutually_exclusive=[],
required_if=[],
supports_check_mode=False,
)
result = Ospf_interfaces(module).execute_module()
module.exit_json(**result)
if __name__ == "__main__":
main()
| 28.901961 | 117 | 0.452209 | 2,229 | 26,532 | 5.317631 | 0.115747 | 0.101999 | 0.097444 | 0.088585 | 0.74344 | 0.721168 | 0.688518 | 0.675778 | 0.667004 | 0.6503 | 0 | 0.060361 | 0.427408 | 26,532 | 917 | 118 | 28.933479 | 0.719853 | 0.022953 | 0 | 0.811071 | 0 | 0 | 0.968456 | 0.00843 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001203 | false | 0.036101 | 0.004813 | 0 | 0.006017 | 0.001203 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
73309d6581f1a74018f1c74c0b2976e061d90b9b | 1,901 | py | Python | yenviron/test/test.py | javiermatos/yenviron | 5df987a930db56dcf1a87a70a3b4c7c816b18eca | [
"MIT"
] | null | null | null | yenviron/test/test.py | javiermatos/yenviron | 5df987a930db56dcf1a87a70a3b4c7c816b18eca | [
"MIT"
] | null | null | null | yenviron/test/test.py | javiermatos/yenviron | 5df987a930db56dcf1a87a70a3b4c7c816b18eca | [
"MIT"
] | null | null | null | import os
import unittest
from .. import Yenviron
from .. import exceptions
from ..yenviron import yenviron_read
TEST_FILE = os.path.join(os.path.dirname(__file__), 'yenviron.yml')
class YenvironTests(unittest.TestCase):
def setUp(self):
self.yenv = yenviron_read(TEST_FILE)
def test_constructor_no_dictionary(self):
with self.assertRaises(exceptions.YenvironError):
Yenviron(1)
def test_constructor(self):
Yenviron({})
def test_getitem_integer(self):
self.assertIsInstance(self.yenv['INTEGER_1'], int)
self.assertIsInstance(self.yenv.get('INTEGER_1'), int)
self.assertIsInstance(self.yenv.get('INTEGER_314', 1), int)
def test_getitem_string(self):
self.assertIsInstance(self.yenv['STRING_1'], str)
self.assertIsInstance(self.yenv.get('STRING_1'), str)
self.assertIsInstance(self.yenv.get('STRING_314', 'a'), str)
def test_getitem_list(self):
self.assertIsInstance(self.yenv['LIST_INTEGER'], list)
self.assertIsInstance(self.yenv.get('LIST_INTEGER'), list)
self.assertIsInstance(self.yenv.get('LIST_INTEGER_314', []), list)
def test_getitem_dict(self):
self.assertIsInstance(self.yenv['DICTIONARY'], dict)
self.assertIsInstance(self.yenv.get('DICTIONARY'), dict)
self.assertIsInstance(self.yenv.get('DICTIONARY_314', {}), dict)
def test_getitem_missing(self):
with self.assertRaises(exceptions.YenvironKeyError):
value = self.yenv['DICTIONARY_314']
def test_setitem(self):
with self.assertRaises(TypeError):
self.yenv['ANY_VALUE'] = 1
def test_delitem(self):
with self.assertRaises(TypeError):
del self.yenv['ANY_VALUE']
def test_contains(self):
self.assertTrue('INTEGER_1' in self.yenv)
self.assertFalse('INTEGER_314' in self.yenv)
| 32.220339 | 74 | 0.68122 | 229 | 1,901 | 5.475983 | 0.218341 | 0.114833 | 0.229665 | 0.267943 | 0.513557 | 0.304625 | 0.304625 | 0.304625 | 0.22488 | 0.082137 | 0 | 0.016949 | 0.193056 | 1,901 | 58 | 75 | 32.775862 | 0.800522 | 0 | 0 | 0.047619 | 0 | 0 | 0.102052 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0.261905 | false | 0 | 0.119048 | 0 | 0.404762 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
7335575b50fd36b8370c4859b6102181961b0853 | 90 | py | Python | comp_493/apps.py | hackeralistarsays/Toxic-Words-Identification | 341c236030684a1397b03d57c2f7cb740b8e4f2f | [
"MIT"
] | 1 | 2021-06-07T06:06:44.000Z | 2021-06-07T06:06:44.000Z | comp_493/apps.py | hackeralistarsays/Toxic-Words-Identification | 341c236030684a1397b03d57c2f7cb740b8e4f2f | [
"MIT"
] | null | null | null | comp_493/apps.py | hackeralistarsays/Toxic-Words-Identification | 341c236030684a1397b03d57c2f7cb740b8e4f2f | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class Comp493Config(AppConfig):
name = 'comp_493'
| 15 | 33 | 0.755556 | 11 | 90 | 6.090909 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.166667 | 90 | 5 | 34 | 18 | 0.813333 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
7338abcf2e1213816f857be7663fffb4b7542a62 | 4,839 | py | Python | 01_Modelos_Supervisionados/1.11_Ensemble_Methods/1.11.3_AdaBoost.py | BrunoBertti/Scikit_Learning | 4b9e10ff7909f3728ac1e8bba19f5fd779340bc4 | [
"MIT"
] | null | null | null | 01_Modelos_Supervisionados/1.11_Ensemble_Methods/1.11.3_AdaBoost.py | BrunoBertti/Scikit_Learning | 4b9e10ff7909f3728ac1e8bba19f5fd779340bc4 | [
"MIT"
] | null | null | null | 01_Modelos_Supervisionados/1.11_Ensemble_Methods/1.11.3_AdaBoost.py | BrunoBertti/Scikit_Learning | 4b9e10ff7909f3728ac1e8bba19f5fd779340bc4 | [
"MIT"
] | null | null | null | ########## 1.11.3 AdaBoost ##########
# O módulo sklearn.ensemble inclui o algoritmo de boosting popular AdaBoost, introduzido em 1995 por Freund e Schapire [FS1995].
# O princípio central do AdaBoost é ajustar uma sequência de alunos fracos (ou seja, modelos que são apenas ligeiramente melhores do que suposições aleatórias, como pequenas árvores de decisão) em versões repetidamente modificadas dos dados. As previsões de todos eles são então combinadas por meio de uma maioria ponderada de votos (ou soma) para produzir a previsão final. As modificações de dados em cada assim chamada iteração de reforço consistem na aplicação de pesos w1, w1,…, w_N a cada uma das amostras de treinamento. Inicialmente, esses pesos são todos definidos para w_i = 1 / N, de modo que a primeira etapa simplesmente treina um aluno fraco nos dados originais. Para cada iteração sucessiva, os pesos da amostra são modificados individualmente e o algoritmo de aprendizagem é reaplicado aos dados reponderados. Em uma determinada etapa, aqueles exemplos de treinamento que foram preditos incorretamente pelo modelo impulsionado induzido na etapa anterior têm seus pesos aumentados, enquanto os pesos são diminuídos para aqueles que foram preditos corretamente. À medida que as iterações prosseguem, os exemplos difíceis de prever recebem uma influência cada vez maior. Cada aluno fraco subsequente é, portanto, forçado a se concentrar nos exemplos que foram perdidos pelos anteriores na sequência [HTF].
# https://scikit-learn.org/stable/auto_examples/ensemble/plot_adaboost_hastie_10_2.html
# AdaBoost pode ser usado para problemas de classificação e regressão:
# Para classificação multiclasse, AdaBoostClassifier implementa AdaBoost-SAMME e AdaBoost-SAMME.R [ZZRH2009].
# Para regressão, AdaBoostRegressor implementa AdaBoost.R2 [D1997].
##### 1.11.3.1. Uso
# O exemplo a seguir mostra como ajustar um classificador AdaBoost com 100 alunos fracos:
from sklearn.model_selection import cross_val_score
from sklearn.datasets import load_iris
from sklearn.ensemble import AdaBoostClassifier
X, y = load_iris(return_X_y=True)
clf = AdaBoostClassifier(n_estimators=100)
scores = cross_val_score(clf, X, y, cv=5)
print(scores.mean())
# O número de alunos fracos é controlado pelo parâmetro n_estimators. O parâmetro learning_rate controla a contribuição dos alunos fracos na combinação final. Por padrão, os alunos fracos são tocos de decisão. Diferentes alunos fracos podem ser especificados por meio do parâmetro base_estimator. Os principais parâmetros a serem ajustados para obter bons resultados são n_estimators e a complexidade dos estimadores de base (por exemplo, sua profundidade max_depth ou o número mínimo necessário de amostras para considerar uma divisão min_samples_split).
## Exemplos:
## Discrete versus Real AdaBoost compares the classification error of a decision stump, decision tree, and a boosted decision stump using AdaBoost-SAMME and AdaBoost-SAMME.R. (https://scikit-learn.org/stable/auto_examples/ensemble/plot_adaboost_hastie_10_2.html#sphx-glr-auto-examples-ensemble-plot-adaboost-hastie-10-2-py)
## Multi-class AdaBoosted Decision Trees shows the performance of AdaBoost-SAMME and AdaBoost-SAMME.R on a multi-class problem. (https://scikit-learn.org/stable/auto_examples/ensemble/plot_adaboost_multiclass.html#sphx-glr-auto-examples-ensemble-plot-adaboost-multiclass-py)
## Two-class AdaBoost shows the decision boundary and decision function values for a non-linearly separable two-class problem using AdaBoost-SAMME. (https://scikit-learn.org/stable/auto_examples/ensemble/plot_adaboost_twoclass.html#sphx-glr-auto-examples-ensemble-plot-adaboost-twoclass-py)
## Decision Tree Regression with AdaBoost demonstrates regression with the AdaBoost.R2 algorithm. (https://scikit-learn.org/stable/auto_examples/ensemble/plot_adaboost_regression.html#sphx-glr-auto-examples-ensemble-plot-adaboost-regression-py)
## Referências:
## FS1995 Y. Freund, and R. Schapire, “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting”, 1997. (https://scikit-learn.org/stable/modules/ensemble.html#id9)
## ZZRH2009 J. Zhu, H. Zou, S. Rosset, T. Hastie. “Multi-class AdaBoost”, 2009. (https://scikit-learn.org/stable/modules/ensemble.html#id11)
## D1997 Drucker. “Improving Regressors using Boosting Techniques”, 1997. (https://scikit-learn.org/stable/modules/ensemble.html#id12)
## HTF(1,2,3) T. Hastie, R. Tibshirani and J. Friedman, “Elements of Statistical Learning Ed. 2”, Springer, 2009. 1(https://scikit-learn.org/stable/modules/ensemble.html#id10), 2 (https://scikit-learn.org/stable/modules/ensemble.html#id20), 3(https://scikit-learn.org/stable/modules/ensemble.html#id34)
| 83.431034 | 1,322 | 0.786733 | 704 | 4,839 | 5.357955 | 0.431818 | 0.032078 | 0.04666 | 0.055408 | 0.232238 | 0.232238 | 0.201485 | 0.201485 | 0.1079 | 0.08245 | 0 | 0.021999 | 0.135772 | 4,839 | 57 | 1,323 | 84.894737 | 0.879244 | 0.898119 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
734e711fb7fcf8bf86d10a3b11203c5e3efd8327 | 236 | py | Python | lldb/packages/Python/lldbsuite/test/functionalities/data-formatter/dump_dynamic/TestDumpDynamic.py | medismailben/llvm-project | e334a839032fe500c3bba22bf976ab7af13ce1c1 | [
"Apache-2.0"
] | 2,338 | 2018-06-19T17:34:51.000Z | 2022-03-31T11:00:37.000Z | packages/Python/lldbsuite/test/functionalities/data-formatter/dump_dynamic/TestDumpDynamic.py | DalavanCloud/lldb | e913eaf2468290fb94c767d474d611b41a84dd69 | [
"Apache-2.0"
] | 3,740 | 2019-01-23T15:36:48.000Z | 2022-03-31T22:01:13.000Z | packages/Python/lldbsuite/test/functionalities/data-formatter/dump_dynamic/TestDumpDynamic.py | DalavanCloud/lldb | e913eaf2468290fb94c767d474d611b41a84dd69 | [
"Apache-2.0"
] | 500 | 2019-01-23T07:49:22.000Z | 2022-03-30T02:59:37.000Z | from __future__ import absolute_import
from lldbsuite.test import lldbinline
lldbinline.MakeInlineTest(
__file__, globals(), [
lldbinline.expectedFailureAll(
oslist=["windows"], bugnumber="llvm.org/pr24663")])
| 26.222222 | 63 | 0.724576 | 22 | 236 | 7.363636 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0.173729 | 236 | 8 | 64 | 29.5 | 0.805128 | 0 | 0 | 0 | 0 | 0 | 0.097458 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
73640623a5ba6d309c8a94173a1ac23be655be3e | 221 | py | Python | sample/data_generator.py | YuMurata/ParameterOptimizer | 60b172a7a9d3f54213ac3d59e15ebb4d707475c3 | [
"MIT"
] | null | null | null | sample/data_generator.py | YuMurata/ParameterOptimizer | 60b172a7a9d3f54213ac3d59e15ebb4d707475c3 | [
"MIT"
] | null | null | null | sample/data_generator.py | YuMurata/ParameterOptimizer | 60b172a7a9d3f54213ac3d59e15ebb4d707475c3 | [
"MIT"
] | null | null | null | import numpy as np
import os
import sys
sys.path.append(os.getcwd())
from ParameterOptimizer import DataGenerator
class SampleGenerator(DataGenerator):
def generate(self, param: dict) -> dict:
return param
| 18.416667 | 44 | 0.751131 | 28 | 221 | 5.928571 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171946 | 221 | 11 | 45 | 20.090909 | 0.907104 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.5 | 0.125 | 0.875 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 4 |
7dfa7cfa68d826aa8b9a1c3fa184801fdd079c67 | 58 | py | Python | instwitter/rest/partial/__init__.py | sovnarkom/instwitter-py | 971ec9db5c75476b8fc51adc90eb368951152341 | [
"MIT"
] | null | null | null | instwitter/rest/partial/__init__.py | sovnarkom/instwitter-py | 971ec9db5c75476b8fc51adc90eb368951152341 | [
"MIT"
] | 1 | 2017-11-27T04:13:29.000Z | 2017-11-27T04:13:29.000Z | instwitter/rest/partial/__init__.py | sovnarkom/instwitter-py | 971ec9db5c75476b8fc51adc90eb368951152341 | [
"MIT"
] | null | null | null | '''
Created on Aug 9, 2009
@author: aleksandrcicenin
'''
| 9.666667 | 25 | 0.672414 | 7 | 58 | 5.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104167 | 0.172414 | 58 | 5 | 26 | 11.6 | 0.708333 | 0.844828 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
b40c94e6bd3e1d1d11dd0cf946103ea5e29ce7ba | 37,194 | py | Python | docker-images/taigav2/taiga-back/tests/integration/test_hooks_gitlab.py | mattcongy/itshop | 6be025a9eaa7fe7f495b5777d1f0e5a3184121c9 | [
"MIT"
] | 1 | 2017-05-29T19:01:06.000Z | 2017-05-29T19:01:06.000Z | docker-images/taigav2/taiga-back/tests/integration/test_hooks_gitlab.py | mattcongy/itshop | 6be025a9eaa7fe7f495b5777d1f0e5a3184121c9 | [
"MIT"
] | null | null | null | docker-images/taigav2/taiga-back/tests/integration/test_hooks_gitlab.py | mattcongy/itshop | 6be025a9eaa7fe7f495b5777d1f0e5a3184121c9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (C) 2014-2016 Andrey Antukh <niwi@niwi.nz>
# Copyright (C) 2014-2016 Jesús Espino <jespinog@gmail.com>
# Copyright (C) 2014-2016 David Barragán <bameda@dbarragan.com>
# Copyright (C) 2014-2016 Alejandro Alonso <alejandro.alonso@kaleidos.net>
# Copyright (C) 2014-2016 Anler Hernández <hello@anler.me>
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import pytest
from copy import deepcopy
from unittest import mock
from django.core.urlresolvers import reverse
from django.core import mail
from taiga.base.utils import json
from taiga.hooks.gitlab import event_hooks
from taiga.hooks.gitlab.api import GitLabViewSet
from taiga.hooks.exceptions import ActionSyntaxException
from taiga.projects import choices as project_choices
from taiga.projects.epics.models import Epic
from taiga.projects.issues.models import Issue
from taiga.projects.tasks.models import Task
from taiga.projects.userstories.models import UserStory
from taiga.projects.models import Membership
from taiga.projects.history.services import get_history_queryset_by_model_instance, take_snapshot
from taiga.projects.notifications.choices import NotifyLevel
from taiga.projects.notifications.models import NotifyPolicy
from taiga.projects import services
from .. import factories as f
pytestmark = pytest.mark.django_db
push_base_payload = {
"object_kind": "push",
"before": "95790bf891e76fee5e1747ab589903a6a1f80f22",
"after": "da1560886d4f094c3e6c9ef40349f7d38b5d27d7",
"ref": "refs/heads/master",
"checkout_sha": "da1560886d4f094c3e6c9ef40349f7d38b5d27d7",
"user_id": 4,
"user_name": "John Smith",
"user_email": "john@example.com",
"user_avatar": "https://s.gravatar.com/avatar/d4c74594d841139328695756648b6bd6?s=8://s.gravatar.com/avatar/d4c74594d841139328695756648b6bd6?s=80",
"project_id": 15,
"project": {
"name": "Diaspora",
"description": "",
"web_url": "http://example.com/mike/diaspora",
"avatar_url": None,
"git_ssh_url": "git@example.com:mike/diaspora.git",
"git_http_url": "http://example.com/mike/diaspora.git",
"namespace": "Mike",
"visibility_level": 0,
"path_with_namespace": "mike/diaspora",
"default_branch": "master",
"homepage": "http://example.com/mike/diaspora",
"url": "git@example.com:mike/diaspora.git",
"ssh_url": "git@example.com:mike/diaspora.git",
"http_url": "http://example.com/mike/diaspora.git"
},
"repository": {
"name": "Diaspora",
"url": "git@example.com:mike/diaspora.git",
"description": "",
"homepage": "http://example.com/mike/diaspora",
"git_http_url": "http://example.com/mike/diaspora.git",
"git_ssh_url": "git@example.com:mike/diaspora.git",
"visibility_level": 0
},
"commits": [
{
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"message": "Update Catalan translation to e38cb41.",
"timestamp": "2011-12-12T14:27:31+02:00",
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"author": {
"name": "Jordi Mallach",
"email": "jordi@softcatala.org"
},
"added": ["CHANGELOG"],
"modified": ["app/controller/application.rb"],
"removed": []
},
{
"id": "da1560886d4f094c3e6c9ef40349f7d38b5d27d7",
"message": "fixed readme",
"timestamp": "2012-01-03T23:36:29+02:00",
"url": "http://example.com/mike/diaspora/commit/da1560886d4f094c3e6c9ef40349f7d38b5d27d7",
"author": {
"name": "GitLab dev user",
"email": "gitlabdev@dv6700.(none)"
},
"added": ["CHANGELOG"],
"modified": ["app/controller/application.rb"],
"removed": []
}
],
"total_commits_count": 4
}
new_issue_base_payload = {
"object_kind": "issue",
"user": {
"name": "Administrator",
"username": "root",
"avatar_url": "http://www.gravatar.com/avatar/e64c7d89f26bd1972efa854d13d7dd61?s=40\u0026d=identicon"
},
"project": {
"name": "Gitlab Test",
"description": "Aut reprehenderit ut est.",
"web_url": "http://example.com/gitlabhq/gitlab-test",
"avatar_url": None,
"git_ssh_url": "git@example.com:gitlabhq/gitlab-test.git",
"git_http_url": "http://example.com/gitlabhq/gitlab-test.git",
"namespace": "GitlabHQ",
"visibility_level": 20,
"path_with_namespace": "gitlabhq/gitlab-test",
"default_branch": "master",
"homepage": "http://example.com/gitlabhq/gitlab-test",
"url": "http://example.com/gitlabhq/gitlab-test.git",
"ssh_url": "git@example.com:gitlabhq/gitlab-test.git",
"http_url": "http://example.com/gitlabhq/gitlab-test.git"
},
"repository": {
"name": "Gitlab Test",
"url": "http://example.com/gitlabhq/gitlab-test.git",
"description": "Aut reprehenderit ut est.",
"homepage": "http://example.com/gitlabhq/gitlab-test"
},
"object_attributes": {
"id": 301,
"title": "New API: create/update/delete file",
"assignee_id": 51,
"author_id": 51,
"project_id": 14,
"created_at": "2013-12-03T17:15:43Z",
"updated_at": "2013-12-03T17:15:43Z",
"position": 0,
"branch_name": None,
"description": "Create new API for manipulations with repository",
"milestone_id": None,
"state": "opened",
"iid": 23,
"url": "http://example.com/diaspora/issues/23",
"action": "open"
},
"assignee": {
"name": "User1",
"username": "user1",
"avatar_url": "http://www.gravatar.com/avatar/e64c7d89f26bd1972efa854d13d7dd61?s=40\u0026d=identicon"
}
}
issue_comment_base_payload = {
"object_kind": "note",
"user": {
"name": "Administrator",
"username": "root",
"avatar_url": "http://www.gravatar.com/avatar/e64c7d89f26bd1972efa854d13d7dd61?s=40\u0026d=identicon"
},
"project_id": 5,
"project": {
"name": "Gitlab Test",
"description": "Aut reprehenderit ut est.",
"web_url": "http://example.com/gitlab-org/gitlab-test",
"avatar_url": None,
"git_ssh_url": "git@example.com:gitlab-org/gitlab-test.git",
"git_http_url": "http://example.com/gitlab-org/gitlab-test.git",
"namespace": "Gitlab Org",
"visibility_level": 10,
"path_with_namespace": "gitlab-org/gitlab-test",
"default_branch": "master",
"homepage": "http://example.com/gitlab-org/gitlab-test",
"url": "http://example.com/gitlab-org/gitlab-test.git",
"ssh_url": "git@example.com:gitlab-org/gitlab-test.git",
"http_url": "http://example.com/gitlab-org/gitlab-test.git"
},
"repository": {
"name": "diaspora",
"url": "git@example.com:mike/diaspora.git",
"description": "",
"homepage": "http://example.com/mike/diaspora"
},
"object_attributes": {
"id": 1241,
"note": "Hello world",
"noteable_type": "Issue",
"author_id": 1,
"created_at": "2015-05-17 17:06:40 UTC",
"updated_at": "2015-05-17 17:06:40 UTC",
"project_id": 5,
"attachment": None,
"line_code": None,
"commit_id": "",
"noteable_id": 92,
"system": False,
"st_diff": None,
"url": "http://example.com/gitlab-org/gitlab-test/issues/17#note_1241"
},
"issue": {
"id": 92,
"title": "test",
"assignee_id": None,
"author_id": 1,
"project_id": 5,
"created_at": "2015-04-12 14:53:17 UTC",
"updated_at": "2015-04-26 08:28:42 UTC",
"position": 0,
"branch_name": None,
"description": "test",
"milestone_id": None,
"state": "closed",
"iid": 17
}
}
def test_bad_signature(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e"
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "badbadbad")
data = {}
response = client.post(url, json.dumps(data), content_type="application/json")
response_content = response.data
assert response.status_code == 400
assert "Bad signature" in response_content["_error_message"]
def test_ok_signature(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["111.111.111.111"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url,
json.dumps(data),
content_type="application/json",
REMOTE_ADDR="111.111.111.111")
assert response.status_code == 204
def test_ok_empty_payload(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["111.111.111.111"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
response = client.post(url, "null", content_type="application/json",
REMOTE_ADDR="111.111.111.111")
assert response.status_code == 204
def test_ok_signature_ip_in_network(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["111.111.111.0/24"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url, json.dumps(data),
content_type="application/json",
REMOTE_ADDR="111.111.111.112")
assert response.status_code == 204
def test_ok_signature_invalid_network(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["131.103.20.160/27;165.254.145.0/26;104.192.143.0/24"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = json.dumps({"push": {"changes": [{"new": {"target": { "message": "test message"}}}]}})
response = client.post(url,
data,
content_type="application/json",
HTTP_X_EVENT_KEY="repo:push",
REMOTE_ADDR="104.192.143.193")
assert response.status_code == 400
assert "Bad signature" in response.data["_error_message"]
def test_blocked_project(client):
project = f.ProjectFactory(blocked_code=project_choices.BLOCKED_BY_STAFF)
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["111.111.111.111"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url,
json.dumps(data),
content_type="application/json",
REMOTE_ADDR="111.111.111.111")
assert response.status_code == 451
def test_invalid_ip(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["111.111.111.111"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url,
json.dumps(data),
content_type="application/json",
REMOTE_ADDR="111.111.111.112")
assert response.status_code == 400
def test_invalid_origin_ip_settings(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["testing"]
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url,
json.dumps(data),
content_type="application/json",
REMOTE_ADDR="111.111.111.112")
assert response.status_code == 400
def test_valid_local_network_ip(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": ["192.168.1.1"],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url,
json.dumps(data),
content_type="application/json",
REMOTE_ADDR="192.168.1.1")
assert response.status_code == 204
def test_not_ip_filter(client):
project = f.ProjectFactory()
f.ProjectModulesConfigFactory(project=project, config={
"gitlab": {
"secret": "tpnIwJDz4e",
"valid_origin_ips": [],
}
})
url = reverse("gitlab-hook-list")
url = "{}?project={}&key={}".format(url, project.id, "tpnIwJDz4e")
data = {"test:": "data"}
response = client.post(url,
json.dumps(data),
content_type="application/json",
REMOTE_ADDR="111.111.111.111")
assert response.status_code == 204
def test_push_event_detected(client):
project = f.ProjectFactory()
url = reverse("gitlab-hook-list")
url = "%s?project=%s" % (url, project.id)
data = deepcopy(push_base_payload)
data["commits"] = [{
"message": "test message",
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
data["total_commits_count"] = 1
GitLabViewSet._validate_signature = mock.Mock(return_value=True)
with mock.patch.object(event_hooks.PushEventHook, "process_event") as process_event_mock:
response = client.post(url, json.dumps(data),
HTTP_X_GITHUB_EVENT="push",
content_type="application/json")
assert process_event_mock.call_count == 1
assert response.status_code == 204
def test_push_event_epic_processing(client):
creation_status = f.EpicStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_epics"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
new_status = f.EpicStatusFactory(project=creation_status.project)
epic = f.EpicFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #%s ok
bye!
""" % (epic.ref, new_status.slug),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(epic.project, payload)
ev_hook.process_event()
epic = Epic.objects.get(id=epic.id)
assert epic.status.id == new_status.id
assert len(mail.outbox) == 1
def test_push_event_issue_processing(client):
creation_status = f.IssueStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_issues"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
new_status = f.IssueStatusFactory(project=creation_status.project)
issue = f.IssueFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #%s ok
bye!
""" % (issue.ref, new_status.slug),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(issue.project, payload)
ev_hook.process_event()
issue = Issue.objects.get(id=issue.id)
assert issue.status.id == new_status.id
assert len(mail.outbox) == 1
def test_push_event_task_processing(client):
creation_status = f.TaskStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_tasks"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
new_status = f.TaskStatusFactory(project=creation_status.project)
task = f.TaskFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #%s ok
bye!
""" % (task.ref, new_status.slug),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(task.project, payload)
ev_hook.process_event()
task = Task.objects.get(id=task.id)
assert task.status.id == new_status.id
assert len(mail.outbox) == 1
def test_push_event_user_story_processing(client):
creation_status = f.UserStoryStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_us"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
new_status = f.UserStoryStatusFactory(project=creation_status.project)
user_story = f.UserStoryFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #%s ok
bye!
""" % (user_story.ref, new_status.slug),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(user_story.project, payload)
ev_hook.process_event()
user_story = UserStory.objects.get(id=user_story.id)
assert user_story.status.id == new_status.id
assert len(mail.outbox) == 1
def test_push_event_issue_mention(client):
creation_status = f.IssueStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_issues"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
issue = f.IssueFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
take_snapshot(issue, user=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s ok
bye!
""" % (issue.ref),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
mail.outbox = []
ev_hook = event_hooks.PushEventHook(issue.project, payload)
ev_hook.process_event()
issue_history = get_history_queryset_by_model_instance(issue)
assert issue_history.count() == 1
assert issue_history[0].comment.startswith("This issue has been mentioned by")
assert len(mail.outbox) == 1
def test_push_event_task_mention(client):
creation_status = f.TaskStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_tasks"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
task = f.TaskFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
take_snapshot(task, user=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s ok
bye!
""" % (task.ref),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
mail.outbox = []
ev_hook = event_hooks.PushEventHook(task.project, payload)
ev_hook.process_event()
task_history = get_history_queryset_by_model_instance(task)
assert task_history.count() == 1
assert task_history[0].comment.startswith("This task has been mentioned by")
assert len(mail.outbox) == 1
def test_push_event_user_story_mention(client):
creation_status = f.UserStoryStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_us"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
user_story = f.UserStoryFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
take_snapshot(user_story, user=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s ok
bye!
""" % (user_story.ref),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
mail.outbox = []
ev_hook = event_hooks.PushEventHook(user_story.project, payload)
ev_hook.process_event()
us_history = get_history_queryset_by_model_instance(user_story)
assert us_history.count() == 1
assert us_history[0].comment.startswith("This user story has been mentioned by")
assert len(mail.outbox) == 1
def test_push_event_multiple_actions(client):
creation_status = f.IssueStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_issues"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
new_status = f.IssueStatusFactory(project=creation_status.project)
issue1 = f.IssueFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
issue2 = f.IssueFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #%s ok
test TG-%s #%s ok
bye!
""" % (issue1.ref, new_status.slug, issue2.ref, new_status.slug),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook1 = event_hooks.PushEventHook(issue1.project, payload)
ev_hook1.process_event()
issue1 = Issue.objects.get(id=issue1.id)
issue2 = Issue.objects.get(id=issue2.id)
assert issue1.status.id == new_status.id
assert issue2.status.id == new_status.id
assert len(mail.outbox) == 2
def test_push_event_processing_case_insensitive(client):
creation_status = f.TaskStatusFactory()
role = f.RoleFactory(project=creation_status.project, permissions=["view_tasks"])
f.MembershipFactory(project=creation_status.project, role=role, user=creation_status.project.owner)
new_status = f.TaskStatusFactory(project=creation_status.project)
task = f.TaskFactory.create(status=creation_status, project=creation_status.project, owner=creation_status.project.owner)
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test tg-%s #%s ok
bye!
""" % (task.ref, new_status.slug.upper()),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(task.project, payload)
ev_hook.process_event()
task = Task.objects.get(id=task.id)
assert task.status.id == new_status.id
assert len(mail.outbox) == 1
def test_push_event_task_bad_processing_non_existing_ref(client):
issue_status = f.IssueStatusFactory()
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-6666666 #%s ok
bye!
""" % (issue_status.slug),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(issue_status.project, payload)
with pytest.raises(ActionSyntaxException) as excinfo:
ev_hook.process_event()
assert str(excinfo.value) == "The referenced element doesn't exist"
assert len(mail.outbox) == 0
def test_push_event_us_bad_processing_non_existing_status(client):
user_story = f.UserStoryFactory.create()
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #non-existing-slug ok
bye!
""" % (user_story.ref),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(user_story.project, payload)
with pytest.raises(ActionSyntaxException) as excinfo:
ev_hook.process_event()
assert str(excinfo.value) == "The status doesn't exist"
assert len(mail.outbox) == 0
def test_push_event_bad_processing_non_existing_status(client):
issue = f.IssueFactory.create()
payload = deepcopy(push_base_payload)
payload["commits"] = [{
"message": """test message
test TG-%s #non-existing-slug ok
bye!
""" % (issue.ref),
"id": "b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
"url": "http://example.com/mike/diaspora/commit/b6568db1bc1dcd7f8b4d5a946b0b91f9dacd7327",
}]
payload["total_commits_count"] = 1
mail.outbox = []
ev_hook = event_hooks.PushEventHook(issue.project, payload)
with pytest.raises(ActionSyntaxException) as excinfo:
ev_hook.process_event()
assert str(excinfo.value) == "The status doesn't exist"
assert len(mail.outbox) == 0
def test_issues_event_opened_issue(client):
issue = f.IssueFactory.create()
issue.project.default_issue_status = issue.status
issue.project.default_issue_type = issue.type
issue.project.default_severity = issue.severity
issue.project.default_priority = issue.priority
issue.project.save()
Membership.objects.create(user=issue.owner, project=issue.project, role=f.RoleFactory.create(project=issue.project), is_admin=True)
notify_policy = NotifyPolicy.objects.get(user=issue.owner, project=issue.project)
notify_policy.notify_level = NotifyLevel.all
notify_policy.save()
payload = deepcopy(new_issue_base_payload)
payload["object_attributes"]["title"] = "test-title"
payload["object_attributes"]["description"] = "test-body"
payload["object_attributes"]["url"] = "http://gitlab.com/test/project/issues/11"
payload["object_attributes"]["action"] = "open"
payload["repository"]["homepage"] = "test"
mail.outbox = []
ev_hook = event_hooks.IssuesEventHook(issue.project, payload)
ev_hook.process_event()
assert Issue.objects.count() == 2
assert len(mail.outbox) == 1
def test_issues_event_other_than_opened_issue(client):
issue = f.IssueFactory.create()
issue.project.default_issue_status = issue.status
issue.project.default_issue_type = issue.type
issue.project.default_severity = issue.severity
issue.project.default_priority = issue.priority
issue.project.save()
payload = deepcopy(new_issue_base_payload)
payload["object_attributes"]["title"] = "test-title"
payload["object_attributes"]["description"] = "test-body"
payload["object_attributes"]["url"] = "http://gitlab.com/test/project/issues/11"
payload["object_attributes"]["action"] = "update"
payload["repository"]["homepage"] = "test"
mail.outbox = []
ev_hook = event_hooks.IssuesEventHook(issue.project, payload)
ev_hook.process_event()
assert Issue.objects.count() == 1
assert len(mail.outbox) == 0
def test_issues_event_bad_issue(client):
issue = f.IssueFactory.create()
issue.project.default_issue_status = issue.status
issue.project.default_issue_type = issue.type
issue.project.default_severity = issue.severity
issue.project.default_priority = issue.priority
issue.project.save()
payload = deepcopy(new_issue_base_payload)
del payload["object_attributes"]["title"]
del payload["object_attributes"]["description"]
del payload["object_attributes"]["url"]
payload["object_attributes"]["action"] = "open"
payload["repository"]["homepage"] = "test"
mail.outbox = []
ev_hook = event_hooks.IssuesEventHook(issue.project, payload)
with pytest.raises(ActionSyntaxException) as excinfo:
ev_hook.process_event()
assert str(excinfo.value) == "Invalid issue information"
assert Issue.objects.count() == 1
assert len(mail.outbox) == 0
def test_issue_comment_event_on_existing_issue_task_and_us(client):
project = f.ProjectFactory()
role = f.RoleFactory(project=project, permissions=["view_tasks", "view_issues", "view_us"])
f.MembershipFactory(project=project, role=role, user=project.owner)
user = f.UserFactory()
issue = f.IssueFactory.create(external_reference=["gitlab", "http://gitlab.com/test/project/issues/11"], owner=project.owner, project=project)
take_snapshot(issue, user=user)
task = f.TaskFactory.create(external_reference=["gitlab", "http://gitlab.com/test/project/issues/11"], owner=project.owner, project=project)
take_snapshot(task, user=user)
us = f.UserStoryFactory.create(external_reference=["gitlab", "http://gitlab.com/test/project/issues/11"], owner=project.owner, project=project)
take_snapshot(us, user=user)
payload = deepcopy(issue_comment_base_payload)
payload["user"]["username"] = "test"
payload["issue"]["iid"] = "11"
payload["issue"]["title"] = "test-title"
payload["object_attributes"]["noteable_type"] = "Issue"
payload["object_attributes"]["note"] = "Test body"
payload["repository"]["homepage"] = "http://gitlab.com/test/project"
mail.outbox = []
assert get_history_queryset_by_model_instance(issue).count() == 0
assert get_history_queryset_by_model_instance(task).count() == 0
assert get_history_queryset_by_model_instance(us).count() == 0
ev_hook = event_hooks.IssueCommentEventHook(issue.project, payload)
ev_hook.process_event()
issue_history = get_history_queryset_by_model_instance(issue)
assert issue_history.count() == 1
assert "Test body" in issue_history[0].comment
task_history = get_history_queryset_by_model_instance(task)
assert task_history.count() == 1
assert "Test body" in issue_history[0].comment
us_history = get_history_queryset_by_model_instance(us)
assert us_history.count() == 1
assert "Test body" in issue_history[0].comment
assert len(mail.outbox) == 3
def test_issue_comment_event_on_not_existing_issue_task_and_us(client):
issue = f.IssueFactory.create(external_reference=["gitlab", "10"])
take_snapshot(issue, user=issue.owner)
task = f.TaskFactory.create(project=issue.project, external_reference=["gitlab", "10"])
take_snapshot(task, user=task.owner)
us = f.UserStoryFactory.create(project=issue.project, external_reference=["gitlab", "10"])
take_snapshot(us, user=us.owner)
payload = deepcopy(issue_comment_base_payload)
payload["user"]["username"] = "test"
payload["issue"]["iid"] = "99999"
payload["issue"]["title"] = "test-title"
payload["object_attributes"]["noteable_type"] = "Issue"
payload["object_attributes"]["note"] = "test comment"
payload["repository"]["homepage"] = "test"
mail.outbox = []
assert get_history_queryset_by_model_instance(issue).count() == 0
assert get_history_queryset_by_model_instance(task).count() == 0
assert get_history_queryset_by_model_instance(us).count() == 0
ev_hook = event_hooks.IssueCommentEventHook(issue.project, payload)
ev_hook.process_event()
assert get_history_queryset_by_model_instance(issue).count() == 0
assert get_history_queryset_by_model_instance(task).count() == 0
assert get_history_queryset_by_model_instance(us).count() == 0
assert len(mail.outbox) == 0
def test_issues_event_bad_comment(client):
issue = f.IssueFactory.create(external_reference=["gitlab", "10"])
take_snapshot(issue, user=issue.owner)
payload = deepcopy(issue_comment_base_payload)
payload["user"]["username"] = "test"
payload["issue"]["iid"] = "10"
payload["issue"]["title"] = "test-title"
payload["object_attributes"]["noteable_type"] = "Issue"
del payload["object_attributes"]["note"]
payload["repository"]["homepage"] = "test"
ev_hook = event_hooks.IssueCommentEventHook(issue.project, payload)
mail.outbox = []
with pytest.raises(ActionSyntaxException) as excinfo:
ev_hook.process_event()
assert str(excinfo.value) == "Invalid issue comment information"
assert Issue.objects.count() == 1
assert len(mail.outbox) == 0
def test_api_get_project_modules(client):
project = f.create_project()
f.MembershipFactory(project=project, user=project.owner, is_admin=True)
url = reverse("projects-modules", args=(project.id,))
client.login(project.owner)
response = client.get(url)
assert response.status_code == 200
content = response.data
assert "gitlab" in content
assert content["gitlab"]["secret"] != ""
assert content["gitlab"]["webhooks_url"] != ""
def test_api_patch_project_modules(client):
project = f.create_project()
f.MembershipFactory(project=project, user=project.owner, is_admin=True)
url = reverse("projects-modules", args=(project.id,))
client.login(project.owner)
data = {
"gitlab": {
"secret": "test_secret",
"url": "test_url",
}
}
response = client.patch(url, json.dumps(data), content_type="application/json")
assert response.status_code == 204
config = services.get_modules_config(project).config
assert "gitlab" in config
assert config["gitlab"]["secret"] == "test_secret"
assert config["gitlab"]["webhooks_url"] != "test_url"
def test_replace_gitlab_references():
ev_hook = event_hooks.BaseGitLabEventHook
assert ev_hook.replace_gitlab_references(None, "project-url", "#2") == "[GitLab#2](project-url/issues/2)"
assert ev_hook.replace_gitlab_references(None, "project-url", "#2 ") == "[GitLab#2](project-url/issues/2) "
assert ev_hook.replace_gitlab_references(None, "project-url", " #2 ") == " [GitLab#2](project-url/issues/2) "
assert ev_hook.replace_gitlab_references(None, "project-url", " #2") == " [GitLab#2](project-url/issues/2)"
assert ev_hook.replace_gitlab_references(None, "project-url", "#test") == "#test"
assert ev_hook.replace_gitlab_references(None, "project-url", None) == ""
| 38.503106 | 148 | 0.674033 | 4,315 | 37,194 | 5.632213 | 0.103824 | 0.043205 | 0.05703 | 0.039172 | 0.777846 | 0.758384 | 0.729293 | 0.716949 | 0.698556 | 0.677365 | 0 | 0.048428 | 0.185003 | 37,194 | 965 | 149 | 38.543005 | 0.753307 | 0.025676 | 0 | 0.625156 | 0 | 0.002491 | 0.287217 | 0.040144 | 0 | 0 | 0 | 0 | 0.103362 | 1 | 0.039851 | false | 0 | 0.024907 | 0 | 0.064757 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
b437c1b4233eb991a4302ae2b36a6531c92e9041 | 83 | py | Python | statusmsg/movement.py | bmoscon/statusmsg | 1418c87503a9092201de53a11505f29bf3e83d5b | [
"MIT"
] | 2 | 2020-03-25T05:07:26.000Z | 2020-03-25T05:51:23.000Z | statusmsg/movement.py | bmoscon/statusmsg | 1418c87503a9092201de53a11505f29bf3e83d5b | [
"MIT"
] | null | null | null | statusmsg/movement.py | bmoscon/statusmsg | 1418c87503a9092201de53a11505f29bf3e83d5b | [
"MIT"
] | 1 | 2020-03-25T03:55:30.000Z | 2020-03-25T03:55:30.000Z | UP = u"\u001b[{}A"
DOWN = u"\u001b[{}B"
RIGHT = u"\u001b[{}C"
LEFT = u"\u001b[{}D"
| 16.6 | 21 | 0.518072 | 16 | 83 | 2.6875 | 0.625 | 0.55814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169014 | 0.144578 | 83 | 4 | 22 | 20.75 | 0.43662 | 0 | 0 | 0 | 0 | 0 | 0.481928 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c30c0290eedbadf9d5fe6c049917f1bd37fc880e | 806 | py | Python | test/apps/minermedic/resources/mining_pools/ethermine_etc.py | holitics/minermedic | 39dd03e58d665bea23f1e9c9ab134a794d092cb1 | [
"Apache-2.0"
] | 2 | 2020-02-13T15:32:43.000Z | 2020-04-08T04:10:10.000Z | test/apps/minermedic/resources/mining_pools/ethermine_etc.py | alimogh/minermedic | 39dd03e58d665bea23f1e9c9ab134a794d092cb1 | [
"Apache-2.0"
] | 11 | 2019-11-23T00:20:23.000Z | 2020-01-02T02:17:55.000Z | test/apps/minermedic/resources/mining_pools/ethermine_etc.py | alimogh/minermedic | 39dd03e58d665bea23f1e9c9ab134a794d092cb1 | [
"Apache-2.0"
] | 2 | 2020-06-15T22:32:43.000Z | 2020-07-17T18:40:58.000Z | routes = {
'/data/price?fsym=ETC&tsyms=USD' : {'USD': 4.88},
'/miner/:2da4e946c0ee6977bc44fbba9019b3931952cfff/worker/:holitics1/currentStats' : {"status":"OK","data":{"time":1553352000,"lastSeen":1553351963,"reportedHashrate":314146860,"currentHashrate":188888888.8888889,"validShares":170,"invalidShares":0,"staleShares":0,"averageHashrate":112500000}},
'/miner/:2da4e946c0ee6977bc44fbba9019b3931952cfff/currentStats': {"status":"OK","data":{"time":1553352000,"lastSeen":1553351963,"reportedHashrate":314146860,"currentHashrate":188888888.8888889,"validShares":170,"invalidShares":0,"staleShares":0,"averageHashrate":112500000,"activeWorkers":1,"unpaid":67943930585283656,"unconfirmed":"null","coinsPerMin":0.0002075227774157812,"usdPerMin":0.00101478638156317,"btcPerMin":2.5317778844725305e-7}}
} | 161.2 | 444 | 0.782878 | 74 | 806 | 8.527027 | 0.621622 | 0.142631 | 0.063391 | 0.07607 | 0.557845 | 0.557845 | 0.557845 | 0.557845 | 0.557845 | 0.557845 | 0 | 0.307107 | 0.022333 | 806 | 5 | 445 | 161.2 | 0.493655 | 0 | 0 | 0 | 0 | 0 | 0.552664 | 0.210657 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c3349a27f06fd92494a7b97c7f209f0da24cc31a | 128 | py | Python | Darlington/phase1/python Basic 1/day 11 solution/qtn3.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 6 | 2020-05-23T19:53:25.000Z | 2021-05-08T20:21:30.000Z | Darlington/phase1/python Basic 1/day 11 solution/qtn3.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 8 | 2020-05-14T18:53:12.000Z | 2020-07-03T00:06:20.000Z | Darlington/phase1/python Basic 1/day 11 solution/qtn3.py | CodedLadiesInnovateTech/-python-challenge-solutions | 430cd3eb84a2905a286819eef384ee484d8eb9e7 | [
"MIT"
] | 39 | 2020-05-10T20:55:02.000Z | 2020-09-12T17:40:59.000Z | #program to get the identity of an object.obj1 = object()
import sys
obj1_address = id(obj1)
print()
print(obj1_address)
print() | 21.333333 | 57 | 0.757813 | 21 | 128 | 4.52381 | 0.666667 | 0.231579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036036 | 0.132813 | 128 | 6 | 58 | 21.333333 | 0.81982 | 0.4375 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.6 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
c3549d47b1e67094a300675a8a737d848bffec56 | 3,966 | py | Python | authors/apps/authentication/tests/test_social_auth.py | andela/Ah-backend-guardians | cc84c18f7c222bc69cf4a263a1c2296b6d335c8b | [
"BSD-3-Clause"
] | null | null | null | authors/apps/authentication/tests/test_social_auth.py | andela/Ah-backend-guardians | cc84c18f7c222bc69cf4a263a1c2296b6d335c8b | [
"BSD-3-Clause"
] | 32 | 2019-01-09T07:52:32.000Z | 2022-01-13T01:01:55.000Z | authors/apps/authentication/tests/test_social_auth.py | andela/Ah-backend-guardians | cc84c18f7c222bc69cf4a263a1c2296b6d335c8b | [
"BSD-3-Clause"
] | 3 | 2019-01-03T12:05:53.000Z | 2019-09-24T11:41:14.000Z | from rest_framework.test import APITestCase
from .data import wrong_fb_token, fb_token, google_token, twitter_token
from rest_framework.views import status
from django.urls import reverse
from unittest.mock import patch
class TestSocialLogin(APITestCase):
def setUp(self):
self.fb_url = reverse("authentication:fb_login")
self.google_url = reverse("authentication:google_login")
self.twitter_url = reverse("authentication:twitter_login")
@patch('facebook.GraphAPI.get_object')
def test_fb_login_new(self, get_object):
get_object.return_value = dict(
email="someemail@example.com",
name="Author's Haven"
)
response = self.client.post(self.fb_url, fb_token, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
@patch('facebook.GraphAPI.get_object')
def test_fb_login_return_user(self, get_object):
get_object.return_value = dict(
email="someemail@example.com",
name="Author's Haven"
)
self.client.post(self.fb_url, fb_token, format="json")
response = self.client.post(self.fb_url, fb_token, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
@patch('facebook.GraphAPI', side_effect=Exception())
def test_fb_login_wrong_token(self, GraphAPI):
response = self.client.post(self.fb_url, wrong_fb_token, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
@patch('google.oauth2.id_token.verify_oauth2_token')
def test_google_login_new_user(self, verify_oauth2_token):
verify_oauth2_token.return_value = dict(
email="someemail@example.com",
name="Author's Haven"
)
response = self.client.post(
self.google_url, google_token, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
@patch('google.oauth2.id_token.verify_oauth2_token')
def test_google_login_return_user(self, verify_oauth2_token):
verify_oauth2_token.return_value = dict(
email="someemail@example.com",
name="Author's Haven"
)
self.client.post(self.google_url, google_token, format="json")
response = self.client.post(
self.google_url, google_token, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
@patch('google.oauth2.id_token.verify_oauth2_token', side_effect=Exception())
def test_test_google_login_wrong_token(self, verify_oauth2_token):
response = self.client.post(
self.google_url, google_token, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
@patch('twitter.Api.VerifyCredentials')
def test_twitter_login_new_user(self, VerifyCredentials):
VerifyCredentials.return_value.__dict__ = dict(
email="someemail@example.com",
name="Author's Haven"
)
response = self.client.post(
self.twitter_url, twitter_token, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
@patch('twitter.Api.VerifyCredentials')
def test_twitter_login_return_user(self, VerifyCredentials):
VerifyCredentials.return_value.__dict__ = dict(
email="someemail@example.com",
name="Author's Haven"
)
self.client.post(self.twitter_url, twitter_token, format="json")
response = self.client.post(
self.twitter_url, twitter_token, format="json")
self.assertEqual(response.status_code, status.HTTP_200_OK)
@patch('twitter.Api.VerifyCredentials', side_effect=Exception())
def test_twitter_login_wrong_token(self, VerifyCredentials):
response = self.client.post(self.twitter_url, fb_token, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
| 43.108696 | 81 | 0.699445 | 493 | 3,966 | 5.330629 | 0.137931 | 0.045662 | 0.063927 | 0.082192 | 0.790335 | 0.760654 | 0.760654 | 0.736301 | 0.687595 | 0.656012 | 0 | 0.011908 | 0.195411 | 3,966 | 91 | 82 | 43.582418 | 0.811658 | 0 | 0 | 0.576923 | 0 | 0 | 0.156833 | 0.119264 | 0 | 0 | 0 | 0 | 0.115385 | 1 | 0.128205 | false | 0 | 0.064103 | 0 | 0.205128 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c381d4455cdc44a19c0c1e3b1893d49b9d498397 | 125 | py | Python | buuctf/19-diff/exp.py | RoderickChan/ctf_tasks | a021c6d86cade26448d099933f3caa856ed28360 | [
"MIT"
] | null | null | null | buuctf/19-diff/exp.py | RoderickChan/ctf_tasks | a021c6d86cade26448d099933f3caa856ed28360 | [
"MIT"
] | null | null | null | buuctf/19-diff/exp.py | RoderickChan/ctf_tasks | a021c6d86cade26448d099933f3caa856ed28360 | [
"MIT"
] | null | null | null | from pwn import *
sh = ssh(user='ctf', host='node3.buuoj.cn', port=25102, password='guest', level='debug')
sh.interactive() | 25 | 88 | 0.688 | 19 | 125 | 4.526316 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053571 | 0.104 | 125 | 5 | 89 | 25 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
5edd1ab27ba5b4f789ed6757409d3efdb9438ef1 | 201 | py | Python | polyaxon/checks/logs.py | elyase/polyaxon | 1c19f059a010a6889e2b7ea340715b2bcfa382a0 | [
"MIT"
] | null | null | null | polyaxon/checks/logs.py | elyase/polyaxon | 1c19f059a010a6889e2b7ea340715b2bcfa382a0 | [
"MIT"
] | null | null | null | polyaxon/checks/logs.py | elyase/polyaxon | 1c19f059a010a6889e2b7ea340715b2bcfa382a0 | [
"MIT"
] | null | null | null | from checks.worker import WorkerCheck
from polyaxon.config_settings import LogsCeleryTasks
class LogsCheck(WorkerCheck):
WORKER_HEALTH_TASK = LogsCeleryTasks.LOGS_HEALTH
WORKER_NAME = 'LOGS'
| 25.125 | 52 | 0.820896 | 23 | 201 | 6.956522 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129353 | 201 | 7 | 53 | 28.714286 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0.019901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
5ee2f3c0cd1ce632622707ae901d53b5ee0b8c35 | 50 | py | Python | supervisor/plugins/__init__.py | pnjongang/supervisor | 2a006ae76de4b06e3e291b37aa2a4e14dc272445 | [
"Apache-2.0"
] | 597 | 2017-04-27T15:10:08.000Z | 2019-12-18T16:02:57.000Z | supervisor/plugins/__init__.py | BigElkHunter/cyberockit | fa7140fd9a5ee1316d103628f1f7f4c6db05b158 | [
"Apache-2.0"
] | 1,056 | 2020-01-30T09:59:44.000Z | 2022-03-31T10:15:32.000Z | supervisor/plugins/__init__.py | BigElkHunter/cyberockit | fa7140fd9a5ee1316d103628f1f7f4c6db05b158 | [
"Apache-2.0"
] | 295 | 2020-02-03T11:30:42.000Z | 2022-03-31T18:53:14.000Z | """Supervisor plugins to extend functionality."""
| 25 | 49 | 0.76 | 5 | 50 | 7.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 50 | 1 | 50 | 50 | 0.844444 | 0.86 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6f0019a57d5428d442b78defbb534d9e85b991a5 | 117 | py | Python | sample.py | Batake/wavegan | 8ba09b68717a29829c061083803b7d21f7004e19 | [
"MIT"
] | null | null | null | sample.py | Batake/wavegan | 8ba09b68717a29829c061083803b7d21f7004e19 | [
"MIT"
] | null | null | null | sample.py | Batake/wavegan | 8ba09b68717a29829c061083803b7d21f7004e19 | [
"MIT"
] | null | null | null | import tensorflow as tf
import numpy
a = [3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14]
a.set_shape([2, 3, 2])
print(a)
| 16.714286 | 45 | 0.581197 | 27 | 117 | 2.481481 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21978 | 0.222222 | 117 | 6 | 46 | 19.5 | 0.516484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
6f07b1b43182a9afc92fcabcd43fe77cb308fced | 218 | py | Python | backend-project/small_eod/notes/views.py | kostyrko/small_eod | c12fe36f2233bdd297d32d61d9cd25ddbfca897b | [
"MIT"
] | 1 | 2021-02-10T19:58:51.000Z | 2021-02-10T19:58:51.000Z | backend-project/small_eod/notes/views.py | kostyrko/small_eod | c12fe36f2233bdd297d32d61d9cd25ddbfca897b | [
"MIT"
] | null | null | null | backend-project/small_eod/notes/views.py | kostyrko/small_eod | c12fe36f2233bdd297d32d61d9cd25ddbfca897b | [
"MIT"
] | 1 | 2021-02-10T19:38:35.000Z | 2021-02-10T19:38:35.000Z | from rest_framework import viewsets
from .models import Note
from .serializers import NoteSerializer
class NoteViewSet(viewsets.ModelViewSet):
queryset = Note.objects.all()
serializer_class = NoteSerializer
| 21.8 | 41 | 0.802752 | 24 | 218 | 7.208333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142202 | 218 | 9 | 42 | 24.222222 | 0.925134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
6f35634adb7d4297e95a4478cb429275ee04f57e | 8,199 | py | Python | signing_today_client/__init__.py | signingtoday/signingtoday-sdk-python | ed267279622fb59f2ad8fa289157fc9cdf9d8a5b | [
"MIT"
] | null | null | null | signing_today_client/__init__.py | signingtoday/signingtoday-sdk-python | ed267279622fb59f2ad8fa289157fc9cdf9d8a5b | [
"MIT"
] | null | null | null | signing_today_client/__init__.py | signingtoday/signingtoday-sdk-python | ed267279622fb59f2ad8fa289157fc9cdf9d8a5b | [
"MIT"
] | null | null | null | # coding: utf-8
# flake8: noqa
"""
Signing Today Web
*Signing Today* is the perfect Digital Signature Gateway. Whenever in Your workflow You need to add one or more Digital Signatures to Your document, *Signing Today* is the right choice. You prepare Your documents, *Signing Today* takes care of all the rest: send invitations (`signature tickets`) to signers, collects their signatures, send You back the signed document. Integrating *Signing Today* in Your existing applications is very easy. Just follow these API specifications and get inspired by the many examples presented hereafter. # noqa: E501
The version of the OpenAPI document: 2.0.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
__version__ = "1.0.0"
# import apis into sdk package
from signing_today_client.api.backoffice_api import BackofficeApi
from signing_today_client.api.dst_note_api import DSTNoteApi
from signing_today_client.api.devices_api import DevicesApi
from signing_today_client.api.digital_signature_transactions_api import DigitalSignatureTransactionsApi
from signing_today_client.api.notifications_api import NotificationsApi
from signing_today_client.api.resources_api import ResourcesApi
from signing_today_client.api.robot_api import RobotApi
from signing_today_client.api.robots_api import RobotsApi
from signing_today_client.api.services_api import ServicesApi
from signing_today_client.api.signing_services_api import SigningServicesApi
from signing_today_client.api.users_api import UsersApi
from signing_today_client.api.bit4id_pathgroup_dst_note_api import Bit4idPathgroupDSTNoteApi
from signing_today_client.api.bit4id_pathgroup_devices_api import Bit4idPathgroupDevicesApi
from signing_today_client.api.bit4id_pathgroup_digital_signature_transactions_api import Bit4idPathgroupDigitalSignatureTransactionsApi
from signing_today_client.api.bit4id_pathgroup_notifications_api import Bit4idPathgroupNotificationsApi
from signing_today_client.api.bit4id_pathgroup_resources_api import Bit4idPathgroupResourcesApi
from signing_today_client.api.bit4id_pathgroup_robots_api import Bit4idPathgroupRobotsApi
from signing_today_client.api.bit4id_pathgroup_services_api import Bit4idPathgroupServicesApi
from signing_today_client.api.bit4id_pathgroup_users_api import Bit4idPathgroupUsersApi
# import ApiClient
from signing_today_client.api_client import ApiClient
from signing_today_client.configuration import Configuration
from signing_today_client.exceptions import OpenApiException
from signing_today_client.exceptions import ApiTypeError
from signing_today_client.exceptions import ApiValueError
from signing_today_client.exceptions import ApiKeyError
from signing_today_client.exceptions import ApiException
# import models into sdk package
from signing_today_client.models.alfresco_sync import AlfrescoSync
from signing_today_client.models.audit_record import AuditRecord
from signing_today_client.models.auth_credential import AuthCredential
from signing_today_client.models.create_digital_signature_transaction import CreateDigitalSignatureTransaction
from signing_today_client.models.create_document import CreateDocument
from signing_today_client.models.create_document_resource import CreateDocumentResource
from signing_today_client.models.create_document_source import CreateDocumentSource
from signing_today_client.models.create_user_request import CreateUserRequest
from signing_today_client.models.dst_note import DSTNote
from signing_today_client.models.dst_signing_address_response import DSTSigningAddressResponse
from signing_today_client.models.dst_status_changed_notification import DSTStatusChangedNotification
from signing_today_client.models.ds_ts_get_response import DSTsGetResponse
from signing_today_client.models.device_authorization_response import DeviceAuthorizationResponse
from signing_today_client.models.digital_signature_transaction import DigitalSignatureTransaction
from signing_today_client.models.document import Document
from signing_today_client.models.error_response import ErrorResponse
from signing_today_client.models.fillable_form import FillableForm
from signing_today_client.models.identity import Identity
from signing_today_client.models.identity_provider_data import IdentityProviderData
from signing_today_client.models.identity_provider_data_token_info import IdentityProviderDataTokenInfo
from signing_today_client.models.inline_object import InlineObject
from signing_today_client.models.inline_object1 import InlineObject1
from signing_today_client.models.inline_object2 import InlineObject2
from signing_today_client.models.inline_object3 import InlineObject3
from signing_today_client.models.inline_object4 import InlineObject4
from signing_today_client.models.inline_object5 import InlineObject5
from signing_today_client.models.inline_object6 import InlineObject6
from signing_today_client.models.inline_object7 import InlineObject7
from signing_today_client.models.inline_object8 import InlineObject8
from signing_today_client.models.inline_object9 import InlineObject9
from signing_today_client.models.inline_response200 import InlineResponse200
from signing_today_client.models.instantiate_dst_template import InstantiateDSTTemplate
from signing_today_client.models.lf_resource import LFResource
from signing_today_client.models.notification_event import NotificationEvent
from signing_today_client.models.notifications_response import NotificationsResponse
from signing_today_client.models.organization import Organization
from signing_today_client.models.organization_private_settings import OrganizationPrivateSettings
from signing_today_client.models.organization_public_settings import OrganizationPublicSettings
from signing_today_client.models.organization_settings import OrganizationSettings
from signing_today_client.models.organization_settings_alfresco_properties import OrganizationSettingsAlfrescoProperties
from signing_today_client.models.organizations_get_response import OrganizationsGetResponse
from signing_today_client.models.robot_authentication_token import RobotAuthenticationToken
from signing_today_client.models.robot_configuration import RobotConfiguration
from signing_today_client.models.robot_configuration_authentication import RobotConfigurationAuthentication
from signing_today_client.models.robot_configuration_webhooks import RobotConfigurationWebhooks
from signing_today_client.models.robot_id_instantiate_roles_mapping import RobotIdInstantiateRolesMapping
from signing_today_client.models.saml_token import SAMLToken
from signing_today_client.models.saml_token_edu_person_targeted_id import SAMLTokenEduPersonTargetedID
from signing_today_client.models.service_failure_response import ServiceFailureResponse
from signing_today_client.models.signature import Signature
from signing_today_client.models.signature_request import SignatureRequest
from signing_today_client.models.signature_restriction import SignatureRestriction
from signing_today_client.models.signature_status_changed_notification import SignatureStatusChangedNotification
from signing_today_client.models.signature_status_changed_notification_document import SignatureStatusChangedNotificationDocument
from signing_today_client.models.signature_status_changed_notification_dst import SignatureStatusChangedNotificationDst
from signing_today_client.models.signer import Signer
from signing_today_client.models.signer_instance import SignerInstance
from signing_today_client.models.signer_record import SignerRecord
from signing_today_client.models.signers_group import SignersGroup
from signing_today_client.models.trusted_device import TrustedDevice
from signing_today_client.models.trusted_devices_get_response import TrustedDevicesGetResponse
from signing_today_client.models.user import User
from signing_today_client.models.user_group import UserGroup
from signing_today_client.models.user_group_get_response import UserGroupGetResponse
from signing_today_client.models.user_sync_report import UserSyncReport
from signing_today_client.models.user_sync_report_users import UserSyncReportUsers
from signing_today_client.models.users_get_response import UsersGetResponse
| 70.076923 | 557 | 0.90755 | 1,018 | 8,199 | 6.966601 | 0.246562 | 0.165821 | 0.209814 | 0.288494 | 0.477439 | 0.366892 | 0.178511 | 0.051607 | 0.026227 | 0 | 0 | 0.006648 | 0.064276 | 8,199 | 116 | 558 | 70.681034 | 0.917753 | 0.093426 | 0 | 0 | 0 | 0 | 0.000675 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.989474 | 0 | 0.989474 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
6f4015d6ea8eccc96fd691f2cd556d5572154186 | 107 | py | Python | crabageprediction/venv/Lib/site-packages/matplotlib/backends/__init__.py | 13rianlucero/CrabAgePrediction | 92bc7fbe1040f49e820473e33cc3902a5a7177c7 | [
"MIT"
] | 603 | 2020-12-23T13:49:32.000Z | 2022-03-31T23:38:03.000Z | crabageprediction/venv/Lib/site-packages/matplotlib/backends/__init__.py | 13rianlucero/CrabAgePrediction | 92bc7fbe1040f49e820473e33cc3902a5a7177c7 | [
"MIT"
] | 387 | 2020-12-15T14:54:04.000Z | 2022-03-31T07:00:21.000Z | crabageprediction/venv/Lib/site-packages/matplotlib/backends/__init__.py | 13rianlucero/CrabAgePrediction | 92bc7fbe1040f49e820473e33cc3902a5a7177c7 | [
"MIT"
] | 35 | 2021-03-26T03:12:04.000Z | 2022-03-23T10:15:10.000Z | # NOTE: plt.switch_backend() (called at import time) will add a "backend"
# attribute here for backcompat.
| 35.666667 | 73 | 0.747664 | 16 | 107 | 4.9375 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149533 | 107 | 2 | 74 | 53.5 | 0.868132 | 0.953271 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6f69abc29e15677f3ac4fe0579559b5b487bc0a0 | 1,160 | py | Python | account/migrations/0002_auto_20210805_2019.py | Gandabh/E-commerce-Site | 8bc4ca85c9cd6f3ed1435e5767aef4ab315df559 | [
"MIT"
] | 1 | 2022-01-01T21:46:48.000Z | 2022-01-01T21:46:48.000Z | account/migrations/0002_auto_20210805_2019.py | Gandabh/E-commerce-Site | 8bc4ca85c9cd6f3ed1435e5767aef4ab315df559 | [
"MIT"
] | null | null | null | account/migrations/0002_auto_20210805_2019.py | Gandabh/E-commerce-Site | 8bc4ca85c9cd6f3ed1435e5767aef4ab315df559 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.4 on 2021-08-05 20:19
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('account', '0001_initial'),
]
operations = [
migrations.RenameField(
model_name='checkout',
old_name='shopping_address',
new_name='shipping_address',
),
migrations.RenameField(
model_name='checkout',
old_name='shopping_city',
new_name='shipping_city',
),
migrations.RenameField(
model_name='checkout',
old_name='shopping_company',
new_name='shipping_company',
),
migrations.RenameField(
model_name='checkout',
old_name='shopping_country',
new_name='shipping_country',
),
migrations.RenameField(
model_name='checkout',
old_name='shopping_email',
new_name='shipping_email',
),
migrations.RenameField(
model_name='checkout',
old_name='shopping_phone',
new_name='shipping_phone',
),
]
| 26.363636 | 47 | 0.555172 | 104 | 1,160 | 5.894231 | 0.355769 | 0.205546 | 0.254486 | 0.293638 | 0.51876 | 0.51876 | 0.51876 | 0.51876 | 0 | 0 | 0 | 0.024837 | 0.340517 | 1,160 | 43 | 48 | 26.976744 | 0.776471 | 0.038793 | 0 | 0.486486 | 1 | 0 | 0.220126 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6f6a92b5138c79573f9781e8fbb9c922fdd9a951 | 150 | py | Python | src/datapane/runner/__init__.py | philopon/datapane | d7d69865d4def0cbe6eb334acd9edeb829dd67e6 | [
"Apache-2.0"
] | 481 | 2020-04-25T05:40:21.000Z | 2022-03-30T22:04:35.000Z | src/datapane/runner/__init__.py | tig/datapane | defae6776e73b07191c0a5804a50b284ec3c9a63 | [
"Apache-2.0"
] | 74 | 2020-04-28T10:47:35.000Z | 2022-03-14T15:50:55.000Z | src/datapane/runner/__init__.py | admariner/datapane | c440eaf07bd1c1f2de3ff952e0fd8c78d636aa8f | [
"Apache-2.0"
] | 41 | 2020-07-21T16:30:21.000Z | 2022-02-21T22:50:27.000Z | # Copyright 2020 StackHut Limited (trading as Datapane)
# SPDX-License-Identifier: Apache-2.0
import os
os.environ["DATAPANE_BY_DATAPANE"] = "true"
| 21.428571 | 55 | 0.766667 | 21 | 150 | 5.380952 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.12 | 150 | 6 | 56 | 25 | 0.810606 | 0.593333 | 0 | 0 | 0 | 0 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
4889998c94ff2f393e577e28645f3246160f9d0a | 95 | py | Python | network_machine_learning_in_python/_build/jupyter_execute/foundations/ch2/ch2.py | Laknath1996/graph-stats-book | 4b10c2f99dbfb5e05a72c98130f8c4338d7c9a21 | [
"MIT"
] | 10 | 2020-09-15T19:09:53.000Z | 2022-03-17T21:24:14.000Z | network_machine_learning_in_python/_build/jupyter_execute/foundations/ch2/ch2.py | Laknath1996/graph-stats-book | 4b10c2f99dbfb5e05a72c98130f8c4338d7c9a21 | [
"MIT"
] | 30 | 2020-09-15T19:15:11.000Z | 2022-03-10T15:33:24.000Z | network_machine_learning_in_python/_build/jupyter_execute/foundations/ch2/ch2.py | Laknath1996/graph-stats-book | 4b10c2f99dbfb5e05a72c98130f8c4338d7c9a21 | [
"MIT"
] | 2 | 2021-04-12T05:08:00.000Z | 2021-10-04T09:42:21.000Z | #!/usr/bin/env python
# coding: utf-8
# # End-to-end Biology Network Machine Learning Project
| 19 | 55 | 0.726316 | 15 | 95 | 4.6 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0.147368 | 95 | 4 | 56 | 23.75 | 0.839506 | 0.915789 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
489464c8c1191fcfcd0df2ba333e56f27bcb9f40 | 345 | py | Python | machine_comprehension_baselines/Peter/feature_extractor_interface.py | recski/semantic_parsing_szte_bme | b775614ac33882a09c0cce94bb40bbd8744df336 | [
"MIT"
] | null | null | null | machine_comprehension_baselines/Peter/feature_extractor_interface.py | recski/semantic_parsing_szte_bme | b775614ac33882a09c0cce94bb40bbd8744df336 | [
"MIT"
] | null | null | null | machine_comprehension_baselines/Peter/feature_extractor_interface.py | recski/semantic_parsing_szte_bme | b775614ac33882a09c0cce94bb40bbd8744df336 | [
"MIT"
] | 2 | 2018-03-29T14:05:12.000Z | 2018-04-05T16:57:00.000Z | from feature_extractor import get_features_wordvec as gfw
from feature_extractor import get_features_bow as gfb
def get_features_bow(train, dev, outfile="features_bow.pickle"):
return gfb(train, dev, outfile)
def get_features_wordvec(train, dev, model_path, outfile="features_wv.pickle"):
return gfw(train, dev, model_path, outfile)
| 31.363636 | 79 | 0.797101 | 52 | 345 | 5.019231 | 0.384615 | 0.168582 | 0.153257 | 0.199234 | 0.467433 | 0.283525 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121739 | 345 | 10 | 80 | 34.5 | 0.861386 | 0 | 0 | 0 | 0 | 0 | 0.107246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 4 |
48a3af11a2099b33aa24ede8460a661547932a4a | 184 | py | Python | pyf/_lcfirst.py | snoopyjc/pythonizer | 6b3683084f41f0aa06b1b4e652a0f00b19cceac1 | [
"Artistic-2.0"
] | 1 | 2022-03-13T22:08:25.000Z | 2022-03-13T22:08:25.000Z | pyf/_lcfirst.py | snoopyjc/pythonizer | 6b3683084f41f0aa06b1b4e652a0f00b19cceac1 | [
"Artistic-2.0"
] | 21 | 2022-03-17T16:53:04.000Z | 2022-03-31T23:55:24.000Z | pyf/_lcfirst.py | snoopyjc/pythonizer | 6b3683084f41f0aa06b1b4e652a0f00b19cceac1 | [
"Artistic-2.0"
] | null | null | null |
def _lcfirst(string):
"""Implementation of lcfirst and \l in interpolated strings: lowercase the first char of the given string"""
return string[0:1].lower() + string[1:]
| 36.8 | 113 | 0.695652 | 26 | 184 | 4.884615 | 0.730769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020134 | 0.190217 | 184 | 4 | 114 | 46 | 0.832215 | 0.554348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
48c1f8df48e36abc529fbc3430c006edd6e1bbab | 299 | py | Python | molsysmt/tools/string_aminoacids1/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/tools/string_aminoacids1/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/tools/string_aminoacids1/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | from .is_string_aminoacids1 import is_string_aminoacids1
from .to_file_fasta import to_file_fasta
from .to_file_pir import to_file_pir
from .to_string_aminoacids3 import to_string_aminoacids3
from .to_biopython_Seq import to_biopython_Seq
from .to_biopython_SeqRecord import to_biopython_SeqRecord
| 37.375 | 58 | 0.896321 | 48 | 299 | 5.083333 | 0.270833 | 0.122951 | 0.155738 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014599 | 0.083612 | 299 | 7 | 59 | 42.714286 | 0.875912 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
48cb3517940af0c6055eeeda75791bbf631123cb | 73 | py | Python | tests/__init__.py | abhi-parmar/phishing_detection | 7442ad7030ea986af0dabf72600dfb8ef16a6dfa | [
"MIT"
] | 5 | 2019-07-20T19:39:33.000Z | 2020-10-08T14:16:53.000Z | tests/__init__.py | abhi-parmar/phishing_detection | 7442ad7030ea986af0dabf72600dfb8ef16a6dfa | [
"MIT"
] | 6 | 2019-07-20T18:03:38.000Z | 2021-02-02T22:05:41.000Z | tests/__init__.py | abhi-parmar/phishing_detection | 7442ad7030ea986af0dabf72600dfb8ef16a6dfa | [
"MIT"
] | 1 | 2019-07-20T17:57:55.000Z | 2019-07-20T17:57:55.000Z | # -*- coding: utf-8 -*-
"""Unit test package for phishing_detection."""
| 18.25 | 47 | 0.630137 | 9 | 73 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016129 | 0.150685 | 73 | 3 | 48 | 24.333333 | 0.709677 | 0.876712 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
48fdbac5759068eb0d7164e1f9c8f82232902593 | 44 | py | Python | HARK/__init__.py | fangli-DX3906/HARK | a55d06f4e47e9564b3bcc2250c8d8012cc758761 | [
"Apache-2.0"
] | null | null | null | HARK/__init__.py | fangli-DX3906/HARK | a55d06f4e47e9564b3bcc2250c8d8012cc758761 | [
"Apache-2.0"
] | null | null | null | HARK/__init__.py | fangli-DX3906/HARK | a55d06f4e47e9564b3bcc2250c8d8012cc758761 | [
"Apache-2.0"
] | null | null | null | from .core import *
__version__ = "0.10.6"
| 11 | 22 | 0.659091 | 7 | 44 | 3.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0.181818 | 44 | 3 | 23 | 14.666667 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
5b0379501019e33fc3e4d3ef24fd717a1c458388 | 177 | py | Python | OpenDataCatalog/suggestions/forms.py | timwis/Open-Data-Catalog | 0ccdc71f28773508c337875fd32478dd4324a50c | [
"MIT"
] | 3 | 2016-08-07T17:25:56.000Z | 2019-11-12T00:51:14.000Z | suggestions/forms.py | opensandiego/Open-Data-Catalog | 06f93bab36d22431ff86a87faea4e388d0491846 | [
"MIT"
] | 1 | 2021-04-17T10:52:53.000Z | 2021-04-17T10:52:53.000Z | suggestions/forms.py | opensandiego/Open-Data-Catalog | 06f93bab36d22431ff86a87faea4e388d0491846 | [
"MIT"
] | 2 | 2016-10-28T14:20:27.000Z | 2021-04-17T10:52:28.000Z | from django import forms
from models import *
class SuggestionForm(forms.Form):
text = forms.CharField(widget=forms.Textarea(), max_length=255, label="My Nomination")
| 25.285714 | 90 | 0.745763 | 23 | 177 | 5.695652 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019868 | 0.146893 | 177 | 7 | 91 | 25.285714 | 0.847682 | 0 | 0 | 0 | 0 | 0 | 0.073034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
5b10c17d68462a6a3c8d1c991deebe207414cf8a | 183 | py | Python | Mundo 2/aula14/desafio1.py | mandamg/Exercicios-de-Python-do-Curso-em-Video | 3f818c11c3c10213bebc1dfb6a740adee468ea3a | [
"MIT"
] | null | null | null | Mundo 2/aula14/desafio1.py | mandamg/Exercicios-de-Python-do-Curso-em-Video | 3f818c11c3c10213bebc1dfb6a740adee468ea3a | [
"MIT"
] | null | null | null | Mundo 2/aula14/desafio1.py | mandamg/Exercicios-de-Python-do-Curso-em-Video | 3f818c11c3c10213bebc1dfb6a740adee468ea3a | [
"MIT"
] | null | null | null | sexo = input('Digite seu sexo[M/F]:').upper()
while sexo not in 'MmFf':
sexo = input('Digite novamente').upper()
idade = input('Digite sua idade:')
print('registrado com sucesso') | 36.6 | 45 | 0.68306 | 27 | 183 | 4.62963 | 0.666667 | 0.264 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136612 | 183 | 5 | 46 | 36.6 | 0.791139 | 0 | 0 | 0 | 0 | 0 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
d2a7e69486ef02d7e6f575978099b251d27f2793 | 129 | py | Python | tests/__main__.py | utdb/judged | 383c08df1ded265ff26c344e1df2f1a623ffe86d | [
"MIT"
] | 20 | 2015-12-26T12:58:14.000Z | 2021-07-31T21:34:00.000Z | tests/__main__.py | utdb/judged | 383c08df1ded265ff26c344e1df2f1a623ffe86d | [
"MIT"
] | null | null | null | tests/__main__.py | utdb/judged | 383c08df1ded265ff26c344e1df2f1a623ffe86d | [
"MIT"
] | 5 | 2016-12-22T18:49:48.000Z | 2020-12-29T07:47:58.000Z | #!/usr/bin/env python3.4
from tests.lawful import run_tests
def main():
run_tests()
if __name__ == '__main__':
main()
| 12.9 | 34 | 0.658915 | 19 | 129 | 3.947368 | 0.736842 | 0.213333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 0.193798 | 129 | 9 | 35 | 14.333333 | 0.701923 | 0.178295 | 0 | 0 | 0 | 0 | 0.07619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
d2b81dfc98fa74368c9b5b82eee79f3d69a63c12 | 112 | py | Python | job_scheduler/logging/__init__.py | konkolorado/job-scheduler | e76b24d0592d9d1f62b5a1525b6a152b9983b2fa | [
"MIT"
] | null | null | null | job_scheduler/logging/__init__.py | konkolorado/job-scheduler | e76b24d0592d9d1f62b5a1525b6a152b9983b2fa | [
"MIT"
] | null | null | null | job_scheduler/logging/__init__.py | konkolorado/job-scheduler | e76b24d0592d9d1f62b5a1525b6a152b9983b2fa | [
"MIT"
] | 1 | 2021-08-09T15:28:49.000Z | 2021-08-09T15:28:49.000Z | from job_scheduler.logging.main import logging_config, setup_logging
all = ["setup_logging", "logging_config"]
| 28 | 68 | 0.8125 | 15 | 112 | 5.733333 | 0.6 | 0.302326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089286 | 112 | 3 | 69 | 37.333333 | 0.843137 | 0 | 0 | 0 | 0 | 0 | 0.241071 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
d2dc1b96e9438dfa4003d16ae801702a9e076190 | 106 | py | Python | python/zp/primjer_17.08.py | jasarsoft/examples | d6fddfcb8c50c31fbfe170a3edd2b6c07890f13e | [
"MIT"
] | null | null | null | python/zp/primjer_17.08.py | jasarsoft/examples | d6fddfcb8c50c31fbfe170a3edd2b6c07890f13e | [
"MIT"
] | null | null | null | python/zp/primjer_17.08.py | jasarsoft/examples | d6fddfcb8c50c31fbfe170a3edd2b6c07890f13e | [
"MIT"
] | null | null | null | # assert nareba
mojalista = ['stvar']
assert len(mojalista) >= 1
print(mojalista.pop())
print(mojalista)
| 15.142857 | 26 | 0.716981 | 13 | 106 | 5.846154 | 0.615385 | 0.368421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010753 | 0.122642 | 106 | 6 | 27 | 17.666667 | 0.806452 | 0.122642 | 0 | 0 | 0 | 0 | 0.054945 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
d2f4df5b21e4fd04b50c45008a3f65aa35a4a280 | 126 | py | Python | terrascript/data/Sighery/njalla.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/data/Sighery/njalla.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/data/Sighery/njalla.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/data/Sighery/njalla.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:22:37 UTC)
__all__ = []
| 21 | 73 | 0.746032 | 19 | 126 | 4.736842 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 0.119048 | 126 | 5 | 74 | 25.2 | 0.702703 | 0.84127 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
826d05ccbf003a93b78441d93a7b1a39ed54662b | 252 | py | Python | ncurses/__init__.py | siju-samuel/tvm-curses-ui | c19ece357fa103f2ca05c3c9ee3bb62016f358cb | [
"Apache-2.0"
] | null | null | null | ncurses/__init__.py | siju-samuel/tvm-curses-ui | c19ece357fa103f2ca05c3c9ee3bb62016f358cb | [
"Apache-2.0"
] | null | null | null | ncurses/__init__.py | siju-samuel/tvm-curses-ui | c19ece357fa103f2ca05c3c9ee3bb62016f358cb | [
"Apache-2.0"
] | null | null | null | """Public Python API of TVM Debugger (tvmdbg)."""
#from __future__ import absolute_import
#from __future__ import division
#from __future__ import print_function
#from tvm.contrib.debugger.curses.wrappers.ui_wrapper import LocalCLIDebugWrapperModule
| 31.5 | 87 | 0.829365 | 31 | 252 | 6.258065 | 0.645161 | 0.154639 | 0.247423 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099206 | 252 | 7 | 88 | 36 | 0.854626 | 0.936508 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
82765802c5e3b78ec10e3e8150d127da4b0fb9c3 | 1,180 | py | Python | tests/test_article.py | AhmadSAshraf/News-Highlight-IP | 78448c61601bdfff85dabe0b884181689aae6c42 | [
"MIT"
] | null | null | null | tests/test_article.py | AhmadSAshraf/News-Highlight-IP | 78448c61601bdfff85dabe0b884181689aae6c42 | [
"MIT"
] | 2 | 2019-11-17T11:45:08.000Z | 2021-06-02T00:41:00.000Z | tests/test_article.py | AhmadSAshraf/News-Highlight-IP | 78448c61601bdfff85dabe0b884181689aae6c42 | [
"MIT"
] | null | null | null | import unittest
from app.models import Article
class TestArticle(unittest.TestCase):
'''
Test Case to test the behaviour of the Article Model
Args:
unittest.TestCase - helps in creating Test Cases
'''
def setUp(self):
'''
Inbuilt function that runs before each test is executed
'''
self.new_article = Article("Panos Mourdoukoutas, Contributor, Panos Mourdoukoutas, Contributor https://www.forbes.com/sites/panosmourdoukoutas/", "XRP Keeps On Rallying, As Bitcoin, ETH, And XLM Are Catching Up -- What's Next?", "Bitcoin, XRP, ETH, XLM rise, shaking off bad news--watch BTC and ETH price action for what comes next.", "https://www.forbes.com/sites/panosmourdoukoutas/2019/10/13/xrp-keeps-on-rallying-as-bitcoin-eth-and-xlm-are-catching-up-whats-next/", "https://thumbor.forbes.com/thumbor/600x315/https%3A%2F%2Fspecials-images.forbesimg.com%2Fdam%2Fimageserve%2F915943332%2F960x0.jpg%3Ffit%3Dscale", "2019-10-13T11:48:00Z")
def test_isArticleInstance(self):
'''
Function to test if the object created in the setup is indeed a Source Object
'''
self.assertTrue(isinstance(self.new_article,Article))
if __name__ == '__main__':
unittest.main(verbosity=2) | 51.304348 | 642 | 0.759322 | 172 | 1,180 | 5.145349 | 0.593023 | 0.030508 | 0.031638 | 0.047458 | 0.20113 | 0.20113 | 0.110734 | 0.110734 | 0.110734 | 0.110734 | 0 | 0.049038 | 0.118644 | 1,180 | 23 | 643 | 51.304348 | 0.801923 | 0.211017 | 0 | 0 | 0 | 0.555556 | 0.660773 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
8286317b36f193ad62f2b4776d1b2d34a528550f | 190 | py | Python | anprx/__init__.py | ppintosilva/anprx | a37fa36d735b00261efbefbf14c6224323ee1169 | [
"Apache-2.0"
] | 6 | 2019-07-05T04:43:01.000Z | 2021-02-25T07:06:16.000Z | anprx/__init__.py | ppintosilva/anprx | a37fa36d735b00261efbefbf14c6224323ee1169 | [
"Apache-2.0"
] | null | null | null | anprx/__init__.py | ppintosilva/anprx | a37fa36d735b00261efbefbf14c6224323ee1169 | [
"Apache-2.0"
] | 1 | 2020-01-14T23:04:59.000Z | 2020-01-14T23:04:59.000Z | __version__ = '0.1.3'
import anprx.core
import anprx.helpers
import anprx.nominatim
import anprx.constants
import anprx.utils
import anprx.plot
import anprx.animate
import anprx.exceptions
| 17.272727 | 23 | 0.826316 | 28 | 190 | 5.464286 | 0.5 | 0.575163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017647 | 0.105263 | 190 | 10 | 24 | 19 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0.026316 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.888889 | 0 | 0.888889 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
828ac4b71d30059b88e5b0387d57756d5059297a | 27 | py | Python | mitty/version.py | latticelabs/Mitty-deprecated- | bf192600233daea8a42a1f995c60b1e883cbaaba | [
"Apache-2.0"
] | 1 | 2015-10-21T23:43:34.000Z | 2015-10-21T23:43:34.000Z | mitty/version.py | latticelabs/Mitty | bf192600233daea8a42a1f995c60b1e883cbaaba | [
"Apache-2.0"
] | null | null | null | mitty/version.py | latticelabs/Mitty | bf192600233daea8a42a1f995c60b1e883cbaaba | [
"Apache-2.0"
] | null | null | null | __version__ = '1.40.0.dev0' | 27 | 27 | 0.703704 | 5 | 27 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0.074074 | 27 | 1 | 27 | 27 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0.392857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
8298b1f8e47eb7a013401b584f7e880a83b16915 | 943 | py | Python | data/models/__init__.py | SIXMON/peps | 48c09a951a0193ada7b91c8bb6efc4b1232c3520 | [
"MIT"
] | 5 | 2019-08-29T13:55:47.000Z | 2021-11-15T08:30:33.000Z | data/models/__init__.py | SIXMON/peps | 48c09a951a0193ada7b91c8bb6efc4b1232c3520 | [
"MIT"
] | 295 | 2019-08-19T12:40:29.000Z | 2022-01-24T14:03:20.000Z | data/models/__init__.py | SIXMON/peps | 48c09a951a0193ada7b91c8bb6efc4b1232c3520 | [
"MIT"
] | 7 | 2020-05-27T06:28:48.000Z | 2021-11-17T10:00:54.000Z | from .simulator.practice import Practice
from .simulator.simulatorculture import SimulatorCulture
from .simulator.pest import Pest
from .simulator.practicetype import PracticeType, PracticeTypeCategory
from .simulator.problem import Problem
from .simulator.soiltype import SoilType
from .simulator.weed import Weed
from .simulator.practicegroup import PracticeGroup
from .simulator.mechanism import Mechanism
from .simulator.resource import Resource, ResourceType
from .simulator.pepsenum import PepsEnum
from .simulator.glyphosateuses import GlyphosateUses
from .simulator.discardaction import DiscardAction
from .simulator.category import Category
from .simulator.groupcount import GroupCount
from .simulator.referercount import RefererCount
from .farmer import Farmer, FarmImage
from .experiment import Experiment, ExperimentImage, ExperimentVideo, TAGS
from .message import Message
from .cultures import CULTURES
from .theme import Theme
| 41 | 74 | 0.8579 | 106 | 943 | 7.632075 | 0.283019 | 0.257108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096501 | 943 | 22 | 75 | 42.863636 | 0.949531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
82de4075bd68d725d35956589d2a7c4dccb1bf54 | 208 | py | Python | pytpp/attributes/agent_container.py | Venafi/pytpp | 42af655b2403b8c9447c86962abd4aaa0201f646 | [
"MIT"
] | 4 | 2022-02-04T23:58:55.000Z | 2022-02-15T18:53:08.000Z | pytpp/attributes/agent_container.py | Venafi/pytpp | 42af655b2403b8c9447c86962abd4aaa0201f646 | [
"MIT"
] | null | null | null | pytpp/attributes/agent_container.py | Venafi/pytpp | 42af655b2403b8c9447c86962abd4aaa0201f646 | [
"MIT"
] | null | null | null | from pytpp.attributes._helper import IterableMeta
from pytpp.attributes.top import TopAttributes
class AgentContainerAttributes(TopAttributes, metaclass=IterableMeta):
__config_class__ = "Agent Container"
| 29.714286 | 70 | 0.855769 | 21 | 208 | 8.190476 | 0.666667 | 0.104651 | 0.22093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086538 | 208 | 6 | 71 | 34.666667 | 0.905263 | 0 | 0 | 0 | 0 | 0 | 0.072115 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
7d6353775087371a1620dd64e5c3fb3327b8766b | 70 | py | Python | run.py | eloipereira/multimeter_reader | 30a5255de5a5a7e9644fb432298347d6d96a41f0 | [
"Apache-2.0"
] | null | null | null | run.py | eloipereira/multimeter_reader | 30a5255de5a5a7e9644fb432298347d6d96a41f0 | [
"Apache-2.0"
] | null | null | null | run.py | eloipereira/multimeter_reader | 30a5255de5a5a7e9644fb432298347d6d96a41f0 | [
"Apache-2.0"
] | null | null | null | from seven_seg_rec import SevenSegRec
app = SevenSegRec()
app.run()
| 11.666667 | 37 | 0.771429 | 10 | 70 | 5.2 | 0.8 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 70 | 5 | 38 | 14 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
7d74815edf63f4b28b364d00763a2f2c340c4b50 | 16,194 | py | Python | apps/app01/migrations/0001_initial.py | chenhuxy/myweb | eb9373434d1ca069fd37ae6688d140d99a319294 | [
"Apache-2.0"
] | 7 | 2021-04-13T02:57:11.000Z | 2022-02-07T09:08:18.000Z | apps/app01/migrations/0001_initial.py | chenhuxy/myweb | eb9373434d1ca069fd37ae6688d140d99a319294 | [
"Apache-2.0"
] | 1 | 2021-06-27T15:06:15.000Z | 2021-06-28T08:33:53.000Z | apps/app01/migrations/0001_initial.py | chenhuxy/myweb | eb9373434d1ca069fd37ae6688d140d99a319294 | [
"Apache-2.0"
] | 1 | 2021-04-16T06:47:15.000Z | 2021-04-16T06:47:15.000Z | # Generated by Django 2.2.17 on 2020-11-09 19:02
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Asset',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cabinet_num', models.CharField(blank=True, max_length=256, null=True, verbose_name='机柜号')),
('cabinet_order', models.CharField(blank=True, max_length=256, null=True, verbose_name='机架号')),
('create_time', models.DateTimeField(auto_now_add=True, null=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, null=True, verbose_name='更新时间')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
],
options={
'verbose_name_plural': '资产总表',
'verbose_name': '资产总表',
},
),
migrations.CreateModel(
name='Blog',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=256)),
('content', models.TextField()),
],
),
migrations.CreateModel(
name='Contract',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('sn', models.CharField(max_length=128, unique=True, verbose_name='合同号')),
('name', models.CharField(max_length=256, verbose_name='合同名称')),
('cost', models.IntegerField(verbose_name='合同金额')),
('start_date', models.DateTimeField(blank=True)),
('end_date', models.DateTimeField(blank=True)),
('license_num', models.IntegerField(blank=True, verbose_name='license数量')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
],
options={
'verbose_name_plural': '合同',
'verbose_name': '合同',
},
),
migrations.CreateModel(
name='DeviceStatus',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(default='未上线', max_length=256, verbose_name='名字')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
],
options={
'verbose_name_plural': '设备状态',
'verbose_name': '设备状态',
},
),
migrations.CreateModel(
name='DeviceType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=256, verbose_name='名称')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
],
options={
'verbose_name_plural': '设备类型',
'verbose_name': '设备类型',
},
),
migrations.CreateModel(
name='IDC',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('region_display_name', models.CharField(default=None, max_length=256, verbose_name='区域名称')),
('display_name', models.CharField(default=None, max_length=256, verbose_name='机房名称')),
('floor', models.IntegerField(default=1, verbose_name='楼层')),
('memo', models.TextField(blank=True, verbose_name='备注')),
],
options={
'verbose_name_plural': '机房',
'verbose_name': '机房',
},
),
migrations.CreateModel(
name='Tag',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=256, verbose_name='标签名')),
('memo', models.TextField(blank=True, verbose_name='备注')),
],
options={
'verbose_name_plural': '标签',
'verbose_name': '标签',
},
),
migrations.CreateModel(
name='UserProfile',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=256, verbose_name='名字')),
('email', models.EmailField(max_length=256, verbose_name='邮箱')),
('mobile', models.CharField(max_length=256, verbose_name='手机')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
],
options={
'verbose_name_plural': '用户信息',
'verbose_name': '用户信息',
},
),
migrations.CreateModel(
name='Server',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('hostname', models.CharField(blank=True, max_length=128, unique=True, verbose_name='主机名')),
('sn', models.CharField(max_length=256, verbose_name='SN号')),
('manufactory', models.CharField(blank=True, max_length=256, null=True, verbose_name='厂商')),
('model', models.CharField(blank=True, max_length=256, null=True, verbose_name='型号')),
('bios', models.CharField(blank=True, max_length=256, null=True, verbose_name='BIOS')),
('type', models.BooleanField(default=False, verbose_name='虚拟机')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
('asset', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to='app01.Asset')),
],
options={
'index_together': {('sn', 'asset')},
'verbose_name_plural': '服务器信息',
'verbose_name': '服务器信息',
},
),
migrations.CreateModel(
name='NIC',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(blank=True, max_length=256, verbose_name='网卡名称')),
('model', models.CharField(blank=True, max_length=256, verbose_name='网卡型号')),
('ipaddr', models.GenericIPAddressField(verbose_name='ip地址')),
('mac', models.CharField(max_length=256, verbose_name='MAC地址')),
('netmask', models.CharField(blank=True, max_length=256)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('server_info', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.Server')),
],
options={
'verbose_name_plural': '网卡信息',
'verbose_name': '网卡信息',
},
),
migrations.CreateModel(
name='NetworkDevice',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(blank=True, max_length=256, verbose_name='设备名称')),
('sn', models.CharField(max_length=256, verbose_name='SN号')),
('manufactory', models.CharField(blank=True, max_length=256, null=True, verbose_name='厂商')),
('model', models.CharField(blank=True, max_length=256, null=True, verbose_name='型号')),
('memo', models.TextField(blank=True, verbose_name='备注')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
('asset', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to='app01.Asset')),
],
options={
'verbose_name_plural': '网络设备信息',
'verbose_name': '网络设备信息',
},
),
migrations.CreateModel(
name='Memory',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('slot', models.CharField(blank=True, max_length=256, verbose_name='插槽名称')),
('model', models.CharField(blank=True, max_length=256, verbose_name='内存型号')),
('capacity', models.FloatField(blank=True, verbose_name='内存容量')),
('ifac_type', models.CharField(blank=True, max_length=256, verbose_name='接口类型')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('server_info', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.Server')),
],
options={
'verbose_name_plural': '内存信息',
'verbose_name': '内存信息',
},
),
migrations.CreateModel(
name='HandleLog',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('handle_type', models.CharField(max_length=256, verbose_name='操作类型')),
('summary', models.CharField(max_length=256)),
('detail', models.TextField()),
('create_at', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('memo', models.TextField(blank=True, verbose_name='备注')),
('creater', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.UserProfile')),
],
options={
'verbose_name_plural': '操作日志',
'verbose_name': '操作日志',
},
),
migrations.CreateModel(
name='Disk',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('slot', models.CharField(blank=True, max_length=256, verbose_name='插槽名称')),
('model', models.CharField(blank=True, max_length=256, verbose_name='磁盘型号')),
('capacity', models.FloatField(blank=True, verbose_name='磁盘容量')),
('ifac_type', models.CharField(blank=True, max_length=256, verbose_name='接口类型')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('server_info', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.Server')),
],
options={
'verbose_name_plural': '磁盘信息',
'verbose_name': '磁盘信息',
},
),
migrations.CreateModel(
name='CPU',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(blank=True, max_length=256, verbose_name='CPU名称')),
('model', models.CharField(blank=True, max_length=256, verbose_name='CPU型号')),
('core_num', models.IntegerField(blank=True, default=1, verbose_name='CPU核数')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
('update_time', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('server_info', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.Server')),
],
options={
'verbose_name_plural': 'CPU信息',
'verbose_name': 'CPU信息',
},
),
migrations.CreateModel(
name='BusinessUnit',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=128, unique=True, verbose_name='业务线')),
('memo', models.TextField(blank=True, null=True, verbose_name='备注')),
('contact', models.ForeignKey(default=None, on_delete=django.db.models.deletion.CASCADE, to='app01.UserProfile')),
],
options={
'verbose_name_plural': '业务线',
'verbose_name': '业务线',
},
),
migrations.AddField(
model_name='asset',
name='admin',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='+', to='app01.UserProfile', verbose_name='设备管理员'),
),
migrations.AddField(
model_name='asset',
name='business_unit',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='app01.BusinessUnit', verbose_name='所属业务线'),
),
migrations.AddField(
model_name='asset',
name='contract',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='app01.Contract', verbose_name='合同'),
),
migrations.AddField(
model_name='asset',
name='device_status',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.DeviceStatus'),
),
migrations.AddField(
model_name='asset',
name='device_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='app01.DeviceType'),
),
migrations.AddField(
model_name='asset',
name='idc',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='app01.IDC', verbose_name='idc机房'),
),
migrations.AddField(
model_name='asset',
name='tag',
field=models.ManyToManyField(blank=True, to='app01.Tag', verbose_name='标签'),
),
migrations.CreateModel(
name='Admininfo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(max_length=256, verbose_name='用户名')),
('password', models.CharField(max_length=256, verbose_name='密码')),
('user', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to='app01.UserProfile')),
],
options={
'verbose_name_plural': '用户',
'verbose_name': '用户',
},
),
]
| 51.903846 | 168 | 0.562307 | 1,624 | 16,194 | 5.415025 | 0.118227 | 0.165113 | 0.080168 | 0.056175 | 0.798954 | 0.77496 | 0.74005 | 0.691039 | 0.673982 | 0.660337 | 0 | 0.014449 | 0.286279 | 16,194 | 311 | 169 | 52.07074 | 0.746409 | 0.002841 | 0 | 0.555921 | 1 | 0 | 0.129444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.003289 | 0.006579 | 0 | 0.019737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
7d9b7b39866f25631ff6a01f00fdb1f171abec72 | 185 | py | Python | importing_questions.py | mohamadaref/telegram_bot | 80eaa1da29c8c2f2990f26315b667d5a85e57d14 | [
"MIT"
] | 1 | 2019-09-19T07:05:05.000Z | 2019-09-19T07:05:05.000Z | importing_questions.py | mohamadaref/telegram_bot | 80eaa1da29c8c2f2990f26315b667d5a85e57d14 | [
"MIT"
] | null | null | null | importing_questions.py | mohamadaref/telegram_bot | 80eaa1da29c8c2f2990f26315b667d5a85e57d14 | [
"MIT"
] | null | null | null | import numpy
import xlsxwriter
from pandas import read_excel
import xlrd
import pandas
data = read_excel('data/questions_and_choices.xlsx')
print(data.columns[2])
# print(data.head())
| 18.5 | 52 | 0.8 | 28 | 185 | 5.142857 | 0.607143 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006024 | 0.102703 | 185 | 9 | 53 | 20.555556 | 0.861446 | 0.097297 | 0 | 0 | 0 | 0 | 0.187879 | 0.187879 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.714286 | 0 | 0.714286 | 0.142857 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
7da8c2fac293a6a76a53044863c61082cb3a1956 | 4,346 | py | Python | os_api_ref/tests/test_microversions.py | openstack/os-api-ref | 3d4f056ca3fa870000391f02cea2120392ddfbfd | [
"Apache-2.0"
] | 16 | 2016-05-25T08:18:08.000Z | 2019-03-22T05:49:20.000Z | os_api_ref/tests/test_microversions.py | openstack/os-api-ref | 3d4f056ca3fa870000391f02cea2120392ddfbfd | [
"Apache-2.0"
] | null | null | null | os_api_ref/tests/test_microversions.py | openstack/os-api-ref | 3d4f056ca3fa870000391f02cea2120392ddfbfd | [
"Apache-2.0"
] | 1 | 2018-08-09T09:04:07.000Z | 2018-08-09T09:04:07.000Z | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
test_os_api_ref
----------------------------------
Tests for `os_api_ref` module.
"""
from bs4 import BeautifulSoup
from sphinx_testing import with_app
from os_api_ref.tests import base
class TestMicroversions(base.TestCase):
"""Test basic rendering.
This can be used to test that basic rendering works for these
examples, so if someone breaks something we know.
"""
@with_app(buildername='html', srcdir=base.example_dir('microversions'),
copy_srcdir_to_tmpdir=True)
def setUp(self, app, status, warning):
super(TestMicroversions, self).setUp()
self.app = app
self.app.build()
self.status = status.getvalue()
self.warning = warning.getvalue()
self.html = (app.outdir / 'index.html').read_text(encoding='utf-8')
self.soup = BeautifulSoup(self.html, 'html.parser')
self.content = str(self.soup)
def test_rest_method(self):
"""Test that min / max mv css class attributes are set"""
content = self.soup.find_all(class_='rp_min_ver_2_17')
self.assertRegexpMatches(
str(content[0]),
'^<div class="operation-grp rp_min_ver_2_17 rp_max_ver_2_19 ?"')
content = self.soup.find_all(class_='rp_max_ver_2_19')
self.assertRegexpMatches(
str(content[0]),
'^<div class="operation-grp rp_min_ver_2_17 rp_max_ver_2_19 ?"')
def test_parameters_table(self):
"""Test that min / max mv css class attributes are set in params"""
table = """
<table class="docutils align-default">
<colgroup>
<col style="width: 20%"/>
<col style="width: 10%"/>
<col style="width: 10%"/>
<col style="width: 60%"/>
</colgroup>
<thead>
<tr class="row-odd"><th class="head"><p>Name</p></th>
<th class="head"><p>In</p></th>
<th class="head"><p>Type</p></th>
<th class="head"><p>Description</p></th>
</tr>
</thead>
<tbody>
<tr class="row-even"><td><p>name</p></td>
<td><p>body</p></td>
<td><p>string</p></td>
<td><p>The name of things</p></td>
</tr>
<tr class="rp_min_ver_2_11 row-odd"><td><p>name2</p></td>
<td><p>body</p></td>
<td><p>string</p></td>
<td><p>The name of things</p>
<p><strong>New in version 2.11</strong></p>
</td>
</tr>
<tr class="rp_max_ver_2_20 row-even"><td><p>name3</p></td>
<td><p>body</p></td>
<td><p>string</p></td>
<td><p>The name of things</p>
<p><strong>Available until version 2.20</strong></p>
</td>
</tr>
</tbody>
</table>
"""
self.assertIn(table, self.content)
def test_mv_selector(self):
button_selectors = '<option selected="selected" value="">All</option><option value="2.1">2.1</option><option value="2.2">2.2</option><option value="2.3">2.3</option><option value="2.4">2.4</option><option value="2.5">2.5</option><option value="2.6">2.6</option><option value="2.7">2.7</option><option value="2.8">2.8</option><option value="2.9">2.9</option><option value="2.10">2.10</option><option value="2.11">2.11</option><option value="2.12">2.12</option><option value="2.13">2.13</option><option value="2.14">2.14</option><option value="2.15">2.15</option><option value="2.16">2.16</option><option value="2.17">2.17</option><option value="2.18">2.18</option><option value="2.19">2.19</option><option value="2.20">2.20</option><option value="2.21">2.21</option><option value="2.22">2.22</option><option value="2.23">2.23</option><option value="2.24">2.24</option><option value="2.25">2.25</option><option value="2.26">2.26</option><option value="2.27">2.27</option><option value="2.28">2.28</option><option value="2.29">2.29</option><option value="2.30">2.30</option>' # noqa
self.assertIn(button_selectors, self.content)
def test_js_declares(self):
self.assertIn("os_max_mv = 30;", self.content)
self.assertIn("os_min_mv = 1;", self.content)
| 41 | 1,088 | 0.654625 | 723 | 4,346 | 3.84509 | 0.286307 | 0.129496 | 0.183453 | 0.194245 | 0.21259 | 0.201079 | 0.17518 | 0.138849 | 0.138849 | 0.138849 | 0 | 0.057227 | 0.14358 | 4,346 | 105 | 1,089 | 41.390476 | 0.689683 | 0.197883 | 0 | 0.314286 | 0 | 0.028571 | 0.629543 | 0.384414 | 0 | 0 | 0 | 0 | 0.085714 | 1 | 0.071429 | false | 0 | 0.042857 | 0 | 0.128571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
7dad24d1431882acd62f256b4f14165f12b4da0f | 646 | py | Python | psdaq/psdaq/pyxpm/LclsTimingCore/__init__.py | JBlaschke/lcls2 | 30523ef069e823535475d68fa283c6387bcf817b | [
"BSD-3-Clause-LBNL"
] | 16 | 2017-11-09T17:10:56.000Z | 2022-03-09T23:03:10.000Z | psdaq/psdaq/pyxpm/LclsTimingCore/__init__.py | JBlaschke/lcls2 | 30523ef069e823535475d68fa283c6387bcf817b | [
"BSD-3-Clause-LBNL"
] | 6 | 2017-12-12T19:30:05.000Z | 2020-07-09T00:28:33.000Z | psdaq/psdaq/pyxpm/LclsTimingCore/__init__.py | JBlaschke/lcls2 | 30523ef069e823535475d68fa283c6387bcf817b | [
"BSD-3-Clause-LBNL"
] | 25 | 2017-09-18T20:02:43.000Z | 2022-03-27T22:27:42.000Z | #!/usr/bin/env python
from LclsTimingCore.EvrV1Isr import *
from LclsTimingCore.EvrV1Reg import *
from LclsTimingCore.EvrV2ChannelReg import *
from LclsTimingCore.EvrV2Core import *
from LclsTimingCore.EvrV2CoreTriggers import *
from LclsTimingCore.EvrV2TriggerReg import *
from LclsTimingCore.GthRxAlignCheck import *
from LclsTimingCore.LclsTriggerPulse import *
from LclsTimingCore.TimingFrameRx import *
from LclsTimingCore.TPG import *
from LclsTimingCore.TPGControl import *
from LclsTimingCore.TPGMiniCore import *
from LclsTimingCore.TPGSeqJump import *
from LclsTimingCore.TPGSeqState import *
from LclsTimingCore.TPGStatus import *
| 30.761905 | 46 | 0.843653 | 64 | 646 | 8.515625 | 0.34375 | 0.495413 | 0.616514 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010345 | 0.102167 | 646 | 20 | 47 | 32.3 | 0.92931 | 0.03096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
7db5c79abec8743037120f8ec171f35d3ed2e6b2 | 66 | py | Python | Week_3_Labs/vfleet2210_cloudmesh_ex2.py | futuresystems/465-rvanfleet | 3fb4de7589de06947eb5eee6786cb70badd21205 | [
"Apache-2.0"
] | null | null | null | Week_3_Labs/vfleet2210_cloudmesh_ex2.py | futuresystems/465-rvanfleet | 3fb4de7589de06947eb5eee6786cb70badd21205 | [
"Apache-2.0"
] | null | null | null | Week_3_Labs/vfleet2210_cloudmesh_ex2.py | futuresystems/465-rvanfleet | 3fb4de7589de06947eb5eee6786cb70badd21205 | [
"Apache-2.0"
] | null | null | null | import cloudmesh
user = cloudmesh.load()
print user.cloudnames()
| 13.2 | 23 | 0.772727 | 8 | 66 | 6.375 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 66 | 4 | 24 | 16.5 | 0.87931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
7dbb14d6e00580e7f1cbb12355206815f91e7875 | 28,103 | py | Python | Reinforcement-Learning/Python-Model/venv/lib/python3.8/site-packages/tensorflow/core/profiler/protobuf/xplane_pb2.py | lawrence910426/ProgrammingII_FinalProject | 493183dc2a674310e65bffe3a5e00395e8bebb4b | [
"MIT"
] | null | null | null | Reinforcement-Learning/Python-Model/venv/lib/python3.8/site-packages/tensorflow/core/profiler/protobuf/xplane_pb2.py | lawrence910426/ProgrammingII_FinalProject | 493183dc2a674310e65bffe3a5e00395e8bebb4b | [
"MIT"
] | null | null | null | Reinforcement-Learning/Python-Model/venv/lib/python3.8/site-packages/tensorflow/core/profiler/protobuf/xplane_pb2.py | lawrence910426/ProgrammingII_FinalProject | 493183dc2a674310e65bffe3a5e00395e8bebb4b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: tensorflow/core/profiler/protobuf/xplane.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='tensorflow/core/profiler/protobuf/xplane.proto',
package='tensorflow.profiler',
syntax='proto3',
serialized_options=_b('\370\001\001'),
serialized_pb=_b('\n.tensorflow/core/profiler/protobuf/xplane.proto\x12\x13tensorflow.profiler\"j\n\x06XSpace\x12+\n\x06planes\x18\x01 \x03(\x0b\x32\x1b.tensorflow.profiler.XPlane\x12\x0e\n\x06\x65rrors\x18\x02 \x03(\t\x12\x10\n\x08warnings\x18\x03 \x03(\t\x12\x11\n\thostnames\x18\x04 \x03(\t\"\xba\x03\n\x06XPlane\x12\n\n\x02id\x18\x01 \x01(\x03\x12\x0c\n\x04name\x18\x02 \x01(\t\x12)\n\x05lines\x18\x03 \x03(\x0b\x32\x1a.tensorflow.profiler.XLine\x12\x46\n\x0e\x65vent_metadata\x18\x04 \x03(\x0b\x32..tensorflow.profiler.XPlane.EventMetadataEntry\x12\x44\n\rstat_metadata\x18\x05 \x03(\x0b\x32-.tensorflow.profiler.XPlane.StatMetadataEntry\x12)\n\x05stats\x18\x06 \x03(\x0b\x32\x1a.tensorflow.profiler.XStat\x1aY\n\x12\x45ventMetadataEntry\x12\x0b\n\x03key\x18\x01 \x01(\x03\x12\x32\n\x05value\x18\x02 \x01(\x0b\x32#.tensorflow.profiler.XEventMetadata:\x02\x38\x01\x1aW\n\x11StatMetadataEntry\x12\x0b\n\x03key\x18\x01 \x01(\x03\x12\x31\n\x05value\x18\x02 \x01(\x0b\x32\".tensorflow.profiler.XStatMetadata:\x02\x38\x01\"\xbb\x01\n\x05XLine\x12\n\n\x02id\x18\x01 \x01(\x03\x12\x12\n\ndisplay_id\x18\n \x01(\x03\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x14\n\x0c\x64isplay_name\x18\x0b \x01(\t\x12\x14\n\x0ctimestamp_ns\x18\x03 \x01(\x03\x12\x13\n\x0b\x64uration_ps\x18\t \x01(\x03\x12+\n\x06\x65vents\x18\x04 \x03(\x0b\x32\x1b.tensorflow.profiler.XEventJ\x04\x08\x05\x10\x06J\x04\x08\x06\x10\x07J\x04\x08\x07\x10\x08J\x04\x08\x08\x10\t\"\x95\x01\n\x06XEvent\x12\x13\n\x0bmetadata_id\x18\x01 \x01(\x03\x12\x13\n\toffset_ps\x18\x02 \x01(\x03H\x00\x12\x19\n\x0fnum_occurrences\x18\x05 \x01(\x03H\x00\x12\x13\n\x0b\x64uration_ps\x18\x03 \x01(\x03\x12)\n\x05stats\x18\x04 \x03(\x0b\x32\x1a.tensorflow.profiler.XStatB\x06\n\x04\x64\x61ta\"\xad\x01\n\x05XStat\x12\x13\n\x0bmetadata_id\x18\x01 \x01(\x03\x12\x16\n\x0c\x64ouble_value\x18\x02 \x01(\x01H\x00\x12\x16\n\x0cuint64_value\x18\x03 \x01(\x04H\x00\x12\x15\n\x0bint64_value\x18\x04 \x01(\x03H\x00\x12\x13\n\tstr_value\x18\x05 \x01(\tH\x00\x12\x15\n\x0b\x62ytes_value\x18\x06 \x01(\x0cH\x00\x12\x13\n\tref_value\x18\x07 \x01(\x04H\x00\x42\x07\n\x05value\"}\n\x0eXEventMetadata\x12\n\n\x02id\x18\x01 \x01(\x03\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x14\n\x0c\x64isplay_name\x18\x04 \x01(\t\x12\x10\n\x08metadata\x18\x03 \x01(\x0c\x12)\n\x05stats\x18\x05 \x03(\x0b\x32\x1a.tensorflow.profiler.XStat\">\n\rXStatMetadata\x12\n\n\x02id\x18\x01 \x01(\x03\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x03 \x01(\tB\x03\xf8\x01\x01\x62\x06proto3')
)
_XSPACE = _descriptor.Descriptor(
name='XSpace',
full_name='tensorflow.profiler.XSpace',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='planes', full_name='tensorflow.profiler.XSpace.planes', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='errors', full_name='tensorflow.profiler.XSpace.errors', index=1,
number=2, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='warnings', full_name='tensorflow.profiler.XSpace.warnings', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='hostnames', full_name='tensorflow.profiler.XSpace.hostnames', index=3,
number=4, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=71,
serialized_end=177,
)
_XPLANE_EVENTMETADATAENTRY = _descriptor.Descriptor(
name='EventMetadataEntry',
full_name='tensorflow.profiler.XPlane.EventMetadataEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='tensorflow.profiler.XPlane.EventMetadataEntry.key', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='tensorflow.profiler.XPlane.EventMetadataEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('8\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=444,
serialized_end=533,
)
_XPLANE_STATMETADATAENTRY = _descriptor.Descriptor(
name='StatMetadataEntry',
full_name='tensorflow.profiler.XPlane.StatMetadataEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='tensorflow.profiler.XPlane.StatMetadataEntry.key', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='tensorflow.profiler.XPlane.StatMetadataEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('8\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=535,
serialized_end=622,
)
_XPLANE = _descriptor.Descriptor(
name='XPlane',
full_name='tensorflow.profiler.XPlane',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='tensorflow.profiler.XPlane.id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='tensorflow.profiler.XPlane.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='lines', full_name='tensorflow.profiler.XPlane.lines', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='event_metadata', full_name='tensorflow.profiler.XPlane.event_metadata', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='stat_metadata', full_name='tensorflow.profiler.XPlane.stat_metadata', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='stats', full_name='tensorflow.profiler.XPlane.stats', index=5,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_XPLANE_EVENTMETADATAENTRY, _XPLANE_STATMETADATAENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=180,
serialized_end=622,
)
_XLINE = _descriptor.Descriptor(
name='XLine',
full_name='tensorflow.profiler.XLine',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='tensorflow.profiler.XLine.id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='display_id', full_name='tensorflow.profiler.XLine.display_id', index=1,
number=10, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='tensorflow.profiler.XLine.name', index=2,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='display_name', full_name='tensorflow.profiler.XLine.display_name', index=3,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='timestamp_ns', full_name='tensorflow.profiler.XLine.timestamp_ns', index=4,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='duration_ps', full_name='tensorflow.profiler.XLine.duration_ps', index=5,
number=9, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='events', full_name='tensorflow.profiler.XLine.events', index=6,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=625,
serialized_end=812,
)
_XEVENT = _descriptor.Descriptor(
name='XEvent',
full_name='tensorflow.profiler.XEvent',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='metadata_id', full_name='tensorflow.profiler.XEvent.metadata_id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='offset_ps', full_name='tensorflow.profiler.XEvent.offset_ps', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='num_occurrences', full_name='tensorflow.profiler.XEvent.num_occurrences', index=2,
number=5, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='duration_ps', full_name='tensorflow.profiler.XEvent.duration_ps', index=3,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='stats', full_name='tensorflow.profiler.XEvent.stats', index=4,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='data', full_name='tensorflow.profiler.XEvent.data',
index=0, containing_type=None, fields=[]),
],
serialized_start=815,
serialized_end=964,
)
_XSTAT = _descriptor.Descriptor(
name='XStat',
full_name='tensorflow.profiler.XStat',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='metadata_id', full_name='tensorflow.profiler.XStat.metadata_id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='double_value', full_name='tensorflow.profiler.XStat.double_value', index=1,
number=2, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='uint64_value', full_name='tensorflow.profiler.XStat.uint64_value', index=2,
number=3, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='int64_value', full_name='tensorflow.profiler.XStat.int64_value', index=3,
number=4, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='str_value', full_name='tensorflow.profiler.XStat.str_value', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='bytes_value', full_name='tensorflow.profiler.XStat.bytes_value', index=5,
number=6, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='ref_value', full_name='tensorflow.profiler.XStat.ref_value', index=6,
number=7, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='value', full_name='tensorflow.profiler.XStat.value',
index=0, containing_type=None, fields=[]),
],
serialized_start=967,
serialized_end=1140,
)
_XEVENTMETADATA = _descriptor.Descriptor(
name='XEventMetadata',
full_name='tensorflow.profiler.XEventMetadata',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='tensorflow.profiler.XEventMetadata.id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='tensorflow.profiler.XEventMetadata.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='display_name', full_name='tensorflow.profiler.XEventMetadata.display_name', index=2,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='metadata', full_name='tensorflow.profiler.XEventMetadata.metadata', index=3,
number=3, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='stats', full_name='tensorflow.profiler.XEventMetadata.stats', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1142,
serialized_end=1267,
)
_XSTATMETADATA = _descriptor.Descriptor(
name='XStatMetadata',
full_name='tensorflow.profiler.XStatMetadata',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='tensorflow.profiler.XStatMetadata.id', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='name', full_name='tensorflow.profiler.XStatMetadata.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='description', full_name='tensorflow.profiler.XStatMetadata.description', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1269,
serialized_end=1331,
)
_XSPACE.fields_by_name['planes'].message_type = _XPLANE
_XPLANE_EVENTMETADATAENTRY.fields_by_name['value'].message_type = _XEVENTMETADATA
_XPLANE_EVENTMETADATAENTRY.containing_type = _XPLANE
_XPLANE_STATMETADATAENTRY.fields_by_name['value'].message_type = _XSTATMETADATA
_XPLANE_STATMETADATAENTRY.containing_type = _XPLANE
_XPLANE.fields_by_name['lines'].message_type = _XLINE
_XPLANE.fields_by_name['event_metadata'].message_type = _XPLANE_EVENTMETADATAENTRY
_XPLANE.fields_by_name['stat_metadata'].message_type = _XPLANE_STATMETADATAENTRY
_XPLANE.fields_by_name['stats'].message_type = _XSTAT
_XLINE.fields_by_name['events'].message_type = _XEVENT
_XEVENT.fields_by_name['stats'].message_type = _XSTAT
_XEVENT.oneofs_by_name['data'].fields.append(
_XEVENT.fields_by_name['offset_ps'])
_XEVENT.fields_by_name['offset_ps'].containing_oneof = _XEVENT.oneofs_by_name['data']
_XEVENT.oneofs_by_name['data'].fields.append(
_XEVENT.fields_by_name['num_occurrences'])
_XEVENT.fields_by_name['num_occurrences'].containing_oneof = _XEVENT.oneofs_by_name['data']
_XSTAT.oneofs_by_name['value'].fields.append(
_XSTAT.fields_by_name['double_value'])
_XSTAT.fields_by_name['double_value'].containing_oneof = _XSTAT.oneofs_by_name['value']
_XSTAT.oneofs_by_name['value'].fields.append(
_XSTAT.fields_by_name['uint64_value'])
_XSTAT.fields_by_name['uint64_value'].containing_oneof = _XSTAT.oneofs_by_name['value']
_XSTAT.oneofs_by_name['value'].fields.append(
_XSTAT.fields_by_name['int64_value'])
_XSTAT.fields_by_name['int64_value'].containing_oneof = _XSTAT.oneofs_by_name['value']
_XSTAT.oneofs_by_name['value'].fields.append(
_XSTAT.fields_by_name['str_value'])
_XSTAT.fields_by_name['str_value'].containing_oneof = _XSTAT.oneofs_by_name['value']
_XSTAT.oneofs_by_name['value'].fields.append(
_XSTAT.fields_by_name['bytes_value'])
_XSTAT.fields_by_name['bytes_value'].containing_oneof = _XSTAT.oneofs_by_name['value']
_XSTAT.oneofs_by_name['value'].fields.append(
_XSTAT.fields_by_name['ref_value'])
_XSTAT.fields_by_name['ref_value'].containing_oneof = _XSTAT.oneofs_by_name['value']
_XEVENTMETADATA.fields_by_name['stats'].message_type = _XSTAT
DESCRIPTOR.message_types_by_name['XSpace'] = _XSPACE
DESCRIPTOR.message_types_by_name['XPlane'] = _XPLANE
DESCRIPTOR.message_types_by_name['XLine'] = _XLINE
DESCRIPTOR.message_types_by_name['XEvent'] = _XEVENT
DESCRIPTOR.message_types_by_name['XStat'] = _XSTAT
DESCRIPTOR.message_types_by_name['XEventMetadata'] = _XEVENTMETADATA
DESCRIPTOR.message_types_by_name['XStatMetadata'] = _XSTATMETADATA
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
XSpace = _reflection.GeneratedProtocolMessageType('XSpace', (_message.Message,), {
'DESCRIPTOR' : _XSPACE,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XSpace)
})
_sym_db.RegisterMessage(XSpace)
XPlane = _reflection.GeneratedProtocolMessageType('XPlane', (_message.Message,), {
'EventMetadataEntry' : _reflection.GeneratedProtocolMessageType('EventMetadataEntry', (_message.Message,), {
'DESCRIPTOR' : _XPLANE_EVENTMETADATAENTRY,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XPlane.EventMetadataEntry)
})
,
'StatMetadataEntry' : _reflection.GeneratedProtocolMessageType('StatMetadataEntry', (_message.Message,), {
'DESCRIPTOR' : _XPLANE_STATMETADATAENTRY,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XPlane.StatMetadataEntry)
})
,
'DESCRIPTOR' : _XPLANE,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XPlane)
})
_sym_db.RegisterMessage(XPlane)
_sym_db.RegisterMessage(XPlane.EventMetadataEntry)
_sym_db.RegisterMessage(XPlane.StatMetadataEntry)
XLine = _reflection.GeneratedProtocolMessageType('XLine', (_message.Message,), {
'DESCRIPTOR' : _XLINE,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XLine)
})
_sym_db.RegisterMessage(XLine)
XEvent = _reflection.GeneratedProtocolMessageType('XEvent', (_message.Message,), {
'DESCRIPTOR' : _XEVENT,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XEvent)
})
_sym_db.RegisterMessage(XEvent)
XStat = _reflection.GeneratedProtocolMessageType('XStat', (_message.Message,), {
'DESCRIPTOR' : _XSTAT,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XStat)
})
_sym_db.RegisterMessage(XStat)
XEventMetadata = _reflection.GeneratedProtocolMessageType('XEventMetadata', (_message.Message,), {
'DESCRIPTOR' : _XEVENTMETADATA,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XEventMetadata)
})
_sym_db.RegisterMessage(XEventMetadata)
XStatMetadata = _reflection.GeneratedProtocolMessageType('XStatMetadata', (_message.Message,), {
'DESCRIPTOR' : _XSTATMETADATA,
'__module__' : 'tensorflow.core.profiler.protobuf.xplane_pb2'
# @@protoc_insertion_point(class_scope:tensorflow.profiler.XStatMetadata)
})
_sym_db.RegisterMessage(XStatMetadata)
DESCRIPTOR._options = None
_XPLANE_EVENTMETADATAENTRY._options = None
_XPLANE_STATMETADATAENTRY._options = None
# @@protoc_insertion_point(module_scope)
| 43.235385 | 2,508 | 0.744405 | 3,636 | 28,103 | 5.470022 | 0.068757 | 0.053899 | 0.047061 | 0.067977 | 0.778219 | 0.722912 | 0.653829 | 0.639097 | 0.629846 | 0.62155 | 0 | 0.041814 | 0.123474 | 28,103 | 649 | 2,509 | 43.302003 | 0.765599 | 0.029926 | 0 | 0.669449 | 1 | 0.001669 | 0.222667 | 0.175432 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008347 | 0 | 0.008347 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
815d71f321bc2dc30e202b099ca6baca614a37cf | 237 | py | Python | omnitool/config.py | ojhall94/omnitool | b95a715fb7a41a4fc5d7f90dd66fead124fe14b8 | [
"MIT"
] | null | null | null | omnitool/config.py | ojhall94/omnitool | b95a715fb7a41a4fc5d7f90dd66fead124fe14b8 | [
"MIT"
] | null | null | null | omnitool/config.py | ojhall94/omnitool | b95a715fb7a41a4fc5d7f90dd66fead124fe14b8 | [
"MIT"
] | null | null | null | import os
import inspect
from pkg_resources import resource_filename
__ROOT__ = '/'.join(os.path.abspath(inspect.getfile(inspect.currentframe())).split('/')[:-1])
# default data directory
datadir = resource_filename('omnitool', 'data')
| 29.625 | 93 | 0.763713 | 29 | 237 | 6 | 0.724138 | 0.183908 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004608 | 0.084388 | 237 | 7 | 94 | 33.857143 | 0.797235 | 0.092827 | 0 | 0 | 0 | 0 | 0.065728 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
81678698f54f3a607d8f06b0d124d220fa08947f | 23 | py | Python | config.py | dcesarz/design_patterns_factories | e276e25586dc631309febb0fe9eb5f1454627107 | [
"MIT"
] | null | null | null | config.py | dcesarz/design_patterns_factories | e276e25586dc631309febb0fe9eb5f1454627107 | [
"MIT"
] | null | null | null | config.py | dcesarz/design_patterns_factories | e276e25586dc631309febb0fe9eb5f1454627107 | [
"MIT"
] | null | null | null | enable_prints = False
| 11.5 | 22 | 0.782609 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
816828595bd990b46d818b158797cf7c3af405ad | 273 | py | Python | apps/result/admin.py | abdulwaliyahaya/resultLab | 08c84d9b8c2bcc2a71e9c0a7766988ada0234be4 | [
"MIT"
] | 3 | 2019-03-19T03:49:17.000Z | 2021-03-20T10:36:22.000Z | apps/result/admin.py | abdulwaliyahaya/resultLab | 08c84d9b8c2bcc2a71e9c0a7766988ada0234be4 | [
"MIT"
] | null | null | null | apps/result/admin.py | abdulwaliyahaya/resultLab | 08c84d9b8c2bcc2a71e9c0a7766988ada0234be4 | [
"MIT"
] | 3 | 2019-08-10T18:39:59.000Z | 2021-01-19T07:30:20.000Z | from django.contrib import admin
from apps.result.models import *
from apps.accounts.models import *
admin.site.register(Subject)
admin.site.register(Class)
admin.site.register(ResultSheet)
admin.site.register(StudentResultSheet)
admin.site.register(StudentSubjectResult)
| 27.3 | 41 | 0.835165 | 35 | 273 | 6.514286 | 0.457143 | 0.197368 | 0.372807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065934 | 273 | 9 | 42 | 30.333333 | 0.894118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
817ab7876c737bade8883470735d7a8414cf9f5a | 1,085 | py | Python | not_used/scripts/get_last_working_day.py | flaireclair/DeepLearning-stock | 34cd296236fc10df34158cc5a65b5537d167645e | [
"Apache-2.0"
] | null | null | null | not_used/scripts/get_last_working_day.py | flaireclair/DeepLearning-stock | 34cd296236fc10df34158cc5a65b5537d167645e | [
"Apache-2.0"
] | null | null | null | not_used/scripts/get_last_working_day.py | flaireclair/DeepLearning-stock | 34cd296236fc10df34158cc5a65b5537d167645e | [
"Apache-2.0"
] | 1 | 2019-01-23T03:16:42.000Z | 2019-01-23T03:16:42.000Z | import workdays
import datetime
def return_last_day_of_working(year, month) :
if month == 3 :
if year % 4 == 0 :
start_date = datetime.datetime(year, month-1, 29)
end_date = datetime.datetime(year, month, 31)
else :
start_date = datetime.datetime(year, month-1, 28)
end_date = datetime.datetime(year, month, 31)
elif month is 5 or 7 or 10 or 12 :
start_date = datetime.datetime(year, month-1, 30)
end_date = datetime.datetime(year, month, 31)
elif month == 2 :
if year % 4 == 0 :
start_date = datetime.datetime(year, month-1, 31)
end_date = datetime.datetime(year, month, 29)
else :
start_date = datetime.datetime(year, month-1, 31)
end_date = datetime.datetime(year, month, 28)
else :
start_date = datetime.datetime(year, month-1, 31)
end_date = datetime.datetime(year, month, 30)
day_num = workdays.networkdays(start_date, end_date)
return workdays.workday(start_date, days=day_num)
| 37.413793 | 61 | 0.607373 | 145 | 1,085 | 4.4 | 0.248276 | 0.183386 | 0.376176 | 0.451411 | 0.721003 | 0.721003 | 0.721003 | 0.612853 | 0.551724 | 0.416928 | 0 | 0.054616 | 0.291244 | 1,085 | 28 | 62 | 38.75 | 0.775033 | 0 | 0 | 0.44 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.08 | 0 | 0.16 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
8183841960ed28b97b879ced512194b2494685f2 | 129 | py | Python | exercicios/ex012.py | Matheus1199/python | c87859d4bf63ba0edea43d864fcbce4915da7e6a | [
"MIT"
] | null | null | null | exercicios/ex012.py | Matheus1199/python | c87859d4bf63ba0edea43d864fcbce4915da7e6a | [
"MIT"
] | null | null | null | exercicios/ex012.py | Matheus1199/python | c87859d4bf63ba0edea43d864fcbce4915da7e6a | [
"MIT"
] | null | null | null | n1 = float(input('Valor do produto: R$'))
print('O preço do produto com 5% de desconto é: R${:.2f}'.format(n1 - n1 * (5 / 100)))
| 43 | 86 | 0.612403 | 24 | 129 | 3.291667 | 0.75 | 0.227848 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084112 | 0.170543 | 129 | 2 | 87 | 64.5 | 0.654206 | 0 | 0 | 0 | 0 | 0 | 0.534884 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
819392e335c0e5e4842732e4f8a681daf9a7f116 | 44 | py | Python | apartment/apps/users/__init__.py | Hardy7/CS465_Final | a38ccc1ebd988ed5ded9acc05ad6c8dd1cc2d1bd | [
"MIT"
] | 1 | 2020-03-15T22:45:42.000Z | 2020-03-15T22:45:42.000Z | apartment/apps/users/__init__.py | Hardy7/CS465_Final | a38ccc1ebd988ed5ded9acc05ad6c8dd1cc2d1bd | [
"MIT"
] | null | null | null | apartment/apps/users/__init__.py | Hardy7/CS465_Final | a38ccc1ebd988ed5ded9acc05ad6c8dd1cc2d1bd | [
"MIT"
] | null | null | null | default_app_config = "users.apps.UserConfig" | 44 | 44 | 0.840909 | 6 | 44 | 5.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 44 | 1 | 44 | 44 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.466667 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
819d37071e48ae3db89c3858cb9fdb4608af349a | 860 | py | Python | ddtrace/monkey.py | KDWSS/dd-trace-py | 6d859bec403347f7c1e7efd039210908b562741e | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | ddtrace/monkey.py | KDWSS/dd-trace-py | 6d859bec403347f7c1e7efd039210908b562741e | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | ddtrace/monkey.py | KDWSS/dd-trace-py | 6d859bec403347f7c1e7efd039210908b562741e | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | """Patch libraries to be automatically instrumented.
It can monkey patch supported standard libraries and third party modules.
A patched module will automatically report spans with its default configuration.
A library instrumentation can be configured (for instance, to report as another service)
using Pin. For that, check its documentation.
"""
from ._monkey import ModuleNotFoundException # noqa
from ._monkey import PATCH_MODULES # noqa
from ._monkey import PatchException # noqa
from ._monkey import get_patched_modules # noqa
from ._monkey import patch # noqa
from ._monkey import patch_all # noqa
from ._monkey import patch_module # noqa
from .utils.deprecation import deprecation
deprecation(
name="ddtrace.monkey",
message="Import the patch and patch_all functions directly from the ddtrace module instead",
version="1.0.0",
)
| 34.4 | 96 | 0.787209 | 115 | 860 | 5.773913 | 0.495652 | 0.105422 | 0.168675 | 0.180723 | 0.201807 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004138 | 0.156977 | 860 | 24 | 97 | 35.833333 | 0.911724 | 0.438372 | 0 | 0 | 0 | 0 | 0.212766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.692308 | 0 | 0.692308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
81b0b7de1c0886e163caacb79bae1fbba6245a1f | 11,461 | py | Python | tests/test_crypto/test_ledger_apis.py | 8ball030/agents-aea | fcf470e3daa9bd8e272ca66542c6003feb0fd7a8 | [
"Apache-2.0"
] | 1 | 2021-07-25T18:50:18.000Z | 2021-07-25T18:50:18.000Z | tests/test_crypto/test_ledger_apis.py | 8ball030/agents-aea | fcf470e3daa9bd8e272ca66542c6003feb0fd7a8 | [
"Apache-2.0"
] | null | null | null | tests/test_crypto/test_ledger_apis.py | 8ball030/agents-aea | fcf470e3daa9bd8e272ca66542c6003feb0fd7a8 | [
"Apache-2.0"
] | null | null | null |
# -*- coding: utf-8 -*-
# ------------------------------------------------------------------------------
#
# Copyright 2018-2019 Fetch.AI Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# ------------------------------------------------------------------------------
"""This module contains the tests for the crypto/helpers module."""
import logging
import os
from typing import Dict
from unittest import mock
import pytest
from hexbytes import HexBytes
from aea.crypto.ethereum import ETHEREUM, EthereumCrypto
from aea.crypto.fetchai import FETCHAI, FetchAICrypto
from aea.crypto.ledger_apis import LedgerApis, DEFAULT_FETCHAI_CONFIG, \
_try_to_instantiate_fetchai_ledger_api, \
_try_to_instantiate_ethereum_ledger_api
from tests.conftest import CUR_PATH
logger = logging.getLogger(__name__)
DEFAULT_ETHEREUM_CONFIG = ("https://ropsten.infura.io/v3/f00f7b3ba0e848ddbdc8941c527447fe", 3)
fet_address = "B3t9pv4rYccWqCjeuoXsDoeXLiKxVAQh6Q3CLAiNZZQ2mtqF1"
eth_address = "0x21795D753752ccC1AC728002D23Ba33cbF13b8b0"
GAS_PRICE = '50'
GAS_ID = 'gwei'
class TestLedgerApis:
"""Test the ledger_apis module."""
def test_initialisation(self):
"""Test the initialisation of the ledger APIs."""
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
assert ledger_apis.configs.get(ETHEREUM) == DEFAULT_ETHEREUM_CONFIG
assert ledger_apis.has_fetchai
assert ledger_apis.has_ethereum
assert isinstance(ledger_apis.last_tx_statuses, Dict)
unknown_config = ("UknownPath", 8080)
with pytest.raises(ValueError):
LedgerApis({"UNKNOWN": unknown_config})
def test_eth_token_balance(self):
"""Test the token_balance for the eth tokens."""
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
api = ledger_apis.apis[ETHEREUM]
with mock.patch.object(api.eth, 'getBalance', return_value=10):
balance = ledger_apis.token_balance(ETHEREUM, eth_address)
assert balance == 10
assert ledger_apis.last_tx_statuses[ETHEREUM] == 'OK'
with mock.patch.object(api.eth, 'getBalance', return_value=0, side_effect=Exception):
balance = ledger_apis.token_balance(ETHEREUM, fet_address)
assert balance == 0, "This must be 0 since the address is wrong"
assert ledger_apis.last_tx_statuses[ETHEREUM] == 'ERROR'
def test_unknown_token_balance(self):
"""Test the token_balance for the unknown tokens."""
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
with pytest.raises(AssertionError):
balance = ledger_apis.token_balance("UNKNOWN", fet_address)
assert balance == 0, "Unknown identifier so it will return 0"
def test_fet_token_balance(self):
"""Test the token_balance for the fet tokens."""
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
api = ledger_apis.apis[FETCHAI]
with mock.patch.object(api.tokens, 'balance', return_value=10):
balance = ledger_apis.token_balance(FETCHAI, fet_address)
assert balance == 10
assert ledger_apis.last_tx_statuses[FETCHAI] == 'OK'
with mock.patch.object(api.tokens, 'balance', return_value=0, side_effect=Exception):
balance = ledger_apis.token_balance(FETCHAI, eth_address)
assert balance == 0, "This must be 0 since the address is wrong"
assert ledger_apis.last_tx_statuses[FETCHAI] == 'ERROR'
def test_transfer_fetchai(self):
"""Test the transfer function for fetchai token."""
private_key_path = os.path.join(CUR_PATH, 'data', "fet_private_key.txt")
fet_obj = FetchAICrypto(private_key_path=private_key_path)
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
with mock.patch.object(ledger_apis.apis.get(FETCHAI).tokens, 'transfer',
return_value="97fcacaaf94b62318c4e4bbf53fd2608c15062f17a6d1bffee0ba7af9b710e35"):
with mock.patch.object(ledger_apis.apis.get(FETCHAI), 'sync'):
tx_digest = ledger_apis.transfer(FETCHAI, fet_obj, fet_address, amount=10, tx_fee=10)
assert tx_digest is not None
assert ledger_apis.last_tx_statuses[FETCHAI] == 'OK'
def test_failed_transfer_fetchai(self):
"""Test the transfer function for fetchai token fails."""
private_key_path = os.path.join(CUR_PATH, 'data', "fet_private_key.txt")
fet_obj = FetchAICrypto(private_key_path=private_key_path)
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
with mock.patch.object(ledger_apis.apis.get(FETCHAI).tokens, 'transfer',
return_value="97fcacaaf94b62318c4e4bbf53fd2608c15062f17a6d1bffee0ba7af9b710e35"):
with mock.patch.object(ledger_apis.apis.get(FETCHAI), 'sync', side_effect=Exception):
tx_digest = ledger_apis.transfer(FETCHAI, fet_obj, fet_address, amount=10, tx_fee=10)
assert tx_digest is None
assert ledger_apis.last_tx_statuses[FETCHAI] == 'ERROR'
def test_transfer_ethereum(self):
"""Test the transfer function for ethereum token."""
private_key_path = os.path.join(CUR_PATH, "data", "eth_private_key.txt")
eth_obj = EthereumCrypto(private_key_path=private_key_path)
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
with mock.patch.object(ledger_apis.apis.get(ETHEREUM).eth, 'getTransactionCount', return_value=5):
with mock.patch.object(ledger_apis.apis.get(ETHEREUM).eth.account, 'signTransaction',
return_value=mock.Mock()):
result = HexBytes('0xf85f808082c35094d898d5e829717c72e7438bad593076686d7d164a80801ba005c2e99ecee98a12fbf28ab9577423f42e9e88f2291b3acc8228de743884c874a077d6bc77a47ad41ec85c96aac2ad27f05a039c4787fca8a1e5ee2d8c7ec1bb6a')
with mock.patch.object(ledger_apis.apis.get(ETHEREUM).eth, 'sendRawTransaction',
return_value=result):
with mock.patch.object(ledger_apis.apis.get(ETHEREUM).eth, "getTransactionReceipt",
return_value=b'0xa13f2f926233bc4638a20deeb8aaa7e8d6a96e487392fa55823f925220f6efed'):
tx_digest = ledger_apis.transfer(ETHEREUM, eth_obj, eth_address, amount=10, tx_fee=200000)
assert tx_digest is not None
assert ledger_apis.last_tx_statuses[ETHEREUM] == 'OK'
def test_failed_transfer_ethereum(self):
"""Test the transfer function for ethereum token fails."""
private_key_path = os.path.join(CUR_PATH, "data", "eth_private_key.txt")
eth_obj = EthereumCrypto(private_key_path=private_key_path)
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
with mock.patch.object(ledger_apis.apis.get(ETHEREUM).eth, 'getTransactionCount', return_value=5, side_effect=Exception):
tx_digest = ledger_apis.transfer(ETHEREUM, eth_obj, eth_address, amount=10, tx_fee=200000)
assert tx_digest is None
assert ledger_apis.last_tx_statuses[ETHEREUM] == 'ERROR'
def test_is_tx_settled_fetchai(self):
"""Test if the transaction is settled for fetchai."""
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
tx_digest = "97fcacaaf94b62318c4e4bbf53fd2608c15062f17a6d1bffee0ba7af9b710e35"
with pytest.raises(AssertionError):
ledger_apis.is_tx_settled("Unknown", tx_digest=tx_digest, amount=10)
with mock.patch.object(ledger_apis.apis[FETCHAI].tx, "status", return_value='Submitted'):
is_successful = ledger_apis.is_tx_settled(FETCHAI, tx_digest=tx_digest, amount=10)
assert is_successful
assert ledger_apis.last_tx_statuses[FETCHAI] == 'OK'
with mock.patch.object(ledger_apis.apis[FETCHAI].tx, "status", side_effect=Exception):
is_successful = ledger_apis.is_tx_settled(FETCHAI, tx_digest=tx_digest, amount=10)
assert not is_successful
assert ledger_apis.last_tx_statuses[FETCHAI] == 'ERROR'
def test_is_tx_settled_ethereum(self):
"""Test if the transaction is settled for eth."""
ledger_apis = LedgerApis({ETHEREUM: DEFAULT_ETHEREUM_CONFIG,
FETCHAI: DEFAULT_FETCHAI_CONFIG})
tx_digest = "97fcacaaf94b62318c4e4bbf53fd2608c15062f17a6d1bffee0ba7af9b710e35"
result = HexBytes(
'0xf85f808082c35094d898d5e829717c72e7438bad593076686d7d164a80801ba005c2e99ecee98a12fbf28ab9577423f42e9e88f2291b3acc8228de743884c874a077d6bc77a47ad41ec85c96aac2ad27f05a039c4787fca8a1e5ee2d8c7ec1bb6a')
with mock.patch.object(ledger_apis.apis[ETHEREUM].eth, "getTransactionReceipt", return_value=result):
is_successful = ledger_apis.is_tx_settled(ETHEREUM, tx_digest=tx_digest, amount=10)
assert is_successful
assert ledger_apis.last_tx_statuses[ETHEREUM] == 'OK'
with mock.patch.object(ledger_apis.apis[ETHEREUM].eth, "getTransactionReceipt", side_effect=Exception):
is_successful = ledger_apis.is_tx_settled(ETHEREUM, tx_digest=tx_digest, amount=10)
assert not is_successful
assert ledger_apis.last_tx_statuses[ETHEREUM] == 'ERROR'
def test_try_to_instantiate_fetchai_ledger_api(self):
"""Test the instantiation of the fetchai ledger api."""
_try_to_instantiate_fetchai_ledger_api(addr="127.0.0.1", port=80)
from fetchai.ledger.api import LedgerApi
with mock.patch.object(LedgerApi, "__init__", side_effect=Exception):
with pytest.raises(SystemExit):
_try_to_instantiate_fetchai_ledger_api(addr="127.0.0.1", port=80)
def test__try_to_instantiate_ethereum_ledger_api(self):
"""Test the instantiation of the ethereum ledger api."""
_try_to_instantiate_ethereum_ledger_api(addr="127.0.0.1", port=80)
from web3 import Web3
with mock.patch.object(Web3, "__init__", side_effect=Exception):
with pytest.raises(SystemExit):
_try_to_instantiate_ethereum_ledger_api(addr="127.0.0.1", port=80)
| 55.100962 | 233 | 0.681092 | 1,287 | 11,461 | 5.792541 | 0.158508 | 0.0778 | 0.033132 | 0.048424 | 0.758149 | 0.734406 | 0.712676 | 0.712676 | 0.67002 | 0.636888 | 0 | 0.065412 | 0.22101 | 11,461 | 207 | 234 | 55.36715 | 0.769601 | 0.123724 | 0 | 0.506849 | 0 | 0 | 0.142857 | 0.087201 | 0 | 0 | 0.050231 | 0 | 0.212329 | 1 | 0.082192 | false | 0 | 0.082192 | 0 | 0.171233 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
81d48812c9dfe5fb94dd78089decf39084948ec2 | 307 | py | Python | python codes/Leapyear.py | vatsla1601/Hacktoberfest2020-1 | 2f65f26bd5bba3ba518cac95b7bf080e295c11e0 | [
"MIT"
] | 266 | 2020-10-06T11:47:03.000Z | 2022-03-08T09:01:00.000Z | python codes/Leapyear.py | vatsla1601/Hacktoberfest2020-1 | 2f65f26bd5bba3ba518cac95b7bf080e295c11e0 | [
"MIT"
] | 73 | 2020-10-06T09:14:43.000Z | 2021-10-05T19:02:00.000Z | python codes/Leapyear.py | vatsla1601/Hacktoberfest2020-1 | 2f65f26bd5bba3ba518cac95b7bf080e295c11e0 | [
"MIT"
] | 958 | 2020-10-06T10:41:13.000Z | 2022-02-23T11:14:34.000Z | # User enters the year
year = int(input("Enter Year: "))
# Leap Year Check
if year % 4 == 0 and year % 100 != 0:
print(year, "is a Leap Year")
elif year % 100 == 0:
print(year, "is not a Leap Year")
elif year % 400 ==0:
print(year, "is a Leap Year")
else:
print(year, "is not a Leap Year")
| 23.615385 | 37 | 0.599349 | 55 | 307 | 3.345455 | 0.381818 | 0.217391 | 0.23913 | 0.195652 | 0.625 | 0.559783 | 0.478261 | 0 | 0 | 0 | 0 | 0.06087 | 0.250814 | 307 | 12 | 38 | 25.583333 | 0.73913 | 0.117264 | 0 | 0.444444 | 0 | 0 | 0.283582 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.444444 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
81da352648b20b5a710bd862af6067bb2d9a4f59 | 1,345 | py | Python | md2html/node.py | dosisod/dosisod.github.io | 8fca74392cbbe53a761f60cbdb05513dc663f768 | [
"MIT"
] | null | null | null | md2html/node.py | dosisod/dosisod.github.io | 8fca74392cbbe53a761f60cbdb05513dc663f768 | [
"MIT"
] | null | null | null | md2html/node.py | dosisod/dosisod.github.io | 8fca74392cbbe53a761f60cbdb05513dc663f768 | [
"MIT"
] | null | null | null | from enum import auto, Enum
from dataclasses import dataclass, field
from typing import List
@dataclass(kw_only=True)
class Node:
contents: str = ""
@dataclass(kw_only=True)
class CommentNode(Node):
pass
@dataclass(kw_only=True)
class DataNode(Node):
data: List[str] = field(default_factory=list)
@dataclass(kw_only=True)
class ListNode(DataNode):
pass
@dataclass(kw_only=True)
class BulletNode(ListNode):
pass
@dataclass(kw_only=True)
class NumListNode(ListNode):
pass
@dataclass(kw_only=True)
class CheckboxNode(Node):
checked: bool
@dataclass(kw_only=True)
class TextNode(Node):
pass
@dataclass(kw_only=True)
class CodeblockNode(DataNode):
pass
@dataclass(kw_only=True)
class PythonNode(Node):
pass
@dataclass(kw_only=True)
class HtmlNode(Node):
pass
@dataclass(kw_only=True)
class HeaderNode(Node):
level: int = 1
@dataclass(kw_only=True)
class NewlineNode(Node):
contents: str = ""
@dataclass(kw_only=True)
class BlockquoteNode(Node):
pass
class HeaderAlignment(Enum):
DEFAULT = auto()
LEFT = auto()
CENTER = auto()
RIGHT = auto()
@dataclass
class HeaderCell:
name: str
alignment: HeaderAlignment = HeaderAlignment.DEFAULT
@dataclass(kw_only=True)
class TableNode(Node):
header: List[HeaderCell]
rows: List[List[str]]
| 14.462366 | 56 | 0.712268 | 170 | 1,345 | 5.541176 | 0.282353 | 0.175159 | 0.238854 | 0.302548 | 0.507431 | 0.430998 | 0.37155 | 0.082803 | 0 | 0 | 0 | 0.000901 | 0.174721 | 1,345 | 92 | 57 | 14.619565 | 0.847748 | 0 | 0 | 0.448276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.155172 | 0.051724 | 0 | 0.568966 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
c4ab7f48e708ad4ee67bb02e733fbbb2b035134c | 27,242 | py | Python | models_nonconvex_simple2/sfacloc2_2_95.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | null | null | null | models_nonconvex_simple2/sfacloc2_2_95.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | null | null | null | models_nonconvex_simple2/sfacloc2_2_95.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | null | null | null | # MINLP written by GAMS Convert at 08/20/20 01:30:44
#
# Equation counts
# Total E G L N X C B
# 240 46 162 32 0 0 0 0
#
# Variable counts
# x b i s1s s2s sc si
# Total cont binary integer sos1 sos2 scont sint
# 187 148 39 0 0 0 0 0
# FX 0 0 0 0 0 0 0 0
#
# Nonzero counts
# Total const NL DLL
# 596 520 76 0
#
# Reformulation has removed 1 variable and 1 equation
from pyomo.environ import *
model = m = ConcreteModel()
m.x1 = Var(within=Reals,bounds=(0,0.26351883),initialize=0)
m.x2 = Var(within=Reals,bounds=(0,0.26351883),initialize=0)
m.x3 = Var(within=Reals,bounds=(0,0.22891574),initialize=0)
m.x4 = Var(within=Reals,bounds=(0,0.22891574),initialize=0)
m.x5 = Var(within=Reals,bounds=(0,0.21464835),initialize=0)
m.x6 = Var(within=Reals,bounds=(0,0.21464835),initialize=0)
m.x7 = Var(within=Reals,bounds=(0,0.17964414),initialize=0)
m.x8 = Var(within=Reals,bounds=(0,0.17964414),initialize=0)
m.x9 = Var(within=Reals,bounds=(0,0.17402843),initialize=0)
m.x10 = Var(within=Reals,bounds=(0,0.17402843),initialize=0)
m.x11 = Var(within=Reals,bounds=(0,0.15355962),initialize=0)
m.x12 = Var(within=Reals,bounds=(0,0.15355962),initialize=0)
m.x13 = Var(within=Reals,bounds=(0,0.1942283),initialize=0)
m.x14 = Var(within=Reals,bounds=(0,0.1942283),initialize=0)
m.x15 = Var(within=Reals,bounds=(0,0.25670555),initialize=0)
m.x16 = Var(within=Reals,bounds=(0,0.25670555),initialize=0)
m.x17 = Var(within=Reals,bounds=(0,0.27088619),initialize=0)
m.x18 = Var(within=Reals,bounds=(0,0.27088619),initialize=0)
m.x19 = Var(within=Reals,bounds=(0,0.28985675),initialize=0)
m.x20 = Var(within=Reals,bounds=(0,0.28985675),initialize=0)
m.x21 = Var(within=Reals,bounds=(0,0.25550303),initialize=0)
m.x22 = Var(within=Reals,bounds=(0,0.25550303),initialize=0)
m.x23 = Var(within=Reals,bounds=(0,0.19001726),initialize=0)
m.x24 = Var(within=Reals,bounds=(0,0.19001726),initialize=0)
m.x25 = Var(within=Reals,bounds=(0,0.23803143),initialize=0)
m.x26 = Var(within=Reals,bounds=(0,0.23803143),initialize=0)
m.x27 = Var(within=Reals,bounds=(0,0.23312962),initialize=0)
m.x28 = Var(within=Reals,bounds=(0,0.23312962),initialize=0)
m.x29 = Var(within=Reals,bounds=(0,0.27705307),initialize=0)
m.x30 = Var(within=Reals,bounds=(0,0.27705307),initialize=0)
m.x31 = Var(within=Reals,bounds=(5.68,5.96),initialize=5.68)
m.x32 = Var(within=Reals,bounds=(40.18,42.0933333333333),initialize=40.18)
m.x33 = Var(within=Reals,bounds=(94.7666666666667,99.28),initialize=94.7666666666667)
m.x34 = Var(within=Reals,bounds=(59.0533333333333,61.8666666666667),initialize=59.0533333333333)
m.x35 = Var(within=Reals,bounds=(53.7333333333333,56.2866666666667),initialize=53.7333333333333)
m.x36 = Var(within=Reals,bounds=(37.7266666666667,41.5),initialize=37.7266666666667)
m.x37 = Var(within=Reals,bounds=(59.6466666666667,62.4933333333333),initialize=59.6466666666667)
m.x38 = Var(within=Reals,bounds=(59.2733333333333,62.24),initialize=59.2733333333333)
m.b39 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b40 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b41 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b42 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b43 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b44 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b45 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b46 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b47 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b48 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b49 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b50 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b51 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b52 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b53 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b54 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b55 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b56 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b57 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b58 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b59 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b60 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b61 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b62 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b63 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b64 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b65 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b66 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b67 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b68 = Var(within=Binary,bounds=(0,1),initialize=0)
m.x69 = Var(within=Reals,bounds=(0,0.5323080366),initialize=0)
m.x70 = Var(within=Reals,bounds=(0,0.918715169866666),initialize=0)
m.x71 = Var(within=Reals,bounds=(0,1.021726146),initialize=0)
m.x72 = Var(within=Reals,bounds=(0,1.0706790744),initialize=0)
m.x73 = Var(within=Reals,bounds=(0,7.32543671346667),initialize=0)
m.x74 = Var(within=Reals,bounds=(0,15.2453990736),initialize=0)
m.x75 = Var(within=Reals,bounds=(0,1.28061192466667),initialize=0)
m.x76 = Var(within=Reals,bounds=(0,15.8815166933333),initialize=0)
m.x77 = Var(within=Reals,bounds=(0,15.2472806811333),initialize=0)
m.x78 = Var(within=Reals,bounds=(0,12.029055125),initialize=0)
m.x79 = Var(within=Reals,bounds=(0,15.9672360214667),initialize=0)
m.x80 = Var(within=Reals,bounds=(0,15.3736631157333),initialize=0)
m.x81 = Var(within=Reals,bounds=(0,6.2237284564),initialize=0)
m.x82 = Var(within=Reals,bounds=(0,8.85892556),initialize=0)
m.x83 = Var(within=Reals,bounds=(0,17.2437830768),initialize=0)
m.x84 = Var(within=Reals,bounds=(0.25788969,0.35227087),initialize=0.25788969)
m.x85 = Var(within=Reals,bounds=(0.25788969,0.35227087),initialize=0.25788969)
m.x86 = Var(within=Reals,bounds=(-0.98493628,-0.7794471),initialize=-0.7794471)
m.x87 = Var(within=Reals,bounds=(-0.98493628,-0.7794471),initialize=-0.7794471)
m.x88 = Var(within=Reals,bounds=(0,0.0580296499999999),initialize=0)
m.x89 = Var(within=Reals,bounds=(0,0.0580296499999999),initialize=0)
m.x90 = Var(within=Reals,bounds=(0,0.0546689399999999),initialize=0)
m.x91 = Var(within=Reals,bounds=(0,0.0546689399999999),initialize=0)
m.x92 = Var(within=Reals,bounds=(0,0.09360565),initialize=0)
m.x93 = Var(within=Reals,bounds=(0,0.09360565),initialize=0)
m.x94 = Var(within=Reals,bounds=(0,0.0476880399999999),initialize=0)
m.x95 = Var(within=Reals,bounds=(0,0.0476880399999999),initialize=0)
m.x96 = Var(within=Reals,bounds=(0,0.05276021),initialize=0)
m.x97 = Var(within=Reals,bounds=(0,0.05276021),initialize=0)
m.x98 = Var(within=Reals,bounds=(0,0.04905388),initialize=0)
m.x99 = Var(within=Reals,bounds=(0,0.04905388),initialize=0)
m.x100 = Var(within=Reals,bounds=(0,0.07731692),initialize=0)
m.x101 = Var(within=Reals,bounds=(0,0.07731692),initialize=0)
m.x102 = Var(within=Reals,bounds=(0,0.08211741),initialize=0)
m.x103 = Var(within=Reals,bounds=(0,0.08211741),initialize=0)
m.x104 = Var(within=Reals,bounds=(0,0.09438118),initialize=0)
m.x105 = Var(within=Reals,bounds=(0,0.09438118),initialize=0)
m.x106 = Var(within=Reals,bounds=(0,0.08436757),initialize=0)
m.x107 = Var(within=Reals,bounds=(0,0.08436757),initialize=0)
m.x108 = Var(within=Reals,bounds=(0,0.06987597),initialize=0)
m.x109 = Var(within=Reals,bounds=(0,0.06987597),initialize=0)
m.x110 = Var(within=Reals,bounds=(0,0.04788831),initialize=0)
m.x111 = Var(within=Reals,bounds=(0,0.04788831),initialize=0)
m.x112 = Var(within=Reals,bounds=(0,0.0668875099999999),initialize=0)
m.x113 = Var(within=Reals,bounds=(0,0.0668875099999999),initialize=0)
m.x114 = Var(within=Reals,bounds=(0,0.07276512),initialize=0)
m.x115 = Var(within=Reals,bounds=(0,0.07276512),initialize=0)
m.x116 = Var(within=Reals,bounds=(0,0.09438118),initialize=0)
m.x117 = Var(within=Reals,bounds=(0,0.09438118),initialize=0)
m.x118 = Var(within=Reals,bounds=(0,0.20548918),initialize=0)
m.x119 = Var(within=Reals,bounds=(0,0.20548918),initialize=0)
m.x120 = Var(within=Reals,bounds=(0,0.1742468),initialize=0)
m.x121 = Var(within=Reals,bounds=(0,0.1742468),initialize=0)
m.x122 = Var(within=Reals,bounds=(0,0.1210427),initialize=0)
m.x123 = Var(within=Reals,bounds=(0,0.1210427),initialize=0)
m.x124 = Var(within=Reals,bounds=(0,0.1319561),initialize=0)
m.x125 = Var(within=Reals,bounds=(0,0.1319561),initialize=0)
m.x126 = Var(within=Reals,bounds=(0,0.12126822),initialize=0)
m.x127 = Var(within=Reals,bounds=(0,0.12126822),initialize=0)
m.x128 = Var(within=Reals,bounds=(0,0.10450574),initialize=0)
m.x129 = Var(within=Reals,bounds=(0,0.10450574),initialize=0)
m.x130 = Var(within=Reals,bounds=(0,0.11691138),initialize=0)
m.x131 = Var(within=Reals,bounds=(0,0.11691138),initialize=0)
m.x132 = Var(within=Reals,bounds=(0,0.17458814),initialize=0)
m.x133 = Var(within=Reals,bounds=(0,0.17458814),initialize=0)
m.x134 = Var(within=Reals,bounds=(0,0.17650501),initialize=0)
m.x135 = Var(within=Reals,bounds=(0,0.17650501),initialize=0)
m.x136 = Var(within=Reals,bounds=(0,0.20548918),initialize=0)
m.x137 = Var(within=Reals,bounds=(0,0.20548918),initialize=0)
m.x138 = Var(within=Reals,bounds=(0,0.18562706),initialize=0)
m.x139 = Var(within=Reals,bounds=(0,0.18562706),initialize=0)
m.x140 = Var(within=Reals,bounds=(0,0.14212895),initialize=0)
m.x141 = Var(within=Reals,bounds=(0,0.14212895),initialize=0)
m.x142 = Var(within=Reals,bounds=(0,0.17114392),initialize=0)
m.x143 = Var(within=Reals,bounds=(0,0.17114392),initialize=0)
m.x144 = Var(within=Reals,bounds=(0,0.1603645),initialize=0)
m.x145 = Var(within=Reals,bounds=(0,0.1603645),initialize=0)
m.x146 = Var(within=Reals,bounds=(0,0.18267189),initialize=0)
m.x147 = Var(within=Reals,bounds=(0,0.18267189),initialize=0)
m.x148 = Var(within=Reals,bounds=(0,0.5323080366),initialize=0)
m.x149 = Var(within=Reals,bounds=(0,0.5323080366),initialize=0)
m.x150 = Var(within=Reals,bounds=(0,0.918715169866666),initialize=0)
m.x151 = Var(within=Reals,bounds=(0,0.918715169866666),initialize=0)
m.x152 = Var(within=Reals,bounds=(0,1.021726146),initialize=0)
m.x153 = Var(within=Reals,bounds=(0,1.021726146),initialize=0)
m.x154 = Var(within=Reals,bounds=(0,1.0706790744),initialize=0)
m.x155 = Var(within=Reals,bounds=(0,1.0706790744),initialize=0)
m.x156 = Var(within=Reals,bounds=(0,7.32543671346667),initialize=0)
m.x157 = Var(within=Reals,bounds=(0,7.32543671346667),initialize=0)
m.x158 = Var(within=Reals,bounds=(0,15.2453990736),initialize=0)
m.x159 = Var(within=Reals,bounds=(0,15.2453990736),initialize=0)
m.x160 = Var(within=Reals,bounds=(0,1.28061192466667),initialize=0)
m.x161 = Var(within=Reals,bounds=(0,1.28061192466667),initialize=0)
m.x162 = Var(within=Reals,bounds=(0,15.8815166933333),initialize=0)
m.x163 = Var(within=Reals,bounds=(0,15.8815166933333),initialize=0)
m.x164 = Var(within=Reals,bounds=(0,15.2472806811333),initialize=0)
m.x165 = Var(within=Reals,bounds=(0,15.2472806811333),initialize=0)
m.x166 = Var(within=Reals,bounds=(0,12.029055125),initialize=0)
m.x167 = Var(within=Reals,bounds=(0,12.029055125),initialize=0)
m.x168 = Var(within=Reals,bounds=(0,15.9672360214667),initialize=0)
m.x169 = Var(within=Reals,bounds=(0,15.9672360214667),initialize=0)
m.x170 = Var(within=Reals,bounds=(0,15.3736631157333),initialize=0)
m.x171 = Var(within=Reals,bounds=(0,15.3736631157333),initialize=0)
m.x172 = Var(within=Reals,bounds=(0,6.2237284564),initialize=0)
m.x173 = Var(within=Reals,bounds=(0,6.2237284564),initialize=0)
m.x174 = Var(within=Reals,bounds=(0,8.85892556),initialize=0)
m.x175 = Var(within=Reals,bounds=(0,8.85892556),initialize=0)
m.x176 = Var(within=Reals,bounds=(0,17.2437830768),initialize=0)
m.x177 = Var(within=Reals,bounds=(0,17.2437830768),initialize=0)
m.b178 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b179 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b180 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b181 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b182 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b183 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b184 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b185 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b186 = Var(within=Binary,bounds=(0,1),initialize=0)
m.obj = Objective(expr= m.x69 + m.x70 + m.x71 + m.x72 + m.x73 + m.x74 + m.x75 + m.x76 + m.x77 + m.x78 + m.x79 + m.x80
+ m.x81 + m.x82 + m.x83, sense=minimize)
m.c2 = Constraint(expr=(-1.01*m.x1*m.b39) - 1.01*m.b39*m.x1 + m.x148 >= 0)
m.c3 = Constraint(expr=(-1.01*m.x2*m.b40) - 1.01*m.b40*m.x2 + m.x149 >= 0)
m.c4 = Constraint(expr=(-2.00666666666667*m.x3*m.b41) - 2.00666666666667*m.b41*m.x3 + m.x150 >= 0)
m.c5 = Constraint(expr=(-2.00666666666667*m.x4*m.b42) - 2.00666666666667*m.b42*m.x4 + m.x151 >= 0)
m.c6 = Constraint(expr=(-2.38*m.x5*m.b43) - 2.38*m.b43*m.x5 + m.x152 >= 0)
m.c7 = Constraint(expr=(-2.38*m.x6*m.b44) - 2.38*m.b44*m.x6 + m.x153 >= 0)
m.c8 = Constraint(expr=-m.x31*m.x7*m.b45 + m.x154 >= 0)
m.c9 = Constraint(expr=-m.x31*m.x8*m.b46 + m.x155 >= 0)
m.c10 = Constraint(expr=-m.x32*m.x9*m.b47 + m.x156 >= 0)
m.c11 = Constraint(expr=-m.x32*m.x10*m.b48 + m.x157 >= 0)
m.c12 = Constraint(expr=-m.x33*m.x11*m.b49 + m.x158 >= 0)
m.c13 = Constraint(expr=-m.x33*m.x12*m.b50 + m.x159 >= 0)
m.c14 = Constraint(expr=(-3.29666666666667*m.x13*m.b51) - 3.29666666666667*m.b51*m.x13 + m.x160 >= 0)
m.c15 = Constraint(expr=(-3.29666666666667*m.x14*m.b52) - 3.29666666666667*m.b52*m.x14 + m.x161 >= 0)
m.c16 = Constraint(expr=-m.x34*m.x15*m.b53 + m.x162 >= 0)
m.c17 = Constraint(expr=-m.x34*m.x16*m.b54 + m.x163 >= 0)
m.c18 = Constraint(expr=-m.x35*m.x17*m.b55 + m.x164 >= 0)
m.c19 = Constraint(expr=-m.x35*m.x18*m.b56 + m.x165 >= 0)
m.c20 = Constraint(expr=-m.x36*m.x19*m.b57 + m.x166 >= 0)
m.c21 = Constraint(expr=-m.x36*m.x20*m.b58 + m.x167 >= 0)
m.c22 = Constraint(expr=-m.x37*m.x21*m.b59 + m.x168 >= 0)
m.c23 = Constraint(expr=-m.x37*m.x22*m.b60 + m.x169 >= 0)
m.c24 = Constraint(expr=(-40.4533333333333*m.x23*m.b61) - 40.4533333333333*m.b61*m.x23 + m.x170 >= 0)
m.c25 = Constraint(expr=(-40.4533333333333*m.x24*m.b62) - 40.4533333333333*m.b62*m.x24 + m.x171 >= 0)
m.c26 = Constraint(expr=(-13.0733333333333*m.x25*m.b63) - 13.0733333333333*m.b63*m.x25 + m.x172 >= 0)
m.c27 = Constraint(expr=(-13.0733333333333*m.x26*m.b64) - 13.0733333333333*m.b64*m.x26 + m.x173 >= 0)
m.c28 = Constraint(expr=(-19*m.x27*m.b65) - 19*m.b65*m.x27 + m.x174 >= 0)
m.c29 = Constraint(expr=(-19*m.x28*m.b66) - 19*m.b66*m.x28 + m.x175 >= 0)
m.c30 = Constraint(expr=-m.x38*m.x29*m.b67 + m.x176 >= 0)
m.c31 = Constraint(expr=-m.x38*m.x30*m.b68 + m.x177 >= 0)
m.c32 = Constraint(expr= m.b39 + m.b40 == 1)
m.c33 = Constraint(expr= m.b41 + m.b42 == 1)
m.c34 = Constraint(expr= m.b43 + m.b44 == 1)
m.c35 = Constraint(expr= m.b45 + m.b46 == 1)
m.c36 = Constraint(expr= m.b47 + m.b48 == 1)
m.c37 = Constraint(expr= m.b49 + m.b50 == 1)
m.c38 = Constraint(expr= m.b51 + m.b52 == 1)
m.c39 = Constraint(expr= m.b53 + m.b54 == 1)
m.c40 = Constraint(expr= m.b55 + m.b56 == 1)
m.c41 = Constraint(expr= m.b57 + m.b58 == 1)
m.c42 = Constraint(expr= m.b59 + m.b60 == 1)
m.c43 = Constraint(expr= m.b61 + m.b62 == 1)
m.c44 = Constraint(expr= m.b63 + m.b64 == 1)
m.c45 = Constraint(expr= m.b65 + m.b66 == 1)
m.c46 = Constraint(expr= m.b67 + m.b68 == 1)
m.c47 = Constraint(expr= 2.02*m.b39 + 4.01333333333333*m.b41 + 4.76*m.b43 + 5.96*m.b45 + 42.0933333333333*m.b47
+ 99.28*m.b49 + 6.59333333333333*m.b51 + 61.8666666666667*m.b53 + 56.2866666666667*m.b55
+ 41.5*m.b57 + 62.4933333333333*m.b59 + 80.9066666666667*m.b61 + 26.1466666666667*m.b63
+ 38*m.b65 + 62.24*m.b67 <= 302.08)
m.c48 = Constraint(expr= 2.02*m.b40 + 4.01333333333333*m.b42 + 4.76*m.b44 + 5.96*m.b46 + 42.0933333333333*m.b48
+ 99.28*m.b50 + 6.59333333333333*m.b52 + 61.8666666666667*m.b54 + 56.2866666666667*m.b56
+ 41.5*m.b58 + 62.4933333333333*m.b60 + 80.9066666666667*m.b62 + 26.1466666666667*m.b64
+ 38*m.b66 + 62.24*m.b68 <= 302.08)
m.c49 = Constraint(expr= m.x84 + m.x88 >= 0.29424122)
m.c50 = Constraint(expr= m.x85 + m.x89 >= 0.29424122)
m.c51 = Constraint(expr= m.x84 + m.x90 >= 0.29760193)
m.c52 = Constraint(expr= m.x85 + m.x91 >= 0.29760193)
m.c53 = Constraint(expr= m.x84 + m.x92 >= 0.35149534)
m.c54 = Constraint(expr= m.x85 + m.x93 >= 0.35149534)
m.c55 = Constraint(expr= m.x84 + m.x94 >= 0.30458283)
m.c56 = Constraint(expr= m.x85 + m.x95 >= 0.30458283)
m.c57 = Constraint(expr= m.x84 + m.x96 >= 0.29951066)
m.c58 = Constraint(expr= m.x85 + m.x97 >= 0.29951066)
m.c59 = Constraint(expr= m.x84 + m.x98 >= 0.30694357)
m.c60 = Constraint(expr= m.x85 + m.x99 >= 0.30694357)
m.c61 = Constraint(expr= m.x84 + m.x100 >= 0.33520661)
m.c62 = Constraint(expr= m.x85 + m.x101 >= 0.33520661)
m.c63 = Constraint(expr= m.x84 + m.x102 >= 0.3400071)
m.c64 = Constraint(expr= m.x85 + m.x103 >= 0.3400071)
m.c65 = Constraint(expr= m.x84 + m.x104 >= 0.35227087)
m.c66 = Constraint(expr= m.x85 + m.x105 >= 0.35227087)
m.c67 = Constraint(expr= m.x84 + m.x106 >= 0.34225726)
m.c68 = Constraint(expr= m.x85 + m.x107 >= 0.34225726)
m.c69 = Constraint(expr= m.x84 + m.x108 >= 0.32776566)
m.c70 = Constraint(expr= m.x85 + m.x109 >= 0.32776566)
m.c71 = Constraint(expr= m.x84 + m.x110 >= 0.30438256)
m.c72 = Constraint(expr= m.x85 + m.x111 >= 0.30438256)
m.c73 = Constraint(expr= m.x84 + m.x112 >= 0.28538336)
m.c74 = Constraint(expr= m.x85 + m.x113 >= 0.28538336)
m.c75 = Constraint(expr= m.x84 + m.x114 >= 0.27950575)
m.c76 = Constraint(expr= m.x85 + m.x115 >= 0.27950575)
m.c77 = Constraint(expr= - m.x84 + m.x88 >= -0.29424122)
m.c78 = Constraint(expr= - m.x85 + m.x89 >= -0.29424122)
m.c79 = Constraint(expr= - m.x84 + m.x90 >= -0.29760193)
m.c80 = Constraint(expr= - m.x85 + m.x91 >= -0.29760193)
m.c81 = Constraint(expr= - m.x84 + m.x92 >= -0.35149534)
m.c82 = Constraint(expr= - m.x85 + m.x93 >= -0.35149534)
m.c83 = Constraint(expr= - m.x84 + m.x94 >= -0.30458283)
m.c84 = Constraint(expr= - m.x85 + m.x95 >= -0.30458283)
m.c85 = Constraint(expr= - m.x84 + m.x96 >= -0.29951066)
m.c86 = Constraint(expr= - m.x85 + m.x97 >= -0.29951066)
m.c87 = Constraint(expr= - m.x84 + m.x98 >= -0.30694357)
m.c88 = Constraint(expr= - m.x85 + m.x99 >= -0.30694357)
m.c89 = Constraint(expr= - m.x84 + m.x100 >= -0.33520661)
m.c90 = Constraint(expr= - m.x85 + m.x101 >= -0.33520661)
m.c91 = Constraint(expr= - m.x84 + m.x102 >= -0.3400071)
m.c92 = Constraint(expr= - m.x85 + m.x103 >= -0.3400071)
m.c93 = Constraint(expr= - m.x84 + m.x106 >= -0.34225726)
m.c94 = Constraint(expr= - m.x85 + m.x107 >= -0.34225726)
m.c95 = Constraint(expr= - m.x84 + m.x108 >= -0.32776566)
m.c96 = Constraint(expr= - m.x85 + m.x109 >= -0.32776566)
m.c97 = Constraint(expr= - m.x84 + m.x110 >= -0.30438256)
m.c98 = Constraint(expr= - m.x85 + m.x111 >= -0.30438256)
m.c99 = Constraint(expr= - m.x84 + m.x112 >= -0.28538336)
m.c100 = Constraint(expr= - m.x85 + m.x113 >= -0.28538336)
m.c101 = Constraint(expr= - m.x84 + m.x114 >= -0.27950575)
m.c102 = Constraint(expr= - m.x85 + m.x115 >= -0.27950575)
m.c103 = Constraint(expr= - m.x84 + m.x116 >= -0.25788969)
m.c104 = Constraint(expr= - m.x85 + m.x117 >= -0.25788969)
m.c105 = Constraint(expr= m.x86 + m.x120 >= -0.9536939)
m.c106 = Constraint(expr= m.x87 + m.x121 >= -0.9536939)
m.c107 = Constraint(expr= m.x86 + m.x122 >= -0.9004898)
m.c108 = Constraint(expr= m.x87 + m.x123 >= -0.9004898)
m.c109 = Constraint(expr= m.x86 + m.x124 >= -0.9114032)
m.c110 = Constraint(expr= m.x87 + m.x125 >= -0.9114032)
m.c111 = Constraint(expr= m.x86 + m.x126 >= -0.90071532)
m.c112 = Constraint(expr= m.x87 + m.x127 >= -0.90071532)
m.c113 = Constraint(expr= m.x86 + m.x128 >= -0.88043054)
m.c114 = Constraint(expr= m.x87 + m.x129 >= -0.88043054)
m.c115 = Constraint(expr= m.x86 + m.x130 >= -0.8680249)
m.c116 = Constraint(expr= m.x87 + m.x131 >= -0.8680249)
m.c117 = Constraint(expr= m.x86 + m.x132 >= -0.81034814)
m.c118 = Constraint(expr= m.x87 + m.x133 >= -0.81034814)
m.c119 = Constraint(expr= m.x86 + m.x134 >= -0.80843127)
m.c120 = Constraint(expr= m.x87 + m.x135 >= -0.80843127)
m.c121 = Constraint(expr= m.x86 + m.x136 >= -0.7794471)
m.c122 = Constraint(expr= m.x87 + m.x137 >= -0.7794471)
m.c123 = Constraint(expr= m.x86 + m.x138 >= -0.79930922)
m.c124 = Constraint(expr= m.x87 + m.x139 >= -0.79930922)
m.c125 = Constraint(expr= m.x86 + m.x140 >= -0.84280733)
m.c126 = Constraint(expr= m.x87 + m.x141 >= -0.84280733)
m.c127 = Constraint(expr= m.x86 + m.x142 >= -0.81379236)
m.c128 = Constraint(expr= m.x87 + m.x143 >= -0.81379236)
m.c129 = Constraint(expr= m.x86 + m.x144 >= -0.82457178)
m.c130 = Constraint(expr= m.x87 + m.x145 >= -0.82457178)
m.c131 = Constraint(expr= m.x86 + m.x146 >= -0.80226439)
m.c132 = Constraint(expr= m.x87 + m.x147 >= -0.80226439)
m.c133 = Constraint(expr= - m.x86 + m.x118 >= 0.98493628)
m.c134 = Constraint(expr= - m.x87 + m.x119 >= 0.98493628)
m.c135 = Constraint(expr= - m.x86 + m.x120 >= 0.9536939)
m.c136 = Constraint(expr= - m.x87 + m.x121 >= 0.9536939)
m.c137 = Constraint(expr= - m.x86 + m.x122 >= 0.9004898)
m.c138 = Constraint(expr= - m.x87 + m.x123 >= 0.9004898)
m.c139 = Constraint(expr= - m.x86 + m.x124 >= 0.9114032)
m.c140 = Constraint(expr= - m.x87 + m.x125 >= 0.9114032)
m.c141 = Constraint(expr= - m.x86 + m.x126 >= 0.90071532)
m.c142 = Constraint(expr= - m.x87 + m.x127 >= 0.90071532)
m.c143 = Constraint(expr= - m.x86 + m.x128 >= 0.88043054)
m.c144 = Constraint(expr= - m.x87 + m.x129 >= 0.88043054)
m.c145 = Constraint(expr= - m.x86 + m.x130 >= 0.8680249)
m.c146 = Constraint(expr= - m.x87 + m.x131 >= 0.8680249)
m.c147 = Constraint(expr= - m.x86 + m.x132 >= 0.81034814)
m.c148 = Constraint(expr= - m.x87 + m.x133 >= 0.81034814)
m.c149 = Constraint(expr= - m.x86 + m.x134 >= 0.80843127)
m.c150 = Constraint(expr= - m.x87 + m.x135 >= 0.80843127)
m.c151 = Constraint(expr= - m.x86 + m.x138 >= 0.79930922)
m.c152 = Constraint(expr= - m.x87 + m.x139 >= 0.79930922)
m.c153 = Constraint(expr= - m.x86 + m.x140 >= 0.84280733)
m.c154 = Constraint(expr= - m.x87 + m.x141 >= 0.84280733)
m.c155 = Constraint(expr= - m.x86 + m.x142 >= 0.81379236)
m.c156 = Constraint(expr= - m.x87 + m.x143 >= 0.81379236)
m.c157 = Constraint(expr= - m.x86 + m.x144 >= 0.82457178)
m.c158 = Constraint(expr= - m.x87 + m.x145 >= 0.82457178)
m.c159 = Constraint(expr= - m.x86 + m.x146 >= 0.80226439)
m.c160 = Constraint(expr= - m.x87 + m.x147 >= 0.80226439)
m.c161 = Constraint(expr= m.x1 - m.x88 - m.x118 == 0)
m.c162 = Constraint(expr= m.x2 - m.x89 - m.x119 == 0)
m.c163 = Constraint(expr= m.x3 - m.x90 - m.x120 == 0)
m.c164 = Constraint(expr= m.x4 - m.x91 - m.x121 == 0)
m.c165 = Constraint(expr= m.x5 - m.x92 - m.x122 == 0)
m.c166 = Constraint(expr= m.x6 - m.x93 - m.x123 == 0)
m.c167 = Constraint(expr= m.x7 - m.x94 - m.x124 == 0)
m.c168 = Constraint(expr= m.x8 - m.x95 - m.x125 == 0)
m.c169 = Constraint(expr= m.x9 - m.x96 - m.x126 == 0)
m.c170 = Constraint(expr= m.x10 - m.x97 - m.x127 == 0)
m.c171 = Constraint(expr= m.x11 - m.x98 - m.x128 == 0)
m.c172 = Constraint(expr= m.x12 - m.x99 - m.x129 == 0)
m.c173 = Constraint(expr= m.x13 - m.x100 - m.x130 == 0)
m.c174 = Constraint(expr= m.x14 - m.x101 - m.x131 == 0)
m.c175 = Constraint(expr= m.x15 - m.x102 - m.x132 == 0)
m.c176 = Constraint(expr= m.x16 - m.x103 - m.x133 == 0)
m.c177 = Constraint(expr= m.x17 - m.x104 - m.x134 == 0)
m.c178 = Constraint(expr= m.x18 - m.x105 - m.x135 == 0)
m.c179 = Constraint(expr= m.x19 - m.x106 - m.x136 == 0)
m.c180 = Constraint(expr= m.x20 - m.x107 - m.x137 == 0)
m.c181 = Constraint(expr= m.x21 - m.x108 - m.x138 == 0)
m.c182 = Constraint(expr= m.x22 - m.x109 - m.x139 == 0)
m.c183 = Constraint(expr= m.x23 - m.x110 - m.x140 == 0)
m.c184 = Constraint(expr= m.x24 - m.x111 - m.x141 == 0)
m.c185 = Constraint(expr= m.x25 - m.x112 - m.x142 == 0)
m.c186 = Constraint(expr= m.x26 - m.x113 - m.x143 == 0)
m.c187 = Constraint(expr= m.x27 - m.x114 - m.x144 == 0)
m.c188 = Constraint(expr= m.x28 - m.x115 - m.x145 == 0)
m.c189 = Constraint(expr= m.x29 - m.x116 - m.x146 == 0)
m.c190 = Constraint(expr= m.x30 - m.x117 - m.x147 == 0)
m.c191 = Constraint(expr= m.b179 + m.b180 >= 1)
m.c192 = Constraint(expr= m.b178 + m.b180 >= 1)
m.c193 = Constraint(expr= m.b178 + m.b179 >= 1)
m.c194 = Constraint(expr= m.b180 + m.b182 >= 1)
m.c195 = Constraint(expr= m.b180 + m.b181 >= 1)
m.c196 = Constraint(expr= m.b179 + m.b182 >= 1)
m.c197 = Constraint(expr= m.b179 + m.b181 >= 1)
m.c198 = Constraint(expr= m.b178 + m.b182 >= 1)
m.c199 = Constraint(expr= m.b178 + m.b181 >= 1)
m.c200 = Constraint(expr= m.x31 - 5.96*m.b178 >= 0)
m.c201 = Constraint(expr= m.x32 - 42.0933333333333*m.b179 >= 0)
m.c202 = Constraint(expr= m.x33 - 99.28*m.b180 >= 0)
m.c203 = Constraint(expr= m.x34 - 61.8666666666667*m.b181 >= 0)
m.c204 = Constraint(expr= m.x35 - 56.2866666666667*m.b182 >= 0)
m.c205 = Constraint(expr= m.x36 - 39.6133333333333*m.b183 >= 0)
m.c206 = Constraint(expr= m.x36 - 41.5*m.b184 >= 0)
m.c207 = Constraint(expr= m.x37 - 62.4933333333333*m.b185 >= 0)
m.c208 = Constraint(expr= m.x38 - 62.24*m.b186 >= 0)
m.c209 = Constraint(expr= - m.x69 + m.x148 <= 0)
m.c210 = Constraint(expr= - m.x69 + m.x149 <= 0)
m.c211 = Constraint(expr= - m.x70 + m.x150 <= 0)
m.c212 = Constraint(expr= - m.x70 + m.x151 <= 0)
m.c213 = Constraint(expr= - m.x71 + m.x152 <= 0)
m.c214 = Constraint(expr= - m.x71 + m.x153 <= 0)
m.c215 = Constraint(expr= - m.x72 + m.x154 <= 0)
m.c216 = Constraint(expr= - m.x72 + m.x155 <= 0)
m.c217 = Constraint(expr= - m.x73 + m.x156 <= 0)
m.c218 = Constraint(expr= - m.x73 + m.x157 <= 0)
m.c219 = Constraint(expr= - m.x74 + m.x158 <= 0)
m.c220 = Constraint(expr= - m.x74 + m.x159 <= 0)
m.c221 = Constraint(expr= - m.x75 + m.x160 <= 0)
m.c222 = Constraint(expr= - m.x75 + m.x161 <= 0)
m.c223 = Constraint(expr= - m.x76 + m.x162 <= 0)
m.c224 = Constraint(expr= - m.x76 + m.x163 <= 0)
m.c225 = Constraint(expr= - m.x77 + m.x164 <= 0)
m.c226 = Constraint(expr= - m.x77 + m.x165 <= 0)
m.c227 = Constraint(expr= - m.x78 + m.x166 <= 0)
m.c228 = Constraint(expr= - m.x78 + m.x167 <= 0)
m.c229 = Constraint(expr= - m.x79 + m.x168 <= 0)
m.c230 = Constraint(expr= - m.x79 + m.x169 <= 0)
m.c231 = Constraint(expr= - m.x80 + m.x170 <= 0)
m.c232 = Constraint(expr= - m.x80 + m.x171 <= 0)
m.c233 = Constraint(expr= - m.x81 + m.x172 <= 0)
m.c234 = Constraint(expr= - m.x81 + m.x173 <= 0)
m.c235 = Constraint(expr= - m.x82 + m.x174 <= 0)
m.c236 = Constraint(expr= - m.x82 + m.x175 <= 0)
m.c237 = Constraint(expr= - m.x83 + m.x176 <= 0)
m.c238 = Constraint(expr= - m.x83 + m.x177 <= 0)
m.c239 = Constraint(expr= m.b183 - m.b184 >= 0)
m.c240 = Constraint(expr= m.x86 - m.x87 >= 0)
| 39.028653 | 119 | 0.655495 | 4,809 | 27,242 | 3.713246 | 0.132668 | 0.030688 | 0.187322 | 0.164641 | 0.701742 | 0.609677 | 0.609677 | 0.609229 | 0.609229 | 0.343059 | 0 | 0.279151 | 0.140261 | 27,242 | 697 | 120 | 39.084648 | 0.483284 | 0.024961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002299 | 0 | 0.002299 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c4ba4380258671989c146897f898e573b64695b4 | 1,896 | py | Python | test/test_game_over.py | jasondaming/bobail | cbc3796c5da9ff1daae0ed6729e1c336c79322b2 | [
"MIT"
] | null | null | null | test/test_game_over.py | jasondaming/bobail | cbc3796c5da9ff1daae0ed6729e1c336c79322b2 | [
"MIT"
] | null | null | null | test/test_game_over.py | jasondaming/bobail | cbc3796c5da9ff1daae0ed6729e1c336c79322b2 | [
"MIT"
] | null | null | null | import unittest
from bobail.game import Game
import random
class TestGameOver(unittest.TestCase):
def setUp(self):
self.game = Game()
def test_new_game_not_over(self):
self.expect(False)
def test_win_by_bobail_end_row(self):
self.make_non_final_moves([[3, 11], [13, 8], [25, 10]])
self.move([8, 3]).expect(True)
def test_win_by_bobail_trapped(self):
self.make_non_final_moves([[3, 11], [13, 12], [23, 17], [12, 16]])
self.move([4, 12]).expect(True)
"""
def test_move_limit_draw(self):
self.make_non_final_moves([[10, 14], [22, 17], [9, 13], [17, 10], [7, 14], [25, 22], [6, 10], [29, 25], [1, 6], [22, 18], [6, 9], [24, 19], [2, 6], [28, 24],
[11, 16], [24, 20], [8, 11], [32, 28], [4, 8], [27, 24], [3, 7], [31, 27], [13, 17], [25, 22], [9, 13], [18, 9], [9, 2], [10, 14], [22, 18], [5, 9], [19, 15],
[16, 19], [23, 16], [12, 19], [30, 25], [14, 23], [23, 32], [21, 14], [14, 5], [11, 18], [2, 11], [11, 4], [19, 23], [26, 19], [13, 17], [25, 21], [17, 22],
[21, 17], [22, 25], [17, 14], [18, 22], [5, 1], [22, 26], [4, 8], [26, 31], [19, 15], [25, 30], [8, 11], [31, 26], [1, 6], [26, 23], [24, 19], [23, 16],
[16, 7], [14, 10], [7, 14], [15, 10], [14, 7], [28, 24], [32, 28], [20, 16], [28, 19], [19, 12], [6, 9], [7, 10], [9, 13], [10, 7], [13, 9], [7, 3], [9, 6],
[3, 7], [6, 1], [7, 11], [1, 6], [11, 8], [6, 9], [8, 11], [9, 6], [11, 8], [6, 9], [8, 11], [9, 6], [11, 8], [6, 9], [8, 11], [9, 6], [11, 8], [6, 9], [8, 11],
[9, 6], [11, 8], [6, 9], [8, 11], [9, 6], [11, 8], [6, 9], [8, 11], [9, 6], [11, 8], [6, 9], [8, 11], [9, 6]])
self.move([11, 8]).expect(True)
"""
def make_non_final_moves(self, moves):
for move in moves:
self.move(move).expect(False)
def move(self, move):
self.game.move(move)
return self
def expect(self, value):
self.assertIs(self.game.is_over(), value) | 43.090909 | 164 | 0.484705 | 358 | 1,896 | 2.486034 | 0.192737 | 0.020225 | 0.031461 | 0.039326 | 0.21573 | 0.175281 | 0.147191 | 0.147191 | 0.147191 | 0.079775 | 0 | 0.25968 | 0.209916 | 1,896 | 44 | 165 | 43.090909 | 0.334446 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 1 | 0.318182 | false | 0 | 0.136364 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
c4cdd3fa30a063ec12b40b110fdad2bdeef08314 | 269 | py | Python | src/utils/env_utils.py | super6liu/technical-analysis-julian | dd8868b65d80f78e536f3471d4dc09440de48e62 | [
"MIT"
] | null | null | null | src/utils/env_utils.py | super6liu/technical-analysis-julian | dd8868b65d80f78e536f3471d4dc09440de48e62 | [
"MIT"
] | null | null | null | src/utils/env_utils.py | super6liu/technical-analysis-julian | dd8868b65d80f78e536f3471d4dc09440de48e62 | [
"MIT"
] | 1 | 2021-10-03T13:18:09.000Z | 2021-10-03T13:18:09.000Z | from src.configs import CONFIGS
from src.constants import Env
def get_env():
return Env(CONFIGS["env"] or Env.PRODUCETION)
def set_env(env: Env):
CONFIGS["env"] = env
def is_test():
env = get_env()
return env == Env.TEST or env == Env.DEVELOPMENT
| 16.8125 | 52 | 0.672862 | 42 | 269 | 4.214286 | 0.357143 | 0.169492 | 0.135593 | 0.169492 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204461 | 269 | 15 | 53 | 17.933333 | 0.827103 | 0 | 0 | 0 | 0 | 0 | 0.022305 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.222222 | 0.111111 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
c4eae706f8fcbe1ab1e4daff9460b2af2e60b285 | 87 | py | Python | lablog/apps.py | raylab/EmpathyLab | ae1bde75103eb38c979694ee76d13a319196c36b | [
"CC0-1.0"
] | null | null | null | lablog/apps.py | raylab/EmpathyLab | ae1bde75103eb38c979694ee76d13a319196c36b | [
"CC0-1.0"
] | 6 | 2018-03-30T02:50:04.000Z | 2018-04-15T16:11:52.000Z | lablog/apps.py | raylab/EmpathyLab | ae1bde75103eb38c979694ee76d13a319196c36b | [
"CC0-1.0"
] | null | null | null | from django.apps import AppConfig
class LablogConfig(AppConfig):
name = 'lablog'
| 14.5 | 33 | 0.747126 | 10 | 87 | 6.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 87 | 5 | 34 | 17.4 | 0.902778 | 0 | 0 | 0 | 0 | 0 | 0.068966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
f2068df91c4edfb351c4c9e5aff891396f2adac1 | 17 | py | Python | commands/tests/__init__.py | MaferMazu/blockchain-network-simulator | 3f2184b2c05681184eee40168b142e5d0af581b1 | [
"MIT"
] | 1 | 2021-03-30T09:29:13.000Z | 2021-03-30T09:29:13.000Z | tests/__init__.py | alueschow/srupy | a73fa62549bb3e62b07fe2be4a96926a41088ae2 | [
"MIT"
] | 1 | 2021-03-24T10:28:21.000Z | 2021-03-24T10:28:21.000Z | commands/tests/__init__.py | MaferMazu/blockchain-network-simulator | 3f2184b2c05681184eee40168b142e5d0af581b1 | [
"MIT"
] | 1 | 2021-03-20T20:23:26.000Z | 2021-03-20T20:23:26.000Z | """Docstring."""
| 8.5 | 16 | 0.529412 | 1 | 17 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 17 | 1 | 17 | 17 | 0.5625 | 0.588235 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
f20ae42a83d651cabcf43872caa538c600105c34 | 8,220 | py | Python | readthedocs/rtd_tests/tests/test_version_querysets.py | darrowco/readthedocs.org | fa7fc5a24306f1f6a27c7393f381c594ab29b357 | [
"MIT"
] | null | null | null | readthedocs/rtd_tests/tests/test_version_querysets.py | darrowco/readthedocs.org | fa7fc5a24306f1f6a27c7393f381c594ab29b357 | [
"MIT"
] | null | null | null | readthedocs/rtd_tests/tests/test_version_querysets.py | darrowco/readthedocs.org | fa7fc5a24306f1f6a27c7393f381c594ab29b357 | [
"MIT"
] | null | null | null | from django_dynamic_fixture import get
from django.contrib.auth.models import User
from django.test import TestCase
from readthedocs.projects.models import Project
from readthedocs.projects.constants import PRIVATE, PUBLIC, PROTECTED
from readthedocs.builds.constants import LATEST
from readthedocs.builds.models import Version
class VersionQuerySetTests(TestCase):
def setUp(self):
self.user = get(User)
self.another_user = get(User)
self.project = get(
Project,
privacy_level=PUBLIC,
users=[self.user],
main_language_project=None,
versions=[],
)
self.version_latest = self.project.versions.get(slug=LATEST)
self.version = get(
Version,
privacy_level=PUBLIC,
project=self.project,
active=True,
)
self.version_private = get(
Version,
privacy_level=PRIVATE,
project=self.project,
active=True,
)
self.version_protected = get(
Version,
privacy_level=PROTECTED,
project=self.project,
active=True,
)
self.another_project = get(
Project,
privacy_level=PUBLIC,
users=[self.another_user],
main_language_project=None,
versions=[],
)
self.another_version_latest = self.another_project.versions.get(slug=LATEST)
self.another_version = get(
Version,
privacy_level=PUBLIC,
project=self.another_project,
active=True,
)
self.another_version_private = get(
Version,
privacy_level=PRIVATE,
project=self.another_project,
active=True,
)
self.another_version_protected = get(
Version,
privacy_level=PROTECTED,
project=self.another_project,
active=True,
)
self.shared_project = get(
Project,
privacy_level=PUBLIC,
users=[self.user, self.another_user],
main_language_project=None,
versions=[],
)
self.shared_version_latest = self.shared_project.versions.get(slug=LATEST)
self.shared_version = get(
Version,
privacy_level=PUBLIC,
project=self.shared_project,
active=True,
)
self.shared_version_private = get(
Version,
privacy_level=PRIVATE,
project=self.shared_project,
active=True,
)
self.shared_version_protected = get(
Version,
privacy_level=PROTECTED,
project=self.shared_project,
active=True,
)
self.user_versions = {
self.version,
self.version_latest,
self.version_private,
self.version_protected,
self.shared_version,
self.shared_version_latest,
self.shared_version_private,
self.shared_version_protected,
}
self.another_user_versions = {
self.another_version_latest,
self.another_version,
self.another_version_private,
self.another_version_protected,
self.shared_version,
self.shared_version_latest,
self.shared_version_private,
self.shared_version_protected,
}
def test_public(self):
query = Version.objects.public()
versions = {
self.version_latest,
self.version,
self.another_version,
self.another_version_latest,
self.shared_version,
self.shared_version_latest,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_public_user(self):
query = Version.objects.public(user=self.user)
versions = (
self.user_versions |
{self.another_version_latest, self.another_version}
)
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_public_project(self):
query = Version.objects.public(user=self.user, project=self.project)
versions = {
self.version,
self.version_latest,
self.version_private,
self.version_protected,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_protected(self):
query = Version.objects.protected()
versions = {
self.version,
self.version_latest,
self.version_protected,
self.another_version,
self.another_version_latest,
self.another_version_protected,
self.shared_version,
self.shared_version_latest,
self.shared_version_protected,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_protected_user(self):
query = Version.objects.protected(user=self.user)
versions = (
self.user_versions |
{
self.another_version,
self.another_version_latest,
self.another_version_protected,
}
)
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_protected_project(self):
query = Version.objects.protected(user=self.user, project=self.project)
versions = {
self.version,
self.version_latest,
self.version_private,
self.version_protected,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_private(self):
query = Version.objects.private()
versions = {
self.version_private,
self.another_version_private,
self.shared_version_private,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_private_user(self):
query = Version.objects.private(user=self.user)
versions = (
self.user_versions |
{self.another_version_private}
)
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_private_project(self):
query = Version.objects.private(user=self.user, project=self.project)
versions = {
self.version,
self.version_latest,
self.version_private,
self.version_protected,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_api(self):
query = Version.objects.api()
versions = {
self.version_latest,
self.version,
self.another_version,
self.another_version_latest,
self.shared_version,
self.shared_version_latest,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_api_user(self):
query = Version.objects.api(user=self.user, detail=False)
versions = self.user_versions
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
def test_for_project(self):
self.another_project.main_language_project = self.project
self.another_project.save()
query = Version.objects.for_project(self.project)
versions = {
self.version,
self.version_latest,
self.version_protected,
self.version_private,
self.another_version,
self.another_version_latest,
self.another_version_protected,
self.another_version_private,
}
self.assertEqual(query.count(), len(versions))
self.assertEqual(set(query), versions)
| 31.860465 | 84 | 0.587105 | 797 | 8,220 | 5.846926 | 0.062735 | 0.087339 | 0.100429 | 0.064378 | 0.883691 | 0.827468 | 0.782403 | 0.747425 | 0.674893 | 0.526824 | 0 | 0 | 0.326399 | 8,220 | 257 | 85 | 31.984436 | 0.841611 | 0 | 0 | 0.631356 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101695 | 1 | 0.055085 | false | 0 | 0.029661 | 0 | 0.088983 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1eed754f4ffcc83268d48e26138477f856bd5b83 | 200 | py | Python | src/custom_logging/middlewares.py | sh-cho/django-custom-logging | a016fd1ede6f81e3b0ca4ccbe170ffe2015ecb76 | [
"MIT"
] | 4 | 2021-04-05T14:24:09.000Z | 2021-11-01T09:34:23.000Z | src/custom_logging/middlewares.py | sh-cho/django-custom-logging | a016fd1ede6f81e3b0ca4ccbe170ffe2015ecb76 | [
"MIT"
] | null | null | null | src/custom_logging/middlewares.py | sh-cho/django-custom-logging | a016fd1ede6f81e3b0ca4ccbe170ffe2015ecb76 | [
"MIT"
] | null | null | null | from . import local_thread
def capture_request(get_response):
def middleware(request):
local_thread.request = request or None
return get_response(request)
return middleware
| 20 | 46 | 0.725 | 24 | 200 | 5.833333 | 0.541667 | 0.157143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.22 | 200 | 9 | 47 | 22.222222 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
1eefb0b2f5d5177d57174c190054a13509188ae0 | 4,491 | py | Python | utils/interfaces/_library/appsec_data.py | DataDog/system-tests | 04f94312fddb135830dbe2df6d51d9246561ea6e | [
"Apache-2.0"
] | 3 | 2021-11-15T20:28:25.000Z | 2022-01-27T18:33:15.000Z | utils/interfaces/_library/appsec_data.py | DataDog/system-tests | 04f94312fddb135830dbe2df6d51d9246561ea6e | [
"Apache-2.0"
] | 25 | 2021-11-08T15:50:38.000Z | 2022-03-29T12:16:17.000Z | utils/interfaces/_library/appsec_data.py | DataDog/system-tests | 04f94312fddb135830dbe2df6d51d9246561ea6e | [
"Apache-2.0"
] | 1 | 2021-11-15T20:28:28.000Z | 2021-11-15T20:28:28.000Z | # Unless explicitly stated otherwise all files in this repository are licensed under the the Apache License Version 2.0.
# This product includes software developed at Datadog (https://www.datadoghq.com/).
# Copyright 2021 Datadog, Inc.
# Automatic generatiom from:
# python utils/scripts/extract_appsec_waf_rules.py nodejs
rule_id_to_type = {
"crs-913-110": "security_scanner",
"crs-913-120": "security_scanner",
"ua0-600-0xx": "security_scanner",
"ua0-600-10x": "security_scanner",
"ua0-600-12x": "security_scanner",
"ua0-600-13x": "security_scanner",
"ua0-600-14x": "security_scanner",
"ua0-600-15x": "security_scanner",
"ua0-600-16x": "security_scanner",
"ua0-600-18x": "security_scanner",
"ua0-600-19x": "security_scanner",
"ua0-600-1xx": "security_scanner",
"ua0-600-20x": "security_scanner",
"ua0-600-22x": "security_scanner",
"ua0-600-23x": "security_scanner",
"ua0-600-25x": "security_scanner",
"ua0-600-26x": "security_scanner",
"ua0-600-27x": "security_scanner",
"ua0-600-28x": "security_scanner",
"ua0-600-29x": "security_scanner",
"ua0-600-2xx": "security_scanner",
"ua0-600-30x": "security_scanner",
"ua0-600-31x": "security_scanner",
"ua0-600-32x": "security_scanner",
"ua0-600-33x": "security_scanner",
"ua0-600-34x": "security_scanner",
"ua0-600-35x": "security_scanner",
"ua0-600-36x": "security_scanner",
"ua0-600-37x": "security_scanner",
"ua0-600-39x": "security_scanner",
"ua0-600-3xx": "security_scanner",
"ua0-600-40x": "security_scanner",
"ua0-600-41x": "security_scanner",
"ua0-600-42x": "security_scanner",
"ua0-600-43x": "security_scanner",
"ua0-600-44x": "security_scanner",
"ua0-600-45x": "security_scanner",
"ua0-600-46x": "security_scanner",
"ua0-600-47x": "security_scanner",
"ua0-600-48x": "security_scanner",
"ua0-600-49x": "security_scanner",
"ua0-600-4xx": "security_scanner",
"ua0-600-51x": "security_scanner",
"ua0-600-52x": "security_scanner",
"ua0-600-53x": "security_scanner",
"ua0-600-54x": "security_scanner",
"ua0-600-5xx": "security_scanner",
"ua0-600-6xx": "security_scanner",
"ua0-600-7xx": "security_scanner",
"ua0-600-9xx": "security_scanner",
"crs-920-260": "http_protocol_violation",
"crs-921-110": "http_protocol_violation",
"crs-921-140": "http_protocol_violation",
"crs-921-160": "http_protocol_violation",
"crs-943-100": "http_protocol_violation",
"crs-930-100": "lfi",
"crs-930-110": "lfi",
"crs-930-120": "lfi",
"crs-931-110": "rfi",
"crs-931-120": "rfi",
"crs-932-160": "command_injection",
"crs-932-171": "command_injection",
"crs-932-180": "command_injection",
"sqr-000-008": "command_injection",
"sqr-000-009": "command_injection",
"sqr-000-010": "command_injection",
"crs-933-111": "unrestricted_file_upload",
"crs-933-130": "php_code_injection",
"crs-933-131": "php_code_injection",
"crs-933-140": "php_code_injection",
"crs-933-150": "php_code_injection",
"crs-933-160": "php_code_injection",
"crs-933-170": "php_code_injection",
"crs-933-200": "php_code_injection",
"crs-934-100": "js_code_injection",
"sqr-000-002": "js_code_injection",
"crs-941-100": "xss",
"crs-941-110": "xss",
"crs-941-120": "xss",
"crs-941-140": "xss",
"crs-941-180": "xss",
"crs-941-200": "xss",
"crs-941-210": "xss",
"crs-941-220": "xss",
"crs-941-230": "xss",
"crs-941-240": "xss",
"crs-941-270": "xss",
"crs-941-280": "xss",
"crs-941-290": "xss",
"crs-941-300": "xss",
"crs-941-350": "xss",
"crs-941-360": "xss",
"crs-942-100": "sql_injection",
"crs-942-140": "sql_injection",
"crs-942-160": "sql_injection",
"crs-942-190": "sql_injection",
"crs-942-220": "sql_injection",
"crs-942-240": "sql_injection",
"crs-942-250": "sql_injection",
"crs-942-270": "sql_injection",
"crs-942-280": "sql_injection",
"crs-942-360": "sql_injection",
"crs-942-500": "sql_injection",
"crs-942-290": "nosql_injection",
"sqr-000-007": "nosql_injection",
"crs-944-100": "java_code_injection",
"crs-944-110": "java_code_injection",
"crs-944-130": "java_code_injection",
"sqr-000-001": "ssrf",
"sqr-000-011": "ssrf",
"sqr-000-012": "ssrf",
"sqr-000-013": "ssrf",
"sqr-000-014": "ssrf",
"sqr-000-015": "ssrf",
}
| 35.928 | 120 | 0.6277 | 611 | 4,491 | 4.432079 | 0.278232 | 0.276957 | 0.319055 | 0.37223 | 0.095643 | 0 | 0 | 0 | 0 | 0 | 0 | 0.18131 | 0.163661 | 4,491 | 124 | 121 | 36.217742 | 0.53967 | 0.07014 | 0 | 0 | 0 | 0 | 0.664748 | 0.033333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
4839394dc53ab72ad0fa51ddfd0f73a119654e54 | 112 | py | Python | plugins/hello.py | takaakiaoki/ilas-slackbot-aoki | 001c13340948a1b80ec4f6bc66ceb093d6a7ca66 | [
"MIT"
] | null | null | null | plugins/hello.py | takaakiaoki/ilas-slackbot-aoki | 001c13340948a1b80ec4f6bc66ceb093d6a7ca66 | [
"MIT"
] | null | null | null | plugins/hello.py | takaakiaoki/ilas-slackbot-aoki | 001c13340948a1b80ec4f6bc66ceb093d6a7ca66 | [
"MIT"
] | null | null | null | import slackbot.bot
@slackbot.bot.respond_to('hello')
def resp_hello(message):
message.reply('こんにちは')
| 18.666667 | 34 | 0.714286 | 15 | 112 | 5.2 | 0.733333 | 0.282051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 112 | 5 | 35 | 22.4 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0.093458 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
48565e754f39efb7bc7362b9e5136fa99ea54b01 | 767 | py | Python | neukivy/factory_registers.py | kengoon/NeuKivy | 1ad64b0dc32cef870d727034fa2313045f3ce4de | [
"MIT"
] | 1 | 2021-06-25T07:04:04.000Z | 2021-06-25T07:04:04.000Z | neukivy/factory_registers.py | kengoon/NeuKivy | 1ad64b0dc32cef870d727034fa2313045f3ce4de | [
"MIT"
] | null | null | null | neukivy/factory_registers.py | kengoon/NeuKivy | 1ad64b0dc32cef870d727034fa2313045f3ce4de | [
"MIT"
] | null | null | null | from kivy.factory import Factory
r = Factory.register
r("NeuButton", module="neukivy.uix.button")
r("NeuRoundedButton", module="neukivy.uix.button")
r("NeuCircularButton", module="neukivy.uix.button")
r("NeuIconButton", module="neukivy.uix.button")
r("NeuRoundedIconButton", module="neukivy.uix.button")
r("NeuCircularIconButton", module="neukivy.uix.button")
r("NeuIconTextButton", module="neukivy.uix.button")
r("NeuRoundedIconTextButton", module="neukivy.uix.button")
r("NeuCard", module="neukivy.uix.card")
r("NeuBackdrop", module="neukivy.uix.backdrop")
r("NeuBackdropFrontLayer", module="neukivy.uix.backdrop")
r("NeuBanner", module="neukivy.uix.banner")
r("NeuBannerLeftWidget", module="neukivy.uix.banner")
r("NeuBannerAction", module="neukivy.uix.banner")
| 40.368421 | 58 | 0.765319 | 92 | 767 | 6.380435 | 0.293478 | 0.310051 | 0.381601 | 0.29983 | 0.477002 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04824 | 767 | 18 | 59 | 42.611111 | 0.80411 | 0 | 0 | 0 | 0 | 0 | 0.616688 | 0.08605 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
4858c4478ad63d18f082b63bb99e796851f071a7 | 325 | py | Python | arxpy/smt/tests/test_verification_differential.py | ranea/RX-difference-tool | 28b4a42bc0c602956f980a8b0d7d758f4147ff5e | [
"MIT"
] | null | null | null | arxpy/smt/tests/test_verification_differential.py | ranea/RX-difference-tool | 28b4a42bc0c602956f980a8b0d7d758f4147ff5e | [
"MIT"
] | null | null | null | arxpy/smt/tests/test_verification_differential.py | ranea/RX-difference-tool | 28b4a42bc0c602956f980a8b0d7d758f4147ff5e | [
"MIT"
] | null | null | null | """Tests for the verification module."""
import unittest
import doctest
import arxpy.smt.verification_differential
class EmptyTest(unittest.TestCase):
pass
def load_tests(loader, tests, ignore):
"""Add doctests."""
tests.addTests(doctest.DocTestSuite(arxpy.smt.verification_differential))
return tests
| 18.055556 | 77 | 0.756923 | 37 | 325 | 6.567568 | 0.648649 | 0.065844 | 0.164609 | 0.263374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141538 | 325 | 17 | 78 | 19.117647 | 0.870968 | 0.147692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 4 |
485be2a332a52e0c6d1f38bdc388d47757ba78fb | 211 | py | Python | tests/test_pydisp.py | dimatura/pydisplay | 98addd643372c19cbdab2226b5b600eaac5953a5 | [
"MIT"
] | 2 | 2017-03-09T19:06:39.000Z | 2017-03-15T04:10:22.000Z | tests/test_pydisp.py | dimatura/pydisplay | 98addd643372c19cbdab2226b5b600eaac5953a5 | [
"MIT"
] | null | null | null | tests/test_pydisp.py | dimatura/pydisplay | 98addd643372c19cbdab2226b5b600eaac5953a5 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Tests for `pydisp` module.
"""
import pytest
from contextlib import contextmanager
from click.testing import CliRunner
import pydisp
from pydisp import cli
| 13.1875 | 37 | 0.725118 | 28 | 211 | 5.464286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00565 | 0.161137 | 211 | 15 | 38 | 14.066667 | 0.858757 | 0.327014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
4868ef2e203da395641db74123cb18e7b219ac72 | 15,821 | py | Python | cairis/controllers/RequirementController.py | RAIJ95/https-github.com-failys-cairis | 86601347ea016f4a3f90b6942093d63e91de5f74 | [
"Apache-2.0"
] | null | null | null | cairis/controllers/RequirementController.py | RAIJ95/https-github.com-failys-cairis | 86601347ea016f4a3f90b6942093d63e91de5f74 | [
"Apache-2.0"
] | null | null | null | cairis/controllers/RequirementController.py | RAIJ95/https-github.com-failys-cairis | 86601347ea016f4a3f90b6942093d63e91de5f74 | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import httplib
from flask import session, request, make_response
from flask_restful import Resource
from flask_restful_swagger import swagger
from cairis.daemon.CairisHTTPError import ObjectNotFoundHTTPError, ARMHTTPError, MalformedJSONHTTPError, MissingParameterHTTPError
from cairis.data.RequirementDAO import RequirementDAO
from cairis.tools.MessageDefinitions import RequirementMessage
from cairis.tools.ModelDefinitions import RequirementModel
from cairis.tools.SessionValidator import get_session_id
from cairis.tools.JsonConverter import json_serialize
__author__ = 'Robin Quetin'
class RequirementsAPI(Resource):
# region Swagger Doc
@swagger.operation(
notes='Get all requirements',
nickname='requirements-get',
responseClass=RequirementModel.__name__,
responseContainer='List',
parameters=[
{
"name": "ordered",
"description": "Defines if the list has to be order",
"default": 1,
"required": False,
"allowMultiple": False,
"dataType": int.__name__,
"paramType": "query"
},
{
"name": "constraint_id",
"description": "The constraint used as filter to query the database",
"default": "",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
},
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
"code": httplib.BAD_REQUEST,
"message": "The database connection was not properly set up"
},
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
}
]
)
# endregion
def get(self):
session_id = get_session_id(session, request)
ordered = request.args.get('ordered', 0)
constraint_id = request.args.get('constraint_id', '')
dao = RequirementDAO(session_id)
reqs = dao.get_requirements(constraint_id=constraint_id, ordered=(ordered=='1'))
dao.close()
resp = make_response(json_serialize(reqs, session_id=session_id), httplib.OK)
resp.headers['Content-type'] = 'application/json'
resp.headers['Access-Control-Allow-Origin'] = "*"
return resp
# region Swagger Doc
@swagger.operation(
notes='Creates a new requirement',
nickname='requirements-post',
parameters=[
{
"name": "body",
"description": "The serialized version of the new requirement to be added",
"required": True,
"allowMultiple": False,
"type": RequirementMessage.__name__,
"paramType": "body"
},
{
"name": "asset",
"description": "The name of the asset which is associated to the new requirement",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
},
{
"name": "environment",
"description": "The name of the environment which is associated to the new requirement",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
},
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
},
{
'code': MalformedJSONHTTPError.status_code,
'message': MalformedJSONHTTPError.status
},
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
}
]
)
# endregion
def post(self):
session_id = get_session_id(session, request)
asset_name = request.args.get('asset', None)
environment_name = request.args.get('environment', None)
dao = RequirementDAO(session_id)
new_req = dao.from_json(request)
req_id = dao.add_requirement(new_req, asset_name=asset_name, environment_name=environment_name)
dao.close()
resp_dict = {'message': 'Requirement successfully added', 'requirement_id': req_id}
resp = make_response(json_serialize(resp_dict), httplib.OK)
resp.contenttype = 'application/json'
return resp
# region Swagger Docs
@swagger.operation(
notes='Updates a requirement',
nickname='requirement-update-put',
parameters=[
{
'name': 'body',
"description": "The new updated requirement",
"required": True,
"allowMultiple": False,
'type': RequirementMessage.__name__,
'paramType': 'body'
}
],
responseMessages=[
{
'code': ObjectNotFoundHTTPError.status_code,
'message': ObjectNotFoundHTTPError.status
},
{
'code': MalformedJSONHTTPError.status_code,
'message': MalformedJSONHTTPError.status
},
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
}
]
)
# endregion
def put(self):
session_id = get_session_id(session, request)
dao = RequirementDAO(session_id)
req = dao.from_json(request)
dao.update_requirement(req, req_id=req.theId)
dao.close()
resp_dict = {'message': 'Requirement successfully updated'}
resp = make_response(json_serialize(resp_dict), httplib.OK)
resp.headers['Content-type'] = 'application/json'
return resp
class RequirementsByAssetAPI(Resource):
# region Swagger Doc
@swagger.operation(
notes='Get the requirements associated with an asset',
nickname='requirements-by-asset-get',
responseClass=RequirementModel.__name__,
responseContainer='List',
parameters=[
{
"name": "ordered",
"description": "Defines if the list has to be order",
"default": 1,
"required": False,
"allowMultiple": False,
"dataType": int.__name__,
"paramType": "query"
},
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
"code": httplib.BAD_REQUEST,
"message": "The database connection was not properly set up"
},
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
}
]
)
# endregion
def get(self, name):
session_id = get_session_id(session, request)
ordered = request.args.get('ordered', '1')
dao = RequirementDAO(session_id)
reqs = dao.get_requirements(constraint_id=name, is_asset=True, ordered=(ordered=='1'))
dao.close()
resp = make_response(json_serialize(reqs, session_id=session_id), httplib.OK)
resp.headers['Content-type'] = 'application/json'
return resp
class RequirementsByEnvironmentAPI(Resource):
# region Swagger Doc
@swagger.operation(
notes='Get the requirements associated with an environment',
nickname='requirements-by-environment-get',
responseClass=RequirementModel.__name__,
responseContainer='List',
parameters=[
{
"name": "ordered",
"description": "Defines if the list has to be order",
"default": 1,
"required": False,
"allowMultiple": False,
"dataType": int.__name__,
"paramType": "query"
},
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
"code": httplib.BAD_REQUEST,
"message": "The database connection was not properly set up"
},
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
}
]
)
# endregion
def get(self, name):
session_id = get_session_id(session, request)
ordered = request.args.get('ordered', '1')
dao = RequirementDAO(session_id)
reqs = dao.get_requirements(constraint_id=name, is_asset=False, ordered=(ordered=='1'))
dao.close()
resp = make_response(json_serialize(reqs, session_id=session_id), httplib.OK)
resp.headers['Content-type'] = 'application/json'
return resp
class RequirementByNameAPI(Resource):
# region Swagger Doc
@swagger.operation(
notes='Get a requirement by name',
nickname='requirement-by-name-get',
responseClass=RequirementModel.__name__,
parameters=[
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
"code": httplib.BAD_REQUEST,
"message": "The database connection was not properly set up"
},
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
},
{
'code': ObjectNotFoundHTTPError.status_code,
'message': ObjectNotFoundHTTPError.status
}
]
)
# endregion
def get(self, name):
session_id = get_session_id(session, request)
dao = RequirementDAO(session_id)
req = dao.get_requirement_by_name(name)
dao.close()
resp = make_response(json_serialize(req, session_id=session_id), httplib.OK)
resp.headers['Content-type'] = 'application/json'
return resp
# region Swagger Doc
@swagger.operation(
notes='Deletes an existing requirement',
nickname='requirement-by-name-delete',
parameters=[
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MalformedJSONHTTPError.status_code,
'message': MalformedJSONHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
},
{
'code': ObjectNotFoundHTTPError.status_code,
'message': ObjectNotFoundHTTPError.status
}
]
)
# endregion
def delete(self, name):
session_id = get_session_id(session, request)
dao = RequirementDAO(session_id)
dao.delete_requirement(name=name)
dao.close()
resp_dict = {'message': 'Requirement successfully deleted'}
resp = make_response(json_serialize(resp_dict), httplib.OK)
resp.headers['Content-type'] = 'application/json'
return resp
class RequirementByShortcodeAPI(Resource):
# region Swagger Doc
@swagger.operation(
notes='Get a requirement by ID',
nickname='requirement-by-id-get',
responseClass=RequirementModel.__name__,
parameters=[
{
"name": "session_id",
"description": "The ID of the user's session",
"required": False,
"allowMultiple": False,
"dataType": str.__name__,
"paramType": "query"
}
],
responseMessages=[
{
'code': ARMHTTPError.status_code,
'message': ARMHTTPError.status
},
{
'code': MissingParameterHTTPError.status_code,
'message': MissingParameterHTTPError.status
},
{
'code': ObjectNotFoundHTTPError.status_code,
'message': ObjectNotFoundHTTPError.status
}
]
)
# endregion
def get(self, shortcode):
session_id = get_session_id(session, request)
dao = RequirementDAO(session_id)
req = dao.get_requirement_by_shortcode(shortcode)
dao.close()
resp = make_response(json_serialize(req, session_id=session_id), httplib.OK)
resp.headers['Content-type'] = 'application/json'
return resp
| 34.543668 | 130 | 0.544593 | 1,333 | 15,821 | 6.295574 | 0.152288 | 0.045043 | 0.046592 | 0.048022 | 0.73868 | 0.719852 | 0.715324 | 0.701859 | 0.661106 | 0.637393 | 0 | 0.001279 | 0.357689 | 15,821 | 457 | 131 | 34.619256 | 0.824542 | 0.062954 | 0 | 0.568182 | 0 | 0 | 0.198837 | 0.011832 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020202 | false | 0 | 0.025253 | 0 | 0.078283 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
487eda7246d5065a6326fdfdfeae8ef4e829b161 | 260 | py | Python | server/server.py | MRtecno98/Juno | 18576ad6b14000f40ddcdbe14434bc1986f7b64b | [
"MIT"
] | null | null | null | server/server.py | MRtecno98/Juno | 18576ad6b14000f40ddcdbe14434bc1986f7b64b | [
"MIT"
] | null | null | null | server/server.py | MRtecno98/Juno | 18576ad6b14000f40ddcdbe14434bc1986f7b64b | [
"MIT"
] | null | null | null | from .agent_operator import AgentOperator
from .router import Router
class LogisticalRouter(Router):
def __init__(self, agents_configuration):
super().__init__()
self.operator = AgentOperator(self, "localhost", 7777, agents_configuration)
| 32.5 | 84 | 0.753846 | 27 | 260 | 6.851852 | 0.592593 | 0.086486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018265 | 0.157692 | 260 | 7 | 85 | 37.142857 | 0.826484 | 0 | 0 | 0 | 0 | 0 | 0.034615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
488025cf27dd390997ea57a79914d14c58d17a96 | 252 | py | Python | tests/dags/dag_dagfile_has_dag.py | politools/dag-checks | 53b587d29d5e985698d9b08627af9231647de316 | [
"Apache-2.0"
] | 5 | 2021-02-06T14:01:08.000Z | 2021-12-02T00:06:07.000Z | tests/dags/dag_dagfile_has_dag.py | Polidea/dag-checks | 53b587d29d5e985698d9b08627af9231647de316 | [
"Apache-2.0"
] | null | null | null | tests/dags/dag_dagfile_has_dag.py | Polidea/dag-checks | 53b587d29d5e985698d9b08627af9231647de316 | [
"Apache-2.0"
] | 3 | 2021-02-01T08:09:40.000Z | 2021-11-01T18:29:23.000Z | from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator
from airflow.utils.dates import days_ago
with DAG(dag_id="dag_dagfile_has_dag_fail", schedule_interval=None, start_date=days_ago(1)):
DummyOperator(task_id="test")
| 36 | 92 | 0.829365 | 39 | 252 | 5.076923 | 0.641026 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004348 | 0.087302 | 252 | 6 | 93 | 42 | 0.856522 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
6f890edd1d8e95fd92f0e4244480ea7056ddfb96 | 18 | py | Python | test/integration/Lambda/two parameters.py | HighSchoolHacking/GLS-Draft | 9e418b6290e7c8e3f2da87668784bdba1cde5a76 | [
"MIT"
] | 30 | 2019-10-29T12:47:50.000Z | 2022-02-12T06:41:39.000Z | test/integration/Lambda/two parameters.py | HighSchoolHacking/GLS-Draft | 9e418b6290e7c8e3f2da87668784bdba1cde5a76 | [
"MIT"
] | 247 | 2017-09-21T17:11:18.000Z | 2019-10-08T12:59:07.000Z | test/integration/Lambda/two parameters.py | HighSchoolHacking/GLS-Draft | 9e418b6290e7c8e3f2da87668784bdba1cde5a76 | [
"MIT"
] | 17 | 2017-10-01T16:53:20.000Z | 2018-11-28T07:20:35.000Z | #
abc(def, ghi)
#
| 4.5 | 13 | 0.5 | 3 | 18 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 18 | 3 | 14 | 6 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6fb01fb5e9d710542100f90f0bea37a4da3c63c8 | 141 | py | Python | 8-9.py | Holaplace/path_to_python | 8fae2aca8d6da04c39a67514948fdf50e883750a | [
"MIT"
] | 1 | 2019-02-06T01:49:18.000Z | 2019-02-06T01:49:18.000Z | 8-9.py | Holaplace/path_to_python | 8fae2aca8d6da04c39a67514948fdf50e883750a | [
"MIT"
] | null | null | null | 8-9.py | Holaplace/path_to_python | 8fae2aca8d6da04c39a67514948fdf50e883750a | [
"MIT"
] | null | null | null | def show_magicians(names):
for name in names:
print(name)
unprinted_name = ['a','b','c','d']
show_magicians(unprinted_name) | 23.5 | 35 | 0.64539 | 20 | 141 | 4.35 | 0.65 | 0.298851 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.198582 | 141 | 6 | 36 | 23.5 | 0.769912 | 0 | 0 | 0 | 0 | 0 | 0.029197 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.2 | 0.6 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
6fca6a0ffa9a3a0bbd2f44aa41088d7773521122 | 76 | py | Python | framework/utils/__init__.py | maxim-filkov/mobile-test-helper | 46dcb27b0ee25153b697d19c17801cee35e136ce | [
"Apache-2.0"
] | 1 | 2018-12-14T02:13:14.000Z | 2018-12-14T02:13:14.000Z | framework/utils/__init__.py | maxim-filkov/mobile-test-helper | 46dcb27b0ee25153b697d19c17801cee35e136ce | [
"Apache-2.0"
] | null | null | null | framework/utils/__init__.py | maxim-filkov/mobile-test-helper | 46dcb27b0ee25153b697d19c17801cee35e136ce | [
"Apache-2.0"
] | 1 | 2018-12-14T02:13:21.000Z | 2018-12-14T02:13:21.000Z | """
Define here anything what is needed for the package framework.utils.
""" | 25.333333 | 68 | 0.75 | 11 | 76 | 5.181818 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144737 | 76 | 3 | 69 | 25.333333 | 0.876923 | 0.894737 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6fd7cc5b707fc4743a9bf24846c1994cce369414 | 187 | py | Python | Dataset/Leetcode/test/20/206.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/test/20/206.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/test/20/206.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | class Solution:
def XXX(self, s):
while "()" in s or "[]" in s or "{}" in s:
s = s.replace('[]','').replace('()','').replace('{}','')
return len(s) == 0
| 26.714286 | 68 | 0.406417 | 24 | 187 | 3.166667 | 0.541667 | 0.118421 | 0.131579 | 0.184211 | 0.171053 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007752 | 0.31016 | 187 | 6 | 69 | 31.166667 | 0.581395 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6fdd47e78f89d37feeb24f469a0543a6ee9f590a | 81 | py | Python | code/answer_4-2-37.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | 1 | 2022-03-29T13:50:12.000Z | 2022-03-29T13:50:12.000Z | code/answer_4-2-37.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | code/answer_4-2-37.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | S = input()
print(S[::-1].replace("6", "x").replace("9", "6").replace("x", "9"))
| 27 | 68 | 0.493827 | 14 | 81 | 2.857143 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067568 | 0.08642 | 81 | 2 | 69 | 40.5 | 0.472973 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
6fea3bf47a39cd0d19d90ae0f406360a9a2cbe4e | 162 | py | Python | test/test_roadrunner.py | stanleygu/tellurium | bfa6898eb4b632b31c4d12c0b0c78ce704a1d898 | [
"Apache-2.0"
] | null | null | null | test/test_roadrunner.py | stanleygu/tellurium | bfa6898eb4b632b31c4d12c0b0c78ce704a1d898 | [
"Apache-2.0"
] | null | null | null | test/test_roadrunner.py | stanleygu/tellurium | bfa6898eb4b632b31c4d12c0b0c78ce704a1d898 | [
"Apache-2.0"
] | null | null | null | import roadrunner
import roadrunner.testing
rr = roadrunner.RoadRunner(roadrunner.testing.getData('feedback.xml'))
result = rr.simulate()
roadrunner.plot(result)
| 27 | 70 | 0.814815 | 19 | 162 | 6.947368 | 0.526316 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067901 | 162 | 5 | 71 | 32.4 | 0.874172 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
b5121a1ca5582d503c0d07016451cf6761eb41d3 | 269 | py | Python | random_fun/python_challange/c05.py | mmmonk/crap | 96ba81723f043503e7ed2f96ea727b524d22b83f | [
"MIT"
] | 14 | 2015-01-14T15:53:22.000Z | 2019-06-21T06:15:47.000Z | random_fun/python_challange/c05.py | mmmonk/crap | 96ba81723f043503e7ed2f96ea727b524d22b83f | [
"MIT"
] | 1 | 2018-04-01T08:40:17.000Z | 2020-06-24T10:05:33.000Z | random_fun/python_challange/c05.py | mmmonk/crap | 96ba81723f043503e7ed2f96ea727b524d22b83f | [
"MIT"
] | 12 | 2015-05-13T10:52:04.000Z | 2020-10-07T14:49:37.000Z | #!/usr/bin/env python
# http://www.pythonchallenge.com/pc/def/peak.html
import urllib
import pickle
import sys
f = urllib.URLopener().open("http://www.pythonchallenge.com/pc/def/banner.p")
a = pickle.load(f)
for b in a:
print "".join([c[0]*int(c[1]) for c in b])
| 19.214286 | 77 | 0.687732 | 49 | 269 | 3.77551 | 0.653061 | 0.075676 | 0.237838 | 0.27027 | 0.324324 | 0.324324 | 0 | 0 | 0 | 0 | 0 | 0.008439 | 0.118959 | 269 | 13 | 78 | 20.692308 | 0.772152 | 0.252788 | 0 | 0 | 0 | 0 | 0.231156 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.428571 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
b51221aa9b2ae9dbe7650525cc7ddc80faa8b9e7 | 104 | py | Python | sitemap_urls_auditor/sitemap/__init__.py | alena-kono/sitemap-urls-auditor | b9f1651c48fd8e4131eca8ee44122ffa54a4576e | [
"MIT"
] | null | null | null | sitemap_urls_auditor/sitemap/__init__.py | alena-kono/sitemap-urls-auditor | b9f1651c48fd8e4131eca8ee44122ffa54a4576e | [
"MIT"
] | null | null | null | sitemap_urls_auditor/sitemap/__init__.py | alena-kono/sitemap-urls-auditor | b9f1651c48fd8e4131eca8ee44122ffa54a4576e | [
"MIT"
] | null | null | null | """Sitemap package.
Package is responsible for getting and processing urls from website's sitemap.
"""
| 20.8 | 78 | 0.769231 | 14 | 104 | 5.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144231 | 104 | 4 | 79 | 26 | 0.898876 | 0.923077 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
82eb341ad33a59915ff901fe669b2e8e08f584f8 | 218 | py | Python | Palantir/problem_3.py | HouariZegai/DailyCodingProblem | bf625ef461863b2b149949d01477b5ea054c5d9f | [
"MIT"
] | 54 | 2019-03-14T21:27:13.000Z | 2021-12-08T12:58:08.000Z | Palantir/problem_3.py | HouariZegai/DailyCodingProblem | bf625ef461863b2b149949d01477b5ea054c5d9f | [
"MIT"
] | null | null | null | Palantir/problem_3.py | HouariZegai/DailyCodingProblem | bf625ef461863b2b149949d01477b5ea054c5d9f | [
"MIT"
] | 8 | 2020-02-06T23:28:01.000Z | 2021-08-05T20:33:16.000Z | """ Asked by: Palantir [Easy].
Write a program that checks whether an integer is a palindrome. For example, 121 is a palindrome,
as well as 888. 678 is not a palindrome. Do not convert the integer into a string.
""" | 36.333333 | 98 | 0.733945 | 38 | 218 | 4.210526 | 0.710526 | 0.20625 | 0.1625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051136 | 0.192661 | 218 | 6 | 99 | 36.333333 | 0.857955 | 0.958716 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
d23177e3400dbfd375eaf3af86095ce23dce1c57 | 208 | py | Python | Chapter10/deque_rotation.py | kaushalkumarshah/Learn-Python-in-7-Days | 2663656767c8959ace836f0c0e272f3e501bbe6e | [
"MIT"
] | 12 | 2018-07-09T16:20:31.000Z | 2022-03-21T22:52:15.000Z | Chapter10/deque_rotation.py | kaushalkumarshah/Learn-Python-in-7-Days | 2663656767c8959ace836f0c0e272f3e501bbe6e | [
"MIT"
] | null | null | null | Chapter10/deque_rotation.py | kaushalkumarshah/Learn-Python-in-7-Days | 2663656767c8959ace836f0c0e272f3e501bbe6e | [
"MIT"
] | 19 | 2018-01-09T12:49:06.000Z | 2021-11-23T08:05:55.000Z | import collections
d = collections.deque(xrange(6))
print "Normal queue", d
d.rotate(2)
print "\nRight rotation :", d
d1 = collections.deque(xrange(6))
d1.rotate(-2)
print "\nleft rotation :", d1 | 18.909091 | 34 | 0.673077 | 30 | 208 | 4.666667 | 0.5 | 0.228571 | 0.314286 | 0.328571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040698 | 0.173077 | 208 | 11 | 35 | 18.909091 | 0.773256 | 0 | 0 | 0 | 0 | 0 | 0.236181 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.125 | null | null | 0.375 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
d24960f94b9f9717ec1c57f09484fd4c020e671f | 16,562 | py | Python | app/project/logics.py | Pechsopha/KITPoint | 076890838ca7f57b76f7c9a9a4101c9e90b13d8b | [
"MIT"
] | 1 | 2019-10-16T14:27:29.000Z | 2019-10-16T14:27:29.000Z | app/project/logics.py | Pechsopha/KITPoint | 076890838ca7f57b76f7c9a9a4101c9e90b13d8b | [
"MIT"
] | null | null | null | app/project/logics.py | Pechsopha/KITPoint | 076890838ca7f57b76f7c9a9a4101c9e90b13d8b | [
"MIT"
] | null | null | null | from app import db
from app.base.models import KitPointValue, AverageInternHour, Semester, Batch
from app.student.models import StudentSummary, Student
from app.task.models import Report, Task
from app.stakeholder.models import StakeholderTask, StakeholderSummary, Stakeholder
from datetime import datetime
def generate_planning_hour(project):
total_hour = 0
student_point = 0
adviser_point = 0
for i in project.summaries:
StudentSummary.query.filter_by(id=i.id).delete()
db.session.commit()
# Estimate student cost of the project #
for m in project.members:
res = db.session.query(Semester).join(AverageInternHour) \
.filter(Semester.start_date <= project.deadline, Semester.end_date >= project.start_date) \
.filter(m.batch_id == Semester.batch_id)\
.order_by(Semester.name)\
.all()
if len(res) == 0:
return False
count = 1
for i in res:
if len(res) == 1:
diff = (project.deadline - project.start_date).days/7
else:
if count == 1:
diff = (i.end_date - project.start_date).days/7
elif count < len(res):
diff = (i.end_date - i.start_date).days/7
else:
diff = (project.deadline - i.start_date).days/7
count += 1
if diff == 26:
diff -= 4
elif diff == 25:
diff -= 3
elif diff == 24:
diff -= 2
s_hour = diff*i.average_intern_hour.intern_hour
s_point = s_hour*float(m.rank.chargeTable.price)/100
total_hour += s_hour
student_point += s_point
summary = StudentSummary(m.department_id, m.batch_id, i.id, m.id,
project.id, s_hour, 0, 0, s_point, 0, 0)
summary.created_at = datetime.now()
db.session.add(summary)
# Estimate advisers cost of the project #
for i in project.stakeholder_summaries:
StakeholderSummary.query.filter_by(id=i.id).delete()
db.session.commit()
coordinator = Stakeholder.query.filter_by(id=project.coordinator).first()
project.advisers.append(coordinator)
for adviser in project.advisers:
weeks = (project.deadline - project.start_date).days / 7
days = (project.deadline - project.start_date).days - weeks*2
hours = days*8
total_hour += hours
point = hours*float(adviser.charge_rate)/100
adviser_point += point
stakeholder_summary = StakeholderSummary(
adviser.id, project.id, point, 0, 0, 0
)
stakeholder_summary.created_at = datetime.now()
db.session.add(stakeholder_summary)
contingency = student_point*float(project.contingency)/100
budget = float(project.budget)/100
total_point = student_point + contingency + budget + adviser_point
project.estimate_point = total_point
project.planning_hour = total_hour
def calculate_point(project, propose_point, acquire_point):
total_actual = 0
point_value = KitPointValue.query.all()
point_value = float(point_value[0].value)
actual_point = float(project.actual_point)
acquire_point = float(acquire_point)
propose_point = float(propose_point)
summaries = db.session.query(Report.project_id,
Report.student_id,
Report.batch_id,
Report.semester_id,
db.func.sum(Report.session).label('session') * 5 / 6)\
.group_by(Report.project_id,
Report.student_id,
Report.batch_id,
Report.semester_id)\
.filter(Report.project_id == project.id)\
.filter(Report.is_approved is True)\
.all()
stakeholder_summaries = db.session.query(StakeholderTask.assign_to, db.func.sum(StakeholderTask.actual_hour)) \
.filter(StakeholderTask.project_id == project.id) \
.group_by(StakeholderTask.assign_to) \
.all()
if acquire_point == 0 and propose_point == 0:
for i in project.members:
student = db.session.query(Student).get(i.id)
price = float(student.rank.chargeTable.price)
hours = db.session.query(Report.semester_id, db.func.sum(Report.session).label('session') * 5 / 6)\
.group_by(Report.semester_id)\
.filter(Report.project_id == project.id, i.id == Report.student_id)\
.order_by(Report.semester_id).all()
summaries = StudentSummary.query.filter_by(project_id=project.id, student_id=i.id).all()
sem_ids = []
for l in summaries:
sem_ids.append(l.semester_id)
for j in hours:
for k in summaries:
if k.semester_id == j[0]:
k.actual_point = price*float(j[1])/point_value
total_actual += k.actual_point
if j[0] not in sem_ids:
summary = StudentSummary(
i.department_id, i.batch_id, int(j[0]), i.id, project.id, float(j[1]),
float(j[1]), float(j[1] * price / point_value), float(j[1] * price / point_value), 0, 0)
db.session.add(summary)
total_actual += float(j[1] * price / point_value)
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
project.actual_point = total_actual
if propose_point > 0:
generate_propose_point(project, summaries, stakeholder_summaries, propose_point, actual_point, point_value)
if acquire_point > 0:
generate_acquire_point(project, summaries, stakeholder_summaries, acquire_point, actual_point, point_value)
def generate_propose_point(project, summaries, stakeholder_summaries, propose_point, actual_point, point_value):
total_actual = 0
total_propose = 0
if propose_point == actual_point:
for i in project.summaries:
price = float(i.student.rank.chargeTable.price)
for j in summaries:
if i.semester_id == j[3] and i.student_id == j[1]:
i.actual_point = price * float(j[4]) / point_value
i.propose_point = price * float(j[4]) / point_value
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_propose += i.propose_point
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.propose_point = price*float(j[1]) / point_value
i.actual_point = price*float(j[1]) / point_value
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_propose += i.propose_point
elif propose_point > actual_point:
percentage = actual_point/propose_point
for i in project.summaries:
price = float(i.student.rank.chargeTable.price)
for j in summaries:
if i.semester_id == j[3] and i.student_id == j[1]:
i.actual_point = price * float(j[4]) / point_value
i.propose_point = i.actual_point/percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_propose += i.propose_point
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.propose_point = i.actual_point / percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_propose += i.propose_point
else:
percentage = propose_point / actual_point
for i in project.summaries:
price = float(i.student.rank.chargeTable.price)
for j in summaries:
if i.semester_id == j[3] and i.student_id == j[1]:
i.actual_point = price * float(j[4]) / point_value
i.propose_point = i.actual_point * percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_propose += i.propose_point
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.propose_point = i.actual_point * percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_propose += i.propose_point
project.actual_point = total_actual
project.propose_point = total_propose
def generate_acquire_point(project, summaries, stakeholder_summaries, acquire_point, actual_point, point_value):
total_actual = 0
total_acquire = 0
if acquire_point == actual_point:
for i in project.summaries:
price = float(i.student.rank.chargeTable.price)
for j in summaries:
if i.semester_id == j[3] and i.student_id == j[1]:
i.actual_point = price * float(j[4]) / point_value
i.acquire_point = price * float(j[4]) / point_value
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_acquire += i.acquire_point
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.acquire_point = i.actual_point
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_acquire += i.acquire_point
elif acquire_point > actual_point:
percentage = actual_point / acquire_point
for i in project.summaries:
price = float(i.student.rank.chargeTable.price)
for j in summaries:
if i.semester_id == j[3] and i.student_id == j[1]:
i.actual_point = price * float(j[4]) / point_value
i.acquire_point = i.actual_point / percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_acquire += i.acquire_point
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.acquire_point = i.actual_point / percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_acquire += i.acquire_point
else:
percentage = acquire_point / actual_point
for i in project.summaries:
price = float(i.student.rank.chargeTable.price)
for j in summaries:
if i.semester_id == j[3] and i.student_id == j[1]:
i.actual_point = price * float(j[4]) / point_value
i.acquire_point = i.actual_point * percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_acquire += i.acquire_point
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.acquire_point = i.actual_point * percentage
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
total_acquire += i.acquire_point
project.actual_point = total_actual
project.acquire_point = total_acquire
def calculate_point_old(project, propose_point, acquire_point):
point_value = KitPointValue.query.all()
point_value = float(point_value[0].value)
actual_point = float(project.actual_point)
acquire_point = float(acquire_point)
propose_point = float(propose_point)
total_actual = 0
summaries = db.session\
.query(Task.batch_id, Task.assign_to, Task.project_id,
Semester.id, db.func.sum(Task.actual_hour))\
.group_by(Semester.id, Task.batch_id, Task.assign_to,
Task.project_id)\
.filter(Task.project_id == project.id)\
.filter(Task.start_date >= Semester.start_date, Task.deadline <= Semester.end_date)\
.filter(Semester.batch_id == Task.batch_id)\
.all()
stakeholder_summaries = db.session.query(StakeholderTask.assign_to, db.func.sum(StakeholderTask.actual_hour)) \
.filter(StakeholderTask.project_id == project.id) \
.group_by(StakeholderTask.assign_to) \
.all()
if acquire_point == 0 and propose_point == 0:
for i in project.members:
price = float(i.rank.chargeTable.price)
hours = db.session.query(Semester.id, db.func.sum(Task.actual_hour).label('session'))\
.group_by(Semester.id)\
.filter(Task.start_date <= Semester.end_date, Task.deadline >= Semester.start_date)\
.filter(Task.batch_id == Semester.batch_id)\
.filter(Task.assign_to == i.id)\
.filter(project.id == Task.project_id)\
.order_by(Semester.name).all()
print(hours, i)
summaries = StudentSummary.query.filter_by(project_id=project.id, student_id=i.id).all()
sem_ids = []
for l in summaries:
sem_ids.append(l.semester_id)
for j in hours:
for k in summaries:
if k.semester_id == j[0]:
k.actual_point = price*float(j[1]/len(hours))/point_value
total_actual += k.actual_point
if j[0] not in sem_ids:
summary = StudentSummary(
i.department_id, i.batch_id, int(j[0]), i.id, project.id, float(j[1]),
float(j[1]), float(j[1]/len(hours)) * price / point_value,
float(j[1]/len(hours)) * price / point_value, 0, 0)
db.session.add(summary)
total_actual += float(j[1]/len(hours)) * price / point_value
for i in project.stakeholder_summaries:
price = float(i.stakeholder.charge_rate)
for j in stakeholder_summaries:
if i.stakeholder_id == j[0]:
i.actual_point = price*float(j[1]) / point_value
i.updated_at = datetime.now()
db.session.merge(i)
total_actual += i.actual_point
project.actual_point = total_actual
if propose_point > 0:
generate_propose_point(project, summaries, stakeholder_summaries, propose_point, actual_point, point_value)
if acquire_point > 0:
generate_acquire_point(project, summaries, stakeholder_summaries, acquire_point, actual_point, point_value)
| 46.522472 | 115 | 0.576319 | 1,982 | 16,562 | 4.60999 | 0.063068 | 0.075845 | 0.048594 | 0.033271 | 0.791945 | 0.761191 | 0.727263 | 0.702966 | 0.679216 | 0.659954 | 0 | 0.010972 | 0.328644 | 16,562 | 355 | 116 | 46.653521 | 0.810774 | 0.004528 | 0 | 0.640244 | 0 | 0 | 0.001274 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015244 | false | 0 | 0.018293 | 0 | 0.036585 | 0.003049 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
9637f0ed3e638cd3afbe37dcddde1600334abfa1 | 388 | py | Python | models/__init__.py | jtbai/glo-7027 | 956be4b9c474bade4f28c2b2d7267ae55002f0a5 | [
"MIT"
] | null | null | null | models/__init__.py | jtbai/glo-7027 | 956be4b9c474bade4f28c2b2d7267ae55002f0a5 | [
"MIT"
] | null | null | null | models/__init__.py | jtbai/glo-7027 | 956be4b9c474bade4f28c2b2d7267ae55002f0a5 | [
"MIT"
] | null | null | null | from models.gradient_boosting_regression import GradientBoostingRegression
from models.random_forest_regression import RandomForestRegression
from models.simple_linear_regression import SimpleLinearRegression
from models.support_vector_machine_regression import SupportVectorMachineRegression
from models.support_vector_machine_regression_poly3 import SupportVectorMachineRegressionPoly3 | 77.6 | 95 | 0.935567 | 38 | 388 | 9.210526 | 0.5 | 0.142857 | 0.097143 | 0.131429 | 0.228571 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0.005435 | 0.051546 | 388 | 5 | 95 | 77.6 | 0.945652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
96409e1f58060901b7d5cf857deaf1a76eea2cac | 106 | py | Python | pastebin_crawler/settings/__init__.py | mikeler216/pastebin_crawler | 9a5712fc139cf4f02b8f81398b458f75fe0f8e86 | [
"Apache-2.0"
] | null | null | null | pastebin_crawler/settings/__init__.py | mikeler216/pastebin_crawler | 9a5712fc139cf4f02b8f81398b458f75fe0f8e86 | [
"Apache-2.0"
] | 3 | 2021-01-30T17:57:37.000Z | 2021-01-31T17:59:08.000Z | pastebin_crawler/settings/__init__.py | mikeler216/pastebin_crawler | 9a5712fc139cf4f02b8f81398b458f75fe0f8e86 | [
"Apache-2.0"
] | null | null | null | from apscheduler.schedulers.blocking import BlockingScheduler
background_scheduler = BlockingScheduler()
| 26.5 | 61 | 0.877358 | 9 | 106 | 10.222222 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 106 | 3 | 62 | 35.333333 | 0.938776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
964358059a90f5670d5af22e074f7f19a9eea993 | 102 | py | Python | pyintesishome/exceptions.py | rklomp/pyIntesisHome | 0c2fefaadebb9bdde87217a0b612d55b577c2b20 | [
"MIT"
] | null | null | null | pyintesishome/exceptions.py | rklomp/pyIntesisHome | 0c2fefaadebb9bdde87217a0b612d55b577c2b20 | [
"MIT"
] | null | null | null | pyintesishome/exceptions.py | rklomp/pyIntesisHome | 0c2fefaadebb9bdde87217a0b612d55b577c2b20 | [
"MIT"
] | null | null | null | class IHConnectionError(Exception):
pass
class IHAuthenticationError(ConnectionError):
pass
| 14.571429 | 45 | 0.784314 | 8 | 102 | 10 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 102 | 6 | 46 | 17 | 0.930233 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
96620bccfe48a659a0d91f70c5e9466f6902dc44 | 262 | py | Python | apps/account/admin.py | Couapy/mc-manager | 9ea4b6d5e42a3261b364b729991e10717e966c73 | [
"MIT"
] | null | null | null | apps/account/admin.py | Couapy/mc-manager | 9ea4b6d5e42a3261b364b729991e10717e966c73 | [
"MIT"
] | 9 | 2021-04-09T06:22:50.000Z | 2022-03-12T00:57:19.000Z | apps/account/admin.py | Couapy/mc-manager | 9ea4b6d5e42a3261b364b729991e10717e966c73 | [
"MIT"
] | null | null | null | from django.contrib import admin
from social_django.models import Association, Nonce, UserSocialAuth
from .models import Profile
admin.site.register(Profile)
admin.site.unregister(Association)
admin.site.unregister(Nonce)
admin.site.unregister(UserSocialAuth)
| 26.2 | 67 | 0.843511 | 33 | 262 | 6.666667 | 0.424242 | 0.163636 | 0.259091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076336 | 262 | 9 | 68 | 29.111111 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.