hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
7f861eb33a0b220e3a89491dc68d3e6bfdddea0a | 725 | py | Python | src/settings/models.py | TolgaKara/prodai | a858d39226a072a2a52513d942dd046bb3787da8 | [
"Apache-2.0"
] | null | null | null | src/settings/models.py | TolgaKara/prodai | a858d39226a072a2a52513d942dd046bb3787da8 | [
"Apache-2.0"
] | 7 | 2020-10-09T09:24:28.000Z | 2022-03-12T00:14:07.000Z | src/settings/models.py | TolgaKara/prodai | a858d39226a072a2a52513d942dd046bb3787da8 | [
"Apache-2.0"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
# Create your models here.
class Setting(models.Model):
user_id = models.ForeignKey(User, on_delete=models.CASCADE)
class TimeTrackingSetting(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE)
max_daily_work_time = models.TextField(max_length=50)
timetracking_name = models.CharField(max_length=100)
workingtime = models.IntegerField()
short_break = models.IntegerField()
long_break = models.IntegerField()
cycle = models.IntegerField()
class ActivitiesSetting(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE)
block_activities = models.TextField(max_length=400)
| 31.521739 | 63 | 0.768276 | 90 | 725 | 6.033333 | 0.455556 | 0.132597 | 0.082873 | 0.121547 | 0.281768 | 0.281768 | 0.281768 | 0.206262 | 0.206262 | 0.206262 | 0 | 0.012759 | 0.135172 | 725 | 22 | 64 | 32.954545 | 0.85327 | 0.033103 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
7f9732b0483bd520dbc21d4a193f8cf3d1d42f99 | 2,578 | py | Python | ansible/terminal_plugins/aos.py | xwjiang2021/sonic-mgmt | 82c446b9fb016eb070af765aa9d9999e55b27342 | [
"Apache-2.0"
] | 2 | 2021-11-24T09:33:41.000Z | 2021-12-03T09:08:29.000Z | ansible/terminal_plugins/aos.py | xwjiang2021/sonic-mgmt | 82c446b9fb016eb070af765aa9d9999e55b27342 | [
"Apache-2.0"
] | null | null | null | ansible/terminal_plugins/aos.py | xwjiang2021/sonic-mgmt | 82c446b9fb016eb070af765aa9d9999e55b27342 | [
"Apache-2.0"
] | null | null | null | from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import re
import json
from ansible.plugins.terminal import TerminalBase
from ansible.errors import AnsibleConnectionFailure
from ansible.module_utils._text import to_bytes, to_text
class TerminalModule(TerminalBase):
terminal_stdout_re = [
re.compile(br"[\r\n]?[\w+\-\.:\/\[\]]+(?:\([^\)]+\)){,3}(?:>|#) ?$"),
re.compile(br"\[\w+\@[\w\-\.]+(?: [^\]])\] ?[>#\$] ?$")
]
terminal_stderr_re = [
re.compile(br"% ?Error"),
re.compile(br"% User not present"),
re.compile(br"% ?Bad secret"),
re.compile(br"invalid input", re.I),
re.compile(br"(?:incomplete|ambiguous) command", re.I),
re.compile(br"connection timed out", re.I),
# Strings like this regarding VLANs are not errors
re.compile(br"[^\r\n]+ not found(?! in current VLAN)", re.I),
re.compile(br"'[^']' +returned error code: ?\d+"),
re.compile(br"[^\r\n](?<! shell )\/bin\/(?:ba)?sh"),
re.compile(br"% More than \d+ OSPF instance", re.I),
re.compile(br"% Subnet [0-9a-f.:/]+ overlaps", re.I),
re.compile(br"Maximum number of pending sessions has been reached"),
]
def on_open_shell(self):
pass
def on_become(self, passwd=None):
if self._get_prompt().endswith(b'#'):
return
cmd = {u'command': u'enable'}
if passwd:
cmd[u'prompt'] = to_text(r"[\r\n]?password: $", errors='surrogate_or_strict')
cmd[u'answer'] = passwd
cmd[u'prompt_retry_check'] = True
try:
self._exec_cli_command(to_bytes(json.dumps(cmd), errors='surrogate_or_strict'))
prompt = self._get_prompt()
if prompt is None or not prompt.endswith(b'#'):
raise AnsibleConnectionFailure('failed to elevate privilege to enable mode still at prompt [%s]' % prompt)
except AnsibleConnectionFailure as e:
prompt = self._get_prompt()
raise AnsibleConnectionFailure('unable to elevate privilege to enable mode, at prompt [%s] with error: %s' % (prompt, e.message))
def on_unbecome(self):
prompt = self._get_prompt()
if prompt is None:
# if prompt is None most likely the terminal is hung up at a prompt
return
if b'(config' in prompt:
self._exec_cli_command(b'end')
self._exec_cli_command(b'disable')
elif prompt.endswith(b'#'):
self._exec_cli_command(b'disable')
| 37.362319 | 141 | 0.595035 | 331 | 2,578 | 4.483384 | 0.410876 | 0.084906 | 0.103774 | 0.040431 | 0.206199 | 0.119946 | 0.044474 | 0.044474 | 0 | 0 | 0 | 0.001549 | 0.248642 | 2,578 | 68 | 142 | 37.911765 | 0.764584 | 0.04422 | 0 | 0.132075 | 0 | 0 | 0.273466 | 0.029663 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056604 | false | 0.09434 | 0.113208 | 0 | 0.264151 | 0.018868 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7f9d05c84680fa636b07deb1bc4b9da548452f7b | 3,879 | py | Python | setup.py | annieapp/annie | 82e0e858d835fcafa70629377a58102699bb8bb2 | [
"MIT"
] | 4 | 2019-06-17T15:44:54.000Z | 2022-03-25T15:40:04.000Z | setup.py | annieapp/annie | 82e0e858d835fcafa70629377a58102699bb8bb2 | [
"MIT"
] | 5 | 2019-06-12T15:25:51.000Z | 2019-07-04T23:33:06.000Z | setup.py | RDIL/annie | 82e0e858d835fcafa70629377a58102699bb8bb2 | [
"MIT"
] | 1 | 2019-06-18T00:04:41.000Z | 2019-06-18T00:04:41.000Z | """
Annie Modified MIT License
Copyright (c) 2019-present year Reece Dunham and the Annie Team
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, and/or distribute
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. SELLING THE SOFTWARE IS ALSO NOT ALLOWED WITHOUT WRITTEN PERMISSION
FROM THE ANNIE TEAM.
"""
import setuptools
CLASSIFIERS = [
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: Implementation :: PyPy",
"Operating System :: Microsoft",
"Operating System :: Microsoft :: Windows :: Windows 10",
"Operating System :: Microsoft :: Windows :: Windows 8",
"Operating System :: Microsoft :: Windows :: Windows 8.1",
"Operating System :: Microsoft :: Windows :: Windows 7",
"Operating System :: MacOS",
"Operating System :: MacOS :: MacOS X",
"Operating System :: POSIX :: Linux",
"Operating System :: Unix",
"Operating System :: Other OS",
"Intended Audience :: Developers",
"Intended Audience :: System Administrators",
"Intended Audience :: Information Technology",
"Intended Audience :: Science/Research",
"Topic :: Software Development :: Libraries",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Utilities",
"Topic :: System",
"Topic :: Terminals",
"Topic :: Text Processing",
"Topic :: Internet",
"Topic :: Internet :: WWW/HTTP :: WSGI",
"Topic :: Internet :: WWW/HTTP :: WSGI :: Server",
"Topic :: System :: Monitoring",
"Topic :: System :: Software Distribution",
"Development Status :: 4 - Beta",
"Framework :: IDLE",
"Framework :: Flask",
"Natural Language :: English",
"Environment :: Web Environment"
]
URLs = \
{
"Bug Tracker": "https://github.com/annieapp/annie/issues",
"Documentation": "https://docs.annieapp.co",
"Source Code": "https://github.com/annieapp/annie",
"License": "https://github.com/annieapp/annie/blob/master/LICENSE",
}
setuptools.setup(
name='annie-server',
version='1.4.0',
author="Annie Team",
author_email="support@rdil.rocks",
description="Annie Server",
license="See https://github.com/annieapp/annie/blob/master/LICENSE",
url="https://annieapp.co",
packages=setuptools.find_packages(exclude=["docs", "frontend"]),
include_package_data=True,
zip_safe=False,
install_requires=[
"Flask>=1.1.1",
"lcbools>=1.0.2"
],
classifiers=CLASSIFIERS,
project_urls=URLs,
download_url="https://github.com/annieapp/annie/releases",
keywords=["annie", "server", "analytics", "monitoring"],
long_description="See https://annieapp.co",
long_description_content_type="text/markdown"
)
| 39.181818 | 78 | 0.685228 | 463 | 3,879 | 5.714903 | 0.460043 | 0.056689 | 0.075586 | 0.058957 | 0.140212 | 0.062736 | 0.033258 | 0.033258 | 0 | 0 | 0 | 0.00956 | 0.191029 | 3,879 | 98 | 79 | 39.581633 | 0.833652 | 0.305749 | 0 | 0 | 0 | 0 | 0.668901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014286 | 0 | 0.014286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7f9deee06e30542f48efc2735e6a677bc1ee2c2d | 427 | py | Python | store/migrations/0011_watch_slug.py | StilyanMinchev/OnlineStore | 9f4c5a4d13150d753a0a85908c0cd56246106fb8 | [
"MIT"
] | 1 | 2020-12-27T09:36:28.000Z | 2020-12-27T09:36:28.000Z | store/migrations/0011_watch_slug.py | StilyanMinchev/OnlineStore | 9f4c5a4d13150d753a0a85908c0cd56246106fb8 | [
"MIT"
] | 7 | 2021-06-05T00:01:05.000Z | 2022-03-12T00:52:05.000Z | store/migrations/0011_watch_slug.py | StilyanMinchev/OnlineStore | 9f4c5a4d13150d753a0a85908c0cd56246106fb8 | [
"MIT"
] | 1 | 2020-12-27T09:36:31.000Z | 2020-12-27T09:36:31.000Z | # Generated by Django 3.1.3 on 2020-12-13 08:36
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('store', '0010_auto_20201212_2058'),
]
operations = [
migrations.AddField(
model_name='watch',
name='slug',
field=models.SlugField(default=1, editable=False),
preserve_default=False,
),
]
| 21.35 | 62 | 0.594848 | 46 | 427 | 5.413043 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10596 | 0.29274 | 427 | 19 | 63 | 22.473684 | 0.718543 | 0.105386 | 0 | 0 | 1 | 0 | 0.097368 | 0.060526 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7fa2a0f0ee0be54fb76852220d29ddc43297d130 | 937 | py | Python | mytravelblog/accounts/tests/forms/tests_EditPasswordForm.py | yetoshimo/my-travel-blog | de67dd135e66f2dda121850d54fd56fd644b9bff | [
"MIT"
] | null | null | null | mytravelblog/accounts/tests/forms/tests_EditPasswordForm.py | yetoshimo/my-travel-blog | de67dd135e66f2dda121850d54fd56fd644b9bff | [
"MIT"
] | null | null | null | mytravelblog/accounts/tests/forms/tests_EditPasswordForm.py | yetoshimo/my-travel-blog | de67dd135e66f2dda121850d54fd56fd644b9bff | [
"MIT"
] | null | null | null | from django import test as django_tests
from django.contrib.auth import get_user_model
from mytravelblog.accounts.forms import UserLoginForm, EditPasswordForm
UserModel = get_user_model()
class EditPasswordFromTests(django_tests.TestCase):
def setUp(self):
self.username = 'testuser'
self.password1 = 'P@ssword1'
self.user = UserModel.objects.create_user(
username=self.username,
password=self.password1,
)
def test_form_populates_with_form_control_attribute(self):
data = {
'old_password': self.password1,
'new_password1': self.password1,
'new_password2': self.password1,
}
change_password_form = EditPasswordForm(data=data, user=self.user)
self.assertTrue(change_password_form.is_valid())
self.assertEqual('form-control', change_password_form.fields['old_password'].widget.attrs['class'])
| 33.464286 | 107 | 0.693703 | 104 | 937 | 6.019231 | 0.461538 | 0.103834 | 0.086262 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01084 | 0.21238 | 937 | 27 | 108 | 34.703704 | 0.837398 | 0 | 0 | 0 | 0 | 0 | 0.089648 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 1 | 0.095238 | false | 0.47619 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7fa65605f373bb773e7700052c14cdd58e55214a | 929 | py | Python | Python/eight_kyu/abbrev_name.py | Brokenshire/codewars-projects | db9cd09618b8a7085b0d53ad76f73f9e249b9396 | [
"Apache-2.0"
] | 1 | 2019-12-20T04:09:56.000Z | 2019-12-20T04:09:56.000Z | Python/eight_kyu/abbrev_name.py | Brokenshire/codewars-projects | db9cd09618b8a7085b0d53ad76f73f9e249b9396 | [
"Apache-2.0"
] | null | null | null | Python/eight_kyu/abbrev_name.py | Brokenshire/codewars-projects | db9cd09618b8a7085b0d53ad76f73f9e249b9396 | [
"Apache-2.0"
] | null | null | null | # Python solution for 'Abbreviate a Two Word Name' codewars question.
# Level: 8 kyu
# Tags: FUNDAMENTALS, STRINGS, and ARRAYS.
# Author: Jack Brokenshire
# Date: 16/05/2020
import unittest
def abbrev_name(name):
"""
Converts name of two words into initials.
:param name: a string.
:return: two capital letters with a dot separating them.
"""
name = name.split()
return "{}.{}".format(name[0][0].upper(), name[1][0].upper())
class TestAbbrevName(unittest.TestCase):
"""Class to test 'abbrev_name' function"""
def test_abbrev_name(self):
self.assertEqual(abbrev_name("Sam Harris"), "S.H")
self.assertEqual(abbrev_name("Patrick Feenan"), "P.F")
self.assertEqual(abbrev_name("Evan Cole"), "E.C")
self.assertEqual(abbrev_name("P Favuzzi"), "P.F")
self.assertEqual(abbrev_name("David Mendieta"), "D.M")
if __name__ == '__main__':
unittest.main()
| 28.151515 | 69 | 0.658773 | 124 | 929 | 4.798387 | 0.604839 | 0.134454 | 0.176471 | 0.210084 | 0.090756 | 0.090756 | 0 | 0 | 0 | 0 | 0 | 0.017287 | 0.190527 | 929 | 32 | 70 | 29.03125 | 0.773936 | 0.347686 | 0 | 0 | 0 | 0 | 0.146597 | 0 | 0 | 0 | 0 | 0 | 0.384615 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7fa7c7a849af2d16d11ffa6cce19f13ddd50d411 | 1,876 | py | Python | assets/python_scripts/cvScale.py | weiSupreme/weiSupreme.github.io | 59d419f2d8207c7da0a427330b21546f4f0d85d0 | [
"CC-BY-4.0"
] | 3 | 2018-09-15T05:47:35.000Z | 2019-04-08T07:00:02.000Z | assets/python_scripts/cvScale.py | weiSupreme/weiSupreme.github.io | 59d419f2d8207c7da0a427330b21546f4f0d85d0 | [
"CC-BY-4.0"
] | null | null | null | assets/python_scripts/cvScale.py | weiSupreme/weiSupreme.github.io | 59d419f2d8207c7da0a427330b21546f4f0d85d0 | [
"CC-BY-4.0"
] | null | null | null | import os
import cv2
import random
src_dir = "select2000/"
name_idx = 6000
dest_img_dir = "scale/image/"
dest_txt_dir = "scale/txt/"
img_list = os.listdir(src_dir)
def write_label(old_name, new_name, hratio_, wratio_):
old_obj = open(src_dir+old_name)
new_obj = open(dest_txt_dir+new_name, 'w')
old_txt = old_obj.read()
writeline_str = ''
gt_split = old_txt.split('\n')
for gt_line in gt_split:
gt_ind = gt_line.split(',')
cordiante_str = []
if len(gt_ind) > 8:
for i in range(0, 8):
if i % 2 == 0:
cordiante_str.append(str(float(gt_ind[i])*wratio_))
else:
cordiante_str.append(str(float(gt_ind[i])*hratio_))
writeline_str = cordiante_str[0] + ',' + cordiante_str[1] + ',' + cordiante_str[2] + ',' + cordiante_str[3] + ',' + cordiante_str[4] + ',' + cordiante_str[5] + ',' + cordiante_str[6] + ',' + cordiante_str[7] + ',' + gt_ind[8] + '\n'
new_obj.write(writeline_str)
old_obj.close()
new_obj.close()
for img_name in img_list:
if '.txt' in img_name:
continue
print img_name
txt_name = img_name.rstrip('jpg') + 'txt'
new_txt_name = str(name_idx).zfill(6) + '.txt'
img = cv2.imread(src_dir+img_name)
height, width, c = img.shape
hratio = 0
wratio = 0
prob = random.choice(range(0, 10))
hratio = random.choice(range(5, 10)) / 10.
if prob < 7:
wratio = hratio
else:
wratio = random.choice(range(5, 10)) / 10.
scale_img = cv2.resize(img, (int(width*wratio), int(height*hratio)), interpolation=cv2.INTER_AREA)
write_label(txt_name, new_txt_name, hratio, wratio)
cv2.imwrite(dest_img_dir+str(name_idx).zfill(6)+'.jpg', scale_img)
name_idx += 1
#print hratio, wratio
#break
| 30.258065 | 244 | 0.591151 | 271 | 1,876 | 3.822878 | 0.276753 | 0.127413 | 0.049228 | 0.040541 | 0.135135 | 0.104247 | 0.061776 | 0.061776 | 0 | 0 | 0 | 0.033333 | 0.264392 | 1,876 | 61 | 245 | 30.754098 | 0.717391 | 0.013326 | 0 | 0.042553 | 0 | 0 | 0.035154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.06383 | null | null | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7fb037fc5e93403c24d4069718d9478ba79f717e | 13,941 | py | Python | mod_spotted_extended_light/mod_spotted_extended_light308.py | chipsi007/spoter-mods | 0b8745cf06c651d84e356d16ce9d49f574dac81e | [
"WTFPL"
] | null | null | null | mod_spotted_extended_light/mod_spotted_extended_light308.py | chipsi007/spoter-mods | 0b8745cf06c651d84e356d16ce9d49f574dac81e | [
"WTFPL"
] | null | null | null | mod_spotted_extended_light/mod_spotted_extended_light308.py | chipsi007/spoter-mods | 0b8745cf06c651d84e356d16ce9d49f574dac81e | [
"WTFPL"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import re
import json
import codecs
import datetime
import threading
import urllib
import urllib2
import string
import random
import BigWorld
from constants import AUTH_REALM
from gui.Scaleform.daapi.view.lobby.hangar.Hangar import Hangar
from gui.battle_control import g_sessionProvider
from Avatar import PlayerAvatar
from constants import BATTLE_EVENT_TYPE
import SoundGroups
from gui.app_loader import g_appLoader
class Config(object):
def __init__(self):
self.enable = True
self.debug = False
self.ru = True if 'RU' in AUTH_REALM else False
self.version = 'v3.08(18.11.2015)'
self.author = 'by spoter'
self.description = 'spotted_extended_light'
self.description_ru = 'Мод: "Маленький Светлячок"'
self.author_ru = 'автор: spoter'
self.name = 'spotted_extended_light'
self.description_analytics = 'Мод: "Маленький Светлячок"'
self.tid = 'UA-57975916-7'
self.sys_mes = {}
self.setup = {'MODIFIER': {'MODIFIER_NONE': 0, 'MODIFIER_SHIFT': 1, 'MODIFIER_CTRL': 2, 'MODIFIER_ALT': 4}}
self._thread_analytics = None
self.analytics_started = False
self.language = None
self.xvm_installed = False
self.xvm_check()
self.res_mods = self.res_mods_init()
self.data = {}
self.default_config()
new_config = self.load_json(self.name, self.data)
self.data = new_config
if 'Русский' in self.data['config'].get('language'): self.ru = True
if self.ru:
self.description = self.description_ru
self.author = self.author_ru
@staticmethod
def res_mods_init():
wd = os.path.dirname(os.path.realpath(__file__))
wd = wd[0:wd.rfind('\\')]
wd = wd[0:wd.rfind('\\')]
wd = wd[0:wd.rfind('\\')]
return wd
def xvm_check(self):
try:
import xvm_main
self.xvm_installed = True
except StandardError:
pass
def default_config(self):
self.data = {
'config': {
'enable': True, 'debug': False, 'sound': True, 'icon_size': [70, 24], 'language': 'Русский'
}, 'sound': {
'%s' % BATTLE_EVENT_TYPE.SPOTTED: '/GUI/notifications_FX/enemy_sighted_for_team', '%s' % BATTLE_EVENT_TYPE.RADIO_HIT_ASSIST: '/GUI/notifications_FX/gun_intuition',
'%s' % BATTLE_EVENT_TYPE.RADIO_KILL_ASSIST: '/GUI/notifications_FX/cybersport_auto_search', '%s' % BATTLE_EVENT_TYPE.TRACK_ASSIST: '/GUI/notifications_FX/gun_intuition'
}, 'language': {
'Русский': {
'messages': {
'%s' % BATTLE_EVENT_TYPE.SPOTTED: 'Засвечены {icons}', '%s' % BATTLE_EVENT_TYPE.RADIO_HIT_ASSIST: 'Урон по засвету в {icons_names}',
'%s' % BATTLE_EVENT_TYPE.RADIO_KILL_ASSIST: 'Убили по засвету {full}', '%s' % BATTLE_EVENT_TYPE.TRACK_ASSIST: 'Ассист {icons_vehicles} за сбитые траки'
}
}, 'English': {
'messages': {
'%s' % BATTLE_EVENT_TYPE.SPOTTED: 'Spotted {icons}', '%s' % BATTLE_EVENT_TYPE.RADIO_HIT_ASSIST: 'Radio hit assist to {icons_names}',
'%s' % BATTLE_EVENT_TYPE.RADIO_KILL_ASSIST: 'Radio kill assist to {full}', '%s' % BATTLE_EVENT_TYPE.TRACK_ASSIST: 'Tracks assist {icons_vehicles}'
}
}, 'Deutsch': {
'messages': {
'%s' % BATTLE_EVENT_TYPE.SPOTTED: 'Gefunden {icons}', '%s' % BATTLE_EVENT_TYPE.RADIO_HIT_ASSIST: 'Schaden von Licht in {icons_names}',
'%s' % BATTLE_EVENT_TYPE.RADIO_KILL_ASSIST: 'Bei gluht getotet {full}', '%s' % BATTLE_EVENT_TYPE.TRACK_ASSIST: 'Assist {icons_vehicles} fur schwer verletzte tracks'
}
}
}
}
def do_config(self):
self.enable = self.data['config'].get('enable', False)
self.debug = self.data['config'].get('debug', False)
if self.data['config'].get('language') in self.data['language']:
self.language = self.data['language'].get(self.data['config'].get('language'))
else:
self.data['config']['language'] = 'English'
self.language = self.data['language'].get('English')
def byte_ify(self, inputs):
if inputs:
if isinstance(inputs, dict):
return {self.byte_ify(key): self.byte_ify(value) for key, value in inputs.iteritems()}
elif isinstance(inputs, list):
return [self.byte_ify(element) for element in inputs]
elif isinstance(inputs, unicode):
return inputs.encode('utf-8')
else:
return inputs
return inputs
@staticmethod
def json_comments(text):
regex = r'\s*(#|\/{2}).*$'
regex_inline = r'(:?(?:\s)*([A-Za-z\d\.{}]*)|((?<=\").*\"),?)(?:\s)*(((#|(\/{2})).*)|)$'
lines = text.split('\n')
excluded = []
for index, line in enumerate(lines):
if re.search(regex, line):
if re.search(r'^' + regex, line, re.IGNORECASE):
excluded.append(lines[index])
elif re.search(regex_inline, line):
lines[index] = re.sub(regex_inline, r'\1', line)
for line in excluded:
lines.remove(line)
return '\n'.join(lines)
def load_json(self, name, config_old, save=False):
config_new = config_old
path = './res_mods/configs/spoter_mods/%s/' % self.name
if not os.path.exists(path):
os.makedirs(path)
new_path = '%s%s.json' % (path, name)
if save:
with codecs.open(new_path, 'w', encoding='utf-8-sig') as json_file:
data = json.dumps(config_old, sort_keys=True, indent=4, ensure_ascii=False, encoding='utf-8-sig', separators=(',', ': '))
json_file.write('%s' % self.byte_ify(data))
json_file.close()
config_new = config_old
else:
if os.path.isfile(new_path):
try:
with codecs.open(new_path, 'r', encoding='utf-8-sig') as json_file:
data = self.json_comments(json_file.read().decode('utf-8-sig'))
config_new = self.byte_ify(json.loads(data))
json_file.close()
except Exception as e:
self.sys_mess()
print '%s%s' % (self.sys_mes['ERROR'], e)
else:
self.sys_mess()
print '%s[%s, %s %s]' % (self.sys_mes['ERROR'], self.code_pa(self.description), self.version, self.sys_mes['MSG_RECREATE_CONFIG'])
with codecs.open(new_path, 'w', encoding='utf-8-sig') as json_file:
data = json.dumps(config_old, sort_keys=True, indent=4, ensure_ascii=False, encoding='utf-8-sig', separators=(',', ': '))
json_file.write('%s' % self.byte_ify(data))
json_file.close()
config_new = config_old
print '%s[%s, %s %s]' % (self.sys_mes['INFO'], self.code_pa(self.description), self.version, self.sys_mes['MSG_RECREATE_CONFIG_DONE'])
return config_new
@staticmethod
def code_pa(text):
try:
return text.encode('windows-1251')
except StandardError:
return text
def debugs(self, text):
if self.debug:
try:
text = text.encode('windows-1251')
except StandardError:
pass
print '%s%s [%s]: %s' % (datetime.datetime.now(), self.sys_mes['DEBUG'], self.code_pa(self.description), text)
def analytics_do(self):
if not self.analytics_started:
player = BigWorld.player()
param = urllib.urlencode({
'v': 1, # Version.
'tid': '%s' % self.tid, # Tracking ID / Property ID.
'cid': player.databaseID, # Anonymous Client ID.
't': 'screenview', # Screenview hit type.
'an': '%s' % self.description_analytics, # App name.
'av': '%s %s' % (self.description_analytics, self.version), # App version.
'cd': 'start [%s]' % AUTH_REALM # Screen name / content description.
})
self.debugs('http://www.google-analytics.com/collect?%s' % param)
urllib2.urlopen(url='http://www.google-analytics.com/collect?', data=param).read()
self.analytics_started = True
def analytics(self):
self._thread_analytics = threading.Thread(target=self.analytics_do, name='Thread')
self._thread_analytics.start()
def sys_mess(self):
self.sys_mes = {
'DEBUG': '[DEBUG]', 'LOAD_MOD': self.code_pa('[ЗАГРУЗКА]: ') if self.ru else '[LOAD_MOD]: ', 'INFO': self.code_pa('[ИНФО]: ') if self.ru else '[INFO]: ',
'ERROR': self.code_pa('[ОШИБКА]: ') if self.ru else '[ERROR]: ',
'MSG_RECREATE_CONFIG': self.code_pa('конфиг не найден, создаем заново') if self.ru else 'Config not found, recreating',
'MSG_RECREATE_CONFIG_DONE': self.code_pa('конфиг создан УСПЕШНО') if self.ru else 'Config recreating DONE',
'MSG_INIT': self.code_pa('применение настроек...') if self.ru else 'initialized ...', 'MSG_LANGUAGE_SET': self.code_pa('Выбран язык:') if self.ru else 'Language set to:',
'MSG_DISABLED': self.code_pa('отключен ...') if self.ru else 'disabled ...'
}
def load_mod(self):
self.do_config()
self.sys_mess()
print ''
print '%s[%s, %s]' % (self.sys_mes['LOAD_MOD'], self.code_pa(self.description), self.code_pa(self.author))
if self.enable:
self.debugs('Debug Activated ...')
print '%s[%s %s %s...]' % (self.sys_mes['INFO'], self.code_pa(self.description), self.sys_mes['MSG_LANGUAGE_SET'], self.code_pa(self.data['config'].get('language')))
print '%s[%s, %s %s]' % (self.sys_mes['INFO'], self.code_pa(self.description), self.version, self.sys_mes['MSG_INIT'])
else:
print '%s[%s, %s %s]' % (self.sys_mes['INFO'], self.code_pa(self.description), self.version, self.sys_mes['MSG_DISABLED'])
print ''
class Assist(object):
def __init__(self):
self.format_str = {'icons': '', 'names': '', 'vehicles': '', 'icons_names': '', 'icons_vehicles': '', 'full': ''}
@staticmethod
def check_macros(macros):
if macros in config.language['messages']['0']: return True
if macros in config.language['messages']['1']: return True
if macros in config.language['messages']['2']: return True
if macros in config.language['messages']['3']: return True
def format_recreate(self):
self.format_str = {'icons': '', 'names': '', 'vehicles': '', 'icons_names': '', 'icons_vehicles': '', 'full': ''}
@staticmethod
def sound(assist_type):
if '%s' % assist_type in config.data['sound']:
if config.data['sound'][assist_type] != '':
sound = SoundGroups.g_instance.getSound2D(config.data['sound'][assist_type])
if sound:
sound.play()
def post_message(self, assist_type, vehicles_ids):
if assist_type in config.language['messages']:
self.format_recreate()
for i in vehicles_ids:
if i >> 32 & 4294967295L > 0: i = i >> 32 & 4294967295L
else: i &= 4294967295L
icon = '<img src="img://%s" width="%s" height="%s" />' % (
g_sessionProvider.getArenaDP().getVehicleInfo(i).vehicleType.iconPath.replace('..', 'gui'), config.data['config'].get('icon_size')[0], config.data['config'].get('icon_size')[1])
target_info = g_sessionProvider.getCtx().getFullPlayerNameWithParts(vID=i)
if self.check_macros('{icons}'): self.format_str['icons'] += icon
if self.check_macros('{names}'): self.format_str['names'] += '[%s]' % target_info[1] if target_info[1] else icon
if self.check_macros('{vehicles}'): self.format_str['vehicles'] += '[%s]' % target_info[4] if target_info[4] else icon
if self.check_macros('{icons_names}'): self.format_str['icons_names'] += '%s[%s]' % (icon, target_info[1]) if target_info[1] else icon
if self.check_macros('{icons_vehicles}'): self.format_str['icons_vehicles'] += '%s[%s]' % (icon, target_info[4]) if target_info[4] else icon
if self.check_macros('{full}'):
full = g_sessionProvider.getCtx().getFullPlayerName(vID=i)
self.format_str['full'] += '%s[%s]' % (icon, full) if full else icon
msg = config.language['messages'][assist_type].format(**self.format_str)
g_appLoader.getDefBattleApp().call('battle.PlayerMessagesPanel.ShowMessage', [msg + random.choice(string.ascii_letters), '%s' % msg.decode('utf-8-sig'), 'gold'])
# deformed functions:
def hook_update_all(self):
hooked_update_all(self)
config.analytics()
def hook_on_battle_event(self, event_type, details):
if config.enable:
assist_type = '%s' % event_type
if config.data['config'].get('sound'): assist.sound(assist_type)
assist.post_message(assist_type, details)
return hooked_on_battle_event(self, event_type, details)
#hooked
# noinspection PyProtectedMember
hooked_update_all = Hangar._Hangar__updateAll
hooked_on_battle_event = PlayerAvatar.onBattleEvent
#hook
Hangar._Hangar__updateAll = hook_update_all
PlayerAvatar.onBattleEvent = hook_on_battle_event
#start mod
assist = Assist()
config = Config()
config.load_mod()
| 46.781879 | 197 | 0.582742 | 1,689 | 13,941 | 4.618709 | 0.19775 | 0.006666 | 0.032688 | 0.032816 | 0.361748 | 0.3015 | 0.223177 | 0.193693 | 0.157928 | 0.142161 | 0 | 0.010411 | 0.269636 | 13,941 | 297 | 198 | 46.939394 | 0.755647 | 0.016283 | 0 | 0.18677 | 0 | 0 | 0.175195 | 0.028615 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007782 | 0.07393 | null | null | 0.038911 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7fb04d7e5c955b6114c4a179c5490fcab0e0ff9c | 3,497 | py | Python | adjutantclient/osc/v1/tokens.py | openstack/python-adjutantclient | 20d234bc370056184a0885e6e6f735db056668a2 | [
"Apache-2.0"
] | 6 | 2017-10-31T13:27:59.000Z | 2019-01-28T22:09:25.000Z | adjutantclient/osc/v1/tokens.py | openstack/python-adjutantclient | 20d234bc370056184a0885e6e6f735db056668a2 | [
"Apache-2.0"
] | null | null | null | adjutantclient/osc/v1/tokens.py | openstack/python-adjutantclient | 20d234bc370056184a0885e6e6f735db056668a2 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2016 Catalyst IT Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
import logging
from osc_lib.command import command
from osc_lib.i18n import _
from adjutantclient import client as adjutant_client
LOG = logging.getLogger(__name__)
def _list_tokens(client, filters={}):
tokens = client.tokens.list(filters=filters)
headers = ['Token', 'Task', 'Expires on', 'Created on']
rows = [[token.token, token.task, token.expires,
token.created_on] for token in tokens]
return headers, rows
class TokenList(command.Lister):
"""Lists adjutant tokens. """
def get_parser(self, prog_name):
parser = super(TokenList, self).get_parser(prog_name)
parser.add_argument(
'--filters', metavar='<filters>',
required=False,
help=_('JSON containing filters for tokens.'),
default={})
return parser
def take_action(self, parsed_args):
client = self.app.client_manager.admin_logic
return _list_tokens(client, parsed_args.filters)
class TokenShow(command.ShowOne):
"""Show details of one token."""
def get_parser(self, prog_name):
parser = super(TokenShow, self).get_parser(prog_name)
parser.add_argument(
'token', metavar='<token_id>',
help=_("The token."))
parser.add_argument(
'--bypass-url', metavar='<bypass-url>', default=None,
help=_('Bypasss URL for unauthenticated access to the endpoint.'))
return parser
def take_action(self, parsed_args):
if not parsed_args.bypass_url:
self.app.client_manager._auth_required = True
self.app.client_manager.setup_auth()
client = self.app.client_manager.admin_logic
else:
client = adjutant_client.Client("1", parsed_args.bypass_url)
token = client.tokens.get(parsed_args.token)
return zip(*(token.to_dict()).items())
class TokenSubmit(command.Command):
"""Submit token data."""
def get_parser(self, prog_name):
parser = super(TokenSubmit, self).get_parser(prog_name)
parser.add_argument(
'token', metavar='<token_id>', help=_('The token.'))
parser.add_argument(
'data', metavar='<token_data>',
help=_('Submission data for the token. Must be valid json.'))
return parser
def take_action(self, parsed_args):
client = self.app.client_manager.admin_logic
resp = client.tokens.submit(
parsed_args.token, json.loads(parsed_args.data))
print('Success', ' '.join(resp.notes))
class TokenClear(command.Lister):
"""Clear Expired tokens, admin only."""
def take_action(self, parsed_args):
client = self.app.client_manager.admin_logic
resp = client.tokens.clear_expired()
print('Success. ' + ' '.join(resp.json()['notes']))
return _list_tokens(client)
| 32.682243 | 78 | 0.654847 | 437 | 3,497 | 5.075515 | 0.350114 | 0.045086 | 0.037872 | 0.054103 | 0.286294 | 0.286294 | 0.286294 | 0.270063 | 0.188007 | 0.188007 | 0 | 0.004106 | 0.233915 | 3,497 | 106 | 79 | 32.990566 | 0.823815 | 0.197312 | 0 | 0.301587 | 0 | 0 | 0.108508 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.126984 | false | 0.063492 | 0.079365 | 0 | 0.380952 | 0.031746 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7fb71d82abd96a4b2037456c5481862b49d126e5 | 6,010 | py | Python | src/django/addr/addr/settings.py | deshk04/addressparser | 03c33859bb585a6cf767e0f4d3dfd23cabd2d78b | [
"MIT"
] | null | null | null | src/django/addr/addr/settings.py | deshk04/addressparser | 03c33859bb585a6cf767e0f4d3dfd23cabd2d78b | [
"MIT"
] | null | null | null | src/django/addr/addr/settings.py | deshk04/addressparser | 03c33859bb585a6cf767e0f4d3dfd23cabd2d78b | [
"MIT"
] | null | null | null | """
Django settings for addr project.
Generated by 'django-admin startproject' using Django 1.11.29.
For more information on this file, see
https://docs.djangoproject.com/en/1.11/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.11/ref/settings/
"""
import os
from datetime import timedelta
from dotenv import load_dotenv
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.11/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
# SECRET_KEY = 'txynano6xva_dfzjhiopg5wak5bqk^eecw^3+0xv)09j23k%++'
load_dotenv('/addr/.env')
SECRET_KEY = os.getenv("DJANGO_SECRET_KEY")
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
PROJ_ENV = os.getenv("PROJ_ENV")
ALLOWED_HOSTS = ['*']
PROJECT_ROOT_PATH = '/addr/src/'
ADDR_TEMPLATE_FOLDER = PROJECT_ROOT_PATH + '/static/html'
ADDR_STATIC_ROOT = PROJECT_ROOT_PATH + '/static/'
ADDR_STATIC_FOLDER = PROJECT_ROOT_PATH + '/static/html'
DJANGO_LOGFILE = '/addr/ops/logs/uwsgi/django.log'
# Application definition
INSTALLED_APPS = []
if PROJ_ENV == 'dev':
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'adr',
'rest_framework',
'rest_framework.authtoken',
'corsheaders'
]
else:
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'adr',
'rest_framework',
'rest_framework.authtoken',
]
REST_FRAMEWORK = {
'DEFAULT_AUTHENTICATION_CLASSES': [
'rest_framework_simplejwt.authentication.JWTAuthentication',
],
# 'DEFAULT_AUTHENTICATION_CLASSES': (
# 'rest_framework.authentication.TokenAuthentication',
# 'rest_framework.authentication.SessionAuthentication',
# ),
}
SIMPLE_JWT = {
'ACCESS_TOKEN_LIFETIME': timedelta(minutes=240),
'REFRESH_TOKEN_LIFETIME': timedelta(days=6),
'SIGNING_KEY': 's1d'
}
MIDDLEWARE = []
if PROJ_ENV == 'dev':
MIDDLEWARE = [
'corsheaders.middleware.CorsMiddleware',
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
else:
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
CORS_ORIGIN_ALLOW_ALL = True
ROOT_URLCONF = 'addr.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [ADDR_TEMPLATE_FOLDER],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'addr.wsgi.application'
# Database
# https://docs.djangoproject.com/en/1.11/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': os.environ.get('DATABASE_ENGINE', 'django.db.backends.postgresql_psycopg2'),
'NAME': os.environ.get('POSTGRES_DB', 'addrmain'),
'USER': os.environ.get('POSTGRES_USER', 'addruser'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD', ''),
'HOST': os.environ.get('POSTGRES_SVCNM', os.environ.get('POSTGRES_HOST', 'addr_database')),
'PORT': os.environ.get('POSTGRES_PORT', '5432'),
},
# 'logs': {
# 'ENGINE': 'django.db.backends.postgresql_psycopg2',
# 'NAME': 'logs',
# 'USER': os.environ.get('PGUSER', 'dartuser'),
# 'PASSWORD': os.environ.get('PGPASSWORD', ''),
# 'HOST': os.environ.get('PGHOST', 'localhost'),
# 'PORT': os.environ.get('PGPORT', '5432'),
# },
}
# Password validation
# https://docs.djangoproject.com/en/1.11/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/1.11/topics/i18n/
LANGUAGE_CODE = 'en-us'
# TIME_ZONE = 'UTC'
TIME_ZONE = 'Australia/Melbourne'
USE_I18N = True
USE_L10N = True
# USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.11/howto/static-files/
STATIC_URL = '/static/'
STATICFILES_DIRS = [
os.path.join(BASE_DIR, "static"),
ADDR_STATIC_ROOT,
]
DATA_UPLOAD_MAX_MEMORY_SIZE = 5242880
# 2.5MB - 2621440
# 5MB - 5242880
# 10MB - 10485760
# 20MB - 20971520
# 50MB - 5242880
| 29.033816 | 99 | 0.67604 | 632 | 6,010 | 6.272152 | 0.351266 | 0.078708 | 0.0333 | 0.044147 | 0.432392 | 0.395056 | 0.340061 | 0.317861 | 0.279768 | 0.248739 | 0 | 0.022277 | 0.185857 | 6,010 | 206 | 100 | 29.174757 | 0.78786 | 0.26772 | 0 | 0.33871 | 1 | 0 | 0.49587 | 0.382515 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.048387 | 0.024194 | 0 | 0.024194 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7fba6b53744ed7ffa8875853e834cde754de773e | 5,106 | py | Python | xxhh/rank.py | jannson/Similar | 975527fae5fe63ab63913fa1f24619d616331687 | [
"MIT"
] | 2 | 2015-12-17T17:25:44.000Z | 2019-01-24T16:46:55.000Z | xxhh/rank.py | jannson/Similar | 975527fae5fe63ab63913fa1f24619d616331687 | [
"MIT"
] | null | null | null | xxhh/rank.py | jannson/Similar | 975527fae5fe63ab63913fa1f24619d616331687 | [
"MIT"
] | null | null | null | import sys, os, os.path, re
import codecs
import numpy as np
from scipy.sparse import *
from scipy import *
from sklearn.externals import joblib
import networkx as nx
django_path = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(13, django_path)
os.environ['DJANGO_SETTINGS_MODULE'] = 'xxhh.settings'
from django.db.models import Count
from django.db.models import Q
#from xxhh.models import TestLog as XhLogUd
from xxhh.models import XhLogUd
RATE = {}
RATE['u'] = 1
RATE['d'] = -1
def input_test():
all_objs = []
with codecs.open('trainsmall.txt', 'r', 'utf-8') as f:
for line in f:
items = line.split()
if len(items) > 2:
xh = XhLogUd()
xh.guid = items[0]
xh.post_id = int(items[1])
score = int(items[2])
if score == 1:
xh.uaction = 'd'
else:
xh.uaction = 'u'
xh.pos = 'z'
xh.shiduan = 9
xh.ctime = 9
#xh.save()
all_objs.append(xh)
return all_objs
class Ratings(object):
def __init__(self):
#all_objs = list(XhLogUd.objects.exclude(guid=''))
all_objs = input_test()
guid2id = {}
id2guid = []
i = 0
for obj in all_objs:
if obj.guid not in guid2id:
guid = obj.guid
guid2id[guid] = i
id2guid.append(guid)
i += 1
self.guid2id = guid2id
self.id2guid = id2guid
max_item = 0
post2id = {}
id2post = []
i = 0
for obj in all_objs:
if obj.post_id not in post2id:
post_id = obj.post_id
post2id[post_id] = i
id2post.append(post_id)
i += 1
self.post2id = post2id
self.id2post = id2post
guids = [[] for _ in id2guid]
for obj in all_objs:
guids[guid2id[obj.guid]].append((RATE[obj.uaction.strip()], self.guid2id[obj.guid], self.post2id[obj.post_id]))
self.guids = guids
posts1 = [set() for _ in id2post]
posts2 = [set() for _ in id2post]
for obj in all_objs:
if obj.uaction == 'u':
posts1[post2id[obj.post_id]].add(guid2id[obj.guid])
else:
posts2[post2id[obj.post_id]].add(guid2id[obj.guid])
'''
post_max = 0
for guid_set in self.posts:
if post_max < len(guid_set):
post_max = len(guid_set)
print "max guids in post", post_max
'''
print "all users", len(self.id2guid)
print "all items", len(self.id2post)
#sorting users
#sorted_guids = [(g,len(self.guids[self.guid2id[g]]) ) for i,g in enumerate(self.id2guid)]
#self.sorted_guids = [g for g,c in sorted(sorted_guids, key = lambda item: -item[1]) if c >= 3]
#print self.sorted_guids[0:10]
#print 'len and max rated guid', len(self.sorted_guids), self.sorted_guids[0], len(self.guids[self.guid2id[self.sorted_guids[0]]])
weights1 = np.zeros((len(id2post),len(id2post)), dtype=float)
weights2 = np.zeros((len(id2post),len(id2post)), dtype=float)
#weights = lil_matrix( (len(id2post), len(id2post)), dtype=float)
for x in xrange(len(id2post)):
for y in xrange(len(id2post)):
if x < y:
weights1[x,y] = len(posts1[x] & posts1[y])
weights2[x,y] = len(posts2[x] & posts2[y])
else:
weights1[x,y] = weights1[y,x]
weights2[x,y] = weights2[y,x]
#weights1 = weights1/len(self.id2post)
#weights2 = weights2/len(self.id2post)
nx_graph = nx.from_numpy_matrix(weights1)
scores1 = nx.pagerank_numpy(nx_graph)
print 'score1 complete'
nx_graph = nx.from_numpy_matrix(weights2)
scores2 = nx.pagerank_numpy(nx_graph)
print 'score2 complete'
scores = [scores1[i]/scores2[i] for i in xrange(len(id2post))]
#nx_graph = nx.from_scipy_sparse_matrix(self.weights)
#scores = nx.pagerank_scipy(nx_graph)
res = sorted( [(scores[i],id2post[i]) for i in xrange(len(id2post))] , reverse=True)
res1 = sorted( [(scores1[i],id2post[i]) for i in xrange(len(id2post))] , reverse=True)
res2 = sorted( [(scores2[i],id2post[i]) for i in xrange(len(id2post))] , reverse=True)
with open('rank.out', 'w') as f:
for r in res:
f.write('%d %f\n' % (r[1], r[0]) )
with open('rank_1.out', 'w') as f:
for r in res1:
f.write('%d %f\n' % (r[1], r[0]) )
with open('rank_2.out', 'w') as f:
for r in res2:
f.write('%d %f\n' % (r[1], r[0]) )
def by_user(self, id):
return self.guids[id]
def all_users(self):
for guid in self.sorted_guids:
yield self.guid2id[guid]
ratings = Ratings()
| 33.372549 | 138 | 0.53584 | 681 | 5,106 | 3.911894 | 0.211454 | 0.045045 | 0.033784 | 0.040541 | 0.281532 | 0.220345 | 0.170796 | 0.140015 | 0.087462 | 0.066066 | 0 | 0.034777 | 0.335488 | 5,106 | 152 | 139 | 33.592105 | 0.750368 | 0.133373 | 0 | 0.12844 | 0 | 0 | 0.038206 | 0.005221 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.091743 | null | null | 0.036697 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7fbb8f0f4fee8e352ae2772b0862fa10d84c521b | 1,290 | py | Python | temp/pattern.py | BlueBeret/ComputerGraphics-Assignment | 5df905adfe8bcc958493337a80bd6eb75806bf6b | [
"MIT"
] | 1 | 2022-03-11T14:03:37.000Z | 2022-03-11T14:03:37.000Z | temp/pattern.py | BlueBeret/ComputerGraphics-Assignment | 5df905adfe8bcc958493337a80bd6eb75806bf6b | [
"MIT"
] | null | null | null | temp/pattern.py | BlueBeret/ComputerGraphics-Assignment | 5df905adfe8bcc958493337a80bd6eb75806bf6b | [
"MIT"
] | null | null | null | from tkinter import *
root = Tk()
canvas = Canvas(root,
width=1000,
height=1000,
background="#FFFFFF")
canvas.pack(expand=YES, fill=BOTH)
def dot(x,y,size=1, color="#FF0000", outline=""):
if size==1:
canvas.create_line(x,y,x+1,y, fill=color, width=size)
return
canvas.create_oval(x-size/2,y-size/2,x+size/2,y+size/2,fill=color,width=0, outline=outline)
def onClick(event):
global BERZIER_COORD
if len(BERZIER_COORD) < 8:
BERZIER_COORD.append(event.x)
BERZIER_COORD.append(event.y)
if len(BERZIER_COORD) == 8:
kurvaBezier(BERZIER_COORD[0], BERZIER_COORD[1], BERZIER_COORD[2],
BERZIER_COORD[3], BERZIER_COORD[4], BERZIER_COORD[5], BERZIER_COORD[6], BERZIER_COORD[7])
BERZIER_COORD = []
def kurvaBezier(x1,y1,x2,y2,x3,y3,x4,y4, warna="#FF0000"):
panjang = max(abs(x1-x2), abs(y1-y2)) + max(abs(x2-x3), abs(y2-y3))+max (abs(x3-x4), abs(y3-y4))
u = 0
for i in range(panjang):
x = x1 * ((1-u) **3) + 3*x2*u*((1-u)**2) + 3*x3*(u**2)*(1-u) + x4*(u**3)
y = y1 * ((1-u) **3) + 3*y2*u*((1-u)**2) + 3*y3*(u**2)*(1-u) + y4*(u**3)
u += 1/panjang
dot(round(x),round(y),color=warna)
BERZIER_COORD = []
canvas.bind("<Button-1>", onClick)
root.mainloop() | 29.318182 | 100 | 0.591473 | 219 | 1,290 | 3.406393 | 0.305936 | 0.241287 | 0.037534 | 0.018767 | 0.093834 | 0.032172 | 0 | 0 | 0 | 0 | 0 | 0.080426 | 0.2 | 1,290 | 44 | 101 | 29.318182 | 0.642442 | 0 | 0 | 0.0625 | 0 | 0 | 0.024012 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09375 | false | 0 | 0.03125 | 0 | 0.15625 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f68072e40e9326d2208563fb1dc9bb70cc1a0ff4 | 3,420 | py | Python | gadann/updater.py | dpineo/gadann | ff5dce9a8fc6192ba1efd854672f593872116beb | [
"MIT"
] | null | null | null | gadann/updater.py | dpineo/gadann | ff5dce9a8fc6192ba1efd854672f593872116beb | [
"MIT"
] | null | null | null | gadann/updater.py | dpineo/gadann | ff5dce9a8fc6192ba1efd854672f593872116beb | [
"MIT"
] | null | null | null | #
# GADANN - GPU Accelerated Deep Artificial Neural Network
#
# Copyright (C) 2014 Daniel Pineo (daniel@pineo.net)
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
import logging
from . import kernels
logger = logging.getLogger(__name__)
class Updater(object):
def __init__(self):
pass
def update(self, args):
for key in params.iterkeys():
params[key] = params[key] + grads[key]*learning_rate
class SgdUpdater(Updater):
def __init__(self, learning_rate=0.1, weight_cost=0.01):
self.weight_cost = weight_cost
self.learning_rate = learning_rate
def update(self, params, grads):
for k, v in grads.items():
params[k] = params[k] - self.learning_rate*v - params[k]*self.weight_cost
def status(self):
return ''
class MomentumUpdater(Updater):
def __init__(self, learning_rate=0.1, inertia=0.9, weight_cost=0.00):
self.inertia = inertia
self.weight_cost = weight_cost
self.learning_rate = learning_rate
def update(self, params, grads):
try:
self.velocities = [self.inertia*v + (1-self.inertia)*g for (v,g) in zip(self.velocities, grads)]
except:
self.velocities = [(1-self.inertia)*g for g in grads]
for i in range(len(params)):
params[i] = params[i] - self.learning_rate*self.velocities[i] - params[i]*self.weight_cost
self.inertia += .001*(1-self.inertia)
def status(self):
return 'inertia:' + str(self.inertia)
class RmspropUpdater(Updater):
def __init__(self, learning_rate=0.1, inertia=0.0, weight_cost=0.00):
self.epsilon = 0.000001
self.inertia = inertia
self.weight_cost = weight_cost
self.learning_rate = learning_rate
def update(self, params, grads):
try:
self.accum = [self.inertia*a + (1-self.inertia)*(g**2) for (a,g) in zip(self.accum, grads)]
except:
self.accum = [(1-self.inertia)*(g**2) for g in grads]
for i in range(len(params)):
params[i] = params[i] - self.learning_rate * grads[i] / (kernels.sqrt(self.accum[i]) + self.epsilon) - params[i]*self.weight_cost
self.inertia += .001*(1-self.inertia)
def status(self):
return 'inertia:' + str(self.inertia)
| 37.173913 | 142 | 0.65614 | 473 | 3,420 | 4.649049 | 0.327696 | 0.070032 | 0.065484 | 0.023647 | 0.352888 | 0.326057 | 0.31196 | 0.31196 | 0.297408 | 0.297408 | 0 | 0.016988 | 0.24269 | 3,420 | 91 | 143 | 37.582418 | 0.832046 | 0.332749 | 0 | 0.489796 | 0 | 0 | 0.00738 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.22449 | false | 0.020408 | 0.040816 | 0.061224 | 0.408163 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f682fe676c7e1a1a9637f751482cd8ab906a78f3 | 880 | py | Python | Aula37/View/testesquad.py | PabloSchumacher/TrabalhosPython | 828edd35eb40442629211bc9f1477f75fb025d74 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | Aula37/View/testesquad.py | PabloSchumacher/TrabalhosPython | 828edd35eb40442629211bc9f1477f75fb025d74 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | Aula37/View/testesquad.py | PabloSchumacher/TrabalhosPython | 828edd35eb40442629211bc9f1477f75fb025d74 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | import sys
sys.path.append( r"C:\Users\900157\Documents\Github\TrabalhosPython\Aula37" )
from Controller.squad_controller import SquadController
from Model.squad import Squad
squad = Squad()
squad.nome = 'JooJ'
squad.descricao = 'Intermediário'
squad.npessoas = 20
squad.backend.nome = 'Lua' #Adicionar
squad.backend.idbackend = 1 #Alterar
squad.frontend.nome = 'Lua' #Adicionar
squad.frontend.idfrontend = 1 #Alterar
squad.sgbd.nome = 'Lua' #Adicionar
squad.sgbd.idsgbd = 1 #Alterar
squad.id = 6
controller = SquadController()
#controller.deletar(5) #Deletando por id
#controller.salvar(squad) #Adicionando
controller.alterar(squad) #Alterando
print(controller.listar_todos()) #Buscando todos
#print(controller.buscar_por_id(squad)) #Buscando por id | 38.26087 | 77 | 0.671591 | 99 | 880 | 5.929293 | 0.474747 | 0.081772 | 0.081772 | 0.107325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022026 | 0.226136 | 880 | 23 | 78 | 38.26087 | 0.839941 | 0.275 | 0 | 0 | 0 | 0 | 0.1296 | 0.088 | 0 | 0 | 0 | 0.043478 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f684bccdded45f6ae63003186c9bee87eac9d795 | 5,427 | py | Python | restapi/resources/login.py | fossabot/http-api | 57a646884e62fe024f6dd7edcc572c85f2955f16 | [
"MIT"
] | null | null | null | restapi/resources/login.py | fossabot/http-api | 57a646884e62fe024f6dd7edcc572c85f2955f16 | [
"MIT"
] | null | null | null | restapi/resources/login.py | fossabot/http-api | 57a646884e62fe024f6dd7edcc572c85f2955f16 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from datetime import datetime, timedelta
import pytz
from restapi.rest.definition import EndpointResource
from restapi.exceptions import RestApiException
from restapi.connectors.authentication import HandleSecurity
from restapi import decorators
from restapi.confs import WRAP_RESPONSE
from restapi.utilities.htmlcodes import hcodes
class Login(EndpointResource):
""" Let a user login by using the configured method """
baseuri = "/auth"
depends_on = ["MAIN_LOGIN_ENABLE"]
labels = ["authentication"]
POST = {
"/login": {
"summary": "Login with basic credentials",
"description": "Normal credentials (username and password) login endpoint",
"parameters": [
{
"name": "credentials",
"in": "body",
"schema": {"$ref": "#/definitions/Credentials"},
}
],
"responses": {
"200": {"description": "Credentials are valid"},
"401": {"description": "Invalid username or password"},
},
}
}
def verify_information(self, user, security, totp_auth, totp_code, now=None):
message = {
'actions': [],
'errors': []
}
if totp_auth and totp_code is None:
message['actions'].append(self.auth.SECOND_FACTOR_AUTHENTICATION)
message['errors'].append("You do not provided a valid second factor")
epoch = datetime.fromtimestamp(0, pytz.utc)
last_pwd_change = user.last_password_change
if last_pwd_change is None or last_pwd_change == 0:
last_pwd_change = epoch
if self.auth.FORCE_FIRST_PASSWORD_CHANGE and last_pwd_change == epoch:
message['actions'].append('FIRST LOGIN')
message['errors'].append("Please change your temporary password")
if totp_auth:
qr_code = security.get_qrcode(user)
message["qr_code"] = qr_code
elif self.auth.MAX_PASSWORD_VALIDITY > 0:
if last_pwd_change == epoch:
expired = True
else:
valid_until = last_pwd_change + timedelta(
days=self.auth.MAX_PASSWORD_VALIDITY
)
if now is None:
now = datetime.now(pytz.utc)
expired = valid_until < now
if expired:
message['actions'].append('PASSWORD EXPIRED')
message['errors'].append("Your password is expired, please change it")
if not message['errors']:
return None
return self.response(
errors=message, code=hcodes.HTTP_BAD_FORBIDDEN
)
@decorators.catch_errors()
def post(self):
jargs = self.get_input()
username = jargs.get('username')
if username is None:
username = jargs.get('email')
password = jargs.get('password')
if password is None:
password = jargs.get('pwd')
# Now credentials are checked at every request
if username is None or password is None:
msg = "Missing username or password"
raise RestApiException(msg, status_code=hcodes.HTTP_BAD_UNAUTHORIZED)
username = username.lower()
now = datetime.now(pytz.utc)
new_password = jargs.get('new_password')
password_confirm = jargs.get('password_confirm')
totp_authentication = (
self.auth.SECOND_FACTOR_AUTHENTICATION is not None
and self.auth.SECOND_FACTOR_AUTHENTICATION == self.auth.TOTP
)
if totp_authentication:
totp_code = jargs.get('totp_code')
else:
totp_code = None
security = HandleSecurity(self.auth)
# ##################################################
# Authentication control
security.verify_blocked_username(username)
token, jti = self.auth.make_login(username, password)
security.verify_token(username, token)
user = self.auth.get_user()
security.verify_blocked_user(user)
security.verify_active_user(user)
if totp_authentication and totp_code is not None:
security.verify_totp(user, totp_code)
# ##################################################
# If requested, change the password
if new_password is not None and password_confirm is not None:
pwd_changed = security.change_password(
user, password, new_password, password_confirm
)
if pwd_changed:
password = new_password
token, jti = self.auth.make_login(username, password)
# ##################################################
# Something is missing in the authentication, asking action to user
ret = self.verify_information(
user, security, totp_authentication, totp_code, now
)
if ret is not None:
return ret
# Everything is ok, let's save authentication information
if user.first_login is None:
user.first_login = now
user.last_login = now
self.auth.save_token(user, token, jti)
if WRAP_RESPONSE:
return self.response({'token': token})
return self.response(token)
| 32.112426 | 87 | 0.575456 | 562 | 5,427 | 5.395018 | 0.263345 | 0.031662 | 0.030013 | 0.019789 | 0.092348 | 0.027045 | 0.027045 | 0.027045 | 0 | 0 | 0 | 0.002684 | 0.313433 | 5,427 | 168 | 88 | 32.303571 | 0.811057 | 0.054911 | 0 | 0.051282 | 0 | 0 | 0.121475 | 0.005036 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017094 | false | 0.188034 | 0.068376 | 0 | 0.17094 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f68c3bc365dd845146ca474b6dad80a6bde410a7 | 659 | py | Python | Python/_10_09/2.3.19.py | MBkkt/Homework | db92bdb9262f1737d637f01d5cf2d3680e379a7b | [
"MIT"
] | 1 | 2019-04-07T18:27:29.000Z | 2019-04-07T18:27:29.000Z | Python/_10_09/2.3.19.py | MBkkt/Homework | db92bdb9262f1737d637f01d5cf2d3680e379a7b | [
"MIT"
] | 1 | 2019-10-02T23:01:01.000Z | 2019-10-02T23:01:01.000Z | Python/_10_09/2.3.19.py | MBkkt/Homework | db92bdb9262f1737d637f01d5cf2d3680e379a7b | [
"MIT"
] | 2 | 2018-10-19T22:42:54.000Z | 2019-03-10T14:57:59.000Z | """ Комбинации. Составьте программу combinations. ру, получающую из командной
строки один аргумент n и выводящую все 2" комбинаций любого
размера. Комбинация - это подмножество из п элементов, независимо
от порядка. Например, когда п = 3, вы должны получить следующий вывод:
а аЬ аЬс ас Ь ьс с
Обратите внимание, что программа должна выводить и пустую строку
(подмножество размером О). """
from itertools import combinations
def sum_comb(n):
return ' '.join(map(lambda i: ' '.join(map(lambda x: ''.join(map(str, x)), combinations(range(n), i))), range(n + 1)))
if __name__ == '__main__':
n = int(input())
print(sum_comb(n))
| 34.684211 | 123 | 0.70258 | 94 | 659 | 4.819149 | 0.776596 | 0.046358 | 0.03532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005597 | 0.186646 | 659 | 18 | 124 | 36.611111 | 0.839552 | 0.578149 | 0 | 0 | 0 | 0 | 0.039683 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.166667 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
f68d61972a56264e1422d85b1127e77df23bf4c2 | 2,130 | py | Python | src/rubrik_config/rubrik_config_base.py | rubrikinc/rubrik-config-backup | 4c421f5090064372f6e89673a5d445de8248ddda | [
"MIT"
] | 2 | 2021-01-27T21:15:24.000Z | 2021-06-08T17:31:15.000Z | src/rubrik_config/rubrik_config_base.py | rubrikinc/rubrik-config-backup | 4c421f5090064372f6e89673a5d445de8248ddda | [
"MIT"
] | 1 | 2020-12-10T07:01:09.000Z | 2020-12-10T07:01:09.000Z | src/rubrik_config/rubrik_config_base.py | rubrikinc/rubrik-config-backup | 4c421f5090064372f6e89673a5d445de8248ddda | [
"MIT"
] | null | null | null | import abc
import json
import os
from rubrik_config import helpers
class RubrikConfigBase(abc.ABC):
def __init__(self, path, rubrik, logger):
self.path = path
self.rubrik = rubrik
self.logger = logger
self.cluster_version = self.rubrik.cluster_version()
self.cluster_name = helpers.cluster_name(self.rubrik)
self.config_name = helpers.config_name(self)
self.dependencies = set()
@abc.abstractmethod
def backup(self):
"""Backup all configuration items of this type.
Returns:
int: The number of items backed up.
"""
pass
@abc.abstractmethod
def restore(self, items):
"""Restore the given configuration items.
Args:
items ([str]): The configuration items to restore.
Returns:
list: A list of jobs that have been initiated on the cluster as part
of the recovery of the configuration.
"""
pass
@abc.abstractmethod
def status(self, job):
"""Return the status of the given job.
Args:
job (str): Job ID
Returns:
list: A list containing the status details of the given job.
"""
pass
# Private Methods
def _write(self, content, content_type, name_fn=lambda x: x['name']):
if len(content) == 0:
return
content_dir_name = content_type.split('.')[0]
path = f"{self.path}/{content_dir_name}"
os.makedirs(path, exist_ok=True)
for item in content:
filename = f"{path}/{helpers.secure_filename(name_fn(item))}.json"
with open(filename, 'w') as f:
file_content = {
'clusterName': self.cluster_name,
'clusterVersion': self.cluster_version,
'type': content_type,
'config': item
}
f.write(json.dumps(file_content, indent=4, sort_keys=True))
self.logger.info("'%s' successfully saved to %s" % (name_fn(item), filename))
| 26.296296 | 89 | 0.567606 | 245 | 2,130 | 4.816327 | 0.383673 | 0.037288 | 0.050847 | 0.040678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002125 | 0.337089 | 2,130 | 80 | 90 | 26.625 | 0.833569 | 0.226761 | 0 | 0.153846 | 0 | 0 | 0.100264 | 0.05409 | 0 | 0 | 0 | 0 | 0 | 1 | 0.128205 | false | 0.076923 | 0.102564 | 0 | 0.282051 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f68e8c40892b76a4713ca99cffde2e9dcbf87b67 | 2,105 | py | Python | prev_ob_models/exclude/GilraBhalla2015/synapses/mitral_granule_NMDA.py | fameshpatel/olfactorybulb | 8d7a644b4560309ef177c0590ff73ed4c2432604 | [
"MIT"
] | null | null | null | prev_ob_models/exclude/GilraBhalla2015/synapses/mitral_granule_NMDA.py | fameshpatel/olfactorybulb | 8d7a644b4560309ef177c0590ff73ed4c2432604 | [
"MIT"
] | null | null | null | prev_ob_models/exclude/GilraBhalla2015/synapses/mitral_granule_NMDA.py | fameshpatel/olfactorybulb | 8d7a644b4560309ef177c0590ff73ed4c2432604 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import sys
import math
# The PYTHONPATH should contain the location of moose.py and _moose.so
# files. Putting ".." with the assumption that moose.py and _moose.so
# has been generated in ${MOOSE_SOURCE_DIRECTORY}/pymoose/ (as default
# pymoose build does) and this file is located in
# ${MOOSE_SOURCE_DIRECTORY}/pymoose/examples
# sys.path.append('..\..')
try:
import moose
except ImportError:
print "ERROR: Could not import moose. Please add the directory containing moose.py in your PYTHONPATH"
import sys
sys.exit(1)
from synapseConstants import *
class mitral_granule_NMDA(moose.SynChan):
"""Non-saturating NMDA synapse from mitral to granule cell."""
def __init__(self, *args):
#### The Mg_block way
moose.SynChan.__init__(self,*args)
self.mgblock = moose.Mg_block(self.path+"/mgblock")
self.mgblock.CMg = mitral_granule_NMDA_MgConc
self.mgblock.KMg_A = mitral_granule_NMDA_KMg_A
## KMg_B has not been wrapped properly in pymoose,
## needed to set it via setField available in every pymoose object
#mgblock.KMg_B = mitral_granule_NMDA_KMg_B
self.mgblock.setField('KMg_B',str(mitral_granule_NMDA_KMg_B))
## connect source to destination.
## excsyn2 sends Gk and Ek to mgblock. other way around gives error.
self.connect("origChannel", self.mgblock, "origChannel")
self.addField('mgblock')
self.setField('mgblock','True')
#### The Mg_block way ends
##### The NMDAChan way
#moose.NMDAChan.__init__(self,*args)
#self.MgConc = mitral_granule_NMDA_MgConc
#connect this in the calling script in the usual way as below:
#granulecomp.connect("channel", excsyn2, "channel")
##### The NMDAChan way - ends
self.Ek = mitral_granule_NMDA_Ek
self.Gbar = mitral_granule_NMDA_Gbar
self.tau1 = mitral_granule_NMDA_tau1
self.tau2 = mitral_granule_NMDA_tau2
self.addField('graded')
self.setField('graded','False')
| 38.981481 | 106 | 0.679335 | 281 | 2,105 | 4.886121 | 0.41637 | 0.094683 | 0.123816 | 0.0437 | 0.097597 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004875 | 0.220428 | 2,105 | 53 | 107 | 39.716981 | 0.83181 | 0.418527 | 0 | 0.08 | 0 | 0 | 0.146168 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.28 | null | null | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f691db8dd8822bcebc9d504797dbc6f906497158 | 900 | py | Python | hparams.py | SuzukiDaishi/AutoVC.pytorch | 0ce6f2dd5b6e34f812c56fc1466bb3444ef837bd | [
"MIT"
] | 25 | 2020-06-09T14:44:14.000Z | 2022-03-26T11:19:47.000Z | hparams.py | SuzukiDaishi/AutoVC.pytorch | 0ce6f2dd5b6e34f812c56fc1466bb3444ef837bd | [
"MIT"
] | 2 | 2020-10-27T02:31:28.000Z | 2021-02-21T02:47:02.000Z | hparams.py | SuzukiDaishi/AutoVC.pytorch | 0ce6f2dd5b6e34f812c56fc1466bb3444ef837bd | [
"MIT"
] | 4 | 2020-10-27T02:03:35.000Z | 2022-03-26T11:19:41.000Z | class hparams:
sample_rate = 16000
n_fft = 1024
#fft_bins = n_fft // 2 + 1
num_mels = 80
hop_length = 256
win_length = 1024
fmin = 90
fmax = 7600
min_level_db = -100
ref_level_db = 20
seq_len_factor = 64
bits = 12
seq_len = seq_len_factor * hop_length
dim_neck = 32
dim_emb = 256
dim_pre = 512
freq = 32
## wavenet vocoder
builder = 'wavenet'
hop_size = 256
log_scale_min = float(-32.23619130191664)
out_channels = 10 * 3
layers = 24
stacks = 4
residual_channels = 512
gate_channels = 512
skip_out_channels = 256
dropout = 1 - 0.95
kernel_size = 3
cin_channels = 80
upsample_conditional_features = True
upsample_scales = [4, 4, 4, 4]
freq_axis_kernel_size = 3
gin_channels = -1
n_speakers = -1
weight_normalization = True
legacy = True
| 21.428571 | 45 | 0.61 | 126 | 900 | 4.047619 | 0.603175 | 0.035294 | 0.047059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152459 | 0.322222 | 900 | 41 | 46 | 21.95122 | 0.683607 | 0.045556 | 0 | 0 | 0 | 0 | 0.008178 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
f692bbf1983670a97677d806c68b916963e86d4a | 2,542 | py | Python | expRT/BTS_Material.py | EJ-Chang/OSD-Study | d7029879fa95406daf8f8e67a15bf859e7bcbce9 | [
"MIT"
] | null | null | null | expRT/BTS_Material.py | EJ-Chang/OSD-Study | d7029879fa95406daf8f8e67a15bf859e7bcbce9 | [
"MIT"
] | 15 | 2020-04-17T08:38:50.000Z | 2020-06-04T10:01:50.000Z | expRT/BTS_Material.py | EJ-Chang/OSD-Study | d7029879fa95406daf8f8e67a15bf859e7bcbce9 | [
"MIT"
] | null | null | null | # Material
buttonType = ['Arrow button', 'Radio button', 'Switch button', 'Option']
requestLUT = [
{
'name' : 'Radio',
'on' : 'OSD_ImgFolder/radio_on.png',
'off' : 'OSD_ImgFolder/radio_off.png',
'default' : 'off',
'hint_path' : 'OSD_ImgFolder/L4 off.png',
'hint': 0
},
{
'name' : 'Switch',
'on' : 'OSD_ImgFolder/switch_on.png',
'off' : 'OSD_ImgFolder/switch_off.png',
'default' : 'off',
'hint_path' : 'OSD_ImgFolder/L4 off.png',
'hint' : 0
},
{
'name' : 'Switch_clue',
'on' : 'OSD_ImgFolder/switch_on.png',
'off' : 'OSD_ImgFolder/switch_off.png',
'default' : 'off',
'hint_path' : 'OSD_ImgFolder/L4 off.png',
'hint' : 1
}
]
backgroundLUT = [
{
'name' : 'Background',
'position' : (0, 0),
'path' : 'OSD_ImgFolder/MainFrame.png'
},
{
'name' : 'Lay_1',
'position' : (-405, -30),
'path' : 'OSD_ImgFolder/Layer_1.png'
},
{
'name' : 'Lay_2',
'position' : (-235, -30),
'path' : 'OSD_ImgFolder/Layer_2.png'
},
{
'name' : 'Lay_3',
'position' : (35, -30),
'path' : 'OSD_ImgFolder/Layer_3.png'
},
{
'name' : 'Lay_4',
'position' : (305, -30),
'path' : 'OSD_ImgFolder/Layer_4.png'
}
]
strLUT = [
{
'name' : 'L1 str',
'position' : (-405, -30),
'path' : 'OSD_ImgFolder/aLayer_1.png',
'hint' : 'OSD_ImgFolder/aLayer_1.png'
},
{
'name' : 'L2_str',
'position' : (-235, -30),
'path' : 'OSD_ImgFolder/L2 str.png',
'hint' : 'OSD_ImgFolder/L2 str.png'
},
{
'name' : 'L3_str',
'position' : (35, -30),
'path' : 'OSD_ImgFolder/L3 str.png',
'hint' : 'OSD_ImgFolder/L3 off.png'
},
{
'name' : 'L4_str',
'position' : (305, -30),
'path' : 'OSD_ImgFolder/L4 str.png',
'hint' : 'OSD_ImgFolder/L4 off.png'
}
]
indicatorLUT = [
{
'name' : 'L1 selector',
'width' : 68,
'height' : 70,
'position' : [(-405, 110), (-405, 40), (-405, -30), (-405, -100), (-405, -170)]
},
{
'name' : 'L2 selector',
'width' : 268,
'height' : 70,
'position' : [(-235, 110), (-235, 40), (-235, -30), (-235, -100), (-235, -170)]
},
{
'name' : 'L3 selector',
'width' : 268,
'height' : 70,
'position' : [(35, 110), (35, 40), (35, -30), (35, -100), (35, -170)]
},
{
'name' : 'L4 selector',
'width' : 268,
'height' : 70,
'position' : [(305, 110), (305, 40), (305, -30), (305, -100), (305, -170)]
}
] | 21.361345 | 84 | 0.486231 | 291 | 2,542 | 4.092784 | 0.178694 | 0.221662 | 0.161209 | 0.120907 | 0.621327 | 0.490344 | 0.216625 | 0.216625 | 0.216625 | 0.216625 | 0 | 0.107574 | 0.283242 | 2,542 | 119 | 85 | 21.361345 | 0.546103 | 0.003147 | 0 | 0.254717 | 0 | 0 | 0.422029 | 0.135018 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6ac1d6095f521421f885471df134a2eb8d6bf74 | 811 | py | Python | Chapter 9/code1.py | PacktPublishing/Mastering-IPython-4 | d752f7ba38e0c9399a83d57da406fe26152f272b | [
"MIT"
] | 22 | 2016-06-07T07:52:35.000Z | 2021-11-08T13:12:21.000Z | Chapter 9/code1.py | PacktPublishing/Mastering-IPython-4 | d752f7ba38e0c9399a83d57da406fe26152f272b | [
"MIT"
] | 2 | 2016-05-23T08:20:54.000Z | 2018-07-02T08:21:32.000Z | Chapter 9/code1.py | PacktPublishing/Mastering-IPython-4 | d752f7ba38e0c9399a83d57da406fe26152f272b | [
"MIT"
] | 27 | 2016-05-23T08:19:51.000Z | 2021-08-31T02:46:00.000Z | """
This is an abbreviated version of my random number generator test suite.
It uses the pytest framework. It does not do much in this form.
"""
import numpy as np
import scipy.stats
import random
class TestRandoms( ):
"""
This is the main class.
Normally it would hold all the tests, plus and setup and teardown fixtures.
"""
def test_builtin(self):
"""
Test the built-in random number generator on 10000 numbers.
"""
num_tests = 10000
vals = [0 for i in range(10)]
for i in range(num_tests):
tmp = random.randint(0, 9)
vals[tmp] = vals[tmp] + 1
chi2, p = scipy.stats.chisquare(self.vals)
assert p > 0.05
def foo( ):
""" I just needed a function outside of a class as an example"""
pass
| 24.575758 | 79 | 0.614057 | 120 | 811 | 4.125 | 0.608333 | 0.024242 | 0.084848 | 0.044444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035149 | 0.298397 | 811 | 32 | 80 | 25.34375 | 0.834798 | 0.440197 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 1 | 0.142857 | false | 0.071429 | 0.214286 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f6b177997afccd32ce4402f37f222e11eaf794c4 | 915 | py | Python | config.py | mscienski/code-challenge-starter-python-flask | 17f5856bdde0e7ea73dc7dab20b4b366e01a61a2 | [
"MIT"
] | null | null | null | config.py | mscienski/code-challenge-starter-python-flask | 17f5856bdde0e7ea73dc7dab20b4b366e01a61a2 | [
"MIT"
] | null | null | null | config.py | mscienski/code-challenge-starter-python-flask | 17f5856bdde0e7ea73dc7dab20b4b366e01a61a2 | [
"MIT"
] | null | null | null | #pylint: disable=no-member
import os
import logging
from flask import Flask
from flask_cors import CORS
from sqlalchemy.orm import sessionmaker
from flask_sqlalchemy import SQLAlchemy
host: str = os.getenv('DB_HOST', 'localhost')
port: int = int(os.getenv('DB_PORT', '5432'))
user: str = os.getenv('DB_USER', 'postgres')
password: str = os.getenv('DB_PASSWORD', '')
name = os.getenv('DB_NAME', 'challenge_starter_development')
db_url = f'postgresql://{user}:{password}@{host}:{port}/{name}'
flask_app = Flask(__name__)
flask_app.config['SQLALCHEMY_DATABASE_URI'] = db_url
flask_app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
DB = SQLAlchemy(flask_app)
Session = sessionmaker(bind=DB.engine)
CORS(flask_app)
gunicorn_logger = logging.getLogger('gunicorn.error')
gunicorn_logger.setLevel(logging.INFO)
flask_app.logger.handlers = gunicorn_logger.handlers
flask_app.logger.setLevel(gunicorn_logger.level)
| 29.516129 | 63 | 0.780328 | 128 | 915 | 5.34375 | 0.382813 | 0.081871 | 0.073099 | 0.057018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00479 | 0.087432 | 915 | 30 | 64 | 30.5 | 0.814371 | 0.027322 | 0 | 0 | 0 | 0 | 0.232846 | 0.149606 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.090909 | 0.272727 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f6b27fc8dc15fd2e2b41b5a9f80c430678ed9908 | 389 | py | Python | shared-data/python/tests/labware/__init__.py | Opentrons/protocol_framework | ebbd6b2fe984edd6ecfcbf1dbe040db7f7356b9f | [
"Apache-2.0"
] | 2 | 2015-11-10T17:49:51.000Z | 2016-01-15T04:43:37.000Z | shared-data/python/tests/labware/__init__.py | Opentrons/labware | e21d8db51eac5818477264a45ef12c0a2d15fb72 | [
"Apache-2.0"
] | null | null | null | shared-data/python/tests/labware/__init__.py | Opentrons/labware | e21d8db51eac5818477264a45ef12c0a2d15fb72 | [
"Apache-2.0"
] | null | null | null | from typing import List, Tuple
from pathlib import Path
def get_ot_defs() -> List[Tuple[str, int]]:
def_files = (
Path(__file__).parent / ".." / ".." / ".." / "labware" / "definitions" / "2"
).glob("**/*.json")
# example filename
# shared-data/labware/definitions/2/opentrons_96_tiprack_300ul/1.json
return [(f.parent.name, int(f.stem)) for f in def_files]
| 27.785714 | 84 | 0.627249 | 52 | 389 | 4.480769 | 0.692308 | 0.077253 | 0.16309 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025478 | 0.192802 | 389 | 13 | 85 | 29.923077 | 0.716561 | 0.215938 | 0 | 0 | 0 | 0 | 0.112583 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
f6b888452d4e63186e8b7dc8abfa403e51a0d83f | 895 | py | Python | apps/civic_pulse/management/commands/create_scraper_user.py | JimHafner/GovLens | ea44084255d5409fa25ab11b24562f4d14d51bd1 | [
"MIT"
] | 17 | 2019-08-21T07:58:05.000Z | 2021-09-03T20:00:56.000Z | apps/civic_pulse/management/commands/create_scraper_user.py | JimHafner/GovLens | ea44084255d5409fa25ab11b24562f4d14d51bd1 | [
"MIT"
] | 51 | 2019-08-20T23:00:10.000Z | 2022-03-11T23:45:35.000Z | apps/civic_pulse/management/commands/create_scraper_user.py | JimHafner/GovLens | ea44084255d5409fa25ab11b24562f4d14d51bd1 | [
"MIT"
] | 44 | 2019-08-30T01:45:53.000Z | 2021-09-30T23:27:02.000Z | """Idempotent management command to create the scraper user with a DRF token
"""
from django.core.management.base import BaseCommand
from django.contrib.auth.models import User
from rest_framework.authtoken.models import Token
SCRAPER_USERNAME = "scraper"
class Command(BaseCommand):
help = "Get or create a scraper user with a Django REST Framework token"
def add_arguments(self, parser):
pass
def handle(self, *args, **options):
user, created = User.objects.get_or_create(username=SCRAPER_USERNAME)
user.save()
if created:
self.stdout.write(f"Created new user with username {SCRAPER_USERNAME}")
else:
self.stdout.write(f"User {SCRAPER_USERNAME} already exists.")
token, created = Token.objects.get_or_create(user=user)
self.stdout.write(f"The token for the user {SCRAPER_USERNAME} is {token}")
| 33.148148 | 83 | 0.707263 | 120 | 895 | 5.183333 | 0.425 | 0.120579 | 0.053055 | 0.07717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202235 | 895 | 26 | 84 | 34.423077 | 0.871148 | 0.081564 | 0 | 0 | 0 | 0 | 0.257669 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0.058824 | 0.176471 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f6be3f33c5cbe03b07efac65c5537f50b908734b | 573 | py | Python | grouper/ctl/base.py | aneeq009/merou | 7a87b43aaf64244932fa460842132a2d9329e704 | [
"Apache-2.0"
] | 58 | 2017-05-26T06:46:24.000Z | 2022-03-25T20:55:51.000Z | grouper/ctl/base.py | aneeq009/merou | 7a87b43aaf64244932fa460842132a2d9329e704 | [
"Apache-2.0"
] | 74 | 2017-06-16T17:48:37.000Z | 2022-03-28T23:09:54.000Z | grouper/ctl/base.py | aneeq009/merou | 7a87b43aaf64244932fa460842132a2d9329e704 | [
"Apache-2.0"
] | 43 | 2017-05-20T22:11:51.000Z | 2022-03-25T00:24:56.000Z | from abc import ABCMeta, abstractmethod
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from argparse import ArgumentParser, Namespace
class CtlCommand(metaclass=ABCMeta):
"""Implements a subcommand of grouper-ctl."""
@staticmethod
@abstractmethod
def add_arguments(parser):
# type: (ArgumentParser) -> None
"""Add the arguments for this command to the provided parser."""
pass
@abstractmethod
def run(self, args):
# type: (Namespace) -> None
"""Run a command with some arguments."""
pass
| 24.913043 | 72 | 0.666667 | 63 | 573 | 6.015873 | 0.619048 | 0.063325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.244328 | 573 | 22 | 73 | 26.045455 | 0.875289 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f6c144a6777f88e0431ff780adf19960dca5e7ab | 6,798 | py | Python | noysim/geo.py | bertdecoensel/noysim | 0f45958093e3db453db8be6edc54f4cb1e119be0 | [
"MIT"
] | 1 | 2016-06-12T08:27:58.000Z | 2016-06-12T08:27:58.000Z | noysim/geo.py | bertdecoensel/noysim | 0f45958093e3db453db8be6edc54f4cb1e119be0 | [
"MIT"
] | null | null | null | noysim/geo.py | bertdecoensel/noysim | 0f45958093e3db453db8be6edc54f4cb1e119be0 | [
"MIT"
] | 1 | 2019-01-16T13:16:56.000Z | 2019-01-16T13:16:56.000Z | # Noysim -- Noise simulation tools for Aimsun.
# Copyright (c) 2010-2011 by Bert De Coensel, Ghent University & Griffith University.
#
# Basic geometry functions and classes
import numpy
import pylab
EPSILON = 10e-12 # smallest difference for points/directions
#---------------------------------------------------------------------------------------------------
# Convenience functions
#---------------------------------------------------------------------------------------------------
def parse_coordinates(*args):
""" parse 2D/3D coordinates x,y(,z) in a variety of fashions, and return a 3-element tuple """
n = len(args)
if n == 0:
return (0.0,0.0,0.0)
if n == 1:
try: # try if a Point object is supplied
return args[0].coordinates()
except:
if type(args[0]) in (tuple,list):
# coordinates supplied as a tuple (x,y) or (x,y,z)
if len(args[0]) == 2:
return (args[0][0], args[0][1], 0.0)
if len(args[0]) == 3:
return (args[0][0], args[0][1], args[0][2])
if type(args[0]) is str:
# coordinates supplied as a string '(x,y,z)'
c = args[0].strip('()').split(',')
return (float(c[0]), float(c[1]), float(c[2]))
else:
# coordinates supplied as separate arguments x,y or x,y,z
if n == 2:
return (args[0], args[1], 0.0)
if n == 3:
return (args[0], args[1], args[2])
raise Exception('unable to parse coordinates: ' + str(args))
def asPoint(p):
""" create a point object from 2D/3D coordinates """
if isinstance(p, Point):
return p
else:
return Point(p)
def asDirection(d):
""" create a direction object from a tuple (bearing, gradient) """
if isinstance(d, Direction):
return d
else:
return Direction(bearing = d[0], gradient = d[1])
#---------------------------------------------------------------------------------------------------
# Point class
#---------------------------------------------------------------------------------------------------
class Point(object):
""" basic 3D point class """
def __init__(self, *xyz):
object.__init__(self)
self.x, self.y, self.z = parse_coordinates(*xyz)
def copy(self):
""" return a copy """
return Point(self.x, self.y, self.z)
def coordinates(self):
""" return the coordinates as a tuple (x,y,z) """
return (self.x, self.y, self.z)
def __getitem__(self, key):
""" implement list style access to coordinates: p[0], p[1], p[2] """
return self.coordinates()[key]
def __str__(self):
""" string representation of a point """
return '(%.2f,%.2f,%.2f)' % self.coordinates()
def middle(self, other):
""" return the middle point between self and another point """
return Point((self.x + other.x)/2.0, (self.y + other.y)/2.0, (self.z + other.z)/2.0)
def distanceSquared(self, other):
""" return the squared distance to another point """
return (self.x - other.x)**2 + (self.y - other.y)**2 + (self.z - other.z)**2
def distance(self, other):
""" return the distance to another point """
return numpy.sqrt(self.distanceSquared(other))
def distanceXY(self, other):
""" return the distance to another point, both projected to the xy-plane """
return numpy.sqrt((self.x - other.x)**2 + (self.y - other.y)**2)
def __eq__(self, other):
""" check if points coincide """
if other == None:
return False
return (self.distance(other) < EPSILON)
def __ne__(self, other):
""" check if points do not coincide """
return not self.__eq__(other)
def __cmp__(self, other):
""" compare the coordinates, first x, then y, then z """
if self.x == other.x:
if (self.y == other.y):
return (self.z < other.z)
else:
return (self.y < other.y)
else:
return (self.x < other.x)
def projectXY(self, z = 0.0):
""" return the projection of the point on the xy-plane """
return Point(self.x, self.y, z)
def transform(self, func):
""" perform a coordinate transformation with the given function (x,y,z) to (x',y',z') """
self.x, self.y, self.z = func((self.x, self.y, self.z))
def plot(self, color = 'black', size = 5):
""" plot the point in the xy-plane """
pylab.plot([self.x], [self.y], color = color, linestyle = 'None', marker = '.', markersize = size)
#---------------------------------------------------------------------------------------------------
# Direction class
#---------------------------------------------------------------------------------------------------
class Direction(object):
""" basic geometrical 3D direction class """
def __init__(self, bearing, gradient = 0.0):
object.__init__(self)
# both bearing and gradient are stored in degrees
self.bearing = bearing
self.gradient = gradient
def copy(self):
""" return a copy """
return Direction(self.bearing, self.gradient)
def __getitem__(self, key):
""" implement list style access to bearing and gradient """
return (self.bearing, self.gradient)[key]
def bearingRadians(self):
""" return the bearing (horizontal angle with the x-axis) in radians """
return numpy.radians(self.bearing)
def gradientRadians(self):
""" return the gradient (vertical angle with the xy-plane) in radians """
return numpy.radians(self.gradient)
def __str__(self):
""" return a string representation of the direction """
return '[%.2f,%.2f]' % (self.bearing, self.gradient)
def __eq__(self, other):
""" check if directions coincide """
if other == None:
return False
db = abs(self.bearing - other.bearing)
dg = abs(self.gradient - other.gradient)
return (db <= EPSILON) and (dg <= EPSILON)
def __ne__(self, other):
""" check if directions do not coincide """
return not self.__eq__(other)
def directionFromTo(p1, p2):
""" returns the direction from point 1 to point 2 """
(dx, dy, dz) = (p2.x - p1.x, p2.y - p1.y, p2.z - p1.z)
siz = p1.distance(p2)
return Direction(bearing = numpy.degrees(numpy.arctan2(dy, dx)), gradient = numpy.degrees(numpy.arcsin(dz/siz)))
#---------------------------------------------------------------------------------------------------
# Test code
#---------------------------------------------------------------------------------------------------
if __name__ == '__main__':
points = []
points.append(Point(1.2, 3.4))
points.append(Point([5.6, 7.8, 9.0]))
points.append(Point('(7.8, 9.0, 1.2)'))
pylab.figure()
for p in points:
p.plot()
try:
pylab.show()
except:
pass
| 32.84058 | 115 | 0.529126 | 861 | 6,798 | 4.0964 | 0.212544 | 0.018429 | 0.005954 | 0.019847 | 0.271052 | 0.20499 | 0.142614 | 0.081089 | 0.058407 | 0.013609 | 0 | 0.021518 | 0.220653 | 6,798 | 206 | 116 | 33 | 0.644205 | 0.374816 | 0 | 0.236842 | 0 | 0 | 0.023834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.236842 | false | 0.008772 | 0.017544 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6c25e843d1f45408a584bccb97be735f0255735 | 17,651 | py | Python | pyrometheus/codegen/python.py | pyrometheus/pyrometheus | 9cfb6fb404173c132d1a92439cce0e4ed10d1406 | [
"MIT"
] | 7 | 2021-12-29T18:17:49.000Z | 2022-01-10T09:53:09.000Z | pyrometheus/codegen/python.py | pyrometheus/pyrometheus | 9cfb6fb404173c132d1a92439cce0e4ed10d1406 | [
"MIT"
] | 3 | 2022-02-06T01:22:09.000Z | 2022-03-16T15:20:16.000Z | pyrometheus/codegen/python.py | pyrometheus/pyrometheus | 9cfb6fb404173c132d1a92439cce0e4ed10d1406 | [
"MIT"
] | null | null | null | """
Python code generation
----------------------
.. autofunction:: gen_thermochem_code
.. autofunction:: get_thermochem_class
"""
__copyright__ = """
Copyright (C) 2020 Esteban Cisneros
Copyright (C) 2020 Andreas Kloeckner
"""
__license__ = """
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
"""
from numbers import Number
import pymbolic.primitives as p
from pymbolic.mapper.stringifier import StringifyMapper, PREC_NONE, PREC_CALL
import cantera as ct
import numpy as np # noqa: F401
from mako.template import Template
import pyrometheus.chem_expr
file_extension = "py"
# {{{ code generation helpers
class CodeGenerationMapper(StringifyMapper):
def map_constant(self, expr, enclosing_prec):
return repr(expr)
OP_NAMES = {
">=": "greater_equal",
">": "greater",
"==": "equal",
"!=": "not_equal",
"<=": "less_equal",
"<": "less",
}
def map_comparison(self, expr, enclosing_prec, *args, **kwargs):
return (f"self.usr_np.{self.OP_NAMES[expr.operator]}"
f"({self.rec(expr.left, PREC_NONE, *args, **kwargs)}, "
f"{self.rec(expr.right, PREC_NONE, *args, **kwargs)})")
def map_if(self, expr, enclosing_prec, *args, **kwargs):
return "self.usr_np.where(%s, %s, %s)" % (
self.rec(expr.condition, PREC_NONE, *args, **kwargs),
self.rec(expr.then, PREC_NONE, *args, **kwargs),
self.rec(expr.else_, PREC_NONE, *args, **kwargs),
)
def map_call(self, expr, enclosing_prec, *args, **kwargs):
return self.format(
"self.usr_np.%s(%s)",
self.rec(expr.function, PREC_CALL, *args, **kwargs),
self.join_rec(", ", expr.parameters, PREC_NONE, *args, **kwargs),
)
def str_np_inner(ary):
if isinstance(ary, Number):
return repr(ary)
elif ary.shape:
return "[%s]" % (", ".join(str_np_inner(ary_i) for ary_i in ary))
raise TypeError("invalid argument to str_np_inner")
def str_np(ary):
return "np.array(%s)" % str_np_inner(ary)
# }}}
# {{{ main code template
code_tpl = Template(
"""\"""
.. autoclass:: Thermochemistry
\"""
import numpy as np
class Thermochemistry:
\"""
.. attribute:: model_name
.. attribute:: num_elements
.. attribute:: num_species
.. attribute:: num_reactions
.. attribute:: num_falloff
.. attribute:: one_atm
Returns 1 atm in SI units of pressure (Pa).
.. attribute:: gas_constant
.. attribute:: species_names
.. attribute:: species_indices
.. automethod:: get_specific_gas_constant
.. automethod:: get_density
.. automethod:: get_pressure
.. automethod:: get_mix_molecular_weight
.. automethod:: get_concentrations
.. automethod:: get_mixture_specific_heat_cp_mass
.. automethod:: get_mixture_specific_heat_cv_mass
.. automethod:: get_mixture_enthalpy_mass
.. automethod:: get_mixture_internal_energy_mass
.. automethod:: get_species_specific_heats_r
.. automethod:: get_species_enthalpies_rt
.. automethod:: get_species_entropies_r
.. automethod:: get_species_gibbs_rt
.. automethod:: get_equilibrium_constants
.. automethod:: get_temperature
.. automethod:: __init__
\"""
def __init__(self, usr_np=np):
\"""Initialize thermochemistry object for a mechanism.
Parameters
----------
usr_np
:mod:`numpy`-like namespace providing at least the following functions,
for any array ``X`` of the bulk array type:
- ``usr_np.log(X)`` (like :data:`numpy.log`)
- ``usr_np.log10(X)`` (like :data:`numpy.log10`)
- ``usr_np.exp(X)`` (like :data:`numpy.exp`)
- ``usr_np.where(X > 0, X_yes, X_no)`` (like :func:`numpy.where`)
- ``usr_np.linalg.norm(X, np.inf)`` (like :func:`numpy.linalg.norm`)
where the "bulk array type" is a type that offers arithmetic analogous
to :class:`numpy.ndarray` and is used to hold all types of (potentialy
volumetric) "bulk data", such as temperature, pressure, mass fractions,
etc. This parameter defaults to *actual numpy*, so it can be ignored
unless it is needed by the user (e.g. for purposes of
GPU processing or automatic differentiation).
\"""
self.usr_np = usr_np
self.model_name = ${repr(sol.source)}
self.num_elements = ${sol.n_elements}
self.num_species = ${sol.n_species}
self.num_reactions = ${sol.n_reactions}
self.num_falloff = ${
sum(1 if isinstance(r, ct.FalloffReaction) else 0
for r in sol.reactions())}
self.one_atm = ${ct.one_atm}
self.gas_constant = ${ct.gas_constant}
self.big_number = 1.0e300
self.species_names = ${sol.species_names}
self.species_indices = ${
dict([[sol.species_name(i), i]
for i in range(sol.n_species)])}
self.wts = ${str_np(sol.molecular_weights)}
self.iwts = 1/self.wts
def _pyro_zeros_like(self, argument):
# FIXME: This is imperfect, as a NaN will stay a NaN.
return 0 * argument
def _pyro_make_array(self, res_list):
\"""This works around (e.g.) numpy.exp not working with object
arrays of numpy scalars. It defaults to making object arrays, however
if an array consists of all scalars, it makes a "plain old"
:class:`numpy.ndarray`.
See ``this numpy bug <https://github.com/numpy/numpy/issues/18004>`__
for more context.
\"""
from numbers import Number
all_numbers = all(isinstance(e, Number) for e in res_list)
dtype = np.float64 if all_numbers else object
result = np.empty((len(res_list),), dtype=dtype)
# 'result[:] = res_list' may look tempting, however:
# https://github.com/numpy/numpy/issues/16564
for idx in range(len(res_list)):
result[idx] = res_list[idx]
return result
def _pyro_norm(self, argument, normord):
\"""This works around numpy.linalg norm not working with scalars.
If the argument is a regular ole number, it uses :func:`numpy.abs`,
otherwise it uses ``usr_np.linalg.norm``.
\"""
# Wrap norm for scalars
from numbers import Number
if isinstance(argument, Number):
return np.abs(argument)
return self.usr_np.linalg.norm(argument, normord)
def species_name(self, species_index):
return self.species_name[species_index]
def species_index(self, species_name):
return self.species_indices[species_name]
def get_specific_gas_constant(self, mass_fractions):
return self.gas_constant * (
%for i in range(sol.n_species):
+ self.iwts[${i}]*mass_fractions[${i}]
%endfor
)
def get_density(self, p, temperature, mass_fractions):
mmw = self.get_mix_molecular_weight(mass_fractions)
rt = self.gas_constant * temperature
return p * mmw / rt
def get_pressure(self, rho, temperature, mass_fractions):
mmw = self.get_mix_molecular_weight(mass_fractions)
rt = self.gas_constant * temperature
return rho * rt / mmw
def get_mix_molecular_weight(self, mass_fractions):
return 1/(
%for i in range(sol.n_species):
+ self.iwts[${i}]*mass_fractions[${i}]
%endfor
)
def get_concentrations(self, rho, mass_fractions):
return self.iwts * rho * mass_fractions
def get_mass_average_property(self, mass_fractions, spec_property):
return sum([mass_fractions[i] * spec_property[i] * self.iwts[i]
for i in range(self.num_species)])
def get_mixture_specific_heat_cp_mass(self, temperature, mass_fractions):
cp0_r = self.get_species_specific_heats_r(temperature)
cpmix = self.get_mass_average_property(mass_fractions, cp0_r)
return self.gas_constant * cpmix
def get_mixture_specific_heat_cv_mass(self, temperature, mass_fractions):
cp0_r = self.get_species_specific_heats_r(temperature) - 1.0
cpmix = self.get_mass_average_property(mass_fractions, cp0_r)
return self.gas_constant * cpmix
def get_mixture_enthalpy_mass(self, temperature, mass_fractions):
h0_rt = self.get_species_enthalpies_rt(temperature)
hmix = self.get_mass_average_property(mass_fractions, h0_rt)
return self.gas_constant * temperature * hmix
def get_mixture_internal_energy_mass(self, temperature, mass_fractions):
e0_rt = self.get_species_enthalpies_rt(temperature) - 1.0
emix = self.get_mass_average_property(mass_fractions, e0_rt)
return self.gas_constant * temperature * emix
def get_species_specific_heats_r(self, temperature):
return self._pyro_make_array([
% for sp in sol.species():
${cgm(ce.poly_to_expr(sp.thermo, "temperature"))},
% endfor
])
def get_species_enthalpies_rt(self, temperature):
return self._pyro_make_array([
% for sp in sol.species():
${cgm(ce.poly_to_enthalpy_expr(sp.thermo, "temperature"))},
% endfor
])
def get_species_entropies_r(self, temperature):
return self._pyro_make_array([
% for sp in sol.species():
${cgm(ce.poly_to_entropy_expr(sp.thermo, "temperature"))},
% endfor
])
def get_species_gibbs_rt(self, temperature):
h0_rt = self.get_species_enthalpies_rt(temperature)
s0_r = self.get_species_entropies_r(temperature)
return h0_rt - s0_r
def get_equilibrium_constants(self, temperature):
rt = self.gas_constant * temperature
c0 = self.usr_np.log(self.one_atm / rt)
g0_rt = self.get_species_gibbs_rt(temperature)
return self._pyro_make_array([
%for i, react in enumerate(sol.reactions()):
%if react.reversible:
${cgm(ce.equilibrium_constants_expr(
sol, i, Variable("g0_rt")))},
%else:
-0.17364695002734*temperature,
%endif
%endfor
])
def get_temperature(self, enthalpy_or_energy, t_guess, y, do_energy=False):
if do_energy is False:
pv_fun = self.get_mixture_specific_heat_cp_mass
he_fun = self.get_mixture_enthalpy_mass
else:
pv_fun = self.get_mixture_specific_heat_cv_mass
he_fun = self.get_mixture_internal_energy_mass
num_iter = 500
tol = 1.0e-6
ones = self._pyro_zeros_like(enthalpy_or_energy) + 1.0
t_i = t_guess * ones
for _ in range(num_iter):
f = enthalpy_or_energy - he_fun(t_i, y)
j = -pv_fun(t_i, y)
dt = -f / j
t_i += dt
if self._pyro_norm(dt, np.inf) < tol:
return t_i
raise RuntimeError("Temperature iteration failed to converge")
%if falloff_reactions:
def get_falloff_rates(self, temperature, concentrations, k_fwd):
ones = self._pyro_zeros_like(temperature) + 1.0
k_high = self._pyro_make_array([
%for _, react in falloff_reactions:
%if react.uses_legacy:
${cgm(ce.rate_coefficient_expr(
react.high_rate, Variable("temperature")))},
%else:
${cgm(ce.rate_coefficient_expr(
react.rate.high_rate, Variable("temperature")))},
%endif
%endfor
])
k_low = self._pyro_make_array([
%for _, react in falloff_reactions:
%if react.uses_legacy:
${cgm(ce.rate_coefficient_expr(
react.low_rate, Variable("temperature")))},
%else:
${cgm(ce.rate_coefficient_expr(
react.rate.low_rate, Variable("temperature")))},
%endif
%endfor
])
reduced_pressure = self._pyro_make_array([
%for i, (_, react) in enumerate(falloff_reactions):
(${cgm(ce.third_body_efficiencies_expr(
sol, react, Variable("concentrations")))})*k_low[${i}]/k_high[${i}],
%endfor
])
falloff_center = self._pyro_make_array([
%for _, react in falloff_reactions:
${cgm(ce.troe_falloff_expr(react, Variable("temperature")))},
%endfor
])
falloff_function = self._pyro_make_array([
%for i, (_, react) in enumerate(falloff_reactions):
${cgm(ce.falloff_function_expr(
react, i, Variable("temperature"), Variable("reduced_pressure"),
Variable("falloff_center")))},
%endfor
])*reduced_pressure/(1+reduced_pressure)
%for j, (i, react) in enumerate(falloff_reactions):
k_fwd[${i}] = k_high[${j}]*falloff_function[${j}]*ones
%endfor
return
%endif
def get_fwd_rate_coefficients(self, temperature, concentrations):
ones = self._pyro_zeros_like(temperature) + 1.0
k_fwd = [
%for react in sol.reactions():
%if isinstance(react, ct.FalloffReaction):
0*temperature,
%else:
${cgm(ce.rate_coefficient_expr(react.rate,
Variable("temperature")))} * ones,
%endif
%endfor
]
%if falloff_reactions:
self.get_falloff_rates(temperature, concentrations, k_fwd)
%endif
%for i, react in three_body_reactions:
k_fwd[${i}] *= (${cgm(ce.third_body_efficiencies_expr(
sol, react, Variable("concentrations")))})
%endfor
return self._pyro_make_array(k_fwd)
def get_net_rates_of_progress(self, temperature, concentrations):
k_fwd = self.get_fwd_rate_coefficients(temperature, concentrations)
log_k_eq = self.get_equilibrium_constants(temperature)
return self._pyro_make_array([
%for i in range(sol.n_reactions):
${cgm(ce.rate_of_progress_expr(sol, i,
Variable("concentrations"),
Variable("k_fwd"), Variable("log_k_eq")))},
%endfor
])
def get_net_production_rates(self, rho, temperature, mass_fractions):
c = self.get_concentrations(rho, mass_fractions)
r_net = self.get_net_rates_of_progress(temperature, c)
ones = self._pyro_zeros_like(r_net[0]) + 1.0
return self._pyro_make_array([
%for sp in sol.species():
${cgm(ce.production_rate_expr(
sol, sp.name, Variable("r_net")))} * ones,
%endfor
])""", strict_undefined=True)
# }}}
def gen_thermochem_code(sol: ct.Solution) -> str:
"""For the mechanism given by *sol*, return Python source code for a class conforming
to a module containing a class called ``Thermochemistry`` adhering to the
:class:`~pyrometheus.thermochem_example.Thermochemistry` interface.
"""
return code_tpl.render(
ct=ct,
sol=sol,
str_np=str_np,
cgm=CodeGenerationMapper(),
Variable=p.Variable,
ce=pyrometheus.chem_expr,
falloff_reactions=[(i, react) for i, react in enumerate(sol.reactions())
if isinstance(react, ct.FalloffReaction)],
three_body_reactions=[(i, react) for i, react in enumerate(sol.reactions())
if isinstance(react, ct.ThreeBodyReaction)],
)
def compile_class(code_str, class_name="Thermochemistry"):
exec_dict = {}
exec(compile(code_str, "<generated code>", "exec"), exec_dict)
exec_dict["_MODULE_SOURCE_CODE"] = code_str
return exec_dict[class_name]
def get_thermochem_class(sol: ct.Solution):
"""For the mechanism given by *sol*, return a class conforming to the
:class:`~pyrometheus.thermochem_example.Thermochemistry` interface.
"""
return compile_class(gen_thermochem_code(sol))
def cti_to_mech_file(cti_file_name, mech_file_name):
"""Write python file for mechanism specified by CTI file."""
with open(mech_file_name, "w") as outf:
code = gen_thermochem_code(ct.Solution(cti_file_name, "gas"))
print(code, file=outf)
# vim: foldmethod=marker
| 35.372745 | 89 | 0.622798 | 2,176 | 17,651 | 4.798713 | 0.19761 | 0.028634 | 0.016185 | 0.019536 | 0.344091 | 0.289408 | 0.254549 | 0.216529 | 0.179659 | 0.147673 | 0 | 0.006981 | 0.26956 | 17,651 | 498 | 90 | 35.443775 | 0.802916 | 0.035805 | 0 | 0.234667 | 0 | 0 | 0.689523 | 0.217204 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026667 | false | 0 | 0.026667 | 0.013333 | 0.154667 | 0.002667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6c4ba56f91f92449ed57ea8aedac277928e66b9 | 3,025 | py | Python | python/dash_tools/staticdownloader.py | jainsat/media-tools | 1f457d3a84dc3e1c8a5461774ebd45bcbb2eefb4 | [
"BSD-3-Clause"
] | 1 | 2021-04-26T06:34:08.000Z | 2021-04-26T06:34:08.000Z | python/dash_tools/staticdownloader.py | jainsat/media-tools | 1f457d3a84dc3e1c8a5461774ebd45bcbb2eefb4 | [
"BSD-3-Clause"
] | null | null | null | python/dash_tools/staticdownloader.py | jainsat/media-tools | 1f457d3a84dc3e1c8a5461774ebd45bcbb2eefb4 | [
"BSD-3-Clause"
] | 1 | 2020-05-18T22:08:07.000Z | 2020-05-18T22:08:07.000Z | #!/usr/bin/env python
import os
from common import fetch_file
import staticmpdparser
import client
import json
import pdb
def download(options, mpd_url=None, mpd_str=None, base_url=None, base_dst=""):
"Download MPD if url specified and then start downloading segments."
if mpd_url:
# First download the MPD file
mpd_str, _, _ = fetch_file(mpd_url)
base_url, file_name = os.path.split(mpd_url)
mpd_parser = staticmpdparser.StaticManifestParser(mpd_str)
if options.verbose:
print str(mpd_parser.mpd)
if options.abr:
print("Starting ABR client")
client.AbrClient(mpd_parser.mpd, base_url, base_dst, options).download()
elif options.bola:
print("Starting BOLA client")
client.BolaClient(mpd_parser.mpd, base_url, base_dst, options).download()
elif options.bba0:
print("Starting BBA0 client")
client.BBAClient(mpd_parser.mpd, base_url, base_dst, options).download_bba0()
elif options.bba2:
print("Starting BBA2 client")
client.BBAClient(mpd_parser.mpd, base_url, base_dst, options).download_bba2()
elif options.pensieve:
print("Starting Pensieve client")
client.PensieveClient(mpd_parser.mpd, base_url, base_dst, options).download_pensieve()
else:
print("Starting Simple client")
client.SimpleClient(mpd_parser.mpd, base_url, base_dst).download()
def main():
"Parse command line and start the fetching."
from optparse import OptionParser
usage = "usage: %prog [options] mpdURL [dstDir]"
parser = OptionParser(usage)
parser.add_option("-v", "--verbose", action="store_true", dest="verbose")
parser.add_option("-a", "--abr", dest="abr", action="store_true")
parser.add_option("-b", "--bola", dest="bola", action="store_true")
parser.add_option("-B", "--bba0", dest="bba0", action="store_true")
parser.add_option("-X", "--bba2", dest="bba2", action="store_true")
parser.add_option("-p", "--pensieve", dest="pensieve", action="store_true")
parser.add_option("-g", "--gp", dest="gp", type="float", default=5,
help = 'Specify the (gamma p) product in seconds.')
parser.add_option("-s", "--buffer_size", dest="buffer_size", type="int", default=20,
help='Specify the buffer size in seconds')
parser.add_option("-C", "--bandwidthchangerscript", dest="bandwidth_changerscript_path", type="str",
default="./trigger_bandwidth_changer.sh", help='Specify the bandwidth changer script to trigger the remote program on server that runs tc on a network trace')
(options, args) = parser.parse_args()
if len(args) < 1:
parser.error("incorrect number of arguments")
print(usage)
# MPD url can be of the form http://10.128.0.33:5000/manifest.mpd
mpd_url = args[0]
base_dst = "download"
if len(args) >= 2:
base_dst = args[1]
download(options, mpd_url, base_dst=base_dst)
if __name__ == "__main__":
main()
| 43.84058 | 180 | 0.669421 | 404 | 3,025 | 4.829208 | 0.319307 | 0.039467 | 0.069195 | 0.049206 | 0.253716 | 0.229113 | 0.182983 | 0.137878 | 0.137878 | 0.116863 | 0 | 0.011929 | 0.196364 | 3,025 | 68 | 181 | 44.485294 | 0.790621 | 0.037025 | 0 | 0 | 0 | 0 | 0.265292 | 0.028179 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.116667 | null | null | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6c60cbb47c6eb49ed0c025ad7108cd872063833 | 303 | py | Python | Python/Delta/Delta.py | jankupczyk/Proste-Programy-PY | 96d932a5869e71861de89422caf1329f9edcd5f3 | [
"MIT"
] | 1 | 2021-06-28T15:53:51.000Z | 2021-06-28T15:53:51.000Z | Python/Delta/Delta.py | jankupczyk/Proste-Programy-PY | 96d932a5869e71861de89422caf1329f9edcd5f3 | [
"MIT"
] | null | null | null | Python/Delta/Delta.py | jankupczyk/Proste-Programy-PY | 96d932a5869e71861de89422caf1329f9edcd5f3 | [
"MIT"
] | null | null | null | # Oblicza delte
a = int(input("Podaj [a]:"))
b = int(input("Podaj [b]:"))
c = int(input("Podaj [c]:"))
d = b**2-4*a*c
if d > 0:
print("2 rozwiązania")
elif d == 0:
print("1 rozwiązanie")
else:
print("0 rozwiązań")
for i in range():
if i != 0:
print(i, end=" ")
| 17.823529 | 29 | 0.49505 | 49 | 303 | 3.061224 | 0.510204 | 0.16 | 0.26 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036866 | 0.283828 | 303 | 16 | 30 | 18.9375 | 0.654378 | 0.042904 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.307692 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6c87781dc296b3f3f472139665bb2dcb047df30 | 2,005 | py | Python | common/xrd-ui-tests-python/tests/xroad_global_groups_tests/XroadMemberRemoveFromGlobalGroup.py | nordic-institute/X-Road-tests | e030661a0ad8ceab74dd8122b751e88025a3474a | [
"MIT"
] | 1 | 2019-02-09T00:16:54.000Z | 2019-02-09T00:16:54.000Z | common/xrd-ui-tests-python/tests/xroad_global_groups_tests/XroadMemberRemoveFromGlobalGroup.py | nordic-institute/X-Road-tests | e030661a0ad8ceab74dd8122b751e88025a3474a | [
"MIT"
] | 1 | 2018-06-06T08:33:32.000Z | 2018-06-06T08:33:32.000Z | common/xrd-ui-tests-python/tests/xroad_global_groups_tests/XroadMemberRemoveFromGlobalGroup.py | nordic-institute/X-Road-tests | e030661a0ad8ceab74dd8122b751e88025a3474a | [
"MIT"
] | 3 | 2018-07-09T08:51:00.000Z | 2020-07-23T18:40:24.000Z | import unittest
from helpers import auditchecker, xroad
from main.maincontroller import MainController
from tests.xroad_global_groups_tests import global_groups_tests
class XroadMemberRemoveFromGlobalGroup(unittest.TestCase):
"""
SERVICE_38 Remove an X-Road Member from a Global Group
RIA URL: https://jira.ria.ee/browse/XTKB-183
Depends on finishing other test(s): member add to global group
Requires helper scenarios:
X-Road version: 6.16.0
"""
def __init__(self, methodName='test_member_remove_from_global_group'):
unittest.TestCase.__init__(self, methodName)
def test_member_remove_from_global_group(self):
main = MainController(self)
cs_host = main.config.get('cs.host')
cs_user = main.config.get('cs.user')
cs_pass = main.config.get('cs.pass')
cs_ssh_host = main.config.get('cs.ssh_host')
cs_ssh_user = main.config.get('cs.ssh_user')
cs_ssh_pass = main.config.get('cs.ssh_pass')
group_name = main.config.get('cs.global_group')
log_checker = auditchecker.AuditChecker(cs_ssh_host, cs_ssh_user, cs_ssh_pass)
member_name = main.config.get('ss1.client_name')
member_code = xroad.split_xroad_id(main.config.get('ss1.client_id'))['code']
test_member_remove_from_global_group = global_groups_tests.test_member_remove_from_global_group(main, member_name,
member_code,
group_name,
log_checker=log_checker)
try:
main.reload_webdriver(cs_host, cs_user, cs_pass)
test_member_remove_from_global_group()
except:
main.save_exception_data()
raise
finally:
main.tearDown()
| 43.586957 | 128 | 0.597007 | 233 | 2,005 | 4.798283 | 0.321888 | 0.080501 | 0.104651 | 0.093918 | 0.328265 | 0.194991 | 0 | 0 | 0 | 0 | 0 | 0.008112 | 0.323691 | 2,005 | 45 | 129 | 44.555556 | 0.816372 | 0.105736 | 0 | 0 | 0 | 0 | 0.077797 | 0.020443 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0.129032 | 0.129032 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f6c9dadb66ba06561d5aa2ae635083ad448267fc | 494 | py | Python | sentimental_analysis.py | akrakman/hackillinois | a4cfb107aefb95ff987ba159e3c7867a9d10ce9a | [
"MIT"
] | null | null | null | sentimental_analysis.py | akrakman/hackillinois | a4cfb107aefb95ff987ba159e3c7867a9d10ce9a | [
"MIT"
] | null | null | null | sentimental_analysis.py | akrakman/hackillinois | a4cfb107aefb95ff987ba159e3c7867a9d10ce9a | [
"MIT"
] | null | null | null | from numpy import average, number
from textblob import TextBlob
class ScaleUtilities:
average = 0
number = 0
def __init__(self, string, number):
self.string = string
def get_subjectivity_of(string):
polarity = TextBlob(string).sentiment.polarity * 5
number += 1
average += polarity
return polarity
def average_opinion():
if (number == 0):
print("You idiot")
exit(1)
return average / number
| 22.454545 | 58 | 0.605263 | 54 | 494 | 5.407407 | 0.5 | 0.089041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017751 | 0.315789 | 494 | 21 | 59 | 23.52381 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0.018219 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0 | 0.117647 | 0 | 0.588235 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
f6cb67b13096c25131faa2e988c43e882cde21bd | 166,942 | py | Python | SeagateSenseCodes.py | ssmore98/siod | 5e1bd2691710ebaf49223d52894c2b08aa958112 | [
"Unlicense"
] | null | null | null | SeagateSenseCodes.py | ssmore98/siod | 5e1bd2691710ebaf49223d52894c2b08aa958112 | [
"Unlicense"
] | null | null | null | SeagateSenseCodes.py | ssmore98/siod | 5e1bd2691710ebaf49223d52894c2b08aa958112 | [
"Unlicense"
] | null | null | null | seagate_sense_codes = {0: {0: {0: {0: 'No error.', 'L1': 'No Sense', 'L2': 'No Sense'},
31: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Logical unit transitioning to another power condition'}},
94: {0: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Drive is in power save mode for unknown reasons'},
1: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Idle Condition Activated by timer'},
2: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Standby condition activated by timer'},
3: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Idle condition activated by host command'},
4: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Standby condition activated by host command'},
5: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Idle B condition activated by timer'},
6: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Idle B condition activated by host command'},
7: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Idle C condition activated by timer'},
8: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Idle C condition activated by host command'},
9: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Standby Y condition activated by timer'},
10: {0: 'No Specific FRU code.',
'L1': 'No Sense',
'L2': 'Standby Y conditition activated by host command'}}},
1: {1: {0: {157: "Recovered Media Manager's anticipatory autoseek (ATS2) XFR error.",
'L1': 'Recovered Error',
'L2': 'No Index/Logical Block Signal'}},
3: {0: {0: 'FRU code comes from the contents of the lower 8-bit of the servo fault register (address 38h). A description of this register is attached at the end of this document.',
'L1': 'Recovered Error',
'L2': 'Peripheral Device Write Fault'}},
9: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Track Following Error'},
1: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Servo Fault'},
13: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Write to at least one copy of a redundant file failed'},
14: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Redundant files have < 50% good copies'},
248: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Calibration is needed but the QST is set without the Recal Only bit'},
255: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Servo cal completed as part of self-test'}},
11: {0: {6: 'Non Volatile Cache now is volatile',
48: 'Recovered Erase Error Rate Warning',
50: 'Recovered Read Error Rate Warning',
66: 'Recovered Program Error Rate Warning',
'L1': 'Recovered Error',
'L2': 'Recovered Error Rates Warnings'},
1: {0: 'Warning \xe2\x80\x93 Specified temperature exceeded.',
'L1': 'Recovered Error',
'L2': 'Warning \xe2\x80\x93 Specified temperature exceeded'},
2: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Warning, Enclosure Degraded'},
3: {0: 'Warning \xe2\x80\x93 Specified temperature exceeded.',
'L1': 'Recovered Error',
'L2': 'Warning \xe2\x80\x93 Flash temperature exceeded'},
4: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Warning - Flash Read Cache Capacity Degraded'},
6: {0: 'Warning \xe2\x80\x93 NVC now volatile. NVC specified temperature exceeded.',
'L1': 'Recovered Error',
'L2': 'Warning - Non-Volatile Cache now volatile'},
7: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Warning, spare sector margin exceeded. NVC_WCD disabled.'},
38: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Warning - Power loss management warning threshold exceeded'},
93: {0: 'Pre Warning.',
'L1': 'Recovered Error',
'L2': 'Pre-SMART Warning'},
225: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Drive is exiting RAW mode, returning from high temperature'},
226: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Drive is exiting RAW mode, returning from low temperature'},
241: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Drive is entering RAW mode due to high temperature'},
242: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Drive is entering RAW mode due to low temperature'}},
12: {1: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Write Error Recovered with Auto-Reallocation'},
2: {1: 'Data written with retries. Auto-reallocation failed.',
2: 'Data written with retries. Auto-reallocation failed with critical error, triggering Write Protection.',
'L1': 'Recovered Error',
'L2': 'Write Error Recovered, Auto-Reallocation failed'}},
17: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Unrecovered Read Error'}},
21: {1: {0: 'Mechanical Positioning Error.',
1: 'Mechanical positioning error - Recovered servo command.',
2: 'Mechanical positioning error - Recovered servo command during spinup.',
'L1': 'Recovered Error',
'L2': 'Mechanical Positioning Error'}},
22: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Data Synchronization Mark Error'}},
23: {1: {0: 'No specific FRU code.',
17: 'PRESCAN - Recovered data with error recovery.',
18: 'RAW - Recovered data with error recovery.',
'L1': 'Recovered Error',
'L2': 'Recovered Data Using Retries'},
2: {0: 'No specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Recovered Data Using Positive Offset'},
3: {0: 'No specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Recovered Data Using Negative Offset'}},
24: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Recovered Data with ECC, no retry attempted'},
1: {0: 'Recovered data with ECC and retries applied.',
1: 'F2 \xe2\x80\x93 Recovered data with ECC and retries applied, but did normal ECC during erasure correction step.',
2: 'BERP \xe2\x80\x93 Recovered data with BERP Erasure Recovery (Erasure Pointer) and retries applied.',
3: 'BERP \xe2\x80\x93 Recovered data with BERP Sliding Window and retries applied.',
4: 'BERP \xe2\x80\x93 Recovered data with BERP Extended Iterations and retries applied.',
5: 'BERP \xe2\x80\x93 Recovered data with BERP LLR Scaling and retries applied.',
'L1': 'Recovered Error',
'L2': 'Recovered Data with ECC and Retries Applied'},
2: {0: 'No specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Recovered Data with ECC and/or Retries, Data Auto-Reallocated'},
3: {0: 'Data recovered with ATIC and retries applied.',
1: 'Data recovered with ATIC and retries applied, and auto-reallocated.',
2: 'Data recovered with ATIC and retries applied, and re-written.',
'L1': 'Recovered Error',
'L2': 'Recovered Data with ATIC and Retries Applied'},
4: {0: 'Data recovered with SERV and retries applied.',
1: 'Data recovered with SERV and retries applied, and auto-reallocated.',
2: 'Data recovered with SERV and retries applied, and re-written.',
'L1': 'Recovered Error',
'L2': 'Recovered Data with SERV and Retries Applied'},
5: {1: 'Data recovered. Auto-reallocation failed.',
2: 'Data recovered. Auto-reallocation failed with critical error, triggering Write Protection.',
'L1': 'Recovered Error',
'L2': 'Recovered Data with ECC and/or Retries, Auto-Reallocation Failed'},
7: {0: 'No specific FRU code.',
'L1': 'Recovered Error',
'L2': 'ECC and/or retries, data re-written'},
8: {0: 'Data Recovered with BIPS/SP',
1: 'Data Recovered with BIPS/SP, and auto-reallocated.',
2: 'Data Recovered with BIPS/SP, and re-written.',
3: 'Data Recovered with Intermediate Super Parity.',
4: 'Data Recovered with Intermediate Super Parity, and auto-reallocated.',
5: 'Data Recovered with Intermediate Super Parity, and re-written.',
'L1': 'Recovered Error',
'L2': 'Recovered Data With Intermediate Super Parity'},
9: {0: 'Recovered the sector found to bad during IRAW scan with the IRAW process.',
'L1': 'Recovered Error',
'L2': 'Recovered the sector found to bad during IRAW scan'}},
25: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Defect List Error'}},
28: {0: {0: 'Defect list not found.',
1: 'Invalid Defect list format data request.',
'L1': 'Recovered Error',
'L2': 'Defect List Not Found'}},
31: {0: {0: 'Partial defect list transfer.',
'L1': 'Recovered Error',
'L2': 'Number of defects overflows the allocated space that the Read Defect command can handle'}},
55: {0: {0: 'Parameter Rounded.',
1: 'Limit the BytesPerSector to Maximum Sector Size.',
2: 'Limit the BytesPerSector to Minimum Sector Size.',
3: 'Rounded the odd BytesPerSector.',
4: 'Parameter rounded in the mode page check.',
5: 'Rounded the VBAR size.',
'L1': 'Recovered Error',
'L2': 'Parameter Rounded'}},
63: {128: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Buffer Contents Have Changed'}},
64: {1: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'DRAM Parity Error'},
2: {128: 'Spinup error recovered with buzz retries.',
129: 'Spinup error recovered without buzz retries.',
'L1': 'Recovered Error',
'L2': 'Spinup Error recovered with retries'}},
68: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Internal Target Failure'}},
93: {0: {0: 'No Specific FRU code.',
1: 'Fail Max Recovered Error threshold during DST',
4: 'Reallocation.',
5: 'Reallocation AST table.',
6: 'Reallocation DDT table.',
8: 'Auto Reallocation Failure',
9: 'Impending Failure - Throughput Performance: Insufficient spare to back all NVC cache.',
10: 'NVC has failed to save Torn Write data more times than the threshold (currently 10)',
11: 'Failure Prediction Max Temperature Exceeded',
16: 'Hardware failure.',
20: 'Excessive reassigns.',
22: 'Start times failure.',
24: 'Instruction DBA error found during idle task. Fixed.',
32: 'General failure.',
39: 'SSD Early Retired Blocks Failure',
40: 'SSD Flash Life Left',
41: 'Exceeded time allocated to complete Zero Disk test (seq write)',
48: 'Recovered erase error rate (SSD-Jaeger only)',
49: 'Head failure.',
50: 'Recovered data error rate.',
55: 'Recovered TA.',
56: 'Hard TA event.',
64: 'Head flip.',
65: 'SSE (servo seek error).',
66: 'Write fault.',
67: 'Seek failure.',
68: 'Erase Error.',
69: 'Track following errors (Hit66).',
74: 'Seek performance failure.',
91: 'Spinup failure.',
96: 'Firmware Failure condition',
97: 'RVFF system failure.',
98: 'Gain adaptation failure.',
99: 'Fluid Dynamic Bearing Motor leakage detection test failed.',
100: 'Saving Media Cache Map Table (MCMT) to reserved zone failed.',
116: 'SED NOR Key store near to end of life',
117: 'Multiply threshold config.',
239: 'No control table on disk.',
'L1': 'Recovered Error',
'L2': 'Failure Prediction Threshold Exceeded'},
16: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'Failure Prediction Threshold Exceeded'},
255: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': 'False Failure Prediction Threshold Exceeded'}},
133: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': '5V threshold exceeded'}},
140: {0: {0: 'No Specific FRU code.',
'L1': 'Recovered Error',
'L2': '12V threshold exceeded'}}},
2: {4: {0: {0: 'Logical Unit Not Ready, Cause Not Reportable.',
1: 'No Specific FRU code.',
2: 'Logical unit not ready, Change Definition Command in progress.',
128: 'R/W system not ready.',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, Cause Not Reportable'},
1: {0: 'No Specific FRU code.',
1: 'Logical unit not ready, excess particle sweep time required',
2: 'Wait for good power loss management status.',
'L1': 'Not Ready',
'L2': 'Logical Unit is in Process of Becoming Ready'},
2: {0: 'No Specific FRU code.',
1: 'Supply voltage levels not within spec',
'L1': 'Not Ready',
'L2': 'Drive Not Ready \xe2\x80\x93 START UNIT Required'},
3: {0: 'Logical Unit Not Ready, Manual Intervention Required.',
1: 'Logical unit not ready, manual intervention required, Servo doesn\xe2\x80\x99t support MDW Delta-L, Servo doesn\xe2\x80\x99t support VBAR.',
2: 'Logical unit not ready, manual intervention required \xe2\x80\x94CFW is configured for medium latency, but channel family is not capable of supporting medium latency.',
3: 'Logical unit not ready, R/W sub-system failure.',
4: 'Logical unit not ready, Drive is unmated',
5: 'Logical unit not ready, R/W sub-system init failed; Invalid Servo Firmware',
6: 'Logical unit not ready, R/W sub-system init failed; Invalid Servo Adaptives',
7: 'Logical unit not ready, Read failure on all System Data Table copies',
8: 'Logical unit not ready, Forward table read error during restore',
9: 'Logical unit not ready, Scram metadata read error during restore',
10: 'Logical unit not ready, GCU info restore failed',
11: 'Logical unit not ready, Defect table restore failed',
12: 'Logical unit not ready, Scram restore failed',
13: 'Logical unit not ready, Bxor Critical Failure',
15: 'Logical unit not ready, Media is corrupted but data successfully recovered via media scan',
16: 'Logical unit not ready, suspect list read error',
17: 'Logical unit not ready, reverse directory error during restore',
18: 'Logical unit not ready, system recovery format completed but drive lost critical system data',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, Manual Intervention Required'},
4: {0: 'Logical unit not ready, Scram restore failed',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, Format in Progress'},
7: {0: 'Logical unit not ready, Scram restore failed',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, Format in Progress'},
9: {0: 'Logical unit not ready, self-test in progress.',
1: 'Logical unit not ready, short background self-test in progress.',
2: 'Logical unit not ready, extended background self-test in progress.',
3: 'Logical unit not ready, short foreground self-test in progress.',
4: 'Logical unit not ready, extended foreground self-test in progress.',
5: 'Logical unit not ready, firmware download in progress.',
6: 'Logical unit not ready, initial volume download in progress.',
7: 'Logical unit not ready, session opened.',
8: 'No Specific FRU code',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, H2SAT measurement is Progress'},
12: {0: 'No Specific FRU code.',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, Field Adjustable Adaptive Fly Height (FAAFH) in progress'},
13: {7: 'Logical unit not ready, Session is already Open.',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, Session is already Open'},
17: {2: 'Logical Unit Not Ready, Notify (Enable Spinup) required',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, Notify (Enable Spinup) required'},
26: {0: 'Logical Unit Not Ready, Start Stop Unit Command in Progress',
'L1': 'Not Ready',
'L2': 'Logical Unit Not Ready, Start Stop Unit Command in Progress'},
27: {0: 'Logical unit not ready, sanitize in progress.',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, sanitize in progress.'},
28: {0: 'No specific FRU code.',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, additional power use not yet granted'},
34: {1: 'Spindle Error (Spinup Failure)',
2: 'SMIF Traning Failed after 5 times attempt',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, power cycle required.'},
240: {0: 'Logical unit not ready, super certify in progress.',
1: 'Counterfeit attempt detected (ETF log SN or SAP SN mismatch)',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, super certify in progress'},
242: {0: 'Drive has been placed in special firmware due to assert storm threshold being exceeded.',
'L1': 'Not Ready',
'L2': 'Logical unit not ready, Assert Storm Threshold being exceeded.'}},
53: {2: {1: 'Enclosure not ready, no ENCL_ACK assert.',
'L1': 'Not Ready',
'L2': 'Enclosure Services Unavailable'}},
132: {0: {0: 'Remanufacturing State.',
'L1': 'Not Ready',
'L2': 'Remanufacturing State'}}},
3: {3: {0: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Peripheral Device Write Fault'}},
9: {0: {0: 'Track following error.',
254: 'Head flip during power cycle.',
255: 'Track following error.',
'L1': 'Medium Error',
'L2': 'Track Following Error'},
4: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Head Select Fault'}},
10: {1: {0: 'No Specific FRU code.',
1: 'Failed to write super certify log file from media backend',
'L1': 'Medium Error',
'L2': 'Failed to write super certify log file'},
2: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Failed to read super certify log file'}},
12: {0: {0: 'Peripheral device write fault.',
1: 'Write error during single sector recovery.',
2: 'Write error during gc relocation',
3: 'Write is revirtualized because of a channel error - SSD',
4: 'Read during preamp unsafe fault.',
5: 'Flash Media Request aborted due to Graceful Channel Reset',
7: 'Save SMART error log failed',
8: 'Write error from media backend',
9: 'SSD Uncorrectable Write Error from Media backend',
10: 'SSD Erase Error.',
11: 'SSD PSM Runt Buffer Page Mismatch Error',
17: 'Unrecovered read error on READ step of PRESCAN.',
18: 'Unrecovered read error on READ step of WRITE converted to WRITE VERIFY during RAW operation.',
22: 'Unrecovered read error on READ step of Read-modify-write',
128: 'BVD update error.',
129: 'BVD Correctable IOEDC error.',
'L1': 'Medium Error',
'L2': 'Write Error'},
2: {0: 'Unrecovered write error - Auto reallocation failed.',
1: 'Reallocate Block - Write alternate block failed, no servo defects.',
2: 'Reallocate Block - Alternate block compare test failed.',
3: 'Reallocate Block - Alternate block sync mark error.',
4: 'Reallocate Block - Maximum allowed alternate selection exhausted.',
5: 'Reallocate Block - Resource is not available for a repetitive reallocation.',
6: 'Reallocate Block Failed',
7: 'Reallocate Block Failed - Write Protect',
8: 'Write error, autoreallocation failed from media backend',
'L1': 'Medium Error',
'L2': 'Write Error \xe2\x80\x93 Auto Reallocation Failed'},
3: {0: 'Write Error \xe2\x80\x93 Recommend Reassignment',
'L1': 'Medium Error',
'L2': 'Write Error \xe2\x80\x93 Recommend Reassignment'},
4: {0: 'WORM Error - Invalid Overlapping Address Range.',
1: 'WORM Error - Written WORM Area Infringement.',
2: 'WORM Error - No further writes allowed on WORM drive',
3: 'WORM Error - SIM Registry Read Failed',
4: 'WORM Error - SIM Registry Write Failed',
5: 'WORM Error - Illegal write request after Lock',
'L1': 'Illigal Request',
'L2': 'Write Error - WORM'},
128: {3: 'Disc trace write (to clear it) failed 02.',
5: 'Disc trace write failed.',
6: 'Save UDS DRAM trace frames to disc failed.',
8: 'Write Long disc transfer failed.',
'L1': 'Medium Error',
'L2': 'Write Error \xe2\x80\x93 Unified Debug System'},
255: {1: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Write Error \xe2\x80\x93 Too many error recovery revs'}},
17: {0: {0: 'Unrecovered Read Error.',
1: 'Unrecovered Read Error, too many recovery revs.',
2: 'Unrecovered read error, could not read UDS save-on-update trace',
3: 'Unrecovered read error, could not read UDS finished frame index file',
4: 'Unrecovered read error, could not read SMART to include it in UDS',
5: 'Unrecovered read error, UDS read of a finished frame file failed',
6: 'Read SMART error log failed',
7: 'Unrecovered read BBM on partial reallocation',
8: 'Unrecovered read error on media backend',
9: 'SSD Unrecovered Read Error due to ECC Failure',
10: 'SSD Unrecovered Read Error due to Corrupt Bit',
11: 'Unrecovered read flagged error, created with flagged write uncorrectable command',
12: 'Unrecovered read error due to flash channel reset',
128: 'Read during preamp unsafe fault.',
129: 'EDAC HW uncorrectable error.',
130: 'EDAC overrun error.',
131: 'LBA corrupted with Write Long COR_DIS mode.',
132: 'LBA was in Media cache, hardened upon unrec. read error during cleaning',
133: 'EDAC HW uncorrectable error, Super parity or ISP valid and parity recovery attempted.',
134: 'EDAC HW uncorrectable error, Super parity and ISP Invalid.',
160: 'Read preamp unsafe fault with short/open fault set',
'L1': 'Medium Error',
'L2': 'Unrecovered Read Error'},
4: {0: 'Unrecovered Read Error \xe2\x80\x93 Auto Reallocation Failed',
128: 'Write alternate block failed, no servo defects.',
129: 'Alternate block compare test failed.',
130: 'Alternate block sync mark error.',
131: 'Maximum allowed alternate selection exhausted.',
132: 'Resource is not available for a repetitive reallocation.',
133: 'SERV HW EDAC failure.',
134: 'SERV SID failure.',
135: 'Number of reallocation pending Super Block exceeded limit.',
136: 'Reallocation pending sector encountered during Super Block read.',
137: 'Reallocation pending sector encountered during Super Block write.',
'L1': 'Medium Error',
'L2': 'Unrecovered Read Error \xe2\x80\x93 Auto Reallocation Failed'},
20: {0: 'Unrecovered read error- read pseudo-unrecovered from a WRITE LONG',
'L1': 'Medium Error',
'L2': 'Unrecovered Read Error \xe2\x80\x93 LBA marked bad by application'},
255: {1: 'Unrecovered read error- timelimit exceeded.',
'L1': 'Medium Error',
'L2': 'Unrecovered Read Error \xe2\x80\x93 Too many error recovery revs'}},
20: {1: {0: 'Record Not Found.',
128: 'Search exhaust error or congen mode page directory not found',
129: 'Reallocation LBA is restricted from write access or congen compressed XML not found',
130: 'Reallocation LBA is restricted from read access.',
131: 'Read from or Write to log page data on reserved zone failed.',
'L1': 'Medium Error',
'L2': 'Record Not Found'}},
21: {1: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Mechanical Positioning Error'},
3: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Unrecovered write errors due to grown servo flaws'}},
22: {0: {0: 'Data synchronization mark error.',
128: 'Data sync timeout error.',
129: 'Formatter FIFO parity error 01.',
130: 'Formatter FIFO parity error 02.',
131: 'Super Sector - Data sync timeout error',
132: 'Disc Xfr - Data sync timeout error on sector splits',
'L1': 'Medium Error',
'L2': 'Data Synchronization Mark Error'},
1: {0: 'Data missed sync mark error. FRU code is bits mask indicating which fragment(s) have missed sync error. Bit n represents fragment n.',
'L1': 'Medium Error',
'L2': 'Data Synchronization Mark Error'}},
49: {0: {0: 'Medium Format Corrupted.',
1: 'Corruption result of a Mode Select command.',
2: 'Corruption result of a sparing changed condition.',
3: 'Corruption the result of a failed LBA pattern write in Format command.',
4: 'Corruption result of failed user table recovery.',
5: 'Corruption of NVC global header',
6: 'Medium format corrupted from media backend',
7: 'Medium Format Corrupt due to flash identify failure',
8: 'Medium Format Corrupt as a result of failed NVC write or invalid NVC meta data',
9: 'Medium Format Corrupt due to NVC WCD Meta data corruption',
10: 'Media Format Corrupt due to Write failure during MC WCD data restore to disc',
11: 'Medium Format Corrupt because NVC did not burn flash',
12: 'Medium Format Corrupt because Pseudo Error Masks lost after power loss',
13: 'Medium Format Corrupt because unexpected Media Cache Segment Sequence Number',
14: 'Medium Format Corrupt\xc2\xa0due to system reset without saving NVC data',
15: 'Medium Format Corrupt due to firmware reset (jump to 0) without saving NVC data',
16: 'Medium Format Corrupt due to incomplete burn but no actual power loss',
18: 'Medium Format Corrupt because watchdog timer reset',
19: 'Medium Format Corrupted On Assert (intentionally)',
20: 'Medium Format Corrupt due to IP timeout during SCRAM',
21: 'Medium Format Corrupt due to Write Parameter Error',
22: 'Medium Format Corrupt due to GCU Metadata Error',
24: 'Medium Format Corrupt due to System Metadata restore failure',
32: 'Format Corrupt due to Download changes.',
34: 'Scram user NVC restore failed',
'L1': 'Medium Error',
'L2': 'Medium Format Corrupted'},
1: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Corruption in R/W format request.'},
3: {0: 'Sanitize Command Failed',
1: 'Sanitize Command Failed due to file access error',
'L1': 'Medium Error',
'L2': 'Sanitize Command Failed'},
145: {13: 'Corrupt WWN in drive information file.',
'L1': 'Medium Error',
'L2': 'Corrupt WWN in drive information file'}},
50: {1: {0: 'Defect list update failure.',
128: 'Failed to save defect files.',
129: 'Failed to save defect files post format 01.',
130: 'Failed to save defect files post format 02.',
131: 'Failed to save defect files post format 03.',
'L1': 'Medium Error',
'L2': 'Defect List Update Error'},
3: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Defect list longer than allocated memory.'}},
51: {0: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Flash not ready for access.'}},
68: {0: {0: 'No Specific FRU code.',
'L1': 'Medium Error',
'L2': 'Internal Target Failure'}},
93: {0: {1: 'Max Unrecovered Read Error',
'L1': 'Medium Error',
'L2': 'Unrecovered Read Error'}}},
4: {1: {0: {0: 'No index or sector pulses found.',
128: 'Spin up - Media Manager error encountered.',
129: 'Data field timeout error.',
130: "Media Manager's TDT FIFO Counter error.",
131: "Media Manager's Servo Counter error.",
132: "Media Manager's Latency error.",
133: "Media Manager's Index error.",
134: "Media Manager's Servo error.",
135: 'Media Manager errors could not be cleared successfully.',
136: 'Clearing of MM errors due to a servo error failed.',
137: 'SWCE/SGate overlap error.',
138: 'Servo gate timeout error 01.',
139: 'Servo gate timeout error 02.',
140: 'Servo gate timeout error 03.',
141: 'Servo gate timeout error 04.',
142: 'Servo gate timeout error 05.',
143: 'Super Sector - Handshake error.',
144: 'Super Sector - Servo gate timeout error 01.',
145: 'Super Sector - Servo gate timeout error 02.',
146: 'Super Sector - Servo gate timeout error 03.',
147: 'Super Sector - Servo gate timeout error 04.',
148: 'Super Sector - Servo gate timeout error 05.',
149: 'Servo gate timeout error during generation of Aseek Req.',
150: 'BVD check timeout error.',
151: 'NRZ sequencer completion timeout error.',
152: 'Sequencer timeout on Media Manager event..',
153: 'NRZ xfr error on Media Manager event.',
154: 'Disc sequencer handshake error.',
155: 'Medium Latency channel synchronization handshake error.',
156: 'Fast dPES missed servo sample error.',
157: "Media Manager's anticipatory autoseek (ATS2) XFR error.",
158: 'When a reassigned sector is encountered, wait for the NRZ to finish the previous sector',
159: 'Fast IO Data Collection out of sync with sequencer',
160: 'Channel not ready rev count exhausted. Apply to LDPC LLI channels',
161: "Media Manager's anticipatory autoseek (ATS2) Servo error.",
162: 'Media Manager\xe2\x80\x99s anticipatory autoseek (ATS2) Disc Pause Condition',
163: 'BERP infinite loop condition',
164: 'Brownout fault detected during write transfer.',
165: 'Sequencer completion timeout error at reassigned sector.',
166: 'Sequencer S-gate timeout error during start of sector read.',
167: 'Sequencer S-gate timeout error during skipping of a new sector.',
'L1': 'Hardware Error',
'L2': 'No Index/Logical Block Signal'}},
3: {0: {2: 'Gated Channel Fault',
3: 'Write Preamp Unsafe Fault',
4: 'Write Servo Unsafe Fault',
5: 'Read/write channel fault.',
6: 'SFF fault.',
7: 'Write servo field fault.',
8: 'Write Servo unsafe fault.',
9: 'SSD: Peripheral device write fault (flush cache failed)',
16: 'Write Servo sector fault.',
32: 'Read/Write channel fault.',
64: 'Servo fault.',
128: 'Detect of new servo flaws failed.',
129: 'PSG environment fault.',
130: 'Shock event occurred.',
131: 'Unexpected Extended WGATE fault.',
132: 'Channel detected fault during write.',
133: 'Disc locked clock fault detected.',
134: 'Skip Write Detect Dvgas fault',
135: 'Skip Write Detect Rvgas fault',
136: 'Skip Write Detect Fvgas fault',
137: 'Skip Write Detect Dvgas+Rvgas+Fvgas sum threshold exceeded - last SWD fault Dvgas',
138: 'Skip Write Detect Dvgas+Rvgas+Fvgas sum threshold exceeded - last SWD fault Rvgas',
139: 'Skip Write Detect Dvgas+Rvgas+Fvgas sum threshold exceeded - last SWD fault Fvgas',
140: 'Drive free-fall event occurred',
141: 'Large Shock event occured',
144: 'NRZ Write Parity fault.',
145: 'Marvell 8830 TBG Unlock fault.',
146: 'Marvell 8830 WClk Loss fault.',
147: 'EBMS Fault Detect(EFD) Contact fault during write.',
148: 'EBMS Fault Detect(EFD) Contact fault during read.',
149: 'EBMS Fault Detect(EFD) SWOT fault.',
150: 'Marvell SRC SFG Unlock fault.',
255: "LSI 6 channel preamp attempting to write without heat, condition detected by servo and passed as servo fault ( i.e. Preamp error condition indicated by servo fault condition ). This should be recovered by a 'seek away ' performed as part of recovery step",
'L1': 'Hardware Error',
'L2': 'Peripheral Device Write Fault'}},
9: {0: {0: 'Servo track following error.',
64: 'Servo fault, Normally 04/0900/80 would be changed to 04/0900/40 by the firmware.',
128: 'Servo fault, Normally 04/0900/80 would be changed to 04/0900/40 by the firmware.',
129: 'Servo unsafe fault during write.',
130: 'EDAC block address error.',
131: 'Missing MDW information reported by servo detection.',
132: 'Servo command timed out.',
133: 'Seek command timed out.',
134: 'Seek exceeded recovery time limit.',
135: 'Service drive free fall condition timed out.',
136: 'The altitude has exceeded the limit',
137: 'Seek command timed out on alternate sector',
138: 'Super Block marked dirty.',
139: 'Verify of Super Block data failed.',
140: 'Servo fatal error indicated',
141: 'Super Parity long word Cross Check error',
142: 'Super Parity low word Cross Check error',
143: 'Super Parity high word Cross Check error',
144: 'Super Parity data miscompare',
145: 'Invalid Anticipatry Track Seek request.',
146: 'Enhance Super parity regeneration failure.',
147: 'Super parity regeneration failure.',
'L1': 'Hardware Error',
'L2': 'Track Following Error'},
1: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Servo Fault'},
4: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Head Select Fault'},
255: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Servo cal failed as part of self-test'}},
21: {1: {0: 'Mechanical Positioning Error.',
128: 'Servo error encountered during drive spin-up.',
129: 'Servo error encountered during drive spin-down.',
130: 'Spindle failed error.',
131: 'Unrecovered seek error encountered.',
132: 'Servo command failed.',
133: 'Servo heater timing failed.',
134: 'Servo Free-Fall Protection command failed.',
135: 'Servo Disc Slip Full TMFF recalibration failed.',
136: 'Servo Disc Slip Head Switch Timing recalibration failed.',
137: 'Servo Disc Slip Head Switch Track recalibration failed.',
138: 'Servo read heat fast I/O command failed.',
139: 'Spin-up attempt during G2P merge process failed.',
140: 'Spin-down attempt during PList processing failed.',
141: 'Spin-up attempt during PList processing failed.',
'L1': 'Hardware Error',
'L2': 'Mechanical Positioning Error'}},
22: {0: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Data Synchronization Mark Error'}},
25: {0: {0: 'Defect list error.',
1: 'Save user table error.',
2: 'SIM file transfer error.',
3: 'Persistent Reserve Save to disc fail.',
4: 'Defect list error media backend',
128: 'Format - Recover of saved Grown DST file failed.',
129: 'Recovery of saved Non-Resident DST failed.',
130: 'Clear R/W Slip List - Save of R/W Operating Parmaters file failed.',
131: 'Restore Alt List File From media - Failed restoration from media file.',
132: 'Save of Servo Disc Slip Parms to media failed.',
133: 'Read of Servo Disc Slip Parms from media failed 1.',
134: 'Read of Servo Disc Slip Parms from media failed 2.',
135: 'Servo Disc Slip file - invalid format revision.',
136: 'GList to PList - Recover of saved Grown DST file failed.',
137: 'Clear Non-resident Grown DST - Save to media failed.',
'L1': 'Hardware Error',
'L2': 'Defect List Error'}},
28: {0: {0: 'Defect list not found.',
1: 'Defect list processing error.',
2: 'Read Manufacturing Info File failure.',
3: 'Read Manufacturing Info File failure.',
4: 'Defect list not found from media backend',
50: 'Read Manufacturing Info File failure.',
52: 'Read Manufacturing Info File failure.',
129: 'Failure to read Primary Defects file for reporting.',
130: 'Invalid entry count in Plist file.',
131: 'Invalid byte extent value in Plist entry.',
132: 'Process Defect Lists - Sort error due to invalid offset.',
133: 'Process Defect Lists - Sort error due to invalid head.',
134: 'Process Defect Lists - Sort error due to invalid cylinder.',
135: 'Process Defect Lists - Unable to recover the Primary Defect files.',
136: 'Failed to seek to defect files for reassign.',
137: 'Failed to seek to defect files for undo-reassign.',
138: 'Failure to write defects report lists file to media.',
139: 'Read of defects report file from media failed.',
140: 'An invalid defects report file is encountered 01.',
141: 'An invalid defects report file is encountered 02',
142: 'Restore of R/W User Operating Parameters file failed.',
143: 'Invalid Primary Servo Flaws data encountered.',
144: 'Failed to save defect files due to miscompare error.',
146: 'PList overflow error while merging PSFT and PList for reporting.',
147: 'Maximum certify passes of a zone exceeded.',
148: 'Maximum write passes of a zone exceeded.',
149: 'Primary Servo Flaws data retrieval - Unable to read file on disc.',
150: 'Primary Servo Flaws data retrieval - Invalid entry count in file.',
151: 'Defective Sectors List data retrieval - Unable to read file on disc.',
152: 'Defective Sectors List data retrieval - Invalid file header data.',
153: 'PList data retrieval - Invalid entry count in Plist file.',
154: 'PList data retrieval - Unable to read Plist file on disc.',
155: 'System Format - invalid entry count.',
156: 'Primary TA data retrieval - Unable to read file on disc.',
157: 'Primary TA data retrieval - Invalid count.',
158: 'Primary TA data retrieval - Invalid sort.',
159: "Process Defect Lists - Defect doesn't exist in audit space.",
160: 'Retrieve Defects Report List - Not All Entries Available',
161: 'Format - Invalid LBA range in PVT before update of dirty blocks.',
162: 'Format - Invalid Parity Validity Table after clean of dirty blocks.',
163: 'Format - Clean of dirty blocks failed.',
164: 'Format - Save of Parity Validity Table to media failed.',
'L1': 'Hardware Error',
'L2': 'Defect List Not Found'}},
38: {48: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Writing to the Flash Failed'},
49: {0: 'Failed to program PIC code with new firmware',
'L1': 'Hardware Error',
'L2': 'Writing to the PIC Failed'}},
41: {0: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Flashing LED occurred.'}},
50: {0: {8: 'No defect spare location available',
9: 'No defect spare location available for reassign block.',
128: 'Processing of pending reallocation failed.',
129: 'Failed to insert defect to DST.',
130: 'Failed to insert Plist defect to DST.',
131: 'Grown DST file full 01.',
132: 'Grown DST file full 02.',
133: 'Resident DST file full.',
134: 'Failed to insert defective sectors associated w/grown servo flaw.',
135: 'Failure to invalidate Defects Report disc files.',
136: 'Format System Partition \xe2\x80\x93 Failed to insert defective system sectors associated w/ grown servo flaw.',
137: 'Format System Partition \xe2\x80\x93 Failed to insert defective system sectors.',
138: 'Format System Partition \xe2\x80\x93 System Defects file full.',
139: 'Process Defect Lists \xe2\x80\x93 Failed to insert a client specified defect in the defect file.',
140: 'ASFT \xe2\x80\x93 Max # of servo flaws per track exceeded (path #1).',
141: 'ASFT \xe2\x80\x93 Max # of servo flaws per track exceeded (path #2).',
142: 'ASFT full (path #1).',
143: 'ASFT full (path #2).',
144: 'Addition to Reassign Pending List failed.',
145: 'Resource is not available for a new reallocation.',
146: 'No alternates available (path #1).',
147: 'Failed to insert defective sectors associated w/grown servo flaw.',
148: 'Failed to deallocate compromised defects.',
149: 'Format System Partition \xe2\x80\x93 Failed to deallocate compromised.',
150: 'Insertion of DDT entry failed.',
151: 'Compressed DDT file full.',
152: 'Format \xe2\x80\x93 Failed to insert defective sectors associated w/primary servo flaw.',
153: 'Defective Tracks List \xe2\x80\x93 Failed to insert grown defective sectors associated with defective track.',
154: 'Defective Tracks List \xe2\x80\x93 Failed to insert primary defective sectors associated with defective track.',
155: 'Defective Tracks List \xe2\x80\x93 Failed to add new entry to list.',
156: 'Reallocate Block \xe2\x80\x93 Resource is not available for a partial reallocation.',
157: 'Resource is not available for a partial reallocation.',
158: 'Not enough non-defective sectors to allocate for BIPS parity Sectors.',
159: 'BIPS defect table operation failed \xe2\x80\x93 case 1.',
160: 'BIPS defect table operation failed \xe2\x80\x93 case 2.',
161: 'Format \xe2\x80\x93 Failed to add defective track to DST.',
162: 'Format \xe2\x80\x93 Failed to allocate spare sectors.',
163: 'Pad and Fill Defects \xe2\x80\x93 Max number of skipped tracks exceeded.',
164: 'Format \xe2\x80\x93 Failed to allocate spare sectors.',
165: 'Format \xe2\x80\x93 More LBAs than PBAs.',
166: 'Format \xe2\x80\x93 Failed to allocate spare sectors.',
167: 'Format \xe2\x80\x93 Failed to allocate spare sectors.',
168: 'Format \xe2\x80\x93 Failed to allocate spare sectors.',
169: 'Format \xe2\x80\x93 Excessive number of slips not supported by hardware.',
170: 'Invalid HW parity data for parity sector reallocation.',
171: 'Format \xe2\x80\x93 Could not allocate required guard/pad around media cache area on disc',
172: 'Format \xe2\x80\x93 Will not be able to save ISP/MC metadata after the format (a mis-configuration problematic to try to address before this point)',
173: 'Format - Failed to allocate spare sectors.',
174: 'Format - Failed to allocate spare sectors.',
175: 'Format - Failed to allocate spare sectors.',
176: 'Format - Failed to allocate spare sectors.',
177: 'Format - Failed to update parity sectors slip list.',
178: 'Format - Invalid track sector range encountered.',
179: 'Format \xe2\x80\x93 Could not allocate required guard/pad around intermediate super parity area on disc',
180: 'Format \xe2\x80\x93 Media Cache starting DDT entry not found.',
193: 'Format - Attempt to add pad between user area and start/end of Distributed Media Cache failed',
'L1': 'Hardware Error',
'L2': 'DMC area padding failed'},
1: {0: 'Defect list update failure.',
23: 'Saving of the ASFT during idle time failed',
129: 'Plist file overflow error.',
130: 'PSFT file overflow error.',
131: 'Unable to write defect files.',
132: 'Unable to update operating parms file.',
133: 'Plist file overflow error.',
134: 'Plist file overflow error.',
'L1': 'Hardware Error',
'L2': 'Defect List Update Error'}},
53: {0: {8: 'LIP occurred during discovery.',
9: 'LIP occurred during an 8067 command.',
10: 'LIP occurred during an 8045 read.',
11: 'LIP occurred during an 8067 read.',
12: 'LIP occurred during an 8067 write.',
13: 'Parallel ESI deasserted during discovery.',
14: 'Parallel ESI deasserted during an 8067 command.',
15: 'Parallel ESI deasserted during an 8045 read.',
16: 'Parallel ESI deasserted during an 8067 read.',
17: 'Parallel ESI deasserted during an 8067 write.',
'L1': 'Hardware Error',
'L2': 'Unspecified Enclosure Services Failure'},
3: {2: 'Enclosure found but not ready \xe2\x80\x93 No Encl_Ack Negate.',
4: 'Read Data Transfer Enclosure Timeout.',
5: 'Write Data Transfer Enclosure Timeout.',
13: 'Read Data Transfer Bad Checksum.',
14: 'Write Data Transfer Enclosure Timeout.',
15: 'Read Data Transfer Enclosure Timeout.',
'L1': 'Hardware Error',
'L2': 'Enclosure Transfer Failure'},
4: {4: 'Read Data Transfer Refused by Enclosure.',
5: 'Write Data Transfer Refused by Enclosure.',
'L1': 'Hardware Error',
'L2': 'Enclosure Transfer Refused'}},
62: {3: {0: 'No Specific FRU code.',
1: 'Logical Unit Failed Self Test \xe2\x80\x93 TestWriteRead',
2: 'Logical Unit Failed Self Test \xe2\x80\x93 TestRandomRead',
3: 'Logical Unit Failed Self Test \xe2\x80\x93 ScanOuterDiameter',
4: 'Logical Unit Failed Self Test \xe2\x80\x93 ScanInnerDiameter',
5: 'Logical Unit Failed Self Test from media backend',
'L1': 'Hardware Error',
'L2': 'Logical Unit Failed Self Test'},
4: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'H2SAT Foreground Test Failure'}},
64: {0: {128: 'Format \xe2\x80\x93 Exceeded max number of track rewrites during certify retries.',
'L1': 'Hardware Error',
'L2': 'Miscellaneous Error'},
1: {0: 'Buffer memory parity error.',
1: 'Buffer FIFO parity error.',
2: 'IOEDC error.',
3: 'VBM parity error.',
'L1': 'Hardware Error',
'L2': 'DRAM Parity Error'},
145: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Cryptographic Hardware Power-On Self-Test Failure'},
146: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Cryptographic Algorithm Power-On Self-Test Failure'},
147: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Conditional Random Number Generation Self-Test Failure'},
148: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Hidden Root Key Error During Command Execution'},
149: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Entropy Power-On Self-Test Failure'},
150: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Conditional Entropy Self-Test Failure'},
151: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Boot Firmware SHA-256 Power-On Self-Test Failure'},
152: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Boot Firmware RSA Power-On Self-Test Failure'}},
66: {0: {0: 'Power-on or self-test failure.',
1: 'DST failure.',
2: 'SIM Spinning-up state transition failure.',
3: 'SIM Drive Initialization state transition failure.',
4: 'Read/write thread initialization failed.',
20: 'DIC exceeds the time limits consecutively over count limit',
'L1': 'Hardware Error',
'L2': 'Power-On or Self-Test Failure'},
10: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Port A failed loopback test'},
11: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Port B failed loopback test'}},
68: {0: {0: 'Internal target failure',
1: 'Self test buffer test failure.',
2: 'Write read test failure.',
3: 'Data sync timeout error.',
4: 'SSD Test access error (DST)',
5: 'Backend (SSD) DRAM failure (DST)',
6: 'Backend (SSD) SRAM failure (DST)',
7: 'Internal target failure META data test',
8: 'Internal target failure System Area Check',
9: 'wait for ACP to get ready for reset took too long',
10: 'wait for AMP to get ready for reset took too long',
11: 'FDBMotor Leakage detected.',
12: 'wait for I2C to be available took too long',
128: 'Write during preamp unsafe fault.',
129: 'Read write channel fault.',
130: 'Small form factor fault.',
131: 'Write during servo field fault.',
132: 'Media Manager\xe2\x80\x99s TPBA FIFO Counter error.',
133: 'Media Manager\xe2\x80\x99s TPBA FIFO Under-run error.',
134: 'Media Manager\xe2\x80\x99s DDT FIFO Counter error.',
135: 'Media Manager\xe2\x80\x99s DDT FIFO Under-run error.',
136: 'Media Manager\xe2\x80\x99s Parity error.',
137: 'Media Manager\xe2\x80\x99s TDT FIFO Under-run error.',
138: 'Media Manager\xe2\x80\x99s Skip Mask Under-run error.',
139: 'Get Temperature request resulted in invalid temperature.',
140: 'Detected unsupported H/W in a Set Voltage Margin request.',
141: 'Unused Error Code.',
142: 'SMART Initial buffer not ready',
143: 'Formatter EDAC correction memory parity error.',
144: 'NX \xe2\x80\x93 RLL1 error.',
145: 'Disc Buffer parity error.',
146: 'Sequencer encountered an EXE/SGATE overlap error.',
147: 'Formatter Correction Buffer underrun error.',
148: 'Formatter Correction Buffer overrun error.',
149: 'Formatted detected NRZ interface protocol error.',
150: 'Media Manager\xe2\x80\x99s MX Overrun error.',
151: 'Media Manager\xe2\x80\x99s NX Overrun error.',
152: 'Media Manager\xe2\x80\x99s TDT Request error.',
153: 'Media Manager\xe2\x80\x99s SST Overrun error.',
154: 'Servo PZT calibration failed.',
155: 'Fast I/O- Servo Data Update Timeout error.',
156: 'Fast I/O- First wedge Servo data Timeout error.',
157: 'Fast I/O- Max samples per collection exceeded.',
158: 'CR memory EDC error',
159: 'SP block detected an EDC error',
160: 'Preamp heater open/short fault.',
161: 'RW Channel fault- Memory buffer overflow or underflow or parity error during write.',
162: 'RW Channel fault- Memory buffer overflow or read data path FIFO underflow in legacy NRZ mode.',
163: 'RW Channel fault- Preamp fault during R/W.',
164: 'RW Channel fault- SGATE, RGATE, or WGATE overlap.',
165: 'RW Channel fault- Mismatch in split sector controls or sector size controls.',
166: 'RW Channel fault- Write clock or NRZ clock is not running.',
167: 'RW Channel fault- SGATE, RGATE, or WGATE asserted during calibration.',
168: 'RW Channel fault- RWBI changed during a read or write event.',
169: 'RW Channel fault- Mode overlap flag.',
170: 'RW Channel fault- Inappropriate WPLO or RPLO behavior.',
171: 'RW Channel fault- Write aborted.',
172: 'RW Channel fault- Bit count late.',
173: 'RW Channel fault- Servo overlap error',
174: 'RW Channel fault- Last data fault',
176: 'PES threshold in field is too far from the same value calculated in the factory.',
177: 'Not enough Harmonic Ratio samples were gathered',
178: 'Sigma of Harmonic Ratio samples after all discards exceeded the limit',
179: 'No EBMS contact fault, even at lowest threshold value',
180: 'EBMS fault still detected at highest threshold value',
181: 'Formatter detected BFI error.',
182: 'Formatter FIFO Interface error.',
183: 'Media sequencer- Disc sequencer Data transfer size mismatch.',
184: 'Correction buffer active while disc sequencer timeout error (this error code is used to fix the hardware skip mask read transfer issue).',
185: 'Seagate Iterative Decoder \xe2\x80\x93 Channel RSM fault',
186: 'Seagate Iterative Decoder \xe2\x80\x93 Channel WSM fault',
187: 'Seagate Iterative Decoder \xe2\x80\x93 Channel BCI fault',
188: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SRC fault',
189: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SAB fault',
190: 'Seagate Iterative Decoder \xe2\x80\x93 Channel read gate overflow error',
192: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SMB Bus B parity error',
193: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SMB buffer error on write',
194: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SOB buffer error on write',
195: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SOB parity error',
196: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SAB buffer error',
197: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SAB bend error',
198: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI buffer sync error',
199: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI data length error on write',
200: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI framing error on write',
201: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI write status error',
202: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI pipe state error (Bonanza), - Channel RSM Gross Error (Caribou- Luxor)',
203: 'Seagate Iterative Decoder \xe2\x80\x93 Channel decoder microcode error',
204: 'Seagate Iterative Decoder \xe2\x80\x93 Channel encoder microcode error',
205: 'Seagate Iterative Decoder \xe2\x80\x93 Channel NRZ parity error',
206: 'Seagate Iterative Decoder \xe2\x80\x93 Symbols per Sector mismatch error',
207: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SMB Bus A parity error',
208: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SMB NRZ parity error',
209: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SOB Buffer error on read',
210: 'Seagate Iterative Decoder \xe2\x80\x93 Channel SMB Buffer error on read',
211: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI data length error on read',
212: 'Seagate Iterative Decoder \xe2\x80\x93 Channel LLI framing error on read',
217: 'Seagate Iterative Decoder \xe2\x80\x93 Channel WSM Gross error',
218: 'Seagate Iterative Decoder \xe2\x80\x93 Channel ERF buffer error',
224: 'Preamp low voltage error',
225: 'Preamp low write data frequency at common point error',
226: 'Preamp write head open error',
227: 'Preamp write head shorted to ground error',
228: 'Preamp TA sensor open error',
229: 'Preamp temperature error',
230: 'Preamp write without heat error',
231: 'Preamp writer off in write error',
232: 'Preamp writer output buffer error',
233: 'Preamp low write data frequency at the head error',
234: 'Preamp FOS error',
235: 'Preamp TA or contact detect error',
236: 'Preamp SWOT error',
237: 'Preamp serial port communication error',
238: 'HSC magnitude overflow error',
240: 'RW Channel \xe2\x80\x93 RDATA valid overlap fault',
241: 'RW Channel \xe2\x80\x93 RD valid gap fault',
244: 'RW Channel \xe2\x80\x93 W Parity not ready',
247: 'RW Channel \xe2\x80\x93 Wrong sector length',
248: 'RW Channel \xe2\x80\x93 Encoder overflow error',
249: 'RW Channel \xe2\x80\x93 Encoder early termination fault',
250: 'RW Channel \xe2\x80\x93 Iteration parameter error',
251: 'RW Channel \xe2\x80\x93 MXP write fault',
252: 'RW Channel \xe2\x80\x93 Symbol count error',
253: 'RW Channel \xe2\x80\x93 RD Incomplete error',
254: 'RW Channel \xe2\x80\x93 RD Data VGA error',
255: 'RW Channel \xe2\x80\x93 RD Data TA error',
'L1': 'Hardware Error',
'L2': 'Internal Target Failure'},
1: {2: 'RW Channel \xe2\x80\x93 RFM Wrong sector length',
3: 'RW Channel \xe2\x80\x93 RFM FIFO underflow',
4: 'RW Channel \xe2\x80\x93 RFM FIFO Overflow',
5: 'RW Channel \xe2\x80\x93 Vector flow errors',
32: 'HSC - An error occurred when attempting to open the file to be used for Harmonic Sensor Circuitry data collection.',
33: 'HSC - The Standard Deviation of the VGAS data collected by the Harmonic Sensor Circuitry was zero.',
34: 'HSC - The Standard Deviation of the 3rd Harmonic data collected by the Harmonic Sensor Circuitry was zero.',
35: 'HSC - The Servo Loop Code returned at the completion of Harmonic Sensor Circuitry data collection was not 0.',
36: 'HSC - An invalid write pattern was specified Harmonic Sensor Circuitry data collection.',
37: 'AR Sensor - The AR Sensor DAC to Target calculation encountered the need to take the square root of a negative value.',
38: 'AR Sensor - The AR Sensor encountered an error when attempting to open the Background Task file.',
39: 'AR Sensor - The AR Sensor encountered an error when attempting to open the General Purpose Task file.',
40: "AR Sensor - The size of the Background Task file is inadequate to satisfy the AR Sensor's requirements.",
41: "AR Sensor - The size of the General Purpose Task file is inadequate to satisfy the AR Sensor's requirements.",
42: 'AR Sensor - The FAFH Parameter File revision is incompatible with the AR Sensor.',
43: 'AR Sensor - The AR Sensor Descriptor in the FAFH Parameter File is invalid.',
44: 'AR Sensor - The Iterative Call Index specified when invoking the AR Sensor exceeds the maximum supported value.',
45: 'AR Sensor - The AR Sensor encountered an error when performing a Track Position request.',
46: 'AR Sensor - The Servo Data Sample Count specified when invoking the AR Sensor exceeds the maximum supported value.',
47: 'AR Sensor - The AR Sensor encountered an error when attempting to set the read channel frequency.',
48: 'AR Sensor - The 3rd Harmonic value measured by the AR Sensor was 0.',
96: 'RW Channel - LOSSLOCKR fault',
97: 'RW Channel - BLICNT fault',
98: 'RW Channel - LLI ABORT fault',
99: 'RW Channel - WG FILLR fault',
100: 'RW Channel - WG FILLW fault',
101: 'RW Channel - CHAN fault',
102: 'RW Channel - FRAG NUM fault',
103: 'RW Channel - WTG fault',
104: 'RW Channel - CTG fault',
105: 'RW Channel -\xc2\xa0NZRCLR fault',
106: 'RW Channel - \xc2\xa0Read synthesizer prechange fail fault',
107: 'RW Channel -\xc2\xa0Servo synthesizer prechange fail fault',
108: 'RW Channel - Servo Error detected prior to halting Calibration Processor',
109: 'RW Channel -\xc2\xa0Unable to Halt Calibration Processor',
110: 'RW Channel -\xc2\xa0ADC Calibrations already disabled',
111: 'RW Channel -\xc2\xa0Calibration Processor Registers have already been saved',
112: 'RW Channel -\xc2\xa0Address where Calibration Processor Registers are to be saved is invalid',
113: 'RW Channel -\xc2\xa0Array for saving Calibration Processor Register values is too small',
114: 'RW Channel -\xc2\xa0Calibration Processor Register values to be used for AR are invalid',
115: 'RW Channel -\xc2\xa0Synchronous abort complete fault',
116: 'RW Channel -\xc2\xa0Preamble length fault',
117: 'RW Channel -\xc2\xa0TA or media defect event fault',
118: 'RW Channel -\xc2\xa0DPLL frequency overflow/underflow fault',
119: 'RW Channel -\xc2\xa0Zero gain threshold exceeded fault',
120: 'RW Channel -\xc2\xa0DPLL frequency deviation fault',
121: 'RW Channel -\xc2\xa0Extended EVGA overflow/underflow fault',
128: 'RW Channel -\xc2\xa0\xc2\xa0Read VGA gain fault',
129: 'RW Channel -\xc2\xa0Acquire Peak Amplitude flag fault',
130: 'RW Channel -\xc2\xa0Massive drop-out fault',
131: 'RW Channel -\xc2\xa0Low Quality sync mark fault',
132: 'RW Channel -\xc2\xa0NPLD load error fault',
133: 'RW Channel -\xc2\xa0Write path memory fault status bit fault',
134: 'RW Channel -\xc2\xa0WRPO disabled fault',
135: 'RW Channel -\xc2\xa0Preamble quality monitor fault',
136: 'RW Channel -\xc2\xa0Reset detection flag fault',
137: 'RW Channel -\xc2\xa0Packet write fault',
138: 'RW Channel -\xc2\xa0Gate command queue overflow fault',
139: 'RW Channel -\xc2\xa0Gate command queue underflow fault',
140: 'RW Channel -\xc2\xa0Ending write splice fault status fault',
141: 'RW Channel -\xc2\xa0Write-through gap servo collision fault',
142: 'RW Channel - Read Gate Fault',
143: 'Error reading the Preamp Gain register during an HSC operation',
144: 'Error writing the Preamp Gain register during an HSC operation',
145: 'RW Channel - Calibration Processor not halted',
146: 'RW Channel - Background Calibrations already stopped',
147: 'RW Channel -\xc2\xa0Background Calibrations not stopped',
148: 'RW Channel - Calibration Processor halt error',
149: 'RW Channel - Save AR Calibration Processor registers error',
150: 'RW Channel - Load AR Calibration Processor registers error',
151: 'RW Channel - Restore AR Calibration Processor registers error',
152: 'RW Channel - Write Markov Modulation Code Failure Type 0',
153: 'RW Channel - Write Markov Modulation Code Failure Type 1',
154: 'RW Channel - Write Markov Modulation Code Failure Type 2',
155: 'RW Formatter - NRZ Interface Parity Randomizer Nyquist Error',
156: 'RW Formatter - NRZ Interface Parity Randomizer Run Error',
157: 'RW Formatter - DLT Fifo Underrun Error',
158: 'RW Formatter - WDT Fifo Underrun Error',
159: 'RW Formatter - M2 MI error',
'L1': 'Hardware Error',
'L2': 'Internal Target Failure'},
224: {0: 'Failure writing firmware to disc.',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
225: {0: 'Failed to reinitialize the NVC Host',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
226: {0: 'Failed to erase the NVC Header',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
227: {0: 'Failed to write NVC client data to disc',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
228: {0: 'Failed to initialize the NVC header',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
229: {0: 'Failed to initialize the NVC Host',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
230: {0: 'Failed to write MCMT during initialization',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
231: {0: 'Failed to write the ISPT during format',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
232: {0: 'Failed to clear logs during format',
'L1': 'Hardware Error',
'L2': 'Writing to Disc Failed'},
242: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Data Integrity Check Failed on verify'},
246: {0: 'FRU 00 \xe2\x80\x93 09 stand for error on head 0 \xe2\x80\x93 9.',
16: 'Power-on self-test failed.',
'L1': 'Hardware Error',
'L2': 'Data Integrity Check Failed during write'},
251: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Failed to enter Raid Partial Copy Diagnostic mode'},
255: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'XOR CDB check error'}},
93: {0: {1: 'Number of Command Timeouts Exceeded',
'L1': 'Hardware Error',
'L2': 'Command Timeout'}},
101: {0: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Voltage fault.'}},
128: {134: {0: 'Host IOEDC Error on Read detected by host',
1: 'IOEDC error on read.',
2: 'IOECC error on read.',
3: 'FDE IOECC error on read.',
4: 'SSD IOEDC error on read',
5: 'SSD Erased Page Error',
6: 'FDE Sector-bypass datatype mismatch',
'L1': 'Hardware Error',
'L2': 'IOEDC - DataType Error on Read'},
135: {0: 'Host IOEDC Error on Write, this is unused',
1: 'FDE IOEDC Error on Write detected by the FDE logic',
2: 'SSD IOEDC Error on Write',
128: 'Disk IOEDC parity error on write detected by formatter',
129: 'IOECC and IOEDC errors occurred, which is highly probable (when IOECC is enabled) for multiple or single bit corruption.',
130: 'IOECC parity error on write.',
131: 'IOECC error (correctable).',
'L1': 'Hardware Error',
'L2': 'IOEDC Error on Write'},
136: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Host Parity Check Failed'},
137: {128: 'IOEDC parity error on read detected by formatter',
'L1': 'Hardware Error',
'L2': 'IOEDC error on read detected by formatter'},
138: {'L1': 'Hardware Error',
'L2': 'Host FIFO Parity Error detected by Common Buffer',
'fru': ['xx is 00, 01, 02 or 03 ( channel number )']},
139: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Host FIFO Parity Error detected by frame buffer logic'},
140: {0: 'No Specific FRU code.',
1: 'For Read Host Data Frame Buffer Parity Error.',
2: 'For Write Host Data Frame Buffer Parity Error.',
3: 'SSD Buffer Memory Parity Error',
4: 'Host Data Frame Buffer Uncorrectable ECC Error.',
'L1': 'Hardware Error',
'L2': 'Host Data Frame Buffer Uncorrectable ECC Error'},
141: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Host Data Frame Buffer Protection Error'},
142: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Host FIFO overrun or underrun rrror'},
143: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'Host FIFO unknown error'}},
129: {0: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'LA Check Error, LCM bit = 0'}},
130: {0: {0: 'No Specific FRU code.',
1: 'Insufficient return buffer.',
'L1': 'Hardware Error',
'L2': 'Diag internal client detected insufficient buffer'}},
131: {0: {0: 'No Specific FRU code.',
'L1': 'Hardware Error',
'L2': 'DOS Scalars are out of range'}}},
5: {26: {0: {0: 'Parameter list length error.',
1: 'Format Parameter list length error.',
2: 'Mode Select command Parameter list length error.',
3: 'Extended Mode select command Parameter list length error.',
4: 'Mode Select operation Parameter list length error.',
5: 'Check mode page Parameter list length error.',
6: 'Reassign Block command Parameter list length error.',
7: 'Parameter list length error.',
8: 'Parameter data list length error.',
'L1': 'Illegal Request',
'L2': 'Parameter List Length Error'}},
32: {0: {0: 'Invalid Command Operation Code',
1: 'Primary Invalid Command Operation Code',
2: 'Unique command not unlocked code.',
7: 'Glist to Plist Unlock command not unlocked code.',
8: 'Invalid Command Operation Code for SSD Backend',
'L1': 'Illegal Request',
'L2': 'Invalid Command Operation Code'},
243: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Invalid linked command operation code'}},
33: {0: {1: 'Logical block address out of range',
2: 'Invalid LBA in synchronize cache',
3: 'Invalid LBA in read capacity',
4: 'Invalid LB in write same',
5: 'Invalid LBA in read/write long',
6: 'Invalid LBA in seek',
7: 'Logical block address out of range from media backend',
'L1': 'Illegal Request',
'L2': 'Logical Block Address Out of Range'}},
36: {0: {0: 'Invalid field in CDB',
1: 'Bad CDB error.',
2: 'Invalid field in CDB. (Format command)',
3: 'Invalid field in CDB. (Setup R/W Long command)',
4: 'Invalid field in CDB. (Log sense page)',
5: 'Invalid field in CDB. (Log sense parameter)',
6: 'Invalid field in CDB. (Log select command)',
7: 'Invalid Field in CDB \xe2\x80\x93 UDS trigger command.',
8: 'Invalid Field in CDB \xe2\x80\x93 buffer overflow check.',
9: 'Invalid power transition request',
10: 'Invalid power transition mode bit',
11: 'Invalid power transition PCM',
12: 'Invalid page and subpage combination (Log sense)',
13: 'Invalid field in CDB. (Report Zones command- SMR)',
22: 'Invalid field in CDB. (Skip Mask)',
48: 'Invalid Combination of CDB.',
49: 'Change Definition Illegal Parameter.',
50: 'Change Definition Illegal Password',
51: 'Change Definition Unlock Command Error',
52: 'Change Definition Not Supported',
53: 'Change Definition Mismatch in port mode (Single/Dual port: SAS only)',
54: 'Invalid field in CDB from media backend',
'L1': 'Illegal Request',
'L2': 'Invalid Field in CDB'},
1: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Illegal Queue Type for CDB (Low priority commands must be SIMPLE queue)'},
46: {0: 'The Byte Offset exceeds the length of the SMART frame data.',
'L1': 'Illegal Request',
'L2': 'Invalid field in CDB for E6 SMART Dump command, unique to NetApp.'},
240: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Invalid LBA in linked command'},
242: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Invalid linked command operation code'},
243: {128: 'G->P operation requested while drive was formatted w/o PLIST.',
129: 'Servo Flaw already exists in ASFT or PSFT.',
130: 'G->P operation encountered a G-list entry that overlaps an existing P-list entry.',
131: 'G->P operation encountered a Growth Servo Flaw which overlapped an existing Primary defect Servo Flaw.',
132: 'Defects report lists not available for retrieval.',
133: "Servo Flaw doesn't exist in ASFT.",
'L1': 'Illegal Request',
'L2': 'Illegal Servo Flaw operation request'}},
37: {0: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Logical unit not supported'}},
38: {0: {0: 'Invalid Field in Parameter List.',
1: 'Invalid Field in Format Parameter List.',
2: 'Field in RetrieveThirdPartyID Parameter List.',
3: 'Invalid Field in ModeSelectOperation Parameter List.',
4: 'Invalid Field in CheckModePage Parameter List.',
5: 'Invalid Field in ReassignBlocksCmd Parameter List.',
6: 'Invalid Field in PersistentReserveOutCmd Parameter List.',
7: 'Invalid Field Invalid in LogSelectCmd Parameter List.',
8: 'Invalid LogSelectCmd Parameter List length.',
9: 'Invalid Field in WriteBufferCmd Parameter List.',
10: 'Invalid Field in SendDiagnosticCmd Parameter List.',
11: 'Invalid Field in BuildTranslateAddrPage Parameter List.',
12: 'An E0 packet to UDS was too small',
13: 'Invalid FCN ID in E0 packet to UDS.',
14: 'Invalid Field in Retrieved Trace Information packet to UDS (E0).',
15: 'Cannot clear UDS trace, because UDS was not allowed to return it all',
16: 'Cannot enable/disable UDS trace, drive not ready',
17: 'Unsupported block size.',
18: 'UDS trigger command.',
19: 'Invalid remanufacturing command.',
20: 'Invalid command while SMART reporting disabled.',
21: 'Invalid field in parameter list from media backend',
128: 'Invalid input cylinder.',
129: 'Invalid input head.',
130: 'Invalid input sector.',
131: 'Input user LBA is invalid 01.',
132: 'Input user LBA is invalid 02.',
133: 'Input user LBA is invalid 03.',
134: 'Input system LBA is invalid.',
135: 'Client defect list size is invalid.',
136: 'Sort error due to invalid offset.',
137: 'Sort error due to invalid head.',
138: 'Sort error due to invalid cylinder.',
139: 'Failed to validate a client specified byte extent info.',
140: 'Failed to validate a client specified sector extent info.',
141: 'Invalid track in client defect list entry.',
142: 'Input track is invalid.',
143: 'First LBA of input track is invalid.',
144: 'Invalid servo data block length.',
145: 'Invalid servo program block length.',
146: 'Address translation \xe2\x80\x93 input PBA is invalid',
147: 'Address translation \xe2\x80\x93 input symbol extent is invalid.',
148: 'Super sector transfer \xe2\x80\x93 invalid wedge transfer size.',
149: 'Track ZLR Transfer \xe2\x80\x93 Invalid partition.',
150: 'Track ZLR Transfer \xe2\x80\x93 Invalid LBA range on target track.',
151: 'Track ZLR Transfer \xe2\x80\x93 Reallocated LBA found on target track.',
152: 'Input user LBA is invalid 04.',
153: 'Input user LBA is invalid 05.',
154: 'Convert Sector to RLL Data \xe2\x80\x93 Unsupported sector size.',
155: 'Add Servo Flaw \xe2\x80\x93 Invalid input specified.',
156: 'Invalid condition for enabling servo free fall protection (drive not spinning).',
157: 'Invalid condition for disabling servo free fall protection (drive not spinning).',
158: 'Invalid condition for disabling servo free fall protection (protection already disabled).',
159: 'Invalid condition for disabling servo free fall protection (protection already de-activated).',
160: 'Invalid condition for disabling servo free fall protection (free-fall condition is currently active).',
161: 'Invalid drive free-fall control option specified.',
162: 'Check free-fall event failed \xe2\x80\x93 protection not functional.',
163: 'Invalid sector range specified.',
164: 'Invalid count value specified for update.',
165: 'Invalid channel memory select specified for access.',
166: 'Invalid buffer index specified for read channel memory access.',
167: 'Invalid start address specified for read channel memory access.',
168: 'Invalid transfer length specified for read channel memory access.',
169: 'Invalid sector extent info',
175: 'Band translation - invalid input type specified',
176: 'Band translation - invalid output type specified',
177: 'Band translation - invalid input Band ID',
178: 'Band translation - invalid input Band ID',
179: 'Band translation - invalid input track position',
180: 'Band translation - invalid input RAP zone, head',
185: 'Invalid band number.',
186: 'Invalid band lba offset.',
187: 'Invalid user lba.',
189: 'Invalid parameter',
193: 'DITS Buffer ( Dummy Cache ) too small',
'L1': 'Illegal Request',
'L2': 'DITS Buffer ( Dummy Cache ) too small'},
1: {0: 'No Specific FRU code.',
1: 'Log pages unavailable for inclusion in UDS dump.',
'L1': 'Illegal Request',
'L2': 'Parameter Not Supported'},
2: {0: 'No Specific FRU code.',
1: 'DIAG: Invalid input cylinder.',
2: 'DIAG: Invalid input head.',
3: 'DIAG: Invalid input sector.',
4: 'DIAG: Invalid Wedge.',
5: 'DIAG: Invalid LBA.',
6: 'DIAG: Invalid file selection.',
7: 'DIAG: Invalid file length.',
8: 'DIAG: Invalid start offset.',
9: 'DIAG: Write Overflow.',
10: 'DIAG: Backplane Bypass selection invalid.',
11: 'DIAG: Invalid serial number.',
12: 'DIAG: Incomplete DFB.',
13: 'DIAG: Unsupported DFB revision.',
14: 'DIAG: Invalid Temperature selection.',
15: 'DIAG: Invalid Transfer Length.',
16: 'DIAG: Unsupported memory area.',
17: 'DIAG: Invalid command.',
18: 'DIAG: File copy invalid.',
19: 'DIAG: Insufficient data sent from initiator.',
20: 'DIAG: Unsupported DIAG command.',
21: 'DIAG: Flash segment invalid.',
22: 'DIAG: Req flash segment copy invalid.',
23: 'DIAG: Flash access failed.',
24: 'DIAG: Flash segment length invalid.',
25: 'DIAG: File checksum invalid.',
26: 'DIAG: Host DFB Length Invalid',
27: 'DIAG: Unaligned transfer.',
28: 'DIAG: Unsupported operation.',
29: 'DIAG: Backend invalid.',
30: 'DIAG: Flash plane invalid.',
31: 'DIAG: ISP node not found.',
32: 'DIAG: Invalid parameter.',
33: 'DIAG: Format corrupt condition required.',
34: 'DIAG: Clear all scan unit counts not allowed',
35: 'DIAG:\xc2\xa0 Unsupported Flash Device',
36: 'DIAG:\xc2\xa0 Raw flash blocks in MList',
37: 'DIAG:\xc2\xa0 Raw flash format table mismatch',
38: 'DIAG:\xc2\xa0 Raw flash Unused format slot',
39: 'DIAG:\xc2\xa0 Raw flash cannot decide format table',
40: 'DIAG:\xc2\xa0 Raw flash invalid error code',
41: 'DIAG:\xc2\xa0 Write protect condition',
42: 'DIAG: Requested for a Pre-erased block in Nor flash',
57: 'Parameter Data out of range.',
58: 'Parameter Data over write.',
64: 'DIAG: Diag write failed',
65: 'DIAG: DIAG_DST_IS_IN_PROGRESS',
66: 'DIAG: DIAG_TEST_RANGE_IN_SET',
68: 'DIAG: DIAG_BMS_IS_ENABLED',
72: 'DIAG: DIAG_INVALID_START_LBA',
73: 'DIAG: DIAG_INVALID_END_LBA',
'L1': 'Illegal Request',
'L2': 'Parameter Value Invalid'},
3: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Threshold Parameter not supported'},
4: {0: 'Invalid Release of Active Persistent Reserve',
1: 'Invalid release of persistent reservation. (reservation type mismatch)',
'L1': 'Illegal Request',
'L2': 'Invalid Release of Active Persistent Reserve'},
5: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Fail to read valid log dump data'},
152: {0: 'No Specific FRU code.',
1: 'FDE checksum error.',
2: 'Failed Flash Verification on Newly Downloaded Component.',
'L1': 'Illegal Request',
'L2': 'Invalid Field Parameter \xe2\x80\x93 Check Sum'},
153: {1: 'Segment type mismatch.',
2: 'Customer ID mismatch.',
3: 'Drive type mismatch.',
4: 'HW configuration mismatch.',
5: 'Compatibility configuration mismatch.',
6: 'Servo firmware product family mismatch.',
7: 'QNR download is not supported.',
8: 'CAP product family mismatch.',
9: 'RAP product family mismatch.',
10: 'Download segment length too large.',
11: 'Download length invalid.',
12: 'CTPM missing.',
13: 'CFW and CAP mismatch 01.',
14: 'CFW and CAP mismatch 02.',
15: 'CFW and CAP mismatch 03.',
16: 'CFW and RAP mismatch 01.',
17: 'CFW and RAP mismatch 02.',
18: 'CFW and RAP mismatch 03.',
19: 'CFW and SAP mismatch 01.',
20: 'CFW and SAP mismatch 02.',
21: 'CFW and SAP mismatch 03.',
22: 'CFW and SFW mismatch 01.',
23: 'SAP Product family mistmatch',
24: 'CFW and SFW mismatch 03.',
25: 'Download buffer offset invalid.',
26: 'Address translation invalid.',
27: 'CFW and IAP mismatch.',
28: 'Quick Download in Progress.',
29: 'Invalid unlock tags \xe2\x80\x93 customer does not match',
30: 'Invalid unlock tags \xe2\x80\x93 customer does not match',
31: 'Invalid unlock tags \xe2\x80\x93 checksum failure',
32: 'Firmware not backward compatible.',
33: 'Download overlay incompatible.',
34: 'Overlay download failure 1.',
35: 'Overlay download failure 2.',
36: 'Overlay download failure 3.',
37: 'General download failure',
38: 'Trying to download bridge code for wrong product family',
39: 'Factory flags mismatch.',
40: 'Illegal combination \xe2\x80\x93 Missing BootFW module.',
41: 'Illegal combination \xe2\x80\x93 Missing Customer FW Feature Flags module.',
42: 'Illegal combination \xe2\x80\x93 Programmable Inquiry download not supported.',
43: 'Illegal combination \xe2\x80\x93 Missing CustomerFW module.',
44: 'Download Congen header failure',
46: 'Download Congen XML failure',
47: 'Download Congen version failure',
48: 'Download Congen XML SIM MakeLocalFile failure',
49: 'Download Congen mode data failure \xe2\x80\x93 could not save mode header.',
50: 'Download Congen mode data failure \xe2\x80\x93 mode page had sent length/spec length miscompare.',
51: 'Download Congen mode data failure \xe2\x80\x93 mode page had invalid contents.',
52: 'Download Congen mode data failure \xe2\x80\x93 mode page tried to change contents not allowed by change mask.',
53: 'Download Congen mode data failure \xe2\x80\x93 save all mode pages could not write to media.',
54: 'Download Congen mode data failure \xe2\x80\x93 save partial mode pages could not write to media.',
55: 'Download Congen mode data failure \xe2\x80\x93 mode change callbacks did not complete successfully.',
56: 'Package Enforcement Failure \xe2\x80\x93 Package didn\xe2\x80\x99t contain valid SFW component',
57: 'Invalid link rate',
59: 'Unlock code not allowed to be DL if dets is locked',
60: 'DETS is locked, code download is blocked',
61: 'Code download is blocked if DETS is locked',
62: 'Download is blocked due to system area incompatibility with new code',
63: 'Invalid SD&D customer family for customer cross-market-segment downloads.',
64: 'Unlock File failed to be written to the flash',
65: 'Unlock File secuirty headers do not match',
80: 'Download header length invalid',
81: 'Download length is not a multiple of the buffer word size',
82: 'Download length and segment length mismatch',
161: 'Unknown firmware tag type.',
162: 'Attempt to R/W locked LBA band',
163: 'SSD download combined code has mismatched frontend and backend',
164: 'SSD download backend code \xe2\x80\x93 recovery required',
165: 'SSD download a firmware which is mismatched with resident firmware',
166: 'SSD download standalone (non-bundle) firmware in boot mode.',
'L1': 'Illegal Request',
'L2': 'Invalid Field Parameter \xe2\x80\x93 Firmware Tag'},
154: {0: 'Invalid Security Field Parameter in secure download packaging.',
1: 'Attempt to perform secure download with drive not spun up',
2: 'Attempt to download signed non-fde firmware in use state or fail state',
3: 'Attempt to download signed sed code onto a non-sed drive.',
4: 'Download inner signature key index does not match the outer signature key index.',
16: 'Inner firmware signature validation failure.',
18: 'Power Governor feature requires that both CFW and SFW support same number of seek profiles. This sense code indicates an attempt to download a code with mismatching seek profiles count',
20: 'DOS Table Size has been reduced',
'L1': 'Illegal Request',
'L2': 'Invalid Field Parameter \xe2\x80\x93 Firmware Tag'},
155: {0: 'SSD download code mismatched with running frontend code FRU indicate running frontend compatibility number 00-FF',
'L1': 'Illegal Request',
'L2': 'SSD Compatibility Error'}},
44: {0: {0: 'Command Sequence Error.',
1: 'Command Sequence Error. (R/W Buffer command)',
2: 'Command Sequence Error. (Retrieve SDBPP Packet)',
3: 'Command Sequence Error. (Diag Locked)',
4: 'Command Sequence Error. (Concurrent UDS service attempt)',
5: 'Command Sequence Error. (UDS retrieval: back-to-back E0)',
6: 'Command Sequence Error. (Unexpected retrieve trace packet received during non-handshaked UDS retrieval)',
7: 'Command Sequence Error. (Back-to-back E1 commands, illegal during handshaked UDS retrieval and illegal during non-handshaked when it\xe2\x80\x99s time to retrieve the last trace packet)',
8: 'Command Sequence Error. (Send Diag cmd. Before Write Buffer cmd)',
9: 'Command Sequence Error. (Channel BCI logging in online mode)',
10: 'Stop command execution disallowed when Raid Rebuild mode is active/enabled',
11: 'Foreground H2SAT operation currently not allowed.',
'L1': 'Illegal Request',
'L2': 'Command Sequence Error'},
5: {0: 'No Specific FRU code.',
1: 'Power Management frozen (OBSOLETE)',
'L1': 'Illegal Request',
'L2': 'Illegal Power Condition Request'},
128: {0: 'Command Sequence Error. (Illegal to request MC flush while cleaning is disabled.)',
'L1': 'Illegal Request',
'L2': 'Command Sequence Error'}},
50: {1: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Defect List Update Error'}},
53: {1: {3: 'No enclosure found.',
7: 'Unsupported 8045 Enclosure Request.',
'L1': 'Illegal Request',
'L2': 'Unsupported Enclosure Function'}},
71: {6: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'SAS - Physical Test in Progress'}},
73: {0: {0: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Illegal request, Invalid message error'}},
85: {4: {1: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'PRKT table is full'}}},
6: {11: {1: {0: 'No Specific FRU code.',
1: 'Temperature is lower than the low temperature threshold.',
'L1': 'Unit Attention',
'L2': 'Warning \xe2\x80\x93 Specified temperature exceeded'}},
41: {0: {0: 'Power on, reset, or bus device reset occurred. (SPI flash LED)',
1: 'CDB trigger dump and reset occurred..',
2: 'LIP trigger dump and reset occurreD',
3: 'Performing some type of logout, either N_PORT, FCP, or both.',
202: 'The Flight Recorder area in FLASH contains data',
'L1': 'Unit Attention',
'L2': 'Power-On, Reset, or Bus Device Reset Occurred'},
1: {0: 'Power-on reset occurred. (SPI)',
1: 'Power-on reset occurred. (SSI)',
6: 'Power-on reset occurred when rezero with 0xEF in byte 1',
7: 'Power-on reset occurred due to HW controller watchdog expiration',
8: 'Power-on reset initiated by firmware (e.g. to remove lockup conditions)',
9: 'Power-on reset occurred when servo watchdog timer expires',
'L1': 'Unit Attention',
'L2': 'Power-On Reset Occurred'},
2: {0: 'SCSI bus reset occurred.',
2: 'Warm reset occurred.',
'L1': 'Unit Attention',
'L2': 'SCSI Bus Reset Occurred'},
3: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Bus Device Reset'},
4: {0: 'No Specific FRU code.',
1: 'Internal Reset due to Assert Storm Threshold being exceeded.',
3: 'NVC WCD has marked corrupted sector dirty.',
'L1': 'Unit Attention',
'L2': 'Internal Reset'},
5: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Transceiver Mode Changed to SE'},
6: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Transceiver Mode Changed to LVD'},
7: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'IT Nexus Loss'},
8: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Write Log Dump data to disk fail'},
9: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Write Log Dump Entry information fail'},
10: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Reserved disc space is full'},
11: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'SDBP test service contained an error, examine status packet(s) for details'},
12: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'SDBP incoming buffer overflow (incoming packet too big)'},
205: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Flashing LED occurred. (Cold reset)'},
206: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Flashing LED occurred. (Warm reset)'}},
42: {1: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Mode Parameters Changed'},
2: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Log Parameters Changed'},
3: {0: 'Reservations preempted.',
1: 'Reservations preempted. (Clear service action)',
'L1': 'Unit Attention',
'L2': 'Reservations Preempted'},
4: {0: 'Reservations released.',
1: 'Reservations released. (Registration with reg key = 0)',
2: 'Reservations Released. (Preempt service action)',
3: 'Reservations Released. (Release service action)',
'L1': 'Unit Attention',
'L2': 'Reservations Released'},
5: {0: 'Registrations preempted.',
1: 'Registrations preempted 01.',
'L1': 'Unit Attention',
'L2': 'Registrations Preempted'},
9: {0: 'Capacity data changed',
'L1': 'Unit Attention',
'L2': 'Capacity data changed'}},
47: {0: {0: 'No Specific FRU code.',
1: 'Target is already unlocked by another initiator',
'L1': 'Unit Attention',
'L2': 'Tagged Commands Cleared By another Initiator'},
1: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Commands cleared due to power-off warning'}},
63: {0: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Target operating conditions have changed'},
1: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Download Occurred'},
2: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Changed Operating Definition'},
3: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Inquiry Data Has Changed'},
5: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Device Identifier Changed'},
145: {1: 'WWN in ETFLOG does not match CAPM WWN.',
'L1': 'Unit Attention',
'L2': 'WWN in ETFLOG does not match CAPM WWN'}},
91: {0: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Log Exception'}},
92: {0: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'RPL Status Change'}},
93: {0: {0: 'No Specific FRU code.',
4: 'Reallocation.',
5: 'Reallocation AST table.',
6: 'Reallocation DDT table.',
16: 'Hardware failure.',
20: 'Excessive reassigns.',
32: 'General failure.',
40: 'Flash Life Left Failure',
49: 'Head failure.',
50: 'Recovered data error rate.',
51: 'Recovered data error rate during early life (Xiotech SSD Only)',
55: 'Recovered TA.',
56: 'Hard TA event.',
64: 'Head flip.',
65: 'SSE (servo seek error).',
66: 'Write fault.',
67: 'Seek failure.',
69: 'Track following errors (Hit66).',
74: 'Seek performance failure.',
91: 'Spinup failure.',
107: 'Flash spinup failure',
117: 'Multiply threshold config.',
239: 'No control table on disk.(OBSOLETE)',
'L1': 'Unit Attention',
'L2': 'Failure Prediction Threshold Exceeded'},
255: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'False Failure Prediction Threshold Exceeded'}},
128: {144: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Host Read Redundancy Check Error'},
145: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Host Write Redundancy Check Error'},
146: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Disc Read Redundancy Check Error'},
147: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Disc Write Redundancy Check Error'},
148: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Xor Redundancy Check Error'}},
180: {0: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'Unreported Deferred Errors have been logged on log page 34h'}},
255: {0: {0: 'No Specific FRU code.',
'L1': 'Unit Attention',
'L2': 'FC SEL_ID changed'}}},
7: {3: {0: {0: 'No Specific FRU code.',
'L1': 'Data Protect',
'L2': 'Peripheral device write fault'}},
32: {2: {0: 'No Access Rights. Attempt to access security locked LBA.',
1: 'No Access Rights. Sanitize command pre-condition not met.',
'L1': 'Data Protect',
'L2': 'Access Denied'}},
39: {0: {0: 'Write protected.',
1: 'Write protected during ready check.',
2: 'Write protected media backend',
3: 'Write protected due to PIC failure',
4: 'Write protected due to reserved blocks exceeded threshold',
5: 'Write protected due to defects per die exceeded threshold',
6: 'Write protected due to retired blocks exceeded threshold',
7: 'Write protected due to write failure (Nand error)',
8: 'Write Protected due to Auto status command write failure',
9: 'Write protected due to spare blocks exceeded threshold',
10: 'Write protected due to GCU for system data is not available',
11: 'Write protected due to defect table read error during restore',
12: 'Zone is read only ( SMR Only)',
13: 'Write protected due to defect block list overflow',
'L1': 'Data Protect',
'L2': 'Write Protected'}}},
9: {8: {0: {0: 'No Specific FRU code.',
'L1': 'Firmware Error Constants',
'L2': 'Logical Unit Communication Failure'}},
128: {0: {0: 'General firmware error.',
1: 'Firmware error during CDB check.',
2: 'Error recovering log tables.',
3: 'UDS has no packet to send back, but it neglected to make this clear to the SDBP layer (E1 command).',
4: 'Processed unsupported UDS session (UDS should have rejected).',
5: 'Packet retrieval allowed by upper levels of UDS when no UDS trace retrieval was active.',
6: 'UDS trace retrieval is trying to include an empty finished frame trace file.',
7: 'Unexpected content split across UDS retrieved trace packets.',
8: 'UDS trace retrieval sense data without FAIL status or vice versa.',
9: 'UDS trace retrieval: Internal confusion over amount of trace.',
10: 'UDS \xe2\x80\x93 retrieval failure.',
11: '\xe2\x80\x9cDummy Cache\xe2\x80\x9d file request failed',
12: 'Failed to fix up system parameters during head depop.',
13: 'Write same command call to XOR data copy failed.',
14: 'Write same command call to create cache segment from buffer failed to allocate sufficient space.',
15: 'Request to read servo data timed out.',
16: 'Loading Disc Firmware failed',
17: 'Disc Firmware Signature Verification Failed.',
18: 'WriteAndOrVerifyCmd Xor copy failure',
19: 'WriteAndOrVerifyCmd request cache segment allocation failed',
20: 'Write Buffer detected an Unknown Error.',
21: 'Write Buffer detected a Corrupted Data Error.',
22: 'Write Buffer detected a Permanent Error.',
23: 'Write Buffer detected a Service Delivery/Target Failure Error.',
24: 'Phy Log Retrieval Failed',
25: 'failed to issue command to auxiliary processor',
26: "failed memory allocation on auxiliary processor's heap",
27: 'Loading PIC Firmware Failed',
28: 'Loading FME Firmware Failed',
29: 'LDevFormat() failed to allocate sufficient space.',
30: 'LDevFormat() call to XorCopyData() failed',
45: 'InitSurface() failed to allocate sufficient space.',
46: 'InitSurface () call to XorCopyData() failed',
61: 'Log Page cache not allocated',
62: 'Log Page cache not allocated',
63: 'Log Page not enough cache available',
64: 'SMART Frame Index corrupted on disc and not recoverable via f/w.',
65: 'NVC Disabled by error condition',
66: 'Log data save to disc failed',
67: 'Wait for phy after reset too long',
68: 'H2SAT unexpected condition occurred',
70: 'Memory allocation failed during Reassign Blocks command',
74: 'Diag command attempted to execute missing or incompatible Overlay code',
75: 'Wait for phy after reset too long',
80: 'Flash management access failed',
128: 'Invalid prime request.',
129: 'Request cannot be processed.',
130: 'Unsupported fault.',
131: 'Track address fault.',
132: 'Servo-Disc synchronization error.',
133: 'End of transfer reached prematurely.',
134: 'Unexpected sequencer timeout error.',
135: 'Unknown error in the NRZ Transfer logic.',
136: 'Unknown EDAC error.',
137: "Unknown Media Manager's error.",
138: 'Invalid disc halt.',
139: 'Unexpected sequencer halt condition.',
140: 'Unexpected sequencer halt.',
141: 'Unknown sequencer timeout error.',
142: 'Unknown NRZ interface error.',
143: 'Disc was soft halted.',
144: 'Fault condition error.',
145: 'Correct Buffer Completion timeout error.',
146: 'Maximum write passes of a zone exceeded. (Changed to 04/1C00/93)',
147: 'Maximum certify passes of a zone exceeded. (Changed to 04/1C00/94)',
148: 'Recovered seek error encountered.',
149: 'Forced to enter error recovery before error is encountered.',
150: 'Recovered servo command error.',
151: 'Partial reallocation performed.',
152: 'Transfer was truncated.',
153: 'Transfer completed.',
154: 'Track transfer completed.',
155: 'Scan Defect - Allocated scan time exceeded.',
156: 'IOEDIOECC parity error on write',
157: 'IOECC parity error on write',
158: 'IOECC error (correctable)',
159: 'EDAC stopped for FW erasure',
160: 'Reallocate Block - Input was not marked for pending reallocation.',
161: 'Input LBA was not found in the RST.',
162: 'Input PBA was not found in the resident DST 1',
163: 'Input PBA was not found in the resident DST 2',
164: 'DST Mgr - Skootch failed 1',
165: 'DST Mgr - Skootch failed 2',
166: 'DST Mgr - Insert failed',
167: 'Correction Buffer over-run, under-run, or EDC error',
168: 'Form FIFO over/under run error',
169: 'Failed to transition to active power',
170: 'Input LBA was marked as logged',
171: 'Format - Max number of servo flaws per track exceeded in servo coast',
172: 'Format - Write servo unsafe errors when the track already has multiple flaws',
173: "Formatter's parity RAM progress is not in sync with transfer.",
174: 'Disc Xfr - Conflict of R/W request resource.',
175: 'Conflict of R/W resource during write attempt of super block data.',
176: "Formatter's parity RAM progress not in sync with alt transfer.",
177: "Formatter's parity RAM is invalid for parity sectors update.",
178: "Formatter's parity RAM is invalid for parity sectors alt-update.",
179: 'Parity secs read of expected reallocated sectors not reallocated.',
180: 'Parity sectors write of expected reallocated sectors not reallocated.',
181: 'PVT not showing all super blocks valid on successful format.',
182: 'Sector Data Regen - Restart of transfer is required.',
183: 'Sector Data Regen - Restart of transfer failed on a reallocated blk.',
184: 'Sector Data Regeneration - Restart of transfer failed.',
185: 'Format - Dirty super blk on nedia not reported in PVT.',
186: 'Super Block Read - No user sectors available.',
187: 'Full R/W reallocation code support is not available.',
188: 'Full R/W reallocation code support is not available.',
189: 'Full R/W reallocation code support is not available.',
190: 'Super Block Read - Recovered Data using SuperC Block.',
191: 'ATIC DERPR Retry - Recovered Data using DERP ATIC retry.',
192: 'Unexpected Servo Response - Retry count equals zero for a non-PZT request',
193: 'Recovered Data using Intermediate Super Parity',
194: 'Overlapping Defect Blocks',
195: 'Missing Defect Blocks',
196: 'Input LBA was not protected from torn write',
197: 'Formatter transfer did not halt properly',
198: 'Servo DC calibration failed',
199: 'Invalid band LBA range encountered during dirty super blocks update attempt',
200: 'Detect Formatter FIFO pointer synchronization loss error',
201: 'Detect Formatter FIFO pointer synchronization loss error',
202: 'Full reallocation support not available',
203: 'Invalid block for unmark DART pending reallocation',
204: 'Mark pending DART skipped',
205: 'Outercode recovery scratchpad buffer size insufficient',
206: 'Recovered data using firmware Iterative OuterCode(IOC)',
207: 'AFH Heater DAC is <= 0',
208: 'AFH Calculated Heater DAC value is >= max allocated memory for heater DAC',
209: 'AFH DAC value supplied to the DAC actuation path is > -b/2a',
210: 'ATS2 Seek Error occurred along with Track address fault error',
211: 'Buffer overflow detected in Legacy mode read.',
'L1': 'Firmware Error Constants',
'L2': 'General Firmware Error Qualifier'},
82: {'L1': 'Firmware Error Constants',
'L2': 'General Firmware Error Qualifier',
'fru': ['Error byte returned by PMC code for various DITS APIs']}}},
11: {0: {30: {0: 'Invoke within a TCG session',
'L1': 'Aborted Command',
'L2': 'Sanitize command aborted'}},
8: {0: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Logical unit communication failure'},
1: {0: 'Logical Unit Communication Time-Out.',
128: 'Servo command timed out.',
129: 'Seek operation timed out.',
130: 'Seek operation has exceeded the recovery time limit.',
'L1': 'Aborted Command',
'L2': 'Logical Unit Communication Time-Out'}},
12: {16: {0: 'Write command requires initial access to a mapped out head.',
1: 'Write command attempted a seek to access a mapped out head.',
2: 'Write command encountered an alternate block mapped to a bad head.',
'L1': 'Aborted Command',
'L2': 'Command aborted due to multiple write errors'}},
14: {1: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'SAS abort command (10.2.3)'},
2: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'SAS abort command (9.2.6.3.3.8.1)'}},
16: {1: {0: 'No specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Logical Block guard check failed'},
2: {0: 'No specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Logical Block application tag check failed'},
3: {0: 'No specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Logical Block reference tag check failed'}},
17: {3: {1: 'Read command requires initial access to a mapped out head.',
2: 'Read command attempted a seek to access a mapped out head.',
3: 'Read command encountered an alternate block mapped to a bad head.',
4: 'Prefetch command for FIM has detected a failed LBA in the range',
'L1': 'Aborted Command',
'L2': 'Command aborted due to multiple read errors'}},
63: {15: {0: 'Echo buffer overwritten.',
1: 'Read buffer echo error.',
'L1': 'Aborted Command',
'L2': 'Echo buffer overwritten'}},
67: {0: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Message reject error'}},
68: {0: {0: 'Timed out while waiting in queue.',
1: 'Timed out during error recovery.',
2: 'Timed out while executing command.',
'L1': 'Aborted Command',
'L2': 'Overall Command Timeout'},
246: {0: 'FRU 00 \xe2\x80\x93 09 stand for error on head 0 \xe2\x80\x93 9.',
'L1': 'Aborted Command',
'L2': 'Data Integrity Check Failed during write'}},
69: {0: {0: 'Select/Reselection Failure.',
1: 'Select/Reselection time out.',
'L1': 'Aborted Command',
'L2': 'Select/Reselection Failure'}},
71: {0: {0: 'SCSI Parity Error in message phase.',
1: 'SCSI parity error in command phase.',
3: 'SCSI parity error in data phase.',
8: 'SCSI CRC error in data phase.',
'L1': 'Aborted Command',
'L2': 'SCSI Parity Error'},
3: {1: 'SCSI CRC error in command IU.',
8: 'SCSI CRC error in data (out) IU.',
'L1': 'Aborted Command',
'L2': 'Information Unit CRC Error'},
128: {9: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Fibre Channel Sequence Error'}},
72: {0: {1: 'Initiator detected error message received, selection path.',
2: 'Initiator detected error message received, reselection path.',
'L1': 'Aborted Command',
'L2': 'Initiator Detected Error Message Received'}},
73: {0: {1: 'Invalid message received, selection path.',
2: 'Invalid message received, reselection path.',
'L1': 'Aborted Command',
'L2': 'Invalid message received'}},
75: {0: {0: 'No Specific FRU code.',
2: 'Invalid source ID.',
3: 'Invalid destination ID.',
4: 'Running Disparity error.',
5: 'Invalid CRC.',
16: 'Invalid data frame during transfer and no xfr done.',
'L1': 'Aborted Command',
'L2': 'DATA phase error'},
1: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Invalid transfer tag'},
2: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Too much write data'},
3: {0: 'No Specific FRU code.',
1: 'Link reset occurred during transfer.',
5: 'Break received in the middle of a data frame',
6: 'Break received but unbalanced ACK/NAKs',
'L1': 'Aborted Command',
'L2': 'ACK NAK Timeout'},
4: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'NAK received'},
5: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Data offset error'},
6: {0: 'No Specific FRU code.',
1: 'Break Response Timeout',
2: 'Done Response Timeout',
3: 'SAS Credit Timeout',
'L1': 'Aborted Command',
'L2': 'Initiator Response Timeout'},
32: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'SAS Credit Timeout'},
255: {0: 'Drive type mismatch \xe2\x80\x93 download firmware',
'L1': 'Aborted Command',
'L2': 'Check FW Tags'}},
78: {0: {0: 'No Specific FRU code.',
1: 'SAS - Overlapped Commands Attempted.',
2: 'UDS trigger on non-queued cmd with outstanding NCQ cmds',
'L1': 'Aborted Command',
'L2': 'Overlapped Commands Attempted'}},
85: {4: {0: 'Cannot reassign if Media cache is not empty',
'L1': 'Aborted Command',
'L2': 'Insufficient Resources'}},
116: {8: {5: 'No Specific FRU code.',
'L1': 'Illegal Request',
'L2': 'Invalid Field Parameter \xe2\x80\x93 Check Sum'}},
128: {0: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Logical Unit Access Not Authorized.'}},
129: {0: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'LA Check Error.'},
1: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Unexpected Boot FW execution delay.'},
2: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Unexpected Customer FW execution delay.'},
3: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Unexpected FW download delay.'}},
251: {1: {0: 'No Specific FRU code.',
'L1': 'Aborted Command',
'L2': 'Command maps to a head marked as bad'}}},
13: {33: {0: {0: 'No Specific FRU code.',
1: 'UDS Trace retrieval complete.',
'L1': 'Volume Overflow Constants',
'L2': 'Logical Block Address Out of Range'}}},
14: {29: {0: {0: 'Miscompare During Verify Operation.',
128: 'Data miscompare error.',
129: 'Data miscompare error at erasure correction.',
'L1': 'Data Miscompare',
'L2': 'Miscompare During Verify Operation'}}}}
| 82.481225 | 297 | 0.40913 | 14,479 | 166,942 | 4.715519 | 0.111817 | 0.017327 | 0.031797 | 0.041581 | 0.427193 | 0.351588 | 0.295111 | 0.239396 | 0.189129 | 0.148149 | 0 | 0.064969 | 0.513651 | 166,942 | 2,023 | 298 | 82.521997 | 0.775951 | 0 | 0 | 0.227992 | 0 | 0.014837 | 0.459408 | 0.000928 | 0 | 0 | 0.000024 | 0 | 0.00544 | 1 | 0 | false | 0.003956 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6cfba446c1b7d27bb2983b109b1c4404b84a1da | 2,084 | py | Python | project_name/project_name/settings/dev.py | aexeagmbh/django-project-template | 123c4bd8b79320b460677d8df42895600ad99393 | [
"MIT"
] | null | null | null | project_name/project_name/settings/dev.py | aexeagmbh/django-project-template | 123c4bd8b79320b460677d8df42895600ad99393 | [
"MIT"
] | null | null | null | project_name/project_name/settings/dev.py | aexeagmbh/django-project-template | 123c4bd8b79320b460677d8df42895600ad99393 | [
"MIT"
] | null | null | null | # coding=utf-8
"""Development settings and globals."""
from .base import *
# ######### DEBUG CONFIGURATION
# See: https://docs.djangoproject.com/en/dev/ref/settings/#debug
DEBUG = True
# See: https://docs.djangoproject.com/en/dev/ref/settings/#template-debug
TEMPLATE_DEBUG = DEBUG
# ######### END DEBUG CONFIGURATION
# ######### EMAIL CONFIGURATION
# See: https://docs.djangoproject.com/en/dev/ref/settings/#email-backend
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
# ######### END EMAIL CONFIGURATION
# ######### DATABASE CONFIGURATION
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': '{{ project_name }}',
'USER': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
# ######### TOOLBAR CONFIGURATION
# See: http://django-debug-toolbar.readthedocs.org/en/latest/installation.html#explicit-setup
INSTALLED_APPS += (
'debug_toolbar',
)
MIDDLEWARE_CLASSES += (
'debug_toolbar.middleware.DebugToolbarMiddleware',
)
DEBUG_TOOLBAR_PANELS = [
'debug_toolbar.panels.version.VersionDebugPanel',
'debug_toolbar.panels.timer.TimerDebugPanel',
'debug_toolbar.panels.headers.HeaderDebugPanel',
'debug_toolbar.panels.request_vars.RequestVarsDebugPanel',
'debug_toolbar.panels.sql.SQLDebugPanel',
'debug_toolbar.panels.template.TemplateDebugPanel',
'debug_toolbar.panels.cache.CacheDebugPanel',
# 'debug_toolbar.panels.signals.SignalDebugPanel',
'debug_toolbar.panels.profiling.ProfilingDebugPanel',
]
DEBUG_TOOLBAR_PATCH_SETTINGS = False
# boto (not use on dev):
DEFAULT_FILE_STORAGE = 'django.core.files.storage.FileSystemStorage'
DEBUG_TOOLBAR_CONFIG = {
'SHOW_TOOLBAR_CALLBACK': '{}.show_toolbar'.format(__name__),
}
def show_toolbar(request):
return DEBUG
# ######### END TOOLBAR CONFIGURATION
BROKER_URL = 'amqp://guest@rabbitmq'
AMQP_HTTP_API_URL = 'rabbitmq:15672'
CELERY_RESULT_BACKEND = 'disabled'
ENVIRONMENT = 'dev'
try:
from .local_settings import *
except ImportError:
print('No local settings found')
| 26.717949 | 93 | 0.712092 | 224 | 2,084 | 6.4375 | 0.5 | 0.124827 | 0.124827 | 0.052011 | 0.10957 | 0.10957 | 0.10957 | 0.10957 | 0.10957 | 0.079057 | 0 | 0.006098 | 0.134357 | 2,084 | 77 | 94 | 27.064935 | 0.793237 | 0.277831 | 0 | 0 | 0 | 0 | 0.503876 | 0.410148 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022727 | false | 0 | 0.068182 | 0.022727 | 0.113636 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6d0cac352b221bf2f7870f056af97d65cf0f1e3 | 354 | py | Python | examples/unlock_antidotes.py | astitva22/Pixelate-22-Sample-Arena | 0bee7aea075d760e60456d38225bb3d6d1b68fed | [
"MIT"
] | 1 | 2022-03-01T20:39:25.000Z | 2022-03-01T20:39:25.000Z | examples/unlock_antidotes.py | astitva22/Pixelate-22-Sample-Arena | 0bee7aea075d760e60456d38225bb3d6d1b68fed | [
"MIT"
] | null | null | null | examples/unlock_antidotes.py | astitva22/Pixelate-22-Sample-Arena | 0bee7aea075d760e60456d38225bb3d6d1b68fed | [
"MIT"
] | 7 | 2022-03-01T20:37:14.000Z | 2022-03-09T06:27:38.000Z | import gym
import pixelate_arena
import time
import pybullet as p
import os
if __name__ == "__main__":
parent_path = os.path.dirname(os.getcwd())
os.chdir(parent_path)
env = gym.make("pixelate_arena-v0")
x=0
while True:
p.stepSimulation()
if x==10000:
env.unlock_antidotes()
x+=1
time.sleep(1) | 20.823529 | 46 | 0.629944 | 50 | 354 | 4.2 | 0.6 | 0.12381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034351 | 0.259887 | 354 | 17 | 47 | 20.823529 | 0.767176 | 0 | 0 | 0 | 0 | 0 | 0.070423 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3125 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
f6e28a13700fedee22ada39a3fa8792554af7720 | 11,213 | py | Python | mm2s5/mm2s5.py | scottkirkwood/mm2s5 | 05160ead97b28a82e3d6f529672c8cd2067ff27c | [
"Apache-2.0"
] | null | null | null | mm2s5/mm2s5.py | scottkirkwood/mm2s5 | 05160ead97b28a82e3d6f529672c8cd2067ff27c | [
"Apache-2.0"
] | null | null | null | mm2s5/mm2s5.py | scottkirkwood/mm2s5 | 05160ead97b28a82e3d6f529672c8cd2067ff27c | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- encoding: latin1 -*-
#
# Copyright 2010 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Patched by Iceberg Luo to support FreeMind 0.9.x 2009
"""Convert a Memory Map File into an S5 presentation
If you create a mind map with FreeMind the title of the mind-map (center circle)
will be the title of the slide.
A top level node called "__meta__" can be used to set the metadata for
the presentation. The immediate children are keys and it's first child is a value.
title: Title of presentation, not needed since I get it from the top node
subtitle: Witty subtitle of the presentation
author: You probably want to change this.
company: And this
template: which subdirectory to use under the "ui" directory, 'default' is default
presdate: Date of the presentation
content_type: defaults to 'application/xhtml+xml; charset=utf-8'
header:
footer:
If the first character of the first line is a '<' then we won't add
the the <ul> list to the markup.
The icons can have special meaning:
The "Not OK" icon the slide will be skipped.
The "OK' icon will have no additional markup on the text (i.e. no <ul>)
The "Stop" icon will build the slide one line at a time.
The "Priority 1" icon will use an ordered list
"""
__author__ = 'scott@forusers.com (Scott Kirkwood)'
__version__ = '0.2.3'
import sys
import optparse
from xml.etree import ElementTree
import codecs
class Mm2S5:
def __init__(self):
self.et_in = None
self.meta = {
'title' : 'Title',
'subtitle': '',
'author' : """The Author.
You don't need to change this code.
Instead, you should write a __meta__ node in your mm file.
Refer to the docstring for details.""",
'company' : 'See above',
'template' : 'default',
'presdate' : 'Today',
'content_type' : 'application/xhtml+xml; charset=utf-8',
'header' : '',
'footer' : None,
'generator' : 'mm2s5.py',
}
def open(self, infilename):
""" Open the .mm file and create a S5 file as a list of lines """
infile = file(infilename).read()
self.et_in = self.xmlparse(infile)
lines = self.convert()
return lines
def write(self, outfilename, lines):
""" Write out the lines, written as a convenience function
Writing out the HTML in correct UTF-8 format is a little tricky."""
outfile = codecs.open(outfilename, 'w', 'utf-8')
outfile.write(u'\n'.join(lines))
outfile.close()
def xmlparse(self, text):
""" import the XML text into self.et_in """
return ElementTree.XML(text)
def convert(self):
""" Convert self.et_in to a HTML as a list of lines in S5 format """
self._grab_meta()
lines = []
lines.append("""<?xml version="1.0" encoding="UTF-8"?>""")
lines.append("""<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">""")
lines.append('<head>')
lines.append("""
<title>%(title)s</title>
<meta name="version" content="S5 1.1" />
<meta name="generator" content="%(generator)s" />
<meta name="presdate" content="%(presdate)s" />
<meta name="author" content="%(author)s" />
<meta name="company" content="%(company)s" />
<meta http-equiv="Content-type" content="%(content_type)s" />
<!-- S5 format see Eric A. Meyer, http://meyerweb.com/eric/tools/s5/ -->
<link rel="stylesheet" href="ui/%(template)s/slides.css" type="text/css"
media="projection" id="slideProj" />
<link rel="stylesheet" href="ui/%(template)s/outline.css" type="text/css" x
media="screen" id="outlineStyle" />
<link rel="stylesheet" href="ui/%(template)s/print.css" type="text/css"
media="print" id="slidePrint" />
<link rel="stylesheet" href="ui/%(template)s/opera.css" type="text/css"
media="projection" id="operaFix" />
<script src="ui/%(template)s/slides.js" type="text/javascript"></script>
""" % self.meta)
lines.append('</head>')
lines.append('<body>')
lines.append("""<div class="layout">
<div id="controls"><!-- DO NOT EDIT --></div>
<div id="currentSlide"><!-- DO NOT EDIT --></div>
<div id="header">%(header)s</div>
<div id="footer">%(footer)s</div>
</div>""" % self.meta)
lines.append('<div class="presentation">')
presentation = self.et_in.find('node')
lines.append(' <div class="slide">')
lines.append(' <h1>%s</h1>' % (self.meta['title']))
lines.append(' <h2>%s</h2>' % (self.meta['subtitle']))
lines.append(' <h3>%s</h3>' % (self.meta['author']))
lines.append(' <h4>%s</h4>' % (self.meta['company']))
lines.append(' </div>')
for page in presentation.findall('node'):
# Skip the __meta__ node, if any
if page.attrib['TEXT'] == '__meta__':
continue
attribs = self._get_list_attributes(page)
if 'skip' in attribs:
continue
lines.append(' <div class="slide">')
lines.append(' <h1>%s</h1>' % (page.attrib['TEXT']))
lines.append(' <div class="slidecontent">')
self._doList(lines, page, 0)
lines.append(' </div>') # content
lines.append(' </div>') # slide
lines.append('</div>') # Presentation
lines.append('</body>')
lines.append('</html>')
return lines
def _get_list_attributes(self, page):
""" If there's a special icon, return some attributes
Also, handle HTML markup a bit differently
"""
ret = {}
for icon in page.findall('icon'):
icon_type = icon.attrib['BUILTIN']
if icon_type == 'button_ok':
ret['no_ul'] = True
elif icon_type == "stop": # Stop light icon
ret['ul_class'] = "incremental"
elif icon_type == 'button_cancel':
ret['skip'] = True
elif icon_type == 'full-1':
ret['ol'] = True
# Special case, if the first node starts with <
# Then we'll assume markup and not do
# a <ul> etc.
node = page.find('node')
if node != None and \
(node.attrib['TEXT'].startswith('<') or
node.attrib['TEXT'] == '__table__'):
ret['no_ul'] = True
return ret
def _grab_meta(self):
""" Grab a "page" called __meta__, if any """
titles = self.et_in.find('node').attrib['TEXT'].split('\n')
self.meta['title'] = titles[0]
if len(titles) > 1:
self.meta['subtitle'] = titles[1]
for cur_node in self.et_in.getiterator('node'):
if cur_node.attrib.get('TEXT') == '__meta__': # Probably due to FreeMind 0.9, we might not have TEXT attribute
for sub_attrib in cur_node.findall('node'):
key = sub_attrib.attrib['TEXT']
sub_value = sub_attrib.find('node')
if sub_value:
value = sub_value.attrib['TEXT']
self.meta[key] = value
if self.meta['footer'] == None:
self.meta['footer'] = '<h1>%(company)s</h2><h2>%(title)s</h2>' % self.meta
def _doList(self, lines, sub, depth):
""" Recurse this list of items
Code is a little messier than I would like """
if sub == None or len(sub) == 0:
return
attribs = self._get_list_attributes(sub)
if 'ul_class' in attribs:
ul_class = ' class="%s"' % (attribs['ul_class'])
else:
ul_class = ''
indent = ' ' * (depth + 2)
if 'no_ul' not in attribs:
if 'ol' in attribs:
lines.append('%s<ol%s>' % (indent, ul_class,))
end = '%s</ol>' % (indent)
else:
lines.append('%s<ul%s>' % (indent, ul_class,))
end = '%s</ul>' % (indent)
else:
end = None
for line in sub.findall('node'):
text = line.attrib.get('TEXT') # Probably due to FreeMind 0.9, we might not have TEXT attribute
if not text: # FreeMind 0.9 's HTML node stores text in html format
p = (line
._children[0]#Element richcontent
._children[0]#Element html
._children[1]#Element body
._children[0])#Element p
if p.text:
text=p.text
elif p.tag == 'img':
text='<img src="%s">' % p.get('src')
if text == '__table__':
lines += self._insert_table(text, line, depth)
else:
lines += self._insert_line_item(text, line, depth, attribs)
self._doList(lines, line, depth + 1)
if end:
lines.append(end)
def _insert_line_item(self, text, line, depth, attribs):
""" Insert a line item <li></li> """
indent = ' ' * (depth + 3)
lines = []
if not text:
return lines
text = text.replace('<html>', '')
if 'LINK'in line.attrib:
text = '<a href="%s">%s</a>' % (line.attrib['LINK'], text)
if 'no_ul' not in attribs:
text = text.replace('\n', '<br/>\n')
lines.append('%s<li>%s</li>' % (indent, text))
else:
lines.append('%s' % (text))
return lines
def _insert_table(self, unused_text, line, depth):
""" If we get a special node called __table__ insert the children
as rows in a table (descendants are columns in that row) """
lines = []
indent = ' ' * (depth + 2)
table = line
lines.append('%s<table>' % (indent))
for row in table.findall('node'):
lines.append('%s <tr>' % (indent))
for col in row.getiterator('node'):
lines.append('%s <td>%s</td>' % (indent, col.attrib['TEXT']))
lines.append('%s </tr>' % (indent))
lines.append('%s</table>' % (indent))
return lines
def show_version():
print 'mm2s5 version %s.' % __version__
print 'Written by %s' % __author__
def parse_command_line():
usage = """%prog <mmfile> [<htmloutput>]
Create a FreeMind (.mm) document (see http://freemind.sourceforge.net/wiki/index.php/Main_Page)
the main node will be the title page and the lower nodes will be pages.
"""
parser = optparse.OptionParser(usage)
parser.add_option('-v', '--version', dest='version', action='store_true',
help='Show version information and exit.')
(options, args) = parser.parse_args()
if options.version:
show_version()
sys.exit(0)
if len(args) == 0:
parser.print_usage()
sys.exit(-1)
infile = args[0]
if not infile.endswith('.mm'):
print "Input file must end with '.mm'"
parser.print_usage()
sys.exit(-1)
if len(args) == 1:
outfile = infile.replace('.mm', '.html')
elif len(args) == 2:
outfile = args[1]
else:
parser.print_usage()
sys.exit(-1)
mm2s5 = Mm2S5()
lines = mm2s5.open(infile)
mm2s5.write(outfile, lines)
if __name__ == "__main__":
parse_command_line()
| 33.471642 | 116 | 0.605725 | 1,559 | 11,213 | 4.263631 | 0.244387 | 0.052956 | 0.018956 | 0.014292 | 0.14232 | 0.09508 | 0.068301 | 0.039717 | 0.02708 | 0.02708 | 0 | 0.011933 | 0.237671 | 11,213 | 334 | 117 | 33.571856 | 0.765676 | 0.09373 | 0 | 0.168889 | 0 | 0.035556 | 0.345503 | 0.061847 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.017778 | null | null | 0.035556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6e364a4d8935e2990712e28b098a07099b88272 | 395 | py | Python | public/ViPER/modules/head.py | severnake/ViPER | e788f73bfe894f7fd03081782c87778de38a8df2 | [
"MIT"
] | 23 | 2016-11-26T06:55:44.000Z | 2021-07-27T19:28:05.000Z | public/ViPER/modules/head.py | severnake/ViPER | e788f73bfe894f7fd03081782c87778de38a8df2 | [
"MIT"
] | 2 | 2021-03-11T04:35:03.000Z | 2021-05-11T22:03:33.000Z | public/ViPER/modules/head.py | severnake/ViPER | e788f73bfe894f7fd03081782c87778de38a8df2 | [
"MIT"
] | 7 | 2017-08-12T10:44:41.000Z | 2022-03-22T05:49:49.000Z | import requests
from termcolor.termcolor import colored, cprint
class header:
"""
Class for extracting headers
"""
def __init__(self):
pass
def get_headers(self, target):
req = requests.head(target)
req = req.headers
for i in req.items():
cprint(i[0].ljust(60)+i[1].rjust(50),'blue')
| 23.235294 | 60 | 0.531646 | 45 | 395 | 4.555556 | 0.644444 | 0.087805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.362025 | 395 | 16 | 61 | 24.6875 | 0.789683 | 0.070886 | 0 | 0 | 0 | 0 | 0.011799 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.1 | 0.2 | 0 | 0.5 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f6f037d5301de39668019ccf1a2625274e621032 | 14,051 | py | Python | Model/lookalike-model/tests/pipeline/test_main_clean.py | rangaswamymr/blue-marlin | 2ab39a6af01e14f40386f640fe087aeb284b5524 | [
"Apache-2.0"
] | null | null | null | Model/lookalike-model/tests/pipeline/test_main_clean.py | rangaswamymr/blue-marlin | 2ab39a6af01e14f40386f640fe087aeb284b5524 | [
"Apache-2.0"
] | null | null | null | Model/lookalike-model/tests/pipeline/test_main_clean.py | rangaswamymr/blue-marlin | 2ab39a6af01e14f40386f640fe087aeb284b5524 | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0.html
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
import yaml
from pyspark import SparkContext
from pyspark.sql import SparkSession, HiveContext
from pyspark.sql.functions import col, udf, collect_set
from pyspark.sql.types import IntegerType, BooleanType
from lookalike_model.pipeline import main_clean, util
from data_generator import *
class TestMainClean(unittest.TestCase):
def setUp(self):
# Initialize the Spark session
self.spark = SparkSession.builder.appName('unit test').getOrCreate()
self.spark.sparkContext.setLogLevel('ERROR')
# Testing the method that tests user uniqueness and removes users with conflicting age/gender.
def compare_list(self, list1, list2):
if len(list1) != len(list2):
return False
def _key(x): return '-'.join([str(_) for _ in x])
return sorted(list1, key=_key) == sorted(list2, key=_key)
def _test_clean_persona(self):
print('*** Running test_clean_persona ***')
data = [
('0000001', 0, 0),
('0000001', 0, 0), # duplicate entry, duplicates will be removed
('0000002', 1, 1),
('0000003', 2, 2),
('0000003', 2, 3), # duplicate entry, duplicates will be removed
]
schema = StructType([
StructField("aid", StringType(), True),
StructField("gender", StringType(), True),
StructField("age", StringType(), True)
])
df = self.spark.createDataFrame(self.spark.sparkContext.parallelize(data), schema)
df = main_clean.clean_persona(df, 1).select('aid', 'gender', 'age')
expected_output = [
('0000001', 0, 0),
('0000002', 1, 1)
]
print("Original DataFrame df:")
print(df.show(100, False))
print("Expected DataFrame df_expected:")
print(expected_output)
self.assertTrue(self.compare_list(df.collect(), expected_output))
# Tests the method that adds the 'day' column with just the date from the action_time column.
def _test_add_day(self):
print('*** Running test_add_day ***')
data = [
('0000001', '2022-02-19 12:34:56.78'),
('0000002', '2022-02-20 12:34:56.78')
]
schema = StructType([
StructField("aid", StringType(), True),
StructField("action_time", StringType(), True)
])
df = self.spark.createDataFrame(self.spark.sparkContext.parallelize(data), schema)
# Run the method to be tested.
df = main_clean.add_day(df)
print(df.show(100, False))
expected_output = [
('0000001', '2022-02-19 12:34:56.78', '2022-02-19'),
('0000002', '2022-02-20 12:34:56.78', '2022-02-20')
]
schema_expected = StructType([
StructField("aid", StringType(), True),
StructField("action_time", StringType(), True),
StructField("day", StringType(), True),
])
df_expected = self.spark.createDataFrame(self.spark.sparkContext.parallelize(expected_output), schema_expected)
df_expected = main_clean.add_day(df_expected)
print("Original DataFrame df:")
print(df.show(100, False))
print("Expected DataFrame df_expected:")
print(df_expected.show(100, False))
result = sorted(df.collect()) == sorted(df_expected.collect())
print result
# Testing the addition of the 'aid_bucket' column with a hash mod bucket_num value.
def _test_add_aid_bucket(self):
print('*** Running test_add_aid_bucket ***')
# Input
data = [
('0000001', 0, 0),
('0000002', 1, 1),
('0000003', 2, 2),
]
schema = StructType([
StructField("aid", StringType(), True),
StructField("gender", StringType(), True),
StructField("age", StringType(), True)
]
)
df = self.spark.createDataFrame(self.spark.sparkContext.parallelize(data), schema)
# Run the method to be tested.
aid_bucket_num = 4
df = main_clean.add_aid_bucket(df, aid_bucket_num)
expected_output = [
('0000001', 0, 0),
('0000002', 1, 1),
('0000003', 2, 2)
]
df_expected = add_aid_bucket(self.spark.createDataFrame(self.spark.sparkContext.parallelize(expected_output), schema), aid_bucket_num)
print("Original DataFrame df:")
print(df.show(100, False))
print("Expected DataFrame df_expected:")
print(df_expected.show(100, False))
result = sorted(df.collect()) == sorted(df_expected.collect())
print result
# Testing the method that joins the log rows with the user persona, keyword, and media category.
def _test_clean_batched_log(self):
print('*** Running test_clean_batched_log ***')
# Get the data inputs for the test.
df_log = create_raw_log(self.spark)
df_persona = create_cleaned_persona(self.spark)
df_keywords = create_keywords(self.spark)
conditions = {
'new_slot_id_list': [
'abcdef0', 'abcdef1', 'abcdef2', 'abcdef3', 'abcdef4',
'abcdef5', 'abcdef6', 'abcdef7', 'abcdef8', 'abcdef9'
],
'new_slot_id_app_name_list': [
'Huawei Magazine', 'Huawei Browser', 'Huawei Video', 'Huawei Music', 'Huawei Reading',
'Huawei Magazine', 'Huawei Browser', 'Huawei Video', 'Huawei Music', 'Huawei Reading'
]
}
# Run the method to be tested.
aid_bucket_num = 4
df = main_clean.clean_batched_log(df_log, df_persona, df_keywords, aid_bucket_num)
print(df.sort('aid').show(100, False))
# Validate the output.
self.validate_cleaned_log(df, df_persona, conditions, df_keywords, df_log, aid_bucket_num)
# Testing data look up and cleaning process for clicklog and showlog data.
def _test_clean_logs(self):
print('*** Running test_clean_logs ***')
with open('config_clean.yml', 'r') as ymlfile:
cfg = yaml.safe_load(ymlfile)
showlog_table = cfg['showlog_table_name']
showlog_output_table = cfg['pipeline']['main_clean']['showlog_output_table']
clicklog_table = cfg['clicklog_table_name']
clicklog_output_table = cfg['pipeline']['main_clean']['clicklog_output_table']
log_table_names = (showlog_table, showlog_output_table, clicklog_table, clicklog_output_table)
print(showlog_table)
print(clicklog_table)
print(showlog_output_table)
print(clicklog_output_table)
# Create the persona and keyword dataframes.
df_persona = create_cleaned_persona(self.spark)
df_keywords = create_keywords(self.spark)
# Create the clicklog and showlog tables.
create_clicklog_table(self.spark, clicklog_table)
create_showlog_table(self.spark, showlog_table)
# Drop the output tables
util.drop_table(self.hive_context, showlog_output_table)
util.drop_table(self.hive_context, clicklog_output_table)
# Run the method to be tested.
main_clean.clean_logs(cfg, df_persona, df_keywords, log_table_names)
# Validate the output tables.
conditions = cfg['pipeline']['main_clean']['conditions']
df_log = create_raw_log(self.spark)
# Validate the cleaned clicklog table.
df_clicklog = util.load_df(self.hive_context, clicklog_output_table)
print(df_clicklog.sort('action_time').show(100, False))
self.validate_cleaned_log(df_clicklog, conditions, df_persona, df_keywords, df_log, cfg['pipeline']['main_clean']['aid_bucket_num'])
# Validate the cleaned showlog table.
df_showlog = util.load_df(self.hive_context, clicklog_output_table)
self.validate_cleaned_log(df_showlog, conditions, df_persona, df_keywords, df_log, cfg['pipeline']['main_clean']['aid_bucket_num'])
# Testing full data cleaning process for persona, clicklog, and showlog data.
def _test_run(self):
print('*** Running test_run ***')
with open('config_clean.yml', 'r') as ymlfile:
cfg = yaml.safe_load(ymlfile)
# Create the persona, keywords, clicklog and showlog tables.
persona_table = cfg['persona_table_name']
keywords_table = cfg['keywords_table']
showlog_table = cfg['showlog_table_name']
clicklog_table = cfg['clicklog_table_name']
effective_keywords_table = cfg['pipeline']['main_keywords']['keyword_output_table']
create_persona_table(self.spark, persona_table)
create_keywords_table(self.spark, keywords_table)
create_clicklog_table(self.spark, clicklog_table)
create_showlog_table(self.spark, showlog_table)
create_effective_keywords_table(self.spark, effective_keywords_table)
# Drop the output tables
showlog_output_table = cfg['pipeline']['main_clean']['showlog_output_table']
clicklog_output_table = cfg['pipeline']['main_clean']['clicklog_output_table']
persona_output_table = cfg['pipeline']['main_clean']['persona_output_table']
util.drop_table(self.hive_context, showlog_output_table)
util.drop_table(self.hive_context, clicklog_output_table)
util.drop_table(self.hive_context, persona_output_table)
# Run the method to be tested.
main_clean.run(self.hive_context, cfg)
# Validate the output tables.
conditions = cfg['pipeline']['main_clean']['conditions']
bucket_num = cfg['pipeline']['main_clean']['aid_bucket_num']
df_keywords = util.load_df(self.hive_context, keywords_table)
# run() does filtering on the effective keywords so we need to filter
# the raw logs with the spread app ids when validating the output.
effective_spread_app_ids = ['C000', 'C001', 'C002', 'C003', 'C004', 'C010', 'C011', 'C012', 'C013', 'C014', ]
df_log = create_raw_log(self.spark)
df_log = self.filter_spread_app_ids(df_log, effective_spread_app_ids)
# Validate the cleaned persona table.
df_persona = util.load_df(self.hive_context, persona_output_table)
self.validate_clean_persona(df_persona, bucket_num)
# Validate the cleaned clicklog table.
df_clicklog = util.load_df(self.hive_context, clicklog_output_table)
self.validate_cleaned_log(df_clicklog, conditions, df_persona, df_keywords, df_log, bucket_num)
print_df_generator_code(df_clicklog.sort('aid'))
# Validate the cleaned showlog table.
df_showlog = util.load_df(self.hive_context, clicklog_output_table)
self.validate_cleaned_log(df_showlog, conditions, df_persona, df_keywords, df_log, bucket_num)
print_df_generator_code(df_showlog.sort('aid'))
def filter_spread_app_ids(self, df, spread_app_ids):
# User defined function to return if the keyword is in the inclusion set.
_udf = udf(lambda x: x in spread_app_ids, BooleanType())
# Return the filtered dataframe.
return df.filter(_udf(col('spread_app_id')))
# ========================================
# Helper methods
# ========================================
def validate_cleaned_log(self, df, conditions, df_persona, df_keywords, df_log, bucket_num):
# Verify the column names.
columns = ['spread_app_id', 'aid', 'adv_id', 'media', 'slot_id', 'device_name',
'net_type', 'price_model', 'action_time', 'gender', 'age',
'gender_index', 'keyword', 'day', 'aid_bucket']
for name in columns:
self.assertTrue(name in df.columns)
# Verify the number of rows.
# The raw log count has one entry that will be filtered out so adjusted accordingly.
self.assertEqual(df.count(), df_log.count() - 1)
# Helper method for verifying table joins.
def assert_row_value(row, df_match, field_name, join_field):
self.assertEqual(row[field_name], df_match.filter(col(join_field) == row[join_field]).collect()[0][field_name])
# Check the row values.
for row in df.collect():
self.assertTrue(row['slot_id'] in conditions['new_slot_id_list'])
self.assertEqual(row['day'], row['action_time'].split()[0])
self.assertTrue(int(row['aid_bucket']) < bucket_num)
assert_row_value(row, df_persona, 'gender', 'aid')
assert_row_value(row, df_persona, 'age', 'aid')
assert_row_value(row, df_keywords, 'keyword', 'spread_app_id')
assert_row_value(row, df_keywords, 'keyword_index', 'spread_app_id')
assert_row_value(row, df_log, 'adv_id', 'aid')
assert_row_value(row, df_log, 'media', 'aid')
assert_row_value(row, df_log, 'slot_id', 'aid')
assert_row_value(row, df_log, 'device_name', 'aid')
assert_row_value(row, df_log, 'net_type', 'aid')
assert_row_value(row, df_log, 'price_model', 'aid')
assert_row_value(row, df_log, 'action_time', 'aid')
# Runs the tests.
if __name__ == '__main__':
# Run the unit tests.
unittest.main()
| 44.46519 | 142 | 0.647925 | 1,756 | 14,051 | 4.937358 | 0.175968 | 0.02699 | 0.026298 | 0.023529 | 0.526528 | 0.489043 | 0.434833 | 0.410265 | 0.370473 | 0.359285 | 0 | 0.028375 | 0.235001 | 14,051 | 315 | 143 | 44.606349 | 0.778212 | 0.184542 | 0 | 0.4 | 0 | 0 | 0.159035 | 0.007807 | 0 | 0 | 0 | 0 | 0.090476 | 0 | null | null | 0 | 0.038095 | null | null | 0.138095 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6f688d8a5f12a882c462a7d43fb10da64d6fa89 | 5,009 | py | Python | PiCam/picam.py | alexwtz/pciam | 7e90dd22a8e97e41fb480cb85cdf95cefd2d974e | [
"MIT"
] | null | null | null | PiCam/picam.py | alexwtz/pciam | 7e90dd22a8e97e41fb480cb85cdf95cefd2d974e | [
"MIT"
] | null | null | null | PiCam/picam.py | alexwtz/pciam | 7e90dd22a8e97e41fb480cb85cdf95cefd2d974e | [
"MIT"
] | null | null | null | import serial
import time
import atexit
from functools import wraps
from flask import Flask, render_template, request, Response
app = Flask(__name__)
def check_auth(username, password):
"""This function is called to check if a username /
password combination is valid.
"""
return username == 'admin' and password == 'secret'
def authenticate():
"""Sends a 401 response that enables basic auth"""
return Response(
'Could not verify your access level for that URL.\n'
'You have to login with proper credentials', 401,
{'WWW-Authenticate': 'Basic realm="Login Required"'})
def requires_auth(f):
@wraps(f)
def decorated(*args, **kwargs):
auth = request.authorization
if not auth or not check_auth(auth.username, auth.password):
return authenticate()
return f(*args, **kwargs)
return decorated
# This function maps the angle we want to move the servo to, to the needed PWM value
def angleMap(angle):
value = int((round((2000.0/180.0),0)*angle) -1000)
if(value > 1000):
return 1000
elif(value < -1000):
return -1000
else:
return value;
# Create a dictionary called pins to store the pin number, name, and angle
pins = {
23 : {'name' : 'pan', 'angle' : 90},
22 : {'name' : 'tilt', 'angle' : 90}
}
#defines the speed of the movement
speed = 5
#define inverse
paninverse = 1
tiltinverse = -1
def cleanup():
print("Exit app")
def sendPosition(motor, position):
#Initialise the serial interface
s=serial.Serial("/dev/ttyAMA0",9600)
if(s.isOpen()):
s.close()
s.open()
if motor == "pan":
s.write("s0 "+str(position)+" 5\n")
elif motor == "tilt":
s.write("s1 "+str(position)+" 5\n")
s.close()
return "Moved"
def sendBothPosition(positionPan, positionTilt):
#Initialise the serial interface
s=serial.Serial("/dev/ttyAMA0",9600)
if(s.isOpen()):
s.close()
s.open()
s.write("s0 "+str(positionPan)+" 5\n")
s.write("s1 "+str(positionTilt)+" 5\n")
s.close()
return "Moved"
# Load the main form template on webrequest for the root page
@app.route("/")
@requires_auth
def main():
# Create a template data dictionary to send any data to the template
templateData = {
'title' : 'PiCam'
}
# Pass the template data into the template picam.html and return it to the user
return render_template('picam.html', **templateData)
# The function below is executed when someone requests a URL with a move direction
@app.route("/<direction>")
def move(direction):
global speed
global titlinverse
global paninverse
# Choose the direction of the request
if direction == 'left':
# Increment the angle by speed
na = pins[23]['angle'] + paninverse * speed
# Verify that the new angle is not too great
if int(na) <= 180:
# Change the angle of the servo
sendPosition(pins[23]['name'],angleMap(na))
print("Servo 23 at %s" % (angleMap(na)))
# Store the new angle in the pins dictionary
pins[23]['angle'] = na
else:
pins[23]['angle'] = 180
return str(na) + ' ' + str(angleMap(na))
elif direction == 'reset':
sendBothPosition(0,0)
pins[23]['angle'] = 90
pins[22]['angle'] = 90
return '0 0'
elif direction == 'right':
na = pins[23]['angle'] - paninverse *speed
if na >= 0:
sendPosition(pins[23]['name'],angleMap(na))
print("Servo 23 at %s" % (angleMap(na)))
pins[23]['angle'] = na
else :
pins[23]['angle'] = 0
return str(na) + ' ' + str(angleMap(na))
elif direction == 'up':
na = pins[22]['angle'] + tiltinverse * speed
if na <= 180:
sendPosition(pins[22]['name'],angleMap(na))
print("Servo 22 at %s" % (angleMap(na)))
pins[22]['angle'] = na
else :
pins[22]['angle'] = 180
return str(na) + ' ' + str(angleMap(na))
elif direction == 'down':
na = pins[22]['angle'] - tiltinverse * speed
if na >= 0:
sendPosition(pins[22]['name'],angleMap(na))
print("Servo 22 at %s" % (angleMap(na)))
pins[22]['angle'] = na
else :
pins[22]['angle'] = 0
return str(na) + ' ' + str(angleMap(na))
elif direction == 'speed1':
speed = 1
return str(speed)
elif direction == 'speed5':
speed = 5
return str(speed)
elif direction == 'speed10':
speed = 10
return str(speed)
elif direction == 'speed20':
speed = 20
return str(speed)
# Function to manually set a motor to a specific pluse width
@app.route("/<motor>/<pulsewidth>")
def manual(motor,pulsewidth):
if motor == "pan":
servoPan.set_servo(23, int(pulsewidth))
elif motor == "tilt":
servoTilt.set_servo(22, int(pulsewidth))
return "Moved"
# Clean everything up when the app exits
atexit.register(cleanup)
if __name__ == "__main__":
app.run(host='0.0.0.0', port=80, debug=True)
| 28.787356 | 84 | 0.609503 | 669 | 5,009 | 4.533632 | 0.28849 | 0.039565 | 0.025387 | 0.025058 | 0.294428 | 0.267722 | 0.234751 | 0.234751 | 0.197824 | 0.197824 | 0 | 0.041267 | 0.25015 | 5,009 | 173 | 85 | 28.953757 | 0.766241 | 0.165502 | 0 | 0.345865 | 0 | 0 | 0.130478 | 0.005229 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.022556 | 0.037594 | null | null | 0.037594 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f6fc765a80d06b184d48204318ac290f02c0daa1 | 756 | py | Python | ozone-framework-python-server/people/urls.py | aamduka/ozone | 3fdbf232f5ea70661204a632e45310ca9d374973 | [
"Apache-2.0"
] | 6 | 2020-02-21T22:06:31.000Z | 2020-12-08T10:48:07.000Z | ozone-framework-python-server/people/urls.py | aamduka/ozone | 3fdbf232f5ea70661204a632e45310ca9d374973 | [
"Apache-2.0"
] | 12 | 2019-12-26T17:38:40.000Z | 2022-02-10T14:15:55.000Z | ozone-framework-python-server/people/urls.py | aamduka/ozone | 3fdbf232f5ea70661204a632e45310ca9d374973 | [
"Apache-2.0"
] | 4 | 2019-08-05T13:22:29.000Z | 2021-07-21T16:04:03.000Z | from django.urls import path
from rest_framework import routers
from .administration.views import AdministrationOfUserAPIView
from .views import PersonDetailView, PersonDashboardsWidgetsView, PersonWidgetDefinitionViewSet, PersonStackViewset
router = routers.SimpleRouter()
router.register(r'admin/users', AdministrationOfUserAPIView)
router.register(r'admin/users-widgets', PersonWidgetDefinitionViewSet, base_name='admin_users-widgets')
urlpatterns = [
path('me/', PersonDetailView.as_view(), name='user-detail'),
path('me/dashboards-widgets/', PersonDashboardsWidgetsView.as_view(), name='user-widgets-dashboards-detail'),
path('admin/users-stacks/', PersonStackViewset.as_view(), name='admin_users-stacks')
]
urlpatterns += router.urls
| 42 | 115 | 0.806878 | 78 | 756 | 7.730769 | 0.410256 | 0.082919 | 0.049751 | 0.066335 | 0.082919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078042 | 756 | 17 | 116 | 44.470588 | 0.865136 | 0 | 0 | 0 | 0 | 0 | 0.201058 | 0.068783 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
f6fee774c750964c054daa25511fd9a0fd69478c | 903 | py | Python | api/views/logs/resources.py | bdeprez/machinaris | 233d6e60ae6526b999fb660d9b87a895341de96f | [
"Apache-2.0"
] | null | null | null | api/views/logs/resources.py | bdeprez/machinaris | 233d6e60ae6526b999fb660d9b87a895341de96f | [
"Apache-2.0"
] | null | null | null | api/views/logs/resources.py | bdeprez/machinaris | 233d6e60ae6526b999fb660d9b87a895341de96f | [
"Apache-2.0"
] | null | null | null | import json
import re
import traceback
from flask import request, make_response, abort
from flask.views import MethodView
from api import app
from api.extensions.api import Blueprint
from api.commands import log_parser
blp = Blueprint(
'Log',
__name__,
url_prefix='/logs',
description="Operations on all logs"
)
@blp.route('/')
class Logs(MethodView):
def get(self):
response = make_response(json.dumps(['alerts', 'farming', 'plotting', 'archiving', 'webui', 'apisrv', 'pooling']), 200)
response.mimetype = "application/json"
return response
@blp.route('/<type>')
class LogByType(MethodView):
def get(self, type):
log = log_parser.get_log_lines(type, log_id=request.args.get('log_id'), blockchain=request.args.get('blockchain'))
response = make_response(log, 200)
response.mimetype = "plain/text"
return response
| 23.763158 | 127 | 0.687708 | 113 | 903 | 5.371681 | 0.469027 | 0.059308 | 0.052718 | 0.065898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008174 | 0.187154 | 903 | 37 | 128 | 24.405405 | 0.818801 | 0 | 0 | 0.074074 | 0 | 0 | 0.14175 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.296296 | 0 | 0.518519 | 0.074074 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
f6ff14536113c487785785cf8713fabd87f3391a | 34,386 | py | Python | W_conv.py | jeongjuns/Control_yolact | dad800dcc1aa0b02445e302256b4508b7688880c | [
"MIT"
] | null | null | null | W_conv.py | jeongjuns/Control_yolact | dad800dcc1aa0b02445e302256b4508b7688880c | [
"MIT"
] | null | null | null | W_conv.py | jeongjuns/Control_yolact | dad800dcc1aa0b02445e302256b4508b7688880c | [
"MIT"
] | null | null | null |
import math
import warnings
import torch
from torch import Tensor
from torch.nn.parameter import Parameter
import torch.nn.functional as F
import torch.nn.init as init
from torch.nn.modules.module import Module
from torch.nn.modules.utils import _single, _pair, _triple, _reverse_repeat_tuple
from torch.autograd import Variable
from torch.nn.common_types import _size_1_t, _size_2_t, _size_3_t
from typing import Optional, List, Tuple
#from torch.nn.modules.conv import _ConvNd
flcnt1=0
flcnt2=0
flcnt3=0
avgcnt1=0
avgcnt2=0
avgcnt3=0
#fpnlatlayercnt=0
flfpnlatlayercnt=0
bboxcnt=0
flbboxcnt=0
confcnt=0
flconfcnt=0
maskcnt=0
flmaskcnt=0
makenetcnt=0
flmakenetcnt=0
segcnt=0
flsegcnt=0
# torch.nn.conv2d 변형
class W_ConvNd(Module):
__constants__ = ['stride', 'padding', 'dilation', 'groups',
'padding_mode', 'output_padding', 'in_channels',
'out_channels', 'kernel_size']
__annotations__ = {'bias': Optional[torch.Tensor]}
_in_channels: int
out_channels: int
kernel_size: Tuple[int, ...]
stride: Tuple[int, ...]
padding: Tuple[int, ...]
dilation: Tuple[int, ...]
transposed: bool
output_padding: Tuple[int, ...]
groups: int
padding_mode: str
weight: Tensor
bias: Optional[Tensor]
def __init__(self,
in_channels: int,
out_channels: int,
kernel_size: _size_1_t,
stride: _size_1_t,
padding: _size_1_t,
dilation: _size_1_t,
transposed: bool,
output_padding: _size_1_t,
groups: int,
bias: Optional[Tensor],
padding_mode: str) -> None:
super(W_ConvNd, self).__init__()
if in_channels % groups != 0:
raise ValueError('in_channels must be divisible by groups')
if out_channels % groups != 0:
raise ValueError('out_channels must be divisible by groups')
valid_padding_modes = {'zeros', 'reflect', 'replicate', 'circular'}
if padding_mode not in valid_padding_modes:
raise ValueError("padding_mode must be one of {}, but got padding_mode='{}'".format(
valid_padding_modes, padding_mode))
self.in_channels = in_channels
self.out_channels = out_channels
self.kernel_size = kernel_size
self.stride = stride
self.padding = padding
self.dilation = dilation
self.transposed = transposed
self.output_padding = output_padding
self.groups = groups
self.padding_mode = padding_mode
# `_reversed_padding_repeated_twice` is the padding to be passed to
# `F.pad` if needed (e.g., for non-zero padding types that are
# implemented as two ops: padding + conv). `F.pad` accepts paddings in
# reverse order than the dimension.
self._reversed_padding_repeated_twice = _reverse_repeat_tuple(self.padding, 2)
if transposed:
self.weight = Parameter(torch.Tensor(
in_channels, out_channels // groups, *kernel_size))
else:
self.weight = Parameter(torch.Tensor(
out_channels, in_channels // groups, *kernel_size))
if bias:
self.bias = Parameter(torch.Tensor(out_channels))
else:
self.register_parameter('bias', None)
self.reset_parameters()
def reset_parameters(self) -> None:
init.kaiming_uniform_(self.weight, a=math.sqrt(5))
if self.bias is not None:
fan_in, _ = init._calculate_fan_in_and_fan_out(self.weight)
bound = 1 / math.sqrt(fan_in)
init.uniform_(self.bias, -bound, bound)
def extra_repr(self):
s = ('{in_channels}, {out_channels}, kernel_size={kernel_size}'
', stride={stride}')
if self.padding != (0,) * len(self.padding):
s += ', padding={padding}'
if self.dilation != (1,) * len(self.dilation):
s += ', dilation={dilation}'
if self.output_padding != (0,) * len(self.output_padding):
s += ', output_padding={output_padding}'
if self.groups != 1:
s += ', groups={groups}'
if self.bias is None:
s += ', bias=False'
if self.padding_mode != 'zeros':
s += ', padding_mode={padding_mode}'
return s.format(**self.__dict__)
def __setstate__(self, state):
super(_ConvNd, self).__setstate__(state)
if not hasattr(self, 'padding_mode'):
self.padding_mode = 'zeros'
class W_Conv2d1(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(W_Conv2d1, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
################################################# jj add
self.W1 = Parameter(make_mw(out_channels, in_channels, kernel_size[0]), requires_grad=True)
W_Conv2d1.fl = {}
W_Conv2d1.Wweight={}
################################################# jj end
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
################################################# jj add
global flcnt1
global avgcnt1
if avgcnt1 == 34:
avgcnt1 = 1
if flcnt1 == 33:
avgcnt1 += 1
for i in range(33,66):
if flcnt1 == i:
W_Conv2d1.fl['{0}'.format(i-33)] = self.weight.clone().detach()
if flcnt1 > 32:
for i in range(1,34):
if avgcnt1 == i:
W_Conv2d1.Wweight['{0}'.format(i)] = mod_compute(W_Conv2d1.fl['{0}'.format(i-1)], self.W1)
if flcnt1 < 66:
flcnt1+=1
if 0 < avgcnt1 < 34:
avgcnt1+=1
if flcnt1 < 34:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(W_Conv2d1.fl['{0}'.format(avgcnt1-2)], self.W1))
def mod_compute(fl, w):
# seungil modification
if fl.size(3) == 1:
fl = fl.squeeze(-1).squeeze(-1)
fla_tensor = w@fl
fla_tensor = fla_tensor.unsqueeze(-1).unsqueeze(-1)
elif fl.size(3) == 3:
fla_tensor = torch.zeros(fl.size(0), fl.size(1), 3, 3)
for i in range(3):
for j in range(3):
temp = fl[:,:,i,j].squeeze(-1).squeeze(-1)
temp = w@temp
fla_tensor[:,:,i,j] = temp
return fla_tensor
def make_mw(o_size, i_size, k_size):
# seungil modification
mw = torch.eye(o_size)
return mw
################################################# jj end
class W_Conv2d2(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(W_Conv2d2, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
################################################# jj add
self.W2 = Parameter(make_mw(out_channels, in_channels, kernel_size[0]), requires_grad=True)
W_Conv2d2.fl = {}
W_Conv2d2.Wweight={}
################################################# jj end
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
################################################# jj add
global flcnt2
global avgcnt2
if avgcnt2 == 34:
avgcnt2 = 1
if flcnt2 == 33:
avgcnt2 += 1
for i in range(33,66):
if flcnt2 == i:
W_Conv2d2.fl['{0}'.format(i-33)] = self.weight.clone().detach()
if flcnt2 > 32:
for i in range(1,34):
if avgcnt2 == i:
W_Conv2d2.Wweight['{0}'.format(i)] = mod_compute(W_Conv2d2.fl['{0}'.format(i-1)], self.W2)
if flcnt2 < 66:
flcnt2+=1
if 0 < avgcnt2 < 34:
avgcnt2+=1
#if flcnt2 == 66:
# print(W_Conv2d2.fl['{0}'.format(32)][0][0])
if flcnt2 < 34:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(W_Conv2d2.fl['{0}'.format(avgcnt2-2)], self.W2))
################################################# jj end
class W_Conv2d3(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(W_Conv2d3, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
################################################# jj add
self.W3 = Parameter(make_mw(out_channels, in_channels, kernel_size[0]), requires_grad=True)
W_Conv2d3.fl = {}
W_Conv2d3.Wweight={}
################################################# jj end
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
################################################# jj add
global flcnt3
global avgcnt3
if avgcnt3 == 34:
avgcnt3 = 1
if flcnt3 == 33:
avgcnt3 += 1
for i in range(33,66):
if flcnt3 == i:
W_Conv2d3.fl['{0}'.format(i-33)] = self.weight.clone().detach()
if flcnt3 > 32:
for i in range(1,34):
if avgcnt3 == i:
W_Conv2d3.Wweight['{0}'.format(i)] = mod_compute(W_Conv2d3.fl['{0}'.format(i-1)], self.W3)
if flcnt3 < 66:
flcnt3+=1
if 0 < avgcnt3 < 34:
avgcnt3+=1
if flcnt3 < 34:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(W_Conv2d3.fl['{0}'.format(avgcnt3-2)], self.W3))
################################################# jj end
class bbox_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(bbox_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
bbox_Conv2d.fl={}
bbox_Conv2d.Wweight={}
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
global flbboxcnt
global bboxcnt
if bboxcnt == 6:
bboxcnt = 1
if flbboxcnt == 5:
bboxcnt += 1
for i in range(5,10):
if flbboxcnt == i:
bbox_Conv2d.fl['{0}'.format(i-5)] = self.weight.clone().detach()
if flbboxcnt > 4:
for i in range(1,6):
if bboxcnt == i:
bbox_Conv2d.Wweight['{0}'.format(i)] = mod_compute(bbox_Conv2d.fl['{0}'.format(i-1)], self.mw)
if flbboxcnt < 10:
flbboxcnt+=1
if 0 < bboxcnt < 6:
bboxcnt+=1
#if flbboxcnt == 10:
# print(bbox_Conv2d.fl['{0}'.format(0)][0][0])
# print(bbox_Conv2d.Wweight['{0}'.format(1)][0][0])
if flbboxcnt < 6:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(bbox_Conv2d.fl['{0}'.format(bboxcnt-2)], self.mw))
class conf_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(conf_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
conf_Conv2d.fl={}
conf_Conv2d.Wweight={}
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
global flconfcnt
global confcnt
if confcnt == 6:
confcnt = 1
if flconfcnt == 5:
confcnt += 1
for i in range(5,10):
if flconfcnt == i:
conf_Conv2d.fl['{0}'.format(i-5)] = self.weight.clone().detach()
if flconfcnt > 4:
for i in range(1,6):
if confcnt == i:
conf_Conv2d.Wweight['{0}'.format(i)] = mod_compute(conf_Conv2d.fl['{0}'.format(i-1)], self.mw)
if flconfcnt < 10:
flconfcnt+=1
if 0 < confcnt < 6:
confcnt+=1
if flconfcnt < 6:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(conf_Conv2d.fl['{0}'.format(confcnt-2)], self.mw))
class mask_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(mask_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
mask_Conv2d.fl={}
mask_Conv2d.Wweight={}
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
global flmaskcnt
global maskcnt
if maskcnt == 6:
maskcnt = 1
if flmaskcnt == 5:
maskcnt += 1
for i in range(5,10):
if flmaskcnt == i:
mask_Conv2d.fl['{0}'.format(i-5)] = self.weight.clone().detach()
if flmaskcnt > 4:
for i in range(1,6):
if maskcnt == i:
mask_Conv2d.Wweight['{0}'.format(i)] = mod_compute(mask_Conv2d.fl['{0}'.format(i-1)], self.mw)
if flmaskcnt < 10:
flmaskcnt+=1
if 0 < maskcnt < 6:
maskcnt+=1
if flmaskcnt < 6:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(mask_Conv2d.fl['{0}'.format(maskcnt-2)], self.mw))
class makenet_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(makenet_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
makenet_Conv2d.fl={}
makenet_Conv2d.Wweight={}
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
global flmakenetcnt
global makenetcnt
if makenetcnt == 11:
makenetcnt = 1
if flmakenetcnt == 10:
makenetcnt += 1
for i in range(10,20):
if flmakenetcnt == i:
makenet_Conv2d.fl['{0}'.format(i-10)] = self.weight.clone().detach()
if flmakenetcnt > 9:
for i in range(1,11):
if makenetcnt == i:
makenet_Conv2d.Wweight['{0}'.format(i)] = mod_compute(makenet_Conv2d.fl['{0}'.format(i-1)], self.mw)
if flmakenetcnt < 20:
flmakenetcnt+=1
if 0 < makenetcnt < 11:
makenetcnt+=1
if flmakenetcnt < 11:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(makenet_Conv2d.fl['{0}'.format(makenetcnt-2)], self.mw))
class seg_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(seg_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
seg_Conv2d.fl={}
seg_Conv2d.Wweight={}
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
global flsegcnt
global segcnt
if segcnt == 2:
segcnt = 1
if flsegcnt == 1:
segcnt += 1
for i in range(1,2):
if flsegcnt == i:
seg_Conv2d.fl['{0}'.format(i-1)] = self.weight.clone().detach()
if flsegcnt > 0:
for i in range(1,2):
if segcnt == i:
seg_Conv2d.Wweight['{0}'.format(i)] = mod_compute(seg_Conv2d.fl['{0}'.format(i-1)], self.mw)
if flsegcnt < 2:
flsegcnt+=1
if 0 < segcnt < 2:
segcnt+=1
if flsegcnt < 2:
return self._conv_forward(input, self.weight)
else :
return self._conv_forward(input, mod_compute(seg_Conv2d.fl['{0}'.format(segcnt-2)], self.mw))
class fpn_lat_layers_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(fpn_lat_layers_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
self.fl_2048=torch.ones(256,2048,1,1)
self.fl_1024=torch.ones(256,1024,1,1)
self.fl_512=torch.ones(256,512,1,1)
self.fla_2048=torch.ones(256,2048,1,1)
self.fla_1024=torch.ones(256,1024,1,1)
self.fla_512=torch.ones(256,512,1,1)
self.in_channels = in_channels
self.cnt = 0
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
if self.cnt < 2:
self.cnt += 1
if self.cnt == 2:
if self.in_channels == 2048:
self.fl_2048 = self.weight.clone().detach()
elif self.in_channels == 1024:
self.fl_1024 = self.weight.clone().detach()
elif self.in_channels == 512:
self.fl_512 = self.weight.clone().detach()
self.cnt += 1
if self.cnt > 2:
if self.in_channels == 2048:
self.fla_2048 = self.fl_2048.squeeze(-1).squeeze(-1)
self.fla_2048 = self.mw@self.fla_2048
self.fla_2048 = self.fla_2048.unsqueeze(-1).unsqueeze(-1)
elif self.in_channels == 1024:
self.fla_1024 = self.fl_1024.squeeze(-1).squeeze(-1)
self.fla_1024 = self.mw@self.fla_1024
self.fla_1024 = self.fla_1024.unsqueeze(-1).unsqueeze(-1)
elif self.in_channels == 512:
self.fla_512 = self.fl_512.squeeze(-1).squeeze(-1)
self.fla_512 = self.mw@self.fla_512
self.fla_512 = self.fla_512.unsqueeze(-1).unsqueeze(-1)
#print(self.fl_512[0][0])
if self.cnt < 2:
return self._conv_forward(input, self.weight)
elif self.in_channels == 2048:
return self._conv_forward(input, self.fla_2048)
elif self.in_channels == 1024:
return self._conv_forward(input, self.fla_1024)
elif self.in_channels == 512:
return self._conv_forward(input, self.fla_512)
return self._conv_forward(input, self.weight)
class fpn_pred_layers_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
x_cnt : int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(fpn_pred_layers_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
self.fl_512=torch.ones(256,256,3,3)
self.fl_1024=torch.ones(256,256,3,3)
self.fl_2048=torch.ones(256,256,3,3)
self.fla_512=torch.ones(256,256,3,3)
self.fla_1024=torch.ones(256,256,3,3)
self.fla_2048=torch.ones(256,256,3,3)
self.cnt = 0
self.x_cnt = x_cnt
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
if self.cnt < 2:
self.cnt += 1
if self.cnt == 2:
if self.x_cnt == 512:
self.fl_512 = self.weight.clone().detach()
elif self.x_cnt == 1024:
self.fl_1024 = self.weight.clone().detach()
elif self.x_cnt == 2048:
self.fl_2048 = self.weight.clone().detach()
self.cnt += 1
if self.cnt > 2:
if self.x_cnt == 512:
self.fla_512 = torch.zeros(self.fl_512.size(0), self.fl_512.size(1),3,3).cuda()
for i in range(3):
for j in range(3):
temp = self.fl_512[:,:,i,j].squeeze(-1).squeeze(-1)
temp = self.mw@temp
self.fla_512[:,:,i,j] = temp
if self.x_cnt == 1024:
self.fla_1024 = torch.zeros(self.fl_1024.size(0), self.fl_1024.size(1),3,3).cuda()
for i in range(3):
for j in range(3):
temp = self.fl_1024[:,:,i,j].squeeze(-1).squeeze(-1)
temp = self.mw@temp
self.fla_1024[:,:,i,j] = temp
if self.x_cnt == 2048:
self.fla_2048 = torch.zeros(self.fl_2048.size(0), self.fl_2048.size(1),3,3).cuda()
for i in range(3):
for j in range(3):
temp = self.fl_2048[:,:,i,j].squeeze(-1).squeeze(-1)
temp = self.mw@temp
self.fla_2048[:,:,i,j] = temp
if self.cnt < 2:
return self._conv_forward(input, self.weight)
elif self.x_cnt == 512:
return self._conv_forward(input, self.fla_512)
elif self.x_cnt == 1024:
return self._conv_forward(input, self.fla_1024)
elif self.x_cnt == 2048:
return self._conv_forward(input, self.fla_2048)
return self._conv_forward(input, self.weight)
class fpn_down_layers_Conv2d(W_ConvNd):
def __init__(
self,
in_channels: int,
out_channels: int,
x_cnt : int,
kernel_size: _size_2_t,
stride: _size_2_t = 1,
padding: _size_2_t = 0,
dilation: _size_2_t = 1,
groups: int = 1,
bias: bool = True,
padding_mode: str = 'zeros' # TODO: refine this type
):
kernel_size = _pair(kernel_size)
stride = _pair(stride)
padding = _pair(padding)
dilation = _pair(dilation)
super(fpn_down_layers_Conv2d, self).__init__(
in_channels, out_channels, kernel_size, stride, padding, dilation,
False, _pair(0), groups, bias, padding_mode)
self.mw = Parameter(make_mw(out_channels,in_channels,kernel_size[0]), requires_grad=True)
self.fl_0=torch.ones(2,2,2,2)
self.fl_1=torch.ones(2,2,2,2)
self.fla_0=torch.ones(2,2,2,2)
self.fla_1=torch.ones(2,2,2,2)
self.cnt = 0
self.x_cnt = x_cnt
def _conv_forward(self, input, weight):
if self.padding_mode != 'zeros':
return F.conv2d(F.pad(input, self._reversed_padding_repeated_twice, mode=self.padding_mode),
weight, self.bias, self.stride,
_pair(0), self.dilation, self.groups)
return F.conv2d(input, weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def forward(self, input: Tensor) -> Tensor:
if self.cnt < 2:
self.cnt += 1
if self.cnt == 2:
if self.x_cnt == 0:
self.fl_0 = self.weight.clone().detach()
elif self.x_cnt == 1:
self.fl_1 = self.weight.clone().detach()
self.cnt += 1
if self.cnt > 2:
if self.x_cnt == 0:
self.fla_0 = torch.zeros(self.fl_0.size(0), self.fl_0.size(1),3,3).cuda()
for i in range(3):
for j in range(3):
temp = self.fl_0[:,:,i,j].squeeze(-1).squeeze(-1)
temp = self.mw@temp
self.fla_0[:,:,i,j] = temp
if self.x_cnt == 1:
self.fla_1 = torch.zeros(self.fl_1.size(0), self.fl_1.size(1),3,3).cuda()
for i in range(3):
for j in range(3):
temp = self.fl_1[:,:,i,j].squeeze(-1).squeeze(-1)
temp = self.mw@temp
self.fla_1[:,:,i,j] = temp
if self.cnt < 2:
return self._conv_forward(input, self.weight)
elif self.x_cnt == 0:
return self._conv_forward(input, self.fla_0)
elif self.x_cnt == 1:
return self._conv_forward(input, self.fla_1)
return self._conv_forward(input, self.weight) | 34.351648 | 120 | 0.529663 | 4,151 | 34,386 | 4.151289 | 0.051795 | 0.03714 | 0.015669 | 0.03656 | 0.768164 | 0.739961 | 0.688893 | 0.65123 | 0.599756 | 0.57811 | 0 | 0.048043 | 0.343221 | 34,386 | 1,001 | 121 | 34.351648 | 0.714975 | 0.026115 | 0 | 0.563732 | 0 | 0 | 0.02105 | 0.002525 | 0 | 0 | 0 | 0.000999 | 0 | 1 | 0.051248 | false | 0 | 0.015769 | 0 | 0.173456 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1001525132d1016af0a89108b37f0bd4e52435e6 | 2,246 | py | Python | instapyper.py | rriehle/instapyper | 0a2e01793a971ccaa47dac1cc32b6d6209ad7d77 | [
"MIT"
] | 1 | 2017-09-27T19:17:18.000Z | 2017-09-27T19:17:18.000Z | instapyper.py | rriehle/instapyper | 0a2e01793a971ccaa47dac1cc32b6d6209ad7d77 | [
"MIT"
] | null | null | null | instapyper.py | rriehle/instapyper | 0a2e01793a971ccaa47dac1cc32b6d6209ad7d77 | [
"MIT"
] | null | null | null | # encoding: utf-8
import requests
from requests_oauthlib import OAuth1
class Instapyper:
# Not sure this dict is necessary or even useful
status_codes = {
200: "Ok",
201: "URL successfully added",
400: "Bad Request",
401: "Unauthorized",
403: "Invalid username or password",
405: "Method not allowed",
500: "Service error, try again later",
504: "Gateway timeout",
}
# Likewise unsure of the utility of this dict
urls = {
'add': "https://www.instapaper.com/api/add",
'auth': "https://www.instapaper.com/api/authenticate",
'bookmarks_list': "https://www.instapaper.com/api/1/bookmarks/list",
'folders_list': "https://www.instapaper.com/api/1/folders/list",
'oauth_access_token': "https://www.instapaper.com/api/1/oauth/access_token",
}
def __init__(self, user, password):
self.user = user
self.password = password
def add(self, url, title=None, selection=None, redirect=None, jsonp=None):
parameters = {
'username': self.user,
'password': self.password,
'url': url,
}
if title:
parameters['title'] = title
if selection:
parameters['selection'] = selection
if redirect:
parameters['redirect'] = redirect
if jsonp:
parameters['jsonp'] = jsonp
try:
self.response = requests.post(
Instapyper.urls['add'],
data=parameters,
)
except Exception as e:
print(e)
return self.response
def bookmarks_list(self):
pass
def oauth(self, jsonp=None):
'''
http://docs.python-requests.org/en/master/user/authentication/#oauth-1-authentication
'''
auth = OAuth1(
'APP_KEY',
'APP_SECRET',
'USER_OAUTH_TOKEN',
'USER_OAUTH_TOKEN_SECRET',
)
try:
self.response = requests.get(
Instapyper.urls['oauth_access_token'],
auth=auth,
)
except Exception as e:
print(e)
return self.response
| 25.522727 | 93 | 0.547195 | 234 | 2,246 | 5.162393 | 0.431624 | 0.033113 | 0.074503 | 0.086921 | 0.17798 | 0.138245 | 0.11755 | 0.069536 | 0.069536 | 0 | 0 | 0.020861 | 0.338379 | 2,246 | 87 | 94 | 25.816092 | 0.792059 | 0.085931 | 0 | 0.129032 | 0 | 0 | 0.262457 | 0.011347 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0.080645 | 0.032258 | 0 | 0.177419 | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1003316b1ac47d4a3dfc26a90b05ca7e035820e5 | 6,333 | py | Python | gitea_api/models/internal_tracker.py | awalker125/gitea-api | 2dea0493d4b6a92d6e63a7284afb2c80cbf35cf7 | [
"MIT"
] | null | null | null | gitea_api/models/internal_tracker.py | awalker125/gitea-api | 2dea0493d4b6a92d6e63a7284afb2c80cbf35cf7 | [
"MIT"
] | null | null | null | gitea_api/models/internal_tracker.py | awalker125/gitea-api | 2dea0493d4b6a92d6e63a7284afb2c80cbf35cf7 | [
"MIT"
] | 1 | 2022-01-27T14:12:40.000Z | 2022-01-27T14:12:40.000Z | # coding: utf-8
"""
Gitea API.
This documentation describes the Gitea API. # noqa: E501
OpenAPI spec version: 1.15.3
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
import pprint
import re # noqa: F401
import six
from gitea_api.configuration import Configuration
class InternalTracker(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
"""
"""
Attributes:
swagger_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
swagger_types = {
'allow_only_contributors_to_track_time': 'bool',
'enable_issue_dependencies': 'bool',
'enable_time_tracker': 'bool'
}
attribute_map = {
'allow_only_contributors_to_track_time': 'allow_only_contributors_to_track_time',
'enable_issue_dependencies': 'enable_issue_dependencies',
'enable_time_tracker': 'enable_time_tracker'
}
def __init__(self, allow_only_contributors_to_track_time=None, enable_issue_dependencies=None, enable_time_tracker=None, _configuration=None): # noqa: E501
"""InternalTracker - a model defined in Swagger""" # noqa: E501
if _configuration is None:
_configuration = Configuration()
self._configuration = _configuration
self._allow_only_contributors_to_track_time = None
self._enable_issue_dependencies = None
self._enable_time_tracker = None
self.discriminator = None
if allow_only_contributors_to_track_time is not None:
self.allow_only_contributors_to_track_time = allow_only_contributors_to_track_time
if enable_issue_dependencies is not None:
self.enable_issue_dependencies = enable_issue_dependencies
if enable_time_tracker is not None:
self.enable_time_tracker = enable_time_tracker
@property
def allow_only_contributors_to_track_time(self):
"""Gets the allow_only_contributors_to_track_time of this InternalTracker. # noqa: E501
Let only contributors track time (Built-in issue tracker) # noqa: E501
:return: The allow_only_contributors_to_track_time of this InternalTracker. # noqa: E501
:rtype: bool
"""
return self._allow_only_contributors_to_track_time
@allow_only_contributors_to_track_time.setter
def allow_only_contributors_to_track_time(self, allow_only_contributors_to_track_time):
"""Sets the allow_only_contributors_to_track_time of this InternalTracker.
Let only contributors track time (Built-in issue tracker) # noqa: E501
:param allow_only_contributors_to_track_time: The allow_only_contributors_to_track_time of this InternalTracker. # noqa: E501
:type: bool
"""
self._allow_only_contributors_to_track_time = allow_only_contributors_to_track_time
@property
def enable_issue_dependencies(self):
"""Gets the enable_issue_dependencies of this InternalTracker. # noqa: E501
Enable dependencies for issues and pull requests (Built-in issue tracker) # noqa: E501
:return: The enable_issue_dependencies of this InternalTracker. # noqa: E501
:rtype: bool
"""
return self._enable_issue_dependencies
@enable_issue_dependencies.setter
def enable_issue_dependencies(self, enable_issue_dependencies):
"""Sets the enable_issue_dependencies of this InternalTracker.
Enable dependencies for issues and pull requests (Built-in issue tracker) # noqa: E501
:param enable_issue_dependencies: The enable_issue_dependencies of this InternalTracker. # noqa: E501
:type: bool
"""
self._enable_issue_dependencies = enable_issue_dependencies
@property
def enable_time_tracker(self):
"""Gets the enable_time_tracker of this InternalTracker. # noqa: E501
Enable time tracking (Built-in issue tracker) # noqa: E501
:return: The enable_time_tracker of this InternalTracker. # noqa: E501
:rtype: bool
"""
return self._enable_time_tracker
@enable_time_tracker.setter
def enable_time_tracker(self, enable_time_tracker):
"""Sets the enable_time_tracker of this InternalTracker.
Enable time tracking (Built-in issue tracker) # noqa: E501
:param enable_time_tracker: The enable_time_tracker of this InternalTracker. # noqa: E501
:type: bool
"""
self._enable_time_tracker = enable_time_tracker
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
if issubclass(InternalTracker, dict):
for key, value in self.items():
result[key] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, InternalTracker):
return False
return self.to_dict() == other.to_dict()
def __ne__(self, other):
"""Returns true if both objects are not equal"""
if not isinstance(other, InternalTracker):
return True
return self.to_dict() != other.to_dict()
| 34.796703 | 160 | 0.657666 | 755 | 6,333 | 5.210596 | 0.166887 | 0.089476 | 0.106762 | 0.116929 | 0.629385 | 0.584138 | 0.548043 | 0.412049 | 0.344433 | 0.240976 | 0 | 0.013997 | 0.266698 | 6,333 | 181 | 161 | 34.98895 | 0.833118 | 0.32528 | 0 | 0.085366 | 1 | 0 | 0.074494 | 0.050202 | 0 | 0 | 0 | 0 | 0 | 1 | 0.146341 | false | 0 | 0.04878 | 0 | 0.353659 | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1012347db29b0e7431b7d278a87031a83ffc0e9a | 419 | py | Python | mentoring_app/migrations/0007_mentoringprogram_is_published.py | ShouravAhmed/Luminar | f80c621028f81ef7f657592560e4c95fd8e91699 | [
"MIT"
] | null | null | null | mentoring_app/migrations/0007_mentoringprogram_is_published.py | ShouravAhmed/Luminar | f80c621028f81ef7f657592560e4c95fd8e91699 | [
"MIT"
] | null | null | null | mentoring_app/migrations/0007_mentoringprogram_is_published.py | ShouravAhmed/Luminar | f80c621028f81ef7f657592560e4c95fd8e91699 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.7 on 2021-12-28 05:56
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('mentoring_app', '0006_mentoringprogram_is_archived'),
]
operations = [
migrations.AddField(
model_name='mentoringprogram',
name='is_published',
field=models.BooleanField(default=False),
),
]
| 22.052632 | 63 | 0.630072 | 43 | 419 | 6 | 0.813953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061688 | 0.264916 | 419 | 18 | 64 | 23.277778 | 0.775974 | 0.107399 | 0 | 0 | 1 | 0 | 0.198925 | 0.08871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
101537d3feef1a4dbbd2ef2f6aa60048ff8c781a | 1,265 | py | Python | RecordSpider/beian.py | wjcIvan/oschinaLearning | 4f172b068ab00063b4b0e6217d031e0dcb48f492 | [
"MIT"
] | 1 | 2020-07-29T07:00:32.000Z | 2020-07-29T07:00:32.000Z | RecordSpider/beian.py | wjcIvan/oschinaLearning | 4f172b068ab00063b4b0e6217d031e0dcb48f492 | [
"MIT"
] | 4 | 2020-07-30T08:44:04.000Z | 2020-07-30T08:45:14.000Z | RecordSpider/beian.py | wjcIvan/oschinaLearning | 4f172b068ab00063b4b0e6217d031e0dcb48f492 | [
"MIT"
] | null | null | null | # coding:utf-8
import sys
from PyQt5 import QtWidgets
import window
import recordSpider
class MainWindow(object):
def __init__(self):
app = QtWidgets.QApplication(sys.argv)
MainWindow = QtWidgets.QMainWindow()
self.ui = window.Ui_MainWindow()
self.ui.setupUi(MainWindow)
MainWindow.show()
# 获取文本框内容
self.ui.pushButton.clicked.connect(self.click_success)
sys.exit(app.exec_())
def click_success(self):
domain = self.ui.textEdit.toPlainText()
result = recordSpider.main(domain)
if result is None:
self.ui.lineEdit.setText("未找到备案")
self.ui.lineEdit_2.setText("")
self.ui.lineEdit_3.setText("")
self.ui.lineEdit_4.setText("")
self.ui.lineEdit_5.setText("")
self.ui.lineEdit_6.setText("")
else:
self.ui.lineEdit.setText(result["main"])
self.ui.lineEdit_2.setText(result["mainType"])
self.ui.lineEdit_3.setText(result["record"])
self.ui.lineEdit_4.setText(result["websiteName"])
self.ui.lineEdit_5.setText(result["websiteHome"])
self.ui.lineEdit_6.setText(result["time"])
if __name__ == "__main__":
MainWindow()
| 30.119048 | 62 | 0.618972 | 144 | 1,265 | 5.256944 | 0.375 | 0.126816 | 0.221929 | 0.110964 | 0.290621 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012698 | 0.252964 | 1,265 | 41 | 63 | 30.853659 | 0.78836 | 0.01581 | 0 | 0 | 0 | 0 | 0.045894 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
101a10a127a273feed1bef4adf70f1e5a220a9a0 | 2,186 | py | Python | relations/views.py | Mansouroopi/DRF | 057c6d77c012386734deee9c9076264c005d0221 | [
"MIT"
] | null | null | null | relations/views.py | Mansouroopi/DRF | 057c6d77c012386734deee9c9076264c005d0221 | [
"MIT"
] | null | null | null | relations/views.py | Mansouroopi/DRF | 057c6d77c012386734deee9c9076264c005d0221 | [
"MIT"
] | null | null | null |
from snippets.permissions import IsOwnerOrReadOnly
from rest_framework import permissions
from rest_framework import viewsets
from .models import Album, Track, Student, Module
from .serializers import AlbumSerializer, StudentSerializer, ModuleSerializer, TrackSerializer
from rest_framework.response import Response
class AlbumViewSet(viewsets.ModelViewSet):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions.
Additionally we also provide an extra `highlight` action.
"""
queryset = Album.objects.all()
serializer_class = AlbumSerializer
permission_classes = [permissions.IsAuthenticatedOrReadOnly,
IsOwnerOrReadOnly]
class TrackViewSet(viewsets.ModelViewSet):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions.
Additionally we also provide an extra `highlight` action.
"""
queryset = Track.objects.all()
serializer_class = TrackSerializer
def create(self, *args, **kwargs):
"""
override the create method
:param args:
:param kwargs:
:return:
"""
track_data = self.request.data
new_track = Track.objects.create(album=Album.objects.get(id=track_data['album']),
order=track_data['order'],
title=track_data['title'],
duration=track_data['duration'])
new_track.save()
serializer = TrackSerializer(new_track)
return Response(serializer.data)
class StudentViewSet(viewsets.ModelViewSet):
"""
student viewset automatically provides list, create, retrival, update, and destroy
"""
queryset = Student.objects.all()
serializer_class = StudentSerializer
class ModuleViewSet(viewsets.ModelViewSet):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions.
Additionally we also provide an extra `highlight` action.
"""
queryset = Module.objects.all()
serializer_class = ModuleSerializer
| 31.681159 | 94 | 0.663769 | 206 | 2,186 | 6.966019 | 0.339806 | 0.031359 | 0.078049 | 0.089199 | 0.335889 | 0.309408 | 0.309408 | 0.309408 | 0.309408 | 0.309408 | 0 | 0 | 0.245197 | 2,186 | 68 | 95 | 32.147059 | 0.869697 | 0.282251 | 0 | 0 | 0 | 0 | 0.016017 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.206897 | 0 | 0.724138 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
63dc294f098bbd41379d16d4e24791b73958a2ea | 1,345 | py | Python | python/examples/test_tds.py | dmillard/autogen | c14790f01b7e6ff4c140331317ad981c33e9b40f | [
"MIT"
] | 33 | 2021-10-29T21:33:39.000Z | 2022-03-31T11:02:43.000Z | python/examples/test_tds.py | dmillard/autogen | c14790f01b7e6ff4c140331317ad981c33e9b40f | [
"MIT"
] | 8 | 2021-08-13T07:01:05.000Z | 2021-11-16T19:07:33.000Z | python/examples/test_tds.py | dmillard/autogen | c14790f01b7e6ff4c140331317ad981c33e9b40f | [
"MIT"
] | 4 | 2021-10-31T10:59:44.000Z | 2021-11-06T07:21:45.000Z | import pytinydiffsim_ad as dp
import autogen as ag
import numpy as np
import math
TIME_STEPS = 20
def func(input_tau):
world = dp.TinyWorld()
world.friction = dp.ADScalar(1.0)
urdf_parser = dp.TinyUrdfParser()
urdf_data = urdf_parser.load_urdf("/root/tiny-differentiable-simulator/data/cartpole.urdf")
print("robot_name=",urdf_data.robot_name)
# b2vis = meshcat_utils_dp.convert_visuals(urdf_data, "~/tiny-differentiable-simulator/data/laikago/laikago_tex.jpg", vis, "../../data/laikago/")
is_floating=False
mb = dp.TinyMultiBody(is_floating)
urdf2mb = dp.UrdfToMultiBody2()
res = urdf2mb.convert2(urdf_data, world, mb)
dt = dp.ADScalar(1./1000.)
cost_output = np.ones(TIME_STEPS, dtype=ag.ADScalar)
for i in range(TIME_STEPS):
# print(type(input_tau[0]))
# convert to tds type
mb.tau[0] = input_tau[i].value()
dp.forward_dynamics(mb, world.gravity)
dp.integrate_euler(mb, dt)
# convert to ag type
pole_cost = math.sin(mb.q[1].value()) ** 2
cart_cost = (mb.q[0].value() / 2.4) ** 2
total_cost = pole_cost + cart_cost
cost_output[i] = ag.ADScalar(total_cost)
print('cost arr=', cost_output)
return cost_output
f = ag.trace(func, [1.0] * TIME_STEPS)
gen = ag.GeneratedCppAD(f)
x = [2.0] * TIME_STEPS
y = f.forward(x)
print("y = ", y)
J = f.jacobian(x)
print("j = ", J) | 28.617021 | 147 | 0.692193 | 213 | 1,345 | 4.201878 | 0.422535 | 0.050279 | 0.024581 | 0.069274 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022948 | 0.157621 | 1,345 | 47 | 148 | 28.617021 | 0.76699 | 0.154647 | 0 | 0 | 1 | 0 | 0.072374 | 0.047661 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029412 | false | 0 | 0.117647 | 0 | 0.176471 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
63e423f755852b992268e18edda0f5773e6503d3 | 760 | py | Python | heap/binary_heap_test.py | dyc-it/algorithm | 68b866c9bf5ee2b479be124d70d2362144d568bb | [
"Apache-2.0"
] | null | null | null | heap/binary_heap_test.py | dyc-it/algorithm | 68b866c9bf5ee2b479be124d70d2362144d568bb | [
"Apache-2.0"
] | null | null | null | heap/binary_heap_test.py | dyc-it/algorithm | 68b866c9bf5ee2b479be124d70d2362144d568bb | [
"Apache-2.0"
] | null | null | null | import unittest
import random
from binary_heap import BinaryHeap
class TestBinaryHeap(unittest.TestCase):
def setUp(self):
size = 8
self.random_list = random.sample(range(0, size), size)
print "random list generated: " + str(self.random_list)
self.heap = BinaryHeap(size)
for key in self.random_list:
if not self.heap.is_full():
self.heap.insert(key)
def tearDown(self):
pass
def test_delete_min(self):
order_list = sorted(self.random_list)
index = 0
while not self.heap.is_empty():
min_value = self.heap.delete_min()
print min_value
self.assertEqual(min_value, order_list[index])
index += 1
| 24.516129 | 63 | 0.607895 | 96 | 760 | 4.65625 | 0.447917 | 0.111857 | 0.12528 | 0.058166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007547 | 0.302632 | 760 | 30 | 64 | 25.333333 | 0.835849 | 0 | 0 | 0 | 0 | 0 | 0.030343 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0 | null | null | 0.045455 | 0.136364 | null | null | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
63e72dba0fb4b89d7e9e4db6bf070ba06e155aae | 1,473 | py | Python | Fred/7_Gestures/Reactions.py | ProjectHewitt/Fred_Inmoov | 179be5ba18c89fe6b2c64efc0b88c9db8e40b75b | [
"Apache-2.0"
] | 6 | 2021-02-25T02:14:56.000Z | 2021-06-13T16:13:00.000Z | Fred/7_Gestures/Reactions.py | ProjectHewitt/Fred_Inmoov | 179be5ba18c89fe6b2c64efc0b88c9db8e40b75b | [
"Apache-2.0"
] | null | null | null | Fred/7_Gestures/Reactions.py | ProjectHewitt/Fred_Inmoov | 179be5ba18c89fe6b2c64efc0b88c9db8e40b75b | [
"Apache-2.0"
] | 4 | 2021-02-07T00:14:32.000Z | 2021-10-02T03:32:44.000Z | ##############################################################
# Program Code for Fred Inmoov #
# Of the Cyber_One YouTube Channel #
# https://www.youtube.com/cyber_one #
# #
# This is version 5 #
# Divided up into sub programs #
# #
# Running on MyRobotLab (MRL) http://myrobotlab.org/ #
# Fred in a modified Inmmov robot, you can find all the #
# origonal files on the Inmoov web site. http://inmoov.fr/ #
# #
# 7_Gestures/Reactions.py #
# This file perform standard actions such as nodding Yes #
# or shaking the head No #
# #
##############################################################
import time
print "Creating the various gestures to make the robot appear alive"
def Yes():
PanTilt(0, -40, 0)
time.sleep(0.3)
PanTilt(0, 30, 0)
time.sleep(0.3)
PanTilt(0, -20, 0)
time.sleep(0.4)
PanTilt(0, 0, 0)
def No():
PanTilt(40, 0, 0)
time.sleep(0.3)
PanTilt(-40, 0, 0)
time.sleep(0.3)
PanTilt(40, 0, 0)
time.sleep(0.3)
PanTilt(0, 0, 0)
| 37.769231 | 68 | 0.376782 | 145 | 1,473 | 3.806897 | 0.531034 | 0.025362 | 0.108696 | 0.119565 | 0.206522 | 0.206522 | 0.206522 | 0.132246 | 0.132246 | 0.132246 | 0 | 0.055138 | 0.458248 | 1,473 | 38 | 69 | 38.763158 | 0.636591 | 0.544467 | 0 | 0.5 | 0 | 0 | 0.143541 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.055556 | null | null | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
63e9684d10afbc91d9cdbd324d7a08494f2c12d6 | 5,125 | py | Python | signin/jd_job/common.py | nujabse/simpleSignin | e818f9880d97d39beddcadccb53dc23e18fe6d8e | [
"MIT"
] | 11 | 2018-03-07T04:13:05.000Z | 2019-11-28T04:43:55.000Z | signin/jd_job/common.py | nujabse/simpleSignin | e818f9880d97d39beddcadccb53dc23e18fe6d8e | [
"MIT"
] | 2 | 2018-03-04T15:08:01.000Z | 2018-05-28T07:42:47.000Z | signin/jd_job/common.py | nujabse/simpleSignin | e818f9880d97d39beddcadccb53dc23e18fe6d8e | [
"MIT"
] | 15 | 2018-01-25T10:54:06.000Z | 2019-11-05T07:09:20.000Z | #!/usr/bin/env python
# encoding: utf-8
# author: Vincent
# refer: https://github.com/vc5
import re
import time
from requests import Response
from selenium.common.exceptions import NoSuchElementException
from selenium.webdriver.common.touch_actions import TouchActions
from ..chrome import mobile_emulation
from lib.settings import MOBILE_UA
class RequestError(Exception):
def __init__(self, message, code: str = None, response: Response = None):
self.message = message
self.code = code
self.response = response
def find_value(pattern, string, default=None, flags=0):
"""
根据正则表达式在字符串中搜索值,若未找到,返回 default
"""
m = re.search(pattern, string, flags)
if m:
return m.group(1)
else:
return default
class Job:
job_name = '签到任务名称demo'
index_url = 'https://bk.jd.com/m/channel/login/daka.html'
login_url = 'https://home.m.jd.com'
sign_url = 'https://bk.jd.com/m/channel/login/clock.html'
test_url = index_url
job_gb_url = 'https://bk.jd.com/m/channel/login/recDakaGb.html'
is_mobile = True # 默认为True,模拟移动动设备登陆
ua = MOBILE_UA
# sess = requestium.Session()
# sess.get(index_url)
# sess.driver.close()
# #重新初始化session的driver
# sess._driver = sess._driver_initializer() 或
# sess._driver = None后,可以重新使用sess.driver.get
# 重新指定driver path
# sess.webdriver_path="bin/chromedriver_linux64"
def __init__(self, bot):
self.bot = bot
self.session = bot.session
self.job_success = False
self.logger = bot.user.logger
self.session.headers.update({'User-Agent': self.ua})
self.session.webdriver_options.add_argument('user-agent={0}'.format(self.ua))
def run(self):
self.logger.info('Job Start: {}'.format(self.job_name))
is_login = self.is_login()
self.logger.info('登录状态: {}'.format(is_login))
if not is_login:
self.logger.info('进行登录...')
try:
self.login(url=self.login_url)
is_login = True
self.logger.info('登录成功')
except Exception as e:
self.logger.error('登录失败: {}'.format(repr(e)))
if is_login:
if self.is_signed():
self.job_success = True
else:
self.job_success = self.sign()
self.logger.info('Job End.')
def is_login(self):
r = self.session.get(self.test_url, allow_redirects=False)
if r.is_redirect and 'passport' in r.headers['Location']:
return False
else:
return True
def login(self, url):
# cookies = browser.get_cookies(url=self.login_url, signbot=self.bot)
# self.session.cookies.update(cookies)
self.session._driver = None
if self.is_mobile:
self.session.webdriver_options.add_experimental_option("mobileEmulation", mobile_emulation)
driver = self.session.driver
# 模拟触控操作
# https://seleniumhq.github.io/selenium/docs/api/py/webdriver/selenium.webdriver.common.touch_actions.html
tap_loginbtn = TouchActions(driver)
driver.get(url)
user_input = driver.find_element_by_id('username')
password_input = driver.find_element_by_id('password')
login_btn = driver.find_element_by_id('loginBtn')
user_input.send_keys(self.bot.user.username)
password_input.send_keys(self.bot.user.password)
tap_loginbtn.tap(login_btn).perform()
time.sleep(6)
nickname = driver.find_element_by_css_selector('#myHeader span[class$="name_text"]')
nickname = nickname.text
self.logger.info('登陆成功,欢迎{}'.format(nickname))
print('登陆成功')
else:
self.login_pc(url)
self.session.transfer_driver_cookies_to_session()
self.session.driver.close()
def login_pc(self, url):
driver = self.session.driver
driver.get(url)
nickname = ''
switcher = driver.find_element_by_link_text('账户登录')
switcher.click()
user_input = driver.find_element_by_id('loginname')
password_input = driver.find_element_by_id('nloginpwd')
login_btn = driver.find_element_by_id('loginsubmit')
user_input.send_keys(self.bot.user.username)
password_input.send_keys(self.bot.user.password)
login_btn.click()
time.sleep(6)
try:
nickname = driver.find_element_by_css_selector('#shortcut-2014 a[class=nickname]')
nickname = nickname.text
self.logger.info('登陆成功,欢迎{}'.format(nickname))
except NoSuchElementException:
self.logger.warn('登陆异常,请检查是否需要验证码')
return nickname
def is_signed(self):
'''
验证是否签到
:return: 已经签到则返回True,否则返回False
'''
return False
def sign(self):
'''
用来签到的方法
:return:
'''
pass
def report(self):
'''
用来报告签到结果的方法
:return:返回需要通知用户的签到结果str
'''
return ''
| 31.441718 | 118 | 0.617756 | 612 | 5,125 | 4.990196 | 0.313725 | 0.03962 | 0.050098 | 0.055992 | 0.254748 | 0.198428 | 0.198428 | 0.112639 | 0.085134 | 0.085134 | 0 | 0.003465 | 0.267902 | 5,125 | 162 | 119 | 31.635802 | 0.810501 | 0.132488 | 0 | 0.207547 | 0 | 0 | 0.099306 | 0.005556 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09434 | false | 0.056604 | 0.066038 | 0 | 0.320755 | 0.009434 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
63eb09455a488d9193ff41ee088a0eccb9993357 | 998 | py | Python | Nmapscript.py | WhiteRedTHT/port-scann | bb4a6731f3c4ae484d1978fa7527f55520cb7e99 | [
"MIT"
] | null | null | null | Nmapscript.py | WhiteRedTHT/port-scann | bb4a6731f3c4ae484d1978fa7527f55520cb7e99 | [
"MIT"
] | null | null | null | Nmapscript.py | WhiteRedTHT/port-scann | bb4a6731f3c4ae484d1978fa7527f55520cb7e99 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import vulners
print("---------------------------------------------------------------")
print(""" P4RS Script Hoş Geldiniz
Programı kullanmak için sadece IP adresini yazmanız yeterlidir.
Programı çalıştırmak için; Medusa araçları, searcsploit ve brutespray uygulamaları gerekmektedir. BruteSpray uygulamasını ben sizlere verdim. Diğerlerini kendiniz indirmelisiniz.
""")
print("---------------------------------------------------------------")
hedef_ip = raw_input("Lütfen IP adresi giriniz: ")
b = os.system("nmap " + hedef_ip + " -oX output.xml -v ")
c = os.system("searchsploit " "-v output.xml")
def vulners():
api = input("Api key'i giriniz: ")
servis = input("Tarama yapmak istediğiniz servisi yazınız: ")
vulners_api = vulners.Vulners(api_key="api")
tarama = vulners_api.search("servis", limit=3)
print(tarama)
d = os.system("./brutespray.py " "--file output.xml -U user.txt -P wordlist.txt --threads 5 --hosts 5")
| 26.263158 | 179 | 0.61022 | 114 | 998 | 5.289474 | 0.640351 | 0.066335 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005882 | 0.148297 | 998 | 37 | 180 | 26.972973 | 0.703529 | 0.021042 | 0 | 0.111111 | 0 | 0.111111 | 0.651813 | 0.13057 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.111111 | 0 | 0.166667 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
63f6c1a6e6d4382fdea90513c54f2d68c07f76d8 | 2,491 | py | Python | iu_mongo/session.py | intelligenceunion/mongo-driver | 02bbc6839c8264d4b06b053e8cc83d42147ede17 | [
"MIT"
] | null | null | null | iu_mongo/session.py | intelligenceunion/mongo-driver | 02bbc6839c8264d4b06b053e8cc83d42147ede17 | [
"MIT"
] | null | null | null | iu_mongo/session.py | intelligenceunion/mongo-driver | 02bbc6839c8264d4b06b053e8cc83d42147ede17 | [
"MIT"
] | 1 | 2019-06-21T17:49:08.000Z | 2019-06-21T17:49:08.000Z | from pymongo.read_concern import ReadConcern
from pymongo.read_preferences import ReadPreference
from pymongo.write_concern import WriteConcern
from pymongo.errors import InvalidOperation
from iu_mongo.errors import TransactionError
__all__ = ['Session', 'TransactionContext']
DEFAULT_READ_CONCERN = ReadConcern('majority')
DEFAULT_WRITE_CONCERN = WriteConcern(w='majority', wtimeout=5000)
DEFAULT_READ_PREFERENCE = ReadPreference.PRIMARY
class TransactionContext(object):
def __init__(self, pymongo_transaction_context, pymongo_session):
self._pymongo_transaction_context = pymongo_transaction_context
self._pymongo_session = pymongo_session
def __enter__(self):
self._pymongo_transaction_context.__enter__()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self._pymongo_transaction_context.__exit__(exc_type, exc_val, exc_tb)
@property
def _transaction(self):
return self._pymongo_session._transaction
@property
def transaction_id(self):
return self._transaction.transaction_id
class Session(object):
def __init__(self, pymongo_client_session):
self._pymongo_client_session = pymongo_client_session
@property
def pymongo_session(self):
return self._pymongo_client_session
@property
def pymongo_client(self):
return self._pymongo_client_session.client
@property
def session_id(self):
return self._pymongo_client_session.session_id
def start_transaction(self):
try:
pymongo_transaction_context = self._pymongo_client_session.start_transaction(
read_concern=DEFAULT_READ_CONCERN,
write_concern=DEFAULT_WRITE_CONCERN,
read_preference=DEFAULT_READ_PREFERENCE
)
return TransactionContext(pymongo_transaction_context, self._pymongo_client_session)
except InvalidOperation as e:
raise TransactionError(str(e))
def abort_transaction(self):
try:
self._pymongo_client_session.abort_transaction()
except InvalidOperation as e:
raise TransactionError(str(e))
def commit_transaction(self):
self._pymongo_client_session.commit_transaction()
def __enter__(self):
self._pymongo_client_session.__enter__()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self._pymongo_client_session.__exit__(exc_type, exc_val, exc_tb)
| 32.350649 | 96 | 0.734243 | 276 | 2,491 | 6.094203 | 0.184783 | 0.111177 | 0.142687 | 0.156956 | 0.440547 | 0.306778 | 0.212842 | 0.128419 | 0.128419 | 0.065398 | 0 | 0.002012 | 0.201927 | 2,491 | 76 | 97 | 32.776316 | 0.844064 | 0 | 0 | 0.293103 | 0 | 0 | 0.016459 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.241379 | false | 0 | 0.086207 | 0.086207 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
63f8d4f926db9084fff0778f781e3d35e9ba0d24 | 5,009 | py | Python | bin/ingredient_phrase_tagger/training/cli.py | Deekshith1994/Recipes | 5dfc98b249f6caf90571c037eb35560417b6818e | [
"Apache-2.0"
] | null | null | null | bin/ingredient_phrase_tagger/training/cli.py | Deekshith1994/Recipes | 5dfc98b249f6caf90571c037eb35560417b6818e | [
"Apache-2.0"
] | null | null | null | bin/ingredient_phrase_tagger/training/cli.py | Deekshith1994/Recipes | 5dfc98b249f6caf90571c037eb35560417b6818e | [
"Apache-2.0"
] | null | null | null | import re
import decimal
import optparse
import pandas as pd
from ingredient_phrase_tagger.training import utils
class Cli(object):
def __init__(self, argv):
self.opts = self._parse_args(argv)
self._upstream_cursor = None
def run(self):
self.generate_data(self.opts.count, self.opts.offset)
def generate_data(self, count, offset):
"""
Generates training data in the CRF++ format for the ingredient
tagging task
"""
df = pd.read_csv(self.opts.data_path)
df = df.fillna("")
start = int(offset)
end = int(offset) + int(count)
df_slice = df.iloc[start: end]
for index, row in df_slice.iterrows():
try:
# extract the display name
display_input = utils.cleanUnicodeFractions(row["input"])
tokens = utils.tokenize(display_input)
del(row["input"])
rowData = self.addPrefixes([(t, self.matchUp(t, row)) for t in tokens])
for i, (token, tags) in enumerate(rowData):
features = utils.getFeatures(token, i+1, tokens)
print utils.joinLine([token] + features + [self.bestTag(tags)])
# ToDo: deal with this
except UnicodeDecodeError:
pass
print
def parseNumbers(self, s):
"""
Parses a string that represents a number into a decimal data type so that
we can match the quantity field in the db with the quantity that appears
in the display name. Rounds the result to 2 places.
"""
ss = utils.unclump(s)
m3 = re.match('^\d+$', ss)
if m3 is not None:
return decimal.Decimal(round(float(ss), 2))
m1 = re.match(r'(\d+)\s+(\d)/(\d)', ss)
if m1 is not None:
num = int(m1.group(1)) + (float(m1.group(2)) / float(m1.group(3)))
return decimal.Decimal(str(round(num,2)))
m2 = re.match(r'^(\d)/(\d)$', ss)
if m2 is not None:
num = float(m2.group(1)) / float(m2.group(2))
return decimal.Decimal(str(round(num,2)))
return None
def matchUp(self, token, ingredientRow):
"""
Returns our best guess of the match between the tags and the
words from the display text.
This problem is difficult for the following reasons:
* not all the words in the display name have associated tags
* the quantity field is stored as a number, but it appears
as a string in the display name
* the comment is often a compilation of different comments in
the display name
"""
ret = []
# strip parens from the token, since they often appear in the
# display_name, but are removed from the comment.
token = utils.normalizeToken(token)
decimalToken = self.parseNumbers(token)
for key, val in ingredientRow.iteritems():
if isinstance(val, basestring):
for n, vt in enumerate(utils.tokenize(val)):
if utils.normalizeToken(vt) == token:
ret.append(key.upper())
elif decimalToken is not None:
try:
if val == decimalToken:
ret.append(key.upper())
except:
pass
return ret
def addPrefixes(self, data):
"""
We use BIO tagging/chunking to differentiate between tags
at the start of a tag sequence and those in the middle. This
is a common technique in entity recognition.
Reference: http://www.kdd.cis.ksu.edu/Courses/Spring-2013/CIS798/Handouts/04-ramshaw95text.pdf
"""
prevTags = None
newData = []
for n, (token, tags) in enumerate(data):
newTags = []
for t in tags:
p = "B" if ((prevTags is None) or (t not in prevTags)) else "I"
newTags.append("%s-%s" % (p, t))
newData.append((token, newTags))
prevTags = tags
return newData
def bestTag(self, tags):
if len(tags) == 1:
return tags[0]
# if there are multiple tags, pick the first which isn't COMMENT
else:
for t in tags:
if (t != "B-COMMENT") and (t != "I-COMMENT"):
return t
# we have no idea what to guess
return "OTHER"
def _parse_args(self, argv):
"""
Parse the command-line arguments into a dict.
"""
opts = optparse.OptionParser()
opts.add_option("--count", default="100", help="(%default)")
opts.add_option("--offset", default="0", help="(%default)")
opts.add_option("--data-path", default="nyt-ingredients-snapshot-2015.csv", help="(%default)")
(options, args) = opts.parse_args(argv)
return options
| 30.730061 | 102 | 0.553404 | 618 | 5,009 | 4.444984 | 0.37055 | 0.014561 | 0.030579 | 0.029123 | 0.040772 | 0.023298 | 0.023298 | 0 | 0 | 0 | 0 | 0.01277 | 0.343382 | 5,009 | 162 | 103 | 30.919753 | 0.822438 | 0.049112 | 0 | 0.117647 | 0 | 0 | 0.046682 | 0.00928 | 0 | 0 | 0 | 0.006173 | 0 | 0 | null | null | 0.023529 | 0.058824 | null | null | 0.023529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
12004752447a3c322e24283675f1768106bc32b5 | 761 | py | Python | wagtailrelated/utils.py | torchbox/wagtail-related | b0e811e5093d6d5c1554cab66f3c5097d6d474ad | [
"BSD-3-Clause"
] | 2 | 2018-09-03T15:09:30.000Z | 2020-11-29T02:23:01.000Z | wagtailrelated/utils.py | torchbox/wagtail-related | b0e811e5093d6d5c1554cab66f3c5097d6d474ad | [
"BSD-3-Clause"
] | null | null | null | wagtailrelated/utils.py | torchbox/wagtail-related | b0e811e5093d6d5c1554cab66f3c5097d6d474ad | [
"BSD-3-Clause"
] | null | null | null | from bs4 import BeautifulSoup
from wagtail.core.fields import StreamField
def extract_text(obj):
"""Extracts data, concatenates and removes html tags
from fields listed in a obj.related_source_fields list.
"""
related_source_fields = getattr(obj._meta.model, 'related_source_fields', None)
if not related_source_fields:
return
html_pieces = []
for source_field in related_source_fields:
field = source_field.get_field(obj.__class__)
field_value = source_field.get_value(obj)
if isinstance(field, StreamField):
field_value = ' '.join(field_value)
html_pieces.append(field_value)
text = ' '.join(html_pieces)
text = BeautifulSoup(text, 'html5lib').text
return text
| 29.269231 | 83 | 0.703022 | 96 | 761 | 5.28125 | 0.4375 | 0.128205 | 0.187377 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003344 | 0.214192 | 761 | 25 | 84 | 30.44 | 0.844482 | 0.137976 | 0 | 0 | 0 | 0 | 0.048362 | 0.032761 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
12012366643f4f22b21641dcce1d6d36c1aaaf3a | 1,233 | py | Python | codes/globo_videos_cuts/core/tests/models/programs_model_test_case.py | lariodiniz/teste_meta | 3bf043df3ee76871d68a3f8aea7c3ecd53765fec | [
"MIT"
] | null | null | null | codes/globo_videos_cuts/core/tests/models/programs_model_test_case.py | lariodiniz/teste_meta | 3bf043df3ee76871d68a3f8aea7c3ecd53765fec | [
"MIT"
] | null | null | null | codes/globo_videos_cuts/core/tests/models/programs_model_test_case.py | lariodiniz/teste_meta | 3bf043df3ee76871d68a3f8aea7c3ecd53765fec | [
"MIT"
] | null | null | null | # coding: utf-8
__author__ = "Lário dos Santos Diniz"
from django.test import TestCase
from model_mommy import mommy
from core.models import Programs
class ProgramsModelTestCase(TestCase):
"""Class Testing Model Pogramas """
def setUp(self):
"""
Initial Test Settings
"""
self.program = mommy.make(Programs)
def tearDown(self):
"""Final method"""
self.program.delete()
def test_there_are_fields(self):
"""test the fields the model"""
self.assertTrue('title' in dir(Programs), 'Class Program does not have the field title')
self.assertTrue('start_time' in dir(Programs), 'Class Program does not have the field start_time')
self.assertTrue('end_time' in dir(Programs), 'Class Program does not have the field end_time')
def test_there_is_a_program(self):
"""test if you are creating a Program correctly"""
self.assertEquals(Programs.objects.count(), 1)
self.assertEquals(Programs.objects.all()[0].title, self.program.title)
self.assertEquals(Programs.objects.all()[0].start_time, self.program.start_time)
self.assertEquals(Programs.objects.all()[0].end_time, self.program.end_time)
| 33.324324 | 106 | 0.680454 | 161 | 1,233 | 5.086957 | 0.372671 | 0.067155 | 0.117216 | 0.151404 | 0.299145 | 0.299145 | 0.17094 | 0.17094 | 0.17094 | 0.17094 | 0 | 0.005112 | 0.206813 | 1,233 | 36 | 107 | 34.25 | 0.832311 | 0.121655 | 0 | 0 | 0 | 0 | 0.175168 | 0 | 0 | 0 | 0 | 0 | 0.388889 | 1 | 0.222222 | false | 0 | 0.166667 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1204c9abc6db283ce3d3831f239b7fd8af24dbea | 4,563 | py | Python | test_phase_equilibrium.py | khuston/interfacial_transport | dfca4288e2876de81683fb46704c14393afdf422 | [
"MIT"
] | 1 | 2020-05-07T09:25:24.000Z | 2020-05-07T09:25:24.000Z | test_phase_equilibrium.py | khuston/interfacial_transport | dfca4288e2876de81683fb46704c14393afdf422 | [
"MIT"
] | null | null | null | test_phase_equilibrium.py | khuston/interfacial_transport | dfca4288e2876de81683fb46704c14393afdf422 | [
"MIT"
] | null | null | null | import numpy as np
import logging
from interfacial_transport import compute_one_phase_equilibrium, compute_mass_balance_one_phase
from interfacial_transport import compute_two_phase_equilibrium, compute_mass_balance_two_phase
from numpy.testing import assert_almost_equal
logger = logging.getLogger(__name__)
# Check one-phase equilibrium for errors
def test_no_errors_one_phase_equilibrium_interface():
alpha = 6.86e-6 # 1/s
beta = 22.1 # m**3/mol/s
n = 0.460 #
kappa = 10.3 #
D = 3.8e-10 # m**2/s
Gamma_infty = 2.25e-6 # mol/m**2
R = 1.9e-3
a = alpha/beta
c0 = 0.0006
times_to_save = np.logspace(-3., 4.)
r, c_save, Gamma_save = compute_one_phase_equilibrium(D, c0, R, a, Gamma_infty, kappa, n, times_to_save)
assert(c_save.ndim == 2)
assert(Gamma_save.ndim == 1)
assert(r.ndim == 1)
assert(len(Gamma_save) == len(times_to_save))
assert(len(c_save) == len(times_to_save))
assert(len(r) == c_save.shape[1])
def test_no_errors_two_phase_equilibrium_interface():
alpha = 6.86e-6 # 1/s
beta = 22.1 # m**3/mol/s
n = 0.460 #
kappa = 10.3 #
DA = 3.8e-10/48. # m**2/s
DB = 3.8e-10 # m**2/s
Gamma_infty = 2.25e-6 # mol/m**2
R = 1.9e-3
aB = alpha/beta
aA = alpha/beta*1.45
c0A = 0.
c0B = 0.0006
times_to_save = np.logspace(-3., 4.)
rA, rB, cA_save, cB_save, Gamma_save = compute_two_phase_equilibrium(DA, DB, c0A, c0B, R, aA, aB, Gamma_infty, kappa, n, times_to_save)
assert(cA_save.ndim == 2)
assert(cB_save.ndim == 2)
assert(Gamma_save.ndim == 1)
assert(rA.ndim == 1)
assert(rB.ndim == 1)
assert(len(Gamma_save) == len(times_to_save))
assert(len(cA_save) == len(times_to_save))
assert(len(cB_save) == len(times_to_save))
assert(len(rA) == cA_save.shape[1])
assert(len(rB) == cB_save.shape[1])
def test_no_errors_compute_mass_balance_one_phase():
Gamma = 1.
c = np.ones(5)
r = np.linspace(1., 2., 5)
compute_mass_balance_one_phase(Gamma, c, r, r[0])
Gamma = np.ones(10)
c = np.ones((10, 20))
r = np.linspace(1., 2., 20)
compute_mass_balance_one_phase(Gamma, c, r, r[0])
def test_no_errors_compute_mass_balance_two_phase():
Gamma = 1.
cA = np.ones(5)
rA = np.linspace(0., 1., 5)
cB = np.ones(5)
rB = np.linspace(1., 2., 5)
compute_mass_balance_two_phase(Gamma, cA, rA, cA, rB, rB[0])
Gamma = np.ones(10)
cA = np.ones((10, 20))
rA = np.linspace(0., 1., 20)
cB = np.ones((10, 20))
rB = np.linspace(1., 2., 20)
compute_mass_balance_two_phase(Gamma, cA, rA, cB, rB, rB[0])
def test_one_phase_mass_balance_accuracy():
alpha = 6.86e-6 # 1/s
beta = 22.1 # m**3/mol/s
n = 0.460 #
kappa = 10.3 #
D = 3.8e-10 # m**2/s
Gamma_infty = 2.25e-6 # mol/m**2
R = 1.9e-3
a = alpha/beta
c0 = 0.0006
times_to_save = np.logspace(-3., 4.)
r, c_save, Gamma_save = compute_one_phase_equilibrium(D, c0, R, a, Gamma_infty, kappa, n, times_to_save)
compute_mass_balance_one_phase(Gamma_save, c_save, r, R)
error = (compute_mass_balance_one_phase(Gamma_save, c_save, r, R) - compute_mass_balance_one_phase(0., np.array([0.]+[c0]*(len(r)-1)), r, R))
assert_almost_equal(error[-1]/Gamma_save[-1], 0., decimal=2)
def test_two_phase_mass_balance_accuracy():
alpha = 6.86e-6 # 1/s
beta = 22.1 # m**3/mol/s
n = 0.460 #
kappa = 10.3 #
DA = 3.8e-10/48. # m**2/s
DB = 3.8e-10 # m**2/s
Gamma_infty = 2.25e-6 # mol/m**2
R = 1.9e-3
aB = alpha/beta
aA = alpha/beta*1.45
c0A = 0.
c0B = 0.0006
times_to_save = np.logspace(-3., 4.)
rA, rB, cA_save, cB_save, Gamma_save = compute_two_phase_equilibrium(DA, DB, c0A, c0B, R, aA, aB, Gamma_infty, kappa, n, times_to_save)
compute_mass_balance_two_phase(Gamma_save, cA_save, rA, cB_save, rB, R)
logger.info('rA={}'.format(rA))
logger.info('rB={}'.format(rB))
logger.info('cA={}'.format(cA_save))
logger.info('cB={}'.format(cB_save))
error = (compute_mass_balance_two_phase(Gamma_save, cA_save, rA, cB_save, rB, R) - compute_mass_balance_two_phase(0., np.array([c0A]*(len(rA)-1)+[0.]), rA, np.array([0.]+[c0B]*(len(rB)-1)), rB, R))
assert_almost_equal(error[-1]/Gamma_save[-1], 0., decimal=2)
def test_validate_one_phase_equilibrium_interface():
pass
| 31.909091 | 201 | 0.60881 | 782 | 4,563 | 3.309463 | 0.112532 | 0.068006 | 0.097372 | 0.056801 | 0.803323 | 0.698609 | 0.688563 | 0.618238 | 0.565688 | 0.53864 | 0 | 0.073051 | 0.238001 | 4,563 | 142 | 202 | 32.133803 | 0.671268 | 0.038571 | 0 | 0.563636 | 0 | 0 | 0.004588 | 0 | 0 | 0 | 0 | 0 | 0.172727 | 1 | 0.063636 | false | 0.009091 | 0.045455 | 0 | 0.109091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1205020ea522f7b1fd7bb580cbb7689312826db5 | 3,914 | py | Python | testbed1/scratchnet_single_switch_test.py | Benny93/SDNMininetScripts | 16844ac1cecfdf27e25ca0f758029c81030b9ba1 | [
"MIT"
] | null | null | null | testbed1/scratchnet_single_switch_test.py | Benny93/SDNMininetScripts | 16844ac1cecfdf27e25ca0f758029c81030b9ba1 | [
"MIT"
] | null | null | null | testbed1/scratchnet_single_switch_test.py | Benny93/SDNMininetScripts | 16844ac1cecfdf27e25ca0f758029c81030b9ba1 | [
"MIT"
] | null | null | null | #!/usr/bin/python
"""
Build a simple network from scratch, using mininet primitives.
This is more complicated than using the higher-level classes,
but it exposes the configuration details and allows customization.
For most tasks, the higher-level API will be preferable.
"""
import csv
import sys
import time
from mininet.net import Mininet
from mininet.node import Node
from mininet.link import Link
from mininet.log import setLogLevel, info
from mininet.util import quietRun
import pingparser
CTLR_IP = '2017:db8::ffaa'
CTLR_PRT = '6653'
# 0: Step wise testing, 1: Continues Testing
mode = 1
def stop_net(controller, cname, switch):
info("*** Stopping network\n")
controller.cmd('kill %' + cname)
switch.cmd('ovs-vsctl del-br dp0')
switch.deleteIntfs()
info('Net was removed\n')
def scratchNet(cname='controller', cargs='-v ptcp:'):
"Create network from scratch using Open vSwitch."
info("*** Creating nodes\n")
controller = Node('c0', inNamespace=False)
switch = Node('s0', inNamespace=False)
h0 = Node('h0')
h1 = Node('h1')
info("*** Creating links\n")
Link(h0, switch)
Link(h1, switch)
info("*** Configuring hosts\n")
h0.setIP('192.168.123.1/24')
h1.setIP('192.168.123.2/24')
info(str(h0) + '\n')
info(str(h1) + '\n')
info("*** Starting network using Open vSwitch\n")
controller.cmd(cname + ' ' + cargs + '&')
switch.cmd('ovs-vsctl del-br dp0')
switch.cmd('ovs-vsctl add-br dp0')
for intf in switch.intfs.values():
print switch.cmd('ovs-vsctl add-port dp0 %s' % intf)
# Note: controller and switch are in root namespace, and we
# can connect via loopback interface
s_cmd = 'ovs-vsctl set-controller dp0 tcp:[{}]:{}'.format(CTLR_IP, CTLR_PRT)
print s_cmd
switch.cmd(s_cmd)
ping_results = ['received,host,jitter,packet_loss,avgping,minping,time,sent,maxping\n']
try:
h0.cmd('echo "" > pings.txt')
if mode == 0:
step_wise_testing(h0, h1, ping_results)
else:
continuous_testing(h0, h1, ping_results)
except KeyboardInterrupt:
print "Warning: Caught KeyboardInterrupt, stopping network"
tm_local = time.localtime()
dt = time.gmtime()
file_name = 'pings_{}_{}_{}-{}_{}_{}.csv'.format(dt.tm_year, dt.tm_mon, dt.tm_mday, tm_local.tm_hour, tm_local.tm_min, tm_local.tm_sec)
f = open(file_name, 'w+')
for item in ping_results:
f.write(item)
stop_net(controller, cname, switch)
def step_wise_testing(h0, h1, ping_results):
while True:
if 'is_connected' not in quietRun('ovs-vsctl show'):
wait_for_controller_connection()
print "Press ENTER to execute Test\n"
line = sys.stdin.readline()
if line:
info("Key Input Accepted\n")
ping_test(h0, h1, ping_results)
def continuous_testing(h0, h1, ping_results):
while True:
if 'is_connected' not in quietRun('ovs-vsctl show'):
wait_for_controller_connection()
ping_test(h0, h1, ping_results)
time.sleep(1)
def ping_test(h0, h1, ping_results):
info("*** Running test\n")
ping_res = h0.cmdPrint('ping -c1 ' + h1.IP())
ping_res = pingparser.parse(ping_res)
tm_local = time.localtime()
ping_res['time'] = '{}:{}:{}'.format(tm_local.tm_hour, tm_local.tm_min, tm_local.tm_sec)
val_string = ','.join(ping_res.itervalues())
ping_results.append(val_string + "\n")
print ping_res
info("*** Sleep\n")
def wait_for_controller_connection():
info('*** Waiting for switch to connect to controller')
while 'is_connected' not in quietRun('ovs-vsctl show'):
time.sleep(1)
info('.')
info('Connected \n')
if __name__ == '__main__':
setLogLevel('info')
info('*** Scratch network demo (kernel datapath)\n')
Mininet.init()
scratchNet()
| 29.877863 | 143 | 0.648441 | 546 | 3,914 | 4.5 | 0.368132 | 0.04477 | 0.022792 | 0.042735 | 0.245421 | 0.208791 | 0.163614 | 0.148148 | 0.108262 | 0.108262 | 0 | 0.024081 | 0.21487 | 3,914 | 130 | 144 | 30.107692 | 0.775464 | 0.038835 | 0 | 0.148936 | 0 | 0 | 0.247358 | 0.027135 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.095745 | null | null | 0.053191 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
12192a5c4a0869098ef5a2f6114fc9d572aaa5ab | 2,101 | py | Python | training/pl_logger.py | stdiff/emo-classifier | 211731a44022408c750b611383216ce0578f2d41 | [
"MIT"
] | null | null | null | training/pl_logger.py | stdiff/emo-classifier | 211731a44022408c750b611383216ce0578f2d41 | [
"MIT"
] | null | null | null | training/pl_logger.py | stdiff/emo-classifier | 211731a44022408c750b611383216ce0578f2d41 | [
"MIT"
] | null | null | null | from typing import Dict, Any, Optional
import pandas as pd
from pytorch_lightning.loggers import LightningLoggerBase
from pytorch_lightning.loggers.base import rank_zero_experiment
from pytorch_lightning.utilities import rank_zero_only
class SimpleLogger(LightningLoggerBase):
def __init__(self):
super().__init__()
self.metrics = []
self.params: Optional[Dict[str, Any]] = None
@property
def name(self):
return "simple_logger"
def flush(self):
self.metrics = []
self.params = None
@property
@rank_zero_experiment
def experiment(self):
# Return the experiment object associated with this logger.
pass
@property
def version(self):
# Return the experiment version, int or str.
return "0.1"
@rank_zero_only
def log_hyperparams(self, params):
self.params = params
@rank_zero_only
def log_metrics(self, metrics: Dict[str, Any], step):
## metrics = {"metric_name": 0.123, "epoch": 9}
## step will not be cleared at the end of an epoch. It is just increasing
# print(metrics)
metrics["step"] = step
self.metrics.append(metrics)
@rank_zero_only
def save(self):
# Optional. Any code necessary to save logger data goes here
# If you implement this, remember to call `super().save()`
# at the start of the method (important for aggregation of metrics)
super().save()
@rank_zero_only
def finalize(self, status):
# Optional. Any code that needs to be run after training
# finishes goes here
pass
def get_history(self) -> pd.DataFrame:
rows = []
for metric in self.metrics:
metric_name = [key for key in metric.keys() if key not in ("epoch", "step")][0]
row = {
"epoch": metric["epoch"],
"step": metric["step"],
"metric": metric_name,
"value": metric[metric_name],
}
rows.append(row)
return pd.DataFrame(rows)
| 29.180556 | 91 | 0.611614 | 256 | 2,101 | 4.890625 | 0.394531 | 0.044728 | 0.047923 | 0.047923 | 0.028754 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005394 | 0.294146 | 2,101 | 71 | 92 | 29.591549 | 0.83884 | 0.231794 | 0 | 0.229167 | 0 | 0 | 0.036227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0.041667 | 0.104167 | 0.041667 | 0.395833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
12229d09733f5a8668d4db1077a04761f8c6c6e6 | 1,234 | py | Python | pdfencrypter.py | kamimura/pdfendecrypter | 0fc93b8234d7e94cd42267eb05f97ba398f08e1e | [
"MIT"
] | null | null | null | pdfencrypter.py | kamimura/pdfendecrypter | 0fc93b8234d7e94cd42267eb05f97ba398f08e1e | [
"MIT"
] | null | null | null | pdfencrypter.py | kamimura/pdfendecrypter | 0fc93b8234d7e94cd42267eb05f97ba398f08e1e | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
import sys
import PyPDF2
length = len(sys.argv)
if length >= 2:
PASSWORD = sys.argv[1]
else:
print('usage: cmd password [path]')
sys.exit(1)
if length == 3:
PATH = sys.argv[2]
else:
PATH = os.curdir
for folder_name, _, filenames in os.walk(PATH):
for filename in filenames:
if filename.endswith('.pdf'):
filename = folder_name + os.sep + filename
with open(filename, 'rb') as pdf_file:
try:
pdf_reader = PyPDF2.PdfFileReader(pdf_file)
if not pdf_reader.isEncrypted:
pdf_writer = PyPDF2.PdfFileWriter()
pdf_writer.encrypt(PASSWORD)
for page_num in range(pdf_reader.numPages):
page = pdf_reader.getPage(page_num)
pdf_writer.addPage(page)
with open(filename + '_encrypted.pdf', 'wb') as f:
pdf_writer.write(f)
os.rename(filename + '_encrypted.pdf', filename)
except Exception as err:
print('{0}: {1}'.format(filename, err))
| 32.473684 | 74 | 0.523501 | 139 | 1,234 | 4.52518 | 0.467626 | 0.057234 | 0.050874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015464 | 0.371151 | 1,234 | 37 | 75 | 33.351351 | 0.795103 | 0.034846 | 0 | 0.064516 | 0 | 0 | 0.058873 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.096774 | 0.096774 | 0 | 0.096774 | 0.064516 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1225b42f1a8d25d36977830638476dfe2a802c62 | 11,992 | py | Python | schemes/mkcap.py | gold2718/ccpp-framework | 66f1a069b6b15748e08adbe940b8ceb9b39619ab | [
"Apache-2.0"
] | null | null | null | schemes/mkcap.py | gold2718/ccpp-framework | 66f1a069b6b15748e08adbe940b8ceb9b39619ab | [
"Apache-2.0"
] | 39 | 2019-01-25T21:50:33.000Z | 2021-09-03T16:57:43.000Z | schemes/mkcap.py | gold2718/ccpp-framework | 66f1a069b6b15748e08adbe940b8ceb9b39619ab | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Script to generate a cap module and subroutines
# from a scheme xml file.
#
from __future__ import print_function
import os
import sys
import getopt
import xml.etree.ElementTree as ET
#################### Main program routine
def main():
args = parse_args()
data = parse_scheme(args['scheme'])
cap = Cap()
cap.filename = args['output']
cap.write(data)
#################### Parse the command line arguments
def parse_args():
args = {}
opts, rem = getopt.getopt(sys.argv[1:],
'hvo:',
['help',
'verbose',
'output=',
])
for opt, arg in opts:
if opt in ('-h', '--help'):
lusage()
elif opt in ('-v', '--verbose'):
args['verbose'] = True
elif opt in ('-o', '--output'):
args['output'] = arg
else:
usage()
if (not rem):
eprint("Must specify an input scheme file")
usage()
if (os.path.isfile(rem[0])):
args['scheme'] = rem[0]
else:
eprint("Unable to read input scheme file: {0}".format(rem[0]))
usage()
if (not 'output' in args):
args['output'] = sys.stdout
return args
#################### Parse the scheme xml file into a data dictionary
def parse_scheme(filename):
data = {}
tree = ET.parse(filename)
root = tree.getroot()
data['module'] = root.attrib.get('module')
data['subs'] = {}
for sub in root.findall('subroutine'):
name = sub.attrib.get('name')
data['subs'][name] = {}
data['subs'][name]['vars'] = []
for var in sub.findall('var'):
v = Var()
v.standard_name = var.find('standard_name').text
#v.long_name = var.find('long_name').text
v.units = var.find('units').text
v.local_name = var.find('local_name').text
v.type = var.find('type').text
v.rank = int(var.find('rank').text)
data['subs'][name]['vars'].append(v)
return data
#################### Print a usage statement
def usage():
name = os.path.basename(__file__)
eprint("Usage {0}: [-h] [-v] [-o output.f90] scheme.xml".format(name))
sys.exit(1)
#################### Print a long usage statement
def lusage():
pass
#################### Print a message to STDERR
def eprint(*args, **kwargs):
print(*args, file=sys.stderr, **kwargs)
###############################################################################
class Var(object):
def __init__(self, **kwargs):
self._standard_name = None
self._long_name = None
self._units = None
self._local_name = None
self._type = None
self._rank = None
self._container = None
for key, value in kwargs.items():
setattr(self, "_"+key, value)
@property
def standard_name(self):
'''Get the name of the variable.'''
return self._standard_name
@standard_name.setter
def standard_name(self, value):
self._standard_name = value
@property
def long_name(self):
'''Get the name of the variable.'''
return self._long_name
@long_name.setter
def long_name(self, value):
self._long_name = value
@property
def units(self):
'''Get the units of the variable.'''
return self._units
@units.setter
def units(self, value):
self._units = value
@property
def local_name(self):
'''Get the local variable name of the variable.'''
return self._local_name
@local_name.setter
def local_name(self, value):
self._local_name = value
@property
def type(self):
'''Get the type of the variable.'''
return self._type
@type.setter
def type(self, value):
self._type = value
@property
def rank(self):
'''Get the rank of the variable.'''
return self._rank
@rank.setter
def rank(self, value):
if not isinstance(value, int):
raise TypeError('Invalid type for variable property rank, must be integer')
if (value == 0):
self._rank = ''
else:
self._rank = '('+ ','.join([':'] * value) +')'
@property
def intent(self):
'''Get the intent of the variable.'''
return self._intent
@intent.setter
def intent(self, value):
if not value in ['none', 'in', 'out', 'inout']:
raise ValueError('Invalid value {0} for variable property intent'.format(value))
self._intent = value
@property
def optional(self):
'''Get the optional of the variable.'''
return self._optional
@optional.setter
def optional(self, value):
if not value in ['T', 'F']:
raise ValueError('Invalid value {0} for variable property optional'.format(value))
self._optional = value
@property
def container(self):
'''Get the container of the variable.'''
return self._container
@container.setter
def container(self, value):
self._container = value
def compatible(self, other):
return self.standard_name == other.standard_name \
and self.long_name == other.long_name \
and self.units == other.units \
and self.type == other.type \
and self.rank == other.rank
def print_def(self):
'''Print the definition line for the variable.'''
str = "{s.type}, pointer :: {s.local_name}{s.rank}"
return str.format(s=self)
def print_get(self):
'''Print the data retrieval line for the variable.'''
str='''
call ccpp_field_get(cdata, '{s.standard_name}', {s.local_name}, ierr)
if (ierr /= 0) then
call ccpp_error('Unable to retrieve {s.standard_name}')
return
end if'''
return str.format(s=self)
def print_debug(self):
'''Print the data retrieval line for the variable.'''
str='''Contents of {s} (* = mandatory for compatibility):
standard_name = {s.standard_name} *
long_name = {s.long_name} *
units = {s.units} *
local_name = {s.local_name}
type = {s.type} *
rank = {s.rank} *
intent = {s.intent}
optional = {s.optional}
container = {s.container}'''
return str.format(s=self)
@classmethod
def from_table(cls, columns, data):
# DH* - workaround to use the existing table headers
standard_name = data[columns.index('longname')]
#standard_name = data[columns.index('standard_name')]
long_name = data[columns.index('description')]
#long_name = data[columns.index('long_name')]
units = data[columns.index('units')]
local_name = data[columns.index('local var name')]
#local_name = data[columns.index('local_name')]
type = data[columns.index('type')]
rank = data[columns.index('rank')]
intent = data[columns.index('intent')]
optional = data[columns.index('optional')]
# *DH
return cls(standard_name = standard_name,
long_name = long_name,
units = units,
local_name = local_name,
type = type,
rank = rank,
intent = intent,
optional = optional,
)
def to_xml(self, element):
element.set('name', self._standard_name)
sub_element = ET.SubElement(element, 'standard_name')
sub_element.text = self._standard_name
sub_element = ET.SubElement(element, 'long_name')
sub_element.text = self._long_name
sub_element = ET.SubElement(element, 'units')
sub_element.text = self._units
sub_element = ET.SubElement(element, 'local_name')
sub_element.text = self._local_name
sub_element = ET.SubElement(element, 'type')
sub_element.text = self._type
sub_element = ET.SubElement(element, 'rank')
sub_element.text = self._rank
sub_element = ET.SubElement(element, 'intent')
sub_element.text = self._intent
sub_element = ET.SubElement(element, 'optional')
sub_element.text = self._optional
sub_element = ET.SubElement(element, 'container')
sub_element.text = self._container
return element
###############################################################################
class Cap(object):
header='''
!
! This work (Common Community Physics Package), identified by NOAA, NCAR,
! CU/CIRES, is free of known copyright restrictions and is placed in the
! public domain.
!
! THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
! IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
! FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
! THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
! IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
! CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
!
!>
!! @brief Auto-generated cap module for the {module} scheme
!!
!
module {module}_cap
use, intrinsic :: iso_c_binding, &
only: c_f_pointer, c_ptr
use :: ccpp_types, &
only: ccpp_t
use :: ccpp_fields, &
only: ccpp_field_get
use :: ccpp_errors, &
only: ccpp_error
use :: {module}, &
only: {subroutines}
implicit none
private
public :: {subroutine_caps}
contains
'''
sub='''
subroutine {subroutine}_cap(ptr) bind(c)
type(c_ptr), intent(inout) :: ptr
type(ccpp_t), pointer :: cdata
integer :: ierr
{var_defs}
call c_f_pointer(ptr, cdata)
{var_gets}
call {subroutine}({args})
end subroutine {subroutine}_cap
'''
def __init__(self, **kwargs):
for key, value in kwargs.items():
setattr(self, "_"+key, value)
def write(self, data):
if (self.filename is not sys.stdout):
f = open(self.filename, 'w')
else:
f = sys.stdout
subs = ','.join(["{0}".format(s) for s in data['subs']])
sub_caps = ','.join(["{0}_cap".format(s) for s in data['subs']])
f.write(Cap.header.format(module = data['module'],
subroutines = subs,
subroutine_caps = sub_caps))
for (k, v) in data['subs'].items():
var_defs = "\n".join([" "*8 + x.print_def() for x in v['vars']])
var_gets = "\n".join([x.print_get() for x in v['vars']])
args = ','.join(["{0}={0}".format(x.local_name) for x in v['vars']])
f.write(Cap.sub.format(subroutine=k,
var_defs=var_defs,
var_gets=var_gets,
args=args))
f.write("end module {module}_cap\n".format(module = data['module']))
if (f is not sys.stdout):
f.close()
@property
def filename(self):
'''Get the filename of write the output to.'''
return self._filename
@filename.setter
def filename(self, value):
self._filename = value
###############################################################################
if __name__ == "__main__":
main()
| 30.827763 | 94 | 0.532688 | 1,376 | 11,992 | 4.503634 | 0.18314 | 0.042601 | 0.028401 | 0.027594 | 0.208165 | 0.118929 | 0.087462 | 0.071648 | 0.041956 | 0.041956 | 0 | 0.002208 | 0.32013 | 11,992 | 388 | 95 | 30.907216 | 0.757881 | 0.082805 | 0 | 0.097902 | 1 | 0 | 0.282347 | 0.006058 | 0.003497 | 0 | 0 | 0 | 0 | 1 | 0.122378 | false | 0.003497 | 0.017483 | 0.003497 | 0.22028 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
122a8bcfb8def21ca6542908e7ec0138c02aea09 | 1,803 | py | Python | scripts/animate_demo.py | wjchen84/rapprentice | 9232a6a21e2c80f00854912f07dcdc725b0be95a | [
"BSD-2-Clause"
] | 23 | 2015-08-25T19:40:18.000Z | 2020-12-27T09:23:06.000Z | scripts/animate_demo.py | wjchen84/rapprentice | 9232a6a21e2c80f00854912f07dcdc725b0be95a | [
"BSD-2-Clause"
] | null | null | null | scripts/animate_demo.py | wjchen84/rapprentice | 9232a6a21e2c80f00854912f07dcdc725b0be95a | [
"BSD-2-Clause"
] | 8 | 2016-05-18T20:13:06.000Z | 2020-11-03T16:09:50.000Z | #!/usr/bin/env python
"""
Animate demonstration trajectory
"""
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("h5file")
parser.add_argument("--seg")
parser.add_argument("--nopause", action="store_true")
args = parser.parse_args()
import h5py, openravepy,trajoptpy
from rapprentice import animate_traj, ros2rave,clouds
from numpy import asarray
import numpy as np
hdf = h5py.File(args.h5file)
segnames = [args.seg] if args.seg else hdf.keys()
env = openravepy.Environment()
env.StopSimulation()
env.Load("robots/pr2-beta-static.zae")
robot = env.GetRobots()[0]
viewer = trajoptpy.GetViewer(env)
for segname in segnames:
seg_info = hdf[segname]
from rapprentice import berkeley_pr2
r2r = ros2rave.RosToRave(robot, seg_info["joint_states"]["name"])
rave_traj = [r2r.convert(row) for row in asarray(seg_info["joint_states"]["position"])]
robot.SetActiveDOFs(r2r.rave_inds)
robot.SetActiveDOFValues(rave_traj[0])
handles = []
T_w_k = berkeley_pr2.get_kinect_transform(robot)
o = T_w_k[:3,3]
x = T_w_k[:3,0]
y = T_w_k[:3,1]
z = T_w_k[:3,2]
handles.append(env.drawarrow(o, o+.3*x, .005,(1,0,0,1)))
handles.append(env.drawarrow(o, o+.3*y, .005,(0,1,0,1)))
handles.append(env.drawarrow(o, o+.3*z, .005,(0,0,1,1)))
XYZ_k = clouds.depth_to_xyz(np.asarray(seg_info["depth"]), berkeley_pr2.f)
Twk = asarray(seg_info["T_w_k"])
XYZ_w = XYZ_k.dot(Twk[:3,:3].T) + Twk[:3,3][None,None,:]
RGB = np.asarray(seg_info["rgb"])
handles.append(env.plot3(XYZ_w.reshape(-1,3), 2, RGB.reshape(-1,3)[:,::-1]/255.))
animate_traj.animate_traj(rave_traj, robot, pause = not args.nopause)
print "DONE"
trajoptpy.GetViewer(env).Idle()
| 24.04 | 91 | 0.661675 | 277 | 1,803 | 4.151625 | 0.368231 | 0.036522 | 0.015652 | 0.013913 | 0.076522 | 0.076522 | 0.076522 | 0.052174 | 0.052174 | 0 | 0 | 0.040912 | 0.173045 | 1,803 | 74 | 92 | 24.364865 | 0.730382 | 0.011093 | 0 | 0 | 0 | 0 | 0.062572 | 0.014925 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.146341 | null | null | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1234396a6bb8250f31699a3c56eb0bb2cef11b8c | 1,070 | py | Python | tests/library/register_expansion_test.py | Walon1998/dace | 95ddfd3e9a5c654f0f0d66d026e0b64ec0f028a0 | [
"BSD-3-Clause"
] | 1 | 2022-03-11T13:36:34.000Z | 2022-03-11T13:36:34.000Z | tests/library/register_expansion_test.py | Walon1998/dace | 95ddfd3e9a5c654f0f0d66d026e0b64ec0f028a0 | [
"BSD-3-Clause"
] | null | null | null | tests/library/register_expansion_test.py | Walon1998/dace | 95ddfd3e9a5c654f0f0d66d026e0b64ec0f028a0 | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2019-2021 ETH Zurich and the DaCe authors. All rights reserved.
import dace
import dace.library
from dace.transformation import transformation as xf
import pytest
@dace.library.node
class MyLibNode(dace.nodes.LibraryNode):
implementations = {}
default_implementation = 'pure'
def __init__(self, name='MyLibNode', **kwargs):
super().__init__(name=name, **kwargs)
def test_register_expansion():
sdfg = dace.SDFG('libtest')
state = sdfg.add_state()
n = state.add_node(MyLibNode())
# Expect KeyError as pure expansion not given
with pytest.raises(KeyError):
sdfg()
@dace.library.register_expansion(MyLibNode, 'pure')
class ExpandMyLibNode(xf.ExpandTransformation):
environments = []
@staticmethod
def expansion(node: MyLibNode, state: dace.SDFGState, sdfg: dace.SDFG, **kwargs):
return dace.nodes.Tasklet('donothing', code='pass')
# After registering the expansion, the code should work
sdfg()
if __name__ == '__main__':
test_register_expansion()
| 26.75 | 89 | 0.696262 | 123 | 1,070 | 5.861789 | 0.520325 | 0.04577 | 0.058252 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009302 | 0.196262 | 1,070 | 39 | 90 | 27.435897 | 0.82907 | 0.159813 | 0 | 0.08 | 0 | 0 | 0.050279 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0.04 | 0.16 | 0.04 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1242151ef65549586db3bf65bc0bd27cfd616b7b | 9,415 | py | Python | seek/dbtable_content_blobs.py | BMCBCC/NExtSEEK | 7aca407bbc74efc5beb4a98227c6864444b11f61 | [
"MIT"
] | null | null | null | seek/dbtable_content_blobs.py | BMCBCC/NExtSEEK | 7aca407bbc74efc5beb4a98227c6864444b11f61 | [
"MIT"
] | null | null | null | seek/dbtable_content_blobs.py | BMCBCC/NExtSEEK | 7aca407bbc74efc5beb4a98227c6864444b11f61 | [
"MIT"
] | null | null | null | '''
Created on July 12, 2016
@author: Huiming Ding
Email: huiming@mit.edu
Description:
This script is implemented for the Content_blobs database/table.
Input: No typical input to define.
Output: No typical output to define.
Example command line:
Log of changes:
'''
#!/usr/bin/env python
import os
import sys
import time
import datetime
import simplejson
import json
import logging
logger = logging.getLogger(__name__)
from .models import Content_blobs
from dmac.dbtable import DBtable
# This is the mapping between the field name used in DataGrid table
# and the field name used in the SQL query for DB retrieval
CONTENT_BLOBS_FILTER_MAPPING = {
}
# Default values for Sample table
CONTENT_BLOBS_DEFAULT = {
#'id':'',
'md5sum':'',
'url':None,
'uuid':'',
'original_filename':'',
'content_type':'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
'asset_id':0,
'asset_type':'DataFile',
'asset_version':1,
'is_webpage':0,
'external_link':0,
'sha1sum':'',
'file_size':0,
'created_at':'',
'updated_at':''
}
class DBtable_content_blobs(DBtable):
''' The class stores all the information about the table [Sample].[dbo].[Sample]
Typical usage of the class
content_blobs = DBtable_content_blobs("DEFAULT")
return content_blobs.retrieveTableList(request)
'''
def __init__(self, whichServer='default'):
print "DBtable_content_blobs"
DBtable.__init__(self, 'SEEK', 'seek_development')
self.tablename = 'content_blobs'
self.tablemodel = Content_blobs
self.fulltablename = self.tablemodel
# this is the table for retrieving the list and shown in a dataGrid
#self.viewtablename = self.dbname + '.' + self.tablename
self.viewtablename = self.tablemodel
self.fields = [
'id',
'md5sum',
'url',
'uuid',
'original_filename',
'content_type',
'asset_id',
'asset_type',
'asset_version',
'is_webpage',
'external_link',
'sha1sum',
'file_size',
'created_at',
'updated_at'
]
# the unique constraint to find the primary key
self.uniqueFields = ['original_filename']
# The primary key name
self.primaryField = "id"
self.fieldMapping = CONTENT_BLOBS_FILTER_MAPPING
self.excludeFields = []
def storeDataFile(self, username, sampleType, record, attributeInfo, uploadEnforced=False):
''' Store one record from input excel file for batch uploading.
Input
record, a dictionary from sample sheet for uploading.
attributeInfo, the list of sample attributes defined in Seek system for this sample type.
uploadEnforced, if False, only run test; if True, forcefully upload the rcord into DB.
Output
msg, any message
status, whether or nor the test passes.
'''
if not self.__notEmptyLine(record):
#msg = 'Error: record for uploading empty in ' + sampleType
print(msg)
return msg, 0, None
# prepare requuired fields for the sample
headers_required = attributeInfo['headers_required']
# Verify whether the record for uploading has all required fields
msg_required, meetRequired = self.__verifyRequiredFields(record, headers_required)
if not meetRequired:
msg = 'Error: ' + msg_required
print(msg)
return msg, 0, None
#keysup = [x.upper() for x in record.keys()]
if 'UID' not in record.keys():
msg = 'Error: Sample record does not have a UID field.'
print(msg)
return msg, 0, None
record_new = self.__getRecord(username, record, attributeInfo)
uid = record_new['title']
#print(record_new)
if not uploadEnforced:
msg = 'Warning: Upload not enforced, test okay.'
#print(msg)
return 'Upload not enforced', 1, uid
#print(record_new)
#return 'Upload to be enforced', 1, uid
msg, status, sample_id = self.storeOneRecord(username, record_new)
if status:
self.__updateProject(username, sample_id)
#print(msg, status, uid)
return msg, status, uid
def searchFile(self, infilename, asset_typeIn=None):
''' Search Seek whether a data file has been uploaded previously.
Input
infilename: = original file name from the client side.
Output
diclist, a list of dictionaries/records from content_blobs table.
asset_id, latest asset id
asset_type, asset type
asset_version, asset version, default 1.
nassets, how many assets with the same original name and asset type.
Criteria
Only the following first two criteria are applied in the implementation of the script.
1. same file name, applied;
2. same login user, applied;
3. file checksum, not applied;
4. file time stamp, not applied;
5. file size, not applied.
'''
# Step 1. Query content_blobs table whether the data file is already uploaded.
constraint = {}
constraint['original_filename'] = infilename
if asset_typeIn is not None:
constraint['asset_type'] = asset_typeIn
diclist_cb = self.queryRecordsByConstraint(constraint)
asset_id = None
asset_type = None
asset_version = None
nassets = len(diclist_cb)
if nassets==1:
print("unqiue record found in content_blobs table")
dici = diclist_cb[0]
asset_id = dici['asset_id']
asset_type = dici['asset_type']
asset_version = dici['asset_version']
elif nassets>1:
print("multiple records found, choose the one with the highest version")
version_max = -1
for dici in diclist_cb:
version_i = dici['asset_version']
if version_i is None:
version_i = 0
else:
version_i = int(version_i)
if version_i > version_max:
asset_id = dici['asset_id']
asset_type = dici['asset_type']
asset_version = version_i
else:
print("file not found in content blob")
asset_id = None
asset_type = None
asset_version = None
print "asset info: ", asset_id, asset_type, asset_version, nassets
return asset_id, asset_type, asset_version, nassets
def retrieveFileList(self, username, asset_type):
''' Retrieve a list of records.
Input:
user_seek,
asset_type, such as "Document", "SampleType", "DataFile" or, "Sop"
'''
#filtersdic = dg.getDatagridFilters(ret)
filtersdic = {}
filtersdic['orderby'] = ''
filtersdic['limit'] = ''
filtersdic['suffix'] = ''
filtersdic['startNo'] = 0
filtersdic['endNo'] = 0
#sqlquery_filter, filterRules = self.__getFilteringParameters(ret)
filterRules = [{"field":"asset_type","op":"contains","value":asset_type}]
if asset_type in ["Document", "SampleType", "DataFile", "Sop"]:
sqlquery_filter = " asset_type='" + asset_type + "';"
else:
sqlquery_filter = " "
filtersdic['sqlquery_filter'] = sqlquery_filter
filtersdic['filterRules'] = filterRules
data = self.retrieveRecords(username, filtersdic)
return data
def getRecord(self, asset_id, asset_typeIn):
''' Search Seek whether a data file has been uploaded previously.
Input
asset_id: = primary key for the asset type, such as Sample, data file or SOP
asset_typeIn, one of 'DataFile', 'Sop', 'SampleType', and 'Document'
Output
diclist, a list of dictionaries/records from content_blobs table.
asset_id, latest asset id
asset_type, asset type
asset_version, asset version, default 1.
nassets, how many assets with the same original name and asset type.
content_type
Criteria
Only the following first two criteria are applied in the implementation of the script.
1. same file name, applied;
2. same login user, applied;
3. file checksum, not applied;
4. file time stamp, not applied;
5. file size, not applied.
'''
# Step 1. Query content_blobs table whether the data file is already uploaded.
constraint = {}
constraint['asset_id'] = asset_id
constraint['asset_type'] = asset_typeIn
diclist_cb = self.queryRecordsByConstraint(constraint)
return diclist_cb | 34.613971 | 101 | 0.585024 | 1,027 | 9,415 | 5.214216 | 0.259981 | 0.043697 | 0.031373 | 0.020915 | 0.303268 | 0.284967 | 0.267414 | 0.254342 | 0.254342 | 0.239402 | 0 | 0.006545 | 0.334679 | 9,415 | 272 | 102 | 34.613971 | 0.84834 | 0.100372 | 0 | 0.181159 | 0 | 0 | 0.172787 | 0.015791 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.065217 | null | null | 0.057971 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1242b70d24751dd58f2e174b2b6dafa4b0cf5800 | 600 | py | Python | netbox_secretstore/__init__.py | motad333/netbox-secretstore | 6ddb3f7bd1b83cc96f140338926aa56a99ce21f3 | [
"Apache-2.0"
] | null | null | null | netbox_secretstore/__init__.py | motad333/netbox-secretstore | 6ddb3f7bd1b83cc96f140338926aa56a99ce21f3 | [
"Apache-2.0"
] | null | null | null | netbox_secretstore/__init__.py | motad333/netbox-secretstore | 6ddb3f7bd1b83cc96f140338926aa56a99ce21f3 | [
"Apache-2.0"
] | null | null | null | from extras.plugins import PluginConfig
from django.utils.translation import gettext_lazy as _
class NetBoxSecretStore(PluginConfig):
name = 'netbox_secretstore'
verbose_name = _('Netbox Secret Store')
description = _('A Secret Storage for NetBox')
version = '1.0.8'
author = 'NetBox Maintainers'
author_email = ''
base_url = 'netbox_secretstore'
min_version = '3.0.0'
required_settings = []
caching_config = {
'*': {
'ops': 'all'
}
}
default_settings = {
'public_key_size': 2048
}
config = NetBoxSecretStore
| 24 | 54 | 0.638333 | 62 | 600 | 5.935484 | 0.725806 | 0.054348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022472 | 0.258333 | 600 | 24 | 55 | 25 | 0.804494 | 0 | 0 | 0 | 0 | 0 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.095238 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
124ab5ee13504d68a3d114c277fcb2d99863ce77 | 5,229 | py | Python | data/model.py | depowered/mndot-bid-abstracts | bb02c005e7cbd5645502e21c4baebd76170d4be1 | [
"MIT"
] | null | null | null | data/model.py | depowered/mndot-bid-abstracts | bb02c005e7cbd5645502e21c4baebd76170d4be1 | [
"MIT"
] | null | null | null | data/model.py | depowered/mndot-bid-abstracts | bb02c005e7cbd5645502e21c4baebd76170d4be1 | [
"MIT"
] | null | null | null | from sqlalchemy import Column, Integer, String, Float, ForeignKey
from sqlalchemy.engine.create import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
CONNECTION_STRING = "sqlite+pysqlite:///data/db.sqlite"
engine = create_engine(CONNECTION_STRING)
Session = sessionmaker(engine)
Base = declarative_base()
class Item2018(Base):
__tablename__ = "Item2018"
ItemID_2018 = Column(Integer, primary_key=True, unique=True)
SpecCode_2018 = Column(String)
UnitCode_2018 = Column(String)
ItemCode_2018 = Column(String)
Description_2018 = Column(String)
Unit_2018 = Column(String)
def __str__(self) -> str:
return f'Item(Description={self.Description}, Unit={self.Unit})'
def __repr__(self) -> str:
a = f'ItemID = {self.ItemID_2018}'
b = f'SpecCode = {self.SpecCode_2018}'
c = f'UnitCode = {self.UnitCode_2018}'
d = f'ItemCode = {self.ItemCode_2018}'
e = f'Description = {self.Description_2018}'
f = f'Unit = {self.Unit_2018}'
return ', '.join( [a, b, c, d, e, f] )
class Item2020(Base):
__tablename__ = "Item2020"
ItemID_2020 = Column(Integer, primary_key=True, unique=True)
SpecCode_2020 = Column(String)
UnitCode_2020 = Column(String)
ItemCode_2020 = Column(String)
Description_2020 = Column(String)
Unit_2020 = Column(String)
Item2018_ID = Column(Integer)
def __str__(self) -> str:
return f'Item(Description={self.Description}, Unit={self.Unit})'
def __repr__(self) -> str:
a = f'ItemID = {self.ItemID_2020}'
b = f'SpecCode = {self.SpecCode_2020}'
c = f'UnitCode = {self.UnitCode_2020}'
d = f'ItemCode = {self.ItemCode_2020}'
e = f'Description = {self.Description_2020}'
f = f'Unit = {self.Unit_2020}'
return ', '.join( [a, b, c, d, e, f] )
class Abstract(Base):
__tablename__ = "Abstract"
AbstractID = Column(Integer, primary_key=True, unique=True)
Year = Column(Integer)
Processed = Column(String)
def __str__(self) -> str:
return f'Abstract(AbstractID={self.AbstractID})'
def __repr__(self) -> str:
return f'AbstractID = {self.AbstractID}, Year = {self.Year}, Processed = {self.Processed}'
class Contract(Base):
__tablename__ = "Contract"
ContractID = Column(Integer, primary_key=True, unique=True)
Year = Column(Integer)
LetDate = Column(String)
SPNumber = Column(String)
District = Column(String)
County = Column(String)
BidderID_0 = Column(Integer, ForeignKey("Bidder.BidderID"))
BidderID_1 = Column(Integer, ForeignKey("Bidder.BidderID"))
BidderID_2 = Column(Integer, ForeignKey("Bidder.BidderID"))
def __str__(self) -> str:
return f'Contract(ContractID={self.ContractID})'
def __repr__(self) -> str:
a = f'ContractID = {self.ContractID}'
b = f'Year = {self.Year}'
c = f'LetDate = {self.LetDate}'
d = f'SPNumber = {self.SPNumber}'
e = f'District = {self.District}'
f = f'County = {self.County}'
g = f'BidderID_0 = {self.BidderID_0}'
h = f'BidderID_1 = {self.BidderID_1}'
i = f'BidderID_2 = {self.BidderID_2}'
return ', '.join( [a, b, c, d, e, f, g, h, i] )
class Bid(Base):
__tablename__ = "Bid"
BidID = Column(Integer, primary_key=True, unique=True)
ContractID = Column(Integer, ForeignKey("Contract.ContractID"))
ItemID = Column(Integer)
SpecYear = Column(Integer)
Quantity = Column(Float)
Engineer_UnitPrice = Column(Float)
Engineer_TotalPrice = Column(Float)
BidderID_0_UnitPrice = Column(Float)
BidderID_0_TotalPrice = Column(Float)
BidderID_1_UnitPrice = Column(Float)
BidderID_1_TotalPrice = Column(Float)
BidderID_2_UnitPrice = Column(Float)
BidderID_2_TotalPrice = Column(Float)
def __str__(self) -> str:
return f'Bid(BidID={self.BidID})'
def __repr__(self) -> str:
a = f'BidID = {self.BidID}'
b = f'ContractID = {self.ContractID}'
c = f'ItemID = {self.ContractID}'
d = f'Quantity = {self.Quantity}'
e = f'Engineer_UnitPrice = {self.Engineer_UnitPrice}'
f = f'Engineer_TotalPrice = {self.Engineer_TotalPrice}'
g = f'BidderID_0_UnitPrice = {self.BidderID_0_UnitPrice}'
h = f'BidderID_0_TotalPrice = {self.BidderID_0_TotalPrice}'
i = f'BidderID_1_UnitPrice = {self.BidderID_1_UnitPrice}'
j = f'BidderID_1_TotalPrice = {self.BidderID_1_TotalPrice}'
k = f'BidderID_2_UnitPrice = {self.BidderID_2_UnitPrice}'
l = f'BidderID_2_TotalPrice = {self.BidderID_2_TotalPrice}'
return ', '.join( [a, b, c, d, e, f, g, h, i, j, k, l] )
class Bidder(Base):
__tablename__ = "Bidder"
BidderID = Column(Integer, primary_key=True, unique=True)
Name = Column(String)
def __str__(self) -> str:
return f'Bidder(Name={self.Name})'
def __repr__(self) -> str:
return f'BidderID = {self.BidderID}, Name = {self.Name}'
def main():
# Creates blank database file
Base.metadata.create_all(engine)
if __name__ == '__main__':
main() | 29.542373 | 98 | 0.646969 | 643 | 5,229 | 4.998445 | 0.136858 | 0.064717 | 0.032358 | 0.034848 | 0.333852 | 0.255134 | 0.191661 | 0.168637 | 0.116988 | 0.10392 | 0 | 0.035161 | 0.222222 | 5,229 | 177 | 99 | 29.542373 | 0.755102 | 0.005164 | 0 | 0.14876 | 0 | 0.008264 | 0.309364 | 0.106326 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107438 | false | 0 | 0.033058 | 0.066116 | 0.669421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
12569e9c0d171a9d7527cfe600f2b89b30a16992 | 222 | py | Python | backend/profiles/serializers.py | stevethompsonstar/django-react-blog | 88af926454901c826acc9e2996addd0d53b0626a | [
"MIT"
] | 592 | 2017-03-07T04:29:08.000Z | 2020-09-21T00:36:58.000Z | backend/profiles/serializers.py | stevethompsonstar/django-react-blog | 88af926454901c826acc9e2996addd0d53b0626a | [
"MIT"
] | 8 | 2017-03-08T01:22:36.000Z | 2020-08-20T15:45:42.000Z | backend/profiles/serializers.py | stevethompsonstar/django-react-blog | 88af926454901c826acc9e2996addd0d53b0626a | [
"MIT"
] | 102 | 2017-03-07T05:42:47.000Z | 2020-08-28T20:02:20.000Z | from rest_framework import serializers
from .models import Subscriber
class SubscriberSerializer(serializers.ModelSerializer):
class Meta:
model = Subscriber
fields = (
'email',
)
| 20.181818 | 56 | 0.662162 | 19 | 222 | 7.684211 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.279279 | 222 | 10 | 57 | 22.2 | 0.9125 | 0 | 0 | 0 | 0 | 0 | 0.022523 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
125da2debfb3bcd25523d38067e1548fefba9b8b | 858 | py | Python | fog/fog-client/setup.py | breakEval13/tor_dev | 1040d906474d1da463f4de57b3c5f72ae14f550d | [
"Apache-2.0"
] | 1 | 2020-07-21T01:23:28.000Z | 2020-07-21T01:23:28.000Z | fog/fog-client/setup.py | breakEval13/tor_dev | 1040d906474d1da463f4de57b3c5f72ae14f550d | [
"Apache-2.0"
] | null | null | null | fog/fog-client/setup.py | breakEval13/tor_dev | 1040d906474d1da463f4de57b3c5f72ae14f550d | [
"Apache-2.0"
] | null | null | null | from distutils.core import setup
import py2exe
# if py2exe complains "can't find P", try one of the following workarounds:
#
# a. py2exe doesn't support zipped eggs - http://www.py2exe.org/index.cgi/ExeWithEggs
# You should give the --always-unzip option to easy_install, or you can use setup.py directly
# $ python setup.py install --record install.log --single-version-externally-managed
# Don't forget to remove the previous zipped egg.
#
# b. Add an empty __init__.py to the P/ top-level directory, if it's missing
# - this is due to a bug (or misleading documentation) in python's imp.find_module()
setup(
console=["fog-client"],
zipfile="py2exe-fog-client.zip",
options={
"py2exe": {
"includes": ["pyptlib", "twisted", "txsocksx"],
"packages": ["ometa", "terml", "zope.interface"],
},
},
)
| 35.75 | 94 | 0.67366 | 122 | 858 | 4.688525 | 0.729508 | 0.024476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008708 | 0.19697 | 858 | 23 | 95 | 37.304348 | 0.82148 | 0.631702 | 0 | 0 | 0 | 0 | 0.323529 | 0.068627 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
126834642203c16a6f094647812a7f04fd742d0c | 853 | py | Python | appshell/endpoints.py | adh/appshell | 94901df8045336e3217eb2fd5eae77cb7c639340 | [
"MIT"
] | 3 | 2015-08-21T22:22:52.000Z | 2018-07-14T03:32:51.000Z | appshell/endpoints.py | adh/appshell | 94901df8045336e3217eb2fd5eae77cb7c639340 | [
"MIT"
] | null | null | null | appshell/endpoints.py | adh/appshell | 94901df8045336e3217eb2fd5eae77cb7c639340 | [
"MIT"
] | 1 | 2015-08-21T22:22:57.000Z | 2015-08-21T22:22:57.000Z | from appshell.base import View
from appshell.templates import confirmation, message
from flask import request, flash, redirect
from flask_babelex import Babel, Domain
mydomain = Domain('appshell')
_ = mydomain.gettext
lazy_gettext = mydomain.lazy_gettext
class ConfirmationEndpoint(View):
methods = ("GET", "POST")
redirect_to = None
def prepare(self):
pass
def dispatch_request(self, **args):
self.prepare(**args)
if request.method == "POST":
self.do_it(**args)
return self.done()
else:
return confirmation(self.confirmation_message)
def done(self):
if self.flash_message:
flash(*self.flash_message)
if self.redirect_to:
return redirect(self.redirect_to)
return message(_("Done"))
| 25.848485 | 58 | 0.629543 | 94 | 853 | 5.574468 | 0.404255 | 0.057252 | 0.061069 | 0.076336 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.277843 | 853 | 32 | 59 | 26.65625 | 0.850649 | 0 | 0 | 0 | 0 | 0 | 0.026964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0.04 | 0.16 | 0 | 0.56 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
126fc6c0e704141c3768c167da733628103ad816 | 248 | py | Python | yo_fluq_ds/_fluq/_common.py | okulovsky/yo_ds | 9e1fa2e7a1b9746c3982afc152c024169fec45ca | [
"MIT"
] | 16 | 2019-09-26T09:05:42.000Z | 2021-02-04T01:39:09.000Z | yo_fluq_ds/_fluq/_common.py | okulovsky/yo_ds | 9e1fa2e7a1b9746c3982afc152c024169fec45ca | [
"MIT"
] | 2 | 2019-10-23T19:01:23.000Z | 2020-06-11T09:08:45.000Z | yo_fluq_ds/_fluq/_common.py | okulovsky/yo_ds | 9e1fa2e7a1b9746c3982afc152c024169fec45ca | [
"MIT"
] | 2 | 2019-09-26T09:05:50.000Z | 2019-10-23T18:46:11.000Z | from .._common import *
from yo_fluq import *
Queryable = lambda *args, **kwargs: FlupFactory.QueryableFactory(*args, **kwargs)
T = TypeVar('T')
TOut = TypeVar('TOut')
TKey = TypeVar('TKey')
TValue = TypeVar('TValue')
TFactory = TypeVar('TFactory') | 31 | 81 | 0.71371 | 30 | 248 | 5.833333 | 0.566667 | 0.114286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116935 | 248 | 8 | 82 | 31 | 0.799087 | 0 | 0 | 0 | 0 | 0 | 0.092369 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89cc8c96ba145a294249c4b138a3c09de02694bc | 949 | py | Python | headless_chrome.py | MineRobber9000/discordscript | c9c4eda0e28db72890d0490617ed361722c7c44c | [
"MIT"
] | null | null | null | headless_chrome.py | MineRobber9000/discordscript | c9c4eda0e28db72890d0490617ed361722c7c44c | [
"MIT"
] | null | null | null | headless_chrome.py | MineRobber9000/discordscript | c9c4eda0e28db72890d0490617ed361722c7c44c | [
"MIT"
] | null | null | null | from selenium import webdriver
def _options_factory():
"""Produces a selenium.webdriver.ChromeOptions object. Used to force "headless" on invocation. You shouldn't call this function."""
ret = webdriver.ChromeOptions()
ret.add_argument("headless")
return ret
def get_driver(*varargs,args=[]):
"""Creates headless selenium.webdriver.Chrome object. Supply command-line options in args or varargs."""
args.extend(varargs)
args = list(set(args))
opt = _options_factory()
for arg in args:
if arg=="headless": continue # already headless
opt.add_argument(arg)
return webdriver.Chrome(chrome_options=opt)
# import other useful things
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.action_chains import ActionChains
from selenium.webdriver.support import expected_conditions
# BeautifulSoup support
from bs4 import BeautifulSoup
def soupify(driver):
return BeautifulSoup(driver.page_source,"html.parser")
| 32.724138 | 132 | 0.791359 | 125 | 949 | 5.92 | 0.52 | 0.114865 | 0.085135 | 0.072973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001189 | 0.113804 | 949 | 28 | 133 | 33.892857 | 0.878716 | 0.306639 | 0 | 0 | 0 | 0 | 0.04186 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157895 | false | 0 | 0.263158 | 0.052632 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
89cd7e4794021790017343f1b97d640a8664e0b4 | 2,677 | py | Python | lemonsoap/scent/columns_scent.py | Ekrekr/LemonSoap | 61b86b70a3788486235de2e8baeb7c68b80318a9 | [
"MIT"
] | null | null | null | lemonsoap/scent/columns_scent.py | Ekrekr/LemonSoap | 61b86b70a3788486235de2e8baeb7c68b80318a9 | [
"MIT"
] | 1 | 2019-08-23T18:30:31.000Z | 2019-08-23T18:32:23.000Z | lemonsoap/scent/columns_scent.py | Ekrekr/LemonSoap | 61b86b70a3788486235de2e8baeb7c68b80318a9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
LemonSoap - headers scent.
Deals with column headers.
"""
import pandas as pd
import inflection
import re
import logging
from ..lemon_bar import LemonBar
from .scent_template import ScentTemplate
class ColumnsScent(ScentTemplate):
"""
Manages headers issue identification and fixing.
"""
def __init__(self, lf: LemonBar):
ScentTemplate.__init__(self, lf, "headers",
"columns_scent.ColumnsScent")
def check(self) -> bool:
"""
Identifies issues with headers in a dataframe.
Correct format is "snake_case", with no special characters. Numbers
are however allowed.
Returns:
False if no issues otherwise True.
"""
columns = self._lb().columns
for column in columns:
fixed = self._standardize(column)
if fixed != column:
self._log.info(f"* '{column}' incorrect format, "
f"should be '{fixed}.")
return self._finish_check()
def fix(self) -> LemonBar:
"""
Fixes headers in a given LemonBar.
Returns:
LemonBar with fixes applied.
"""
self.check()
for issue in self._issues:
# OK to call this here as well as in check as unlikely to be
# enough headers to cause an overhead.
fixed = self._standardize(issue[0])
self._log.info(f"* '{issue[0]}' replaced with '{fixed}'")
self._lb().rename(columns={issue[0]: fixed}, inplace=True)
return self._lb
def _standardize(self, inp: str) -> str:
"""
Converts input to standard column header format.
* snake_case.
* No special characters.
* Less than 24 characters long.
* Unique.
Args:
inp: string to fix.
Returns:
Converted input.
"""
# Make underscored, lower case with no special characters.
fixed = inp.replace(" ", "_")
fixed = inflection.underscore(fixed)
fixed = re.sub('\W+', '', fixed)
# Headers less than 24 chars.
if len(fixed) > 24:
fixed = fixed[:24]
# If not unique then try with repeatedly incrementing numbers.
# TODO: O(n^2) algorithm, becomes very slow with lots of headers that
# are the same. Should use precomputation table.
sim_num = 0
fixed_inc = fixed
while fixed_inc in self._lb().columns:
sim_str = str(sim_num)
fixed_inc = fixed + str(sim_num)
sim_num += 1
return fixed_inc
| 27.316327 | 77 | 0.566679 | 304 | 2,677 | 4.875 | 0.453947 | 0.016194 | 0.038462 | 0.022942 | 0.036437 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008479 | 0.339186 | 2,677 | 97 | 78 | 27.597938 | 0.829282 | 0.351886 | 0 | 0 | 0 | 0 | 0.082677 | 0.01706 | 0 | 0 | 0 | 0.010309 | 0 | 1 | 0.105263 | false | 0 | 0.157895 | 0 | 0.368421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89d04b9eb8d01d291b7226fc499b9d1ce4373de0 | 466 | py | Python | arquivos_de_exercicios_descubra_o_python/Cap. 04/escreveArquivo_start.py | DiegoDBLe/Python-Linkedin | 0365fb2c83d04c10a2ebd8b56baddb91a4525811 | [
"MIT"
] | null | null | null | arquivos_de_exercicios_descubra_o_python/Cap. 04/escreveArquivo_start.py | DiegoDBLe/Python-Linkedin | 0365fb2c83d04c10a2ebd8b56baddb91a4525811 | [
"MIT"
] | null | null | null | arquivos_de_exercicios_descubra_o_python/Cap. 04/escreveArquivo_start.py | DiegoDBLe/Python-Linkedin | 0365fb2c83d04c10a2ebd8b56baddb91a4525811 | [
"MIT"
] | null | null | null | #
# Escrevendo arquivos com funções do Python
#
def escreveArquivo():
arquivo = open('NovoArquivo.txt', 'w+')
arquivo.write('Linha gerada com a função Escrevendo Arquivo \r\n')
arquivo.close()
#escreveArquivo()]
def alteraArquivo():
arquivo = open('NovoArquivo.txt', 'a+') # a de append que dizer escreva nas proximas linhas do arquivo
arquivo.write('Linha gerada com a função Altera Arquivo \r\n')
arquivo.close()
alteraArquivo()
| 18.64 | 106 | 0.690987 | 59 | 466 | 5.457627 | 0.525424 | 0.068323 | 0.136646 | 0.15528 | 0.335404 | 0.204969 | 0.204969 | 0 | 0 | 0 | 0 | 0 | 0.188841 | 466 | 24 | 107 | 19.416667 | 0.851852 | 0.255365 | 0 | 0.222222 | 0 | 0 | 0.376471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89d0cf9f3e0655b88c1a304fee38eb9854817a94 | 1,150 | py | Python | graphio/queries/query_parameters.py | JTaeger/graphio | e856d4266842540cfe56ba7367d8f97183ae2954 | [
"Apache-2.0"
] | null | null | null | graphio/queries/query_parameters.py | JTaeger/graphio | e856d4266842540cfe56ba7367d8f97183ae2954 | [
"Apache-2.0"
] | null | null | null | graphio/queries/query_parameters.py | JTaeger/graphio | e856d4266842540cfe56ba7367d8f97183ae2954 | [
"Apache-2.0"
] | null | null | null | def params_create_rels_unwind_from_objects(relationships, property_identifier=None):
"""
Format Relationship properties into a one level dictionary matching the query generated in
`query_create_rels_from_list`. This is necessary because you cannot access nested dictionairies
in the UNWIND query.
UNWIND { rels } AS rel
MATCH (a:Gene), (b:GeneSymbol)
WHERE a.sid = rel.start_sid AND b.sid = rel.end_sid AND b.taxid = rel.end_taxid
CREATE (a)-[r:MAPS]->(b)
SET r = rel.properties
Call with params:
{'start_sid': 1, 'end_sid': 2, 'end_taxid': '9606', 'properties': {'foo': 'bar} }
:param relationships: List of Relationships.
:return: List of parameter dictionaries.
"""
if not property_identifier:
property_identifier = 'rels'
output = []
for r in relationships:
d = {}
for k, v in r.start_node_properties.items():
d['start_{}'.format(k)] = v
for k, v in r.end_node_properties.items():
d['end_{}'.format(k)] = v
d['properties'] = r.properties
output.append(d)
return {property_identifier: output}
| 33.823529 | 99 | 0.646087 | 155 | 1,150 | 4.632258 | 0.458065 | 0.100279 | 0.019499 | 0.019499 | 0.022284 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006857 | 0.23913 | 1,150 | 33 | 100 | 34.848485 | 0.813714 | 0.506087 | 0 | 0 | 1 | 0 | 0.055118 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89d191bec6b5c9759bd10872394402ac762205b9 | 477 | py | Python | user/migrations/0002_userprofile_relations.py | Trippr-dwoc/Trippr-backend | 69a8bb8e1a742b64b4eaf8612f97806e6191e8fb | [
"MIT"
] | null | null | null | user/migrations/0002_userprofile_relations.py | Trippr-dwoc/Trippr-backend | 69a8bb8e1a742b64b4eaf8612f97806e6191e8fb | [
"MIT"
] | null | null | null | user/migrations/0002_userprofile_relations.py | Trippr-dwoc/Trippr-backend | 69a8bb8e1a742b64b4eaf8612f97806e6191e8fb | [
"MIT"
] | null | null | null | # Generated by Django 3.2.3 on 2021-10-19 18:54
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('user', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='userprofile',
name='relations',
field=models.ManyToManyField(related_name='_user_userprofile_relations_+', to=settings.AUTH_USER_MODEL),
),
]
| 23.85 | 116 | 0.649895 | 52 | 477 | 5.788462 | 0.673077 | 0.066445 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.243187 | 477 | 19 | 117 | 25.105263 | 0.781163 | 0.09434 | 0 | 0 | 1 | 0 | 0.151163 | 0.067442 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89d59366b3b61d950e0acc7ff13715341f62b69d | 47,267 | py | Python | scielomanager/journalmanager/models.py | jamilatta/scielo-manager | d506c6828ba9b1089faa164bc42ba29a0f228e61 | [
"BSD-2-Clause"
] | null | null | null | scielomanager/journalmanager/models.py | jamilatta/scielo-manager | d506c6828ba9b1089faa164bc42ba29a0f228e61 | [
"BSD-2-Clause"
] | null | null | null | scielomanager/journalmanager/models.py | jamilatta/scielo-manager | d506c6828ba9b1089faa164bc42ba29a0f228e61 | [
"BSD-2-Clause"
] | null | null | null | # -*- encoding: utf-8 -*-
import urllib
import hashlib
import logging
import choices
import caching.base
from scielomanager import tools
try:
from collections import OrderedDict
except ImportError:
from ordereddict import OrderedDict
from django.db import (
models,
transaction,
IntegrityError,
DatabaseError,
)
from django.core.exceptions import ObjectDoesNotExist
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes import generic
from django.contrib.auth.models import User
from django.utils.translation import ugettext_lazy as _
from django.utils.translation import ugettext as __
from django.conf import settings
from django.db.models.signals import post_save, pre_save
from django.dispatch import receiver
from django.template.defaultfilters import slugify
from django.core.exceptions import ImproperlyConfigured
from scielo_extensions import modelfields
from tastypie.models import create_api_key
import jsonfield
from scielomanager.utils import base28
from . import modelmanagers
User.__bases__ = (caching.base.CachingMixin, models.Model)
User.add_to_class('objects', caching.base.CachingManager())
logger = logging.getLogger(__name__)
EVENT_TYPES = [(ev_type, ev_type) for ev_type in ['added', 'deleted', 'updated']]
ISSUE_DEFAULT_LICENSE_HELP_TEXT = _(u"If not defined, will be applied the related journal's use license. \
The SciELO default use license is BY-NC. Please visit: http://ref.scielo.org/jf5ndd (5.2.11. Política de direitos autorais) for more details.")
def get_user_collections(user_id):
"""
Return all the collections of a given user, The returned collections are the collections where the
user could have access by the collections bar.
"""
user_collections = User.objects.get(pk=user_id).usercollections_set.all().order_by(
'collection__name')
return user_collections
def get_journals_default_use_license():
"""
Returns the default use license for all new Journals.
This callable is passed as the default value on Journal.use_license field.
The default use license is the one defined on SciELO criteria, and at
the time is BY-NC. See http://ref.scielo.org/jf5ndd for more information.
"""
try:
return UseLicense.objects.get(is_default=True)
except UseLicense.DoesNotExist:
raise ImproperlyConfigured("There is no UseLicense set as default")
class AppCustomManager(caching.base.CachingManager):
"""
Domain specific model managers.
"""
def available(self, is_available=True):
"""
Filter the queryset based on its availability.
"""
data_queryset = self.get_query_set()
if not isinstance(is_available, bool):
try:
if int(is_available) == 0:
is_available = False
else:
is_available = True
except (ValueError, TypeError):
is_available = True
data_queryset = data_queryset.filter(is_trashed=not is_available)
return data_queryset
class JournalCustomManager(AppCustomManager):
def all_by_user(self, user, is_available=True, pub_status=None):
"""
Retrieves all the user's journals, contextualized by
their default collection.
"""
default_collection = Collection.objects.get_default_by_user(user)
objects_all = self.available(is_available).filter(
collections=default_collection).distinct()
if pub_status:
if pub_status in [stat[0] for stat in choices.JOURNAL_PUBLICATION_STATUS]:
objects_all = objects_all.filter(pub_status=pub_status)
return objects_all
def recents_by_user(self, user):
"""
Retrieves the recently modified objects related to the given user.
"""
default_collection = Collection.objects.get_default_by_user(user)
recents = self.filter(
collections=default_collection).distinct().order_by('-updated')[:5]
return recents
def all_by_collection(self, collection, is_available=True):
objects_all = self.available(is_available).filter(
collections=collection)
return objects_all
def by_issn(self, issn):
"""
Get the journal assigned to `issn`, being electronic or print.
In some cases more than one instance of the same journal will be
returned due to the fact that journals present in more than one
collection is handled separately.
"""
if issn == '':
return Journal.objects.none()
journals = Journal.objects.filter(
models.Q(print_issn=issn) | models.Q(eletronic_issn=issn)
)
return journals
class SectionCustomManager(AppCustomManager):
def all_by_user(self, user, is_available=True):
default_collection = Collection.objects.get_default_by_user(user)
objects_all = self.available(is_available).filter(
journal__collections=default_collection).distinct()
return objects_all
class IssueCustomManager(AppCustomManager):
def all_by_collection(self, collection, is_available=True):
objects_all = self.available(is_available).filter(
journal__collections=collection)
return objects_all
class InstitutionCustomManager(AppCustomManager):
"""
Add capabilities to Institution subclasses to retrieve querysets
based on user's collections.
"""
def all_by_user(self, user, is_available=True):
default_collection = Collection.objects.get_default_by_user(user)
objects_all = self.available(is_available).filter(
collections__in=[default_collection]).distinct()
return objects_all
class CollectionCustomManager(AppCustomManager):
def all_by_user(self, user):
"""
Returns all the Collections related to the given
user.
"""
collections = self.filter(usercollections__user=user).order_by(
'name')
return collections
def get_default_by_user(self, user):
"""
Returns the Collection marked as default by the given user.
If none satisfies this condition, the first
instance is then returned.
Like any manager method that does not return Querysets,
`get_default_by_user` raises DoesNotExist if there is no
result for the given parameter.
"""
collections = self.filter(usercollections__user=user,
usercollections__is_default=True).order_by('name')
if not collections.count():
try:
collection = self.all_by_user(user)[0]
except IndexError:
raise Collection.DoesNotExist()
else:
collection.make_default_to_user(user)
return collection
return collections[0]
def get_managed_by_user(self, user):
"""
Returns all collections managed by a given user.
"""
collections = self.filter(usercollections__user=user,
usercollections__is_manager=True).order_by('name')
return collections
class RegularPressReleaseCustomManager(caching.base.CachingManager):
def by_journal_pid(self, journal_pid):
"""
Returns all PressReleases related to a Journal, given its
PID.
"""
journals = Journal.objects.filter(
models.Q(print_issn=journal_pid) | models.Q(eletronic_issn=journal_pid))
preleases = self.filter(issue__journal__in=journals.values('id')).select_related('translations')
return preleases
def all_by_journal(self, journal):
"""
Returns all PressReleases related to a Journal
"""
preleases = self.filter(issue__journal=journal)
return preleases
def by_issue_pid(self, issue_pid):
"""
Returns all PressReleases related to an Issue, given its
PID.
"""
issn_slice = slice(0, 9)
year_slice = slice(9, 13)
order_slice = slice(13, None)
issn = issue_pid[issn_slice]
year = issue_pid[year_slice]
order = int(issue_pid[order_slice])
preleases_qset = self.by_journal_pid(issn)
return preleases_qset.filter(issue__publication_year=year).filter(issue__order=order)
class AheadPressReleaseCustomManager(caching.base.CachingManager):
def by_journal_pid(self, journal_pid):
"""
Returns all PressReleases related to a Journal, given its
PID.
"""
preleases = self.filter(models.Q(journal__print_issn=journal_pid) | models.Q(journal__eletronic_issn=journal_pid))
return preleases
class Language(caching.base.CachingMixin, models.Model):
"""
Represents ISO 639-1 Language Code and its language name in English. Django
automaticaly translates language names, if you write them right.
http://en.wikipedia.org/wiki/ISO_639-1_language_matrix
"""
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
iso_code = models.CharField(_('ISO 639-1 Language Code'), max_length=2)
name = models.CharField(_('Language Name (in English)'), max_length=64)
def __unicode__(self):
return __(self.name)
class Meta:
ordering = ['name']
class UserProfile(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
user = models.OneToOneField(User)
email = models.EmailField(_('E-mail'), blank=False, unique=True, null=False)
@property
def gravatar_id(self):
return hashlib.md5(self.email.lower().strip()).hexdigest()
@property
def avatar_url(self):
params = urllib.urlencode({'s': 18, 'd': 'mm'})
return '{0}/avatar/{1}?{2}'.format(getattr(settings, 'GRAVATAR_BASE_URL',
'https://secure.gravatar.com'), self.gravatar_id, params)
@property
def get_default_collection(self):
"""
Return the default collection for this user
"""
return Collection.objects.get_default_by_user(self.user)
def save(self, force_insert=False, force_update=False):
self.user.email = self.email
self.user.save()
return super(UserProfile, self).save(force_insert, force_update)
class Collection(caching.base.CachingMixin, models.Model):
objects = CollectionCustomManager()
nocacheobjects = models.Manager()
collection = models.ManyToManyField(User, related_name='user_collection',
through='UserCollections', null=True, blank=True, )
name = models.CharField(_('Collection Name'), max_length=128, db_index=True, )
name_slug = models.SlugField(unique=True, db_index=True, blank=True, null=True)
url = models.URLField(_('Instance URL'), )
logo = models.ImageField(_('Logo'), upload_to='img/collections_logos', null=True, blank=True, )
acronym = models.CharField(_('Sigla'), max_length=16, db_index=True, blank=True, )
country = models.CharField(_('Country'), max_length=32,)
state = models.CharField(_('State'), max_length=32, null=False, blank=True,)
city = models.CharField(_('City'), max_length=32, null=False, blank=True,)
address = models.TextField(_('Address'),)
address_number = models.CharField(_('Number'), max_length=8,)
address_complement = models.CharField(_('Complement'), max_length=128, null=False, blank=True,)
zip_code = models.CharField(_('Zip Code'), max_length=16, null=True, blank=True, )
phone = models.CharField(_('Phone Number'), max_length=16, null=False, blank=True, )
fax = models.CharField(_('Fax Number'), max_length=16, null=False, blank=True, )
email = models.EmailField(_('Email'), )
def __unicode__(self):
return unicode(self.name)
class Meta:
ordering = ['name']
permissions = (("list_collection", "Can list Collections"),)
def save(self, *args, **kwargs):
self.name_slug = slugify(self.name)
super(Collection, self).save(*args, **kwargs)
def add_user(self, user, is_default=False, is_manager=False):
"""
Add the user to the current collection.
"""
UserCollections.objects.create(collection=self,
user=user,
is_default=is_default,
is_manager=is_manager)
def remove_user(self, user):
"""
Removes the user from the current collection.
If the user isn't already related to the given collection,
it will do nothing, silently.
"""
try:
uc = UserCollections.objects.get(collection=self, user=user)
except UserCollections.DoesNotExist:
return None
else:
uc.delete()
def make_default_to_user(self, user):
"""
Makes the current collection, the user's default.
"""
UserCollections.objects.filter(user=user).update(is_default=False)
uc, created = UserCollections.objects.get_or_create(
collection=self, user=user)
uc.is_default = True
uc.save()
def is_default_to_user(self, user):
"""
Returns a boolean value depending if the current collection
is set as default to the given user.
"""
try:
uc = UserCollections.objects.get(collection=self, user=user)
return uc.is_default
except UserCollections.DoesNotExist:
return False
def is_managed_by_user(self, user):
"""
Returns a boolean value depending if the current collection
is managed by the given user.
"""
try:
uc = UserCollections.objects.get(collection=self, user=user)
return uc.is_manager
except UserCollections.DoesNotExist:
return False
class UserCollections(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
user = models.ForeignKey(User)
collection = models.ForeignKey(Collection)
is_default = models.BooleanField(_('Is default'), default=False, null=False, blank=False)
is_manager = models.BooleanField(_('Is manager of the collection?'), default=False, null=False,
blank=False)
class Meta:
unique_together = ("user", "collection", )
class Institution(caching.base.CachingMixin, models.Model):
#Custom manager
objects = AppCustomManager()
nocacheobjects = models.Manager()
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
name = models.CharField(_('Institution Name'), max_length=256, db_index=True)
complement = models.TextField(_('Institution Complements'), blank=True, default="")
acronym = models.CharField(_('Sigla'), max_length=16, db_index=True, blank=True)
country = models.CharField(_('Country'), max_length=32)
state = models.CharField(_('State'), max_length=32, null=False, blank=True)
city = models.CharField(_('City'), max_length=32, null=False, blank=True)
address = models.TextField(_('Address'))
address_number = models.CharField(_('Number'), max_length=8)
address_complement = models.CharField(_('Address Complement'), max_length=128, null=False, blank=True)
zip_code = models.CharField(_('Zip Code'), max_length=16, null=True, blank=True)
phone = models.CharField(_('Phone Number'), max_length=16, null=False, blank=True)
fax = models.CharField(_('Fax Number'), max_length=16, null=False, blank=True)
cel = models.CharField(_('Cel Number'), max_length=16, null=False, blank=True)
email = models.EmailField(_('E-mail'))
is_trashed = models.BooleanField(_('Is trashed?'), default=False, db_index=True)
def __unicode__(self):
return u'%s' % (self.name)
class Meta:
ordering = ['name']
class Sponsor(Institution):
objects = InstitutionCustomManager()
nocacheobjects = models.Manager()
userobjects = modelmanagers.SponsorManager()
collections = models.ManyToManyField(Collection)
class Meta:
permissions = (("list_sponsor", "Can list Sponsors"),)
class SubjectCategory(caching.base.CachingMixin, models.Model):
#Custom manager
objects = JournalCustomManager()
nocacheobjects = models.Manager()
term = models.CharField(_('Term'), max_length=256, db_index=True)
def __unicode__(self):
return self.term
class StudyArea(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
study_area = models.CharField(_('Study Area'), max_length=256,
choices=sorted(choices.SUBJECTS, key=lambda SUBJECTS: SUBJECTS[1]))
def __unicode__(self):
return self.study_area
class Journal(caching.base.CachingMixin, models.Model):
"""
Represents a Journal that is managed by one SciELO Collection.
`editor_address` references the institution who operates the
process.
`publisher_address` references the institution who is responsible
for the Journal.
"""
#Custom manager
objects = JournalCustomManager()
nocacheobjects = models.Manager()
userobjects = modelmanagers.JournalManager()
#Relation fields
creator = models.ForeignKey(User, related_name='enjoy_creator', editable=False)
sponsor = models.ManyToManyField('Sponsor', verbose_name=_('Sponsor'), related_name='journal_sponsor', null=True, blank=True)
previous_title = models.ForeignKey('Journal', verbose_name=_('Previous title'), related_name='prev_title', null=True, blank=True)
use_license = models.ForeignKey('UseLicense', verbose_name=_('Use license'))
collections = models.ManyToManyField('Collection', through='Membership')
languages = models.ManyToManyField('Language',)
national_code = models.CharField(_('National Code'), max_length=64, null=True, blank=True)
abstract_keyword_languages = models.ManyToManyField('Language', related_name="abstract_keyword_languages", )
subject_categories = models.ManyToManyField(SubjectCategory, verbose_name=_("Subject Categories"), related_name="journals", null=True)
study_areas = models.ManyToManyField(StudyArea, verbose_name=_("Study Area"), related_name="journals_migration_tmp", null=True)
editors = models.ManyToManyField(User, related_name='user_editors', null=True, blank=True)
#Fields
current_ahead_documents = models.IntegerField(_('Total of ahead of print documents for the current year'), max_length=3, default=0, blank=True, null=True)
previous_ahead_documents = models.IntegerField(_('Total of ahead of print documents for the previous year'), max_length=3, default=0, blank=True, null=True)
twitter_user = models.CharField(_('Twitter User'), max_length=128, null=True, blank=True)
title = models.CharField(_('Journal Title'), max_length=256, db_index=True)
title_iso = models.CharField(_('ISO abbreviated title'), max_length=256, db_index=True)
short_title = models.CharField(_('Short Title'), max_length=256, db_index=True, null=True)
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
acronym = models.CharField(_('Acronym'), max_length=16, blank=False)
scielo_issn = models.CharField(_('The ISSN used to build the Journal PID.'), max_length=16,
choices=sorted(choices.SCIELO_ISSN, key=lambda SCIELO_ISSN: SCIELO_ISSN[1]))
print_issn = models.CharField(_('Print ISSN'), max_length=9, db_index=True)
eletronic_issn = models.CharField(_('Electronic ISSN'), max_length=9, db_index=True)
subject_descriptors = models.CharField(_('Subject / Descriptors'), max_length=1024)
init_year = models.CharField(_('Initial Year'), max_length=4)
init_vol = models.CharField(_('Initial Volume'), max_length=16)
init_num = models.CharField(_('Initial Number'), max_length=16)
final_year = models.CharField(_('Final Year'), max_length=4, null=True, blank=True)
final_vol = models.CharField(_('Final Volume'), max_length=16, null=False, blank=True)
final_num = models.CharField(_('Final Number'), max_length=16, null=False, blank=True)
medline_title = models.CharField(_('Medline Title'), max_length=256, null=True, blank=True)
medline_code = models.CharField(_('Medline Code'), max_length=64, null=True, blank=True)
frequency = models.CharField(_('Frequency'), max_length=16,
choices=sorted(choices.FREQUENCY, key=lambda FREQUENCY: FREQUENCY[1]))
editorial_standard = models.CharField(_('Editorial Standard'), max_length=64,
choices=sorted(choices.STANDARD, key=lambda STANDARD: STANDARD[1]))
ctrl_vocabulary = models.CharField(_('Controlled Vocabulary'), max_length=64,
choices=choices.CTRL_VOCABULARY)
pub_level = models.CharField(_('Publication Level'), max_length=64,
choices=sorted(choices.PUBLICATION_LEVEL, key=lambda PUBLICATION_LEVEL: PUBLICATION_LEVEL[1]))
secs_code = models.CharField(_('SECS Code'), max_length=64, null=False, blank=True)
copyrighter = models.CharField(_('Copyrighter'), max_length=254)
url_online_submission = models.CharField(_('URL of online submission'), max_length=128, null=True, blank=True)
url_journal = models.CharField(_('URL of the journal'), max_length=128, null=True, blank=True)
notes = models.TextField(_('Notes'), max_length=254, null=True, blank=True)
index_coverage = models.TextField(_('Index Coverage'), null=True, blank=True)
cover = models.ImageField(_('Journal Cover'), upload_to='img/journal_cover/', null=True, blank=True)
logo = models.ImageField(_('Journal Logo'), upload_to='img/journals_logos', null=True, blank=True)
is_trashed = models.BooleanField(_('Is trashed?'), default=False, db_index=True)
other_previous_title = models.CharField(_('Other Previous Title'), max_length=255, blank=True)
editor_name = models.CharField(_('Editor Names'), max_length=512)
editor_address = models.CharField(_('Editor Address'), max_length=512)
editor_address_city = models.CharField(_('Editor City'), max_length=256)
editor_address_state = models.CharField(_('Editor State/Province/Region'), max_length=128)
editor_address_zip = models.CharField(_('Editor Zip/Postal Code'), max_length=64)
editor_address_country = modelfields.CountryField(_('Editor Country'))
editor_phone1 = models.CharField(_('Editor Phone 1'), max_length=32)
editor_phone2 = models.CharField(_('Editor Phone 2'), null=True, blank=True, max_length=32)
editor_email = models.EmailField(_('Editor E-mail'))
publisher_name = models.CharField(_('Publisher Name'), max_length=256)
publisher_country = modelfields.CountryField(_('Publisher Country'))
publisher_state = models.CharField(_('Publisher State/Province/Region'), max_length=64)
publication_city = models.CharField(_('Publication City'), max_length=64)
is_indexed_scie = models.BooleanField(_('SCIE'), default=False)
is_indexed_ssci = models.BooleanField(_('SSCI'), default=False)
is_indexed_aehci = models.BooleanField(_('A&HCI'), default=False)
def __unicode__(self):
return self.title
class Meta:
ordering = ['title']
permissions = (("list_journal", "Can list Journals"),
("list_editor_journal", "Can list editor Journals"))
def issues_as_grid(self, is_available=True):
objects_all = self.issue_set.available(is_available).order_by(
'-publication_year', '-volume')
grid = OrderedDict()
for issue in objects_all:
year_node = grid.setdefault(issue.publication_year, OrderedDict())
volume_node = year_node.setdefault(issue.volume, [])
volume_node.append(issue)
for year, volume in grid.items():
for vol, issues in volume.items():
issues.sort(key=lambda x: x.order)
return grid
def has_issues(self, issues):
"""
Returns ``True`` if all the given issues are bound to the journal.
``issues`` is a list of Issue pk.
"""
issues_to_test = set(int(issue) for issue in issues)
bound_issues = set(issue.pk for issue in self.issue_set.all())
return issues_to_test.issubset(bound_issues)
def reorder_issues(self, new_order, publication_year, volume=None):
"""
Make persistent the ordering received as a list of ``pk``,
to all the issues in a given ``publication_year`` and ``volume``.
The lenght of ``new_order`` must match with the subset of
issues by ``publication_year`` and ``volume``.
"""
filters = {'publication_year': publication_year}
if volume:
filters['volume'] = volume
issues = self.issue_set.filter(**filters)
issues_count = issues.count()
new_order_count = len(new_order)
if new_order_count != issues_count:
raise ValueError('new_order lenght does not match. %s:%s' % (new_order_count, issues_count))
with transaction.commit_on_success():
for i, pk in enumerate(new_order):
order = i + 1
issue = issues.get(pk=pk)
issue.order = order
issue.save()
def is_editor(self, user):
"""
Returns a boolean value depending if the given user is an editor
of the current journal.
"""
try:
self.editors.get(id=user.id)
except ObjectDoesNotExist:
return False
return True
@property
def scielo_pid(self):
"""
Returns the ISSN used as PID on SciELO public catalogs.
"""
attr = u'print_issn' if self.scielo_issn == u'print' else u'eletronic_issn'
return getattr(self, attr)
def join(self, collection, responsible):
"""Make this journal part of the collection.
"""
Membership.objects.create(journal=self,
collection=collection,
created_by=responsible,
status='inprogress')
def membership_info(self, collection, attribute=None):
"""Retrieve info about the relation of this journal with a
given collection.
"""
rel = self.membership_set.get(collection=collection)
if attribute:
return getattr(rel, attribute)
else:
return rel
def change_status(self, collection, new_status, reason, responsible):
rel = self.membership_info(collection)
rel.status = new_status
rel.reason = reason
rel.save()
class Membership(models.Model):
"""
Represents the many-to-many relation
between Journal and Collection.
"""
journal = models.ForeignKey('Journal')
collection = models.ForeignKey('Collection')
status = models.CharField(max_length=16, default="inprogress",
choices=choices.JOURNAL_PUBLICATION_STATUS)
since = models.DateTimeField(auto_now=True)
reason = models.TextField(_('Why are you changing the publication status?'),
blank=True, default="")
created_by = models.ForeignKey(User, editable=False)
def save(self, *args, **kwargs):
"""
Always save a copy at JournalTimeline
"""
super(Membership, self).save(*args, **kwargs)
JournalTimeline.objects.create(journal=self.journal,
collection=self.collection,
status=self.status,
reason=self.reason,
created_by=self.created_by,
since=self.since)
class Meta():
unique_together = ("journal", "collection")
class JournalTimeline(models.Model):
"""
Represents the status history of a journal.
"""
journal = models.ForeignKey('Journal', related_name='statuses')
collection = models.ForeignKey('Collection')
status = models.CharField(max_length=16,
choices=choices.JOURNAL_PUBLICATION_STATUS)
since = models.DateTimeField()
reason = models.TextField(default="")
created_by = models.ForeignKey(User)
class JournalTitle(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
journal = models.ForeignKey(Journal, related_name='other_titles')
title = models.CharField(_('Title'), null=False, max_length=128)
category = models.CharField(_('Title Category'), null=False, max_length=128, choices=sorted(choices.TITLE_CATEGORY, key=lambda TITLE_CATEGORY: TITLE_CATEGORY[1]))
class JournalMission(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
journal = models.ForeignKey(Journal, related_name='missions')
description = models.TextField(_('Mission'))
language = models.ForeignKey('Language', blank=False, null=True)
class UseLicense(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
license_code = models.CharField(_('License Code'), unique=True, null=False, blank=False, max_length=64)
reference_url = models.URLField(_('License Reference URL'), null=True, blank=True)
disclaimer = models.TextField(_('Disclaimer'), null=True, blank=True, max_length=512)
is_default = models.BooleanField(_('Is Default?'), default=False)
def __unicode__(self):
return self.license_code
class Meta:
ordering = ['license_code']
def save(self, *args, **kwargs):
"""
Only one UseLicense must be the default (is_default==True).
If already have one, these will be unset as default (is_default==False)
If None is already setted, this instance been saved, will be the default.
If the only one is unsetted as default, then will be foreced to be the default anyway,
to allways get one license setted as default
"""
qs = UseLicense.objects.filter(is_default=True)
if (qs.count() == 0 ) or (self in qs):
# no other was default, or ``self`` is the current default one,
# so ``self`` will be set as default
self.is_default = True
if self.is_default:
if self.pk:
qs = qs.exclude(pk=self.pk)
if qs.count() != 0:
qs.update(is_default=False)
super(UseLicense, self).save(*args, **kwargs)
class TranslatedData(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
translation = models.CharField(_('Translation'), null=True, blank=True, max_length=512)
language = models.CharField(_('Language'), choices=sorted(choices.LANGUAGE, key=lambda LANGUAGE: LANGUAGE[1]), null=False, blank=False, max_length=32)
model = models.CharField(_('Model'), null=False, blank=False, max_length=32)
field = models.CharField(_('Field'), null=False, blank=False, max_length=32)
def __unicode__(self):
return self.translation if self.translation is not None else 'Missing trans: {0}.{1}'.format(self.model, self.field)
class SectionTitle(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
section = models.ForeignKey('Section', related_name='titles')
title = models.CharField(_('Title'), max_length=256, null=False)
language = models.ForeignKey('Language')
class Meta:
ordering = ['title']
class Section(caching.base.CachingMixin, models.Model):
"""
Represents a multilingual section of one/many Issues of
a given Journal.
``legacy_code`` contains the section code used by the old
title manager. We've decided to store this value just by
historical reasons, and we don't know if it will last forever.
"""
#Custom manager
objects = SectionCustomManager()
nocacheobjects = models.Manager()
userobjects = modelmanagers.SectionManager()
journal = models.ForeignKey(Journal)
code = models.CharField(unique=True, max_length=21, blank=True)
legacy_code = models.CharField(null=True, blank=True, max_length=16)
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
is_trashed = models.BooleanField(_('Is trashed?'), default=False, db_index=True)
def __unicode__(self):
return ' / '.join([sec_title.title for sec_title in self.titles.all().order_by('language')])
@property
def actual_code(self):
if not self.pk or not self.code:
raise AttributeError('section must be saved in order to have a code')
return self.code
def is_used(self):
try:
return True if self.issue_set.all().count() else False
except ValueError: # raised when the object is not yet saved
return False
def add_title(self, title, language):
"""
Adds a section title in the given language.
A Language instance must be passed as the language argument.
"""
SectionTitle.objects.create(section=self,
title=title, language=language)
def _suggest_code(self, rand_generator=base28.genbase):
"""
Suggests a code for the section instance.
The code is formed by the journal acronym + 4 pseudo-random
base 28 chars.
``rand_generator`` is the callable responsible for the pseudo-random
chars sequence. It may accept the number of chars as argument.
"""
num_chars = getattr(settings, 'SECTION_CODE_TOTAL_RANDOM_CHARS', 4)
fmt = '{0}-{1}'.format(self.journal.acronym, rand_generator(num_chars))
return fmt
def _create_code(self, *args, **kwargs):
if not self.code:
tries = kwargs.pop('max_tries', 5)
while tries > 0:
self.code = self._suggest_code()
try:
super(Section, self).save(*args, **kwargs)
except IntegrityError:
tries -= 1
logger.warning('conflict while trying to generate a section code. %i tries remaining.' % tries)
continue
else:
logger.info('code created successfully for %s' % unicode(self))
break
else:
msg = 'max_tries reached while trying to generate a code for the section %s.' % unicode(self)
logger.error(msg)
raise DatabaseError(msg)
class Meta:
permissions = (("list_section", "Can list Sections"),)
def save(self, *args, **kwargs):
"""
If ``code`` already exists, the section is saved. Else,
the ``code`` will be generated before the save process is
performed.
"""
if self.code:
super(Section, self).save(*args, **kwargs)
else:
# the call to super().save is delegated to _create_code
# because there are needs to control saving max tries.
self._create_code(*args, **kwargs)
class Issue(caching.base.CachingMixin, models.Model):
#Custom manager
objects = IssueCustomManager()
nocacheobjects = models.Manager()
section = models.ManyToManyField(Section, blank=True)
journal = models.ForeignKey(Journal)
volume = models.CharField(_('Volume'), blank=True, max_length=16)
number = models.CharField(_('Number'), blank=True, max_length=16)
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
publication_start_month = models.IntegerField(_('Start Month'), blank=True, null=True, choices=choices.MONTHS)
publication_end_month = models.IntegerField(_('End Month'), blank=True, null=True, choices=choices.MONTHS)
publication_year = models.IntegerField(_('Year'))
is_marked_up = models.BooleanField(_('Is Marked Up?'), default=False, null=False, blank=True)
use_license = models.ForeignKey(UseLicense, null=True, help_text=ISSUE_DEFAULT_LICENSE_HELP_TEXT)
total_documents = models.IntegerField(_('Total of Documents'), default=0)
ctrl_vocabulary = models.CharField(_('Controlled Vocabulary'), max_length=64,
choices=sorted(choices.CTRL_VOCABULARY, key=lambda CTRL_VOCABULARY: CTRL_VOCABULARY[1]), null=False, blank=True)
editorial_standard = models.CharField(_('Editorial Standard'), max_length=64,
choices=sorted(choices.STANDARD, key=lambda STANDARD: STANDARD[1]))
cover = models.ImageField(_('Issue Cover'), upload_to='img/issue_cover/', null=True, blank=True)
is_trashed = models.BooleanField(_('Is trashed?'), default=False, db_index=True)
label = models.CharField(db_index=True, blank=True, null=True, max_length=64)
order = models.IntegerField(_('Issue Order'), blank=True)
type = models.CharField(_('Type'), max_length=15, choices=choices.ISSUE_TYPES, default='regular', editable=False)
suppl_text = models.CharField(_('Suppl Text'), max_length=15, null=True, blank=True)
class Meta:
permissions = (("list_issue", "Can list Issues"),
("reorder_issue", "Can Reorder Issues"))
@property
def scielo_pid(self):
"""
Returns the PID used on SciELO public catalogs, in the form:
``journal_issn + year + order``
"""
jissn = self.journal.scielo_pid
return ''.join(
[
jissn,
unicode(self.publication_year),
u'%04d' % self.order,
]
)
@property
def identification(self):
values = [self.number]
if self.type == 'supplement':
values.append('suppl.%s' % self.suppl_text)
return ' '.join([val for val in values if val]).strip().replace(
'spe', 'special').replace('ahead', 'ahead of print')
def __unicode__(self):
return "{0} ({1})".format(self.volume, self.identification).replace('()', '')
@property
def publication_date(self):
return '{0} / {1} - {2}'.format(self.publication_start_month,
self.publication_end_month,
self.publication_year)
@property
def suppl_type(self):
if self.type == 'supplement':
if self.number != '' and self.volume == '':
return 'number'
elif self.number == '' and self.volume != '':
return 'volume'
else:
raise AttributeError('Issues of type %s do not have an attribute named: suppl_type' % self.get_type_display())
def _suggest_order(self, force=False):
"""
Based on ``publication_year``, ``volume`` and a pre defined
``order``, this method suggests the subsequent ``order`` value.
If the Issues already has a ``order``, it suggests it. Else,
a query is made for the given ``publication_year`` and ``volume``
and the ``order`` attribute of the last instance is used.
When force ``True`` this method ignore order attribute from the instance
and return the suggest order.
"""
if self.order and force == False:
return self.order
filters = {
'publication_year': self.publication_year,
'journal': self.journal,
}
try:
last = Issue.objects.filter(**filters).order_by('order').reverse()[0]
next_order = last.order + 1
except IndexError:
next_order = 1
return next_order
def _get_default_use_license(self):
return self.journal.use_license
def save(self, *args, **kwargs):
self.label = unicode(self)
if self.use_license is None and self.journal:
self.use_license = self._get_default_use_license()
if not self.pk:
self.order = self._suggest_order()
else:
# the ordering control is based on publication year attr.
# if an issue is moved between pub years, the order must be reset.
if tools.has_changed(self, 'publication_year'):
self.order = self._suggest_order(force=True)
super(Issue, self).save(*args, **kwargs)
class IssueTitle(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
issue = models.ForeignKey(Issue)
language = models.ForeignKey('Language')
title = models.CharField(_('Title'), max_length=128)
class PendedForm(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
view_name = models.CharField(max_length=128)
form_hash = models.CharField(max_length=32)
user = models.ForeignKey(User, related_name='pending_forms')
created_at = models.DateTimeField(auto_now=True)
class PendedValue(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
form = models.ForeignKey(PendedForm, related_name='data')
name = models.CharField(max_length=255)
value = models.TextField()
class DataChangeEvent(models.Model):
"""
Tracks data changes to make possible for consumer apps to know
what to sync.
"""
changed_at = models.DateTimeField(auto_now=True)
user = models.ForeignKey(User)
content_type = models.ForeignKey(ContentType)
object_id = models.PositiveIntegerField()
content_object = generic.GenericForeignKey('content_type', 'object_id')
event_type = models.CharField(max_length=16, choices=EVENT_TYPES)
collection = models.ForeignKey(Collection)
class PressRelease(caching.base.CachingMixin, models.Model):
"""
Represents a press-release bound to a Journal.
If ``issue`` is None, the pressrelease is refers to an ahead article.
It can be available in one or any languages (restricted by the Journal
publishing policy).
"""
nocacheobjects = models.Manager()
objects = models.Manager()
doi = models.CharField(_("Press release DOI number"),
max_length=128, null=True, blank=True)
def add_article(self, article):
"""
``article`` is a string of the article pid.
"""
PressReleaseArticle.objects.create(press_release=self,
article_pid=article)
def remove_article(self, article):
try:
pra = PressReleaseArticle.objects.get(press_release=self,
article_pid=article)
except PressReleaseArticle.DoesNotExist:
return None
else:
pra.delete()
def add_translation(self, title, content, language):
"""
Adds a new press-release translation.
``language`` is an instance of Language.
"""
PressReleaseTranslation.objects.create(press_release=self,
language=language,
title=title,
content=content)
def remove_translation(self, language):
"""
Removes the translation for the given press-release.
If the translation doesn't exist, nothing happens silently.
"""
qry_params = {'press_release': self}
if isinstance(language, basestring):
qry_params['language__iso_code'] = language
else:
qry_params['language'] = language
try:
pr = PressReleaseTranslation.objects.get(**qry_params)
except PressReleaseTranslation.DoesNotExist:
return None
else:
pr.delete()
def get_trans(self, language):
"""
Syntatic suggar for retrieving translations in a given language
"""
prt = self.translations.get(language__iso_code=language)
return prt
def __unicode__(self):
"""
Try to get the first title of the Press Release.
The form ensures at least one title.
"""
try:
title = PressReleaseTranslation.objects.filter(press_release=self).order_by('language')[0].title
except IndexError:
return __('No Title')
return title
class Meta:
abstract = False
permissions = (("list_pressrelease", "Can list PressReleases"),)
class PressReleaseTranslation(caching.base.CachingMixin, models.Model):
"""
Represents a press-release in a given language.
"""
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
press_release = models.ForeignKey(PressRelease, related_name='translations')
language = models.ForeignKey('Language')
title = models.CharField(_('Title'), max_length=128)
content = models.TextField(_('Content'))
class PressReleaseArticle(caching.base.CachingMixin, models.Model):
"""
Represents press-releases bound to Articles.
"""
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
press_release = models.ForeignKey(PressRelease, related_name='articles')
article_pid = models.CharField(_('PID'), max_length=32, db_index=True)
class RegularPressRelease(PressRelease):
objects = RegularPressReleaseCustomManager()
userobjects = modelmanagers.RegularPressReleaseManager()
issue = models.ForeignKey(Issue, related_name='press_releases')
class AheadPressRelease(PressRelease):
objects = AheadPressReleaseCustomManager()
userobjects = modelmanagers.AheadPressReleaseManager()
journal = models.ForeignKey(Journal, related_name='press_releases')
class Article(caching.base.CachingMixin, models.Model):
objects = caching.base.CachingManager()
nocacheobjects = models.Manager()
issue = models.ForeignKey(Issue, related_name='articles')
front = jsonfield.JSONField()
xml_url = models.CharField(_('XML URL'), max_length=256)
pdf_url = models.CharField(_('PDF URL'), max_length=256)
images_url = models.CharField(_('Images URL'), max_length=256)
def __unicode__(self):
return u' - '.join([self.title, str(self.issue)])
class Meta:
permissions = (("list_article", "Can list Article"),)
@property
def title(self):
if not 'title-group' in self.front:
return None
default_language = self.front.get('default-language', None)
if default_language in self.front['title-group']:
return self.front['title-group'][default_language]
return self.front['title-group'].values()[0]
@property
def titles(self):
if not 'title-group' in self.front:
return None
return self.front['title-group']
models.signals.post_save.connect(create_api_key, sender=User)
| 38.428455 | 166 | 0.66393 | 5,490 | 47,267 | 5.549545 | 0.124408 | 0.028359 | 0.012801 | 0.014508 | 0.36423 | 0.297272 | 0.252634 | 0.21223 | 0.187153 | 0.173893 | 0 | 0.008292 | 0.229505 | 47,267 | 1,229 | 167 | 38.459723 | 0.828276 | 0.127552 | 0 | 0.268874 | 0 | 0.001325 | 0.085498 | 0.003567 | 0 | 0 | 0 | 0 | 0 | 1 | 0.091391 | false | 0 | 0.034437 | 0.018543 | 0.605298 | 0.010596 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
89d89227f288e0f65c724c9ca47de9d7c7896fe6 | 1,685 | py | Python | src/pyro_util/modules/__init__.py | MacoskoLab/pyro-util | 6ea5e1dfd082abed8e675743c59efe9b548671ab | [
"MIT"
] | null | null | null | src/pyro_util/modules/__init__.py | MacoskoLab/pyro-util | 6ea5e1dfd082abed8e675743c59efe9b548671ab | [
"MIT"
] | null | null | null | src/pyro_util/modules/__init__.py | MacoskoLab/pyro-util | 6ea5e1dfd082abed8e675743c59efe9b548671ab | [
"MIT"
] | null | null | null | from typing import Tuple
import torch
import torch.nn as nn
from pyro.distributions.util import broadcast_shape
from pyro_util.modules.weight_scaling import GammaReLU, WSLinear
T = torch.Tensor
def make_ws_fc(*dims: int) -> nn.Module:
"""Helper function for creating a fully connected neural network.
This version uses weight-scaled linear layers and gamma-scaled ReLU
:param dims: The size of the layers in the network (at least 2)
:return: nn.Sequential containing all the layers
"""
layers = [WSLinear(dims[0], dims[1])]
for in_dim, out_dim in zip(dims[1:], dims[2:]):
layers.append(GammaReLU())
layers.append(WSLinear(in_dim, out_dim))
return nn.Sequential(*layers)
def make_bn_fc(*dims: int) -> nn.Module:
"""Helper function for creating a fully connected neural network.
This version uses BatchNorm between linear layers.
:param dims: The size of the layers in the network (at least 2)
:return: nn.Sequential containing all the layers
"""
layers = [nn.Linear(dims[0], dims[1])]
for in_dim, out_dim in zip(dims[1:], dims[2:]):
layers.append(nn.BatchNorm1d(in_dim))
layers.append(nn.ReLU())
layers.append(nn.Linear(in_dim, out_dim))
return nn.Sequential(*layers)
def split_in_half(t: T) -> Tuple[T, T]:
"""Splits a tensor in half along the final dimension"""
return t.reshape(t.shape[:-1] + (2, -1)).unbind(-2)
def broadcast_inputs(input_args):
"""Helper for broadcasting inputs to neural net"""
shape = broadcast_shape(*[s.shape[:-1] for s in input_args]) + (-1,)
input_args = [s.expand(shape) for s in input_args]
return input_args
| 29.561404 | 72 | 0.68546 | 258 | 1,685 | 4.379845 | 0.317829 | 0.022124 | 0.063717 | 0.038938 | 0.499115 | 0.472566 | 0.472566 | 0.472566 | 0.472566 | 0.40531 | 0 | 0.012555 | 0.196439 | 1,685 | 56 | 73 | 30.089286 | 0.822009 | 0.336499 | 0 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16 | false | 0 | 0.2 | 0 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
89d965296e8228cfbfe6118b69532269f5a9bce1 | 10,442 | py | Python | render.py | ondrejkoren/WeatherFrame | 2f9da22e5fa162e9ec7af218f353c16eb1250572 | [
"MIT"
] | 2 | 2020-10-26T19:18:11.000Z | 2020-10-27T17:27:40.000Z | render.py | ondrejkoren/WeatherFrame | 2f9da22e5fa162e9ec7af218f353c16eb1250572 | [
"MIT"
] | null | null | null | render.py | ondrejkoren/WeatherFrame | 2f9da22e5fa162e9ec7af218f353c16eb1250572 | [
"MIT"
] | 1 | 2020-12-18T03:42:42.000Z | 2020-12-18T03:42:42.000Z | from WeatherScreens.RingScreen import RingScreen
from WeatherScreens.QuadrantScreen import QuadrantScreen
from WeatherScreens.ImageScreen import ImageScreen
from WeatherScreens.ScreenBase import ScreenBase
from datetime import datetime, timedelta
from suntime import Sun, SunTimeException
from dateutil import tz
import pyowm
import argparse
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="WeatherFrame CLI Utility")
parser.add_argument("--lat", type=float,
help="Latitude in decimal form")
parser.add_argument("--long", type=float,
help="Longitude in decimal form")
parser.add_argument("--owm", type=str,
help="OpenWeatherMap API Token")
parser.add_argument("--type", type=str,
help="Screen type")
parser.add_argument("--image", type=str,
help="Image path")
args = parser.parse_args()
latitude = args.lat
longitude = args.long
owm_token = args.owm
screen_type = args.type
image_path = args.image
# MOCK data
weather_data = {
'wind': {'speed': 33.5, 'deg': 190, 'gust': 42.12},
'humidity': 100,
'humidity_indoor': 47,
'temp': {'temp': -33.77, 'temp_max': 0.56, 'temp_min': -2.0},
'temp_indoor': 24.12,
'status': 'Mist',
'clouds': 90,
'pressure': {'press': 1009, 'sea_level': 1038.381},
'observation_time': "2020-01-25 09:04:34+00",
'forecast': [
{'status': 'Clouds', 'temp': {'temp': -0.52, 'temp_max': 0.83, 'temp_min': -0.52, 'temp_kf': -1.35}, 'wind': {'speed': 2.21, 'deg': 88}, 'date': "2020-01-26 15:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -1.69, 'temp_max': -0.68, 'temp_min': -1.69, 'temp_kf': -1.01}, 'wind': {'speed': 1.73, 'deg': 80}, 'date': "2020-01-26 18:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -1.75, 'temp_max': -1.07, 'temp_min': -1.75, 'temp_kf': -0.68}, 'wind': {'speed': 1.42, 'deg': 45}, 'date': "2020-01-26 21:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -1.66, 'temp_max': -1.32, 'temp_min': -1.66, 'temp_kf': -0.34}, 'wind': {'speed': 1.32, 'deg': 8}, 'date': "2020-01-27 00:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -1.56, 'temp_kf': -273.15, 'temp_max': -1.56, 'temp_min': -1.56}, 'wind': {'speed': 0.83, 'deg': 17}, 'date': "2020-01-27 03:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -1.48, 'temp_kf': -273.15, 'temp_max': -1.48, 'temp_min': -1.48}, 'wind': {'speed': 1.09, 'deg': 317}, 'date': "2020-01-27 06:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 1.78, 'temp_kf': -273.15, 'temp_max': 1.78, 'temp_min': 1.78}, 'wind': {'speed': 1.53, 'deg': 302}, 'date': "2020-01-27 09:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 4.87, 'temp_kf': -273.15, 'temp_max': 4.87, 'temp_min': 4.87}, 'wind': {'speed': 1.39, 'deg': 267}, 'date': "2020-01-27 12:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 3.01, 'temp_kf': -273.15, 'temp_max': 3.01, 'temp_min': 3.01}, 'wind': {'speed': 1.96, 'deg': 187}, 'date': "2020-01-27 15:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 1.33, 'temp_kf': -273.15, 'temp_max': 1.33, 'temp_min': 1.33}, 'wind': {'speed': 3.08, 'deg': 141}, 'date': "2020-01-27 18:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 1.25, 'temp_kf': -273.15, 'temp_max': 1.25, 'temp_min': 1.25}, 'wind': {'speed': 3.64, 'deg': 140}, 'date': "2020-01-27 21:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 1.46, 'temp_kf': -273.15, 'temp_max': 1.46, 'temp_min': 1.46}, 'wind': {'speed': 5.11, 'deg': 138}, 'date': "2020-01-28 00:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 2.65, 'temp_kf': -273.15, 'temp_max': 2.65, 'temp_min': 2.65}, 'wind': {'speed': 6.79, 'deg': 142}, 'date': "2020-01-28 03:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 3.88, 'temp_kf': -273.15, 'temp_max': 3.88, 'temp_min': 3.88}, 'wind': {'speed': 5.3, 'deg': 164}, 'date': "2020-01-28 06:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 5.47, 'temp_kf': -273.15, 'temp_max': 5.47, 'temp_min': 5.47}, 'wind': {'speed': 5.01, 'deg': 143}, 'date': "2020-01-28 09:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 6.44, 'temp_kf': -273.15, 'temp_max': 6.44, 'temp_min': 6.44}, 'wind': {'speed': 3.59, 'deg': 335}, 'date': "2020-01-28 12:00:00+00"},
{'status': 'Rain', 'temp': {'temp': 5.16, 'temp_kf': -273.15, 'temp_max': 5.16, 'temp_min': 5.16}, 'wind': {'speed': 3.21, 'deg': 264}, 'date': "2020-01-28 15:00:00+00"},
{'status': 'Rain', 'temp': {'temp': 3.55, 'temp_kf': -273.15, 'temp_max': 3.55, 'temp_min': 3.55}, 'wind': {'speed': 3.59, 'deg': 321}, 'date': "2020-01-28 18:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 3.97, 'temp_kf': -273.15, 'temp_max': 3.97, 'temp_min': 3.97}, 'wind': {'speed': 7.12, 'deg': 301}, 'date': "2020-01-28 21:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 2.98, 'temp_kf': -273.15, 'temp_max': 2.98, 'temp_min': 2.98}, 'wind': {'speed': 6.25, 'deg': 277}, 'date': "2020-01-29 00:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 1.37, 'temp_kf': -273.15, 'temp_max': 1.37, 'temp_min': 1.37}, 'wind': {'speed': 3.69, 'deg': 263}, 'date': "2020-01-29 03:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 2.09, 'temp_kf': -273.15, 'temp_max': 2.09, 'temp_min': 2.09}, 'wind': {'speed': 5.82, 'deg': 213}, 'date': "2020-01-29 06:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 4.53, 'temp_kf': -273.15, 'temp_max': 4.53, 'temp_min': 4.53}, 'wind': {'speed': 3.18, 'deg': 260}, 'date': "2020-01-29 09:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 5.56, 'temp_kf': -273.15, 'temp_max': 5.56, 'temp_min': 5.56}, 'wind': {'speed': 11.16, 'deg': 291}, 'date': "2020-01-29 12:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 4.4, 'temp_kf': -273.15, 'temp_max': 4.4, 'temp_min': 4.4}, 'wind': {'speed': 9.39, 'deg': 296}, 'date': "2020-01-29 15:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 3.49, 'temp_kf': -273.15, 'temp_max': 3.49, 'temp_min': 3.49}, 'wind': {'speed': 12.78, 'deg': 298}, 'date': "2020-01-29 18:00:00+00"},
{'status': 'Clear', 'temp': {'temp': 2.37, 'temp_kf': -273.15, 'temp_max': 2.37, 'temp_min': 2.37}, 'wind': {'speed': 6.79, 'deg': 288}, 'date': "2020-01-29 21:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 2.59, 'temp_kf': -273.15, 'temp_max': 2.59, 'temp_min': 2.59}, 'wind': {'speed': 8.32, 'deg': 292}, 'date': "2020-01-30 00:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 1.8, 'temp_kf': -273.15, 'temp_max': 1.8, 'temp_min': 1.8}, 'wind': {'speed': 7.83, 'deg': 294}, 'date': "2020-01-30 03:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 1.06, 'temp_kf': -273.15, 'temp_max': 1.06, 'temp_min': 1.06}, 'wind': {'speed': 5.74, 'deg': 303}, 'date': "2020-01-30 06:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 3.67, 'temp_kf': -273.15, 'temp_max': 3.67, 'temp_min': 3.67}, 'wind': {'speed': 9.05, 'deg': 305}, 'date': "2020-01-30 09:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 5.38, 'temp_kf': -273.15, 'temp_max': 5.38, 'temp_min': 5.38}, 'wind': {'speed': 9.72, 'deg': 299}, 'date': "2020-01-30 12:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 4.55, 'temp_kf': -273.15, 'temp_max': 4.55, 'temp_min': 4.55}, 'wind': {'speed': 4.51, 'deg': 294}, 'date': "2020-01-30 15:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 3.21, 'temp_kf': -273.15, 'temp_max': 3.21, 'temp_min': 3.21}, 'wind': {'speed': 4.77, 'deg': 298}, 'date': "2020-01-30 18:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 1.39, 'temp_kf': -273.15, 'temp_max': 1.39, 'temp_min': 1.39}, 'wind': {'speed': 1.37, 'deg': 269}, 'date': "2020-01-30 21:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 0.23, 'temp_kf': -273.15, 'temp_max': 0.23, 'temp_min': 0.23}, 'wind': {'speed': 1.08, 'deg': 155}, 'date': "2020-01-31 00:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -0.07, 'temp_kf': -273.15, 'temp_max': -0.07, 'temp_min': -0.07}, 'wind': {'speed': 0.35, 'deg': 28}, 'date': "2020-01-31 03:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': -0.09, 'temp_kf': -273.15, 'temp_max': -0.09, 'temp_min': -0.09}, 'wind': {'speed': 0.47, 'deg': 342}, 'date': "2020-01-31 06:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 3.67, 'temp_kf': -273.15, 'temp_max': 3.67, 'temp_min': 3.67}, 'wind': {'speed': 1.49, 'deg': 286}, 'date': "2020-01-31 09:00:00+00"},
{'status': 'Clouds', 'temp': {'temp': 6.95, 'temp_kf': -273.15, 'temp_max': 6.95, 'temp_min': 6.95}, 'wind': {'speed': 1.9, 'deg': 258}, 'date': "2020-01-31 12:00:00+00"}
]
}
# correct weather data forecast dates
fixed_forecast = []
now = datetime.now()
datapoint_datetime = datetime.strptime(weather_data["forecast"][0]["date"], "%Y-%m-%d %H:%M:%S+00")
diff = now - datapoint_datetime
for x in weather_data["forecast"]:
x_date = datapoint_datetime = datetime.strptime(x["date"], "%Y-%m-%d %H:%M:%S+00")
x["date"] = x_date + timedelta(days=diff.days+1)
x["date"] = x["date"].strftime("%Y-%m-%d %H:%M:%S+00")
fixed_forecast.append(x)
weather_data["forecast"] = fixed_forecast
owm = pyowm.OWM(owm_token)
observation = owm.weather_at_coords(latitude, longitude)
w = observation.get_weather()
weather_data = {
'wind': w.get_wind(),
'humidity': w.get_humidity(),
'temp': w.get_temperature('celsius'),
'clouds': w.get_clouds(),
'pressure': w.get_pressure(),
'status': w.get_status(),
'observation_time': observation.get_reception_time(timeformat="iso")
}
screen = None
if screen_type == "ring":
screen = RingScreen(coordinates=(latitude, longitude),
weather_data=weather_data)
elif screen_type == "quadrant":
screen = QuadrantScreen(coordinates=(latitude, longitude),
weather_data=weather_data)
elif screen_type == "image":
screen = ImageScreen(path=image_path)
else:
screen = ScreenBase()
image = screen.render()
image.show()
| 83.536 | 187 | 0.538977 | 1,632 | 10,442 | 3.340074 | 0.137255 | 0.062374 | 0.049532 | 0.085856 | 0.409099 | 0.385801 | 0.368189 | 0.227298 | 0.139974 | 0.049532 | 0 | 0.175975 | 0.192396 | 10,442 | 124 | 188 | 84.209677 | 0.470414 | 0.00431 | 0 | 0.034783 | 0 | 0 | 0.35838 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.078261 | 0 | 0.078261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89da74e72512e7844f3d0419ef0411f667024554 | 254 | py | Python | evergreen/manage/website/page/urls.py | craigsander/evergreen | 73c8e1ed546e3ac480add3e30e8696f4eb052496 | [
"MIT"
] | null | null | null | evergreen/manage/website/page/urls.py | craigsander/evergreen | 73c8e1ed546e3ac480add3e30e8696f4eb052496 | [
"MIT"
] | 6 | 2016-05-09T02:56:05.000Z | 2016-05-26T18:36:41.000Z | evergreen/manage/website/page/urls.py | craigsander/evergreen | 73c8e1ed546e3ac480add3e30e8696f4eb052496 | [
"MIT"
] | null | null | null | from django.conf.urls import url, include
from django.conf import settings
from . import views
# Wire up our API using automatic URL routing.
# Additionally, we include login URLs for the browsable API.
urlpatterns = [
url(r'manage/', views.index),
]
| 23.090909 | 60 | 0.755906 | 38 | 254 | 5.052632 | 0.684211 | 0.104167 | 0.145833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161417 | 254 | 10 | 61 | 25.4 | 0.901408 | 0.405512 | 0 | 0 | 0 | 0 | 0.047297 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
89dffdff4ab352309290e6973585dc0df6b2adbd | 2,440 | py | Python | python/rsa_encrypt_decrypt.py | hipro/hipro | 49e8c9751109839fdef9a6bc812e3b92fdff7d4e | [
"Apache-2.0"
] | null | null | null | python/rsa_encrypt_decrypt.py | hipro/hipro | 49e8c9751109839fdef9a6bc812e3b92fdff7d4e | [
"Apache-2.0"
] | null | null | null | python/rsa_encrypt_decrypt.py | hipro/hipro | 49e8c9751109839fdef9a6bc812e3b92fdff7d4e | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""加密算法:公钥(私钥)加密,私钥解密"""
from Crypto.PublicKey import RSA
from Crypto import Random
DATA = 'Hello, word!'
PRIVATE_KEY_PEM = """-----BEGIN RSA PRIVATE KEY-----
MIICXQIBAAKBgQDB3c0nwVs6koPkpt6REeT07jK7m9qE9BmDw1Zl55T66rGfKM3g
1DFBq7jtcZ+xcgYAGgvJWPW16nylag/1lVNUxMShm2jlp3MwuBNKRvrXP2u29j9v
AAlM9lMLXzt0Ui4ZfLF9abpti5oD9tWy29Sp9Lt+0OWHKxp1QRazmykQeQIDAQAB
AoGAdL4FMcB9GFtscz+NXVyiPGBISrOCtndr+e2iVIFNNIAp8AcZWx9MfhhTpyC6
IpfgRyVoHZqldCO9Zbrl22RNpfybrP/2BeHx9xJWDXLXNAvDkZNCokCtc/bZYaQU
XCSYHUAmV078E0xZShwMwGu1YgZlz9er3XsqqBrT9ujDjIECQQDTOt+ukShtMJQd
6soNTA5+LU/kA+MKRB7oNPoviEMRRGeonD2ZXbjmzY6i1XJ/YsKPVuMkkvYCtPEY
KcvtCSApAkEA6vTMUBViRTr1Db63WBGpobAr9V8kiiMn6q2TuRBITsyijOgL6u+X
CrpRf+KDVyWC06ZHS/UFPPi+lubIgAU30QJAKtMp3HOTlaeer/4VHuMHoS9AnkLn
egJbncp32sEuj8almXqrxndI8IpGW98YipkURwlfnd+pvty+cJ6wuIr8GQJBAN/2
33cLGzSQ4ZzrigtqMr+Mlip8OfFvV5JtSR4kdjie+efFHe8h2WGBf0SfH8GHYTDt
FJNECW04Uzy22rKlxrECQQCtOkedu7SDr4tb3miKPNy5jyoVBRIR4QElE6DfZoDX
sxf4NowzBDwLbhYHNzSCl0xlIAA/xvFtRkEDtlYjq58n
-----END RSA PRIVATE KEY-----"""
PUBLIC_KEY_PEM = """-----BEGIN PUBLIC KEY-----
MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDB3c0nwVs6koPkpt6REeT07jK7
m9qE9BmDw1Zl55T66rGfKM3g1DFBq7jtcZ+xcgYAGgvJWPW16nylag/1lVNUxMSh
m2jlp3MwuBNKRvrXP2u29j9vAAlM9lMLXzt0Ui4ZfLF9abpti5oD9tWy29Sp9Lt+
0OWHKxp1QRazmykQeQIDAQAB
-----END PUBLIC KEY-----"""
def _encrypt_by_public():
random_func = Random.new().read
public_key = RSA.importKey(PUBLIC_KEY_PEM)
encrypted = public_key.encrypt(DATA, random_func)
return encrypted
def _encrypt_by_private():
random_func = Random.new().read
private_key = RSA.importKey(PRIVATE_KEY_PEM)
encrypted = private_key.encrypt(DATA, random_func)
return encrypted
def _decrypt_by_private(msg_encrypt):
private_key = RSA.importKey(PRIVATE_KEY_PEM)
decrypted = private_key.decrypt(msg_encrypt)
return decrypted
def _decrypt_by_public_err(msg_encrypt):
"""无效"""
public_key = RSA.importKey(PUBLIC_KEY_PEM)
decrypted = public_key.decrypt(msg_encrypt)
return decrypted
if __name__ == '__main__':
print(DATA, _decrypt_by_private(_encrypt_by_public()))
print(DATA, _decrypt_by_private(_encrypt_by_private()))
try:
print(DATA, _decrypt_by_public_err(_encrypt_by_public()))
except TypeError as e1:
print(DATA, e1)
try:
print(DATA, _decrypt_by_public_err(_encrypt_by_private()))
except TypeError as e2:
print(DATA, e2)
| 35.362319 | 66 | 0.812295 | 229 | 2,440 | 8.327511 | 0.379913 | 0.047195 | 0.031463 | 0.037756 | 0.252753 | 0.228631 | 0.191924 | 0.08495 | 0.040902 | 0 | 0 | 0.071363 | 0.104098 | 2,440 | 68 | 67 | 35.882353 | 0.801006 | 0.014754 | 0 | 0.226415 | 0 | 0 | 0.491851 | 0.429586 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075472 | false | 0 | 0.113208 | 0 | 0.264151 | 0.113208 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89e266ad4a5dbc97ce90b8acd022e9cd85eb5d18 | 408 | py | Python | shop_website/shop/migrations/0002_auto_20200228_1533.py | omar00070/django-shopping-website | af2741b900b60631349ea2e6de17586994e31680 | [
"MIT"
] | null | null | null | shop_website/shop/migrations/0002_auto_20200228_1533.py | omar00070/django-shopping-website | af2741b900b60631349ea2e6de17586994e31680 | [
"MIT"
] | null | null | null | shop_website/shop/migrations/0002_auto_20200228_1533.py | omar00070/django-shopping-website | af2741b900b60631349ea2e6de17586994e31680 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.3 on 2020-02-28 15:33
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('shop', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='product',
name='photo',
field=models.ImageField(default='default.jpg', upload_to='product_image'),
),
]
| 21.473684 | 86 | 0.60049 | 44 | 408 | 5.477273 | 0.795455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064189 | 0.27451 | 408 | 18 | 87 | 22.666667 | 0.75 | 0.110294 | 0 | 0 | 1 | 0 | 0.144044 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89e8f1e4189f0729dfe92491a135e0998857cfdb | 2,306 | py | Python | chainer/links/connection/mgu.py | Qwinpin/chainer | 1dca01bc8a1aceec6ee53a66d24970b203a9fc51 | [
"MIT"
] | 1 | 2019-02-12T23:10:16.000Z | 2019-02-12T23:10:16.000Z | chainer/links/connection/mgu.py | nolfwin/chainer | 8d776fcc1e848cb9d3800a6aab356eb91ae9d088 | [
"MIT"
] | 1 | 2018-06-26T08:16:09.000Z | 2018-06-26T08:16:09.000Z | chainer/links/connection/mgu.py | nolfwin/chainer | 8d776fcc1e848cb9d3800a6aab356eb91ae9d088 | [
"MIT"
] | 1 | 2018-05-28T22:43:34.000Z | 2018-05-28T22:43:34.000Z | import numpy
import chainer
from chainer.backends import cuda
from chainer.functions.activation import sigmoid
from chainer.functions.activation import tanh
from chainer.functions.array import concat
from chainer.functions.math import linear_interpolate
from chainer import link
from chainer.links.connection import linear
class MGUBase(link.Chain):
def __init__(self, n_inputs, n_units):
super(MGUBase, self).__init__()
with self.init_scope():
self.W_f = linear.Linear(n_inputs + n_units, n_units)
self.W_h = linear.Linear(n_inputs + n_units, n_units)
def _call_mgu(self, h, x):
f = sigmoid.sigmoid(self.W_f(concat.concat([h, x])))
h_bar = tanh.tanh(self.W_h(concat.concat([f * h, x])))
h_new = linear_interpolate.linear_interpolate(f, h_bar, h)
return h_new
class StatelessMGU(MGUBase):
forward = MGUBase._call_mgu
class StatefulMGU(MGUBase):
def __init__(self, in_size, out_size):
super(StatefulMGU, self).__init__(in_size, out_size)
self._state_size = out_size
self.reset_state()
def _to_device(self, device, skip_between_cupy_devices=False):
# Overrides Link._to_device
# TODO(niboshi): Avoid forcing concrete links to override _to_device
device = chainer.get_device(device)
super(StatefulMGU, self)._to_device(
device, skip_between_cupy_devices=skip_between_cupy_devices)
if self.h is not None:
if not (skip_between_cupy_devices
and device.xp is cuda.cupy
and isinstance(self.h, cuda.ndarray)):
self.h.to_device(device)
return self
def set_state(self, h):
assert isinstance(h, chainer.Variable)
h_ = h
if self.xp is numpy:
h_.to_cpu()
else:
h_.to_gpu()
self.h = h_
def reset_state(self):
self.h = None
def forward(self, x):
if self.h is None:
n_batch = x.shape[0]
dtype = chainer.get_dtype()
h_data = self.xp.zeros(
(n_batch, self._state_size), dtype=dtype)
h = chainer.Variable(h_data)
else:
h = self.h
self.h = self._call_mgu(h, x)
return self.h
| 29.948052 | 76 | 0.630095 | 316 | 2,306 | 4.322785 | 0.259494 | 0.040264 | 0.058565 | 0.064422 | 0.139092 | 0.045388 | 0.045388 | 0.045388 | 0 | 0 | 0 | 0.000601 | 0.278404 | 2,306 | 76 | 77 | 30.342105 | 0.820313 | 0.039896 | 0 | 0.034483 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013158 | 0.017241 | 1 | 0.12069 | false | 0 | 0.155172 | 0 | 0.396552 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89f5028eff37603d75de75f5abd7c8d726c5c6e5 | 2,882 | py | Python | deepinesStore/cardg.py | xoascf/store_deepines | d16f5af2b90845d2be3ca19d19cc43d4ad4a0bcf | [
"Apache-2.0"
] | 6 | 2019-09-21T14:16:53.000Z | 2021-04-05T12:21:03.000Z | deepinesStore/cardg.py | xoascf/store_deepines | d16f5af2b90845d2be3ca19d19cc43d4ad4a0bcf | [
"Apache-2.0"
] | 3 | 2019-09-20T02:12:10.000Z | 2021-01-26T20:24:14.000Z | deepinesStore/cardg.py | xoascf/store_deepines | d16f5af2b90845d2be3ca19d19cc43d4ad4a0bcf | [
"Apache-2.0"
] | 14 | 2019-10-20T04:48:07.000Z | 2022-01-29T14:51:09.000Z | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'guis/card.ui'
#
# Created by: PyQt5 UI code generator 5.13.0
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore, QtGui, QtWidgets
class QLabelClickable(QtWidgets.QLabel):
clicked = QtCore.pyqtSignal()
def __init__(self, *args):
QtWidgets.QLabel.__init__(self, *args)
def mouseReleaseEvent(self, ev):
self.clicked.emit()
class Ui_Frame(object):
def setupUi(self, Frame):
Frame.setObjectName("Frame")
Frame.resize(230, 249)
Frame.setMinimumSize(QtCore.QSize(226, 249))
Frame.setMaximumSize(QtCore.QSize(230, 16777215))
self.verticalLayout = QtWidgets.QVBoxLayout(Frame)
self.verticalLayout.setContentsMargins(0, 0, 0, 0)
self.verticalLayout.setSpacing(0)
self.verticalLayout.setObjectName("verticalLayout")
self.image_app = QLabelClickable(Frame)
self.image_app.setText("")
self.image_app.setScaledContents(True)
self.image_app.setAlignment(QtCore.Qt.AlignCenter)
self.image_app.setCursor(QtGui.QCursor(QtCore.Qt.PointingHandCursor))
self.image_app.setObjectName("image_app")
self.image_app.setStyleSheet("#image_app{margin-top: 10px;}")
self.verticalLayout.addWidget(self.image_app)
self.lbl_name_app = QLabelClickable(Frame)
self.lbl_name_app.setStyleSheet("background-color: transparent;"
"margin-top:5px;")
self.lbl_name_app.setText("")
self.lbl_name_app.setAlignment(QtCore.Qt.AlignCenter)
self.lbl_name_app.setCursor(QtGui.QCursor(QtCore.Qt.PointingHandCursor))
font = QtGui.QFont()
font.setFamily("Segoe UI Semibold")
font.setPointSize(11)
font.setItalic(False)
self.lbl_name_app.setFont(font)
self.lbl_name_app.setWordWrap(True)
self.lbl_name_app.setObjectName("lbl_name_app")
self.verticalLayout.addWidget(self.lbl_name_app)
self.btn_select_app = QLabelClickable(Frame)
font = QtGui.QFont()
font.setFamily("Segoe UI Semibold")
font.setPointSize(9)
font.setItalic(False)
self.btn_select_app.setFont(font)
self.btn_select_app.setWordWrap(True)
self.btn_select_app.setAlignment(QtCore.Qt.AlignCenter)
self.btn_select_app.setCursor(QtGui.QCursor(QtCore.Qt.PointingHandCursor))
self.btn_select_app.setObjectName("btn_select_app")
self.verticalLayout.addWidget(self.btn_select_app)
self.retranslateUi(Frame)
QtCore.QMetaObject.connectSlotsByName(Frame)
def retranslateUi(self, Frame):
_translate = QtCore.QCoreApplication.translate
Frame.setWindowTitle(_translate("Frame", "Card"))
self.btn_select_app.setText(_translate("Frame", "Instalar"))
| 37.428571 | 82 | 0.693616 | 333 | 2,882 | 5.822823 | 0.312312 | 0.041258 | 0.051573 | 0.064982 | 0.235173 | 0.200103 | 0.14131 | 0.115523 | 0.059825 | 0.059825 | 0 | 0.017703 | 0.196391 | 2,882 | 76 | 83 | 37.921053 | 0.819516 | 0.063498 | 0 | 0.107143 | 1 | 0 | 0.068351 | 0.008172 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.017857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
89ff6360ea1be91377db709efe114247c747be5c | 2,295 | py | Python | src/Deque/deque_scratch.py | shapovalovdev/AlgorythmsAndDataStructures | 34d5f38c089e0ba902813607f08847fbdc7361ab | [
"Apache-2.0"
] | null | null | null | src/Deque/deque_scratch.py | shapovalovdev/AlgorythmsAndDataStructures | 34d5f38c089e0ba902813607f08847fbdc7361ab | [
"Apache-2.0"
] | null | null | null | src/Deque/deque_scratch.py | shapovalovdev/AlgorythmsAndDataStructures | 34d5f38c089e0ba902813607f08847fbdc7361ab | [
"Apache-2.0"
] | null | null | null | class Node:
def __init__(self,v):
self.next=None
self.prev=None
self.value=v
class Deque:
def __init__(self):
self.front=None
self.tail=None
def addFront(self, item):
node=Node(item)
if self.front is None: #case of none items
self.front=node
self.tail=node
elif self.tail is self.front: # case of 1 item
self.tail.prev=node
self.front=node
node.next=self.tail
else: # case of several items
self.front.prev=node
prev_front=self.front
self.front=node
node.next=prev_front
def addTail(self, item):
node=Node(item)
if self.front is None:
self.front = node
else:
self.tail.next=node
node.prev=self.tail
self.tail=node
def removeFront(self):
if self.front is None:
return None #if the stack is empty
else:
item=self.front
if self.front.next is not None:
self.front=self.front.next
elif self.front.next is self.tail:
self.front=self.tail
else:
self.front=None
self.tail=None
return item.value
def removeTail(self):
if self.front is None:
return None #if the stack is empty
else:
if self.front.next is None: #case from one item
item=self.front
self.front=None
self.tail=None
else:
item=self.tail
self.tail=item.prev
item.prev.next=None #case from two items
return item.value
def size(self):
node = self.front
length=0
while node is not None:
length+=1
node = node.next
return length
def getFront(self):
if self.front is None:
return None
else:
return self.front.value
def getTail(self):
if self.tail is None:
return None
else:
return self.tail.value | 27.987805 | 86 | 0.484967 | 272 | 2,295 | 4.055147 | 0.150735 | 0.203989 | 0.06981 | 0.05893 | 0.366274 | 0.297371 | 0.229374 | 0.189483 | 0.161378 | 0.161378 | 0 | 0.002357 | 0.445316 | 2,295 | 82 | 87 | 27.987805 | 0.864101 | 0.062309 | 0 | 0.472973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.121622 | false | 0 | 0 | 0 | 0.27027 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6002ee1619df2f3a01c18514eb41944710fe8cf | 1,529 | py | Python | ife/features/tests/test_features.py | Collonville/ImageFeatureExtractor | 92c9b4bbb19ac6f319d86e2e9837425a822e78aa | [
"BSD-3-Clause"
] | 2 | 2020-09-10T09:59:45.000Z | 2021-02-18T06:06:57.000Z | ife/features/tests/test_features.py | Collonville/ImageFeatureExtractor | 92c9b4bbb19ac6f319d86e2e9837425a822e78aa | [
"BSD-3-Clause"
] | 9 | 2019-07-24T14:34:45.000Z | 2021-06-01T01:43:45.000Z | ife/features/tests/test_features.py | Collonville/ImageFeatureExtractor | 92c9b4bbb19ac6f319d86e2e9837425a822e78aa | [
"BSD-3-Clause"
] | 1 | 2019-08-10T12:37:07.000Z | 2019-08-10T12:37:07.000Z | import unittest
from collections import defaultdict
import numpy as np
import pandas as pd
from ife.io.io import ImageReader
class TestMomentFeatures(unittest.TestCase):
def test_moment_output_type(self) -> None:
features = ImageReader.read_from_single_file("ife/data/small_rgb.jpg")
moment = features.moment()
self.assertIs(np.ndarray, type(moment))
moment = features.moment(output_type="")
self.assertIs(np.ndarray, type(moment))
moment = features.moment(output_type="one_col")
self.assertIs(np.ndarray, type(moment))
self.assertEqual(np.zeros(15).shape, moment.shape) # type: ignore
moment = features.moment(output_type="dict")
self.assertIs(defaultdict, type(moment))
moment = features.moment(output_type="pandas")
self.assertIs(pd.DataFrame, type(moment))
def test_colourfulness_output_type(self) -> None:
features = ImageReader.read_from_single_file("ife/data/small_rgb.jpg")
moment = features.colourfulness()
self.assertIs(np.float64, type(moment))
moment = features.colourfulness(output_type="")
self.assertIs(np.float64, type(moment))
moment = features.colourfulness(output_type="one_col")
self.assertIs(np.float64, type(moment))
moment = features.colourfulness(output_type="dict")
self.assertIs(dict, type(moment))
moment = features.colourfulness(output_type="pandas")
self.assertIs(pd.DataFrame, type(moment))
| 32.531915 | 78 | 0.689993 | 182 | 1,529 | 5.664835 | 0.241758 | 0.096993 | 0.108632 | 0.162949 | 0.746848 | 0.686712 | 0.670223 | 0.580019 | 0.580019 | 0.484966 | 0 | 0.006494 | 0.194245 | 1,529 | 46 | 79 | 33.23913 | 0.830357 | 0.007848 | 0 | 0.322581 | 0 | 0 | 0.051485 | 0.029043 | 0 | 0 | 0 | 0 | 0.354839 | 1 | 0.064516 | false | 0 | 0.16129 | 0 | 0.258065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6045b4c7f02b396222d75c9572cf50573b6e5e0 | 4,116 | py | Python | mysite/timesheets/models.py | xanderyzwich/Timesheets | 15685ac7b786d3e66bd24e8a3a252f193ee8f49b | [
"MIT"
] | null | null | null | mysite/timesheets/models.py | xanderyzwich/Timesheets | 15685ac7b786d3e66bd24e8a3a252f193ee8f49b | [
"MIT"
] | 1 | 2019-06-11T21:23:49.000Z | 2019-06-11T21:23:49.000Z | mysite/timesheets/models.py | xanderyzwich/Timesheets | 15685ac7b786d3e66bd24e8a3a252f193ee8f49b | [
"MIT"
] | null | null | null | """The database models and form based on the timesheet model"""
import datetime
from django.db import models
from django.forms import ModelForm, ValidationError
# Create your models here.
class Task(models.Model):
"""Used to support Timesheet class"""
type = models.CharField(max_length=25)
class Meta:
ordering = ('type',)
def __str__(self):
return self.type
class Employee(models.Model):
"""Used to support Timesheet class"""
id = models.IntegerField(primary_key=True) # PK
first_name = models.CharField(max_length=25)
last_name = models.CharField(max_length=25)
created_date = models.DateField(default=datetime.date.today)
def name(self):
return self.first_name + ' ' + self.last_name
def __str__(self):
return str(self.id) + ' ' + str(self.first_name) + ' ' + str(self.last_name)
class App(models.Model):
"""Used to support Timesheet class"""
id = models.IntegerField(primary_key=True)
name = models.CharField(max_length=25)
created_date = models.DateField(default=datetime.date.today)
def __str__(self):
return str(self.id) + ' ' + str(self.name)
class Defect(models.Model):
"""Used to support Timesheet class"""
id = models.CharField(primary_key=True, max_length=25)
app = models.ForeignKey(App, on_delete=models.PROTECT)
description = models.CharField(max_length=50)
created_date = models.DateField(default=datetime.date.today)
def __str__(self):
return str(self.id) + ' ' + str(self.app) + ' ' + str(self.description)
class Adhoc(models.Model):
"""Used to support Timesheet class"""
id = models.IntegerField(primary_key=True)
description = models.CharField(max_length=50)
hours_projected = models.IntegerField(default=0)
created_date = models.DateField(default=datetime.date.today)
def __str__(self):
output = 'Adhoc Task: ' + str(self.id) + ' - ' + str(self.description)
if int(self.hours_projected) > 0:
output += ' (' + str(self.hours_projected) + ' hours projected)'
return output
class Timesheet(models.Model): # PK=EMP_ID,APP_ID,TASK_TYPE,DEFECT_ID,ADHOC_ID,TASK_DATE
"""Primary class/table for this application"""
emp = models.ForeignKey(Employee, on_delete=models.PROTECT)
app = models.ForeignKey(App, on_delete=models.PROTECT)
task = models.ForeignKey(Task, on_delete=models.PROTECT)
defect = models.ForeignKey(Defect, on_delete=models.PROTECT, default=None, blank=True, null=True)
adhoc = models.ForeignKey(Adhoc, on_delete=models.PROTECT,default=None, blank=True, null=True)
date = models.DateField(default=datetime.date.today)
hours = models.DecimalField(decimal_places=2,max_digits=4)
class Meta:
ordering = ('-date', 'emp__id', 'app__id', 'task__type', 'defect__id', 'adhoc__id')
def __str__(self):
return str('Employee: ' + str(self.emp) + ' App: ' + str(self.app) + ' Task: ' + str(self.task) + ' Defect: '
+ str(self.defect) + ' Adhoc: ' + str(self.adhoc) + ' Hours: ' + str(self.hours))
class TimesheetForm(ModelForm):
"""Input form for Timesheet data entry by user"""
class Meta:
model = Timesheet
fields = ['emp', 'app', 'task', 'defect', 'adhoc', 'date', 'hours']
def clean(self):
cleaned_data = super().clean()
task = cleaned_data.get('task')
adhoc = cleaned_data.get('adhoc')
defect = cleaned_data.get('defect')
if task.type == 'Adhoc' and adhoc is None:
self.fields['adhoc'].required = True
raise ValidationError("Adhoc item is required")
elif task.type == 'Defect' and defect is None:
self.fields['defect'].required = True
raise ValidationError("Defect item is required")
elif adhoc is not None and task.type != 'Adhoc':
raise ValidationError("Adhoc requires matching Task")
elif defect is not None and task.type != 'Defect':
raise ValidationError("Defect requires matching Task")
return cleaned_data
| 34.881356 | 117 | 0.659864 | 519 | 4,116 | 5.084778 | 0.188825 | 0.061008 | 0.040925 | 0.054566 | 0.442213 | 0.419098 | 0.364532 | 0.333839 | 0.30125 | 0.271694 | 0 | 0.005542 | 0.210884 | 4,116 | 117 | 118 | 35.179487 | 0.806958 | 0.094023 | 0 | 0.27027 | 0 | 0 | 0.086839 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108108 | false | 0 | 0.040541 | 0.081081 | 0.702703 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d604e88c648108777cbfd88997a6da0f4142321d | 972 | py | Python | waferscreen/inst_control/inactive/keithley_2700_multimeter.py | chw3k5/WaferScreen | c0ca7fe939fe7cd0b722b7d6129b148c03a7505c | [
"Apache-2.0"
] | 1 | 2021-07-30T19:06:07.000Z | 2021-07-30T19:06:07.000Z | waferscreen/inst_control/inactive/keithley_2700_multimeter.py | chw3k5/WaferScreen | c0ca7fe939fe7cd0b722b7d6129b148c03a7505c | [
"Apache-2.0"
] | 8 | 2021-04-22T20:47:48.000Z | 2021-07-30T19:06:01.000Z | waferscreen/inst_control/inactive/keithley_2700_multimeter.py | chw3k5/WaferScreen | c0ca7fe939fe7cd0b722b7d6129b148c03a7505c | [
"Apache-2.0"
] | null | null | null | '''
Created on Mar 11, 2009
@author: schimaf
'''
import gpib_instrument
class Keithley2700Multimeter(gpib_instrument.Gpib_Instrument):
'''
classdocs
'''
def __init__(self, pad, board_number = 0, name = '', sad = 0, timeout = 13, send_eoi = 1, eos_mode = 0):
'''
Constructor
'''
super(Keithley2700Multimeter, self).__init__(board_number, name, pad, sad, timeout, send_eoi, eos_mode)
# GPIB identity string of the instrument
self.id_string = "KEITHLEY INSTRUMENTS INC.,MODEL 2700,0822752,B02"
self.manufacturer = 'Keithley'
self.model_number = '2700'
self.description = 'Multimeter'
self.compare_identity()
def data(self):
result = self.ask(':DATA?')
print "result", result
array = result.split(',')
y = array[0]
z = y[0:-3]
voltage = float(z)
return voltage | 24.923077 | 111 | 0.56893 | 104 | 972 | 5.125 | 0.567308 | 0.078799 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060241 | 0.316872 | 972 | 39 | 112 | 24.923077 | 0.74247 | 0.039095 | 0 | 0 | 0 | 0 | 0.100728 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d60722b9887f89e13eb10b2c11b054a0ed09389f | 324 | py | Python | test/test_comment.py | ExiaSR/server | f29ce921681d8b70f9e2541f7e251deb894bea29 | [
"Apache-2.0"
] | 3 | 2017-02-22T21:15:27.000Z | 2017-08-07T17:30:21.000Z | test/test_comment.py | ExiaSR/server | f29ce921681d8b70f9e2541f7e251deb894bea29 | [
"Apache-2.0"
] | 4 | 2017-02-24T00:47:02.000Z | 2017-03-20T08:51:02.000Z | test/test_comment.py | TeamGhostBuster/restful-api | f29ce921681d8b70f9e2541f7e251deb894bea29 | [
"Apache-2.0"
] | 1 | 2017-01-27T16:22:46.000Z | 2017-01-27T16:22:46.000Z | import pytest
from test.conftest import *
@pytest.mark.run(after='test_create_article_for_user')
@post('/article/{}/comment', {"comment": "shit posting #1"})
def test_post_comment_to_article(result=None, url_id=['article_id']):
assert result.status_code == 200
assert result.json()['content'] == 'shit posting #1'
| 32.4 | 69 | 0.725309 | 46 | 324 | 4.869565 | 0.630435 | 0.107143 | 0.107143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017301 | 0.108025 | 324 | 9 | 70 | 36 | 0.757785 | 0 | 0 | 0 | 0 | 0 | 0.311728 | 0.08642 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d607862cf4892c18870cfcbccf290b4c451cd8b2 | 2,243 | py | Python | mazeinawall/generate_dataset.py | ncvescera/QRL-Maze_in_a_Wall | c150612bd17b1a70439853acfbdb2bc7d2b92a67 | [
"Apache-2.0"
] | 2 | 2022-02-25T15:55:28.000Z | 2022-03-24T17:41:42.000Z | mazeinawall/generate_dataset.py | ncvescera/QRL-Maze_in_a_Wall | c150612bd17b1a70439853acfbdb2bc7d2b92a67 | [
"Apache-2.0"
] | 1 | 2022-02-03T15:48:30.000Z | 2022-02-17T08:52:21.000Z | mazeinawall/generate_dataset.py | ncvescera/QRL-Maze_in_a_Wall | c150612bd17b1a70439853acfbdb2bc7d2b92a67 | [
"Apache-2.0"
] | null | null | null | from random import randint, seed
import numpy as np
from os import path, mkdir
from maze_utils import generate_grid
seed_number = 69
training_folder = "training"
testing_folder = "testing"
tot_elem_training = 100 # numero di matrici da generare
tot_elem_testing = 20 # numero di matrici da generare
max_w = 10 # massima altezza
max_h = 10 # massima lunghezza
min_w = 3 # minima altezza
min_h = 3 # minima larghezza
def generate_dataset():
"""
Genera il dataset di training e testing creando matrici a caso
di dimensione massima 10x10, minima 3x3 e con un numero minimo di 1 muro
:return:
"""
# imposto il seed
np.random.seed(seed_number)
seed(seed_number)
generate_training(tot_elem_training)
generate_testing(tot_elem_testing)
def generate_testing(dim: int):
"""
Genera il dataset di testing.
Se la cartella non esiste la crea e la popola con matrici a caso.
:param dim: numero di matrici da creare
:return:
"""
# se la cartella non esiste la creo
if not path.exists(testing_folder):
mkdir(testing_folder)
for elem in range(dim):
file_name = f"{testing_folder}/matrice_{elem}"
# scelta random di w, h e walls
w = randint(min_w, max_w)
h = randint(min_h, max_h)
walls = randint(1, int(w * h / 2) - 1)
grid = generate_grid(w, h, walls=walls)
np.savetxt(file_name, grid, delimiter=" ", fmt='%i')
def generate_training(dim: int):
"""
Genera il dataset di training.
Se la cartella non esiste la crea e la popola con matrici a caso.
:param dim: numero di matrici da creare
:return:
"""
# se la cartella non esiste la creo
if not path.exists(training_folder):
mkdir(training_folder)
for elem in range(dim):
file_name = f"{training_folder}/matrice_{elem}"\
# scelta random di w, h e walls
w = randint(min_w, max_w)
h = randint(min_h, max_h)
walls = randint(1, int(w * h / 2) - 1)
grid = generate_grid(w, h, walls=walls)
np.savetxt(file_name, grid, delimiter=" ", fmt='%i')
if __name__ == "__main__":
generate_dataset()
| 25.488636 | 76 | 0.637539 | 326 | 2,243 | 4.217791 | 0.269939 | 0.011636 | 0.043636 | 0.049455 | 0.552 | 0.491636 | 0.458182 | 0.458182 | 0.458182 | 0.411636 | 0 | 0.01599 | 0.275078 | 2,243 | 87 | 77 | 25.781609 | 0.829643 | 0.316095 | 0 | 0.3 | 1 | 0 | 0.063624 | 0.043568 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075 | false | 0 | 0.1 | 0 | 0.175 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d60b9e43d95710ca6c53adbaecd49a801012a6ac | 660 | py | Python | controllers/home.py | elydev01/kvtemplate2 | e1563574795b711b722d5219e8e8a7a158137884 | [
"MIT"
] | null | null | null | controllers/home.py | elydev01/kvtemplate2 | e1563574795b711b722d5219e8e8a7a158137884 | [
"MIT"
] | null | null | null | controllers/home.py | elydev01/kvtemplate2 | e1563574795b711b722d5219e8e8a7a158137884 | [
"MIT"
] | null | null | null | from kivy.lang import Builder
from kivy.metrics import dp
from kivy import properties as p
from kivy.animation import Animation
from kivymd.app import MDApp as App
from kivymd.uix.screen import MDScreen
class HomeMainScreen(MDScreen):
bg_pos = p.NumericProperty(0)
def toggle_bg_pos(self):
bg_pos = 0 if self.bg_pos > 0 else dp(self.height/2)
Animation(bg_pos=bg_pos).start(self)
with open('views/home.kv', encoding='utf-8') as f:
Builder.load_string(f.read())
class HomeScreenApp(App):
def build(self):
return HomeMainScreen()
def main():
HomeScreenApp().run()
if __name__ == '__main__':
main()
| 20 | 60 | 0.698485 | 98 | 660 | 4.540816 | 0.5 | 0.067416 | 0.040449 | 0.044944 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009416 | 0.195455 | 660 | 32 | 61 | 20.625 | 0.828625 | 0 | 0 | 0 | 0 | 0 | 0.039394 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.3 | 0.05 | 0.65 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d60fbcd4c5db31db47d319f9c2fd9dae6d72dff3 | 260 | py | Python | H/283. Move Zeroes.py | shaohy/leetcode | a5a4bbd768b5d5e394f327785bd99c8a3b11ae0b | [
"MIT"
] | null | null | null | H/283. Move Zeroes.py | shaohy/leetcode | a5a4bbd768b5d5e394f327785bd99c8a3b11ae0b | [
"MIT"
] | null | null | null | H/283. Move Zeroes.py | shaohy/leetcode | a5a4bbd768b5d5e394f327785bd99c8a3b11ae0b | [
"MIT"
] | null | null | null | class Solution:
def moveZeroes(self, nums: List[int]) -> None:
"""
Do not return anything, modify nums in-place instead.
"""
for i in nums:
if i == 0:
nums.append(i)
nums.remove(i) | 28.888889 | 61 | 0.476923 | 31 | 260 | 4 | 0.741935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006536 | 0.411538 | 260 | 9 | 62 | 28.888889 | 0.803922 | 0.203846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d611e9107074951bab0a620dc787648f0af98fce | 3,045 | py | Python | ultimate-utils-proj-src/uutils/torch/torch_geometric/__init__.py | CBMM/ultimate-utils | 2179eb6a510128ecefea5e2e19108098f6728b05 | [
"MIT"
] | null | null | null | ultimate-utils-proj-src/uutils/torch/torch_geometric/__init__.py | CBMM/ultimate-utils | 2179eb6a510128ecefea5e2e19108098f6728b05 | [
"MIT"
] | null | null | null | ultimate-utils-proj-src/uutils/torch/torch_geometric/__init__.py | CBMM/ultimate-utils | 2179eb6a510128ecefea5e2e19108098f6728b05 | [
"MIT"
] | null | null | null |
# def draw_nx(g, labels=None):
# import matplotlib.pyplot as plt
# if labels is not None:
# g = nx.relabel_nodes(g, labels)
# pos = nx.kamada_kawai_layout(g)
# nx.draw(g, pos, with_labels=True)
# plt.show()
#
# def draw_nx_attributes_as_labels(g, attribute):
# # import pylab
# import matplotlib.pyplot as plt
# import networkx as nx
# labels = nx.get_node_attributes(g, attribute)
# pos = nx.kamada_kawai_layout(g)
# nx.draw(g, pos, labels=labels, with_labels=True)
# # nx.draw(g, labels=labels)
# # pylab.show()
# plt.show()
#
# def draw_nx_with_pygraphviz(g, path2file=None, save_file=False):
# attribute_name = None
# draw_nx_with_pygraphviz_attribtes_as_labels(g, attribute_name, path2file, save_file)
#
# def draw_nx_with_pygraphviz_attribtes_as_labels(g, attribute_name, path2file=None, save_file=False):
# import matplotlib.pyplot as plt
# import matplotlib.image as mpimg
#
# # https://stackoverflow.com/questions/15345192/draw-more-information-on-graph-nodes-using-pygraphviz
# # https://stackoverflow.com/a/67442702/1601580
#
# if path2file is None:
# path2file = './example.png'
# path2file = Path(path2file).expanduser()
# save_file = True
# if type(path2file) == str:
# path2file = Path(path2file).expanduser()
# save_file = True
#
# print(f'\n{g.is_directed()=}')
# g = nx.nx_agraph.to_agraph(g)
# if attribute_name is not None:
# print(f'{g=}')
# # to label in pygrapviz make sure to have the AGraph obj have the label attribute set on the nodes
# g = str(g)
# g = g.replace(attribute_name, 'label')
# print(g)
# # g = pgv.AGraph(g)
# g = pgv.AGraph(g)
# g.layout()
# g.draw(path2file)
#
# # https://stackoverflow.com/questions/20597088/display-a-png-image-from-python-on-mint-15-linux
# img = mpimg.imread(path2file)
# plt.imshow(img)
# plt.show()
#
# # remove file https://stackoverflow.com/questions/6996603/how-to-delete-a-file-or-folder
# if not save_file:
# path2file.unlink()
# tests
def test1():
# conda install -y pytorch-geometric -c rusty1s -c conda-forge
import torch
from torch_geometric.data import Data
# [2, number_edges], edge = (node_idx1, node_idx2), e.g. e = (0,1) = (n0, n1) (note this is reflected on the type torch long)
edge_index = torch.tensor([[0, 1, 1, 2],
[1, 0, 2, 1]], dtype=torch.long)
# features to each node [num_nodes, D]
x = torch.tensor([[0.0], [-1.0], [1.0]])
data = Data(x=x, edge_index=edge_index)
print(data)
# https://discuss.pytorch.org/t/pytorch-geometric/44994
# https://stackoverflow.com/questions/61274847/how-to-visualize-a-torch-geometric-graph-in-python
import networkx as nx
from torch_geometric.utils.convert import to_networkx
g = to_networkx(data)
nx.draw(g)
pass
if __name__ == '__main__':
test1()
print("Done\a") | 32.052632 | 129 | 0.639737 | 428 | 3,045 | 4.406542 | 0.320093 | 0.025451 | 0.055673 | 0.063627 | 0.256098 | 0.193001 | 0.145281 | 0.098621 | 0.098621 | 0.098621 | 0 | 0.038039 | 0.222989 | 3,045 | 95 | 130 | 32.052632 | 0.759087 | 0.785878 | 0 | 0 | 0 | 0 | 0.024306 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.0625 | 0.25 | 0 | 0.3125 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d61e6516f718df3f31a9d34a9f5fa19b3f8b469f | 2,071 | py | Python | spinup/exercises/problem_set_1/exercise1_1.py | wowbob396/spinningup | 717013c2d404666a9c362a045bbebe395e58d8a0 | [
"MIT"
] | null | null | null | spinup/exercises/problem_set_1/exercise1_1.py | wowbob396/spinningup | 717013c2d404666a9c362a045bbebe395e58d8a0 | [
"MIT"
] | null | null | null | spinup/exercises/problem_set_1/exercise1_1.py | wowbob396/spinningup | 717013c2d404666a9c362a045bbebe395e58d8a0 | [
"MIT"
] | null | null | null | import tensorflow as tf
import numpy as np
import math
"""
Exercise 1.1: Diagonal Gaussian Likelihood
Write a function which takes in Tensorflow symbols for the means and
log stds of a batch of diagonal Gaussian distributions, along with a
Tensorflow placeholder for (previously-generated) samples from those
distributions, and returns a Tensorflow symbol for computing the log
likelihoods of those samples.
"""
def calculate_error():
pass
def gaussian_likelihood(x, mu, log_std):
"""
Args:
x: Tensor with shape [batch, dim]
mu: Tensor with shape [batch, dim]
log_std: Tensor with shape [batch, dim] or [dim]
Returns:
Tensor with shape [batch]
"""
#######################
# #
# YOUR CODE HERE #
# #
#######################
batch = x.shape[0].value
dim = x.shape[1].value
log_likelihood = -(1/2)*((tf.math.pow(x-mu,2)/tf.math.pow(tf.exp(log_std),2))+(2*log_std)+np.log(2 * np.pi))
print(log_likelihood)
return tf.reduce_sum(log_likelihood,1)
if __name__ == '__main__':
"""
Run this file to verify your solution.
"""
from spinup.exercises.problem_set_1_solutions import exercise1_1_soln
from spinup.exercises.common import print_result
sess = tf.Session()
dim = 10
x = tf.placeholder(tf.float32, shape=(None, dim))
mu = tf.placeholder(tf.float32, shape=(None, dim))
log_std = tf.placeholder(tf.float32, shape=(dim,))
your_gaussian_likelihood = gaussian_likelihood(x, mu, log_std)
true_gaussian_likelihood = exercise1_1_soln.gaussian_likelihood(x, mu, log_std)
batch_size = 32
feed_dict = {x: np.random.rand(batch_size, dim),
mu: np.random.rand(batch_size, dim),
log_std: np.random.rand(dim)}
your_result, true_result = sess.run([your_gaussian_likelihood, true_gaussian_likelihood],
feed_dict=feed_dict)
correct = np.allclose(your_result, true_result)
print_result(correct) | 29.169014 | 112 | 0.638822 | 278 | 2,071 | 4.57554 | 0.348921 | 0.113208 | 0.04717 | 0.062893 | 0.230346 | 0.154874 | 0.053459 | 0 | 0 | 0 | 0 | 0.016518 | 0.239981 | 2,071 | 71 | 113 | 29.169014 | 0.791614 | 0.103332 | 0 | 0 | 0 | 0 | 0.00602 | 0 | 0 | 0 | 0 | 0.014085 | 0 | 1 | 0.068966 | false | 0.034483 | 0.172414 | 0 | 0.275862 | 0.103448 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d62251a3f732e6d3445b6aaa9d46a183c953d427 | 262 | py | Python | setup.py | yuvipanda/fakeokclient | 379893350457900dd10452c27b774f73a1850ed2 | [
"BSD-3-Clause"
] | null | null | null | setup.py | yuvipanda/fakeokclient | 379893350457900dd10452c27b774f73a1850ed2 | [
"BSD-3-Clause"
] | null | null | null | setup.py | yuvipanda/fakeokclient | 379893350457900dd10452c27b774f73a1850ed2 | [
"BSD-3-Clause"
] | null | null | null | import setuptools
setuptools.setup(
name="fakeokpy",
version='0.1',
url="https://github.com/yuvipanda/fakeokpy",
author="Yuvi Panda",
author_email="yuvipanda@gmail.com",
license="BSD-3-Clause",
packages=setuptools.find_packages(),
)
| 21.833333 | 48 | 0.679389 | 31 | 262 | 5.677419 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013575 | 0.156489 | 262 | 11 | 49 | 23.818182 | 0.782805 | 0 | 0 | 0 | 0 | 0 | 0.339695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6225ce13fba3710da7c5601f7127c0706c16753 | 3,889 | py | Python | Archive/Presentation/Cat Code Presentation.py | JohanWinther/cat-state-encoding | 3fa95c5c9d9d223e4b9fbc38fe5e27a46d0d12ef | [
"MIT"
] | 3 | 2020-02-10T01:53:29.000Z | 2022-01-13T09:23:40.000Z | Archive/Presentation/Cat Code Presentation.py | JohanWinther/cat-state-encoding | 3fa95c5c9d9d223e4b9fbc38fe5e27a46d0d12ef | [
"MIT"
] | null | null | null | Archive/Presentation/Cat Code Presentation.py | JohanWinther/cat-state-encoding | 3fa95c5c9d9d223e4b9fbc38fe5e27a46d0d12ef | [
"MIT"
] | 1 | 2021-07-31T08:55:43.000Z | 2021-07-31T08:55:43.000Z |
# coding: utf-8
# $ \newcommand{\cat}[2][\phantom{i}]{\ket{C^{#2}_{#1\alpha}}} $
# $ \newcommand{\ket}[1]{|#1\rangle} $
# $ \newcommand{\bra}[1]{\langle#1|} $
# $ \newcommand{\braket}[2]{\langle#1|#2\rangle} $
# $\newcommand{\au}{\hat{a}^\dagger}$
# $\newcommand{\ad}{\hat{a}}$
# $\newcommand{\bu}{\hat{b}^\dagger}$
# $\newcommand{\bd}{\hat{b}}$
# # Cat Code Preparation with Optimal Control
# <sup>Johan Winther</sup>
#
# ## Goal
# Obtain a set of pulses which will encode the quantum information of a qubit with "cat codes" (and vice versa).
#
# <sub>N. Ofek, A. Petrenko, R. Heeres, P. Reinhold, Z. Leghtas, B. Vlastakis, Y. Liu, L. Frunzio,
# S. M. Girvin, L. Jiang, M. Mirrahimi, M. H. Devoret & R. J. Schoelkopf, ‘Extending the lifetime of a quantum bit with error correction in superconducting circuits’, Nature; London, vol. 536, no. 7617, pp. 441–445, Aug. 2016.</sub>
# # Outline
# * Why cat codes?
# * Optimal control (GRAPE)
# * Using optimal control to generate cat codes
# * My work so far
# # Why use cat codes for error correction?
# The cat code is comprised of the logical basis:
# 
# <p style="text-align: center;">Notation: $ \ket{0}_L = \cat{\pm},\,\, \ket{1}_L = \cat[i]{\pm} $ </p>
# $ \ket{\psi} = c_0 \ket{C_\alpha^\pm} + c_1 \ket{C_{i\alpha}^\pm} $
# 
# ## Crash course in Optimal control (GRAPE)
# 
# We (usually) optimise for fidelity $\newcommand{\tr}[0]{\operatorname{tr}} f_{PSU} = \tfrac{1}{d} \big| \tr \{X_{targ}^{\dagger} X(T)\} \big| $
# # Optimal control for cat codes
# Jaynes-Cummings (dispersive)
# $$ \hat{H} = \omega_s\au\ad \,+ (\omega_a - \chi_{sa}\au\ad)\bu\bd $$
# $$-\, \frac{K_s}{2}\au{}^2\ad{}^2 \,-\, \frac{K_a}{2}\bu{}^2\bd{}^2 $$
# $$+\, \underbrace{\epsilon_a(t)\bu + \epsilon_a^*(t)\bd}_{\text{Qubit drive}} \,+\, \underbrace{\epsilon_s(t)\au + \epsilon_s^*(t)\ad}_{\text{Res drive}} $$
#
# $$ \bu\bd = \ket{e}\bra{e} = \sigma_-\sigma_+ $$
# 
# * Use optimisation to find the pulse envelope which will realise the unitary $ \hat{U}_t \underbrace{(c_0\ket{g} + c_1\ket{e})}_{\text{ancilla}}\underbrace{\ket{0}}_{\text{res}} = \underbrace{\ket{g}}_{\text{ancilla}} \underbrace{(c_0\cat{+} + c_1\cat[i]{+})}_{\text{res}} $
# * Practically this means we want to optimise for $K$ state transfers at the same time $ F_{oc} = \frac{1}{K^2} | \sum_k^K \braket{\psi_k(T)}{\psi_k^{\text{tar}}} |^2 $ where we encode many points on the Bloch sphere in the cat code basis.
# In[7]:
from numpy import sqrt
π = 3.1415926
ω_r = 8.3056 * 2 * π # resonator frequency
ω_q = 6.2815 * 2 * π # qubit frequency
K_q = -2*π*297e-3 # Kerr qubit 200-300 MHz
K_r = 2*π*4.5e-6 # Kerr res 1-10 Khz
ω_ef = ω_q + K_q
ω_gf = ω_q + K_q/2
χ = 25e-3 * 2 * π # parameter in the dispersive hamiltonian
Δ = abs(ω_r - ω_q) # detuning
g = sqrt(Δ * χ) # coupling strength that is consistent with chi
print(g)
# 
# 
# 
# ### My work so far
# * Use the pulse optimisation tool in `QuTiP` (quantum simulation toolbox in Python), or other framework
# * Project status - more difficult than expected
# * Even for the simple things, e.g. bit flip pulse, there are problems with convergence and numerical errors
# * Custom constraints on the pulses aren't implemented yet (nor general optimization goals) in QuTiP
# * I will try `Krotov`, another python toolbox which uses the Krotov method instead of GRAPE
# * Goal of the thesis is to realise this method and then eventually evaluate possible extensions:
# * Other bosonic codes besides "2 lobe"-cat codes
# * Optimise the coefficients of Fock states (theoretical curiosity)
# ## Thank you for listening! Any questions?
| 40.092784 | 280 | 0.651839 | 631 | 3,889 | 3.944532 | 0.432647 | 0.044998 | 0.050623 | 0.064685 | 0.073122 | 0.031338 | 0.031338 | 0.031338 | 0.031338 | 0.031338 | 0 | 0.028493 | 0.16071 | 3,889 | 96 | 281 | 40.510417 | 0.733762 | 0.894574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d6240d57782a997c5e7d887f0b05f55415199e37 | 965 | py | Python | problems/41/problem_41.py | r1cc4rdo/daily_coding_problem | 6ac85309fad2f64231ac7ab94aa4158e18bdec40 | [
"Unlicense"
] | 158 | 2018-01-25T06:33:30.000Z | 2022-03-14T23:18:05.000Z | problems/41/problem_41.py | r1cc4rdo/daily_coding_problem | 6ac85309fad2f64231ac7ab94aa4158e18bdec40 | [
"Unlicense"
] | 9 | 2018-07-04T00:31:57.000Z | 2020-05-16T21:02:30.000Z | problems/41/problem_41.py | r1cc4rdo/daily_coding_problem | 6ac85309fad2f64231ac7ab94aa4158e18bdec40 | [
"Unlicense"
] | 50 | 2018-06-22T16:48:44.000Z | 2022-01-11T16:45:48.000Z | def coding_problem_41(flights_db, starting_airport):
"""
Given an unordered list of flights taken by someone, each represented as (origin, destination) pairs, and a
starting airport, compute the person's itinerary. If no such itinerary exists, return null. If there are multiple
possible itineraries, return the lexicographically smallest one. All flights must be used in the itinerary.
Examples:
>>> coding_problem_41([('SFO', 'HKO'), ('YYZ', 'SFO'), ('YUL', 'YYZ'), ('HKO', 'ORD')], 'YUL')
['YUL', 'YYZ', 'SFO', 'HKO', 'ORD']
>>> coding_problem_41([('SFO', 'COM'), ('COM', 'YYZ')], 'COM') # returns None
>>> coding_problem_41([('A', 'B'), ('A', 'C'), ('B', 'C'), ('C', 'A')], 'A')
['A', 'B', 'C', 'A', 'C']
The itinerary ['A', 'C', 'A', 'B', 'C'] is also a valid however the first one is lexicographically smaller.
"""
pass
if __name__ == '__main__':
import doctest
doctest.testmod(verbose=True)
| 40.208333 | 117 | 0.609326 | 131 | 965 | 4.351145 | 0.557252 | 0.091228 | 0.105263 | 0.063158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010283 | 0.193782 | 965 | 23 | 118 | 41.956522 | 0.722365 | 0.790674 | 0 | 0 | 0 | 0 | 0.053333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
d62ad03a43eda3844d3bfe00ca8827143a6790cf | 288 | py | Python | rgs/__init__.py | slavakargin/RandomGeometricStructures | 02cdc1b4d3128e23b1e78f1f1612b51917894eea | [
"MIT"
] | null | null | null | rgs/__init__.py | slavakargin/RandomGeometricStructures | 02cdc1b4d3128e23b1e78f1f1612b51917894eea | [
"MIT"
] | null | null | null | rgs/__init__.py | slavakargin/RandomGeometricStructures | 02cdc1b4d3128e23b1e78f1f1612b51917894eea | [
"MIT"
] | null | null | null | '''
A package to manipulate and display some random structures,
including meander systems, planar triangulations, and ribbon tilings
Created on May 8, 2021
@author: vladislavkargin
'''
'''
#I prefer blank __init__.py
from . import mndrpy
from . import pmaps
from . import ribbons
''' | 16.941176 | 68 | 0.753472 | 38 | 288 | 5.605263 | 0.868421 | 0.140845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0.166667 | 288 | 17 | 69 | 16.941176 | 0.866667 | 0.621528 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d631a41f89b7ce2f5771d40e6b71d43ca80b06fc | 282 | py | Python | api/serializers/RouteDistanceSerializer.py | M4hakala/drf_route_api_example | 894989bf71a6c781c54093a7ac6a8d7a1d951146 | [
"MIT"
] | null | null | null | api/serializers/RouteDistanceSerializer.py | M4hakala/drf_route_api_example | 894989bf71a6c781c54093a7ac6a8d7a1d951146 | [
"MIT"
] | null | null | null | api/serializers/RouteDistanceSerializer.py | M4hakala/drf_route_api_example | 894989bf71a6c781c54093a7ac6a8d7a1d951146 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from api.models import RouteModel
class RouteDistanceSerializer(serializers.ModelSerializer):
km = serializers.FloatField(source='distance', read_only=True)
class Meta:
model = RouteModel
fields = ('route_id', 'km')
| 25.636364 | 66 | 0.737589 | 30 | 282 | 6.833333 | 0.766667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177305 | 282 | 10 | 67 | 28.2 | 0.883621 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d63513361fda0a919145ea884276a37e0a46cdc6 | 4,051 | py | Python | tree/basic.py | Matioz/AlphaZero | d50ca8e57752ada2e5a5f6b817d67b6fe753449d | [
"MIT"
] | 1 | 2020-08-08T14:01:27.000Z | 2020-08-08T14:01:27.000Z | tree/basic.py | Matioz/AlphaZero | d50ca8e57752ada2e5a5f6b817d67b6fe753449d | [
"MIT"
] | 13 | 2018-05-20T10:41:08.000Z | 2022-03-11T23:39:11.000Z | tree/basic.py | piojanu/AlphaZero | 7285559374c331d83320f1ef447fb185880531c6 | [
"MIT"
] | null | null | null | import numpy as np
from abc import ABCMeta, abstractmethod
class Node(object):
"""Represents state in MCTS search tree.
Args:
state (object): The environment state corresponding to this node in the search tree.
Note:
Node object is immutable. Node is left without exit edges (empty dict) when it's terminal.
"""
def __init__(self, state):
self._state = state
self._edges = None
@property
def state(self):
"""object: The environment state corresponding to this node in the search tree."""
return self._state
@property
def edges(self):
"""list of Edges: Mapping from this node's possible actions to corresponding edges."""
return self._edges
def expand(self, edges):
"""Initialize Node object with edges.
Args:
edges (dict of Edges): Mapping from this node's possible actions to corresponding edges.
"""
self._edges = edges
def select_edge(self, c=1.):
"""Choose next action (edge) according to UCB formula.
Args:
c (float): The parameter c >= 0 controls the trade-off between choosing lucrative nodes
(low c) and exploring nodes with low visit counts (high c). (Default: 1)
Returns:
int: Action chosen with UCB formula.
Edge: Edge which represents proper action chosen with UCB formula.
or
None: If it is terminal node and has no exit edges.
"""
assert self.edges is not None, "This node hasn't been expanded yet!"
if len(self.edges) == 0:
return None
state_visits = 0
scores = {}
# Initialize every edge's score to its Q-value and count current state visits
for action, edge in self.edges.items():
state_visits += edge.num_visits
scores[(action, edge)] = edge.qvalue
# Add exploration term to every edge's score
for action, edge in self.edges.items():
scores[(action, edge)] += c * edge.prior * \
np.sqrt(state_visits) / (1 + edge.num_visits)
# Choose next action and edge with highest score
action_edge = max(scores, key=scores.get)
return action_edge
class Edge(object):
"""Represents state-actions pair in MCTS search tree.
Args:
prior (float): Action probability from prior policy. (Default: 1.)
"""
def __init__(self, prior=1.):
self._prior = prior
self._next_node = None
self._reward = 0
self._qvalue = 0
self._num_visits = 0
def expand(self, next_node, reward):
"""Explore this edge.
Args:
next_node (Node): Node that this edge points to.
reward (float): Reward of transition represented by this edge.
"""
self._next_node = next_node
self._reward = reward
def update(self, return_t):
"""Update edge with data from child.
Args:
return_t (float): (Un)discounted return from timestep 't' (this edge).
"""
self._num_visits += 1
# This is formula for iteratively calculating average
# NOTE: You can check that first arbitrary value will be forgotten after fist update
self._qvalue += (return_t - self._qvalue) / self.num_visits
@property
def next_node(self):
"""next_node (Node): Node that this edge points to."""
return self._next_node
@property
def reward(self):
"""float: Reward of transition represented by this edge."""
return self._reward
@property
def qvalue(self):
"""float: Quality value of this edge state-action pair."""
return self._qvalue
@property
def prior(self):
"""float: Action probability from prior policy."""
return self._prior
@property
def num_visits(self):
"""int: Number of times this state-action pair was visited."""
return self._num_visits
| 28.935714 | 100 | 0.607011 | 515 | 4,051 | 4.673786 | 0.291262 | 0.029913 | 0.024927 | 0.013295 | 0.263398 | 0.225177 | 0.194433 | 0.170337 | 0.133776 | 0.103864 | 0 | 0.004274 | 0.306838 | 4,051 | 139 | 101 | 29.143885 | 0.85292 | 0.470254 | 0 | 0.160714 | 0 | 0 | 0.018558 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 1 | 0.232143 | false | 0 | 0.035714 | 0 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d648b943b6e2a77549b2f9c033aba864d03bf880 | 506 | py | Python | integrations-and-supported-tools/fastai/scripts/Neptune_fastai.py | neptune-ai/examples | e64cfaadb028e2187063fc43768dfee44074729b | [
"MIT"
] | 15 | 2021-06-11T16:35:15.000Z | 2022-03-29T15:53:59.000Z | integrations-and-supported-tools/fastai/scripts/Neptune_fastai.py | neptune-ai/examples | e64cfaadb028e2187063fc43768dfee44074729b | [
"MIT"
] | 12 | 2021-04-26T13:07:50.000Z | 2021-11-15T10:50:03.000Z | integrations-and-supported-tools/fastai/scripts/Neptune_fastai.py | neptune-ai/examples | e64cfaadb028e2187063fc43768dfee44074729b | [
"MIT"
] | 10 | 2021-05-07T16:28:18.000Z | 2022-02-28T21:47:11.000Z | import fastai
from neptune.new.integrations.fastai import NeptuneCallback
from fastai.vision.all import *
import neptune.new as neptune
run = neptune.init(
project="common/fastai-integration", api_token="ANONYMOUS", tags="basic"
)
path = untar_data(URLs.MNIST_TINY)
dls = ImageDataLoaders.from_csv(path)
# Log all training phases of the learner
learn = cnn_learner(dls, resnet18, cbs=[NeptuneCallback(run=run, base_namespace="experiment")])
learn.fit_one_cycle(2)
learn.fit_one_cycle(1)
run.stop()
| 26.631579 | 95 | 0.784585 | 73 | 506 | 5.30137 | 0.657534 | 0.05168 | 0.056848 | 0.082687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008811 | 0.102767 | 506 | 18 | 96 | 28.111111 | 0.843612 | 0.075099 | 0 | 0 | 0 | 0 | 0.10515 | 0.053648 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
d648e5cbc312d423bf5006d19075c50d0c7bd02f | 17,011 | py | Python | setup.py | Ademan/psycopg2 | bf7e1da0aee83c500a3f9477fd1646e3ee28a3d1 | [
"OpenSSL"
] | 2 | 2016-07-18T08:53:22.000Z | 2018-03-28T16:59:52.000Z | setup.py | Ademan/psycopg2 | bf7e1da0aee83c500a3f9477fd1646e3ee28a3d1 | [
"OpenSSL"
] | null | null | null | setup.py | Ademan/psycopg2 | bf7e1da0aee83c500a3f9477fd1646e3ee28a3d1 | [
"OpenSSL"
] | null | null | null | # setup.py - distutils packaging
#
# Copyright (C) 2003-2010 Federico Di Gregorio <fog@debian.org>
#
# psycopg2 is free software: you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# psycopg2 is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public
# License for more details.
"""Python-PostgreSQL Database Adapter
psycopg is a PostgreSQL database adapter for the Python programming
language. This is version 2, a complete rewrite of the original code to
provide new-style classes for connection and cursor objects and other sweet
candies. Like the original, psycopg 2 was written with the aim of being
very small and fast, and stable as a rock.
psycopg is different from the other database adapter because it was
designed for heavily multi-threaded applications that create and destroy
lots of cursors and make a conspicuous number of concurrent INSERTs or
UPDATEs. psycopg 2 also provide full asycronous operations for the really
brave programmer.
"""
classifiers = """\
Development Status :: 5 - Production/Stable
Intended Audience :: Developers
License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)
License :: OSI Approved :: Zope Public License
Programming Language :: Python
Programming Language :: C
Programming Language :: SQL
Topic :: Database
Topic :: Database :: Front-Ends
Topic :: Software Development
Topic :: Software Development :: Libraries :: Python Modules
Operating System :: Microsoft :: Windows
Operating System :: Unix
"""
import os
import os.path
import sys
import re
import subprocess
import ConfigParser
from distutils.core import setup, Extension
from distutils.errors import DistutilsFileError
from distutils.command.build_ext import build_ext
from distutils.sysconfig import get_python_inc
from distutils.ccompiler import get_default_compiler
# Take a look at http://www.python.org/dev/peps/pep-0386/
# for a consistent versioning pattern.
PSYCOPG_VERSION = '2.3.0-beta2'
version_flags = ['dt', 'dec']
PLATFORM_IS_WINDOWS = sys.platform.lower().startswith('win')
def get_pg_config(kind, pg_config="pg_config"):
try:
p = subprocess.Popen([pg_config, "--" + kind],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
except OSError:
raise Warning("Unable to find 'pg_config' file")
p.stdin.close()
r = p.stdout.readline().strip()
if not r:
raise Warning(p.stderr.readline())
return r
class psycopg_build_ext(build_ext):
"""Conditionally complement the setup.cfg options file.
This class configures the include_dirs, libray_dirs, libraries
options as required by the system. Most of the configuration happens
in finalize_options() method.
If you want to set up the build step for a peculiar platform, add a
method finalize_PLAT(), where PLAT matches your sys.platform.
"""
user_options = build_ext.user_options[:]
user_options.extend([
('use-pydatetime', None,
"Use Python datatime objects for date and time representation."),
('pg-config=', None,
"The name of the pg_config binary and/or full path to find it"),
('have-ssl', None,
"Compile with OpenSSL built PostgreSQL libraries (Windows only)."),
('static-libpq', None,
"Statically link the PostgreSQL client library"),
])
boolean_options = build_ext.boolean_options[:]
boolean_options.extend(('use-pydatetime', 'have-ssl', 'static-libpq'))
DEFAULT_PG_CONFIG = "pg_config"
def initialize_options(self):
build_ext.initialize_options(self)
self.use_pg_dll = 1
self.pgdir = None
self.mx_include_dir = None
self.use_pydatetime = 1
self.have_ssl = have_ssl
self.pg_config = self.autodetect_pg_config_path()
def get_compiler(self):
"""Return the name of the C compiler used to compile extensions.
If a compiler was not explicitly set (on the command line, for
example), fall back on the default compiler.
"""
if self.compiler:
# distutils doesn't keep the type of self.compiler uniform; we
# compensate:
if isinstance(self.compiler, str):
name = self.compiler
else:
name = self.compiler.compiler_type
else:
name = get_default_compiler()
return name
def get_pg_config(self, kind):
return get_pg_config(kind, self.pg_config)
def finalize_win32(self):
"""Finalize build system configuration on win32 platform."""
import struct
sysVer = sys.version_info[:2]
# Add compiler-specific arguments:
extra_compiler_args = []
compiler_name = self.get_compiler().lower()
compiler_is_msvc = compiler_name.startswith('msvc')
compiler_is_mingw = compiler_name.startswith('mingw')
if compiler_is_msvc:
# If we're using MSVC 7.1 or later on a 32-bit platform, add the
# /Wp64 option to generate warnings about Win64 portability
# problems.
if sysVer >= (2,4) and struct.calcsize('P') == 4:
extra_compiler_args.append('/Wp64')
elif compiler_is_mingw:
# Default MinGW compilation of Python extensions on Windows uses
# only -O:
extra_compiler_args.append('-O3')
# GCC-compiled Python on non-Windows platforms is built with strict
# aliasing disabled, but that must be done explicitly on Windows to
# avoid large numbers of warnings for perfectly idiomatic Python C
# API code.
extra_compiler_args.append('-fno-strict-aliasing')
# Force correct C runtime library linkage:
if sysVer <= (2,3):
# Yes: 'msvcr60', rather than 'msvcrt', is the correct value
# on the line below:
self.libraries.append('msvcr60')
elif sysVer in ((2,4), (2,5)):
self.libraries.append('msvcr71')
# Beyond Python 2.5, we take our chances on the default C runtime
# library, because we don't know what compiler those future
# versions of Python will use.
for exten in ext: # ext is a global list of Extension objects
exten.extra_compile_args.extend(extra_compiler_args)
# End of add-compiler-specific arguments section.
self.libraries.append("ws2_32")
self.libraries.append("advapi32")
if compiler_is_msvc:
# MSVC requires an explicit "libpq"
self.libraries.remove("pq")
self.libraries.append("secur32")
self.libraries.append("libpq")
self.libraries.append("shfolder")
for path in self.library_dirs:
if os.path.isfile(os.path.join(path, "ms", "libpq.lib")):
self.library_dirs.append(os.path.join(path, "ms"))
break
if self.have_ssl:
self.libraries.append("libeay32")
self.libraries.append("ssleay32")
self.libraries.append("crypt32")
self.libraries.append("user32")
self.libraries.append("gdi32")
def finalize_darwin(self):
"""Finalize build system configuration on darwin platform."""
self.libraries.append('ssl')
self.libraries.append('crypto')
def finalize_linux2(self):
"""Finalize build system configuration on GNU/Linux platform."""
# tell piro that GCC is fine and dandy, but not so MS compilers
for ext in self.extensions:
ext.extra_compile_args.append('-Wdeclaration-after-statement')
def finalize_options(self):
"""Complete the build system configuation."""
build_ext.finalize_options(self)
self.include_dirs.append(".")
if static_libpq:
if not self.link_objects: self.link_objects = []
self.link_objects.append(
os.path.join(self.get_pg_config("libdir"), "libpq.a"))
else:
self.libraries.append("pq")
try:
self.library_dirs.append(self.get_pg_config("libdir"))
self.include_dirs.append(self.get_pg_config("includedir"))
self.include_dirs.append(self.get_pg_config("includedir-server"))
try:
# Here we take a conservative approach: we suppose that
# *at least* PostgreSQL 7.4 is available (this is the only
# 7.x series supported by psycopg 2)
pgversion = self.get_pg_config("version").split()[1]
except:
pgversion = "7.4.0"
verre = re.compile(r"(\d+)\.(\d+)(?:(?:\.(\d+))|(devel|(alpha|beta|rc)\d+))")
m = verre.match(pgversion)
if m:
pgmajor, pgminor, pgpatch = m.group(1, 2, 3)
if pgpatch is None or not pgpatch.isdigit():
pgpatch = 0
else:
sys.stderr.write(
"Error: could not determine PostgreSQL version from '%s'"
% pgversion)
sys.exit(1)
define_macros.append(("PG_VERSION_HEX", "0x%02X%02X%02X" %
(int(pgmajor), int(pgminor), int(pgpatch))))
except Warning, w:
if self.pg_config == self.DEFAULT_PG_CONFIG:
sys.stderr.write("Warning: %s" % str(w))
else:
sys.stderr.write("Error: %s" % str(w))
sys.exit(1)
if hasattr(self, "finalize_" + sys.platform):
getattr(self, "finalize_" + sys.platform)()
def autodetect_pg_config_path(self):
res = None
if PLATFORM_IS_WINDOWS:
res = self.autodetect_pg_config_path_windows()
return res or self.DEFAULT_PG_CONFIG
def autodetect_pg_config_path_windows(self):
# Find the first PostgreSQL installation listed in the registry and
# return the full path to its pg_config utility.
#
# This autodetection is performed *only* if the following conditions
# hold:
#
# 1) The pg_config utility is not already available on the PATH:
if os.popen('pg_config').close() is None: # .close()->None == success
return None
# 2) The user has not specified any of the following settings in
# setup.cfg:
# - pg_config
# - include_dirs
# - library_dirs
for settingName in ('pg_config', 'include_dirs', 'library_dirs'):
try:
val = parser.get('build_ext', settingName)
except ConfigParser.NoOptionError:
pass
else:
if val.strip() != '':
return None
# end of guard conditions
import _winreg
pg_inst_base_dir = None
pg_config_path = None
reg = _winreg.ConnectRegistry(None, _winreg.HKEY_LOCAL_MACHINE)
try:
pg_inst_list_key = _winreg.OpenKey(reg,
'SOFTWARE\\PostgreSQL\\Installations'
)
except EnvironmentError:
pg_inst_list_key = None
if pg_inst_list_key is not None:
try:
# Determine the name of the first subkey, if any:
try:
first_sub_key_name = _winreg.EnumKey(pg_inst_list_key, 0)
except EnvironmentError:
first_sub_key_name = None
if first_sub_key_name is not None:
pg_first_inst_key = _winreg.OpenKey(reg,
'SOFTWARE\\PostgreSQL\\Installations\\'
+ first_sub_key_name
)
try:
pg_inst_base_dir = _winreg.QueryValueEx(
pg_first_inst_key, 'Base Directory'
)[0]
finally:
_winreg.CloseKey(pg_first_inst_key)
finally:
_winreg.CloseKey(pg_inst_list_key)
if pg_inst_base_dir and os.path.exists(pg_inst_base_dir):
pg_config_path = os.path.join(pg_inst_base_dir, 'bin',
'pg_config.exe'
)
# Support unicode paths, if this version of Python provides the
# necessary infrastructure:
if hasattr(sys, 'getfilesystemencoding'):
pg_config_path = pg_config_path.encode(
sys.getfilesystemencoding()
)
return pg_config_path
# let's start with macro definitions (the ones not already in setup.cfg)
define_macros = []
include_dirs = []
# gather information to build the extension module
ext = [] ; data_files = []
# sources
sources = [
'psycopgmodule.c', 'pqpath.c', 'typecast.c',
'microprotocols.c', 'microprotocols_proto.c',
'connection_type.c', 'connection_int.c', 'cursor_type.c', 'cursor_int.c',
'lobject_type.c', 'lobject_int.c', 'notify_type.c', 'xid_type.c',
'adapter_qstring.c', 'adapter_pboolean.c', 'adapter_binary.c',
'adapter_asis.c', 'adapter_list.c', 'adapter_datetime.c',
'adapter_pfloat.c', 'adapter_pdecimal.c',
'green.c', 'utils.c']
parser = ConfigParser.ConfigParser()
parser.read('setup.cfg')
# Choose a datetime module
have_pydatetime = True
have_mxdatetime = False
use_pydatetime = int(parser.get('build_ext', 'use_pydatetime'))
# check for mx package
if parser.has_option('build_ext', 'mx_include_dir'):
mxincludedir = parser.get('build_ext', 'mx_include_dir')
else:
mxincludedir = os.path.join(get_python_inc(plat_specific=1), "mx")
if os.path.exists(mxincludedir):
include_dirs.append(mxincludedir)
define_macros.append(('HAVE_MXDATETIME','1'))
sources.append('adapter_mxdatetime.c')
have_mxdatetime = True
version_flags.append('mx')
# now decide which package will be the default for date/time typecasts
if have_pydatetime and (use_pydatetime or not have_mxdatetime):
define_macros.append(('PSYCOPG_DEFAULT_PYDATETIME','1'))
elif have_mxdatetime:
define_macros.append(('PSYCOPG_DEFAULT_MXDATETIME','1'))
else:
def e(msg):
sys.stderr.write("error: " + msg + "\n")
e("psycopg requires a datetime module:")
e(" mx.DateTime module not found")
e(" python datetime module not found")
e("Note that psycopg needs the module headers and not just the module")
e("itself. If you installed Python or mx.DateTime from a binary package")
e("you probably need to install its companion -dev or -devel package.")
sys.exit(1)
# generate a nice version string to avoid confusion when users report bugs
for have in parser.get('build_ext', 'define').split(','):
if have == 'PSYCOPG_EXTENSIONS':
version_flags.append('ext')
elif have == 'HAVE_PQPROTOCOL3':
version_flags.append('pq3')
if version_flags:
PSYCOPG_VERSION_EX = PSYCOPG_VERSION + " (%s)" % ' '.join(version_flags)
else:
PSYCOPG_VERSION_EX = PSYCOPG_VERSION
if not PLATFORM_IS_WINDOWS:
define_macros.append(('PSYCOPG_VERSION', '"'+PSYCOPG_VERSION_EX+'"'))
else:
define_macros.append(('PSYCOPG_VERSION', '\\"'+PSYCOPG_VERSION_EX+'\\"'))
if parser.has_option('build_ext', 'have_ssl'):
have_ssl = int(parser.get('build_ext', 'have_ssl'))
else:
have_ssl = 0
if parser.has_option('build_ext', 'static_libpq'):
static_libpq = int(parser.get('build_ext', 'static_libpq'))
else:
static_libpq = 0
# build the extension
sources = map(lambda x: os.path.join('psycopg', x), sources)
ext.append(Extension("psycopg2._psycopg", sources,
define_macros=define_macros,
include_dirs=include_dirs,
undef_macros=[]))
setup(name="psycopg2",
version=PSYCOPG_VERSION,
maintainer="Federico Di Gregorio",
maintainer_email="fog@initd.org",
author="Federico Di Gregorio",
author_email="fog@initd.org",
url="http://initd.org/tracker/psycopg",
download_url = "http://initd.org/pub/software/psycopg2",
license="GPL with exceptions or ZPL",
platforms = ["any"],
description=__doc__.split("\n")[0],
long_description="\n".join(__doc__.split("\n")[2:]),
classifiers=filter(None, classifiers.split("\n")),
data_files=data_files,
package_dir={'psycopg2':'lib'},
packages=['psycopg2'],
cmdclass={ 'build_ext': psycopg_build_ext },
ext_modules=ext)
| 37.970982 | 89 | 0.628064 | 2,099 | 17,011 | 4.930443 | 0.272034 | 0.027829 | 0.027539 | 0.009856 | 0.114214 | 0.073727 | 0.036719 | 0.018166 | 0.00889 | 0 | 0 | 0.009467 | 0.27347 | 17,011 | 447 | 90 | 38.055928 | 0.827899 | 0.157957 | 0 | 0.105085 | 0 | 0 | 0.209302 | 0.019576 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.00339 | 0.044068 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.