hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
604c11d1662643b5e9e977b3126e196c0ca94747 | 1,944 | py | Python | edit/editpublisher.py | lokal-profil/isfdb_site | 0ce20d6347849926d4eda961ea9249c31519eea5 | [
"BSD-3-Clause"
] | null | null | null | edit/editpublisher.py | lokal-profil/isfdb_site | 0ce20d6347849926d4eda961ea9249c31519eea5 | [
"BSD-3-Clause"
] | null | null | null | edit/editpublisher.py | lokal-profil/isfdb_site | 0ce20d6347849926d4eda961ea9249c31519eea5 | [
"BSD-3-Clause"
] | null | null | null | #!_PYTHONLOC
#
# (C) COPYRIGHT 2004-2021 Al von Ruff and Ahasuerus
# ALL RIGHTS RESERVED
#
# The copyright notice above does not evidence any actual or
# intended publication of such source code.
#
# Version: $Revision$
# Date: $Date$
from isfdblib import *
from isfdblib_help import *
from isfdblib_print import *
from isfdb import *
from SQLparsing import *
from login import User
if __name__ == '__main__':
publisherID = SESSION.Parameter(0, 'int')
record = SQLGetPublisher(publisherID)
if not record:
SESSION.DisplayError('Record Does Not Exist')
PrintPreSearch('Publisher Editor')
PrintNavBar('edit/editpublisher.cgi', publisherID)
help = HelpPublisher()
printHelpBox('publisher', 'EditPublisher')
print '<form id="data" METHOD="POST" ACTION="/cgi-bin/edit/submitpublisher.cgi">'
print '<table border="0">'
print '<tbody id="tagBody">'
# Limit the ability to edit publisher names to moderators
user = User()
user.load()
display_only = 1
if SQLisUserModerator(user.id):
display_only = 0
printfield("Publisher Name", "publisher_name", help, record[PUBLISHER_NAME], display_only)
trans_publisher_names = SQLloadTransPublisherNames(record[PUBLISHER_ID])
printmultiple(trans_publisher_names, "Transliterated Name", "trans_publisher_names", help)
webpages = SQLloadPublisherWebpages(record[PUBLISHER_ID])
printWebPages(webpages, 'publisher', help)
printtextarea('Note', 'publisher_note', help, SQLgetNotes(record[PUBLISHER_NOTE]))
printtextarea('Note to Moderator', 'mod_note', help, '')
print '</tbody>'
print '</table>'
print '<p>'
print '<input NAME="publisher_id" VALUE="%d" TYPE="HIDDEN">' % publisherID
print '<input TYPE="SUBMIT" VALUE="Submit Data" tabindex="1">'
print '</form>'
print '<p>'
PrintPostSearch(0, 0, 0, 0, 0, 0)
| 28.173913 | 98 | 0.677469 | 219 | 1,944 | 5.885845 | 0.479452 | 0.03879 | 0.00931 | 0.00931 | 0.004655 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012266 | 0.203189 | 1,944 | 68 | 99 | 28.588235 | 0.819884 | 0.154835 | 0 | 0.052632 | 0 | 0.026316 | 0.280809 | 0.052728 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.157895 | null | null | 0.447368 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
604f0eeff04eca0db1f9e0f762b1e72dacff74c1 | 2,907 | py | Python | audioanalysis_demo/test_audio_analysis.py | tiaotiao/applets | c583a4405ed18c7d74bfba49884525c43d114398 | [
"MIT"
] | null | null | null | audioanalysis_demo/test_audio_analysis.py | tiaotiao/applets | c583a4405ed18c7d74bfba49884525c43d114398 | [
"MIT"
] | null | null | null | audioanalysis_demo/test_audio_analysis.py | tiaotiao/applets | c583a4405ed18c7d74bfba49884525c43d114398 | [
"MIT"
] | null | null | null |
import sys, wave
import AudioAnalysis
FILE_NAME = "snippet.wav"
def testWavWrite():
try:
f = wave.open(FILE_NAME, "rb")
except Exception, e:
print e
print "File type is not wav!"
return
c = wave.open("cnv_" + FILE_NAME, "wb")
print f.getnchannels()
print f.getsampwidth()
print f.getframerate()
print f.getnframes()
#print f.getparams()
total = f.getnframes()
read_count = total / 2
c.setnchannels(f.getnchannels())
c.setsampwidth(f.getsampwidth())
c.setframerate(f.getframerate())
c.setnframes(read_count)
c.setcomptype(f.getcomptype(), f.getcompname())
frames = f.readframes(read_count)
print len(frames)
print "bytes per frame: ", len(frames) / read_count
#for b in frames:
# i = int(b.encode("hex"), 16)
# print b.encode("hex")
#print '#' * (i / 10)
c.writeframes(frames)
print "----------"
f.close()
c.close()
def process(p):
print p
def testAudioAnalysis():
a = AudioAnalysis.AudioAnalysis(FILE_NAME)
print a.getFilename()
print a.getFileType()
a.setFrameInterval(0.01)
print a.analysePower(process)
print a.getPowerMin(), "\tgetPowerMin"
print a.getPowerMax(), "\tgetPowerMax"
print a.getSamplePowerMin(), "\tgetSamplePowerMin"
print a.getSamplePowerMax(), "\tgetSamplePowerMax"
print a.getFrameRate(), "\tgetFrameRate"
print a.getSampleWidth(), "\tgetSampleWidth"
print a.getDuration(), "\tgetDuration"
print a.getFrameInterval(), "\tgetFrameInterval"
print a.getSamples(), "\tgetSamples"
powers = a.getFramePower()
for p in powers:
print "%04lf" % p[0], "%-6d" % p[1] ,'#' * (p[1] / 100)
def main():
f = open(FILE_NAME, "rb")
if not f:
print "Open file failed!"
return
try:
w = wave.open(f)
except Exception, e:
print e
print "File type is not wav!"
return
print "get channels\t", w.getnchannels() # channels, single or double
print "frame rate\t", w.getframerate() # rate, frames per sec
print "samp width\t", w.getsampwidth() # maybe: channels * width = bytes per frame
print "get n frames\t", w.getnframes() # total frames
print "comp type\t", w.getcomptype() # compress
print "params\t", w.getparams()
total = w.getnframes()
read_count = 100
frames = w.readframes(read_count)
print "len(frames)\t", len(frames)
print "bytes per frame\t", len(frames) / read_count
#for b in frames:
#i = int(b.encode("hex"), 16)
#print b.encode("hex")
#print '#' * (i / 10)
print "----------"
w.close()
f.close()
if __name__ == "__main__":
main()
#testAudioAnalysis()
#testWavWrite()
| 23.827869 | 88 | 0.579635 | 340 | 2,907 | 4.894118 | 0.323529 | 0.043269 | 0.024038 | 0.016827 | 0.208534 | 0.208534 | 0.141827 | 0.141827 | 0.141827 | 0.141827 | 0 | 0.011429 | 0.277606 | 2,907 | 122 | 89 | 23.827869 | 0.780952 | 0.117647 | 0 | 0.197368 | 0 | 0 | 0.146504 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.026316 | null | null | 0.460526 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
6053b76dec55ceda546ea38cd4b295199bfedd36 | 382 | py | Python | openslides_backend/action/topic/delete.py | reiterl/openslides-backend | d36667f00087ae8baf25853d4cef18a5e6dc7b3b | [
"MIT"
] | null | null | null | openslides_backend/action/topic/delete.py | reiterl/openslides-backend | d36667f00087ae8baf25853d4cef18a5e6dc7b3b | [
"MIT"
] | null | null | null | openslides_backend/action/topic/delete.py | reiterl/openslides-backend | d36667f00087ae8baf25853d4cef18a5e6dc7b3b | [
"MIT"
] | null | null | null | from ...models.models import Topic
from ..default_schema import DefaultSchema
from ..generics import DeleteAction
from ..register import register_action
@register_action("topic.delete")
class TopicDelete(DeleteAction):
"""
Action to delete simple topics that can be shown in the agenda.
"""
model = Topic()
schema = DefaultSchema(Topic()).get_delete_schema()
| 25.466667 | 67 | 0.740838 | 46 | 382 | 6.043478 | 0.565217 | 0.100719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162304 | 382 | 14 | 68 | 27.285714 | 0.86875 | 0.164921 | 0 | 0 | 0 | 0 | 0.039604 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
605951901688fbda8e99d2e5f2796e9b32eff1fe | 18,195 | py | Python | exchange_calendars/extensions/exchange_calendar_krx.py | syonoki/exchange_calendars | 639ab0f88a874af99bb601824a8ffef2572820d4 | [
"Apache-2.0"
] | null | null | null | exchange_calendars/extensions/exchange_calendar_krx.py | syonoki/exchange_calendars | 639ab0f88a874af99bb601824a8ffef2572820d4 | [
"Apache-2.0"
] | null | null | null | exchange_calendars/extensions/exchange_calendar_krx.py | syonoki/exchange_calendars | 639ab0f88a874af99bb601824a8ffef2572820d4 | [
"Apache-2.0"
] | null | null | null | """
Last update: 2018-10-26
"""
from exchange_calendars.extensions.calendar_extension import ExtendedExchangeCalendar
from pandas import (
Timestamp,
)
from pandas.tseries.holiday import (
Holiday,
previous_friday,
)
from exchange_calendars.exchange_calendar import HolidayCalendar
from datetime import time
from itertools import chain
from pytz import timezone
KRNewYearsDay = Holiday(
'New Years Day',
month=1,
day=1)
KRIndependenceDay = Holiday(
'Independence Day',
month=3,
day=1
)
KRArbourDay = Holiday(
'Arbour Day',
month=4,
day=5,
end_date=Timestamp('2006-01-01'),
)
KRLabourDay = Holiday(
'Labour Day',
month=5,
day=1
)
KRChildrensDay = Holiday(
'Labour Day',
month=5,
day=5
)
# 현충일
KRMemorialDay = Holiday(
'Memorial Day',
month=6,
day=6
)
# 제헌절
KRConstitutionDay = Holiday(
'Constitution Day',
month=7,
day=17,
end_date=Timestamp('2008-01-01')
)
# 광복절
KRLiberationDay = Holiday(
'Liberation Day',
month=8,
day=15
)
# 개천절
KRNationalFoundationDay = Holiday(
'NationalFoundationDay',
month=10,
day=3
)
Christmas = Holiday(
'Christmas',
month=12,
day=25
)
# 한글날
KRHangulProclamationDay = Holiday(
'Hangul Proclamation Day',
month=10,
day=9,
start_date=Timestamp('2013-01-01')
)
# KRX 연말 휴장
KRXEndOfYearClosing = Holiday(
'KRX Year-end Closing',
month=12,
day=31,
observance=previous_friday,
start_date=Timestamp('2001-01-01')
)
KRXEndOfYearClosing2000 = [
Timestamp('2000-12-27', tz='UTC'),
Timestamp('2000-12-28', tz='UTC'),
Timestamp('2000-12-29', tz='UTC'),
Timestamp('2000-12-30', tz='UTC'),
]
# Lunar New Year
KRLunarNewYear = [
# 2000
Timestamp('2000-02-04', tz='UTC'),
# 2001
Timestamp('2001-01-23', tz='UTC'),
Timestamp('2001-01-24', tz='UTC'),
Timestamp('2001-01-25', tz='UTC'),
# 2002
Timestamp('2002-02-11', tz='UTC'),
Timestamp('2002-02-12', tz='UTC'),
Timestamp('2002-02-13', tz='UTC'),
# 2003
Timestamp('2003-01-31', tz='UTC'),
# 2004
Timestamp('2004-01-21', tz='UTC'),
Timestamp('2004-01-22', tz='UTC'),
Timestamp('2004-01-23', tz='UTC'),
# 2005
Timestamp('2005-02-08', tz='UTC'),
Timestamp('2005-02-09', tz='UTC'),
Timestamp('2005-02-10', tz='UTC'),
# 2006
Timestamp('2006-01-28', tz='UTC'),
Timestamp('2006-01-29', tz='UTC'),
Timestamp('2006-01-30', tz='UTC'),
# 2007
Timestamp('2007-02-19', tz='UTC'),
# 2008
Timestamp('2008-02-06', tz='UTC'),
Timestamp('2008-02-07', tz='UTC'),
Timestamp('2008-02-08', tz='UTC'),
# 2009
Timestamp('2009-01-25', tz='UTC'),
Timestamp('2009-01-26', tz='UTC'),
Timestamp('2009-01-27', tz='UTC'),
# 2010
Timestamp('2010-02-13', tz='UTC'),
Timestamp('2010-02-14', tz='UTC'),
Timestamp('2010-02-15', tz='UTC'),
# 2011
Timestamp('2011-02-02', tz='UTC'),
Timestamp('2011-02-03', tz='UTC'),
Timestamp('2011-02-04', tz='UTC'),
# 2012
Timestamp('2012-01-23', tz='UTC'),
Timestamp('2012-01-24', tz='UTC'),
# 2013
Timestamp('2013-02-11', tz='UTC'),
# 2014
Timestamp('2014-01-30', tz='UTC'),
Timestamp('2014-01-31', tz='UTC'),
# 2015
Timestamp('2015-02-18', tz='UTC'),
Timestamp('2015-02-19', tz='UTC'),
Timestamp('2015-02-20', tz='UTC'),
# 2016
Timestamp('2016-02-07', tz='UTC'),
Timestamp('2016-02-08', tz='UTC'),
Timestamp('2016-02-09', tz='UTC'),
Timestamp('2016-02-10', tz='UTC'),
# 2017
Timestamp('2017-01-27', tz='UTC'),
Timestamp('2017-01-28', tz='UTC'),
Timestamp('2017-01-29', tz='UTC'),
Timestamp('2017-01-30', tz='UTC'),
# 2018
Timestamp('2018-02-15', tz='UTC'),
Timestamp('2018-02-16', tz='UTC'),
Timestamp('2018-02-17', tz='UTC'),
# 2019
Timestamp('2019-02-04', tz='UTC'),
Timestamp('2019-02-05', tz='UTC'),
Timestamp('2019-02-06', tz='UTC'),
# 2020
Timestamp('2020-01-24', tz='UTC'),
Timestamp('2020-01-25', tz='UTC'),
Timestamp('2020-01-26', tz='UTC'),
Timestamp('2020-01-27', tz='UTC'),
# 2021
Timestamp('2021-02-11', tz='UTC'),
Timestamp('2021-02-12', tz='UTC'),
# 2022
Timestamp('2022-01-31', tz='UTC'),
Timestamp('2022-02-01', tz='UTC'),
Timestamp('2022-02-02', tz='UTC'),
]
# Election Days
KRElectionDays = [
Timestamp('2000-04-13', tz='UTC'), # National Assembly
Timestamp('2002-06-13', tz='UTC'), # Regional election
Timestamp('2002-12-19', tz='UTC'), # Presidency
Timestamp('2004-04-15', tz='UTC'), # National Assembly
Timestamp('2006-05-31', tz='UTC'), # Regional election
Timestamp('2007-12-19', tz='UTC'), # Presidency
Timestamp('2008-04-09', tz='UTC'), # National Assembly
Timestamp('2010-06-02', tz='UTC'), # Local election
Timestamp('2012-04-11', tz='UTC'), # National Assembly
Timestamp('2012-12-19', tz='UTC'), # Presidency
Timestamp('2014-06-04', tz='UTC'), # Local election
Timestamp('2016-04-13', tz='UTC'), # National Assembly
Timestamp('2017-05-09', tz='UTC'), # Presidency
Timestamp('2018-06-13', tz='UTC'), # Local election
Timestamp('2020-04-15', tz='UTC'), # National Assembly
Timestamp('2022-03-09', tz='UTC'), # Presidency
Timestamp('2022-06-01', tz='UTC'), # Local election
]
# Buddha's birthday
KRBuddhasBirthday = [
Timestamp('2000-05-11', tz='UTC'),
Timestamp('2001-05-01', tz='UTC'),
Timestamp('2003-05-08', tz='UTC'),
Timestamp('2004-05-26', tz='UTC'),
Timestamp('2005-05-15', tz='UTC'),
Timestamp('2006-05-05', tz='UTC'),
Timestamp('2007-05-24', tz='UTC'),
Timestamp('2008-05-12', tz='UTC'),
Timestamp('2009-05-02', tz='UTC'),
Timestamp('2010-05-21', tz='UTC'),
Timestamp('2011-05-10', tz='UTC'),
Timestamp('2012-05-28', tz='UTC'),
Timestamp('2013-05-17', tz='UTC'),
Timestamp('2014-05-06', tz='UTC'),
Timestamp('2015-05-25', tz='UTC'),
Timestamp('2016-05-14', tz='UTC'),
Timestamp('2017-05-03', tz='UTC'),
Timestamp('2018-05-22', tz='UTC'),
Timestamp('2020-04-30', tz='UTC'),
Timestamp('2021-05-19', tz='UTC'),
]
# Harvest Moon Day
KRHarvestMoonDay = [
# 2000
Timestamp('2000-09-11', tz='UTC'),
Timestamp('2000-09-12', tz='UTC'),
Timestamp('2000-09-13', tz='UTC'),
# 2001
Timestamp('2001-10-01', tz='UTC'),
Timestamp('2001-10-02', tz='UTC'),
# 2002
Timestamp('2002-09-20', tz='UTC'),
# 2003
Timestamp('2003-09-10', tz='UTC'),
Timestamp('2003-09-11', tz='UTC'),
Timestamp('2003-09-12', tz='UTC'),
# 2004
Timestamp('2004-09-27', tz='UTC'),
Timestamp('2004-09-28', tz='UTC'),
Timestamp('2004-09-29', tz='UTC'),
# 2005
Timestamp('2005-09-17', tz='UTC'),
Timestamp('2005-09-18', tz='UTC'),
Timestamp('2005-09-19', tz='UTC'),
# 2006
Timestamp('2006-10-05', tz='UTC'),
Timestamp('2006-10-06', tz='UTC'),
Timestamp('2006-10-07', tz='UTC'),
# 2007
Timestamp('2007-09-24', tz='UTC'),
Timestamp('2007-09-25', tz='UTC'),
Timestamp('2007-09-26', tz='UTC'),
# 2008
Timestamp('2008-09-13', tz='UTC'),
Timestamp('2008-09-14', tz='UTC'),
Timestamp('2008-09-15', tz='UTC'),
# 2009
Timestamp('2009-10-02', tz='UTC'),
Timestamp('2009-10-03', tz='UTC'),
Timestamp('2009-10-04', tz='UTC'),
# 2010
Timestamp('2010-09-21', tz='UTC'),
Timestamp('2010-09-22', tz='UTC'),
Timestamp('2010-09-23', tz='UTC'),
# 2011
Timestamp('2011-09-12', tz='UTC'),
Timestamp('2011-09-13', tz='UTC'),
# 2012
Timestamp('2012-10-01', tz='UTC'),
# 2013
Timestamp('2013-09-18', tz='UTC'),
Timestamp('2013-09-19', tz='UTC'),
Timestamp('2013-09-20', tz='UTC'),
# 2014
Timestamp('2014-09-08', tz='UTC'),
Timestamp('2014-09-09', tz='UTC'),
Timestamp('2014-09-10', tz='UTC'),
# 2015
Timestamp('2015-09-28', tz='UTC'),
Timestamp('2015-09-29', tz='UTC'),
# 2016
Timestamp('2016-09-14', tz='UTC'),
Timestamp('2016-09-15', tz='UTC'),
Timestamp('2016-09-16', tz='UTC'),
# 2017
Timestamp('2017-10-03', tz='UTC'),
Timestamp('2017-10-04', tz='UTC'),
Timestamp('2017-10-05', tz='UTC'),
Timestamp('2017-10-06', tz='UTC'),
# 2018
Timestamp('2018-09-23', tz='UTC'),
Timestamp('2018-09-24', tz='UTC'),
Timestamp('2018-09-25', tz='UTC'),
Timestamp('2018-09-26', tz='UTC'),
# 2019
Timestamp('2019-09-12', tz='UTC'),
Timestamp('2019-09-13', tz='UTC'),
# 2020
Timestamp('2020-09-30', tz='UTC'),
Timestamp('2020-10-01', tz='UTC'),
Timestamp('2020-10-02', tz='UTC'),
# 2021
Timestamp('2021-09-20', tz='UTC'),
Timestamp('2021-09-21', tz='UTC'),
Timestamp('2021-09-22', tz='UTC'),
# 2022
Timestamp('2022-09-09', tz='UTC'),
Timestamp('2022-09-12', tz='UTC'), # 대체휴일
]
# 대체휴일
KRSubstitutionHolidayForChildrensDay2018 = [
Timestamp('2018-05-07', tz='UTC')
]
# 임시공휴일
KRCelebrationForWorldCupHosting = [
Timestamp('2002-07-01', tz='UTC')
]
KRSeventyYearsFromIndependenceDay = [
Timestamp('2015-08-14', tz='UTC')
]
KRTemporaryHolidayForChildrensDay2016 = [
Timestamp('2016-05-06', tz='UTC')
]
KRTemporaryHolidayForHarvestMoonDay2017 = [
Timestamp('2017-10-02', tz='UTC')
]
KRTemporaryHolidayForChildrenDay2018 = [
Timestamp('2018-05-07', tz='UTC')
]
KRTemporaryHolidayForChildrenDay2019 = [
Timestamp('2019-05-06', tz='UTC')
]
KRTemporaryHolidayForLiberationDay2020 = [
Timestamp('2020-08-17', tz='UTC')
]
KRTemporaryHoliday2021 = [
Timestamp('2021-08-16', tz='UTC'), # 광복절 대체휴일
Timestamp('2021-10-04', tz='UTC'), # 개천절 대체휴일
Timestamp('2021-10-11', tz='UTC'), # 한글날 대체휴일
]
KRTemporaryHoliday2022 = [
Timestamp('2022-10-10', tz='UTC'), # 한글날 대체휴일
]
# 잘 모르겠는 휴장일
HolidaysNeedToCheck = [
Timestamp('2000-01-03', tz='UTC')
]
HolidaysBefore1999 = [
Timestamp('1990-01-01', tz='UTC'),
Timestamp('1990-01-02', tz='UTC'),
Timestamp('1990-01-03', tz='UTC'),
Timestamp('1990-01-29', tz='UTC'),
Timestamp('1990-03-01', tz='UTC'),
Timestamp('1990-04-05', tz='UTC'),
Timestamp('1990-05-02', tz='UTC'),
Timestamp('1990-06-06', tz='UTC'),
Timestamp('1990-07-17', tz='UTC'),
Timestamp('1990-08-15', tz='UTC'),
Timestamp('1990-09-03', tz='UTC'),
Timestamp('1990-10-01', tz='UTC'),
Timestamp('1990-10-03', tz='UTC'),
Timestamp('1990-10-09', tz='UTC'),
Timestamp('1990-12-25', tz='UTC'),
Timestamp('1991-01-01', tz='UTC'),
Timestamp('1991-01-02', tz='UTC'),
Timestamp('1991-02-14', tz='UTC'),
Timestamp('1991-02-15', tz='UTC'),
Timestamp('1991-03-01', tz='UTC'),
Timestamp('1991-04-05', tz='UTC'),
Timestamp('1991-05-21', tz='UTC'),
Timestamp('1991-06-06', tz='UTC'),
Timestamp('1991-07-17', tz='UTC'),
Timestamp('1991-08-15', tz='UTC'),
Timestamp('1991-09-23', tz='UTC'),
Timestamp('1991-10-03', tz='UTC'),
Timestamp('1991-12-25', tz='UTC'),
Timestamp('1991-12-30', tz='UTC'),
Timestamp('1992-01-01', tz='UTC'),
Timestamp('1992-09-10', tz='UTC'),
Timestamp('1992-09-11', tz='UTC'),
Timestamp('1992-10-03', tz='UTC'),
Timestamp('1992-12-25', tz='UTC'),
Timestamp('1992-12-29', tz='UTC'),
Timestamp('1992-12-30', tz='UTC'),
Timestamp('1992-12-31', tz='UTC'),
Timestamp('1993-01-01', tz='UTC'),
Timestamp('1993-01-22', tz='UTC'),
Timestamp('1993-03-01', tz='UTC'),
Timestamp('1993-04-05', tz='UTC'),
Timestamp('1993-05-05', tz='UTC'),
Timestamp('1993-05-28', tz='UTC'),
Timestamp('1993-07-17', tz='UTC'),
Timestamp('1993-09-29', tz='UTC'),
Timestamp('1993-09-30', tz='UTC'),
Timestamp('1993-10-01', tz='UTC'),
Timestamp('1993-12-29', tz='UTC'),
Timestamp('1993-12-30', tz='UTC'),
Timestamp('1993-12-31', tz='UTC'),
Timestamp('1994-01-02', tz='UTC'),
Timestamp('1994-02-09', tz='UTC'),
Timestamp('1994-02-10', tz='UTC'),
Timestamp('1994-02-11', tz='UTC'),
Timestamp('1994-03-01', tz='UTC'),
Timestamp('1994-04-05', tz='UTC'),
Timestamp('1994-05-05', tz='UTC'),
Timestamp('1994-06-06', tz='UTC'),
Timestamp('1994-07-17', tz='UTC'),
Timestamp('1994-08-15', tz='UTC'),
Timestamp('1994-09-19', tz='UTC'),
Timestamp('1994-09-20', tz='UTC'),
Timestamp('1994-09-21', tz='UTC'),
Timestamp('1994-10-03', tz='UTC'),
Timestamp('1994-12-29', tz='UTC'),
Timestamp('1994-12-30', tz='UTC'),
Timestamp('1995-01-02', tz='UTC'),
Timestamp('1995-01-30', tz='UTC'),
Timestamp('1995-01-31', tz='UTC'),
Timestamp('1995-02-01', tz='UTC'),
Timestamp('1995-03-01', tz='UTC'),
Timestamp('1995-05-01', tz='UTC'),
Timestamp('1995-05-05', tz='UTC'),
Timestamp('1995-06-06', tz='UTC'),
Timestamp('1995-06-27', tz='UTC'),
Timestamp('1995-07-17', tz='UTC'),
Timestamp('1995-08-15', tz='UTC'),
Timestamp('1995-09-08', tz='UTC'),
Timestamp('1995-09-09', tz='UTC'),
Timestamp('1995-10-03', tz='UTC'),
Timestamp('1995-12-22', tz='UTC'),
Timestamp('1995-12-25', tz='UTC'),
Timestamp('1995-12-28', tz='UTC'),
Timestamp('1995-12-29', tz='UTC'),
Timestamp('1995-12-30', tz='UTC'),
Timestamp('1995-12-31', tz='UTC'),
Timestamp('1996-01-01', tz='UTC'),
Timestamp('1996-01-02', tz='UTC'),
Timestamp('1996-02-19', tz='UTC'),
Timestamp('1996-02-20', tz='UTC'),
Timestamp('1996-03-01', tz='UTC'),
Timestamp('1996-04-05', tz='UTC'),
Timestamp('1996-04-11', tz='UTC'),
Timestamp('1996-05-01', tz='UTC'),
Timestamp('1996-05-05', tz='UTC'),
Timestamp('1996-05-24', tz='UTC'),
Timestamp('1996-06-06', tz='UTC'),
Timestamp('1996-07-17', tz='UTC'),
Timestamp('1996-08-15', tz='UTC'),
Timestamp('1996-09-26', tz='UTC'),
Timestamp('1996-09-27', tz='UTC'),
Timestamp('1996-09-28', tz='UTC'),
Timestamp('1996-10-03', tz='UTC'),
Timestamp('1996-12-25', tz='UTC'),
Timestamp('1996-12-30', tz='UTC'),
Timestamp('1996-12-31', tz='UTC'),
Timestamp('1997-01-01', tz='UTC'),
Timestamp('1997-01-02', tz='UTC'),
Timestamp('1997-02-07', tz='UTC'),
Timestamp('1997-02-08', tz='UTC'),
Timestamp('1997-03-01', tz='UTC'),
Timestamp('1997-04-05', tz='UTC'),
Timestamp('1997-05-05', tz='UTC'),
Timestamp('1997-05-14', tz='UTC'),
Timestamp('1997-06-06', tz='UTC'),
Timestamp('1997-07-17', tz='UTC'),
Timestamp('1997-08-15', tz='UTC'),
Timestamp('1997-09-16', tz='UTC'),
Timestamp('1997-09-17', tz='UTC'),
Timestamp('1997-10-03', tz='UTC'),
Timestamp('1997-12-25', tz='UTC'),
Timestamp('1998-01-01', tz='UTC'),
Timestamp('1998-01-02', tz='UTC'),
Timestamp('1998-01-27', tz='UTC'),
Timestamp('1998-01-28', tz='UTC'),
Timestamp('1998-01-29', tz='UTC'),
Timestamp('1998-03-01', tz='UTC'),
Timestamp('1998-04-05', tz='UTC'),
Timestamp('1998-05-01', tz='UTC'),
Timestamp('1998-05-03', tz='UTC'),
Timestamp('1998-05-05', tz='UTC'),
Timestamp('1998-06-04', tz='UTC'),
Timestamp('1998-06-06', tz='UTC'),
Timestamp('1998-07-17', tz='UTC'),
Timestamp('1998-08-15', tz='UTC'),
Timestamp('1998-10-03', tz='UTC'),
Timestamp('1998-10-04', tz='UTC'),
Timestamp('1998-10-05', tz='UTC'),
Timestamp('1998-10-06', tz='UTC'),
Timestamp('1998-12-25', tz='UTC'),
Timestamp('1998-12-31', tz='UTC'),
Timestamp('1999-01-01', tz='UTC'),
Timestamp('1999-02-15', tz='UTC'),
Timestamp('1999-02-16', tz='UTC'),
Timestamp('1999-02-17', tz='UTC'),
Timestamp('1999-03-01', tz='UTC'),
Timestamp('1999-04-05', tz='UTC'),
Timestamp('1999-05-05', tz='UTC'),
Timestamp('1999-05-22', tz='UTC'),
Timestamp('1999-06-06', tz='UTC'),
Timestamp('1999-07-17', tz='UTC'),
Timestamp('1999-09-23', tz='UTC'),
Timestamp('1999-09-24', tz='UTC'),
Timestamp('1999-09-25', tz='UTC'),
Timestamp('1999-10-03', tz='UTC'),
Timestamp('1999-12-29', tz='UTC'),
Timestamp('1999-12-30', tz='UTC'),
Timestamp('1999-12-31', tz='UTC'),
]
class KRXExchangeCalendar(ExtendedExchangeCalendar):
"""
Exchange calendars for KRX
Open Time: 9:00 AM, Asia/Seoul
Close Time: 3:30 PM, Asia/Seoul (3:00 PM until 2016/07/31)
"""
@property
def name(self):
return "KRX"
@property
def tz(self):
# return timezone('Asia/Seoul')
return timezone('UTC')
@property
def open_time(self):
return time(9, 0)
@property
def open_times(self):
return [(None, time(9, 0))]
@property
def close_time(self):
return time(15, 30)
@property
def close_times(self):
return [(None, time(15, 30))]
@property
def regular_holidays(self):
return HolidayCalendar([
KRNewYearsDay,
KRIndependenceDay,
KRArbourDay,
KRLabourDay,
KRChildrensDay,
KRMemorialDay,
KRConstitutionDay,
KRLiberationDay,
KRNationalFoundationDay,
Christmas,
KRHangulProclamationDay,
KRXEndOfYearClosing
])
@property
def special_closes(self):
return []
@property
def adhoc_holidays(self):
return list(chain(
KRXEndOfYearClosing2000,
KRLunarNewYear,
KRElectionDays,
KRBuddhasBirthday,
KRHarvestMoonDay,
KRSubstitutionHolidayForChildrensDay2018,
KRCelebrationForWorldCupHosting,
KRSeventyYearsFromIndependenceDay,
KRTemporaryHolidayForChildrensDay2016,
KRTemporaryHolidayForHarvestMoonDay2017,
KRTemporaryHolidayForChildrenDay2018,
KRTemporaryHolidayForChildrenDay2019,
HolidaysNeedToCheck,
KRTemporaryHolidayForLiberationDay2020,
KRTemporaryHoliday2021,
HolidaysBefore1999,
))
def __hash__(self):
return hash(self.name)
def __eq__(self, other):
return self.__class__ == other.__class__
| 28.474178 | 85 | 0.580599 | 2,452 | 18,195 | 4.294861 | 0.081158 | 0.159054 | 0.340329 | 0.041022 | 0.645238 | 0.069699 | 0.012914 | 0 | 0 | 0 | 0 | 0.207138 | 0.19313 | 18,195 | 638 | 86 | 28.518809 | 0.510183 | 0.044957 | 0 | 0.04142 | 0 | 0 | 0.265002 | 0.001216 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021696 | false | 0 | 0.013807 | 0.021696 | 0.059172 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
605c46e1dca45ffe66a05a4a91174510b5abbb04 | 433 | py | Python | src/forecastmgmt/ui/masterdata/person_window.py | vvladych/forecastmgmt | 9eea272d00bb42031f49b5bb5af01388ecce31cf | [
"Unlicense"
] | null | null | null | src/forecastmgmt/ui/masterdata/person_window.py | vvladych/forecastmgmt | 9eea272d00bb42031f49b5bb5af01388ecce31cf | [
"Unlicense"
] | 37 | 2015-07-01T22:18:51.000Z | 2016-03-11T21:17:12.000Z | src/forecastmgmt/ui/masterdata/person_window.py | vvladych/forecastmgmt | 9eea272d00bb42031f49b5bb5af01388ecce31cf | [
"Unlicense"
] | null | null | null | from gi.repository import Gtk
from masterdata_abstract_window import MasterdataAbstractWindow
from person_add_mask import PersonAddMask
from person_list_mask import PersonListMask
class PersonWindow(MasterdataAbstractWindow):
def __init__(self, main_window):
super(PersonWindow, self).__init__(main_window, PersonListMask(), PersonAddMask(main_window, self.add_working_area))
| 25.470588 | 124 | 0.757506 | 45 | 433 | 6.866667 | 0.533333 | 0.097087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191686 | 433 | 16 | 125 | 27.0625 | 0.882857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.571429 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
606613f1ae0ac5127e78b4dfa447fad203daeafe | 6,736 | py | Python | scripts/generate.py | jwise/pebble-caltrain | 770497cb38205827fee2e4e4cfdd79bcf60ceb65 | [
"MIT"
] | 1 | 2019-06-18T01:17:25.000Z | 2019-06-18T01:17:25.000Z | scripts/generate.py | jwise/pebble-caltrain | 770497cb38205827fee2e4e4cfdd79bcf60ceb65 | [
"MIT"
] | null | null | null | scripts/generate.py | jwise/pebble-caltrain | 770497cb38205827fee2e4e4cfdd79bcf60ceb65 | [
"MIT"
] | null | null | null | __author__ = 'katharine'
import csv
import struct
import time
import datetime
def generate_files(source_dir, target_dir):
stops_txt = [x for x in csv.DictReader(open("%s/stops.txt" % source_dir, 'rb')) if x['location_type'] == '0']
print "%d stops" % len(stops_txt)
name_replacements = (
('Caltrain', ''),
('Station', ''),
('Mt View', 'Mountain View'),
('So. San Francisco', 'South SF'),
('South San Francisco', 'South SF'),
)
stop_parent_map = {}
stop_name_map = {}
stop_map = {}
stops = []
for s in stops_txt:
if s['parent_station'] != '' and s['parent_station'] in stop_parent_map:
stop_map[int(s['stop_code'])] = stop_parent_map[s['parent_station']]
continue
for replacement in name_replacements:
s['stop_name'] = s['stop_name'].replace(*replacement)
s['stop_name'] = s['stop_name'].rstrip()
if s['stop_name'] in stop_name_map:
stop_map[int(s['stop_code'])] = stop_name_map[s['stop_name']]
continue
stop_map[int(s['stop_code'])] = len(stops)
stop_parent_map[s['parent_station']] = len(stops)
stop_name_map[s['stop_name']] = len(stops)
stops.append({
'name': s['stop_name'],
'zone': int(s['zone_id']) if s['zone_id'] != '' else 0,
'lat': float(s['stop_lat']),
'lon': float(s['stop_lon'])
})
with open('%s/stops.dat' % target_dir, 'wb') as f:
f.write(struct.pack('<B', len(stops)))
for stop in stops:
f.write(struct.pack('<B18sii', stop['zone'], stop['name'], int(stop['lat'] * 1000000), int(stop['lon'] * 1000000)))
calendar_txt = list(csv.DictReader(open("%s/calendar.txt" % source_dir, 'rb')))
cal = []
cal_map = {}
for i, x in enumerate(calendar_txt):
cal_map[x['service_id']] = len(cal)
end_time = datetime.datetime.strptime(x['end_date'], '%Y%m%d') + datetime.timedelta(1, hours=2)
cal.append({
'id': cal_map[x['service_id']],
'start': time.mktime(time.strptime(x['start_date'], '%Y%m%d')),
'end': time.mktime(end_time.timetuple()),
'days': (
(int(x['monday']) << 0) |
(int(x['tuesday']) << 1) |
(int(x['wednesday']) << 2) |
(int(x['thursday']) << 3) |
(int(x['friday']) << 4) |
(int(x['saturday']) << 5) |
(int(x['sunday']) << 6)
)
})
calendar_dates_txt = list(csv.DictReader(open("%s/calendar_dates.txt" % source_dir, 'rb')))
for i, x in enumerate(calendar_dates_txt):
if x['service_id'] in cal_map:
# XXX: Would be nice to find a way to mark special dates. But
# we can't, right now. Oh well.
continue
cal_map[x['service_id']] = len(cal)
start_time = datetime.datetime.strptime(x['date'], '%Y%m%d')
end_time = start_time + datetime.timedelta(1, hours=2)
cal.append({
'id': cal_map[x['service_id']],
'start': time.mktime(start_time.timetuple()),
'end': time.mktime(end_time.timetuple()),
'days': 0x7F,
})
with open('%s/calendar.dat' % target_dir, 'wb') as f:
f.write(struct.pack('<B', len(cal)))
for c in cal:
f.write(struct.pack('<IIB', int(c['start']), int(c['end']), c['days']))
trips_txt = list(csv.DictReader(open("%s/trips.txt" % source_dir, "rb")))
tr = []
tr_map = {}
# These shouldn't be hardcoded, and should instead be inferred from routes.txt.
route_map = {
"BABY BULLET": 0,
"LIMITED": 1,
"LOCAL": 2,
"SHUTTLE": 3,
"Bu-130": 0,
"Li-130": 1,
"Lo-130": 2,
"TaSj-130": 3,
"Sp-130": 2, # XXX: Special Event Extra Service
}
short_name_replacements = (
('Tamien SJ shuttle', ''),
('S', ''),
('shuttle', ''),
)
for i, trip in enumerate(trips_txt):
for replacement in short_name_replacements:
trip['trip_short_name'] = trip['trip_short_name'].replace(*replacement)
tr.append({
'direction': int(not int(trip['direction_id'])), # We picked opposing values for north/south.
'route': route_map[trip['route_id']],
'service': cal_map[trip['service_id']],
'trip_name': int(trip['trip_short_name'])}),
tr_map[trip['trip_id']] = i
with open('%s/trips.dat' % target_dir, 'wb') as f:
f.write(struct.pack('<H', len(tr)))
for t in tr:
f.write(struct.pack('<HBBB', t['trip_name'], t['direction'], t['route'], t['service']))
times_txt = list(csv.DictReader(open("%s/stop_times.txt" % source_dir)))
tm = sorted([{
'time': (int(x['arrival_time'].split(':')[0])*60 + int(x['arrival_time'].split(':')[1])),
'stop': stop_map[int(x['stop_id'])],
'sequence': int(x['stop_sequence']),
'trip': tr_map[x['trip_id']]
} for x in times_txt], key=lambda y: y['time'])
with open('%s/times.dat' % target_dir, 'wb') as f:
f.write(struct.pack('<H', len(tm)))
for t in tm:
f.write(struct.pack('<HHBB', t['trip'], t['time'], t['stop'], t['sequence']))
stop_times = [sorted([i for i, x in enumerate(tm) if x['stop'] == stop], key=lambda t: tm[t]['time']) for stop, s in enumerate(stops)]
lengths = [len(x) for x in stop_times]
with open('%s/stop_index.dat' % target_dir, 'wb') as f:
f.write(struct.pack('<B', len(lengths)))
counter = len(lengths)*4 + 1
for l in lengths:
f.write(struct.pack('<HH', counter, l))
counter += l*2
for s in stop_times:
for x in s:
f.write(struct.pack('<H', x))
trip_stops = [sorted([i for i, x in enumerate(tm) if x['trip'] == trip], key=lambda k: tm[k]['stop']) for trip, s in enumerate(tr)]
lengths = map(len, trip_stops)
with open('%s/trip_index.dat' % target_dir, 'wb') as f:
f.write(struct.pack('<H', len(lengths)))
counter = len(lengths) * 3 + 2
data_start = counter
for l in lengths:
f.write(struct.pack('<HB', counter, l))
counter += l*2
if data_start != f.tell():
raise Exception("%d != %d" % (counter, f.tell()))
for s in trip_stops:
for x in s:
f.write(struct.pack('<H', x))
if f.tell() != counter:
raise Exception("Not the expected length!")
if __name__ == "__main__":
import sys
generate_files(sys.argv[1], sys.argv[2])
| 36.215054 | 138 | 0.532512 | 924 | 6,736 | 3.728355 | 0.198052 | 0.021771 | 0.048766 | 0.065022 | 0.354717 | 0.29492 | 0.219448 | 0.16836 | 0.13643 | 0.13643 | 0 | 0.013849 | 0.28177 | 6,736 | 185 | 139 | 36.410811 | 0.698222 | 0.036372 | 0 | 0.144737 | 1 | 0 | 0.172398 | 0.003238 | 0 | 0 | 0.000617 | 0 | 0 | 0 | null | null | 0 | 0.032895 | null | null | 0.006579 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6066c37cd5157dfddcd5a311c42cacbf5b1656ab | 203 | py | Python | cmsplugin_cascade/migrations/0007_add_proxy_models.py | teklager/djangocms-cascade | adc461f7054c6c0f88bc732aefd03b157df2f514 | [
"MIT"
] | 139 | 2015-01-08T22:27:06.000Z | 2021-08-19T03:36:58.000Z | cmsplugin_cascade/migrations/0007_add_proxy_models.py | teklager/djangocms-cascade | adc461f7054c6c0f88bc732aefd03b157df2f514 | [
"MIT"
] | 286 | 2015-01-02T14:15:14.000Z | 2022-03-22T11:00:12.000Z | cmsplugin_cascade/migrations/0007_add_proxy_models.py | teklager/djangocms-cascade | adc461f7054c6c0f88bc732aefd03b157df2f514 | [
"MIT"
] | 91 | 2015-01-16T15:06:23.000Z | 2022-03-23T23:36:54.000Z | from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('cmsplugin_cascade', '0006_bootstrapgallerypluginmodel'),
]
operations = [
]
| 16.916667 | 66 | 0.679803 | 16 | 203 | 8.5 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025316 | 0.221675 | 203 | 11 | 67 | 18.454545 | 0.835443 | 0 | 0 | 0 | 0 | 0 | 0.241379 | 0.157635 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
608959b12d0989d6fd9417a960a75d31ceab0a32 | 461 | py | Python | Topics/Submitting data/POST Request With Several Keys/main.py | valenciarichards/hypernews-portal | 0b6c4d8aefe4f8fc7dc90d6542716e98f52515b3 | [
"MIT"
] | 1 | 2021-07-26T03:06:14.000Z | 2021-07-26T03:06:14.000Z | Topics/Submitting data/POST Request With Several Keys/main.py | valenciarichards/hypernews-portal | 0b6c4d8aefe4f8fc7dc90d6542716e98f52515b3 | [
"MIT"
] | null | null | null | Topics/Submitting data/POST Request With Several Keys/main.py | valenciarichards/hypernews-portal | 0b6c4d8aefe4f8fc7dc90d6542716e98f52515b3 | [
"MIT"
] | null | null | null | from django.shortcuts import redirect
from django.views import View
class TodoView(View):
all_todos = []
def post(self, request, *args, **kwargs):
todo = request.POST.get("todo")
important = request.POST.get("important")
if todo not in self.all_todos:
if important:
self.all_todos = [todo] + self.all_todos
else:
self.all_todos.append(todo)
return redirect("/")
| 27.117647 | 56 | 0.590022 | 55 | 461 | 4.854545 | 0.472727 | 0.149813 | 0.179775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.301518 | 461 | 16 | 57 | 28.8125 | 0.829193 | 0 | 0 | 0 | 0 | 0 | 0.030369 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
608a1240bebe926df3b7d5f1923157bb4c477433 | 1,614 | py | Python | pybinsim/pose.py | fkleinTUI/pyBinSim | 5320f62422e0a92154272f8167b87cabdcafe27f | [
"MIT"
] | null | null | null | pybinsim/pose.py | fkleinTUI/pyBinSim | 5320f62422e0a92154272f8167b87cabdcafe27f | [
"MIT"
] | null | null | null | pybinsim/pose.py | fkleinTUI/pyBinSim | 5320f62422e0a92154272f8167b87cabdcafe27f | [
"MIT"
] | null | null | null | import logging
from collections import namedtuple
logger = logging.getLogger("pybinsim.Pose")
class Orientation(namedtuple('Orientation', ['yaw', 'pitch', 'roll'])):
pass
class Position(namedtuple('Position', ['x', 'y', 'z'])):
pass
class Custom(namedtuple('CustomValues', ['a', 'b', 'c'])):
pass
class Pose:
def __init__(self, orientation, position, custom=Custom(0, 0, 0)):
self.orientation = orientation
self.position = position
self.custom = custom
def create_key(self):
value_list = list(self.orientation) + list(self.position) + list(self.custom)
return ','.join([str(x) for x in value_list])
@staticmethod
def from_filterValueList(filter_value_list):
# 'old' format: orientation - position
if len(filter_value_list) == 6:
orientation = Orientation(filter_value_list[0], filter_value_list[1], filter_value_list[2])
position = Position(filter_value_list[3], filter_value_list[4], filter_value_list[5])
return Pose(orientation, position)
# 'new' format: orientation - position - custom
if len(filter_value_list) == 9:
orientation = Orientation(filter_value_list[0], filter_value_list[1], filter_value_list[2])
position = Position(filter_value_list[3], filter_value_list[4], filter_value_list[5])
custom = Custom(filter_value_list[6], filter_value_list[7], filter_value_list[8])
return Pose(orientation, position, custom)
raise RuntimeError("Unable to parse filter list: {}".format(filter_value_list))
| 32.938776 | 103 | 0.669145 | 198 | 1,614 | 5.222222 | 0.29798 | 0.182785 | 0.275629 | 0.030948 | 0.297872 | 0.259188 | 0.259188 | 0.259188 | 0.259188 | 0.259188 | 0 | 0.015613 | 0.20632 | 1,614 | 48 | 104 | 33.625 | 0.791569 | 0.050805 | 0 | 0.241379 | 0 | 0 | 0.061478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103448 | false | 0.103448 | 0.068966 | 0 | 0.413793 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
60931ff29f096eec340bc02f5f6b68402207873f | 5,803 | py | Python | pulsar/datadog_checks/pulsar/check.py | divyamamgai/integrations-extras | 8c40a9cf870578687cc224ee91d3c70cd3a435a4 | [
"BSD-3-Clause"
] | 158 | 2016-06-02T16:25:31.000Z | 2022-03-16T15:55:14.000Z | pulsar/datadog_checks/pulsar/check.py | divyamamgai/integrations-extras | 8c40a9cf870578687cc224ee91d3c70cd3a435a4 | [
"BSD-3-Clause"
] | 554 | 2016-03-15T17:39:12.000Z | 2022-03-31T10:29:16.000Z | pulsar/datadog_checks/pulsar/check.py | divyamamgai/integrations-extras | 8c40a9cf870578687cc224ee91d3c70cd3a435a4 | [
"BSD-3-Clause"
] | 431 | 2016-05-13T15:33:13.000Z | 2022-03-31T10:06:46.000Z | from datadog_checks.base import ConfigurationError, OpenMetricsBaseCheck
EVENT_TYPE = SOURCE_TYPE_NAME = 'pulsar'
class PulsarCheck(OpenMetricsBaseCheck):
"""
PulsarCheck derives from AgentCheck that provides the required check method
"""
def __init__(self, name, init_config, instances=None):
instance = instances[0]
url = instance.get('prometheus_url')
if url is None:
raise ConfigurationError("Unable to find prometheus_url in config file.")
self.NAMESPACE = 'kesque.pulsar'
self.metrics_mapper = {
'pulsar_consumer_available_permits': 'consumer.available_permits',
'pulsar_consumer_blocked_on_unacked_messages': 'consumer.blocked_on_unacked_messages',
'pulsar_consumer_msg_rate_out': 'consumer.msg_rate_out',
'pulsar_consumer_msg_rate_redeliver': 'consumer.msg_rate_redeliver',
'pulsar_consumer_msg_throughput_out': 'consumer.msg_throughput_out',
'pulsar_consumer_unacked_messages': 'consumer.unacked_messages',
'pulsar_consumers_count': 'consumers_count',
'pulsar_entry_size_count': 'entry_size_count',
'pulsar_entry_size_le_100_kb': 'entry_size_le_100_kb',
'pulsar_entry_size_le_128': 'entry_size_le_128',
'pulsar_entry_size_le_16_kb': 'entry_size_le_16_kb',
'pulsar_entry_size_le_1_kb': 'entry_size_le_1_kb',
'pulsar_entry_size_le_1_mb': 'entry_size_le_1_mb',
'pulsar_entry_size_le_2_kb': 'entry_size_le_2_kb',
'pulsar_entry_size_le_4_kb': 'entry_size_le_4_kb',
'pulsar_entry_size_le_512': 'entry_size_le_512',
'pulsar_entry_size_le_overflow': 'entry_size_le_overflow',
'pulsar_entry_size_sum': 'entry_size_sum',
'pulsar_in_bytes_total': 'in_bytes_total',
'pulsar_in_messages_total': 'in_messages_total',
'pulsar_msg_backlog': 'msg_backlog',
'pulsar_out_bytes_total': 'out_bytes_total',
'pulsar_out_messages_total': 'out_messages_total',
'pulsar_producers_count': 'producers_count',
'pulsar_rate_in': 'rate_in',
'pulsar_rate_out': 'rate_out',
'pulsar_replication_backlog': 'replication.backlog',
'pulsar_replication_rate_in': 'replication.rate_in',
'pulsar_replication_rate_out': 'replication.rate_out',
'pulsar_replication_throughput_in': 'replication.throughput_in',
'pulsar_replication_throughput_out': 'replication.throughput_out',
'pulsar_storage_backlog_quota_limit': 'storage.backlog_quota_limit',
'pulsar_storage_backlog_size': 'storage.backlog_size',
'pulsar_storage_read_rate': 'storage.read_rate',
'pulsar_storage_offloaded_size': 'storage.offloaded_size',
'pulsar_storage_size': 'storage.size',
'pulsar_storage_write_latency_count': 'storage.write_latency_count',
'pulsar_storage_write_latency_le_0_5': 'storage.write_latency_le_0_5',
'pulsar_storage_write_latency_le_1': 'storage.write_latency_le_1',
'pulsar_storage_write_latency_le_10': 'storage.write_latency_le_10',
'pulsar_storage_write_latency_le_100': 'storage.write_latency_le_100',
'pulsar_storage_write_latency_le_1000': 'storage.write_latency_le_1000',
'pulsar_storage_write_latency_le_20': 'storage.write_latency_le_20',
'pulsar_storage_write_latency_le_200': 'storage.write_latency_le_200',
'pulsar_storage_write_latency_le_5': 'storage.write_latency_le_5',
'pulsar_storage_write_latency_le_50': 'storage.write_latency_le_50',
'pulsar_storage_write_latency_overflow': 'storage.write_latency_overflow',
'pulsar_storage_write_latency_sum': 'storage.write_latency_sum',
'pulsar_storage_write_rate': 'storage.write_rate',
'pulsar_subscription_back_log': 'subscription.back_log',
'pulsar_subscription_back_log_no_delayed': 'subscription.back_log_no_delayed',
'pulsar_subscription_blocked_on_unacked_messages': 'subscription.blocked_on_unacked_messages',
'pulsar_subscription_delayed': 'subscription.delayed',
'pulsar_subscription_msg_rate_out': 'subscription.msg_rate_out',
'pulsar_subscription_msg_rate_redeliver': 'subscription.msg_rate_redeliver',
'pulsar_subscription_msg_throughput_out': 'subscription.msg_throughput_out',
'pulsar_subscription_unacked_messages': 'subscription.unacked_messages',
'pulsar_subscriptions_count': 'subscriptions.count',
'pulsar_throughput_in': 'throughput_in',
'pulsar_throughput_out': 'throughput_out',
'pulsar_topics_count': 'topics_count',
'scrape_duration_seconds': 'scrape_duration_seconds',
'scrape_samples_post_metric_relabeling': 'scrape_samples_post_metric_relabeling',
'scrape_samples_scraped': 'scrape_samples_scraped',
'topic_load_times': 'topic_load_times',
'topic_load_times_count': 'topic_load_times_count',
'topic_load_times_sum': 'topic_load_times_sum',
'up': 'broker.up',
}
instance.update(
{
'prometheus_url': url,
'namespace': self.NAMESPACE,
'metrics': [self.metrics_mapper],
'send_distribution_counts_as_monotonic': instance.get('send_distribution_counts_as_monotonic', True),
'send_distribution_sums_as_monotonic': instance.get('send_distribution_sums_as_monotonic', True),
}
)
super(PulsarCheck, self).__init__(name, init_config, instances)
| 58.616162 | 117 | 0.693607 | 659 | 5,803 | 5.485584 | 0.182094 | 0.086307 | 0.126141 | 0.104564 | 0.334993 | 0.104011 | 0.036238 | 0 | 0 | 0 | 0 | 0.015546 | 0.212993 | 5,803 | 98 | 118 | 59.214286 | 0.776002 | 0.012924 | 0 | 0 | 0 | 0 | 0.631653 | 0.492647 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011236 | false | 0 | 0.011236 | 0 | 0.033708 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6093e7dfd679097b8c74873a9180892df2d03fa2 | 1,756 | py | Python | magie/window.py | NiumXp/Magie | f0dc1d135274b5453fde659ba46f09f4b303c099 | [
"MIT"
] | 1 | 2020-10-26T17:16:52.000Z | 2020-10-26T17:16:52.000Z | magie/window.py | NiumXp/Magie | f0dc1d135274b5453fde659ba46f09f4b303c099 | [
"MIT"
] | null | null | null | magie/window.py | NiumXp/Magie | f0dc1d135274b5453fde659ba46f09f4b303c099 | [
"MIT"
] | null | null | null | import pygame
class Window:
def __init__(self, title: str, dimension: tuple):
self.surface = None
self.initial_title = title
self.initial_dimension = dimension
@property
def title(self) -> str:
"""Returns the title of the window."""
return pygame.display.get_caption()
@title.setter
def title(self, new_title: str):
"""Sets the window title."""
pygame.display.set_caption(new_title)
def set_title(self, new_title: str):
"""Alias for `Window.title = ...`."""
self.title = new_title
@property
def width(self) -> int:
"""Alias for Window.get_width."""
return self.get_width()
@property
def height(self) -> int:
"""Alias for Window.get_height."""
return self.get_height()
@property
def size(self) -> tuple:
"""Alias for Window.get_size."""
return self.get_size()
def get_width(self) -> int:
"""Returns the widget of the window."""
if self.surface:
return self.surface.get_width()
return self.initial_dimension[0]
def get_height(self) -> int:
"""Returns the height of the window."""
if self.surface:
return self.surface.get_height()
return self.initial_dimension[1]
def get_size(self) -> tuple:
"""Returns the size of the size."""
if self.surface:
return self.surface.get_size()
return self.initial_dimension
def build(self):
"""Build the window."""
self.surface = pygame.display.set_mode(self.initial_dimension)
self.set_title(self.initial_title)
return self
| 27.4375 | 71 | 0.574601 | 207 | 1,756 | 4.719807 | 0.183575 | 0.102354 | 0.102354 | 0.052201 | 0.21392 | 0.172979 | 0.123849 | 0.090072 | 0.090072 | 0.090072 | 0 | 0.00165 | 0.309795 | 1,756 | 63 | 72 | 27.873016 | 0.804455 | 0.16344 | 0 | 0.179487 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.282051 | false | 0 | 0.025641 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
609ff6c4c10fb31262c0f09062e4b789c62b7f8c | 360 | py | Python | tests/__init__.py | issuu/jmespath | 7480643df8ad0c99269b1fa6b9e793582bcb7efe | [
"MIT"
] | null | null | null | tests/__init__.py | issuu/jmespath | 7480643df8ad0c99269b1fa6b9e793582bcb7efe | [
"MIT"
] | null | null | null | tests/__init__.py | issuu/jmespath | 7480643df8ad0c99269b1fa6b9e793582bcb7efe | [
"MIT"
] | null | null | null | import sys
# The unittest module got a significant overhaul
# in 2.7, so if we're in 2.6 we can use the backported
# version unittest2.
if sys.version_info[:2] == (2, 6):
import unittest2 as unittest
import simplejson as json
from ordereddict import OrderedDict
else:
import unittest
import json
from collections import OrderedDict
| 21.176471 | 54 | 0.725 | 54 | 360 | 4.814815 | 0.555556 | 0.023077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032143 | 0.222222 | 360 | 16 | 55 | 22.5 | 0.896429 | 0.327778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.777778 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
60a1ad6402e8e4a36b677d266c4a8212a4c64c09 | 15,419 | py | Python | WebPortal/gbol_portal/vars.py | ZFMK/GermanBarcodeofLife | f9e9af699ba3f5b3d0d537819a8463c4162d7e82 | [
"Apache-2.0"
] | null | null | null | WebPortal/gbol_portal/vars.py | ZFMK/GermanBarcodeofLife | f9e9af699ba3f5b3d0d537819a8463c4162d7e82 | [
"Apache-2.0"
] | null | null | null | WebPortal/gbol_portal/vars.py | ZFMK/GermanBarcodeofLife | f9e9af699ba3f5b3d0d537819a8463c4162d7e82 | [
"Apache-2.0"
] | null | null | null | import configparser
c = configparser.ConfigParser()
c.read("production.ini")
config = {}
config['host'] = c['dboption']['chost']
config['port'] = int(c['dboption']['cport'])
config['user'] = c['dboption']['cuser']
config['pw'] = c['dboption']['cpw']
config['db'] = c['dboption']['cdb']
config['homepath'] = c['option']['home']
config['hosturl'] = c['option']['hosturl']
config['news'] = c['news']
config['smtp'] = {}
config['smtp']['sender'] = c['option']['smtp-sender']
config['smtp']['server'] = c['option']['smtp']
config['collection_table'] = {}
config['collection_table']['template'] = c['option']['template_collection_sheet']
config['collection_table']['ordered'] = c['option']['collection_table_ordered']
config['collection_table']['filled'] = c['option']['collection_table_filled']
config['dwb'] = {}
config['dwb']['name_suffix'] = c['option']['dwb_name_suffix']
config['dwb']['connection_string'] = c['option']['dwb_connection_string']
config['dwb']['use_dwb'] = int(c['option']['use_dwb'])
if not c.has_option('option', 'dev_group'):
log.critical('Option `dev_group` is not defined in production.ini!\nPlease add at least one email to the list.')
raise NameError('Option `dev_group` is not defined in production.ini!\nPlease add at least one email to the list.')
config['dev_group'] = c['option']['dev_group']
taxon_ids = """100408, 100430, 100431, 100451, 100453, 3000243, 3100522, 3200125,
3200126, 4000014, 4402020, 4403366, 4403382, 4403383, 4404012,
4404135, 4404679, 4405947, 4406565, 4407062, 4408012, 5000093,
5000095, 5000203, 5009403, 5009532, 5100497, 5200013, 5210014,
5220011, 5400004, 5401236, 5413793, 5416518, 5416650, 5426341,
5428084, 5428327, 5428727, 5428849, 5428977, 5429029, 5429176,
5429405, 5430460, 5431215"""
states = {'de': ["Europa",
"Baden-Württemberg",
"Bayern",
"Berlin",
"Brandenburg",
"Bremen",
"Hamburg",
"Hessen",
"Mecklenburg-Vorpommern",
"Niedersachsen",
"Nordrhein-Westfalen",
"Rheinland-Pfalz",
"Saarland",
"Sachsen",
"Sachsen-Anhalt",
"Schleswig-Holstein",
"Thüringen"],
'en': ["Europe",
"Baden-Württemberg",
"Bavaria",
"Berlin",
"Brandenburg",
"Bremen",
"Hamburg",
"Hesse",
"Mecklenburg-Vorpommern",
"Lower Saxony",
"North Rhine Westphalia",
"RhinelandPalatinate",
"Saarland",
"Saxony",
"Saxony-Anhalt",
"Schleswig-Holstein",
"Thuringia"]}
messages = {}
messages['results'] = {}
messages['results']['choose_taxa'] = {'de': '- Bitte wählen Sie ein Taxon aus -',
'en': '- Please choose a taxon -'}
messages['results']['choose_states'] = {'de': '- Bitte wählen Sie ein Bundesland aus -',
'en': '- Please choose a state -'}
messages['news_edit'] = {'de': ' Bearbeiten ', 'en': ' Edit '}
messages['news_reset'] = {'de': " Zurücksetzen ", 'en': " Reset "}
messages['news_reset_html'] = {'de': "<h2><strong>Titel</strong></h2><p>Inhalt</p>",
'en': "<h2><strong>Title</strong></h2><p>Content</p>"}
messages['news_message_saved'] = {'de': "News gespeichert!", 'en': "News saved!"}
messages['news_message_updated'] = {'de': "News bearbeitet!", 'en': "News updated!"}
messages['news_message_empty'] = {'de': "Bitte geben Sie Titel und Inhalt des neuen Newsbeitrages ein!",
'en': "Please enter title and content of the news posting!"}
messages['news_cancel'] = {'de': " Abbrechen ", 'en': " Cancel "}
messages['contact'] = {'de': 'Bitte überprüfen Sie die eingegebenen Daten.', 'en': 'Please check the data entered.'}
messages['contact_send'] = {'de': 'Die Mail wurde versandt!', 'en': 'Send mail was successful!'}
messages['letter_sender'] = {'de': 'Absender', 'en': 'Sender'}
messages['letter_send_to'] = {'de': 'Empfänger', 'en': 'Send to'}
messages['letter_order_no'] = {'de': 'Auftragsnummer {0}', 'en': 'Order no. {0}'}
messages['letter_no_samples'] = {'de': 'Anzahl Proben: {0}', 'en': 'No. samples: {0}'}
messages['letter_body1'] = {'de': 'Hinweis: Bitte drucken Sie das Anschreiben aus oder notieren Sie alternativ die ',
'en': 'Please print this cover letter or write the'}
messages['letter_body2'] = {'de': 'Auftragsnummer auf einem Zettel und legen diesen dem Probenpaket bei.',
'en': 'order number on a slip and send it together with your parcel '
'containing the samples.'}
messages['pls_select'] = {'de': 'Bitte wählen', 'en': 'Please select'}
messages['wrong_credentials'] = {'de': 'Falscher Benutzer oder Passwort!', 'en': 'Wrong user or password!'}
messages['still_locked'] = {'de': 'Sie wurden noch nicht von einem Koordinator freigeschaltet!',
'en': 'Your account must be unlocked by the Administrator!'}
messages['required_fields'] = {'de': 'Bitte alle Pflichtfelder ausfüllen!',
'en': 'Please fill out all required fields!'}
messages['username_present'] = {'de': 'Nutzername schon vorhanden, bitte wählen Sie einen anderen.',
'en': 'Username already present, please choose another one.'}
messages['user_created'] = {'de': 'Ihre Registrierungsanfrage wird bearbeitet. Sie werden in Kürze eine Email '
'Benachichtigung zum Stand Ihrer Freigabe für das GBOL Webportal erhalten.',
'en': 'User created. Please wait for unlock of your account by the administrator.'}
messages['reg_exp_mail_subject'] = {'de': 'Ihre Registrierung beim GBOL Webportal',
'en': 'Your Registration at GBOL Webportal'}
messages['reg_exp_mail_body'] = {'de': 'Hallo {salutation} {title} {vorname} {nachname},\n\n'
'wir haben Ihre Registrierung für die taxonomische Expertise {expertisename} '
'erhalten und an die entsprechenden Koordinatoren weitergeleitet.\n\n'
'Viele Grüße\nIhr GBOL Team',
'en': 'Hello {salutation} {title} {vorname} {nachname},\n\n'
'We have received Your registration for the taxonomic expertise {3} and '
'have send them to the corresponding GBOL-taxon coordinators.\n\n'
'Best regards,\nYour GBOL team'}
messages['reg_exp_chg_mail_body'] = {'de': 'Hallo {tk_user},\n\n{req_user} hat sich für die Expertise {expertisename} '
'registriert.\nBitte prüfen Sie die Angaben und zertifizieren die '
'Expertise anschließend.\n\nViele Grüße\nIhr GBOL Team',
'en': 'Hello {tk_user},\n\n{req_user} applies for the taxonomic expertise '
'{expertisename}.\nPlease check the data and approve or decline the request.'
'\n\nBest regards, Your GBOL team'}
messages['reg_exp_accept'] = {'de': """Hallo {3} {1} {2},
die Expertise {0} in Ihrem GBOL Konto wurde erfolgreich von einem Koordinator freigegeben.
Viele Grüße
Ihr GBOL Team
""", 'en': """Hello {3} {1} {2}
The expertise {0} of your GBOL account has been approved by the coordinator.
Best regards,
The GBOL Team
"""}
messages['reg_exp_decline'] = {'de': """Hallo {3} {1} {2},
die Expertise {0} in Ihrem GBOL Konto wurde von einem Koordinator abgelehnt.
Sie können sich bei Fragen im Kontakt-Bereich bei uns melden.
Viele Grüße
Ihr GBOL Team
""", 'en': """Hello {3} {1} {2}
The expertise {0} of your GBOL account has been refused by the coordinator.
If You have any questions regarding the GBOL approval process, please send us a note in the contact area.
We will answer Your inquiry as soon as possible.
Best regards,
The GBOL Team
"""}
messages['pwd_forgot_email_body'] = {'de': """{0},
eine Anfrage zum Zurücksetzen des Passworts für Ihr Benutzerkonto auf
dem German Barcode of Life Webportal wurde gestellt.
Sie können Ihr Passwort mit einem Klick auf folgenden Link ändern:
http://{1}/sammeln/change-password?link={2}
Ihr Benutzername lautet: {3}
Dieser Link kann nur einmal verwendet werden und leitet Sie zu einer Seite,
auf der Sie ein neues Passwort festlegen können. Er ist einen Tag lang gültig
und läuft automatisch aus, falls Sie ihn nicht verwenden.
Viele Grüße
Das Team von German Barcode of Life""",
'en': """{0},
a request for password reset for your useraccount on the
German Barcode of Life webportal has been posed.
You can change your password with the following link:
http://{1}/sammeln/change-password?link={2}
Your user name is: {3}
Please note: this link can only be used once. The link will direct you to a
website where you can enter a new password.
The link is valid for one day.
Best wishes,
Your team from German Barcode of Life"""}
messages['pwd_forgot_email_subject'] = {'de': 'Neue Login-Daten für {0} auf German Barcode of Life',
'en': 'New login data for your user {0} on German Barcode of '
'Life webportal'}
messages['pwd_forgot_sent'] = {'de': 'Das Passwort und weitere Hinweise wurden an '
'die angegebene Email-Adresse gesendet.',
'en': 'The password and further tips werde sent to your email address.'}
messages['pwd_forgot_not_found'] = {'de': 'Es wurde kein Benutzer mit eingegebenem Namen bzw. Email gefunden.',
'en': 'No user found with the name or the email entered.'}
messages['pwd_unmatch'] = {'de': 'Die beiden Passwörter stimmen nicht überein.', 'en': 'Passwords do not match.'}
messages['pwd_saved'] = {'de': 'Neues Passwort gespeichert.', 'en': 'New password saved'}
messages['pwd__link_used'] = {'de': 'Link wurde bereits benutzt.', 'en': 'The link has been used already'}
messages['pwd__link_invalid'] = {'de': 'Kein gültiger Link.', 'en': 'Link invalid'}
messages['pwd__link_timeout'] = {'de': 'Link ist nicht mehr gültig.', 'en': 'Link has timed out'}
messages['order_success'] = {'de': 'Danke, Ihre Bestellung wurde entgegengenommen.',
'en': 'Thank You, the order has been received.'}
messages['order_info_missing'] = {'de': 'Bitte füllen Sie alle Felder aus.', 'en': 'Please fill out all fields.'}
messages['edt_no_passwd'] = {'de': 'Bitte geben Sie Ihr Passwort an, um das Benutzerprofil zu ändern.',
'en': 'Please provide your password in order to change the userprofile.'}
messages['edt_passwd_wrong'] = {'de': 'Falsches Passwort.', 'en': 'Wrong password.'}
messages['edt_passwd_mismatch'] = {'de': 'Die beiden neuen Passwörter stimmen nicht überein.',
'en': 'Both new passwords do not match.'}
messages['edt_success'] = {'de': 'Benutzerprofil erfolgreich geändert', 'en': 'Userprofile updated.'}
messages['err_upload'] = {'de': 'Ein Fehler ist beim Hochladen der Sammeltabelle aufgetreten. '
'Bitte schicken Sie Ihre Sammeltabelle per E-Mail an den Taxonkoordinator.',
'en': 'An error occured when uploading the collection sheet. Please sent it to the '
'taxon coordinator via e-mail.'}
messages['succ_upload'] = {'de': 'Die Sammeltabelle wurde erfolgreich hochgeladen!',
'en': 'Collection sheet uploaded successfully!'}
messages['download'] = {'de': 'Herunterladen', 'en': 'Download'}
messages['cert'] = {'de': 'zertifiziert', 'en': 'certified'}
messages['subm'] = {'de': 'beantragt', 'en': 'submitted'}
messages['select'] = {'de': 'Auswahl', 'en': 'Please select'}
messages['robot'] = {'de': 'Registrierung konnte nicht durchgeführt werden!', 'en': 'Could not process registration!'}
messages['email_reg_subject'] = {'de': 'GBOL Registrierung', 'en': 'GBOL Registration'}
messages['email_reg_body'] = {'de': """"Hallo {4} {2} {3}
ihr GBOL Konto {0} wurde erfolgreich von einem Koordinator freigegeben.
Sie können sich nun im dem Experten-Bereich anmelden.
Viele Grüße
Ihr GBOL Team
""", 'en': """Hello {4} {2} {3}
Your GBOL account has been approved by the coordinator.
You can now login into the expert area.
Best regards,
The GBOL Team
"""}
messages['email_reg_body_decline'] = {'de': """"Hallo {4} {2} {3}
ihr GBOL Konto {0} wurde von einem Koordinator abgelehnt.
Sie können sich bei Fragen im Kontakt-Bereich von GBOL bei uns melden.
Best regards,
Ihr GBOL Team
""", 'en': """Hello {4} {2} {3}
Your GBoL account has been refused by the coordinator.
If You have any questions regarding the GBoL approval process, please send us a note in the contact area.
We will answer Your inquiry as soon as possible.
Best regards,
The GBOL Team
"""}
messages['states'] = {'de': {'raw': 'Neu', 'cooking': 'in Arbeit', 'done': 'Fertig'},
'en': {'raw': 'New', 'cooking': 'in progress', 'done': 'Done'}}
messages['error'] = {'de': 'Keine Ergebnisse gefunden', 'en': 'Nothing found'}
messages['coord'] = {'de': 'Koordinaten (lat/lon)', 'en': 'Coordinates (lat/lon)'}
messages['taxon'] = {'de': 'Taxon', 'en': 'Higher taxon'}
messages['ncoll'] = {'en': 'Not Collected', 'de': 'Nicht gesammelt'}
messages['nbar'] = {'en': 'No Barcode', 'de': 'Kein Barcode'}
messages['barc'] = {'en': 'Barcode', 'de': 'Barcode'}
messages['pub_updated'] = {'en': 'Publication updated!', 'de': 'Publikation bearbeitet!'}
messages['pub_saved'] = {'en': 'Publication saved!', 'de': 'Publikation gespeichert!'}
messages['pub_error'] = {'en': 'Please enter title and content of the publications posting!',
'de': 'Bitte geben Sie Titel und Inhalt des neuen Publikationsbeitrages ein!'}
messages['mail_req_body'] = """Guten Tag {0},
eine Bestellung für Versandmaterial wurde auf dem GBOL-Portal abgesendet.
Gesendet am {1}
Bestellung:
Material: {2}
Anzahl Verpackungseinheiten: {3}
Taxonomische Gruppe: {4}
Nummer erstes Sammelröhrchen: {5}
Nummer letztes Sammelröhrchen: {6}
Absender:
{name}
{street}
{city}
{country}
Email: {email}
"""
# -- In case of an error one of these messages are send to the dev_group specified in production.ini
messages['error'] = {}
messages['error']['order_processing'] = """
Eine Bestellung für Versandmaterial konnte nicht verarbeitet werden:
Bestellzeit: {1}
Koordinator (User-id): {0}
Möglicher Trasaktions-Key: {9}
Bestellung:
Material: {2}
Anzahl Verpackungseinheiten: {3}
Taxonomische Gruppe (ID): {4}
Nummer erstes Sammelröhrchen: {5}
Nummer letztes Sammelröhrchen: {6}
Bestellt von:
User-ID: {7}
Name: {8}
Fehler:
{10}
"""
| 48.335423 | 120 | 0.613075 | 1,833 | 15,419 | 5.091653 | 0.327878 | 0.010286 | 0.006429 | 0.009643 | 0.23508 | 0.196078 | 0.165649 | 0.15322 | 0.132433 | 0.107361 | 0 | 0.033188 | 0.243725 | 15,419 | 318 | 121 | 48.487421 | 0.767173 | 0.006356 | 0 | 0.213483 | 0 | 0.014981 | 0.673738 | 0.03512 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.067416 | 0.003745 | 0 | 0.003745 | 0.003745 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
60aaca2558f26dc382fc17ea023f8e47287579c5 | 26,892 | py | Python | datastore/core/test/test_basic.py | jbenet/datastore | 6c0e9f39b844788ca21c5cf613c625a72fb20c20 | [
"MIT"
] | 1 | 2015-05-24T13:11:16.000Z | 2015-05-24T13:11:16.000Z | datastore/core/test/test_basic.py | jbenet/datastore | 6c0e9f39b844788ca21c5cf613c625a72fb20c20 | [
"MIT"
] | null | null | null | datastore/core/test/test_basic.py | jbenet/datastore | 6c0e9f39b844788ca21c5cf613c625a72fb20c20 | [
"MIT"
] | null | null | null |
import unittest
import logging
from ..basic import DictDatastore
from ..key import Key
from ..query import Query
class TestDatastore(unittest.TestCase):
def subtest_simple(self, stores, numelems=1000):
def checkLength(len):
try:
for sn in stores:
self.assertEqual(len(sn), numelems)
except TypeError, e:
pass
self.assertTrue(len(stores) > 0)
pkey = Key('/dfadasfdsafdas/')
checkLength(0)
# ensure removing non-existent keys is ok.
for value in range(0, numelems):
key = pkey.child(value)
for sn in stores:
self.assertFalse(sn.contains(key))
sn.delete(key)
self.assertFalse(sn.contains(key))
checkLength(0)
# insert numelems elems
for value in range(0, numelems):
key = pkey.child(value)
for sn in stores:
self.assertFalse(sn.contains(key))
sn.put(key, value)
self.assertTrue(sn.contains(key))
self.assertEqual(sn.get(key), value)
# reassure they're all there.
checkLength(numelems)
for value in range(0, numelems):
key = pkey.child(value)
for sn in stores:
self.assertTrue(sn.contains(key))
self.assertEqual(sn.get(key), value)
checkLength(numelems)
k = pkey
n = int(numelems)
allitems = list(range(0, n))
def test_query(query, slice):
for sn in stores:
try:
contents = list(sn.query(Query(pkey)))
expected = contents[slice]
result = list(sn.query(query))
# make sure everything is there.
self.assertTrue(len(contents) == len(allitems),\
'%s == %s' % (str(contents), str(allitems)))
self.assertTrue(all([val in contents for val in allitems]))
self.assertTrue(len(result) == len(expected),\
'%s == %s' % (str(result), str(expected)))
self.assertTrue(all([val in result for val in expected]))
#TODO: should order be preserved?
# self.assertEqual(result, expected)
except NotImplementedError:
print 'WARNING: %s does not implement query.' % sn
test_query(Query(k), slice(0, n))
test_query(Query(k, limit=n), slice(0, n))
test_query(Query(k, limit=n/2), slice(0, n/2))
test_query(Query(k, offset=n/2), slice(n/2, n))
test_query(Query(k, offset=n/3, limit=n/3), slice(n/3, 2*(n/3)))
del k
del n
# change numelems elems
for value in range(0, numelems):
key = pkey.child(value)
for sn in stores:
self.assertTrue(sn.contains(key))
sn.put(key, value + 1)
self.assertTrue(sn.contains(key))
self.assertNotEqual(value, sn.get(key))
self.assertEqual(value + 1, sn.get(key))
checkLength(numelems)
# remove numelems elems
for value in range(0, numelems):
key = pkey.child(value)
for sn in stores:
self.assertTrue(sn.contains(key))
sn.delete(key)
self.assertFalse(sn.contains(key))
checkLength(0)
class TestNullDatastore(unittest.TestCase):
def test_null(self):
from ..basic import NullDatastore
s = NullDatastore()
for c in range(1, 20):
c = str(c)
k = Key(c)
self.assertFalse(s.contains(k))
self.assertEqual(s.get(k), None)
s.put(k, c)
self.assertFalse(s.contains(k))
self.assertEqual(s.get(k), None)
for item in s.query(Query(Key('/'))):
raise Exception('Should not have found anything.')
class TestDictionaryDatastore(TestDatastore):
def test_dictionary(self):
s1 = DictDatastore()
s2 = DictDatastore()
s3 = DictDatastore()
stores = [s1, s2, s3]
self.subtest_simple(stores)
class TestCacheShimDatastore(TestDatastore):
def test_simple(self):
from ..basic import CacheShimDatastore
from ..basic import NullDatastore
class NullMinusQueryDatastore(NullDatastore):
def query(self, query):
raise NotImplementedError
# make sure the cache is used
s1 = CacheShimDatastore(NullMinusQueryDatastore(), cache=DictDatastore())
# make sure the cache is not relief upon
s2 = CacheShimDatastore(DictDatastore(), cache=NullDatastore())
# make sure the cache works in tandem
s3 = CacheShimDatastore(DictDatastore(), cache=DictDatastore())
self.subtest_simple([s1, s2, s3])
class TestLoggingDatastore(TestDatastore):
def test_simple(self):
from ..basic import LoggingDatastore
class NullLogger(logging.getLoggerClass()):
def debug(self, *args, **kwargs): pass
def info(self, *args, **kwargs): pass
def warning(self, *args, **kwargs): pass
def error(self, *args, **kwargs): pass
def critical(self, *args, **kwargs): pass
s1 = LoggingDatastore(DictDatastore(), logger=NullLogger('null'))
s2 = LoggingDatastore(DictDatastore())
self.subtest_simple([s1, s2])
class TestKeyTransformDatastore(TestDatastore):
def test_simple(self):
from ..basic import KeyTransformDatastore
s1 = KeyTransformDatastore(DictDatastore())
s2 = KeyTransformDatastore(DictDatastore())
s3 = KeyTransformDatastore(DictDatastore())
stores = [s1, s2, s3]
self.subtest_simple(stores)
def test_reverse_transform(self):
from ..basic import KeyTransformDatastore
def transform(key):
return key.reverse
ds = DictDatastore()
kt = KeyTransformDatastore(ds, keytransform=transform)
k1 = Key('/a/b/c')
k2 = Key('/c/b/a')
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertFalse(kt.contains(k1))
self.assertFalse(kt.contains(k2))
ds.put(k1, 'abc')
self.assertEqual(ds.get(k1), 'abc')
self.assertFalse(ds.contains(k2))
self.assertFalse(kt.contains(k1))
self.assertEqual(kt.get(k2), 'abc')
kt.put(k1, 'abc')
self.assertEqual(ds.get(k1), 'abc')
self.assertEqual(ds.get(k2), 'abc')
self.assertEqual(kt.get(k1), 'abc')
self.assertEqual(kt.get(k2), 'abc')
ds.delete(k1)
self.assertFalse(ds.contains(k1))
self.assertEqual(ds.get(k2), 'abc')
self.assertEqual(kt.get(k1), 'abc')
self.assertFalse(kt.contains(k2))
kt.delete(k1)
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertFalse(kt.contains(k1))
self.assertFalse(kt.contains(k2))
def test_lowercase_transform(self):
from ..basic import KeyTransformDatastore
def transform(key):
return Key(str(key).lower())
ds = DictDatastore()
lds = KeyTransformDatastore(ds, keytransform=transform)
k1 = Key('hello')
k2 = Key('HELLO')
k3 = Key('HeLlo')
ds.put(k1, 'world')
ds.put(k2, 'WORLD')
self.assertEqual(ds.get(k1), 'world')
self.assertEqual(ds.get(k2), 'WORLD')
self.assertFalse(ds.contains(k3))
self.assertEqual(lds.get(k1), 'world')
self.assertEqual(lds.get(k2), 'world')
self.assertEqual(lds.get(k3), 'world')
def test(key, val):
lds.put(key, val)
self.assertEqual(lds.get(k1), val)
self.assertEqual(lds.get(k2), val)
self.assertEqual(lds.get(k3), val)
test(k1, 'a')
test(k2, 'b')
test(k3, 'c')
class TestLowercaseKeyDatastore(TestDatastore):
def test_simple(self):
from ..basic import LowercaseKeyDatastore
s1 = LowercaseKeyDatastore(DictDatastore())
s2 = LowercaseKeyDatastore(DictDatastore())
s3 = LowercaseKeyDatastore(DictDatastore())
stores = [s1, s2, s3]
self.subtest_simple(stores)
def test_lowercase(self):
from ..basic import LowercaseKeyDatastore
ds = DictDatastore()
lds = LowercaseKeyDatastore(ds)
k1 = Key('hello')
k2 = Key('HELLO')
k3 = Key('HeLlo')
ds.put(k1, 'world')
ds.put(k2, 'WORLD')
self.assertEqual(ds.get(k1), 'world')
self.assertEqual(ds.get(k2), 'WORLD')
self.assertFalse(ds.contains(k3))
self.assertEqual(lds.get(k1), 'world')
self.assertEqual(lds.get(k2), 'world')
self.assertEqual(lds.get(k3), 'world')
def test(key, val):
lds.put(key, val)
self.assertEqual(lds.get(k1), val)
self.assertEqual(lds.get(k2), val)
self.assertEqual(lds.get(k3), val)
test(k1, 'a')
test(k2, 'b')
test(k3, 'c')
class TestNamespaceDatastore(TestDatastore):
def test_simple(self):
from ..basic import NamespaceDatastore
s1 = NamespaceDatastore(Key('a'), DictDatastore())
s2 = NamespaceDatastore(Key('b'), DictDatastore())
s3 = NamespaceDatastore(Key('c'), DictDatastore())
stores = [s1, s2, s3]
self.subtest_simple(stores)
def test_namespace(self):
from ..basic import NamespaceDatastore
k1 = Key('/c/d')
k2 = Key('/a/b')
k3 = Key('/a/b/c/d')
ds = DictDatastore()
nd = NamespaceDatastore(k2, ds)
ds.put(k1, 'cd')
ds.put(k3, 'abcd')
self.assertEqual(ds.get(k1), 'cd')
self.assertFalse(ds.contains(k2))
self.assertEqual(ds.get(k3), 'abcd')
self.assertEqual(nd.get(k1), 'abcd')
self.assertFalse(nd.contains(k2))
self.assertFalse(nd.contains(k3))
def test(key, val):
nd.put(key, val)
self.assertEqual(nd.get(key), val)
self.assertFalse(ds.contains(key))
self.assertFalse(nd.contains(k2.child(key)))
self.assertEqual(ds.get(k2.child(key)), val)
for i in range(0, 10):
test(Key(str(i)), 'val%d' % i)
class TestNestedPathDatastore(TestDatastore):
def test_simple(self):
from ..basic import NestedPathDatastore
s1 = NestedPathDatastore(DictDatastore())
s2 = NestedPathDatastore(DictDatastore(), depth=2)
s3 = NestedPathDatastore(DictDatastore(), length=2)
s4 = NestedPathDatastore(DictDatastore(), length=1, depth=2)
stores = [s1, s2, s3, s4]
self.subtest_simple(stores)
def test_nested_path(self):
from ..basic import NestedPathDatastore
nested_path = NestedPathDatastore.nestedPath
def test(depth, length, expected):
nested = nested_path('abcdefghijk', depth, length)
self.assertEqual(nested, expected)
test(3, 2, 'ab/cd/ef')
test(4, 2, 'ab/cd/ef/gh')
test(3, 4, 'abcd/efgh/ijk')
test(1, 4, 'abcd')
test(3, 10, 'abcdefghij/k')
def subtest_nested_path_ds(self, **kwargs):
from ..basic import NestedPathDatastore
k1 = kwargs.pop('k1')
k2 = kwargs.pop('k2')
k3 = kwargs.pop('k3')
k4 = kwargs.pop('k4')
ds = DictDatastore()
np = NestedPathDatastore(ds, **kwargs)
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertFalse(ds.contains(k3))
self.assertFalse(ds.contains(k4))
self.assertFalse(np.contains(k1))
self.assertFalse(np.contains(k2))
self.assertFalse(np.contains(k3))
self.assertFalse(np.contains(k4))
np.put(k1, k1)
np.put(k2, k2)
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertTrue(ds.contains(k3))
self.assertTrue(ds.contains(k4))
self.assertTrue(np.contains(k1))
self.assertTrue(np.contains(k2))
self.assertFalse(np.contains(k3))
self.assertFalse(np.contains(k4))
self.assertEqual(np.get(k1), k1)
self.assertEqual(np.get(k2), k2)
self.assertEqual(ds.get(k3), k1)
self.assertEqual(ds.get(k4), k2)
np.delete(k1)
np.delete(k2)
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertFalse(ds.contains(k3))
self.assertFalse(ds.contains(k4))
self.assertFalse(np.contains(k1))
self.assertFalse(np.contains(k2))
self.assertFalse(np.contains(k3))
self.assertFalse(np.contains(k4))
ds.put(k3, k1)
ds.put(k4, k2)
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertTrue(ds.contains(k3))
self.assertTrue(ds.contains(k4))
self.assertTrue(np.contains(k1))
self.assertTrue(np.contains(k2))
self.assertFalse(np.contains(k3))
self.assertFalse(np.contains(k4))
self.assertEqual(np.get(k1), k1)
self.assertEqual(np.get(k2), k2)
self.assertEqual(ds.get(k3), k1)
self.assertEqual(ds.get(k4), k2)
ds.delete(k3)
ds.delete(k4)
self.assertFalse(ds.contains(k1))
self.assertFalse(ds.contains(k2))
self.assertFalse(ds.contains(k3))
self.assertFalse(ds.contains(k4))
self.assertFalse(np.contains(k1))
self.assertFalse(np.contains(k2))
self.assertFalse(np.contains(k3))
self.assertFalse(np.contains(k4))
def test_3_2(self):
opts = {}
opts['k1'] = Key('/abcdefghijk')
opts['k2'] = Key('/abcdefghijki')
opts['k3'] = Key('/ab/cd/ef/abcdefghijk')
opts['k4'] = Key('/ab/cd/ef/abcdefghijki')
opts['depth'] = 3
opts['length'] = 2
self.subtest_nested_path_ds(**opts)
def test_5_3(self):
opts = {}
opts['k1'] = Key('/abcdefghijk')
opts['k2'] = Key('/abcdefghijki')
opts['k3'] = Key('/abc/def/ghi/jka/bcd/abcdefghijk')
opts['k4'] = Key('/abc/def/ghi/jki/abc/abcdefghijki')
opts['depth'] = 5
opts['length'] = 3
self.subtest_nested_path_ds(**opts)
def test_keyfn(self):
opts = {}
opts['k1'] = Key('/abcdefghijk')
opts['k2'] = Key('/abcdefghijki')
opts['k3'] = Key('/kj/ih/gf/abcdefghijk')
opts['k4'] = Key('/ik/ji/hg/abcdefghijki')
opts['depth'] = 3
opts['length'] = 2
opts['keyfn'] = lambda key: key.name[::-1]
self.subtest_nested_path_ds(**opts)
class TestSymlinkDatastore(TestDatastore):
def test_simple(self):
from ..basic import SymlinkDatastore
s1 = SymlinkDatastore(DictDatastore())
s2 = SymlinkDatastore(DictDatastore())
s3 = SymlinkDatastore(DictDatastore())
s4 = SymlinkDatastore(DictDatastore())
stores = [s1, s2, s3, s4]
self.subtest_simple(stores)
def test_symlink_basic(self):
from ..basic import SymlinkDatastore
dds = DictDatastore()
sds = SymlinkDatastore(dds)
a = Key('/A')
b = Key('/B')
sds.put(a, 1)
self.assertEqual(sds.get(a), 1)
self.assertEqual(sds.get(b), None)
self.assertNotEqual(sds.get(b), sds.get(a))
sds.link(a, b)
self.assertEqual(sds.get(a), 1)
self.assertEqual(sds.get(b), 1)
self.assertEqual(sds.get(a), sds.get(b))
sds.put(b, 2)
self.assertEqual(sds.get(a), 2)
self.assertEqual(sds.get(b), 2)
self.assertEqual(sds.get(a), sds.get(b))
sds.delete(a)
self.assertEqual(sds.get(a), None)
self.assertEqual(sds.get(b), None)
self.assertEqual(sds.get(b), sds.get(a))
sds.put(a, 3)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), 3)
self.assertEqual(sds.get(b), sds.get(a))
sds.delete(b)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), None)
self.assertNotEqual(sds.get(b), sds.get(a))
def test_symlink_internals(self):
from ..basic import SymlinkDatastore
dds = DictDatastore()
sds = SymlinkDatastore(dds)
a = Key('/A')
b = Key('/B')
c = Key('/C')
d = Key('/D')
lva = sds._link_value_for_key(a)
lvb = sds._link_value_for_key(b)
lvc = sds._link_value_for_key(c)
lvd = sds._link_value_for_key(d)
# helper to check queries
sds_query = lambda: list(sds.query(Query(Key('/'))))
dds_query = lambda: list(dds.query(Query(Key('/'))))
# ensure _link_value_for_key and _link_for_value work
self.assertEqual(lva, str(a.child(sds.sentinel)))
self.assertEqual(a, sds._link_for_value(lva))
# adding a value should work like usual
sds.put(a, 1)
self.assertEqual(sds.get(a), 1)
self.assertEqual(sds.get(b), None)
self.assertNotEqual(sds.get(b), sds.get(a))
self.assertEqual(dds.get(a), 1)
self.assertEqual(dds.get(b), None)
self.assertEqual(sds_query(), [1])
self.assertEqual(dds_query(), [1])
# _follow_link(sds._link_value_for_key(a)) should == get(a)
self.assertEqual(sds._follow_link(lva), 1)
self.assertEqual(list(sds._follow_link_gen([lva])), [1])
# linking keys should work
sds.link(a, b)
self.assertEqual(sds.get(a), 1)
self.assertEqual(sds.get(b), 1)
self.assertEqual(sds.get(a), sds.get(b))
self.assertEqual(dds.get(a), 1)
self.assertEqual(dds.get(b), lva)
self.assertEqual(sds_query(), [1, 1])
self.assertEqual(dds_query(), [1, lva])
# changing link should affect source
sds.put(b, 2)
self.assertEqual(sds.get(a), 2)
self.assertEqual(sds.get(b), 2)
self.assertEqual(sds.get(a), sds.get(b))
self.assertEqual(dds.get(a), 2)
self.assertEqual(dds.get(b), lva)
self.assertEqual(sds_query(), [2, 2])
self.assertEqual(dds_query(), [2, lva])
# deleting source should affect link
sds.delete(a)
self.assertEqual(sds.get(a), None)
self.assertEqual(sds.get(b), None)
self.assertEqual(sds.get(b), sds.get(a))
self.assertEqual(dds.get(a), None)
self.assertEqual(dds.get(b), lva)
self.assertEqual(sds_query(), [None])
self.assertEqual(dds_query(), [lva])
# putting back source should yield working link
sds.put(a, 3)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), 3)
self.assertEqual(sds.get(b), sds.get(a))
self.assertEqual(dds.get(a), 3)
self.assertEqual(dds.get(b), lva)
self.assertEqual(sds_query(), [3, 3])
self.assertEqual(dds_query(), [3, lva])
# deleting link should not affect source
sds.delete(b)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), None)
self.assertNotEqual(sds.get(b), sds.get(a))
self.assertEqual(dds.get(a), 3)
self.assertEqual(dds.get(b), None)
self.assertEqual(sds_query(), [3])
self.assertEqual(dds_query(), [3])
# linking should bring back to normal
sds.link(a, b)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), 3)
self.assertEqual(sds.get(b), sds.get(a))
self.assertEqual(dds.get(a), 3)
self.assertEqual(dds.get(b), lva)
self.assertEqual(sds_query(), [3, 3])
self.assertEqual(dds_query(), [3, lva])
# Adding another link should not affect things.
sds.link(a, c)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), 3)
self.assertEqual(sds.get(c), 3)
self.assertEqual(sds.get(a), sds.get(b))
self.assertEqual(sds.get(a), sds.get(c))
self.assertEqual(dds.get(a), 3)
self.assertEqual(dds.get(b), lva)
self.assertEqual(dds.get(c), lva)
self.assertEqual(sds_query(), [3, 3, 3])
self.assertEqual(dds_query(), [3, lva, lva])
# linking should be transitive
sds.link(b, c)
sds.link(c, d)
self.assertEqual(sds.get(a), 3)
self.assertEqual(sds.get(b), 3)
self.assertEqual(sds.get(c), 3)
self.assertEqual(sds.get(d), 3)
self.assertEqual(sds.get(a), sds.get(b))
self.assertEqual(sds.get(a), sds.get(c))
self.assertEqual(sds.get(a), sds.get(d))
self.assertEqual(dds.get(a), 3)
self.assertEqual(dds.get(b), lva)
self.assertEqual(dds.get(c), lvb)
self.assertEqual(dds.get(d), lvc)
self.assertEqual(sds_query(), [3, 3, 3, 3])
self.assertEqual(set(dds_query()), set([3, lva, lvb, lvc]))
self.assertRaises(AssertionError, sds.link, d, a)
def test_symlink_recursive(self):
from ..basic import SymlinkDatastore
dds = DictDatastore()
sds1 = SymlinkDatastore(dds)
sds2 = SymlinkDatastore(sds1)
a = Key('/A')
b = Key('/B')
sds2.put(a, 1)
self.assertEqual(sds2.get(a), 1)
self.assertEqual(sds2.get(b), None)
self.assertNotEqual(sds2.get(b), sds2.get(a))
sds2.link(a, b)
self.assertEqual(sds2.get(a), 1)
self.assertEqual(sds2.get(b), 1)
self.assertEqual(sds2.get(a), sds2.get(b))
self.assertEqual(sds1.get(a), sds1.get(b))
sds2.link(a, b)
self.assertEqual(sds2.get(a), 1)
self.assertEqual(sds2.get(b), 1)
self.assertEqual(sds2.get(a), sds2.get(b))
self.assertEqual(sds1.get(a), sds1.get(b))
sds2.link(a, b)
self.assertEqual(sds2.get(a), 1)
self.assertEqual(sds2.get(b), 1)
self.assertEqual(sds2.get(a), sds2.get(b))
self.assertEqual(sds1.get(a), sds1.get(b))
sds2.put(b, 2)
self.assertEqual(sds2.get(a), 2)
self.assertEqual(sds2.get(b), 2)
self.assertEqual(sds2.get(a), sds2.get(b))
self.assertEqual(sds1.get(a), sds1.get(b))
sds2.delete(a)
self.assertEqual(sds2.get(a), None)
self.assertEqual(sds2.get(b), None)
self.assertEqual(sds2.get(b), sds2.get(a))
sds2.put(a, 3)
self.assertEqual(sds2.get(a), 3)
self.assertEqual(sds2.get(b), 3)
self.assertEqual(sds2.get(b), sds2.get(a))
sds2.delete(b)
self.assertEqual(sds2.get(a), 3)
self.assertEqual(sds2.get(b), None)
self.assertNotEqual(sds2.get(b), sds2.get(a))
class TestDirectoryDatastore(TestDatastore):
def test_simple(self):
from ..basic import DirectoryDatastore
s1 = DirectoryDatastore(DictDatastore())
s2 = DirectoryDatastore(DictDatastore())
self.subtest_simple([s1, s2])
class TestDatastoreCollection(TestDatastore):
def test_tiered(self):
from ..basic import TieredDatastore
s1 = DictDatastore()
s2 = DictDatastore()
s3 = DictDatastore()
ts = TieredDatastore([s1, s2, s3])
k1 = Key('1')
k2 = Key('2')
k3 = Key('3')
s1.put(k1, '1')
s2.put(k2, '2')
s3.put(k3, '3')
self.assertTrue(s1.contains(k1))
self.assertFalse(s2.contains(k1))
self.assertFalse(s3.contains(k1))
self.assertTrue(ts.contains(k1))
self.assertEqual(ts.get(k1), '1')
self.assertEqual(s1.get(k1), '1')
self.assertFalse(s2.contains(k1))
self.assertFalse(s3.contains(k1))
self.assertFalse(s1.contains(k2))
self.assertTrue(s2.contains(k2))
self.assertFalse(s3.contains(k2))
self.assertTrue(ts.contains(k2))
self.assertEqual(s2.get(k2), '2')
self.assertFalse(s1.contains(k2))
self.assertFalse(s3.contains(k2))
self.assertEqual(ts.get(k2), '2')
self.assertEqual(s1.get(k2), '2')
self.assertEqual(s2.get(k2), '2')
self.assertFalse(s3.contains(k2))
self.assertFalse(s1.contains(k3))
self.assertFalse(s2.contains(k3))
self.assertTrue(s3.contains(k3))
self.assertTrue(ts.contains(k3))
self.assertEqual(s3.get(k3), '3')
self.assertFalse(s1.contains(k3))
self.assertFalse(s2.contains(k3))
self.assertEqual(ts.get(k3), '3')
self.assertEqual(s1.get(k3), '3')
self.assertEqual(s2.get(k3), '3')
self.assertEqual(s3.get(k3), '3')
ts.delete(k1)
ts.delete(k2)
ts.delete(k3)
self.assertFalse(ts.contains(k1))
self.assertFalse(ts.contains(k2))
self.assertFalse(ts.contains(k3))
self.subtest_simple([ts])
def test_sharded(self, numelems=1000):
from ..basic import ShardedDatastore
s1 = DictDatastore()
s2 = DictDatastore()
s3 = DictDatastore()
s4 = DictDatastore()
s5 = DictDatastore()
stores = [s1, s2, s3, s4, s5]
hash = lambda key: int(key.name) * len(stores) / numelems
sharded = ShardedDatastore(stores, shardingfn=hash)
sumlens = lambda stores: sum(map(lambda s: len(s), stores))
def checkFor(key, value, sharded, shard=None):
correct_shard = sharded._stores[hash(key) % len(sharded._stores)]
for s in sharded._stores:
if shard and s == shard:
self.assertTrue(s.contains(key))
self.assertEqual(s.get(key), value)
else:
self.assertFalse(s.contains(key))
if correct_shard == shard:
self.assertTrue(sharded.contains(key))
self.assertEqual(sharded.get(key), value)
else:
self.assertFalse(sharded.contains(key))
self.assertEqual(sumlens(stores), 0)
# test all correct.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
shard = stores[hash(key) % len(stores)]
checkFor(key, value, sharded)
shard.put(key, value)
checkFor(key, value, sharded, shard)
self.assertEqual(sumlens(stores), numelems)
# ensure its in the same spots.
for i in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
shard = stores[hash(key) % len(stores)]
checkFor(key, value, sharded, shard)
shard.put(key, value)
checkFor(key, value, sharded, shard)
self.assertEqual(sumlens(stores), numelems)
# ensure its in the same spots.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
shard = stores[hash(key) % len(stores)]
checkFor(key, value, sharded, shard)
sharded.put(key, value)
checkFor(key, value, sharded, shard)
self.assertEqual(sumlens(stores), numelems)
# ensure its in the same spots.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
shard = stores[hash(key) % len(stores)]
checkFor(key, value, sharded, shard)
if value % 2 == 0:
shard.delete(key)
else:
sharded.delete(key)
checkFor(key, value, sharded)
self.assertEqual(sumlens(stores), 0)
# try out adding it to the wrong shards.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
incorrect_shard = stores[(hash(key) + 1) % len(stores)]
checkFor(key, value, sharded)
incorrect_shard.put(key, value)
checkFor(key, value, sharded, incorrect_shard)
self.assertEqual(sumlens(stores), numelems)
# ensure its in the same spots.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
incorrect_shard = stores[(hash(key) + 1) % len(stores)]
checkFor(key, value, sharded, incorrect_shard)
incorrect_shard.put(key, value)
checkFor(key, value, sharded, incorrect_shard)
self.assertEqual(sumlens(stores), numelems)
# this wont do anything
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
incorrect_shard = stores[(hash(key) + 1) % len(stores)]
checkFor(key, value, sharded, incorrect_shard)
sharded.delete(key)
checkFor(key, value, sharded, incorrect_shard)
self.assertEqual(sumlens(stores), numelems)
# this will place it correctly.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
incorrect_shard = stores[(hash(key) + 1) % len(stores)]
correct_shard = stores[(hash(key)) % len(stores)]
checkFor(key, value, sharded, incorrect_shard)
sharded.put(key, value)
incorrect_shard.delete(key)
checkFor(key, value, sharded, correct_shard)
self.assertEqual(sumlens(stores), numelems)
# this will place it correctly.
for value in range(0, numelems):
key = Key('/fdasfdfdsafdsafdsa/%d' % value)
correct_shard = stores[(hash(key)) % len(stores)]
checkFor(key, value, sharded, correct_shard)
sharded.delete(key)
checkFor(key, value, sharded)
self.assertEqual(sumlens(stores), 0)
self.subtest_simple([sharded])
if __name__ == '__main__':
unittest.main()
| 27.553279 | 77 | 0.643611 | 3,624 | 26,892 | 4.734547 | 0.075055 | 0.160858 | 0.059797 | 0.057524 | 0.679625 | 0.62822 | 0.58107 | 0.561429 | 0.522264 | 0.510024 | 0 | 0.024767 | 0.201249 | 26,892 | 975 | 78 | 27.581538 | 0.774022 | 0.039305 | 0 | 0.610072 | 0 | 0 | 0.034456 | 0.013527 | 0 | 0 | 0 | 0.001026 | 0.420144 | 0 | null | null | 0.008633 | 0.03741 | null | null | 0.001439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60aeca0f1277fa05a855d2d1630e9fb88e8caff2 | 124 | py | Python | entry.py | Allenyou1126/allenyou-acme.sh | 2f5fa606cb0a66ded49d75a98d0dc47adc68c87c | [
"Apache-2.0"
] | null | null | null | entry.py | Allenyou1126/allenyou-acme.sh | 2f5fa606cb0a66ded49d75a98d0dc47adc68c87c | [
"Apache-2.0"
] | null | null | null | entry.py | Allenyou1126/allenyou-acme.sh | 2f5fa606cb0a66ded49d75a98d0dc47adc68c87c | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
import json
from allenyoucert import Cert
def main():
certList = list()
a = json()
main()
| 11.272727 | 29 | 0.645161 | 17 | 124 | 4.705882 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010417 | 0.225806 | 124 | 10 | 30 | 12.4 | 0.822917 | 0.169355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
60b0bb8f2315cabad512414bb05d03c08dd3d690 | 882 | py | Python | users/forms.py | iurykrieger96/alura-django | d8ed9998ccbc629127b2c6ca3ed3798da9a578f3 | [
"MIT"
] | null | null | null | users/forms.py | iurykrieger96/alura-django | d8ed9998ccbc629127b2c6ca3ed3798da9a578f3 | [
"MIT"
] | null | null | null | users/forms.py | iurykrieger96/alura-django | d8ed9998ccbc629127b2c6ca3ed3798da9a578f3 | [
"MIT"
] | null | null | null | from django import forms
from django.contrib.auth.models import User
from django.forms.utils import ErrorList
class UserForm(forms.Form):
name = forms.CharField(required=True)
email = forms.EmailField(required=True)
password = forms.CharField(required=True)
phone = forms.CharField(required=True)
name_company = forms.CharField(required=True)
def is_valid(self):
valid = True
if not super(UserForm, self).is_valid():
self.add_erro('Por favor, verifique os dados informados')
valid = False
user_exists = User.objects.filter(username=self.data['name']).exists()
if user_exists:
self.add_erro('Usuario ja existente')
valid = False
return valid
def add_erro(self, message):
self._errors.setdefault(forms.forms.NON_FIELD_ERRORS, ErrorList()).append(message)
| 32.666667 | 90 | 0.679138 | 110 | 882 | 5.345455 | 0.481818 | 0.102041 | 0.14966 | 0.176871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.219955 | 882 | 26 | 91 | 33.923077 | 0.854651 | 0 | 0 | 0.095238 | 0 | 0 | 0.072562 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0.047619 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
60b74cba1803d593d12dfa926f3a269a994bf89a | 705 | py | Python | examples/multiple_deserializers.py | klauer/apischema | 0da9b96b74dabe8704e2dcfca4502aed98500799 | [
"MIT"
] | null | null | null | examples/multiple_deserializers.py | klauer/apischema | 0da9b96b74dabe8704e2dcfca4502aed98500799 | [
"MIT"
] | null | null | null | examples/multiple_deserializers.py | klauer/apischema | 0da9b96b74dabe8704e2dcfca4502aed98500799 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from apischema import deserialize, deserializer
from apischema.json_schema import deserialization_schema
@dataclass
class Expression:
value: int
@deserializer
def evaluate_expression(expr: str) -> Expression:
return Expression(int(eval(expr)))
# Could be shorten into deserializer(Expression), because class is callable too
@deserializer
def expression_from_value(value: int) -> Expression:
return Expression(value)
assert deserialization_schema(Expression) == {
"$schema": "http://json-schema.org/draft/2019-09/schema#",
"type": ["string", "integer"],
}
assert deserialize(Expression, 0) == deserialize(Expression, "1 - 1") == Expression(0)
| 25.178571 | 86 | 0.753191 | 81 | 705 | 6.481481 | 0.481481 | 0.049524 | 0.099048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016367 | 0.133333 | 705 | 27 | 87 | 26.111111 | 0.842881 | 0.10922 | 0 | 0.117647 | 0 | 0 | 0.116613 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.117647 | false | 0 | 0.176471 | 0.117647 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
60bc5e49484474aa3891401d08c7a54d11595b9c | 3,145 | py | Python | iqoptionapi/country_id.py | mustx1/MYIQ | 3afb597aa8a8abc278b7d70dad46af81789eae3e | [
"MIT"
] | 3 | 2021-07-09T18:34:57.000Z | 2021-08-04T18:05:17.000Z | iqoptionapi/country_id.py | mustx1/MYIQ | 3afb597aa8a8abc278b7d70dad46af81789eae3e | [
"MIT"
] | 5 | 2022-01-20T00:32:49.000Z | 2022-02-16T23:12:10.000Z | iqoptionapi/country_id.py | mustx1/MYIQ | 3afb597aa8a8abc278b7d70dad46af81789eae3e | [
"MIT"
] | 2 | 2021-06-15T15:13:07.000Z | 2021-12-08T04:16:36.000Z | ID = {"Worldwide":0,
"AF": 1,
"AL": 2,
"DZ": 3,
"AD": 5,
"AO": 6,
"AI": 7,
"AG": 9,
"AR": 10,
"AM": 11,
"AW": 12,
"AT": 14,
"AZ": 15,
"BS": 16,
"BH": 17,
"BD": 18,
"BB": 19,
"BY": 20,
"BZ": 22,
"BJ": 23,
"BM": 24,
"BO": 26,
"BA": 27,
"BW": 28,
"BV": 29,
"BR": 30,
"BN": 31,
"BG": 32,
"BF": 33,
"BI": 34,
"KH": 35,
"CM": 36,
"CV": 38,
"KY": 39,
"TD": 41,
"CL": 42,
"CN": 43,
"CC": 45,
"CO": 46,
"KM": 47,
"CG": 48,
"CK": 49,
"CR": 50,
"CI": 51,
"HR": 52,
"CU": 53,
"CY": 54,
"CZ": 55,
"DK": 56,
"DJ": 57,
"DM": 58,
"DO": 59,
"TL": 60,
"EC": 61,
"EG": 62,
"SV": 63,
"EE": 66,
"ET": 67,
"FO": 69,
"FJ": 70,
"FI": 71,
"FR": 72,
"GF": 73,
"PF": 74,
"GA": 75,
"GM": 76,
"GE": 77,
"DE": 78,
"GH": 79,
"GR": 81,
"GD": 83,
"GP": 84,
"GT": 86,
"GN": 87,
"GY": 88,
"HT": 89,
"HN": 90,
"HK": 91,
"HU": 92,
"IS": 93,
"ID": 94,
"IQ": 95,
"IE": 96,
"IT": 97,
"JM": 98,
"JO": 100,
"KZ": 101,
"KE": 102,
"KI": 103,
"KW": 104,
"KG": 105,
"LA": 106,
"LV": 107,
"LB": 108,
"LS": 109,
"LR": 110,
"LY": 111,
"LT": 113,
"LU": 114,
"MO": 115,
"MK": 116,
"MG": 117,
"MW": 118,
"MY": 119,
"MV": 120,
"ML": 121,
"MT": 122,
"MQ": 124,
"MR": 125,
"MU": 126,
"MX": 128,
"FM": 129,
"MD": 130,
"MC": 131,
"MN": 132,
"MA": 134,
"MZ": 135,
"MM": 136,
"NA": 137,
"NP": 139,
"NL": 140,
"AN": 141,
"NC": 142,
"NZ": 143,
"NI": 144,
"NE": 145,
"NG": 146,
"NO": 149,
"OM": 150,
"PK": 151,
"PW": 152,
"PA": 153,
"PG": 154,
"PY": 155,
"PE": 156,
"PH": 157,
"PL": 159,
"PT": 160,
"QA": 162,
"RE": 163,
"RO": 164,
"RW": 166,
"KN": 167,
"LC": 168,
"SA": 171,
"SN": 172,
"SC": 173,
"SG": 175,
"SK": 176,
"SI": 177,
"SO": 179,
"ZA": 180,
"KR": 181,
"ES": 182,
"LK": 183,
"SH": 184,
"SR": 186,
"SZ": 187,
"SE": 188,
"CH": 189,
"TW": 191,
"TJ": 192,
"TZ": 193,
"TH": 194,
"TG": 195,
"TT": 198,
"TN": 199,
"TR": 200,
"TM": 201,
"UG": 203,
"UA": 204,
"AE": 205,
"GB": 206,
"UY": 207,
"UZ": 208,
"VE": 211,
"VN": 212,
"VG": 213,
"YE": 216,
"ZM": 218,
"ZW": 219,
"RS": 220,
"ME": 221,
"IN": 225,
"TC": 234,
"CD": 235,
"GG": 236,
"IM": 237,
"JE": 239,
"CW": 246, }
| 16.465969 | 20 | 0.275358 | 381 | 3,145 | 2.272966 | 0.997375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.292818 | 0.482035 | 3,145 | 190 | 21 | 16.552632 | 0.238797 | 0 | 0 | 0 | 0 | 0 | 0.123052 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60c6f3e298d768073d3b87bc2b50e42241d8b8c4 | 6,395 | py | Python | src/_cffi_src/openssl/engine.py | balabit-deps/balabit-os-6-python-cryptography | c31d184a56a18bad89a6444313367be71b5b0877 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/_cffi_src/openssl/engine.py | balabit-deps/balabit-os-6-python-cryptography | c31d184a56a18bad89a6444313367be71b5b0877 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/_cffi_src/openssl/engine.py | balabit-deps/balabit-os-6-python-cryptography | c31d184a56a18bad89a6444313367be71b5b0877 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function
INCLUDES = """
#include <openssl/engine.h>
"""
TYPES = """
static const long Cryptography_HAS_ENGINE_CRYPTODEV;
typedef ... ENGINE;
typedef ... RSA_METHOD;
typedef ... DSA_METHOD;
typedef ... ECDH_METHOD;
typedef ... ECDSA_METHOD;
typedef ... DH_METHOD;
typedef struct {
void (*seed)(const void *, int);
int (*bytes)(unsigned char *, int);
void (*cleanup)();
void (*add)(const void *, int, double);
int (*pseudorand)(unsigned char *, int);
int (*status)();
} RAND_METHOD;
typedef ... STORE_METHOD;
typedef int (*ENGINE_GEN_INT_FUNC_PTR)(ENGINE *);
typedef ... *ENGINE_CTRL_FUNC_PTR;
typedef ... *ENGINE_LOAD_KEY_PTR;
typedef ... *ENGINE_CIPHERS_PTR;
typedef ... *ENGINE_DIGESTS_PTR;
typedef ... ENGINE_CMD_DEFN;
typedef ... UI_METHOD;
static const unsigned int ENGINE_METHOD_RSA;
static const unsigned int ENGINE_METHOD_DSA;
static const unsigned int ENGINE_METHOD_RAND;
static const unsigned int ENGINE_METHOD_ECDH;
static const unsigned int ENGINE_METHOD_ECDSA;
static const unsigned int ENGINE_METHOD_CIPHERS;
static const unsigned int ENGINE_METHOD_DIGESTS;
static const unsigned int ENGINE_METHOD_STORE;
static const unsigned int ENGINE_METHOD_ALL;
static const unsigned int ENGINE_METHOD_NONE;
static const int ENGINE_R_CONFLICTING_ENGINE_ID;
"""
FUNCTIONS = """
ENGINE *ENGINE_get_first(void);
ENGINE *ENGINE_get_last(void);
ENGINE *ENGINE_get_next(ENGINE *);
ENGINE *ENGINE_get_prev(ENGINE *);
int ENGINE_add(ENGINE *);
int ENGINE_remove(ENGINE *);
ENGINE *ENGINE_by_id(const char *);
int ENGINE_init(ENGINE *);
int ENGINE_finish(ENGINE *);
void ENGINE_load_openssl(void);
void ENGINE_load_dynamic(void);
void ENGINE_load_builtin_engines(void);
void ENGINE_cleanup(void);
ENGINE *ENGINE_get_default_RSA(void);
ENGINE *ENGINE_get_default_DSA(void);
ENGINE *ENGINE_get_default_ECDH(void);
ENGINE *ENGINE_get_default_ECDSA(void);
ENGINE *ENGINE_get_default_DH(void);
ENGINE *ENGINE_get_default_RAND(void);
ENGINE *ENGINE_get_cipher_engine(int);
ENGINE *ENGINE_get_digest_engine(int);
int ENGINE_set_default_RSA(ENGINE *);
int ENGINE_set_default_DSA(ENGINE *);
int ENGINE_set_default_ECDH(ENGINE *);
int ENGINE_set_default_ECDSA(ENGINE *);
int ENGINE_set_default_DH(ENGINE *);
int ENGINE_set_default_RAND(ENGINE *);
int ENGINE_set_default_ciphers(ENGINE *);
int ENGINE_set_default_digests(ENGINE *);
int ENGINE_set_default_string(ENGINE *, const char *);
int ENGINE_set_default(ENGINE *, unsigned int);
unsigned int ENGINE_get_table_flags(void);
void ENGINE_set_table_flags(unsigned int);
int ENGINE_register_RSA(ENGINE *);
void ENGINE_unregister_RSA(ENGINE *);
void ENGINE_register_all_RSA(void);
int ENGINE_register_DSA(ENGINE *);
void ENGINE_unregister_DSA(ENGINE *);
void ENGINE_register_all_DSA(void);
int ENGINE_register_ECDH(ENGINE *);
void ENGINE_unregister_ECDH(ENGINE *);
void ENGINE_register_all_ECDH(void);
int ENGINE_register_ECDSA(ENGINE *);
void ENGINE_unregister_ECDSA(ENGINE *);
void ENGINE_register_all_ECDSA(void);
int ENGINE_register_DH(ENGINE *);
void ENGINE_unregister_DH(ENGINE *);
void ENGINE_register_all_DH(void);
int ENGINE_register_RAND(ENGINE *);
void ENGINE_unregister_RAND(ENGINE *);
void ENGINE_register_all_RAND(void);
int ENGINE_register_STORE(ENGINE *);
void ENGINE_unregister_STORE(ENGINE *);
void ENGINE_register_all_STORE(void);
int ENGINE_register_ciphers(ENGINE *);
void ENGINE_unregister_ciphers(ENGINE *);
void ENGINE_register_all_ciphers(void);
int ENGINE_register_digests(ENGINE *);
void ENGINE_unregister_digests(ENGINE *);
void ENGINE_register_all_digests(void);
int ENGINE_register_complete(ENGINE *);
int ENGINE_register_all_complete(void);
int ENGINE_ctrl(ENGINE *, int, long, void *, void (*)(void));
int ENGINE_cmd_is_executable(ENGINE *, int);
int ENGINE_ctrl_cmd(ENGINE *, const char *, long, void *, void (*)(void), int);
int ENGINE_ctrl_cmd_string(ENGINE *, const char *, const char *, int);
ENGINE *ENGINE_new(void);
int ENGINE_free(ENGINE *);
int ENGINE_up_ref(ENGINE *);
int ENGINE_set_id(ENGINE *, const char *);
int ENGINE_set_name(ENGINE *, const char *);
int ENGINE_set_RSA(ENGINE *, const RSA_METHOD *);
int ENGINE_set_DSA(ENGINE *, const DSA_METHOD *);
int ENGINE_set_ECDH(ENGINE *, const ECDH_METHOD *);
int ENGINE_set_ECDSA(ENGINE *, const ECDSA_METHOD *);
int ENGINE_set_DH(ENGINE *, const DH_METHOD *);
int ENGINE_set_RAND(ENGINE *, const RAND_METHOD *);
int ENGINE_set_STORE(ENGINE *, const STORE_METHOD *);
int ENGINE_set_destroy_function(ENGINE *, ENGINE_GEN_INT_FUNC_PTR);
int ENGINE_set_init_function(ENGINE *, ENGINE_GEN_INT_FUNC_PTR);
int ENGINE_set_finish_function(ENGINE *, ENGINE_GEN_INT_FUNC_PTR);
int ENGINE_set_ctrl_function(ENGINE *, ENGINE_CTRL_FUNC_PTR);
int ENGINE_set_load_privkey_function(ENGINE *, ENGINE_LOAD_KEY_PTR);
int ENGINE_set_load_pubkey_function(ENGINE *, ENGINE_LOAD_KEY_PTR);
int ENGINE_set_ciphers(ENGINE *, ENGINE_CIPHERS_PTR);
int ENGINE_set_digests(ENGINE *, ENGINE_DIGESTS_PTR);
int ENGINE_set_flags(ENGINE *, int);
int ENGINE_set_cmd_defns(ENGINE *, const ENGINE_CMD_DEFN *);
const char *ENGINE_get_id(const ENGINE *);
const char *ENGINE_get_name(const ENGINE *);
const RSA_METHOD *ENGINE_get_RSA(const ENGINE *);
const DSA_METHOD *ENGINE_get_DSA(const ENGINE *);
const ECDH_METHOD *ENGINE_get_ECDH(const ENGINE *);
const ECDSA_METHOD *ENGINE_get_ECDSA(const ENGINE *);
const DH_METHOD *ENGINE_get_DH(const ENGINE *);
const RAND_METHOD *ENGINE_get_RAND(const ENGINE *);
const STORE_METHOD *ENGINE_get_STORE(const ENGINE *);
const EVP_CIPHER *ENGINE_get_cipher(ENGINE *, int);
const EVP_MD *ENGINE_get_digest(ENGINE *, int);
int ENGINE_get_flags(const ENGINE *);
const ENGINE_CMD_DEFN *ENGINE_get_cmd_defns(const ENGINE *);
EVP_PKEY *ENGINE_load_private_key(ENGINE *, const char *, UI_METHOD *, void *);
EVP_PKEY *ENGINE_load_public_key(ENGINE *, const char *, UI_METHOD *, void *);
void ENGINE_add_conf_module(void);
"""
MACROS = """
void ENGINE_load_cryptodev(void);
"""
CUSTOMIZATIONS = """
#if defined(LIBRESSL_VERSION_NUMBER)
static const long Cryptography_HAS_ENGINE_CRYPTODEV = 0;
void (*ENGINE_load_cryptodev)(void) = NULL;
#else
static const long Cryptography_HAS_ENGINE_CRYPTODEV = 1;
#endif
"""
| 37.180233 | 79 | 0.792807 | 943 | 6,395 | 4.991516 | 0.135737 | 0.126195 | 0.073932 | 0.046739 | 0.380922 | 0.191417 | 0.101976 | 0.046526 | 0.046526 | 0.028681 | 0 | 0.000695 | 0.100547 | 6,395 | 171 | 80 | 37.397661 | 0.817629 | 0.027052 | 0 | 0.031847 | 0 | 0 | 0.973303 | 0.490994 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006369 | 0 | 0.006369 | 0.006369 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60c93dc759e4957c6c02160af6c8d88b4200fc54 | 1,518 | py | Python | notes/algo-ds-practice/problems/graph/mother_vertex.py | Anmol-Singh-Jaggi/interview-notes | 65af75e2b5725894fa5e13bb5cd9ecf152a0d652 | [
"MIT"
] | 6 | 2020-07-05T05:15:19.000Z | 2021-01-24T20:17:14.000Z | notes/algo-ds-practice/problems/graph/mother_vertex.py | Anmol-Singh-Jaggi/interview-notes | 65af75e2b5725894fa5e13bb5cd9ecf152a0d652 | [
"MIT"
] | null | null | null | notes/algo-ds-practice/problems/graph/mother_vertex.py | Anmol-Singh-Jaggi/interview-notes | 65af75e2b5725894fa5e13bb5cd9ecf152a0d652 | [
"MIT"
] | 2 | 2020-09-14T06:46:37.000Z | 2021-06-15T09:17:21.000Z | '''
What is a Mother Vertex?
A mother vertex in a graph G = (V,E) is a vertex v such that all other vertices in G can be reached by a path from v.
How to find mother vertex?
Case 1:- Undirected Connected Graph : In this case, all the vertices are mother vertices as we can reach to all the other nodes in the graph.
Case 2:- Undirected/Directed Disconnected Graph : In this case, there is no mother vertices as we cannot reach to all the other nodes in the graph.
Case 3:- Directed Connected Graph : In this case, we have to find a vertex -v in the graph such that we can reach to all the other nodes in the graph through a directed path.
SOLUTION:
If there exist mother vertex (or vertices), then one of the mother vertices is the last finished vertex in DFS. (Or a mother vertex has the maximum finish time in DFS traversal).
A vertex is said to be finished in DFS if a recursive call for its DFS is over, i.e., all descendants of the vertex have been visited.
Algorithm :
Do DFS traversal of the given graph. While doing traversal keep track of last finished vertex ‘v’. This step takes O(V+E) time.
If there exist mother vertex (or vetices), then v must be one (or one of them). Check if v is a mother vertex by doing DFS/BFS from v. This step also takes O(V+E) time.
Note that there is no need to literally store the finish time for each vertex.
We can just do:
...
...
if node not in visited:
dfs(node)
latest = node
...
...
# Check if latest is indeed a mother vertex.
'''
| 43.371429 | 178 | 0.729908 | 284 | 1,518 | 3.901408 | 0.334507 | 0.086643 | 0.058664 | 0.040614 | 0.217509 | 0.152527 | 0.105596 | 0.105596 | 0.105596 | 0.105596 | 0 | 0.002517 | 0.214756 | 1,518 | 34 | 179 | 44.647059 | 0.927013 | 0.994071 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60ce6d6b59025060fc5b78c3b1b74f61b441a1a2 | 1,058 | py | Python | scripts/quest/q3526s.py | pantskun/swordiemen | fc33ffec168e6611587fdc75de8270f6827a4176 | [
"MIT"
] | null | null | null | scripts/quest/q3526s.py | pantskun/swordiemen | fc33ffec168e6611587fdc75de8270f6827a4176 | [
"MIT"
] | null | null | null | scripts/quest/q3526s.py | pantskun/swordiemen | fc33ffec168e6611587fdc75de8270f6827a4176 | [
"MIT"
] | null | null | null | # In Search for the Lost Memory [Explorer Thief] (3526)
# To be replaced with GMS's exact dialogue.
# Following dialogue has been edited from DeepL on JMS's dialogue transcript (no KMS footage anywhere):
# https://kaengouraiu2.blog.fc2.com/blog-entry-46.html
recoveredMemory = 7081
darkLord = 1052001
sm.setSpeakerID(darkLord)
sm.sendNext("The way you moved without a trace...you must have exceptional talent. "
"Long time no see, #h #.")
sm.sendSay("Since when did you grow up to this point? You're no less inferior to any Dark Lord. "
"You were just a greenhorn that couldn't even hide their presence...Hmph, well, it's been a while since then. "
"Still, it feels weird to see you become so strong. I guess this is how it feels to be proud.")
sm.sendSay("But don't let your guard down. Know that there's still more progress to be made. "
"As the one who has made you into a thief, I know you that you can be even stronger...!")
sm.startQuest(parentID)
sm.completeQuest(parentID)
sm.startQuest(recoveredMemory)
sm.setQRValue(recoveredMemory, "1", False) | 48.090909 | 111 | 0.753308 | 181 | 1,058 | 4.403315 | 0.674033 | 0.015056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022321 | 0.153119 | 1,058 | 22 | 112 | 48.090909 | 0.867188 | 0.236295 | 0 | 0 | 0 | 0.357143 | 0.679104 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60cffac1ca4370e00b0819dac3003373ca8c201a | 361 | py | Python | foundation/djangocms_submenu/cms_plugins.py | Mindelirium/foundation | 2d07e430915d696ca7376afea633692119c4d30e | [
"MIT"
] | null | null | null | foundation/djangocms_submenu/cms_plugins.py | Mindelirium/foundation | 2d07e430915d696ca7376afea633692119c4d30e | [
"MIT"
] | null | null | null | foundation/djangocms_submenu/cms_plugins.py | Mindelirium/foundation | 2d07e430915d696ca7376afea633692119c4d30e | [
"MIT"
] | null | null | null | from django.utils.translation import ugettext_lazy as _
from cms.plugin_base import CMSPluginBase
from cms.plugin_pool import plugin_pool
from cms.models.pluginmodel import CMSPlugin
class SubmenuPlugin(CMSPluginBase):
model = CMSPlugin
name = _("Submenu")
render_template = "cms/plugins/submenu.html"
plugin_pool.register_plugin(SubmenuPlugin)
| 25.785714 | 55 | 0.806094 | 45 | 361 | 6.266667 | 0.577778 | 0.074468 | 0.092199 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127424 | 361 | 13 | 56 | 27.769231 | 0.895238 | 0 | 0 | 0 | 0 | 0 | 0.085873 | 0.066482 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
60d36c2c07868be1d602edbfc58792e48953ffe7 | 960 | py | Python | StickyDJ-Bot/src/clients/client.py | JCab09/StickyDJ-Bot | feaf2229a6729be6ad022f9105da19192e3a91d3 | [
"WTFPL"
] | null | null | null | StickyDJ-Bot/src/clients/client.py | JCab09/StickyDJ-Bot | feaf2229a6729be6ad022f9105da19192e3a91d3 | [
"WTFPL"
] | null | null | null | StickyDJ-Bot/src/clients/client.py | JCab09/StickyDJ-Bot | feaf2229a6729be6ad022f9105da19192e3a91d3 | [
"WTFPL"
] | null | null | null | #!/usr/bin/env python3
"""
Base-Client Class
This is the parent-class of all client-classes and holds properties and functions they all depend on.
Author: Jason Cabezuela
"""
import src.util.debugger as Debugger
import src.util.configmaker as configmaker
class BaseClient(object):
"""Base-Client Class"""
def __init__(self, configpath, configtype, debugFlag = False):
self._Debug = Debugger.Debugger(debugFlag)
self._Debug.write("INIT BaseClient")
defaultPrompt = "-"
self._prompt = defaultPrompt
self._clientConfig = configmaker.getConfig(configpath, configtype)
self._Debug.write("INIT_END BaseClient")
@property
def prompt(self):
return self._prompt
def get_client_configuration():
"""Base Class for getting client configuration"""
def load_client_configuration():
"""Base Class for loading client configuration into memory"""
| 27.428571 | 101 | 0.684375 | 108 | 960 | 5.944444 | 0.518519 | 0.11838 | 0.046729 | 0.056075 | 0.096573 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00134 | 0.222917 | 960 | 34 | 102 | 28.235294 | 0.859249 | 0.295833 | 0 | 0 | 0 | 0 | 0.053599 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.133333 | 0.066667 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
60d3ab0d0b30464f05fb6f22999b5abfbd62a774 | 1,575 | py | Python | cryptomon/ascii.py | S0L1DUS/cryptocoinmon | 37b210ca2f93f0b70f160ad903782408dee0f9e9 | [
"MIT"
] | null | null | null | cryptomon/ascii.py | S0L1DUS/cryptocoinmon | 37b210ca2f93f0b70f160ad903782408dee0f9e9 | [
"MIT"
] | null | null | null | cryptomon/ascii.py | S0L1DUS/cryptocoinmon | 37b210ca2f93f0b70f160ad903782408dee0f9e9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import sys
from cryptomon.common import Colors
if sys.version_info >= (3, 0):
import io
else:
import StringIO as io
ascii_title = """
/$$$$$$ /$$ /$$ /$$
/$$__ $$ | $$ | $$$ /$$$
| $$ \__/ /$$$$$$ /$$ /$$ /$$$$$$ /$$$$$$ /$$$$$$ | $$$$ /$$$$ /$$$$$$ /$$$$$$$
| $$ /$$__ $$| $$ | $$ /$$__ $$|_ $$_/ /$$__ $$| $$ $$/$$ $$ /$$__ $$| $$__ $$
| $$ | $$ \__/| $$ | $$| $$ \ $$ | $$ | $$ \ $$| $$ $$$| $$| $$ \ $$| $$ \ $$
| $$ $$| $$ | $$ | $$| $$ | $$ | $$ /$$| $$ | $$| $$\ $ | $$| $$ | $$| $$ | $$
| $$$$$$/| $$ | $$$$$$$| $$$$$$$/ | $$$$/| $$$$$$/| $$ \/ | $$| $$$$$$/| $$ | $$
\______/ |__/ \____ $$| $$____/ \___/ \______/ |__/ |__/ \______/ |__/ |__/
/$$ | $$| $$
| $$$$$$/| $$
\______/ |__/
"""
def process_title(title):
buf = io.StringIO(title)
lines = buf.readlines()
lines = lines[1:-1]
colored_lines = []
colored_title = ""
for line in lines:
colored_lines.append(Colors.BLUE + line[:13] + Colors.YELLOW + line[14:])
for line in colored_lines:
colored_title += line
return colored_title + Colors.ENDLINE
| 37.5 | 93 | 0.258413 | 72 | 1,575 | 4.625 | 0.513889 | 0.108108 | 0.114114 | 0.144144 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010804 | 0.471111 | 1,575 | 41 | 94 | 38.414634 | 0.388956 | 0.013333 | 0 | 0 | 0 | 0.133333 | 0.66817 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.133333 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60dcfdc60c959c4708ded0468f3983791fbd3721 | 596 | py | Python | mineshaft.py | DidymusRex/PiCraft | 07b7dad2e68a2473e4314c9b77cb82abce41ae97 | [
"Unlicense"
] | null | null | null | mineshaft.py | DidymusRex/PiCraft | 07b7dad2e68a2473e4314c9b77cb82abce41ae97 | [
"Unlicense"
] | null | null | null | mineshaft.py | DidymusRex/PiCraft | 07b7dad2e68a2473e4314c9b77cb82abce41ae97 | [
"Unlicense"
] | null | null | null | #! /usr/bin/env python
import mcpi.minecraft as minecraft
import mcpi.block as block
import random
import time
mc = minecraft.Minecraft.create()
# ----------------------------------------------------------------------
# S E T U P
# ----------------------------------------------------------------------
# Where Am I?
pos = mc.player.getTilePos()
print "Game center point is %d, %d, %d" % (pos.x, pos.y, pos.z)
limit=256
mc.setBlocks(pos.x, pos.y, pos.z, pos.x+10, pos.y-256, pos.z+10, block.AIR.id)
mc.setBlocks(pos.x, pos.y, pos.z, pos.x-10, pos.y+256, pos.z-10, block.DIAMOND_ORE.id)
| 24.833333 | 86 | 0.513423 | 91 | 596 | 3.351648 | 0.450549 | 0.065574 | 0.068852 | 0.078689 | 0.347541 | 0.347541 | 0.308197 | 0.308197 | 0.308197 | 0.308197 | 0 | 0.032443 | 0.120805 | 596 | 23 | 87 | 25.913043 | 0.549618 | 0.313758 | 0 | 0 | 0 | 0 | 0.077114 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.4 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
60e2a8760c2ed0b0d0f7dbca153dcadf9a319042 | 4,827 | py | Python | src/cowrie/telnet/userauth.py | uwacyber/cowrie | 1d81e1ca5d0b8461e06f17aee26cb7bf108a16e5 | [
"BSD-3-Clause"
] | 2,316 | 2015-06-25T18:51:02.000Z | 2022-03-29T23:31:16.000Z | src/cowrie/telnet/userauth.py | uwacyber/cowrie | 1d81e1ca5d0b8461e06f17aee26cb7bf108a16e5 | [
"BSD-3-Clause"
] | 632 | 2015-07-08T19:02:05.000Z | 2022-03-20T03:20:56.000Z | src/cowrie/telnet/userauth.py | uwacyber/cowrie | 1d81e1ca5d0b8461e06f17aee26cb7bf108a16e5 | [
"BSD-3-Clause"
] | 486 | 2015-08-12T05:02:45.000Z | 2022-03-27T19:54:02.000Z | # Copyright (C) 2015, 2016 GoSecure Inc.
"""
Telnet Transport and Authentication for the Honeypot
@author: Olivier Bilodeau <obilodeau@gosecure.ca>
"""
from __future__ import annotations
import struct
from twisted.conch.telnet import (
ECHO,
LINEMODE,
NAWS,
SGA,
AuthenticatingTelnetProtocol,
ITelnetProtocol,
)
from twisted.python import log
from cowrie.core.config import CowrieConfig
from cowrie.core.credentials import UsernamePasswordIP
class HoneyPotTelnetAuthProtocol(AuthenticatingTelnetProtocol):
"""
TelnetAuthProtocol that takes care of Authentication. Once authenticated this
protocol is replaced with HoneyPotTelnetSession.
"""
loginPrompt = b"login: "
passwordPrompt = b"Password: "
windowSize = [40, 80]
def connectionMade(self):
# self.transport.negotiationMap[NAWS] = self.telnet_NAWS
# Initial option negotation. Want something at least for Mirai
# for opt in (NAWS,):
# self.transport.doChain(opt).addErrback(log.err)
# I need to doubly escape here since my underlying
# CowrieTelnetTransport hack would remove it and leave just \n
self.transport.write(self.factory.banner.replace(b"\n", b"\r\r\n"))
self.transport.write(self.loginPrompt)
def connectionLost(self, reason):
"""
Fires on pre-authentication disconnects
"""
AuthenticatingTelnetProtocol.connectionLost(self, reason)
def telnet_User(self, line):
"""
Overridden to conditionally kill 'WILL ECHO' which confuses clients
that don't implement a proper Telnet protocol (most malware)
"""
self.username = line # .decode()
# only send ECHO option if we are chatting with a real Telnet client
self.transport.willChain(ECHO)
# FIXME: this should be configurable or provided via filesystem
self.transport.write(self.passwordPrompt)
return "Password"
def telnet_Password(self, line):
username, password = self.username, line # .decode()
del self.username
def login(ignored):
self.src_ip = self.transport.getPeer().host
creds = UsernamePasswordIP(username, password, self.src_ip)
d = self.portal.login(creds, self.src_ip, ITelnetProtocol)
d.addCallback(self._cbLogin)
d.addErrback(self._ebLogin)
# are we dealing with a real Telnet client?
if self.transport.options:
# stop ECHO
# even if ECHO negotiation fails we still want to attempt a login
# this allows us to support dumb clients which is common in malware
# thus the addBoth: on success and on exception (AlreadyNegotiating)
self.transport.wontChain(ECHO).addBoth(login)
else:
# process login
login("")
return "Discard"
def telnet_Command(self, command):
self.transport.protocol.dataReceived(command + b"\r")
return "Command"
def _cbLogin(self, ial):
"""
Fired on a successful login
"""
interface, protocol, logout = ial
protocol.windowSize = self.windowSize
self.protocol = protocol
self.logout = logout
self.state = "Command"
self.transport.write(b"\n")
# Remove the short timeout of the login prompt.
self.transport.setTimeout(
CowrieConfig.getint("honeypot", "interactive_timeout", fallback=300)
)
# replace myself with avatar protocol
protocol.makeConnection(self.transport)
self.transport.protocol = protocol
def _ebLogin(self, failure):
# TODO: provide a way to have user configurable strings for wrong password
self.transport.wontChain(ECHO)
self.transport.write(b"\nLogin incorrect\n")
self.transport.write(self.loginPrompt)
self.state = "User"
def telnet_NAWS(self, data):
"""
From TelnetBootstrapProtocol in twisted/conch/telnet.py
"""
if len(data) == 4:
width, height = struct.unpack("!HH", b"".join(data))
self.windowSize = [height, width]
else:
log.msg("Wrong number of NAWS bytes")
def enableLocal(self, opt):
if opt == ECHO:
return True
# TODO: check if twisted now supports SGA (see git commit c58056b0)
elif opt == SGA:
return False
else:
return False
def enableRemote(self, opt):
# TODO: check if twisted now supports LINEMODE (see git commit c58056b0)
if opt == LINEMODE:
return False
elif opt == NAWS:
return True
elif opt == SGA:
return True
else:
return False
| 32.18 | 82 | 0.63207 | 534 | 4,827 | 5.681648 | 0.430712 | 0.072841 | 0.035597 | 0.029005 | 0.062953 | 0.041529 | 0 | 0 | 0 | 0 | 0 | 0.008109 | 0.284649 | 4,827 | 149 | 83 | 32.395973 | 0.870547 | 0.315724 | 0 | 0.180723 | 0 | 0 | 0.043423 | 0 | 0 | 0 | 0 | 0.020134 | 0 | 1 | 0.13253 | false | 0.084337 | 0.072289 | 0 | 0.373494 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
60e7397962091395a87e3dcdedd54da76ffb66e6 | 515 | py | Python | filer/tests/utils/__init__.py | pbs/django-filer | c862a84d4e1d86c14eeb509e341f6a7d39a421bf | [
"BSD-3-Clause"
] | 1 | 2015-03-03T15:49:14.000Z | 2015-03-03T15:49:14.000Z | filer/tests/utils/__init__.py | pbs/django-filer | c862a84d4e1d86c14eeb509e341f6a7d39a421bf | [
"BSD-3-Clause"
] | 10 | 2015-04-08T14:16:52.000Z | 2021-12-15T16:17:57.000Z | filer/tests/utils/__init__.py | pbs/django-filer | c862a84d4e1d86c14eeb509e341f6a7d39a421bf | [
"BSD-3-Clause"
] | null | null | null | from django.template.loaders.base import Loader as BaseLoader
from django.template.base import TemplateDoesNotExist
class Mock():
pass
class MockLoader(BaseLoader):
is_usable = True
def load_template_source(self, template_name, template_dirs=None):
if template_name == 'cms_mock_template.html':
return '<div></div>', 'template.html'
elif template_name == '404.html':
return "404 Not Found", "404.html"
else:
raise TemplateDoesNotExist()
| 25.75 | 70 | 0.671845 | 60 | 515 | 5.616667 | 0.583333 | 0.106825 | 0.106825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022727 | 0.231068 | 515 | 19 | 71 | 27.105263 | 0.828283 | 0 | 0 | 0 | 0 | 0 | 0.145631 | 0.042718 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.076923 | 0.153846 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
60ebcdffbce13db306c3a548fbb31af96bfe8e29 | 1,549 | py | Python | molecule/default/tests/test_default.py | joshbenner/sensu-ansible-role | ecc92ba3462d7edf50ad96ddda61080ba58c29f8 | [
"BSD-3-Clause"
] | null | null | null | molecule/default/tests/test_default.py | joshbenner/sensu-ansible-role | ecc92ba3462d7edf50ad96ddda61080ba58c29f8 | [
"BSD-3-Clause"
] | 1 | 2018-10-31T03:14:05.000Z | 2018-10-31T03:14:05.000Z | molecule/default/tests/test_default.py | joshbenner/sensu-ansible-role | ecc92ba3462d7edf50ad96ddda61080ba58c29f8 | [
"BSD-3-Clause"
] | null | null | null | import os
import testinfra.utils.ansible_runner
testinfra_hosts = testinfra.utils.ansible_runner.AnsibleRunner(
os.environ['MOLECULE_INVENTORY_FILE']).get_hosts('all')
def test_packages(host):
package = host.package('sensu')
assert package.is_installed
assert '1.7.0' in package.version
def test_dir_ownership(host):
assert host.file('/opt/sensu').group == 'sensu'
def test_main_config(host):
f = host.file('/etc/sensu/config.json')
assert f.exists
assert f.is_file
assert f.user == 'sensu'
assert f.group == 'sensu'
assert f.mode == 0o600
assert f.contains('rabbitmq')
assert f.contains('check-cpu.rb')
assert f.contains('"foo": "bar"')
assert f.contains('example_subscription')
assert f.contains('"zip": "zap"')
assert not f.contains('subscription_to_be_overridden')
def test_server_running(host):
server = host.service('sensu-server')
assert server.is_running
assert server.is_enabled
def test_api_running(host):
api = host.service('sensu-api')
assert api.is_running
assert api.is_enabled
def test_client_running(host):
client = host.service('sensu-client')
assert client.is_running
assert client.is_enabled
def test_api_listening(host):
assert host.socket('tcp://0.0.0.0:4567').is_listening
def test_plugin_installed(host):
assert host.file('/opt/sensu/embedded/bin/check-memory.rb').exists
# Tests extension install/enable
def test_snmp_listening(host):
assert host.socket('udp://0.0.0.0:1062').is_listening
| 24.587302 | 70 | 0.713363 | 223 | 1,549 | 4.789238 | 0.345291 | 0.065543 | 0.070225 | 0.044944 | 0.138577 | 0.048689 | 0 | 0 | 0 | 0 | 0 | 0.017571 | 0.154939 | 1,549 | 62 | 71 | 24.983871 | 0.798319 | 0.019367 | 0 | 0 | 0 | 0 | 0.187212 | 0.074489 | 0 | 0 | 0 | 0 | 0.560976 | 1 | 0.219512 | false | 0 | 0.04878 | 0 | 0.268293 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
60ed370b9d6be678d96e0ec349072e5fb104c1f2 | 1,582 | py | Python | gellifinsta/models.py | vallka/djellifique | fb84fba6be413f9d38276d89ae84aeaff761218f | [
"MIT"
] | null | null | null | gellifinsta/models.py | vallka/djellifique | fb84fba6be413f9d38276d89ae84aeaff761218f | [
"MIT"
] | null | null | null | gellifinsta/models.py | vallka/djellifique | fb84fba6be413f9d38276d89ae84aeaff761218f | [
"MIT"
] | null | null | null | from django.db import models
from django.utils.translation import ugettext_lazy as _
from django.utils.html import mark_safe
# Create your models here.
class Gellifinsta(models.Model):
class Meta:
ordering = ['-taken_at_datetime']
shortcode = models.CharField(_("Shortcode"), max_length=20)
taken_at_datetime = models.DateTimeField(_("taken at"))
username = models.CharField(_("Username"), max_length=100)
is_active = models.BooleanField(_("Active"),default=True)
is_video = models.BooleanField(_("Video"),default=False)
file_path = models.CharField(_("File Path"), max_length=500)
url = models.CharField(_("URL"), max_length=500)
created_dt = models.DateTimeField(_("Created Date/Time"), auto_now_add=True, null=True)
updated_dt = models.DateTimeField(_("Updated Date/Time"), auto_now=True, null=True)
caption = models.TextField(_("Caption"), blank=True, null=True)
tags = models.TextField(_("Tags"), blank=True, null=True)
def __str__(self):
return self.shortcode + ':' + str(self.taken_at_datetime)
def image_tag(self):
return mark_safe('<img src="%s" width="250" />' % (self.url))
image_tag.short_description = 'Image'
def tags_spaced(self):
return self.tags.replace(',',' ')
tags_spaced.short_description = 'Tags'
class Products(models.Model):
class Meta:
ordering = ['name']
name = models.CharField(_("Name"), max_length=100, unique=True)
is_active = models.BooleanField(_("Active"),default=True)
def __str__(self):
return self.name
| 35.954545 | 91 | 0.689633 | 199 | 1,582 | 5.226131 | 0.371859 | 0.072115 | 0.046154 | 0.038462 | 0.178846 | 0.125 | 0.082692 | 0 | 0 | 0 | 0 | 0.012908 | 0.167509 | 1,582 | 43 | 92 | 36.790698 | 0.776765 | 0.015171 | 0 | 0.1875 | 0 | 0 | 0.106684 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.09375 | 0.125 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
60fea0a23e4a5ea1e75b9b0eb479df0b1f05f8bb | 707 | py | Python | examples/GenerateSubset.py | vitay/YouTubeFacesDB | e7225e8d775ad64889fbee57a4452a25573a0360 | [
"MIT"
] | 11 | 2018-02-25T16:20:16.000Z | 2021-07-27T02:46:09.000Z | examples/GenerateSubset.py | vitay/YouTubeFacesDB | e7225e8d775ad64889fbee57a4452a25573a0360 | [
"MIT"
] | 1 | 2018-12-14T19:56:16.000Z | 2018-12-16T22:09:30.000Z | examples/GenerateSubset.py | vitay/YouTubeFacesDB | e7225e8d775ad64889fbee57a4452a25573a0360 | [
"MIT"
] | 3 | 2017-05-05T03:23:17.000Z | 2019-11-11T01:39:25.000Z | from YouTubeFacesDB import generate_ytf_database
###############################################################################
# Create the dataset
###############################################################################
generate_ytf_database(
directory= '../data',#'/scratch/vitay/Datasets/YouTubeFaces', # Location of the YTF dataset
filename='ytfdb.h5', # Name of the HDF5 file to write to
labels=10, # Number of labels to randomly select
max_number=-1, # Maximum number of images to use
size=(100, 100), # Size of the images
color=False, # Black and white
bw_first=True, # Final shape is (1, w, h)
cropped=True # The original images are cropped to the faces
) | 47.133333 | 95 | 0.550212 | 82 | 707 | 4.670732 | 0.658537 | 0.039164 | 0.099217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020101 | 0.155587 | 707 | 15 | 96 | 47.133333 | 0.621441 | 0.414427 | 0 | 0 | 1 | 0 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8801787fa421093191e86dccf0ba799d1e648912 | 506 | py | Python | hyssop_aiohttp/component/__init__.py | hsky77/hyssop | 4ab1e82f9e2592de56589c7426a037564bef49a6 | [
"MIT"
] | null | null | null | hyssop_aiohttp/component/__init__.py | hsky77/hyssop | 4ab1e82f9e2592de56589c7426a037564bef49a6 | [
"MIT"
] | null | null | null | hyssop_aiohttp/component/__init__.py | hsky77/hyssop | 4ab1e82f9e2592de56589c7426a037564bef49a6 | [
"MIT"
] | null | null | null | # Copyright (C) 2020-Present the hyssop authors and contributors.
#
# This module is part of hyssop and is released under
# the MIT License: http://www.opensource.org/licenses/mit-license.php
'''
File created: January 1st 2021
Modified By: hsky77
Last Updated: January 7th 2021 15:30:08 pm
'''
from hyssop.project.component import ComponentTypes
from .aio_client import AioClientComponent
class AioHttpComponentTypes(ComponentTypes):
AioClient = ('aioclient', 'aio_client', 'AioClientComponent')
| 25.3 | 69 | 0.774704 | 66 | 506 | 5.909091 | 0.757576 | 0.051282 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050228 | 0.134387 | 506 | 19 | 70 | 26.631579 | 0.840183 | 0.551383 | 0 | 0 | 0 | 0 | 0.171296 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
8801ff2af63497d7ca9dadd57139f98ae23b3370 | 5,081 | py | Python | punkweb_boards/rest/serializers.py | Punkweb/punkweb-boards | 8934d15fbff2a3ce9191fdb19d58d029eb55ef16 | [
"BSD-3-Clause"
] | 20 | 2018-02-22T11:36:04.000Z | 2022-03-22T11:48:22.000Z | punkweb_boards/rest/serializers.py | imgVOID/punkweb-boards | 8934d15fbff2a3ce9191fdb19d58d029eb55ef16 | [
"BSD-3-Clause"
] | 28 | 2018-02-22T07:11:46.000Z | 2022-02-23T08:05:29.000Z | punkweb_boards/rest/serializers.py | imgVOID/punkweb-boards | 8934d15fbff2a3ce9191fdb19d58d029eb55ef16 | [
"BSD-3-Clause"
] | 5 | 2018-02-25T11:05:19.000Z | 2021-05-27T02:25:31.000Z | from rest_framework import serializers
from punkweb_boards.conf.settings import SHOUTBOX_DISABLED_TAGS
from punkweb_boards.models import (
BoardProfile,
Category,
Subcategory,
Thread,
Post,
Conversation,
Message,
Report,
Shout,
)
class BoardProfileSerializer(serializers.ModelSerializer):
post_count = serializers.ReadOnlyField()
can_shout = serializers.ReadOnlyField()
rendered_username = serializers.ReadOnlyField()
rendered_rank = serializers.ReadOnlyField()
class Meta:
model = BoardProfile
fields = "__all__"
class CategorySerializer(serializers.ModelSerializer):
class Meta:
model = Category
exclude = ("auth_req",)
class SubcategorySerializer(serializers.ModelSerializer):
last_thread = serializers.ReadOnlyField(source="last_thread.id")
last_thread_title = serializers.ReadOnlyField(source="last_thread.title")
last_thread_created = serializers.ReadOnlyField(
source="last_thread.created"
)
last_thread_user = serializers.ReadOnlyField(
source="last_thread.user.profile.rendered_username"
)
parent_name = serializers.ReadOnlyField(source="parent.name")
thread_count = serializers.ReadOnlyField()
post_count = serializers.ReadOnlyField()
can_post = serializers.SerializerMethodField()
def get_can_post(self, obj):
return obj.can_post(self.context.get("request").user)
class Meta:
model = Subcategory
exclude = ("auth_req",)
class ThreadSerializer(serializers.ModelSerializer):
last_post = serializers.ReadOnlyField(source="last_post.id")
last_post_created = serializers.ReadOnlyField(source="last_post.created")
last_post_username = serializers.ReadOnlyField(
source="last_post.user.username"
)
last_post_rendered_username = serializers.ReadOnlyField(
source="last_post.user.profile.rendered_username"
)
user_username = serializers.ReadOnlyField(source="user.username")
user_rendered_username = serializers.ReadOnlyField(
source="user.profile.rendered_username"
)
user_image = serializers.ReadOnlyField(source="user.profile.avatar")
user_post_count = serializers.ReadOnlyField(
source="user.profile.post_count"
)
user_join_date = serializers.ReadOnlyField(source="user.created")
flagged = serializers.ReadOnlyField(source="reported")
posts_count = serializers.ReadOnlyField()
can_edit = serializers.SerializerMethodField()
def get_can_edit(self, obj):
return obj.can_edit(self.context.get("request").user)
class Meta:
model = Thread
fields = "__all__"
read_only_fields = (
"pinned",
"closed",
"user",
"upvoted_by",
"downvoted_by",
)
class PostSerializer(serializers.ModelSerializer):
flagged = serializers.ReadOnlyField(source="reported")
can_edit = serializers.SerializerMethodField()
def get_can_edit(self, obj):
return obj.can_edit(self.context.get("request").user)
class Meta:
model = Post
fields = "__all__"
read_only_fields = ("user", "upvoted_by", "downvoted_by")
class ConversationSerializer(serializers.ModelSerializer):
last_message = serializers.ReadOnlyField(source="last_message.id")
last_message_title = serializers.ReadOnlyField(source="last_message.title")
last_message_created = serializers.ReadOnlyField(
source="last_message.created"
)
last_message_user = serializers.ReadOnlyField(
source="last_message.user.profile.rendered_username"
)
message_count = serializers.ReadOnlyField()
class Meta:
model = Conversation
fields = "__all__"
read_only_fields = ("unread_by",)
class MessageSerializer(serializers.ModelSerializer):
class Meta:
model = Message
fields = "__all__"
read_only_fields = ("user",)
class ShoutSerializer(serializers.ModelSerializer):
username = serializers.ReadOnlyField(source="user.username")
rendered_username = serializers.ReadOnlyField(
source="user.profile.rendered_username"
)
class Meta:
model = Shout
fields = (
"id",
"user",
"username",
"rendered_username",
"content",
"_content_rendered",
"created",
"modified",
)
read_only_fields = ("user",)
def create(self, validated_data):
for key in SHOUTBOX_DISABLED_TAGS:
key_tag = "[{}]".format(key).lower()
if (
key_tag[: len(key_tag) - 1]
in validated_data.get("content").lower()
):
raise serializers.ValidationError(
{
"notAllowed": "{} is not allowed in the shoutbox".format(
key_tag
)
}
)
return Shout.objects.create(**validated_data)
| 30.793939 | 81 | 0.657745 | 478 | 5,081 | 6.732218 | 0.209205 | 0.223741 | 0.205096 | 0.126787 | 0.529832 | 0.224052 | 0.159416 | 0.128341 | 0.116221 | 0.070851 | 0 | 0.000261 | 0.245818 | 5,081 | 164 | 82 | 30.981707 | 0.839509 | 0 | 0 | 0.226277 | 0 | 0 | 0.142492 | 0.045463 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029197 | false | 0 | 0.021898 | 0.021898 | 0.437956 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
88031b336437f0a5497f94eace7653d85a0ddb61 | 1,326 | py | Python | runtime/components/Statistic/moving_minimum_time.py | ulise/hetida-designer | a6be8eb45abf950d5498e3ca756ea1d2e46b5c00 | [
"MIT"
] | 41 | 2020-11-18T10:12:29.000Z | 2022-03-28T21:46:41.000Z | runtime/components/Statistic/moving_minimum_time.py | ulise/hetida-designer | a6be8eb45abf950d5498e3ca756ea1d2e46b5c00 | [
"MIT"
] | 4 | 2020-12-08T15:28:15.000Z | 2022-02-01T11:40:17.000Z | runtime/components/Statistic/moving_minimum_time.py | ulise/hetida-designer | a6be8eb45abf950d5498e3ca756ea1d2e46b5c00 | [
"MIT"
] | 14 | 2020-11-18T11:39:17.000Z | 2022-03-21T15:05:11.000Z | from hetdesrun.component.registration import register
from hetdesrun.datatypes import DataType
import pandas as pd
import numpy as np
# ***** DO NOT EDIT LINES BELOW *****
# These lines may be overwritten if input/output changes.
@register(
inputs={"data": DataType.Any, "t": DataType.String},
outputs={"movmin": DataType.Any},
)
def main(*, data, t):
"""entrypoint function for this component
Usage example:
>>> main(
... data = pd.Series(
... {
... "2019-08-01T15:20:00": 4.0,
... "2019-08-01T15:20:01": 5.0,
... "2019-08-01T15:20:05": 1.0,
... "2019-08-01T15:20:09": 9.0,
... }
... ),
... t = "4s"
... )["movmin"]
2019-08-01 15:20:00 4.0
2019-08-01 15:20:01 4.0
2019-08-01 15:20:05 1.0
2019-08-01 15:20:09 9.0
dtype: float64
"""
# ***** DO NOT EDIT LINES ABOVE *****
# write your code here.
try:
data.index = pd.to_datetime(data.index)
except (ValueError, TypeError):
raise TypeError("indices of data must be datetime")
data_sort = data.sort_index().dropna()
try:
return {"movmin": data_sort.rolling(t).min()}
except (ValueError):
raise ValueError(f"t could not be parsed as frequency: {t}")
| 28.212766 | 68 | 0.555053 | 180 | 1,326 | 4.066667 | 0.466667 | 0.065574 | 0.057377 | 0.071038 | 0.147541 | 0.098361 | 0.038251 | 0 | 0 | 0 | 0 | 0.137461 | 0.281297 | 1,326 | 46 | 69 | 28.826087 | 0.63064 | 0.457014 | 0 | 0.111111 | 1 | 0 | 0.137931 | 0 | 0 | 0 | 0 | 0.021739 | 0 | 1 | 0.055556 | false | 0 | 0.222222 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
880be95fb023fa99a8e4f0737f4b060a1751c3cd | 576 | py | Python | keyboardrow.py | AndySamoil/Elite_Code | 7dc3b7b1b8688c932474f8a10fd2637fd2918bdd | [
"MIT"
] | null | null | null | keyboardrow.py | AndySamoil/Elite_Code | 7dc3b7b1b8688c932474f8a10fd2637fd2918bdd | [
"MIT"
] | null | null | null | keyboardrow.py | AndySamoil/Elite_Code | 7dc3b7b1b8688c932474f8a10fd2637fd2918bdd | [
"MIT"
] | null | null | null | def findWords(self, words: List[str]) -> List[str]:
''' sets and iterate through sets
'''
every = [set("qwertyuiop"), set("asdfghjkl"), set("zxcvbnm")]
ans = []
for word in words:
l = len(word)
for sett in every:
count = 0
for let in word:
if let.lower() in sett:
count += 1
if count == l:
ans.append(word)
return ans | 27.428571 | 69 | 0.362847 | 53 | 576 | 3.943396 | 0.566038 | 0.066986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007547 | 0.539931 | 576 | 21 | 70 | 27.428571 | 0.781132 | 0 | 0 | 0 | 0 | 0 | 0.048964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8811e504a270f2f7246e1ece4241279f011e0643 | 745 | py | Python | netbox/ipam/managers.py | aslafy-z/netbox | a5512dd4c46c005df8752fc330c1382ac22b31ea | [
"Apache-2.0"
] | 1 | 2022-01-25T09:02:56.000Z | 2022-01-25T09:02:56.000Z | netbox/ipam/managers.py | aslafy-z/netbox | a5512dd4c46c005df8752fc330c1382ac22b31ea | [
"Apache-2.0"
] | 4 | 2021-06-08T22:29:06.000Z | 2022-03-12T00:48:51.000Z | netbox/ipam/managers.py | aslafy-z/netbox | a5512dd4c46c005df8752fc330c1382ac22b31ea | [
"Apache-2.0"
] | null | null | null | from django.db import models
from ipam.lookups import Host, Inet
class IPAddressManager(models.Manager):
def get_queryset(self):
"""
By default, PostgreSQL will order INETs with shorter (larger) prefix lengths ahead of those with longer
(smaller) masks. This makes no sense when ordering IPs, which should be ordered solely by family and host
address. We can use HOST() to extract just the host portion of the address (ignoring its mask), but we must
then re-cast this value to INET() so that records will be ordered properly. We are essentially re-casting each
IP address as a /32 or /128.
"""
qs = super().get_queryset()
return qs.order_by(Inet(Host('address')))
| 41.388889 | 118 | 0.689933 | 111 | 745 | 4.603604 | 0.738739 | 0.043053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008834 | 0.240268 | 745 | 17 | 119 | 43.823529 | 0.893993 | 0.613423 | 0 | 0 | 0 | 0 | 0.030172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
714b8d767c11fadd1e5da33bbf5b7d19a7d70405 | 382 | py | Python | py/2016/5B.py | pedrotari7/advent_of_code | 98d5bc8d903435624a019a5702f5421d7b4ef8c8 | [
"MIT"
] | null | null | null | py/2016/5B.py | pedrotari7/advent_of_code | 98d5bc8d903435624a019a5702f5421d7b4ef8c8 | [
"MIT"
] | null | null | null | py/2016/5B.py | pedrotari7/advent_of_code | 98d5bc8d903435624a019a5702f5421d7b4ef8c8 | [
"MIT"
] | null | null | null | import md5
(i,count) = (0,0)
password = ['']*8
while 1:
key = 'reyedfim' + str(i)
md = md5.new(key).hexdigest()
if md[:5] == '00000':
index = int(md[5],16)
if index < len(password) and password[index]=='':
password[index] = md[6]
count += 1
if count == 8:
break
i+=1
print ''.join(password) | 17.363636 | 58 | 0.465969 | 50 | 382 | 3.56 | 0.54 | 0.033708 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077236 | 0.356021 | 382 | 22 | 59 | 17.363636 | 0.646341 | 0 | 0 | 0 | 0 | 0 | 0.033943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.266667 | 0.066667 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
715a02ff047054f60c24cd7d80d0ef426229bc1b | 1,658 | py | Python | src/exabgp/bgp/message/update/attribute/bgpls/link/mplsmask.py | pierky/exabgp | 34be537ae5906c0830b31da1152ae63108ccf911 | [
"BSD-3-Clause"
] | 1,560 | 2015-01-01T08:53:05.000Z | 2022-03-29T20:22:43.000Z | src/exabgp/bgp/message/update/attribute/bgpls/link/mplsmask.py | pierky/exabgp | 34be537ae5906c0830b31da1152ae63108ccf911 | [
"BSD-3-Clause"
] | 818 | 2015-01-01T17:38:40.000Z | 2022-03-30T07:29:24.000Z | src/exabgp/bgp/message/update/attribute/bgpls/link/mplsmask.py | pierky/exabgp | 34be537ae5906c0830b31da1152ae63108ccf911 | [
"BSD-3-Clause"
] | 439 | 2015-01-06T21:20:41.000Z | 2022-03-19T23:24:25.000Z | # encoding: utf-8
"""
mplsmask.py
Created by Evelio Vila on 2016-12-01.
Copyright (c) 2014-2017 Exa Networks. All rights reserved.
"""
from exabgp.bgp.message.notification import Notify
from exabgp.bgp.message.update.attribute.bgpls.linkstate import LinkState
from exabgp.bgp.message.update.attribute.bgpls.linkstate import FlagLS
# 0 1 2 3
# 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
# +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
# | Type | Length |
# +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
# |L|R| Reserved |
# +-+-+-+-+-+-+-+-+
# https://tools.ietf.org/html/rfc7752#section-3.3.2.2 MPLS Protocol Mask
#
# +------------+------------------------------------------+-----------+
# | Bit | Description | Reference |
# +------------+------------------------------------------+-----------+
# | 'L' | Label Distribution Protocol (LDP) | [RFC5036] |
# | 'R' | Extension to RSVP for LSP Tunnels | [RFC3209] |
# | | (RSVP-TE) | |
# | 'Reserved' | Reserved for future use | |
# +------------+------------------------------------------+-----------+
# RFC 7752 3.3.2.2. MPLS Protocol Mask TLV
@LinkState.register()
class MplsMask(FlagLS):
REPR = 'MPLS Protocol mask'
JSON = 'mpls-mask'
TLV = 1094
FLAGS = ['LDP', 'RSVP-TE', 'RSV', 'RSV', 'RSV', 'RSV', 'RSV', 'RSV']
LEN = 1
| 41.45 | 77 | 0.390229 | 164 | 1,658 | 3.945122 | 0.518293 | 0.015456 | 0.018547 | 0.02473 | 0.309119 | 0.281298 | 0.281298 | 0.219475 | 0.219475 | 0.049459 | 0 | 0.071119 | 0.304584 | 1,658 | 39 | 78 | 42.512821 | 0.490026 | 0.729795 | 0 | 0 | 0 | 0 | 0.130641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3 | 0 | 0.9 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
715e64156e2717f5d7270f3da98702a6b8223554 | 253 | py | Python | helpers/Screen.py | 1000monkeys/MastermindRedux | 6b07a341ecbf2ea325949a49c84218cc3632cd33 | [
"Unlicense"
] | null | null | null | helpers/Screen.py | 1000monkeys/MastermindRedux | 6b07a341ecbf2ea325949a49c84218cc3632cd33 | [
"Unlicense"
] | null | null | null | helpers/Screen.py | 1000monkeys/MastermindRedux | 6b07a341ecbf2ea325949a49c84218cc3632cd33 | [
"Unlicense"
] | null | null | null | import sys
class Screen:
def __init__(self) -> None:
pass
def handle_events(self, events):
for event in events:
if event.type == self.pygame.QUIT:
sys.exit()
def draw(self, screen):
pass | 19.461538 | 46 | 0.545455 | 31 | 253 | 4.290323 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.359684 | 253 | 13 | 47 | 19.461538 | 0.820988 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0.2 | 0.1 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
71727855c3b5a49ba770b23fd1b96b453bcf8530 | 855 | py | Python | carPooling/migrations/0018_auto_20190521_1651.py | yangtao4389/pinche | 81463761058f67d47cea980f29a061b1e1b2d08a | [
"Apache-2.0"
] | 1 | 2020-09-30T01:27:57.000Z | 2020-09-30T01:27:57.000Z | carPooling/migrations/0018_auto_20190521_1651.py | yangtao4389/pinche | 81463761058f67d47cea980f29a061b1e1b2d08a | [
"Apache-2.0"
] | 9 | 2020-06-05T19:51:33.000Z | 2022-03-11T23:40:25.000Z | carPooling/migrations/0018_auto_20190521_1651.py | yangtao4389/pinche | 81463761058f67d47cea980f29a061b1e1b2d08a | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.4 on 2019-05-21 16:51
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('carPooling', '0017_carpoolingrecunbook'),
]
operations = [
migrations.AlterField(
model_name='carpoolinguserconf',
name='c_name',
field=models.CharField(max_length=128, null=True, verbose_name='真实姓名'),
),
migrations.AlterField(
model_name='carpoolinguserconf',
name='c_phone',
field=models.CharField(db_index=True, max_length=11, verbose_name='电话号码'),
),
migrations.AlterField(
model_name='carpoolinguserconf',
name='c_weixin_id',
field=models.CharField(db_index=True, max_length=128, null=True, verbose_name='微信id'),
),
]
| 29.482759 | 98 | 0.611696 | 89 | 855 | 5.696629 | 0.505618 | 0.118343 | 0.147929 | 0.171598 | 0.57002 | 0.57002 | 0.57002 | 0.157791 | 0 | 0 | 0 | 0.043339 | 0.271345 | 855 | 28 | 99 | 30.535714 | 0.770465 | 0.052632 | 0 | 0.409091 | 1 | 0 | 0.153465 | 0.029703 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
717d2436cb249576851958abd14c5cdda155ab5a | 565 | py | Python | PySDDP/term.py | tscher/PySDDP | ece69b77c951cbb1f046ac184f6fe4fc025ad690 | [
"MIT"
] | 9 | 2021-01-07T13:35:47.000Z | 2022-02-06T14:30:33.000Z | PySDDP/term.py | AndreMarcato/PySDDP | e6b1e60df6a5598c30552be61b07ed642e46399c | [
"MIT"
] | null | null | null | PySDDP/term.py | AndreMarcato/PySDDP | e6b1e60df6a5598c30552be61b07ed642e46399c | [
"MIT"
] | 3 | 2021-01-08T11:37:23.000Z | 2021-04-19T15:07:26.000Z | class term(object):
# Dados de cadastro das usinas termeletrica (presentes no TERM.DAT)
Codigo = None
Nome = None
Potencia = None
FCMax = None
TEIF = None
IP = None
GTMin = None
# Dados Adicionais Especificados no arquivo de configuracao termica (CONFT)
Sist = None
Status = None
Classe = None
# Dados Adicionais Especificados no arquivo de classe termica (CLAST)
Custo = None
NomeClasse = None
TipoComb = None
def insere(self, custo, gmax):
self.custo = custo
self.gmax = gmax
| 23.541667 | 79 | 0.640708 | 68 | 565 | 5.323529 | 0.544118 | 0.049724 | 0.104972 | 0.176796 | 0.237569 | 0.237569 | 0.237569 | 0 | 0 | 0 | 0 | 0 | 0.297345 | 565 | 23 | 80 | 24.565217 | 0.911839 | 0.366372 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0 | 0.882353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
718a929c80bd8d634b1687ba5560ac7c6a4f6fe7 | 264 | py | Python | venv/lib/python2.7/dist-packages/landscape/sysinfo/load.py | pengwu/scapy_env | 3db9c5dea2e219048a2387649d6d89be342903d9 | [
"MIT"
] | null | null | null | venv/lib/python2.7/dist-packages/landscape/sysinfo/load.py | pengwu/scapy_env | 3db9c5dea2e219048a2387649d6d89be342903d9 | [
"MIT"
] | null | null | null | venv/lib/python2.7/dist-packages/landscape/sysinfo/load.py | pengwu/scapy_env | 3db9c5dea2e219048a2387649d6d89be342903d9 | [
"MIT"
] | null | null | null | import os
from twisted.internet.defer import succeed
class Load(object):
def register(self, sysinfo):
self._sysinfo = sysinfo
def run(self):
self._sysinfo.add_header("System load", str(os.getloadavg()[0]))
return succeed(None)
| 18.857143 | 72 | 0.666667 | 34 | 264 | 5.088235 | 0.676471 | 0.190751 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004854 | 0.219697 | 264 | 13 | 73 | 20.307692 | 0.834951 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
718c6a96017a844d29bf1f77cede2d377a4c970c | 675 | py | Python | src/boh_api/viewsets.py | dougmorato/bag-of-holding | 8a7bc45ced8837bdb00da60dcfb496bb0271f161 | [
"Apache-2.0"
] | null | null | null | src/boh_api/viewsets.py | dougmorato/bag-of-holding | 8a7bc45ced8837bdb00da60dcfb496bb0271f161 | [
"Apache-2.0"
] | 1 | 2021-06-10T23:58:45.000Z | 2021-06-10T23:58:45.000Z | src/boh_api/viewsets.py | dougmorato/bag-of-holding | 8a7bc45ced8837bdb00da60dcfb496bb0271f161 | [
"Apache-2.0"
] | null | null | null | from rest_framework import viewsets
from boh import models
from . import serializers
class OrganizationViewSet(viewsets.ModelViewSet):
queryset = models.Organization.objects.all()
serializer_class = serializers.OrganizationSerializer
class ApplicationViewSet(viewsets.ModelViewSet):
queryset = models.Application.objects.all()
serializer_class = serializers.ApplicationSerializer
class TagViewSet(viewsets.ModelViewSet):
queryset = models.Tag.objects.all()
serializer_class = serializers.TagSerializer
class PersonViewSet(viewsets.ModelViewSet):
queryset = models.Person.objects.all()
serializer_class = serializers.PersonSerializer
| 25.961538 | 57 | 0.8 | 64 | 675 | 8.359375 | 0.40625 | 0.149533 | 0.209346 | 0.254206 | 0.269159 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127407 | 675 | 25 | 58 | 27 | 0.908319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
719fb32d418ed1529b6d751555ff2385cebf2266 | 623 | py | Python | Last 3 digits of 11^x.py | jaiveergill/Last-Three-Digits-of-11-x | def4519b9b46e41b4c4f2b3a5dbe5566316dd83e | [
"MIT"
] | null | null | null | Last 3 digits of 11^x.py | jaiveergill/Last-Three-Digits-of-11-x | def4519b9b46e41b4c4f2b3a5dbe5566316dd83e | [
"MIT"
] | null | null | null | Last 3 digits of 11^x.py | jaiveergill/Last-Three-Digits-of-11-x | def4519b9b46e41b4c4f2b3a5dbe5566316dd83e | [
"MIT"
] | null | null | null | # This is a simple program to find the last three digits of 11 raised to any given number.
# The main algorithm that does the work is on line 10
def trim_num(num):
if len(str(num)) > 3: # no need to trim if the number is 3 or less digits long
return str(num)[(len(str(num)) - 3):] # trims the number
return num
def main(exp):
init_val = str((((exp-1) * (exp))/2) % 10 + (exp % 100) / 10) + str(exp % 10) + "1" # The main algorithm which needs to be cleaned (only the last three digits should be shown)
return "{}".format(trim_num(init_val))
# To use it, simply copy the code and run the function
| 44.5 | 179 | 0.662921 | 114 | 623 | 3.587719 | 0.535088 | 0.04401 | 0.05868 | 0.08802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039337 | 0.224719 | 623 | 13 | 180 | 47.923077 | 0.807453 | 0.569823 | 0 | 0 | 0 | 0 | 0.011494 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
71a33e281903173f09972e5b14ecf88c5dd711ba | 1,251 | py | Python | summary/summary_avail.py | bit0fun/plugins | 1f6f701bf1e60882b8fa61cb735e7033c8c29e3c | [
"BSD-3-Clause"
] | 173 | 2019-01-17T12:40:47.000Z | 2022-03-27T12:14:00.000Z | summary/summary_avail.py | bit0fun/plugins | 1f6f701bf1e60882b8fa61cb735e7033c8c29e3c | [
"BSD-3-Clause"
] | 284 | 2019-03-01T17:54:14.000Z | 2022-03-29T13:27:51.000Z | summary/summary_avail.py | bit0fun/plugins | 1f6f701bf1e60882b8fa61cb735e7033c8c29e3c | [
"BSD-3-Clause"
] | 92 | 2019-02-26T03:45:40.000Z | 2022-03-28T03:23:50.000Z | from datetime import datetime
# ensure an rpc peer is added
def addpeer(p, rpcpeer):
pid = rpcpeer['id']
if pid not in p.persist['peerstate']:
p.persist['peerstate'][pid] = {
'connected': rpcpeer['connected'],
'last_seen': datetime.now() if rpcpeer['connected'] else None,
'avail': 1.0 if rpcpeer['connected'] else 0.0
}
# exponetially smooth online/offline states of peers
def trace_availability(p, rpcpeers):
p.persist['availcount'] += 1
leadwin = max(min(p.avail_window, p.persist['availcount'] * p.avail_interval), p.avail_interval)
samples = leadwin / p.avail_interval
alpha = 1.0 / samples
beta = 1.0 - alpha
for rpcpeer in rpcpeers['peers']:
pid = rpcpeer['id']
addpeer(p, rpcpeer)
if rpcpeer['connected']:
p.persist['peerstate'][pid]['last_seen'] = datetime.now()
p.persist['peerstate'][pid]['connected'] = True
p.persist['peerstate'][pid]['avail'] = 1.0 * alpha + p.persist['peerstate'][pid]['avail'] * beta
else:
p.persist['peerstate'][pid]['connected'] = False
p.persist['peerstate'][pid]['avail'] = 0.0 * alpha + p.persist['peerstate'][pid]['avail'] * beta
| 36.794118 | 108 | 0.60032 | 156 | 1,251 | 4.769231 | 0.326923 | 0.11828 | 0.205645 | 0.215054 | 0.278226 | 0.094086 | 0.094086 | 0.094086 | 0 | 0 | 0 | 0.013613 | 0.236611 | 1,251 | 33 | 109 | 37.909091 | 0.765445 | 0.06235 | 0 | 0.08 | 0 | 0 | 0.184615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.04 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
71b9f1ca619e6a3da629a83c1ba692653be95c14 | 409 | py | Python | panel/api/models/provider.py | angeelgarr/DCPanel | 1901a0f4b1b4273b60d3a218797fb6614d05b4c0 | [
"MIT"
] | 7 | 2016-01-06T13:28:35.000Z | 2020-11-30T07:35:59.000Z | panel/api/models/provider.py | angeelgarr/DCPanel | 1901a0f4b1b4273b60d3a218797fb6614d05b4c0 | [
"MIT"
] | null | null | null | panel/api/models/provider.py | angeelgarr/DCPanel | 1901a0f4b1b4273b60d3a218797fb6614d05b4c0 | [
"MIT"
] | 6 | 2017-07-18T06:41:56.000Z | 2022-01-17T07:04:44.000Z | from django.db import models
from django.contrib import admin
class Provider(models.Model):
name = models.CharField(max_length=50)
domain = models.CharField(max_length=50)
class Meta:
ordering = ['name']
app_label = 'api'
def __str__(self):
return self.domain
@admin.register(Provider)
class ProviderAdmin(admin.ModelAdmin):
list_display = ('name', 'domain')
| 20.45 | 44 | 0.682152 | 50 | 409 | 5.42 | 0.6 | 0.073801 | 0.132841 | 0.177122 | 0.191882 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0.207824 | 409 | 19 | 45 | 21.526316 | 0.824074 | 0 | 0 | 0 | 0 | 0 | 0.041565 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0.076923 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
71c3d256540447d130560ac9efdd84ad55be2fad | 970 | py | Python | IceSpringMusicPlayer/plugins/IceSpringHelloWorldPlugin/helloWorldPlugin.py | baijifeilong/rawsteelp | 425547e6e2395bf4acb62435b18b5b3a4b7ebef4 | [
"MIT"
] | null | null | null | IceSpringMusicPlayer/plugins/IceSpringHelloWorldPlugin/helloWorldPlugin.py | baijifeilong/rawsteelp | 425547e6e2395bf4acb62435b18b5b3a4b7ebef4 | [
"MIT"
] | null | null | null | IceSpringMusicPlayer/plugins/IceSpringHelloWorldPlugin/helloWorldPlugin.py | baijifeilong/rawsteelp | 425547e6e2395bf4acb62435b18b5b3a4b7ebef4 | [
"MIT"
] | null | null | null | # Created by BaiJiFeiLong@gmail.com at 2022/1/21 17:13
import typing
from IceSpringRealOptional.typingUtils import gg
from PySide2 import QtWidgets, QtCore
from IceSpringMusicPlayer import tt
from IceSpringMusicPlayer.common.pluginMixin import PluginMixin
from IceSpringMusicPlayer.common.pluginWidgetMixin import PluginWidgetMixin
from IceSpringMusicPlayer.tt import Text
class HelloWorldPlugin(QtWidgets.QWidget, PluginMixin, PluginWidgetMixin):
@classmethod
def getPluginName(cls) -> Text:
return tt.HelloWorldPlugin_Name
@classmethod
def getPluginReplacers(cls) -> typing.Dict[Text, typing.Callable[[], PluginWidgetMixin]]:
return {tt.HelloWorldWidget_Name: lambda: cls()}
def __init__(self):
super().__init__()
label = QtWidgets.QLabel("Hello World")
label.setAlignment(gg(QtCore.Qt.AlignmentFlag.AlignCenter))
self.setLayout(QtWidgets.QGridLayout())
self.layout().addWidget(label)
| 33.448276 | 93 | 0.760825 | 100 | 970 | 7.28 | 0.55 | 0.131868 | 0.082418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014616 | 0.153608 | 970 | 28 | 94 | 34.642857 | 0.872107 | 0.053608 | 0 | 0.1 | 0 | 0 | 0.012009 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.35 | 0.1 | 0.65 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
71cfe70b6d19b560b2ea31dad54f5ad9cbddfef1 | 291 | py | Python | Projects/envirohat-monitor/clear-screen.py | pkbullock/RaspberryPi | 1c8e83566e97f65fe530d8d43293f4b26c015d0d | [
"MIT"
] | null | null | null | Projects/envirohat-monitor/clear-screen.py | pkbullock/RaspberryPi | 1c8e83566e97f65fe530d8d43293f4b26c015d0d | [
"MIT"
] | null | null | null | Projects/envirohat-monitor/clear-screen.py | pkbullock/RaspberryPi | 1c8e83566e97f65fe530d8d43293f4b26c015d0d | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import ST7735
import sys
st7735 = ST7735.ST7735(
port=0,
cs=1,
dc=9,
backlight=12,
rotation=270,
spi_speed_hz=10000000
)
# Reset the display
st7735.begin()
st7735.reset()
st7735.set_backlight(0)
print "\nDone."
# Exit cleanly
sys.exit(0) | 12.652174 | 25 | 0.666667 | 43 | 291 | 4.44186 | 0.697674 | 0.125654 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202586 | 0.202749 | 291 | 23 | 26 | 12.652174 | 0.62069 | 0.178694 | 0 | 0 | 0 | 0 | 0.029536 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.133333 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
71d94b479d90dc462de92906e5d1f18653ee665b | 433 | py | Python | cauldron/cli/server/routes/ui_statuses.py | JohnnyPeng18/cauldron | 09120c2a4cef65df46f8c0c94f5d79395b3298cd | [
"MIT"
] | 90 | 2016-09-02T15:11:10.000Z | 2022-01-02T11:37:57.000Z | cauldron/cli/server/routes/ui_statuses.py | JohnnyPeng18/cauldron | 09120c2a4cef65df46f8c0c94f5d79395b3298cd | [
"MIT"
] | 86 | 2016-09-23T16:52:22.000Z | 2022-03-31T21:39:56.000Z | cauldron/cli/server/routes/ui_statuses.py | JohnnyPeng18/cauldron | 09120c2a4cef65df46f8c0c94f5d79395b3298cd | [
"MIT"
] | 261 | 2016-12-22T05:36:48.000Z | 2021-11-26T12:40:42.000Z | import flask
from cauldron.cli.server import run as server_runner
from cauldron.ui import arguments
from cauldron.ui import statuses
@server_runner.APPLICATION.route('/ui-status', methods=['POST'])
def ui_status():
args = arguments.from_request()
last_timestamp = args.get('last_timestamp', 0)
force = args.get('force', False)
results = statuses.get_status(last_timestamp, force)
return flask.jsonify(results)
| 28.866667 | 64 | 0.750577 | 59 | 433 | 5.372881 | 0.491525 | 0.113565 | 0.088328 | 0.126183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002681 | 0.138568 | 433 | 14 | 65 | 30.928571 | 0.847185 | 0 | 0 | 0 | 0 | 0 | 0.076212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
71de0e1236a03e6cfb4186e21a5012996969d350 | 130 | py | Python | WebVisualizations/data.py | chuhaovince/Web-Design-Challenge | 1826a0e2dfbe4e11feb78f0ecce02e0f8a0a7eb5 | [
"ADSL"
] | null | null | null | WebVisualizations/data.py | chuhaovince/Web-Design-Challenge | 1826a0e2dfbe4e11feb78f0ecce02e0f8a0a7eb5 | [
"ADSL"
] | null | null | null | WebVisualizations/data.py | chuhaovince/Web-Design-Challenge | 1826a0e2dfbe4e11feb78f0ecce02e0f8a0a7eb5 | [
"ADSL"
] | null | null | null | import pandas as pd
path = "Resources/cities.csv"
data = pd.read_csv(path)
data_html = data.to_html("data.html", bold_rows = True) | 32.5 | 55 | 0.746154 | 23 | 130 | 4.043478 | 0.652174 | 0.172043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 130 | 4 | 55 | 32.5 | 0.808696 | 0 | 0 | 0 | 0 | 0 | 0.221374 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e087f1cb73d38e79a6d1863be5eb904e7f3b6261 | 1,541 | py | Python | Data_and_Dicts.py | melkisedeath/Harmonic_Analysis_and_Trajectory | a5a2819c053ddd287dcb668fac2f1be7e44f6c59 | [
"MIT"
] | null | null | null | Data_and_Dicts.py | melkisedeath/Harmonic_Analysis_and_Trajectory | a5a2819c053ddd287dcb668fac2f1be7e44f6c59 | [
"MIT"
] | null | null | null | Data_and_Dicts.py | melkisedeath/Harmonic_Analysis_and_Trajectory | a5a2819c053ddd287dcb668fac2f1be7e44f6c59 | [
"MIT"
] | null | null | null | """HERE are the base Points for all valid Tonnetze Systems.
A period of all 12 notes divided by mod 3, mod 4 (always stable)
"""
# x = 4, y = 3
NotePointsT345 = {
0: (0, 0),
1: (1, 3),
2: (2, 2),
3: (0, 1),
4: (1, 0),
5: (2, 3),
6: (0, 2),
7: (1, 1),
8: (2, 0),
9: (0, 3),
10: (1, 2),
11: (2, 1)
}
# x = 8, y = 3
NotePointsT138 = {
0: (0, 0),
1: (2, 3),
2: (1, 2),
3: (0, 1),
4: (2, 0),
5: (1, 3),
6: (0, 2),
7: (2, 1),
8: (1, 0),
9: (0, 3),
10: (2, 2),
11: (1, 1)
}
# x = 2, y = 9
NotePointsT129 = {
0: (0, 0),
1: (2, 1),
2: (1, 0),
3: (0, 3),
4: (2, 0),
5: (1, 3),
6: (0, 2),
7: (2, 3),
8: (1, 2),
9: (0, 1),
10: (2, 2),
11: (1, 1)
}
# x = 4, y = 1
NotePointsT147 = {
0: (0, 0),
1: (0, 1),
2: (0, 2),
3: (0, 3),
4: (1, 0),
5: (1, 1),
6: (1, 2),
7: (1, 3),
8: (2, 0),
9: (2, 1),
10: (2, 2),
11: (2, 3)
}
# x = 2, y = 3
NotePointsT237 = {
0: (0, 0),
1: (2, 3),
2: (1, 0),
3: (0, 1),
4: (2, 0),
5: (1, 1),
6: (0, 2),
7: (2, 1),
8: (1, 2),
9: (0, 3),
10: (2, 2),
11: (1, 3)
}
dictOfTonnetz = {
'T345': NotePointsT345,
'T147': NotePointsT147,
'T138': NotePointsT138,
'T237': NotePointsT237,
'T129': NotePointsT129
}
dictOfTonnetze = {
'T129': [1, 2, 9],
'T138': [1, 3, 8],
'T147': [1, 4, 7],
'T156': [1, 5, 6],
'T237': [2, 3, 7],
'T345': [3, 4, 5]
}
| 14.817308 | 64 | 0.345879 | 265 | 1,541 | 2.011321 | 0.177358 | 0.041276 | 0.028143 | 0.037523 | 0.262664 | 0.170732 | 0.170732 | 0.150094 | 0.041276 | 0.041276 | 0 | 0.301587 | 0.386762 | 1,541 | 103 | 65 | 14.961165 | 0.262434 | 0.121999 | 0 | 0.494118 | 0 | 0 | 0.032787 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e09a1b7388131ca361a24a122df607870ceb2f36 | 5,249 | py | Python | legacy/neural_qa/train.py | FrancisLiang/models-1 | e14d5bc1ab36d0dd11977f27cff54605bf99c945 | [
"Apache-2.0"
] | 4 | 2020-01-04T13:15:02.000Z | 2021-07-21T07:50:02.000Z | legacy/neural_qa/train.py | FrancisLiang/models-1 | e14d5bc1ab36d0dd11977f27cff54605bf99c945 | [
"Apache-2.0"
] | 2 | 2019-06-26T03:21:49.000Z | 2019-09-19T09:43:42.000Z | legacy/neural_qa/train.py | FrancisLiang/models-1 | e14d5bc1ab36d0dd11977f27cff54605bf99c945 | [
"Apache-2.0"
] | 3 | 2019-10-31T07:18:49.000Z | 2020-01-13T03:18:39.000Z | import sys
import os
import argparse
import numpy as np
import paddle.v2 as paddle
import reader
import utils
import network
import config
from utils import logger
def save_model(trainer, model_save_dir, parameters, pass_id):
f = os.path.join(model_save_dir, "params_pass_%05d.tar.gz" % pass_id)
logger.info("model saved to %s" % f)
with utils.open_file(f, "w") as f:
trainer.save_parameter_to_tar(f)
def show_parameter_init_info(parameters):
"""
Print the information of initialization mean and standard deviation of
parameters
:param parameters: the parameters created in a model
"""
logger.info("Parameter init info:")
for p in parameters:
p_val = parameters.get(p)
logger.info(("%-25s : initial_mean=%-7.4f initial_std=%-7.4f "
"actual_mean=%-7.4f actual_std=%-7.4f dims=%s") %
(p, parameters.__param_conf__[p].initial_mean,
parameters.__param_conf__[p].initial_std, p_val.mean(),
p_val.std(), parameters.__param_conf__[p].dims))
logger.info("\n")
def show_parameter_status(parameters):
"""
Print some statistical information of parameters in a network
:param parameters: the parameters created in a model
"""
for p in parameters:
abs_val = np.abs(parameters.get(p))
abs_grad = np.abs(parameters.get_grad(p))
logger.info(
("%-25s avg_abs_val=%-10.6f max_val=%-10.6f avg_abs_grad=%-10.6f "
"max_grad=%-10.6f min_val=%-10.6f min_grad=%-10.6f") %
(p, abs_val.mean(), abs_val.max(), abs_grad.mean(), abs_grad.max(),
abs_val.min(), abs_grad.min()))
def train(conf):
if not os.path.exists(conf.model_save_dir):
os.makedirs(conf.model_save_dir, mode=0755)
settings = reader.Settings(
vocab=conf.vocab,
is_training=True,
label_schema=conf.label_schema,
negative_sample_ratio=conf.negative_sample_ratio,
hit_ans_negative_sample_ratio=conf.hit_ans_negative_sample_ratio,
keep_first_b=conf.keep_first_b,
seed=conf.seed)
samples_per_pass = conf.batch_size * conf.batches_per_pass
train_reader = paddle.batch(
paddle.reader.buffered(
reader.create_reader(conf.train_data_path, settings,
samples_per_pass),
size=samples_per_pass),
batch_size=conf.batch_size)
# TODO(lipeng17) v2 API does not support parallel_nn yet. Therefore, we can
# only use CPU currently
paddle.init(
use_gpu=conf.use_gpu,
trainer_count=conf.trainer_count,
seed=conf.paddle_seed)
# network config
cost = network.training_net(conf)
# create parameters
# NOTE: parameter values are not initilized here, therefore, we need to
# print parameter initialization info in the beginning of the first batch
parameters = paddle.parameters.create(cost)
# create optimizer
rmsprop_optimizer = paddle.optimizer.RMSProp(
learning_rate=conf.learning_rate,
rho=conf.rho,
epsilon=conf.epsilon,
model_average=paddle.optimizer.ModelAverage(
average_window=conf.average_window,
max_average_window=conf.max_average_window))
# create trainer
trainer = paddle.trainer.SGD(cost=cost,
parameters=parameters,
update_equation=rmsprop_optimizer)
# begin training network
def _event_handler(event):
"""
Define end batch and end pass event handler
"""
if isinstance(event, paddle.event.EndIteration):
sys.stderr.write(".")
batch_num = event.batch_id + 1
total_batch = conf.batches_per_pass * event.pass_id + batch_num
if batch_num % conf.log_period == 0:
sys.stderr.write("\n")
logger.info("Total batch=%d Batch=%d CurrentCost=%f Eval: %s" \
% (total_batch, batch_num, event.cost, event.metrics))
if batch_num % conf.show_parameter_status_period == 0:
show_parameter_status(parameters)
elif isinstance(event, paddle.event.EndPass):
save_model(trainer, conf.model_save_dir, parameters, event.pass_id)
elif isinstance(event, paddle.event.BeginIteration):
if event.batch_id == 0 and event.pass_id == 0:
show_parameter_init_info(parameters)
## for debugging purpose
#with utils.open_file("config", "w") as config:
# print >> config, paddle.layer.parse_network(cost)
trainer.train(
reader=train_reader,
event_handler=_event_handler,
feeding=network.feeding,
num_passes=conf.num_passes)
logger.info("Training has finished.")
def main():
conf = config.TrainingConfig()
logger.info("loading word embeddings...")
conf.vocab, conf.wordvecs = utils.load_wordvecs(conf.word_dict_path,
conf.wordvecs_path)
logger.info("loaded")
logger.info("length of word dictionary is : %d." % len(conf.vocab))
train(conf)
if __name__ == "__main__":
main()
| 33.864516 | 79 | 0.638407 | 671 | 5,249 | 4.749627 | 0.281669 | 0.031377 | 0.018826 | 0.018826 | 0.097898 | 0.026985 | 0.026985 | 0.026985 | 0 | 0 | 0 | 0.011604 | 0.261193 | 5,249 | 154 | 80 | 34.084416 | 0.810211 | 0.085159 | 0 | 0.02 | 0 | 0 | 0.093065 | 0.005195 | 0 | 0 | 0 | 0.006494 | 0 | 0 | null | null | 0.1 | 0.1 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
e0a33a3a50dffea3e1b1459aa66fc2a300fcaf7e | 879 | py | Python | tests/test_random.py | hirnimeshrampuresoftware/python-tcod | c82d60eaaf12e50b405d55df1026c1d00dd283b6 | [
"BSD-2-Clause"
] | 231 | 2018-06-28T10:07:41.000Z | 2022-03-20T16:17:19.000Z | tests/test_random.py | hirnimeshrampuresoftware/python-tcod | c82d60eaaf12e50b405d55df1026c1d00dd283b6 | [
"BSD-2-Clause"
] | 66 | 2018-06-27T19:04:25.000Z | 2022-03-30T21:15:15.000Z | tests/test_random.py | hirnimeshrampuresoftware/python-tcod | c82d60eaaf12e50b405d55df1026c1d00dd283b6 | [
"BSD-2-Clause"
] | 31 | 2018-09-12T00:35:42.000Z | 2022-03-20T16:17:22.000Z | import copy
import pickle
import tcod
def test_tcod_random() -> None:
rand = tcod.random.Random(tcod.random.COMPLEMENTARY_MULTIPLY_WITH_CARRY)
assert 0 <= rand.randint(0, 100) <= 100
assert 0 <= rand.uniform(0, 100) <= 100
rand.guass(0, 1)
rand.inverse_guass(0, 1)
def test_tcod_random_copy() -> None:
rand = tcod.random.Random(tcod.random.MERSENNE_TWISTER)
rand2 = copy.copy(rand)
assert rand.uniform(0, 1) == rand2.uniform(0, 1)
assert rand.uniform(0, 1) == rand2.uniform(0, 1)
assert rand.uniform(0, 1) == rand2.uniform(0, 1)
def test_tcod_random_pickle() -> None:
rand = tcod.random.Random(tcod.random.MERSENNE_TWISTER)
rand2 = pickle.loads(pickle.dumps(rand))
assert rand.uniform(0, 1) == rand2.uniform(0, 1)
assert rand.uniform(0, 1) == rand2.uniform(0, 1)
assert rand.uniform(0, 1) == rand2.uniform(0, 1)
| 30.310345 | 76 | 0.675768 | 136 | 879 | 4.264706 | 0.191176 | 0.048276 | 0.186207 | 0.186207 | 0.662069 | 0.662069 | 0.6 | 0.541379 | 0.541379 | 0.541379 | 0 | 0.071724 | 0.175199 | 879 | 28 | 77 | 31.392857 | 0.728276 | 0 | 0 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.380952 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e0b1b9a256ca71156990f5651aa970e6f745d293 | 1,524 | py | Python | apps/user/urls.py | mrf-foundation/ckios_v1 | 3556a99ba5e01f00e137fd124903ace77d2cba28 | [
"Apache-2.0"
] | null | null | null | apps/user/urls.py | mrf-foundation/ckios_v1 | 3556a99ba5e01f00e137fd124903ace77d2cba28 | [
"Apache-2.0"
] | null | null | null | apps/user/urls.py | mrf-foundation/ckios_v1 | 3556a99ba5e01f00e137fd124903ace77d2cba28 | [
"Apache-2.0"
] | null | null | null | # -*- encoding: utf-8 -*-
"""
Copyright (c) 2021 ronyman.com
"""
from django.contrib import admin
from django.contrib.auth import views as auth_views
from django.urls import path, include
from django.conf import settings
from django.conf.urls.static import static
from apps.user import views as user_views
from.views import EditProfilePage
urlpatterns = [
#User
path('admin/', admin.site.urls),
path('register/', user_views.register, name='register'),
path('login/', auth_views.LoginView.as_view(template_name='registration/login.html'), name='login'),
path('profile/', user_views.profile, name='profile'),
path('edit_profile/', user_views.edit_profile, name='edit_profile'),
path("myprofile/", user_views.myprofile, name="Myprofile"),
path('logout/', auth_views.LogoutView.as_view(template_name='users/logout.html'), name='logout'),
#path('tinymce/', include('tinymce.urls')),
path('edit_profile_page/', user_views.EditProfilePage.as_view(template_name='registration/edit_profile_page.html'), name='edit_profile_page'),
# For PasswordPresset
path('admin/password_reset/',auth_views.PasswordResetView.as_view(),name='admin_password_reset',),
path('admin/password_reset/done/',auth_views.PasswordResetDoneView.as_view(),name='password_reset_done',),
path('reset/<uidb64>/<token>/',auth_views.PasswordResetConfirmView.as_view(),name='password_reset_confirm',),
path('reset/done/',auth_views.PasswordResetCompleteView.as_view(),name='password_reset_complete',),
] | 47.625 | 146 | 0.75 | 197 | 1,524 | 5.588832 | 0.284264 | 0.057221 | 0.036331 | 0.049046 | 0.117166 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005069 | 0.093832 | 1,524 | 32 | 147 | 47.625 | 0.79218 | 0.079396 | 0 | 0 | 0 | 0 | 0.27351 | 0.124192 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.190476 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
e0cbddbdfa807e2e3192b9b0f5e91d704ca43fe7 | 1,684 | py | Python | tnnt/uniqdeaths.py | tnnt-devteam/python-backend | 1ecb0ddaccf176726739b64212831d038a7463a0 | [
"MIT"
] | null | null | null | tnnt/uniqdeaths.py | tnnt-devteam/python-backend | 1ecb0ddaccf176726739b64212831d038a7463a0 | [
"MIT"
] | 10 | 2021-07-19T13:04:20.000Z | 2021-12-02T18:36:29.000Z | tnnt/uniqdeaths.py | tnnt-devteam/python-backend | 1ecb0ddaccf176726739b64212831d038a7463a0 | [
"MIT"
] | 1 | 2021-10-17T10:36:51.000Z | 2021-10-17T10:36:51.000Z | from tnnt.settings import UNIQUE_DEATH_REJECTIONS, UNIQUE_DEATH_NORMALIZATIONS
import re
def normalize(death):
# Given a death string, apply normalizations from settings.
for regtuple in UNIQUE_DEATH_NORMALIZATIONS:
death = re.sub(regtuple[0], regtuple[1], death)
return death
def reject(death):
# Given a death string, return True if it should be excluded as a
# unique death and False if not.
for regex in UNIQUE_DEATH_REJECTIONS:
if re.search(regex, death) is not None:
return True
return False
def compile_unique_deaths(gameQS):
# Given a QuerySet of Game objects, return a set containing strings of all
# the unique deaths from those games after rejections and normalizations are
# applied.
# This is primarily for aggregation, and runs somewhat faster than it would
# if we wanted to return the players who got a death and when. This is a
# post 2021 TODO.
# First, get all unique, un-normalized deaths.
raw_uniq_deaths = \
gameQS.values_list('death', flat=True).distinct()
# Then apply normalizations and rejections, and turn it into a set
# to automatically remove any duplicates produced by normalization.
return set(normalize(d) for d in raw_uniq_deaths if not reject(d))
# post 2021 TODO: showing unique deaths of a player or clan:
# 1. list(Game.objects.values_list('death', 'player__name', 'endtime'))
# 2. iterate through list, filtering any death for which reject is True, and
# normalizing all death strings.
# 3. sort by first death, then endtime.
# 4. filter again by taking only the first player/endtime for each death and
# ignoring later ones.
| 42.1 | 80 | 0.726841 | 252 | 1,684 | 4.785714 | 0.460317 | 0.045605 | 0.034826 | 0.026534 | 0.036484 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010518 | 0.20962 | 1,684 | 39 | 81 | 43.179487 | 0.895567 | 0.606295 | 0 | 0 | 0 | 0 | 0.007776 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 1 | 0.2 | false | 0 | 0.133333 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e0cd77f1011874355803f2699f0949c70a877b16 | 286 | py | Python | locan/data/hulls/__init__.py | super-resolution/Locan | 94ed7759f7d7ceddee7c7feaabff80010cfedf30 | [
"BSD-3-Clause"
] | 8 | 2021-11-25T20:05:49.000Z | 2022-03-27T17:45:00.000Z | locan/data/hulls/__init__.py | super-resolution/Locan | 94ed7759f7d7ceddee7c7feaabff80010cfedf30 | [
"BSD-3-Clause"
] | 4 | 2021-12-15T22:39:20.000Z | 2022-03-11T17:35:34.000Z | locan/data/hulls/__init__.py | super-resolution/Locan | 94ed7759f7d7ceddee7c7feaabff80010cfedf30 | [
"BSD-3-Clause"
] | 1 | 2022-03-22T19:53:13.000Z | 2022-03-22T19:53:13.000Z | """
Hull objects of localization data.
Submodules:
-----------
.. autosummary::
:toctree: ./
hull
alpha_shape
"""
from locan.data.hulls.alpha_shape import *
from locan.data.hulls.hull import *
__all__ = []
__all__.extend(hull.__all__)
__all__.extend(alpha_shape.__all__)
| 13.619048 | 42 | 0.688811 | 34 | 286 | 5.117647 | 0.470588 | 0.172414 | 0.149425 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15035 | 286 | 20 | 43 | 14.3 | 0.716049 | 0.409091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e0d53d1eacb61e87932581de548a87b7c4071732 | 3,258 | py | Python | networks/metrics.py | pabloduque0/cnn_deconv_viz | 3fc3d8a9dbad8e8e28d4df4023bdb438e4c9cf85 | [
"MIT"
] | null | null | null | networks/metrics.py | pabloduque0/cnn_deconv_viz | 3fc3d8a9dbad8e8e28d4df4023bdb438e4c9cf85 | [
"MIT"
] | null | null | null | networks/metrics.py | pabloduque0/cnn_deconv_viz | 3fc3d8a9dbad8e8e28d4df4023bdb438e4c9cf85 | [
"MIT"
] | null | null | null | from keras import backend as K
import tensorflow as tf
import numpy as np
def custom_dice_coefficient(y_true, y_pred, recall_weight=0.3):
recall_weight = tf.Variable(recall_weight, dtype=tf.float32)
regular_dice = dice_coefficient(y_true, y_pred)
recall = lession_recall(y_true, y_pred)
recall = tf.cast(recall, dtype=tf.float32)
recall_addition = recall * regular_dice * recall_weight
return regular_dice + recall_addition
def lession_recall(y_true, y_pred):
conn_comp_true = tf.contrib.image.connected_components(tf.cast(tf.squeeze(y_true, axis=[-1]), tf.bool))
conn_comp_pred = conn_comp_true * tf.cast(tf.squeeze(y_pred, axis=[-1]), tf.int32)
n_conn_comp_true, _ = tf.unique(K.flatten(conn_comp_true))
n_conn_comp_pred, _ = tf.unique(K.flatten(conn_comp_pred))
n_conn_comp_true = tf.size(input=n_conn_comp_true) - 1
n_conn_comp_pred = tf.size(input=n_conn_comp_pred) - 1
recall = tf.cond(pred=tf.equal(n_conn_comp_pred, tf.Variable(0)),
true_fn=lambda: tf.Variable(1.0, dtype=tf.float64), false_fn=lambda: n_conn_comp_pred / n_conn_comp_true)
return recall
def thresholded_dice(y_true, y_pred):
y_true = tf.math.floor(y_true + 0.6)
return dice_coefficient(y_true, y_pred)
def thresholded_dice_loss(y_true, y_pred):
return -thresholded_dice(y_true, y_pred)
def custom_dice_coefficient_loss(y_true, y_pred):
return -custom_dice_coefficient(y_true, y_pred)
def dice_coefficient(y_true, y_pred, smooth=0.1):
y_true_f = K.flatten(y_true)
y_pred_f = K.flatten(y_pred)
intersection = K.sum(y_pred_f * y_true_f)
return (2. * intersection + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth)
def dice_coefficient_loss(y_true, y_pred):
return -dice_coefficient(y_true, y_pred)
def sigmoid(x):
return 1. / (1. + K.exp(-x))
def segmentation_recall(y_true, y_pred):
y_true_f = K.flatten(y_true)
y_pred_f = K.flatten(y_pred)
recall = K.sum(y_pred_f * y_true_f) / tf.cast(K.sum(y_true_f), tf.float32)
return recall
def weighted_crossentropy_pixelwise(y_true, y_pred):
y_pred = tf.clip_by_value(y_pred, 1e-7, 1 - 1e-7)
y_pred = K.log(y_pred / (1 - y_pred))
wmh_indexes = np.where(y_true == 1.0)
weights = np.repeat(1.0, 240 * 240)
weights = np.reshape(weights, (1, 240, 240, 1))
weights[wmh_indexes] = 5000.0
crossentropy = (y_true * weights * -K.log(sigmoid(y_pred)) + (1 - y_true * weights) * -K.log(1 - sigmoid(y_pred)))
return crossentropy
def prediction_count(y_true, y_pred):
return tf.math.count_nonzero(y_pred)
def label_count(y_true, y_pred):
return tf.math.count_nonzero(y_true)
def prediction_sum(y_true, y_pred):
return tf.reduce_sum(input_tensor=y_pred)
def label_sum(y_true, y_pred):
return tf.reduce_sum(input_tensor=y_true)
custom_dice_coef = custom_dice_coefficient
custom_dice_loss = custom_dice_coefficient_loss
dice_coef = dice_coefficient
dice_coef_loss = dice_coefficient_loss
weighted_crossentropy = weighted_crossentropy_pixelwise
predicted_count = prediction_count
predicted_sum = prediction_sum
ground_truth_count = label_count
ground_truth_sum = label_sum
pixel_recall = segmentation_recall
obj_recall = lession_recall | 30.448598 | 126 | 0.736648 | 551 | 3,258 | 3.990926 | 0.170599 | 0.081855 | 0.057299 | 0.095498 | 0.452933 | 0.360164 | 0.252842 | 0.18372 | 0.109141 | 0.109141 | 0 | 0.020697 | 0.154696 | 3,258 | 107 | 127 | 30.448598 | 0.777778 | 0 | 0 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.202899 | false | 0 | 0.043478 | 0.115942 | 0.449275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
e0e356f37fd9ccf2de0310d2d8f77f903645f43e | 324 | py | Python | apps/odoo/lib/odoo-10.0.post20170615-py2.7.egg/odoo/addons/stock/models/web_planner.py | gtfarng/Odoo_migrade | 9cc28fae4c379e407645248a29d22139925eafe7 | [
"Apache-2.0"
] | 1 | 2019-12-19T01:53:13.000Z | 2019-12-19T01:53:13.000Z | apps/odoo/lib/odoo-10.0.post20170615-py2.7.egg/odoo/addons/stock/models/web_planner.py | gtfarng/Odoo_migrade | 9cc28fae4c379e407645248a29d22139925eafe7 | [
"Apache-2.0"
] | null | null | null | apps/odoo/lib/odoo-10.0.post20170615-py2.7.egg/odoo/addons/stock/models/web_planner.py | gtfarng/Odoo_migrade | 9cc28fae4c379e407645248a29d22139925eafe7 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from odoo import models
class PlannerInventory(models.Model):
_inherit = 'web.planner'
def _get_planner_application(self):
planner = super(PlannerInventory, self)._get_planner_application()
planner.append(['planner_inventory', 'Inventory Planner'])
return planner
| 24.923077 | 74 | 0.697531 | 34 | 324 | 6.411765 | 0.617647 | 0.091743 | 0.192661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003802 | 0.188272 | 324 | 12 | 75 | 27 | 0.825095 | 0.064815 | 0 | 0 | 0 | 0 | 0.149502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e0eeafb1dffc9fb35d396d75f5c0ceb8be5469f6 | 545 | py | Python | django/currencies/migrations/0003_auto_20211121_0701.py | AngelOnFira/megagame-controller | 033fec84babf80ffd0868a0f7d946ac4c18b061c | [
"MIT"
] | null | null | null | django/currencies/migrations/0003_auto_20211121_0701.py | AngelOnFira/megagame-controller | 033fec84babf80ffd0868a0f7d946ac4c18b061c | [
"MIT"
] | 1 | 2022-03-03T21:56:12.000Z | 2022-03-03T21:56:12.000Z | django/currencies/migrations/0003_auto_20211121_0701.py | AngelOnFira/megagame-controller | 033fec84babf80ffd0868a0f7d946ac4c18b061c | [
"MIT"
] | null | null | null | # Generated by Django 3.2.8 on 2021-11-21 12:01
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("currencies", "0002_initial"),
]
operations = [
migrations.AddField(
model_name="payment",
name="completed",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="payment",
name="completion_amount",
field=models.IntegerField(default=0),
),
]
| 22.708333 | 53 | 0.577982 | 52 | 545 | 5.980769 | 0.711538 | 0.115756 | 0.14791 | 0.173633 | 0.244373 | 0.244373 | 0 | 0 | 0 | 0 | 0 | 0.05305 | 0.308257 | 545 | 23 | 54 | 23.695652 | 0.771883 | 0.082569 | 0 | 0.352941 | 1 | 0 | 0.124498 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e0f2ec36c80ac0eba008f5fa3fbe488f2f46b78b | 171 | py | Python | modulo-3/aulas/modulos e pacotes/uteis.py | Luis-Felipe-N/curso-em-video-python | 09ff58ae31ae0360ebec74de609011d527956065 | [
"MIT"
] | null | null | null | modulo-3/aulas/modulos e pacotes/uteis.py | Luis-Felipe-N/curso-em-video-python | 09ff58ae31ae0360ebec74de609011d527956065 | [
"MIT"
] | null | null | null | modulo-3/aulas/modulos e pacotes/uteis.py | Luis-Felipe-N/curso-em-video-python | 09ff58ae31ae0360ebec74de609011d527956065 | [
"MIT"
] | null | null | null | def fatorial(n):
f = 1
while n != 0:
f *= n
n -= 1
return f
def dobro(n):
n *= 2
return n
def triplo(n):
n *= 3
return n
| 11.4 | 17 | 0.403509 | 28 | 171 | 2.464286 | 0.428571 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.473684 | 171 | 14 | 18 | 12.214286 | 0.711111 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e0f9e97bf2ab57eb94b8a3dc17de1f26affd17b5 | 121 | py | Python | output/models/ms_data/regex/re_l32_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 1 | 2021-08-14T17:59:21.000Z | 2021-08-14T17:59:21.000Z | output/models/ms_data/regex/re_l32_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | 4 | 2020-02-12T21:30:44.000Z | 2020-04-15T20:06:46.000Z | output/models/ms_data/regex/re_l32_xsd/__init__.py | tefra/xsdata-w3c-tests | b6b6a4ac4e0ab610e4b50d868510a8b7105b1a5f | [
"MIT"
] | null | null | null | from output.models.ms_data.regex.re_l32_xsd.re_l32 import (
Regex,
Doc,
)
__all__ = [
"Regex",
"Doc",
]
| 12.1 | 59 | 0.603306 | 17 | 121 | 3.823529 | 0.705882 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043956 | 0.247934 | 121 | 9 | 60 | 13.444444 | 0.67033 | 0 | 0 | 0 | 0 | 0 | 0.066116 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4607ba171b6d586a01f576a069c7a776194711fd | 2,646 | py | Python | app/forms.py | Rahmatullina/FinalYearProject | 326f521b9f600dbbc7ace2223bd5aafc79b2267c | [
"Apache-2.0"
] | null | null | null | app/forms.py | Rahmatullina/FinalYearProject | 326f521b9f600dbbc7ace2223bd5aafc79b2267c | [
"Apache-2.0"
] | 9 | 2020-09-26T01:09:35.000Z | 2022-02-10T01:32:30.000Z | app/forms.py | Rahmatullina/FinalYearProject | 326f521b9f600dbbc7ace2223bd5aafc79b2267c | [
"Apache-2.0"
] | null | null | null | from django import forms
from django.contrib.auth.forms import PasswordResetForm, SetPasswordForm
# from .models import RegionModel
# from .models import SERVICE_CHOICES, REGION_CHOICES
from django.contrib.auth import authenticate
# from django.contrib.auth.forms import UserCreationForm, UserChangeForm
# from .models import CustomUser
class LoginForm(forms.Form):
username = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control'}), max_length=100)
password = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control','type':'password'}),max_length=100)
def clean(self):
username = self.cleaned_data.get('username')
password = self.cleaned_data.get('password')
user = authenticate(username=username, password=password)
if not user or not user.is_active:
raise forms.ValidationError("Sorry, that login was invalid or user is inactive. Please try again.")
return self.cleaned_data
def login(self, request):
username = self.cleaned_data.get('username')
password = self.cleaned_data.get('password')
user = authenticate(username=username, password=password)
return user
# class PassResetForm(PasswordResetForm):
# email = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control', 'placeholder': 'Enter email',
# 'type':'email'}), max_length=100)
#
#
# class PassResetConfirmForm(SetPasswordForm):
# new_password1 = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control',
# 'placeholder':'Enter new password',
# 'type':'password'}), max_length=100)
# new_password2 = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control',
# 'placeholder': 'Enter new password again',
# 'type': 'password'}), max_length=100)
# class CustomUserCreationForm(UserCreationForm):
#
# class Meta(UserCreationForm):
# model = CustomUser
# fields = UserCreationForm.Meta.fields + ('region_name',)
#
#
# class CustomUserChangeForm(UserChangeForm):
# email = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control'}), max_length=100)
# username = forms.CharField(widget=forms.TextInput(attrs={'class': 'form-control'}), max_length=254)
#
# class Meta:
# model = CustomUser
# fields = ('email','username')
| 46.421053 | 120 | 0.62585 | 260 | 2,646 | 6.3 | 0.273077 | 0.059829 | 0.08547 | 0.106838 | 0.538462 | 0.494505 | 0.455433 | 0.455433 | 0.455433 | 0.421856 | 0 | 0.011663 | 0.254724 | 2,646 | 56 | 121 | 47.25 | 0.818966 | 0.595994 | 0 | 0.333333 | 0 | 0 | 0.140655 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0.333333 | 0.166667 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
460a2bcd85d3d00d44d62b6c67fe157741f17339 | 363 | py | Python | biblioteca/views.py | Dagmoores/ProjetoIntegradorIUnivesp | 86f5004c359b760cceec87bfa905db16e3375653 | [
"MIT"
] | null | null | null | biblioteca/views.py | Dagmoores/ProjetoIntegradorIUnivesp | 86f5004c359b760cceec87bfa905db16e3375653 | [
"MIT"
] | null | null | null | biblioteca/views.py | Dagmoores/ProjetoIntegradorIUnivesp | 86f5004c359b760cceec87bfa905db16e3375653 | [
"MIT"
] | null | null | null | from django.views.generic import DetailView, ListView, TemplateView
from .models import Books
class BooksListView(ListView):
model = Books
class BooksDeitalView(DetailView):
model = Books
class Home(TemplateView):
template_name = './biblioteca/index.html'
class TermsOfService(TemplateView):
template_name = './biblioteca/termsOfService.html' | 24.2 | 67 | 0.77135 | 38 | 363 | 7.315789 | 0.552632 | 0.107914 | 0.107914 | 0.244604 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137741 | 363 | 15 | 68 | 24.2 | 0.888179 | 0 | 0 | 0.2 | 0 | 0 | 0.151099 | 0.151099 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
460d75e0442c54e12cc43f9bb046096139d2369e | 2,741 | py | Python | recipes/Python/52228_Remote_control_with_telnetlib/recipe-52228.py | tdiprima/code | 61a74f5f93da087d27c70b2efe779ac6bd2a3b4f | [
"MIT"
] | 2,023 | 2017-07-29T09:34:46.000Z | 2022-03-24T08:00:45.000Z | recipes/Python/52228_Remote_control_with_telnetlib/recipe-52228.py | unhacker/code | 73b09edc1b9850c557a79296655f140ce5e853db | [
"MIT"
] | 32 | 2017-09-02T17:20:08.000Z | 2022-02-11T17:49:37.000Z | recipes/Python/52228_Remote_control_with_telnetlib/recipe-52228.py | unhacker/code | 73b09edc1b9850c557a79296655f140ce5e853db | [
"MIT"
] | 780 | 2017-07-28T19:23:28.000Z | 2022-03-25T20:39:41.000Z | # auto_telnet.py - remote control via telnet
import os, sys, string, telnetlib
from getpass import getpass
class AutoTelnet:
def __init__(self, user_list, cmd_list, **kw):
self.host = kw.get('host', 'localhost')
self.timeout = kw.get('timeout', 600)
self.command_prompt = kw.get('command_prompt', "$ ")
self.passwd = {}
for user in user_list:
self.passwd[user] = getpass("Enter user '%s' password: " % user)
self.telnet = telnetlib.Telnet()
for user in user_list:
self.telnet.open(self.host)
ok = self.action(user, cmd_list)
if not ok:
print "Unable to process:", user
self.telnet.close()
def action(self, user, cmd_list):
t = self.telnet
t.write("\n")
login_prompt = "login: "
response = t.read_until(login_prompt, 5)
if string.count(response, login_prompt):
print response
else:
return 0
password_prompt = "Password:"
t.write("%s\n" % user)
response = t.read_until(password_prompt, 3)
if string.count(response, password_prompt):
print response
else:
return 0
t.write("%s\n" % self.passwd[user])
response = t.read_until(self.command_prompt, 5)
if not string.count(response, self.command_prompt):
return 0
for cmd in cmd_list:
t.write("%s\n" % cmd)
response = t.read_until(self.command_prompt, self.timeout)
if not string.count(response, self.command_prompt):
return 0
print response
return 1
if __name__ == '__main__':
basename = os.path.splitext(os.path.basename(sys.argv[0]))[0]
logname = os.environ.get("LOGNAME", os.environ.get("USERNAME"))
host = 'localhost'
import getopt
optlist, user_list = getopt.getopt(sys.argv[1:], 'c:f:h:')
usage = """
usage: %s [-h host] [-f cmdfile] [-c "command"] user1 user2 ...
-c command
-f command file
-h host (default: '%s')
Example: %s -c "echo $HOME" %s
""" % (basename, host, basename, logname)
if len(sys.argv) < 2:
print usage
sys.exit(1)
cmd_list = []
for (opt, optarg) in optlist:
if opt == '-f':
for r in open(optarg).readlines():
if string.rstrip(r):
cmd_list.append(r)
elif opt == '-c':
command = optarg
if command[0] == '"' and command[-1] == '"':
command = command[1:-1]
cmd_list.append(command)
elif opt == '-h':
host = optarg
autoTelnet = AutoTelnet(user_list, cmd_list, host=host)
| 34.2625 | 76 | 0.553448 | 343 | 2,741 | 4.300292 | 0.271137 | 0.037966 | 0.057627 | 0.048814 | 0.199322 | 0.181695 | 0.112542 | 0.065085 | 0.065085 | 0.065085 | 0 | 0.01174 | 0.316308 | 2,741 | 79 | 77 | 34.696203 | 0.775347 | 0.015323 | 0 | 0.175676 | 0 | 0 | 0.119021 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.094595 | 0.040541 | null | null | 0.067568 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4610420d2938e979a1edec925cfaa9db7b11366a | 755 | py | Python | Codes/Python32/Lib/importlib/test/extension/test_path_hook.py | eyantra/FireBird_Swiss_Knife | cac322cf28e2d690b86ba28a75e87551e5e47988 | [
"MIT"
] | 319 | 2016-09-22T15:54:48.000Z | 2022-03-18T02:36:58.000Z | Codes/Python32/Lib/importlib/test/extension/test_path_hook.py | eyantra/FireBird_Swiss_Knife | cac322cf28e2d690b86ba28a75e87551e5e47988 | [
"MIT"
] | 9 | 2016-11-03T21:56:41.000Z | 2020-08-09T19:27:37.000Z | Codes/Python32/Lib/importlib/test/extension/test_path_hook.py | eyantra/FireBird_Swiss_Knife | cac322cf28e2d690b86ba28a75e87551e5e47988 | [
"MIT"
] | 27 | 2016-10-06T16:05:32.000Z | 2022-03-18T02:37:00.000Z | from importlib import _bootstrap
from . import util
import collections
import imp
import sys
import unittest
class PathHookTests(unittest.TestCase):
"""Test the path hook for extension modules."""
# XXX Should it only succeed for pre-existing directories?
# XXX Should it only work for directories containing an extension module?
def hook(self, entry):
return _bootstrap._file_path_hook(entry)
def test_success(self):
# Path hook should handle a directory where a known extension module
# exists.
self.assertTrue(hasattr(self.hook(util.PATH), 'find_module'))
def test_main():
from test.support import run_unittest
run_unittest(PathHookTests)
if __name__ == '__main__':
test_main()
| 23.59375 | 77 | 0.721854 | 98 | 755 | 5.367347 | 0.520408 | 0.045627 | 0.041825 | 0.057034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205298 | 755 | 31 | 78 | 24.354839 | 0.876667 | 0.325828 | 0 | 0 | 0 | 0 | 0.038 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.1875 | false | 0 | 0.4375 | 0.0625 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1ca1fbc58c354b91137807db3948258d4447da15 | 4,395 | py | Python | minos/api_gateway/common/exceptions.py | Clariteia/api_gateway_common | e68095f31091699fc6cc4537bd6acf97a8dc6c3e | [
"MIT"
] | 3 | 2021-05-14T08:13:09.000Z | 2021-05-26T11:25:35.000Z | minos/api_gateway/common/exceptions.py | Clariteia/api_gateway_common | e68095f31091699fc6cc4537bd6acf97a8dc6c3e | [
"MIT"
] | 27 | 2021-05-13T08:43:19.000Z | 2021-08-24T17:19:36.000Z | minos/api_gateway/common/exceptions.py | Clariteia/api_gateway_common | e68095f31091699fc6cc4537bd6acf97a8dc6c3e | [
"MIT"
] | null | null | null | """
Copyright (C) 2021 Clariteia SL
This file is part of minos framework.
Minos framework can not be copied and/or distributed without the express permission of Clariteia SL.
"""
from typing import (
Any,
Type,
)
class MinosException(Exception):
"""Exception class for import packages or modules"""
__slots__ = "_message"
def __init__(self, error_message: str):
self._message = error_message
def __repr__(self):
return f"{type(self).__name__}(message={repr(self._message)})"
def __str__(self) -> str:
"""represent in a string format the error message passed during the instantiation"""
return self._message
class MinosImportException(MinosException):
pass
class MinosProtocolException(MinosException):
pass
class MinosMessageException(MinosException):
pass
class MinosConfigException(MinosException):
"""Base config exception."""
class MinosConfigDefaultAlreadySetException(MinosConfigException):
"""Exception to be raised when some config is already set as default."""
class MinosRepositoryException(MinosException):
"""Base repository exception."""
class MinosRepositoryAggregateNotFoundException(MinosRepositoryException):
"""Exception to be raised when some aggregate is not found on the repository."""
class MinosRepositoryDeletedAggregateException(MinosRepositoryException):
"""Exception to be raised when some aggregate is already deleted from the repository."""
class MinosRepositoryManuallySetAggregateIdException(MinosRepositoryException):
"""Exception to be raised when some aggregate is trying to be created with a manually set id."""
class MinosRepositoryManuallySetAggregateVersionException(MinosRepositoryException):
"""Exception to be raised when some aggregate is trying to be created with a manually set version."""
class MinosRepositoryUnknownActionException(MinosRepositoryException):
"""Exception to be raised when some entry tries to perform an unknown action."""
class MinosRepositoryNonProvidedException(MinosRepositoryException):
"""Exception to be raised when a repository is needed but none is set."""
class MinosModelException(MinosException):
"""Exception to be raised when some mandatory condition is not satisfied by a model."""
pass
class EmptyMinosModelSequenceException(MinosModelException):
"""Exception to be raised when a sequence must be not empty, but it is empty."""
pass
class MultiTypeMinosModelSequenceException(MinosModelException):
"""Exception to be raised when a sequence doesn't satisfy the condition to have the same type for each item."""
pass
class MinosModelAttributeException(MinosException):
"""Base model attributes exception."""
pass
class MinosReqAttributeException(MinosModelAttributeException):
"""Exception to be raised when some required attributes are not provided."""
pass
class MinosTypeAttributeException(MinosModelAttributeException):
"""Exception to be raised when there are any mismatching between the expected and observed attribute type."""
def __init__(self, name: str, target_type: Type, value: Any):
self.name = name
self.target_type = target_type
self.value = value
super().__init__(
f"The {repr(target_type)} expected type for {repr(name)} does not match with "
f"the given data type: {type(value)}"
)
class MinosMalformedAttributeException(MinosModelAttributeException):
"""Exception to be raised when there are any kind of problems with the type definition."""
pass
class MinosParseAttributeException(MinosModelAttributeException):
"""Exception to be raised when there are any kind of problems with the parsing logic."""
def __init__(self, name: str, value: Any, exception: Exception):
self.name = name
self.value = value
self.exception = exception
super().__init__(f"{repr(exception)} was raised while parsing {repr(name)} field with {repr(value)} value.")
class MinosAttributeValidationException(MinosModelAttributeException):
"""Exception to be raised when some fields are not valid."""
def __init__(self, name: str, value: Any):
self.name = name
self.value = value
super().__init__(f"{repr(value)} value does not pass the {repr(name)} field validation.")
| 30.520833 | 116 | 0.739022 | 487 | 4,395 | 6.558522 | 0.305955 | 0.02129 | 0.061052 | 0.08923 | 0.340326 | 0.33469 | 0.268316 | 0.201628 | 0.169693 | 0.111459 | 0 | 0.00111 | 0.180432 | 4,395 | 143 | 117 | 30.734266 | 0.885619 | 0.363367 | 0 | 0.254237 | 0 | 0.016949 | 0.120446 | 0.019331 | 0 | 0 | 0 | 0 | 0 | 1 | 0.101695 | false | 0.169492 | 0.033898 | 0.016949 | 0.559322 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
1caab3990fb21bf24a942c5cae050f1ff9f8b143 | 305 | py | Python | tests/fixtures/db/sqlite.py | code-watch/meltano | 2afff73ed43669b5134dacfce61814f7f4e77a13 | [
"MIT"
] | 8 | 2020-06-16T22:29:54.000Z | 2021-06-04T11:57:57.000Z | tests/fixtures/db/sqlite.py | dotmesh-io/meltano | 4616d44ded9dff4e9ad19a9004349e9baa16ddd5 | [
"MIT"
] | 38 | 2019-12-09T06:53:33.000Z | 2022-03-29T22:29:19.000Z | tests/fixtures/db/sqlite.py | aroder/meltano | b8d1d812f4051b6334986fc6b447d23c4d0d5043 | [
"MIT"
] | 2 | 2020-06-16T22:29:59.000Z | 2020-11-04T05:47:50.000Z | import pytest
import os
import sqlalchemy
import contextlib
@pytest.fixture(scope="session")
def engine_uri(test_dir):
database_path = test_dir.joinpath("pytest_meltano.db")
try:
database_path.unlink()
except FileNotFoundError:
pass
return f"sqlite:///{database_path}"
| 17.941176 | 58 | 0.714754 | 37 | 305 | 5.702703 | 0.702703 | 0.170616 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186885 | 305 | 16 | 59 | 19.0625 | 0.850806 | 0 | 0 | 0 | 0 | 0 | 0.160656 | 0.081967 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
1cab4f72005e2a4605f4cdeb62be5961ecba1542 | 336 | py | Python | arxiv/canonical/util.py | arXiv/arxiv-canonical | a758ed88a568f23a834288aed4dcf7039c1340cf | [
"MIT"
] | 5 | 2019-05-26T22:52:54.000Z | 2021-11-05T12:27:11.000Z | arxiv/canonical/util.py | arXiv/arxiv-canonical | a758ed88a568f23a834288aed4dcf7039c1340cf | [
"MIT"
] | 31 | 2019-06-24T13:51:25.000Z | 2021-11-12T22:27:10.000Z | arxiv/canonical/util.py | arXiv/arxiv-canonical | a758ed88a568f23a834288aed4dcf7039c1340cf | [
"MIT"
] | 4 | 2019-01-10T22:01:54.000Z | 2021-11-05T12:26:58.000Z | """Various helpers and utilities that don't belong anywhere else."""
from typing import Dict, Generic, TypeVar
KeyType = TypeVar('KeyType')
ValueType = TypeVar('ValueType')
class GenericMonoDict(Dict[KeyType, ValueType]):
"""A dict with specific key and value types."""
def __getitem__(self, key: KeyType) -> ValueType: ... | 28 | 68 | 0.720238 | 41 | 336 | 5.804878 | 0.707317 | 0.201681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151786 | 336 | 12 | 69 | 28 | 0.835088 | 0.309524 | 0 | 0 | 0 | 0 | 0.072072 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1cd41a80f04199f3be841ce38a8ac4428c343606 | 6,620 | py | Python | show/drawing.py | nohamanona/poke-auto-fuka | 9d355694efa0168738795afb403fc89264dcaeae | [
"Apache-2.0"
] | 5 | 2019-12-31T18:38:52.000Z | 2021-01-07T08:57:17.000Z | show/drawing.py | nohamanona/poke-auto-fuka | 9d355694efa0168738795afb403fc89264dcaeae | [
"Apache-2.0"
] | null | null | null | show/drawing.py | nohamanona/poke-auto-fuka | 9d355694efa0168738795afb403fc89264dcaeae | [
"Apache-2.0"
] | 1 | 2020-03-03T08:14:47.000Z | 2020-03-03T08:14:47.000Z | import cv2
import numpy as np
class DrawingClass(object):
def __init__(self):
self.draw_command ='None'
self.frame_count = 0
def drawing(self, frame, fps, num_egg, htc_egg, state):
cv2.putText(frame, 'FPS: {:.2f}'.format(fps),
(10, 30), cv2.FONT_HERSHEY_SIMPLEX, 1.0, (0, 255, 0), thickness=2)
cv2.putText(frame, 'Possessed EGG: {}'.format(num_egg),
(10, 100), cv2.FONT_HERSHEY_SIMPLEX, 1.0, (0, 255, 255), thickness=2)
cv2.putText(frame, 'Hatched EGG: {}'.format(htc_egg),
(10, 130), cv2.FONT_HERSHEY_SIMPLEX, 1.0, (0, 255, 255), thickness=2)
cv2.putText(frame, 'State: {}'.format(state),
(250, 30), cv2.FONT_HERSHEY_SIMPLEX, 1.0, (0, 255, 255), thickness=2)
return frame
def draw_controler(self, frame, command):
#print('draw',command)
if command =='LX MIN':
self.draw_command = 'LX MIN'
elif command =='LX MAX':
self.draw_command = 'LX MAX'
elif command =='LY MIN':
self.draw_command = 'LY MIN'
elif command =='LY MAX':
self.draw_command = 'LY MAX'
elif command =='Button A':
self.draw_command = 'Button A'
elif command =='Button B':
self.draw_command = 'Button B'
elif command =='Button X':
self.draw_command = 'Button X'
elif command =='Button Y':
self.draw_command = 'Button Y'
elif command =='HAT TOP':
self.draw_command = 'HAT TOP'
elif command =='HAT RIGHT':
self.draw_command = 'HAT RIGHT'
elif command =='HAT BOTTOM':
self.draw_command = 'HAT BOTTOM'
elif command =='HAT LEFT':
self.draw_command = 'HAT LEFT'
elif command =='Button START':
self.draw_command = 'Button START'
elif command =='STOP':
self.draw_command = 'STOP'
#stick
if self.draw_command =='LX MIN' or self.draw_command =='HAT LEFT':
cv2.circle(frame, (970, 490), 20, (0, 0, 255), thickness=-1)
elif self.draw_command =='LX MAX' or self.draw_command =='HAT RIGHT':
cv2.circle(frame, (1030, 490), 20, (0, 0, 255), thickness=-1)
elif self.draw_command =='LY MIN' or self.draw_command =='HAT TOP':
cv2.circle(frame, (1000, 460), 20, (0, 0, 255), thickness=-1)
elif self.draw_command =='LY MAX' or self.draw_command =='HAT BOTTOM':
cv2.circle(frame, (1000, 520), 20, (0, 0, 255), thickness=-1)
else:
cv2.circle(frame, (1000, 490), 20, (0, 0, 255), thickness=-1)
cv2.circle(frame, (1000, 490), 50, (0, 0, 255), thickness=2)
#button
if self.draw_command =='Button X':
cv2.circle(frame, (1180, 460), 15, (0, 0, 255), thickness=-1)
cv2.putText(frame, 'X',(1172, 468), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (255, 255, 255), thickness=2)
self.frame_count +=1
elif self.frame_count == 6:
cv2.circle(frame, (1180, 460), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'X',(1172, 468), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
self.frame_count =0
self.draw_command = 'None'
else:
cv2.circle(frame, (1180, 460), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'X',(1172, 468), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
if self.draw_command =='Button B':
cv2.circle(frame, (1180, 520), 15, (0, 0, 255), thickness=-1)
cv2.putText(frame, 'B',(1172, 528), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (255, 255, 255), thickness=2)
self.frame_count +=1
elif self.frame_count == 6:
cv2.circle(frame, (1180, 520), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'B',(1172, 528), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
self.frame_count =0
self.draw_command = 'None'
else:
cv2.circle(frame, (1180, 520), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'B',(1172, 528), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
if self.draw_command =='Button Y':
cv2.circle(frame, (1150, 490), 15, (0, 0, 255), thickness=-1)
cv2.putText(frame, 'Y',(1142, 498), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (255, 255, 255), thickness=2)
self.frame_count +=1
elif self.frame_count == 6:
cv2.circle(frame, (1150, 490), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'Y',(1142, 498), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
self.frame_count =0
self.draw_command = 'None'
else:
cv2.circle(frame, (1150, 490), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'Y',(1142, 498), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
if self.draw_command =='Button A':
cv2.circle(frame, (1210, 490), 15, (0, 0, 255), thickness=-1)
cv2.putText(frame, 'A',(1202, 498), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (255, 255, 255), thickness=2)
self.frame_count +=1
elif self.frame_count == 6:
cv2.circle(frame, (1210, 490), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'A',(1202, 498), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
self.frame_count =0
self.draw_command = 'None'
else:
cv2.circle(frame, (1210, 490), 15, (0, 0, 255), thickness=2)
cv2.putText(frame, 'A',(1202, 498), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
if self.draw_command =='Button START':
cv2.circle(frame, (1130, 423), 10, (0, 0, 255), thickness=-1)
cv2.putText(frame, '+',(1120, 430), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (255, 255, 255), thickness=2)
self.frame_count +=1
elif self.frame_count == 6:
cv2.circle(frame, (1130, 423), 10, (0, 0, 255), thickness=1)
cv2.putText(frame, '+',(1120, 430), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
self.frame_count =0
self.draw_command = 'None'
else:
cv2.circle(frame, (1130, 423), 10, (0, 0, 255), thickness=1)
cv2.putText(frame, '+',(1120, 430), cv2.FONT_HERSHEY_SIMPLEX, 0.8, (0, 0, 255), thickness=2)
return frame | 51.317829 | 109 | 0.540483 | 902 | 6,620 | 3.86031 | 0.09867 | 0.134406 | 0.050258 | 0.124641 | 0.814474 | 0.684951 | 0.652786 | 0.647042 | 0.647042 | 0.637565 | 0 | 0.149515 | 0.299849 | 6,620 | 129 | 110 | 51.317829 | 0.601726 | 0.004834 | 0 | 0.42735 | 0 | 0 | 0.062713 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025641 | false | 0 | 0.017094 | 0 | 0.068376 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1cdaebcf2a2178841183e0647850aae12465877f | 1,859 | py | Python | football/football_test.py | EEdwardsA/DS-OOP-Review | 2352866c5d0ea6a09802c29c17366450f35c75ae | [
"MIT"
] | null | null | null | football/football_test.py | EEdwardsA/DS-OOP-Review | 2352866c5d0ea6a09802c29c17366450f35c75ae | [
"MIT"
] | null | null | null | football/football_test.py | EEdwardsA/DS-OOP-Review | 2352866c5d0ea6a09802c29c17366450f35c75ae | [
"MIT"
] | null | null | null | import unittest
from players import Player, Quarterback
from possible_values import *
from game import Game
from random import randint, uniform, sample
from season import *
# TODO - some things you can add...
class FootballGameTest(unittest.TestCase):
'''test the class'''
def test_field_goal_made(self):
teams = sample(team_names, k=2)
game = Game(teams=teams)
team_prev_points = game.score[teams[0]]
game.field_goal(teams[0])
team_post_points = game.score[teams[0]]
self.assertEqual(team_post_points, team_prev_points + 3)
def test_get_winner(self):
teams = sample(team_names, k=2)
game = Game(teams=teams)
game.field_goal(teams[0])
t1_points = game.score[teams[0]]
t2_points = game.score[teams[1]]
if t1_points >= t2_points:
win, lose = teams
else:
lose, win = teams
self.assertEqual((win,lose), game.get_winning_team())
class FootballPlayerTest(unittest.TestCase):
'''Check the default values for Player and Quarterback
yards=120, touchdowns=5, safety=1,
interceptions=0
'''
def test_default_player_yards(self):
player = Player(name='Dude')
self.assertEqual(player.yards, 120)
def test_player_yards_set_to(self):
player = Player(name='OtherDude', yards=150)
self.assertEqual(player.yards, 150)
def test_default_qb_interceptions(self):
qb = Quarterback(name='FancyDude')
self.assertEqual(qb.interceptions, 4)
def test_default_qb_completed_passes(self):
qb = Quarterback()
self.assertEqual(qb.completed_passes, 20)
def test_passing_score(self):
qb = Quarterback()
self.assertEqual((20 - (2 * 4)), qb.passing_score())
if __name__ == '__main__':
unittest.main()
| 28.166667 | 64 | 0.652501 | 236 | 1,859 | 4.927966 | 0.326271 | 0.042132 | 0.051591 | 0.068788 | 0.217541 | 0.075666 | 0.075666 | 0.075666 | 0.075666 | 0.075666 | 0 | 0.02477 | 0.239914 | 1,859 | 65 | 65 | 28.6 | 0.798301 | 0.088757 | 0 | 0.186047 | 0 | 0 | 0.017964 | 0 | 0 | 0 | 0 | 0.015385 | 0.162791 | 1 | 0.162791 | false | 0.093023 | 0.139535 | 0 | 0.348837 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
1cdce77473b836e98d4d4044b2d6d581603e5972 | 1,930 | py | Python | users/views.py | rossm6/accounts | 74633ce4038806222048d85ef9dfe97a957a6a71 | [
"MIT"
] | 11 | 2021-01-23T01:09:54.000Z | 2021-01-25T07:16:30.000Z | users/views.py | rossm6/accounts | 74633ce4038806222048d85ef9dfe97a957a6a71 | [
"MIT"
] | 7 | 2021-04-06T18:19:10.000Z | 2021-09-22T19:45:03.000Z | users/views.py | rossm6/accounts | 74633ce4038806222048d85ef9dfe97a957a6a71 | [
"MIT"
] | 3 | 2021-01-23T18:55:32.000Z | 2021-02-16T17:47:59.000Z | from django.contrib.auth import update_session_auth_hash
from django.contrib.auth.mixins import LoginRequiredMixin
from django.contrib.auth.models import User
from django.contrib.auth.views import (LoginView, PasswordResetConfirmView,
PasswordResetView)
from django.http import HttpResponse, HttpResponseNotAllowed
from django.shortcuts import render
from django.urls import reverse_lazy
from django.views.generic import CreateView, DeleteView, UpdateView
from users.forms import (SignInForm, SignUpForm, UserPasswordResetForm,
UserProfileForm, UserSetPasswordForm)
from users.mixins import LockDuringEditMixin
from users.models import Lock, UserSession
class SignUp(CreateView):
model = User
form_class = SignUpForm
template_name = "registration/signup.html"
success_url = reverse_lazy("dashboard:dashboard")
class SignIn(LoginView):
form_class = SignInForm
class Profile(LoginRequiredMixin, LockDuringEditMixin, UpdateView):
model = User
form_class = UserProfileForm
template_name = "registration/profile.html"
success_url = reverse_lazy("users:profile")
def get_object(self):
return self.request.user
def form_valid(self, form):
response = super().form_valid(form)
update_session_auth_hash(self.request, self.object) # this will delete the current user session
# and create anew
UserSession.objects.create(user=self.object, session_id=self.request.session.session_key)
return response
class UserPasswordResetView(PasswordResetView):
form_class = UserPasswordResetForm
class UserPasswordResetConfirmView(PasswordResetConfirmView):
form_class = UserSetPasswordForm
def unlock(request, pk):
if request.method == "POST":
lock = Lock.objects.filter(pk=pk).delete()
return HttpResponse('')
return HttpResponseNotAllowed(["POST"])
| 33.275862 | 103 | 0.748705 | 206 | 1,930 | 6.902913 | 0.383495 | 0.056259 | 0.04782 | 0.059072 | 0.035162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178238 | 1,930 | 57 | 104 | 33.859649 | 0.896595 | 0.029534 | 0 | 0.04878 | 0 | 0 | 0.047594 | 0.026203 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073171 | false | 0.195122 | 0.268293 | 0.02439 | 0.829268 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
1cdeb5c9ff7c16810e652dbe520cbde408b27771 | 939 | py | Python | metrics/serializers.py | BrianWaganerSTL/RocketDBaaS | d924589188411371842513060a5e08b1be3cdccf | [
"MIT"
] | 1 | 2018-11-04T09:36:35.000Z | 2018-11-04T09:36:35.000Z | metrics/serializers.py | BrianWaganerSTL/RocketDBaaS_api | d924589188411371842513060a5e08b1be3cdccf | [
"MIT"
] | null | null | null | metrics/serializers.py | BrianWaganerSTL/RocketDBaaS_api | d924589188411371842513060a5e08b1be3cdccf | [
"MIT"
] | null | null | null | from rest_framework import serializers
from metrics.models import Metrics_Cpu, Metrics_PingServer, Metrics_MountPoint, \
Metrics_CpuLoad, Metrics_PingDb
class Metrics_CpuSerializer(serializers.ModelSerializer):
class Meta:
model = Metrics_Cpu
fields = '__all__'
depth = 0
class Metrics_MountPointSerializer(serializers.ModelSerializer):
class Meta:
model = Metrics_MountPoint
fields = '__all__'
depth = 0
class Metrics_CpuLoadSerializer(serializers.ModelSerializer):
class Meta:
model = Metrics_CpuLoad
fields = '__all__'
depth = 0
class Metrics_PingServerSerializer(serializers.ModelSerializer):
class Meta:
model = Metrics_PingServer
fields = '__all__'
depth = 0
class Metrics_PingDbSerializer(serializers.ModelSerializer):
class Meta:
model = Metrics_PingDb
fields = '__all__'
depth = 0 | 27.617647 | 81 | 0.698616 | 89 | 939 | 6.966292 | 0.280899 | 0.096774 | 0.25 | 0.282258 | 0.553226 | 0.553226 | 0 | 0 | 0 | 0 | 0 | 0.007022 | 0.241747 | 939 | 34 | 82 | 27.617647 | 0.863764 | 0 | 0 | 0.535714 | 0 | 0 | 0.037234 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1ce1d1ce742bc81665dc46ca2940199356484a9f | 1,010 | py | Python | tests/structures/test_generator.py | cherub96/voc | 2692d56059e4d4a52768270feaf5179b23609b04 | [
"BSD-3-Clause"
] | 1 | 2021-01-03T00:59:50.000Z | 2021-01-03T00:59:50.000Z | tests/structures/test_generator.py | cherub96/voc | 2692d56059e4d4a52768270feaf5179b23609b04 | [
"BSD-3-Clause"
] | null | null | null | tests/structures/test_generator.py | cherub96/voc | 2692d56059e4d4a52768270feaf5179b23609b04 | [
"BSD-3-Clause"
] | null | null | null | from ..utils import TranspileTestCase
class GeneratorTests(TranspileTestCase):
def test_simple_generator(self):
self.assertCodeExecution("""
def multiplier(first, second):
y = first * second
yield y
y *= second
yield y
y *= second
yield y
y *= second
yield y
print(list(multiplier(1, 20)))
""")
def test_loop_generator(self):
self.assertCodeExecution("""
def fizz_buzz(start, stop):
for i in range(start, stop):
found = False
if i % 2 == 0:
yield 'fizz'
found = True
if i % 3 == 0:
yield 'buzz'
found = True
if not found:
yield i
print(list(fizz_buzz(1, 20)))
""")
| 28.055556 | 44 | 0.40099 | 87 | 1,010 | 4.586207 | 0.436782 | 0.110276 | 0.120301 | 0.097744 | 0.323308 | 0.12782 | 0.12782 | 0.12782 | 0.12782 | 0.12782 | 0 | 0.020661 | 0.520792 | 1,010 | 35 | 45 | 28.857143 | 0.803719 | 0 | 0 | 0.433333 | 0 | 0 | 0.766337 | 0.046535 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.066667 | false | 0 | 0.033333 | 0 | 0.133333 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1ce7603f33584c5aefec4359d0957617e3a28159 | 5,106 | py | Python | CreateHalo.py | yoyoberenguer/MultiplayerGameEngine | 1d1a4c0ab40d636322c4e3299cbc84fb57965b31 | [
"MIT"
] | 4 | 2019-09-08T13:54:14.000Z | 2021-12-18T11:46:59.000Z | CreateHalo.py | yoyoberenguer/MultiplayerGameEngine | 1d1a4c0ab40d636322c4e3299cbc84fb57965b31 | [
"MIT"
] | 1 | 2019-09-01T11:21:39.000Z | 2019-09-01T15:01:21.000Z | CreateHalo.py | yoyoberenguer/MultiplayerGameEngine | 1d1a4c0ab40d636322c4e3299cbc84fb57965b31 | [
"MIT"
] | 1 | 2019-08-23T07:00:20.000Z | 2019-08-23T07:00:20.000Z |
import pygame
from NetworkBroadcast import Broadcast, AnimatedSprite, DeleteSpriteCommand
from Textures import HALO_SPRITE12, HALO_SPRITE14, HALO_SPRITE13
__author__ = "Yoann Berenguer"
__credits__ = ["Yoann Berenguer"]
__version__ = "1.0.0"
__maintainer__ = "Yoann Berenguer"
__email__ = "yoyoberenguer@hotmail.com"
class PlayerHalo(pygame.sprite.Sprite):
images = []
containers = None
def __init__(self, texture_name_, object_, timing_, layer_=0):
self.layer = layer_
pygame.sprite.Sprite.__init__(self, self.containers)
if isinstance(object_.gl.All, pygame.sprite.LayeredUpdates):
object_.gl.All.change_layer(self, object_.layer)
self.object = object_
if isinstance(self.images, pygame.Surface):
self.images = [self.images] * 30
self.images_copy = self.images.copy()
self.image = self.images_copy[0]
self.rect = self.image.get_rect(center=object_.rect.center)
self.dt = 0
self.index = 0
self.gl = object_.gl
self.length = len(self.images) - 1
self.blend = 0
self.timing = timing_
self.texture_name = texture_name_
self.id_ = id(self)
self.player_halo_object = Broadcast(self.make_object())
def make_object(self) -> AnimatedSprite:
return AnimatedSprite(frame_=self.gl.FRAME, id_=self.id_, surface_=self.texture_name,
layer_=self.layer, blend_=self.blend, rect_=self.rect,
index_=self.index)
def update(self):
if self.dt > self.timing:
if self.object.rect.colliderect(self.gl.SCREENRECT):
self.image = self.images_copy[self.index]
self.rect = self.image.get_rect(center=self.object.rect.center)
self.index += 1
if self.index > self.length:
self.kill()
return
self.dt = 0
self.player_halo_object.update({'frame': self.gl.FRAME,
'rect': self.rect, 'index': self.index})
else:
self.kill()
return
else:
self.dt += self.gl.TIME_PASSED_SECONDS
self.player_halo_object.queue()
class AsteroidHalo(pygame.sprite.Sprite):
images = []
containers = None
def __init__(self, texture_name_, object_, timing_, layer_=0):
self.layer = layer_
pygame.sprite.Sprite.__init__(self, self.containers)
if isinstance(object_.gl.All, pygame.sprite.LayeredUpdates):
object_.gl.All.change_layer(self, object_.layer)
self.object = object_
if isinstance(self.images, pygame.Surface):
self.images = [self.images] * 30
self.images_copy = self.images.copy()
self.image = self.images_copy[0]
if not id(AsteroidHalo.images) == id(eval(texture_name_)):
raise ValueError("Asteroid image does not match with its surface name.")
self.rect = self.image.get_rect(center=object_.rect.center)
self.dt = 0
self.index = 0
self.gl = object_.gl
self.length = len(self.images) - 1
self.blend = 0
self.timing = timing_
self.texture_name = texture_name_
self.id_ = id(self)
self.asteroidHalo_object = Broadcast(self.make_object())
Broadcast.add_object_id(self.id_)
def delete_object(self) -> DeleteSpriteCommand:
"""
Send a command to kill an object on client side.
:return: DetectCollisionSprite object
"""
return DeleteSpriteCommand(frame_=self.gl.FRAME, to_delete_={self.id_: self.texture_name})
def make_object(self) -> AnimatedSprite:
return AnimatedSprite(frame_=self.gl.FRAME, id_=self.id_, surface_=self.texture_name,
layer_=self.layer, blend_=self.blend, rect_=self.rect,
index_=self.index)
def quit(self) -> None:
Broadcast.remove_object_id(self.id_)
obj = Broadcast(self.delete_object())
obj.queue()
self.kill()
def update(self) -> None:
if self.dt > self.timing:
if self.object.rect.colliderect(self.gl.SCREENRECT):
self.image = self.images_copy[self.index]
self.rect = self.image.get_rect(center=self.object.rect.center)
self.index += 1
if self.index > self.length:
self.quit()
return
self.asteroidHalo_object.update(
{'frame': self.gl.FRAME, 'rect': self.rect, 'index': self.index})
self.asteroidHalo_object.queue()
self.dt = 0
else:
self.quit()
return
else:
self.dt += self.gl.TIME_PASSED_SECONDS
| 31.9125 | 99 | 0.570897 | 564 | 5,106 | 4.929078 | 0.166667 | 0.057554 | 0.040288 | 0.038849 | 0.694964 | 0.674101 | 0.674101 | 0.674101 | 0.674101 | 0.646043 | 0 | 0.008455 | 0.328241 | 5,106 | 159 | 100 | 32.113208 | 0.802041 | 0.017039 | 0 | 0.731481 | 0 | 0 | 0.032124 | 0.005181 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0.018519 | 0.027778 | 0.018519 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1ce98f8cbd7283e38faf8437d2c92e51357a9597 | 93 | py | Python | day4/homework/q7.py | AkshayManchanda/Python_Training | 5a50472d118ac6d40145bf1dd60f26864bf9fb6c | [
"MIT"
] | null | null | null | day4/homework/q7.py | AkshayManchanda/Python_Training | 5a50472d118ac6d40145bf1dd60f26864bf9fb6c | [
"MIT"
] | null | null | null | day4/homework/q7.py | AkshayManchanda/Python_Training | 5a50472d118ac6d40145bf1dd60f26864bf9fb6c | [
"MIT"
] | null | null | null | i=input("Enter a string: ")
list = i.split()
list.sort()
for i in list:
print(i,end=' ')
| 15.5 | 27 | 0.591398 | 17 | 93 | 3.235294 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 93 | 5 | 28 | 18.6 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0.182796 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1ceddd105ecb3e0dcae569f584b7c20f28eab09e | 553 | py | Python | Python/Calculating_Trimmed_Means/calculating_trimmed_means1.py | PeriscopeData/analytics-toolbox | 83effdee380c33e5eecea29528acf5375fd496fb | [
"MIT"
] | 2 | 2019-09-27T22:19:09.000Z | 2019-12-02T23:12:18.000Z | Python/Calculating_Trimmed_Means/calculating_trimmed_means1.py | PeriscopeData/analytics-toolbox | 83effdee380c33e5eecea29528acf5375fd496fb | [
"MIT"
] | 1 | 2019-10-03T17:46:23.000Z | 2019-10-03T17:46:23.000Z | Python/Calculating_Trimmed_Means/calculating_trimmed_means1.py | PeriscopeData/analytics-toolbox | 83effdee380c33e5eecea29528acf5375fd496fb | [
"MIT"
] | 2 | 2021-07-17T18:23:50.000Z | 2022-03-03T04:53:03.000Z | # SQL output is imported as a pandas dataframe variable called "df"
# Source: https://stackoverflow.com/questions/19441730/trimmed-mean-with-percentage-limit-in-python
import pandas as pd
import matplotlib.pyplot as plt
from scipy.stats import tmean, scoreatpercentile
import numpy as np
def trimmean(arr, percent):
lower_limit = scoreatpercentile(arr, percent)
upper_limit = scoreatpercentile(arr, 100-percent)
return tmean(arr, limits=(lower_limit, upper_limit), inclusive=(False, False))
my_result = trimmean(df["amt_paid"].values,10) | 39.5 | 100 | 0.779385 | 77 | 553 | 5.519481 | 0.688312 | 0.047059 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026749 | 0.121157 | 553 | 14 | 101 | 39.5 | 0.847737 | 0.296564 | 0 | 0 | 0 | 0 | 0.020672 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e80208494724abb41e0606d412c1f9f663383e0b | 227 | py | Python | 2020_01_01/max_values/max_values.py | 94JuHo/Algorithm_study | e2c10ec680d966e5bcc4e7cb88d9514f9ccbbf15 | [
"MIT"
] | null | null | null | 2020_01_01/max_values/max_values.py | 94JuHo/Algorithm_study | e2c10ec680d966e5bcc4e7cb88d9514f9ccbbf15 | [
"MIT"
] | null | null | null | 2020_01_01/max_values/max_values.py | 94JuHo/Algorithm_study | e2c10ec680d966e5bcc4e7cb88d9514f9ccbbf15 | [
"MIT"
] | null | null | null | values = []
for i in range(9):
values.append(int(input('')))
max_value = 0
location = 0
for i in range(9):
if values[i] > max_value:
max_value = values[i]
location = i+1
print(max_value)
print(location) | 18.916667 | 33 | 0.61674 | 37 | 227 | 3.675676 | 0.432432 | 0.235294 | 0.088235 | 0.161765 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028902 | 0.237885 | 227 | 12 | 34 | 18.916667 | 0.757225 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.181818 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e804b98c7267ddc9714ddf384d8bd8fa54ddef35 | 5,249 | py | Python | nova/api/openstack/compute/legacy_v2/contrib/console_auth_tokens.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/api/openstack/compute/legacy_v2/contrib/console_auth_tokens.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/api/openstack/compute/legacy_v2/contrib/console_auth_tokens.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2013 Cloudbase Solutions Srl'
nl|'\n'
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'webob'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'extensions'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'wsgi'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'consoleauth'
name|'import'
name|'rpcapi'
name|'as'
name|'consoleauth_rpcapi'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'i18n'
name|'import'
name|'_'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|authorize
name|'authorize'
op|'='
name|'extensions'
op|'.'
name|'extension_authorizer'
op|'('
string|"'compute'"
op|','
string|"'console_auth_tokens'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|ConsoleAuthTokensController
name|'class'
name|'ConsoleAuthTokensController'
op|'('
name|'wsgi'
op|'.'
name|'Controller'
op|')'
op|':'
newline|'\n'
DECL|member|__init__
indent|' '
name|'def'
name|'__init__'
op|'('
name|'self'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_consoleauth_rpcapi'
op|'='
name|'consoleauth_rpcapi'
op|'.'
name|'ConsoleAuthAPI'
op|'('
op|')'
newline|'\n'
name|'super'
op|'('
name|'ConsoleAuthTokensController'
op|','
name|'self'
op|')'
op|'.'
name|'__init__'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
nl|'\n'
DECL|member|show
dedent|''
name|'def'
name|'show'
op|'('
name|'self'
op|','
name|'req'
op|','
name|'id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Checks a console auth token and returns the related connect info."""'
newline|'\n'
name|'context'
op|'='
name|'req'
op|'.'
name|'environ'
op|'['
string|"'nova.context'"
op|']'
newline|'\n'
name|'authorize'
op|'('
name|'context'
op|')'
newline|'\n'
nl|'\n'
name|'token'
op|'='
name|'id'
newline|'\n'
name|'connect_info'
op|'='
name|'self'
op|'.'
name|'_consoleauth_rpcapi'
op|'.'
name|'check_token'
op|'('
name|'context'
op|','
name|'token'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'connect_info'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'webob'
op|'.'
name|'exc'
op|'.'
name|'HTTPNotFound'
op|'('
name|'explanation'
op|'='
name|'_'
op|'('
string|'"Token not found"'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'console_type'
op|'='
name|'connect_info'
op|'.'
name|'get'
op|'('
string|"'console_type'"
op|')'
newline|'\n'
comment|'# This is currently required only for RDP consoles'
nl|'\n'
name|'if'
name|'console_type'
op|'!='
string|'"rdp-html5"'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'webob'
op|'.'
name|'exc'
op|'.'
name|'HTTPUnauthorized'
op|'('
nl|'\n'
name|'explanation'
op|'='
name|'_'
op|'('
string|'"The requested console type details are not "'
nl|'\n'
string|'"accessible"'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
op|'{'
string|"'console'"
op|':'
nl|'\n'
op|'{'
name|'i'
op|':'
name|'connect_info'
op|'['
name|'i'
op|']'
nl|'\n'
name|'for'
name|'i'
name|'in'
op|'['
string|"'instance_uuid'"
op|','
string|"'host'"
op|','
string|"'port'"
op|','
nl|'\n'
string|"'internal_access_path'"
op|']'
nl|'\n'
name|'if'
name|'i'
name|'in'
name|'connect_info'
op|'}'
op|'}'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|Console_auth_tokens
dedent|''
dedent|''
name|'class'
name|'Console_auth_tokens'
op|'('
name|'extensions'
op|'.'
name|'ExtensionDescriptor'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Console token authentication support."""'
newline|'\n'
DECL|variable|name
name|'name'
op|'='
string|'"ConsoleAuthTokens"'
newline|'\n'
DECL|variable|alias
name|'alias'
op|'='
string|'"os-console-auth-tokens"'
newline|'\n'
DECL|variable|namespace
name|'namespace'
op|'='
op|'('
string|'"http://docs.openstack.org/compute/ext/"'
nl|'\n'
string|'"consoles-auth-tokens/api/v2"'
op|')'
newline|'\n'
DECL|variable|updated
name|'updated'
op|'='
string|'"2013-08-13T00:00:00Z"'
newline|'\n'
nl|'\n'
DECL|member|get_resources
name|'def'
name|'get_resources'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'controller'
op|'='
name|'ConsoleAuthTokensController'
op|'('
op|')'
newline|'\n'
name|'ext'
op|'='
name|'extensions'
op|'.'
name|'ResourceExtension'
op|'('
string|"'os-console-auth-tokens'"
op|','
nl|'\n'
name|'controller'
op|')'
newline|'\n'
name|'return'
op|'['
name|'ext'
op|']'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 14.62117 | 88 | 0.634597 | 751 | 5,249 | 4.376831 | 0.22237 | 0.094919 | 0.063888 | 0.036507 | 0.346821 | 0.280499 | 0.159112 | 0.153636 | 0.095528 | 0.06328 | 0 | 0.005547 | 0.107068 | 5,249 | 358 | 89 | 14.662011 | 0.695754 | 0 | 0 | 0.801676 | 0 | 0 | 0.437417 | 0.042484 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.013966 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e811c36f36c477d41124bae1d31894772c187da0 | 135 | py | Python | ExerciciosdePython/ex049.py | aleksandromelo/Exercicios | 782ff539efa1286180eaf8df8c25c4eca7a5e669 | [
"MIT"
] | null | null | null | ExerciciosdePython/ex049.py | aleksandromelo/Exercicios | 782ff539efa1286180eaf8df8c25c4eca7a5e669 | [
"MIT"
] | null | null | null | ExerciciosdePython/ex049.py | aleksandromelo/Exercicios | 782ff539efa1286180eaf8df8c25c4eca7a5e669 | [
"MIT"
] | null | null | null | num = int(input('Digite um número para ver sua tabuada: '))
for i in range(1, 11):
print('{} x {:2} = {}'.format(num, i, num * i)) | 33.75 | 59 | 0.57037 | 24 | 135 | 3.208333 | 0.833333 | 0.103896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037383 | 0.207407 | 135 | 4 | 60 | 33.75 | 0.682243 | 0 | 0 | 0 | 0 | 0 | 0.389706 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e815853c059c203edb8fd8cdf216a9a5dcd145cf | 218 | py | Python | src/createData.py | saijananiganesan/SimPathFinder | 1634f2cb82c8056256d191be72589c4c531a3f67 | [
"MIT"
] | null | null | null | src/createData.py | saijananiganesan/SimPathFinder | 1634f2cb82c8056256d191be72589c4c531a3f67 | [
"MIT"
] | 1 | 2021-09-27T18:46:29.000Z | 2021-10-01T19:19:48.000Z | src/createData.py | saijananiganesan/SimPathFinder | 1634f2cb82c8056256d191be72589c4c531a3f67 | [
"MIT"
] | null | null | null | from __init__ import ExtractUnlabeledData, SampleUnlabeledData, ExtractLabeledData
E = ExtractLabeledData(data_dir='../labeldata/')
E.get_pathways()
E.get_pathway_names()
E.get_classes_dict()
E.create_df_all_labels()
| 27.25 | 82 | 0.825688 | 27 | 218 | 6.185185 | 0.740741 | 0.071856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06422 | 218 | 7 | 83 | 31.142857 | 0.818627 | 0 | 0 | 0 | 0 | 0 | 0.059633 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e81b190b66eb2d495bc411eca3c4ded7a86955c8 | 892 | py | Python | app/api/serializers.py | michelmarcondes/django-study-with-docker | 248e41db3f16a5d26662c5e93ebf32716a20195e | [
"MIT"
] | null | null | null | app/api/serializers.py | michelmarcondes/django-study-with-docker | 248e41db3f16a5d26662c5e93ebf32716a20195e | [
"MIT"
] | null | null | null | app/api/serializers.py | michelmarcondes/django-study-with-docker | 248e41db3f16a5d26662c5e93ebf32716a20195e | [
"MIT"
] | null | null | null | from rest_framework import serializers
from projects.models import Project, Tag, Review
from users.models import Profile
class ReviewSerializer(serializers.ModelSerializer):
class Meta:
model = Review
fields = '__all__'
class ProfileSerializer(serializers.ModelSerializer):
class Meta:
model = Profile
fields = '__all__'
class TagSerializer(serializers.ModelSerializer):
class Meta:
model = Tag
fields = '__all__'
class ProjectSerializer(serializers.ModelSerializer):
owner = ProfileSerializer(many=False)
tags = TagSerializer(many=True)
reviews = serializers.SerializerMethodField()
class Meta:
model = Project
fields = '__all__'
def get_reviews(self, obj):
reviews = obj.review_set.all()
serializer = ReviewSerializer(reviews, many=True)
return serializer.data | 24.777778 | 57 | 0.696188 | 86 | 892 | 7 | 0.430233 | 0.172757 | 0.093023 | 0.174419 | 0.199336 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2287 | 892 | 36 | 58 | 24.777778 | 0.875 | 0 | 0 | 0.307692 | 0 | 0 | 0.031355 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.115385 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e836603f27c307f1405743aabfd1cf7f2bab9fdf | 2,695 | py | Python | Chapter04/python/2.0.0/com/sparksamples/util.py | quguiliang/Machine-Learning-with-Spark-Second-Edition | 0ba131e6c15a3de97609c6cb5d976806ccc14f09 | [
"MIT"
] | 112 | 2017-05-13T15:44:29.000Z | 2022-02-19T20:14:14.000Z | Chapter04/python/2.0.0/com/sparksamples/util.py | tophua/Machine-Learning-with-Spark-Second-Edition | 0d93e992f6c79d55ad5cdcab735dbe6674143974 | [
"MIT"
] | 1 | 2017-05-25T00:10:43.000Z | 2017-05-25T00:10:43.000Z | Chapter04/python/2.0.0/com/sparksamples/util.py | tophua/Machine-Learning-with-Spark-Second-Edition | 0d93e992f6c79d55ad5cdcab735dbe6674143974 | [
"MIT"
] | 115 | 2017-05-06T10:49:00.000Z | 2022-03-08T07:48:54.000Z | import os
import sys
from pyspark.sql.types import *
PATH = "/home/ubuntu/work/ml-resources/spark-ml/data"
SPARK_HOME = "/home/ubuntu/work/spark-2.0.0-bin-hadoop2.7/"
os.environ['SPARK_HOME'] = SPARK_HOME
sys.path.append(SPARK_HOME + "/python")
from pyspark import SparkContext
from pyspark import SparkConf
from pyspark.sql import SparkSession
conf = SparkConf().setAppName("First Spark App").setMaster("local")
sc = SparkContext(conf=conf)
spark = SparkSession(sc)
def get_user_data():
custom_schema = StructType([
StructField("no", StringType(), True),
StructField("age", IntegerType(), True),
StructField("gender", StringType(), True),
StructField("occupation", StringType(), True),
StructField("zipCode", StringType(), True)
])
from pyspark.sql import SQLContext
from pyspark.sql.types import *
sql_context = SQLContext(sc)
user_df = sql_context.read \
.format('com.databricks.spark.csv') \
.options(header='false', delimiter='|') \
.load("%s/ml-100k/u.user" % PATH, schema = custom_schema)
return user_df
def get_movie_data_df():
custom_schema = StructType([
StructField("no", StringType(), True),
StructField("moviename", StringType(), True),
StructField("date", StringType(), True),
StructField("f1", StringType(), True), StructField("url", StringType(), True),
StructField("f2", IntegerType(), True), StructField("f3", IntegerType(), True),
StructField("f4", IntegerType(), True), StructField("f5", IntegerType(), True),
StructField("f6", IntegerType(), True), StructField("f7", IntegerType(), True),
StructField("f8", IntegerType(), True), StructField("f9", IntegerType(), True),
StructField("f10", IntegerType(), True), StructField("f11", IntegerType(), True),
StructField("f12", IntegerType(), True), StructField("f13", IntegerType(), True),
StructField("f14", IntegerType(), True), StructField("f15", IntegerType(), True),
StructField("f16", IntegerType(), True), StructField("f17", IntegerType(), True),
StructField("f18", IntegerType(), True), StructField("f19", IntegerType(), True)
])
from pyspark.sql import SQLContext
from pyspark.sql.types import *
sql_context = SQLContext(sc)
movie_df = sql_context.read \
.format('com.databricks.spark.csv') \
.options(header='false', delimiter='|') \
.load("%s/ml-100k/u.item" % PATH, schema = custom_schema)
return movie_df
def get_movie_data():
return sc.textFile("%s/ml-100k/u.item" % PATH)
def get_rating_data():
return sc.textFile("%s/ml-100k/u.data" % PATH)
| 35.460526 | 89 | 0.656401 | 305 | 2,695 | 5.718033 | 0.298361 | 0.223624 | 0.268349 | 0.018349 | 0.358372 | 0.297018 | 0.287844 | 0.287844 | 0.186927 | 0.186927 | 0 | 0.020777 | 0.178479 | 2,695 | 75 | 90 | 35.933333 | 0.766938 | 0 | 0 | 0.298246 | 0 | 0.017544 | 0.128853 | 0.050501 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070175 | false | 0 | 0.175439 | 0.035088 | 0.315789 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c0188f969e63d35026803dbf920b5a12f7d75cf | 12,648 | py | Python | specs/d3d9caps.py | prahal/apitrace | e9426dd61586757d23d7dddc85b3076f477e7f07 | [
"MIT"
] | 1 | 2020-06-19T12:34:44.000Z | 2020-06-19T12:34:44.000Z | specs/d3d9caps.py | prahal/apitrace | e9426dd61586757d23d7dddc85b3076f477e7f07 | [
"MIT"
] | null | null | null | specs/d3d9caps.py | prahal/apitrace | e9426dd61586757d23d7dddc85b3076f477e7f07 | [
"MIT"
] | null | null | null | ##########################################################################
#
# Copyright 2008-2009 VMware, Inc.
# All Rights Reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
#
##########################################################################/
"""d3d9caps.h"""
from winapi import *
from d3d9types import *
D3DVS20CAPS = Flags(DWORD, [
"D3DVS20CAPS_PREDICATION",
])
D3DVSHADERCAPS2_0 = Struct("D3DVSHADERCAPS2_0", [
(D3DVS20CAPS, "Caps"),
(INT, "DynamicFlowControlDepth"),
(INT, "NumTemps"),
(INT, "StaticFlowControlDepth"),
])
D3DPS20CAPS = Flags(DWORD, [
"D3DPS20CAPS_ARBITRARYSWIZZLE",
"D3DPS20CAPS_GRADIENTINSTRUCTIONS",
"D3DPS20CAPS_PREDICATION",
"D3DPS20CAPS_NODEPENDENTREADLIMIT",
"D3DPS20CAPS_NOTEXINSTRUCTIONLIMIT",
])
D3DPSHADERCAPS2_0 = Struct("D3DPSHADERCAPS2_0", [
(D3DPS20CAPS, "Caps"),
(INT, "DynamicFlowControlDepth"),
(INT, "NumTemps"),
(INT, "StaticFlowControlDepth"),
(INT, "NumInstructionSlots"),
])
D3DCAPS = Flags(DWORD, [
"D3DCAPS_READ_SCANLINE",
])
D3DCAPS2 = Flags(DWORD, [
"D3DCAPS2_FULLSCREENGAMMA",
"D3DCAPS2_CANCALIBRATEGAMMA",
"D3DCAPS2_RESERVED",
"D3DCAPS2_CANMANAGERESOURCE",
"D3DCAPS2_DYNAMICTEXTURES",
"D3DCAPS2_CANAUTOGENMIPMAP",
"D3DCAPS2_CANSHARERESOURCE",
])
D3DCAPS3 = Flags(DWORD, [
"D3DCAPS3_RESERVED",
"D3DCAPS3_ALPHA_FULLSCREEN_FLIP_OR_DISCARD",
"D3DCAPS3_LINEAR_TO_SRGB_PRESENTATION",
"D3DCAPS3_COPY_TO_VIDMEM",
"D3DCAPS3_COPY_TO_SYSTEMMEM",
])
D3DPRESENT_INTERVAL = Flags(DWORD, [
#"D3DPRESENT_INTERVAL_DEFAULT", # 0
"D3DPRESENT_INTERVAL_ONE",
"D3DPRESENT_INTERVAL_TWO",
"D3DPRESENT_INTERVAL_THREE",
"D3DPRESENT_INTERVAL_FOUR",
"D3DPRESENT_INTERVAL_IMMEDIATE",
])
D3DCURSORCAPS = Flags(DWORD, [
"D3DCURSORCAPS_COLOR",
"D3DCURSORCAPS_LOWRES",
])
D3DDEVCAPS = Flags(DWORD, [
"D3DDEVCAPS_EXECUTESYSTEMMEMORY",
"D3DDEVCAPS_EXECUTEVIDEOMEMORY",
"D3DDEVCAPS_TLVERTEXSYSTEMMEMORY",
"D3DDEVCAPS_TLVERTEXVIDEOMEMORY",
"D3DDEVCAPS_TEXTURESYSTEMMEMORY",
"D3DDEVCAPS_TEXTUREVIDEOMEMORY",
"D3DDEVCAPS_DRAWPRIMTLVERTEX",
"D3DDEVCAPS_CANRENDERAFTERFLIP",
"D3DDEVCAPS_TEXTURENONLOCALVIDMEM",
"D3DDEVCAPS_DRAWPRIMITIVES2",
"D3DDEVCAPS_SEPARATETEXTUREMEMORIES",
"D3DDEVCAPS_DRAWPRIMITIVES2EX",
"D3DDEVCAPS_HWTRANSFORMANDLIGHT",
"D3DDEVCAPS_CANBLTSYSTONONLOCAL",
"D3DDEVCAPS_HWRASTERIZATION",
"D3DDEVCAPS_PUREDEVICE",
"D3DDEVCAPS_QUINTICRTPATCHES",
"D3DDEVCAPS_RTPATCHES",
"D3DDEVCAPS_RTPATCHHANDLEZERO",
"D3DDEVCAPS_NPATCHES",
])
D3DPMISCCAPS = Flags(DWORD, [
"D3DPMISCCAPS_MASKZ",
"D3DPMISCCAPS_CULLNONE",
"D3DPMISCCAPS_CULLCW",
"D3DPMISCCAPS_CULLCCW",
"D3DPMISCCAPS_COLORWRITEENABLE",
"D3DPMISCCAPS_CLIPPLANESCALEDPOINTS",
"D3DPMISCCAPS_CLIPTLVERTS",
"D3DPMISCCAPS_TSSARGTEMP",
"D3DPMISCCAPS_BLENDOP",
"D3DPMISCCAPS_NULLREFERENCE",
"D3DPMISCCAPS_INDEPENDENTWRITEMASKS",
"D3DPMISCCAPS_PERSTAGECONSTANT",
"D3DPMISCCAPS_FOGANDSPECULARALPHA",
"D3DPMISCCAPS_SEPARATEALPHABLEND",
"D3DPMISCCAPS_MRTINDEPENDENTBITDEPTHS",
"D3DPMISCCAPS_MRTPOSTPIXELSHADERBLENDING",
"D3DPMISCCAPS_FOGVERTEXCLAMPED",
"D3DPMISCCAPS_POSTBLENDSRGBCONVERT",
])
D3DLINECAPS = Flags(DWORD, [
"D3DLINECAPS_TEXTURE",
"D3DLINECAPS_ZTEST",
"D3DLINECAPS_BLEND",
"D3DLINECAPS_ALPHACMP",
"D3DLINECAPS_FOG",
"D3DLINECAPS_ANTIALIAS",
])
D3DPRASTERCAPS = Flags(DWORD, [
"D3DPRASTERCAPS_DITHER",
"D3DPRASTERCAPS_ZTEST",
"D3DPRASTERCAPS_FOGVERTEX",
"D3DPRASTERCAPS_FOGTABLE",
"D3DPRASTERCAPS_MIPMAPLODBIAS",
"D3DPRASTERCAPS_ZBUFFERLESSHSR",
"D3DPRASTERCAPS_FOGRANGE",
"D3DPRASTERCAPS_ANISOTROPY",
"D3DPRASTERCAPS_WBUFFER",
"D3DPRASTERCAPS_WFOG",
"D3DPRASTERCAPS_ZFOG",
"D3DPRASTERCAPS_COLORPERSPECTIVE",
"D3DPRASTERCAPS_SCISSORTEST",
"D3DPRASTERCAPS_SLOPESCALEDEPTHBIAS",
"D3DPRASTERCAPS_DEPTHBIAS",
"D3DPRASTERCAPS_MULTISAMPLE_TOGGLE",
])
D3DPCMPCAPS = Flags(DWORD, [
"D3DPCMPCAPS_NEVER",
"D3DPCMPCAPS_LESS",
"D3DPCMPCAPS_EQUAL",
"D3DPCMPCAPS_LESSEQUAL",
"D3DPCMPCAPS_GREATER",
"D3DPCMPCAPS_NOTEQUAL",
"D3DPCMPCAPS_GREATEREQUAL",
"D3DPCMPCAPS_ALWAYS",
])
D3DPBLENDCAPS = Flags(DWORD, [
"D3DPBLENDCAPS_ZERO",
"D3DPBLENDCAPS_ONE",
"D3DPBLENDCAPS_SRCCOLOR",
"D3DPBLENDCAPS_INVSRCCOLOR",
"D3DPBLENDCAPS_SRCALPHA",
"D3DPBLENDCAPS_INVSRCALPHA",
"D3DPBLENDCAPS_DESTALPHA",
"D3DPBLENDCAPS_INVDESTALPHA",
"D3DPBLENDCAPS_DESTCOLOR",
"D3DPBLENDCAPS_INVDESTCOLOR",
"D3DPBLENDCAPS_SRCALPHASAT",
"D3DPBLENDCAPS_BOTHSRCALPHA",
"D3DPBLENDCAPS_BOTHINVSRCALPHA",
"D3DPBLENDCAPS_BLENDFACTOR",
"D3DPBLENDCAPS_SRCCOLOR2",
"D3DPBLENDCAPS_INVSRCCOLOR2",
])
D3DPSHADECAPS = Flags(DWORD, [
"D3DPSHADECAPS_COLORGOURAUDRGB",
"D3DPSHADECAPS_SPECULARGOURAUDRGB",
"D3DPSHADECAPS_ALPHAGOURAUDBLEND",
"D3DPSHADECAPS_FOGGOURAUD",
])
D3DPTEXTURECAPS = Flags(DWORD, [
"D3DPTEXTURECAPS_PERSPECTIVE",
"D3DPTEXTURECAPS_POW2",
"D3DPTEXTURECAPS_ALPHA",
"D3DPTEXTURECAPS_SQUAREONLY",
"D3DPTEXTURECAPS_TEXREPEATNOTSCALEDBYSIZE",
"D3DPTEXTURECAPS_ALPHAPALETTE",
"D3DPTEXTURECAPS_NONPOW2CONDITIONAL",
"D3DPTEXTURECAPS_PROJECTED",
"D3DPTEXTURECAPS_CUBEMAP",
"D3DPTEXTURECAPS_VOLUMEMAP",
"D3DPTEXTURECAPS_MIPMAP",
"D3DPTEXTURECAPS_MIPVOLUMEMAP",
"D3DPTEXTURECAPS_MIPCUBEMAP",
"D3DPTEXTURECAPS_CUBEMAP_POW2",
"D3DPTEXTURECAPS_VOLUMEMAP_POW2",
"D3DPTEXTURECAPS_NOPROJECTEDBUMPENV",
])
D3DPTFILTERCAPS = Flags(DWORD, [
"D3DPTFILTERCAPS_MINFPOINT",
"D3DPTFILTERCAPS_MINFLINEAR",
"D3DPTFILTERCAPS_MINFANISOTROPIC",
"D3DPTFILTERCAPS_MINFPYRAMIDALQUAD",
"D3DPTFILTERCAPS_MINFGAUSSIANQUAD",
"D3DPTFILTERCAPS_MIPFPOINT",
"D3DPTFILTERCAPS_MIPFLINEAR",
"D3DPTFILTERCAPS_CONVOLUTIONMONO",
"D3DPTFILTERCAPS_MAGFPOINT",
"D3DPTFILTERCAPS_MAGFLINEAR",
"D3DPTFILTERCAPS_MAGFANISOTROPIC",
"D3DPTFILTERCAPS_MAGFPYRAMIDALQUAD",
"D3DPTFILTERCAPS_MAGFGAUSSIANQUAD",
])
D3DPTADDRESSCAPS = Flags(DWORD, [
"D3DPTADDRESSCAPS_WRAP",
"D3DPTADDRESSCAPS_MIRROR",
"D3DPTADDRESSCAPS_CLAMP",
"D3DPTADDRESSCAPS_BORDER",
"D3DPTADDRESSCAPS_INDEPENDENTUV",
"D3DPTADDRESSCAPS_MIRRORONCE",
])
D3DSTENCILCAPS = Flags(DWORD, [
"D3DSTENCILCAPS_KEEP",
"D3DSTENCILCAPS_ZERO",
"D3DSTENCILCAPS_REPLACE",
"D3DSTENCILCAPS_INCRSAT",
"D3DSTENCILCAPS_DECRSAT",
"D3DSTENCILCAPS_INVERT",
"D3DSTENCILCAPS_INCR",
"D3DSTENCILCAPS_DECR",
"D3DSTENCILCAPS_TWOSIDED",
])
D3DTEXOPCAPS = Flags(DWORD, [
"D3DTEXOPCAPS_DISABLE",
"D3DTEXOPCAPS_SELECTARG1",
"D3DTEXOPCAPS_SELECTARG2",
"D3DTEXOPCAPS_MODULATE",
"D3DTEXOPCAPS_MODULATE2X",
"D3DTEXOPCAPS_MODULATE4X",
"D3DTEXOPCAPS_ADD",
"D3DTEXOPCAPS_ADDSIGNED",
"D3DTEXOPCAPS_ADDSIGNED2X",
"D3DTEXOPCAPS_SUBTRACT",
"D3DTEXOPCAPS_ADDSMOOTH",
"D3DTEXOPCAPS_BLENDDIFFUSEALPHA",
"D3DTEXOPCAPS_BLENDTEXTUREALPHA",
"D3DTEXOPCAPS_BLENDFACTORALPHA",
"D3DTEXOPCAPS_BLENDTEXTUREALPHAPM",
"D3DTEXOPCAPS_BLENDCURRENTALPHA",
"D3DTEXOPCAPS_PREMODULATE",
"D3DTEXOPCAPS_MODULATEALPHA_ADDCOLOR",
"D3DTEXOPCAPS_MODULATECOLOR_ADDALPHA",
"D3DTEXOPCAPS_MODULATEINVALPHA_ADDCOLOR",
"D3DTEXOPCAPS_MODULATEINVCOLOR_ADDALPHA",
"D3DTEXOPCAPS_BUMPENVMAP",
"D3DTEXOPCAPS_BUMPENVMAPLUMINANCE",
"D3DTEXOPCAPS_DOTPRODUCT3",
"D3DTEXOPCAPS_MULTIPLYADD",
"D3DTEXOPCAPS_LERP",
])
D3DFVFCAPS = Flags(DWORD, [
"D3DFVFCAPS_TEXCOORDCOUNTMASK",
"D3DFVFCAPS_DONOTSTRIPELEMENTS",
"D3DFVFCAPS_PSIZE",
])
D3DVTXPCAPS = Flags(DWORD, [
"D3DVTXPCAPS_TEXGEN",
"D3DVTXPCAPS_MATERIALSOURCE7",
"D3DVTXPCAPS_DIRECTIONALLIGHTS",
"D3DVTXPCAPS_POSITIONALLIGHTS",
"D3DVTXPCAPS_LOCALVIEWER",
"D3DVTXPCAPS_TWEENING",
"D3DVTXPCAPS_TEXGEN_SPHEREMAP",
"D3DVTXPCAPS_NO_TEXGEN_NONLOCALVIEWER",
])
D3DDEVCAPS2 = Flags(DWORD, [
"D3DDEVCAPS2_STREAMOFFSET",
"D3DDEVCAPS2_DMAPNPATCH",
"D3DDEVCAPS2_ADAPTIVETESSRTPATCH",
"D3DDEVCAPS2_ADAPTIVETESSNPATCH",
"D3DDEVCAPS2_CAN_STRETCHRECT_FROM_TEXTURES",
"D3DDEVCAPS2_PRESAMPLEDDMAPNPATCH",
"D3DDEVCAPS2_VERTEXELEMENTSCANSHARESTREAMOFFSET",
])
D3DDTCAPS = Flags(DWORD, [
"D3DDTCAPS_UBYTE4",
"D3DDTCAPS_UBYTE4N",
"D3DDTCAPS_SHORT2N",
"D3DDTCAPS_SHORT4N",
"D3DDTCAPS_USHORT2N",
"D3DDTCAPS_USHORT4N",
"D3DDTCAPS_UDEC3",
"D3DDTCAPS_DEC3N",
"D3DDTCAPS_FLOAT16_2",
"D3DDTCAPS_FLOAT16_4",
])
#D3DPS_VERSION = Enum("DWORD", [
# "D3DPS_VERSION(0,0)",
# "D3DPS_VERSION(1,0)",
# "D3DPS_VERSION(1,1)",
# "D3DPS_VERSION(1,2)",
# "D3DPS_VERSION(1,3)",
# "D3DPS_VERSION(1,4)",
# "D3DPS_VERSION(2,0)",
# "D3DPS_VERSION(3,0)",
#])
D3DPS_VERSION = DWORD
#D3DVS_VERSION = Enum("DWORD", [
# "D3DVS_VERSION(0,0)",
# "D3DVS_VERSION(1,0)",
# "D3DVS_VERSION(1,1)",
# "D3DVS_VERSION(2,0)",
# "D3DVS_VERSION(3,0)",
#])
D3DVS_VERSION = DWORD
D3DCAPS9 = Struct("D3DCAPS9", [
(D3DDEVTYPE, "DeviceType"),
(UINT, "AdapterOrdinal"),
(D3DCAPS, "Caps"),
(D3DCAPS2, "Caps2"),
(D3DCAPS3, "Caps3"),
(D3DPRESENT_INTERVAL, "PresentationIntervals"),
(D3DCURSORCAPS, "CursorCaps"),
(D3DDEVCAPS, "DevCaps"),
(D3DPMISCCAPS, "PrimitiveMiscCaps"),
(D3DPRASTERCAPS, "RasterCaps"),
(D3DPCMPCAPS, "ZCmpCaps"),
(D3DPBLENDCAPS, "SrcBlendCaps"),
(D3DPBLENDCAPS, "DestBlendCaps"),
(D3DPCMPCAPS, "AlphaCmpCaps"),
(D3DPSHADECAPS, "ShadeCaps"),
(D3DPTEXTURECAPS, "TextureCaps"),
(D3DPTFILTERCAPS, "TextureFilterCaps"),
(D3DPTFILTERCAPS, "CubeTextureFilterCaps"),
(D3DPTFILTERCAPS, "VolumeTextureFilterCaps"),
(D3DPTADDRESSCAPS, "TextureAddressCaps"),
(D3DPTADDRESSCAPS, "VolumeTextureAddressCaps"),
(D3DLINECAPS, "LineCaps"),
(DWORD, "MaxTextureWidth"),
(DWORD, "MaxTextureHeight"),
(DWORD, "MaxVolumeExtent"),
(DWORD, "MaxTextureRepeat"),
(DWORD, "MaxTextureAspectRatio"),
(DWORD, "MaxAnisotropy"),
(Float, "MaxVertexW"),
(Float, "GuardBandLeft"),
(Float, "GuardBandTop"),
(Float, "GuardBandRight"),
(Float, "GuardBandBottom"),
(Float, "ExtentsAdjust"),
(D3DSTENCILCAPS, "StencilCaps"),
(D3DFVFCAPS, "FVFCaps"),
(D3DTEXOPCAPS, "TextureOpCaps"),
(DWORD, "MaxTextureBlendStages"),
(DWORD, "MaxSimultaneousTextures"),
(D3DVTXPCAPS, "VertexProcessingCaps"),
(DWORD, "MaxActiveLights"),
(DWORD, "MaxUserClipPlanes"),
(DWORD, "MaxVertexBlendMatrices"),
(DWORD, "MaxVertexBlendMatrixIndex"),
(Float, "MaxPointSize"),
(DWORD, "MaxPrimitiveCount"),
(DWORD, "MaxVertexIndex"),
(DWORD, "MaxStreams"),
(DWORD, "MaxStreamStride"),
(D3DVS_VERSION, "VertexShaderVersion"),
(DWORD, "MaxVertexShaderConst"),
(D3DPS_VERSION, "PixelShaderVersion"),
(Float, "PixelShader1xMaxValue"),
(D3DDEVCAPS2, "DevCaps2"),
(Float, "MaxNpatchTessellationLevel"),
(DWORD, "Reserved5"),
(UINT, "MasterAdapterOrdinal"),
(UINT, "AdapterOrdinalInGroup"),
(UINT, "NumberOfAdaptersInGroup"),
(D3DDTCAPS, "DeclTypes"),
(DWORD, "NumSimultaneousRTs"),
(D3DPTFILTERCAPS, "StretchRectFilterCaps"),
(D3DVSHADERCAPS2_0, "VS20Caps"),
(D3DPSHADERCAPS2_0, "PS20Caps"),
(D3DPTFILTERCAPS, "VertexTextureFilterCaps"),
(DWORD, "MaxVShaderInstructionsExecuted"),
(DWORD, "MaxPShaderInstructionsExecuted"),
(DWORD, "MaxVertexShader30InstructionSlots"),
(DWORD, "MaxPixelShader30InstructionSlots"),
])
| 29.971564 | 79 | 0.716872 | 949 | 12,648 | 9.266596 | 0.454162 | 0.026154 | 0.007391 | 0.007505 | 0.01501 | 0.01501 | 0.01501 | 0 | 0 | 0 | 0 | 0.040682 | 0.156546 | 12,648 | 421 | 80 | 30.042755 | 0.783652 | 0.119466 | 0 | 0.092486 | 0 | 0 | 0.607384 | 0.464266 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.00578 | 0 | 0.00578 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c1a874f462704bb21b985ac2653cb99f9e8cd06 | 351 | py | Python | offspect/gui/VWidgets/message.py | translationalneurosurgery/tool-offspect | 011dafb697e8542fc7c3cf8af8523af3ff704a14 | [
"MIT"
] | 1 | 2022-02-23T12:26:45.000Z | 2022-02-23T12:26:45.000Z | offspect/gui/VWidgets/message.py | neuromti/tool-offspect | 011dafb697e8542fc7c3cf8af8523af3ff704a14 | [
"MIT"
] | 51 | 2020-03-15T14:52:36.000Z | 2020-09-28T09:30:53.000Z | offspect/gui/VWidgets/message.py | neuromti/tool-offspect | 011dafb697e8542fc7c3cf8af8523af3ff704a14 | [
"MIT"
] | 1 | 2020-03-24T07:35:30.000Z | 2020-03-24T07:35:30.000Z | # from PyQt5.QtWidgets import QMessageBox
# def raise_error(message: str = "DEFAULT:Error Description:More Information"):
# box = QMessageBox()
# kind, msg, info = message.split(":")
# box.setIcon(QMessageBox.Critical)
# box.setWindowTitle(kind + " Error")
# box.setText(msg)
# box.setInformativeText(info)
# box.exec_()
| 29.25 | 79 | 0.669516 | 38 | 351 | 6.131579 | 0.657895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003509 | 0.188034 | 351 | 11 | 80 | 31.909091 | 0.814035 | 0.940171 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c29c6717099eef5374f68ca8190022df79ffc51 | 4,574 | py | Python | Footy/UnsupportedBantzStrings.py | schleising/banter-bot | 3e51453ae993d2c26dc51464a3cef3875a6be3c9 | [
"Apache-2.0"
] | null | null | null | Footy/UnsupportedBantzStrings.py | schleising/banter-bot | 3e51453ae993d2c26dc51464a3cef3875a6be3c9 | [
"Apache-2.0"
] | 2 | 2022-03-27T10:44:38.000Z | 2022-03-28T19:24:39.000Z | Footy/UnsupportedBantzStrings.py | schleising/banter-bot | 3e51453ae993d2c26dc51464a3cef3875a6be3c9 | [
"Apache-2.0"
] | 1 | 2022-03-28T11:45:47.000Z | 2022-03-28T11:45:47.000Z | # {team} -> Name of team
# {name} -> Name of person who supports team
teamMatchStarted: list[str] = [
"{team} are shit",
"{team} cunts",
"Dirty {team}",
"Dirty {team}, dirty {name}",
]
drawing: list[str] = [
"{team} level, this is a shit match",
"Boring old {team}",
"Happy with how it's going, {name}?",
"Yawn...",
"{team} wankers",
"How can you support this rubbish, {name}?",
"You get the feeling that {team} don't really want this",
"No passion from {team}, {name}",
"If a game of football is like making love to a beautiful woman, this {team} game is a £10 hand job from a swivel-eyed misfit",
"This {team} match is like a game of chess. But with more players and only one piece",
]
teamLeadByOne: list[str] = [
"{team} cheats, the ref's a cunt",
"That was never a goal for {team}",
"{team} don't deserve that",
"Bollocks",
"That should go to VAR",
"Bit fortunuate for {team}",
"Can't imagine {team} will keep this lead",
"Lucky goal for {team}",
"{team} got lucky there",
"{team} aren't good enough to stay ahead",
"Offside!",
]
teamExtendingLead: list[str] = [
"There's no way {team} deserve this lead",
"Have {team} paid the ref?",
"This is bullshit",
"The ref's a cunt, {name}'s a cunt",
"The ref's a cunt, {team} are cunts, {name}'s a cunt",
"Something has gone seriously wrong with this country",
"When I voted for Brexit, I didn't vote for this",
"At least Boris remains in charge, we've still got that",
"Richard Wanger would be turning in his grave",
"Liberal elite bullshit",
"That was so offside",
"VAR!",
"Is the linesman OK?",
"If only {name}'s wife was as dirty as this game",
]
teamLosingLead: list[str] = [
"Lazy old {team}, lazy old {name}",
"{team} are throwing it away",
"{team} are rubbish",
"{team} fucking it up again",
"We really are being treated to some world class flouncing from {team} today",
"Brace yourself, {name}. This is going to hurt",
"This is brown trouser time for {team}",
"I hope {name} brought a spare pair of underpants",
"I see {team} are playing their B Team. B for Bullshit",
]
teamDeficitOfOne: list[str] = [
"This is more like it from {team}",
"Oh dear...",
"{team} wankers",
"How are you feeling, {name}?",
"Bit disappointing, {name}?",
"Not looking good for {team}, {name}",
"You must be furious, {name}",
"{team} have just got no heart",
"This is what happens when you try to buy success",
"All that money spent, {name}, and for what?",
]
teamExtendingDeficit: list[str] = [
"Starting to feel a bit sorry for {team}",
"Never mind, {name}, there's always the next game",
"Poor {team}",
"Whoops...",
"Oh dear, everything OK, {name}?",
"Hey {name}, where'd you get your keeper?\nPOUNDSHOP !! POUNDSHOP !!",
"{team} can't raise themselves for this game, typical",
"A team like {team} have such a proud history, but what we see today is just embarrassing",
"{team} clearly not up for it today",
"{team} are letting you down, {name}",
"Watching {team} is like watching a bunch of cavemen: Neanderthal",
]
teamLosingDeficit: list[str] = [
"Too little too late for {team}",
"{team} won't come back from here",
"The ref's a cunt",
"This is awful",
"What a mess",
"Well this is an unmitigated shit show",
]
teamWon: list[str] = [
"That was a shit game",
"There's no way {team} deserved that",
"Fuck you, {name} !!",
"This will go down in history...\nAs the most tedious game I have ever had the misfortune to witness",
]
teamLost: list[str] = [
"Justice done, {team} lost",
"Job done for {team}?",
"Job done, {name}?",
"{name} !!?",
"Probably the best {team} could hope for",
"Everything OK, {name}?",
"{team} continue to disappoint",
"Well if the football thing doesn't work out for {team}, they can always consider a career on the stage",
"{team} set the bar low",
"{team} fail to meet their already low expectations",
]
teamDrew: list [str] = [
"Another uninspiring result for {team}",
"Thanks for nothing, {team}",
"Well that's 90 minutes we won't get back, thanks {team}",
"Another draw for {team}",
"Boring old {team}",
"You should be happy with that result, {name}",
"If I could pick one highlight from this {team} game it would be when it finally ended.",
"I think {name} will be happy with {team}'s performance today.",
]
| 34.390977 | 131 | 0.622868 | 687 | 4,574 | 4.148472 | 0.404658 | 0.027018 | 0.012632 | 0.011228 | 0.027368 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001148 | 0.238085 | 4,574 | 132 | 132 | 34.651515 | 0.816356 | 0.014211 | 0 | 0.033613 | 0 | 0.016807 | 0.749001 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.008403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c2a4f856378f5f6862c783cf6cd99f449623c3f | 3,271 | py | Python | nlpir/native/classifier.py | NLPIR-team/nlpir-python | 029f81e69ee561725fa017dce09cfd55acb34d20 | [
"MIT"
] | 18 | 2021-01-01T03:07:17.000Z | 2022-03-20T11:25:46.000Z | nlpir/native/classifier.py | yangyaofei/nlpir-python | af7bca9e2c3ba5f07316364da8f5e46a2bbc7c52 | [
"MIT"
] | 23 | 2020-12-04T07:10:09.000Z | 2022-03-10T09:39:26.000Z | nlpir/native/classifier.py | yangyaofei/nlpir-python | af7bca9e2c3ba5f07316364da8f5e46a2bbc7c52 | [
"MIT"
] | 13 | 2020-10-23T13:38:26.000Z | 2022-03-18T12:10:10.000Z | # coding=utf-8
from nlpir.native.nlpir_base import NLPIRBase
from ctypes import c_bool, c_char_p, c_int, POINTER, Structure, c_float
class StDoc(Structure):
__fields__ = [
("sTitle", c_char_p),
("sContent", c_char_p),
("sAuthor", c_char_p),
("sBoard", c_char_p),
("sDatatype", c_char_p)
]
class Classifier(NLPIRBase):
@property
def dll_name(self):
return "LJClassifier"
@NLPIRBase.byte_str_transform
def init_lib(self, data_path: str, encode: int, license_code: str) -> int:
"""
Call **classifier_init**
:param data_path:
:param encode:
:param license_code:
:return: 1 success 0 fail
"""
return self.get_func("classifier_init", [c_char_p, c_char_p, c_int, c_char_p], c_bool)(
"rulelist.xml", data_path, encode, license_code)
@NLPIRBase.byte_str_transform
def exit_lib(self) -> bool:
"""
Call **classifier_exit**
:return: exit success or not
"""
return self.get_func("classifier_exit", None, None)()
@NLPIRBase.byte_str_transform
def get_last_error_msg(self) -> str:
return self.get_func("classifier_GetLastErrorMsg", None, c_char_p)()
@NLPIRBase.byte_str_transform
def exec_1(self, data: StDoc, out_type: int = 0):
"""
Call **classifier_exec1**
对输入的文章结构进行分类
:param data: 文章结构
:param out_type: 输出是否包括置信度, 0 没有置信度 1 有置信度
:return: 主题类别串 各类之间用\t隔开,类名按照置信度从高到低排序
举例:“要闻 敏感 诉讼”, “要闻 1.00 敏感 0.95 诉讼 0.82”
"""
return self.get_func("classifier_exec1", [POINTER(StDoc), c_int], c_char_p)(data, out_type)
@NLPIRBase.byte_str_transform
def exec(self, title: str, content: str, out_type: int):
"""
Call **classifier_exec**
对输入的文章进行分类
:param title: 文章标题
:param content: 文章内容
:param out_type: 输出知否包括置信度,同 :func:`exec_1`
:return: 同 :func:`exec_1`
"""
return self.get_func("classifier_exec", [c_char_p, c_char_p, c_int], c_char_p)(title, content, out_type)
@NLPIRBase.byte_str_transform
def exec_file(self, filename: str, out_type: int) -> str:
"""
Call **classifier_execFile**
:param filename: 文件名
:param out_type: 输出是否包括置信度, 0 没有置信度 1 有置信度
:return: 主题类别串 各类之间用\t隔开,类名按照置信度从高到低排序
举例:“要闻 敏感 诉讼”, “要闻 1.00 敏感 0.95 诉讼 0.82”
"""
return self.get_func("classifier_execFile", [c_char_p, c_int], c_char_p)(filename, out_type)
@NLPIRBase.byte_str_transform
def detail(self, class_name: str):
"""
Call **classifier_detail**
对于当前文档,输入类名,取得结果明细
:param class_name: 结果类名
:return: 结果明细 例如:
::
RULE3:
SUBRULE1: 内幕 1
SUBRULE2: 股市 1 基金 3 股票 8
SUBRULE3: 书摘 2
"""
return self.get_func("classifier_detail", [c_char_p], c_char_p)(class_name)
@NLPIRBase.byte_str_transform
def set_sim_thresh(self, sim: float):
"""
Call **classifier_setsimthresh**
设置阈值
:param sim: 阈值
:return:
"""
return self.get_func("classifier_setsimthresh", [c_float])(sim)
| 27.957265 | 112 | 0.597982 | 426 | 3,271 | 4.312207 | 0.274648 | 0.048993 | 0.058792 | 0.030484 | 0.41644 | 0.242243 | 0.218291 | 0.199238 | 0.148068 | 0.148068 | 0 | 0.017131 | 0.286151 | 3,271 | 116 | 113 | 28.198276 | 0.769593 | 0.293183 | 0 | 0.205128 | 0 | 0 | 0.106022 | 0.025219 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.051282 | 0.051282 | 0.589744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1c2b194566a9e96dba834338ec915a2289eb1837 | 682 | py | Python | functions/markdown-to-html/markdown2html.py | truls/faas-profiler | d54ca0d9926f38c693f616ba4d08414aea823f51 | [
"MIT"
] | null | null | null | functions/markdown-to-html/markdown2html.py | truls/faas-profiler | d54ca0d9926f38c693f616ba4d08414aea823f51 | [
"MIT"
] | null | null | null | functions/markdown-to-html/markdown2html.py | truls/faas-profiler | d54ca0d9926f38c693f616ba4d08414aea823f51 | [
"MIT"
] | null | null | null | # Copyright (c) 2019 Princeton University
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
from markdown import markdown
import base64
import json
import base64
def main(params):
try:
md = json.loads(base64.decodebytes(params["__ow_body"].encode("utf-8")))["markdown"].encode("utf-8")
md_text = base64.decodebytes(md).decode("utf-8")
except KeyError:
return {'Error' : 'Possibly lacking markdown parameter in request.'}
test_id = params["__ow_query"].split("&")[0]
html = markdown(md_text)
return {"result": "ok", "html_response": html, "testid": test_id}
| 29.652174 | 108 | 0.690616 | 94 | 682 | 4.893617 | 0.617021 | 0.026087 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028623 | 0.180352 | 682 | 22 | 109 | 31 | 0.794275 | 0.233138 | 0 | 0.153846 | 0 | 0 | 0.235521 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1c2db146a81095258082a5e01445b3cddf1eab20 | 8,037 | py | Python | users/models.py | moshthepitt/probsc | 9b8cab206bb1c41238e36bd77f5e0573df4d8e2d | [
"MIT"
] | null | null | null | users/models.py | moshthepitt/probsc | 9b8cab206bb1c41238e36bd77f5e0573df4d8e2d | [
"MIT"
] | null | null | null | users/models.py | moshthepitt/probsc | 9b8cab206bb1c41238e36bd77f5e0573df4d8e2d | [
"MIT"
] | null | null | null | from django.conf import settings
from django.db import models
from django.utils.translation import ugettext_lazy as _
from django.utils.encoding import python_2_unicode_compatible
from django.urls import reverse
from django_extensions.db.models import TimeStampedModel
from mptt.models import MPTTModel, TreeForeignKey
from .managers import UserProfileManager, DepartmentManager, PositionManager
User = settings.AUTH_USER_MODEL
class Department(MPTTModel, TimeStampedModel):
"""
Departments in an organisation
"""
name = models.CharField(_("Name"), max_length=255)
description = models.TextField(_("Description"), blank=True, default="")
parent = TreeForeignKey('self', verbose_name=_("Parent"), null=True,
blank=True, related_name='children', db_index=True,
on_delete=models.PROTECT,
help_text=_("The parent department"))
customer = models.ForeignKey(
'customers.Customer', verbose_name=_("Customer"),
on_delete=models.PROTECT)
manager = models.ForeignKey(
User, verbose_name=_("Manager"), on_delete=models.PROTECT,
blank=True, null=True)
active = models.BooleanField(_("Active"), default=True)
objects = DepartmentManager()
class Meta:
verbose_name = _("Department")
verbose_name_plural = _("Departments")
ordering = ['name']
def get_absolute_url(self):
return "#"
def get_edit_url(self):
return reverse('users:departments_edit', args=[self.pk])
def get_delete_url(self):
return reverse('users:departments_delete', args=[self.pk])
def get_list_url(self):
return reverse('users:departments_list')
def __str__(self):
return self.name
class Position(MPTTModel, TimeStampedModel):
"""
Job positions in an organisation
"""
name = models.CharField(_("Name"), max_length=255)
description = models.TextField(_("Description"), blank=True, default="")
department = models.ForeignKey(
Department, verbose_name=_("Department"), on_delete=models.PROTECT)
parent = TreeForeignKey('self', verbose_name=_("Reports To"), null=True,
blank=True, related_name='children', db_index=True,
on_delete=models.PROTECT,
help_text=_("The parent Job Position"))
supervisor = models.ForeignKey(
User, verbose_name=_("Supervisor"), on_delete=models.PROTECT,
blank=True, null=True)
customer = models.ForeignKey(
'customers.Customer', verbose_name=_("Customer"),
on_delete=models.PROTECT)
active = models.BooleanField(_("Active"), default=True)
objects = PositionManager()
class Meta:
verbose_name = _("Job Positions")
verbose_name_plural = _("Job Positions")
ordering = ['name']
def get_absolute_url(self):
return "#"
def get_edit_url(self):
return reverse('users:positions_edit', args=[self.pk])
def get_delete_url(self):
return reverse('users:positions_delete', args=[self.pk])
def get_list_url(self):
return reverse('users:positions_list')
def __str__(self):
return "{} - {}".format(self.department.name, self.name)
@python_2_unicode_compatible
class UserProfile(models.Model):
"""
Model used to store more information on users
"""
ADMIN = '1'
MEMBER = '2'
EDITOR = '3'
MEMBER_ROLE_CHOICES = (
(ADMIN, _('Admin')),
(EDITOR, _('Editor')),
(MEMBER, _('Member')),
)
created_on = models.DateTimeField(_("Created on"), auto_now_add=True)
updated_on = models.DateTimeField(_("Updated on"), auto_now=True)
user = models.OneToOneField(User, verbose_name=_("User"))
position = models.ForeignKey(Position, verbose_name=_(
"job Position"), on_delete=models.SET_NULL, blank=True, null=True,
default=None)
customer = models.ForeignKey('customers.Customer', verbose_name=_(
"Customer"), on_delete=models.SET_NULL, blank=True, null=True,
default=None)
role = models.CharField(
_("Role"), max_length=1, choices=MEMBER_ROLE_CHOICES, blank=False,
default=MEMBER)
active = models.BooleanField(
_("Active"), default=True, help_text="Is the staff member actively "
"employed?")
objects = UserProfileManager()
class Meta:
verbose_name = _("Staff Member")
verbose_name_plural = _("Staff Members")
ordering = ['user__first_name', 'user__last_name', 'user__email']
def get_name(self):
if self.user.get_full_name():
return self.user.get_full_name()
if self.user.email:
return self.user.email
return self.user.username
def get_initials(self):
if self.user.first_name and self.user.last_name:
return "{}{}".format(self.user.first_name[0],
self.user.last_name[0])
if self.user.first_name:
return self.user.first_name[0]
if self.user.last_name:
return self.user.last_name[0]
return self.user.email[0]
def is_admin(self):
return self.role == self.ADMIN
def is_editor(self):
return self.role == self.EDITOR
def can_edit(self):
return self.role == self.EDITOR or self.role == self.ADMIN
def get_subordinates(self):
"""
Returns a queryset of UserProfile objects which report to this
userprofile
"""
if self.position:
queryset = UserProfile.objects.active().exclude(
id=self.id).filter(
models.Q(
position__supervisor=self.user) | models.Q(
position__department__manager=self.user) | models.Q(
position__parent=self.position))
else:
queryset = UserProfile.objects.active().exclude(
id=self.id).filter(
models.Q(
position__supervisor=self.user) | models.Q(
position__department__manager=self.user))
# get job positions of subs
subordinate_positions = Position.objects.filter(
userprofile__in=queryset)
# get any position that may report to these positions
# list of position ids of Positions that report to
# subordinate_positions
reporting_jp_ids = []
for sub_p in subordinate_positions:
reporting_jps = sub_p.get_descendants(include_self=False)
if reporting_jps is not None:
reporting_jp_ids = reporting_jp_ids + list(
reporting_jps.values_list('id', flat=True))
reporting_jp_ids = list(set(reporting_jp_ids))
# get user profiles wiht positions that report to subordinate_positions
reporting_profiles = UserProfile.objects.active().filter(
position__id__in=reporting_jp_ids)
queryset = queryset.union(reporting_profiles)
# unions result in weird filtering so we create a new queryset
queryset_ids = list(set([x.id for x in queryset]))
if queryset_ids:
queryset = UserProfile.objects.filter(id__in=queryset_ids)
else:
queryset = UserProfile.objects.none()
return queryset
def has_subordinates(self):
return self.get_subordinates().exists()
def get_department(self):
if self.position is not None:
return self.position.department.name
return None
def get_absolute_url(self):
return "#"
def get_edit_url(self):
return reverse('users:userprofiles_edit', args=[self.pk])
def get_delete_url(self):
return "#"
def get_list_url(self):
return reverse('users:userprofiles_list')
def __str__(self):
return _("{user}").format(user=self.get_name())
| 33.911392 | 79 | 0.630459 | 902 | 8,037 | 5.37694 | 0.190687 | 0.039175 | 0.032165 | 0.03299 | 0.491959 | 0.399381 | 0.352577 | 0.312165 | 0.289278 | 0.289278 | 0 | 0.002873 | 0.26378 | 8,037 | 236 | 80 | 34.055085 | 0.816799 | 0.057857 | 0 | 0.321429 | 0 | 0 | 0.087583 | 0.018158 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136905 | false | 0 | 0.047619 | 0.113095 | 0.553571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
1c30e979a316677653e10a7d840b2373d881b549 | 1,925 | py | Python | src/modules/loss.py | ab3llini/BlindLess | 46c50fb2748b9d372044d00b901f0cde91946684 | [
"MIT"
] | 1 | 2022-03-19T09:19:12.000Z | 2022-03-19T09:19:12.000Z | src/modules/loss.py | ab3llini/BlindLess | 46c50fb2748b9d372044d00b901f0cde91946684 | [
"MIT"
] | 1 | 2020-02-06T18:26:07.000Z | 2020-02-06T18:26:07.000Z | src/modules/loss.py | ab3llini/BlindLess | 46c50fb2748b9d372044d00b901f0cde91946684 | [
"MIT"
] | null | null | null | from torch.nn import CrossEntropyLoss
class GPT2Loss(CrossEntropyLoss):
def __init__(self, pad_token_id):
super(GPT2Loss, self).__init__(ignore_index=pad_token_id)
def forward(self, output, labels):
"""
Loss function for gpt2
:param output:
:param labels:
:return:
"""
# Flatten the tensors (shift-align)
# Remove last token from output
output = output[..., :-1, :].contiguous().view(-1, output.size(-1))
# Remove the first token from labels e do not care for question
labels = (labels[..., 1:].contiguous()).view(-1)
# Compute the actual loss
return super(GPT2Loss, self).forward(output, labels)
class VisualGPT2Loss(GPT2Loss):
def __init__(self, pad_token_id, extract=None):
super(VisualGPT2Loss, self).__init__(pad_token_id=pad_token_id)
if extract is not None:
assert type(extract) == int, 'Extract value MUST be integer'
self.extract = extract
def forward(self, output, labels):
if self.extract is not None:
output = output[self.extract]
# Compute the actual loss
return super(VisualGPT2Loss, self).forward(output, labels[0])
class BERTLoss(CrossEntropyLoss):
def __init__(self, pad_token_id):
super(BERTLoss, self).__init__(ignore_index=pad_token_id)
def forward(self, output, labels):
"""
Loss function for gpt2
:param output:
:param labels:
:return:
"""
# Flatten the tensors (shift-align)
# Remove last token from output
output = output[..., :-1, :].contiguous().view(-1, output.size(-1))
# Remove the first token from labels e do not care for question
labels = (labels[..., 1:].contiguous()).view(-1)
# Compute the actual loss
return super(BERTLoss, self).forward(output, labels)
| 31.048387 | 75 | 0.619221 | 230 | 1,925 | 5.008696 | 0.252174 | 0.048611 | 0.060764 | 0.055556 | 0.664931 | 0.642361 | 0.597222 | 0.597222 | 0.524306 | 0.524306 | 0 | 0.014184 | 0.267532 | 1,925 | 61 | 76 | 31.557377 | 0.802837 | 0.232727 | 0 | 0.36 | 0 | 0 | 0.021106 | 0 | 0 | 0 | 0 | 0 | 0.04 | 1 | 0.24 | false | 0 | 0.04 | 0 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1c31541017e2e3db5152ae18abbb5211d1ab50d4 | 6,481 | py | Python | analyze_tls.py | khushhallchandra/CN-project | 405ce86e4e65e116531aa19287b8d05c959b1441 | [
"MIT"
] | null | null | null | analyze_tls.py | khushhallchandra/CN-project | 405ce86e4e65e116531aa19287b8d05c959b1441 | [
"MIT"
] | null | null | null | analyze_tls.py | khushhallchandra/CN-project | 405ce86e4e65e116531aa19287b8d05c959b1441 | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
def main(filename):
data = pd.read_csv(filename, header=None)
means = data.mean(axis = 0)
stds = data.std(axis = 0)
return means[0], means[1], stds[0], stds[1]
if __name__ == '__main__':
files_http1 = ['./results/benchmark_size/http1_txt1.csv', './results/benchmark_size/http1_txt2.csv', './results/benchmark_size/http1_txt3.csv', './results/benchmark_size/http1_txt4.csv', './results/benchmark_size/http1_txt5.csv']
files_http1_tls = ['./results/benchmark_size/http1_tls_txt1.csv', './results/benchmark_size/http1_tls_txt2.csv', './results/benchmark_size/http1_tls_txt3.csv', './results/benchmark_size/http1_tls_txt4.csv', './results/benchmark_size/http1_tls_txt5.csv']
files_http2 = ['./results/benchmark_size/http2_txt1.csv', './results/benchmark_size/http2_txt2.csv', './results/benchmark_size/http2_txt3.csv', './results/benchmark_size/http2_txt4.csv', './results/benchmark_size/http2_txt5.csv']
files_http2_tls = ['./results/benchmark_size/http2_tls_txt1.csv', './results/benchmark_size/http2_tls_txt2.csv', './results/benchmark_size/http2_tls_txt3.csv', './results/benchmark_size/http2_tls_txt4.csv', './results/benchmark_size/http2_tls_txt5.csv']
time_tot_http2, time_contentTransfer_http2 = [], []
std_tot_http2, std_contentTransfer_http2 = [], []
time_tot_http1, time_contentTransfer_http1 = [], []
std_tot_http1, std_contentTransfer_http1 = [], []
time_tot_http2_tls, time_contentTransfer_http2_tls = [], []
std_tot_http2_tls, std_contentTransfer_http2_tls = [], []
time_tot_http1_tls, time_contentTransfer_http1_tls = [], []
std_tot_http1_tls, std_contentTransfer_http1_tls = [], []
for f in files_http2:
t1, t2, std1, std2 = main(f)
time_contentTransfer_http2.append(t1)
time_tot_http2.append(t2)
std_contentTransfer_http2.append(2*std1)
std_tot_http2.append(2*std2)
for f in files_http1:
t1, t2, std1, std2 = main(f)
time_contentTransfer_http1.append(t1)
time_tot_http1.append(t2)
std_contentTransfer_http1.append(2*std1)
std_tot_http1.append(2*std2)
for f in files_http2_tls:
t1, t2, std1, std2 = main(f)
time_contentTransfer_http2_tls.append(t1)
time_tot_http2_tls.append(t2)
std_contentTransfer_http2_tls.append(2*std1)
std_tot_http2_tls.append(2*std2)
for f in files_http1_tls:
t1, t2, std1, std2 = main(f)
time_contentTransfer_http1_tls.append(t1)
time_tot_http1_tls.append(t2)
std_contentTransfer_http1_tls.append(2*std1)
std_tot_http1_tls.append(2*std2)
x = [100, 1000, 10000, 100000, 1000000]
time_tot_http2, time_contentTransfer_http2 = np.array(time_tot_http2), np.array(time_contentTransfer_http2)
std_tot_http2, std_contentTransfer_http2 = np.array(std_tot_http2), np.array(std_contentTransfer_http2)
time_tot_http1, time_contentTransfer_http1 = np.array(time_tot_http1), np.array(time_contentTransfer_http1)
std_tot_http1, std_contentTransfer_http1 = np.array(std_tot_http1), np.array(std_contentTransfer_http1)
time_tot_http2_tls, time_contentTransfer_http2_tls = np.array(time_tot_http2_tls), np.array(time_contentTransfer_http2_tls)
std_tot_http2_tls, std_contentTransfer_http2_tls = np.array(std_tot_http2_tls), np.array(std_contentTransfer_http2_tls)
time_tot_http1_tls, time_contentTransfer_http1_tls = np.array(time_tot_http1_tls), np.array(time_contentTransfer_http1_tls)
std_tot_http1_tls, std_contentTransfer_http1_tls = np.array(std_tot_http1_tls), np.array(std_contentTransfer_http1_tls)
fig, ax = plt.subplots()
ax.grid()
ax.plot(x, time_contentTransfer_http1, 'o-', color='r', label="HTTP1")
ax.plot(x, time_contentTransfer_http1_tls, 'o-', color='g', label="HTTP1_with_tls")
ax.plot(x, time_contentTransfer_http2, 'o-', color='b', label="SPDY")
ax.plot(x, time_contentTransfer_http2_tls, 'o-', color='k', label="SPDY_with_tls")
ax.fill_between(x, time_contentTransfer_http1 - std_contentTransfer_http1, time_contentTransfer_http1 + std_contentTransfer_http1, color='gray', alpha=0.3)
ax.fill_between(x, time_contentTransfer_http2 - std_contentTransfer_http2, time_contentTransfer_http2 + std_contentTransfer_http2, color='gray', alpha=0.3)
ax.fill_between(x, time_contentTransfer_http1_tls - std_contentTransfer_http1_tls, time_contentTransfer_http1_tls + std_contentTransfer_http1_tls, color='gray', alpha=0.3)
ax.fill_between(x, time_contentTransfer_http2_tls - std_contentTransfer_http2_tls, time_contentTransfer_http2_tls + std_contentTransfer_http2_tls, color='gray', alpha=0.3)
# ax.errorbar(x, time_contentTransfer_http2, yerr=std_contentTransfer_http2, fmt='-', color='r', label="HTTP2")
# ax.errorbar(x, time_contentTransfer_quic, yerr=std_contentTransfer_quic, fmt='-', color='b', label="QUIC")
ax.set_xlabel('Size of data (Length)')
ax.set_ylabel('Time (in ms)')
ax.legend()
ax.set_xscale('log')
ax.set_title('Comparison of Time Taken for Data Transfer with TLS ON/OFF')
fig.savefig('results/plots/time_contentTransfer_tls.png', dpi=fig.dpi)
fig, ax = plt.subplots()
ax.grid()
ax.plot(x, time_tot_http1, 'o-', color='r', label="HTTP1")
ax.plot(x, time_tot_http1_tls, 'o-', color='g', label="HTTP1_with_tls")
ax.plot(x, time_tot_http2, 'o-', color='b', label="SPDY")
ax.plot(x, time_tot_http2_tls, 'o-', color='k', label="SPDY_with_tls")
ax.fill_between(x, time_tot_http1 - std_tot_http1, time_tot_http1 + std_tot_http1, color='gray', alpha=0.3)
ax.fill_between(x, time_tot_http2 - std_tot_http2, time_tot_http2 + std_tot_http2, color='gray', alpha=0.3)
ax.fill_between(x, time_tot_http1_tls - std_tot_http1_tls, time_tot_http1_tls + std_tot_http1_tls, color='gray', alpha=0.3)
ax.fill_between(x, time_tot_http2_tls - std_tot_http2_tls, time_tot_http2_tls + std_tot_http2_tls, color='gray', alpha=0.3)
# ax.errorbar(x, time_tot_http2, yerr=std_tot_http2, fmt='-', color='r', label="HTTP2")
# ax.errorbar(x, time_tot_quic, yerr=std_tot_quic, fmt='-', color='b', label="QUIC")
ax.set_xlabel('Size of data (Length)')
ax.set_ylabel('Time (in ms)')
ax.legend()
ax.set_xscale('log')
ax.set_title('Comparison of Total Time with TLS ON/OFF')
fig.savefig('results/plots/total_time_tls.png', dpi=fig.dpi) | 54.923729 | 257 | 0.738158 | 974 | 6,481 | 4.526694 | 0.103696 | 0.059878 | 0.090724 | 0.083466 | 0.889091 | 0.75482 | 0.540712 | 0.529372 | 0.448628 | 0.346564 | 0 | 0.044298 | 0.132696 | 6,481 | 118 | 258 | 54.923729 | 0.740082 | 0.059404 | 0 | 0.186047 | 0 | 0 | 0.19698 | 0.14675 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011628 | false | 0 | 0.034884 | 0 | 0.05814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c33fa15ddbf9c5dfc357e4226f51b2734c6f579 | 738 | py | Python | nodes/List/GetTaskRenderListIndex.py | atticus-lv/RenderNode | 8a4797a2186b76fedebc5d634cff298e69089474 | [
"Apache-2.0"
] | 17 | 2021-11-21T09:26:55.000Z | 2022-03-09T06:56:01.000Z | nodes/List/GetTaskRenderListIndex.py | atticus-lv/RenderNode | 8a4797a2186b76fedebc5d634cff298e69089474 | [
"Apache-2.0"
] | 1 | 2021-12-05T13:02:48.000Z | 2021-12-06T08:02:34.000Z | nodes/List/GetTaskRenderListIndex.py | atticus-lv/RenderNode | 8a4797a2186b76fedebc5d634cff298e69089474 | [
"Apache-2.0"
] | 4 | 2021-11-23T14:49:34.000Z | 2021-12-30T15:04:58.000Z | import bpy
from bpy.props import *
from ...nodes.BASE.node_base import RenderNodeBase
class RenderNodeGetListIndex(RenderNodeBase):
"""A simple input node"""
bl_idname = 'RenderNodeGetListIndex'
bl_label = 'Get List Index'
def init(self, context):
self.create_output('RenderNodeSocketInt', "index", 'Index')
def process(self,context,id,path):
node = self.id_data.nodes.get(bpy.context.window_manager.rsn_active_list)
if not node or node.bl_idname != 'RenderNodeTaskRenderListNode': return
self.outputs[0].set_value(node.active_index)
def register():
bpy.utils.register_class(RenderNodeGetListIndex)
def unregister():
bpy.utils.unregister_class(RenderNodeGetListIndex)
| 26.357143 | 81 | 0.730352 | 88 | 738 | 5.977273 | 0.522727 | 0.153992 | 0.045627 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001618 | 0.162602 | 738 | 27 | 82 | 27.333333 | 0.849515 | 0.025745 | 0 | 0 | 0 | 0 | 0.130435 | 0.070126 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.1875 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1c3ca96a8752bea73f340ee28894ea1bdab8af22 | 215 | py | Python | Python 3/19.prac_no2.py | ByeonUi-Hyeok/practice | 6f55ddcb662e2bf8e0c3fb4c4af0beb77a1c7d2d | [
"MIT"
] | 1 | 2021-06-11T08:55:03.000Z | 2021-06-11T08:55:03.000Z | Python 3/19.prac_no2.py | ByeonUi-Hyeok/practice | 6f55ddcb662e2bf8e0c3fb4c4af0beb77a1c7d2d | [
"MIT"
] | null | null | null | Python 3/19.prac_no2.py | ByeonUi-Hyeok/practice | 6f55ddcb662e2bf8e0c3fb4c4af0beb77a1c7d2d | [
"MIT"
] | null | null | null | import funcvote as vote
votes = input("투표내용 >>>")
# print(votes)
# print(type(votes))
result = vote.str2int(votes)
print(vote.countvotes(result))
result = vote.countvotes(result)
vote.printvote(result)
# 투표 초안 | 14.333333 | 32 | 0.716279 | 29 | 215 | 5.310345 | 0.517241 | 0.194805 | 0.25974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005348 | 0.130233 | 215 | 15 | 33 | 14.333333 | 0.818182 | 0.172093 | 0 | 0 | 0 | 0 | 0.045714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c3f21c6980082d2b5b98180066cf9ba8b94eb50 | 156 | py | Python | utils/runtime_mode.py | omiderfanmanesh/dengue-infections-prediction | 6b4e4aa4af6f6e2cc581fd7828634bbfdc446340 | [
"Apache-2.0"
] | null | null | null | utils/runtime_mode.py | omiderfanmanesh/dengue-infections-prediction | 6b4e4aa4af6f6e2cc581fd7828634bbfdc446340 | [
"Apache-2.0"
] | null | null | null | utils/runtime_mode.py | omiderfanmanesh/dengue-infections-prediction | 6b4e4aa4af6f6e2cc581fd7828634bbfdc446340 | [
"Apache-2.0"
] | 1 | 2021-06-05T10:05:44.000Z | 2021-06-05T10:05:44.000Z | # Copyright (c) 2021, Omid Erfanmanesh, All rights reserved.
class RuntimeMode:
TRAIN = 0
TUNING = 1
CROSS_VAL = 2
FEATURE_IMPORTANCE = 3
| 19.5 | 61 | 0.666667 | 20 | 156 | 5.1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069565 | 0.262821 | 156 | 7 | 62 | 22.285714 | 0.817391 | 0.371795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c45545157e97f9c4e1cc68b6cafb654b5d57282 | 439 | py | Python | news/views.py | valch85/newssite | ef612a7bde4ff1d6e1e35f5cc4ec9407f031270e | [
"Apache-2.0"
] | null | null | null | news/views.py | valch85/newssite | ef612a7bde4ff1d6e1e35f5cc4ec9407f031270e | [
"Apache-2.0"
] | 2 | 2020-02-12T00:16:37.000Z | 2020-06-05T20:42:49.000Z | news/views.py | valch85/newssite | ef612a7bde4ff1d6e1e35f5cc4ec9407f031270e | [
"Apache-2.0"
] | null | null | null | from django.shortcuts import render, get_object_or_404
from .models import News
# Create your views here.
def index(request):
latest_news_list = News.objects.order_by('-pub_date')[:10]
context = {'latest_news_list': latest_news_list}
return render(request, 'news/index.html', context)
def detail(request, news_id):
new = get_object_or_404(News, pk=news_id)
return render(request, 'news/detail.html', {'new': new})
| 27.4375 | 62 | 0.728929 | 66 | 439 | 4.606061 | 0.5 | 0.098684 | 0.138158 | 0.092105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021333 | 0.145786 | 439 | 15 | 63 | 29.266667 | 0.789333 | 0.052392 | 0 | 0 | 0 | 0 | 0.143204 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1c4d007b31f3f642fe520a7abaa4b88348fd22fe | 179 | py | Python | torch/metrics/accuracy_score.py | LilDataScientist/PyTorch-From-Scratch | ae3c0bffc5a36a9a7c123b98f52bdaa32fbedef6 | [
"MIT"
] | null | null | null | torch/metrics/accuracy_score.py | LilDataScientist/PyTorch-From-Scratch | ae3c0bffc5a36a9a7c123b98f52bdaa32fbedef6 | [
"MIT"
] | null | null | null | torch/metrics/accuracy_score.py | LilDataScientist/PyTorch-From-Scratch | ae3c0bffc5a36a9a7c123b98f52bdaa32fbedef6 | [
"MIT"
] | null | null | null | import numpy as np
def accuracy_score(y_true, y_pred):
a = np.argmax(y_true, axis=1)
b = np.argmax(y_pred, axis=1)
return np.count_nonzero(a == b) / y_true.shape[0]
| 22.375 | 53 | 0.664804 | 35 | 179 | 3.2 | 0.571429 | 0.133929 | 0.160714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0.195531 | 179 | 7 | 54 | 25.571429 | 0.756944 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1c57c91d84e8ef886ecab5c688f26666500663aa | 536 | py | Python | Tree/node.py | philipwerner/python_data_structures | 554c38376b732f65c5c168d0e1bd30bea3d1ab6b | [
"MIT"
] | null | null | null | Tree/node.py | philipwerner/python_data_structures | 554c38376b732f65c5c168d0e1bd30bea3d1ab6b | [
"MIT"
] | null | null | null | Tree/node.py | philipwerner/python_data_structures | 554c38376b732f65c5c168d0e1bd30bea3d1ab6b | [
"MIT"
] | null | null | null | """Node class module for Binary Tree."""
class Node(object):
"""The Node class."""
def __init__(self, value):
"""Initialization of node object."""
self.value = value
self.left = None
self.right = None
def __str__(self):
"""Return a string representation of the node object."""
return f'{self.value}'
def __repr__(self):
"""Return a representation of the node object."""
return f'<Node | Value: {self.value} | Left: {self.left} | Right: {self.right}>'
| 26.8 | 88 | 0.585821 | 66 | 536 | 4.575758 | 0.363636 | 0.13245 | 0.072848 | 0.152318 | 0.238411 | 0.238411 | 0.238411 | 0 | 0 | 0 | 0 | 0 | 0.268657 | 536 | 19 | 89 | 28.210526 | 0.770408 | 0.328358 | 0 | 0 | 0 | 0.111111 | 0.245509 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
1c5a7f175c98d892dc83db59726cb2f27a8bed94 | 2,198 | py | Python | parser/fase2/team20/execution/executeSentence2.py | LopDlMa/tytus | 0b43ee1c7300cb11ddbe593e08239321b71dc443 | [
"MIT"
] | null | null | null | parser/fase2/team20/execution/executeSentence2.py | LopDlMa/tytus | 0b43ee1c7300cb11ddbe593e08239321b71dc443 | [
"MIT"
] | null | null | null | parser/fase2/team20/execution/executeSentence2.py | LopDlMa/tytus | 0b43ee1c7300cb11ddbe593e08239321b71dc443 | [
"MIT"
] | null | null | null | from .AST.sentence import *
from .AST.expression import *
from .AST.error import *
import sys
sys.path.append("../")
from console import *
def executeSentence2(self, sentence):
if isinstance(sentence, CreateDatabase):
h=0
elif isinstance(sentence, ShowDatabases):
h=0
elif isinstance(sentence, DropDatabase):
h=0
elif isinstance(sentence,Use):
h=0
elif isinstance(sentence,CreateTable):
h=0
elif isinstance(sentence, CreateType):
h=0
elif isinstance(sentence, InsertAll):
h=0
elif isinstance(sentence, Insert):
h=0
elif isinstance(sentence, Delete):
archivo = open("C3D.py", 'a')
archivo.write("\n")
archivo.write("ICreateDatabase("+sentence.name+","+sentence.ifNotExistsFlag+","+sentence.OrReplace+","+sentence.OwnerMode+")")
archivo.close()
elif isinstance(sentence,Select):
print(sentence.columns)
#print(sentence.columns[0].function)
#print(sentence.columns[0].expression)
print(sentence.tables)
print(sentence.options)
elif isinstance(sentence,DropTable):
h=0
elif isinstance(sentence,AlterDatabaseRename):
archivo = open("C3D.py", 'a')
archivo.write("\n")
archivo.write("ICreateDatabase("+sentence.name+","+sentence.ifNotExistsFlag+","+sentence.OrReplace+","+sentence.OwnerMode+")")
archivo.close()
elif isinstance(sentence,Update):
h=0
elif isinstance(sentence,AlterTableDropConstraint):
archivo = open("C3D.py", 'a')
archivo.write("\n")
archivo.write("ICreateDatabase("+sentence.name+","+sentence.ifNotExistsFlag+","+sentence.OrReplace+","+sentence.OwnerMode+")")
archivo.close()
elif isinstance(sentence,AlterTableAlterColumnType):
h=0
elif isinstance(sentence, AlterTableAddColumn):
h=0
elif isinstance(sentence, AlterTableDropColumn):
archivo = open("C3D.py", 'a')
archivo.write("\n")
archivo.write("ICreateDatabase("+sentence.name+","+sentence.ifNotExistsFlag+","+sentence.OrReplace+","+sentence.OwnerMode+")")
archivo.close()
| 36.032787 | 135 | 0.641947 | 219 | 2,198 | 6.442922 | 0.242009 | 0.216867 | 0.249468 | 0.136074 | 0.642098 | 0.437987 | 0.437987 | 0.437987 | 0.437987 | 0.437987 | 0 | 0.010989 | 0.213376 | 2,198 | 60 | 136 | 36.633333 | 0.80509 | 0.032757 | 0 | 0.509091 | 0 | 0 | 0.056026 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018182 | false | 0 | 0.090909 | 0 | 0.109091 | 0.054545 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c5d3932d3d58eb3852f548752bb665e5c02d910 | 475 | py | Python | pysol/core/helpers.py | lotfio/pysol | 34fac6d1ec246a7a037d8237e00974a9a9548faa | [
"MIT"
] | 2 | 2019-10-09T21:58:20.000Z | 2020-01-08T07:29:28.000Z | pysol/core/helpers.py | lotfio/pysol | 34fac6d1ec246a7a037d8237e00974a9a9548faa | [
"MIT"
] | null | null | null | pysol/core/helpers.py | lotfio/pysol | 34fac6d1ec246a7a037d8237e00974a9a9548faa | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#| This file is part of cony
#|
#| @package Pysol python cli application
#| @author <lotfio lakehal>
#| @license MIT
#| @version 0.1.0
#| @copyright 2019 lotfio lakehal
import sys
# load module function
# this function loads a module by string name
def load_module(module):
module_path = module
if module_path in sys.modules:
return sys.modules[module_path]
return __import__(module_path, fromlist=[module]) | 23.75 | 53 | 0.673684 | 64 | 475 | 4.859375 | 0.640625 | 0.128617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02168 | 0.223158 | 475 | 20 | 53 | 23.75 | 0.821138 | 0.547368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1c76dc2dd43f773b5f6265abe296c42c73bdff7c | 597 | py | Python | rgislackbot/dispatcher/dispatchconfig.py | raginggeek/RGISlackBot | 9fddf78b37f494eb6605da890a28f41427f37f03 | [
"MIT"
] | null | null | null | rgislackbot/dispatcher/dispatchconfig.py | raginggeek/RGISlackBot | 9fddf78b37f494eb6605da890a28f41427f37f03 | [
"MIT"
] | 11 | 2019-03-08T01:38:40.000Z | 2019-03-15T16:21:59.000Z | rgislackbot/dispatcher/dispatchconfig.py | raginggeek/RGISlackBot | 9fddf78b37f494eb6605da890a28f41427f37f03 | [
"MIT"
] | null | null | null | class DispatchConfig:
def __init__(self, raw_config):
self.registered_commands = {}
for package in raw_config["handlers"]:
for command in package["commands"]:
self.registered_commands[command] = {
"class": package["class"],
"fullpath": ".".join([package["package"], package["module"], package["class"]])
}
def get_handler_by_command(self, command):
if command in self.registered_commands:
return self.registered_commands[command]
else:
return None
| 37.3125 | 99 | 0.572864 | 57 | 597 | 5.77193 | 0.421053 | 0.170213 | 0.267477 | 0.176292 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.313233 | 597 | 15 | 100 | 39.8 | 0.802439 | 0 | 0 | 0 | 0 | 0 | 0.088777 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1c79747bf1ea18f4c3b8be4f42301cb16c8ee8f3 | 223 | py | Python | labJS/conf.py | lpomfrey/django-labjs | f35346ec7f3b87ae24b2d7a01c06001ceb4173bc | [
"MIT"
] | null | null | null | labJS/conf.py | lpomfrey/django-labjs | f35346ec7f3b87ae24b2d7a01c06001ceb4173bc | [
"MIT"
] | null | null | null | labJS/conf.py | lpomfrey/django-labjs | f35346ec7f3b87ae24b2d7a01c06001ceb4173bc | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from appconf import AppConf
from django.conf import settings # noqa
class LabjsConf(AppConf):
ENABLED = not settings.DEBUG
DEBUG_TOGGLE = 'labjs'
| 18.583333 | 40 | 0.730942 | 28 | 223 | 5.607143 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005495 | 0.183857 | 223 | 11 | 41 | 20.272727 | 0.857143 | 0.116592 | 0 | 0 | 0 | 0 | 0.025773 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
1c855f3c4b60017317473eca05c6f77584434cbc | 2,464 | py | Python | QUICK_START/NODE_SQUEEZESEG_CLUSTER/src/script/squeezeseg/utils/clock.py | Hqss/DINK | 5fecaa65e2f9da48eb8ac38ef709aa555fca8766 | [
"BSD-3-Clause"
] | 189 | 2019-01-16T03:05:23.000Z | 2020-09-14T14:54:16.000Z | QUICK_START/NODE_SQUEEZESEG_CLUSTER/src/script/squeezeseg/utils/clock.py | jtpils/DINK | 5f6b3eaba279126f79ae6607f965311002d7451c | [
"BSD-3-Clause"
] | 3 | 2019-02-11T06:20:15.000Z | 2020-04-05T07:03:53.000Z | QUICK_START/NODE_SQUEEZESEG_CLUSTER/src/script/squeezeseg/utils/clock.py | jtpils/DINK | 5f6b3eaba279126f79ae6607f965311002d7451c | [
"BSD-3-Clause"
] | 25 | 2019-01-16T03:05:24.000Z | 2020-04-04T21:07:53.000Z | #! /usr/bin/python2
# -*- coding: utf-8 -*-
"""
Clock function to take running time following Segmatch.
"""
# BSD 3-Clause License
#
# Copyright (c) 2019, FPAI
# Copyright (c) 2019, SeriouslyHAO
# Copyright (c) 2019, xcj2019
# Copyright (c) 2019, Leonfirst
#
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
#
# * Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# * Neither the name of the copyright holder nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
import datetime
class Clock(object):
def __init__(self):
self.kSecondsToMiliseconds = 1000.0
self.kMicrosecondsToMiliseconds = 0.001
self.start()
def start(self):
self.real_time_start_ = datetime.datetime.now()
def takeTime(self):
seconds = (datetime.datetime.now() - self.real_time_start_).seconds
useconds = (datetime.datetime.now() - self.real_time_start_).microseconds
self.real_time_ms_ = (seconds*self.kSecondsToMiliseconds + useconds*self.kMicrosecondsToMiliseconds) + 0.5
def getRealTime(self):
return self.real_time_ms_
def takeRealTime(self):
self.takeTime()
return self.getRealTime()
| 39.741935 | 114 | 0.745536 | 328 | 2,464 | 5.542683 | 0.503049 | 0.022002 | 0.033003 | 0.028053 | 0.140814 | 0.114411 | 0.114411 | 0.074807 | 0.074807 | 0.074807 | 0 | 0.016882 | 0.18263 | 2,464 | 61 | 115 | 40.393443 | 0.885799 | 0.683847 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0.058824 | 0.058824 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.