hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ca0012763ebde5a8a825cd260de95730a4c414f2 | 926 | py | Python | Stack/stackusinglinkedlist.py | shaurtoonetwork/Data-Structures-Implemented-In-Python | 31f770165d535547e6ab3973ca92944cd4d93e11 | [
"Unlicense"
] | null | null | null | Stack/stackusinglinkedlist.py | shaurtoonetwork/Data-Structures-Implemented-In-Python | 31f770165d535547e6ab3973ca92944cd4d93e11 | [
"Unlicense"
] | null | null | null | Stack/stackusinglinkedlist.py | shaurtoonetwork/Data-Structures-Implemented-In-Python | 31f770165d535547e6ab3973ca92944cd4d93e11 | [
"Unlicense"
] | null | null | null | class Node:
def __init__(self,data):
self.data=data
self.next=None
class Stack():
def __init__(self):
self.head=None
def push(self,data):
new_node=Node(data)
new_node.next=self.head
self.head=new_node
def pop(self):
cur_node=self.head
self.head=cur_node.next
cur_node=None
def get_stack(self):
cur_node=self.head
while cur_node:
print(cur_node.data)
cur_node = cur_node.next
def peak(self):
top_node=self.head
print(top_node.data)
def size(self,node):
if node is None:
return 0
return 1+self.size(node.next)
def is_empty(self):
if self.head is None:
print("Yes")
else:
print("No")
s=Stack()
s.push("A")
s.push("B")
s.push("C")
s.push("D")
s.push("E")
s.pop()
s.pop()
s.get_stack()
| 15.694915 | 37 | 0.541037 | 137 | 926 | 3.481752 | 0.255474 | 0.134172 | 0.075472 | 0.067086 | 0.079665 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003231 | 0.331533 | 926 | 58 | 38 | 15.965517 | 0.767367 | 0 | 0 | 0.097561 | 0 | 0 | 0.010846 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.195122 | false | 0 | 0 | 0 | 0.292683 | 0.097561 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6b8dd05f5f7cbaff9802e64ef9b63bb934698fe | 10,059 | py | Python | emailhandler.py | wcascades/Simple_Slack_Bot | b674909d5e02d8f15918fb4dee9f2cb1bcf1079a | [
"MIT"
] | null | null | null | emailhandler.py | wcascades/Simple_Slack_Bot | b674909d5e02d8f15918fb4dee9f2cb1bcf1079a | [
"MIT"
] | 3 | 2021-08-05T14:56:04.000Z | 2022-03-06T18:43:58.000Z | emailhandler.py | wcascades/Simple_Slack_Bot | b674909d5e02d8f15918fb4dee9f2cb1bcf1079a | [
"MIT"
] | null | null | null | # don't use 3.9, currently using 3.7
import mysecrets
import re
import email
import logging
import slackhandler
import my_parser
from exchangelib import Credentials, Account, DELEGATE, Configuration, FaultTolerance, Message
from imapclient import IMAPClient
from collections import deque
class emailhandler:
def __init__(self, count=10, protocol='EWS'):
self.last_ten_tickets = deque([], maxlen=count)
self.protocol = protocol
self.credentials = Credentials(mysecrets.username, mysecrets.password)
self.on_call = _get_on_call_number_from_file(mysecrets.oncalltxt_location)
self.permanent_numbers = mysecrets.permanent_numbers
if not self.on_call:
self.on_call = mysecrets.on_call
self.config = Configuration(server=mysecrets.host,
credentials=self.credentials,
retry_policy=FaultTolerance())
self.account = Account(primary_smtp_address=mysecrets.email_address,
config=self.config,
credentials=self.credentials,
autodiscover=False,
access_type=DELEGATE)
self.phonebook = {}
# with open('phonebook.csv') as file:
# csv_file = csv.DictReader(file)
# for row in csv_file:
# self.phonebook[row['Username']] = row['Phone_Number']
# doesn't add ticket if it is already in the deque
def add_ticket_num(self, ticket):
if not self.last_ten_tickets.__contains__(ticket):
self.last_ten_tickets.append(ticket)
return ticket
# Needs to be of email type and not exchangelib message
def process_emails(self, emails):
if isinstance(emails, list):
for mail in emails:
# Update on call logic
on_call_phone_num = _on_call_update_email(mail)
if on_call_phone_num:
self.on_call = on_call_phone_num
_update_on_call_file(on_call_phone_num)
logging.debug("emailhandler.py :: On call number has beeen updated to " + on_call_phone_num)
slackhandler.notifyOnCallUpdate(on_call_phone_num)
# Who is on call request
elif _on_call_request_email(mail):
logging.debug("emailhandler.py :: on call request notification" +
" being sent to slackhandler")
slackhandler.notify_inform_who_is_on_call(self.on_call)
# Priority 1 or 2 logic
else:
num_pri_tuple = _get_ticket_num(str(mail['Subject']))
if num_pri_tuple:
ticket_num = self.add_ticket_num(num_pri_tuple[0])
if ticket_num:
# This block is reached if it's a new ticket to the bot
if num_pri_tuple[1] == 1:
logging.debug("emailhandler.py :: sending message to slackhandler.notify priority 1")
slackhandler.notifyP1(mail)
self.notify_on_call(mail, self.on_call)
for num in self.permanent_numbers:
if num != self.on_call:
self.notify_on_call(mail, num)
elif num_pri_tuple[1] == 2:
logging.debug("emailhandler.py :: sending message to slackhandler.notify priority 2")
slackhandler.notifyP2(mail)
else:
logging.ERROR("Invalid block reached in process_emails")
# returns array of new emails
def get_emails(self):
if self.protocol == 'IMAP':
with IMAPClient(host=mysecrets.host) as client:
# init IMAP connection
client.login(mysecrets.username, mysecrets.password)
client.select_folder('Inbox')
# returns uids of emails
messages = client.search(['UNSEEN'])
# returns emails in a dictionary format
email_dict = client.fetch(messages, ['RFC822'])
client.add_flags(messages, '\\SEEN')
# close out imap connection
client.shutdown()
emails = []
# convert emails from dict format to email format
for mail in email_dict.values():
emails.append(email.message_from_string(mail[b'RFC822'].decode("UTF-8")))
return emails
if self.protocol == 'EWS':
# get unread emails
unread = self.account.inbox.filter(is_read=False)
logging.debug("emailhandler.py get_emails()::" + str(unread.count()))
emails = []
# convert from exchangelib.items.message.Message object to email object
for mail in unread:
try:
emails.append(_convert_from_exchange_email(mail))
logging.debug("emailhandler.py get_emails unread email found :: " + str(mail.subject))
# mark as read
mail.is_read = True
mail.save(update_fields=['is_read'])
except:
logging.error("emailhandler.py:: ERROR in reading email. Not email?")
return emails
# Sets the flag on all email to seen
def read_all_emails(self):
if self.protocol == 'IMAP':
with IMAPClient(host=mysecrets.host) as client:
client.login(mysecrets.username, mysecrets.password)
client.select_folder('Inbox')
messages = client.search(['UNSEEN'])
client.add_flags(messages, '\\SEEN')
client.shutdown()
if self.protocol == 'EWS':
# get unread emails
unread = self.account.inbox.filter(is_read=False)
for mail in unread:
logging.debug('emailhandler.py:: Unread email found in read_all_emails: ' + str(mail.subject))
mail.is_read = True
# todo: save is returning a massive string - check documentation
mail.save(update_fields=['is_read'])
# mail needs to be of email type and not exchangelib message
def notify_on_call(self, mail, phone_number):
on_call_email_to_sms = phone_number + "@vtext.com"
logging.debug("emailhandler.py :: Entering Notify_on_Call" +
"\n - Subject = " +
str(mail["Subject"]) +
"\n to_recipients = " +
on_call_email_to_sms)
if phone_number:
body_string = (mail["Subject"] +
"\n" +
"Center ID: " +
my_parser.get_cid(mail) +
"\n"
"Summary: " +
my_parser.get_summary(mail))
message_to_send = Message(
account=self.account,
subject='',
body=body_string,
to_recipients=[on_call_email_to_sms]
)
try:
message_to_send.send()
logging.debug("emailhandler.py :: email sent to " + str(on_call_email_to_sms))
except:
logging.error("emailhandler.py :: FAILED TO SEND ON CALL TEXT")
else:
logging.debug("emailhandler.py :: Unable to send on call text, on_call is empty")
# get ticket number from a valid high priority subject line
# If str in not in proper format, nothing is returned
# Return tuple of (ticket_number, priority)
def _get_ticket_num(subj):
# emails follow the following format
# Incident# 12345 is a Priority 1 ticket and has been assigned to your team
pattern = re.compile(mysecrets.ticket_regex_string)
if re.search(pattern, subj):
nums = re.findall(r"\d+", subj)
return int(nums[0]), int(nums[1])
def _convert_from_exchange_email(mail):
return email.message_from_string(mail.mime_content.decode("UTF-8"))
def _on_call_update_email(mail):
if isinstance(mail, email.message.Message):
if str(mail['Subject']).upper() == "UPDATE ON-CALL":
# first payload seems to be body - This could change depending where and how it's sent
# Should be consistent throughout enterprise
logging.debug("emailhandler.py :: On-Call Update Found")
phone_num_groups = ''
for p in mail.get_payload():
phone_num_groups = re.match(r"^\d{10}", p.get_payload())
if phone_num_groups:
return phone_num_groups.group(0)
def _update_on_call_file(phone_number):
try:
with open(mysecrets.oncalltxt_location, 'w') as file_obj:
file_obj.write(phone_number)
except IOError as e:
logging.error("emailhandler.py :: IO error recieved while trying to update oncall.txt")
except:
logging.error("emailhandler.py :: Unexpected occured trying to update oncall.txt")
def _get_on_call_number_from_file(oncall_file):
phone_number = ''
try:
with open(oncall_file, 'r') as file_obj:
phone_number = file_obj.readline(10)
except IOError as e:
logging.error("emailhandler.py :: IO error recieved while trying to read oncall.txt")
except:
logging.error("emailhandler.py :: Unexpected occured trying to read oncall.txt")
return phone_number
def _on_call_request_email(mail):
if isinstance(mail, email.message.Message):
if str(mail['Subject']).upper() == "SLACKBOT WHO IS ON CALL":
logging.debug("emailhandler.py :: _on_call_request_email found!")
return True
return False
| 42.987179 | 117 | 0.571429 | 1,136 | 10,059 | 4.865317 | 0.227993 | 0.045594 | 0.052108 | 0.05645 | 0.349195 | 0.274109 | 0.213136 | 0.188891 | 0.188891 | 0.188891 | 0 | 0.005772 | 0.345561 | 10,059 | 233 | 118 | 43.171674 | 0.833814 | 0.125659 | 0 | 0.266272 | 0 | 0 | 0.14483 | 0.002511 | 0 | 0 | 0 | 0.004292 | 0 | 1 | 0.071006 | false | 0.017751 | 0.053254 | 0.005917 | 0.183432 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6b8e6b112b5f5a0738910a049da1d3ae0235a6c | 1,698 | py | Python | notebooks/jetlatcalcs.py | afahadabdullah/cmip6hack-emergentconstraints | 287044e7bfb76bc318856d9b25b9e5e2b52ed477 | [
"MIT"
] | 1 | 2019-11-21T02:36:59.000Z | 2019-11-21T02:36:59.000Z | notebooks/jetlatcalcs.py | afahadabdullah/cmip6hack-emergentconstraints | 287044e7bfb76bc318856d9b25b9e5e2b52ed477 | [
"MIT"
] | null | null | null | notebooks/jetlatcalcs.py | afahadabdullah/cmip6hack-emergentconstraints | 287044e7bfb76bc318856d9b25b9e5e2b52ed477 | [
"MIT"
] | 6 | 2019-10-15T20:17:21.000Z | 2019-10-18T22:20:12.000Z | print( "you have successfully imported the jet latitude calculation subroutines" )
import numpy as np
def getzonalmeanonplev( xdatin, plev ):
# -----calculates zonal mean and picks out a pressure level-----
# input: xdatin = xarray data array (plev,nlat,nlon)
# : plev = desired pressure level (hPa)
# oudput: xdatzmonp = the zonal mean of xdatin on plev
# --------------------------------------------------------------
plevpa = plev*100. #convert to Pa
xdatzm = xdatin.mean(dim='lon')
xdatzmonp = xdatzm.sel(plev = plevpa, method='nearest')
return xdatzmonp
def calcjetlat( uzm, minlat, maxlat):
# -----calculate the latitude of a maximum between latitude bounds
# input: uzm = data array(nlat)
# minlat = minimum latitude over which to search for the max
# maxlat = maximum latitude over which to search for the max
# output: jlatv = the jet latitude
# jmaxv = the jet maximum
# the jet is as the maximum of the quadratic fit to the grid point maximum
# and the two adjacent grid points (as Kidston and Gerber 2010)
# NaN's are skipped
lats = uzm.sel(lat = slice(minlat,maxlat)).coords['lat']
imax = uzm.sel(lat = slice(minlat,maxlat)).argmax(dim='lat', skipna=True, keep_attrs=True)
if (imax == 0) or (imax == len(lats)-1):
jlatv = np.nan
jspeedv = np.nan
print( "!!!! no local maximum found in calcjetlat" )
else:
lat4fit = lats.isel(lat=slice(int(imax)-1,int(imax)+2))
u4fit = uzm.sel(lat = slice(minlat,maxlat)).isel(lat=slice(int(imax)-1,int(imax)+2))
coefs = np.polyfit(lat4fit,u4fit,2)
jlatv = -1.*coefs[1]/(2.*coefs[0])
jspeedv = coefs[2] + coefs[1]*jlatv + coefs[0]*jlatv**2
return jlatv, jspeedv
| 38.590909 | 92 | 0.65371 | 246 | 1,698 | 4.50813 | 0.45122 | 0.036069 | 0.024346 | 0.037872 | 0.182146 | 0.182146 | 0.111812 | 0.111812 | 0.050496 | 0 | 0 | 0.018813 | 0.186101 | 1,698 | 43 | 93 | 39.488372 | 0.783647 | 0.430506 | 0 | 0 | 0 | 0 | 0.134595 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.095238 | 0 | 0.285714 | 0.095238 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6ba96554a84cae2263637750307f51ca135bde3 | 1,488 | py | Python | TournamentReminder.py | 0ffkilter/StunfiskBot | 5ec1fd27b875ede1845747f22bf5b1fafe81caff | [
"MIT"
] | 1 | 2019-05-10T00:29:15.000Z | 2019-05-10T00:29:15.000Z | TournamentReminder.py | 0ffkilter/StunfiskBot | 5ec1fd27b875ede1845747f22bf5b1fafe81caff | [
"MIT"
] | null | null | null | TournamentReminder.py | 0ffkilter/StunfiskBot | 5ec1fd27b875ede1845747f22bf5b1fafe81caff | [
"MIT"
] | null | null | null | import praw, sys, os
user_agent = "Tournament Reminder by /u/0ffkilter"
config_file = open('%s%s' %(os.getcwd(), '/Config.txt'), 'r')
config = config_file.read().split('\n')
config_file.close()
reddit = praw.Reddit(user_agent = user_agent)
reddit.login('stunfiskhelperbot', config[1])
part_file = open('%s%s' %(os.getcwd(), '/Participants.txt'), 'r')
parts = part_file.read().split('\n')
part_file.close()
message = ("Hello! You are receiving this message because you are "
"signed up for /r/stunfisk's Bucket O' Mons Tournament! If you "
"did not sign up for the tournament, that means that /u/0ffkilter "
"typed in someone's name wrong. You should probably let him know. \n\n"
"In Any case, this is a reminder that Round 1 of the Tournament is out! \n\n"
"The Theme is 'power pokes', and the post can be found "
"[here](http://www.reddit.com/r/stunfisk/comments/2cejgl/tournament_bucket_o_mons_round_1_announcement/)\n\n "
"You have until Tuesday, August 5th 12:00 PST to complete your match! \n\n"
"Additional rules and regulations can be found on the aforementioned post. \n\n"
"Send Questions or comments to /u/0ffkilter!")
subject = "Reminder for Bucket O' Mons Tournament!"
parts = ['bigyeIIowtaxi']
for participant in parts:
try:
reddit.send_message(participant, subject, message)
print('Sent -> %s' %participant)
except:
print('Failed -> %s' %participant)
| 33.818182 | 118 | 0.670699 | 219 | 1,488 | 4.484018 | 0.502283 | 0.010183 | 0.033605 | 0.020367 | 0.03666 | 0.03666 | 0 | 0 | 0 | 0 | 0 | 0.010067 | 0.198925 | 1,488 | 43 | 119 | 34.604651 | 0.813758 | 0 | 0 | 0 | 0 | 0.071429 | 0.572966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.035714 | 0 | 0.035714 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6baabf4be70eeed58c26930ea96749e16e4205c | 1,699 | py | Python | NGG_NEW.py | Antrix9/NUMBER_GUESSING_GAME | b5fcd5aca17b52dcec1bdbb5f661e20b82cc768b | [
"MIT"
] | null | null | null | NGG_NEW.py | Antrix9/NUMBER_GUESSING_GAME | b5fcd5aca17b52dcec1bdbb5f661e20b82cc768b | [
"MIT"
] | null | null | null | NGG_NEW.py | Antrix9/NUMBER_GUESSING_GAME | b5fcd5aca17b52dcec1bdbb5f661e20b82cc768b | [
"MIT"
] | 1 | 2021-03-27T16:28:33.000Z | 2021-03-27T16:28:33.000Z | import random
while True:
random_list = random.sample((3, 4, 5), 2)
#print(random_list)
random_number = random.randint(1, 1000)
#print(random_number)
start_range = random_number - random_list[0]
end_range = random_number + random_list[1]
print(f'Range Hint: The Number is between {start_range} & {end_range}')
# prime Hint
prime = True
for x in range(2, int(random_number)):
if (int(random_number) % x == 0):
prime = False
pass
if prime:
print('Prime Number Hint: TRUE')
else:
print('Prime Number Hint: FALSE')
# odd-even hint
if random_number % 2 == 0:
print('ODD or EVEN Hint: Even')
else:
print('ODD or EVEN Hint: Odd')
chanses = 4
while chanses > 0:
user_input = int(input('Enter Your Guessing:-'))
if user_input == random_number:
print(f'You Guessed Right, Number is {random_number}')
break
else:
print('Try Again! Read Hints Carefully\n')
if user_input > random_number:
print('The Number is Less then your Input Number\n')
else:
print('The Number is More then your Input Number\n')
chanses = chanses-1
print('')
# scoring system
if chanses == 0:
print('You Loose')
elif chanses == 1:
print('You Tried Hard')
elif chanses >= 2:
print('Well Done! You Genius Guy')
ask = input('Want to Play Again: Play(1) or Exit(2)--')
if ask == '2':
break
else:
print('Starting game Again...... \n')
| 26.138462 | 76 | 0.540906 | 215 | 1,699 | 4.176744 | 0.334884 | 0.13363 | 0.060134 | 0.051225 | 0.207127 | 0.062361 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0.350206 | 1,699 | 64 | 77 | 26.546875 | 0.791667 | 0.045909 | 0 | 0.155556 | 0 | 0 | 0.291237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.022222 | 0.022222 | 0 | 0.022222 | 0.311111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6bc65524aa9146a7d999087ba23c1ddab175047 | 284 | py | Python | 2048test.py | zhangzhibo123/2048game | 2bd573fdcff61b47bcdafc6aa498c6cd9e5b55a7 | [
"Apache-2.0"
] | null | null | null | 2048test.py | zhangzhibo123/2048game | 2bd573fdcff61b47bcdafc6aa498c6cd9e5b55a7 | [
"Apache-2.0"
] | null | null | null | 2048test.py | zhangzhibo123/2048game | 2bd573fdcff61b47bcdafc6aa498c6cd9e5b55a7 | [
"Apache-2.0"
] | null | null | null | # map = [[0 for i in range(4)] for j in range(4)]
# print(map)
map = [[2, 0, 0, 0], [0, 0, 0, 4], [8, 0, 0, 0], [4, 0, 0, 2]]
# ma = [[map[c][r] for c in range(4)] for r in reversed(range(4))]
# print(ma)
ma = [[map[c][r] for r in reversed(range(4))] for c in range(4)]
print(ma) | 23.666667 | 66 | 0.517606 | 65 | 284 | 2.261538 | 0.230769 | 0.108844 | 0.102041 | 0.081633 | 0.571429 | 0.272109 | 0 | 0 | 0 | 0 | 0 | 0.103604 | 0.21831 | 284 | 12 | 67 | 23.666667 | 0.558559 | 0.46831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6c2400ffbeca603e01ee66e01e65345a49b8317 | 3,110 | py | Python | scripts/generate_result.py | XinYao1994/HOPE | 99b41b457b67d3e5d6dd182f8aa2ce4ea66e4a68 | [
"Apache-2.0"
] | 108 | 2020-04-23T19:06:51.000Z | 2022-02-23T20:05:09.000Z | scripts/generate_result.py | XinYao1994/HOPE | 99b41b457b67d3e5d6dd182f8aa2ce4ea66e4a68 | [
"Apache-2.0"
] | 1 | 2021-07-07T05:58:57.000Z | 2021-07-07T05:58:57.000Z | scripts/generate_result.py | XinYao1994/HOPE | 99b41b457b67d3e5d6dd182f8aa2ce4ea66e4a68 | [
"Apache-2.0"
] | 11 | 2020-04-24T01:53:50.000Z | 2022-01-21T07:36:14.000Z | import sys
import os
import numpy as np
RESULT_DIR = 'results/'
PREFIX = ['ART', 'btree', 'prefixbtree', 'hot', 'microbench/cpr_latency', 'SuRF']
TYPE = ['point', 'range']
DATASETS = ['email', 'url', 'wiki']
VAR = ['cpr','x','height', 'fpr', 'lat', 'insertlat', 'lookuplat', 'mem', 'stats']
BT_OUTFILE = "../results/microbench/build_time_breakdown/bt_breakdown.csv"
def generate_result_single(dirpath, filename):
full_path = dirpath + filename
output_path = dirpath + 'final_' + filename
results = []
with open(full_path, 'r') as f:
print("Generate result for " + full_path)
lines = f.readlines()
cnt = 0
for line in lines:
line = line.strip('\n')
if line == '-':
break
cnt += 1
results = [[] for i in range(cnt)]
idx = 0
for i,line in enumerate(lines):
line = line.strip(',\n')
if line == '-':
idx = 0
continue
results[idx].append(np.array([float(x) for x in line.split(',')]))
idx += 1
results = (np.mean(np.asarray(results), axis=1))
# Output results to file
with open(output_path, 'w') as of:
for row in results:
line_result = ''
for col in row:
line_result += str(col) + ','
line_result = line_result.strip(',')
line_result += '\n'
of.write(line_result)
def microtree():
for pre in PREFIX:
for t in TYPE:
if pre == 'microbench/cpr_latency':
cur_dir = RESULT_DIR + pre + '/'
else:
cur_dir = RESULT_DIR + pre + '/' + t + '/'
for v in VAR:
for d in DATASETS:
file_prefix = v + '_' + d
files = os.listdir(cur_dir)
for f in files:
if f.startswith(file_prefix):
generate_result_single(cur_dir, f)
def buildtime():
ss_time = []
encode_time = []
build_dict_time = []
with open("bt") as f:
lines = f.readlines()
for line in lines:
wl = line.split("=")
key = wl[0].strip()
if (key == "Symbol Select time"):
ss_time.append(wl[1].strip())
if (key == "Code Assign(Hu-Tucker) time"):
encode_time.append(wl[1].strip())
if (key == "Build Dictionary time"):
build_dict_time.append(wl[1].strip())
with open(BT_OUTFILE, 'w') as f:
for i in range(0, len(ss_time)):
f.write(ss_time[i] + "," + encode_time[i] + "," + build_dict_time[i]+"\n")
def triearray():
pass
if __name__ == "__main__":
if len(sys.argv) < 2:
print(sys.argv)
print("Did not generate any results, return")
exit(0)
exp_type = sys.argv[1]
if exp_type == "micro-tree":
microtree()
elif exp_type == "bt":
buildtime()
elif exp_type == "ta":
triearray()
else:
print("Unkown experiment type")
| 30.194175 | 86 | 0.50418 | 376 | 3,110 | 4.015957 | 0.337766 | 0.039735 | 0.025828 | 0.025828 | 0.099338 | 0.063576 | 0.063576 | 0 | 0 | 0 | 0 | 0.006965 | 0.353698 | 3,110 | 102 | 87 | 30.490196 | 0.744279 | 0.007074 | 0 | 0.113636 | 0 | 0 | 0.129294 | 0.033377 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0.011364 | 0.034091 | 0 | 0.079545 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6c2e06c318572e03e24661769d9fadacecffe3d | 1,647 | py | Python | raspberryPi/control.py | djvolz/CarlLights | 7e0f9d16f2a93c9fcdbd7c4caa3438d39685c498 | [
"Beerware"
] | null | null | null | raspberryPi/control.py | djvolz/CarlLights | 7e0f9d16f2a93c9fcdbd7c4caa3438d39685c498 | [
"Beerware"
] | null | null | null | raspberryPi/control.py | djvolz/CarlLights | 7e0f9d16f2a93c9fcdbd7c4caa3438d39685c498 | [
"Beerware"
] | null | null | null | # -*- coding: utf-8 -*-
# @Author: djvolz
# @Date: 2016-11-14 17:03:11
# @Last Modified by: djvolz
# @Last Modified time: 2016-11-15 01:13:28
import time
import json # import json library to parse messages
import boto3 # import boto library that handles queuing functions
import lightSequence as ls
class Controller:
def __init__(self):
# Set the default action
self._action = 'undefined'
# Get the service resource
sqs = boto3.resource('sqs')
# Get the queue
self._queue = sqs.get_queue_by_name(QueueName='talkWithCarlLights')
def lightsFactory(self, type):
program = ls.LightSequence.factory(type)
if (program):
print("TURNING LIGHTS ON LIKE WE AIN'T GOT NO POWER BILL")
program.run()
def processMessages(self):
# Process messages by printing out body and optional author name
for message in \
self._queue.receive_messages(
MessageAttributeNames=['Author']):
# Parse the message request for the action.
request = json.loads(message.body)
action = request['request']['action']
if(action):
self._action = action
# Let the queue know that the message is processed
message.delete()
def run(self):
# Jump straight in
while True:
# Process messages from Alexa
self.processMessages()
# Generate the light sequence based on the specified action
self.lightsFactory(self._action)
# Sleepy time
time.sleep(1)
| 28.894737 | 75 | 0.604129 | 191 | 1,647 | 5.141361 | 0.539267 | 0.03055 | 0.032587 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028344 | 0.314511 | 1,647 | 56 | 76 | 29.410714 | 0.841453 | 0.336976 | 0 | 0 | 0 | 0 | 0.091248 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.321429 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6c7c7680b4ba76a99f84c48fad447e0e2a46a23 | 2,350 | py | Python | measurements/tasks/cleanup.py | nat64check/zaphod_backend | b92191950825e1a4fd8c34207c8491d587cfb61f | [
"BSD-3-Clause"
] | 1 | 2017-11-14T16:22:38.000Z | 2017-11-14T16:22:38.000Z | measurements/tasks/cleanup.py | sjm-steffann/nat64check_zaphod_backend | b92191950825e1a4fd8c34207c8491d587cfb61f | [
"BSD-3-Clause"
] | 5 | 2019-12-03T05:36:07.000Z | 2021-06-25T15:20:04.000Z | measurements/tasks/cleanup.py | sjm-steffann/nat64check_zaphod_backend | b92191950825e1a4fd8c34207c8491d587cfb61f | [
"BSD-3-Clause"
] | null | null | null | # ••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
# Copyright (c) 2018, S.J.M. Steffann. This software is licensed under the BSD
# 3-Clause License. Please see the LICENSE file in the project root directory.
# ••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
import sys
import requests
from django.utils.translation import gettext_lazy as _
from uwsgi_tasks import RetryTaskException, task
from generic.utils import TokenAuth, print_error, print_message, print_warning, retry_get
@task(retry_count=5, retry_timeout=300)
def remove_from_trillian(pk):
from measurements.models import InstanceRun
try:
# Try to find the InstanceRun multiple times, in case of a race condition
run = retry_get(InstanceRun.objects.exclude(analysed=None), pk=pk)
if not run.analysed:
print_warning(_("InstanceRun {pk} has not yet been analysed").format(pk=pk))
return
if not run.trillian_url:
# Already cleaned up
return
print_message(_("Deleting InstanceRun {run.pk} ({run.url}) from {run.trillian.name}").format(run=run))
response = requests.request(
method='DELETE',
url=run.trillian_url,
auth=TokenAuth(run.trillian.token),
timeout=(5, 15),
)
print(response)
if response.status_code not in [204, 404]:
# 204 = deleted, 404 = doesn't exist anymore
print_error(
_("{run.trillian.name} didn't accept our request ({response.status_code}), retrying later").format(
run=run,
response=response
)
)
raise RetryTaskException
run.trillian_url = ''
run.save()
print_message(_("Trillian {run.trillian.name} deleted completed InstanceRun {run.pk}").format(run=run))
except RetryTaskException:
raise
except InstanceRun.DoesNotExist:
print_warning(_("InstanceRun {pk} does not exist anymore").format(pk=pk))
return
except Exception as ex:
print_error(_('{name} on line {line}: {msg}').format(
name=type(ex).__name__,
line=sys.exc_info()[-1].tb_lineno,
msg=ex
))
raise RetryTaskException
| 32.638889 | 115 | 0.580426 | 262 | 2,350 | 5.667939 | 0.473282 | 0.051852 | 0.028283 | 0.03367 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014434 | 0.262979 | 2,350 | 71 | 116 | 33.098592 | 0.752887 | 0.190213 | 0 | 0.108696 | 0 | 0 | 0.176253 | 0.013193 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021739 | false | 0 | 0.130435 | 0 | 0.217391 | 0.173913 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6c876e63b43d2613e35e9f8be781edcaf28ef29 | 1,268 | py | Python | filter_dict/core.py | daddinuz/filter_dict | 5e4df5489a04eb0a97251197a45dfd8a56e88874 | [
"MIT"
] | null | null | null | filter_dict/core.py | daddinuz/filter_dict | 5e4df5489a04eb0a97251197a45dfd8a56e88874 | [
"MIT"
] | 1 | 2018-12-02T15:37:11.000Z | 2018-12-04T04:59:50.000Z | filter_dict/core.py | daddinuz/filter_dict | 5e4df5489a04eb0a97251197a45dfd8a56e88874 | [
"MIT"
] | null | null | null | import typing
import chainable_iterator
K = typing.TypeVar('K')
V = typing.TypeVar('V')
P = typing.List[K]
Predicate = typing.Callable[[P, K, V], bool]
_Node = typing.NamedTuple('_Node', (('path', P), ('data', typing.Mapping)))
def match_all(_kp: P, _k: K, _v: V) -> bool:
return True
def _decompose_dict(predicate: Predicate, the_dict: typing.Mapping[K, V]) \
-> typing.Generator[typing.Tuple[P, V], None, None]:
root = chainable_iterator.ChainableIterator((_Node([], the_dict),))
for node in root:
path = node.path
for k, v in node.data.items():
path.append(k)
if isinstance(v, dict):
root.chain((_Node(path, v),))
if predicate(path, k, v):
yield path, v
path = path[:-1]
def _recompose_dict(items: typing.Iterator[typing.Tuple[P, V]]) -> typing.Mapping[K, V]:
the_dict = {}
for path, value in items:
base, last = the_dict, len(path) - 1
for i, key in enumerate(path):
base = base.setdefault(key, {} if i < last else value)
return the_dict
def filter_dict(predicate: Predicate, the_dict: typing.Mapping[K, V]) -> typing.Mapping[K, V]:
return _recompose_dict(_decompose_dict(predicate, the_dict))
| 30.926829 | 94 | 0.614353 | 177 | 1,268 | 4.254237 | 0.288136 | 0.023904 | 0.074369 | 0.079681 | 0.166003 | 0.132802 | 0.132802 | 0.132802 | 0.132802 | 0.132802 | 0 | 0.00207 | 0.23817 | 1,268 | 40 | 95 | 31.7 | 0.777433 | 0 | 0 | 0 | 0 | 0 | 0.01183 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.066667 | 0.066667 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6c8ac2b3828724efc5a4b1a174acc6003af0ab7 | 2,549 | py | Python | src/marlb.py | bennett-nguyen/marlb | f169fa814c7454dce54fd6d0ce8e0821280345b2 | [
"MIT"
] | 4 | 2021-12-15T14:41:43.000Z | 2021-12-27T18:57:13.000Z | src/marlb.py | bennett-nguyen/marlb | f169fa814c7454dce54fd6d0ce8e0821280345b2 | [
"MIT"
] | null | null | null | src/marlb.py | bennett-nguyen/marlb | f169fa814c7454dce54fd6d0ce8e0821280345b2 | [
"MIT"
] | null | null | null | import argparse
import os
import re
from utils import interpret
parser = argparse.ArgumentParser()
parser.add_argument("-m", "--mode", choices=["interpret", "debug"], required=True, help="Mode to interpret your file")
parser.add_argument("-f", "--file", required=True, help="File to interpret, pass in either file name (in the current working directory) or a file path")
args = parser.parse_args()
def processor(file_name):
if not file_name:
return
try:
if not file_name.endswith(".marble"):
print(f"error: invalid file extension '{file_name[file_name.index('.'):]}'\nvalid extension: '*.marble'")
return
with open(f"./{file_name}", "r", encoding="utf8") as f:
raw_code = f.readlines()
pre_processed_code = []
for line_number, content in enumerate(raw_code, start=1):
content = re.sub("[^qweruiopasdjk!:<>\+-]", "", re.match("^[^\"]*", content).group())
if len(content) == 1 and "\n" in content or not content:
continue
pre_processed_code.append((line_number, content))
interpret(pre_processed_code, args.mode)
except FileNotFoundError:
print(f"error: file '{file_name}' not found in {os.getcwd()}")
return
except ValueError:
print(f"error: '{file_name}' is not recognized as a file")
return
except KeyboardInterrupt:
print("error: proccess has been interrupted")
return
def path_parser():
path = args.file.split("\\") if "\\" in args.file else args.file.split("/")
if not path[-1]:
path.pop()
file_name = path[-1]
invalid_characters = ['"', '~', '!', '@', '#', '$', '%', '^', '&', '*', '(', ')' ',', '+', '{', '}', '\\', '"', '|', '<', '>', '?', '`', '=', '[', ']', ';' "'", '\\', '/']
for entry in path:
if any(char in entry for char in invalid_characters):
print(f"error: invalid syntax '{args.file}'")
return
if len(path) != 1:
for entry in path[:-1]:
try:
os.chdir(entry)
except FileNotFoundError:
print(f"error: directory '{entry}' doesn't exist in {os.getcwd()}")
return
except OSError:
print(f"error: invalid syntax '{args.file}'")
return
return file_name
if __name__ == "__main__":
processor(path_parser())
| 31.085366 | 175 | 0.532366 | 281 | 2,549 | 4.704626 | 0.362989 | 0.066566 | 0.049924 | 0.040847 | 0.142209 | 0.057489 | 0.057489 | 0.057489 | 0 | 0 | 0 | 0.003928 | 0.300902 | 2,549 | 81 | 176 | 31.469136 | 0.737935 | 0 | 0 | 0.25 | 0 | 0 | 0.238133 | 0.025893 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0.017857 | 0.071429 | 0 | 0.25 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6cc07d04501634b38a9132db0f8a13778e8d948 | 1,333 | py | Python | almatsurat_data_compiler.py | IMuslim-org/Al-Matsurat-Internalisation | ad92da6c623911801af23d2d96335e8dedbcd148 | [
"BSD-2-Clause"
] | null | null | null | almatsurat_data_compiler.py | IMuslim-org/Al-Matsurat-Internalisation | ad92da6c623911801af23d2d96335e8dedbcd148 | [
"BSD-2-Clause"
] | null | null | null | almatsurat_data_compiler.py | IMuslim-org/Al-Matsurat-Internalisation | ad92da6c623911801af23d2d96335e8dedbcd148 | [
"BSD-2-Clause"
] | null | null | null |
import csv
import json
import os
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("--dzikir_type", dest="dzikir_type", type=str, help="Please Add Al Matsurat Type")
args = parser.parse_args()
dzikir_type = args.dzikir_type
if dzikir_type == None:
dzikir_type = "sugro"
__dir__ = os.path.dirname(__file__)
arabic_file_kubro = open(f"{__dir__}/data/kubo.json")
arabic_file_sugro = open(f"{__dir__}/data/sugro.json")
latin_sugro = open(f"{__dir__}/data/latin_al-matsurat_sugro.csv", mode="r")
if dzikir_type == "sugro":
dzikirData = json.loads(arabic_file_sugro.read())
else:
dzikirData = json.loads(arabic_file_kubro.read())
with latin_sugro as csv_file:
csv_reader = csv.DictReader(csv_file)
# print(csv_reader)
currentId = 0
contentsLength = 0
latins = []
for row in csv_reader:
if contentsLength ==0:
contentsLength = len(dzikirData[currentId]['contents'])
latins.append(row["Latin Text"])
if len(latins) == contentsLength:
dzikirData[currentId]['latins'] = latins
latins = []
contentsLength = 0
currentId += 1
if dzikir_type == "sugro" :
open(f"{__dir__}/data/sugro.json", "w").write(str(list(dzikirData)).replace('\'', '"'))
# print(csv) | 24.236364 | 102 | 0.651913 | 167 | 1,333 | 4.898204 | 0.359281 | 0.0978 | 0.03912 | 0.05868 | 0.155257 | 0.06357 | 0.06357 | 0 | 0 | 0 | 0 | 0.004757 | 0.211553 | 1,333 | 55 | 103 | 24.236364 | 0.773549 | 0.021005 | 0 | 0.176471 | 0 | 0 | 0.162058 | 0.089094 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6cc4aeb99e1db488430dfb0021b28e612cd980b | 3,642 | py | Python | prometheus_api_client/metric_range_df.py | chauhankaranraj/prometheus-api-client-python | 5c09c7f3d5b9b2f16dc3aa19c84e9a45c6e23327 | [
"MIT"
] | 118 | 2019-03-31T14:43:25.000Z | 2022-03-20T02:35:42.000Z | prometheus_api_client/metric_range_df.py | chauhankaranraj/prometheus-api-client-python | 5c09c7f3d5b9b2f16dc3aa19c84e9a45c6e23327 | [
"MIT"
] | 225 | 2019-03-22T19:31:51.000Z | 2022-03-29T15:38:20.000Z | prometheus_api_client/metric_range_df.py | chauhankaranraj/prometheus-api-client-python | 5c09c7f3d5b9b2f16dc3aa19c84e9a45c6e23327 | [
"MIT"
] | 67 | 2019-03-21T19:00:52.000Z | 2022-03-31T12:01:14.000Z | """A pandas.DataFrame subclass for Prometheus range vector responses."""
from pandas import DataFrame
from pandas._typing import Axes, Dtype
from typing import Optional, Sequence
class MetricRangeDataFrame(DataFrame):
"""Subclass to format and represent Prometheus query response as pandas.DataFrame.
Assumes response is either a json or sequence of jsons.
This class should be used specifically to instantiate a query response,
where the query response has several timestamp values per series.
That is, a range vector is expected.
If the data is an instant vector, use MetricSnapshotDataFrame instead.
Some argument descriptions in this docstring were copied from pandas.core.frame.DataFrame.
:param data: (list|json) A single metric (json with keys "metric" and "values"/"value")
or list of such metrics received from Prometheus as a response to query
:param index: (pandas.Index|array-like) Index to use for resulting dataframe. Will default to
pandas.RangeIndex if no indexing information part of input data and no index provided.
:param columns: (pandas.Index|array-like) Column labels to use for resulting dataframe. Will
default to list of labels + "timestamp" + "value" if not provided.
:param dtype: (dtype) default None. Data type to force. Only a single dtype is allowed. If None, infer.
:param copy: (bool) default False. Copy data from inputs. Only affects DataFrame / 2d ndarray input.
Example Usage:
.. code-block:: python
prom = PrometheusConnect()
metric_data = prom.get_current_metric_value(metric_name='up', label_config=my_label_config)
metric_df = MetricRangeDataFrame(metric_data)
metric_df.head()
'''
+------------+------------+-----------------+--------------------+-------+
| | __name__ | cluster | label_2 | value |
+-------------------------+-----------------+--------------------+-------+
| timestamp | | | | |
+============+============+=================+====================+=======+
| 1577836800 | __up__ | cluster_id_0 | label_2_value_2 | 0 |
+-------------------------+-----------------+--------------------+-------+
| 1577836801 | __up__ | cluster_id_1 | label_2_value_3 | 1 |
+-------------------------+-----------------+------------=-------+-------+
'''
"""
def __init__(
self,
data=None,
index: Optional[Axes] = None,
columns: Optional[Axes] = None,
dtype: Optional[Dtype] = None,
copy: bool = False,
):
"""Functions as a constructor for MetricRangeDataFrame class."""
if data is not None:
# if just a single json instead of list/set/other sequence of jsons,
# treat as list with single entry
if not isinstance(data, Sequence):
data = [data]
row_data = []
for v in data:
if "value" in v:
raise TypeError(
"data must be a range vector. Expected range vector, got instant vector"
)
for t in v["values"]:
row_data.append({**v["metric"], "timestamp": t[0], "value": t[1]})
# init df normally now
super(MetricRangeDataFrame, self).__init__(
data=row_data, index=index, columns=columns, dtype=dtype, copy=copy
)
self.set_index(["timestamp"], inplace=True)
| 46.692308 | 107 | 0.547227 | 391 | 3,642 | 4.976982 | 0.388747 | 0.02261 | 0.016958 | 0.020555 | 0.040082 | 0.040082 | 0.040082 | 0.040082 | 0 | 0 | 0 | 0.012232 | 0.281713 | 3,642 | 77 | 108 | 47.298701 | 0.731651 | 0.674355 | 0 | 0 | 0 | 0 | 0.105871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.111111 | 0 | 0.185185 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6d40c0b4675da86701d6ce4d065ffec39bc4d25 | 8,216 | py | Python | cdApi/discount.py | renqiukai/cd_api | 4f1f641adaf031252b097db03249a2581268cc11 | [
"MIT"
] | null | null | null | cdApi/discount.py | renqiukai/cd_api | 4f1f641adaf031252b097db03249a2581268cc11 | [
"MIT"
] | null | null | null | cdApi/discount.py | renqiukai/cd_api | 4f1f641adaf031252b097db03249a2581268cc11 | [
"MIT"
] | null | null | null | '''
@说明 :满减送接口。
@时间 :2020/2/13 下午4:28:26
@作者 :任秋锴
@版本 :1.0
'''
from .base import base
from .product import product
from .store import store
class discount(base):
def __init__(self, token):
super().__init__(token)
def list(self,
state=None,
name=None,
productCodes=None,
companyId=None,
storeId=None,
pageNum=1, pageSize=1000):
api_name = "manager/discount/list"
data = {
"pageNum": pageNum,
"pageSize": pageSize,
}
if state:
data["state"] = state
if state:
data["name"] = name
if state:
data["productCodes"] = productCodes
if state:
data["companyId"] = companyId
if state:
data["storeId"] = storeId
return self.request(api_name, data)
def create(self, data):
api_name = "manager/discount/add"
response = self.request(api_name, data, method="POST")
return response
def createykj(self, storeCodes, productCodes,
price, beginTime, endTime, name):
''' 一口价导入方案 '''
beginTime = f"{beginTime} 00:00:00"
endTime = f"{endTime} 23:59:59"
# 请求的主数据结构
data = {
"activeGoods": 2,
"beginTime": beginTime,
"endTime": endTime,
"expressFree": 1,
"id": "",
"name": name,
"discountType": 2,
"type": 4,
"unit": 2,
"time": [beginTime, endTime],
"productList": [],
"rules": [],
"storeList": [],
}
# 添加活动规则
rule = {
"id": "",
"usePrice": 1,
"price": price,
"discount": None,
"giftId": None,
"giftIds": [],
"giftRule": 1,
"giftLimit": None,
}
data["rules"].append(rule)
# 添加门店列表
s = store(self.token)
store_map = {}
for store_data in s.getStoreList():
store_id = store_data.get("store_id")
store_code = store_data.get("store_code")
store_parent_id = store_data.get("store_parent_id")
store_map[store_code] = [store_id, store_parent_id]
for store_code in storeCodes:
store_id, parentId = store_map[store_code]
data["storeList"].append(
{"id": store_id, "type": 0, "parentId": parentId})
# 添加商品列表
p = product(self.token)
for code in productCodes:
product_data = p.list(productCode=code)
if not product_data:
print(f"{code}\t商品查找不到。")
return None
product_data = product_data[0]
productId = product_data.get("id")
product_request_payload = {
"productId": productId,
"specType": 1,
"specIds": [],
"price": None,
"beginTime": None,
"endTime": None,
}
data["productList"].append(product_request_payload)
response = self.create(data)
print(productCodes, response)
return response
def createmz(self, storeCodes, productCodes,
rules, beginTime, endTime, name):
''' 满折导入方案 '''
beginTime = f"{beginTime} 00:00:00"
endTime = f"{endTime} 23:59:59"
# 请求的主数据结构
data = {
"activeGoods": 2,
"beginTime": beginTime,
"endTime": endTime,
"expressFree": 1,
"id": "",
"name": name,
"discountType": 2,
"type": 2,
"unit": 2,
"time": [beginTime, endTime],
"productList": [],
"rules": [],
"storeList": [],
}
# 添加活动规则
for row in rules:
usePrice, discount_num = row
rule = {
"id": "",
"usePrice": usePrice,
"price": None,
"discount": discount_num,
"giftId": None,
"giftIds": [],
"giftRule": 1,
"giftLimit": None,
}
data["rules"].append(rule)
# 添加门店列表
data["storeList"] = storeCodes
# 添加商品列表
p = product(self.token)
for code in productCodes:
product_data = p.list(productCode=code)
if not product_data:
print(f"{code}\t商品查找不到。")
return None
product_data = product_data[0]
productId = product_data.get("id")
product_request_payload = {
"productId": productId,
"specType": 1,
"specIds": [],
"price": None,
"beginTime": None,
"endTime": None,
}
data["productList"].append(product_request_payload)
# print(data)
response = self.create(data)
print(productCodes, name, rules, response)
return response
def create_example(self):
api_name = "manager/discount/add"
data = {
# 添加门店列表
# 类型 0门店 1分公司
"storeList": [
{"id": 3, "type": 1, "parentId": 0},
{"id": 11, "type": 0, "parentId": 3},
{"id": 26, "type": 0, "parentId": 3},
{"id": 29, "type": 0, "parentId": 3},
{"id": 27002, "type": 0, "parentId": 3}
],
# 活动商品 1全店 2指定商品
"activeGoods": 2,
# 开始结束时间
"beginTime": "2020-02-15 00:00:00",
"endTime": "2020-02-22 23:59:59",
# 是否包邮 0:否 1:是
"expressFree": 1,
"id": "",
# 活动名称
"name": "整单-一般打折",
# 商品列表
# 规格信息
"productList": [
{"productId": 15641, "specIds": [], "specType":1}
],
# 折扣类型 1订单优惠 2商品优惠
"discountType": 2,
# 类型 1减现金 2打折 3送赠品 4一口价 5第N件打折
"type": 2,
# 单位 1元 2件
"unit": 2,
# 优惠折扣规则
"rules": [
{
# 限领限制
"giftLimit": "",
# 领取规则 1件 2元
"giftRule": 1,
# 规则id
"id": "",
# 打折
"discount": 9,
"isOpenGift": False,
# 赠品id
"giftIds": [],
# 减现金
"price":"",
# 优惠门槛
"usePrice":"0"
}
],
"time": ["2020-02-15 00:00:00", "2020-02-22 23:59:59"]
}
return self.request(api_name, data, method="POST")
def read(self, _id):
api_name = f"manager/discount/info"
data = {
"discountId": _id,
}
response = self.request(api_name, data)
# print(response)
return response
def update(self, data):
api_name = f"manager/discount/update"
# print(data)
response = self.request(api_name, data, method="POST")
return response
def copy(self, _id, name=None, beginTime=None, endTime=None):
response = self.read(_id)
data = response.get("data")
rules = data.get("rules")
data["id"] = ""
if name:
data["name"] = name
if beginTime:
data["beginTime"] = beginTime
if endTime:
data["endTime"] = endTime
if rules:
for idx, rule in enumerate(data["rules"]):
data["rules"][idx]["id"] = ""
response = self.create(data)
# print(data, response)
return response
def delete(self, _id):
api_name = f"manager/discount/delete_stop"
data = {
"id": _id,
"operate": 0
}
response = self.request(api_name, data, method="POST")
# print(response)
return response
| 29.985401 | 66 | 0.443403 | 740 | 8,216 | 4.827027 | 0.214865 | 0.023516 | 0.023516 | 0.030235 | 0.495521 | 0.43561 | 0.380179 | 0.354983 | 0.343785 | 0.31243 | 0 | 0.036531 | 0.430258 | 8,216 | 273 | 67 | 30.095238 | 0.726554 | 0.047712 | 0 | 0.52968 | 0 | 0 | 0.151808 | 0.011964 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045662 | false | 0 | 0.013699 | 0 | 0.114155 | 0.018265 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6d550ad2ae06f4375b0baa41ab881e26142f3e6 | 589 | py | Python | blanc_basic_pages/migrations/0002_page_hero_image.py | src-r-r/blanc-basic-pages | 2714567e054cafeaaefd533a1786cdef43369f9d | [
"BSD-3-Clause"
] | 1 | 2020-01-11T12:48:38.000Z | 2020-01-11T12:48:38.000Z | blanc_basic_pages/migrations/0002_page_hero_image.py | src-r-r/blanc-basic-pages | 2714567e054cafeaaefd533a1786cdef43369f9d | [
"BSD-3-Clause"
] | 1 | 2017-05-02T21:41:35.000Z | 2017-05-02T21:41:35.000Z | blanc_basic_pages/migrations/0002_page_hero_image.py | src-r-r/blanc-basic-pages | 2714567e054cafeaaefd533a1786cdef43369f9d | [
"BSD-3-Clause"
] | 2 | 2020-01-11T12:49:03.000Z | 2020-10-17T00:34:29.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
import blanc_basic_assets.fields
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('assets', '0001_initial'),
('pages', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='page',
name='hero_image',
field=blanc_basic_assets.fields.AssetForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='assets.Image'),
),
]
| 25.608696 | 148 | 0.657046 | 65 | 589 | 5.723077 | 0.6 | 0.064516 | 0.086022 | 0.11828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019565 | 0.219015 | 589 | 22 | 149 | 26.772727 | 0.78913 | 0.035654 | 0 | 0 | 0 | 0 | 0.107774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6d62cb0d58d04d2fb59cddbe5dfdb15bc52ce89 | 4,538 | py | Python | src/app/beer_garden/metrics.py | ExpressHermes/beer-garden | 2ea0944d7528a8127bc1b79d16d8fdc668f1c8e4 | [
"MIT"
] | null | null | null | src/app/beer_garden/metrics.py | ExpressHermes/beer-garden | 2ea0944d7528a8127bc1b79d16d8fdc668f1c8e4 | [
"MIT"
] | null | null | null | src/app/beer_garden/metrics.py | ExpressHermes/beer-garden | 2ea0944d7528a8127bc1b79d16d8fdc668f1c8e4 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
""" Metrics Service
The metrics service manages:
* Connectivity to the Prometheus Server
* Creating default summary views in Prometheus
* Publishing `Request` metrics
"""
import datetime
import logging
from http.server import ThreadingHTTPServer
from brewtils.models import Request
from brewtils.stoppable_thread import StoppableThread
from prometheus_client import Gauge, Counter, Summary
from prometheus_client.exposition import MetricsHandler
from prometheus_client.registry import REGISTRY
import beer_garden.db.api as db
class PrometheusServer(StoppableThread):
"""Wraps a ThreadingHTTPServer to serve Prometheus metrics"""
def __init__(self, host, port):
self.logger = logging.getLogger(__name__)
self.display_name = "Prometheus Server"
self._host = host
self._port = port
# Basically prometheus_client.exposition.start_http_server
metrics_handler = MetricsHandler.factory(REGISTRY)
self.httpd = ThreadingHTTPServer((host, port), metrics_handler)
super(PrometheusServer, self).__init__(
logger=self.logger, name="PrometheusServer"
)
def run(self):
self.logger.debug("Initializing metric counts")
initialize_counts()
self.logger.info(f"Starting {self.display_name} on {self._host}:{self._port}")
self.httpd.serve_forever()
self.logger.info(f"{self.display_name} is stopped")
def stop(self):
self.httpd.shutdown()
# Summaries:
plugin_command_latency = Summary(
"bg_plugin_command_latency_seconds",
"Total time taken for a command to complete in seconds.",
["system", "instance_name", "system_version", "command", "status"],
)
# Counters:
completed_request_counter = Counter(
"bg_completed_requests_total",
"Number of completed requests.",
["system", "instance_name", "system_version", "command", "status"],
)
request_counter_total = Counter(
"bg_requests_total",
"Number of requests.",
["system", "instance_name", "system_version", "command"],
)
# Gauges:
queued_request_gauge = Gauge(
"bg_queued_requests",
"Number of requests waiting to be processed.",
["system", "instance_name", "system_version"],
)
in_progress_request_gauge = Gauge(
"bg_in_progress_requests",
"Number of requests IN_PROGRESS",
["system", "instance_name", "system_version"],
)
def request_latency(start_time):
"""Measure request latency in seconds as a float."""
return (datetime.datetime.utcnow() - start_time).total_seconds()
def initialize_counts():
requests = db.query(
Request, filter_params={"status__in": ["CREATED", "IN_PROGRESS"]}
)
for request in requests:
label_args = {
"system": request.system,
"system_version": request.system_version,
"instance_name": request.instance_name,
}
if request.status == "CREATED":
queued_request_gauge.labels(**label_args).inc()
elif request.status == "IN_PROGRESS":
in_progress_request_gauge.labels(**label_args).inc()
def request_created(request):
queued_request_gauge.labels(
system=request.system,
system_version=request.system_version,
instance_name=request.instance_name,
).inc()
request_counter_total.labels(
system=request.system,
system_version=request.system_version,
instance_name=request.instance_name,
command=request.command,
).inc()
def request_started(request):
"""Update metrics associated with a Request update
This call should happen after the save to the database.
"""
labels = {
"system": request.system,
"system_version": request.system_version,
"instance_name": request.instance_name,
}
queued_request_gauge.labels(**labels).dec()
in_progress_request_gauge.labels(**labels).inc()
def request_completed(request):
"""Update metrics associated with a Request update
This call should happen after the save to the database.
"""
labels = {
"system": request.system,
"system_version": request.system_version,
"instance_name": request.instance_name,
}
in_progress_request_gauge.labels(**labels).dec()
latency = request_latency(request.created_at)
labels["command"] = request.command
labels["status"] = request.status
completed_request_counter.labels(**labels).inc()
plugin_command_latency.labels(**labels).observe(latency)
| 28.721519 | 86 | 0.694579 | 511 | 4,538 | 5.921722 | 0.252446 | 0.059484 | 0.035691 | 0.039656 | 0.329478 | 0.29577 | 0.25347 | 0.206543 | 0.206543 | 0.206543 | 0 | 0.000274 | 0.197003 | 4,538 | 157 | 87 | 28.904459 | 0.830132 | 0.128911 | 0 | 0.227723 | 0 | 0 | 0.203794 | 0.027685 | 0 | 0 | 0 | 0 | 0 | 1 | 0.079208 | false | 0 | 0.089109 | 0 | 0.188119 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6d923de7666600e544be448396f3fd26e44e836 | 2,368 | py | Python | sendmail/__init__.py | hmif-itb/hmif-send-mail | 628e283d7de06b132fe72af97b76104ea5ff8c88 | [
"MIT"
] | null | null | null | sendmail/__init__.py | hmif-itb/hmif-send-mail | 628e283d7de06b132fe72af97b76104ea5ff8c88 | [
"MIT"
] | null | null | null | sendmail/__init__.py | hmif-itb/hmif-send-mail | 628e283d7de06b132fe72af97b76104ea5ff8c88 | [
"MIT"
] | null | null | null | import argparse
from .helpers import yaml_parser
from .mailer import Mailer
from .utils import csv_to_recipients
from .exceptions import TemplateNotFoundException
from .exceptions import TemplateAndCSVNotMatchException
def main():
parser = argparse.ArgumentParser()
parser.add_argument("--config", help="yaml file as mail configuration")
parser.add_argument("--service", help="mail service", default="aws")
parser.add_argument("--mode", help="package app mode", default="send_email")
parser.add_argument("--template-name", help="name of the template to be made")
parser.add_argument(
"--subject",
help="subject of the email",
default="Congrats! Welcome to HMIF Mentoring program",
)
parser.add_argument("--txt", help="txt template content of the email")
parser.add_argument("--html", help="html template content of the email")
args = parser.parse_args()
mode = args.mode
service = args.service
if args.service == "aws":
if mode == "create_template":
mailer = Mailer()
template_name = args.template_name
subject = args.subject
txt = args.txt
html = args.html
mailer.create_template(template_name, subject, html, txt)
elif mode == "send_email":
config_yaml_file = args.config
config_data = yaml_parser(config_yaml_file)
for items in config_data:
sender_name = items["sender"]["name"]
sender_email = items["sender"]["email"]
mailer = Mailer(sender_name, sender_email)
for spec in items["spec"]:
template_name = spec["template"]
template_data = spec["recipient_data"]
try:
mailer.check_template_exist(template_name)
mailer.check_template_match(template_name, template_data)
mail_recipients = csv_to_recipients(template_data)
mailer.send_mail(mail_recipients, template_name)
except TemplateNotFoundException as e:
print(e)
except TemplateAndCSVNotMatchException as e:
print(e)
except Exception as e:
print(e)
| 37 | 82 | 0.600084 | 251 | 2,368 | 5.478088 | 0.270916 | 0.069818 | 0.086545 | 0.019636 | 0.058182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.310389 | 2,368 | 63 | 83 | 37.587302 | 0.842009 | 0 | 0 | 0.058824 | 0 | 0 | 0.154561 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019608 | false | 0 | 0.117647 | 0 | 0.137255 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6d95e9d197de4562be8336500d24a19ca05a815 | 1,540 | py | Python | aquascope/tests/aquascope/webserver/data_access/db/dummy_uploads.py | MicroscopeIT/aquascope_backend | 6b8c13ca3d6bd0a96f750fae809b6cf5a0062f24 | [
"MIT"
] | null | null | null | aquascope/tests/aquascope/webserver/data_access/db/dummy_uploads.py | MicroscopeIT/aquascope_backend | 6b8c13ca3d6bd0a96f750fae809b6cf5a0062f24 | [
"MIT"
] | 3 | 2021-06-08T19:50:36.000Z | 2021-09-08T01:15:33.000Z | aquascope/tests/aquascope/webserver/data_access/db/dummy_uploads.py | MicroscopeIT/aquascope_backend | 6b8c13ca3d6bd0a96f750fae809b6cf5a0062f24 | [
"MIT"
] | 2 | 2019-05-15T13:30:42.000Z | 2020-06-12T02:42:49.000Z | import copy
from bson import ObjectId
from aquascope.webserver.data_access.db.upload import Upload, DEFAULT_UPLOAD_PROJECTION
from aquascope.webserver.data_access.db.util import project_dict
_DUMMY_UPLOADS = [
{
'_id': ObjectId('000000000000000000001000'),
'filename': 'dummy0',
'state': 'initialized',
'tags': []
},
{
'_id': ObjectId('000000000000000000001001'),
'filename': 'dummy1',
'state': 'uploaded',
'tags': ['dummy_tag_1', 'dummy_tag_2', 'with_broken_records_field'],
'broken_records': [],
'broken_record_count': 0
},
{
'_id': ObjectId('000000000000000000001002'),
'filename': 'dummy2',
'state': 'processing',
'image_count': 20,
'duplicate_image_count': 0,
'duplicate_filenames': [],
'tags': ['dummy_tag_1']
},
{
'_id': ObjectId('000000000000000000001003'),
'filename': 'dummy3',
'state': 'finished',
'image_count': 10,
'duplicate_image_count': 2,
'duplicate_filenames': [
'img1.jpg',
'img2.jpg'
],
'tags': ['sth']
},
{
'_id': ObjectId('000000000000000000001004'),
'filename': 'dummy4',
'state': 'failed',
'tags': []
}
]
DUMMY_UPLOADS_WITH_DEFAULT_PROJECTION = [
Upload(project_dict(copy.deepcopy(upload), DEFAULT_UPLOAD_PROJECTION)) for upload in _DUMMY_UPLOADS
]
DUMMY_UPLOADS = [Upload(upload) for upload in _DUMMY_UPLOADS]
| 27.017544 | 103 | 0.587013 | 140 | 1,540 | 6.142857 | 0.421429 | 0.069767 | 0.051163 | 0.060465 | 0.132558 | 0.07907 | 0 | 0 | 0 | 0 | 0 | 0.121454 | 0.267532 | 1,540 | 56 | 104 | 27.5 | 0.640957 | 0 | 0 | 0.039216 | 0 | 0 | 0.327922 | 0.121429 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.078431 | 0 | 0.078431 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6dea75ac1d892435b1297a25c34723e9ed2ad4b | 4,961 | py | Python | Features/OCR.py | JosephLteif/Smart-Assistant | 6f428288da3abf21e83d64e7db63a41eba9c496c | [
"MIT"
] | null | null | null | Features/OCR.py | JosephLteif/Smart-Assistant | 6f428288da3abf21e83d64e7db63a41eba9c496c | [
"MIT"
] | null | null | null | Features/OCR.py | JosephLteif/Smart-Assistant | 6f428288da3abf21e83d64e7db63a41eba9c496c | [
"MIT"
] | null | null | null | # Import required packages
import os
import time
import cv2
import imutils
import numpy as np
import pytesseract
from autocorrect import Speller
from PIL import Image
def VideoOn():
video = cv2.VideoCapture(0, cv2.CAP_DSHOW)
while True:
# check returns true if python can actually read and frame is ndim numpy array
check, frame = video.read()
cv2.imshow('Capturing...', frame)
key = cv2.waitKey(1)
if key == ord('q'):
check, frame = video.read()
a = cv2.imwrite("Data\\OCR_Data\\CaptureImage.jpg", frame)
break
video.release()
cv2.destroyAllWindows()
def TesseractSetup():
# Mention the installed location of Tesseract-OCR in your system
pytesseract.pytesseract.tesseract_cmd = 'C:\\Program Files (x86)\\Tesseract-OCR\\tesseract.exe'
def CropBorder():
# return cropped image from which text needs to be extracted
im = Image.open("Data\OCR_Data\SteveJobsQuote.jpg")
# im = Image.open("./Assets/quote-luck-is-when-skill-meets-opportunity-vinnie-paz-80-71-88.jpg")
if im.mode != 'RGB':
im = im.convert('RGB')
im.save("Data\OCR_Data\SteveJobsQuote.jpg", dpi=(300, 300))
# return border_crop("CaptureImage.jpg")
return cv2.imread("Data\OCR_Data\SteveJobsQuote.jpg")
def ExtractImageData(img):
# cv2.imshow("img", img)
# cv2.waitKey(0)
img = cv2.resize(img, None, fx=2, fy=2, interpolation=cv2.INTER_CUBIC)
# cv2.imshow("img", img)
# cv2.waitKey(0)
# image data
try:
data = pytesseract.image_to_osd(img).split()
except:
return "",0
# Detect language
language = data[-4]
# Detect angle
rotation = data[-9]
print(rotation)
print(data)
# return Image Data
return language, rotation
def PreprocessingImage(img, rotation):
cv2.imshow("img", img)
cv2.waitKey(0)
# apply rotation
rotated = imutils.rotate(img, angle=-(int(rotation)))
cv2.imshow("img", rotated)
cv2.waitKey(0)
# Resize the image to a given scale
img = cv2.resize(rotated, None, fx=2, fy=2, interpolation=cv2.INTER_CUBIC)
cv2.imshow("img", img)
cv2.waitKey(0)
# Blur using GaussianBlur method
img = cv2.GaussianBlur(img, (5, 5), 0)
cv2.imshow("img", img)
cv2.waitKey(0)
# Convert the image to gray scale
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
# cv2.imshow("img", gray)
# cv2.waitKey(0)
# Apply threshhold
thresh1 = cv2.threshold(gray, 0, 255, cv2.THRESH_TOZERO)[1]
cv2.imshow("img", thresh1)
cv2.waitKey(0)
# Specify structure shape and kernel size.
# Kernel size increases or decreases the area
# of the rectangle to be detected.
# A smaller value like (10, 10) will detect
# each word instead of a sentence.
rect_kernel = cv2.getStructuringElement(cv2.MORPH_RECT, (20, 20))
# Appplying dilation on the threshold image
dilation = cv2.dilate(thresh1, rect_kernel, iterations=1)
cv2.imshow("img", dilation)
cv2.waitKey(0)
# Finding contours
contours, hierarchy = cv2.findContours(
dilation, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)
# Creating a copy of image
im2 = img.copy()
return im2, contours
def CreateFileToPrintTo():
# A text file is created and flushed
file = open("Data\\OCR_Data\\recognized.txt", "w+")
file.write("")
file.close()
def FindContour(im2, contours, language):
# Looping through the identified contours
# Then rectangular part is cropped and passed on
# to pytesseract for extracting text from it
# Extracted text is then written into the text file
result = ""
file = open("Data\\OCR_Data\\recognized.txt", "a", encoding='utf-8')
for cnt in contours:
x, y, w, h = cv2.boundingRect(cnt)
# Drawing a rectangle on copied image
rect = cv2.rectangle(im2, (x, y), (x + w, y + h), (0, 255, 0), 2)
# Cropping the text block for giving input to OCR
cropped = im2[y:y + h, x:x + w]
try:
# Apply OCR on the cropped image
if language.lower() == 'latin':
text = pytesseract.image_to_string(cropped, lang="eng")
else:
text = pytesseract.image_to_string(cropped)
except:
return "",""
# Storing the text
result += (text + "\n")
return result, file
def AppendResultToFile(result, file):
spell = Speller(only_replacements=True)
result = result.replace(" ", "")
var = spell(result)
file.write(var)
# Close the file
file.close
def launch():
# VideoOn()
TesseractSetup()
img = CropBorder()
language, rotation = ExtractImageData(img)
im2, contours = PreprocessingImage(img, rotation)
CreateFileToPrintTo()
result, file = FindContour(im2, contours, language)
AppendResultToFile(result, file)
os.remove('Data\\OCR_Data\\CaptureImage.jpg') | 29.011696 | 100 | 0.642209 | 645 | 4,961 | 4.897674 | 0.368992 | 0.02849 | 0.034188 | 0.023742 | 0.149414 | 0.106363 | 0.084204 | 0.039253 | 0.039253 | 0.039253 | 0 | 0.029162 | 0.239669 | 4,961 | 171 | 101 | 29.011696 | 0.808325 | 0.263657 | 0 | 0.153061 | 0 | 0 | 0.091337 | 0.070578 | 0 | 0 | 0 | 0 | 0 | 1 | 0.091837 | false | 0 | 0.081633 | 0 | 0.234694 | 0.020408 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6e1ea932d8c5cacb20dcd319573741e59522077 | 4,395 | py | Python | scrapy/tests/test_contrib_exp_crawlspider_reqgen.py | chongiadung/choinho | d2a216fe7a5064d73cdee3e928a7beef7f511fd1 | [
"MIT"
] | null | null | null | scrapy/tests/test_contrib_exp_crawlspider_reqgen.py | chongiadung/choinho | d2a216fe7a5064d73cdee3e928a7beef7f511fd1 | [
"MIT"
] | 10 | 2020-02-11T23:34:28.000Z | 2022-03-11T23:16:12.000Z | scrapy/tests/test_contrib_exp_crawlspider_reqgen.py | chongiadung/choinho | d2a216fe7a5064d73cdee3e928a7beef7f511fd1 | [
"MIT"
] | 3 | 2018-08-05T14:54:25.000Z | 2021-06-07T01:49:59.000Z | from twisted.internet import defer
from twisted.trial import unittest
from scrapy.http import Request
from scrapy.http import HtmlResponse
from scrapy.utils.python import equal_attributes
from scrapy.contrib_exp.crawlspider.reqext import SgmlRequestExtractor
from scrapy.contrib_exp.crawlspider.reqgen import RequestGenerator
from scrapy.contrib_exp.crawlspider.reqproc import Canonicalize
from scrapy.contrib_exp.crawlspider.reqproc import FilterDomain
from scrapy.contrib_exp.crawlspider.reqproc import FilterUrl
from scrapy.contrib_exp.crawlspider.reqproc import FilterDupes
class RequestGeneratorTest(unittest.TestCase):
def setUp(self):
url = 'http://example.org/somepage/index.html'
html = """<html><head><title>Page title<title>
<body><p><a href="item/12.html">Item 12</a></p>
<p><a href="/about.html">About us</a></p>
<img src="/logo.png" alt="Company logo (not a link)" />
<p><a href="../othercat.html">Other category</a></p>
<p><a href="/" /></p></body></html>"""
self.response = HtmlResponse(url, body=html)
self.deferred = defer.Deferred()
self.requests = [
Request('http://example.org/somepage/item/12.html',
meta={'link_text': 'Item 12'}),
Request('http://example.org/about.html',
meta={'link_text': 'About us'}),
Request('http://example.org/othercat.html',
meta={'link_text': 'Other category'}),
Request('http://example.org/',
meta={'link_text': ''}),
]
def _equal_requests_list(self, list1, list2):
list1 = list(list1)
list2 = list(list2)
if not len(list1) == len(list2):
return False
for (req1, req2) in zip(list1, list2):
if not equal_attributes(req1, req2, ['url']):
return False
return True
def test_basic(self):
reqgen = RequestGenerator([], [], callback=self.deferred)
# returns generator
requests = reqgen.generate_requests(self.response)
self.failUnlessEqual(list(requests), [])
def test_request_extractor(self):
extractors = [
SgmlRequestExtractor()
]
# extract all requests
reqgen = RequestGenerator(extractors, [], callback=self.deferred)
requests = reqgen.generate_requests(self.response)
self.failUnless(self._equal_requests_list(requests, self.requests))
def test_request_processor(self):
extractors = [
SgmlRequestExtractor()
]
processors = [
Canonicalize(),
FilterDupes(),
]
reqgen = RequestGenerator(extractors, processors, callback=self.deferred)
requests = reqgen.generate_requests(self.response)
self.failUnless(self._equal_requests_list(requests, self.requests))
# filter domain
processors = [
Canonicalize(),
FilterDupes(),
FilterDomain(deny='example.org'),
]
reqgen = RequestGenerator(extractors, processors, callback=self.deferred)
requests = reqgen.generate_requests(self.response)
self.failUnlessEqual(list(requests), [])
# filter url
processors = [
Canonicalize(),
FilterDupes(),
FilterUrl(deny=(r'about', r'othercat')),
]
reqgen = RequestGenerator(extractors, processors, callback=self.deferred)
requests = reqgen.generate_requests(self.response)
self.failUnless(self._equal_requests_list(requests, [
Request('http://example.org/somepage/item/12.html',
meta={'link_text': 'Item 12'}),
Request('http://example.org/',
meta={'link_text': ''}),
]))
processors = [
Canonicalize(),
FilterDupes(),
FilterUrl(allow=r'/somepage/'),
]
reqgen = RequestGenerator(extractors, processors, callback=self.deferred)
requests = reqgen.generate_requests(self.response)
self.failUnless(self._equal_requests_list(requests, [
Request('http://example.org/somepage/item/12.html',
meta={'link_text': 'Item 12'}),
]))
| 35.16 | 81 | 0.598635 | 435 | 4,395 | 5.954023 | 0.225287 | 0.034749 | 0.043243 | 0.056757 | 0.501158 | 0.471042 | 0.471042 | 0.385714 | 0.385714 | 0.385714 | 0 | 0.009455 | 0.278043 | 4,395 | 124 | 82 | 35.443548 | 0.806807 | 0.014334 | 0 | 0.484211 | 0 | 0.010526 | 0.164317 | 0.023111 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.115789 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6e323612c27f1c648a39e77bf266f1f8b1edb1c | 5,891 | py | Python | scripts/train_agents.py | Silviatulli/minmax_transparency_project | 45f93870d4366c8a1d0d5a43cf270ec92bc7a63b | [
"MIT"
] | 1 | 2021-07-13T02:55:16.000Z | 2021-07-13T02:55:16.000Z | scripts/train_agents.py | Silviatulli/minmax_transparency_project | 45f93870d4366c8a1d0d5a43cf270ec92bc7a63b | [
"MIT"
] | 1 | 2020-07-06T08:44:28.000Z | 2020-07-06T08:44:28.000Z | scripts/train_agents.py | Silviatulli/minmax_transparency_project | 45f93870d4366c8a1d0d5a43cf270ec92bc7a63b | [
"MIT"
] | 1 | 2020-03-15T15:53:40.000Z | 2020-03-15T15:53:40.000Z | import tensorflow as tf
import json
from sklearn.model_selection import KFold
import jsonpickle
import numpy as np
from minmax.game_model import GameState
def load_data(file_name):
with open(file_name, "r") as file:
data = json.load(file)
data_decoded = jsonpickle.decode(data)
states = list()
actions = list()
rewards = list()
next_states = list()
for state, action, reward, next_state in data_decoded:
state_idx = GameState.get_state_id(state)
states.append(state_idx)
next_state_idx = GameState.get_state_id(next_state)
next_states.append(next_state_idx)
actions.append(int(action))
rewards.append(reward)
states = np.stack(states)
actions = np.stack(actions)
rewards = np.stack(rewards)
next_states = np.stack(next_states)
return states, actions, rewards, next_states
def q_learning_model():
NUM_STATES = 12*12*12*12*2
NUM_ACTIONS = 18
GAMMA = 0.99
model_in = tf.keras.layers.Input(shape=(1,), dtype=tf.int32)
tmp = tf.one_hot(model_in, NUM_STATES)
tmp = tf.keras.layers.Dense(NUM_ACTIONS, use_bias=False)(tmp)
model_out = tf.squeeze(tmp, axis=1)
q_function = tf.keras.Model(model_in, model_out)
state = tf.keras.layers.Input(shape=(1,), dtype=tf.int32, name="State")
action = tf.keras.layers.Input(shape=(1,), dtype=tf.int32, name="Action")
reward = tf.keras.layers.Input(shape=(1,), name="Reward")
next_state = tf.keras.layers.Input(
shape=(1,), dtype=tf.int32, name="Next_State")
td_target = reward + GAMMA * tf.reduce_max(q_function(next_state), axis=-1)
predictions = tf.gather(q_function(state), action, axis=-1)
train_model = tf.keras.Model(
inputs=[state, action, reward, next_state],
outputs=[predictions, td_target]
)
# to date it still feels as if tf.stop_gradient is a horrible
# hack similar to DDQL to stabelize the algorithm
td_error = 0.5 * tf.abs(tf.stop_gradient(td_target) - predictions) ** 2
train_model.add_loss(td_error, [state, action, reward, next_state])
predicted_action = tf.argmax(q_function(state), axis=-1)
correct_predictions = tf.keras.metrics.categorical_accuracy(
action, predicted_action)
train_model.add_metric(correct_predictions,
name="Matched_Actions", aggregation="mean")
return q_function, train_model
def policy_gradient():
NUM_STATES = 12*12*12*12*2
NUM_ACTIONS = 18
GAMMA = 0.99
model_in = tf.keras.layers.Input(shape=(1,), dtype=tf.int32)
tmp = model_in
tmp = tf.keras.layers.Dense(NUM_ACTIONS, activation="softmax")(tmp)
model_out = 1.0 * tmp
policy_fn = tf.keras.Model(model_in, model_out)
state = tf.keras.layers.Input(shape=(1,), dtype=tf.int32, name="State")
action = tf.keras.layers.Input(shape=(1,), dtype=tf.int32, name="Action")
reward = tf.keras.layers.Input(shape=(1,), name="Reward")
next_state = tf.keras.layers.Input(
shape=(1,), dtype=tf.int32, name="Next_State")
target = tf.one_hot(action, NUM_ACTIONS)
target = tf.squeeze(target, axis=1)
predictions = policy_fn(state)
train_model = tf.keras.Model(
inputs=[state, action], outputs=[predictions, target])
error = tf.keras.losses.categorical_crossentropy(target, predictions)
train_model.add_loss(error, [state, action])
most_likely_action = tf.argmax(policy_fn(state), axis=-1)
correct_predictions = tf.keras.metrics.categorical_accuracy(
action, most_likely_action)
train_model.add_metric(correct_predictions,
name="Matched_Actions", aggregation="mean")
return policy_fn, train_model
if __name__ == "__main__":
states, actions, rewards, next_states = load_data("data.json")
indices = np.arange(len(states))
# # Q Learning
# # ===========
# q_function, train_q = q_learning_model()
# # training the q-learning agent
# train_q.compile(optimizer="sgd")
# train_q.fit([states, actions, rewards, next_states])
# # using the learned model
# q_values = q_function(states).numpy()
# best_actions = np.argmax(q_values, axis=-1)
# # Policy Gradient
# # ================
# policy_fn, train_policy = policy_gradient()
# # training the policy gradient
# train_policy.compile(optimizer="sgd")
# train_policy.fit([states, actions, rewards, next_states])
# # use the learned model
# action_props = policy_fn(states).numpy()
# cum_prop = np.cumsum(action_props, axis=-1)
# rng = np.random.rand(len(action_props))[..., np.newaxis]
# best_actions = np.argmax(rng <= cum_prop, axis=-1)
q_scores = list()
policy_scores = list()
for train_idx, test_idx in KFold(shuffle=True).split(indices):
train_data = [
states[train_idx, ...],
actions[train_idx, ...],
rewards[train_idx, ...],
next_states[train_idx, ...],
]
test_data = [
states[test_idx, ...],
actions[test_idx, ...],
rewards[test_idx, ...],
next_states[test_idx, ...],
]
q_function, train_q = q_learning_model()
train_q.compile(optimizer="sgd")
train_q.fit(train_data)
_, score = train_q.evaluate(test_data)
q_scores.append(score)
policy_fn, train_policy = q_learning_model()
train_policy.compile(optimizer="sgd")
train_policy.fit(train_data)
_, score = train_policy.evaluate(test_data)
policy_scores.append(score)
q_scores = np.array(q_scores)
print(f"Q-Learning Accuracy: M={np.mean(q_scores):.2f} "
f"(SD={np.std(q_scores):.2f})")
policy_scores = np.array(policy_scores)
print(f"Policy-Iteration Accuracy: M={np.mean(policy_scores):.2f} "
f"(SD={np.std(policy_scores):.2f})")
| 33.282486 | 79 | 0.654728 | 800 | 5,891 | 4.5975 | 0.195 | 0.036161 | 0.042414 | 0.04894 | 0.430941 | 0.386079 | 0.346384 | 0.313757 | 0.250136 | 0.250136 | 0 | 0.015457 | 0.209302 | 5,891 | 176 | 80 | 33.471591 | 0.774152 | 0.138856 | 0 | 0.226087 | 0 | 0 | 0.056922 | 0.02261 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026087 | false | 0 | 0.052174 | 0 | 0.104348 | 0.017391 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6e3ed186797b0db9105b6d4c93d6db5f25cd9a2 | 447 | py | Python | blockapi/test/api/test_tronscan.py | jendis/blockapi | c886cef954a504fc926c132b30acddc71abe5f12 | [
"MIT"
] | null | null | null | blockapi/test/api/test_tronscan.py | jendis/blockapi | c886cef954a504fc926c132b30acddc71abe5f12 | [
"MIT"
] | null | null | null | blockapi/test/api/test_tronscan.py | jendis/blockapi | c886cef954a504fc926c132b30acddc71abe5f12 | [
"MIT"
] | null | null | null | from pytest import mark
from blockapi.api.tronscan import TronscanAPI
from blockapi.test_init import test_addresses
class TestTronscanAPI:
ADDRESS = test_addresses['TRX'][0]
@mark.vcr()
def test_get_balance(self):
api = TronscanAPI(address=self.ADDRESS)
result = api.get_balance()
assert next((r["amount"] for r in result if r["symbol"] == "TRX")) ==\
0.588285
assert len(result) == 45
| 24.833333 | 78 | 0.655481 | 58 | 447 | 4.948276 | 0.568966 | 0.083624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02907 | 0.230425 | 447 | 17 | 79 | 26.294118 | 0.805233 | 0 | 0 | 0 | 0 | 0 | 0.040268 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.083333 | false | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6e731b3f53ab367c89ef0ea8e1cbffb0d990775 | 429 | py | Python | kornia/feature/loftr/backbone/__init__.py | Ishticode/kornia | 974abb43ec72d12dbd244a2fb247bbbab8498de0 | [
"ECL-2.0",
"Apache-2.0"
] | 4,894 | 2019-10-24T15:51:39.000Z | 2022-03-30T22:58:33.000Z | kornia/feature/loftr/backbone/__init__.py | Ishticode/kornia | 974abb43ec72d12dbd244a2fb247bbbab8498de0 | [
"ECL-2.0",
"Apache-2.0"
] | 912 | 2019-10-24T16:08:42.000Z | 2022-03-31T19:07:09.000Z | kornia/feature/loftr/backbone/__init__.py | Ishticode/kornia | 974abb43ec72d12dbd244a2fb247bbbab8498de0 | [
"ECL-2.0",
"Apache-2.0"
] | 557 | 2019-10-24T16:02:43.000Z | 2022-03-28T07:33:33.000Z | from .resnet_fpn import ResNetFPN_8_2, ResNetFPN_16_4
def build_backbone(config):
if config['backbone_type'] == 'ResNetFPN':
if config['resolution'] == (8, 2):
return ResNetFPN_8_2(config['resnetfpn'])
elif config['resolution'] == (16, 4):
return ResNetFPN_16_4(config['resnetfpn'])
else:
raise ValueError(f"LOFTR.BACKBONE_TYPE {config['backbone_type']} not supported.")
| 35.75 | 89 | 0.655012 | 53 | 429 | 5.056604 | 0.471698 | 0.022388 | 0.08209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044118 | 0.207459 | 429 | 11 | 90 | 39 | 0.744118 | 0 | 0 | 0 | 0 | 0 | 0.27972 | 0.058275 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6e9e820b7afe09d1d292ddb58ac088c11338fa6 | 5,298 | py | Python | bitsoapi/Client.py | oxsoftdev/bitsoapi | daa15c8921f4e49e07f1a958e8837d3548db329d | [
"MIT"
] | null | null | null | bitsoapi/Client.py | oxsoftdev/bitsoapi | daa15c8921f4e49e07f1a958e8837d3548db329d | [
"MIT"
] | null | null | null | bitsoapi/Client.py | oxsoftdev/bitsoapi | daa15c8921f4e49e07f1a958e8837d3548db329d | [
"MIT"
] | null | null | null | from .errors import (ApiError, ApiClientError)
from .mixins import ApiClientMixin
from .models.public import (
AvailableBooks
, Ticker
, OrderBook
, Trade
)
class Client(ApiClientMixin):
def __init__(self, key=None, secret=None):
self.base_url = 'https://bitso.com/api/v3'
self.key = key
self._secret = secret
# public api
def available_books(self):
url = '%s/available_books/' % self.base_url
resp = self._request_url(url, 'GET')
return AvailableBooks(resp)
def ticker(self, book):
url = '%s/ticker/' % self.base_url
parameters = {}
parameters['book'] = book
resp = self._request_url(url, 'GET', params=parameters)
return Ticker(resp['payload'])
def order_book(self, book, aggregate=True):
url = '%s/order_book/' % self.base_url
parameters = {}
parameters['book'] = book
parameters['aggregate'] = aggregate
resp = self._request_url(url, 'GET', params=parameters)
return OrderBook(resp['payload'])
def trades(self, book, **kwargs):
url = '%s/trades/' % self.base_url
parameters = {}
parameters['book'] = book
if 'marker' in kwargs:
parameters['marker'] = kwargs['marker']
if 'limit' in kwargs:
parameters['limit'] = kwargs['limit']
else:
parameters['limit'] = 100
if 'sort' in kwargs:
parameters['sort'] = kwargs['sort']
resp = self._request_url(url, 'GET', params=parameters)
return [Trade(o) for o in resp['payload']]
# private api
def account_status(self):
url = '%s/account_status/' % self.base_url
resp = self._request_url(url, 'GET', private=True)
return AccountStatus(resp['payload'])
def balance(self):
url = '%s/balance/' % self.base_url
resp = self._request_url(url, 'GET', private=True)
return Balance(resp['payload'])
def fees(self):
url = '%s/fees/' % self.base_url
resp = self._request_url(url, 'GET', private=True)
return Fees(resp['payload'])
def ledger(self, operation='', marker=None, limit=25, sort='desc'):
_operations = ['', 'trades', 'fees', 'fundings', 'withdrawals']
if not isinstance(operation, str) and operation not in _operations:
raise ApiClientError({'message': 'invalid operation'})
url = '%s/ledger/%s' % (self.base_url, operation)
parameters = {}
if marker:
parameters['marker'] = marker
if limit:
parameters['limit'] = limit
if sort:
parameters['sort'] = sort
resp = self._request_url(url, 'GET', params=parameters, private=True)
return [LedgerEntry(o) for entry in resp['payload']]
def withdrawals(self):
raise NotImplementedError
def fundings(self):
raise NotImplementedError
def user_trades(self, tids=[], book=None, marker=None, limit=25, sort='desc'):
raise NotImplementedError
def order_trades(self, oid):
raise NotImplementedError
def open_orders(self, book=None):
raise NotImplementedError
def lookup_orders(self, oids):
raise NotImplementedError
def cancel_orders(self, oids):
if isinstance(oids, str):
oids = [oids]
url = '%s/orders/' % self.base_url
url+= '%s/' % ('-'.join(oids))
resp = self._request_url(url, 'DELETE', private=True)
return resp['payload']
def place_order(self, book, side, type, **kwargs):
_sides = ['buy', 'sell']
_types = ['market', 'limit']
if not isinstance(book, str) and not len(book):
raise ApiClientError({'message': 'book not specified'})
if not isinstance(side, str) and side not in _sides:
raise ApiClientError({'message': 'side not specified'})
if not isinstance(type, str) and type not in _types:
raise ApiClientError({'message': 'type not specified'})
if not str(kwargs.get('major','')).strip() and not str(kwargs.get('minor','')).strip():
raise ApiClientError({'message': 'an order must be specified in terms of major or minor, never both'})
if str(kwargs.get('price')).strip() and not (type == 'limit'):
raise ApiClientError({'message': 'price for use only with limit orders'})
url = '%s/orders/' % self.base_url
parameters = {}
parameters['book'] = book
parameters['type'] = type
parameters['side'] = side
if 'major' in kwargs:
parameters['major'] = kwargs.get('major')
if 'minor' in kwargs:
parameters['minor'] = kwargs.get('minor')
if 'price' in kwargs:
parameters['price'] = kwargs.get('price')
resp = self._request_url(url, 'POST', params=parameters, private=True)
return resp['payload']
def funding_destination(self, fund_currency):
raise NotImplementedError
def btc_withdrawal(self, amount, address):
raise NotImplementedError
def eth_withdrawal(self, amount, address):
raise NotImplementedError
def spei_withdrawal(self):
raise NotImplementedError
| 34.855263 | 114 | 0.597206 | 596 | 5,298 | 5.209732 | 0.204698 | 0.028341 | 0.038969 | 0.057971 | 0.300483 | 0.258937 | 0.212882 | 0.152979 | 0.121417 | 0.050242 | 0 | 0.00207 | 0.270479 | 5,298 | 151 | 115 | 35.086093 | 0.801294 | 0.004153 | 0 | 0.235772 | 0 | 0 | 0.128794 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.170732 | false | 0 | 0.02439 | 0 | 0.284553 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6ec2598a56faa62709a6fdad2b2f7f1d84fc242 | 3,256 | py | Python | pycryptoki/mechanism/des.py | nondejus/pycryptoki | 5dbc9fe6b1d17b2bc52a3e91a01abcd0ddc2b0e3 | [
"Apache-2.0"
] | null | null | null | pycryptoki/mechanism/des.py | nondejus/pycryptoki | 5dbc9fe6b1d17b2bc52a3e91a01abcd0ddc2b0e3 | [
"Apache-2.0"
] | null | null | null | pycryptoki/mechanism/des.py | nondejus/pycryptoki | 5dbc9fe6b1d17b2bc52a3e91a01abcd0ddc2b0e3 | [
"Apache-2.0"
] | null | null | null | """
DES3-specific mechanism implementations.
"""
import logging
from ctypes import c_void_p, cast, pointer, sizeof, POINTER
from . import Mechanism
from ..attributes import to_byte_array
from ..conversions import from_bytestring
from ..cryptoki import CK_ULONG, CK_BYTE, CK_BYTE_PTR, CK_DES_CTR_PARAMS, \
CK_KEY_DERIVATION_STRING_DATA, CK_DES_CBC_ENCRYPT_DATA_PARAMS
LOG = logging.getLogger(__name__)
class DES3CTRMechanism(Mechanism):
"""
DES3 CTR Mechanism param conversion.
"""
REQUIRED_PARAMS = ['cb', 'ulCounterBits']
def to_c_mech(self):
"""
Convert extra parameters to ctypes, then build out the mechanism.
:return: :class:`~pycryptoki.cryptoki.CK_MECHANISM`
"""
super(DES3CTRMechanism, self).to_c_mech()
ctr_params = CK_DES_CTR_PARAMS()
ctr_params.cb = (CK_BYTE * 8)(*self.params['cb'])
ctr_params.ulCounterBits = CK_ULONG(self.params['ulCounterBits'])
self.mech.pParameter = cast(pointer(ctr_params), c_void_p)
self.mech.usParameterLen = CK_ULONG(sizeof(ctr_params))
return self.mech
class DES3ECBEncryptDataMechanism(Mechanism):
"""
DES3 mechanism for deriving keys from encrypted data.
"""
REQUIRED_PARAMS = ['data']
def to_c_mech(self):
"""
Convert extra parameters to ctypes, then build out the mechanism.
:return: :class:`~pycryptoki.cryptoki.CK_MECHANISM`
"""
super(DES3ECBEncryptDataMechanism, self).to_c_mech()
# from https://www.cryptsoft.com/pkcs11doc/v220
# /group__SEC__12__14__2__MECHANISM__PARAMETERS.html
# CKM_DES3_ECB_ENCRYPT_DATA
# Note: data should same or > size of key in multiples of 8.
params = CK_KEY_DERIVATION_STRING_DATA()
pdata, data_len = to_byte_array(from_bytestring(self.params['data']))
pdata = cast(pdata, CK_BYTE_PTR)
params.pData = pdata
params.ulLen = CK_ULONG(data_len.value)
self.mech.pParameter = cast(pointer(params), c_void_p)
self.mech.usParameterLen = CK_ULONG(sizeof(params))
return self.mech
class DES3CBCEncryptDataMechanism(Mechanism):
"""
DES3 CBC mechanism for deriving keys from encrypted data.
"""
REQUIRED_PARAMS = ['iv', 'data']
def to_c_mech(self):
"""
Convert extra parameters to ctypes, then build out the mechanism.
:return: :class:`~pycryptoki.cryptoki.CK_MECHANISM`
"""
super(DES3CBCEncryptDataMechanism, self).to_c_mech()
# from https://www.cryptsoft.com/pkcs11doc/v220
# /group__SEC__12__14__2__MECHANISM__PARAMETERS.html
# CKM_DES3_CBC_ENCRYPT_DATA
# Note: data should same or > size of key in multiples of 8.
params = CK_DES_CBC_ENCRYPT_DATA_PARAMS()
pdata, data_len = to_byte_array(from_bytestring(self.params['data']))
pdata = cast(pdata, CK_BYTE_PTR)
# Note: IV should always be a length of 8.
params.iv = (CK_BYTE * 8)(*self.params['iv'])
params.pData = pdata
params.length = CK_ULONG(data_len.value)
self.mech.pParameter = cast(pointer(params), c_void_p)
self.mech.usParameterLen = CK_ULONG(sizeof(params))
return self.mech | 36.58427 | 77 | 0.677211 | 412 | 3,256 | 5.06068 | 0.23301 | 0.034532 | 0.020144 | 0.021583 | 0.663309 | 0.618705 | 0.569784 | 0.569784 | 0.569784 | 0.517026 | 0 | 0.014601 | 0.221744 | 3,256 | 89 | 78 | 36.58427 | 0.808208 | 0.292383 | 0 | 0.372093 | 0 | 0 | 0.023245 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.069767 | false | 0 | 0.139535 | 0 | 0.418605 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6ef7292ee095b344a1e27bdac06d5abb78f801f | 4,772 | py | Python | main.py | hollow-earth/handwritten-latex | 077ecc09edd94d9cda10700886c74bb93f0f6121 | [
"MIT"
] | null | null | null | main.py | hollow-earth/handwritten-latex | 077ecc09edd94d9cda10700886c74bb93f0f6121 | [
"MIT"
] | null | null | null | main.py | hollow-earth/handwritten-latex | 077ecc09edd94d9cda10700886c74bb93f0f6121 | [
"MIT"
] | null | null | null | import tkinter as tk
from tkinter.filedialog import askopenfilename
from tkinter import scrolledtext
from PIL import ImageTk,Image
import execute
def updateText(text):
text_widget["state"] = tk.NORMAL
text_widget.insert(tk.INSERT, text)
text_widget["state"] = tk.DISABLED
def getFile():
global filename
filename = askopenfilename()
if ".png" not in filename.lower() and ".jpg" not in filename.lower():
updateText("Not a valid image, please use a jpg or png.\n")
else:
updateText("Valid image found at {file} \n".format(file=str(filename)))
img = Image.open(filename)
newHeight = 300/img.size[0]
img = img.resize((299, int(newHeight*img.size[1])), Image.ANTIALIAS)
img = ImageTk.PhotoImage(img)
untreatedImage.configure(image=img)
untreatedImage.image = img
untreatedImage.pack(side=tk.LEFT)
def executeProgram():
if filename == "":
updateText("Please select an image first.\n")
else:
global processedFlag
global processedImage
thresholdValue = thresholdEntry.get()
if type(int(thresholdValue)) is not int:
updateText("Value is not an integer.\n")
raise ValueError("Value is not an integer.")
elif type(int(thresholdValue)) is int and (int(thresholdValue) > 255 or int(thresholdValue) < 0):
updateText("Value of threshold should be 0-255.\n")
raise ValueError("Value of threshold should be 0-255.")
processedFlag = True
img = execute.processImage(filename, int(thresholdValue))
processedImage = img
img = Image.fromarray(img)
newHeight = 300/img.size[0]
img = img.resize((299, int(newHeight*img.size[1])), Image.ANTIALIAS)
img = ImageTk.PhotoImage(img)
procImage.configure(image=img)
procImage.image = img
procImage.pack(side=tk.RIGHT)
updateText("Processed image has been generated.\n")
def showImage():
def destroyWindow():
nwin.destroy()
showImageButton["state"], okButton["state"], downloadButton["state"] = tk.NORMAL, tk.NORMAL, tk.NORMAL
if filename == "":
updateText("Please select an image first.\n")
elif processedFlag == False:
updateText("Please process an image first.\n")
else:
global processedImage
nwin = tk.Toplevel()
nwin.title("Processed Image")
photo3 = Image.fromarray(processedImage)
photo2 = ImageTk.PhotoImage(photo3)
nwinCanvas = tk.Canvas(nwin, width = photo3.size[0], height = photo3.size[1])
nwinCanvas.pack(expand = tk.YES, fill = tk.BOTH)
showImageButton["state"], okButton["state"], downloadButton["state"] = tk.DISABLED, tk.DISABLED, tk.DISABLED
nwinCanvas.create_image(1, 1, image = photo2, anchor = tk.NW)
nwin.resizable(True, True)
nwin.protocol("WM_DELETE_WINDOW", destroyWindow)
nwin.mainloop()
filename = ""
processedFlag = False
root = tk.Tk()
root.title("Handwriting to LaTeX")
root.geometry("600x600")
root.resizable(False, False)
canvas = tk.Canvas(root, height=624, width = 600, bg="#e6e6e6")
canvas.pack()
imageFrame = tk.Frame(root)
imageFrame.place(relwidth=1, relheight=0.6, rely=0.1)
frame = tk.Frame(root)
frame.place(relwidth=1, relheight=0.1)
uploadImage = tk.PhotoImage(file="./UI/upload.png")
okImage = tk.PhotoImage(file="./UI/ok.png")
magnifyImage = tk.PhotoImage(file="./UI/magnify.png")
downloadImage = tk.PhotoImage(file="./UI/download.png")
downloadButton = tk.Button(frame, bg = "#ff8080", command=getFile, image=downloadImage, relief="flat", width=150, compound="left")
okButton = tk.Button(frame, bg = "#91FF80", command=executeProgram, image=okImage, relief="flat", width=150, compound="left")
showImageButton = tk.Button(frame, bg = "#73FFFB", command=showImage, image=magnifyImage, relief="flat", width=150, compound="left")
updateButton = tk.Button(frame, bg = "#FFFF99", image=uploadImage, relief="flat", width=150, compound="left")
downloadButton.place(relx = 0, rely = 0)
okButton.place(relx = 0.50, rely = 0)
showImageButton.place(relx = 0.25, rely=0)
updateButton.place(relx=0.75, rely=0)
textFrame = tk.Frame(root)
textFrame.place(relwidth=1, relheight=0.3, rely=0.7)
text_widget = tk.Text(textFrame, width=100, height=9, padx=3, pady=3)
text_widget.pack(side=tk.LEFT)
text_widget.insert(tk.INSERT, "Waiting for image input...\n")
text_widget["state"] = tk.DISABLED
procImage = tk.Label(imageFrame, width=298, height=372, pady = 1, padx=1)
untreatedImage = tk.Label(imageFrame, width=298, height=372, pady = 1, padx=1)
thresholdEntry = tk.Entry(imageFrame, bg="#91FF80")
thresholdEntry.place(relx=0.96, rely=0, width=25, height=20)
root.mainloop() | 40.440678 | 132 | 0.67917 | 614 | 4,772 | 5.262215 | 0.286645 | 0.021665 | 0.015475 | 0.022284 | 0.282266 | 0.210152 | 0.162798 | 0.11204 | 0.11204 | 0.084184 | 0 | 0.035321 | 0.181266 | 4,772 | 118 | 133 | 40.440678 | 0.791656 | 0 | 0 | 0.165049 | 0 | 0 | 0.125707 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048544 | false | 0 | 0.048544 | 0 | 0.097087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f0188653bb2ee871af6055822ae4236aad3138 | 1,681 | py | Python | aaem/components/ashp_base/config.py | gina-alaska/alaska_affordable_energy_model | 96fed0137152985ce280ea37e0affec131e3087f | [
"MIT-feh"
] | 1 | 2022-01-23T07:18:36.000Z | 2022-01-23T07:18:36.000Z | aaem/components/ashp_base/config.py | gina-alaska/alaska_affordable_energy_model | 96fed0137152985ce280ea37e0affec131e3087f | [
"MIT-feh"
] | 5 | 2017-07-14T21:56:46.000Z | 2017-07-14T21:59:15.000Z | aaem/components/ashp_base/config.py | gina-alaska/alaska_affordable_energy_model | 96fed0137152985ce280ea37e0affec131e3087f | [
"MIT-feh"
] | 2 | 2020-04-28T18:12:55.000Z | 2021-01-13T01:56:57.000Z | """
Air Sorce Heat Pump Base Configuration
--------------------------------------
Contains Air Sorce Heat Pump Base configuration info for community
data yaml file, and other set-up requirements
"""
from aaem.components import definitions
from pandas import DataFrame
COMPONENT_NAME = "air source heat pumps base"
IMPORT = "IMPORT"
UNKNOWN = "UNKNOWN"
order = [
'enabled',
'lifetime',
'start year',
'btu/hrs',
'cost per btu/hrs',
'o&m per year',
'data',
'performance data'
]
structure = {
COMPONENT_NAME: {
'enabled': bool, #
'lifetime': int, # number years <int>
'start year': int, # start year <int>
'btu/hrs': float,
'cost per btu/hrs': float,
'o&m per year': float,
'data': DataFrame,
'performance data': {
'COP': list,
'Temperature': list,
'Percent of Total Capacity': list,
}
}
}
comments = {
'enabled': definitions.ENABLED,
'lifetime': definitions.LIFETIME,
'start year': definitions.START_YEAR_WITH_TYPE,
'btu/hrs': '[float] per ASHP unit [btu/hrs]',
'cost per btu/hrs': '[float] cost per btu/hrs [$/(btu/hrs)]',
'o&m per year':'[float] operations and maintenance costs per year [$/year]',
'data':
"[DataFrame] Yearly climate data including 'Peak Month % of total', 'Capacity Factor', 'Minimum Temp', Avg. Temp(monthly), and % heating load (monthly)",
'performance data':
"[dict] contains lists of equal length for keys 'Temperature', 'COP' (Cofficient of performance), and 'Percent of Total Capacity'"
}
## list of prerequisites for module
prereq_comps = []
| 27.112903 | 161 | 0.603807 | 197 | 1,681 | 5.121827 | 0.431472 | 0.053518 | 0.039643 | 0.051536 | 0.243806 | 0.168484 | 0.047572 | 0 | 0 | 0 | 0 | 0 | 0.245687 | 1,681 | 61 | 162 | 27.557377 | 0.795741 | 0.160024 | 0 | 0 | 0 | 0.045455 | 0.516429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.068182 | 0 | 0.068182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f042d82082418eff06a31d2338a1f484ffdb28 | 1,481 | py | Python | tkRAD/xml/rad_xml_attributes_dict.py | eblade/telegram | fffdcdad501a672130ea5b970c5bfc6bd554d826 | [
"MIT"
] | 1 | 2015-02-22T16:33:35.000Z | 2015-02-22T16:33:35.000Z | tkRAD/xml/rad_xml_attributes_dict.py | eblade/telegram | fffdcdad501a672130ea5b970c5bfc6bd554d826 | [
"MIT"
] | null | null | null | tkRAD/xml/rad_xml_attributes_dict.py | eblade/telegram | fffdcdad501a672130ea5b970c5bfc6bd554d826 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
tkRAD - tkinter Rapid Application Development library
(c) 2013+ Raphaël SEBAN <motus@laposte.net>
This program is free software: you can redistribute it and/or
modify it under the terms of the GNU General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program.
If not, see: http://www.gnu.org/licenses/
"""
# lib imports
from ..core import struct_dict as SD
from . import rad_xml_attribute as XA
class RADXMLAttributesDict (SD.StructDict):
r"""
StructDict subclass for commodity;
handles support for RADXMLAttribute items;
"""
def __init__ (self, *args, **kw):
r"""
class constructor;
implements @item_type=RADXMLAttribute;
"""
# super class inits
super().__init__(*args, **kw)
# member inits
self.item_type = XA.RADXMLAttribute
self.item_value_getter = "get_value"
self.item_value_setter = "set_value"
# end def
# end class RADXMLAttributesDict
| 29.62 | 71 | 0.679271 | 197 | 1,481 | 5.010152 | 0.634518 | 0.033435 | 0.039514 | 0.057751 | 0.08308 | 0.056738 | 0 | 0 | 0 | 0 | 0 | 0.006289 | 0.248481 | 1,481 | 49 | 72 | 30.22449 | 0.880503 | 0.654963 | 0 | 0 | 0 | 0 | 0.047493 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f0f2ae0437a7438574c9999baa6648a2c77f76 | 6,080 | py | Python | noogle/models.py | mtik00/gcal_nest | eb54e0cdefa8747f3173a91df131092286b83e09 | [
"MIT"
] | 1 | 2019-08-14T01:51:22.000Z | 2019-08-14T01:51:22.000Z | noogle/models.py | mtik00/noogle | eb54e0cdefa8747f3173a91df131092286b83e09 | [
"MIT"
] | null | null | null | noogle/models.py | mtik00/noogle | eb54e0cdefa8747f3173a91df131092286b83e09 | [
"MIT"
] | null | null | null | import enum
from sqlalchemy import Integer, String, Column, Enum, UniqueConstraint
from sqlalchemy.sql import exists, and_
from sqlalchemy_utils import ArrowType
from sqlalchemy.exc import IntegrityError
from .db import Base, session
import arrow
from .settings import settings
from .utils import get_scheduled_date
from .helpers import print_log
class State(enum.Enum):
waiting = 0
complete = 1
removed = 2
def __str__(self):
return self.name
class Action(enum.Enum):
home = 0
away = 1
eco = 2
heat = 3
def __str__(self):
return self.name
class Structure(Base):
"""Describes a Google structure."""
__tablename__ = "structures"
id = Column(Integer, primary_key=True, autoincrement=True, nullable=False)
name = Column(String, nullable=False)
custom_name = Column(String, nullable=True)
def __repr__(self) -> str:
return f"<Structure(custom_name={self.custom_name})>"
class Thermostat(Base):
"""Describes a Nest thermostat."""
__tablename__ = "thermostats"
id = Column(Integer, primary_key=True, autoincrement=True, nullable=False)
name = Column(String, nullable=False)
label = Column(String, nullable=True)
structure_name = Column(String, nullable=False)
def __repr__(self) -> str:
return f"<Thermostat(label={self.label})>"
class Event(Base):
"""Describes a single event stored in cache."""
__tablename__ = "events"
__table_args__ = (
UniqueConstraint(
"event_id", "scheduled_date", "calendar_id", name="event_id__date__cal__uc"
),
{"sqlite_autoincrement": True},
)
id = Column(Integer, primary_key=True, autoincrement=True, nullable=False)
event_id = Column(String, nullable=False)
name = Column(String, nullable=True)
action = Column(Enum(Action), nullable=False)
calendar_id = Column(String, default="primary")
parent_event_id = Column(Integer, nullable=True)
state = Column(Enum(State), nullable=False, default=State.waiting)
scheduled_date = Column(ArrowType, nullable=True)
actioned_date = Column(ArrowType, nullable=True)
description = Column(String, nullable=True)
structure_name = Column(String, default="", nullable=False)
structure_id = Column(String, default="", nullable=False)
def __str__(self):
return f"<Event {self.action}/{self.state}/{self.scheduled_date}>"
def __repr__(self):
return str(self)
@staticmethod
def waiting():
"""Return all waiting events"""
return session.query(Event).filter(Event.state == State.waiting).all()
@staticmethod
def exists(event_id, scheduled_date, state=State.waiting):
"""Returns True if the event_id exists, False otherwise"""
return session.query(
exists().where(
and_(
Event.event_id == event_id,
Event.scheduled_date == scheduled_date,
Event.state == state,
)
)
).scalar()
@staticmethod
def create_from_gcal(gcal_event, commit=True):
e = Event(
name=gcal_event["summary"], event_id=gcal_event["id"], state=State.waiting
)
e.actioned_date = None
parts = e.name.split(":")
if len(parts) == 2:
e.action = Action[parts[1].strip()]
elif len(parts) == 3:
e.action = Action[parts[1].strip()]
e.description = parts[2].strip()
else:
print_log(f'WARNING: Cannot parse event name: "{e.name}"')
if "date" in gcal_event["start"]:
# The user has an "all day" event in gcal.
default_time = (
settings.calendar.default_home_time
if e.action.value == Action.home
else settings.calendar.default_away_time
)
e.scheduled_date = arrow.get(
gcal_event["start"]["date"]
+ " "
+ default_time
+ " "
+ settings.calendar.timezone,
"YYYY-MM-DD H:mm ZZZ",
)
else:
# NOTE: 'dateTime' includes the timezone
e.scheduled_date = get_scheduled_date(gcal_event)
if commit:
try:
session.add(e)
session.commit()
except IntegrityError:
session.rollback()
return e
@staticmethod
def events_missing(gcal_event_list):
result = []
# If there are no events returned, return all waiting events.
if not gcal_event_list:
return session.query(Event).filter(Event.state == State.waiting).all()
for gcal_event in gcal_event_list:
scheduled_date = get_scheduled_date(gcal_event)
events = (
session.query(Event)
.filter(
and_(
Event.event_id == gcal_event["id"], Event.state == State.waiting
)
)
.all()
)
if not events:
continue
result += [x for x in events if x.scheduled_date != scheduled_date]
# Ensure that future events cached in the DB show up in the list from google
gcal_ids = [x["id"] for x in gcal_event_list]
removed_events = (
session.query(Event).filter(
and_(Event.state == State.waiting, Event.event_id.notin_(gcal_ids))
)
).all()
result += removed_events
return result
def mark_event_missing(self):
self.state = State.removed
session.add(self)
session.commit()
def mark_event_done(self):
self.state = State.complete
session.add(self)
session.commit()
def commit(self):
try:
session.add(self)
session.commit()
return True
except IntegrityError:
session.rollback()
raise
| 29.371981 | 88 | 0.587171 | 675 | 6,080 | 5.093333 | 0.219259 | 0.052938 | 0.046539 | 0.034904 | 0.315009 | 0.243165 | 0.188773 | 0.128272 | 0.099767 | 0.099767 | 0 | 0.002861 | 0.310033 | 6,080 | 206 | 89 | 29.514563 | 0.816687 | 0.064967 | 0 | 0.25 | 0 | 0 | 0.05976 | 0.02599 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.064103 | 0.038462 | 0.448718 | 0.012821 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f23930d3e912ee805e9716c8b0824677ddf74a | 2,956 | py | Python | params.py | ptarau/StanzaGraphs | 617818ded2f6e633417ec548eec7d9ae59c2e65b | [
"Apache-2.0"
] | 4 | 2020-12-31T01:18:05.000Z | 2021-08-13T08:33:25.000Z | params.py | ptarau/StanzaGraphs | 617818ded2f6e633417ec548eec7d9ae59c2e65b | [
"Apache-2.0"
] | null | null | null | params.py | ptarau/StanzaGraphs | 617818ded2f6e633417ec548eec7d9ae59c2e65b | [
"Apache-2.0"
] | 5 | 2020-12-06T01:46:33.000Z | 2022-02-22T05:27:54.000Z | import json
import os
import pickle
import subprocess
from inspect import getframeinfo, stack
import langid
PARAMS = dict(
TRACE=1,
TARGET_LANG='en', # tried zh,fr,sp,de,hu,ro,ar,el,la,it,ru,ja
RANKER='betweenness',
UPLOAD_DIRECTORY='uploads/',
OUTPUT_DIRECTORY='out/',
k_count=7,
s_count=5,
translation=True,
pics=False,
CACHING=0
)
def out_dirs():
out = PARAMS['OUTPUT_DIRECTORY']
return [out + x for x in
('overview.txt',
'pdftexts/',
'sums/',
'keys/'
)
]
def pdf2txt(pdf, txt):
subprocess.run(["pdftotext", "-q", pdf, txt])
if os.path.getsize(txt) > 32:
return True
os.remove(txt)
return False
def detect_lang(text):
return langid.classify(text)[0]
def to_json(obj, fname, indent=1):
"""
serializes an object to a json file
assumes object made of array and dicts
"""
with open(fname, "w") as outf:
# encode('utf8')
json.dump(obj, outf, indent=indent, ensure_ascii=False)
def from_json(fname):
"""
deserializes an object from a json file
"""
with open(fname, "r") as inf:
obj = json.load(inf)
return obj
def exists_file(fname):
""" if it exists as file or dir"""
return os.path.exists(fname)
def home_dir():
from pathlib import Path
return str(Path.home())
def ensure_path(fname):
folder, _ = os.path.split(fname)
os.makedirs(folder, exist_ok=True)
def to_pickle(obj, fname='./arxiv.pickle'):
"""
serializes an object to a .pickle file
"""
ensure_path(fname)
with open(fname, "wb") as outf:
pickle.dump(obj, outf)
def from_pickle(fname):
"""
deserializes an object from a pickle file
"""
with open(fname, "rb") as inf:
return pickle.load(inf)
def load_delimited(fname, delimiter):
with open(fname, mode="rt") as f:
for line in f:
xs = line.split(delimiter)
last = xs[-1]
xs[-1] = last[0:-1]
yield xs
def take(n, gen):
for i, x in enumerate(gen):
if i >= n: break
yield x
def pp(gen, n=10):
if isinstance(gen, dict):
gen = gen.items()
for x in take(n, gen):
print(x)
def ppp(*args, **kwargs):
"""
logging mechanism with possible DEBUG extras
will tell from which line in which file the printed
messge orginates from
"""
if PARAMS["TRACE"] < 1: return
if PARAMS["TRACE"] >= 1:
caller = getframeinfo(stack()[1][0])
print('DEBUG:',
caller.filename.split('/')[-1],
'->', caller.lineno, end=': ')
print(*args, **kwargs)
"""
def force_quiet(fun,*args,**kwargs) :
sout=sys.stdout
serr = sys.stderr
f = open(os.devnull, 'w')
sys.stdout = f
sys.stderr = f
result=fun(*args,**kwargs)
sys.stdout = sout
sys.stderr = serr
return result
"""
| 20.386207 | 66 | 0.578823 | 406 | 2,956 | 4.162562 | 0.403941 | 0.023669 | 0.038462 | 0.023669 | 0.060355 | 0.035503 | 0 | 0 | 0 | 0 | 0 | 0.00992 | 0.28383 | 2,956 | 144 | 67 | 20.527778 | 0.78838 | 0.135318 | 0 | 0 | 0 | 0 | 0.056477 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.17284 | false | 0 | 0.08642 | 0.012346 | 0.358025 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f376b60a00ab17f19cd60c626a3bd2ccf60f7f | 12,589 | py | Python | safe_exploration/environments/ndpendulum.py | oscarkey/safe-exploration | 32f0582a7b54ab7d4c1d415afbcf5e9554e8bcec | [
"MIT"
] | null | null | null | safe_exploration/environments/ndpendulum.py | oscarkey/safe-exploration | 32f0582a7b54ab7d4c1d415afbcf5e9554e8bcec | [
"MIT"
] | null | null | null | safe_exploration/environments/ndpendulum.py | oscarkey/safe-exploration | 32f0582a7b54ab7d4c1d415afbcf5e9554e8bcec | [
"MIT"
] | null | null | null | """Contains the n dimensional inverted pendulum environment."""
import warnings
from typing import Optional
import matplotlib.pyplot as plt
import numpy as np
from numpy import ndarray
from polytope import polytope
from scipy.integrate import ode
from scipy.spatial.qhull import ConvexHull
from ..utils import assert_shape
from .environments import Environment
class NDPendulum(Environment):
"""N dimensional inverted pendulum environment.
The pendulum is represented using hyperspherical coordinates, with a fixed value for r. Thus the state is (n-1)
angles, along with their associated velocities:
0. d_theta1
...
n-1-1. d_theta(n-1)
n-1. theta1
...
(n-1)*2-1. theta(n-1)
There are (n-1) actions, each exerting torque in the plane of one of the angles.
"""
def __init__(self, name: str = "NDPendulum", n: int = 3, l: float = .5, m: float = .15, g: float = 9.82,
b: float = 0., dt: float = .05, init_m: Optional[float] = None, init_std: Optional[float] = None,
plant_noise: ndarray = np.array([0.01, 0.01, 0.01, 0.01]) ** 2, u_min: float = -1., u_max: float = 1,
target: ndarray = np.array([0.0, 0.0]), verbosity: int = 1, norm_x=None, norm_u=None):
"""
:param name: name of the system
:param n: number of dimensions, >=3
:param l: length of the pendulum
:param m: mass of the pendulum
:param g: gravitation constant
:param b: friction coefficient of the system
:param init_m: [(n-1)*2 x 0] initial state mean
:param init_std: standard deviation of the start state sample distribution. Note: This is not(!) the uncertainty
of the state but merely allows for variation in the initial (deterministic) state.
:param u_min: maximum negative torque applied to the system in any dimension
:param u_max: maximum positive torque applied to the system in any dimension
:param target: [(n-1)*2 x 0] target state
"""
assert b == 0., 'Friction is not supported.'
# We have n-1 angles, and for each a position and velocity.
state_dimen = (n - 1) * 2
# We can exert torque in the plane of each of the angles.
action_dimen = (n - 1)
num_angles = n - 1
u_min = np.array([u_min] * num_angles)
u_max = np.array([u_max] * num_angles)
p_origin = np.array([0.0] * state_dimen)
init_m = init_m if init_m is not None else np.array([0., ] * state_dimen)
init_std = init_std if init_std is not None else np.array([0.01, ] * state_dimen)
super().__init__(name, state_dimen, action_dimen, dt, init_m, init_std, plant_noise, u_min, u_max, target,
verbosity, p_origin)
self.odesolver = ode(self._dynamics)
self.l = l
self.m = m
self.g = g
self.b = b
self.target = target
self.target_ilqr = init_m
self.n = n
self.num_angles = num_angles
warnings.warn("Normalization turned off for now. Need to look into it")
max_deg = 30
if norm_x is None:
norm_x = np.array([1.] * state_dimen) # norm_x = np.array([np.sqrt(g/l), np.deg2rad(max_deg)])
if norm_u is None:
norm_u = np.array([1.] * action_dimen) # norm_u = np.array([g*m*l*np.sin(np.deg2rad(max_deg))])
self.norm = [norm_x, norm_u]
self.inv_norm = [arr ** -1 for arr in self.norm]
self._init_safety_constraints()
raise NotImplementedError('NDPendulum doesn\'t work properly yet!')
@property
def l_mu(self) -> ndarray:
return np.array(([0.05] * self.num_angles) + ([.02] * self.num_angles))
@property
def l_sigm(self) -> ndarray:
return np.array(([0.05] * self.num_angles) + ([.02] * self.num_angles))
def _reset(self):
self.odesolver.set_initial_value(self.current_state, 0.0)
def _check_current_state(self, state=None):
if state is None:
state = self.current_state
# Check if the state lies inside the safe polytope i.e. A * x <= b.
res = np.matmul(self.h_mat_safe, state) - self.h_safe.T
satisfied = not (res > 0).any()
# We don't use the status code.
status_code = 0
return not satisfied, status_code
def _dynamics(self, t, state, action):
""" Evaluate the system dynamics
Parameters
----------
t: float
Input Parameter required for the odesolver for time-dependent
odes. Has no influence in this system.
state: n_sx1 array[float]
The current state of the system
action: n_ux1 array[float]
The action to be applied at the current time step
Returns
-------
dz: n_sx1 array[float]
The ode evaluated at the given inputs.
"""
assert_shape(state, (self.n_s,))
assert_shape(action, (self.n_u,))
velocity = state[:self.num_angles]
position = state[self.num_angles:]
gravity_proj = np.zeros_like(position)
gravity_proj[0] = self.g / self.l * np.sin(position[0])
inertia = self.m * self.l ** 2
dvelocity = gravity_proj + action / inertia # - b / inertia * state[0]
dposition = velocity
return np.concatenate((dvelocity.flat, dposition.flat))
def _jac_dynamics(self):
""" Evaluate the jacobians of the system dynamics
Returns
-------
jac: (n_s) x (n_s+n_u) array[float]
The jacobian of the dynamics w.r.t. the state and action
"""
state = np.zeros((self.n_s,))
position = state[self.num_angles:]
theta1 = position[0]
inertia = self.m * self.l ** 2
jac_acl = np.array([[0., 0., self.g / self.l * np.cos(theta1), 0., 1/inertia, 0.], #
[0., 0., 0., 0., 0., 1/inertia]])
jac_vel = np.eye(self.num_angles, self.n_s + self.n_u)
return np.vstack((jac_acl, jac_vel))
def state_to_obs(self, state=None, add_noise=False):
""" Transform the dynamics state to the state to be observed
Parameters
----------
state: n_sx0 1darray[float]
The internal state of the system.
add_noise: bool, optional
If this is set to TRUE, a noisy observation is returned
Returns
-------
state: 2x0 1darray[float]
The state as is observed by the agent.
In the case of the inverted pendulum, this is the same.
"""
if state is None:
state = self.current_state
noise = 0
if add_noise:
noise += np.random.randn(self.n_s) * np.sqrt(self.plant_noise)
state_noise = state + noise
state_norm = state_noise * self.inv_norm[0]
return state_norm
def random_action(self) -> ndarray:
c = 0.5
return c * (np.random.rand(self.n_u) * (self.u_max - self.u_min) + self.u_min)
def _init_safety_constraints(self):
""" Get state and safety constraints
We define the state constraints as:
x_0 - 3*x_1 <= 1
x_0 - 3*x_1 >= -1
x_1 <= max_rad
x_1 >= -max_rad
"""
max_dx = 2.0
max_theta1_deg = 20
max_dtheta1 = 1.2
max_dtheta1_at_vertical = 0.8
max_theta1_rad = np.deg2rad(max_theta1_deg)
# -max_dtheta <dtheta <= max_dtheta
h_0_mat = np.asarray([[1., 0.], [-1., 0.]])
h_0_vec = np.array([max_dtheta1, max_dtheta1])[:, None]
# (1/.4)*dtheta + (2/.26)*theta <= 1
# 2*max_dtheta + c*max_rad <= 1
# => c = (1+2*max_dtheta) / max_rad
# for max_deg = 30, max_dtheta = 1.5 => c \approx 7.62
corners_polygon = np.array([[max_dtheta1, max_dtheta1, -max_theta1_rad, -max_theta1_rad], #
[max_dtheta1, -max_dtheta1, -max_theta1_rad, max_theta1_rad], #
[max_dtheta1, max_dtheta1_at_vertical, -max_theta1_rad, 0.], #
[max_dtheta1, -max_dtheta1_at_vertical, -max_theta1_rad, 0.], #
[-max_dtheta1, max_dtheta1, max_theta1_rad, -max_theta1_rad], #
[-max_dtheta1, -max_dtheta1, max_theta1_rad, max_theta1_rad], #
[-max_dtheta1, max_dtheta1_at_vertical, max_theta1_rad, 0.], #
[-max_dtheta1, -max_dtheta1_at_vertical, max_theta1_rad, 0.], #
[max_dtheta1_at_vertical, max_dtheta1, 0., -max_theta1_rad], #
[max_dtheta1_at_vertical, -max_dtheta1, 0., max_theta1_rad], #
[max_dtheta1_at_vertical, max_dtheta1_at_vertical, 0., 0.], #
[max_dtheta1_at_vertical, -max_dtheta1_at_vertical, 0., 0.], #
[-max_dtheta1_at_vertical, max_dtheta1, 0., -max_theta1_rad], #
[-max_dtheta1_at_vertical, -max_dtheta1, 0., max_theta1_rad], #
[-max_dtheta1_at_vertical, max_dtheta1_at_vertical, 0., 0.], #
[-max_dtheta1_at_vertical, -max_dtheta1_at_vertical, 0., 0.]])
ch = ConvexHull(corners_polygon)
# returns the equation for the convex hull of the corner points s.t. eq = [H,h]
# with Hx <= -h
eq = ch.equations
h_mat_safe = eq[:, :self.n_s]
h_safe = -eq[:, self.n_s:] # We want the form Ax <= b , hence A = H, b = -h
p = polytope.qhull(corners_polygon)
# normalize safety bounds
self.h_mat_safe = h_mat_safe
self.h_safe = h_safe
self.h_mat_obs = None # p.asarray([[0.,1.],[0.,-1.]])
self.h_obs = None # np.array([.6,.6]).reshape(2,1)
# arrange the corner points such that it can be ploted via a line plot
self.corners_polygon = corners_polygon
self.ch_safety_bounds = ch
def get_safety_constraints(self, normalize=True):
""" Return the safe constraints
Parameters
----------
normalize: boolean, optional
If TRUE: Returns normalized constraints
"""
if normalize:
m_x = np.diag(self.norm[0])
h_mat_safe = np.dot(self.h_mat_safe, m_x)
else:
h_mat_safe = self.h_mat_safe
return h_mat_safe, self.h_safe, self.h_mat_obs, self.h_obs
def _render_env(self, screen, axis: [float], display_width: int, display_height: int):
theta = self.current_state[2]
phi = self.current_state[3]
x = np.sin(theta) * np.cos(phi)
y = np.sin(theta) * np.sin(phi)
z = np.cos(theta)
fig = plt.figure(figsize=plt.figaspect(0.5))
ax1 = fig.add_subplot(1, 2, 1, projection='3d')
ax1.elev = 90
ax1.azim = 90
ax1.set_xlim(-1, 1)
ax1.set_ylim(-1, 1)
ax1.set_zlim(-1, 1)
ax1.plot([0, 1], [0, 0], [0, 0], color='grey')
ax1.plot([0, 0], [0, 1], [0, 0], color='grey')
ax1.plot([0, 0], [0, 0], [0, 1], color='grey')
ax1.plot([0, x], [0, y], [0, z])
ax1.scatter([x], [y], [z])
ax2 = fig.add_subplot(1, 2, 2, projection='3d')
ax2.elev = 0
ax2.azim = 90
ax2.set_xlim(-1, 1)
ax2.set_ylim(-1, 1)
ax2.set_zlim(-1, 1)
ax2.plot([0, 1], [0, 0], [0, 0], color='grey')
ax2.plot([0, 0], [0, 1], [0, 0], color='grey')
ax2.plot([0, 0], [0, 0], [0, 1], color='grey')
ax2.plot([0, x], [0, y], [0, z])
ax2.scatter([x], [y], [z])
plt.show()
# # Clear screen to black.
# screen.fill((0, 0, 0))
#
# center_x = display_width / 2
# center_y = display_height / 2
#
# length = min(display_width, display_height) / 3
#
# theta = self.current_state[1]
# end_x = center_x - length * math.sin(theta)
# end_y = center_y - length * math.cos(theta)
#
# pygame.draw.circle(screen, (255, 255, 255), (center_x, center_y), 10)
# pygame.draw.line(screen, (255, 255, 255), (center_x, center_y), (end_x, end_y), width=3)
def plot_ellipsoid_trajectory(self, p, q, vis_safety_bounds=True):
raise NotImplementedError
| 38.381098 | 120 | 0.562157 | 1,774 | 12,589 | 3.795941 | 0.196731 | 0.010989 | 0.008465 | 0.05049 | 0.253787 | 0.208494 | 0.187853 | 0.176715 | 0.149985 | 0.118503 | 0 | 0.041739 | 0.316784 | 12,589 | 327 | 121 | 38.498471 | 0.741193 | 0.288982 | 0 | 0.072289 | 0 | 0.036145 | 0.016137 | 0 | 0 | 0 | 0 | 0 | 0.024096 | 1 | 0.078313 | false | 0 | 0.060241 | 0.012048 | 0.192771 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f37b1b2d758914c3434554bfc9d96346d89687 | 1,062 | py | Python | docs/conf.py | mlibrary/keycard | a91922a0bed8e58f277acde79b050e67dfc1a753 | [
"BSD-3-Clause"
] | 1 | 2022-03-07T13:53:11.000Z | 2022-03-07T13:53:11.000Z | docs/conf.py | mlibrary/keycard | a91922a0bed8e58f277acde79b050e67dfc1a753 | [
"BSD-3-Clause"
] | 16 | 2018-03-02T17:18:03.000Z | 2022-03-01T22:38:05.000Z | docs/conf.py | mlibrary/keycard | a91922a0bed8e58f277acde79b050e67dfc1a753 | [
"BSD-3-Clause"
] | 1 | 2018-04-09T23:44:51.000Z | 2018-04-09T23:44:51.000Z | # -*- coding: utf-8 -*-
import guzzle_sphinx_theme
from recommonmark.parser import CommonMarkParser
# -- General configuration ------------------------------------------------
project = u'Keycard'
copyright = u'2018, Regents of the University of Michigan'
author = u'Noah Botimer'
version = u'0.2.4'
release = u'0.2.4'
extensions = ['guzzle_sphinx_theme']
templates_path = ['_templates']
master_doc = 'index'
source_parsers = {
'.md': CommonMarkParser,
}
source_suffix = ['.rst', '.md']
language = None
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
pygments_style = 'sphinx'
todo_include_todos = False
# -- Options for HTML output ----------------------------------------------
html_theme_path = guzzle_sphinx_theme.html_theme_path()
html_theme = 'guzzle_sphinx_theme'
html_static_path = ['_static']
# Guzzle theme options (see theme.conf for more information)
html_theme_options = {
"project_nav_name": "Keycard",
}
html_sidebars = {
'**': [
'logo-text.html',
'globaltoc.html',
'searchbox.html',
]
}
| 22.595745 | 75 | 0.632768 | 121 | 1,062 | 5.289256 | 0.603306 | 0.075 | 0.10625 | 0.0125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012236 | 0.153484 | 1,062 | 46 | 76 | 23.086957 | 0.699666 | 0.214689 | 0 | 0 | 0 | 0 | 0.288647 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.064516 | 0 | 0.064516 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f397e507056289717eaa16d0283482c1ef5f4e | 2,970 | py | Python | tests/integrations/test_bottle.py | ForroKulcs/bugsnag-python | 107c1add31a2202cc08ef944aa00ab96996b247a | [
"MIT"
] | 76 | 2015-03-01T11:46:57.000Z | 2022-02-18T10:57:44.000Z | tests/integrations/test_bottle.py | ForroKulcs/bugsnag-python | 107c1add31a2202cc08ef944aa00ab96996b247a | [
"MIT"
] | 119 | 2015-01-14T11:53:08.000Z | 2022-03-30T08:22:50.000Z | tests/integrations/test_bottle.py | ForroKulcs/bugsnag-python | 107c1add31a2202cc08ef944aa00ab96996b247a | [
"MIT"
] | 46 | 2015-02-09T23:50:57.000Z | 2022-01-06T16:04:40.000Z | from webtest import TestApp
import bottle
from bottle import route, template
import bugsnag
from bugsnag.wsgi.middleware import BugsnagMiddleware
from tests.utils import IntegrationTest
class TestBottle(IntegrationTest):
def setUp(self):
super(TestBottle, self).setUp()
bugsnag.configure(endpoint=self.server.url,
session_endpoint=self.server.url,
auto_capture_sessions=False,
api_key='3874876376238728937',
asynchronous=False)
def test_routing_error(self):
@route('/beans')
def index():
raise Exception('oh no!')
app = bottle.app()
app.catchall = False
app = TestApp(BugsnagMiddleware(app))
self.assertRaises(Exception, lambda: app.get('/beans'))
self.assertEqual(1, len(self.server.received))
payload = self.server.received[0]['json_body']
event = payload['events'][0]
self.assertTrue(event['unhandled'])
self.assertEqual(event['context'], 'GET /beans')
self.assertEqual(event['exceptions'][0]['errorClass'], 'Exception')
self.assertEqual(event['exceptions'][0]['message'], 'oh no!')
runtime_versions = event['device']['runtimeVersions']
self.assertEqual(runtime_versions['bottle'], '0.12.18')
assert 'environment' not in event['metaData']
def test_enable_environment(self):
bugsnag.configure(send_environment=True)
@route('/beans')
def index():
raise Exception('oh no!')
app = bottle.app()
app.catchall = False
app = TestApp(BugsnagMiddleware(app))
self.assertRaises(Exception, lambda: app.get('/beans'))
self.assertEqual(1, len(self.server.received))
payload = self.server.received[0]['json_body']
metadata = payload['events'][0]['metaData']
self.assertEqual(metadata['environment']['PATH_INFO'], '/beans')
def test_template_error(self):
@route('/berries/<variety>')
def index(variety):
return template('{{type1}} {{type2}}', type1=variety)
app = bottle.app()
app.catchall = False
app = TestApp(BugsnagMiddleware(app))
self.assertRaises(Exception, lambda: app.get('/berries/red'))
self.assertEqual(1, len(self.server.received))
payload = self.server.received[0]['json_body']
event = payload['events'][0]
self.assertTrue(event['unhandled'])
self.assertEqual(event['context'], 'GET /berries/red')
self.assertEqual(event['exceptions'][0]['errorClass'], 'NameError')
self.assertEqual(event['exceptions'][0]['message'],
"name 'type2' is not defined")
assert 'environment' not in event['metaData']
runtime_versions = event['device']['runtimeVersions']
self.assertEqual(runtime_versions['bottle'], bottle.__version__)
| 37.594937 | 75 | 0.619529 | 307 | 2,970 | 5.918567 | 0.29316 | 0.099064 | 0.059439 | 0.066043 | 0.625206 | 0.614199 | 0.488718 | 0.488718 | 0.488718 | 0.488718 | 0 | 0.01819 | 0.241077 | 2,970 | 78 | 76 | 38.076923 | 0.787933 | 0 | 0 | 0.484375 | 0 | 0 | 0.156566 | 0 | 0 | 0 | 0 | 0 | 0.296875 | 1 | 0.109375 | false | 0 | 0.09375 | 0.015625 | 0.234375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f43762f841c177bcebca927c3b07f71f1acd71 | 21,087 | py | Python | snappy_pipeline/workflows/somatic_variant_filtration/__init__.py | eudesbarbosa/snappy-pipeline | 34f9abd999499469f9d488b715c861b03e79d3db | [
"MIT"
] | 5 | 2021-02-26T10:39:56.000Z | 2021-12-23T07:53:26.000Z | snappy_pipeline/workflows/somatic_variant_filtration/__init__.py | eudesbarbosa/snappy-pipeline | 34f9abd999499469f9d488b715c861b03e79d3db | [
"MIT"
] | 93 | 2021-02-22T11:23:59.000Z | 2022-03-31T09:58:39.000Z | snappy_pipeline/workflows/somatic_variant_filtration/__init__.py | eudesbarbosa/snappy-pipeline | 34f9abd999499469f9d488b715c861b03e79d3db | [
"MIT"
] | 3 | 2021-02-22T11:44:59.000Z | 2021-06-21T19:33:53.000Z | # -*- coding: utf-8 -*-
"""Implementation of the ``somatic_variant_filtration`` step
=====================
Default Configuration
=====================
The default configuration is as follows.
.. include:: DEFAULT_CONFIG_somatic_variant_filtration.rst
=========
Important
=========
Because the EB Filter step is so time consuming, the data going
can be heavily prefiltered! (e.g. using Jannovar with the offExome flag).
TODO: document filter, for now see the eb_filter wrapper!
=======
Concept
=======
All variants are annotated with the dkfz-bias-filter to remove sequencing
and PCR artifacts. The variants annotatated with EBFilter are variable, i.e.
only variants that have the PASS flag set because we assume only those will
be kept.
We borrowed the general workflow from variant_filtration, i.e. working with
pre-defined filter sets and exon/region lists.
========
Workflow
========
* 1. Do the filtering genome wide (this file needs to be there, always)
- dkfz-ebfilter-filterset1-genomewide
* 2. optionally, subset to regions defined in bed file, which return
- dkfz-ebfilter-filterset1-regions1
and so on for filterset1 to n
filterset1:
filter bPcr, bSeq flags from dkfz-bias-filter
filterset2:
additionally filter variants with EBscore < x, x is configurable
"""
from collections import OrderedDict
import os
import random
import sys
from biomedsheets.shortcuts import CancerCaseSheet, CancerCaseSheetOptions, is_not_background
from snakemake.io import expand
from snappy_pipeline.utils import dictify, listify
from snappy_pipeline.workflows.abstract import BaseStep, BaseStepPart, LinkOutStepPart
from snappy_pipeline.workflows.ngs_mapping import NgsMappingWorkflow
from snappy_pipeline.workflows.somatic_variant_annotation import SomaticVariantAnnotationWorkflow
from snappy_pipeline.workflows.somatic_variant_calling import (
SOMATIC_VARIANT_CALLERS_MATCHED,
SomaticVariantCallingWorkflow,
)
__author__ = "Manuel Holtgrewe <manuel.holtgrewe@bihealth.de>"
#: Extensions of files to create as main payload
EXT_VALUES = (".vcf.gz", ".vcf.gz.tbi", ".vcf.gz.md5", ".vcf.gz.tbi.md5")
#: Names of the files to create for the extension
EXT_NAMES = ("vcf", "tbi", "vcf_md5", "tbi_md5")
#: Default configuration for the somatic_variant_calling step
DEFAULT_CONFIG = r"""
# Default configuration variant_annotation
step_config:
somatic_variant_filtration:
drmaa_snippet: '' # default, you can override by step below
path_somatic_variant_annotation: ../somatic_variant_annotation
path_ngs_mapping: ../ngs_mapping
tools_ngs_mapping: null
tools_somatic_variant_calling: null
filter_sets:
# no_filter: no_filters # implicit, always defined
dkfz_only: '' # empty
dkfz_and_ebfilter:
ebfilter_threshold: 2.4
dkfz_and_ebfilter_and_oxog:
vaf_threshold: 0.08
coverage_threshold: 5
dkfz_and_oxog:
vaf_threshold: 0.08
coverage_threshold: 5
exon_lists: {}
# genome_wide: null # implicit, always defined
# ensembl74: path/to/ensembl47.bed
eb_filter:
shuffle_seed: 1
panel_of_normals_size: 25
min_mapq: 20
min_baseq: 15
# Parallelization configuration
drmaa_snippet: '' # value to pass in as additional DRMAA arguments
window_length: 10000000 # split input into windows of this size, each triggers a job
num_jobs: 500 # number of windows to process in parallel
use_drmaa: true # use drmaa for parallel processing
restart_times: 5 # number of times to re-launch jobs in case of failure
max_jobs_per_second: 2 # throttling of job creation
max_status_checks_per_second: 10 # throttling of status checks
debug_trunc_tokens: 0 # truncation to first N tokens (0 for none)
keep_tmpdir: never # keep temporary directory, {always, never, onerror}
job_mult_memory: 1 # memory multiplier
job_mult_time: 1 # running time multiplier
merge_mult_memory: 1 # memory multiplier for merging
merge_mult_time: 1 # running time multiplier for merging
ignore_chroms: # patterns of chromosome names to ignore
- NC_007605 # herpes virus
- hs37d5 # GRCh37 decoy
- chrEBV # Eppstein-Barr Virus
- '*_decoy' # decoy contig
- 'HLA-*' # HLA genes
- 'GL000220.*' # Contig with problematic, repetitive DNA in GRCh37
"""
class SomaticVariantFiltrationStepPart(BaseStepPart):
"""Shared code for all tools in somatic_variant_filtration"""
def __init__(self, parent):
super().__init__(parent)
self.log_path = (
r"work/{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
r"dkfz_bias_filter.{tumor_library,[^\.]+}/log/snakemake.dkfz_bias_filter.log"
)
# Build shortcut from cancer bio sample name to matched cancer sample
self.tumor_ngs_library_to_sample_pair = OrderedDict()
for sheet in self.parent.shortcut_sheets:
self.tumor_ngs_library_to_sample_pair.update(
sheet.all_sample_pairs_by_tumor_dna_ngs_library
)
# Build mapping from donor name to donor.
self.donors = OrderedDict()
for sheet in self.parent.shortcut_sheets:
for donor in sheet.donors:
self.donors[donor.name] = donor
@dictify
def _get_log_file(self, action):
"""Return path to log file for the given action"""
assert action in self.actions, "Invalid action"
if action == "write_panel":
return (
"work/{mapper}.eb_filter.panel_of_normals/log/"
"{mapper}.eb_filter.panel_of_normals.log"
)
else:
name_pattern = self.token
key_ext = (
("log", ".log"),
("conda_info", ".conda_info.txt"),
("conda_list", ".conda_list.txt"),
)
for key, ext in key_ext:
yield key, os.path.join("work", name_pattern, "log", name_pattern + ext)
def get_normal_lib_name(self, wildcards):
"""Return name of normal (non-cancer) library"""
pair = self.tumor_ngs_library_to_sample_pair[wildcards.tumor_library]
return pair.normal_sample.dna_ngs_library.name
def get_params(self, action):
"""Return arguments to pass down."""
_ = action
def params_function(wildcards):
if wildcards.tumor_library not in self.donors:
return {
"tumor_library": wildcards.tumor_library,
"normal_library": self.get_normal_lib_name(wildcards),
}
else:
return {}
return params_function
class DkfzBiasFilterStepPart(SomaticVariantFiltrationStepPart):
"""Flag variants with the DKFZ bias filter"""
name = "dkfz_bias_filter"
def __init__(self, parent):
super().__init__(parent)
self.actions = ("run",)
self.token = (
"{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
"dkfz_bias_filter.{tumor_library}"
)
@dictify
def get_input_files(self, action):
"""Return path to jannovar-annotated vcf input file"""
assert action == "run"
# VCF file and index
tpl = (
"output/{mapper}.{var_caller}.jannovar_annotate_somatic_vcf.{tumor_library}/out/"
"{mapper}.{var_caller}.jannovar_annotate_somatic_vcf.{tumor_library}"
)
key_ext = {"vcf": ".vcf.gz", "tbi": ".vcf.gz.tbi"}
variant_annotation = self.parent.sub_workflows["somatic_variant_annotation"]
for key, ext in key_ext.items():
yield key, variant_annotation(tpl + ext)
# BAM file and index
tpl = "output/{mapper}.{tumor_library}/out/{mapper}.{tumor_library}"
key_ext = {"bam": ".bam", "bai": ".bam.bai"}
ngs_mapping = self.parent.sub_workflows["ngs_mapping"]
for key, ext in key_ext.items():
yield key, ngs_mapping(tpl + ext)
@dictify
def get_output_files(self, action):
"""Return output files for the filtration"""
assert action == "run"
prefix = (
r"work/{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
r"dkfz_bias_filter.{tumor_library,[^\.]+}/out/{mapper}.{var_caller}."
r"jannovar_annotate_somatic_vcf.dkfz_bias_filter.{tumor_library}"
)
key_ext = {
"vcf": ".vcf.gz",
"tbi": ".vcf.gz.tbi",
"vcf_md5": ".vcf.gz.md5",
"tbi_md5": ".vcf.gz.tbi.md5",
}
for key, ext in key_ext.items():
yield key, prefix + ext
@classmethod
def update_cluster_config(cls, cluster_config):
"""Update cluster configuration with resource requirements"""
cluster_config["somatic_variant_filtration_dkfz_bias_filter_run"] = {
"mem": 3 * 1024,
"time": "72:00",
"ntasks": 1,
}
class EbFilterStepPart(SomaticVariantFiltrationStepPart):
"""Flag variants with EBFilter"""
name = "eb_filter"
def __init__(self, parent):
super().__init__(parent)
self.actions = ("run", "write_panel")
self.token = (
"{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
"dkfz_bias_filter.eb_filter.{tumor_library}"
)
def get_input_files(self, action):
assert action in self.actions
return getattr(self, "_get_input_files_{}".format(action))
@dictify
def _get_input_files_run(self, wildcards):
# VCF file and index
tpl = (
"work/{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
"dkfz_bias_filter.{tumor_library}/out/{mapper}.{var_caller}."
"jannovar_annotate_somatic_vcf.dkfz_bias_filter."
"{tumor_library}"
)
key_ext = {"vcf": ".vcf.gz", "tbi": ".vcf.gz.tbi"}
for key, ext in key_ext.items():
yield key, tpl.format(**wildcards) + ext
# BAM file and index
tpl = "output/{mapper}.{tumor_library}/out/{mapper}.{tumor_library}"
key_ext = {"bam": ".bam", "bai": ".bam.bai"}
ngs_mapping = self.parent.sub_workflows["ngs_mapping"]
for key, ext in key_ext.items():
yield key, ngs_mapping(tpl.format(**wildcards) + ext)
# Panel of normals TXT file
yield "txt", self._get_output_files_write_panel()["txt"].format(**wildcards)
def _get_input_files_write_panel(self, wildcards):
bam_paths = self._get_panel_of_normal_bams(wildcards)
return {"bam": bam_paths, "bai": [p + ".bai" for p in bam_paths]}
def get_output_files(self, action):
"""Return output files for the filtration"""
assert action in self.actions
return getattr(self, "_get_output_files_{}".format(action))()
@dictify
def _get_output_files_run(self):
prefix = (
r"work/{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
r"dkfz_bias_filter.eb_filter.{tumor_library,[^\.]+}/out/"
r"{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
r"dkfz_bias_filter.eb_filter.{tumor_library}"
)
key_ext = {
"vcf": ".vcf.gz",
"tbi": ".vcf.gz.tbi",
"vcf_md5": ".vcf.gz.md5",
"tbi_md5": ".vcf.gz.tbi.md5",
}
for key, ext in key_ext.items():
yield key, prefix + ext
@dictify
def _get_output_files_write_panel(self):
# TODO: add the actual normal sample here?!
yield "txt", (
"work/{mapper}.eb_filter.panel_of_normals/out/{mapper}.eb_filter."
"panel_of_normals.txt"
)
def write_panel_of_normals_file(self, wildcards):
"""Write out file with paths to panels-of-normal"""
output_path = self.get_output_files("write_panel")["txt"].format(**wildcards)
with open(output_path, "wt") as outf:
for bam_path in self._get_panel_of_normal_bams(wildcards):
print(bam_path, file=outf)
@listify
def _get_panel_of_normal_bams(self, wildcards):
"""Return list of "panel of normal" BAM files."""
libraries = []
for sheet in self.parent.shortcut_sheets:
for donor in sheet.donors:
for bio_sample in donor.bio_samples.values():
if not bio_sample.extra_infos["isTumor"]:
libraries.append(bio_sample.dna_ngs_library.name)
libraries.sort()
random.seed(self.config["eb_filter"]["shuffle_seed"])
lib_count = self.config["eb_filter"]["panel_of_normals_size"]
random.shuffle(libraries)
ngs_mapping = self.parent.sub_workflows["ngs_mapping"]
tpl = "output/{mapper}.{normal_library}/out/{mapper}.{normal_library}"
for library in libraries[:lib_count]:
yield ngs_mapping(tpl.format(normal_library=library, **wildcards) + ".bam")
@staticmethod
def update_cluster_config(cluster_config):
"""Update cluster configuration with resource requirements"""
cluster_config["somatic_variant_filtration_eb_filter_run"] = {
"mem": 8 * 1024,
"time": "144:00",
"ntasks": 1,
}
class ApplyFiltersStepPartBase(SomaticVariantFiltrationStepPart):
"""Base class for the different filters."""
name = None
def __init__(self, parent):
super().__init__(parent)
name_pattern = (
"{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
"dkfz_bias_filter.eb_filter.{tumor_library}.{filter_set}.{exon_list}"
)
self.base_path_out = os.path.join("work", name_pattern, "out", name_pattern + "{ext}")
self.path_log = os.path.join("work", name_pattern, "log", name_pattern + ".log")
def update_cluster_config(self, cluster_config):
cluster_config["variant_filtration_{}_run".format(self.name)] = {
"mem": int(3.75 * 1024 * 2),
"time": "01:00",
"ntasks": 2,
}
class ApplyFiltersStepPart(ApplyFiltersStepPartBase):
"""Apply the configured filters."""
name = "apply_filters"
def get_args(self, action):
def args_function(wildcards):
result = {
"normal_sample": self.get_normal_lib_name(wildcards),
"tumor_sample": wildcards.tumor_library,
}
return result
assert action == "run"
return args_function
@dictify
def get_input_files(self, action):
assert action == "run", "Unsupported actions"
tpl = (
"work/{mapper}.{var_caller}.jannovar_annotate_somatic_vcf."
"dkfz_bias_filter.eb_filter.{tumor_library}/out/{mapper}.{var_caller}."
"jannovar_annotate_somatic_vcf.dkfz_bias_filter.eb_filter."
"{tumor_library}"
)
key_ext = {"vcf": ".vcf.gz", "tbi": ".vcf.gz.tbi"}
for key, ext in key_ext.items():
yield key, tpl + ext
@dictify
def get_output_files(self, action):
assert action == "run"
for key, ext in zip(EXT_NAMES, EXT_VALUES):
yield key, self.base_path_out.replace("{step}", self.name).replace(
"{exon_list}", "genome_wide"
).replace("{ext}", ext)
def get_log_file(self, action):
assert action == "run"
return self.path_log.replace("{step}", self.name).replace("{exon_list}", "genome_wide")
class FilterToExonsStepPart(ApplyFiltersStepPartBase):
"""Apply the configured filters."""
name = "filter_to_exons"
def get_input_files(self, action):
@dictify
def input_function(wildcards):
for key, ext in zip(EXT_NAMES, EXT_VALUES):
yield key, self.base_path_out.format(
step="apply_filters",
mapper=wildcards.mapper,
var_caller=wildcards.var_caller,
filter_set=wildcards.filter_set,
exon_list="genome_wide",
ext=ext,
)
assert action == "run", "Unsupported actions"
return input_function
@dictify
def get_output_files(self, action):
assert action == "run"
for key, ext in zip(EXT_NAMES, EXT_VALUES):
yield key, self.base_path_out.replace("{step}", "filter_to_exons").replace("{ext}", ext)
def get_log_file(self, action):
assert action == "run"
return self.path_log.replace("{step}", self.name)
class SomaticVariantFiltrationWorkflow(BaseStep):
"""Perform somatic variant filtration"""
name = "somatic_variant_filtration"
sheet_shortcut_class = CancerCaseSheet
sheet_shortcut_kwargs = {
"options": CancerCaseSheetOptions(allow_missing_normal=True, allow_missing_tumor=True)
}
@classmethod
def default_config_yaml(cls):
"""Return default config YAML, to be overwritten by project-specific one."""
return DEFAULT_CONFIG
def __init__(
self, workflow, config, cluster_config, config_lookup_paths, config_paths, workdir
):
super().__init__(
workflow,
config,
cluster_config,
config_lookup_paths,
config_paths,
workdir,
(SomaticVariantAnnotationWorkflow, SomaticVariantCallingWorkflow, NgsMappingWorkflow),
)
# Register sub step classes so the sub steps are available
self.register_sub_step_classes(
(
DkfzBiasFilterStepPart,
EbFilterStepPart,
ApplyFiltersStepPart,
FilterToExonsStepPart,
LinkOutStepPart,
)
)
# Register sub workflows
self.register_sub_workflow(
"somatic_variant_annotation", self.config["path_somatic_variant_annotation"]
)
self.register_sub_workflow("ngs_mapping", self.config["path_ngs_mapping"])
# Copy over "tools" setting from somatic_variant_calling/ngs_mapping if not set here
if not self.config["tools_ngs_mapping"]:
self.config["tools_ngs_mapping"] = self.w_config["step_config"]["ngs_mapping"]["tools"][
"dna"
]
if not self.config["tools_somatic_variant_calling"]:
self.config["tools_somatic_variant_calling"] = self.w_config["step_config"][
"somatic_variant_calling"
]["tools"]
@listify
def get_result_files(self):
"""Return list of result files
Process all primary DNA libraries and perform pairwise calling for tumor/normal pairs
"""
callers = set(self.config["tools_somatic_variant_calling"])
name_pattern = (
"{mapper}.{caller}.jannovar_annotate_somatic_vcf."
"dkfz_bias_filter.eb_filter.{tumor_library.name}."
"{filter_set}.{exon_list}"
)
filter_sets = ["no_filter"]
filter_sets += self.config["filter_sets"].keys()
exon_lists = ["genome_wide"]
exon_lists += list(self.config["exon_lists"].keys())
yield from self._yield_result_files_matched(
os.path.join("output", name_pattern, "out", name_pattern + "{ext}"),
mapper=self.config["tools_ngs_mapping"],
caller=callers & set(SOMATIC_VARIANT_CALLERS_MATCHED),
filter_set=filter_sets,
exon_list=exon_lists,
ext=EXT_VALUES,
)
# TODO: filtration for joint calling not implemented yet
def _yield_result_files_matched(self, tpl, **kwargs):
"""Build output paths from path template and extension list.
This function returns the results from the matched somatic variant callers such as
Mutect.
"""
for sheet in filter(is_not_background, self.shortcut_sheets):
for sample_pair in sheet.all_sample_pairs:
if (
not sample_pair.tumor_sample.dna_ngs_library
or not sample_pair.normal_sample.dna_ngs_library
):
msg = (
"INFO: sample pair for cancer bio sample {} has is missing primary"
"normal or primary cancer NGS library"
)
print(msg.format(sample_pair.tumor_sample.name), file=sys.stderr)
continue
yield from expand(
tpl, tumor_library=[sample_pair.tumor_sample.dna_ngs_library], **kwargs
)
def check_config(self):
"""Check that the path to the NGS mapping is present"""
self.ensure_w_config(
("step_config", "somatic_variant_filtration", "path_somatic_variant_annotation"),
"Path to variant calling not configured but required for somatic variant annotation",
)
| 37.790323 | 100 | 0.627685 | 2,458 | 21,087 | 5.114727 | 0.17738 | 0.03118 | 0.021158 | 0.031021 | 0.417276 | 0.384505 | 0.31936 | 0.281737 | 0.265749 | 0.226933 | 0 | 0.007508 | 0.267274 | 21,087 | 557 | 101 | 37.858169 | 0.806161 | 0.141841 | 0 | 0.270471 | 0 | 0 | 0.321024 | 0.134769 | 0 | 0 | 0 | 0.005386 | 0.029777 | 1 | 0.086849 | false | 0.002481 | 0.027295 | 0 | 0.186104 | 0.004963 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f68264ab402d31bf82d19fd43f6fff3290e562 | 983 | py | Python | src/scrapeacademy/__init__.py | sojin-project/scrape-academy | 5a18f5b497a6b3b85049ec1a4451b6a333e84353 | [
"MIT"
] | null | null | null | src/scrapeacademy/__init__.py | sojin-project/scrape-academy | 5a18f5b497a6b3b85049ec1a4451b6a333e84353 | [
"MIT"
] | null | null | null | src/scrapeacademy/__init__.py | sojin-project/scrape-academy | 5a18f5b497a6b3b85049ec1a4451b6a333e84353 | [
"MIT"
] | null | null | null | __version__ = "0.0.1"
import asyncio
import logging
from contextvars import ContextVar
from pathlib import Path
from typing import Any, Awaitable, Union, cast
from .context import Context
logger = logging.getLogger(__name__)
class wrapper:
def __getattr__(self, name: str) -> Any:
return getattr(_context.get(), name)
def __setattr__(self, name: str, value: Any) -> None:
setattr(_context.get(), name, value)
_context: ContextVar[Context] = ContextVar("var", default=Context())
context: Context = cast(Context, wrapper())
def init_context(
cache_dir: Union[Path, str, None] = None,
) -> None:
logger.debug("start context %s", cache_dir)
_context.get().open_cache(cache_dir)
def run(
func: Awaitable[Any],
) -> Any:
async def _run() -> Any:
try:
ret = await func
finally:
await context.close()
return ret
ret = asyncio.run(_run())
logger.debug("finished")
return ret
| 21.369565 | 68 | 0.65412 | 122 | 983 | 5.04918 | 0.401639 | 0.048701 | 0.035714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003942 | 0.225839 | 983 | 45 | 69 | 21.844444 | 0.805519 | 0 | 0 | 0.0625 | 0 | 0 | 0.032553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.1875 | 0.03125 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f6cba04c95192eedb8de4a8cd97764c5fedc28 | 15,739 | py | Python | app.py | explojoe/elektronika_bot | a27c75ecfe677dcefeb5bb492435e7011bd15ef4 | [
"MIT"
] | null | null | null | app.py | explojoe/elektronika_bot | a27c75ecfe677dcefeb5bb492435e7011bd15ef4 | [
"MIT"
] | null | null | null | app.py | explojoe/elektronika_bot | a27c75ecfe677dcefeb5bb492435e7011bd15ef4 | [
"MIT"
] | null | null | null | import os
import random
import time
import json
import datetime
from random import randint
from pyfiglet import figlet_format
from flask import Flask, g, session, redirect, request, url_for, jsonify
from requests_oauthlib import OAuth2Session
OAUTH2_CLIENT_ID = '456608429843283998' #os.environ['OAUTH2_CLIENT_ID']
OAUTH2_CLIENT_SECRET = '03D26-iZchBxx5ncJxN6fjxJkP6k0x-g' #os.environ['OAUTH2_CLIENT_SECRET']
OAUTH2_REDIRECT_URI = 'http://128.1932.254.226:5000/callback'
API_BASE_URL = os.environ.get('API_BASE_URL', 'https://discordapp.com/api')
AUTHORIZATION_BASE_URL = API_BASE_URL + '/oauth2/authorize'
TOKEN_URL = API_BASE_URL + '/oauth2/token'
app = Flask(__name__)
app.debug = True
app.config['SECRET_KEY'] = OAUTH2_CLIENT_SECRET
quotes = [
'"The death of one man is a tragedy. The death of millions is a statistic."',
'"It is enough that the people know there was an election. The people who cast the votes decide nothing. The people who count the votes decide everything."',
'"Death is the solution to all problems. No man - no problem."'
'"The only real power comes out of a long rifle."',
'"Education is a weapon, whose effect depends on who holds it in his hands and at whom it is aimed."',
'"In the Soviet army it takes more courage to retreat than advance."',
'"Gaiety is the most outstanding feature of the Soviet Union."',
'"I trust no one, not even myself."',
'"The Pope! How many divisions has _he_ got?"',
'"BENIS"'
]
expandList = [
'cunt',
'fuck',
'goddamn',
'bitch',
'whore',
'slut',
'fortnight',
'fortnut',
'fortnite',
'mixed reality',
'microsoft',
'emac',
'ruby'
'webscale',
'web scale',
'windows',
'dick'
]
import discord
TOKEN = 'NDU2NjA4NDI5ODQzMjgzOTk4.DgNcRw.EviOEVoX7Lwtb1oHcOp3RGzg5L8'
# 0 = none, 1 = lobby phase, 2 = in progress
gameStatus = 0
host = None
players = []
spies = []
regulars = []
missionsAttempted = 0
missionsFailed = 0
missionsPassed = 0
leader = None
team = []
votes = []
teamStatus = 0
rejects = 0
spiesPerPlayers = [2, 2, 3, 3, 3, 4]
playersPerMission = [
[2, 2, 2, 3],
[3, 3, 3, 4],
[2, 4, 3, 4],
[3, 3, 4, 5],
[3, 4, 4, 5]
]
client = discord.Client()
async def say(text, channel):
global client
await client.send_message(channel, text)
def gameEnd():
global gameStatus
global host
global spies
global regulars
global missionsAttempted
global missionsFailed
global missionsPassed
global leader
global team
global votes
global teamStatus
global rejects
gameStatus = 0 # 0 = none, 1 = lobby phase, 2 = in progress
host = None
players = []
spies = []
regulars = []
missionsAttempted = 0
missionsFailed = 0
missionsPassed = 0
leader = None
team = []
votes = []
teamStatus = 0
rejects = 0
async def gameBegin():
global gameStatus
global host
global spies
global regulars
global missionsAttempted
global missionsFailed
global missionsPassed
global leader
global team
global votes
global teamStatus
global rejects
randLeader = randint(0,len(players)-1)
leader = players[randLeader]
for p in players:
regulars.append(p)
totalSpies = spiesPerPlayers[len(players)-5]
for x in range(0, totalSpies):
randSpy = randint(0,len(regulars)-1)
print(str(randSpy))
spy = regulars[randSpy]
print(str(spy))
regulars.remove(spy)
spies.append(spy)
spyMessage = ''.join(str(e) for e in spies)
for p in players:
if p in spies:
await say('You are a spy! Your partner(s) in crime are: ' + spyMessage, p)
#client.send_message(p, 'You are a spy! Your partner(s) in crime are: ' + spyMessage)
else:
await say('You are part of the resistance!', p)
#client.send_message(p, 'You are part of the resistance!')
def checkVotes():
global gameStatus
global host
global spies
global regulars
global missionsAttempted
global missionsFailed
global missionsPassed
global leader
global team
global votes
global teamStatus
global rejects
@client.event
async def on_message(message):
global gameStatus
global host
global spies
global regulars
global missionsAttempted
global missionsFailed
global missionsPassed
global leader
global team
global votes
global teamStatus
global rejects
# we do not want the bot to reply to itself
if message.author == client.user:
return
print(message.content)
#text = message.content.split(' ')[1]
text = ' '.join(message.content.split()[1:])
#text2 = ' '.join(message.content.split()[2:])
message.content = message.content.lower()
f = open('log.txt', 'a')
curTime = datetime.datetime.utcnow().isoformat() + '|' + '{:30.30}'.format(message.author.name) + '|' + message.author.id + '|' + message.content + '\n'
f.write(curTime)
if(message.channel.id == '360125095043268608'):
return
if(message.channel.id == '364919001434030101'):
return
for word in expandList:
if word in message.content:
msg = 'Expand your vocabulary.'
await client.send_message(message.channel, msg)
break
if message.content.startswith('_help'):
msg = 'Commands: _help, _guidance, _big, _pig, _soviet, _avatar'
await client.send_message(message.channel, msg)
if message.content.startswith('_roll'):
print('text: ' + text)
sides = int(text)
if sides:
num = randint(1, sides)
comment = ''
if(num == 1 and sides != 1):
comment = 'The universe has deathed you, have fun kiddo.'
elif(num == sides):
comment = 'Hot diggety dice-eyes, nice roll partner!'
elif(num > (sides/2)):
comment = 'Not bad.'
elif(num <= (sides/2)):
comment = 'Could be better.'
msg = str(sides) + ' sided die result: `' + str(num) + '`\n' + comment
await client.send_message(message.channel, msg)
if message.content.startswith('_resist host'):
if message.channel.type != discord.ChannelType.text:
await client.send_message(message.channel, 'This must be in a guild channel.')
print(message.channel.type.id)
return
if gameStatus != 0:
await client.send_message(message.channel, 'A game is currently in progress.')
return
host = message.author
players.append(message.author)
votes.append(-1)
gameStatus = 1
await client.send_message(message.channel, 'A resistance lobby is now being hosted.')
if message.content.startswith('_resist start'):
if message.channel.type != discord.ChannelType.text:
await client.send_message(message.channel, 'This must be in a guild channel.')
return
if gameStatus != 1:
await client.send_message(message.channel, 'You are not hosting a lobby')
return
if message.author != host:
await client.send_message(message.channel, 'You are not the host')
return
if len(players) < 5:
await client.send_message(message.channel, 'You need at least 5 players to play.')
return
await client.send_message(message.channel, 'The game has begun')
await gameBegin()
if message.content.startswith('_resist close'):
if message.channel.type != discord.ChannelType.text:
await client.send_message(message.channel, 'This must be in a guild channel.')
return
if message.author != host:
await client.send_message(message.channel, 'You are not the host.')
return
gameEnd()
if message.content.startswith('_resist pick'):
if message.channel.type != discord.ChannelType.private:
await client.send_message(message.channel, 'This must be in a DM.')
return
if message.author not in players:
await say('You are not playing', message.channel)
return
if message.author != leader:
await say('You are not the mission leader.', message.channel)
return
if len(agents) != 0:
await say('Agents have already been assigned.', message.channel)
return
index = len(players) - 5
if index > 2:
index = 3;
if len(message.mentions) != playersPerMission[missionsAttempted][index]:
await say('You must assign exactly ' + playersPerMission[missionsAttempted][index] + ' agents.', message.channel)
return
for agent in message.mentions:
agents.append(agent)
if message.content.startswith('_resist approve'):
if message.channel.type != discord.ChannelType.private:
await client.send_message(message.channel, 'This must be in a DM.')
return
if message.author not in players:
await say('You are not playing.', message.channel)
return
if teamStatus != 0:
await say('Team must be forming.', message.channel)
return
index = players.index(message.author)
votes[index] = 1
checkVotes()
if message.content.startswith('_resist reject'):
if message.channel.type != discord.ChannelType.private:
await client.send_message(message.channel, 'This must be in a DM.')
return
if message.author not in players:
await say('You are not playing.', message.channel)
return
if teamStatus != 0:
await say('Team must be forming.', message.channel)
return
index = players.index(message.author)
votes[index] = 0
checkVotes()
if message.content.startswith('_resist pass'):
if message.channel.type != discord.ChannelType.priate:
await client.send_message(message.channel, 'This must be in a DM.')
return
if message.author not in players:
await say('You are not playing.', message.channel)
return
if teamStatus != 1:
await say('Team must be approved.', message.channel)
return
if message.author not in agents:
await say('You are not on the team.', message.channel)
index = players.index(message.author)
votes[index] = 1
checkVotes()
if message.content.startswith('_resist fail'):
if message.channel.type != discord.ChannelType.private:
await client.send_message(message.channel, 'This must be in a DM.')
return
if message.author not in players:
await say('You are not playing.', message.channel)
return
if teamStatus != 1:
await say('Team must be approved.', message.channel)
return
if message.author not in agents:
await say('You are not on the team.', message.channel)
index = players.index(message.author)
votes[index] = 0
checkVotes()
if message.content.startswith('_resist join'):
if message.channel.type != discord.ChannelType.text:
await client.send_message(message.channel, 'This must be in a guild channel.')
return
if len(players) >= 10:
await client.end_message(message.channel, 'Game full.')
return
if message.author in players:
await client.send_message(message.channel, 'You are already in this game.')
return
players.append(message.author)
votes.append(-1)
if message.content.startswith('_resist leave'):
if message.channel.type != discord.ChannelType.text:
await client.send_message(message.channel, 'This must be in a guild channel.')
return
if message.author not in players:
await client.send_message(message.channel, 'You cannot leave a game you aren\'t in.')
return
if message.author == host:
gameEnd()
elif gameStatus == 1:
players.remove(message.author)
votes.pop(0)
elif gameStatus == 2:
gameEnd()
if message.content.startswith('_resist players'):
if message.channel.type != discord.ChannelType.text:
await client.send_message(message.channel, 'This must be in a guild channel.')
return
msg = ''.join(str(e) for e in players)
await client.send_message(message.channel, msg)
if message.content.startswith('_guidance'):
#msg = 'Hello {0.author.mention}'.format(message)
msg = random.choice(quotes)
await client.send_message(message.channel, msg)
'''
if message.content.startswith('_death'):
target = message.mentions
if target:
if target[0]:
targ = target[0]
currentTime = datetime.datetime.utcnow()
delta
with open('death.json', 'r') as read_file:
data = json.load(read_file)
entry = data[message.author.id]
allow = True
if entry:
death = entry['death']
life = entry['life']
if death:
delta = (currentTime - death).days
else:
delta = (currentTime - life).days
if(delta < 1):
msg = 'You have to wait a whole day'
else:
msg = '{0.author.mention}'.format(message) + ' has ***DEATHED*** ' + targ.mention
data = {}
with open('death.json', 'w') as write_file:
json.dump(data, write_file)
await client.send_message(message.channel, msg)
'''
if message.content.startswith('_big'):
msg = '```' + figlet_format(text, width=160) + '```'
await client.send_message(message.channel, msg)
if message.content.startswith('_soviet'):
#msg = '`This message is from the capitalist pigs at OSU:`\n\n' + text
msg = '`Capitalist pig` <' + message.author.name + '> ' + text
await client.send_message(client.get_channel('456665509555994624'), msg)
if message.content.startswith('_pig'):
#msg = '`This message is from the soviet scum in the Clubhaus:`\n\n' + text
msg = '`Soviet scum` <' + message.author.name + '> ' + text
await client.send_message(client.get_channel('456716843407638529'), msg)
if message.content.startswith('_avatar'):
msg = message.mentions[0].avatar_url
await client.send_message(message.channel, msg)
if client.user in message.mentions:
msg = 'Da'
await client.send_message(message.channel, msg)
@client.event
async def on_ready():
print('vvvv')
print('Logged in as')
print(client.user.name)
print(client.user.id)
await client.change_presence(game=discord.Game(name='with capitalist pigs'))
print('^^^^')
client.run(TOKEN)
if __name__ == '__main__':
app.run()
| 32.995807 | 165 | 0.595781 | 1,829 | 15,739 | 5.06561 | 0.200109 | 0.089153 | 0.062385 | 0.075985 | 0.556287 | 0.511171 | 0.480842 | 0.45116 | 0.433028 | 0.417809 | 0 | 0.020837 | 0.301735 | 15,739 | 476 | 166 | 33.065126 | 0.822202 | 0.03844 | 0 | 0.502703 | 0 | 0.005405 | 0.183069 | 0.006556 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005405 | false | 0.018919 | 0.027027 | 0 | 0.124324 | 0.027027 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f74da2f5764dcc1d735f4668f8bb729ddbbf60 | 1,614 | py | Python | pysal/core/IOHandlers/tests/test_gwt.py | cubensys/pysal | 8d50990f6e6603ba79ae1a887a20a1e3a0734e51 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | pysal/core/IOHandlers/tests/test_gwt.py | cubensys/pysal | 8d50990f6e6603ba79ae1a887a20a1e3a0734e51 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | pysal/core/IOHandlers/tests/test_gwt.py | cubensys/pysal | 8d50990f6e6603ba79ae1a887a20a1e3a0734e51 | [
"MIT",
"BSD-3-Clause"
] | 1 | 2021-07-19T01:46:17.000Z | 2021-07-19T01:46:17.000Z | import unittest
import pysal
from pysal.core.IOHandlers.gwt import GwtIO
import tempfile
import os
import warnings
class test_GwtIO(unittest.TestCase):
def setUp(self):
self.test_file = test_file = pysal.examples.get_path('juvenile.gwt')
self.obj = GwtIO(test_file, 'r')
def test_close(self):
f = self.obj
f.close()
self.failUnlessRaises(ValueError, f.read)
def test_read(self):
w = self.obj.read()
self.assertEqual(168, w.n)
self.assertEqual(16.678571428571427, w.mean_neighbors)
w.transform = 'B'
self.assertEqual([1.0], w[1].values())
def test_seek(self):
self.test_read()
self.failUnlessRaises(StopIteration, self.obj.read)
self.obj.seek(0)
self.test_read()
# Commented out by CRS, GWT 'w' mode removed until we can find a good solution for retaining distances.
# see issue #153.
# Added back by CRS,
def test_write(self):
w = self.obj.read()
f = tempfile.NamedTemporaryFile(
suffix='.gwt', dir=pysal.examples.get_path(''))
fname = f.name
f.close()
o = pysal.open(fname, 'w')
#copy the shapefile and ID variable names from the old gwt.
# this is only available after the read() method has been called.
#o.shpName = self.obj.shpName
#o.varName = self.obj.varName
o.write(w)
o.close()
wnew = pysal.open(fname, 'r').read()
self.assertEqual(wnew.pct_nonzero, w.pct_nonzero)
os.remove(fname)
if __name__ == '__main__':
unittest.main()
| 29.345455 | 107 | 0.619579 | 218 | 1,614 | 4.481651 | 0.449541 | 0.057318 | 0.033777 | 0.040942 | 0.032753 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022708 | 0.263321 | 1,614 | 54 | 108 | 29.888889 | 0.798991 | 0.193928 | 0 | 0.153846 | 0 | 0 | 0.021672 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 1 | 0.128205 | false | 0 | 0.153846 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f7870471a4c3b9fc2d6873b860248bf22dc993 | 3,036 | py | Python | scrnatools/qc/_filter_cells.py | j-germino/sc-rna-tools-git | 4e9a4fce40f6a303a1869e93a6e52e8db663bd06 | [
"BSD-3-Clause"
] | null | null | null | scrnatools/qc/_filter_cells.py | j-germino/sc-rna-tools-git | 4e9a4fce40f6a303a1869e93a6e52e8db663bd06 | [
"BSD-3-Clause"
] | null | null | null | scrnatools/qc/_filter_cells.py | j-germino/sc-rna-tools-git | 4e9a4fce40f6a303a1869e93a6e52e8db663bd06 | [
"BSD-3-Clause"
] | null | null | null | """
Filters cells based on gene number, total counts, and % mitochondrial
From sc-rna-tools package
Created on Mon Jan 10 15:57:46 2022
@author: joe germino (joe.germino@ucsf.edu)
"""
# external imports
from anndata import AnnData
from typing import Tuple, Optional
# sc-rna-tools package imports
from .._configs import configs
from .._utils import debug
from ..plotting import qc_plotting
logger = configs.create_logger(__name__.split('_', 1)[1])
# -------------------------------------------------------function----------------------------------------------------- #
@debug(logger, configs)
def filter_cells(
adata: AnnData,
gene_thresholds: Tuple[int, int],
count_thresholds: Tuple[int, int],
mt_threshold: int = 10,
save_path: Optional[str] = None,
file_type: str = "png",
*args, **kwargs
) -> AnnData:
"""Filters cells based on gene number, total counts, and % mitochondrial
Parameters
----------
adata
The AnnData with the data to filter
gene_thresholds
A Tuple of thresholds for the number of genes per cell with 'gene_thresholds[0]' being the lower bound and
'gene_thresholds[1]' being the upper bound (both exclusive).
count_thresholds
A Tuple of thresholds for the number of total counts cell with 'count_thresholds[0]' being the lower bound and
'count_thresholds[1]' being the upper bound (both exclusive).
mt_threshold
The maximum percent mitochondrial reads per cell. Default 10%.
args
Arguments to pass on to qc_plotting function calls
kwargs
Keyword arguments to pass on to qc_plotting function calls
save_path
The path and file name prefix to save QC plots to ('_qc_plots' or '_filtered_qc_plots' and the file type
provided with 'file_type' will be appended
file_type
The file type for the figures to be saved
Returns
-------
An AnnData object with cells that don't pass the thresholds filtered out
"""
qc_plotting(
adata,
counts_thresholds=count_thresholds,
genes_thresholds=gene_thresholds,
save_path=f"{save_path}_qc_plots.{file_type}",
*args, **kwargs
)
logger.info(f"Number of cells before QC filtering: {len(adata.obs)}")
filtered_adata = adata[adata.obs.pct_counts_mt < mt_threshold].copy()
filtered_adata = filtered_adata[filtered_adata.obs.total_counts < count_thresholds[1]]
filtered_adata = filtered_adata[filtered_adata.obs.total_counts > count_thresholds[0]]
filtered_adata = filtered_adata[filtered_adata.obs.n_genes_by_counts < gene_thresholds[1]]
filtered_adata = filtered_adata[filtered_adata.obs.n_genes_by_counts > gene_thresholds[0]].copy()
logger.info(f"Number of cells after QC filtering: {len(filtered_adata.obs)}")
qc_plotting(
filtered_adata,
show_thresholds=False,
save_path=f"{save_path}_filtered_qc_plots.{file_type}",
*args, **kwargs
)
return filtered_adata
| 37.02439 | 120 | 0.675231 | 404 | 3,036 | 4.873762 | 0.29703 | 0.105637 | 0.085323 | 0.105637 | 0.425597 | 0.408329 | 0.358558 | 0.326054 | 0.283393 | 0.197054 | 0 | 0.010815 | 0.208169 | 3,036 | 81 | 121 | 37.481481 | 0.808236 | 0.458169 | 0 | 0.135135 | 0 | 0 | 0.125082 | 0.064178 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.135135 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6f965dfcb63aec2b920f05b495039fb46c693bf | 6,560 | py | Python | SOGcommand/infile_processor.py | SalishSeaCast/SOG | e8c914ea743076829d848e7d90df9b5e383e8b53 | [
"Apache-2.0"
] | null | null | null | SOGcommand/infile_processor.py | SalishSeaCast/SOG | e8c914ea743076829d848e7d90df9b5e383e8b53 | [
"Apache-2.0"
] | null | null | null | SOGcommand/infile_processor.py | SalishSeaCast/SOG | e8c914ea743076829d848e7d90df9b5e383e8b53 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""SOG infile processor.
Do various operations on input files for the the SOG bio-physical
model of deep estuaries. Most notably, convert new YAML infiles into
the old Fortran-style infiles that SOG reads.
This module provides services to the SOG command processor.
:Author: Doug Latornell <djl@douglatornell.ca>
:License: Apache License, Version 2.0
Copyright 2010-2014 Doug Latornell and The University of British Columbia
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import (
absolute_import,
division,
print_function,
unicode_literals,
)
import pprint
import sys
from tempfile import NamedTemporaryFile
import colander
import yaml
from . import SOG_infile
from .SOG_infile_schema import (
SOG_Infile,
SOG_KEYS,
SOG_EXTRA_KEYS,
SOG_AVG_HIST_FORCING_KEYS,
)
from .SOG_YAML_schema import (
YAML_Infile,
yaml_to_infile,
)
__all__ = ['create_infile', 'read_infile']
def create_infile(yaml_infile, edit_files):
"""Create a SOG Fortran-style infile for SOG to read from
`yaml_infile`.
:arg yaml_infile: Path/name of a SOG YAML infile.
:type yaml_infile: str
:arg edit_files: Paths/names of YAML infile snippets to be merged
into `yaml_infile`.
:type edit_files: list
:returns infile_name: Path/name of the SOG Fortran-style temporary
infile that is created.
:rtype: str
"""
data = _read_yaml_infile(yaml_infile)
YAML = YAML_Infile()
yaml_struct = _deserialize_yaml(data, YAML, yaml_infile)
for edit_file in edit_files:
edit_data = _read_yaml_infile(edit_file)
edit_struct = _deserialize_yaml(
edit_data, YAML, edit_file, edit_mode=True)
_merge_yaml_structs(edit_struct, yaml_struct, YAML)
infile_struct = yaml_to_infile(YAML, yaml_struct)
SOG = SOG_Infile()
data = SOG.serialize(infile_struct)
with NamedTemporaryFile(mode='wt', suffix='.infile', delete=False) as f:
SOG_infile.dump(
data, SOG_KEYS, SOG_EXTRA_KEYS, SOG_AVG_HIST_FORCING_KEYS, f)
infile_name = f.name
return infile_name
def read_infile(yaml_infile, edit_files, key):
"""Return value for specified infile key.
:arg yaml_infile: Path/name of a SOG YAML infile.
:type yaml_infile: str
:arg edit_files: Paths/names of YAML infile snippets to be merged
into `yaml_infile`.
:type edit_files: list
:arg key: Infile key to return value for.
Key must be "fully qualified";
i.e. a dotted name path through the nested YAML mappings,
like :kbd:`initial_conditions.init_datetime`.
:type key: str
:returns value: Infile value associated with key
:rtype: str
"""
data = _read_yaml_infile(yaml_infile)
YAML = YAML_Infile()
yaml_struct = _deserialize_yaml(data, YAML, yaml_infile, edit_mode=True)
for edit_file in edit_files:
edit_data = _read_yaml_infile(edit_file)
edit_struct = _deserialize_yaml(
edit_data, YAML, edit_file, edit_mode=True)
_merge_yaml_structs(edit_struct, yaml_struct, YAML)
try:
value = YAML.get_value(yaml_struct, key)['value']
except KeyError:
print('KeyError: {0}'.format(key), file=sys.stderr)
sys.exit(2)
return value
def _read_yaml_infile(yaml_infile):
"""Read `yaml_infile` and return the resulting Python dict.
:arg yaml_infile: Path/name of a SOG YAML infile.
:type yaml_infile: str
:returns data: Content of `yaml_infile` as a Python dict.
:rtype: dict
"""
with open(yaml_infile, 'rt') as f:
try:
data = yaml.safe_load(f)
except yaml.scanner.ScannerError:
print('Unable to parse {0}: Are you sure that it is YAML?'
.format(yaml_infile), file=sys.stderr)
sys.exit(2)
return data
def _deserialize_yaml(data, yaml_schema, yaml_infile, edit_mode=False):
"""Deserialize `data` according to `yaml_schema` and return the
resulting YAML schema data structure.
:arg data: Content of `yaml_infile` as a Python dict.
:type data: dict
:arg yaml_schema: SOG YAML infile schema instance
:type yaml_schema: :class:`YAML_Infile` instance
:arg yaml_infile: Path/name of a SOG YAML infile.
:type yaml_infile: str
:arg edit_mode: Turn edit mode on/off for schema binding;
defaults to False.
True means that elements can be missing from schema block
mappings;
used to deserialize edit files.
False means that missing elements aren't allowed;
used to deserialize the base infile.
:type edit_mode: boolean
:returns yaml_struct: SOG YAML infile data structure
:rtype: nested dicts
"""
yaml_schema = yaml_schema.bind(allow_missing=edit_mode)
try:
yaml_struct = yaml_schema.deserialize(data)
except colander.Invalid as e:
print('Invalid SOG YAML in {0}. '
'The following parameters are missing or misspelled:'
.format(yaml_infile), file=sys.stderr)
pprint.pprint(e.asdict(), sys.stderr)
sys.exit(2)
return yaml_struct
def _merge_yaml_structs(edit_struct, yaml_struct, schema):
"""Merge non-None values in `edit_struct` into `yaml_struct`.
:arg edit_struct: Edit file data structure to be merged into `yaml_struct`.
:type edit_struct: dict
:arg yaml_struct: SOG YAML infile data structure to receive merge from
`edit_struct`.
:type yaml_struct: dict
:arg schema: SOG YAML infile schema instance.
:type schema: :class:`YAML_Infile` instance
"""
for key in schema.flatten(yaml_struct):
try:
value = schema.get_value(edit_struct, key)
if value is not None:
schema.set_value(yaml_struct, key, value)
except TypeError:
# Ignore empty block mappings
pass
| 32.8 | 79 | 0.676829 | 907 | 6,560 | 4.702315 | 0.259096 | 0.107855 | 0.02626 | 0.015944 | 0.370692 | 0.325205 | 0.295662 | 0.240328 | 0.240328 | 0.223447 | 0 | 0.004261 | 0.24878 | 6,560 | 199 | 80 | 32.964824 | 0.861201 | 0.504878 | 0 | 0.277108 | 0 | 0 | 0.059786 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060241 | false | 0.012048 | 0.120482 | 0 | 0.228916 | 0.072289 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6fbdea6cb4c3770d8a7144d06603a4a9da8d80a | 3,330 | py | Python | tensorflow/python/tpu/tpu_embedding_v2_test_lib.py | katherinekowalski/tensorflow | 26e92c016a0a1fb76c3651070b859a97e7623256 | [
"Apache-2.0"
] | 2 | 2021-01-20T18:00:03.000Z | 2021-01-21T13:34:56.000Z | tensorflow/python/tpu/tpu_embedding_v2_test_lib.py | katherinekowalski/tensorflow | 26e92c016a0a1fb76c3651070b859a97e7623256 | [
"Apache-2.0"
] | null | null | null | tensorflow/python/tpu/tpu_embedding_v2_test_lib.py | katherinekowalski/tensorflow | 26e92c016a0a1fb76c3651070b859a97e7623256 | [
"Apache-2.0"
] | 1 | 2020-10-15T12:06:40.000Z | 2020-10-15T12:06:40.000Z | # Copyright 2020 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Library module for TPU Embedding mid level API test."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
from tensorflow.python.ops import init_ops_v2
from tensorflow.python.platform import test
from tensorflow.python.tpu import tpu_embedding_v2_utils
class EmbeddingTestBase(test.TestCase):
"""Base embedding test class for use on CPU and TPU."""
def _create_initial_data(self):
"""Create the common test data used by both TPU and CPU."""
self.embedding_values = np.array(list(range(32)), dtype=np.float64)
self.initializer = init_ops_v2.Constant(self.embedding_values)
# Embedding for video initialized to
# 0 1 2 3
# 4 5 6 7
# ...
self.table_video = tpu_embedding_v2_utils.TableConfig(
vocabulary_size=8,
dim=4,
initializer=self.initializer,
combiner='sum',
name='video')
# Embedding for user initialized to
# 0 1
# 2 3
# 4 5
# 6 7
# ...
self.table_user = tpu_embedding_v2_utils.TableConfig(
vocabulary_size=16,
dim=2,
initializer=self.initializer,
combiner='mean',
name='user')
self.feature_config = (
tpu_embedding_v2_utils.FeatureConfig(
table=self.table_video, name='watched'),
tpu_embedding_v2_utils.FeatureConfig(
table=self.table_video, name='favorited'),
tpu_embedding_v2_utils.FeatureConfig(
table=self.table_user, name='friends'))
self.batch_size = 2
self.data_batch_size = 4
# One (global) batch of inputs
# sparse tensor for watched:
# row 0: 0
# row 1: 0, 1
# row 2: 0, 1
# row 3: 1
self.feature_watched_indices = [[0, 0], [1, 0], [1, 1],
[2, 0], [2, 1], [3, 0]]
self.feature_watched_values = [0, 0, 1, 0, 1, 1]
self.feature_watched_row_lengths = [1, 2, 2, 1]
# sparse tensor for favorited:
# row 0: 0, 1
# row 1: 1
# row 2: 0
# row 3: 0, 1
self.feature_favorited_indices = [[0, 0], [0, 1], [1, 0],
[2, 0], [3, 0], [3, 1]]
self.feature_favorited_values = [0, 1, 1, 0, 0, 1]
self.feature_favorited_row_lengths = [2, 1, 1, 2]
# sparse tensor for friends:
# row 0: 3
# row 1: 0, 1, 2
# row 2: 3
# row 3: 0, 1, 2
self.feature_friends_indices = [[0, 0], [1, 0], [1, 1], [1, 2],
[2, 0], [3, 0], [3, 1], [3, 2]]
self.feature_friends_values = [3, 0, 1, 2, 3, 0, 1, 2]
self.feature_friends_row_lengths = [1, 3, 1, 3]
| 34.329897 | 80 | 0.612913 | 477 | 3,330 | 4.125786 | 0.30608 | 0.019309 | 0.042683 | 0.057927 | 0.221545 | 0.199187 | 0.190041 | 0.109756 | 0.086382 | 0.086382 | 0 | 0.059226 | 0.254655 | 3,330 | 96 | 81 | 34.6875 | 0.733683 | 0.351652 | 0 | 0.113636 | 0 | 0 | 0.018536 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022727 | false | 0 | 0.159091 | 0 | 0.204545 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
b6ffc4004aece2db3208e45e2a34bebeaa9566bf | 141,004 | py | Python | code/tmp_rtrip/test/test_buffer.py | emilyemorehouse/ast-and-me | 3f58117512e125e1ecbe3c72f2f0d26adb80b7b3 | [
"MIT"
] | 24 | 2018-01-23T05:28:40.000Z | 2021-04-13T20:52:59.000Z | code/tmp_rtrip/test/test_buffer.py | emilyemorehouse/ast-and-me | 3f58117512e125e1ecbe3c72f2f0d26adb80b7b3 | [
"MIT"
] | 17 | 2017-12-21T18:32:31.000Z | 2018-12-18T17:09:50.000Z | code/tmp_rtrip/test/test_buffer.py | emilyemorehouse/ast-and-me | 3f58117512e125e1ecbe3c72f2f0d26adb80b7b3 | [
"MIT"
] | null | null | null | import contextlib
import unittest
from test import support
from itertools import permutations, product
from random import randrange, sample, choice
import warnings
import sys, array, io
from decimal import Decimal
from fractions import Fraction
try:
from _testbuffer import *
except ImportError:
ndarray = None
try:
import struct
except ImportError:
struct = None
try:
import ctypes
except ImportError:
ctypes = None
try:
with warnings.catch_warnings():
from numpy import ndarray as numpy_array
except ImportError:
numpy_array = None
SHORT_TEST = True
NATIVE = {'?': 0, 'c': 0, 'b': 0, 'B': 0, 'h': 0, 'H': 0, 'i': 0, 'I': 0,
'l': 0, 'L': 0, 'n': 0, 'N': 0, 'f': 0, 'd': 0, 'P': 0}
if numpy_array:
del NATIVE['n']
del NATIVE['N']
if struct:
try:
struct.pack('Q', 2 ** 64 - 1)
NATIVE['q'] = 0
NATIVE['Q'] = 0
except struct.error:
pass
STANDARD = {'?': (0, 2), 'c': (0, 1 << 8), 'b': (-(1 << 7), 1 << 7), 'B': (
0, 1 << 8), 'h': (-(1 << 15), 1 << 15), 'H': (0, 1 << 16), 'i': (-(1 <<
31), 1 << 31), 'I': (0, 1 << 32), 'l': (-(1 << 31), 1 << 31), 'L': (0,
1 << 32), 'q': (-(1 << 63), 1 << 63), 'Q': (0, 1 << 64), 'f': (-(1 <<
63), 1 << 63), 'd': (-(1 << 1023), 1 << 1023)}
def native_type_range(fmt):
"""Return range of a native type."""
if fmt == 'c':
lh = 0, 256
elif fmt == '?':
lh = 0, 2
elif fmt == 'f':
lh = -(1 << 63), 1 << 63
elif fmt == 'd':
lh = -(1 << 1023), 1 << 1023
else:
for exp in (128, 127, 64, 63, 32, 31, 16, 15, 8, 7):
try:
struct.pack(fmt, (1 << exp) - 1)
break
except struct.error:
pass
lh = (-(1 << exp), 1 << exp) if exp & 1 else (0, 1 << exp)
return lh
fmtdict = {'': NATIVE, '@': NATIVE, '<': STANDARD, '>': STANDARD, '=':
STANDARD, '!': STANDARD}
if struct:
for fmt in fmtdict['@']:
fmtdict['@'][fmt] = native_type_range(fmt)
MEMORYVIEW = NATIVE.copy()
ARRAY = NATIVE.copy()
for k in NATIVE:
if not k in 'bBhHiIlLfd':
del ARRAY[k]
BYTEFMT = NATIVE.copy()
for k in NATIVE:
if not k in 'Bbc':
del BYTEFMT[k]
fmtdict['m'] = MEMORYVIEW
fmtdict['@m'] = MEMORYVIEW
fmtdict['a'] = ARRAY
fmtdict['b'] = BYTEFMT
fmtdict['@b'] = BYTEFMT
MODE = 0
MULT = 1
cap = {'ndarray': (['', '@', '<', '>', '=', '!'], ['', '1', '2', '3']),
'array': (['a'], ['']), 'numpy': ([''], ['']), 'memoryview': (['@m',
'm'], ['']), 'bytefmt': (['@b', 'b'], [''])}
def randrange_fmt(mode, char, obj):
"""Return random item for a type specified by a mode and a single
format character."""
x = randrange(*fmtdict[mode][char])
if char == 'c':
x = bytes([x])
if obj == 'numpy' and x == b'\x00':
x = b'\x01'
if char == '?':
x = bool(x)
if char == 'f' or char == 'd':
x = struct.pack(char, x)
x = struct.unpack(char, x)[0]
return x
def gen_item(fmt, obj):
"""Return single random item."""
mode, chars = fmt.split('#')
x = []
for c in chars:
x.append(randrange_fmt(mode, c, obj))
return x[0] if len(x) == 1 else tuple(x)
def gen_items(n, fmt, obj):
"""Return a list of random items (or a scalar)."""
if n == 0:
return gen_item(fmt, obj)
lst = [0] * n
for i in range(n):
lst[i] = gen_item(fmt, obj)
return lst
def struct_items(n, obj):
mode = choice(cap[obj][MODE])
xfmt = mode + '#'
fmt = mode.strip('amb')
nmemb = randrange(2, 10)
for _ in range(nmemb):
char = choice(tuple(fmtdict[mode]))
multiplier = choice(cap[obj][MULT])
xfmt += char * int(multiplier if multiplier else 1)
fmt += multiplier + char
items = gen_items(n, xfmt, obj)
item = gen_item(xfmt, obj)
return fmt, items, item
def randitems(n, obj='ndarray', mode=None, char=None):
"""Return random format, items, item."""
if mode is None:
mode = choice(cap[obj][MODE])
if char is None:
char = choice(tuple(fmtdict[mode]))
multiplier = choice(cap[obj][MULT])
fmt = mode + '#' + char * int(multiplier if multiplier else 1)
items = gen_items(n, fmt, obj)
item = gen_item(fmt, obj)
fmt = mode.strip('amb') + multiplier + char
return fmt, items, item
def iter_mode(n, obj='ndarray'):
"""Iterate through supported mode/char combinations."""
for mode in cap[obj][MODE]:
for char in fmtdict[mode]:
yield randitems(n, obj, mode, char)
def iter_format(nitems, testobj='ndarray'):
"""Yield (format, items, item) for all possible modes and format
characters plus one random compound format string."""
for t in iter_mode(nitems, testobj):
yield t
if testobj != 'ndarray':
return
yield struct_items(nitems, testobj)
def is_byte_format(fmt):
return 'c' in fmt or 'b' in fmt or 'B' in fmt
def is_memoryview_format(fmt):
"""format suitable for memoryview"""
x = len(fmt)
return (x == 1 or x == 2 and fmt[0] == '@') and fmt[x - 1] in MEMORYVIEW
NON_BYTE_FORMAT = [c for c in fmtdict['@'] if not is_byte_format(c)]
def atomp(lst):
"""Tuple items (representing structs) are regarded as atoms."""
return not isinstance(lst, list)
def listp(lst):
return isinstance(lst, list)
def prod(lst):
"""Product of list elements."""
if len(lst) == 0:
return 0
x = lst[0]
for v in lst[1:]:
x *= v
return x
def strides_from_shape(ndim, shape, itemsize, layout):
"""Calculate strides of a contiguous array. Layout is 'C' or
'F' (Fortran)."""
if ndim == 0:
return ()
if layout == 'C':
strides = list(shape[1:]) + [itemsize]
for i in range(ndim - 2, -1, -1):
strides[i] *= strides[i + 1]
else:
strides = [itemsize] + list(shape[:-1])
for i in range(1, ndim):
strides[i] *= strides[i - 1]
return strides
def _ca(items, s):
"""Convert flat item list to the nested list representation of a
multidimensional C array with shape 's'."""
if atomp(items):
return items
if len(s) == 0:
return items[0]
lst = [0] * s[0]
stride = len(items) // s[0] if s[0] else 0
for i in range(s[0]):
start = i * stride
lst[i] = _ca(items[start:start + stride], s[1:])
return lst
def _fa(items, s):
"""Convert flat item list to the nested list representation of a
multidimensional Fortran array with shape 's'."""
if atomp(items):
return items
if len(s) == 0:
return items[0]
lst = [0] * s[0]
stride = s[0]
for i in range(s[0]):
lst[i] = _fa(items[i::stride], s[1:])
return lst
def carray(items, shape):
if listp(items) and not 0 in shape and prod(shape) != len(items):
raise ValueError('prod(shape) != len(items)')
return _ca(items, shape)
def farray(items, shape):
if listp(items) and not 0 in shape and prod(shape) != len(items):
raise ValueError('prod(shape) != len(items)')
return _fa(items, shape)
def indices(shape):
"""Generate all possible tuples of indices."""
iterables = [range(v) for v in shape]
return product(*iterables)
def getindex(ndim, ind, strides):
"""Convert multi-dimensional index to the position in the flat list."""
ret = 0
for i in range(ndim):
ret += strides[i] * ind[i]
return ret
def transpose(src, shape):
"""Transpose flat item list that is regarded as a multi-dimensional
matrix defined by shape: dest...[k][j][i] = src[i][j][k]... """
if not shape:
return src
ndim = len(shape)
sstrides = strides_from_shape(ndim, shape, 1, 'C')
dstrides = strides_from_shape(ndim, shape[::-1], 1, 'C')
dest = [0] * len(src)
for ind in indices(shape):
fr = getindex(ndim, ind, sstrides)
to = getindex(ndim, ind[::-1], dstrides)
dest[to] = src[fr]
return dest
def _flatten(lst):
"""flatten list"""
if lst == []:
return lst
if atomp(lst):
return [lst]
return _flatten(lst[0]) + _flatten(lst[1:])
def flatten(lst):
"""flatten list or return scalar"""
if atomp(lst):
return lst
return _flatten(lst)
def slice_shape(lst, slices):
"""Get the shape of lst after slicing: slices is a list of slice
objects."""
if atomp(lst):
return []
return [len(lst[slices[0]])] + slice_shape(lst[0], slices[1:])
def multislice(lst, slices):
"""Multi-dimensional slicing: slices is a list of slice objects."""
if atomp(lst):
return lst
return [multislice(sublst, slices[1:]) for sublst in lst[slices[0]]]
def m_assign(llst, rlst, lslices, rslices):
"""Multi-dimensional slice assignment: llst and rlst are the operands,
lslices and rslices are lists of slice objects. llst and rlst must
have the same structure.
For a two-dimensional example, this is not implemented in Python:
llst[0:3:2, 0:3:2] = rlst[1:3:1, 1:3:1]
Instead we write:
lslices = [slice(0,3,2), slice(0,3,2)]
rslices = [slice(1,3,1), slice(1,3,1)]
multislice_assign(llst, rlst, lslices, rslices)
"""
if atomp(rlst):
return rlst
rlst = [m_assign(l, r, lslices[1:], rslices[1:]) for l, r in zip(llst[
lslices[0]], rlst[rslices[0]])]
llst[lslices[0]] = rlst
return llst
def cmp_structure(llst, rlst, lslices, rslices):
"""Compare the structure of llst[lslices] and rlst[rslices]."""
lshape = slice_shape(llst, lslices)
rshape = slice_shape(rlst, rslices)
if len(lshape) != len(rshape):
return -1
for i in range(len(lshape)):
if lshape[i] != rshape[i]:
return -1
if lshape[i] == 0:
return 0
return 0
def multislice_assign(llst, rlst, lslices, rslices):
"""Return llst after assigning: llst[lslices] = rlst[rslices]"""
if cmp_structure(llst, rlst, lslices, rslices) < 0:
raise ValueError('lvalue and rvalue have different structures')
return m_assign(llst, rlst, lslices, rslices)
def verify_structure(memlen, itemsize, ndim, shape, strides, offset):
"""Verify that the parameters represent a valid array within
the bounds of the allocated memory:
char *mem: start of the physical memory block
memlen: length of the physical memory block
offset: (char *)buf - mem
"""
if offset % itemsize:
return False
if offset < 0 or offset + itemsize > memlen:
return False
if any(v % itemsize for v in strides):
return False
if ndim <= 0:
return ndim == 0 and not shape and not strides
if 0 in shape:
return True
imin = sum(strides[j] * (shape[j] - 1) for j in range(ndim) if strides[
j] <= 0)
imax = sum(strides[j] * (shape[j] - 1) for j in range(ndim) if strides[
j] > 0)
return 0 <= offset + imin and offset + imax + itemsize <= memlen
def get_item(lst, indices):
for i in indices:
lst = lst[i]
return lst
def memory_index(indices, t):
"""Location of an item in the underlying memory."""
memlen, itemsize, ndim, shape, strides, offset = t
p = offset
for i in range(ndim):
p += strides[i] * indices[i]
return p
def is_overlapping(t):
"""The structure 't' is overlapping if at least one memory location
is visited twice while iterating through all possible tuples of
indices."""
memlen, itemsize, ndim, shape, strides, offset = t
visited = 1 << memlen
for ind in indices(shape):
i = memory_index(ind, t)
bit = 1 << i
if visited & bit:
return True
visited |= bit
return False
def rand_structure(itemsize, valid, maxdim=5, maxshape=16, shape=()):
"""Return random structure:
(memlen, itemsize, ndim, shape, strides, offset)
If 'valid' is true, the returned structure is valid, otherwise invalid.
If 'shape' is given, use that instead of creating a random shape.
"""
if not shape:
ndim = randrange(maxdim + 1)
if ndim == 0:
if valid:
return itemsize, itemsize, ndim, (), (), 0
else:
nitems = randrange(1, 16 + 1)
memlen = nitems * itemsize
offset = -itemsize if randrange(2) == 0 else memlen
return memlen, itemsize, ndim, (), (), offset
minshape = 2
n = randrange(100)
if n >= 95 and valid:
minshape = 0
elif n >= 90:
minshape = 1
shape = [0] * ndim
for i in range(ndim):
shape[i] = randrange(minshape, maxshape + 1)
else:
ndim = len(shape)
maxstride = 5
n = randrange(100)
zero_stride = True if n >= 95 and n & 1 else False
strides = [0] * ndim
strides[ndim - 1] = itemsize * randrange(-maxstride, maxstride + 1)
if not zero_stride and strides[ndim - 1] == 0:
strides[ndim - 1] = itemsize
for i in range(ndim - 2, -1, -1):
maxstride *= shape[i + 1] if shape[i + 1] else 1
if zero_stride:
strides[i] = itemsize * randrange(-maxstride, maxstride + 1)
else:
strides[i] = (1, -1)[randrange(2)] * itemsize * randrange(1,
maxstride + 1)
imin = imax = 0
if not 0 in shape:
imin = sum(strides[j] * (shape[j] - 1) for j in range(ndim) if
strides[j] <= 0)
imax = sum(strides[j] * (shape[j] - 1) for j in range(ndim) if
strides[j] > 0)
nitems = imax - imin
if valid:
offset = -imin * itemsize
memlen = offset + (imax + 1) * itemsize
else:
memlen = (-imin + imax) * itemsize
offset = -imin - itemsize if randrange(2) == 0 else memlen
return memlen, itemsize, ndim, shape, strides, offset
def randslice_from_slicelen(slicelen, listlen):
"""Create a random slice of len slicelen that fits into listlen."""
maxstart = listlen - slicelen
start = randrange(maxstart + 1)
maxstep = (listlen - start) // slicelen if slicelen else 1
step = randrange(1, maxstep + 1)
stop = start + slicelen * step
s = slice(start, stop, step)
_, _, _, control = slice_indices(s, listlen)
if control != slicelen:
raise RuntimeError
return s
def randslice_from_shape(ndim, shape):
"""Create two sets of slices for an array x with shape 'shape'
such that shapeof(x[lslices]) == shapeof(x[rslices])."""
lslices = [0] * ndim
rslices = [0] * ndim
for n in range(ndim):
l = shape[n]
slicelen = randrange(1, l + 1) if l > 0 else 0
lslices[n] = randslice_from_slicelen(slicelen, l)
rslices[n] = randslice_from_slicelen(slicelen, l)
return tuple(lslices), tuple(rslices)
def rand_aligned_slices(maxdim=5, maxshape=16):
"""Create (lshape, rshape, tuple(lslices), tuple(rslices)) such that
shapeof(x[lslices]) == shapeof(y[rslices]), where x is an array
with shape 'lshape' and y is an array with shape 'rshape'."""
ndim = randrange(1, maxdim + 1)
minshape = 2
n = randrange(100)
if n >= 95:
minshape = 0
elif n >= 90:
minshape = 1
all_random = True if randrange(100) >= 80 else False
lshape = [0] * ndim
rshape = [0] * ndim
lslices = [0] * ndim
rslices = [0] * ndim
for n in range(ndim):
small = randrange(minshape, maxshape + 1)
big = randrange(minshape, maxshape + 1)
if big < small:
big, small = small, big
if all_random:
start = randrange(-small, small + 1)
stop = randrange(-small, small + 1)
step = (1, -1)[randrange(2)] * randrange(1, small + 2)
s_small = slice(start, stop, step)
_, _, _, slicelen = slice_indices(s_small, small)
else:
slicelen = randrange(1, small + 1) if small > 0 else 0
s_small = randslice_from_slicelen(slicelen, small)
s_big = randslice_from_slicelen(slicelen, big)
if randrange(2) == 0:
rshape[n], lshape[n] = big, small
rslices[n], lslices[n] = s_big, s_small
else:
rshape[n], lshape[n] = small, big
rslices[n], lslices[n] = s_small, s_big
return lshape, rshape, tuple(lslices), tuple(rslices)
def randitems_from_structure(fmt, t):
"""Return a list of random items for structure 't' with format
'fmtchar'."""
memlen, itemsize, _, _, _, _ = t
return gen_items(memlen // itemsize, '#' + fmt, 'numpy')
def ndarray_from_structure(items, fmt, t, flags=0):
"""Return ndarray from the tuple returned by rand_structure()"""
memlen, itemsize, ndim, shape, strides, offset = t
return ndarray(items, shape=shape, strides=strides, format=fmt, offset=
offset, flags=ND_WRITABLE | flags)
def numpy_array_from_structure(items, fmt, t):
"""Return numpy_array from the tuple returned by rand_structure()"""
memlen, itemsize, ndim, shape, strides, offset = t
buf = bytearray(memlen)
for j, v in enumerate(items):
struct.pack_into(fmt, buf, j * itemsize, v)
return numpy_array(buffer=buf, shape=shape, strides=strides, dtype=fmt,
offset=offset)
def cast_items(exporter, fmt, itemsize, shape=None):
"""Interpret the raw memory of 'exporter' as a list of items with
size 'itemsize'. If shape=None, the new structure is assumed to
be 1-D with n * itemsize = bytelen. If shape is given, the usual
constraint for contiguous arrays prod(shape) * itemsize = bytelen
applies. On success, return (items, shape). If the constraints
cannot be met, return (None, None). If a chunk of bytes is interpreted
as NaN as a result of float conversion, return ('nan', None)."""
bytelen = exporter.nbytes
if shape:
if prod(shape) * itemsize != bytelen:
return None, shape
elif shape == []:
if exporter.ndim == 0 or itemsize != bytelen:
return None, shape
else:
n, r = divmod(bytelen, itemsize)
shape = [n]
if r != 0:
return None, shape
mem = exporter.tobytes()
byteitems = [mem[i:i + itemsize] for i in range(0, len(mem), itemsize)]
items = []
for v in byteitems:
item = struct.unpack(fmt, v)[0]
if item != item:
return 'nan', shape
items.append(item)
return (items, shape) if shape != [] else (items[0], shape)
def gencastshapes():
"""Generate shapes to test casting."""
for n in range(32):
yield [n]
ndim = randrange(4, 6)
minshape = 1 if randrange(100) > 80 else 2
yield [randrange(minshape, 5) for _ in range(ndim)]
ndim = randrange(2, 4)
minshape = 1 if randrange(100) > 80 else 2
yield [randrange(minshape, 5) for _ in range(ndim)]
def genslices(n):
"""Generate all possible slices for a single dimension."""
return product(range(-n, n + 1), range(-n, n + 1), range(-n, n + 1))
def genslices_ndim(ndim, shape):
"""Generate all possible slice tuples for 'shape'."""
iterables = [genslices(shape[n]) for n in range(ndim)]
return product(*iterables)
def rslice(n, allow_empty=False):
"""Generate random slice for a single dimension of length n.
If zero=True, the slices may be empty, otherwise they will
be non-empty."""
minlen = 0 if allow_empty or n == 0 else 1
slicelen = randrange(minlen, n + 1)
return randslice_from_slicelen(slicelen, n)
def rslices(n, allow_empty=False):
"""Generate random slices for a single dimension."""
for _ in range(5):
yield rslice(n, allow_empty)
def rslices_ndim(ndim, shape, iterations=5):
"""Generate random slice tuples for 'shape'."""
for _ in range(iterations):
yield tuple(rslice(shape[n]) for n in range(ndim))
for _ in range(iterations):
yield tuple(rslice(shape[n], allow_empty=True) for n in range(ndim))
yield tuple(slice(0, 1, 0) for _ in range(ndim))
def rpermutation(iterable, r=None):
pool = tuple(iterable)
r = len(pool) if r is None else r
yield tuple(sample(pool, r))
def ndarray_print(nd):
"""Print ndarray for debugging."""
try:
x = nd.tolist()
except (TypeError, NotImplementedError):
x = nd.tobytes()
if isinstance(nd, ndarray):
offset = nd.offset
flags = nd.flags
else:
offset = 'unknown'
flags = 'unknown'
print(
"ndarray(%s, shape=%s, strides=%s, suboffsets=%s, offset=%s, format='%s', itemsize=%s, flags=%s)"
% (x, nd.shape, nd.strides, nd.suboffsets, offset, nd.format, nd.
itemsize, flags))
sys.stdout.flush()
ITERATIONS = 100
MAXDIM = 5
MAXSHAPE = 10
if SHORT_TEST:
ITERATIONS = 10
MAXDIM = 3
MAXSHAPE = 4
genslices = rslices
genslices_ndim = rslices_ndim
permutations = rpermutation
@unittest.skipUnless(struct, 'struct module required for this test.')
@unittest.skipUnless(ndarray, 'ndarray object required for this test')
class TestBufferProtocol(unittest.TestCase):
def setUp(self):
self.sizeof_void_p = get_sizeof_void_p()
def verify(self, result, obj=-1, itemsize={1}, fmt=-1, readonly={1},
ndim={1}, shape=-1, strides=-1, lst=-1, sliced=False, cast=False):
if shape:
expected_len = prod(shape) * itemsize
elif not fmt:
expected_len = len(lst)
else:
expected_len = itemsize
suboffsets = ()
if result.suboffsets:
self.assertGreater(ndim, 0)
suboffset0 = 0
for n in range(1, ndim):
if shape[n] == 0:
break
if strides[n] <= 0:
suboffset0 += -strides[n] * (shape[n] - 1)
suboffsets = [suboffset0] + [(-1) for v in range(ndim - 1)]
stride0 = self.sizeof_void_p
if strides[0] < 0:
stride0 = -stride0
strides = [stride0] + list(strides[1:])
self.assertIs(result.obj, obj)
self.assertEqual(result.nbytes, expected_len)
self.assertEqual(result.itemsize, itemsize)
self.assertEqual(result.format, fmt)
self.assertEqual(result.readonly, readonly)
self.assertEqual(result.ndim, ndim)
self.assertEqual(result.shape, tuple(shape))
if not (sliced and suboffsets):
self.assertEqual(result.strides, tuple(strides))
self.assertEqual(result.suboffsets, tuple(suboffsets))
if isinstance(result, ndarray) or is_memoryview_format(fmt):
rep = result.tolist() if fmt else result.tobytes()
self.assertEqual(rep, lst)
if not fmt:
return
if not cast:
b = bytearray()
buf_err = None
for ind in indices(shape):
try:
item1 = get_pointer(result, ind)
item2 = get_item(lst, ind)
if isinstance(item2, tuple):
x = struct.pack(fmt, *item2)
else:
x = struct.pack(fmt, item2)
b.extend(x)
except BufferError:
buf_err = True
break
self.assertEqual(item1, item2)
if not buf_err:
self.assertEqual(result.tobytes(), b)
m = memoryview(result)
h = ''.join('%02x' % c for c in b)
self.assertEqual(m.hex(), h)
ff = fmt if fmt else 'B'
flattened = flatten(lst)
for order in ['C', 'F', 'A']:
expected = result
if order == 'F':
if not is_contiguous(result, 'A') or is_contiguous(
result, 'C'):
trans = transpose(flattened, shape)
expected = ndarray(trans, shape=shape, format=
ff, flags=ND_FORTRAN)
elif not is_contiguous(result, 'A') or is_contiguous(result
, 'F') and order == 'C':
expected = ndarray(flattened, shape=shape, format=ff)
contig = get_contiguous(result, PyBUF_READ, order)
self.assertEqual(contig.tobytes(), b)
self.assertTrue(cmp_contig(contig, expected))
if ndim == 0:
continue
nmemb = len(flattened)
ro = 0 if readonly else ND_WRITABLE
contig = py_buffer_to_contiguous(result, 'C', PyBUF_FULL_RO
)
self.assertEqual(len(contig), nmemb * itemsize)
initlst = [struct.unpack_from(fmt, contig, n * itemsize
) for n in range(nmemb)]
if len(initlst[0]) == 1:
initlst = [v[0] for v in initlst]
y = ndarray(initlst, shape=shape, flags=ro, format=fmt)
self.assertEqual(memoryview(y), memoryview(result))
contig = py_buffer_to_contiguous(result, 'F', PyBUF_FULL_RO
)
self.assertEqual(len(contig), nmemb * itemsize)
initlst = [struct.unpack_from(fmt, contig, n * itemsize
) for n in range(nmemb)]
if len(initlst[0]) == 1:
initlst = [v[0] for v in initlst]
y = ndarray(initlst, shape=shape, flags=ro | ND_FORTRAN,
format=fmt)
self.assertEqual(memoryview(y), memoryview(result))
contig = py_buffer_to_contiguous(result, 'A', PyBUF_FULL_RO
)
self.assertEqual(len(contig), nmemb * itemsize)
initlst = [struct.unpack_from(fmt, contig, n * itemsize
) for n in range(nmemb)]
if len(initlst[0]) == 1:
initlst = [v[0] for v in initlst]
f = ND_FORTRAN if is_contiguous(result, 'F') else 0
y = ndarray(initlst, shape=shape, flags=f | ro, format=fmt)
self.assertEqual(memoryview(y), memoryview(result))
if is_memoryview_format(fmt):
try:
m = memoryview(result)
except BufferError:
return
ex = result.obj if isinstance(result, memoryview) else result
self.assertIs(m.obj, ex)
self.assertEqual(m.nbytes, expected_len)
self.assertEqual(m.itemsize, itemsize)
self.assertEqual(m.format, fmt)
self.assertEqual(m.readonly, readonly)
self.assertEqual(m.ndim, ndim)
self.assertEqual(m.shape, tuple(shape))
if not (sliced and suboffsets):
self.assertEqual(m.strides, tuple(strides))
self.assertEqual(m.suboffsets, tuple(suboffsets))
n = 1 if ndim == 0 else len(lst)
self.assertEqual(len(m), n)
rep = result.tolist() if fmt else result.tobytes()
self.assertEqual(rep, lst)
self.assertEqual(m, result)
def verify_getbuf(self, orig_ex, ex, req, sliced=False):
def simple_fmt(ex):
return ex.format == '' or ex.format == 'B'
def match(req, flag):
return req & flag == flag
if ex.readonly and match(req, PyBUF_WRITABLE) or match(req,
PyBUF_C_CONTIGUOUS) and not ex.c_contiguous or match(req,
PyBUF_F_CONTIGUOUS) and not ex.f_contiguous or match(req,
PyBUF_ANY_CONTIGUOUS) and not ex.contiguous or not match(req,
PyBUF_INDIRECT) and ex.suboffsets or not match(req, PyBUF_STRIDES
) and not ex.c_contiguous or not match(req, PyBUF_ND) and match(req
, PyBUF_FORMAT):
self.assertRaises(BufferError, ndarray, ex, getbuf=req)
return
if isinstance(ex, ndarray) or is_memoryview_format(ex.format):
lst = ex.tolist()
else:
nd = ndarray(ex, getbuf=PyBUF_FULL_RO)
lst = nd.tolist()
ro = 0 if match(req, PyBUF_WRITABLE) else ex.readonly
fmt = ex.format
itemsize = ex.itemsize
ndim = ex.ndim
if not match(req, PyBUF_FORMAT):
fmt = ''
lst = orig_ex.tobytes()
if not match(req, PyBUF_ND):
ndim = 1
shape = orig_ex.shape if match(req, PyBUF_ND) else ()
strides = orig_ex.strides if match(req, PyBUF_STRIDES) else ()
nd = ndarray(ex, getbuf=req)
self.verify(nd, obj=ex, itemsize=itemsize, fmt=fmt, readonly=ro,
ndim=ndim, shape=shape, strides=strides, lst=lst, sliced=sliced)
def test_ndarray_getbuf(self):
requests = (PyBUF_INDIRECT, PyBUF_STRIDES, PyBUF_ND, PyBUF_SIMPLE,
PyBUF_C_CONTIGUOUS, PyBUF_F_CONTIGUOUS, PyBUF_ANY_CONTIGUOUS,
PyBUF_FULL, PyBUF_FULL_RO, PyBUF_RECORDS, PyBUF_RECORDS_RO,
PyBUF_STRIDED, PyBUF_STRIDED_RO, PyBUF_CONTIG, PyBUF_CONTIG_RO)
items_fmt = ([(True if x % 2 else False) for x in range(12)], '?'), ([
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], 'b'), ([1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12], 'B'), ([(2 ** 31 - x if x % 2 else -2 **
31 + x) for x in range(12)], 'l')
structure = ([], [], 0), ([1, 3, 1], [], 0), ([12], [], 0), ([12],
[-1], 11), ([6], [2], 0), ([6], [-2], 11), ([3, 4], [], 0), ([3,
4], [-4, -1], 11), ([2, 2], [4, 1], 4), ([2, 2], [-4, -1], 8)
ndflags = (0, ND_WRITABLE, ND_FORTRAN, ND_FORTRAN | ND_WRITABLE,
ND_PIL, ND_PIL | ND_WRITABLE)
real_flags = (0, PyBUF_WRITABLE, PyBUF_FORMAT, PyBUF_WRITABLE |
PyBUF_FORMAT)
for items, fmt in items_fmt:
itemsize = struct.calcsize(fmt)
for shape, strides, offset in structure:
strides = [(v * itemsize) for v in strides]
offset *= itemsize
for flags in ndflags:
if strides and flags & ND_FORTRAN:
continue
if not shape and flags & ND_PIL:
continue
_items = items if shape else items[0]
ex1 = ndarray(_items, format=fmt, flags=flags, shape=
shape, strides=strides, offset=offset)
ex2 = ex1[::-2] if shape else None
m1 = memoryview(ex1)
if ex2:
m2 = memoryview(ex2)
if ex1.ndim == 0 or ex1.ndim == 1 and shape and strides:
self.assertEqual(m1, ex1)
if ex2 and ex2.ndim == 1 and shape and strides:
self.assertEqual(m2, ex2)
for req in requests:
for bits in real_flags:
self.verify_getbuf(ex1, ex1, req | bits)
self.verify_getbuf(ex1, m1, req | bits)
if ex2:
self.verify_getbuf(ex2, ex2, req | bits,
sliced=True)
self.verify_getbuf(ex2, m2, req | bits,
sliced=True)
items = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]
ex = ndarray(items, shape=[12], flags=ND_GETBUF_FAIL)
self.assertRaises(BufferError, ndarray, ex)
base = ndarray([9], [1])
ex = ndarray(base, getbuf=PyBUF_SIMPLE)
self.assertRaises(BufferError, ndarray, ex, getbuf=PyBUF_WRITABLE)
self.assertRaises(BufferError, ndarray, ex, getbuf=PyBUF_ND)
self.assertRaises(BufferError, ndarray, ex, getbuf=PyBUF_STRIDES)
self.assertRaises(BufferError, ndarray, ex, getbuf=PyBUF_C_CONTIGUOUS)
self.assertRaises(BufferError, ndarray, ex, getbuf=PyBUF_F_CONTIGUOUS)
self.assertRaises(BufferError, ndarray, ex, getbuf=PyBUF_ANY_CONTIGUOUS
)
nd = ndarray(ex, getbuf=PyBUF_SIMPLE)
for shape in ([1, 12, 1], [7, 0, 7]):
for order in (0, ND_FORTRAN):
ex = ndarray(items, shape=shape, flags=order | ND_WRITABLE)
self.assertTrue(is_contiguous(ex, 'F'))
self.assertTrue(is_contiguous(ex, 'C'))
for flags in requests:
nd = ndarray(ex, getbuf=flags)
self.assertTrue(is_contiguous(nd, 'F'))
self.assertTrue(is_contiguous(nd, 'C'))
def test_ndarray_exceptions(self):
nd = ndarray([9], [1])
ndm = ndarray([9], [1], flags=ND_VAREXPORT)
for c in (ndarray, nd.push, ndm.push):
self.assertRaises(TypeError, c, {1, 2, 3})
self.assertRaises(TypeError, c, [1, 2, '3'])
self.assertRaises(TypeError, c, [1, 2, (3, 4)])
self.assertRaises(TypeError, c, [1, 2, 3], shape={3})
self.assertRaises(TypeError, c, [1, 2, 3], shape=[3], strides={1})
self.assertRaises(TypeError, c, [1, 2, 3], shape=[3], offset=[])
self.assertRaises(TypeError, c, [1], shape=[1], format={})
self.assertRaises(TypeError, c, [1], shape=[1], flags={})
self.assertRaises(TypeError, c, [1], shape=[1], getbuf={})
self.assertRaises(TypeError, c, [1], shape=[1], strides=[1],
flags=ND_FORTRAN)
self.assertRaises(TypeError, c, [1], shape=[], flags=ND_PIL)
self.assertRaises(ValueError, c, [], shape=[1])
self.assertRaises(ValueError, c, ['XXX'], shape=[1], format='L')
self.assertRaises(struct.error, c, [1000], shape=[1], format='B')
self.assertRaises(ValueError, c, [1, (2, 3)], shape=[2], format='B'
)
self.assertRaises(ValueError, c, [1, 2, 3], shape=[3], format='QL')
n = ND_MAX_NDIM + 1
self.assertRaises(ValueError, c, [1] * n, shape=[1] * n)
self.assertRaises(ValueError, c, [1], shape=[-1])
self.assertRaises(ValueError, c, [1, 2, 3], shape=['3'])
self.assertRaises(OverflowError, c, [1], shape=[2 ** 128])
self.assertRaises(ValueError, c, [1, 2, 3, 4, 5], shape=[2, 2],
offset=3)
self.assertRaises(ValueError, c, [1, 2, 3], shape=[3], strides=
['1'])
self.assertRaises(OverflowError, c, [1], shape=[1], strides=[2 **
128])
self.assertRaises(ValueError, c, [1, 2], shape=[2, 1], strides=[1])
self.assertRaises(ValueError, c, [1, 2, 3, 4], shape=[2],
strides=[3], format='L')
self.assertRaises(ValueError, c, [1, 2, 3], shape=[3], offset=4)
self.assertRaises(ValueError, c, [1, 2, 3], shape=[1], offset=3,
format='L')
self.assertRaises(ValueError, c, [1, 2, 3], shape=[3], format='')
self.assertRaises(struct.error, c, [(1, 2, 3)], shape=[1],
format='@#$')
items = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
self.assertRaises(ValueError, c, items, shape=[2, 3], strides=[
-3, -2], offset=5)
self.assertRaises(TypeError, c, bytearray(), format='Q')
self.assertRaises(TypeError, c, [1], shape=[1], getbuf=PyBUF_FULL)
self.assertRaises(TypeError, c, [1])
self.assertRaises(BufferError, ndarray, b'123', getbuf=PyBUF_WRITABLE)
nd = ndarray([9], [1], flags=ND_VAREXPORT)
self.assertRaises(ValueError, nd.push, [1], [1], flags=ND_VAREXPORT)
nd = ndarray(b'123')
self.assertRaises(BufferError, nd.push, [1], [1])
self.assertRaises(BufferError, nd.pop)
nd = ndarray([9], [1])
nd.push([1], [1])
m = memoryview(nd)
self.assertRaises(BufferError, nd.push, [1], [1])
self.assertRaises(BufferError, nd.pop)
m.release()
nd.pop()
self.assertRaises(BufferError, nd.pop)
del nd
self.assertRaises(TypeError, get_pointer, {}, [1, 2, 3])
self.assertRaises(TypeError, get_pointer, b'123', {})
nd = ndarray(list(range(100)), shape=[1] * 100)
self.assertRaises(ValueError, get_pointer, nd, [5])
nd = ndarray(list(range(12)), shape=[3, 4])
self.assertRaises(ValueError, get_pointer, nd, [2, 3, 4])
self.assertRaises(ValueError, get_pointer, nd, [3, 3])
self.assertRaises(ValueError, get_pointer, nd, [-3, 3])
self.assertRaises(OverflowError, get_pointer, nd, [1 << 64, 3])
ex = ndarray([1, 2, 3], shape=[3], format='L')
nd = ndarray(ex, getbuf=PyBUF_SIMPLE)
self.assertRaises(ValueError, nd.tolist)
ex1 = ndarray([1, 2, 3], shape=[3], format='L')
ex2 = ndarray(ex1)
nd = ndarray(ex2)
self.assertRaises(TypeError, nd.memoryview_from_buffer)
nd = ndarray([(1,) * 200], shape=[1], format='L' * 200)
self.assertRaises(TypeError, nd.memoryview_from_buffer)
n = ND_MAX_NDIM
nd = ndarray(list(range(n)), shape=[1] * n)
self.assertRaises(ValueError, nd.memoryview_from_buffer)
nd = ndarray([1], shape=[1])
self.assertRaises(TypeError, get_contiguous, 1, 2, 3, 4, 5)
self.assertRaises(TypeError, get_contiguous, nd, 'xyz', 'C')
self.assertRaises(OverflowError, get_contiguous, nd, 2 ** 64, 'C')
self.assertRaises(TypeError, get_contiguous, nd, PyBUF_READ, 961)
self.assertRaises(UnicodeEncodeError, get_contiguous, nd,
PyBUF_READ, '\u2007')
self.assertRaises(ValueError, get_contiguous, nd, PyBUF_READ, 'Z')
self.assertRaises(ValueError, get_contiguous, nd, 255, 'A')
nd = ndarray([1], shape=[1])
self.assertRaises(TypeError, cmp_contig, 1, 2, 3, 4, 5)
self.assertRaises(TypeError, cmp_contig, {}, nd)
self.assertRaises(TypeError, cmp_contig, nd, {})
nd = ndarray([1], shape=[1])
self.assertRaises(TypeError, is_contiguous, 1, 2, 3, 4, 5)
self.assertRaises(TypeError, is_contiguous, {}, 'A')
self.assertRaises(TypeError, is_contiguous, nd, 201)
def test_ndarray_linked_list(self):
for perm in permutations(range(5)):
m = [0] * 5
nd = ndarray([1, 2, 3], shape=[3], flags=ND_VAREXPORT)
m[0] = memoryview(nd)
for i in range(1, 5):
nd.push([1, 2, 3], shape=[3])
m[i] = memoryview(nd)
for i in range(5):
m[perm[i]].release()
self.assertRaises(BufferError, nd.pop)
del nd
def test_ndarray_format_scalar(self):
for fmt, scalar, _ in iter_format(0):
itemsize = struct.calcsize(fmt)
nd = ndarray(scalar, shape=(), format=fmt)
self.verify(nd, obj=None, itemsize=itemsize, fmt=fmt, readonly=
1, ndim=0, shape=(), strides=(), lst=scalar)
def test_ndarray_format_shape(self):
nitems = randrange(1, 10)
for fmt, items, _ in iter_format(nitems):
itemsize = struct.calcsize(fmt)
for flags in (0, ND_PIL):
nd = ndarray(items, shape=[nitems], format=fmt, flags=flags)
self.verify(nd, obj=None, itemsize=itemsize, fmt=fmt,
readonly=1, ndim=1, shape=(nitems,), strides=(itemsize,
), lst=items)
def test_ndarray_format_strides(self):
nitems = randrange(1, 30)
for fmt, items, _ in iter_format(nitems):
itemsize = struct.calcsize(fmt)
for step in range(-5, 5):
if step == 0:
continue
shape = [len(items[::step])]
strides = [step * itemsize]
offset = itemsize * (nitems - 1) if step < 0 else 0
for flags in (0, ND_PIL):
nd = ndarray(items, shape=shape, strides=strides,
format=fmt, offset=offset, flags=flags)
self.verify(nd, obj=None, itemsize=itemsize, fmt=fmt,
readonly=1, ndim=1, shape=shape, strides=strides,
lst=items[::step])
def test_ndarray_fortran(self):
items = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]
ex = ndarray(items, shape=(3, 4), strides=(1, 3))
nd = ndarray(ex, getbuf=PyBUF_F_CONTIGUOUS | PyBUF_FORMAT)
self.assertEqual(nd.tolist(), farray(items, (3, 4)))
def test_ndarray_multidim(self):
for ndim in range(5):
shape_t = [randrange(2, 10) for _ in range(ndim)]
nitems = prod(shape_t)
for shape in permutations(shape_t):
fmt, items, _ = randitems(nitems)
itemsize = struct.calcsize(fmt)
for flags in (0, ND_PIL):
if ndim == 0 and flags == ND_PIL:
continue
nd = ndarray(items, shape=shape, format=fmt, flags=flags)
strides = strides_from_shape(ndim, shape, itemsize, 'C')
lst = carray(items, shape)
self.verify(nd, obj=None, itemsize=itemsize, fmt=fmt,
readonly=1, ndim=ndim, shape=shape, strides=strides,
lst=lst)
if is_memoryview_format(fmt):
ex = ndarray(items, shape=shape, format=fmt)
nd = ndarray(ex, getbuf=PyBUF_CONTIG_RO | PyBUF_FORMAT)
self.assertTrue(nd.strides == ())
mv = nd.memoryview_from_buffer()
self.verify(mv, obj=None, itemsize=itemsize, fmt=
fmt, readonly=1, ndim=ndim, shape=shape,
strides=strides, lst=lst)
nd = ndarray(items, shape=shape, format=fmt, flags=
flags | ND_FORTRAN)
strides = strides_from_shape(ndim, shape, itemsize, 'F')
lst = farray(items, shape)
self.verify(nd, obj=None, itemsize=itemsize, fmt=fmt,
readonly=1, ndim=ndim, shape=shape, strides=strides,
lst=lst)
def test_ndarray_index_invalid(self):
nd = ndarray([1], shape=[1])
self.assertRaises(TypeError, nd.__setitem__, 1, 8)
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertRaises(TypeError, mv.__setitem__, 1, 8)
nd = ndarray([1], shape=[1], flags=ND_WRITABLE)
self.assertRaises(TypeError, nd.__delitem__, 1)
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertRaises(TypeError, mv.__delitem__, 1)
nd = ndarray([1], shape=[1], flags=ND_WRITABLE)
self.assertRaises(OverflowError, nd.__getitem__, 1 << 64)
self.assertRaises(OverflowError, nd.__setitem__, 1 << 64, 8)
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertRaises(IndexError, mv.__getitem__, 1 << 64)
self.assertRaises(IndexError, mv.__setitem__, 1 << 64, 8)
items = [1, 2, 3, 4, 5, 6, 7, 8]
nd = ndarray(items, shape=[len(items)], format='B', flags=ND_WRITABLE)
self.assertRaises(struct.error, nd.__setitem__, 2, 300)
self.assertRaises(ValueError, nd.__setitem__, 1, (100, 200))
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertRaises(ValueError, mv.__setitem__, 2, 300)
self.assertRaises(TypeError, mv.__setitem__, 1, (100, 200))
items = [(1, 2), (3, 4), (5, 6)]
nd = ndarray(items, shape=[len(items)], format='LQ', flags=ND_WRITABLE)
self.assertRaises(ValueError, nd.__setitem__, 2, 300)
self.assertRaises(struct.error, nd.__setitem__, 1, (b'\x001', 200))
def test_ndarray_index_scalar(self):
nd = ndarray(1, shape=(), flags=ND_WRITABLE)
mv = memoryview(nd)
self.assertEqual(mv, nd)
x = nd[()]
self.assertEqual(x, 1)
x = nd[...]
self.assertEqual(x.tolist(), nd.tolist())
x = mv[()]
self.assertEqual(x, 1)
x = mv[...]
self.assertEqual(x.tolist(), nd.tolist())
self.assertRaises(TypeError, nd.__getitem__, 0)
self.assertRaises(TypeError, mv.__getitem__, 0)
self.assertRaises(TypeError, nd.__setitem__, 0, 8)
self.assertRaises(TypeError, mv.__setitem__, 0, 8)
self.assertEqual(nd.tolist(), 1)
self.assertEqual(mv.tolist(), 1)
nd[()] = 9
self.assertEqual(nd.tolist(), 9)
mv[()] = 9
self.assertEqual(mv.tolist(), 9)
nd[...] = 5
self.assertEqual(nd.tolist(), 5)
mv[...] = 5
self.assertEqual(mv.tolist(), 5)
def test_ndarray_index_null_strides(self):
ex = ndarray(list(range(2 * 4)), shape=[2, 4], flags=ND_WRITABLE)
nd = ndarray(ex, getbuf=PyBUF_CONTIG)
self.assertRaises(BufferError, nd.__getitem__, 1)
self.assertRaises(BufferError, nd.__getitem__, slice(3, 5, 1))
def test_ndarray_index_getitem_single(self):
for fmt, items, _ in iter_format(5):
nd = ndarray(items, shape=[5], format=fmt)
for i in range(-5, 5):
self.assertEqual(nd[i], items[i])
self.assertRaises(IndexError, nd.__getitem__, -6)
self.assertRaises(IndexError, nd.__getitem__, 5)
if is_memoryview_format(fmt):
mv = memoryview(nd)
self.assertEqual(mv, nd)
for i in range(-5, 5):
self.assertEqual(mv[i], items[i])
self.assertRaises(IndexError, mv.__getitem__, -6)
self.assertRaises(IndexError, mv.__getitem__, 5)
for fmt, items, _ in iter_format(5):
ex = ndarray(items, shape=[5], flags=ND_WRITABLE, format=fmt)
nd = ndarray(ex, getbuf=PyBUF_CONTIG | PyBUF_FORMAT)
for i in range(-5, 5):
self.assertEqual(nd[i], items[i])
if is_memoryview_format(fmt):
mv = nd.memoryview_from_buffer()
self.assertIs(mv.__eq__(nd), NotImplemented)
for i in range(-5, 5):
self.assertEqual(mv[i], items[i])
items = [1, 2, 3, 4, 5]
ex = ndarray(items, shape=[5])
nd = ndarray(ex, getbuf=PyBUF_CONTIG_RO)
for i in range(-5, 5):
self.assertEqual(nd[i], items[i])
items = [1, 2, 3, 4, 5]
ex = ndarray(items, shape=[5])
nd = ndarray(ex, getbuf=PyBUF_SIMPLE)
for i in range(-5, 5):
self.assertEqual(nd[i], items[i])
def test_ndarray_index_setitem_single(self):
for fmt, items, single_item in iter_format(5):
nd = ndarray(items, shape=[5], format=fmt, flags=ND_WRITABLE)
for i in range(5):
items[i] = single_item
nd[i] = single_item
self.assertEqual(nd.tolist(), items)
self.assertRaises(IndexError, nd.__setitem__, -6, single_item)
self.assertRaises(IndexError, nd.__setitem__, 5, single_item)
if not is_memoryview_format(fmt):
continue
nd = ndarray(items, shape=[5], format=fmt, flags=ND_WRITABLE)
mv = memoryview(nd)
self.assertEqual(mv, nd)
for i in range(5):
items[i] = single_item
mv[i] = single_item
self.assertEqual(mv.tolist(), items)
self.assertRaises(IndexError, mv.__setitem__, -6, single_item)
self.assertRaises(IndexError, mv.__setitem__, 5, single_item)
for fmt, items, single_item in iter_format(5):
nd = ndarray(items, shape=[5], format=fmt, flags=ND_WRITABLE)
for i in range(-5, 4):
items[i] = items[i + 1]
nd[i] = nd[i + 1]
self.assertEqual(nd.tolist(), items)
if not is_memoryview_format(fmt):
continue
nd = ndarray(items, shape=[5], format=fmt, flags=ND_WRITABLE)
mv = memoryview(nd)
self.assertEqual(mv, nd)
for i in range(-5, 4):
items[i] = items[i + 1]
mv[i] = mv[i + 1]
self.assertEqual(mv.tolist(), items)
def test_ndarray_index_getitem_multidim(self):
shape_t = 2, 3, 5
nitems = prod(shape_t)
for shape in permutations(shape_t):
fmt, items, _ = randitems(nitems)
for flags in (0, ND_PIL):
nd = ndarray(items, shape=shape, format=fmt, flags=flags)
lst = carray(items, shape)
for i in range(-shape[0], shape[0]):
self.assertEqual(lst[i], nd[i].tolist())
for j in range(-shape[1], shape[1]):
self.assertEqual(lst[i][j], nd[i][j].tolist())
for k in range(-shape[2], shape[2]):
self.assertEqual(lst[i][j][k], nd[i][j][k])
nd = ndarray(items, shape=shape, format=fmt, flags=flags |
ND_FORTRAN)
lst = farray(items, shape)
for i in range(-shape[0], shape[0]):
self.assertEqual(lst[i], nd[i].tolist())
for j in range(-shape[1], shape[1]):
self.assertEqual(lst[i][j], nd[i][j].tolist())
for k in range(shape[2], shape[2]):
self.assertEqual(lst[i][j][k], nd[i][j][k])
def test_ndarray_sequence(self):
nd = ndarray(1, shape=())
self.assertRaises(TypeError, eval, '1 in nd', locals())
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertRaises(TypeError, eval, '1 in mv', locals())
for fmt, items, _ in iter_format(5):
nd = ndarray(items, shape=[5], format=fmt)
for i, v in enumerate(nd):
self.assertEqual(v, items[i])
self.assertTrue(v in nd)
if is_memoryview_format(fmt):
mv = memoryview(nd)
for i, v in enumerate(mv):
self.assertEqual(v, items[i])
self.assertTrue(v in mv)
def test_ndarray_slice_invalid(self):
items = [1, 2, 3, 4, 5, 6, 7, 8]
xl = ndarray(items, shape=[8], flags=ND_WRITABLE)
ml = memoryview(xl)
self.assertRaises(TypeError, xl.__setitem__, slice(0, 8, 1), items)
self.assertRaises(TypeError, ml.__setitem__, slice(0, 8, 1), items)
xl = ndarray(items, shape=[8], flags=ND_WRITABLE)
ex = ndarray(items, shape=[8], flags=ND_WRITABLE)
xr = ndarray(ex, getbuf=PyBUF_ND)
self.assertRaises(BufferError, xl.__setitem__, slice(0, 8, 1), xr)
nd = ndarray(items, shape=[8], format='L', flags=ND_WRITABLE)
mv = memoryview(nd)
self.assertRaises(ValueError, nd.__getitem__, slice(0, 1, 0))
self.assertRaises(ValueError, mv.__getitem__, slice(0, 1, 0))
nd = ndarray(items, shape=[2, 4], format='L', flags=ND_WRITABLE)
mv = memoryview(nd)
self.assertRaises(ValueError, nd.__getitem__, (slice(0, 1, 1),
slice(0, 1, 0)))
self.assertRaises(ValueError, nd.__getitem__, (slice(0, 1, 0),
slice(0, 1, 1)))
self.assertRaises(TypeError, nd.__getitem__, '@%$')
self.assertRaises(TypeError, nd.__getitem__, ('@%$', slice(0, 1, 1)))
self.assertRaises(TypeError, nd.__getitem__, (slice(0, 1, 1), {}))
self.assertRaises(NotImplementedError, mv.__getitem__, (slice(0, 1,
1), slice(0, 1, 0)))
self.assertRaises(TypeError, mv.__getitem__, '@%$')
xl = ndarray(items, shape=[8], format='B', flags=ND_WRITABLE)
xr = ndarray(items, shape=[8], format='b')
ml = memoryview(xl)
mr = memoryview(xr)
self.assertRaises(ValueError, xl.__setitem__, slice(0, 1, 1), xr[7:8])
self.assertEqual(xl.tolist(), items)
self.assertRaises(ValueError, ml.__setitem__, slice(0, 1, 1), mr[7:8])
self.assertEqual(ml.tolist(), items)
xl = ndarray(items, shape=[8], format='B', flags=ND_WRITABLE)
yr = ndarray(items, shape=[8], format='L')
ml = memoryview(xl)
mr = memoryview(xr)
self.assertRaises(ValueError, xl.__setitem__, slice(0, 1, 1), xr[7:8])
self.assertEqual(xl.tolist(), items)
self.assertRaises(ValueError, ml.__setitem__, slice(0, 1, 1), mr[7:8])
self.assertEqual(ml.tolist(), items)
xl = ndarray(items, shape=[2, 4], format='b', flags=ND_WRITABLE)
xr = ndarray(items, shape=[8], format='b')
ml = memoryview(xl)
mr = memoryview(xr)
self.assertRaises(ValueError, xl.__setitem__, slice(0, 1, 1), xr[7:8])
self.assertEqual(xl.tolist(), [[1, 2, 3, 4], [5, 6, 7, 8]])
self.assertRaises(NotImplementedError, ml.__setitem__, slice(0, 1,
1), mr[7:8])
xl = ndarray(items, shape=[8], format='b', flags=ND_WRITABLE)
xr = ndarray(items, shape=[8], format='b')
ml = memoryview(xl)
mr = memoryview(xr)
self.assertRaises(ValueError, xl.__setitem__, slice(0, 2, 1), xr[7:8])
self.assertEqual(xl.tolist(), items)
self.assertRaises(ValueError, ml.__setitem__, slice(0, 2, 1), mr[7:8])
self.assertEqual(ml.tolist(), items)
self.assertRaises(TypeError, slice_indices, slice(0, 1, 2), {})
self.assertRaises(TypeError, slice_indices, '###########', 1)
self.assertRaises(ValueError, slice_indices, slice(0, 1, 0), 4)
x = ndarray(items, shape=[8], format='b', flags=ND_PIL)
self.assertRaises(TypeError, x.add_suboffsets)
ex = ndarray(items, shape=[8], format='B')
x = ndarray(ex, getbuf=PyBUF_SIMPLE)
self.assertRaises(TypeError, x.add_suboffsets)
def test_ndarray_slice_zero_shape(self):
items = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]
x = ndarray(items, shape=[12], format='L', flags=ND_WRITABLE)
y = ndarray(items, shape=[12], format='L')
x[4:4] = y[9:9]
self.assertEqual(x.tolist(), items)
ml = memoryview(x)
mr = memoryview(y)
self.assertEqual(ml, x)
self.assertEqual(ml, y)
ml[4:4] = mr[9:9]
self.assertEqual(ml.tolist(), items)
x = ndarray(items, shape=[3, 4], format='L', flags=ND_WRITABLE)
y = ndarray(items, shape=[4, 3], format='L')
x[1:2, 2:2] = y[1:2, 3:3]
self.assertEqual(x.tolist(), carray(items, [3, 4]))
def test_ndarray_slice_multidim(self):
shape_t = 2, 3, 5
ndim = len(shape_t)
nitems = prod(shape_t)
for shape in permutations(shape_t):
fmt, items, _ = randitems(nitems)
itemsize = struct.calcsize(fmt)
for flags in (0, ND_PIL):
nd = ndarray(items, shape=shape, format=fmt, flags=flags)
lst = carray(items, shape)
for slices in rslices_ndim(ndim, shape):
listerr = None
try:
sliced = multislice(lst, slices)
except Exception as e:
listerr = e.__class__
nderr = None
try:
ndsliced = nd[slices]
except Exception as e:
nderr = e.__class__
if nderr or listerr:
self.assertIs(nderr, listerr)
else:
self.assertEqual(ndsliced.tolist(), sliced)
def test_ndarray_slice_redundant_suboffsets(self):
shape_t = 2, 3, 5, 2
ndim = len(shape_t)
nitems = prod(shape_t)
for shape in permutations(shape_t):
fmt, items, _ = randitems(nitems)
itemsize = struct.calcsize(fmt)
nd = ndarray(items, shape=shape, format=fmt)
nd.add_suboffsets()
ex = ndarray(items, shape=shape, format=fmt)
ex.add_suboffsets()
mv = memoryview(ex)
lst = carray(items, shape)
for slices in rslices_ndim(ndim, shape):
listerr = None
try:
sliced = multislice(lst, slices)
except Exception as e:
listerr = e.__class__
nderr = None
try:
ndsliced = nd[slices]
except Exception as e:
nderr = e.__class__
if nderr or listerr:
self.assertIs(nderr, listerr)
else:
self.assertEqual(ndsliced.tolist(), sliced)
def test_ndarray_slice_assign_single(self):
for fmt, items, _ in iter_format(5):
for lslice in genslices(5):
for rslice in genslices(5):
for flags in (0, ND_PIL):
f = flags | ND_WRITABLE
nd = ndarray(items, shape=[5], format=fmt, flags=f)
ex = ndarray(items, shape=[5], format=fmt, flags=f)
mv = memoryview(ex)
lsterr = None
diff_structure = None
lst = items[:]
try:
lval = lst[lslice]
rval = lst[rslice]
lst[lslice] = lst[rslice]
diff_structure = len(lval) != len(rval)
except Exception as e:
lsterr = e.__class__
nderr = None
try:
nd[lslice] = nd[rslice]
except Exception as e:
nderr = e.__class__
if diff_structure:
self.assertIs(nderr, ValueError)
else:
self.assertEqual(nd.tolist(), lst)
self.assertIs(nderr, lsterr)
if not is_memoryview_format(fmt):
continue
mverr = None
try:
mv[lslice] = mv[rslice]
except Exception as e:
mverr = e.__class__
if diff_structure:
self.assertIs(mverr, ValueError)
else:
self.assertEqual(mv.tolist(), lst)
self.assertEqual(mv, nd)
self.assertIs(mverr, lsterr)
self.verify(mv, obj=ex, itemsize=nd.itemsize,
fmt=fmt, readonly=0, ndim=nd.ndim, shape=nd
.shape, strides=nd.strides, lst=nd.tolist())
def test_ndarray_slice_assign_multidim(self):
shape_t = 2, 3, 5
ndim = len(shape_t)
nitems = prod(shape_t)
for shape in permutations(shape_t):
fmt, items, _ = randitems(nitems)
for flags in (0, ND_PIL):
for _ in range(ITERATIONS):
lslices, rslices = randslice_from_shape(ndim, shape)
nd = ndarray(items, shape=shape, format=fmt, flags=
flags | ND_WRITABLE)
lst = carray(items, shape)
listerr = None
try:
result = multislice_assign(lst, lst, lslices, rslices)
except Exception as e:
listerr = e.__class__
nderr = None
try:
nd[lslices] = nd[rslices]
except Exception as e:
nderr = e.__class__
if nderr or listerr:
self.assertIs(nderr, listerr)
else:
self.assertEqual(nd.tolist(), result)
def test_ndarray_random(self):
for _ in range(ITERATIONS):
for fmt in fmtdict['@']:
itemsize = struct.calcsize(fmt)
t = rand_structure(itemsize, True, maxdim=MAXDIM, maxshape=
MAXSHAPE)
self.assertTrue(verify_structure(*t))
items = randitems_from_structure(fmt, t)
x = ndarray_from_structure(items, fmt, t)
xlist = x.tolist()
mv = memoryview(x)
if is_memoryview_format(fmt):
mvlist = mv.tolist()
self.assertEqual(mvlist, xlist)
if t[2] > 0:
y = ndarray_from_structure(items, fmt, t, flags=ND_PIL)
ylist = y.tolist()
self.assertEqual(xlist, ylist)
mv = memoryview(y)
if is_memoryview_format(fmt):
self.assertEqual(mv, y)
mvlist = mv.tolist()
self.assertEqual(mvlist, ylist)
if numpy_array:
shape = t[3]
if 0 in shape:
continue
z = numpy_array_from_structure(items, fmt, t)
self.verify(x, obj=None, itemsize=z.itemsize, fmt=fmt,
readonly=0, ndim=z.ndim, shape=z.shape, strides=z.
strides, lst=z.tolist())
def test_ndarray_random_invalid(self):
for _ in range(ITERATIONS):
for fmt in fmtdict['@']:
itemsize = struct.calcsize(fmt)
t = rand_structure(itemsize, False, maxdim=MAXDIM, maxshape
=MAXSHAPE)
self.assertFalse(verify_structure(*t))
items = randitems_from_structure(fmt, t)
nderr = False
try:
x = ndarray_from_structure(items, fmt, t)
except Exception as e:
nderr = e.__class__
self.assertTrue(nderr)
if numpy_array:
numpy_err = False
try:
y = numpy_array_from_structure(items, fmt, t)
except Exception as e:
numpy_err = e.__class__
if 0:
self.assertTrue(numpy_err)
def test_ndarray_random_slice_assign(self):
for _ in range(ITERATIONS):
for fmt in fmtdict['@']:
itemsize = struct.calcsize(fmt)
lshape, rshape, lslices, rslices = rand_aligned_slices(maxdim
=MAXDIM, maxshape=MAXSHAPE)
tl = rand_structure(itemsize, True, shape=lshape)
tr = rand_structure(itemsize, True, shape=rshape)
self.assertTrue(verify_structure(*tl))
self.assertTrue(verify_structure(*tr))
litems = randitems_from_structure(fmt, tl)
ritems = randitems_from_structure(fmt, tr)
xl = ndarray_from_structure(litems, fmt, tl)
xr = ndarray_from_structure(ritems, fmt, tr)
xl[lslices] = xr[rslices]
xllist = xl.tolist()
xrlist = xr.tolist()
ml = memoryview(xl)
mr = memoryview(xr)
self.assertEqual(ml.tolist(), xllist)
self.assertEqual(mr.tolist(), xrlist)
if tl[2] > 0 and tr[2] > 0:
yl = ndarray_from_structure(litems, fmt, tl, flags=ND_PIL)
yr = ndarray_from_structure(ritems, fmt, tr, flags=ND_PIL)
yl[lslices] = yr[rslices]
yllist = yl.tolist()
yrlist = yr.tolist()
self.assertEqual(xllist, yllist)
self.assertEqual(xrlist, yrlist)
ml = memoryview(yl)
mr = memoryview(yr)
self.assertEqual(ml.tolist(), yllist)
self.assertEqual(mr.tolist(), yrlist)
if numpy_array:
if 0 in lshape or 0 in rshape:
continue
zl = numpy_array_from_structure(litems, fmt, tl)
zr = numpy_array_from_structure(ritems, fmt, tr)
zl[lslices] = zr[rslices]
if not is_overlapping(tl) and not is_overlapping(tr):
self.verify(xl, obj=None, itemsize=zl.itemsize, fmt
=fmt, readonly=0, ndim=zl.ndim, shape=zl.shape,
strides=zl.strides, lst=zl.tolist())
self.verify(xr, obj=None, itemsize=zr.itemsize, fmt=fmt,
readonly=0, ndim=zr.ndim, shape=zr.shape, strides=
zr.strides, lst=zr.tolist())
def test_ndarray_re_export(self):
items = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]
nd = ndarray(items, shape=[3, 4], flags=ND_PIL)
ex = ndarray(nd)
self.assertTrue(ex.flags & ND_PIL)
self.assertIs(ex.obj, nd)
self.assertEqual(ex.suboffsets, (0, -1))
self.assertFalse(ex.c_contiguous)
self.assertFalse(ex.f_contiguous)
self.assertFalse(ex.contiguous)
def test_ndarray_zero_shape(self):
for flags in (0, ND_PIL):
nd = ndarray([1, 2, 3], shape=[0], flags=flags)
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertEqual(nd.tolist(), [])
self.assertEqual(mv.tolist(), [])
nd = ndarray([1, 2, 3], shape=[0, 3, 3], flags=flags)
self.assertEqual(nd.tolist(), [])
nd = ndarray([1, 2, 3], shape=[3, 0, 3], flags=flags)
self.assertEqual(nd.tolist(), [[], [], []])
nd = ndarray([1, 2, 3], shape=[3, 3, 0], flags=flags)
self.assertEqual(nd.tolist(), [[[], [], []], [[], [], []], [[],
[], []]])
def test_ndarray_zero_strides(self):
for flags in (0, ND_PIL):
nd = ndarray([1], shape=[5], strides=[0], flags=flags)
mv = memoryview(nd)
self.assertEqual(mv, nd)
self.assertEqual(nd.tolist(), [1, 1, 1, 1, 1])
self.assertEqual(mv.tolist(), [1, 1, 1, 1, 1])
def test_ndarray_offset(self):
nd = ndarray(list(range(20)), shape=[3], offset=7)
self.assertEqual(nd.offset, 7)
self.assertEqual(nd.tolist(), [7, 8, 9])
def test_ndarray_memoryview_from_buffer(self):
for flags in (0, ND_PIL):
nd = ndarray(list(range(3)), shape=[3], flags=flags)
m = nd.memoryview_from_buffer()
self.assertEqual(m, nd)
def test_ndarray_get_pointer(self):
for flags in (0, ND_PIL):
nd = ndarray(list(range(3)), shape=[3], flags=flags)
for i in range(3):
self.assertEqual(nd[i], get_pointer(nd, [i]))
def test_ndarray_tolist_null_strides(self):
ex = ndarray(list(range(20)), shape=[2, 2, 5])
nd = ndarray(ex, getbuf=PyBUF_ND | PyBUF_FORMAT)
self.assertEqual(nd.tolist(), ex.tolist())
m = memoryview(ex)
self.assertEqual(m.tolist(), ex.tolist())
def test_ndarray_cmp_contig(self):
self.assertFalse(cmp_contig(b'123', b'456'))
x = ndarray(list(range(12)), shape=[3, 4])
y = ndarray(list(range(12)), shape=[4, 3])
self.assertFalse(cmp_contig(x, y))
x = ndarray([1], shape=[1], format='B')
self.assertTrue(cmp_contig(x, b'\x01'))
self.assertTrue(cmp_contig(b'\x01', x))
def test_ndarray_hash(self):
a = array.array('L', [1, 2, 3])
nd = ndarray(a)
self.assertRaises(ValueError, hash, nd)
b = bytes(list(range(12)))
nd = ndarray(list(range(12)), shape=[12])
self.assertEqual(hash(nd), hash(b))
nd = ndarray(list(range(12)), shape=[3, 4])
self.assertEqual(hash(nd), hash(b))
nd = ndarray(list(range(12)), shape=[3, 2, 2])
self.assertEqual(hash(nd), hash(b))
b = bytes(transpose(list(range(12)), shape=[4, 3]))
nd = ndarray(list(range(12)), shape=[3, 4], flags=ND_FORTRAN)
self.assertEqual(hash(nd), hash(b))
b = bytes(transpose(list(range(12)), shape=[2, 3, 2]))
nd = ndarray(list(range(12)), shape=[2, 3, 2], flags=ND_FORTRAN)
self.assertEqual(hash(nd), hash(b))
b = bytes(list(range(12)))
nd = ndarray(list(range(12)), shape=[2, 2, 3], flags=ND_PIL)
self.assertEqual(hash(nd), hash(b))
nd = ndarray(list(range(12)), shape=[2, 2, 3], format='L')
self.assertEqual(hash(nd), hash(nd.tobytes()))
def test_py_buffer_to_contiguous(self):
requests = (PyBUF_INDIRECT, PyBUF_STRIDES, PyBUF_ND, PyBUF_SIMPLE,
PyBUF_FULL, PyBUF_FULL_RO, PyBUF_RECORDS, PyBUF_RECORDS_RO,
PyBUF_STRIDED, PyBUF_STRIDED_RO, PyBUF_CONTIG, PyBUF_CONTIG_RO)
self.assertRaises(TypeError, py_buffer_to_contiguous, {}, 'F',
PyBUF_FULL_RO)
nd = ndarray(9, shape=(), format='L', flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
for request in requests:
b = py_buffer_to_contiguous(nd, order, request)
self.assertEqual(b, nd.tobytes())
nd = ndarray([1], shape=[0], format='L', flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
for request in requests:
b = py_buffer_to_contiguous(nd, order, request)
self.assertEqual(b, b'')
nd = ndarray(list(range(8)), shape=[2, 0, 7], format='L', flags=
ND_WRITABLE)
for order in ['C', 'F', 'A']:
for request in requests:
b = py_buffer_to_contiguous(nd, order, request)
self.assertEqual(b, b'')
for f in [0, ND_FORTRAN]:
nd = ndarray([1], shape=[1], format='h', flags=f | ND_WRITABLE)
ndbytes = nd.tobytes()
for order in ['C', 'F', 'A']:
for request in requests:
b = py_buffer_to_contiguous(nd, order, request)
self.assertEqual(b, ndbytes)
nd = ndarray([1, 2, 3], shape=[3], format='b', flags=f |
ND_WRITABLE)
ndbytes = nd.tobytes()
for order in ['C', 'F', 'A']:
for request in requests:
b = py_buffer_to_contiguous(nd, order, request)
self.assertEqual(b, ndbytes)
nd = ndarray([1, 2, 3], shape=[2], strides=[2], flags=ND_WRITABLE)
ndbytes = nd.tobytes()
for order in ['C', 'F', 'A']:
for request in [PyBUF_STRIDES, PyBUF_FULL]:
b = py_buffer_to_contiguous(nd, order, request)
self.assertEqual(b, ndbytes)
nd = nd[::-1]
ndbytes = nd.tobytes()
for order in ['C', 'F', 'A']:
for request in requests:
try:
b = py_buffer_to_contiguous(nd, order, request)
except BufferError:
continue
self.assertEqual(b, ndbytes)
lst = list(range(12))
for f in [0, ND_FORTRAN]:
nd = ndarray(lst, shape=[3, 4], flags=f | ND_WRITABLE)
if numpy_array:
na = numpy_array(buffer=bytearray(lst), shape=[3, 4], dtype
='B', order='C' if f == 0 else 'F')
if f == ND_FORTRAN:
x = ndarray(transpose(lst, [4, 3]), shape=[3, 4], flags=
ND_WRITABLE)
expected = x.tobytes()
else:
expected = nd.tobytes()
for request in requests:
try:
b = py_buffer_to_contiguous(nd, 'C', request)
except BufferError:
continue
self.assertEqual(b, expected)
y = ndarray([v for v in b], shape=[3, 4], flags=ND_WRITABLE)
self.assertEqual(memoryview(y), memoryview(nd))
if numpy_array:
self.assertEqual(b, na.tostring(order='C'))
if f == 0:
x = ndarray(transpose(lst, [3, 4]), shape=[4, 3], flags=
ND_WRITABLE)
else:
x = ndarray(lst, shape=[3, 4], flags=ND_WRITABLE)
expected = x.tobytes()
for request in [PyBUF_FULL, PyBUF_FULL_RO, PyBUF_INDIRECT,
PyBUF_STRIDES, PyBUF_ND]:
try:
b = py_buffer_to_contiguous(nd, 'F', request)
except BufferError:
continue
self.assertEqual(b, expected)
y = ndarray([v for v in b], shape=[3, 4], flags=ND_FORTRAN |
ND_WRITABLE)
self.assertEqual(memoryview(y), memoryview(nd))
if numpy_array:
self.assertEqual(b, na.tostring(order='F'))
if f == ND_FORTRAN:
x = ndarray(lst, shape=[3, 4], flags=ND_WRITABLE)
expected = x.tobytes()
else:
expected = nd.tobytes()
for request in [PyBUF_FULL, PyBUF_FULL_RO, PyBUF_INDIRECT,
PyBUF_STRIDES, PyBUF_ND]:
try:
b = py_buffer_to_contiguous(nd, 'A', request)
except BufferError:
continue
self.assertEqual(b, expected)
y = ndarray([v for v in b], shape=[3, 4], flags=f | ND_WRITABLE
)
self.assertEqual(memoryview(y), memoryview(nd))
if numpy_array:
self.assertEqual(b, na.tostring(order='A'))
nd = ndarray(list(range(12)), shape=[3, 4], flags=ND_WRITABLE | ND_PIL)
b = py_buffer_to_contiguous(nd, 'C', PyBUF_FULL_RO)
self.assertEqual(b, nd.tobytes())
y = ndarray([v for v in b], shape=[3, 4], flags=ND_WRITABLE)
self.assertEqual(memoryview(y), memoryview(nd))
b = py_buffer_to_contiguous(nd, 'F', PyBUF_FULL_RO)
x = ndarray(transpose(lst, [3, 4]), shape=[4, 3], flags=ND_WRITABLE)
self.assertEqual(b, x.tobytes())
y = ndarray([v for v in b], shape=[3, 4], flags=ND_FORTRAN |
ND_WRITABLE)
self.assertEqual(memoryview(y), memoryview(nd))
b = py_buffer_to_contiguous(nd, 'A', PyBUF_FULL_RO)
self.assertEqual(b, nd.tobytes())
y = ndarray([v for v in b], shape=[3, 4], flags=ND_WRITABLE)
self.assertEqual(memoryview(y), memoryview(nd))
def test_memoryview_construction(self):
items_shape = [(9, []), ([1, 2, 3], [3]), (list(range(2 * 3 * 5)),
[2, 3, 5])]
for items, shape in items_shape:
ex = ndarray(items, shape=shape)
m = memoryview(ex)
self.assertTrue(m.c_contiguous)
self.assertTrue(m.contiguous)
ndim = len(shape)
strides = strides_from_shape(ndim, shape, 1, 'C')
lst = carray(items, shape)
self.verify(m, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=strides, lst=lst)
m2 = memoryview(m)
self.verify(m2, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=strides, lst=lst)
nd = ndarray(ex, getbuf=PyBUF_CONTIG_RO | PyBUF_FORMAT)
self.assertEqual(nd.strides, ())
m = nd.memoryview_from_buffer()
self.verify(m, obj=None, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=strides, lst=lst)
nd = ndarray(ex, getbuf=PyBUF_SIMPLE)
self.assertEqual(nd.format, '')
self.assertEqual(nd.shape, ())
self.assertEqual(nd.strides, ())
m = nd.memoryview_from_buffer()
lst = [items] if ndim == 0 else items
self.verify(m, obj=None, itemsize=1, fmt='B', readonly=1, ndim=
1, shape=[ex.nbytes], strides=(1,), lst=lst)
for items, shape in items_shape:
ex = ndarray(items, shape=shape, flags=ND_FORTRAN)
m = memoryview(ex)
self.assertTrue(m.f_contiguous)
self.assertTrue(m.contiguous)
ndim = len(shape)
strides = strides_from_shape(ndim, shape, 1, 'F')
lst = farray(items, shape)
self.verify(m, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=strides, lst=lst)
m2 = memoryview(m)
self.verify(m2, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=strides, lst=lst)
for items, shape in items_shape[1:]:
ex = ndarray(items, shape=shape, flags=ND_PIL)
m = memoryview(ex)
ndim = len(shape)
lst = carray(items, shape)
self.verify(m, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=ex.strides, lst=lst)
m2 = memoryview(m)
self.verify(m2, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=
ndim, shape=shape, strides=ex.strides, lst=lst)
self.assertRaises(TypeError, memoryview, b'9', 'x')
self.assertRaises(TypeError, memoryview, {})
ex = ndarray([1, 2, 3], shape=[3])
nd = ndarray(ex, getbuf=PyBUF_SIMPLE)
self.assertRaises(BufferError, memoryview, nd)
nd = ndarray(ex, getbuf=PyBUF_CONTIG_RO | PyBUF_FORMAT)
self.assertRaises(BufferError, memoryview, nd)
nd = ndarray([1] * 128, shape=[1] * 128, format='L')
self.assertRaises(ValueError, memoryview, nd)
self.assertRaises(ValueError, nd.memoryview_from_buffer)
self.assertRaises(ValueError, get_contiguous, nd, PyBUF_READ, 'C')
self.assertRaises(ValueError, get_contiguous, nd, PyBUF_READ, 'F')
self.assertRaises(ValueError, get_contiguous, nd[::-1], PyBUF_READ, 'C'
)
def test_memoryview_cast_zero_shape(self):
items = [1, 2, 3]
for shape in ([0, 3, 3], [3, 0, 3], [0, 3, 3]):
ex = ndarray(items, shape=shape)
self.assertTrue(ex.c_contiguous)
msrc = memoryview(ex)
self.assertRaises(TypeError, msrc.cast, 'c')
for fmt, _, _ in iter_format(1, 'memoryview'):
msrc = memoryview(b'')
m = msrc.cast(fmt)
self.assertEqual(m.tobytes(), b'')
self.assertEqual(m.tolist(), [])
check_sizeof = support.check_sizeof
def test_memoryview_sizeof(self):
check = self.check_sizeof
vsize = support.calcvobjsize
base_struct = 'Pnin 2P2n2i5P P'
per_dim = '3n'
items = list(range(8))
check(memoryview(b''), vsize(base_struct + 1 * per_dim))
a = ndarray(items, shape=[2, 4], format='b')
check(memoryview(a), vsize(base_struct + 2 * per_dim))
a = ndarray(items, shape=[2, 2, 2], format='b')
check(memoryview(a), vsize(base_struct + 3 * per_dim))
def test_memoryview_struct_module(self):
class INT(object):
def __init__(self, val):
self.val = val
def __int__(self):
return self.val
class IDX(object):
def __init__(self, val):
self.val = val
def __index__(self):
return self.val
def f():
return 7
values = [INT(9), IDX(9), 2.2 + 3j, Decimal('-21.1'), 12.2,
Fraction(5, 2), [1, 2, 3], {4, 5, 6}, {(7): 8}, (), (9,), True,
False, None, NotImplemented, b'a', b'abc', bytearray(b'a'),
bytearray(b'abc'), 'a', 'abc', 'a', 'abc', f, lambda x: x]
for fmt, items, item in iter_format(10, 'memoryview'):
ex = ndarray(items, shape=[10], format=fmt, flags=ND_WRITABLE)
nd = ndarray(items, shape=[10], format=fmt, flags=ND_WRITABLE)
m = memoryview(ex)
struct.pack_into(fmt, nd, 0, item)
m[0] = item
self.assertEqual(m[0], nd[0])
itemsize = struct.calcsize(fmt)
if 'P' in fmt:
continue
for v in values:
struct_err = None
try:
struct.pack_into(fmt, nd, itemsize, v)
except struct.error:
struct_err = struct.error
mv_err = None
try:
m[1] = v
except (TypeError, ValueError) as e:
mv_err = e.__class__
if struct_err or mv_err:
self.assertIsNot(struct_err, None)
self.assertIsNot(mv_err, None)
else:
self.assertEqual(m[1], nd[1])
def test_memoryview_cast_zero_strides(self):
ex = ndarray([1, 2, 3], shape=[3], strides=[0])
self.assertFalse(ex.c_contiguous)
msrc = memoryview(ex)
self.assertRaises(TypeError, msrc.cast, 'c')
def test_memoryview_cast_invalid(self):
for sfmt in NON_BYTE_FORMAT:
sformat = '@' + sfmt if randrange(2) else sfmt
ssize = struct.calcsize(sformat)
for dfmt in NON_BYTE_FORMAT:
dformat = '@' + dfmt if randrange(2) else dfmt
dsize = struct.calcsize(dformat)
ex = ndarray(list(range(32)), shape=[32 // ssize], format=
sformat)
msrc = memoryview(ex)
self.assertRaises(TypeError, msrc.cast, dfmt, [32 // dsize])
for sfmt, sitems, _ in iter_format(1):
ex = ndarray(sitems, shape=[1], format=sfmt)
msrc = memoryview(ex)
for dfmt, _, _ in iter_format(1):
if not is_memoryview_format(dfmt):
self.assertRaises(ValueError, msrc.cast, dfmt, [32 //
dsize])
elif not is_byte_format(sfmt) and not is_byte_format(dfmt):
self.assertRaises(TypeError, msrc.cast, dfmt, [32 // dsize]
)
size_h = struct.calcsize('h')
size_d = struct.calcsize('d')
ex = ndarray(list(range(2 * 2 * size_d)), shape=[2, 2, size_d],
format='h')
msrc = memoryview(ex)
self.assertRaises(TypeError, msrc.cast, shape=[2, 2, size_h],
format='d')
ex = ndarray(list(range(120)), shape=[1, 2, 3, 4, 5])
m = memoryview(ex)
self.assertRaises(TypeError, m.cast)
self.assertRaises(TypeError, m.cast, 1, 2, 3)
self.assertRaises(TypeError, m.cast, {})
self.assertRaises(ValueError, m.cast, 'X')
self.assertRaises(ValueError, m.cast, '@X')
self.assertRaises(ValueError, m.cast, '@XY')
self.assertRaises(ValueError, m.cast, '=B')
self.assertRaises(ValueError, m.cast, '!L')
self.assertRaises(ValueError, m.cast, '<P')
self.assertRaises(ValueError, m.cast, '>l')
self.assertRaises(ValueError, m.cast, 'BI')
self.assertRaises(ValueError, m.cast, 'xBI')
ex = ndarray([(1, 2), (3, 4)], shape=[2], format='II')
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.__getitem__, 0)
self.assertRaises(NotImplementedError, m.__setitem__, 0, 8)
self.assertRaises(NotImplementedError, m.tolist)
ex = ndarray(list(range(120)), shape=[1, 2, 3, 4, 5])
m = memoryview(ex)
self.assertRaises(TypeError, m.cast, 'B', shape={})
ex = ndarray(list(range(120)), shape=[2 * 3 * 4 * 5])
m = memoryview(ex)
self.assertRaises(OverflowError, m.cast, 'B', shape=[2 ** 64])
self.assertRaises(ValueError, m.cast, 'B', shape=[-1])
self.assertRaises(ValueError, m.cast, 'B', shape=[2, 3, 4, 5, 6, 7, -1]
)
self.assertRaises(ValueError, m.cast, 'B', shape=[2, 3, 4, 5, 6, 7, 0])
self.assertRaises(TypeError, m.cast, 'B', shape=[2, 3, 4, 5, 6, 7, 'x']
)
ex = ndarray(list([(9) for _ in range(3 * 5 * 7 * 11)]), shape=[3,
5, 7, 11])
m = memoryview(ex)
self.assertRaises(TypeError, m.cast, 'I', shape=[2, 3, 4, 5])
nd = ndarray(list(range(128)), shape=[128], format='I')
m = memoryview(nd)
self.assertRaises(ValueError, m.cast, 'I', [1] * 128)
ex = ndarray(list([(9) for _ in range(3 * 5 * 7 * 11)]), shape=[3 *
5 * 7 * 11])
m = memoryview(ex)
self.assertRaises(TypeError, m.cast, 'I', shape=[2, 3, 4, 5])
ex = ndarray(list([(9) for _ in range(3 * 5 * 7 * 11)]), shape=[3 *
5 * 7 * 11])
m = memoryview(ex)
self.assertRaises(TypeError, m.cast, 'B', shape=[2, 3, 4, 5])
nd = ndarray(list(range(128)), shape=[128], format='I')
m1 = memoryview(nd)
nd = ndarray(list(range(128)), shape=[128], format='B')
m2 = memoryview(nd)
if sys.maxsize == 2 ** 63 - 1:
self.assertRaises(TypeError, m1.cast, 'B', [7, 7, 73, 127, 337,
92737, 649657])
self.assertRaises(ValueError, m1.cast, 'B', [2 ** 20, 2 ** 20,
2 ** 10, 2 ** 10, 2 ** 3])
self.assertRaises(ValueError, m2.cast, 'I', [2 ** 20, 2 ** 20,
2 ** 10, 2 ** 10, 2 ** 1])
else:
self.assertRaises(TypeError, m1.cast, 'B', [1, 2147483647])
self.assertRaises(ValueError, m1.cast, 'B', [2 ** 10, 2 ** 10,
2 ** 5, 2 ** 5, 2 ** 1])
self.assertRaises(ValueError, m2.cast, 'I', [2 ** 10, 2 ** 10,
2 ** 5, 2 ** 3, 2 ** 1])
def test_memoryview_cast(self):
bytespec = ('B', lambda ex: list(ex.tobytes())), ('b', lambda ex: [
(x - 256 if x > 127 else x) for x in list(ex.tobytes())]), ('c',
lambda ex: [bytes(chr(x), 'latin-1') for x in list(ex.tobytes())])
def iter_roundtrip(ex, m, items, fmt):
srcsize = struct.calcsize(fmt)
for bytefmt, to_bytelist in bytespec:
m2 = m.cast(bytefmt)
lst = to_bytelist(ex)
self.verify(m2, obj=ex, itemsize=1, fmt=bytefmt, readonly=0,
ndim=1, shape=[31 * srcsize], strides=(1,), lst=lst,
cast=True)
m3 = m2.cast(fmt)
self.assertEqual(m3, ex)
lst = ex.tolist()
self.verify(m3, obj=ex, itemsize=srcsize, fmt=fmt, readonly
=0, ndim=1, shape=[31], strides=(srcsize,), lst=lst,
cast=True)
srcsize = struct.calcsize('I')
ex = ndarray(9, shape=[], format='I')
destitems, destshape = cast_items(ex, 'B', 1)
m = memoryview(ex)
m2 = m.cast('B')
self.verify(m2, obj=ex, itemsize=1, fmt='B', readonly=1, ndim=1,
shape=destshape, strides=(1,), lst=destitems, cast=True)
destsize = struct.calcsize('I')
ex = ndarray([9] * destsize, shape=[destsize], format='B')
destitems, destshape = cast_items(ex, 'I', destsize, shape=[])
m = memoryview(ex)
m2 = m.cast('I', shape=[])
self.verify(m2, obj=ex, itemsize=destsize, fmt='I', readonly=1,
ndim=0, shape=(), strides=(), lst=destitems, cast=True)
for fmt, items, _ in iter_format(31, 'array'):
ex = array.array(fmt, items)
m = memoryview(ex)
iter_roundtrip(ex, m, items, fmt)
for fmt, items, _ in iter_format(31, 'memoryview'):
ex = ndarray(items, shape=[31], format=fmt, flags=ND_WRITABLE)
m = memoryview(ex)
iter_roundtrip(ex, m, items, fmt)
def test_memoryview_cast_1D_ND(self):
for _tshape in gencastshapes():
for char in fmtdict['@']:
tfmt = ('', '@')[randrange(2)] + char
tsize = struct.calcsize(tfmt)
n = prod(_tshape) * tsize
obj = 'memoryview' if is_byte_format(tfmt) else 'bytefmt'
for fmt, items, _ in iter_format(n, obj):
size = struct.calcsize(fmt)
shape = [n] if n > 0 else []
tshape = _tshape + [size]
ex = ndarray(items, shape=shape, format=fmt)
m = memoryview(ex)
titems, tshape = cast_items(ex, tfmt, tsize, shape=tshape)
if titems is None:
self.assertRaises(TypeError, m.cast, tfmt, tshape)
continue
if titems == 'nan':
continue
nd = ndarray(titems, shape=tshape, format=tfmt)
m2 = m.cast(tfmt, shape=tshape)
ndim = len(tshape)
strides = nd.strides
lst = nd.tolist()
self.verify(m2, obj=ex, itemsize=tsize, fmt=tfmt,
readonly=1, ndim=ndim, shape=tshape, strides=
strides, lst=lst, cast=True)
m3 = m2.cast(fmt)
m4 = m2.cast(fmt, shape=shape)
ndim = len(shape)
strides = ex.strides
lst = ex.tolist()
self.verify(m3, obj=ex, itemsize=size, fmt=fmt,
readonly=1, ndim=ndim, shape=shape, strides=strides,
lst=lst, cast=True)
self.verify(m4, obj=ex, itemsize=size, fmt=fmt,
readonly=1, ndim=ndim, shape=shape, strides=strides,
lst=lst, cast=True)
if ctypes:
class BEPoint(ctypes.BigEndianStructure):
_fields_ = [('x', ctypes.c_long), ('y', ctypes.c_double)]
point = BEPoint(100, 200.1)
m1 = memoryview(point)
m2 = m1.cast('B')
self.assertEqual(m2.obj, point)
self.assertEqual(m2.itemsize, 1)
self.assertEqual(m2.readonly, 0)
self.assertEqual(m2.ndim, 1)
self.assertEqual(m2.shape, (m2.nbytes,))
self.assertEqual(m2.strides, (1,))
self.assertEqual(m2.suboffsets, ())
x = ctypes.c_double(1.2)
m1 = memoryview(x)
m2 = m1.cast('c')
self.assertEqual(m2.obj, x)
self.assertEqual(m2.itemsize, 1)
self.assertEqual(m2.readonly, 0)
self.assertEqual(m2.ndim, 1)
self.assertEqual(m2.shape, (m2.nbytes,))
self.assertEqual(m2.strides, (1,))
self.assertEqual(m2.suboffsets, ())
def test_memoryview_tolist(self):
a = array.array('h', list(range(-6, 6)))
m = memoryview(a)
self.assertEqual(m, a)
self.assertEqual(m.tolist(), a.tolist())
a = a[2::3]
m = m[2::3]
self.assertEqual(m, a)
self.assertEqual(m.tolist(), a.tolist())
ex = ndarray(list(range(2 * 3 * 5 * 7 * 11)), shape=[11, 2, 7, 3, 5
], format='L')
m = memoryview(ex)
self.assertEqual(m.tolist(), ex.tolist())
ex = ndarray([(2, 5), (7, 11)], shape=[2], format='lh')
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.tolist)
ex = ndarray([b'12345'], shape=[1], format='s')
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.tolist)
ex = ndarray([b'a', b'b', b'c', b'd', b'e', b'f'], shape=[2, 3],
format='s')
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.tolist)
def test_memoryview_repr(self):
m = memoryview(bytearray(9))
r = m.__repr__()
self.assertTrue(r.startswith('<memory'))
m.release()
r = m.__repr__()
self.assertTrue(r.startswith('<released'))
def test_memoryview_sequence(self):
for fmt in ('d', 'f'):
inf = float(1e1000)
ex = array.array(fmt, [1.0, inf, 3.0])
m = memoryview(ex)
self.assertIn(1.0, m)
self.assertIn(1e1000, m)
self.assertIn(3.0, m)
ex = ndarray(9.0, [], format='f')
m = memoryview(ex)
self.assertRaises(TypeError, eval, '9.0 in m', locals())
@contextlib.contextmanager
def assert_out_of_bounds_error(self, dim):
with self.assertRaises(IndexError) as cm:
yield
self.assertEqual(str(cm.exception),
'index out of bounds on dimension %d' % (dim,))
def test_memoryview_index(self):
ex = ndarray(12.5, shape=[], format='d')
m = memoryview(ex)
self.assertEqual(m[()], 12.5)
self.assertEqual(m[...], m)
self.assertEqual(m[...], ex)
self.assertRaises(TypeError, m.__getitem__, 0)
ex = ndarray((1, 2, 3), shape=[], format='iii')
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.__getitem__, ())
ex = ndarray(list(range(7)), shape=[7], flags=ND_WRITABLE)
m = memoryview(ex)
self.assertRaises(IndexError, m.__getitem__, 2 ** 64)
self.assertRaises(TypeError, m.__getitem__, 2.0)
self.assertRaises(TypeError, m.__getitem__, 0.0)
self.assertRaises(IndexError, m.__getitem__, -8)
self.assertRaises(IndexError, m.__getitem__, 8)
ex = ndarray(list(range(12)), shape=[3, 4], flags=ND_WRITABLE)
m = memoryview(ex)
self.assertEqual(m[0, 0], 0)
self.assertEqual(m[2, 0], 8)
self.assertEqual(m[2, 3], 11)
self.assertEqual(m[-1, -1], 11)
self.assertEqual(m[-3, -4], 0)
for index in (3, -4):
with self.assert_out_of_bounds_error(dim=1):
m[index, 0]
for index in (4, -5):
with self.assert_out_of_bounds_error(dim=2):
m[0, index]
self.assertRaises(IndexError, m.__getitem__, (2 ** 64, 0))
self.assertRaises(IndexError, m.__getitem__, (0, 2 ** 64))
self.assertRaises(TypeError, m.__getitem__, (0, 0, 0))
self.assertRaises(TypeError, m.__getitem__, (0.0, 0.0))
self.assertRaises(NotImplementedError, m.__getitem__, ())
self.assertRaises(NotImplementedError, m.__getitem__, 0)
def test_memoryview_assign(self):
ex = ndarray(12.5, shape=[], format='f', flags=ND_WRITABLE)
m = memoryview(ex)
m[()] = 22.5
self.assertEqual(m[()], 22.5)
m[...] = 23.5
self.assertEqual(m[()], 23.5)
self.assertRaises(TypeError, m.__setitem__, 0, 24.7)
ex = ndarray(list(range(7)), shape=[7])
m = memoryview(ex)
self.assertRaises(TypeError, m.__setitem__, 2, 10)
ex = ndarray(list(range(7)), shape=[7], flags=ND_WRITABLE)
m = memoryview(ex)
self.assertRaises(IndexError, m.__setitem__, 2 ** 64, 9)
self.assertRaises(TypeError, m.__setitem__, 2.0, 10)
self.assertRaises(TypeError, m.__setitem__, 0.0, 11)
self.assertRaises(IndexError, m.__setitem__, -8, 20)
self.assertRaises(IndexError, m.__setitem__, 8, 25)
for fmt in fmtdict['@']:
if fmt == 'c' or fmt == '?':
continue
ex = ndarray([1, 2, 3], shape=[3], format=fmt, flags=ND_WRITABLE)
m = memoryview(ex)
i = randrange(-3, 3)
m[i] = 8
self.assertEqual(m[i], 8)
self.assertEqual(m[i], ex[i])
ex = ndarray([b'1', b'2', b'3'], shape=[3], format='c', flags=
ND_WRITABLE)
m = memoryview(ex)
m[2] = b'9'
self.assertEqual(m[2], b'9')
ex = ndarray([True, False, True], shape=[3], format='?', flags=
ND_WRITABLE)
m = memoryview(ex)
m[1] = True
self.assertEqual(m[1], True)
nd = ndarray([b'x'], shape=[1], format='c', flags=ND_WRITABLE)
m = memoryview(nd)
self.assertRaises(TypeError, m.__setitem__, 0, 100)
ex = ndarray(list(range(120)), shape=[1, 2, 3, 4, 5], flags=ND_WRITABLE
)
m1 = memoryview(ex)
for fmt, _range in fmtdict['@'].items():
if fmt == '?':
continue
if fmt == 'c':
continue
m2 = m1.cast(fmt)
lo, hi = _range
if fmt == 'd' or fmt == 'f':
lo, hi = -2 ** 1024, 2 ** 1024
if fmt != 'P':
self.assertRaises(ValueError, m2.__setitem__, 0, lo - 1)
self.assertRaises(TypeError, m2.__setitem__, 0, 'xyz')
self.assertRaises(ValueError, m2.__setitem__, 0, hi)
m2 = m1.cast('c')
self.assertRaises(ValueError, m2.__setitem__, 0, b'\xff\xff')
ex = ndarray(list(range(1)), shape=[1], format='xL', flags=ND_WRITABLE)
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.__setitem__, 0, 1)
ex = ndarray([b'12345'], shape=[1], format='s', flags=ND_WRITABLE)
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.__setitem__, 0, 1)
ex = ndarray(list(range(12)), shape=[3, 4], flags=ND_WRITABLE)
m = memoryview(ex)
m[0, 1] = 42
self.assertEqual(ex[0][1], 42)
m[-1, -1] = 43
self.assertEqual(ex[2][3], 43)
for index in (3, -4):
with self.assert_out_of_bounds_error(dim=1):
m[index, 0] = 0
for index in (4, -5):
with self.assert_out_of_bounds_error(dim=2):
m[0, index] = 0
self.assertRaises(IndexError, m.__setitem__, (2 ** 64, 0), 0)
self.assertRaises(IndexError, m.__setitem__, (0, 2 ** 64), 0)
self.assertRaises(TypeError, m.__setitem__, (0, 0, 0), 0)
self.assertRaises(TypeError, m.__setitem__, (0.0, 0.0), 0)
self.assertRaises(NotImplementedError, m.__setitem__, 0, [2, 3])
def test_memoryview_slice(self):
ex = ndarray(list(range(12)), shape=[12], flags=ND_WRITABLE)
m = memoryview(ex)
self.assertRaises(ValueError, m.__getitem__, slice(0, 2, 0))
self.assertRaises(ValueError, m.__setitem__, slice(0, 2, 0),
bytearray([1, 2]))
self.assertRaises(NotImplementedError, m.__getitem__, ())
ex = ndarray(list(range(12)), shape=[12], flags=ND_WRITABLE)
m = memoryview(ex)
self.assertRaises(NotImplementedError, m.__getitem__, (slice(0, 2,
1), slice(0, 2, 1)))
self.assertRaises(NotImplementedError, m.__setitem__, (slice(0, 2,
1), slice(0, 2, 1)), bytearray([1, 2]))
self.assertRaises(TypeError, m.__getitem__, (slice(0, 2, 1), {}))
self.assertRaises(TypeError, m.__setitem__, (slice(0, 2, 1), {}),
bytearray([1, 2]))
self.assertRaises(TypeError, m.__setitem__, slice(0, 1, 1), [1])
for flags in (0, ND_PIL):
ex1 = ndarray(list(range(12)), shape=[12], strides=[-1], offset
=11, flags=ND_WRITABLE | flags)
ex2 = ndarray(list(range(24)), shape=[12], strides=[2], flags=flags
)
m1 = memoryview(ex1)
m2 = memoryview(ex2)
ex1[2:5] = ex1[2:5]
m1[2:5] = m2[2:5]
self.assertEqual(m1, ex1)
self.assertEqual(m2, ex2)
ex1[1:3][::-1] = ex2[0:2][::1]
m1[1:3][::-1] = m2[0:2][::1]
self.assertEqual(m1, ex1)
self.assertEqual(m2, ex2)
ex1[4:1:-2][::-1] = ex1[1:4:2][::1]
m1[4:1:-2][::-1] = m1[1:4:2][::1]
self.assertEqual(m1, ex1)
self.assertEqual(m2, ex2)
def test_memoryview_array(self):
def cmptest(testcase, a, b, m, singleitem):
for i, _ in enumerate(a):
ai = a[i]
mi = m[i]
testcase.assertEqual(ai, mi)
a[i] = singleitem
if singleitem != ai:
testcase.assertNotEqual(a, m)
testcase.assertNotEqual(a, b)
else:
testcase.assertEqual(a, m)
testcase.assertEqual(a, b)
m[i] = singleitem
testcase.assertEqual(a, m)
testcase.assertEqual(b, m)
a[i] = ai
m[i] = mi
for n in range(1, 5):
for fmt, items, singleitem in iter_format(n, 'array'):
for lslice in genslices(n):
for rslice in genslices(n):
a = array.array(fmt, items)
b = array.array(fmt, items)
m = memoryview(b)
self.assertEqual(m, a)
self.assertEqual(m.tolist(), a.tolist())
self.assertEqual(m.tobytes(), a.tobytes())
self.assertEqual(len(m), len(a))
cmptest(self, a, b, m, singleitem)
array_err = None
have_resize = None
try:
al = a[lslice]
ar = a[rslice]
a[lslice] = a[rslice]
have_resize = len(al) != len(ar)
except Exception as e:
array_err = e.__class__
m_err = None
try:
m[lslice] = m[rslice]
except Exception as e:
m_err = e.__class__
if have_resize:
self.assertIs(m_err, ValueError)
elif m_err or array_err:
self.assertIs(m_err, array_err)
else:
self.assertEqual(m, a)
self.assertEqual(m.tolist(), a.tolist())
self.assertEqual(m.tobytes(), a.tobytes())
cmptest(self, a, b, m, singleitem)
def test_memoryview_compare_special_cases(self):
a = array.array('L', [1, 2, 3])
b = array.array('L', [1, 2, 7])
v = memoryview(a)
w = memoryview(b)
for attr in ('__lt__', '__le__', '__gt__', '__ge__'):
self.assertIs(getattr(v, attr)(w), NotImplemented)
self.assertIs(getattr(a, attr)(v), NotImplemented)
v = memoryview(a)
v.release()
self.assertEqual(v, v)
self.assertNotEqual(v, a)
self.assertNotEqual(a, v)
v = memoryview(a)
w = memoryview(a)
w.release()
self.assertNotEqual(v, w)
self.assertNotEqual(w, v)
v = memoryview(a)
self.assertNotEqual(v, [1, 2, 3])
nd = ndarray([(0, 0)], shape=[1], format='l x d x', flags=ND_WRITABLE)
nd[0] = -1, float('nan')
self.assertNotEqual(memoryview(nd), nd)
a = array.array('u', 'xyz')
v = memoryview(a)
self.assertNotEqual(a, v)
self.assertNotEqual(v, a)
if ctypes:
class BEPoint(ctypes.BigEndianStructure):
_fields_ = [('x', ctypes.c_long), ('y', ctypes.c_long)]
point = BEPoint(100, 200)
a = memoryview(point)
b = memoryview(point)
self.assertNotEqual(a, b)
self.assertNotEqual(a, point)
self.assertNotEqual(point, a)
self.assertRaises(NotImplementedError, a.tolist)
def test_memoryview_compare_ndim_zero(self):
nd1 = ndarray(1729, shape=[], format='@L')
nd2 = ndarray(1729, shape=[], format='L', flags=ND_WRITABLE)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, w)
self.assertEqual(w, v)
self.assertEqual(v, nd2)
self.assertEqual(nd2, v)
self.assertEqual(w, nd1)
self.assertEqual(nd1, w)
self.assertFalse(v.__ne__(w))
self.assertFalse(w.__ne__(v))
w[()] = 1728
self.assertNotEqual(v, w)
self.assertNotEqual(w, v)
self.assertNotEqual(v, nd2)
self.assertNotEqual(nd2, v)
self.assertNotEqual(w, nd1)
self.assertNotEqual(nd1, w)
self.assertFalse(v.__eq__(w))
self.assertFalse(w.__eq__(v))
nd = ndarray(list(range(12)), shape=[12], flags=ND_WRITABLE | ND_PIL)
ex = ndarray(list(range(12)), shape=[12], flags=ND_WRITABLE | ND_PIL)
m = memoryview(ex)
self.assertEqual(m, nd)
m[9] = 100
self.assertNotEqual(m, nd)
nd1 = ndarray((1729, 1.2, b'12345'), shape=[], format='Lf5s')
nd2 = ndarray((1729, 1.2, b'12345'), shape=[], format='hf5s', flags
=ND_WRITABLE)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, w)
self.assertEqual(w, v)
self.assertEqual(v, nd2)
self.assertEqual(nd2, v)
self.assertEqual(w, nd1)
self.assertEqual(nd1, w)
nd1 = ndarray((1729, 1.2, b'12345'), shape=[], format='Lf5s')
nd2 = ndarray((-1729, 1.2, b'12345'), shape=[], format='hf5s',
flags=ND_WRITABLE)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertNotEqual(v, w)
self.assertNotEqual(w, v)
self.assertNotEqual(v, nd2)
self.assertNotEqual(nd2, v)
self.assertNotEqual(w, nd1)
self.assertNotEqual(nd1, w)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
def test_memoryview_compare_ndim_one(self):
nd1 = ndarray([-529, 576, -625, 676, -729], shape=[5], format='@h')
nd2 = ndarray([-529, 576, -625, 676, 729], shape=[5], format='@h')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray([-529, 576, -625, 676, -729], shape=[5], format='<i')
nd2 = ndarray([-529, 576, -625, 676, 729], shape=[5], format='>h')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray([-529, -625, -729], shape=[3], format='@h')
nd2 = ndarray([-529, 576, -625, 676, -729], shape=[5], format='@h')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd2[::2])
self.assertEqual(w[::2], nd1)
self.assertEqual(v, w[::2])
self.assertEqual(v[::-1], w[::-2])
nd1 = ndarray([-529, -625, -729], shape=[3], format='!h')
nd2 = ndarray([-529, 576, -625, 676, -729], shape=[5], format='<l')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd2[::2])
self.assertEqual(w[::2], nd1)
self.assertEqual(v, w[::2])
self.assertEqual(v[::-1], w[::-2])
nd1 = ndarray([-529, -625, -729], shape=[3], format='@h')
nd2 = ndarray([-529, 576, -625, 676, -729], shape=[5], format='@h',
flags=ND_PIL)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd2[::2])
self.assertEqual(w[::2], nd1)
self.assertEqual(v, w[::2])
self.assertEqual(v[::-1], w[::-2])
nd1 = ndarray([-529, -625, -729], shape=[3], format='h 0c')
nd2 = ndarray([-529, 576, -625, 676, -729], shape=[5], format=
'> h', flags=ND_PIL)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd2[::2])
self.assertEqual(w[::2], nd1)
self.assertEqual(v, w[::2])
self.assertEqual(v[::-1], w[::-2])
def test_memoryview_compare_zero_shape(self):
nd1 = ndarray([900, 961], shape=[0], format='@h')
nd2 = ndarray([-900, -961], shape=[0], format='@h')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
nd1 = ndarray([900, 961], shape=[0], format='= h0c')
nd2 = ndarray([-900, -961], shape=[0], format='@ i')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
def test_memoryview_compare_zero_strides(self):
nd1 = ndarray([900, 900, 900, 900], shape=[4], format='@L')
nd2 = ndarray([900], shape=[4], strides=[0], format='L')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
nd1 = ndarray([(900, 900)] * 4, shape=[4], format='@ Li')
nd2 = ndarray([(900, 900)], shape=[4], strides=[0], format='!L h')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
def test_memoryview_compare_random_formats(self):
n = 10
for char in fmtdict['@m']:
fmt, items, singleitem = randitems(n, 'memoryview', '@', char)
for flags in (0, ND_PIL):
nd = ndarray(items, shape=[n], format=fmt, flags=flags)
m = memoryview(nd)
self.assertEqual(m, nd)
nd = nd[::-3]
m = memoryview(nd)
self.assertEqual(m, nd)
n = 10
for _ in range(100):
fmt, items, singleitem = randitems(n)
for flags in (0, ND_PIL):
nd = ndarray(items, shape=[n], format=fmt, flags=flags)
m = memoryview(nd)
self.assertEqual(m, nd)
nd = nd[::-3]
m = memoryview(nd)
self.assertEqual(m, nd)
def test_memoryview_compare_multidim_c(self):
nd1 = ndarray(list(range(-15, 15)), shape=[3, 2, 5], format='@h')
nd2 = ndarray(list(range(0, 30)), shape=[3, 2, 5], format='@h')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray([(0, 1, 2)] * 30, shape=[3, 2, 5], format='=f q xxL')
nd2 = ndarray([(-1.2, 1, 2)] * 30, shape=[3, 2, 5], format='< f 2Q')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray(list(range(30)), shape=[2, 3, 5], format='L')
nd2 = ndarray(list(range(30)), shape=[3, 2, 5], format='L')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray([(0, 1, 2)] * 21, shape=[3, 7], format='! b B xL')
nd2 = ndarray([(0, 1, 2)] * 21, shape=[7, 3], format='= Qx l xxL')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray(list(range(30)), shape=[2, 3, 5], format='L')
nd2 = ndarray(list(range(30)), shape=[2, 3, 5], format='l')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
def test_memoryview_compare_multidim_fortran(self):
nd1 = ndarray(list(range(-15, 15)), shape=[5, 2, 3], format='@h',
flags=ND_FORTRAN)
nd2 = ndarray(list(range(0, 30)), shape=[5, 2, 3], format='@h',
flags=ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray([(2 ** 64 - 1, -1)] * 6, shape=[2, 3], format='=Qq',
flags=ND_FORTRAN)
nd2 = ndarray([(-1, 2 ** 64 - 1)] * 6, shape=[2, 3], format='=qQ',
flags=ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray(list(range(-15, 15)), shape=[2, 3, 5], format='l',
flags=ND_FORTRAN)
nd2 = ndarray(list(range(-15, 15)), shape=[3, 2, 5], format='l',
flags=ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray(list(range(-15, 15)), shape=[2, 3, 5], format='0ll',
flags=ND_FORTRAN)
nd2 = ndarray(list(range(-15, 15)), shape=[3, 2, 5], format='l',
flags=ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray(list(range(30)), shape=[5, 2, 3], format='@h', flags=
ND_FORTRAN)
nd2 = ndarray(list(range(30)), shape=[5, 2, 3], format='@b', flags=
ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
def test_memoryview_compare_multidim_mixed(self):
lst1 = list(range(-15, 15))
lst2 = transpose(lst1, [3, 2, 5])
nd1 = ndarray(lst1, shape=[3, 2, 5], format='@l')
nd2 = ndarray(lst2, shape=[3, 2, 5], format='l', flags=ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, w)
lst1 = [(-3.3, -22, b'x')] * 30
lst1[5] = -2.2, -22, b'x'
lst2 = transpose(lst1, [3, 2, 5])
nd1 = ndarray(lst1, shape=[3, 2, 5], format='d b c')
nd2 = ndarray(lst2, shape=[3, 2, 5], format='d h c', flags=ND_FORTRAN)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, w)
ex1 = ndarray(list(range(40)), shape=[5, 8], format='@I')
nd1 = ex1[3:1:-1, ::-2]
ex2 = ndarray(list(range(40)), shape=[5, 8], format='I')
nd2 = ex2[1:3:1, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
ex1 = ndarray([(2 ** 31 - 1, -2 ** 31)] * 22, shape=[11, 2], format
='=ii')
nd1 = ex1[3:1:-1, ::-2]
ex2 = ndarray([(2 ** 31 - 1, -2 ** 31)] * 22, shape=[11, 2], format
='>ii')
nd2 = ex2[1:3:1, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
ex1 = ndarray(list(range(30)), shape=[2, 3, 5], format='b')
nd1 = ex1[1:3, ::-2]
nd2 = ndarray(list(range(30)), shape=[3, 2, 5], format='b')
nd2 = ex2[1:3, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
ex1 = ndarray(list(range(30)), shape=[2, 3, 5], format='B')
nd1 = ex1[1:3, ::-2]
nd2 = ndarray(list(range(30)), shape=[3, 2, 5], format='b')
nd2 = ex2[1:3, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
ex1 = ndarray([(2, b'123')] * 30, shape=[5, 3, 2], format='b3s')
nd1 = ex1[1:3, ::-2]
nd2 = ndarray([(2, b'123')] * 30, shape=[5, 3, 2], format='i3s')
nd2 = ex2[1:3, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
def test_memoryview_compare_multidim_zero_shape(self):
nd1 = ndarray(list(range(30)), shape=[0, 3, 2], format='i')
nd2 = ndarray(list(range(30)), shape=[5, 0, 2], format='@i')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
nd1 = ndarray(list(range(30)), shape=[0, 3, 2], format='i')
nd2 = ndarray(list(range(30)), shape=[5, 0, 2], format='@i')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
def test_memoryview_compare_multidim_zero_strides(self):
nd1 = ndarray([900] * 80, shape=[4, 5, 4], format='@L')
nd2 = ndarray([900], shape=[4, 5, 4], strides=[0, 0, 0], format='L')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
self.assertEqual(v.tolist(), w.tolist())
nd1 = ndarray([(1, 2)] * 10, shape=[2, 5], format='=lQ')
nd2 = ndarray([(1, 2)], shape=[2, 5], strides=[0, 0], format='<lQ')
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
def test_memoryview_compare_multidim_suboffsets(self):
ex1 = ndarray(list(range(40)), shape=[5, 8], format='@I')
nd1 = ex1[3:1:-1, ::-2]
ex2 = ndarray(list(range(40)), shape=[5, 8], format='I', flags=ND_PIL)
nd2 = ex2[1:3:1, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
ex1 = ndarray([(2 ** 64 - 1, -1)] * 40, shape=[5, 8], format='=Qq',
flags=ND_WRITABLE)
ex1[2][7] = 1, -2
nd1 = ex1[3:1:-1, ::-2]
ex2 = ndarray([(2 ** 64 - 1, -1)] * 40, shape=[5, 8], format='>Qq',
flags=ND_PIL | ND_WRITABLE)
ex2[2][7] = 1, -2
nd2 = ex2[1:3:1, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
ex1 = ndarray(list(range(30)), shape=[2, 3, 5], format='b', flags=
ND_PIL)
nd1 = ex1[1:3, ::-2]
nd2 = ndarray(list(range(30)), shape=[3, 2, 5], format='b')
nd2 = ex2[1:3, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
ex1 = ndarray([(2 ** 8 - 1, -1)] * 40, shape=[2, 3, 5], format='Bb',
flags=ND_PIL | ND_WRITABLE)
nd1 = ex1[1:2, ::-2]
ex2 = ndarray([(2 ** 8 - 1, -1)] * 40, shape=[3, 2, 5], format='Bb')
nd2 = ex2[1:2, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
ex1 = ndarray(list(range(30)), shape=[5, 3, 2], format='i', flags=
ND_PIL)
nd1 = ex1[1:3, ::-2]
ex2 = ndarray(list(range(30)), shape=[5, 3, 2], format='@I', flags=
ND_PIL)
nd2 = ex2[1:3, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, nd2)
self.assertEqual(w, nd1)
self.assertEqual(v, w)
ex1 = ndarray([(b'hello', b'', 1)] * 27, shape=[3, 3, 3], format=
'5s0sP', flags=ND_PIL | ND_WRITABLE)
ex1[1][2][2] = b'sushi', b'', 1
nd1 = ex1[1:3, ::-2]
ex2 = ndarray([(b'hello', b'', 1)] * 27, shape=[3, 3, 3], format=
'5s0sP', flags=ND_PIL | ND_WRITABLE)
ex1[1][2][2] = b'sushi', b'', 1
nd2 = ex2[1:3, ::-2]
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertNotEqual(v, nd2)
self.assertNotEqual(w, nd1)
self.assertNotEqual(v, w)
lst1 = list(range(-15, 15))
lst2 = transpose(lst1, [3, 2, 5])
nd1 = ndarray(lst1, shape=[3, 2, 5], format='@l', flags=ND_PIL)
nd2 = ndarray(lst2, shape=[3, 2, 5], format='l', flags=ND_FORTRAN |
ND_PIL)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, w)
lst1 = [(b'sashimi', b'sliced', 20.05)] * 30
lst1[11] = b'ramen', b'spicy', 9.45
lst2 = transpose(lst1, [3, 2, 5])
nd1 = ndarray(lst1, shape=[3, 2, 5], format='< 10p 9p d', flags=ND_PIL)
nd2 = ndarray(lst2, shape=[3, 2, 5], format='> 10p 9p d', flags=
ND_FORTRAN | ND_PIL)
v = memoryview(nd1)
w = memoryview(nd2)
self.assertEqual(v, nd1)
self.assertEqual(w, nd2)
self.assertEqual(v, w)
def test_memoryview_compare_not_equal(self):
for byteorder in ['=', '<', '>', '!']:
x = ndarray([2 ** 63] * 120, shape=[3, 5, 2, 2, 2], format=
byteorder + 'Q')
y = ndarray([2 ** 63] * 120, shape=[3, 5, 2, 2, 2], format=
byteorder + 'Q', flags=ND_WRITABLE | ND_FORTRAN)
y[2][3][1][1][1] = 1
a = memoryview(x)
b = memoryview(y)
self.assertEqual(a, x)
self.assertEqual(b, y)
self.assertNotEqual(a, b)
self.assertNotEqual(a, y)
self.assertNotEqual(b, x)
x = ndarray([(2 ** 63, 2 ** 31, 2 ** 15)] * 120, shape=[3, 5, 2,
2, 2], format=byteorder + 'QLH')
y = ndarray([(2 ** 63, 2 ** 31, 2 ** 15)] * 120, shape=[3, 5, 2,
2, 2], format=byteorder + 'QLH', flags=ND_WRITABLE | ND_FORTRAN
)
y[2][3][1][1][1] = 1, 1, 1
a = memoryview(x)
b = memoryview(y)
self.assertEqual(a, x)
self.assertEqual(b, y)
self.assertNotEqual(a, b)
self.assertNotEqual(a, y)
self.assertNotEqual(b, x)
def test_memoryview_check_released(self):
a = array.array('d', [1.1, 2.2, 3.3])
m = memoryview(a)
m.release()
self.assertRaises(ValueError, memoryview, m)
self.assertRaises(ValueError, m.cast, 'c')
self.assertRaises(ValueError, ndarray, m)
self.assertRaises(ValueError, m.tolist)
self.assertRaises(ValueError, m.tobytes)
self.assertRaises(ValueError, eval, '1.0 in m', locals())
self.assertRaises(ValueError, m.__getitem__, 0)
self.assertRaises(ValueError, m.__setitem__, 0, 1)
for attr in ('obj', 'nbytes', 'readonly', 'itemsize', 'format',
'ndim', 'shape', 'strides', 'suboffsets', 'c_contiguous',
'f_contiguous', 'contiguous'):
self.assertRaises(ValueError, m.__getattribute__, attr)
b = array.array('d', [1.1, 2.2, 3.3])
m1 = memoryview(a)
m2 = memoryview(b)
self.assertEqual(m1, m2)
m1.release()
self.assertNotEqual(m1, m2)
self.assertNotEqual(m1, a)
self.assertEqual(m1, m1)
def test_memoryview_tobytes(self):
t = -529, 576, -625, 676, -729
nd = ndarray(t, shape=[5], format='@h')
m = memoryview(nd)
self.assertEqual(m, nd)
self.assertEqual(m.tobytes(), nd.tobytes())
nd = ndarray([t], shape=[1], format='>hQiLl')
m = memoryview(nd)
self.assertEqual(m, nd)
self.assertEqual(m.tobytes(), nd.tobytes())
nd = ndarray([t for _ in range(12)], shape=[2, 2, 3], format='=hQiLl')
m = memoryview(nd)
self.assertEqual(m, nd)
self.assertEqual(m.tobytes(), nd.tobytes())
nd = ndarray([t for _ in range(120)], shape=[5, 2, 2, 3, 2], format
='<hQiLl')
m = memoryview(nd)
self.assertEqual(m, nd)
self.assertEqual(m.tobytes(), nd.tobytes())
if ctypes:
class BEPoint(ctypes.BigEndianStructure):
_fields_ = [('x', ctypes.c_long), ('y', ctypes.c_long)]
point = BEPoint(100, 200)
a = memoryview(point)
self.assertEqual(a.tobytes(), bytes(point))
def test_memoryview_get_contiguous(self):
self.assertRaises(TypeError, get_contiguous, {}, PyBUF_READ, 'F')
self.assertRaises(BufferError, get_contiguous, b'x', PyBUF_WRITE, 'C')
nd = ndarray([1, 2, 3], shape=[2], strides=[2])
self.assertRaises(BufferError, get_contiguous, nd, PyBUF_WRITE, 'A')
nd = ndarray(9, shape=(), format='L')
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(m, nd)
self.assertEqual(m[()], 9)
nd = ndarray(9, shape=(), format='L', flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(m, nd)
self.assertEqual(m[()], 9)
for order in ['C', 'F', 'A']:
nd[()] = 9
m = get_contiguous(nd, PyBUF_WRITE, order)
self.assertEqual(m, nd)
self.assertEqual(m[()], 9)
m[()] = 10
self.assertEqual(m[()], 10)
self.assertEqual(nd[()], 10)
nd = ndarray([1], shape=[0], format='L', flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_READ, order)
self.assertRaises(IndexError, m.__getitem__, 0)
self.assertEqual(m, nd)
self.assertEqual(m.tolist(), [])
nd = ndarray(list(range(8)), shape=[2, 0, 7], format='L', flags=
ND_WRITABLE)
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(ndarray(m).tolist(), [[], []])
nd = ndarray([1], shape=[1], format='h', flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_WRITE, order)
self.assertEqual(m, nd)
self.assertEqual(m.tolist(), nd.tolist())
nd = ndarray([1, 2, 3], shape=[3], format='b', flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_WRITE, order)
self.assertEqual(m, nd)
self.assertEqual(m.tolist(), nd.tolist())
nd = ndarray([1, 2, 3], shape=[2], strides=[2], flags=ND_WRITABLE)
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(m, nd)
self.assertEqual(m.tolist(), nd.tolist())
self.assertRaises(TypeError, m.__setitem__, 1, 20)
self.assertEqual(m[1], 3)
self.assertEqual(nd[1], 3)
nd = nd[::-1]
for order in ['C', 'F', 'A']:
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(m, nd)
self.assertEqual(m.tolist(), nd.tolist())
self.assertRaises(TypeError, m.__setitem__, 1, 20)
self.assertEqual(m[1], 1)
self.assertEqual(nd[1], 1)
nd = ndarray(list(range(12)), shape=[3, 4], flags=ND_WRITABLE)
for order in ['C', 'A']:
m = get_contiguous(nd, PyBUF_WRITE, order)
self.assertEqual(ndarray(m).tolist(), nd.tolist())
self.assertRaises(BufferError, get_contiguous, nd, PyBUF_WRITE, 'F')
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(ndarray(m).tolist(), nd.tolist())
nd = ndarray(list(range(12)), shape=[3, 4], flags=ND_WRITABLE |
ND_FORTRAN)
for order in ['F', 'A']:
m = get_contiguous(nd, PyBUF_WRITE, order)
self.assertEqual(ndarray(m).tolist(), nd.tolist())
self.assertRaises(BufferError, get_contiguous, nd, PyBUF_WRITE, 'C')
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(ndarray(m).tolist(), nd.tolist())
nd = ndarray(list(range(12)), shape=[3, 4], flags=ND_WRITABLE | ND_PIL)
for order in ['C', 'F', 'A']:
self.assertRaises(BufferError, get_contiguous, nd, PyBUF_WRITE,
order)
m = get_contiguous(nd, PyBUF_READ, order)
self.assertEqual(ndarray(m).tolist(), nd.tolist())
nd = ndarray([1, 2, 3, 4, 5], shape=[3], strides=[2])
m = get_contiguous(nd, PyBUF_READ, 'C')
self.assertTrue(m.c_contiguous)
def test_memoryview_serializing(self):
size = struct.calcsize('i')
a = array.array('i', [1, 2, 3, 4, 5])
m = memoryview(a)
buf = io.BytesIO(m)
b = bytearray(5 * size)
buf.readinto(b)
self.assertEqual(m.tobytes(), b)
size = struct.calcsize('L')
nd = ndarray(list(range(12)), shape=[2, 3, 2], format='L')
m = memoryview(nd)
buf = io.BytesIO(m)
b = bytearray(2 * 3 * 2 * size)
buf.readinto(b)
self.assertEqual(m.tobytes(), b)
def test_memoryview_hash(self):
b = bytes(list(range(12)))
m = memoryview(b)
self.assertEqual(hash(b), hash(m))
mc = m.cast('c', shape=[3, 4])
self.assertEqual(hash(mc), hash(b))
mx = m[::-2]
b = bytes(list(range(12))[::-2])
self.assertEqual(hash(mx), hash(b))
nd = ndarray(list(range(30)), shape=[3, 2, 5], flags=ND_FORTRAN)
m = memoryview(nd)
self.assertEqual(hash(m), hash(nd))
nd = ndarray(list(range(30)), shape=[3, 2, 5])
x = nd[::2, :, ::-1]
m = memoryview(x)
self.assertEqual(hash(m), hash(x))
nd = ndarray(list(range(30)), shape=[2, 5, 3], flags=ND_PIL)
x = nd[::2, :, ::-1]
m = memoryview(x)
self.assertEqual(hash(m), hash(x))
x = ndarray(list(range(12)), shape=[12], format='B')
a = memoryview(x)
y = ndarray(list(range(12)), shape=[12], format='b')
b = memoryview(y)
self.assertEqual(a, b)
self.assertEqual(hash(a), hash(b))
nd = ndarray(list(range(12)), shape=[2, 2, 3], format='L')
m = memoryview(nd)
self.assertRaises(ValueError, m.__hash__)
nd = ndarray(list(range(-6, 6)), shape=[2, 2, 3], format='h')
m = memoryview(nd)
self.assertRaises(ValueError, m.__hash__)
nd = ndarray(list(range(12)), shape=[2, 2, 3], format='= L')
m = memoryview(nd)
self.assertRaises(ValueError, m.__hash__)
nd = ndarray(list(range(-6, 6)), shape=[2, 2, 3], format='< h')
m = memoryview(nd)
self.assertRaises(ValueError, m.__hash__)
def test_memoryview_release(self):
a = bytearray([1, 2, 3])
m = memoryview(a)
nd = ndarray(m)
self.assertRaises(BufferError, m.release)
del nd
m.release()
a = bytearray([1, 2, 3])
m = memoryview(a)
nd1 = ndarray(m, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
nd2 = ndarray(nd1, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
self.assertIs(nd2.obj, m)
self.assertRaises(BufferError, m.release)
del nd1, nd2
m.release()
a = bytearray([1, 2, 3])
m1 = memoryview(a)
m2 = memoryview(m1)
nd = ndarray(m2)
m1.release()
self.assertRaises(BufferError, m2.release)
del nd
m2.release()
a = bytearray([1, 2, 3])
m1 = memoryview(a)
m2 = memoryview(m1)
nd1 = ndarray(m2, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
nd2 = ndarray(nd1, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
self.assertIs(nd2.obj, m2)
m1.release()
self.assertRaises(BufferError, m2.release)
del nd1, nd2
m2.release()
nd = ndarray([1, 2, 3], shape=[3], flags=ND_VAREXPORT)
m1 = memoryview(nd)
nd.push([4, 5, 6, 7, 8], shape=[5])
m2 = memoryview(nd)
x = memoryview(m1)
self.assertEqual(x.tolist(), m1.tolist())
y = memoryview(m2)
self.assertEqual(y.tolist(), m2.tolist())
self.assertEqual(y.tolist(), nd.tolist())
m2.release()
y.release()
nd.pop()
self.assertEqual(x.tolist(), nd.tolist())
del nd
m1.release()
x.release()
def catch22(b):
with memoryview(b) as m2:
pass
x = bytearray(b'123')
with memoryview(x) as m1:
catch22(m1)
self.assertEqual(m1[0], ord(b'1'))
x = ndarray(list(range(12)), shape=[2, 2, 3], format='l')
y = ndarray(x, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
z = ndarray(y, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
self.assertIs(z.obj, x)
with memoryview(z) as m:
catch22(m)
self.assertEqual(m[0:1].tolist(), [[[0, 1, 2], [3, 4, 5]]])
for flags in (0, ND_REDIRECT):
x = bytearray(b'123')
with memoryview(x) as m1:
del x
y = ndarray(m1, getbuf=PyBUF_FULL_RO, flags=flags)
with memoryview(y) as m2:
del y
z = ndarray(m2, getbuf=PyBUF_FULL_RO, flags=flags)
with memoryview(z) as m3:
del z
catch22(m3)
catch22(m2)
catch22(m1)
self.assertEqual(m1[0], ord(b'1'))
self.assertEqual(m2[1], ord(b'2'))
self.assertEqual(m3[2], ord(b'3'))
del m3
del m2
del m1
x = bytearray(b'123')
with memoryview(x) as m1:
del x
y = ndarray(m1, getbuf=PyBUF_FULL_RO, flags=flags)
with memoryview(y) as m2:
del y
z = ndarray(m2, getbuf=PyBUF_FULL_RO, flags=flags)
with memoryview(z) as m3:
del z
catch22(m1)
catch22(m2)
catch22(m3)
self.assertEqual(m1[0], ord(b'1'))
self.assertEqual(m2[1], ord(b'2'))
self.assertEqual(m3[2], ord(b'3'))
del m1, m2, m3
x = bytearray(b'123')
with self.assertRaises(BufferError):
with memoryview(x) as m:
ex = ndarray(m)
m[0] == ord(b'1')
def test_memoryview_redirect(self):
nd = ndarray([(1.0 * x) for x in range(12)], shape=[12], format='d')
a = array.array('d', [(1.0 * x) for x in range(12)])
for x in (nd, a):
y = ndarray(x, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
z = ndarray(y, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
m = memoryview(z)
self.assertIs(y.obj, x)
self.assertIs(z.obj, x)
self.assertIs(m.obj, x)
self.assertEqual(m, x)
self.assertEqual(m, y)
self.assertEqual(m, z)
self.assertEqual(m[1:3], x[1:3])
self.assertEqual(m[1:3], y[1:3])
self.assertEqual(m[1:3], z[1:3])
del y, z
self.assertEqual(m[1:3], x[1:3])
def test_memoryview_from_static_exporter(self):
fmt = 'B'
lst = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
self.assertRaises(TypeError, staticarray, 1, 2, 3)
x = staticarray()
y = memoryview(x)
self.verify(y, obj=x, itemsize=1, fmt=fmt, readonly=1, ndim=1,
shape=[12], strides=[1], lst=lst)
for i in range(12):
self.assertEqual(y[i], i)
del x
del y
x = staticarray()
y = memoryview(x)
del y
del x
x = staticarray()
y = ndarray(x, getbuf=PyBUF_FULL_RO)
z = ndarray(y, getbuf=PyBUF_FULL_RO)
m = memoryview(z)
self.assertIs(y.obj, x)
self.assertIs(m.obj, z)
self.verify(m, obj=z, itemsize=1, fmt=fmt, readonly=1, ndim=1,
shape=[12], strides=[1], lst=lst)
del x, y, z, m
x = staticarray()
y = ndarray(x, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
z = ndarray(y, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
m = memoryview(z)
self.assertIs(y.obj, x)
self.assertIs(z.obj, x)
self.assertIs(m.obj, x)
self.verify(m, obj=x, itemsize=1, fmt=fmt, readonly=1, ndim=1,
shape=[12], strides=[1], lst=lst)
del x, y, z, m
x = staticarray(legacy_mode=True)
y = memoryview(x)
self.verify(y, obj=None, itemsize=1, fmt=fmt, readonly=1, ndim=1,
shape=[12], strides=[1], lst=lst)
for i in range(12):
self.assertEqual(y[i], i)
del x
del y
x = staticarray(legacy_mode=True)
y = memoryview(x)
del y
del x
x = staticarray(legacy_mode=True)
y = ndarray(x, getbuf=PyBUF_FULL_RO)
z = ndarray(y, getbuf=PyBUF_FULL_RO)
m = memoryview(z)
self.assertIs(y.obj, None)
self.assertIs(m.obj, z)
self.verify(m, obj=z, itemsize=1, fmt=fmt, readonly=1, ndim=1,
shape=[12], strides=[1], lst=lst)
del x, y, z, m
x = staticarray(legacy_mode=True)
y = ndarray(x, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
z = ndarray(y, getbuf=PyBUF_FULL_RO, flags=ND_REDIRECT)
m = memoryview(z)
self.assertIs(y.obj, None)
self.assertIs(z.obj, y)
self.assertIs(m.obj, y)
self.verify(m, obj=y, itemsize=1, fmt=fmt, readonly=1, ndim=1,
shape=[12], strides=[1], lst=lst)
del x, y, z, m
def test_memoryview_getbuffer_undefined(self):
nd = ndarray([1, 2, 3], [3], flags=ND_GETBUF_FAIL | ND_GETBUF_UNDEFINED
)
self.assertRaises(BufferError, memoryview, nd)
def test_issue_7385(self):
x = ndarray([1, 2, 3], shape=[3], flags=ND_GETBUF_FAIL)
self.assertRaises(BufferError, memoryview, x)
if __name__ == '__main__':
unittest.main()
| 41.630942 | 105 | 0.530304 | 17,632 | 141,004 | 4.141901 | 0.03885 | 0.086061 | 0.030125 | 0.014569 | 0.679515 | 0.604108 | 0.536601 | 0.482651 | 0.44186 | 0.408271 | 0 | 0.043371 | 0.328913 | 141,004 | 3,386 | 106 | 41.643237 | 0.728404 | 0.026574 | 0 | 0.480323 | 0 | 0.000323 | 0.011346 | 0 | 0 | 0 | 0 | 0 | 0.273548 | 1 | 0.043226 | false | 0.000968 | 0.005484 | 0.002258 | 0.076452 | 0.000645 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e07d4a290a89738baa85c1f2a65b178b1380804 | 3,388 | py | Python | processing/geoboundaries/standardize/recode.py | fieldmaps/edge-matcher | d5aac4b1fbd5da3518b5a33fb6ddbe4e936af3bd | [
"MIT"
] | 1 | 2021-11-10T12:59:01.000Z | 2021-11-10T12:59:01.000Z | processing/geoboundaries/standardize/recode.py | fieldmaps/admin-boundaries | 6ec585174656cdc1d209c5ccd31087b7aa6d9929 | [
"MIT"
] | null | null | null | processing/geoboundaries/standardize/recode.py | fieldmaps/admin-boundaries | 6ec585174656cdc1d209c5ccd31087b7aa6d9929 | [
"MIT"
] | null | null | null | import pandas as pd
from .utils import logging, filter_config, DATABASE
logger = logging.getLogger(__name__)
con = f'postgresql:///{DATABASE}'
def get_ids(level):
result = []
for l in range(level, -1, -1):
result += [f'adm{l}_id', f'adm{l}_name']
return ','.join(result)
def rename_id(df, level):
for l in range(level, -1, -1):
df[f'adm{l}_src'] = df[f'adm{l}_id']
df[f'adm{l}_name1'] = None
df[f'adm{l}_name2'] = None
return df
def get_max_pad(df, level):
col = f'adm{level}_id'
col_higher = f'adm{level-1}_id'
prev_id = None
higher_id = None
id_num = None
id_max = 0
for _, row in df.iterrows():
if row[col_higher] != higher_id:
id_num = 1
elif row[col] != prev_id:
id_num = id_num + 1
higher_id = row[col_higher]
prev_id = row[col]
id_max = max(id_max, id_num)
return len(str(id_max))
def create_ids(df, name, level, date):
df['adm0_id'] = f"{name.upper()}-{date.strftime('%Y%m%d')}"
for l in range(1, level+1):
col = f'adm{l}_id'
col_higher = f'adm{l-1}_id'
prev_id = None
higher_id = None
id_num = None
id_max = get_max_pad(df, l)
for _, row in df.iterrows():
if row[col_higher] != higher_id:
id_num = 1
elif row[col] != prev_id:
id_num = id_num + 1
higher_id = row[col_higher]
prev_id = row[col]
new_val = f'{higher_id}-{str(id_num).zfill(id_max)}'
row[f'adm{l}_id'] = new_val
return df
def order_ids(level):
result = []
for l in range(level, -1, -1):
result += [f'adm{l}_id', f'adm{l}_src',
f'adm{l}_name', f'adm{l}_name1', f'adm{l}_name2']
return result
def add_meta(df, row):
meta_1 = ['src_lvl']
for m in meta_1:
df[m] = row[m]
df['src_lang'] = 'en'
df['src_lang1'] = None
df['src_lang2'] = None
meta_2 = ['src_date', 'src_update', 'src_name',
'src_name1', 'src_lic', 'src_url']
for m in meta_2:
df[m] = row[m]
df['src_date'] = pd.to_datetime(df['src_date'])
df['src_update'] = pd.to_datetime(df['src_update'])
df['src_grp'] = 'geoBoundaries'
return df
def handle_filter(df, level, config):
col = f"adm{config['adm']}_src"
for name, (switch, *args) in config['layers'].items():
if switch == '==':
df1 = df[df[col].isin(args)]
elif switch == '!=':
df1 = df[~df[col].isin(args)]
df1.to_sql(f'{name}_adm{level}_01', con,
if_exists='replace', index=False, method='multi')
def main(_, name, level, row):
query = f'SELECT {get_ids(level)} FROM {name}_adm{level}_00'
df = pd.read_sql_query(query, con)
cols = list(map(lambda x: [f'adm{x}_name', f'adm{x}_id'], range(level+1)))
cols = [i for l in cols for i in l]
df = df.sort_values(by=cols)
df = rename_id(df, level)
df = create_ids(df, name, level, row['src_update'])
df = df[order_ids(level)]
df = add_meta(df, row)
if name in filter_config.keys():
handle_filter(df, level, filter_config[name])
else:
df.to_sql(f'{name}_adm{level}_01', con,
if_exists='replace', index=False, method='multi')
logger.info(f'{name}_adm{level}')
| 29.719298 | 78 | 0.556375 | 539 | 3,388 | 3.276438 | 0.200371 | 0.043035 | 0.039638 | 0.019819 | 0.396376 | 0.337486 | 0.323896 | 0.286523 | 0.286523 | 0.286523 | 0 | 0.015164 | 0.279811 | 3,388 | 113 | 79 | 29.982301 | 0.708607 | 0 | 0 | 0.333333 | 0 | 0 | 0.182113 | 0.036895 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.020833 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e087c3aba36d8a1da8af28ad2d6df931d6e4180 | 1,112 | py | Python | app/vehicle/models.py | ab7289-tandon-nyu/csgy6083_PDS_Project | d2b7d22274dcabbb6ae35c17a8ffd06498f3634f | [
"MIT"
] | null | null | null | app/vehicle/models.py | ab7289-tandon-nyu/csgy6083_PDS_Project | d2b7d22274dcabbb6ae35c17a8ffd06498f3634f | [
"MIT"
] | null | null | null | app/vehicle/models.py | ab7289-tandon-nyu/csgy6083_PDS_Project | d2b7d22274dcabbb6ae35c17a8ffd06498f3634f | [
"MIT"
] | null | null | null | from app.db import DTOBase
class Vehicle(DTOBase):
__id_field__ = "vin"
__display__ = "display_name"
def __init__(
self,
make: str,
model: str,
year: int,
state: str,
policy_id: int = -1,
vin: str = None,
):
self.make = make
self.model = model
self.year = year
self.state = state
self.policy_id = policy_id
self.vin = vin
@property
def display_name(self):
return f"{self.vin} - {self.make} {self.model}"
class Driver(DTOBase):
__id_field__ = "license"
__display__ = "full_name"
def __init__(
self, fname: str, mname: str, lname: str, birthdate: str, license: str = None
):
self.fname = fname
self.mname = mname
self.lname = lname
self.birthdate = birthdate
self.license = license
@property
def full_name(self):
return f"{self.fname} {self.lname}"
class VehicleDriver:
def __init__(self, license: str = None, vin: str = None):
self.license = license
self.vin = vin
| 20.981132 | 85 | 0.563849 | 131 | 1,112 | 4.503817 | 0.267176 | 0.047458 | 0.055932 | 0.050847 | 0.064407 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001346 | 0.331835 | 1,112 | 52 | 86 | 21.384615 | 0.792732 | 0 | 0 | 0.25 | 0 | 0 | 0.083633 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.025 | 0.05 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e098d9290daa9ce8fe636df83ecdb38807d6eed | 484 | py | Python | setup.py | JesDanis/Sweaden_Prueba | d028c599eabfe7534eff3113dcaa4aba42254a97 | [
"MIT"
] | null | null | null | setup.py | JesDanis/Sweaden_Prueba | d028c599eabfe7534eff3113dcaa4aba42254a97 | [
"MIT"
] | null | null | null | setup.py | JesDanis/Sweaden_Prueba | d028c599eabfe7534eff3113dcaa4aba42254a97 | [
"MIT"
] | null | null | null | from setuptools import setup, find_packages
with open('requirements.txt') as f:
install_requires = f.read().strip().split('\n')
# get version from __version__ variable in apptest/__init__.py
from apptest import __version__ as version
setup(
name='apptest',
version=version,
description='prueba',
author='orlando Cholota',
author_email='edwin_orlando83@hotmail.com',
packages=find_packages(),
zip_safe=False,
include_package_data=True,
install_requires=install_requires
)
| 24.2 | 62 | 0.783058 | 64 | 484 | 5.578125 | 0.671875 | 0.12605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004619 | 0.105372 | 484 | 19 | 63 | 25.473684 | 0.819861 | 0.123967 | 0 | 0 | 0 | 0 | 0.172986 | 0.063981 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e0b0e23dd2e90df198a864385d478cf3eb2e293 | 5,448 | py | Python | flask_helper/_flask.py | meisanggou/JYFlask | c23c659ee60dfc0f9ed4ea47c6a3b018d318100c | [
"MIT"
] | 2 | 2018-11-15T11:18:13.000Z | 2020-05-16T22:12:10.000Z | flask_helper/_flask.py | meisanggou/Flask-Helper | c23c659ee60dfc0f9ed4ea47c6a3b018d318100c | [
"MIT"
] | null | null | null | flask_helper/_flask.py | meisanggou/Flask-Helper | c23c659ee60dfc0f9ed4ea47c6a3b018d318100c | [
"MIT"
] | 1 | 2018-05-07T01:11:23.000Z | 2018-05-07T01:11:23.000Z | # !/usr/bin/env python
# coding: utf-8
from flask import Flask
import os
from flask_helper.exception import InvalidHookClass
from flask_helper.flask_hook import FlaskHook
from flask_helper.hooks.cors_hook import CorsHook
from flask_helper.hooks.handle_30x_hook import Handle30xHook
from flask_helper.hooks.real_ip_hook import RealIPHook
from flask_helper.hooks.user_agent_hook import UserAgentHook
from flask_helper.view import View
from flask_helper.utils.loader import load_classes_from_directory
from flask_helper.utils.loader import load_objects_from_directory
from flask_helper.utils.log import DummyLog
__author__ = 'zhouhenglc'
class _HookFlask(object):
def __init__(self, log=None):
self.hooks = []
self._hook_log = log if log else DummyLog()
def before_request_hook(self):
for hook in self.hooks:
resp = hook.before_request()
if resp is not None:
return resp
def after_request_hook(self, response):
for hook in reversed(self.hooks):
response = hook.after_request(response)
return response
def add_hook(self, hook_cls, *args, **kwargs):
if not issubclass(hook_cls, FlaskHook):
raise InvalidHookClass()
hook_obj = hook_cls(self, *args, **kwargs)
insert_i = len(self.hooks)
for i in range(len(self.hooks) - 1, -1, -1):
if type(self.hooks[i]) == hook_cls:
return self.hooks[i]
if hook_obj.priority < self.hooks[i].priority:
insert_i = i
self._hook_log.info('add hook %s priority is %s',
hook_obj.__class__.__name__,
hook_obj.priority)
self.hooks.insert(insert_i, hook_obj)
return hook_obj
class PredefinedHookFlask(_HookFlask):
def cross_domain(self, **kwargs):
hook_obj = self.add_hook(CorsHook, **kwargs)
return hook_obj
def filter_user_agent(self, *args, **kwargs):
hook_obj = self.add_hook(UserAgentHook, *args, **kwargs)
return hook_obj
def handle_30x(self, **kwargs):
hook_obj = self.add_hook(Handle30xHook, **kwargs)
return hook_obj
def real_ip(self, trust_proxy=None):
if trust_proxy is None:
trust_proxy = ["127.0.0.1"]
hook_obj = self.add_hook(RealIPHook, trust_proxy=trust_proxy)
return hook_obj
class FlaskHelper(Flask, PredefinedHookFlask):
def __init__(self, import_name, *args, **kwargs):
self.log = kwargs.pop('log', DummyLog())
Flask.__init__(self, import_name, *args, **kwargs)
PredefinedHookFlask.__init__(self, self.log)
self.before_request_funcs.setdefault(None, [])
self.after_request_funcs.setdefault(None, [])
self.before_request_funcs[None].append(self.before_request_hook)
self.after_request_funcs[None].append(self.after_request_hook)
self.hooks_folders = set()
default_hooks_folder = os.path.join(self.root_path, 'hooks')
if os.path.exists(default_hooks_folder):
self.register_hooks(default_hooks_folder)
self.views_folders = set()
self._views = set()
default_views_folder = os.path.join(self.root_path, 'views')
if os.path.exists(default_views_folder):
self.register_views(default_views_folder)
def register_blueprint(self, blueprint, **options):
if isinstance(blueprint, View):
self.jinja_env.globals.update(blueprint.jinja_env)
self.log.info('register blueprint %s', blueprint.name)
Flask.register_blueprint(self, blueprint, **options)
def register_views(self, views_folder):
self.log.info('register views from %s', views_folder)
views_folder = os.path.abspath(views_folder)
if views_folder in self.views_folders:
return
self.views_folders.add(views_folder)
module_prefix = 'flask_helper.views_%s' % len(self.hooks_folders)
v_objects = load_objects_from_directory(views_folder, module_prefix,
View)
for v_obj in v_objects:
if v_obj.name in self._views:
self.log.warning('%s blueprint name exist', v_obj.name)
return
self.register_blueprint(v_obj)
self._views.add(v_obj.name)
def register_hooks(self, hooks_folder):
self.log.info('register hooks from %s', hooks_folder)
hooks_folder = os.path.abspath(hooks_folder)
if hooks_folder in self.hooks_folders:
return
self.hooks_folders.add(hooks_folder)
module_prefix = 'flask_helper.hooks_%s' % len(self.hooks_folders)
h_classes = load_classes_from_directory(hooks_folder, module_prefix,
FlaskHook)
for h_class in h_classes:
self.add_hook(h_class)
def run(self, host=None, port=None, **options):
log = options.pop('log', None)
try:
import eventlet
from eventlet import wsgi
# eventlet.monkey_patch()
if host is None:
host = '0.0.0.0'
if port is None:
port = 5000
listen = eventlet.listen((host, port))
wsgi.server(listen, self, log=log, **options)
except ImportError:
Flask.run(host, port, **options)
| 37.572414 | 76 | 0.641887 | 690 | 5,448 | 4.794203 | 0.181159 | 0.04081 | 0.045345 | 0.024184 | 0.247279 | 0.093712 | 0.055623 | 0 | 0 | 0 | 0 | 0.006497 | 0.265419 | 5,448 | 144 | 77 | 37.833333 | 0.82009 | 0.010646 | 0 | 0.067797 | 0 | 0 | 0.036762 | 0.007798 | 0 | 0 | 0 | 0 | 0 | 1 | 0.110169 | false | 0 | 0.144068 | 0 | 0.372881 | 0.059322 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e0ce2cde6afdfd6d125f65d229bd7d0e3594dc9 | 5,802 | py | Python | Functions/UpdateSpreadsheet/update-spreadsheet.py | bretterism/CommunityBot | c25db66ae09b79628123f9850ab59c7cbe7b7984 | [
"MIT"
] | null | null | null | Functions/UpdateSpreadsheet/update-spreadsheet.py | bretterism/CommunityBot | c25db66ae09b79628123f9850ab59c7cbe7b7984 | [
"MIT"
] | null | null | null | Functions/UpdateSpreadsheet/update-spreadsheet.py | bretterism/CommunityBot | c25db66ae09b79628123f9850ab59c7cbe7b7984 | [
"MIT"
] | null | null | null | import base64
import boto3
import datetime
import gspread
import json
import logging
from oauth2client.service_account import ServiceAccountCredentials
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def get_client_secret(filename):
"""Get the client secret from encrypted file. Returns decrypted json object"""
with open(filename) as file:
json_file = json.load(file)
cyphertext = json_file['CiphertextBlob']
blob = base64.b64decode(cyphertext)
client = boto3.client('kms')
secret = client.decrypt(CiphertextBlob=blob)['Plaintext']
s = secret.decode('ascii')
return json.loads(s)
def connect(filename):
# use creds to create a client to interact with the Google Drive API
scope = ['https://spreadsheets.google.com/feeds',
'https://www.googleapis.com/auth/drive']
keyfile_dict = get_client_secret(filename)
creds = ServiceAccountCredentials.from_json_keyfile_dict(keyfile_dict=keyfile_dict, scopes=scope)
client = gspread.authorize(creds)
return client
def get_dinner_titles(workbook):
"""gets the list of available dinners from the workbook by worksheet titles"""
worksheets = workbook.worksheets()
worksheet_titles = [i.title for i in worksheets]
# Exclude system titles
worksheet_titles.remove('History')
worksheet_titles.remove('Settings')
return worksheet_titles
def get_next_dinner_title(workbook):
"""gets the title of the next dinner"""
# Check for override in the settings tab
settings_worksheet = workbook.worksheet('Settings')
override_cell = settings_worksheet.find('Override')
override_value = settings_worksheet.cell(override_cell.row, 2).value
next_dinner = override_value
# Checking dinner dates in the History tab
worksheet_titles = get_dinner_titles(workbook)
if override_value not in worksheet_titles:
# Get next week's dinner template by selecting the oldest date from the history table
history_worksheet = workbook.worksheet('History')
history_values = history_worksheet.get_all_values()
oldest_date = datetime.datetime.now()
next_dinner = ''
for h in history_values:
dinner_date = datetime.datetime.strptime(h[1], '%m/%d/%Y %H:%M:%S')
print(h[0], dinner_date)
if dinner_date <= oldest_date:
next_dinner = h[0]
oldest_date = dinner_date
else:
# Clear the override so next week will be back on the regular rotation
settings_worksheet.update_cell(override_cell.row, override_cell.col+1, '')
return next_dinner
def get_next_dinner(workbook):
next_dinner_title = get_next_dinner_title(workbook)
print(next_dinner_title)
return workbook.worksheet(next_dinner_title)
def get_dinner_items(worksheet):
"""gets the list of items from the worksheet dinner template"""
dinner_items = worksheet.col_values(1)
dinner_items = [d.strip(' ') for d in dinner_items]
return dinner_items
def reset_spreadsheet(worksheet, theme_location, fooditem_range):
"""clears last week's items from the spreadsheet"""
# Clear dinner theme
worksheet.update_acell(theme_location, '')
# Clear dinner items
range_of_cells = worksheet.range(fooditem_range)
for cell in range_of_cells:
cell.value = ''
worksheet.update_cells(range_of_cells)
def insert_new_dinner(dinner_worksheet, template_worksheet, theme_location, fooditem_range):
dinner_items = get_dinner_items(template_worksheet)
dinner_theme = template_worksheet.title
# Adding new dinner theme
dinner_worksheet.update_acell(theme_location, dinner_theme)
# Adding new dinner items
fooditem_range_start = fooditem_range.split(':')[0]
fooditem_cell = dinner_worksheet.acell(fooditem_range_start)
for idx, item in enumerate(dinner_items):
update_row = fooditem_cell.row + idx
update_col = fooditem_cell.col
dinner_worksheet.update_cell(update_row, update_col, item)
def set_history_date(workbook, dinner_theme):
history_worksheet = workbook.worksheet('History')
theme_cell = history_worksheet.find(dinner_theme)
datetime_now = datetime.datetime.now().strftime('%m/%d/%Y %H:%M:%S')
history_worksheet.update_cell(theme_cell.row, theme_cell.col+1, datetime_now)
print(datetime_now)
def notify_users(bot_id, msg):
lam = boto3.client('lambda')
payload = {}
payload['Bot_ID'] = bot_id
payload['Message'] = msg
try:
response = lam.invoke(FunctionName='NotifyUsers',
InvocationType='RequestResponse',
Payload=json.dumps(payload))
except Exception as e:
print(e)
raise e
def lambda_handler(event, context):
logger.info(event)
# Variables
theme_location = 'B1'
fooditem_range = 'A4:B50'
# Obtain client connection
client = connect('client_secret_encrypted.json')
# Gather workbooks/worksheets
workbook_templates_name = event['Templates_Workbook']
workbook_dinner_name = event['Dinner_Workbook']
worksheet_dinner_name = event['Dinner_Worksheet']
workbook_template = client.open(workbook_templates_name)
workbook_dinner = client.open(workbook_dinner_name)
dinner_worksheet = workbook_dinner.worksheet(worksheet_dinner_name)
dinner_template_worksheet = get_next_dinner(workbook_template)
dinner_theme = dinner_template_worksheet.title
# Clear out last week's dinner
reset_spreadsheet(dinner_worksheet, theme_location, fooditem_range)
# Insert new dinner
insert_new_dinner(dinner_worksheet, dinner_template_worksheet, theme_location, fooditem_range)
# Set the timestamp for the new dinner in the history sheet
set_history_date(workbook_template, dinner_theme)
# Notify Users the new spreadsheet is up
spreadsheet_url = 'https://docs.google.com/spreadsheets/d/{}/edit?usp=sharing'.format(workbook_dinner.id)
msg = 'Community Bot here *Bleep* *Bloop*\nThe new spreadsheet is up! Next week''s theme is {}.\nPlease sign-up for a few items to share!\n{}'.format(dinner_theme, spreadsheet_url)
notify_users(event['Bot_ID'], msg)
| 32.413408 | 181 | 0.777146 | 794 | 5,802 | 5.440806 | 0.255668 | 0.027778 | 0.017361 | 0.027778 | 0.098611 | 0.022685 | 0 | 0 | 0 | 0 | 0 | 0.004353 | 0.128921 | 5,802 | 178 | 182 | 32.595506 | 0.850416 | 0.154947 | 0 | 0.017857 | 0 | 0 | 0.105328 | 0.00576 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098214 | false | 0 | 0.0625 | 0 | 0.214286 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e0d8add1a61f1897a6bf0bdee26d14f7d06d4d5 | 543 | py | Python | tests/testing_utils.py | josephsv96/crate_classifier | 02a0b55d5714f1e40923b007526c25ffd0c4e098 | [
"MIT"
] | 1 | 2020-11-18T11:30:42.000Z | 2020-11-18T11:30:42.000Z | tests/testing_utils.py | josephsv96/crate_classifier | 02a0b55d5714f1e40923b007526c25ffd0c4e098 | [
"MIT"
] | 7 | 2021-04-30T21:39:14.000Z | 2022-01-13T03:32:59.000Z | tests/testing_utils.py | josephsv96/crate_classifier | 02a0b55d5714f1e40923b007526c25ffd0c4e098 | [
"MIT"
] | null | null | null | # Helper Functions
try:
from src.utils import load_json
except ImportError as error:
print(f"Error: {error}; Local modules not found")
except Exception as exception:
print(exception)
def load_params_1():
"""Returns source path of images and number of exposures
"""
PKG_1_PARAMS = load_json("config/pkg_1_config.json")
return PKG_1_PARAMS
def load_params_2():
"""Returns source path of images and number of exposures
"""
PKG_2_PARAMS = load_json("config/pkg_2_config.json")
return PKG_2_PARAMS
| 22.625 | 60 | 0.71639 | 81 | 543 | 4.567901 | 0.432099 | 0.064865 | 0.07027 | 0.102703 | 0.383784 | 0.259459 | 0.259459 | 0.259459 | 0.259459 | 0.259459 | 0 | 0.018349 | 0.197053 | 543 | 23 | 61 | 23.608696 | 0.830275 | 0.248619 | 0 | 0 | 0 | 0 | 0.219144 | 0.120907 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e10e72463d04064e483ee31cb51f2fbc3e11508 | 380 | py | Python | setup.py | gonicus/clacks | da579f0acc4e48cf2e9451417ac6792282cf7ab6 | [
"ZPL-2.1"
] | 2 | 2015-01-26T07:15:19.000Z | 2015-11-09T13:42:11.000Z | setup.py | gonicus/clacks | da579f0acc4e48cf2e9451417ac6792282cf7ab6 | [
"ZPL-2.1"
] | null | null | null | setup.py | gonicus/clacks | da579f0acc4e48cf2e9451417ac6792282cf7ab6 | [
"ZPL-2.1"
] | null | null | null | #!/usr/bin/env python
import os
import sys
modules = ['common',
'agent',
'dbus',
'client',
'shell',
'utils']
for module in modules:
os.system("cd %s && ./setup.py %s" % (module, " ".join(sys.argv[1:])))
for root, dirs, files in os.walk("plugins"):
if "setup.py" in files:
os.system("cd %s && ./setup.py %s" % (root, " ".join(sys.argv[1:])))
| 21.111111 | 76 | 0.544737 | 56 | 380 | 3.696429 | 0.553571 | 0.101449 | 0.096618 | 0.10628 | 0.183575 | 0.183575 | 0.183575 | 0 | 0 | 0 | 0 | 0.00678 | 0.223684 | 380 | 17 | 77 | 22.352941 | 0.694915 | 0.052632 | 0 | 0 | 0 | 0 | 0.256267 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e1396c831f746a08108e2cb3118687406bb55ae | 546 | py | Python | force_arp.py | t-amorim/network_package | 863e684b012553d174ad5b5582fb43b33aba9c3e | [
"MIT"
] | null | null | null | force_arp.py | t-amorim/network_package | 863e684b012553d174ad5b5582fb43b33aba9c3e | [
"MIT"
] | null | null | null | force_arp.py | t-amorim/network_package | 863e684b012553d174ad5b5582fb43b33aba9c3e | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
__author__ = "Thomas AMORIM"
__credits__ = ["Thomas Amorim", "Pierre-François Bonnefoi" "Scapy"]
__license__ = "MIT"
__version__ = "1.1"
__status__ = "OK"
from scapy.all import *
import sys
def force_arp(ipslash):
p = Ether(dst="ff:ff:ff:ff:ff:ff", src="00:03:24:45:11:34")/ARP(hwsrc="00:03:24:45:11:34",psrc="192.168.1.0",pdst=ipslash)
p.show2()
send(p)
if __name__ == "__main__":
defaut_ip = "192.168.1.0/24"
if len(sys.argv)!=1:
ip = sys.argv[1] or defaut_ip
force_arp("192.168.1.0/24") | 26 | 123 | 0.663004 | 98 | 546 | 3.367347 | 0.571429 | 0.060606 | 0.072727 | 0.072727 | 0.169697 | 0.072727 | 0 | 0 | 0 | 0 | 0 | 0.120833 | 0.120879 | 546 | 21 | 124 | 26 | 0.566667 | 0.076923 | 0 | 0 | 0 | 0 | 0.32008 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e1afbc263f6d797c000a3851ad609f3c38c6432 | 747 | py | Python | addons/omi-blender-gltf-main/reload_package.py | V-Sekai/V-Sekai-Blender-tools | 3473ad4abb737756290a9007273519460742960d | [
"MIT"
] | 2 | 2021-12-21T16:38:58.000Z | 2022-01-08T00:56:35.000Z | addons/omi-blender-gltf-main/reload_package.py | V-Sekai/V-Sekai-Blender-game-tools | 3473ad4abb737756290a9007273519460742960d | [
"MIT"
] | 1 | 2022-01-29T05:46:50.000Z | 2022-01-29T05:46:50.000Z | addons/omi-blender-gltf-main/reload_package.py | V-Sekai/V-Sekai-Blender-game-tools | 3473ad4abb737756290a9007273519460742960d | [
"MIT"
] | 1 | 2021-11-07T19:41:34.000Z | 2021-11-07T19:41:34.000Z | # from io_scene_gltf2
# Copyright 2018-2019 The glTF-Blender-IO authors.
# Apache 2.0
#
# Script reloading (if the user calls 'Reload Scripts' from Blender)
#
def reload_package(module_dict_main):
import importlib
from pathlib import Path
def reload_package_recursive(current_dir, module_dict):
for path in current_dir.iterdir():
if "__init__" in str(path) or path.stem not in module_dict:
continue
if path.is_file() and path.suffix == ".py":
importlib.reload(module_dict[path.stem])
elif path.is_dir():
reload_package_recursive(path, module_dict[path.stem].__dict__)
reload_package_recursive(Path(__file__).parent, module_dict_main)
| 31.125 | 79 | 0.677376 | 100 | 747 | 4.73 | 0.49 | 0.12685 | 0.139535 | 0.07611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019264 | 0.235609 | 747 | 23 | 80 | 32.478261 | 0.809107 | 0.195448 | 0 | 0 | 0 | 0 | 0.01855 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.25 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e1e0c186a840151723210baf4fa4bdbf7361239 | 7,938 | py | Python | archives/nnAvicaching_find_rewards_without_lp.py | anmolkabra/avicaching-summ17 | 3b85c1b70adcbe5d5b2764195090b28093081b1f | [
"MIT"
] | null | null | null | archives/nnAvicaching_find_rewards_without_lp.py | anmolkabra/avicaching-summ17 | 3b85c1b70adcbe5d5b2764195090b28093081b1f | [
"MIT"
] | null | null | null | archives/nnAvicaching_find_rewards_without_lp.py | anmolkabra/avicaching-summ17 | 3b85c1b70adcbe5d5b2764195090b28093081b1f | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# DEPRECATED -- model didn't work. Was trying to constrain rewards differently.
# Beware -- lack of documentation. Refer to nnAvicaching_find_rewards.py for
# support
from __future__ import print_function
import torch, torch.nn as nn, torch.nn.functional as torchfun, torch.optim as optim
from torch.autograd import Variable
import numpy as np, argparse, time, os, sys
import avicaching_data as ad
# =============================================================================
# options
# =============================================================================
parser = argparse.ArgumentParser(description="NN Avicaching model for finding rewards")
parser.add_argument("--lr", type=float, default=0.01, metavar="LR",
help="inputs learning rate of the network (default=0.01)")
parser.add_argument("--momentum", type=float, default=1.0, metavar="M",
help="inputs SGD momentum (default=1.0)")
parser.add_argument("--no-cuda", action="store_true", default=False,
help="disables CUDA training")
parser.add_argument("--epochs", type=int, default=10, metavar="E",
help="inputs the number of epochs to train for")
parser.add_argument("--locations", type=int, default=116, metavar="J",
help="inputs the number of locations (default=116)")
parser.add_argument("--time", type=int, default=173, metavar="T",
help="inputs total time of data collection; number of weeks (default=173)")
parser.add_argument("--eta", type=float, default=10.0, metavar="F",
help="inputs parameter eta in the model (default=10.0)")
parser.add_argument("--rewards", type=float, default=1000.0, metavar="R",
help="inputs the total budget of rewards to be distributed (default=1000.0)")
parser.add_argument("--weights-file", type=str,
default="./stats/weights/normalizedR_gpu, origXYR_epochs=1000, train= 80%, time=98.6947 sec.txt",
metavar="f", help="inputs the location of the file to use weights from")
parser.add_argument("--log-interval", type=int, default=1, metavar="I",
help="prints training information at I epoch intervals (default=1)")
parser.add_argument("--expand-R", action="store_true", default=False,
help="expands the reward vectors into matrices with distributed rewards")
parser.add_argument("--lambda-loss", type=float, default=10.0,
help="inputs the lambda for penalizing rewards amounting to greater than total rewards (default=10.0)")
parser.add_argument("--lambda-update", type=float, default=0.01,
help="inputs the learning rate for lambda (default=0.01)")
parser.add_argument('--seed', type=int, default=1, metavar='S',
help='random seed (default: 1)')
args = parser.parse_args()
# assigning cuda check and test check to single variables
args.cuda = not args.no_cuda and torch.cuda.is_available()
torch.manual_seed(args.seed)
np.random.seed(seed=args.seed)
if args.cuda:
torch.cuda.manual_seed(args.seed)
# =============================================================================
# parameters and constants
# =============================================================================
np.set_printoptions(formatter={'float': lambda x: "{0:0.3f}".format(x)})
J, T, weights_file_name = args.locations, args.time, args.weights_file
totalR, l = args.rewards, args.lambda_loss
X, W_for_r, F_DIST, numFeatures = [], [], [], 0
F_DIST_weighted = []
torchten = torch.FloatTensor
# =============================================================================
# data input
# =============================================================================
def read_set_data():
global X, W, F_DIST, numFeatures, F_DIST_weighted, W_for_r
# read f and dist datasets from file, operate on them
F = ad.read_F_file("./data/loc_feature_with_avicaching_combined.csv", J)
DIST = ad.read_dist_file("./data/site_distances_km_drastic_price_histlong_0327_0813_combined.txt", J)
# read W and X
W = ad.read_weights_file(weights_file_name, J)
X, _, _ = ad.read_XYR_file("./data/density_shift_histlong_as_previous_loc_classical_drastic_price_0327_0813.txt", J, T)
# process data for the NN
F, DIST = ad.normalize(F, along_dim=0, using_max=True), ad.normalize(DIST, using_max=True) # normalize using max
numFeatures = len(F[0]) + 1 # distance included
F_DIST = torchten(ad.combine_DIST_F(F, DIST, J, numFeatures))
numFeatures += 1 # for rewards later
# split W and join the multiply the fdist portion with F_DIST
W = np.expand_dims(W, axis=2)
W_for_fdist, W_for_r = ad.split_along_dim(W, numFeatures - 1, dim=1)
F_DIST_weighted = Variable(torch.bmm(F_DIST, torchten(W_for_fdist)).squeeze(dim=2), requires_grad=False)
# condense X along T into a single vector and normalize
X = ad.normalize(X.sum(axis=0), using_max=False)
W_for_r, X = Variable(torchten(W_for_r), requires_grad=False), Variable(torchten(X), requires_grad=False)
# =============================================================================
# MyNet class
# =============================================================================
class MyNet(nn.Module):
def __init__(self, J, totalR, eta):
super(MyNet, self).__init__()
self.J, self.totalR, self.eta = J, totalR, eta
# initiate R
self.r = np.random.multinomial(self.totalR, [1 / float(J)] * J, size=1)
normalizedR = ad.normalize(self.r, using_max=False)
self.R = nn.Parameter(torchten(normalizedR))
print("random rewards:\n", self.r)
def forward(self, inp):
repeatedR = self.R.repeat(J, 1).unsqueeze(dim=2)
inp = torch.bmm(repeatedR, W_for_r).view(-1, J) + F_DIST_weighted
inp = torchfun.relu(inp)
# add eta to inp[u][u]
# eta_matrix = Variable(self.eta * torch.eye(J).type(torchten))
# if args.cuda:
# eta_matrix = eta_matrix.cuda()
# inp += eta_matrix
return torchfun.softmax(inp)
def train(net, optimizer):
global W_for_r, l, totalR
start_time = time.time()
# build input
if args.cuda:
W_for_r = W_for_r.cuda()
# feed in data
P = net(W_for_r).t() # P is now weighted -> softmax
# calculate loss
Y = torch.mv(P, X)
loss = torch.norm(Y - torch.mean(Y).expand_as(Y)).pow(2) / J + \
l * torchfun.relu(torch.sum(net.R) - 1.0)
# backpropagate
optimizer.zero_grad()
loss.backward()
# update the rewards and constrain them
optimizer.step()
net.R.data = project_to_min(net.R.data, 0.0)
l += (args.lambda_update * (torch.sum(net.R.data) - 1.0)) # update lambda
end_time = time.time()
return (end_time - start_time, loss.data[0], net.R.data.sum())
# =============================================================================
# utility functions for training and testing routines
# =============================================================================
def build_input(rt):
"""
Builds the final input for the NN. Joins F_DIST and expanded R
"""
if args.expand_R:
return torch.cat([F_DIST, rt.repeat(J, 1, 1)], dim=2)
return torch.cat([F_DIST, rt.repeat(J, 1)], dim=2)
# =============================================================================
# main program
# =============================================================================
if __name__ == "__main__":
read_set_data()
net = MyNet(J, totalR, args.eta)
if args.cuda:
net.cuda()
optimizer = optim.SGD(net.parameters(), lr=args.lr, momentum=args.momentum, nesterov=True)
total_time = 0
for e in xrange(1, args.epochs + 1):
train_res = train(net, optimizer)
total_time += train_res[0]
if e % 20 == 0:
print("epoch=%5d, loss=%.10f, budget=%.10f" % (e, train_res[1], train_res[2]), train_res[0], l)
print("determined rewards:\n", net.R.data.cpu().numpy() * 1000)
print("total time: %.5f" % total_time) | 45.102273 | 123 | 0.599899 | 1,077 | 7,938 | 4.267409 | 0.264624 | 0.027415 | 0.051784 | 0.015666 | 0.084856 | 0.049608 | 0.01262 | 0.01262 | 0.01262 | 0 | 0 | 0.020276 | 0.16125 | 7,938 | 176 | 124 | 45.102273 | 0.670021 | 0.240867 | 0 | 0.027523 | 0 | 0.009174 | 0.220845 | 0.038874 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045872 | false | 0 | 0.045872 | 0 | 0.137615 | 0.06422 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e1eb2172438f2b56adc51f147f65e93bc5948ab | 8,094 | py | Python | orchestrator/utils/functional.py | florisie/orchestrator-core | 7a5a997fc809cdf53dc942d1ee1fa945de4eb4d8 | [
"Apache-2.0"
] | 15 | 2021-02-18T15:30:25.000Z | 2022-02-19T18:57:51.000Z | orchestrator/utils/functional.py | florisie/orchestrator-core | 7a5a997fc809cdf53dc942d1ee1fa945de4eb4d8 | [
"Apache-2.0"
] | 70 | 2021-02-24T17:59:23.000Z | 2022-03-31T08:24:56.000Z | orchestrator/utils/functional.py | florisie/orchestrator-core | 7a5a997fc809cdf53dc942d1ee1fa945de4eb4d8 | [
"Apache-2.0"
] | 6 | 2021-02-23T18:10:15.000Z | 2022-02-15T16:07:38.000Z | # Copyright 2019-2020 SURF.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import itertools
from typing import Callable, Iterable, List, Optional, Sequence, Set, TypeVar, Union
import more_itertools
import structlog
logger = structlog.get_logger(__name__)
def first_available_or_next(values: Iterable[int], start: int = 0) -> int:
"""Return first available value or the next logical value.
>>> first_available_or_next([0, 1, 3])
2
>>> first_available_or_next([0, 1, 2, 3])
4
>>> first_available_or_next([1, 2, 3])
0
>>> first_available_or_next([])
0
>>> first_available_or_next([0, 1, 3], start=11)
11
>>> first_available_or_next([0, 1, 3], start=4)
4
>>> first_available_or_next([], 22)
22
>>> first_available_or_next([1, 100, 101], 33)
33
>>> first_available_or_next([11, 22, 33, 44, 55], 33)
34
Args:
values: an iterable of integer values.
start: set starting value.
Returns:
First available value or next logical one.
"""
# +2 -> One +1 to get as many consecutive values up to and including the max+1 value. Another +1 for one extra because range is exclusive.
stop = max(values, default=0) + 2
if start >= stop:
stop = start + 1
return min(set(range(start, stop)) - set(values))
def orig(func: Callable) -> Callable:
"""Return the function wrapped by one or more decorators.
Args:
func: step function
Returns:
Undecorated step function for testing purposes.
"""
f = func
while hasattr(f, "__wrapped__"):
f = f.__wrapped__ # type:ignore
return f
def join_cs(*args: Union[Iterable[str], str]) -> str:
"""Return comma separated string from one or more comma separated strings or iterables of strings.
It deals with empty strings and properly inserting comma's.
See: `test_join_cs` for examples.
Args:
args: One or more comma separated strings or iterables that should be joined.
Returns:
A comma separated string.
"""
def to_iterable(value: Union[Iterable[str], str]) -> Iterable[str]:
if isinstance(value, str):
return filter(None, value.split(","))
return value
return ",".join(itertools.chain(*map(to_iterable, args)))
def expand_ranges(ranges: Sequence[Sequence[int]], inclusive: bool = False) -> List[int]:
"""Expand sequence of range definitions into sorted and deduplicated list of individual values.
A range definition is either a:
* one element sequence -> an individual value.
* two element sequence -> a range of values (either inclusive or exclusive).
>>> expand_ranges([[1], [2], [10, 12]])
[1, 2, 10, 11]
>>> expand_ranges([[1], [2], [10, 12]], inclusive=True)
[1, 2, 10, 11, 12]
>>> expand_ranges([[]])
Traceback (most recent call last):
...
ValueError: Expected 1 or 2 element list for range definition. Got f0 element list instead.
Resulting list is sorted::
>>> expand_ranges([[100], [1, 4]], inclusive=True)
[1, 2, 3, 4, 100]
Args:
ranges: sequence of range definitions
inclusive: are the stop values of the range definition inclusive or exclusive.
Returns:
Sorted deduplicated list of individual values.
Raises:
ValueError: if range definition is not a one or two element sequence.
"""
values: Set[int] = set()
for r in ranges:
if len(r) == 2:
values.update(range(r[0], r[1] + (1 if inclusive else 0)))
elif len(r) == 1:
values.add(r[0])
else:
raise ValueError(f"Expected 1 or 2 element list for range definition. Got f{len(r)} element list instead.")
return sorted(values)
T = TypeVar("T")
def as_t(value: Optional[T]) -> T:
"""Cast `value` to non-Optional.
One often needs to assign a value that was typed as being `Optional` to a variable that is typed non-Optional. MyPy
rightfully takes issue with these assignments (strict Optional checking is default since MyPy 0.600) unless we
have explicitely determined these values to be not `None`. The most succinct way to do that is using an `assert`
statement::
x: Optional[int] = 7
assert x is not None
y: int = x
However that gets tedious pretty fast. One might be inclined to turn off strict Optional checking. However that
would be a bad decision; None values will percolate through data structures and cause issue at locations far from
where they originally came from. A better solution would be to fail right where the issue occurred but using a
somewhat more convenient syntax.
Some languages such as Kotlin provide the `as` operator:
.. code-block:: kotlin
val x: Int? = 7 // ? declaring the Int to be nullable
val y: Int = x as Int
That is the inspiration for this function. `t` referring to the type being wrapped in an `Optional`. Hence `as_t`
meaning `as the non-Optional type`.
The above Python example now becomes::
x: Optional[int] = 7
y: int = as_t(x)
`as_t` checks whether te value passed to it is not `None`, satisfying MyPy. If it happens to be `None` it raises a
`ValueError`, satisfying our requirement to fail at the location where we require the value to be not None and not
somewhere far down the code path.
Args:
value: `Optional` value to be casted to non-Optional
Returns:
non-Optional value.
Raises:
ValueError: in case `value` is `None`
"""
if value is None:
raise ValueError("Trying to cast a value to non-Optional type failed due to value being None.")
return value
def ireplace(iterable: Iterable[T], old: T, new: T) -> Iterable[T]:
"""Replace one or more occurrences of a specific value in an iterable with another value.
The 'i' prefix indicates 'iterable' and is there to distinguish it from other similar functions.
>>> list(ireplace(["1-10", "", "22"], "", "0"))
['1-10', '0', '22']
Args:
iterable: The iterable that needs to have a specific value replaced for all its occurrences.
old: The value in the iterable to replace.
new: The value to replace `old` with.
Returns:
A new iterable with `old` values replaced by `new` values
"""
yield from more_itertools.replace(iterable, lambda v: v == old, [new])
def to_ranges(i: Iterable[int]) -> Iterable[range]:
"""Convert a sorted iterable of ints to an iterable of range objects.
IMPORTANT: the iterable passed in should be sorted and not contain duplicate elements.
Examples::
>>> list(to_ranges([2, 3, 4, 5, 7, 8, 9, 45, 46, 47, 49, 51, 53, 54, 55, 56, 57, 58, 59, 60, 61]))
[range(2, 6), range(7, 10), range(45, 48), range(49, 50), range(51, 52), range(53, 62)]
Args:
i: sorted iterable
Yields:
range object for each consecutive set of integers
"""
# The trick here is the key function (the lambda one) that calculates the difference between an element of the
# iterable `i` and its corresponding enumeration value. For consecutive values in the iterable, this difference
# will be the same! All these values (those with the same difference) are grouped by the `groupby` function. We
# return the first and last element to construct a `range` object
for _, g in itertools.groupby(enumerate(i), lambda t: t[1] - t[0]):
group = list(g)
yield range(group[0][1], group[-1][1] + 1)
| 33.308642 | 142 | 0.657401 | 1,195 | 8,094 | 4.400837 | 0.307113 | 0.031945 | 0.030424 | 0.03803 | 0.087659 | 0.062559 | 0.047347 | 0.042974 | 0.016733 | 0.016733 | 0 | 0.033851 | 0.244502 | 8,094 | 242 | 143 | 33.446281 | 0.826165 | 0.704102 | 0 | 0.047619 | 0 | 0.02381 | 0.092105 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.190476 | false | 0 | 0.095238 | 0 | 0.452381 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e218d1e4830fc81eed6cc871e828febdb5d8a5f | 4,884 | py | Python | cscs-checks/apps/gromacs/gromacs_check.py | ajocksch/reframe | f701ca5075d78cef2dbf9dc8b754bf70b1a04036 | [
"BSD-3-Clause"
] | null | null | null | cscs-checks/apps/gromacs/gromacs_check.py | ajocksch/reframe | f701ca5075d78cef2dbf9dc8b754bf70b1a04036 | [
"BSD-3-Clause"
] | null | null | null | cscs-checks/apps/gromacs/gromacs_check.py | ajocksch/reframe | f701ca5075d78cef2dbf9dc8b754bf70b1a04036 | [
"BSD-3-Clause"
] | null | null | null | import itertools
import os
import reframe as rfm
import reframe.utility.sanity as sn
class GromacsBaseCheck(rfm.RunOnlyRegressionTest):
def __init__(self, output_file):
super().__init__()
self.valid_prog_environs = ['PrgEnv-gnu']
self.executable = 'gmx_mpi'
# Reset sources dir relative to the SCS apps prefix
self.sourcesdir = os.path.join(self.current_system.resourcesdir,
'Gromacs')
self.keep_files = [output_file]
energy = sn.extractsingle(r'\s+Potential\s+Kinetic En\.\s+Total Energy'
r'\s+Conserved En\.\s+Temperature\n'
r'(\s+\S+){2}\s+(?P<energy>\S+)(\s+\S+){2}\n'
r'\s+Pressure \(bar\)\s+Constr\. rmsd',
output_file, 'energy', float, item=-1)
energy_reference = -3270799.9
self.sanity_patterns = sn.all([
sn.assert_found('Finished mdrun', output_file),
sn.assert_reference(energy, energy_reference, -0.001, 0.001)
])
self.perf_patterns = {
'perf': sn.extractsingle(r'Performance:\s+(?P<perf>\S+)',
output_file, 'perf', float)
}
self.modules = ['GROMACS']
self.maintainers = ['VH']
self.strict_check = False
self.use_multithreading = False
self.extra_resources = {
'switches': {
'num_switches': 1
}
}
class GromacsGPUCheck(GromacsBaseCheck):
def __init__(self, variant):
super().__init__('md.log')
self.valid_systems = ['daint:gpu', 'dom:gpu']
self.descr = 'GROMACS GPU check'
self.name = 'gromacs_gpu_%s_check' % variant
self.executable_opts = ('mdrun -dlb yes -ntomp 1 -npme 0 '
'-s herflat.tpr ').split()
self.variables = {'CRAY_CUDA_MPS': '1'}
self.tags = {'scs'}
self.num_gpus_per_node = 1
if self.current_system.name == 'dom':
self.num_tasks = 72
self.num_tasks_per_node = 12
else:
self.num_tasks = 192
self.num_tasks_per_node = 12
@rfm.simple_test
class GromacsGPUMaintCheck(GromacsGPUCheck):
def __init__(self):
super().__init__('maint')
self.tags |= {'maintenance'}
self.reference = {
'dom:gpu': {
'perf': (29.3, -0.05, None)
},
'daint:gpu': {
'perf': (42.0, -0.10, None)
},
}
@rfm.simple_test
class GromacsGPUProdCheck(GromacsGPUCheck):
def __init__(self):
super().__init__('prod')
self.tags |= {'production'}
self.reference = {
'dom:gpu': {
'perf': (29.3, -0.05, None)
},
'daint:gpu': {
'perf': (42.0, -0.20, None)
},
}
class GromacsCPUCheck(GromacsBaseCheck):
def __init__(self, variant):
super().__init__('md.log')
self.valid_systems = ['daint:mc', 'dom:mc']
self.descr = 'GROMACS CPU check'
self.name = 'gromacs_cpu_%s_check' % variant
self.executable_opts = ('mdrun -dlb yes -ntomp 1 -npme -1 '
'-nb cpu -s herflat.tpr ').split()
if self.current_system.name == 'dom':
self.num_tasks = 216
self.num_tasks_per_node = 36
else:
self.num_tasks = 576
self.num_tasks_per_node = 36
@rfm.simple_test
class GromacsCPUProdCheck(GromacsCPUCheck):
def __init__(self):
super().__init__('prod')
self.tags |= {'production'}
self.reference = {
'dom:mc': {
'perf': (42.7, -0.05, None)
},
'daint:mc': {
'perf': (70.4, -0.20, None)
},
}
# FIXME: This test is obsolete; it is kept only for reference.
@rfm.parameterized_test([1], [2], [4], [6], [8])
class GromacsCPUMonchAcceptance(GromacsBaseCheck):
def __init__(self, num_nodes):
super().__init__('md.log')
self.valid_systems = ['monch:compute']
self.descr = 'GROMACS %d-node CPU check on monch' % num_nodes
self.name = 'gromacs_cpu_monch_%d_node_check' % num_nodes
self.executable_opts = ('mdrun -dlb yes -ntomp 1 -npme -1 '
'-nsteps 5000 -nb cpu -s herflat.tpr ').split()
self.tags = {'monch_acceptance'}
self.num_tasks_per_node = 20
self.num_tasks = num_nodes * self.num_tasks_per_node
reference_by_nodes = {1: 2.6, 2: 5.1, 4: 11.1, 6: 15.8, 8: 20.6}
self.reference = {
'monch:compute': {
'perf': (reference_by_nodes[num_nodes], -0.15, None)
}
}
| 31.509677 | 79 | 0.524775 | 550 | 4,884 | 4.410909 | 0.310909 | 0.03751 | 0.054411 | 0.037098 | 0.340066 | 0.314509 | 0.241962 | 0.229596 | 0.229596 | 0.198269 | 0 | 0.035736 | 0.341114 | 4,884 | 154 | 80 | 31.714286 | 0.718148 | 0.022523 | 0 | 0.284553 | 0 | 0.00813 | 0.168309 | 0.025781 | 0 | 0 | 0 | 0.006494 | 0.01626 | 1 | 0.056911 | false | 0 | 0.03252 | 0 | 0.146341 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e21a21a2e04fd87d50fcbd6fe1e05543f8a0c63 | 3,490 | py | Python | tests/test_serialization.py | andsteing/being | 0d0dca71edc512df47fe5ff3bea692e728f90924 | [
"MIT"
] | 2 | 2021-11-11T12:16:43.000Z | 2022-01-13T06:06:20.000Z | tests/test_serialization.py | andsteing/being | 0d0dca71edc512df47fe5ff3bea692e728f90924 | [
"MIT"
] | 5 | 2022-01-13T08:01:54.000Z | 2022-02-22T12:28:02.000Z | tests/test_serialization.py | andsteing/being | 0d0dca71edc512df47fe5ff3bea692e728f90924 | [
"MIT"
] | 3 | 2022-01-11T18:16:35.000Z | 2022-01-13T13:14:26.000Z | import unittest
import enum
from typing import NamedTuple
import numpy as np
from numpy.testing import assert_equal
from scipy.interpolate import PPoly, CubicSpline, BPoly
from being.serialization import (
ENUM_LOOKUP, EOT, NAMED_TUPLE_LOOKUP, FlyByDecoder, dumps, enum_from_dict,
enum_to_dict, loads, named_tuple_as_dict, named_tuple_from_dict,
register_enum, register_named_tuple, _enum_type_qualname,
)
class TestSerialization(unittest.TestCase):
def assert_splines_equal(self, a, b):
assert_equal(a.x, a.x)
assert_equal(a.c, a.c)
self.assertEqual(a.extrapolate, b.extrapolate)
self.assertEqual(a.axis, b.axis)
def test_splines(self):
spline = CubicSpline([0, 1, 2, 4,], [[0, 1], [1, 0], [2, 1], [3, 0],])
splineCpy = loads(dumps(spline))
self.assert_splines_equal(spline, splineCpy)
def test_that_we_end_up_with_the_correct_spline_types(self):
spline = CubicSpline([0, 1, 3, 6], [0, 1, 0, -1])
ppoly = PPoly(spline.c, spline.x)
bpoly = BPoly.from_power_basis(spline)
self.assert_splines_equal(loads(dumps(spline)), ppoly)
self.assert_splines_equal(loads(dumps(ppoly)), ppoly)
self.assert_splines_equal(loads(dumps(bpoly)), bpoly)
def test_numpy_array(self):
arrays = [
np.array(1),
np.array(1.234),
np.random.random(10),
np.random.random((10, 2, 3)),
(255 * np.random.random((10, 2, 3))).astype(np.uint8),
]
for arr in arrays:
arrCpy = loads(dumps(arr))
assert_equal(arrCpy, arr)
def test_with_new_named_tuple(self):
Foo = NamedTuple('Foo', name=str, id=int)
foo = Foo('Calimero', 42)
dct = named_tuple_as_dict(foo)
self.assertEqual(dct, {
'type': 'Foo',
'name': 'Calimero',
'id': 42,
})
with self.assertRaises(RuntimeError):
named_tuple_from_dict(dct)
register_named_tuple(Foo)
foo2 = named_tuple_from_dict(dct)
self.assertEqual(foo, foo2)
self.assertEqual(foo, loads(dumps(foo)))
NAMED_TUPLE_LOOKUP.pop('Foo')
def test_with_enum(self):
Foo = enum.Enum('Foo', 'FIRST SECOND THIRD')
foo = Foo.SECOND
dct = enum_to_dict(foo)
self.assertEqual(dct, {
'type': _enum_type_qualname(Foo),
'members': list(Foo.__members__),
'value': foo.value,
})
with self.assertRaises(RuntimeError):
enum_from_dict(dct)
register_enum(Foo)
foo2 = enum_from_dict(dct)
self.assertEqual(foo, foo2)
self.assertEqual(foo, loads(dumps(foo)))
ENUM_LOOKUP.pop(_enum_type_qualname(Foo))
def test_a_set_mapps_back_to_itself(self):
x = {1, 2, 'Hello, world!'}
y = loads(dumps(x))
self.assertEqual(x, y)
class TestFlyByDecoder(unittest.TestCase):
def test_doc_example(self):
dec = FlyByDecoder()
snippets = [
'"Hello, World!"\x041.23',
'4\x04[1, 2, 3, 4]\x04{"a":',
' 1, "b": 2}\x04'
]
self.assertEqual(list(dec.decode_more(snippets[0])), ['Hello, World!'])
self.assertEqual(list(dec.decode_more(snippets[1])), [1.234, [1, 2, 3, 4]])
self.assertEqual(list(dec.decode_more(snippets[2])), [{'a': 1, 'b': 2}])
if __name__ == '__main__':
unittest.main()
| 28.842975 | 83 | 0.601719 | 452 | 3,490 | 4.418142 | 0.25 | 0.090135 | 0.045068 | 0.044066 | 0.27992 | 0.224337 | 0.161242 | 0.064096 | 0.064096 | 0.064096 | 0 | 0.031141 | 0.263897 | 3,490 | 120 | 84 | 29.083333 | 0.746205 | 0 | 0 | 0.113636 | 0 | 0 | 0.049284 | 0 | 0 | 0 | 0 | 0 | 0.261364 | 1 | 0.090909 | false | 0 | 0.079545 | 0 | 0.193182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e267a431f39688558b66516192bbde659a69318 | 3,056 | py | Python | delta/utils/solver/solver_utils.py | hchang000/delta | 89320bd538e360d939c50d9f303e81554f6ce7ac | [
"Apache-2.0"
] | 1 | 2019-07-15T11:42:38.000Z | 2019-07-15T11:42:38.000Z | delta/utils/solver/solver_utils.py | hchang000/delta | 89320bd538e360d939c50d9f303e81554f6ce7ac | [
"Apache-2.0"
] | null | null | null | delta/utils/solver/solver_utils.py | hchang000/delta | 89320bd538e360d939c50d9f303e81554f6ce7ac | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2017 Beijing Didi Infinity Technology and Development Co.,Ltd.
# All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Solver utilities."""
import os
import tensorflow as tf
from absl import logging
def get_checkpoint_dir(config):
"""Get the directory of the checkpoint."""
model_path = config['solver']['saver']['model_path']
checkpoint_dir = os.path.join(model_path, "model")
return checkpoint_dir
def get_ckpt_state(config):
"""Get the checkpoint state."""
checkpoint_dir = get_checkpoint_dir(config)
ckpt = tf.train.get_checkpoint_state(checkpoint_dir)
return ckpt
def get_session_conf(config):
"""Get the config for the tensorflow session."""
tfconf = config['solver']['run_config']
session_conf = tf.ConfigProto(
allow_soft_placement=tfconf['allow_soft_placement'],
log_device_placement=tfconf['log_device_placement'],
intra_op_parallelism_threads=tfconf['intra_op_parallelism_threads'],
inter_op_parallelism_threads=tfconf['inter_op_parallelism_threads'],
gpu_options=tf.GPUOptions(allow_growth=tfconf['allow_growth']))
return session_conf
def to_saved_model(config, sess, inputs: dict, outputs: dict):
"""Save model to tensorflow SavedModel."""
export_path_base = config["solver"]["service"]["model_path"]
model_version = config["solver"]["service"]["model_version"]
export_path = os.path.join(
tf.compat.as_bytes(export_path_base), tf.compat.as_bytes(model_version))
export_path = os.path.abspath(export_path)
logging.info('Exporting model to: {}'.format(export_path))
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
# Build the signature_def_map.
signature_def = tf.saved_model.predict_signature_def(inputs, outputs)
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={'infer': signature_def},
strip_default_attrs=True)
builder.save(as_text=True)
logging.info('Done exporting!')
def save_infer_res(config, logits, preds):
"""Save the result of inference."""
res_file = config["data"]["infer"]["res"]
res_dir = os.path.dirname(res_file)
if not os.path.exists(res_dir):
os.makedirs(res_dir)
logging.info("Save inference result to: {}".format(res_file))
with open(res_file, "w") as in_f:
for logit, pred in zip(logits, preds):
in_f.write(" ".join(["{:.3f}".format(num) for num in logit]) +
"\t{}\n".format(pred))
| 38.683544 | 80 | 0.717932 | 424 | 3,056 | 4.962264 | 0.403302 | 0.03327 | 0.038023 | 0.015209 | 0.026616 | 0.026616 | 0 | 0 | 0 | 0 | 0 | 0.003421 | 0.139071 | 3,056 | 78 | 81 | 39.179487 | 0.796275 | 0.30072 | 0 | 0 | 0 | 0 | 0.141081 | 0.026781 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.066667 | 0 | 0.244444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e2a7ecb5e30e765a31e831a79df046ffd35183a | 5,255 | py | Python | splparser/parser.py | lowell80/splparser | 2b33d74d59565cc53ae47452126621300dba9ed8 | [
"BSD-3-Clause"
] | 31 | 2015-12-02T15:41:06.000Z | 2022-03-16T22:27:23.000Z | splparser/parser.py | lowell80/splparser | 2b33d74d59565cc53ae47452126621300dba9ed8 | [
"BSD-3-Clause"
] | 1 | 2021-06-24T11:23:00.000Z | 2021-06-24T11:23:00.000Z | splparser/parser.py | lowell80/splparser | 2b33d74d59565cc53ae47452126621300dba9ed8 | [
"BSD-3-Clause"
] | 15 | 2015-12-10T16:48:20.000Z | 2022-02-26T18:12:44.000Z | #!/usr/bin/env python
import imp
import logging
import os
import ply.yacc
LOGDIR = 'logs'
logging.getLogger().setLevel(logging.DEBUG)
class SPLParser(object):
"""Represents a parser object that can parse SPL queries.
You should not need to create one of these. Simply import
parse from the splparser module.
"""
def __init__(self, lexermod, parsetab_name, parsetab_dir, logname, rulesmod, optimize=True):
"""Creates an SPLParser object.
:param self: The object being created
:type self: SPLParser
:param lexermod: The corresponding lexer module
:type lexermod: module
:param parsetab_name: The name of the toplevel parse table file
:type parsetab_name: str
:param parsetab_dir: The directory in which to store parse tables
:type parsetab_dir: str
:param logname: The name of the log
:type logname: str
:param rulesmod: The module containing the top-level rules
:type rulesmod: module
:param optimize: Whether or not to write a log (it slows performance and can cause the parser to sometimes fail due to too many open file handles -- one for each command parsed as they have separate log files)
:type optimize: dir
:rtype: SPLParser
"""
self.lexer = lexermod.lex()
modname, dirname = self.setup_parsetab(parsetab_name, parsetab_dir)
self.parsetab_name = modname
self.parsetab_dir = dirname
self.rules = rulesmod
self.optimize = optimize
if not optimize: # TODO: Why is this conditional necessary?
self.log = self.setup_log(logname)
self.parser = ply.yacc.yacc(module=self.rules,
debug=True,
debuglog=self.log,
tabmodule=self.parsetab_name,
outputdir=self.parsetab_dir,
optimize=optimize)
else:
self.parser = ply.yacc.yacc(module=self.rules,
tabmodule=self.parsetab_name,
outputdir=self.parsetab_dir,
optimize=optimize)
def setup_parsetab(self, parsetab_name, parsetab_dir):
loaded = False
try: # check for parsetabs in current installation
install_location = os.path.dirname(__file__)
path_to_parsetab = os.path.join(install_location, parsetab_dir, parsetab_name + '.py')
parsetab = imp.load_source(parsetab_name, path_to_parsetab)
loaded = True
install_parsetab_dir = os.path.join(install_location, parsetab_dir)
return parsetab_name, install_parsetab_dir
except IOError: # parsetab module does not exist in install location
pass
if not loaded:
try: # check for parsetabs in current directory
path_to_parsetab = os.path.join(parsetab_dir, parsetab_name + '.py')
parsetab = imp.load_source(parsetab_name, path_to_parsetab)
loaded = True
return parsetab_name, parsetab_dir
except IOError: # parsetab module does not exist in current directory
pass
if not loaded:
try: # in case the above failed, create dir for PLY to write parsetabs in
os.stat(parsetab_dir)
except:
try:
os.makedirs(parsetab_dir)
except OSError:
msg = "ERROR: Need permission to write to ./%s\n" % parsetab_dir
sys.stderr.write(msg)
raise
return parsetab_name, parsetab_dir
def setup_log(self, name):
"""Set up the log so that parsing info can be written out.
:param self: The current SPLParser object
:type self: SPLParser
:param name: The name of the log file
:type name: str
:rtype: logging.Logger
"""
try:
os.stat(LOGDIR)
except:
try:
os.makedirs(LOGDIR)
except OSError:
sys.stderr.write("WARNING: Can't write logs to ./" + LOGDIR + "\n")
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
filehandler = logging.FileHandler(LOGDIR + "/" + str(name) + ".log")
filehandler.setLevel(logging.DEBUG)
logger.addHandler(filehandler)
return logger
def parse(self, data):
"""Parse the given string.
:param self: The current SPLParser object
:type self: SPLParser
:param data: The string to parse
:type data: str
:rtype: ParseTreeNode
"""
parsetree = None
try:
if not self.optimize:
parsetree = self.parser.parse(data, lexer=self.lexer, debug=self.log)
else:
parsetree = self.parser.parse(data, lexer=self.lexer)
except NotImplementedError:
raise
except Exception:
raise
return parsetree
| 38.639706 | 217 | 0.577735 | 582 | 5,255 | 5.113402 | 0.286942 | 0.066532 | 0.033602 | 0.038642 | 0.320228 | 0.276546 | 0.244288 | 0.22379 | 0.171371 | 0.171371 | 0 | 0 | 0.354139 | 5,255 | 135 | 218 | 38.925926 | 0.876841 | 0.294386 | 0 | 0.433735 | 0 | 0 | 0.025985 | 0 | 0 | 0 | 0 | 0.007407 | 0 | 1 | 0.048193 | false | 0.024096 | 0.048193 | 0 | 0.168675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e2b126355edbe84a1de3d96f2ced5993bbd3398 | 1,529 | py | Python | lib/cache.py | rx007/cheat.sh | 0df9db42214dc0bda0f474171583dcedbbfe5717 | [
"MIT"
] | 28,992 | 2017-05-24T12:21:03.000Z | 2022-03-31T09:35:08.000Z | lib/cache.py | rx007/cheat.sh | 0df9db42214dc0bda0f474171583dcedbbfe5717 | [
"MIT"
] | 274 | 2017-05-24T17:14:48.000Z | 2022-03-31T12:50:14.000Z | lib/cache.py | rx007/cheat.sh | 0df9db42214dc0bda0f474171583dcedbbfe5717 | [
"MIT"
] | 1,578 | 2017-05-24T20:40:41.000Z | 2022-03-31T15:23:42.000Z | """
Cache implementation.
Currently only two types of cache are allowed:
* "none" cache switched off
* "redis" use redis for cache
Configuration parameters:
cache.type = redis | none
cache.redis.db
cache.redis.host
cache.redis.port
"""
import os
import json
from config import CONFIG
_REDIS = None
if CONFIG['cache.type'] == 'redis':
import redis
_REDIS = redis.Redis(
host=CONFIG['cache.redis.host'],
port=CONFIG['cache.redis.port'],
db=CONFIG['cache.redis.db'])
_REDIS_PREFIX = ''
if CONFIG.get("cache.redis.prefix", ""):
_REDIS_PREFIX = CONFIG["cache.redis.prefix"] + ":"
def put(key, value):
"""
Save `value` with `key`, and serialize it if needed
"""
if _REDIS_PREFIX:
key = _REDIS_PREFIX + key
if CONFIG["cache.type"] == "redis" and _REDIS:
if isinstance(value, (dict, list)):
value = json.dumps(value)
_REDIS.set(key, value)
def get(key):
"""
Read `value` by `key`, and deserialize it if needed
"""
if _REDIS_PREFIX:
key = _REDIS_PREFIX + key
if CONFIG["cache.type"] == "redis" and _REDIS:
value = _REDIS.get(key)
try:
value = json.loads(value)
except (ValueError, TypeError):
pass
return value
return None
def delete(key):
"""
Remove `key` from the database
"""
if _REDIS:
if _REDIS_PREFIX:
key = _REDIS_PREFIX + key
_REDIS.delete(key)
return None
| 20.662162 | 55 | 0.590582 | 188 | 1,529 | 4.675532 | 0.308511 | 0.125142 | 0.095563 | 0.086462 | 0.21843 | 0.193402 | 0.193402 | 0.159272 | 0.159272 | 0.159272 | 0 | 0 | 0.286462 | 1,529 | 73 | 56 | 20.945205 | 0.805683 | 0.257031 | 0 | 0.27027 | 0 | 0 | 0.11819 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081081 | false | 0.027027 | 0.108108 | 0 | 0.27027 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e2b2151bad3cd55a3b085be74899c264fdd8090 | 4,614 | py | Python | scripts/consensus.py | juanjo75es/s-aligner | 0e3dd3425ce40d5c2e062410f308300437f272b2 | [
"MIT"
] | null | null | null | scripts/consensus.py | juanjo75es/s-aligner | 0e3dd3425ce40d5c2e062410f308300437f272b2 | [
"MIT"
] | null | null | null | scripts/consensus.py | juanjo75es/s-aligner | 0e3dd3425ce40d5c2e062410f308300437f272b2 | [
"MIT"
] | null | null | null | #! /usr/bin/env python3
# -*- coding: utf-8 -*-
from Bio import SeqIO
import argparse
import sys
from argparse import RawTextHelpFormatter
import numpy as np
from collections import namedtuple
MySNP = namedtuple("MySNP", "pos prev new")
snps=[]
def print_vnc(reference, snps, out):
f = open(out+".vcf", "w")
f.write("##fileformat=VCFv4.1\n")
f.write("##source=s-aligner\n")
f.write("#CHROM POS ID REF ALT QUAL FILTER\n")
for snp in snps:
f.write("xxx")
f.write("\t")
f.write(str(snp.pos))
f.write("\t.\t")
f.write(str(snp.prev))
f.write("\t")
f.write(str(snp.new))
f.write("\t.\tPASS")
f.write("\n")
f.close()
def consensus(fasta, reference, out):
for record in SeqIO.parse(reference, "fasta"):
referenceseq = record.seq
print(len(referenceseq))
c_columns = np.array([0]*len(referenceseq)*2)
g_columns = np.array([0]*len(referenceseq)*2)
t_columns = np.array([0]*len(referenceseq)*2)
a_columns = np.array([0]*len(referenceseq)*2)
gap_columns = np.array([0]*len(referenceseq)*2)
prevseq=''
lastseq=''
maxlen=0
for record in SeqIO.parse(fasta, "fasta"):
recordseq = record.seq
if(len(prevseq)>0):
found_first_char=False
pos=0
for c in prevseq:
if( c!='-'):
if(not found_first_char):
found_first_char = True
if(c=='A'):
a_columns[pos] +=1
elif(c=='C'):
c_columns[pos] +=1
elif(c=='G'):
g_columns[pos] +=1
elif(c=='T'):
t_columns[pos] +=1
elif (found_first_char):
gap_columns[pos] += 1
pos += 1
if(len(recordseq) > maxlen):
maxlen=len(recordseq)
lastseq=recordseq
prevseq=recordseq
consensus= ""
posreal=1
for i in range(maxlen):
x = max(a_columns[i],c_columns[i],g_columns[i],t_columns[i],gap_columns[i])
c='ñ'
if(a_columns[i] == x):
c='A'
elif(c_columns[i] == x):
c='C'
elif(g_columns[i] == x):
c='G'
elif(t_columns[i] == x):
c='T'
elif(gap_columns[i] == x):
c='-'
if(x==0):
c='N'
if(c!=lastseq[i] and c!='ñ'):
snp=MySNP(posreal,lastseq[i],c)
snps.append(snp)
#print(str(posreal)+":: "+str(lastseq[i])+" -> "+str(c))
if(c!='-'):
consensus = "".join((consensus, c))
posreal+=1
#print("consensus:"+consensus)
f = open(out, "w")
f.write(">consensus")
f.write("\n")
f.write(consensus)
f.write("\n")
f.close()
print_vnc(reference,snps,out)
#print(snps)
if __name__ == '__main__':
parser = argparse.ArgumentParser(prog='simplify-fasta', usage='%(prog)s -i inputFasta -o outputFasta', description="""
*****************************************************************************
*********************************BinSanity***********************************
** The `simplify-fasta` script is built to simplify fasta headers so as **
** not to run into errors when running BinSanity. Simplified headers **
** means that every contig id is only made up of a single word. This **
** will rename your fasta ids as `>contig_1`, `>contig_2`, and so on. **
*****************************************************************************""", formatter_class=RawTextHelpFormatter)
parser.add_argument("-i", metavar="", dest="inputFASTA",
help="Specify the name of the input file")
parser.add_argument("-r", metavar="", dest="inputREF",
help="Specify the name of the reference file")
parser.add_argument("-o", metavar="", dest="inputOUT",
help="Specify the name for the output file")
args = parser.parse_args()
if len(sys.argv) < 2:
print(parser.print_help())
if args.inputFASTA is None and args.inputOUT is None:
print("You haven't specified an input or output silly")
elif args.inputFASTA is None:
print("You can't give an output without an input")
elif args.inputOUT is None:
print("Provide and output file")
else:
consensus(args.inputFASTA, args.inputREF, args.inputOUT) | 35.221374 | 127 | 0.504551 | 561 | 4,614 | 4.067736 | 0.285205 | 0.042068 | 0.030675 | 0.032866 | 0.218668 | 0.124452 | 0.104294 | 0 | 0 | 0 | 0 | 0.009065 | 0.306675 | 4,614 | 131 | 128 | 35.221374 | 0.704283 | 0.03186 | 0 | 0.077586 | 0 | 0 | 0.241093 | 0.056688 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017241 | false | 0.008621 | 0.051724 | 0 | 0.068966 | 0.060345 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e2cb35cee027810b3a9f3b5a2585ec05cb53c10 | 3,287 | py | Python | scripts/backup_old/metadata.py | RodrigoNaves/ginan-bitbucket-update-tests | 4bd5cc0a9dd0e94b1c2d8b35385e128404009b0c | [
"Apache-2.0"
] | 73 | 2021-07-08T23:35:08.000Z | 2022-03-31T15:17:58.000Z | scripts/backup_old/metadata.py | RodrigoNaves/ginan-bitbucket-update-tests | 4bd5cc0a9dd0e94b1c2d8b35385e128404009b0c | [
"Apache-2.0"
] | 5 | 2021-09-27T14:27:32.000Z | 2022-03-21T23:50:02.000Z | scripts/backup_old/metadata.py | RodrigoNaves/ginan-bitbucket-update-tests | 4bd5cc0a9dd0e94b1c2d8b35385e128404009b0c | [
"Apache-2.0"
] | 39 | 2021-07-12T05:42:51.000Z | 2022-03-31T15:15:34.000Z | import requests
import json
import argparse
import xml.etree.ElementTree as ET
#----------------------------------------
# example ---> python cost_metadata.py -s MOBS
parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument('-s',dest='station',default='',help="station, 4 digits, bala")
args = parser.parse_args()
mdat = {}
#----------------------------------------
def getStationSetUp(station_ID,archive_url='https://gws.geodesy.ga.gov.au/setups/search/findCurrentByFourCharacterId?id='):
"""
Example of how to get the latest meta data from the GA archive.
This example will return the latest meta data for each station
To get the latest set-up of a station you can run the following command from a computer:
Ø curl 'https://gws.geodesy.ga.gov.au/setups/search/findCurrentByFourCharacterId?id=MOBS' -
"""
archive_url = archive_url+station_ID
request = requests.get(archive_url)
request.raise_for_status()
jdat = json.loads(request._content)
for dd in jdat['equipmentInUse']:
if dd['content']['id']['equipmentType'] == "gnss receiver":
mdat['rcvT'] = dd['content']['id']['type']
mdat['rcvN'] = dd['content']['id']['serialNumber']
elif dd['content']['id']['equipmentType'] == "gnss antenna":
mdat['antT'] = dd['content']['id']['type']
mdat['antN'] = dd['content']['id']['serialNumber']
mdat['antdU'] = dd['content']['configuration']['markerArpUpEccentricity']
mdat['antdN'] = dd['content']['configuration']['markerArpNorthEccentricity']
mdat['antdE'] = dd['content']['configuration']['markerArpEastEccentricity']
return 1
def getStationLog(station_ID,archive_url='https://gws.geodesy.ga.gov.au/siteLogs/search/findByFourCharacterId?id='):
"""
To get the full site log you can run the command:
Ø curl 'https://gws.geodesy.ga.gov.au/siteLogs/search/findByFourCharacterId?id=MOBS&format=geodesyml' -i
"""
ns = {
'geo': 'urn:xml-gov-au:icsm:egeodesy:0.4',
'gml': 'http://www.opengis.net/gml/3.2',
'xlink': 'http://www.w3.org/1999/xlink',
'gmd': 'http://www.isotc211.org/2005/gmd',
'gmx': 'http://www.isotc211.org/2005/gmx',
'om': 'http://www.opengis.net/om/2.0',
'gco': 'ttp://www.isotc211.org/2005/gco',
'xsi': 'http://www.w3.org/2001/XMLSchema-instance',
}
archive_url = archive_url+station_ID+'&format=geodesyml'
request = requests.get(archive_url)
request.raise_for_status()
root = ET.fromstring(request._content)
mdat['domes'] = root.findall('.//geo:siteIdentification//geo:iersDOMESNumber', ns)[0].text
pos = root.findall('.//geo:siteLocation//geo:geodeticPosition//gml:pos', ns)[0].text.split()
mdat['lat'] = pos[0]
mdat['long'] = pos[1]
mdat['height'] = pos[2]
getStationSetUp(args.station)
getStationLog(args.station)
print('{0:},{1:},{2:},{3:},{4:},{5:},{6:},{7:},{8:},{9:},{10:}'.format(args.station,mdat['domes'],mdat['lat'],mdat['long'],mdat['height'],mdat['rcvT'],mdat['rcvN'],mdat['antT'],mdat['antdU'],mdat['antdN'],mdat['antdE']))
| 40.580247 | 220 | 0.613325 | 401 | 3,287 | 4.972569 | 0.396509 | 0.040622 | 0.033099 | 0.034102 | 0.294885 | 0.225677 | 0.19659 | 0.19659 | 0.191575 | 0.142427 | 0 | 0.020748 | 0.178887 | 3,287 | 80 | 221 | 41.0875 | 0.718044 | 0.184971 | 0 | 0.085106 | 0 | 0.021277 | 0.383092 | 0.109673 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042553 | false | 0 | 0.085106 | 0 | 0.148936 | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e2d474ceec169ad641399b6fbca19e9290b3d72 | 332 | py | Python | salt/utils/beacons.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 19 | 2016-01-29T14:37:52.000Z | 2022-03-30T18:08:01.000Z | salt/utils/beacons.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 223 | 2016-03-02T16:39:41.000Z | 2022-03-03T12:26:35.000Z | salt/utils/beacons.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 64 | 2016-02-04T19:45:26.000Z | 2021-12-15T02:02:31.000Z | """
Utilies for beacons
"""
import copy
def remove_hidden_options(config, whitelist):
"""
Remove any hidden options not whitelisted
"""
for entry in copy.copy(config):
for func in entry:
if func.startswith("_") and func not in whitelist:
config.remove(entry)
return config
| 19.529412 | 62 | 0.623494 | 40 | 332 | 5.1 | 0.525 | 0.127451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.286145 | 332 | 16 | 63 | 20.75 | 0.860759 | 0.183735 | 0 | 0 | 0 | 0 | 0.004032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e2fa21023b492b98716387cb3f3d9474c117974 | 806 | py | Python | k3rn3l-ctf-2021/united-trees-of-america/main.py | AZ-0/Writeups | 1420a51cbf55dae9ffd6d458aac6dec03aad6f1c | [
"MIT"
] | null | null | null | k3rn3l-ctf-2021/united-trees-of-america/main.py | AZ-0/Writeups | 1420a51cbf55dae9ffd6d458aac6dec03aad6f1c | [
"MIT"
] | null | null | null | k3rn3l-ctf-2021/united-trees-of-america/main.py | AZ-0/Writeups | 1420a51cbf55dae9ffd6d458aac6dec03aad6f1c | [
"MIT"
] | null | null | null | import random
import math
from Crypto.Util.number import *
def isPrime(p):
for i in range(2, math.isqrt(p) + 1):
if p % i == 0:
return False
return True
flag = bytes_to_long(open('flag.txt','rb').read())
p = int(input('Enter a prime: '))
assert 10 < p, 'Prime too small'
assert p < 250, 'Prime too big'
assert isPrime(p), 'Number not prime'
coeffs = [random.getrandbits(128) for _ in range(1000)]
k = sum(coeffs[i] for i in range(0, len(coeffs), p-1))
coeffs[0] += flag - k
def poly(coeffs, n, p):
return sum(c * pow(n, i, p) for i, c in enumerate(coeffs)) % p
n = int(input('Enter a number: '))
assert 1 < n < p - 1, "We're feeling sneaky today, hmm?"
op = 0
for i in range(1, p):
op += poly(coeffs, pow(n, i, p), p)
print(op % p) | 25.1875 | 67 | 0.586849 | 142 | 806 | 3.309859 | 0.43662 | 0.034043 | 0.038298 | 0.070213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036545 | 0.253102 | 806 | 32 | 68 | 25.1875 | 0.744186 | 0 | 0 | 0 | 0 | 0 | 0.150773 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.083333 | false | 0 | 0.125 | 0.041667 | 0.333333 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e30ac0c1a2d30dc96d7adaed4df1ae64f1e8c10 | 4,170 | py | Python | wtm_envs/mujoco/ant_four_rooms/navigate.py | knowledgetechnologyuhh/goal_conditioned_RL_baselines | 915fc875fd8cc75accd0804d99373916756f726e | [
"MIT"
] | 15 | 2020-07-01T16:16:09.000Z | 2021-12-20T21:56:33.000Z | wtm_envs/mujoco/ant_four_rooms/navigate.py | knowledgetechnologyuhh/goal_conditioned_RL_baselines | 915fc875fd8cc75accd0804d99373916756f726e | [
"MIT"
] | 14 | 2020-09-25T22:41:20.000Z | 2022-03-12T00:38:44.000Z | wtm_envs/mujoco/ant_four_rooms/navigate.py | knowledgetechnologyuhh/goal_conditioned_RL_baselines | 915fc875fd8cc75accd0804d99373916756f726e | [
"MIT"
] | 2 | 2020-07-01T16:19:08.000Z | 2020-11-28T10:45:59.000Z | from gym import utils
from wtm_envs.mujoco import ant_env
from wtm_envs.mujoco.hook_env_pddl import PDDLHookEnv
import numpy as np
class AntFourRoomsEnv(ant_env.AntEnv, utils.EzPickle):
def __init__(self, reward_type='sparse'):
name = "ant_four_rooms.xml"
# Provide initial state space consisting of the ranges for all joint angles and velocities.
# In the Ant Reacher task, we use a random initial torso position and use fixed values for the remainder.
initial_joint_pos = np.array([0, 0, 0.55, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, -1.0, 0.0, -1.0, 0.0, 1.0])
initial_joint_pos = np.reshape(initial_joint_pos, (len(initial_joint_pos), 1))
initial_joint_ranges = np.concatenate((initial_joint_pos, initial_joint_pos), 1)
initial_joint_ranges[0] = np.array([-6, 6])
initial_joint_ranges[1] = np.array([-6, 6])
# Concatenate velocity ranges
initial_state_space = np.concatenate(
(initial_joint_ranges, np.zeros((len(initial_joint_ranges) - 1, 2))), 0)
# Provide end goal space.
max_range = 6
goal_space_train = [[-max_range, max_range], [-max_range, max_range], [0.45, 0.55]]
goal_space_test = [[-max_range, max_range], [-max_range, max_range], [0.45, 0.55]]
# Provide a function that maps from the state space to the end goal space.
# This is used to
# (i) determine whether the agent should be given the sparse reward and
# (ii) for Hindsight Experience Replay to determine which end goal was achieved after a sequence of actions.
project_state_to_end_goal = lambda sim, state: state[:3]
# Set end goal achievement thresholds. If the agent is within the threshold for each dimension,
# the end goal has been achieved and the reward of 0 is granted.
# For the Ant Reacher task, the end goal will be the desired (x,y) position of the torso
len_threshold = 0.4
height_threshold = 0.2
end_goal_thresholds = np.array([len_threshold, len_threshold, height_threshold])
# Provide range for each dimension of subgoal space in order to configure subgoal actor networks.
# Subgoal space can be the same as the state space or some other projection out of the state space.
# The subgoal space in the Ant Reacher task is the desired (x,y,z) position and (x,y,z) translational velocity of the torso
cage_max_dim = 8
max_height = 1
max_velo = 3
subgoal_bounds = np.array(
[[-cage_max_dim, cage_max_dim], [-cage_max_dim, cage_max_dim], [0, max_height], [-max_velo, max_velo],
[-max_velo, max_velo]])
# Provide state to subgoal projection function.
# a = np.concatenate((sim.data.qpos[:2], np.array([4 if sim.data.qvel[i] > 4 else -4 if sim.data.qvel[i] < -4 else sim.data.qvel[i] for i in range(3)])))
project_state_to_subgoal = lambda sim, state: np.concatenate((sim.data.qpos[:2], np.array(
[1 if sim.data.qpos[2] > 1 else sim.data.qpos[2]]), np.array(
[3 if sim.data.qvel[i] > 3 else -3 if sim.data.qvel[i] < -3 else sim.data.qvel[i] for i in range(2)])))
# Set subgoal achievement thresholds
velo_threshold = 0.8
quat_threshold = 0.5
# subgoal_thresholds = np.array([len_threshold, len_threshold, height_threshold, quat_threshold, quat_threshold, quat_threshold, quat_threshold, velo_threshold, velo_threshold, velo_threshold])
subgoal_thresholds = np.array([len_threshold, len_threshold, height_threshold, velo_threshold, velo_threshold])
ant_env.AntEnv.__init__(
self, 'ant_four_rooms/environment.xml', n_substeps=15,
reward_type=reward_type, name=name, goal_space_train=goal_space_train, goal_space_test=goal_space_test,
project_state_to_end_goal=project_state_to_end_goal, project_state_to_subgoal=project_state_to_subgoal,
end_goal_thresholds=end_goal_thresholds, initial_state_space=initial_state_space,
subgoal_bounds=subgoal_bounds, subgoal_thresholds=subgoal_thresholds)
utils.EzPickle.__init__(self)
| 58.732394 | 201 | 0.68777 | 640 | 4,170 | 4.248438 | 0.23125 | 0.011769 | 0.012137 | 0.008827 | 0.329165 | 0.269952 | 0.252666 | 0.208533 | 0.135712 | 0.080544 | 0 | 0.026058 | 0.217746 | 4,170 | 70 | 202 | 59.571429 | 0.80748 | 0.359233 | 0 | 0 | 0 | 0 | 0.020354 | 0.011308 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025 | false | 0 | 0.1 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e311c72a870044e223da7fcbd021949601c16fc | 5,546 | py | Python | src/OTLMOW/OTLModel/Datatypes/DtcSierbeplAanleg.py | davidvlaminck/OTLClassPython | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | 2 | 2022-02-01T08:58:11.000Z | 2022-02-08T13:35:17.000Z | src/OTLMOW/OTLModel/Datatypes/DtcSierbeplAanleg.py | davidvlaminck/OTLMOW | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | null | null | null | src/OTLMOW/OTLModel/Datatypes/DtcSierbeplAanleg.py | davidvlaminck/OTLMOW | 71330afeb37c3ea6d9981f521ff8f4a3f8b946fc | [
"MIT"
] | null | null | null | # coding=utf-8
from OTLMOW.OTLModel.BaseClasses.AttributeInfo import AttributeInfo
from OTLMOW.OTLModel.BaseClasses.OTLAttribuut import OTLAttribuut
from OTLMOW.OTLModel.Datatypes.ComplexField import ComplexField
from OTLMOW.OTLModel.Datatypes.KlAanplantingswijzeSierbeplanting import KlAanplantingswijzeSierbeplanting
from OTLMOW.OTLModel.Datatypes.KlSierbeplContainer import KlSierbeplContainer
from OTLMOW.OTLModel.Datatypes.KlSierbeplPlantmaat import KlSierbeplPlantmaat
from OTLMOW.OTLModel.Datatypes.KlVegetatiePlantverband import KlVegetatiePlantverband
from OTLMOW.OTLModel.Datatypes.NonNegIntegerField import NonNegIntegerField
# Generated with OTLComplexDatatypeCreator. To modify: extend, do not edit
class DtcSierbeplAanlegWaarden(AttributeInfo):
def __init__(self, parent=None):
AttributeInfo.__init__(self, parent)
self._aanplantingswijze = OTLAttribuut(field=KlAanplantingswijzeSierbeplanting,
naam='aanplantingswijze',
label='aanplantingswijze',
objectUri='https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#DtcSierbeplAanleg.aanplantingswijze',
definition='Manier van aanplanten.',
owner=self)
self._containermaat = OTLAttribuut(field=KlSierbeplContainer,
naam='containermaat',
label='containermaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#DtcSierbeplAanleg.containermaat',
definition='De grootte van de pot of container waarin de plant wordt geleverd. De P staat voor pot, de C voor container. Het getal geeft de grootte weer in centimeter.',
owner=self)
self._plantdichtheid = OTLAttribuut(field=NonNegIntegerField,
naam='plantdichtheid',
label='plantdichtheid',
objectUri='https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#DtcSierbeplAanleg.plantdichtheid',
definition='Aantal planten per vierkante meter.',
owner=self)
self._plantmaat = OTLAttribuut(field=KlSierbeplPlantmaat,
naam='plantmaat',
label='plantmaat',
objectUri='https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#DtcSierbeplAanleg.plantmaat',
definition='De hoogte van de plant in cm gemeten tussen een minimum en maximum waarde.',
owner=self)
self._plantverband = OTLAttribuut(field=KlVegetatiePlantverband,
naam='plantverband',
label='plantverband',
objectUri='https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#DtcSierbeplAanleg.plantverband',
definition='De wijze waarop de planten zijn geschikt.',
owner=self)
@property
def aanplantingswijze(self):
"""Manier van aanplanten."""
return self._aanplantingswijze.get_waarde()
@aanplantingswijze.setter
def aanplantingswijze(self, value):
self._aanplantingswijze.set_waarde(value, owner=self._parent)
@property
def containermaat(self):
"""De grootte van de pot of container waarin de plant wordt geleverd. De P staat voor pot, de C voor container. Het getal geeft de grootte weer in centimeter."""
return self._containermaat.get_waarde()
@containermaat.setter
def containermaat(self, value):
self._containermaat.set_waarde(value, owner=self._parent)
@property
def plantdichtheid(self):
"""Aantal planten per vierkante meter."""
return self._plantdichtheid.get_waarde()
@plantdichtheid.setter
def plantdichtheid(self, value):
self._plantdichtheid.set_waarde(value, owner=self._parent)
@property
def plantmaat(self):
"""De hoogte van de plant in cm gemeten tussen een minimum en maximum waarde."""
return self._plantmaat.get_waarde()
@plantmaat.setter
def plantmaat(self, value):
self._plantmaat.set_waarde(value, owner=self._parent)
@property
def plantverband(self):
"""De wijze waarop de planten zijn geschikt."""
return self._plantverband.get_waarde()
@plantverband.setter
def plantverband(self, value):
self._plantverband.set_waarde(value, owner=self._parent)
# Generated with OTLComplexDatatypeCreator. To modify: extend, do not edit
class DtcSierbeplAanleg(ComplexField, AttributeInfo):
"""Complex datatype voor dat de aanleg van sierbeplanting beschrijft."""
naam = 'DtcSierbeplAanleg'
label = 'Sierbeplanting aanleg'
objectUri = 'https://wegenenverkeer.data.vlaanderen.be/ns/onderdeel#DtcSierbeplAanleg'
definition = 'Complex datatype voor dat de aanleg van sierbeplanting beschrijft.'
waardeObject = DtcSierbeplAanlegWaarden
def __str__(self):
return ComplexField.__str__(self)
| 50.880734 | 212 | 0.622611 | 476 | 5,546 | 7.157563 | 0.222689 | 0.026416 | 0.042266 | 0.047549 | 0.399472 | 0.381861 | 0.373349 | 0.35339 | 0.306428 | 0.14617 | 0 | 0.000259 | 0.304724 | 5,546 | 108 | 213 | 51.351852 | 0.883299 | 0.100613 | 0 | 0.126582 | 0 | 0.012658 | 0.214574 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.151899 | false | 0 | 0.101266 | 0.012658 | 0.417722 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e31a8f06c8e8b07d56c47355fdc08ec8b4d9a6d | 1,033 | py | Python | scripts/migrate_game_servers.py | Eric-Le-Ge/online-town-public-release | 4397eefb11af6b7d40eb0cc5dd46d73405570091 | [
"Unlicense"
] | null | null | null | scripts/migrate_game_servers.py | Eric-Le-Ge/online-town-public-release | 4397eefb11af6b7d40eb0cc5dd46d73405570091 | [
"Unlicense"
] | null | null | null | scripts/migrate_game_servers.py | Eric-Le-Ge/online-town-public-release | 4397eefb11af6b7d40eb0cc5dd46d73405570091 | [
"Unlicense"
] | null | null | null | import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore
ROOM_COLLECTION = "rooms"
cred = credentials.Certificate("../onlinetown-401f0-firebase-adminsdk-gab3z-1a7a54c2da.json");
firebase_admin.initialize_app(cred)
db = firestore.client()
SERVER_MAP = {
"BLANK": "BLANK",
}
if __name__ == '__main__':
docs = db.collection(ROOM_COLLECTION).stream()
for doc in docs:
# map = doc.to_dict()['map']
# print("{} map: {}".format(doc.id, map))
# if (map == 100 or map == 101):
# print("Changing {}".format(doc.id))
# db.collection(ROOM_COLLECTION).document(doc.id).update({'map': 140})
newServer = ""
try:
server = doc.to_dict()['serverURL']
try:
newServer = SERVER_MAP[server]
db.collection(ROOM_COLLECTION).document(doc.id).update({'serverURL': newServer})
except KeyError:
newServer = server
except KeyError:
server = "KeyError"
newServer = ""
print(server, newServer)
| 28.694444 | 94 | 0.643756 | 117 | 1,033 | 5.504274 | 0.418803 | 0.080745 | 0.074534 | 0.121118 | 0.139752 | 0.139752 | 0.139752 | 0.139752 | 0 | 0 | 0 | 0.023313 | 0.211036 | 1,033 | 35 | 95 | 29.514286 | 0.766871 | 0.199419 | 0 | 0.24 | 0 | 0 | 0.131547 | 0.071864 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.12 | 0 | 0.12 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e31dbd191b4113bed77fb5423e06dc9fe9281eb | 8,542 | py | Python | examples/pqa_stacker.py | alex-ip/agdc | 9e9eb556c33792440a3736f64cd5f628cf3a1385 | [
"BSD-3-Clause"
] | null | null | null | examples/pqa_stacker.py | alex-ip/agdc | 9e9eb556c33792440a3736f64cd5f628cf3a1385 | [
"BSD-3-Clause"
] | null | null | null | examples/pqa_stacker.py | alex-ip/agdc | 9e9eb556c33792440a3736f64cd5f628cf3a1385 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
#===============================================================================
# Copyright (c) 2014 Geoscience Australia
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither Geoscience Australia nor the names of its contributors may be
# used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#===============================================================================
'''
Created on 21/02/2013
@author: u76345
'''
import os
import sys
import logging
from datetime import datetime, time
from agdc import Stacker
from eotools.utils import log_multiline
# Set top level standard output
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setLevel(logging.INFO)
console_formatter = logging.Formatter('%(message)s')
console_handler.setFormatter(console_formatter)
logger = logging.getLogger(__name__)
if not logger.level:
logger.setLevel(logging.DEBUG) # Default logging level for all modules
logger.addHandler(console_handler)
class PQAStacker(Stacker):
""" Subclass of Stacker
Used to implement specific functionality to create stacks of derived datasets.
"""
def derive_datasets(self, input_dataset_info, stack_output_info, tile_type_info):
""" Overrides abstract function in stacker class. Called in Stacker.stack_derived() function.
Arguments:
input_dataset_info: Dict keyed by processing level (e.g. ORTHO, NBAR, PQA, DEM)
containing all tile info which can be used within the function
A sample is shown below:
input_dataset_info = {'NBAR': {'band_name': 'Visible Blue',
'band_tag': 'B10',
'end_datetime': datetime.datetime(2000, 2, 9, 23, 46, 36, 722217),
'end_row': 77,
'level_name': 'NBAR',
'nodata_value': -999L,
'path': 91,
'satellite_tag': 'LS7',
'sensor_name': 'ETM+',
'start_datetime': datetime.datetime(2000, 2, 9, 23, 46, 12, 722217),
'start_row': 77,
'tile_layer': 1,
'tile_pathname': '/g/data/v10/datacube/EPSG4326_1deg_0.00025pixel/LS7_ETM/150_-025/2000/LS7_ETM_NBAR_150_-025_2000-02-09T23-46-12.722217.tif'},
'ORTHO': {'band_name': 'Thermal Infrared (Low Gain)',
'band_tag': 'B61',
'end_datetime': datetime.datetime(2000, 2, 9, 23, 46, 36, 722217),
'end_row': 77,
'level_name': 'ORTHO',
'nodata_value': 0L,
'path': 91,
'satellite_tag': 'LS7',
'sensor_name': 'ETM+',
'start_datetime': datetime.datetime(2000, 2, 9, 23, 46, 12, 722217),
'start_row': 77,
'tile_layer': 1,
'tile_pathname': '/g/data/v10/datacube/EPSG4326_1deg_0.00025pixel/LS7_ETM/150_-025/2000/LS7_ETM_ORTHO_150_-025_2000-02-09T23-46-12.722217.tif'},
'PQA': {'band_name': 'Pixel Quality Assurance',
'band_tag': 'PQA',
'end_datetime': datetime.datetime(2000, 2, 9, 23, 46, 36, 722217),
'end_row': 77,
'level_name': 'PQA',
'nodata_value': None,
'path': 91,
'satellite_tag': 'LS7',
'sensor_name': 'ETM+',
'start_datetime': datetime.datetime(2000, 2, 9, 23, 46, 12, 722217),
'start_row': 77,
'tile_layer': 1,
'tile_pathname': '/g/data/v10/datacube/EPSG4326_1deg_0.00025pixel/LS7_ETM/150_-025/2000/LS7_ETM_PQA_150_-025_2000-02-09T23-46-12.722217.tif'}
}
Arguments (Cont'd):
stack_output_info: dict containing stack output information.
Obtained from stacker object.
A sample is shown below
stack_output_info = {'x_index': 144,
'y_index': -36,
'stack_output_dir': '/g/data/v10/tmp/ndvi',
'start_datetime': None, # Datetime object or None
'end_datetime': None, # Datetime object or None
'satellite': None, # String or None
'sensor': None} # String or None
Arguments (cont'd):
tile_type_info: dict containing tile type information.
Obtained from stacker object (e.g: stacker.tile_type_info) after instantiation.
A sample is shown below
tile_type_info = {'crs': 'EPSG:4326',
'file_extension': '.tif',
'file_format': 'GTiff',
'format_options': 'COMPRESS=LZW,BIGTIFF=YES',
'tile_directory': 'EPSG4326_1deg_0.00025pixel',
'tile_type_id': 1L,
'tile_type_name': 'Unprojected WGS84 1-degree at 4000 pixels/degree',
'unit': 'degree',
'x_origin': 0.0,
'x_pixel_size': Decimal('0.00025000000000000000'),
'x_pixels': 4000L,
'x_size': 1.0,
'y_origin': 0.0,
'y_pixel_size': Decimal('0.00025000000000000000'),
'y_pixels': 4000L,
'y_size': 1.0}
Function must create one or more GDAL-supported output datasets. Useful functions in the
Stacker class include Stacker.get_pqa_mask(), but it is left to the coder to produce exactly
what is required for a single slice of the temporal stack of derived quantities.
Returns:
output_dataset_info: Dict keyed by stack filename
containing filenames of GDAL-supported output datasets created by this function.
Note that the key(s) will be used as the output filename for the VRT temporal stack
and each dataset created must contain only a single band.
"""
# Replace this with code to do fancy stuff
# Use the code for the Stacker.derive_datasets() as a template
return Stacker.derive_datasets(self, input_dataset_info, stack_output_info, tile_type_info)
if __name__ == '__main__':
def date2datetime(input_date, time_offset=time.min):
if not input_date:
return None
return datetime.combine(input_date, time_offset)
# Stacker class takes care of command line parameters
pqa_stacker = PQAStacker()
if pqa_stacker.debug:
console_handler.setLevel(logging.DEBUG)
# Check for required command line parameters
assert pqa_stacker.x_index, 'Tile X-index not specified (-x or --x_index)'
assert pqa_stacker.y_index, 'Tile Y-index not specified (-y or --y_index)'
assert pqa_stacker.output_dir, 'Output directory not specified (-o or --output)'
stack_info_dict = pqa_stacker.stack_derived(x_index=pqa_stacker.x_index,
y_index=pqa_stacker.y_index,
stack_output_dir=pqa_stacker.output_dir,
start_datetime=date2datetime(pqa_stacker.start_date, time.min),
end_datetime=date2datetime(pqa_stacker.end_date, time.max),
satellite=pqa_stacker.satellite,
sensor=pqa_stacker.sensor)
log_multiline(logger.debug, stack_info_dict, 'stack_info_dict', '\t')
logger.info('Finished creating %d temporal stack files in %s.', len(stack_info_dict), pqa_stacker.output_dir)
| 46.172973 | 150 | 0.637321 | 1,078 | 8,542 | 4.874768 | 0.325603 | 0.026641 | 0.027402 | 0.03197 | 0.278021 | 0.21313 | 0.200951 | 0.200951 | 0.200951 | 0.183825 | 0 | 0.062833 | 0.252868 | 8,542 | 184 | 151 | 46.423913 | 0.760577 | 0.73402 | 0 | 0 | 0 | 0 | 0.108041 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 1 | 0.054054 | false | 0 | 0.162162 | 0 | 0.324324 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e32e2f7e162c653864236624b206fa5e65569ed | 2,645 | py | Python | 1_code/speed_challenge/difficulty_utils.py | jaimiles23/Multiplication_Medley | 1072dea1a5be0b339211ff39db6a89a90aca64c1 | [
"MIT"
] | null | null | null | 1_code/speed_challenge/difficulty_utils.py | jaimiles23/Multiplication_Medley | 1072dea1a5be0b339211ff39db6a89a90aca64c1 | [
"MIT"
] | null | null | null | 1_code/speed_challenge/difficulty_utils.py | jaimiles23/Multiplication_Medley | 1072dea1a5be0b339211ff39db6a89a90aca64c1 | [
"MIT"
] | null | null | null | """/**
* @author [Jai Miles]
* @email [jaimiles23@gmail.com]
* @create date 2020-05-21 11:55:58
* @modify date 2020-06-16 23:33:58
* @desc [
SC_Difficulty class with methods to set speed challenge difficulty:
- Ask for difficulties
- Acknowledge difficulty message.
]
*/
"""
##########
# Imports
##########
from statistics import mean
import random
from logs import log_func_name, logger
from aux_utils.create_tuple_message_clauses import get_linear_nlg
from pause.pauser import Pauser
import speed_challenge.data
##########
# Imports
##########
class SC_Difficulty(object):
##########
# Ask for Difficulty
##########
@staticmethod
@log_func_name
def get_q_sc_difficulty(player_object, ) -> str:
"""Master method to return what difficulty prompt."""
sc_plays = player_object.sc_plays
speech_list = []
if sc_plays < 2:
ms_difficulty_list = SC_Difficulty.get_ms_difficulty_list()
ms_get_help = SC_Difficulty.h_get_ms_can_ask_help()
speech_list += Pauser.make_ms_pause_level_list(
ms_difficulty_list, 2.1, ms_get_help, 1.75)
q_what_difficulty = SC_Difficulty.h_get_ms_what_difficulty()
speech_list.append(q_what_difficulty)
return ' '.join(speech_list)
@staticmethod
@log_func_name
def h_get_ms_what_difficulty() -> str:
"""Helper method returns prompt asking what difficulty."""
return get_linear_nlg(
speed_challenge.data.MMT_WHAT_DIFFICULTY)
@staticmethod
@log_func_name
def get_ms_difficulty_list() -> str:
"""Returns message of list of difficulties user can select."""
return get_linear_nlg(
speed_challenge.data.MMT_CAN_USE_DIFF)
@staticmethod
@log_func_name
def h_get_ms_can_ask_help() -> str:
"""Tells the user to ask for help to hear about the difficulties."""
return speed_challenge.data.MS_GET_DIFF_HELP
@staticmethod
@log_func_name
def get_ms_not_register() -> str:
"""Returns message that did not register user's input."""
return get_linear_nlg(
speed_challenge.data.MTT_TRY_AGAIN)
##########
# Acknowledge Difficulty
##########
@staticmethod
@log_func_name
def get_ms_using_difficulty(difficulty: str) -> str:
"""Returns message that will use difficulty."""
ms_use = random.choice(
speed_challenge.data.MT_USE)
ms_difficulty = speed_challenge.data.MS_DIFFICULTY_FORMAT.format(
difficulty)
return ' '.join([ms_use, ms_difficulty])
| 25.679612 | 76 | 0.653308 | 335 | 2,645 | 4.820896 | 0.325373 | 0.06935 | 0.047678 | 0.085449 | 0.256347 | 0.220433 | 0.204334 | 0.1387 | 0 | 0 | 0 | 0.017839 | 0.237051 | 2,645 | 102 | 77 | 25.931373 | 0.782458 | 0.245747 | 0 | 0.319149 | 0 | 0 | 0.001073 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12766 | false | 0 | 0.12766 | 0 | 0.404255 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e3356c7912fd0f7b6244841488af8abd65adea1 | 1,600 | py | Python | examples/example07_branch_dag_parameterization.py | entailor/pytailor | dfd15a0b2a6c1fea6432721ef3b2bc6fb5aad583 | [
"BSD-3-Clause"
] | 9 | 2020-09-20T07:26:19.000Z | 2022-02-28T09:12:30.000Z | examples/example07_branch_dag_parameterization.py | entailor/pytailor | dfd15a0b2a6c1fea6432721ef3b2bc6fb5aad583 | [
"BSD-3-Clause"
] | 2 | 2020-10-03T07:53:23.000Z | 2020-10-12T11:40:24.000Z | examples/example07_branch_dag_parameterization.py | entailor/pytailor | dfd15a0b2a6c1fea6432721ef3b2bc6fb5aad583 | [
"BSD-3-Clause"
] | null | null | null | """
pyTailor Example 7
This example introduces the following NEW concepts:
- Use BranchTask to "branch out" a DAG
- For BranchTask definitions:
- Use *branch_files* to specify which files to use for branching
*branch_files* is given as one or more file tags.
"""
from pytailor import (
PythonTask,
BranchTask,
DAG,
Workflow,
Project,
FileSet,
Files,
Outputs,
)
### workflow definition ###
files = Files()
outputs = Outputs()
with DAG(name="dag") as dag:
with BranchTask(
name="branch",
branch_data=[files.testfiles],
branch_files=[files.testfiles],
):
with DAG(name="sub-dag") as sub_dag:
t1 = PythonTask(
function="glob.glob",
name="task 2",
args=["**/*.txt"],
kwargs={"recursive": True},
download=files.testfiles,
output_to=outputs.glob_res,
)
PythonTask(
function="builtins.print",
name="task 3",
args=[files.testfiles, outputs.glob_res],
parents=t1,
)
### workflow run ###
# open a project
prj = Project.from_name("Test")
# create a fileset and upload files
fileset = FileSet(prj)
fileset.upload(testfiles=["testfiles/testfile_01.txt", "testfiles/testfile_02.txt"])
# create a workflow:
wf = Workflow(project=prj, dag=dag, name="branch workflow 2", fileset=fileset)
# run the workflow
wf.run()
# check the status of the workflow
print("The workflow finished with state:")
print(wf.state)
| 23.529412 | 84 | 0.595625 | 185 | 1,600 | 5.091892 | 0.410811 | 0.059448 | 0.023355 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008787 | 0.28875 | 1,600 | 67 | 85 | 23.880597 | 0.818981 | 0.26625 | 0 | 0 | 0 | 0 | 0.149565 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025 | 0 | 0.025 | 0.075 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e3614f97a735aef0ef5def156ca789d1a024710 | 1,122 | py | Python | fstelemetry/telemetry.py | jbencina/fstelemetry | c9d58741e5c32d8019794e25d54146e2a85ce71b | [
"MIT"
] | 1 | 2021-12-28T22:26:47.000Z | 2021-12-28T22:26:47.000Z | fstelemetry/telemetry.py | jbencina/fstelemetry | c9d58741e5c32d8019794e25d54146e2a85ce71b | [
"MIT"
] | null | null | null | fstelemetry/telemetry.py | jbencina/fstelemetry | c9d58741e5c32d8019794e25d54146e2a85ce71b | [
"MIT"
] | null | null | null | from SimConnect import AircraftRequests, SimConnect
import csv
import os
import time
class Telemetry():
def __init__(self, keys):
self.requests = self._make_connection()
self.keys = keys
def _make_connection(self):
sm = SimConnect()
return AircraftRequests(sm, _time=0)
def get_data(self):
d = {}
d['time'] = time.time()
for k in self.keys:
d[k] = self.requests.get(k)
return d
def write_log(self, data, path):
fieldnames = [k for k in data]
exists = os.path.exists(path)
with open(path, 'a', newline='') as csvfile:
writer = csv.DictWriter(csvfile, fieldnames=fieldnames)
if not exists:
writer.writeheader()
writer.writerow(data)
def listen(self, path, interval=1.0):
print(f'Listening for FS events with {interval} second delay')
while True:
data = self.get_data()
self.write_log(data, path)
print(f'Logged event at {data["time"]}')
time.sleep(interval) | 26.093023 | 70 | 0.567736 | 134 | 1,122 | 4.656716 | 0.432836 | 0.038462 | 0.057692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003968 | 0.326203 | 1,122 | 43 | 71 | 26.093023 | 0.821429 | 0 | 0 | 0 | 0 | 0 | 0.077471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15625 | false | 0 | 0.125 | 0 | 0.375 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e38670c4af3cddf39a7934e3dd858bcc1966769 | 1,840 | py | Python | optumpublicdata/gov_data.py | cpaulisi/optumpublicdata | 8f62c3d856e02ee994f816a245329a08905abb85 | [
"MIT"
] | null | null | null | optumpublicdata/gov_data.py | cpaulisi/optumpublicdata | 8f62c3d856e02ee994f816a245329a08905abb85 | [
"MIT"
] | null | null | null | optumpublicdata/gov_data.py | cpaulisi/optumpublicdata | 8f62c3d856e02ee994f816a245329a08905abb85 | [
"MIT"
] | null | null | null | import pandas as pd
import requests
class hhs_extract:
# initialize with api_key
def __init__(self, bearer_token):
self.api_key = bearer_token
# extract HHS data
def extract_data(self) -> pd.DataFrame:
# read data from Socrata HHS API for county level hospitalizations at CMS registered hospitals
hhs_return_size = 1
hhs_total_size = 0
hhs_df_list = list()
while (hhs_return_size > 0):
payload = {'$$app_token': self.api_key, '$limit':50000, '$offset':hhs_total_size}
hhs_res = requests.get("https://healthdata.gov/resource/anag-cw7u.json", params=payload)
temp_results_df = pd.DataFrame.from_records(hhs_res.json())
hhs_return_size = len(temp_results_df)
hhs_total_size += hhs_return_size
hhs_df_list.append(temp_results_df)
hhs_results_df = pd.concat(hhs_df_list)
return hhs_results_df
class cdc_extract:
# initialize with api_key
def __init__(self, bearer_token):
self.api_key = bearer_token
# extract cdc data
def extract_data(self) -> pd.DataFrame:
# read data from Socrata CDC API for county level vaccinations across all dates
vax_return_size = 1
vax_total_size = 0
vax_df_list = list()
while (vax_return_size > 0):
payload = {'$$app_token': self.api_key, '$limit':50000, '$offset':vax_total_size}
vax_res = requests.get("https://data.cdc.gov/resource/8xkx-amqh.json", params=payload)
temp_results_df = pd.DataFrame.from_records(vax_res.json())
vax_return_size = len(temp_results_df)
vax_total_size += vax_return_size
vax_df_list.append(temp_results_df)
vax_results_df = pd.concat(vax_df_list)
return vax_results_df | 40 | 102 | 0.658152 | 254 | 1,840 | 4.413386 | 0.26378 | 0.080285 | 0.069581 | 0.053524 | 0.50669 | 0.50669 | 0.4157 | 0.4157 | 0.4157 | 0.4157 | 0 | 0.01312 | 0.254348 | 1,840 | 46 | 103 | 40 | 0.803936 | 0.136957 | 0 | 0.176471 | 0 | 0 | 0.087231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.058824 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e395970b87c621a1fcc58e8559a6aaf781e24f8 | 263 | py | Python | Code/Python/port_scan.py | rf-peixoto/Studies | 89178b6caa29cfa24d54058f3c7fc3424c541760 | [
"CC0-1.0"
] | 1 | 2021-02-10T12:53:37.000Z | 2021-02-10T12:53:37.000Z | Code/Python/port_scan.py | rf-peixoto/Studies | 89178b6caa29cfa24d54058f3c7fc3424c541760 | [
"CC0-1.0"
] | null | null | null | Code/Python/port_scan.py | rf-peixoto/Studies | 89178b6caa29cfa24d54058f3c7fc3424c541760 | [
"CC0-1.0"
] | null | null | null | import socket
import sys
for i in range(1, 65535):
skt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
if skt.connect((sys.argv[1], i)) == 0:
print("Port: {0} OPEN".format(i))
skt.close()
except:
continue
| 21.916667 | 59 | 0.574144 | 38 | 263 | 3.921053 | 0.684211 | 0.161074 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.281369 | 263 | 11 | 60 | 23.909091 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0.053232 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e3a44cfa7146ec2b5aaef9188e372242a439598 | 1,933 | py | Python | h2o-py/tests/testdir_munging/pyunit_unique.py | vishalbelsare/h2o-3 | 9322fb0f4c0e2358449e339a434f607d524c69fa | [
"Apache-2.0"
] | 6,098 | 2015-05-22T02:46:12.000Z | 2022-03-31T16:54:51.000Z | h2o-py/tests/testdir_munging/pyunit_unique.py | vishalbelsare/h2o-3 | 9322fb0f4c0e2358449e339a434f607d524c69fa | [
"Apache-2.0"
] | 2,517 | 2015-05-23T02:10:54.000Z | 2022-03-30T17:03:39.000Z | h2o-py/tests/testdir_munging/pyunit_unique.py | vishalbelsare/h2o-3 | 9322fb0f4c0e2358449e339a434f607d524c69fa | [
"Apache-2.0"
] | 2,199 | 2015-05-22T04:09:55.000Z | 2022-03-28T22:20:45.000Z | import sys
sys.path.insert(1,"../../")
import h2o
from tests import pyunit_utils
def pyunit_unique():
iris = h2o.import_file(pyunit_utils.locate("smalldata/iris/iris.csv"))
uniques = iris[4].unique()
rows, cols = uniques.dim
assert rows == 3 and cols == 1, "Expected 3 rows and 1 column, but got {0} rows and {1} column".format(rows,cols)
assert "Iris-setosa" in uniques[0], "Expected Iris-setosa to be in the set of unique species, but it wasn't"
assert "Iris-virginica" in uniques[0], "Expected Iris-virginica to be in the set of unique species, but it wasn't"
assert "Iris-versicolor" in uniques[0], "Expected Iris-versicolor to be in the set of unique species, but it wasn't"
fr = h2o.create_frame(rows=5, cols=1, time_fraction=1)
assert fr.type(0) == "time"
uf = fr.unique()
assert uf.type(0) == "time"
uf.refresh()
assert uf.type(0) == "time"
prostate = h2o.import_file(pyunit_utils.locate("smalldata/parser/csv2orc/prostate_NA.csv"))
prostate["GLEASON"] = prostate["GLEASON"].asfactor()
uniques = prostate["GLEASON"].unique(include_nas=True)
uniques_without_nas = prostate["GLEASON"].unique()
prostate_pandas = prostate.as_data_frame()
uniques_pandas = prostate_pandas["GLEASON"].unique()
assert uniques.nrows == len(uniques_pandas)
assert uniques_without_nas.nrows == len(uniques_pandas) - 1
# make sure domains are recalculated with each temp assign
df_example = h2o.H2OFrame({'time': ['M','M','M','D','D','M','M','D'],
'amount': [1,4,5,0,0,1,3,0]})
df_example['amount'] = df_example['amount'].asfactor()
filtered = df_example[df_example['time']=='D', 'amount']
uniques = filtered['amount'].unique()
assert len(uniques) == 1
assert uniques.as_data_frame().iat[0,0] == 0
if __name__ == "__main__":
pyunit_utils.standalone_test(pyunit_unique)
else:
pyunit_unique()
| 39.44898 | 120 | 0.669943 | 284 | 1,933 | 4.419014 | 0.320423 | 0.035857 | 0.023904 | 0.043028 | 0.246215 | 0.166534 | 0.166534 | 0.104382 | 0.104382 | 0.104382 | 0 | 0.023285 | 0.177962 | 1,933 | 48 | 121 | 40.270833 | 0.76652 | 0.028971 | 0 | 0.054054 | 0 | 0 | 0.2608 | 0.0336 | 0 | 0 | 0 | 0 | 0.297297 | 1 | 0.027027 | false | 0 | 0.135135 | 0 | 0.162162 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e40586624bfd81a0d9a6a3e8dac58538818efa1 | 310 | py | Python | Ex6/q3/server.py | sreyom31/app-srm | 5ca1ac1a3681e160564b05a7d10db2d1b94e5fd1 | [
"MIT"
] | null | null | null | Ex6/q3/server.py | sreyom31/app-srm | 5ca1ac1a3681e160564b05a7d10db2d1b94e5fd1 | [
"MIT"
] | null | null | null | Ex6/q3/server.py | sreyom31/app-srm | 5ca1ac1a3681e160564b05a7d10db2d1b94e5fd1 | [
"MIT"
] | null | null | null | import socket
s = socket.socket()
host = socket.gethostname()
port = 12346
s.bind((host, port))
s.listen(5)
while True:
c, addr = s.accept()
print("connection from", addr)
rec = c.recv(1024).decode("utf-8")
print(rec)
if rec == 'ping':
c.sendto('pong'.encode(), addr)
c.close() | 20.666667 | 39 | 0.6 | 46 | 310 | 4.043478 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045082 | 0.212903 | 310 | 15 | 40 | 20.666667 | 0.717213 | 0 | 0 | 0 | 0 | 0 | 0.090032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e41732c0c1c201ebe223546073ee99f68d2d8f4 | 1,085 | py | Python | spider/luogu.py | MeiK2333/recent_contests | d3c7d2bfb83651b556b501f6000117493503dc7c | [
"MIT"
] | 3 | 2019-08-01T13:18:12.000Z | 2021-04-21T09:10:55.000Z | spider/luogu.py | MeiK-h/recent_contests | 5d4890b3fb9bf65bb31854a8be0e6049ebbf0a06 | [
"MIT"
] | 2 | 2021-06-04T04:53:04.000Z | 2021-06-15T06:40:09.000Z | spider/luogu.py | MeiK-h/recent_contests | 5d4890b3fb9bf65bb31854a8be0e6049ebbf0a06 | [
"MIT"
] | 1 | 2021-06-10T13:33:36.000Z | 2021-06-10T13:33:36.000Z | from datetime import datetime, timezone
import requests
from schemas import Contest
from spider.utils import update_platform
def main():
headers = {
"user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 "
"Safari/537.36 "
}
resp = requests.get(
"https://www.luogu.org/contest/list?page=1&_contentOnly=1", headers=headers
)
contests = resp.json()["currentData"]["contests"]["result"]
data = []
tz = timezone.utc
for contest in contests:
link = f'https://www.luogu.org/contest/{contest["id"]}'
start_time = datetime.fromtimestamp(contest["startTime"], tz=tz)
end_time = datetime.fromtimestamp(contest["endTime"], tz=tz)
data.append(
Contest(
name=contest["name"],
link=link,
start_time=start_time,
end_time=end_time,
contest_id=link.split("/")[-1],
)
)
update_platform("洛谷", data)
if __name__ == "__main__":
main()
| 25.232558 | 116 | 0.587097 | 126 | 1,085 | 4.904762 | 0.531746 | 0.043689 | 0.042071 | 0.05178 | 0.074434 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03944 | 0.275576 | 1,085 | 42 | 117 | 25.833333 | 0.746819 | 0 | 0 | 0 | 0 | 0.032258 | 0.251613 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.129032 | 0 | 0.16129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e4174c582d0f7afb25fdca0511c1255d5ea364c | 1,054 | py | Python | img_colorization.py | MuhammedAshraf2020/ImageColorization | e18949e0bd577e6ef85b3c376a28f7d51206fb1f | [
"MIT"
] | 2 | 2021-01-21T22:23:40.000Z | 2021-01-29T23:19:14.000Z | img_colorization.py | MuhammedAshraf2020/ImageColorization | e18949e0bd577e6ef85b3c376a28f7d51206fb1f | [
"MIT"
] | null | null | null | img_colorization.py | MuhammedAshraf2020/ImageColorization | e18949e0bd577e6ef85b3c376a28f7d51206fb1f | [
"MIT"
] | null | null | null | from processing import *
from decodingModel import *
# Using Transfer learning
feature_extract_model = VggModel()
#Decoding model
colorize = model()
#prepare data in hard disk
PrepareData(datapath = "/content/data" , save_file = "/content/processed/" ,
target_size = (224 , 224) , batch_size = 32 , feature_extract_model = feature_extract_model)
training_dir = "/content/processed"
num_train_samples = 1000
batch_size = 32
steps_per_epoch = np.floor(num_train_samples/batch_size)
epochs = 200
for i in range(epochs):
generator = data_generator_baseline(training_dir, num_train_samples, batch_size)
fit_history = colorize.fit_generator(generator, epochs=1, steps_per_epoch=steps_per_epoch, verbose=1)
if i % 10 == 0:
colorize.save('model_merge_' + str(i) + '.h5')
X = test_images(path = "/content/oldes" , shape = (224 , 224) , batch_size = 2 ,
feature_extract_model = feature_extract_model , model = colorize )
show_images(X , width = 20 , hight = 20 , columns = 2 , rows = 1)
| 34 | 104 | 0.701139 | 140 | 1,054 | 5 | 0.5 | 0.1 | 0.135714 | 0.042857 | 0.177143 | 0.108571 | 0 | 0 | 0 | 0 | 0 | 0.042403 | 0.194497 | 1,054 | 30 | 105 | 35.133333 | 0.782097 | 0.058824 | 0 | 0 | 0 | 0 | 0.082636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e42b718ee16f23a8ba27a4d6ff7efea8f8ce315 | 23,434 | py | Python | cloudify_cli/env.py | TS-at-WS/cloudify-cli | 598b54ecd67495a76678177f910cdc5eac6128d0 | [
"Apache-2.0"
] | null | null | null | cloudify_cli/env.py | TS-at-WS/cloudify-cli | 598b54ecd67495a76678177f910cdc5eac6128d0 | [
"Apache-2.0"
] | 10 | 2020-08-02T07:45:42.000Z | 2021-06-11T01:03:45.000Z | cloudify_cli/env.py | TS-at-WS/cloudify-cli | 598b54ecd67495a76678177f910cdc5eac6128d0 | [
"Apache-2.0"
] | null | null | null | ########
# Copyright (c) 2014 GigaSpaces Technologies Ltd. All rights reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
############
import os
import json
import types
import shutil
import pkgutil
import getpass
import tempfile
import itertools
from base64 import urlsafe_b64encode
import yaml
import requests
from cloudify_rest_client import CloudifyClient
from cloudify_rest_client.client import HTTPClient
from cloudify.cluster_status import CloudifyNodeType
from cloudify_rest_client.utils import is_kerberos_env
from cloudify_rest_client.exceptions import CloudifyClientError
from . import constants
from .exceptions import CloudifyCliError
_ENV_NAME = 'manager'
DEFAULT_LOG_FILE = os.path.expanduser(
'{0}/cloudify-{1}/cloudify-cli.log'.format(
tempfile.gettempdir(), getpass.getuser()))
CLOUDIFY_WORKDIR = os.path.join(
os.environ.get('CFY_WORKDIR', os.path.expanduser('~')),
constants.CLOUDIFY_BASE_DIRECTORY_NAME)
PROFILES_DIR = os.path.join(CLOUDIFY_WORKDIR, 'profiles')
ACTIVE_PROFILE = os.path.join(CLOUDIFY_WORKDIR, 'active.profile')
CLUSTER_RETRY_INTERVAL = 5
def delete_profile(profile_name):
if is_profile_exists(profile_name):
profile_dir = get_profile_dir(profile_name)
shutil.rmtree(profile_dir)
else:
raise CloudifyCliError(
'Profile {0} does not exist'.format(profile_name))
def is_profile_exists(profile_name):
try:
return os.path.isfile(get_context_path(profile_name))
except CloudifyCliError:
return False
def assert_profile_exists(profile_name):
if not is_profile_exists(profile_name):
raise CloudifyCliError(
'Profile {0} does not exist. You can run `cfy init {0}` to '
'create the profile.'.format(profile_name))
def set_active_profile(profile_name):
global profile
with open(ACTIVE_PROFILE, 'w+') as active_profile:
active_profile.write(profile_name)
profile = get_profile_context(profile_name, suppress_error=True)
def get_active_profile():
if os.path.isfile(ACTIVE_PROFILE):
with open(ACTIVE_PROFILE) as active_profile:
return active_profile.read().strip()
else:
# We return None explicitly as no profile is active.
return None
def set_target_manager(manager_host):
global target_manager
target_manager = manager_host
def get_target_manager():
return target_manager
def get_profile_names():
# TODO: This is too.. ambiguous. We should change it so there are
# no exclusions.
excluded = ['local']
profile_names = [item for item in os.listdir(PROFILES_DIR)
if item not in excluded and not item.startswith('.')]
return profile_names
def assert_manager_active():
if not is_manager_active():
raise CloudifyCliError(
'This command is only available when using a manager. '
'Please use the `cfy profiles use` command to connect '
'to a Cloudify Manager.')
def assert_local_active():
if is_manager_active():
raise CloudifyCliError(
'This command is not available when using a manager. '
'You can run `cfy profiles use local` to stop using a manager.')
def assert_credentials_set():
error_msg = 'Manager {0} must be set in order to use a manager.\n' \
'You can set it in the profile by running ' \
'`cfy profiles set {1}`, or you can set the `CLOUDIFY_{2}` ' \
'environment variable.'
if not get_kerberos_env():
if not get_username():
raise CloudifyCliError(
error_msg.format('Username', '--manager-username', 'USERNAME')
)
if not get_password():
raise CloudifyCliError(
error_msg.format('Password', '--manager-password', 'PASSWORD')
)
if not get_tenant_name():
raise CloudifyCliError(
error_msg.format('Tenant', '--manager-tenant', 'TENANT')
)
def is_manager_active():
active_profile = get_active_profile()
if not active_profile:
return False
if active_profile == 'local':
return False
p = get_profile_context(active_profile, suppress_error=True)
if not (p and p.manager_ip):
return False
return True
def get_profile_context(profile_name=None, suppress_error=False):
# empty profile with nothing but default values
default = ProfileContext()
profile_name = profile_name or get_active_profile()
if profile_name == 'local':
if suppress_error:
return default
raise CloudifyCliError('Local profile does not have context')
try:
path = get_context_path(profile_name)
with open(path) as f:
context = yaml.load(f.read())
# fill the default with values from existing profile (the default is
# used as base because some of the attributes may not be in the
# existing profile file)
for key, value in context.__dict__.items():
setattr(default, key, value)
except CloudifyCliError:
if not suppress_error:
raise
return default
def config_initialized_with_logging():
"""
This is for the Windows agent: plugin URLs from
import_resolver are written to config.yaml during installation, so we can
have a scenario where config exists but has no logger paths defined.
"""
has_logging = False
if os.path.isfile(os.path.join(CLOUDIFY_WORKDIR, 'config.yaml')):
with open(os.path.join(CLOUDIFY_WORKDIR, 'config.yaml'), 'r') as f:
has_logging = ('logging' in f.read())
return has_logging
def is_initialized(profile_name=None):
"""
Check if a profile or an environment is initialized.
If profile_name is provided, it will check if the profile
is initialized. If not, it will just check that workenv is.
"""
if profile_name:
return get_profile_dir(profile_name) is not None
else:
return config_initialized_with_logging()
def get_context_path(profile_name, suppress_error=False):
base_dir = get_profile_dir(profile_name, suppress_error)
if not base_dir:
return
return os.path.join(
base_dir,
constants.CLOUDIFY_PROFILE_CONTEXT_FILE_NAME
)
def get_profile_dir(profile_name=None, suppress_error=False):
active_profile = profile_name or get_active_profile()
if active_profile and os.path.isdir(
os.path.join(PROFILES_DIR, active_profile)):
return os.path.join(PROFILES_DIR, active_profile)
elif suppress_error:
return
else:
raise CloudifyCliError('Profile directory does not exist')
def raise_uninitialized():
error = CloudifyCliError(
'Cloudify environment is not initialized')
error.possible_solutions = [
"Run 'cfy init'"
]
raise error
def is_cluster(client_profile=None):
if client_profile is None:
client_profile = profile
return (not isinstance(client_profile.cluster, list) and
client_profile.cluster.get(CloudifyNodeType.MANAGER))
def get_rest_client(client_profile=None,
rest_host=None,
rest_port=None,
rest_protocol=None,
rest_cert=None,
username=None,
password=None,
tenant_name=None,
trust_all=False,
cluster=None,
kerberos_env=None):
if client_profile is None:
client_profile = profile
rest_host = rest_host or client_profile.manager_ip
rest_port = rest_port or client_profile.rest_port
rest_protocol = rest_protocol or client_profile.rest_protocol
rest_cert = rest_cert or get_ssl_cert(client_profile)
username = username or get_username(client_profile)
password = password or get_password(client_profile)
tenant_name = tenant_name or get_tenant_name(client_profile)
trust_all = trust_all or get_ssl_trust_all()
headers = get_auth_header(username, password)
headers[constants.CLOUDIFY_TENANT_HEADER] = tenant_name
cluster = cluster or is_cluster(client_profile)
kerberos_env = kerberos_env \
if kerberos_env is not None else client_profile.kerberos_env
if kerberos_env is False \
or (kerberos_env is None and not is_kerberos_env()):
if not username:
raise CloudifyCliError('Command failed: Missing Username')
if not password:
raise CloudifyCliError('Command failed: Missing password')
if cluster:
client = CloudifyClusterClient(host=rest_host,
port=rest_port,
protocol=rest_protocol,
headers=headers,
cert=rest_cert,
trust_all=trust_all,
profile=client_profile,
kerberos_env=kerberos_env)
else:
client = CloudifyClient(host=rest_host,
port=rest_port,
protocol=rest_protocol,
headers=headers,
cert=rest_cert,
trust_all=trust_all,
kerberos_env=kerberos_env)
return client
def build_manager_host_string(ssh_user='', ip=''):
ip = ip or profile.manager_ip
return build_host_string(ip, ssh_user)
def build_host_string(ip, ssh_user=''):
ssh_user = ssh_user or profile.ssh_user
if not ssh_user:
raise CloudifyCliError('`ssh_user` is not set in the current '
'profile. Please run '
'`cfy profiles set --ssh-user <ssh-user>`.')
return '{0}@{1}'.format(ssh_user, ip)
def get_default_rest_cert_local_path():
base_dir = get_profile_dir(suppress_error=True) or CLOUDIFY_WORKDIR
return os.path.join(base_dir, constants.PUBLIC_REST_CERT)
def get_username(from_profile=None):
if from_profile is None:
from_profile = profile
username = os.environ.get(constants.CLOUDIFY_USERNAME_ENV)
if username and from_profile.manager_username:
raise CloudifyCliError('Manager Username is set in profile *and* in '
'the `CLOUDIFY_USERNAME` env variable. Resolve '
'the conflict before continuing.\n'
'Either unset the env variable, or run '
'`cfy profiles unset --manager-username`')
return username or from_profile.manager_username
def get_password(from_profile=None):
if from_profile is None:
from_profile = profile
password = os.environ.get(constants.CLOUDIFY_PASSWORD_ENV)
if password and from_profile.manager_password:
raise CloudifyCliError('Manager Password is set in profile *and* in '
'the `CLOUDIFY_PASSWORD` env variable. Resolve '
'the conflict before continuing.\n'
'Either unset the env variable, or run '
'`cfy profiles unset --manager-password`')
return password or from_profile.manager_password
def get_tenant_name(from_profile=None):
if from_profile is None:
from_profile = profile
tenant = os.environ.get(constants.CLOUDIFY_TENANT_ENV)
if tenant and from_profile.manager_tenant:
raise CloudifyCliError('Manager Tenant is set in profile *and* in '
'the `CLOUDIFY_TENANT` env variable. Resolve '
'the conflict before continuing.\n'
'Either unset the env variable, or run '
'`cfy profiles unset --manager-tenant`')
return tenant or from_profile.manager_tenant
def get_kerberos_env(from_profile=None):
if from_profile is None:
from_profile = profile
return from_profile.kerberos_env
def get_ssl_cert(from_profile=None):
"""Return the path to a local copy of the manager's public certificate.
:return: If the LOCAL_REST_CERT_FILE env var was set by the user *or* if
`rest_certificate` is set in the profile - use it,
If it wasn't set, check if the certificate file is found in its default
location. If so - use it, otherwise - return None
Note that if it is set in both profile and env var - an error will be
raised
"""
if from_profile is None:
from_profile = profile
cert = os.environ.get(constants.LOCAL_REST_CERT_FILE)
if cert and from_profile.rest_certificate:
raise CloudifyCliError('Rest Certificate is set in profile *and* in '
'the `LOCAL_REST_CERT_FILE` env variable. '
'Resolve the conflict before continuing.\n'
'Either unset the env variable, or run '
'`cfy profiles unset --rest_certificate`')
if cert or from_profile.rest_certificate:
return cert or from_profile.rest_certificate
default_cert_file = get_default_rest_cert_local_path()
return default_cert_file if os.path.isfile(default_cert_file) else None
def get_ssl_trust_all():
trust_all = os.environ.get(constants.CLOUDIFY_SSL_TRUST_ALL)
if trust_all is not None and len(trust_all) > 0:
return True
return False
def get_version_data():
data = pkgutil.get_data('cloudify_cli', 'VERSION')
return json.loads(data)
def get_manager_version_data(rest_client=None):
if not rest_client:
if not get_profile_context(suppress_error=True):
return None
try:
rest_client = get_rest_client()
except CloudifyCliError:
return None
try:
version_data = rest_client.manager.get_version()
except CloudifyClientError:
return None
version_data['ip'] = profile.manager_ip
return version_data
class ProfileContext(yaml.YAMLObject):
yaml_tag = u'!CloudifyProfileContext'
yaml_loader = yaml.Loader
def __init__(self, profile_name=None):
# Note that __init__ is not called when loading from yaml.
# When adding a new ProfileContext attribute, make sure that
# all methods handle the case when the attribute is missing
self._profile_name = profile_name
self.manager_ip = None
self.ssh_key = None
self._ssh_port = None
self.ssh_user = None
self.provider_context = dict()
self.manager_username = None
self.manager_password = None
self.manager_tenant = None
self.rest_port = constants.DEFAULT_REST_PORT
self.rest_protocol = constants.DEFAULT_REST_PROTOCOL
self.rest_certificate = None
self.kerberos_env = False
self._cluster = dict()
def to_dict(self):
return dict(
name=self.profile_name,
manager_ip=self.manager_ip,
ssh_key_path=self.ssh_key,
ssh_port=self.ssh_port,
ssh_user=self.ssh_user,
provider_context=self.provider_context,
manager_username=self.manager_username,
manager_tenant=self.manager_tenant,
rest_port=self.rest_port,
rest_protocol=self.rest_protocol,
rest_certificate=self.rest_certificate,
kerberos_env=self.kerberos_env,
cluster=self.cluster
)
@property
def ssh_port(self):
return self._ssh_port
@ssh_port.setter
def ssh_port(self, ssh_port):
# If the port is int, we want to change it to a string. Otherwise,
# leave None as is
ssh_port = str(ssh_port) if ssh_port else None
self._ssh_port = ssh_port
@property
def profile_name(self):
return getattr(self, '_profile_name', None) \
or getattr(self, 'manager_ip', None)
@property
def cluster(self):
# default the ._cluster attribute here, so that all callers can use it
# as just ._cluster, even if it's not present in the source yaml
if not hasattr(self, '_cluster'):
self._cluster = dict()
return self._cluster
@cluster.setter
def cluster(self, cluster):
self._cluster = cluster
@profile_name.setter
def profile_name(self, profile_name):
self._profile_name = profile_name
def _get_context_path(self):
init_path = get_profile_dir(self.profile_name)
context_path = os.path.join(
init_path,
constants.CLOUDIFY_PROFILE_CONTEXT_FILE_NAME)
return context_path
@property
def workdir(self):
return os.path.join(PROFILES_DIR, self.profile_name)
def save(self, destination=None):
if not self.profile_name:
raise CloudifyCliError('No profile name or Manager IP set')
workdir = destination or self.workdir
# Create a new file
if not os.path.exists(workdir):
os.makedirs(workdir)
target_file_path = os.path.join(
workdir,
constants.CLOUDIFY_PROFILE_CONTEXT_FILE_NAME)
with open(target_file_path, 'w') as f:
f.write(yaml.dump(self))
def get_auth_header(username, password):
header = {}
if username and password:
credentials = '{0}:{1}'.format(username, password)
encoded_credentials = urlsafe_b64encode(credentials)
header = {
constants.CLOUDIFY_AUTHENTICATION_HEADER:
constants.BASIC_AUTH_PREFIX + ' ' + encoded_credentials}
return header
# attributes that can differ for each node in a cluster. Those will be updated
# in the profile when we switch to a new master.
# Dicts with these keys live in profile.cluster, and are added there during
# either `cfy cluster update-profile` (in which case some of them might be
# missing, eg. ssh_*), or during a `cfy cluster join`.
# If a value is missing, we will use the value from the last active manager.
# Only the IP is required.
# Note that not all attributes are allowed - username/password will be
# the same for every node in the cluster.
CLUSTER_NODE_ATTRS = ['host_ip', 'host_type', 'rest_port', 'rest_protocol',
'ssh_port', 'ssh_user', 'ssh_key']
class ClusterHTTPClient(HTTPClient):
def __init__(self, *args, **kwargs):
profile = kwargs.pop('profile')
super(ClusterHTTPClient, self).__init__(*args, **kwargs)
if not profile.cluster:
raise ValueError('Cluster client invoked for an empty cluster!')
self._cluster = list(profile.cluster.get(CloudifyNodeType.MANAGER))
self._profile = profile
first_node = self._cluster[0]
self.cert = first_node.get('cert') or self.cert
self.trust_all = first_node.get('trust_all') or self.trust_all
self.default_timeout_sec = self.default_timeout_sec or (5, None)
def do_request(self, *args, **kwargs):
# this request can be retried for each manager - if the data is
# a generator, we need to copy it, so we can send it more than once
copied_data = None
if isinstance(kwargs.get('data'), types.GeneratorType):
copied_data = itertools.tee(kwargs.pop('data'),
len(self._cluster) + 1)
if kwargs.get('timeout') is None:
kwargs['timeout'] = self.default_timeout_sec
if copied_data is not None:
kwargs['data'] = copied_data[-1]
manager_host = get_target_manager()
if manager_host:
self.host = manager_host
return self._try_do_request(*args, **kwargs)
# First try with the main manager ip given when creating the profile
# with `cfy profiles use`
self.host = self._profile.manager_ip
response = self._try_do_request(*args, **kwargs)
if response:
return response
for node_index, node in list(enumerate(
self._profile.cluster[CloudifyNodeType.MANAGER])):
if self._profile.manager_ip in [node['host_ip'], node['hostname']]:
continue
self._use_node(node)
if copied_data is not None:
kwargs['data'] = copied_data[node_index]
response = self._try_do_request(*args, **kwargs)
if response:
return response
raise CloudifyClientError('All cluster nodes are offline')
def _try_do_request(self, *args, **kwargs):
try:
return super(ClusterHTTPClient, self).do_request(*args,
**kwargs)
except (requests.exceptions.ConnectionError,
CloudifyClientError) as e:
if isinstance(e, CloudifyClientError) and e.status_code != 502:
raise
def _use_node(self, node):
if node['host_ip'] == self.host:
return
self.host = node['host_ip']
for attr in ['rest_port', 'rest_protocol', 'trust_all', 'cert']:
new_value = node.get(attr)
if new_value:
setattr(self, attr, new_value)
self._update_profile(node)
def _update_profile(self, node):
"""
Put the node at the start of the cluster list in profile.
The client tries nodes in the order of the cluster list, so putting
the node first will make the client try it first next time. This makes
the client always try the last-known-active-manager first.
"""
self._profile.cluster[CloudifyNodeType.MANAGER].remove(node)
self._profile.cluster[CloudifyNodeType.MANAGER] = (
[node] + self._profile.cluster[CloudifyNodeType.MANAGER])
for node_attr in CLUSTER_NODE_ATTRS:
if node_attr in node:
setattr(self._profile, node_attr, node[node_attr])
self._profile.save()
class CloudifyClusterClient(CloudifyClient):
"""
A CloudifyClient that will retry the queries with the current manager.
When a request fails with a connection error, this will keep trying with
every node in the cluster, until it finds an active manager.
When an active manager is found, the profile will be updated with its
address.
"""
def __init__(self, profile, *args, **kwargs):
self._profile = profile
super(CloudifyClusterClient, self).__init__(*args, **kwargs)
def client_class(self, *args, **kwargs):
kwargs.setdefault('profile', self._profile)
return ClusterHTTPClient(*args, **kwargs)
profile = get_profile_context(suppress_error=True)
target_manager = None
| 35.886677 | 79 | 0.646454 | 2,923 | 23,434 | 4.967841 | 0.13753 | 0.030301 | 0.008264 | 0.005165 | 0.249776 | 0.166104 | 0.113629 | 0.088286 | 0.072791 | 0.066593 | 0 | 0.002068 | 0.27793 | 23,434 | 652 | 80 | 35.941718 | 0.856096 | 0.14415 | 0 | 0.198257 | 0 | 0 | 0.110501 | 0.003932 | 0 | 0 | 0 | 0.001534 | 0.008715 | 1 | 0.108932 | false | 0.043573 | 0.039216 | 0.010893 | 0.265795 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e44044cdc3759630dbb0df881a857baa4e03f7d | 972 | py | Python | Calibration/HcalAlCaRecoProducers/python/alcastreamHcalIsotrkOutput_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | Calibration/HcalAlCaRecoProducers/python/alcastreamHcalIsotrkOutput_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | Calibration/HcalAlCaRecoProducers/python/alcastreamHcalIsotrkOutput_cff.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
# output block for alcastream HCAL Isotrk
# output module
# module alcastreamHcalIsotrkOutput = PoolOutputModule
alcastreamHcalIsotrkOutput = cms.PSet(
outputCommands = cms.untracked.vstring('drop *',
'keep *_offlineBeamSpot_*_*',
'keep edmTriggerResults_*_*_*',
'keep triggerTriggerEvent_*_*_*',
'keep *_gtStage2Digis_*_*',
'keep HcalNoiseSummary_hcalnoise_*_*',
'keep *_hbhereco_*_*',
'keep recoTracks_generalTracks_*_*',
'keep recoTrackExtras_generalTracks_*_*',
'keep *_IsoProd_*_*',
)
)
| 48.6 | 84 | 0.429012 | 46 | 972 | 8.478261 | 0.695652 | 0.087179 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002053 | 0.498971 | 972 | 19 | 85 | 51.157895 | 0.798768 | 0.111111 | 0 | 0 | 0 | 0 | 0.299185 | 0.186263 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e44300df9e4409d4508d36cc2011c526c3cb3f9 | 3,256 | py | Python | Python (Basic) Skills Certification Test/2.Python: String Representations of Objects.py | manavnarang/HackerrankCertifications | 72d5d493a48c7c1221398a5b20b01bfa7aad26ea | [
"BSD-3-Clause"
] | 1 | 2021-04-23T10:01:38.000Z | 2021-04-23T10:01:38.000Z | Python (Basic) Skills Certification Test/2.Python: String Representations of Objects.py | manavnarang/HackerrankCertifications | 72d5d493a48c7c1221398a5b20b01bfa7aad26ea | [
"BSD-3-Clause"
] | null | null | null | Python (Basic) Skills Certification Test/2.Python: String Representations of Objects.py | manavnarang/HackerrankCertifications | 72d5d493a48c7c1221398a5b20b01bfa7aad26ea | [
"BSD-3-Clause"
] | null | null | null | #Python: String Representations of Objects
#Implement two vehicle classes:
#Car:
#The constructor for Car must take two arguments. The first of them is its maximum speed, and the second one is a string that denotes the units in which the speed is given: either "km/h" or "mph".
#The class must be implemented to return a string based on the arguments. For example, if car is an object of class Car with a maximum speed of 120, and the unit is "km/h", then printing car prints the following string: "Car with the maximum speed of 120 km/h", without quotes. If the maximum speed is 94 and the unit is "mph", then printing car prints in the following string: "Car with the maximum speed of 94 mph", without quotes.
#Boat:
#The constructor for Boat must take a single argument denoting its maximum speed in knots.
#The class must be implemented to return a string based on the argument. For example, if boat is an object of class Boat with a maximum speed of 82, then printing boat prints the following string: "Boat with the maximum speed of 82 knots", without quotes.
#The implementations of the classes will be tested by a provided code stub on several input files. Each input file contains several queries, and each query constructs an object of one of the classes. It then prints the string representation of the object to the standard output.
#Constraints
#1 ≤ the number of queries in one test file ≤ 100
#The lengths of each of the words is at most 10.
#Sample Case 0
#Sample Input
#STDIN Function
#----- -------
#2 → number of queries, q = 2
#car 151 km/h → query parameters = ["car 151 km/h", "boat 77"]
#boat 77
#Sample Output
#Car with the maximum speed of 151 km/h
#Boat with the maximum speed of 77 knots
#Explanation
#There are 2 queries. In the first of them, an object of class Car with the maximum speed of 151 in km/h is constructed, and then its string representation is printed to the output. In the second query, an object of class Boat is constructed with the maximum speed of 77 knots, and then its string representation is printed to the output.
#SOlution:
#!/bin/python3
import math
import os
import random
import re
import sys
class Car:
def __init__(self, maxspeed, speed_unit):
self.maxspeed=maxspeed
self.speed_unit=speed_unit
def __str__(self):
sen1="Car with the maximum speed of {} {}".format(self.maxspeed,self.speed_unit)
return sen1
class Boat:
def __init__(self,maxspeed):
self.maxspeed=maxspeed
def __str__(self):
sen1="Boat with the maximum speed of {} knots".format(self.maxspeed)
return sen1
if __name__ == '__main__':
fptr = open(os.environ['OUTPUT_PATH'], 'w')
q = int(input())
queries = []
for _ in range(q):
args = input().split()
vehicle_type, params = args[0], args[1:]
if vehicle_type == "car":
max_speed, speed_unit = int(params[0]), params[1]
vehicle = Car(max_speed, speed_unit)
elif vehicle_type == "boat":
max_speed = int(params[0])
vehicle = Boat(max_speed)
else:
raise ValueError("invalid vehicle type")
fptr.write("%s\n" % vehicle)
fptr.close()
| 38.761905 | 433 | 0.697174 | 520 | 3,256 | 4.298077 | 0.288462 | 0.075168 | 0.068904 | 0.07651 | 0.279642 | 0.22774 | 0.176286 | 0.127069 | 0.127069 | 0.089485 | 0 | 0.021446 | 0.226658 | 3,256 | 83 | 434 | 39.228916 | 0.864575 | 0.637592 | 0 | 0.171429 | 0 | 0 | 0.108319 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114286 | false | 0 | 0.142857 | 0 | 0.371429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e455903aafe2cb46d5096d1e57569e66b92de48 | 42,807 | py | Python | tests/transitfeed/testtrip.py | gungnir888/transitfeed3 | 406e7ca3fe274521ef5dbf9277c729182b5183cb | [
"Apache-2.0"
] | null | null | null | tests/transitfeed/testtrip.py | gungnir888/transitfeed3 | 406e7ca3fe274521ef5dbf9277c729182b5183cb | [
"Apache-2.0"
] | null | null | null | tests/transitfeed/testtrip.py | gungnir888/transitfeed3 | 406e7ca3fe274521ef5dbf9277c729182b5183cb | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2007 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Unit tests for the trip module.
from io import StringIO
from tests import util
import transitfeed
class DuplicateStopSequenceTestCase(util.TestCase):
def runTest(self):
accumulator = util.RecordingProblemAccumulator(
self, ("ExpirationDate", "NoServiceExceptions"))
problems = transitfeed.ProblemReporter(accumulator)
schedule = transitfeed.Schedule(problem_reporter=problems)
schedule.load(util.data_path('duplicate_stop_sequence'),
extra_validation=True)
e = accumulator.pop_exception('InvalidValue')
self.assertEqual('stop_sequence', e.column_name)
self.assertEqual(10, e.value)
accumulator.assert_no_more_exceptions()
class MissingEndpointTimesTestCase(util.TestCase):
def runTest(self):
accumulator = util.RecordingProblemAccumulator(
self, ('ExpirationDate', 'NoServiceExceptions'))
problems = transitfeed.ProblemReporter(accumulator)
schedule = transitfeed.Schedule(problem_reporter=problems)
schedule.load(util.data_path('missing_endpoint_times'),
extra_validation=True)
e = accumulator.pop_invalid_value('arrival_time')
self.assertEqual('', e.value)
e = accumulator.pop_invalid_value('departure_time')
self.assertEqual('', e.value)
class TripMemoryZipTestCase(util.MemoryZipTestCase):
def assertLoadAndCheckExtraValues(self, schedule_file):
"""Load file-like schedule_file and check for extra trip columns."""
load_problems = util.get_test_failure_problem_reporter(
self, ("ExpirationDate", "UnrecognizedColumn"))
loaded_schedule = transitfeed.Loader(schedule_file,
loader_problems=load_problems,
extra_validation=True).load()
self.assertEqual("foo", loaded_schedule.get_trip("AB1")["t_foo"])
self.assertEqual("", loaded_schedule.get_trip("AB2")["t_foo"])
self.assertEqual("", loaded_schedule.get_trip("AB1")["n_foo"])
self.assertEqual("bar", loaded_schedule.get_trip("AB2")["n_foo"])
# Uncomment the following lines to print the string in testExtraFileColumn
# print repr(zipfile.ZipFile(schedule_file).read("trips.txt"))
# self.fail()
def testExtraObjectAttribute(self):
"""Extra columns added to an object are preserved when writing."""
schedule = self.MakeLoaderAndLoad()
# Add an attribute to an existing trip
trip1 = schedule.get_trip("AB1")
trip1.t_foo = "foo"
# Make a copy of trip_id=AB1 and add an attribute before AddTripObject
trip2 = transitfeed.Trip(field_dict=trip1)
trip2.trip_id = "AB2"
trip2.t_foo = ""
trip2.n_foo = "bar"
schedule.add_trip_object(trip2)
trip2.add_stop_time(stop=schedule.get_stop("BULLFROG"), stop_time="09:00:00")
trip2.add_stop_time(stop=schedule.get_stop("STAGECOACH"), stop_time="09:30:00")
saved_schedule_file = StringIO()
schedule.write_google_transit_feed(saved_schedule_file)
self.accumulator.assert_no_more_exceptions()
self.assertLoadAndCheckExtraValues(saved_schedule_file)
def testExtraFileColumn(self):
"""Extra columns loaded from a file are preserved when writing."""
# Uncomment the code in assertLoadAndCheckExtraValues to generate this
# string.
self.SetArchiveContents(
"trips.txt",
"route_id,service_id,trip_id,t_foo,n_foo\n"
"AB,FULLW,AB1,foo,\n"
"AB,FULLW,AB2,,bar\n")
self.AppendToArchiveContents(
"stop_times.txt",
"AB2,09:00:00,09:00:00,BULLFROG,1\n"
"AB2,09:30:00,09:30:00,STAGECOACH,2\n")
load1_problems = util.get_test_failure_problem_reporter(
self, ("ExpirationDate", "UnrecognizedColumn"))
schedule = self.MakeLoaderAndLoad(loader_problems=load1_problems)
saved_schedule_file = StringIO()
schedule.write_google_transit_feed(saved_schedule_file)
self.assertLoadAndCheckExtraValues(saved_schedule_file)
class TripValidationTestCase(util.ValidationTestCase):
def runTest(self):
trip = transitfeed.Trip()
repr(trip) # shouldn't crash
schedule = self.SimpleSchedule()
trip = transitfeed.Trip()
repr(trip) # shouldn't crash
trip = transitfeed.Trip()
trip.trip_headsign = '\xBA\xDF\x0D' # Not valid ascii or utf8
repr(trip) # shouldn't crash
trip.route_id = '054C'
trip.service_id = 'WEEK'
trip.trip_id = '054C-00'
trip.trip_headsign = 'via Polish Hill'
trip.trip_short_name = 'X12'
trip.direction_id = '0'
trip.block_id = None
trip.shape_id = None
trip.bikes_allowed = '1'
trip.wheelchair_accessible = '2'
trip.validate(self.problems)
self.accumulator.assert_no_more_exceptions()
repr(trip) # shouldn't crash
# missing route ID
trip.route_id = None
self.ValidateAndExpectMissingValue(trip, 'route_id')
trip.route_id = '054C'
# missing service ID
trip.service_id = None
self.ValidateAndExpectMissingValue(trip, 'service_id')
trip.service_id = 'WEEK'
# missing trip ID
trip.trip_id = None
self.ValidateAndExpectMissingValue(trip, 'trip_id')
trip.trip_id = '054C-00'
# invalid direction ID
trip.direction_id = 'NORTH'
self.ValidateAndExpectInvalidValue(trip, 'direction_id')
trip.direction_id = '0'
# invalid bikes_allowed
trip.bikes_allowed = '3'
self.ValidateAndExpectInvalidValue(trip, 'bikes_allowed')
trip.bikes_allowed = None
# invalid wheelchair_accessible
trip.wheelchair_accessible = '3'
self.ValidateAndExpectInvalidValue(trip, 'wheelchair_accessible')
trip.wheelchair_accessible = None
# AddTripObject validates that route_id, service_id, .... are found in the
# schedule. The Validate calls made by self.Expect... above can't make this
# check because trip is not in a schedule.
trip.route_id = '054C-notfound'
schedule.add_trip_object(trip, self.problems, True)
e = self.accumulator.pop_exception('InvalidValue')
self.assertEqual('route_id', e.column_name)
self.accumulator.assert_no_more_exceptions()
trip.route_id = '054C'
# Make sure calling Trip.Validate validates that route_id and service_id
# are found in the schedule.
trip.service_id = 'WEEK-notfound'
trip.validate(self.problems)
e = self.accumulator.pop_exception('InvalidValue')
self.assertEqual('service_id', e.column_name)
self.accumulator.assert_no_more_exceptions()
trip.service_id = 'WEEK'
trip.validate(self.problems)
self.accumulator.assert_no_more_exceptions()
# expect no problems for non-overlapping periods
trip.add_frequency("06:00:00", "12:00:00", 600)
trip.add_frequency("01:00:00", "02:00:00", 1200)
trip.add_frequency("04:00:00", "05:00:00", 1000)
trip.add_frequency("12:00:00", "19:00:00", 700)
trip.validate(self.problems)
self.accumulator.assert_no_more_exceptions()
trip.clear_frequencies()
# overlapping headway periods
trip.add_frequency("00:00:00", "12:00:00", 600)
trip.add_frequency("06:00:00", "18:00:00", 1200)
self.ValidateAndExpectOtherProblem(trip)
trip.clear_frequencies()
trip.add_frequency("12:00:00", "20:00:00", 600)
trip.add_frequency("06:00:00", "18:00:00", 1200)
self.ValidateAndExpectOtherProblem(trip)
trip.clear_frequencies()
trip.add_frequency("06:00:00", "12:00:00", 600)
trip.add_frequency("00:00:00", "25:00:00", 1200)
self.ValidateAndExpectOtherProblem(trip)
trip.clear_frequencies()
trip.add_frequency("00:00:00", "20:00:00", 600)
trip.add_frequency("06:00:00", "18:00:00", 1200)
self.ValidateAndExpectOtherProblem(trip)
trip.clear_frequencies()
self.accumulator.assert_no_more_exceptions()
class TripSequenceValidationTestCase(util.ValidationTestCase):
def runTest(self):
schedule = self.SimpleSchedule()
# Make a new trip without any stop times
trip = schedule.get_route("054C").add_trip(trip_id="054C-00")
stop1 = schedule.get_stop('stop1')
stop2 = schedule.get_stop('stop2')
stop3 = schedule.get_stop('stop3')
stoptime1 = transitfeed.StopTime(self.problems, stop1,
stop_time='12:00:00', stop_sequence=1)
stoptime2 = transitfeed.StopTime(self.problems, stop2,
stop_time='11:30:00', stop_sequence=2)
stoptime3 = transitfeed.StopTime(self.problems, stop3,
stop_time='12:15:00', stop_sequence=3)
trip._add_stop_time_object_unordered(stoptime1, schedule)
trip._add_stop_time_object_unordered(stoptime2, schedule)
trip._add_stop_time_object_unordered(stoptime3, schedule)
trip.validate(self.problems)
e = self.accumulator.pop_exception('OtherProblem')
self.assertTrue(e.format_problem().find('Timetravel detected') != -1)
self.assertTrue(e.format_problem().find('number 2 in trip 054C-00') != -1)
self.accumulator.assert_no_more_exceptions()
class TripServiceIDValidationTestCase(util.ValidationTestCase):
def runTest(self):
schedule = self.SimpleSchedule()
trip1 = transitfeed.Trip()
trip1.route_id = "054C"
trip1.service_id = "WEEKDAY"
trip1.trip_id = "054C_WEEK"
self.ExpectInvalidValueInClosure(column_name="service_id",
value="WEEKDAY",
c=lambda: schedule.add_trip_object(trip1,
validate=True))
class TripDistanceFromStopToShapeValidationTestCase(util.ValidationTestCase):
def runTest(self):
schedule = self.SimpleSchedule()
stop1 = schedule.stops["stop1"]
stop2 = schedule.stops["stop2"]
stop3 = schedule.stops["stop3"]
# Set shape_dist_traveled
trip = schedule.trips["CITY1"]
trip.clear_stop_times()
trip.add_stop_time(stop1, stop_time="12:00:00", shape_dist_traveled=0)
trip.add_stop_time(stop2, stop_time="12:00:45", shape_dist_traveled=500)
trip.add_stop_time(stop3, stop_time="12:02:30", shape_dist_traveled=1500)
trip.shape_id = "shape1"
# Add a valid shape for the trip to the current schedule.
shape = transitfeed.Shape("shape1")
shape.add_point(48.2, 1.00, 0)
shape.add_point(48.2, 1.01, 500)
shape.add_point(48.2, 1.03, 1500)
shape.max_distance = 1500
schedule.add_shape_object(shape)
# The schedule should validate with no problems.
self.ExpectNoProblems(schedule)
# Delete a stop latitude. This should not crash validation.
stop1.stop_lat = None
self.ValidateAndExpectMissingValue(schedule, "stop_lat")
class TripHasStopTimeValidationTestCase(util.ValidationTestCase):
def runTest(self):
schedule = self.SimpleSchedule()
trip = schedule.get_route("054C").add_trip(trip_id="054C-00")
# We should get an OtherProblem here because the trip has no stops.
self.ValidateAndExpectOtherProblem(schedule)
# It should trigger a TYPE_ERROR if there are frequencies for the trip
# but no stops
trip.add_frequency("01:00:00", "12:00:00", 600)
schedule.validate(self.problems)
self.accumulator.pop_exception('OtherProblem') # pop first warning
e = self.accumulator.pop_exception('OtherProblem') # pop frequency error
self.assertTrue(e.format_problem().find('Frequencies defined, but') != -1)
self.assertTrue(e.format_problem().find('given in trip 054C-00') != -1)
self.assertEquals(transitfeed.TYPE_ERROR, e.type)
self.accumulator.assert_no_more_exceptions()
trip.clear_frequencies()
# Add a stop, but with only one stop passengers have nowhere to exit!
stop = transitfeed.Stop(36.425288, -117.133162, "Demo Stop 1", "STOP1")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:11:00", departure_time="5:12:00")
self.ValidateAndExpectOtherProblem(schedule)
# Add another stop, and then validation should be happy.
stop = transitfeed.Stop(36.424288, -117.133142, "Demo Stop 2", "STOP2")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:15:00", departure_time="5:16:00")
schedule.validate(self.problems)
trip.add_stop_time(stop, stop_time="05:20:00")
trip.add_stop_time(stop, stop_time="05:22:00")
# Last stop must always have a time
trip.add_stop_time(stop, arrival_secs=None, departure_secs=None)
self.ExpectInvalidValueInClosure(
'arrival_time', c=lambda: trip.get_end_time(loader_problems=self.problems))
class ShapeDistTraveledOfStopTimeValidationTestCase(util.ValidationTestCase):
def runTest(self):
schedule = self.SimpleSchedule()
shape = transitfeed.Shape("shape_1")
shape.add_point(36.425288, -117.133162, 0)
shape.add_point(36.424288, -117.133142, 1)
schedule.add_shape_object(shape)
trip = schedule.get_route("054C").add_trip(trip_id="054C-00")
trip.shape_id = "shape_1"
stop = transitfeed.Stop(36.425288, -117.133162, "Demo Stop 1", "STOP1")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:11:00", departure_time="5:12:00",
stop_sequence=0, shape_dist_traveled=0)
stop = transitfeed.Stop(36.424288, -117.133142, "Demo Stop 2", "STOP2")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:15:00", departure_time="5:16:00",
stop_sequence=1, shape_dist_traveled=1)
stop = transitfeed.Stop(36.423288, -117.133122, "Demo Stop 3", "STOP3")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:18:00", departure_time="5:19:00",
stop_sequence=2, shape_dist_traveled=2)
self.accumulator.assert_no_more_exceptions()
schedule.validate(self.problems)
e = self.accumulator.pop_exception('OtherProblem')
self.assertMatchesRegex('shape_dist_traveled=2', e.format_problem())
self.accumulator.assert_no_more_exceptions()
# Error if the distance decreases.
shape.add_point(36.421288, -117.133132, 2)
stop = transitfeed.Stop(36.421288, -117.133122, "Demo Stop 4", "STOP4")
schedule.add_stop_object(stop)
stoptime = transitfeed.StopTime(self.problems, stop,
arrival_time="5:29:00",
departure_time="5:29:00", stop_sequence=3,
shape_dist_traveled=1.7)
trip.add_stop_time_object(stoptime, schedule=schedule)
self.accumulator.assert_no_more_exceptions()
schedule.validate(self.problems)
e = self.accumulator.pop_exception('InvalidValue')
self.assertMatchesRegex('stop STOP4 has', e.format_problem())
self.assertMatchesRegex('shape_dist_traveled=1.7', e.format_problem())
self.assertMatchesRegex('distance was 2.0.', e.format_problem())
self.assertEqual(e.type, transitfeed.TYPE_ERROR)
self.accumulator.assert_no_more_exceptions()
# Warning if distance remains the same between two stop_times
stoptime.shape_dist_traveled = 2.0
trip.replace_stop_time_object(stoptime, schedule=schedule)
schedule.validate(self.problems)
e = self.accumulator.pop_exception('InvalidValue')
self.assertMatchesRegex('stop STOP4 has', e.format_problem())
self.assertMatchesRegex('shape_dist_traveled=2.0', e.format_problem())
self.assertMatchesRegex('distance was 2.0.', e.format_problem())
self.assertEqual(e.type, transitfeed.TYPE_WARNING)
self.accumulator.assert_no_more_exceptions()
class StopMatchWithShapeTestCase(util.ValidationTestCase):
def runTest(self):
schedule = self.SimpleSchedule()
shape = transitfeed.Shape("shape_1")
shape.add_point(36.425288, -117.133162, 0)
shape.add_point(36.424288, -117.143142, 1)
schedule.add_shape_object(shape)
trip = schedule.get_route("054C").add_trip(trip_id="054C-00")
trip.shape_id = "shape_1"
# Stop 1 is only 600 meters away from shape, which is allowed.
stop = transitfeed.Stop(36.425288, -117.139162, "Demo Stop 1", "STOP1")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:11:00", departure_time="5:12:00",
stop_sequence=0, shape_dist_traveled=0)
# Stop 2 is more than 1000 meters away from shape, which is not allowed.
stop = transitfeed.Stop(36.424288, -117.158142, "Demo Stop 2", "STOP2")
schedule.add_stop_object(stop)
trip.add_stop_time(stop, arrival_time="5:15:00", departure_time="5:16:00",
stop_sequence=1, shape_dist_traveled=1)
schedule.validate(self.problems)
e = self.accumulator.pop_exception('StopTooFarFromShapeWithDistTraveled')
self.assertTrue(e.format_problem().find('Demo Stop 2') != -1)
self.assertTrue(e.format_problem().find('1344 meters away') != -1)
self.accumulator.assert_no_more_exceptions()
class TripAddStopTimeObjectTestCase(util.ValidationTestCase):
def runTest(self):
schedule = transitfeed.Schedule(problem_reporter=self.problems)
schedule.add_agency("\xc8\x8b Fly Agency", "http://iflyagency.com",
"America/Los_Angeles")
schedule.get_default_service_period().set_date_has_service('20070101')
stop1 = schedule.add_stop(lng=140, lat=48.2, name="Stop 1")
stop2 = schedule.add_stop(lng=140.001, lat=48.201, name="Stop 2")
route = schedule.add_route("B", "Beta", "Bus")
trip = route.add_trip(schedule, "bus trip")
trip.add_stop_time_object(transitfeed.StopTime(self.problems, stop1,
arrival_secs=10,
departure_secs=10),
schedule=schedule, loader_problems=self.problems)
trip.add_stop_time_object(transitfeed.StopTime(self.problems, stop2,
arrival_secs=20,
departure_secs=20),
schedule=schedule, loader_problems=self.problems)
# TODO: Factor out checks or use mock problems object
self.ExpectOtherProblemInClosure(lambda:
trip.add_stop_time_object(transitfeed.StopTime(self.problems, stop1,
arrival_secs=15,
departure_secs=15),
schedule=schedule, loader_problems=self.problems))
trip.add_stop_time_object(transitfeed.StopTime(self.problems, stop1),
schedule=schedule, loader_problems=self.problems)
self.ExpectOtherProblemInClosure(lambda:
trip.add_stop_time_object(transitfeed.StopTime(self.problems, stop1,
arrival_secs=15,
departure_secs=15),
schedule=schedule, loader_problems=self.problems))
trip.add_stop_time_object(transitfeed.StopTime(self.problems, stop1,
arrival_secs=30,
departure_secs=30),
schedule=schedule, loader_problems=self.problems)
self.accumulator.assert_no_more_exceptions()
class TripReplaceStopTimeObjectTestCase(util.TestCase):
def runTest(self):
schedule = transitfeed.Schedule()
schedule.add_agency("\xc8\x8b Fly Agency", "http://iflyagency.com", "America/Los_Angeles")
schedule.get_default_service_period().set_date_has_service('20070101')
stop1 = schedule.add_stop(lng=140, lat=48.2, name="Stop 1")
route = schedule.add_route("B", "Beta", "Bus")
trip = route.add_trip(schedule, "bus trip")
stoptime = transitfeed.StopTime(transitfeed.default_problem_reporter, stop1,
arrival_secs=10,
departure_secs=10)
trip.add_stop_time_object(stoptime, schedule=schedule)
stoptime.departure_secs = 20
trip.replace_stop_time_object(stoptime, schedule=schedule)
stoptimes = trip.get_stop_times()
self.assertEqual(len(stoptimes), 1)
self.assertEqual(stoptimes[0].departure_secs, 20)
unknown_stop = schedule.add_stop(lng=140, lat=48.2, name="unknown")
unknown_stoptime = transitfeed.StopTime(
transitfeed.default_problem_reporter, unknown_stop,
arrival_secs=10,
departure_secs=10)
unknown_stoptime.stop_sequence = 5
# Attempting to replace a non-existent StopTime raises an error
self.assertRaises(transitfeed.Error, trip.ReplaceStopTimeObject,
unknown_stoptime, schedule=schedule)
class SingleTripTestCase(util.TestCase):
def setUp(self):
schedule = transitfeed.Schedule(
problem_reporter=util.ExceptionProblemReporterNoExpiration())
schedule.new_default_agency(agency_name="Test Agency",
agency_url="http://example.com",
agency_timezone="America/Los_Angeles")
route = schedule.add_route(short_name="54C", long_name="Polish Hill",
route_type=3)
service_period = schedule.get_default_service_period()
service_period.set_date_has_service("20070101")
trip = route.add_trip(schedule, 'via Polish Hill')
stop1 = schedule.add_stop(36.425288, -117.133162, "Demo Stop 1")
stop2 = schedule.add_stop(36.424288, -117.133142, "Demo Stop 2")
self.schedule = schedule
self.trip = trip
self.stop1 = stop1
self.stop2 = stop2
class TripStopTimeAccessorsTestCase(SingleTripTestCase):
def runTest(self):
self.trip.add_stop_time(
self.stop1, arrival_time="5:11:00", departure_time="5:12:00")
self.trip.add_stop_time(
self.stop2, arrival_time="5:15:00", departure_time="5:16:00")
# Add some more stop times and test GetEndTime does the correct thing
self.assertEqual(transitfeed.format_seconds_since_midnight(
self.trip.get_start_time()), "05:11:00")
self.assertEqual(transitfeed.format_seconds_since_midnight(
self.trip.get_end_time()), "05:16:00")
self.trip.add_stop_time(self.stop1, stop_time="05:20:00")
self.assertEqual(
transitfeed.format_seconds_since_midnight(self.trip.get_end_time()),
"05:20:00")
self.trip.add_stop_time(self.stop2, stop_time="05:22:00")
self.assertEqual(
transitfeed.format_seconds_since_midnight(self.trip.get_end_time()),
"05:22:00")
class TripGetStopTimesTestCase(SingleTripTestCase):
def runTest(self):
self.trip.add_stop_time(
self.stop1,
arrival_time="5:11:00",
departure_time="5:12:00",
stop_headsign='Stop Headsign',
pickup_type=1,
drop_off_type=2,
shape_dist_traveled=100,
timepoint=1)
self.trip.add_stop_time(
self.stop2, arrival_time="5:15:00", departure_time="5:16:00")
stop_times = self.trip.get_stop_times()
self.assertEquals(2, len(stop_times))
st = stop_times[0]
self.assertEquals(self.stop1.stop_id, st.stop_id)
self.assertEquals('05:11:00', st.arrival_time)
self.assertEquals('05:12:00', st.departure_time)
self.assertEquals(u'Stop Headsign', st.stop_headsign)
self.assertEquals(1, st.pickup_type)
self.assertEquals(2, st.drop_off_type)
self.assertEquals(100.0, st.shape_dist_traveled)
self.assertEquals(1, st.timepoint)
st = stop_times[1]
self.assertEquals(self.stop2.stop_id, st.stop_id)
self.assertEquals('05:15:00', st.arrival_time)
self.assertEquals('05:16:00', st.departure_time)
tuples = self.trip.get_stop_times_tuples()
self.assertEquals(2, len(tuples))
self.assertEqual(
(self.trip.trip_id, "05:11:00", "05:12:00", self.stop1.stop_id,
1, u'Stop Headsign', 1, 2, 100.0, 1),
tuples[0])
self.assertEqual(
(self.trip.trip_id, "05:15:00", "05:16:00", self.stop2.stop_id,
2, '', '', '', '', ''),
tuples[1])
class TripClearStopTimesTestCase(util.TestCase):
def runTest(self):
schedule = transitfeed.Schedule(
problem_reporter=util.ExceptionProblemReporterNoExpiration())
schedule.new_default_agency(agency_name="Test Agency",
agency_timezone="America/Los_Angeles")
route = schedule.add_route(short_name="54C", long_name="Hill", route_type=3)
schedule.get_default_service_period().set_date_has_service("20070101")
stop1 = schedule.add_stop(36, -117.1, "Demo Stop 1")
stop2 = schedule.add_stop(36, -117.2, "Demo Stop 2")
stop3 = schedule.add_stop(36, -117.3, "Demo Stop 3")
trip = route.add_trip(schedule, "via Polish Hill")
trip.clear_stop_times()
self.assertFalse(trip.get_stop_times())
trip.add_stop_time(stop1, stop_time="5:11:00")
self.assertTrue(trip.get_stop_times())
trip.clear_stop_times()
self.assertFalse(trip.get_stop_times())
trip.add_stop_time(stop3, stop_time="4:00:00") # Can insert earlier time
trip.add_stop_time(stop2, stop_time="4:15:00")
trip.add_stop_time(stop1, stop_time="4:21:00")
old_stop_times = trip.get_stop_times()
self.assertTrue(old_stop_times)
trip.clear_stop_times()
self.assertFalse(trip.get_stop_times())
for st in old_stop_times:
trip.add_stop_time_object(st)
self.assertEqual(trip.get_start_time(), 4 * 3600)
self.assertEqual(trip.get_end_time(), 4 * 3600 + 21 * 60)
class InvalidRouteAgencyTestCase(util.LoadTestCase):
def runTest(self):
self.load('invalid_route_agency')
self.accumulator.pop_invalid_value("agency_id", "routes.txt")
self.accumulator.pop_invalid_value("route_id", "trips.txt")
self.accumulator.assert_no_more_exceptions()
class InvalidAgencyIdsTestCase(util.LoadTestCase):
def runTest(self):
self.load('invalid_agency_ids')
self.accumulator.pop_exception('OtherProblem')
self.accumulator.assert_no_more_exceptions()
class AddStopTimeParametersTestCase(util.TestCase):
def runTest(self):
problem_reporter = util.get_test_failure_problem_reporter(self)
schedule = transitfeed.Schedule(problem_reporter=problem_reporter)
route = schedule.add_route(short_name="10", long_name="", route_type="Bus")
stop = schedule.add_stop(40, -128, "My stop")
# Stop must be added to schedule so that the call
# AddStopTime -> AddStopTimeObject -> GetStopTimes -> GetStop can work
trip = transitfeed.Trip()
trip.route_id = route.route_id
trip.service_id = schedule.get_default_service_period().service_id
trip.trip_id = "SAMPLE_TRIP"
schedule.add_trip_object(trip)
# First stop must have time
trip.add_stop_time(stop, arrival_secs=300, departure_secs=360)
trip.add_stop_time(stop)
trip.add_stop_time(stop, arrival_time="00:07:00", departure_time="00:07:30")
trip.validate(problem_reporter)
class AddFrequencyValidationTestCase(util.ValidationTestCase):
def ExpectInvalidValue(self, start_time, end_time, headway,
column_name, value):
try:
trip = transitfeed.Trip()
trip.add_frequency(start_time, end_time, headway)
self.fail("Expected InvalidValue error on %s" % column_name)
except transitfeed.InvalidValue as e:
self.assertEqual(column_name, e.column_name)
self.assertEqual(value, e.value)
self.assertEqual(0, len(trip.get_frequency_tuples()))
def ExpectMissingValue(self, start_time, end_time, headway, column_name):
trip = transitfeed.Trip()
try:
trip.add_frequency(start_time, end_time, headway)
self.fail("Expected MissingValue error on %s" % column_name)
except transitfeed.MissingValue as e:
self.assertEqual(column_name, e.column_name)
self.assertEqual(0, len(trip.get_frequency_tuples()))
def runTest(self):
# these should work fine
trip = transitfeed.Trip()
trip.trip_id = "SAMPLE_ID"
trip.add_frequency(0, 50, 1200)
trip.add_frequency("01:00:00", "02:00:00", "600")
trip.add_frequency(u"02:00:00", u"03:00:00", u"1800")
headways = trip.get_frequency_tuples()
self.assertEqual(3, len(headways))
self.assertEqual((0, 50, 1200, 0), headways[0])
self.assertEqual((3600, 7200, 600, 0), headways[1])
self.assertEqual((7200, 10800, 1800, 0), headways[2])
self.assertEqual([("SAMPLE_ID", "00:00:00", "00:00:50", "1200", "0"),
("SAMPLE_ID", "01:00:00", "02:00:00", "600", "0"),
("SAMPLE_ID", "02:00:00", "03:00:00", "1800", "0")],
trip.get_frequency_output_tuples())
# now test invalid input
self.ExpectMissingValue(None, 50, 1200, "start_time")
self.ExpectMissingValue("", 50, 1200, "start_time")
self.ExpectInvalidValue("midnight", 50, 1200, "start_time",
"midnight")
self.ExpectInvalidValue(-50, 50, 1200, "start_time", -50)
self.ExpectMissingValue(0, None, 1200, "end_time")
self.ExpectMissingValue(0, "", 1200, "end_time")
self.ExpectInvalidValue(0, "noon", 1200, "end_time", "noon")
self.ExpectInvalidValue(0, -50, 1200, "end_time", -50)
self.ExpectMissingValue(0, 600, 0, "headway_secs")
self.ExpectMissingValue(0, 600, None, "headway_secs")
self.ExpectMissingValue(0, 600, "", "headway_secs")
self.ExpectInvalidValue(0, 600, "test", "headway_secs", "test")
self.ExpectInvalidValue(0, 600, -60, "headway_secs", -60)
self.ExpectInvalidValue(0, 0, 1200, "end_time", 0)
self.ExpectInvalidValue("12:00:00", "06:00:00", 1200, "end_time",
21600)
class GetTripTimeTestCase(util.TestCase):
"""Test for GetStopTimeTrips and GetTimeInterpolatedStops"""
def setUp(self):
problems = util.get_test_failure_problem_reporter(self)
schedule = transitfeed.Schedule(problem_reporter=problems)
self.schedule = schedule
schedule.add_agency("Agency", "http://iflyagency.com",
"America/Los_Angeles")
service_period = schedule.get_default_service_period()
service_period.set_date_has_service('20070101')
self.stop1 = schedule.add_stop(lng=140.01, lat=0, name="140.01,0")
self.stop2 = schedule.add_stop(lng=140.02, lat=0, name="140.02,0")
self.stop3 = schedule.add_stop(lng=140.03, lat=0, name="140.03,0")
self.stop4 = schedule.add_stop(lng=140.04, lat=0, name="140.04,0")
self.stop5 = schedule.add_stop(lng=140.05, lat=0, name="140.05,0")
self.route1 = schedule.add_route("1", "One", "Bus")
self.trip1 = self.route1.add_trip(schedule, "trip 1", trip_id='trip1')
self.trip1.add_stop_time(self.stop1, schedule=schedule, departure_secs=100,
arrival_secs=100)
self.trip1.add_stop_time(self.stop2, schedule=schedule)
self.trip1.add_stop_time(self.stop3, schedule=schedule)
# loop back to stop2 to test that interpolated stops work ok even when
# a stop between timepoints is further from the timepoint than the
# preceding
self.trip1.add_stop_time(self.stop2, schedule=schedule)
self.trip1.add_stop_time(self.stop4, schedule=schedule, departure_secs=400,
arrival_secs=400)
self.trip2 = self.route1.add_trip(schedule, "trip 2", trip_id='trip2')
self.trip2.add_stop_time(self.stop2, schedule=schedule, departure_secs=500,
arrival_secs=500)
self.trip2.AddStopTime(self.stop3, schedule=schedule, departure_secs=600,
arrival_secs=600)
self.trip2.AddStopTime(self.stop4, schedule=schedule, departure_secs=700,
arrival_secs=700)
self.trip2.AddStopTime(self.stop3, schedule=schedule, departure_secs=800,
arrival_secs=800)
self.trip3 = self.route1.add_trip(schedule, "trip 3", trip_id='trip3')
def testGetTimeInterpolatedStops(self):
rv = self.trip1.get_time_interpolated_stops()
self.assertEqual(5, len(rv))
(secs, stoptimes, istimepoints) = tuple(zip(*rv))
self.assertEqual((100, 160, 220, 280, 400), secs)
self.assertEqual(("140.01,0", "140.02,0", "140.03,0", "140.02,0", "140.04,0"),
tuple([st.stop.stop_name for st in stoptimes]))
self.assertEqual((True, False, False, False, True), istimepoints)
self.assertEqual([], self.trip3.get_time_interpolated_stops())
def testGetTimeInterpolatedStopsUntimedEnd(self):
self.trip2.AddStopTime(self.stop3, schedule=self.schedule)
self.assertRaises(ValueError, self.trip2.GetTimeInterpolatedStops)
def testGetTimeInterpolatedStopsUntimedStart(self):
# Temporarily replace the problem reporter so that adding the first
# StopTime without a time doesn't throw an exception.
old_problems = self.schedule.problem_reporter
self.schedule.problem_reporter = util.get_test_failure_problem_reporter(
self, ("OtherProblem",))
self.trip3.AddStopTime(self.stop3, schedule=self.schedule)
self.schedule.problem_reporter = old_problems
self.trip3.AddStopTime(self.stop2, schedule=self.schedule,
departure_secs=500, arrival_secs=500)
self.assertRaises(ValueError, self.trip3.GetTimeInterpolatedStops)
def testGetTimeInterpolatedStopsSingleStopTime(self):
self.trip3.AddStopTime(self.stop3, schedule=self.schedule,
departure_secs=500, arrival_secs=500)
rv = self.trip3.get_time_interpolated_stops()
self.assertEqual(1, len(rv))
self.assertEqual(500, rv[0][0])
self.assertEqual(True, rv[0][2])
def testGetStopTimeTrips(self):
stopa = self.schedule.get_nearest_stops(lon=140.03, lat=0)[0]
self.assertEqual("140.03,0", stopa.stop_name) # Got stop3?
rv = stopa.get_stop_time_trips(self.schedule)
self.assertEqual(3, len(rv))
(secs, trip_index, istimepoints) = tuple(zip(*rv))
self.assertEqual((220, 600, 800), secs)
self.assertEqual(("trip1", "trip2", "trip2"), tuple([ti[0].trip_id for ti in trip_index]))
self.assertEqual((2, 1, 3), tuple([ti[1] for ti in trip_index]))
self.assertEqual((False, True, True), istimepoints)
def testStopTripIndex(self):
trip_index = self.stop3.trip_index
trip_ids = [t.trip_id for t, i in trip_index]
self.assertEqual(["trip1", "trip2", "trip2"], trip_ids)
self.assertEqual([2, 1, 3], [i for t, i in trip_index])
def testGetTrips(self):
self.assertEqual(
set([t.trip_id for t in self.stop1.get_trips(self.schedule)]),
{self.trip1.trip_id})
self.assertEqual(
set([t.trip_id for t in self.stop2.get_trips(self.schedule)]),
{self.trip1.trip_id, self.trip2.trip_id})
self.assertEqual(
set([t.trip_id for t in self.stop3.get_trips(self.schedule)]),
{self.trip1.trip_id, self.trip2.trip_id})
self.assertEqual(
set([t.trip_id for t in self.stop4.get_trips(self.schedule)]),
{self.trip1.trip_id, self.trip2.trip_id})
self.assertEqual(
set([t.trip_id for t in self.stop5.get_trips(self.schedule)]),
set())
class GetFrequencyTimesTestCase(util.TestCase):
"""Test for GetFrequencyStartTimes and GetFrequencyStopTimes"""
def setUp(self):
problems = util.get_test_failure_problem_reporter(self)
schedule = transitfeed.Schedule(problem_reporter=problems)
self.schedule = schedule
schedule.add_agency("Agency", "http://iflyagency.com",
"America/Los_Angeles")
service_period = schedule.get_default_service_period()
service_period.set_start_date("20080101")
service_period.set_end_date("20090101")
service_period.set_weekday_service(True)
self.stop1 = schedule.add_stop(lng=140.01, lat=0, name="140.01,0")
self.stop2 = schedule.add_stop(lng=140.02, lat=0, name="140.02,0")
self.stop3 = schedule.add_stop(lng=140.03, lat=0, name="140.03,0")
self.stop4 = schedule.add_stop(lng=140.04, lat=0, name="140.04,0")
self.stop5 = schedule.add_stop(lng=140.05, lat=0, name="140.05,0")
self.route1 = schedule.add_route("1", "One", "Bus")
self.trip1 = self.route1.add_trip(schedule, "trip 1", trip_id="trip1")
# add different types of stop times
self.trip1.AddStopTime(self.stop1, arrival_time="17:00:00",
departure_time="17:01:00") # both arrival and departure time
self.trip1.AddStopTime(self.stop2, schedule=schedule) # non timed
self.trip1.AddStopTime(self.stop3, stop_time="17:45:00") # only stop_time
# add headways starting before the trip
self.trip1.add_frequency("16:00:00", "18:00:00", 1800) # each 30 min
self.trip1.add_frequency("18:00:00", "20:00:00", 2700) # each 45 min
def testGetFrequencyStartTimes(self):
start_times = self.trip1.get_frequency_start_times()
self.assertEqual(
["16:00:00", "16:30:00", "17:00:00", "17:30:00",
"18:00:00", "18:45:00", "19:30:00"],
[transitfeed.format_seconds_since_midnight(secs) for secs in start_times])
# GetHeadwayStartTimes is deprecated, but should still return the same
# result as GetFrequencyStartTimes
self.assertEqual(start_times,
self.trip1.get_frequency_start_times())
def testGetFrequencyStopTimes(self):
stoptimes_list = self.trip1.get_frequency_stop_times()
arrival_secs = []
departure_secs = []
for stoptimes in stoptimes_list:
arrival_secs.append([st.arrival_secs for st in stoptimes])
departure_secs.append([st.departure_secs for st in stoptimes])
# GetHeadwayStopTimes is deprecated, but should still return the same
# result as GetFrequencyStopTimes
# StopTimes are instantiated as they're read from the DB so they can't be
# compared directly, but checking {arrival,departure}_secs should be enough
# to catch most errors.
self.trip1.get_frequency_stop_times()
headway_arrival_secs = []
headway_departure_secs = []
for stoptimes in stoptimes_list:
headway_arrival_secs.append([st.arrival_secs for st in stoptimes])
headway_departure_secs.append([st.departure_secs for st in stoptimes])
self.assertEqual(arrival_secs, headway_arrival_secs)
self.assertEqual(departure_secs, headway_departure_secs)
self.assertEqual(([57600, None, 60300], [59400, None, 62100], [61200, None, 63900],
[63000, None, 65700], [64800, None, 67500], [67500, None, 70200],
[70200, None, 72900]),
tuple(arrival_secs))
self.assertEqual(([57660, None, 60300], [59460, None, 62100], [61260, None, 63900],
[63060, None, 65700], [64860, None, 67500], [67560, None, 70200],
[70260, None, 72900]),
tuple(departure_secs))
# test if stoptimes are created with same parameters than the ones from the original trip
stoptimes = self.trip1.get_stop_times()
for stoptimes_clone in stoptimes_list:
self.assertEqual(len(stoptimes_clone), len(stoptimes))
for st_clone, st in zip(stoptimes_clone, stoptimes):
for name in st.__slots__:
if name not in ('arrival_secs', 'departure_secs'):
self.assertEqual(getattr(st, name), getattr(st_clone, name))
| 47.93617 | 117 | 0.636134 | 5,115 | 42,807 | 5.136461 | 0.113001 | 0.019716 | 0.019259 | 0.021695 | 0.586343 | 0.523313 | 0.459103 | 0.411944 | 0.34347 | 0.318616 | 0 | 0.065667 | 0.25365 | 42,807 | 892 | 118 | 47.98991 | 0.756675 | 0.090125 | 0 | 0.389928 | 0 | 0 | 0.086796 | 0.007184 | 0 | 0 | 0 | 0.001121 | 0.169784 | 1 | 0.05036 | false | 0 | 0.004317 | 0 | 0.086331 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e46875c490fe502ca1757b91de8b131eca3be26 | 1,365 | py | Python | knowledge_graph/preprocessing/preprocessor.py | ihaeyong/drama-graph | 60c3c216cd74bb19efd6baf836f6c7c2b42b764f | [
"MIT"
] | 3 | 2021-04-28T07:19:39.000Z | 2022-03-07T09:34:19.000Z | knowledge_graph/preprocessing/preprocessor.py | ihaeyong/drama-graph | 60c3c216cd74bb19efd6baf836f6c7c2b42b764f | [
"MIT"
] | 18 | 2020-08-24T12:40:38.000Z | 2022-03-12T00:47:14.000Z | knowledge_graph/preprocessing/preprocessor.py | ihaeyong/drama-graph | 60c3c216cd74bb19efd6baf836f6c7c2b42b764f | [
"MIT"
] | 1 | 2020-10-15T10:09:20.000Z | 2020-10-15T10:09:20.000Z | from preprocessing.sentence_processor import *
from preprocessing.coreference import *
from preprocessing.to_statement import *
from utils.macro import *
class preprocessor:
def __init__(self, config, demo_json):
self.input = []
self.config = config
if config['preprocessing']['load']:
self.output = jsonload(self.config['preprocessing']['output_path'])
return
elif config['mode'] == 'subtitle':
self.subtitle_loader()
elif config['mode'] == 'demo':
self.demo_loader(demo_json)
self.coref = coreference(config, self.input)
self.sentence_processor = sentence_processor(config, self.coref.output)
self.to_stmt = to_statement(config, self.sentence_processor.output)
self.output = self.to_stmt.output
def demo_loader(self, demo_json):
self.input.append(demo_json)
return
def subtitle_loader(self):
subtitle_path = self.config['preprocessing']['substitle_file']
# for path in diriter(subtitle_path):
# self.input.append(jsonload(path))
self.input.append(jsonload(subtitle_path))
return
def save_output(self):
if self.config['preprocessing']['load']:
return
jsondump(self.output, self.config['preprocessing']['output_path'])
return
| 33.292683 | 79 | 0.649084 | 151 | 1,365 | 5.688742 | 0.258278 | 0.069849 | 0.107101 | 0.039581 | 0.153667 | 0.090803 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238095 | 1,365 | 40 | 80 | 34.125 | 0.825962 | 0.05348 | 0 | 0.16129 | 0 | 0 | 0.100155 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129032 | false | 0 | 0.129032 | 0 | 0.451613 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e48b9d3b7904192c28927ccbc6170940d53e5e0 | 245 | py | Python | udpoptions/listener.py | adventureloop/network-tests | 2a65c0d10d36842ef63ee33691a911dcec00c6e3 | [
"0BSD"
] | null | null | null | udpoptions/listener.py | adventureloop/network-tests | 2a65c0d10d36842ef63ee33691a911dcec00c6e3 | [
"0BSD"
] | null | null | null | udpoptions/listener.py | adventureloop/network-tests | 2a65c0d10d36842ef63ee33691a911dcec00c6e3 | [
"0BSD"
] | null | null | null | import udp_options
import udp_usrreq
def callback(pcb, data=None, options=None, error=None):
print(pcb)
if __name__ == "__main__":
print("startings")
udp_usrreq.bindaddr('0.0.0.0', 5005, callback)
udp_usrreq.run_loop()
| 22.272727 | 58 | 0.681633 | 35 | 245 | 4.4 | 0.571429 | 0.175325 | 0.038961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.183673 | 245 | 10 | 59 | 24.5 | 0.73 | 0 | 0 | 0 | 0 | 0 | 0.097959 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.375 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e493c2e212c9c101abf9e6677ffaff7e95ab64e | 4,095 | py | Python | tests/test_events.py | mucker-capital/django-slack-utils | b3ad098f90123ac6bc0b26f1f9f674a4e1e71630 | [
"BSD-3-Clause"
] | 20 | 2018-12-18T20:51:28.000Z | 2019-08-29T19:34:24.000Z | tests/test_events.py | startmatter/django-slack-utils | 7ef321edf8ac91d94262845785d9e505dd84dbfb | [
"BSD-3-Clause"
] | 5 | 2020-05-31T16:16:25.000Z | 2021-06-15T02:01:44.000Z | tests/test_events.py | startmatter/django-slack-utils | 7ef321edf8ac91d94262845785d9e505dd84dbfb | [
"BSD-3-Clause"
] | 5 | 2020-03-25T12:55:48.000Z | 2021-06-14T20:21:26.000Z | import json
from unittest import mock
from django.http import JsonResponse
from django.test import TestCase
from django.urls import reverse
from slack_utils import signals
class EventsViewTestCase(TestCase):
def test_verification(self):
with mock.patch('slack_utils.decorators.verify_request') as verify_mock:
resp = self.client.post(reverse('slack-events-api'), "{}", content_type='application/json')
self.assertTrue(verify_mock.called)
def test_url_verification_handshake(self):
with mock.patch('slack_utils.decorators.verify_request', return_value=True):
resp = self.client.post(reverse('slack-events-api'), json.dumps({
"token": "Jhj5dZrVaK7ZwHHjRyZWjbDl",
"challenge": "3eZbrw1aBm2rZgRNFdxV2595E9CY3gmdALWMmHkvFXO7tYXAYM8P",
"type": "url_verification"
}), content_type='application/json')
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.content,
JsonResponse({"challenge": "3eZbrw1aBm2rZgRNFdxV2595E9CY3gmdALWMmHkvFXO7tYXAYM8P"}).content)
def test_app_rate_limited(self):
with mock.patch('slack_utils.decorators.verify_request', return_value=True):
resp = self.client.post(reverse('slack-events-api'), json.dumps({
"token": "Jhj5dZrVaK7ZwHHjRyZWjbDl",
"type": "app_rate_limited",
"team_id": "T123456",
"minute_rate_limited": 1518467820,
"api_app_id": "A123456"
}), content_type='application/json')
self.assertEqual(resp.status_code, 200)
def test_event(self):
def handler(sender, event_type, event_data, signal, **kwargs):
handler.signal_was_called = True
handler.event_type = event_type
handler.event_data = event_data
handler.kwargs = kwargs
signals.event_received.connect(handler)
try:
with mock.patch('slack_utils.decorators.verify_request', return_value=True):
resp = self.client.post(reverse('slack-events-api'), json.dumps({
"token": "z26uFbvR1xHJEdHE1OQiO6t8",
"team_id": "T061EG9RZ",
"api_app_id": "A0FFV41KK",
"event": {
"type": "reaction_added",
"user": "U061F1EUR",
"item": {
"type": "message",
"channel": "C061EG9SL",
"ts": "1464196127.000002"
},
"reaction": "slightly_smiling_face",
"item_user": "U0M4RL1NY",
"event_ts": "1465244570.336841"
},
"type": "event_callback",
"authed_users": [
"U061F7AUR"
],
"event_id": "Ev9UQ52YNA",
"event_time": 1234567890
}), content_type='application/json')
finally:
signals.event_received.disconnect(handler)
self.assertEqual(resp.status_code, 200)
self.assertTrue(handler.signal_was_called)
self.assertEqual(handler.event_type, 'reaction_added')
self.assertDictEqual(handler.event_data, {
"user": "U061F1EUR",
"item": {
"type": "message",
"channel": "C061EG9SL",
"ts": "1464196127.000002"
},
"reaction": "slightly_smiling_face",
"item_user": "U0M4RL1NY",
"event_ts": "1465244570.336841"
})
self.assertDictEqual(handler.kwargs, {
"token": "z26uFbvR1xHJEdHE1OQiO6t8",
"team_id": "T061EG9RZ",
"api_app_id": "A0FFV41KK",
"authed_users": [
"U061F7AUR"
],
"event_id": "Ev9UQ52YNA",
"event_time": 1234567890
}) | 39.375 | 117 | 0.542369 | 348 | 4,095 | 6.16954 | 0.284483 | 0.023288 | 0.02422 | 0.033535 | 0.540289 | 0.526316 | 0.526316 | 0.507685 | 0.48952 | 0.359572 | 0 | 0.075803 | 0.346032 | 4,095 | 104 | 118 | 39.375 | 0.725915 | 0 | 0 | 0.539326 | 0 | 0 | 0.268311 | 0.095215 | 0 | 0 | 0 | 0 | 0.101124 | 1 | 0.05618 | false | 0 | 0.067416 | 0 | 0.134831 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e4bf28a810774213c7d1cbf7b0fb39fe9162916 | 4,922 | py | Python | goes17.py | MarcoCiaramella/goes17-to-geotiff | f51cdff60a22eefb3234486eb204bd2069b44410 | [
"MIT"
] | 1 | 2020-06-02T12:32:58.000Z | 2020-06-02T12:32:58.000Z | goes17.py | MarcoCiaramella/goes17-to-geotiff | f51cdff60a22eefb3234486eb204bd2069b44410 | [
"MIT"
] | null | null | null | goes17.py | MarcoCiaramella/goes17-to-geotiff | f51cdff60a22eefb3234486eb204bd2069b44410 | [
"MIT"
] | null | null | null | import sys
from osgeo import gdal
from osgeo.gdalconst import *
from osgeo import osr
import netCDFreader as ncr
import datetime
import numpy as np
class Goes17Reader:
def __init__(self, nc):
self.dataset = ncr.open(nc,'NETCDF4')
self._read_rad()
self._read_projection()
self._extension()
self._read_datetime()
self._read_band_info()
def _extension(self):
self.ext = [-5434894.885056, -5434894.885056, 5434894.885056, 5434894.885056]
def _read_rad(self):
"""Example
<type 'netCDF4.Variable'>
int16 Rad(y, x)
_FillValue: 1023
long_name: ABI L1b Radiances
standard_name: toa_outgoing_radiance_per_unit_wavelength
_Unsigned: true
sensor_band_bit_depth: 10
valid_range: [ 0 1022]
scale_factor: 0.812106
add_offset: -25.9366
units: W m-2 sr-1 um-1
coordinates: band_id band_wavelength t y x
grid_mapping: goes_imager_projection
ancillary_variables: DQF
resolution: y: 0.000056 rad x: 0.000056 rad
cell_methods: t: point area: point
unlimited dimensions:
current shape = (5424, 5424)
filling on"""
rad = ncr.readVar(self.dataset,'Rad')
self.rad = rad[:]
self.rad = np.flip(self.rad, 1)
self.rad = np.rot90(self.rad, 2)
self.dy = self.rad.shape[0]
self.dx = self.rad.shape[1]
self.nodata = rad._FillValue
self.rad_units = str(rad.units)
def _read_projection(self):
"""Example:
<type 'netCDF4.Variable'>
int32 goes_imager_projection()
long_name: GOES-R ABI fixed grid projection
grid_mapping_name: geostationary
perspective_point_height: 35786023.0
semi_major_axis: 6378137.0
semi_minor_axis: 6356752.31414
inverse_flattening: 298.2572221
latitude_of_projection_origin: 0.0
longitude_of_projection_origin: -137.0
sweep_angle_axis: x
unlimited dimensions:
current shape = ()
filling on, default _FillValue of -2147483647 used"""
proj = ncr.readVar(self.dataset,'goes_imager_projection')
lon_0 = proj.longitude_of_projection_origin
h = proj.perspective_point_height
majior_ax = proj.semi_major_axis
minor_ax = proj.semi_minor_axis
self.proj4 = "+proj=geos +lon_0=%s +h=%s +x_0=0 +y_0=0 +a=%s +b=%s +units=m"%(lon_0,h,majior_ax,minor_ax)
def _read_datetime(self):
""" Example
<type 'netCDF4.Variable'>
float64 t()
long_name: J2000 epoch mid-point between the start and end image scan in seconds
standard_name: time
units: seconds since 2000-01-01 12:00:00
axis: T
bounds: time_bounds
unlimited dimensions:
current shape = ()
filling on, default _FillValue of 9.96920996839e+36 used"""
t = ncr.readVar(self.dataset,'t')
tsplit = t.units.split(' ')
date = tsplit[2]
time = tsplit[3]
seconds = int(t[:])
self.datetime = str(datetime.datetime.strptime(str(date+'T'+time),'%Y-%m-%dT%H:%M:%S') + datetime.timedelta(seconds=seconds))
def _read_band_info(self):
self.band_id = str(ncr.readVar(self.dataset,'band_id')[:][0])
bandVar = ncr.readVar(self.dataset,'band_wavelength')
self.band_wavelength = str(bandVar[:][0])+str(bandVar.units)
def export_geotiff(self,output_file):
#tiff file section
format = "GTiff"
#get driver
driver = gdal.GetDriverByName( format )
dst_ds = driver.Create( output_file, self.dx, self.dy, 1, gdal.GDT_Float32 )
adfGeoTransform = [
self.ext[0],
(self.ext[2] - self.ext[0]) / float(self.dx),
0.0,
self.ext[1],
0.0,
(self.ext[3] - self.ext[1]) / float(self.dy)
]
dst_ds.SetGeoTransform( adfGeoTransform )
dst_ds.SetMetadataItem('TIFFTAG_DATETIME',self.datetime,'')
dst_ds.SetMetadataItem('RAD_UNITS',self.rad_units,'')
dst_ds.SetMetadataItem('BAND_ID',self.band_id,'')
dst_ds.SetMetadataItem('BAND_WAVELENGTH',self.band_wavelength,'')
srs = osr.SpatialReference()
srs.ImportFromProj4(self.proj4)
dst_ds.SetProjection( srs.ExportToWkt() )
dst_ds.GetRasterBand(1).SetNoDataValue(float(self.nodata))
#write data
dst_ds.GetRasterBand(1).WriteArray( self.rad )
# Once we're done, close properly the dataset
dst_ds = None
| 36.731343 | 133 | 0.58269 | 593 | 4,922 | 4.637437 | 0.359191 | 0.025455 | 0.025455 | 0.038182 | 0.133818 | 0.061091 | 0.042182 | 0.042182 | 0.042182 | 0 | 0 | 0.069967 | 0.314709 | 4,922 | 133 | 134 | 37.007519 | 0.745331 | 0.303941 | 0 | 0.029851 | 0 | 0.014925 | 0.061071 | 0.007185 | 0 | 0 | 0 | 0 | 0 | 1 | 0.104478 | false | 0 | 0.119403 | 0 | 0.238806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e4c91eb7d8d3d7397dac30febf5dc19a985e1a2 | 1,622 | py | Python | src/ethereum/utils/transaction.py | norswap/execution-specs | c2274790e8ac2d637c7dbe092477a1b21243916c | [
"CC0-1.0"
] | 1 | 2021-09-07T21:30:14.000Z | 2021-09-07T21:30:14.000Z | src/ethereum/utils/transaction.py | norswap/execution-specs | c2274790e8ac2d637c7dbe092477a1b21243916c | [
"CC0-1.0"
] | null | null | null | src/ethereum/utils/transaction.py | norswap/execution-specs | c2274790e8ac2d637c7dbe092477a1b21243916c | [
"CC0-1.0"
] | null | null | null | """
Utility Functions For Transactions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
..contents:: Table of Contents
:backlinks: none
:local:
Introduction
------------
Transactions specific utility functions used in this application.
"""
from typing import Any, Dict, Tuple
from ethereum.base_types import Bytes0
from ethereum.frontier.eth_types import Transaction
from ethereum.frontier.utils.hexadecimal import hex_to_address
from ethereum.utils.hexadecimal import hex_to_bytes, hex_to_u256
def json_to_transactions(json_data: Dict[Any, Any]) -> Tuple[Transaction, ...]:
"""
Convert json data to tuple of transaction objects.
Parameters
----------
json_data :
The transactions data where the values are hexadecimals.
Returns
-------
transactions : `Tuple[Transaction, ...]`
The transaction objects obtained from the json data.
"""
transactions = []
for transaction in json_data["transactions"]:
tx = Transaction(
nonce=hex_to_u256(transaction["nonce"]),
gas_price=hex_to_u256(transaction["gasPrice"]),
gas=hex_to_u256(transaction["gas"]),
to=(
Bytes0(b"")
if transaction["to"] == ""
else hex_to_address(transaction["to"])
),
value=hex_to_u256(transaction["value"]),
data=hex_to_bytes(transaction["input"]),
v=hex_to_u256(transaction["v"]),
r=hex_to_u256(transaction["r"]),
s=hex_to_u256(transaction["s"]),
)
transactions.append(tx)
return tuple(transactions)
| 28.964286 | 79 | 0.618372 | 177 | 1,622 | 5.485876 | 0.361582 | 0.061792 | 0.07415 | 0.144181 | 0.055613 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021173 | 0.24291 | 1,622 | 55 | 80 | 29.490909 | 0.769544 | 0.302096 | 0 | 0 | 0 | 0 | 0.041783 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.2 | 0 | 0.28 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e4ed343e3d684c83160112816408464927af7e8 | 10,031 | py | Python | allenopt/optimize.py | teffland/allenopt | b1959697e6d3f4c5b0841e27348766c73d2d14bd | [
"MIT"
] | null | null | null | allenopt/optimize.py | teffland/allenopt | b1959697e6d3f4c5b0841e27348766c73d2d14bd | [
"MIT"
] | null | null | null | allenopt/optimize.py | teffland/allenopt | b1959697e6d3f4c5b0841e27348766c73d2d14bd | [
"MIT"
] | null | null | null | """ Optimize a base allennlp configuration with a gaussian process by providing
a hyperparam search file.
"""
from argparse import ArgumentParser
import os
import numpy as np
import json
import _jsonnet
import subprocess
from datetime import datetime
from collections import OrderedDict, defaultdict
import skopt
from skopt.space.space import Categorical, Integer, Real, Space
from skopt.utils import normalize_dimensions
from allennlp.common.params import parse_overrides
from allenopt.util import *
from allenopt.plot import *
logger = init_logger(__name__, logging.DEBUG)
def parse_args(args=[]):
parser = ArgumentParser(description="Optimize allennlp model hyperparams with random, gaussian process, or tree-based process")
parser.add_argument('base_config_path', help="Base config path")
parser.add_argument('search_config_path', help="Search space config path")
parser.add_argument('--include-package', help='Source package to pass to allennlp')
parser.add_argument('-s', '--serialization-dir', type=str, help="Base directory to save trials in." )
parser.add_argument('-o', '--overrides', type=str, default=None, help="If provided, we will override the base config with these")
parser.add_argument('-e', '--evaluate-on-test', type=str, default=None, help="If provided, we will evaluate the best model on this test set.")
parser.add_argument('-n', '--n-calls', type=int, default=10, help="Number of trials")
parser.add_argument('-r', '--random-seed', type=int, default=None, help="Set a random state.")
parser.add_argument('-m', '--mode', type=str, default='gp', choices=['random', 'tree', 'gp'], help="Minimizer type. 'gp' is gaussian process, 'random' is random search, 'tree' is extra trees search.")
parser.add_argument('--n-random-starts', type=int, default=1, help="If provided, seed process with n random function evals in addition to the defaul x0")
parser.add_argument('--xi', type=float, default=0.01, help="Exploration/expoitation param")
parser.add_argument('--kappa', type=float, default=1.96, help="Exploration/expoitation param")
parser.add_argument('--no-delete-worse', action='store_true', help='By default we delete heavy ".th" and ".gz" files for worse trials as we go to save disk space. This disables that.')
return parser.parse_args(args) if args else parser.parse_args()
def run(args):
# Create base serialization dir
if not os.path.exists(args.serialization_dir):
os.makedirs(args.serialization_dir)
# Read in search configuration and create the blackbox function to optimize
f, dimensions, x0, trial_paths, delete_worse_files_cb = setup(args)
n_random_starts = max(1,args.n_random_starts) if x0 is None else args.n_random_starts
callback = None if args.no_delete_worse else delete_worse_files_cb
# Run the actual optimization
if args.mode == 'gp':
results = skopt.gp_minimize(
f, dimensions,
x0=x0,
n_calls=args.n_calls,
n_random_starts=n_random_starts,
random_state=args.random_seed,
verbose=True,
acq_optimizer='sampling',
xi=args.xi,
kappa=args.kappa,
callback=callback,
)
elif args.mode == 'random':
results = skopt.dummy_minimize(
f, dimensions,
x0=x0,
n_calls=args.n_calls,
random_state=args.random_seed,
verbose=True,
callback=callback,
)
elif args.mode == 'tree':
results = skopt.forest_minimize(
f, dimensions,
x0=x0,
n_calls=args.n_calls,
n_random_starts=n_random_starts,
random_state=args.random_seed,
verbose=True,
xi=args.xi,
kappa=args.kappa,
callback=callback,
)
# Maybe evaluate the best model on the test dataset
if args.evaluate_on_test:
logger.info('EVALUATE ON TEST')
evaluate_on_test(args, results, trial_paths)
# Save a bunch of visualizations of the search process
logger.info('PLOTTING RESULTS')
plot_results(args.serialization_dir, results)
logger.info('ALL DONE')
def setup(args):
""" Create the blackbox function to optimize.
This is a complex function that wraps the true parameter setting and training
in subprocess calls to allennlp.
"""
base_config = json.loads(_jsonnet.evaluate_file(args.base_config_path))
search_config = json.loads(_jsonnet.evaluate_file(args.search_config_path))
arg_overrides = parse_overrides(args.overrides)
# Flatten configs and get shorthand mappings
flat_base_config = flatten(base_config)
flat_search_config = flatten(search_config)
shorthands = get_shorthands(flat_search_config)
# Extract any variable dimensions and the mapping to their keys
search_space = extract_search_space(flat_search_config)
lambdas = extract_lambdas(flat_search_config)
dimensions = list(search_space.values())
# We no longer use the base config as an initial point because the base config
# needs to be minimal -- cannot contain fields which aren't used by certain hp
# configurations since overrides cannot "delete" a field in the base config.
x0 = None # get_x0(flat_base_config, search_space)
trial_num = 0
trial_paths = dict()
# Construct f
def f(x):
nonlocal trial_num
nonlocal trial_paths
# Map the x to the config keys that need updated
newx = []
for d,p in zip(dimensions, x):
print(d.name, d, p, type(p))
if 'numpy' in str(type(p)):
p = p.item()
newx.append(p)
x = newx
overrides = skopt.utils.point_asdict(search_space, x)
overrides = fill_search_constants(overrides, flat_search_config)
overrides = restrict_type_overrides(overrides, flat_search_config)
# print(f'Overrides after fill and restrict: {json.dumps(overrides, indent=2)}')
# Construct the trial serialization path
trial_str = construct_trial_name(overrides, shorthands, trial_num)
trial_path = os.path.join(args.serialization_dir, trial_str)
trial_paths[trial_num] = trial_path
# Construct the overrides string
processed_overrides = format_overrides(overrides, lambdas, base_config, arg_overrides)
print(f'Sampled config: {json.dumps(processed_overrides, indent=2)}')
override_str = json.dumps(processed_overrides, indent=None)
# Run Allennlp train subprocess
cmd = f"allennlp train {args.base_config_path} -f -s {trial_path} -o '{override_str}' --file-friendly-logging --include-package {args.include_package}"
print(f'CMD: {cmd}')
try:
subprocess.check_call(cmd, shell=True)
except Exception as e:
logger.error(e, exc_info=True)
raise e
trial_num += 1
# Retrieve the best validation metric and return that value
metrics = json.load(open(os.path.join(trial_path, 'metrics.json')))
validation_metric = base_config['trainer']['validation_metric']
negate = validation_metric.startswith('+')
validation_metric = validation_metric.lstrip('+-')
y = metrics[f'best_validation_{validation_metric}']
if negate:
y = -y
return y
# Construct a callback which maintains only the best weights/archive
def delete_worse_files_cb(results):
""" Remove .th and .gz files for any trials that aren't the best so far.
"""
nonlocal trial_num
nonlocal trial_paths
logger.info(f'DELETE WORSE FILES, trial num:{trial_num}')
best_trial_num = np.argmin(results.func_vals).item()
logger.info(f'Func values: {results.func_vals}, best is {best_trial_num} with path {trial_paths[best_trial_num]}')
for i in range(trial_num):
if i != best_trial_num:
logger.info(f'Deleting .th and .gz files at {trial_paths[i]}')
th_path = os.path.join(trial_paths[i], '*.th')
gz_path = os.path.join(trial_paths[i], '*.gz')
cmd = f"rm -f {th_path} && rm -f {gz_path}"
try:
subprocess.check_call(cmd, shell=True)
except Exception as e:
logger.error(e, exc_info=True)
raise e
return f, dimensions, x0, trial_paths, delete_worse_files_cb
def evaluate_on_test(args, results, trial_paths):
""" Look at all models in serialization dir for the argmaximizer
of the 'best_validation_metric', then evaluate that model on the test set.
"""
# Find the best trial model
best_trial_num = np.argmin(results.func_vals).item()
best_trial_path = trial_paths[best_trial_num]
model_path = os.path.join(best_trial_path, 'model.tar.gz')
# Evaluate that model on the test dataset, dumping to best_trial_test_results.jsons
output_path = os.path.join(args.serialization_dir, 'best_trial_test_metrics.json')
cuda_device = json.loads(_jsonnet.evaluate_file(args.base_config_path))['trainer'].get('cuda_device', -1)
cmd = f"allennlp evaluate {model_path} {args.evaluate_on_test} --output-file {output_path} --cuda-device {cuda_device} --include-package {args.include_package}"
logger.info(f'EVALUATE CMD: {cmd}')
try:
subprocess.check_call(cmd, shell=True)
except Exception as e:
logger.error(e, exc_info=True)
raise e
# Open the results and add the path of the best model so we know who won.
test_metrics = json.load(open(output_path))
test_metrics['best_trial_path'] = best_trial_path
logger.info(f'Best trial path was {best_trial_path} with test metrics:{json.dumps(test_metrics, indent=2)}')
with open(output_path, 'w') as outf:
json.dump(test_metrics, outf)
if __name__ == '__main__':
args = parse_args()
run(args)
| 41.970711 | 204 | 0.676303 | 1,366 | 10,031 | 4.783309 | 0.225476 | 0.022957 | 0.033823 | 0.010713 | 0.279308 | 0.230487 | 0.19345 | 0.137741 | 0.12611 | 0.075911 | 0 | 0.003598 | 0.224205 | 10,031 | 238 | 205 | 42.147059 | 0.836032 | 0.161599 | 0 | 0.27439 | 0 | 0.030488 | 0.219518 | 0.037885 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036585 | false | 0.006098 | 0.085366 | 0 | 0.140244 | 0.018293 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
8e5178728ed7355a5dbc403949cae55f35408aca | 1,498 | py | Python | api/animals.py | Super-Serious/bot | 6cee736848e8beb7c8d004fd21301809a11b2cba | [
"MIT"
] | 5 | 2020-07-19T19:07:27.000Z | 2021-12-15T11:30:03.000Z | api/animals.py | Super-Serious/bot | 6cee736848e8beb7c8d004fd21301809a11b2cba | [
"MIT"
] | 136 | 2020-07-25T19:29:37.000Z | 2022-03-09T15:16:49.000Z | api/animals.py | Super-Serious/bot | 6cee736848e8beb7c8d004fd21301809a11b2cba | [
"MIT"
] | 5 | 2020-07-19T18:12:03.000Z | 2021-02-22T10:35:11.000Z | from typing import Callable, Dict, Tuple, TYPE_CHECKING, Union
from requests import get
from telegram import MessageEntity
if TYPE_CHECKING:
import telegram
import telegram.ext
def animal(update: 'telegram.Update', _context: 'telegram.ext.CallbackContext') -> None:
"""Get animal"""
if update.message:
message: 'telegram.Message' = update.message
else:
return
animal_choice: str
if update.message.caption:
animal_choice = list(message.parse_caption_entities([MessageEntity.BOT_COMMAND]).values())[0]
elif update.message.text:
animal_choice = list(message.parse_entities([MessageEntity.BOT_COMMAND]).values())[0]
animal_choice = animal_choice.partition('@')[0]
urls: Dict[str, Tuple[str, Callable]] = {
"/shiba": (
'http://shibe.online/api/shibes?count=1&urls=true&httpsUrls=false',
lambda resp: message.reply_photo(resp[0])
),
"/fox": (
'https://randomfox.ca/floof/',
lambda resp: message.reply_photo(resp['image'])
),
"/cat": (
'https://api.thecatapi.com/v1/images/search',
lambda resp: message.reply_photo(resp[0]['url'])
),
"/catfact": (
'https://cat-fact.herokuapp.com/facts/random',
lambda resp: message.reply_text(resp['text'])
),
}
response: Union[list, dict] = get(urls[animal_choice][0]).json()
urls[animal_choice][1](response)
| 32.565217 | 101 | 0.618158 | 170 | 1,498 | 5.335294 | 0.429412 | 0.092613 | 0.074972 | 0.097023 | 0.250276 | 0.188534 | 0.070562 | 0 | 0 | 0 | 0 | 0.007881 | 0.23765 | 1,498 | 45 | 102 | 33.288889 | 0.78634 | 0.006676 | 0 | 0.108108 | 0 | 0.027027 | 0.182186 | 0.018893 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.135135 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3cb8f307948086da941c1c7eb13a063fbf80941 | 5,030 | py | Python | appgate/bytes.py | appgate/sdp-operator | 289927e07eca84003aa4bd4631b57dc9955eee23 | [
"MIT"
] | 6 | 2020-09-22T13:21:05.000Z | 2022-01-06T01:49:22.000Z | appgate/bytes.py | appgate/sdp-operator | 289927e07eca84003aa4bd4631b57dc9955eee23 | [
"MIT"
] | 36 | 2020-09-23T06:38:51.000Z | 2022-02-09T13:53:32.000Z | appgate/bytes.py | appgate/sdp-operator | 289927e07eca84003aa4bd4631b57dc9955eee23 | [
"MIT"
] | 3 | 2021-07-27T18:16:52.000Z | 2022-03-01T22:18:15.000Z | import base64
import binascii
import datetime
import hashlib
import re
from typing import Optional, Any, Dict, List, Callable
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.serialization import Encoding, PublicFormat
from cryptography.x509 import load_pem_x509_certificate
from appgate.customloaders import CustomFieldsEntityLoader
from appgate.openapi.attribmaker import SimpleAttribMaker
from appgate.openapi.types import OpenApiDict, AttribType, AttributesDict, \
K8S_LOADERS_FIELD_NAME, InstanceMakerConfig, Entity_T, LoaderFunc
__all__ = [
'checksum_attrib_maker',
'size_attrib_maker',
'certificate_attrib_maker',
]
def datetime_utc(d: datetime.datetime) -> datetime.datetime:
if not d.utcoffset():
return d.astimezone()
return d
def create_certificate_loader(loader: LoaderFunc, entity_type: type) -> Callable[..., Any]:
def certificate_bytes(value: Any, data: str) -> Entity_T:
"""
Creates an Entity_T with the details of a PEM certificate.
NOTE: Entity_T must be compatible with the fields in the dict returned here
NOTE: We need to increase version one since:
Version ::= INTEGER { v1(0), v2(1), v3(2) }
"""
cert = load_pem_x509_certificate(data.encode()) # type: ignore
valid_from = re.sub(r'\+\d\d:\d\d', 'Z',
datetime_utc(cert.not_valid_before).isoformat(timespec='milliseconds'))
valid_to = re.sub(r'\+\d\d:\d\d', 'Z',
datetime_utc(cert.not_valid_after).isoformat(timespec='milliseconds'))
public_key = cert.public_key().public_bytes(
Encoding.PEM,
PublicFormat.SubjectPublicKeyInfo).decode().splitlines()
del public_key[0]
del public_key[-1]
cert_data = {
'version': cert.version.value + 1,
'serial': str(cert.serial_number),
'issuer': ', '.join(cert.issuer.rfc4514_string().split(',')),
'subject': ', '.join(cert.subject.rfc4514_string().split(',')),
'validFrom': valid_from,
'validTo': valid_to,
'fingerprint': binascii.hexlify(cert.fingerprint(hashes.SHA256())).decode(),
'certificate': base64.b64encode(cert.public_bytes(Encoding.PEM)).decode(),
'subjectPublicKey': ''.join(public_key),
}
return loader(cert_data, None, entity_type)
return certificate_bytes
def checksum_bytes(value: Any, data: str) -> str:
bytes_decoded: bytes = base64.b64decode(data)
return hashlib.sha256(bytes_decoded).hexdigest()
def size_bytes(value: Any, data: str) -> int:
bytes_decoded: bytes = base64.b64decode(data)
return len(bytes_decoded)
class BytesFieldAttribMaker(SimpleAttribMaker):
def __init__(self, name: str, tpe: type, base_tpe: type, default: Optional[AttribType],
factory: Optional[type], definition: OpenApiDict,
source_field: str,
loader: Callable[..., Any]) -> None:
super().__init__(name, tpe, base_tpe, default, factory, definition)
self.source_field = source_field
self.loader = loader
def values(self, attributes: Dict[str, 'SimpleAttribMaker'], required_fields: List[str],
instance_maker_config: 'InstanceMakerConfig') -> AttributesDict:
values = super().values(attributes, required_fields, instance_maker_config)
values['eq'] = True
if 'metadata' not in values:
values['metadata'] = {}
values['metadata'][K8S_LOADERS_FIELD_NAME] = [CustomFieldsEntityLoader(
loader=self.loader,
dependencies=[self.source_field],
field=self.name,
)]
return values
def checksum_attrib_maker(name: str, tpe: type, base_tpe: type, default: Optional[AttribType],
factory: Optional[type], definition: OpenApiDict,
source_field: str) -> BytesFieldAttribMaker:
return BytesFieldAttribMaker(name, tpe, base_tpe, default, factory, definition, source_field,
checksum_bytes)
def size_attrib_maker(name: str, tpe: type, base_tpe: type, default: Optional[AttribType],
factory: Optional[type], definition: OpenApiDict,
source_field: str) -> BytesFieldAttribMaker:
return BytesFieldAttribMaker(name, tpe, base_tpe, default, factory, definition, source_field,
size_bytes)
def certificate_attrib_maker(name: str, tpe: type, base_tpe: type, default: Optional[AttribType],
factory: Optional[type], definition: OpenApiDict,
source_field: str,
loader: LoaderFunc) -> BytesFieldAttribMaker:
return BytesFieldAttribMaker(name, tpe, base_tpe, default, factory, definition, source_field,
create_certificate_loader(loader, base_tpe))
| 42.991453 | 99 | 0.648111 | 537 | 5,030 | 5.877095 | 0.27933 | 0.034854 | 0.003802 | 0.017744 | 0.31749 | 0.298479 | 0.298479 | 0.259823 | 0.259823 | 0.259823 | 0 | 0.012668 | 0.24672 | 5,030 | 116 | 100 | 43.362069 | 0.820269 | 0.048708 | 0 | 0.146067 | 0 | 0 | 0.054465 | 0.0095 | 0 | 0 | 0 | 0 | 0 | 1 | 0.11236 | false | 0 | 0.134831 | 0.033708 | 0.370787 | 0.011236 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3cba118a37558869d34642f72693aa5c0e2130e | 2,422 | py | Python | rate_limit/tests/test_actiongroups.py | sapcc/openstack-rate-limit-middleware | 107de6130887ecffa91d1d788a3215beb3aab7e1 | [
"Apache-2.0"
] | 4 | 2019-05-28T19:40:29.000Z | 2021-08-02T23:52:29.000Z | rate_limit/tests/test_actiongroups.py | sapcc/openstack-rate-limit-middleware | 107de6130887ecffa91d1d788a3215beb3aab7e1 | [
"Apache-2.0"
] | 7 | 2019-06-07T11:59:28.000Z | 2021-02-23T09:56:08.000Z | rate_limit/tests/test_actiongroups.py | sapcc/openstack-rate-limit-middleware | 107de6130887ecffa91d1d788a3215beb3aab7e1 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 SAP SE
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import unittest
import os
from rate_limit.rate_limit import OpenStackRateLimitMiddleware
from . import fake
WORKDIR = os.path.dirname(os.path.realpath(__file__))
CONFIGPATH = WORKDIR + '/fixtures/groups.yaml'
class TestActionGroups(unittest.TestCase):
is_setup = False
def setUp(self):
if self.is_setup:
return
self.app = OpenStackRateLimitMiddleware(
app=fake.FakeApp(),
config_file=CONFIGPATH
)
self.is_setup = True
def test_groups(self):
rl_groups = self.app.rate_limit_groups
self.assertIsNotNone(
rl_groups,
"expected rate limit groups to be '{0}' but got '{1}'".format(
"""
groups:
write:
- update
- delete
read:
- read
- read/list
""",
rl_groups
)
)
def test_mapping(self):
stimuli = [
{
'action': 'create',
'expected': 'create'
},
{
'action': 'update',
'expected': 'write'
},
{
'action': 'delete',
'expected': 'write'
},
{
'action': 'read',
'expected': 'read'
},
{
'action': 'read/list',
'expected': 'read'
},
]
for stim in stimuli:
action = stim.get('action')
expected_action = stim.get('expected')
got_action = self.app.get_action_from_rate_limit_groups(action)
self.assertEqual(
got_action,
expected_action,
"action should be '{0}' but got '{1}'".format(expected_action, got_action)
)
if __name__ == '__main__':
unittest.main()
| 24.969072 | 90 | 0.551197 | 254 | 2,422 | 5.114173 | 0.456693 | 0.046189 | 0.034642 | 0.024634 | 0.024634 | 0.024634 | 0 | 0 | 0 | 0 | 0 | 0.007634 | 0.35095 | 2,422 | 96 | 91 | 25.229167 | 0.818702 | 0.224608 | 0 | 0.067797 | 0 | 0 | 0.143982 | 0.011811 | 0 | 0 | 0 | 0 | 0.033898 | 1 | 0.050847 | false | 0 | 0.067797 | 0 | 0.169492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3cbf2ce4c2a5e6f06d0cb9cd4033d69edef90ae | 274 | py | Python | Algorithms/Strings/Super Reduced String/solution.py | kitarp29/ds-algo-solutions | c06effdaec2ff014248ca399268934cd8a639b5a | [
"MIT"
] | 48 | 2020-12-04T17:48:47.000Z | 2022-02-26T17:56:52.000Z | Algorithms/Strings/Super Reduced String/solution.py | kitarp29/ds-algo-solutions | c06effdaec2ff014248ca399268934cd8a639b5a | [
"MIT"
] | 465 | 2020-12-04T02:12:56.000Z | 2021-12-07T16:09:51.000Z | Algorithms/Strings/Super Reduced String/solution.py | kitarp29/ds-algo-solutions | c06effdaec2ff014248ca399268934cd8a639b5a | [
"MIT"
] | 199 | 2020-12-04T02:39:56.000Z | 2021-12-07T10:10:50.000Z | #input the string
n = input().strip()
x = 0
while x+1 < len(n):
if n[x] == n[x+1]:
n = n[:x]+n[x+2:]
if x > 0:
x-= 1
else:
x += 1
# output the reduced string or Empty String
if n == "":
print("Empty String")
else:
print(n) | 18.266667 | 43 | 0.463504 | 48 | 274 | 2.645833 | 0.375 | 0.062992 | 0.047244 | 0.062992 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039548 | 0.354015 | 274 | 15 | 44 | 18.266667 | 0.677966 | 0.211679 | 0 | 0.153846 | 0 | 0 | 0.055814 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3cc0089dc85cf890dfb47d6fdaaefffe5ba01cb | 5,350 | py | Python | corehq/motech/openmrs/forms.py | rochakchauhan/commcare-hq | aa7ab3c2d0c51fe10f2b51b08101bb4b5a376236 | [
"BSD-3-Clause"
] | null | null | null | corehq/motech/openmrs/forms.py | rochakchauhan/commcare-hq | aa7ab3c2d0c51fe10f2b51b08101bb4b5a376236 | [
"BSD-3-Clause"
] | null | null | null | corehq/motech/openmrs/forms.py | rochakchauhan/commcare-hq | aa7ab3c2d0c51fe10f2b51b08101bb4b5a376236 | [
"BSD-3-Clause"
] | null | null | null | from django import forms
from django.core.exceptions import ValidationError
from django.utils.translation import ugettext as _
from crispy_forms.helper import FormHelper
from crispy_forms.layout import Submit
from corehq.apps.userreports.ui.fields import JsonField
from corehq.motech.openmrs.const import (
ADDRESS_PROPERTIES,
IMPORT_FREQUENCY_CHOICES,
LOG_LEVEL_CHOICES,
NAME_PROPERTIES,
PERSON_PROPERTIES,
)
class OpenmrsConfigForm(forms.Form):
openmrs_provider = forms.CharField(label=_("Provider UUID"), required=False)
case_config = JsonField(expected_type=dict)
form_configs = JsonField(expected_type=list)
def __init__(self, *args, **kwargs):
super(OpenmrsConfigForm, self).__init__(*args, **kwargs)
self.helper = FormHelper()
self.helper.add_input(Submit('submit', _('Save Changes')))
def clean_case_config(self):
for key in self.cleaned_data['case_config']['person_properties']:
if key not in PERSON_PROPERTIES:
raise ValidationError(
_('person property key "%(key)s" is not valid.'),
code='invalid',
params={'key': key}
)
for key in self.cleaned_data['case_config']['person_preferred_name']:
if key not in NAME_PROPERTIES:
raise ValidationError(
_('person preferred name key "%(key)s" is not valid.'),
code='invalid',
params={'key': key}
)
for key in self.cleaned_data['case_config']['person_preferred_address']:
if key not in ADDRESS_PROPERTIES:
raise ValidationError(
_('person preferred address key "%(key)s" is not valid.'),
code='invalid',
params={'key': key}
)
for id_ in self.cleaned_data['case_config']['match_on_ids']:
if id_ not in self.cleaned_data['case_config']['patient_identifiers']:
raise ValidationError(
_('ID "%(id_)s" used in "match_on_ids" is missing from "patient_identifiers".'),
code='invalid',
params={'id_': id_}
)
return self.cleaned_data['case_config']
_owner_id_label = _('Owner ID')
_location_type_name_label = _('Organization Level')
class OpenmrsImporterForm(forms.Form):
server_url = forms.CharField(label=_('OpenMRS URL'), required=True,
help_text=_('e.g. "http://www.example.com/openmrs"'))
username = forms.CharField(label=_('Username'), required=True)
password = forms.CharField(label=_('Password'), widget=forms.PasswordInput, required=False)
notify_addresses_str = forms.CharField(label=_('Addresses to send notifications'), required=False,
help_text=_('A comma-separated list of email addresses to send error '
'notifications'))
location_id = forms.CharField(label=_('Location ID'), required=False,
help_text=_('If a project space has multiple OpenMRS servers to import from, '
'for which CommCare location is this OpenMRS server authoritative?'))
import_frequency = forms.ChoiceField(label=_('Import Frequency'), choices=IMPORT_FREQUENCY_CHOICES,
help_text=_('How often should cases be imported?'), required=False)
log_level = forms.TypedChoiceField(label=_('Log Level'), required=False, choices=LOG_LEVEL_CHOICES, coerce=int)
timezone = forms.CharField(label=_('Timezone'), required=False,
help_text=_("Timezone name. If not specified, the domain's timezone will be used."))
report_uuid = forms.CharField(label=_('Report UUID'), required=True,
help_text=_('The OpenMRS UUID of the report of patients to be imported'))
report_params = JsonField(label=_('Report Parameters'), required=False, expected_type=dict)
case_type = forms.CharField(label=_('Case Type'), required=True)
owner_id = forms.CharField(label=_owner_id_label, required=False,
help_text=_('The ID of the mobile worker or location who will own new cases'))
location_type_name = forms.CharField(label=_location_type_name_label, required=False,
help_text=_('The Organization Level whose mobile worker will own new '
'cases'))
external_id_column = forms.CharField(label=_('External ID Column'), required=True,
help_text=_("The column that contains the OpenMRS UUID of the patient"))
name_columns = forms.CharField(label=_('Name Columns'), required=True,
help_text=_('Space-separated column(s) to be concatenated to create the case '
'name (e.g. "givenName familyName")'))
column_map = JsonField(label=_('Map columns to properties'), required=True, expected_type=list,
help_text=_('e.g. [{"column": "givenName", "property": "first_name"}, ...]'))
| 52.45098 | 115 | 0.605234 | 578 | 5,350 | 5.363322 | 0.278547 | 0.05871 | 0.079677 | 0.036774 | 0.182581 | 0.119355 | 0.083226 | 0.083226 | 0.083226 | 0.070645 | 0 | 0 | 0.291963 | 5,350 | 101 | 116 | 52.970297 | 0.818374 | 0 | 0 | 0.129412 | 0 | 0 | 0.261869 | 0.012523 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023529 | false | 0.011765 | 0.152941 | 0 | 0.435294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3cf3b7afcc85c390bb7674416c046fd4a943621 | 5,583 | py | Python | tflite2onnx/op/conv.py | briangrifiin/tflite2onnx | e0a30f23d8236673c6a31a9a2a67913ae4cb8644 | [
"Apache-2.0"
] | 87 | 2020-09-23T01:16:36.000Z | 2022-03-26T01:10:32.000Z | tflite2onnx/op/conv.py | briangrifiin/tflite2onnx | e0a30f23d8236673c6a31a9a2a67913ae4cb8644 | [
"Apache-2.0"
] | 57 | 2020-09-23T01:26:13.000Z | 2022-01-26T07:58:29.000Z | tflite2onnx/op/conv.py | briangrifiin/tflite2onnx | e0a30f23d8236673c6a31a9a2a67913ae4cb8644 | [
"Apache-2.0"
] | 23 | 2020-10-17T11:58:21.000Z | 2022-03-24T06:15:18.000Z | import logging
import tflite
from tflite2onnx.layout import Layout
from tflite2onnx.op.activation import handleFusedActivation
from tflite2onnx.op.common import Operator
from tflite2onnx.op.padding import computePaddingSize
logger = logging.getLogger('tflite2onnx')
class Conv(Operator):
TypeMapping = {
tflite.BuiltinOperator.CONV_2D: 'Conv',
tflite.BuiltinOperator.DEPTHWISE_CONV_2D: 'Conv',
}
def __init__(self, TFactory, index):
super().__init__(TFactory, index)
self.attrs['kernel_shape'] = []
self.attrs['strides'] = []
# ONNX: This attribute cannot be used simultaneously with `auto_pad` attribute.
# re-initialize during self.parse(), as it needs the shape of input.
# We prefer `auto_pad`, however ONNXRuntime doesn't support
# `dilation` + `auto_pad`, such that we use `pads` to workaround it.
self.attrs['pads'] = [0, 0, 0, 0]
# XXX Not enabled as ONNXRuntime has limitation to infer pads for non-1 dilation
# self.attrs['auto_pad'] = 'SAME_UPPER' # See ComputePaddingHeightWidth() of TFLite
self.attrs['dilations'] = []
self.attrs['group'] = -1
self.setInited()
@property
def type(self):
return 'Conv'
@property
def isDepthwise(self):
op = self.tflite
opcode = self.model.OperatorCodes(op.OpcodeIndex()).BuiltinCode()
return (opcode is tflite.BuiltinOperator.DEPTHWISE_CONV_2D)
def parse(self):
logger.debug("Parsing %s...", self.type)
op = self.tflite
opcode = self.model.OperatorCodes(op.OpcodeIndex()).BuiltinCode()
assert(opcode in self.TypeMapping)
assert(op.InputsLength() == 3), "TFLite Conv always has bias"
assert(op.OutputsLength() == 1)
# input
ilayout = Layout('NHWC', 'NCHW')
it = self.parseInput(0, ilayout)
# weight
wlayout = Layout('CHWM', 'MCHW') if self.isDepthwise else Layout('OHWI', 'OIHW')
wt = self.parseInput(1, wlayout)
# bias
self.parseInput(2, is_bias=True)
# output
olayout = Layout('NHWC', 'NCHW')
ot = self.parseOutput(0, olayout)
# options
op_opt = op.BuiltinOptions()
option = tflite.DepthwiseConv2DOptions() if self.isDepthwise else tflite.Conv2DOptions()
option.Init(op_opt.Bytes, op_opt.Pos)
self.attrs['dilations'] = [option.DilationHFactor(), option.DilationWFactor()]
self.attrs['group'] = wt.shape[3] if self.isDepthwise else 1
self.attrs['kernel_shape'] = wt.shape[1:3]
self.attrs['strides'] = [option.StrideH(), option.StrideW()]
# XXX Not enabled as ONNXRuntime has limitation to infer pads for non-1 dilation
# self.attrs['auto_pad'] = PaddingMapping[option.Padding()]
if self.isDepthwise:
assert(option.DepthMultiplier() == 1)
self.attrs['pads'] = computePaddingSize(option.Padding(), it.shape[1:3],
self.attrs['kernel_shape'],
self.attrs['strides'], self.attrs['dilations'])
handleFusedActivation(self, option, ot)
self.setParsed()
def propagatableTensors(self):
return list()
def transform(self):
pass
class TransposeConv(Operator):
TypeMapping = {
tflite.BuiltinOperator.TRANSPOSE_CONV: 'ConvTranspose',
}
# FIXME: cases that untested yet (we are not fully understand the semantic gap)
# 1. Special output shape for VALID padding
# 2. Different input/output shape for SAME padding
def __init__(self, TFactory, index):
super().__init__(TFactory, index)
self.attrs['dilations'] = [1, 1] # TFLite TransposeConv doesn't have dilation
self.attrs['group'] = 1 # TFLite TransposeConv doesn't have group
self.attrs['kernel_shape'] = []
# self.attrs['output_padding'] = []
self.attrs['output_shape'] = []
# pads are overwrited by output_shape
# self.attrs['auto_pad'] = 'NOTSET'
# self.attrs['pads'] = []
self.attrs['strides'] = []
self.setInited()
@property
def type(self):
return 'ConvTranspose'
def parse(self):
logger.debug("Parsing %s...", self.type)
op = self.tflite
opcode = self.model.OperatorCodes(op.OpcodeIndex()).BuiltinCode()
assert(opcode in self.TypeMapping)
assert(op.InputsLength() == 3)
assert(op.OutputsLength() == 1)
# oshape
osi = op.Inputs(0)
oshape = self.TFactory.getData(self.model, self.graph, osi, 'int32')
# X
ilayout = Layout('NHWC', 'NCHW')
self.parseInput(2, ilayout)
# weight
wlayout = Layout('OHWI', 'IOHW')
wt = self.parseInput(1, wlayout)
# FIXME: we don't have a model containing bias.
# output
olayout = Layout('NHWC', 'NCHW')
ot = self.parseOutput(0, olayout)
assert((ot.shape == oshape).all())
# options
op_opt = op.BuiltinOptions()
option = tflite.TransposeConvOptions()
option.Init(op_opt.Bytes, op_opt.Pos)
self.attrs['kernel_shape'] = wt.shape[1:3]
self.attrs['strides'] = [option.StrideH(), option.StrideW()]
oslayout = Layout('NHWC', 'NCHW')
self.attrs['output_shape'] = oslayout.transform(oshape)
self.setParsed()
def propagatableTensors(self):
return list()
def transform(self):
pass
| 33.035503 | 96 | 0.610604 | 619 | 5,583 | 5.434572 | 0.279483 | 0.06956 | 0.022295 | 0.029727 | 0.453627 | 0.415874 | 0.389417 | 0.324316 | 0.324316 | 0.324316 | 0 | 0.010965 | 0.264911 | 5,583 | 168 | 97 | 33.232143 | 0.808723 | 0.189325 | 0 | 0.553398 | 0 | 0 | 0.077625 | 0 | 0 | 0 | 0 | 0.005952 | 0.07767 | 1 | 0.106796 | false | 0.019417 | 0.058252 | 0.038835 | 0.252427 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3da9bfd5a83d4bff0b4a850cbb5746a074cbc82 | 3,850 | bzl | Python | bazel/linux/bundles.bzl | pete-enf/enkit | fe262bef2bdad27bb61135263f10ec848234a9c2 | [
"BSD-3-Clause"
] | null | null | null | bazel/linux/bundles.bzl | pete-enf/enkit | fe262bef2bdad27bb61135263f10ec848234a9c2 | [
"BSD-3-Clause"
] | null | null | null | bazel/linux/bundles.bzl | pete-enf/enkit | fe262bef2bdad27bb61135263f10ec848234a9c2 | [
"BSD-3-Clause"
] | null | null | null | load("//bazel/linux:providers.bzl", "KernelBundleInfo", "KernelImageInfo", "RuntimeBundleInfo", "RuntimeInfo")
load("//bazel/linux:utils.bzl", "expand_deps", "get_compatible")
load("//bazel/utils:messaging.bzl", "location", "package")
load("//bazel/utils:files.bzl", "files_to_dir")
load("@bazel_skylib//lib:shell.bzl", "shell")
def _kunit_bundle(ctx):
ki = ctx.attr.image[KernelImageInfo]
mods = get_compatible(ctx, ki.arch, ki.package, ctx.attr.module)
alldeps = expand_deps(ctx, mods, ctx.attr.depth)
commands = [
# modprobe does not work correctly without /sys
"mount -t sysfs sysfs /sys",
]
inputs = []
for kmod in alldeps:
commands += ["", "# module " + package(kmod.label)]
if kmod.setup:
commands += kmod.setup
for mod in kmod.files:
if mod.extension != "ko":
continue
commands.append("load " + mod.short_path)
inputs.append(mod)
init = ctx.actions.declare_file(ctx.attr.name + "-kunit.sh")
ctx.actions.expand_template(
template = ctx.file._template_kunit,
output = init,
substitutions = {
"{target}": package(ctx.label),
"{message}": "KUNIT TESTS",
"{commands}": "\n".join(commands),
},
is_executable = True,
)
check = ctx.actions.declare_file(ctx.attr.name + "-check.sh")
ctx.actions.expand_template(
template = ctx.file._template_check,
output = check,
substitutions = {
"{target}": package(ctx.label),
"{parser}": ctx.executable._parser.short_path,
},
is_executable = True,
)
outside_runfiles = ctx.runfiles(files = ctx.attr._parser.files.to_list())
outside_runfiles = outside_runfiles.merge(ctx.attr._parser.default_runfiles)
inside_runfiles = ctx.runfiles(inputs)
return [
DefaultInfo(files = depset([init, check]), runfiles = inside_runfiles.merge(outside_runfiles)),
RuntimeBundleInfo(
run = RuntimeInfo(binary = init, runfiles = inside_runfiles),
check = RuntimeInfo(binary = check, runfiles = outside_runfiles),
),
]
kunit_bundle = rule(
doc = """\
Generates a directory containing the kernel modules, all their dependencies,
and an init script to run them as a kunit test.""",
implementation = _kunit_bundle,
attrs = {
"module": attr.label(
mandatory = True,
providers = [KernelBundleInfo],
doc = "The label of the KUnit linux kernel module to be used for testing. It must define a kunit_test_suite so that when loaded, KUnit will start executing its tests.",
),
"image": attr.label(
mandatory = True,
providers = [KernelImageInfo],
doc = "The label of a kernel image this test will run against. Important to select the correct architecture and package module.",
),
"depth": attr.int(
default = 5,
doc = "Maximum recursive depth when expanding a list of kernel module dependencies.",
),
"_template_kunit": attr.label(
allow_single_file = True,
default = Label("//bazel/linux:templates/kunit.template.sh"),
doc = "The template to generate the bash script used to run the tests.",
),
"_template_check": attr.label(
allow_single_file = True,
default = Label("//bazel/linux:templates/check_kunit.template.sh"),
doc = "The template to generate the bash script used to run the tests.",
),
"_parser": attr.label(
default = Label("//bazel/linux/kunit:kunit_zip"),
doc = "KUnit TAP output parser.",
executable = True,
cfg = "host",
),
},
)
| 38.118812 | 180 | 0.603636 | 430 | 3,850 | 5.288372 | 0.34186 | 0.021548 | 0.029024 | 0.029024 | 0.240985 | 0.183817 | 0.183817 | 0.155673 | 0.155673 | 0.112577 | 0 | 0.000358 | 0.275325 | 3,850 | 100 | 181 | 38.5 | 0.814695 | 0.011688 | 0 | 0.230769 | 0 | 0.021978 | 0.306074 | 0.064423 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010989 | false | 0 | 0.010989 | 0 | 0.032967 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3daf2d50e0303551f8d62b9050f38bb85c076ce | 1,562 | py | Python | python/pyqt/LearnPyQt/stack_layout.py | zeroam/TIL | 43e3573be44c7f7aa4600ff8a34e99a65cbdc5d1 | [
"MIT"
] | null | null | null | python/pyqt/LearnPyQt/stack_layout.py | zeroam/TIL | 43e3573be44c7f7aa4600ff8a34e99a65cbdc5d1 | [
"MIT"
] | null | null | null | python/pyqt/LearnPyQt/stack_layout.py | zeroam/TIL | 43e3573be44c7f7aa4600ff8a34e99a65cbdc5d1 | [
"MIT"
] | null | null | null | import sys
from PyQt5.QtWidgets import (
QApplication,
QMainWindow,
QWidget,
QVBoxLayout,
QStackedLayout,
QPushButton,
)
from PyQt5.QtGui import (
QPalette,
QColor,
)
class Color(QWidget):
def __init__(self, color, *args, **kwargs):
super(Color, self).__init__(*args, **kwargs)
self.setAutoFillBackground(True)
palette = self.palette()
palette.setColor(QPalette.Window, QColor(color))
self.setPalette(palette)
class MainWindow(QMainWindow):
def __init__(self, *args, **kwargs):
super(MainWindow, self).__init__(*args, **kwargs)
self.color_index = 3
self.setWindowTitle("Jayone's Awesome App")
layout = QVBoxLayout()
layout2 = QStackedLayout()
layout2.addWidget(Color('red'))
layout2.addWidget(Color('green'))
layout2.addWidget(Color('blue'))
layout2.addWidget(Color('yellow'))
layout2.setCurrentIndex(self.color_index)
layout.addLayout(layout2)
self.stack_layout = layout2
push_button = QPushButton('change')
push_button.clicked.connect(self.button_click)
layout.addWidget(push_button)
widget = QWidget()
widget.setLayout(layout)
self.setCentralWidget(widget)
def button_click(self):
self.color_index += 1
if self.color_index > 3:
self.color_index = 0
self.stack_layout.setCurrentIndex(self.color_index)
app = QApplication(sys.argv)
window = MainWindow()
window.show()
app.exec_()
| 23.313433 | 59 | 0.644686 | 161 | 1,562 | 6.068323 | 0.378882 | 0.064483 | 0.085977 | 0.036847 | 0.079836 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011834 | 0.242638 | 1,562 | 66 | 60 | 23.666667 | 0.814032 | 0 | 0 | 0 | 0 | 0 | 0.028169 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061224 | false | 0 | 0.061224 | 0 | 0.163265 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3ddd665e60d0bf2e360d9f1687c0e8c2fbb5966 | 2,376 | py | Python | src/estimagic/shared/check_option_dicts.py | PaulBehler/estimagic | c14f743986262d508e55738c90737cb504fe987b | [
"MIT"
] | 7 | 2019-05-11T07:19:46.000Z | 2019-05-31T07:03:13.000Z | src/estimagic/shared/check_option_dicts.py | PaulBehler/estimagic | c14f743986262d508e55738c90737cb504fe987b | [
"MIT"
] | 14 | 2019-05-04T14:15:52.000Z | 2019-06-10T11:45:27.000Z | src/estimagic/shared/check_option_dicts.py | PaulBehler/estimagic | c14f743986262d508e55738c90737cb504fe987b | [
"MIT"
] | 1 | 2019-05-21T08:44:37.000Z | 2019-05-21T08:44:37.000Z | """Check option dictionaries for minimize, maximize and first_derivative."""
def check_optimization_options(options, usage, algorithm_mandatory=True):
"""Check optimize_options or maximize_options for usage in estimation functions."""
options = {} if options is None else options
if algorithm_mandatory:
if not isinstance(options, dict) or "algorithm" not in options:
raise ValueError(
"optimize_options or maximize_options must be a dict containing at "
"least the entry 'algorithm'"
)
else:
if not isinstance(options, dict):
raise ValueError(
"optimize_options or maximize_options must be a dict or None."
)
criterion_options = {
"criterion",
"criterion_kwargs",
"derivative",
"derivative_kwargs",
"criterion_and_derivative",
"criterion_and_derivative_kwargs",
}
invalid_criterion = criterion_options.intersection(options)
if invalid_criterion:
msg = (
"Entries related to the criterion function, its derivatives or keyword "
"arguments of those functions are not valid entries of optimize_options "
f"or maximize_options for {usage}. Remove: {invalid_criterion}"
)
raise ValueError(msg)
general_options = {"logging", "log_options", "constraints"}
invalid_general = general_options.intersection(options)
if invalid_general:
msg = (
"The following are not valid entries of optimize_options because they are "
"not only relevant for minimization but also for inference: "
"{invalid_general}"
)
raise ValueError(msg)
def check_numdiff_options(numdiff_options, usage):
"""Check numdiff_options for usage in estimation and optimization functions."""
numdiff_options = {} if numdiff_options is None else numdiff_options
internal_options = {
"func",
"func_kwargs",
"lower_bounds",
"upper_bounds",
"f0",
"key",
}
invalid = internal_options.intersection(numdiff_options)
if invalid:
msg = (
"The following options are set internally and are not allowed in "
f"numdiff_options for {usage}: {invalid}"
)
raise ValueError(msg)
| 32.108108 | 87 | 0.63931 | 252 | 2,376 | 5.845238 | 0.321429 | 0.076035 | 0.046164 | 0.050917 | 0.279701 | 0.126273 | 0.126273 | 0.078751 | 0.078751 | 0.078751 | 0 | 0.00059 | 0.286195 | 2,376 | 73 | 88 | 32.547945 | 0.867925 | 0.093434 | 0 | 0.145455 | 0 | 0 | 0.371375 | 0.025725 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036364 | false | 0 | 0 | 0 | 0.036364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3e1253dea65e55b992c34afad185934a10cd9ae | 760 | py | Python | model_testing.py | rafaelfrieri1/Optimization-Project | 20db5200cd361d358e213310c6eb2997c893ff27 | [
"MIT"
] | null | null | null | model_testing.py | rafaelfrieri1/Optimization-Project | 20db5200cd361d358e213310c6eb2997c893ff27 | [
"MIT"
] | null | null | null | model_testing.py | rafaelfrieri1/Optimization-Project | 20db5200cd361d358e213310c6eb2997c893ff27 | [
"MIT"
] | null | null | null | from sklearn.model_selection import train_test_split
from sklearn.neural_network import MLPClassifier
from sklearn.metrics import recall_score, f1_score, precision_score
#X and y should be defined from the data.
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=3)
mlp = MLPClassifier(max_iter=1000, hidden_layer_sizes=(100, 50), alpha=0.0001, solver='adam', random_state=3)
mlp.fit(X_train, y_train)
y_res = mlp.predict(X_test)
print("accuracy:", mlp.score(X_test, y_test))
print("recall: ", recall_score(y_res, y_test))
print("f1: ", f1_score(y_res, y_test))
print("precision: ", precision_score(y_res, y_test))
# This part of fitting and predicting will be used later with our database to make predictions for the optimization model.
| 47.5 | 122 | 0.785526 | 131 | 760 | 4.305344 | 0.473282 | 0.044326 | 0.053191 | 0.053191 | 0.092199 | 0.067376 | 0 | 0 | 0 | 0 | 0 | 0.028024 | 0.107895 | 760 | 15 | 123 | 50.666667 | 0.803835 | 0.211842 | 0 | 0 | 0 | 0 | 0.060302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.272727 | 0 | 0.272727 | 0.363636 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3e34a856aa1b320193e636ce72fc1a183de8078 | 7,077 | py | Python | e/mail-relay/fabfile.py | zhouli121018/nodejsgm | 0ccbc8acf61badc812f684dd39253d55c99f08eb | [
"MIT"
] | null | null | null | e/mail-relay/fabfile.py | zhouli121018/nodejsgm | 0ccbc8acf61badc812f684dd39253d55c99f08eb | [
"MIT"
] | 18 | 2020-06-05T18:17:40.000Z | 2022-03-11T23:25:21.000Z | e/mail-relay/fabfile.py | zhouli121018/nodejsgm | 0ccbc8acf61badc812f684dd39253d55c99f08eb | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import StringIO
from fabric.api import env, put, run
from fabric.contrib.files import append, contains, exists
supervisord_config_template = '''\
[unix_http_server]
file={work_dir}/run/supervisor.sock
[supervisord]
logfile={work_dir}/log/supervisord.log
logfile_maxbytes=10MB
logfile_backups=10
loglevel=info
pidfile={work_dir}/run/supervisord.pid
nodaemon=false
minfds=65536
minprocs=200
[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface
[supervisorctl]
serverurl=unix://{work_dir}/run/supervisor.sock
'''
program_template = '''\
[program:{package}_{module}]
command = {python_prefix}/bin/python -m {package}.{module}
directory = {work_dir}
autostart = true
autorestart = true
redirect_stderr = true
stdout_logfile = {work_dir}/log/{module}.log
stdout_logfile_maxbytes = 100MB
stdout_logfile_backups = 10
environment = PYTHONPATH="{work_dir}/ttt", PACKAGE_CONFIG="{work_dir}/etc/{package}.config"
'''
program_template_1 = '''\
[program:{package}_{module}]
command = {python_prefix}/bin/python {path}{args}
directory = {work_dir}
autostart = true
autorestart = true
redirect_stderr = true
stdout_logfile = {work_dir}/log/{module}.log
stdout_logfile_maxbytes = 100MB
stdout_logfile_backups = 10
'''
task_les_config_template = '''\
[redis]
host = 127.0.0.1
port = 6379
db = 0
[postgres]
host = 127.0.0.1
port = 5432
database = mm-log
user = mm-log
password = ******
[mysql]
host = 127.0.0.1
port = 3306
db = mm-ms
user = edm_web
passwd = ******
[task]
server_address = 0.0.0.0
server_port = {task_port}
message_dir = /usr/local/mm-bs/data/mails-default
'''
program_list = [
'edm_app/src/wc_dispatcher',
'edm_app/web_api/server',
'mm-bs/bin/channel_chk.py',
'mm-bs/src/assign_address',
'mm-bs/src/bs_esmtpd',
'mm-bs/src/handle_mails',
'mm-bs/src/smtpsender',
'mm-bs/src/split_mails',
'mm-bs/src/sync_redis',
'mm-bs/src/testallocator',
'mm-bs/web_api/server',
'mm-log/bin/logmonitor',
'zhi_meng/manage runserver 0.0.0.0:8888'
]
# ------------------------------------------------
def install_package():
if not exists('/etc/yum.repos.d/epel.repo'):
run('rpm -ivh http://dl.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm')
run('yum groupinstall -y Development')
run('yum install -y bzip2-devel expat-devel gdbm-devel libffi-devel openssl-devel readline-devel sqlite-devel'
' zlib-devel postgresql-devel zeromq3-devel')
run('yum install -y denyhosts dnsmasq redis bash-completion bind-utils nc ntpdate openssh-clients rlwrap strace'
' telnet tmux emacs-nox vim-minimal')
def set_datetime():
try:
run('cp -f /usr/share/zoneinfo/Asia/Shanghai /etc/localtime')
run('ntpdate pool.ntp.org')
run('hwclock -w')
except:
pass
def start_service():
run('chkconfig denyhosts on')
run('chkconfig dnsmasq on')
run('chkconfig redis on')
run('service denyhosts start')
run('service dnsmasq start')
run('service redis start')
def create_pyenv(pyenv):
if not exists('/usr/local/pyenv'):
run('git clone https://github.com/yyuu/pyenv.git /usr/local/pyenv')
if not exists('/usr/local/pyenv/plugins/pyenv-virtualenv'):
run('git clone https://github.com/yyuu/pyenv-virtualenv.git /usr/local/pyenv/plugins/pyenv-virtualenv')
if not exists('/usr/local/pyenv/versions/2.7.11/bin/python'):
run('PYENV_ROOT=/usr/local/pyenv /usr/local/pyenv/bin/pyenv install -f 2.7.11')
if not exists('/usr/local/pyenv/versions/{}/bin/python'.format(pyenv)):
run('PYENV_ROOT=/usr/local/pyenv /usr/local/pyenv/bin/pyenv virtualenv -f 2.7.11 {}'.format(pyenv))
run('PYENV_ROOT=/usr/local/pyenv /usr/local/pyenv/bin/pyenv rehash')
# ------------------------------------------------
def make_dir(work_dir):
run('mkdir -p {}/etc'.format(work_dir))
run('mkdir -p {}/log'.format(work_dir))
run('mkdir -p {}/message'.format(work_dir))
run('mkdir -p {}/run'.format(work_dir))
def copy_source(base_dir, source_path):
put(source_path, '{}/lo.tar.gz'.format(base_dir))
run('tar -C {0} -xf {0}/lo.tar.gz'.format(base_dir))
def install_requirement(pyenv, file):
run('/usr/local/pyenv/versions/{}/bin/pip install -r {}'.format(pyenv, file))
def supervisord_config(base_dir, work_dir, pyenv):
l = [supervisord_config_template.format(work_dir=work_dir)]
for module in ['dispatch', 'server']:
l.append(program_template.format(
package='task_les',
module=module,
python_prefix='/usr/local/pyenv/versions/{}'.format(pyenv),
work_dir=work_dir
))
for p in program_list:
i = p.find(' ')
if i >= 0:
path = p[:i]
args = p[i:]
else:
path = p
args = ''
sp = path.split('/')
run('mkdir -p {}/{}/log'.format(base_dir, sp[0]))
l.append(program_template_1.format(
package=sp[0],
module=sp[-1],
python_prefix='/usr/local/pyenv/versions/{}'.format(pyenv),
work_dir='{}/{}'.format(base_dir, sp[0]),
path='{}/{}.pyc'.format(base_dir, path),
args=args))
put(StringIO.StringIO('\n'.join(l)), '{}/etc/supervisord.conf'.format(work_dir))
c = '/usr/local/pyenv/versions/{}/bin/supervisord -c {}/etc/supervisord.conf'.format(pyenv, work_dir)
if not contains('/etc/rc.local', c):
append('/etc/rc.local', c)
def task_les_config(work_dir, task_port):
s = task_les_config_template.format(work_dir=work_dir, task_port=task_port)
put(StringIO.StringIO(s), '{}/etc/task_les.config'.format(work_dir))
# ------------------------------------------------
def supervisord_start(work_dir, pyenv):
if not exists('{}/run/supervisor.sock'.format(work_dir)):
run('/usr/local/pyenv/versions/{}/bin/supervisord -c {}/etc/supervisord.conf'.format(pyenv, work_dir))
else:
run('/usr/local/pyenv/versions/{}/bin/supervisorctl -c {}/etc/supervisord.conf reload'.format(pyenv, work_dir))
# ------------------------------------------------
def deploy_task(host, port, user, password, base_dir, source_path):
env.abort_on_prompts = True
env.host_string = '{}@{}:{}'.format(user, host, port)
env.password = password
pyenv = 'relay'
work_dir = '{}/mail-relay'.format(base_dir)
install_package()
set_datetime()
start_service()
create_pyenv(pyenv)
install_requirement(pyenv, '{}/requirements_all.txt'.format(work_dir))
deploy_task('127.0.0.1', 22, 'root', '***', '/usr/local', '/opt/lo/lo.tar.gz')
| 32.168182 | 119 | 0.614102 | 928 | 7,077 | 4.53556 | 0.269397 | 0.054882 | 0.055595 | 0.039914 | 0.340461 | 0.310287 | 0.24804 | 0.202899 | 0.163934 | 0.163934 | 0 | 0.018489 | 0.205172 | 7,077 | 219 | 120 | 32.315068 | 0.729778 | 0.03363 | 0 | 0.156977 | 0 | 0.034884 | 0.565574 | 0.233167 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063953 | false | 0.02907 | 0.017442 | 0 | 0.081395 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3e74f0279ac7b565e15ae42cae38266be347e34 | 11,178 | py | Python | modules/dipmath.py | mavost/boreholetools | 0b6caae86cd4e30794e5ec469383914d101d5ee3 | [
"MIT"
] | 10 | 2020-03-04T00:55:19.000Z | 2021-11-16T17:23:44.000Z | modules/dipmath.py | mavost/boreholetools | 0b6caae86cd4e30794e5ec469383914d101d5ee3 | [
"MIT"
] | null | null | null | modules/dipmath.py | mavost/boreholetools | 0b6caae86cd4e30794e5ec469383914d101d5ee3 | [
"MIT"
] | 1 | 2020-12-07T19:49:27.000Z | 2020-12-07T19:49:27.000Z | #!/usr/bin/python #Linux shebang plus chmod to make executable
# ------------------------------------------------------------
# FILENAME: dipmath.py
# VERSION: 1.0 - Python 3.6
# PURPOSE:
# AUTHOR: MVS
# LAST CHANGE: 04/09/2018
# ------------------------------------------------------------
# tools for manipulating dipmeter interpretation files
import math
import sys
from modules import boreholemath
from modules import fileio
class DipPoint(object):
"""provides dip and dip azimuth properties to a point including a corresponding
tangential vector and its manipulation
"""
@staticmethod
def get_matrix_x(angle):
"""static function returns a matrix rotation by angle along X axis
:param angle: rotation angle (radian)
:return: 3x3 list of flt
"""
return [[1.0, 0.0, 0.0],
[0.0, math.cos(angle), -math.sin(angle)],
[0.0, math.sin(angle), math.cos(angle)]]
@staticmethod
def get_matrix_y(angle):
"""static function returns a matrix rotation by angle along Y axis
:param angle: rotation angle (radian)
:return: 3x3 list of flt
"""
return [[math.cos(angle), 0.0, math.sin(angle)],
[0.0, 1.0, 0.0],
[-math.sin(angle), 0.0, math.cos(angle)]]
@staticmethod
def get_matrix_z(angle):
"""static function returns a matrix rotation by angle along Z axis
:param angle: rotation angle (radian)
:return: 3x3 list of flt
"""
return [[math.cos(angle), -math.sin(angle), 0.0],
[math.sin(angle), math.cos(angle), 0.0],
[0.0, 0.0, 1.0]]
def __init__(self, dip=0.0, dazim=0.0):
"""
constructor for dip / dip azimuth instance and corresponding unit vector pointing in direction
of maximum falling dip
:param dip: dip of geological bed from horizontal in falling direction (grad)
:param dazim: dip azimuth of geological bed measured clockwise from grid north (grad)
"""
# debug output
self.verbose = True
# inclination angle between reference frame XY-plane and max falling dip 0 <= dip < pi
self.dip = (math.pi + math.radians(dip)) % math.pi
# azimuth angle projected to XY-plane, referenced to X-axis 0 <= dazim < 2*pi
self.dazim = (math.pi * 2.0 + math.radians(dazim)) % (math.pi * 2.0)
# define a unit vector pointing in direction of maximum falling dip
# defaults to 1,0,0
self.dipN = math.cos(self.dip) * math.cos(self.dazim)
self.dipE = math.cos(self.dip) * math.sin(self.dazim)
self.dipV = math.sin(self.dip)
def _rotate(self, matrix_func, angle):
"""
helper function to actually perform a functional-based rotation of the tangential vector by given angle.
Note, that this function does not change the instance.
:param matrix_func: functional object - either X,Y,Z rotation matrix
:param angle: angle of rotation (grad)
:return:
"""
angrad = math.radians(angle)
matrix = matrix_func(angrad)
in_vec = [self.dipN, self.dipE, self.dipV]
res_vec = [0.0, 0.0, 0.0]
for i in range(3):
for j in range(3):
res_vec[i] += matrix[i][j] * in_vec[j]
return res_vec
def _update_angles(self, newdips):
"""back-calculate dip and dip azimuth from a tangential vector changed by rotation
:param newdips: list(3) of flt corresponding to the three-component tangential vector
"""
try:
if len(newdips) != 3:
raise ValueError('Exception: Dip vector has wrong number of components: ', len(newdips))
# calculate length of vector
length = math.sqrt(newdips[0]*newdips[0]+newdips[1]*newdips[1]+newdips[2]*newdips[2])
if length > 0:
# if dip vector is too long - rescale to unit vector
self.dipN = newdips[0] / length
self.dipE = newdips[1] / length
self.dipV = newdips[2] / length
else:
raise ValueError('Exception: Dip vector has zero length')
horiz = math.sqrt(self.dipN * self.dipN + self.dipE * self.dipE)
self.dip = math.acos(horiz)
self.dazim = (math.pi * 2.0 + math.atan2(self.dipE, self.dipN)) % (math.pi * 2.0)
if self.verbose:
# print('New tangential vector:')
print(' X: {0:7.2f}, Y: {1:7.2f}, Z: {2:7.2f}'.format(*newdips))
print(' Dip: {0:8.3f}, Azimuth: {1:8.3f}'.format(math.degrees(self.dip),
math.degrees(self.dazim)))
except ValueError as err:
print(err.args)
sys.exit(1)
def __str__(self):
"""overloaded string operator"""
return 'Dip: {0:8.3f}, Azimuth: {1:8.3f}'.format(math.degrees(self.dip), math.degrees(self.dazim))
def rotate_x(self, angle):
"""
handle to perform a rotation of bed dip instance by X axis using angle and updating the dip / dip azimuth
:param angle: angle of rotation (grad)
"""
vec = self._rotate(DipPoint.get_matrix_x, angle)
self._update_angles(vec)
def rotate_y(self, angle):
"""
handle to perform a rotation of bed dip instance by Y axis using angle and updating the dip / dip azimuth
:param angle: angle of rotation (grad)
"""
vec = self._rotate(DipPoint.get_matrix_y, angle)
self._update_angles(vec)
def rotate_z(self, angle):
"""
handle to perform a rotation of bed dip instance by Z axis using angle and updating the dip / dip azimuth
:param angle: angle of rotation (grad)
"""
vec = self._rotate(DipPoint.get_matrix_z, angle)
self._update_angles(vec)
class DipMarker(DipPoint):
"""
"""
def __init__(self, md, dip=None, dazim=None, wellgeometry_in=None, verbose=False):
"""
:param md:
:param dip:
:param dazim:
:param wellgeometry_in:
:param verbose:
"""
self.md = md
if dip is not None and dazim is not None:
self.in_dip = math.radians(dip)
self.in_dazim = math.radians(dazim)
# conversion to radians in DipPoint class
super(DipMarker, self).__init__(dip, dazim)
else:
# initialize as zero
self.in_dip = 0.0
self.in_dazim = 0.0
super(DipMarker, self).__init__(0.0, 0.0)
self.verbose = verbose
if self.verbose:
print(self)
if wellgeometry_in is not None and dip is not None and dazim is not None:
self.clpoint = wellgeometry_in.calculate_cl_point(self.md)
if self.verbose:
print('Dipmarker correction:')
print('MD: {0:8.2f}, '.format(self.md) + super(DipMarker, self).__str__())
self.reorient_dip()
def __str__(self):
""" overloading string operator """
return 'MD: {0:8.3f}, Dip: {1:8.3f}, Azimuth: {2:8.3f}'.format(self.md, math.degrees(self.dip),
math.degrees(self.dazim))
def output_list(self, mymode=0):
"""
:param mymode:
:return:
"""
if mymode == 0:
return [f'{self.md:{10}.{2}f}', f'{math.degrees(self.dip):{10}.{5}f}',
f'{math.degrees(self.dazim):{10}.{5}f}']
else:
return [f'{self.md:{10}.{2}f}', f'{math.degrees(self.in_dip):{10}.{5}f}',
f'{math.degrees(self.in_dazim):{10}.{5}f}',
f'{math.degrees(self.dip):{10}.{5}f}', f'{math.degrees(self.dazim):{10}.{5}f}',
f'{math.degrees(self.clpoint.incl):{10}.{5}f}', f'{math.degrees(self.clpoint.azim):{10}.{5}f}']
def reorient_dip(self):
"""
"""
by_y = math.degrees(self.clpoint.incl)
by_z = math.degrees(self.clpoint.azim)
if self.verbose:
print(' Borehole INCL: {0:8.3f}, Borehole AZIM: {1:8.3f}'.format(by_y, by_z))
print(' Rotation on Y-Axis with borehole inclination: {0:8.3f}'.format(by_y))
self.rotate_y(by_y)
if self.verbose:
print(' Rotation on Z-Axis with borehole azimuth : {0:8.3f}'.format(by_z))
self.rotate_z(by_z)
if __name__ == '__main__': # call test environment only if module is called standalone
TWIDTH = 79 # terminal width excluding EOL
print(TWIDTH*'=')
print('module test: dipmath'.ljust(TWIDTH, '-'))
print(TWIDTH*'=')
print('Testing: Class DipPoint')
point = DipPoint(45, 0)
print('Input:')
print(point)
print('Rotation starts:')
for ang in range(0, 360, 15):
point.rotate_y(15)
# print('Rot: ', ang, ' Result:', point)
print(TWIDTH*'=')
print('Testing: Class DipMarker')
# generate a borehole deviation survey / interpolation object
wellgeometry = boreholemath.TransformBoreHoleSurvey(datadir='..\\data', mode=0, relativeCoords=True, verbose=False)
# correct one marker by extracting corresponding horehole inclination/azim
print('Apply correction on one DipMarker point:')
dmarker = DipMarker(5000, 45, 10, wellgeometry, verbose=True)
print(dmarker)
# repeat the same for data read from a file
print('Opening dipmarker file:')
inargs = {'datadir': '..\\data', 'filename_in': 'sample-dipmarker.txt',
'headerlines_in': 1, 'columns_in': (1, 2, 3)}
reader = fileio.BHReaderWriter(**inargs)
lines = reader.read_data()
result = []
for line in lines:
try:
# convert data to numbers and check for depth-sorting
line = [float(i) for i in line]
result.append(DipMarker(*line, wellgeometry, verbose=False))
except ValueError:
print('Exception: Error during conversion of survey data')
sys.exit()
print('Before - MD: {0:8.3f}, Dip: {1:8.3f}, Azimuth: {2:8.3f}'.format(*line))
print('After - ' + str(result[-1]))
print('Writing dipmarker file:')
# mode = basic(0) or detailed(1) output
mode = 1
outdata = []
for item in result:
outdata.append(item.output_list(mode))
if mode == 1:
outheader = ('Well: UNKNOWN', 'MD [depthunit]', 'DIP_ORIG [deg]',
'DAZI_ORIG [deg]', 'DIP [deg]', 'DAZI [deg]', 'INCL [deg]', 'AZIM [deg]')
else:
outheader = ('Well: UNKNOWN', 'MD [depthunit]', 'DIP [deg]', 'DAZI [deg]')
outargs = {'datadir': '..\\data', 'filename_out': 'out_sample-dipmarker.txt',
'header_out': outheader, 'data_out': outdata, 'verbose': True}
writer = fileio.BHReaderWriter(**outargs)
writer.write_data()
print(TWIDTH*'=')
else:
print('Importing ' + __name__)
| 39.638298 | 119 | 0.565933 | 1,434 | 11,178 | 4.332636 | 0.207113 | 0.009979 | 0.007726 | 0.007082 | 0.349429 | 0.306615 | 0.275712 | 0.243522 | 0.230484 | 0.207951 | 0 | 0.027089 | 0.296565 | 11,178 | 281 | 120 | 39.779359 | 0.763068 | 0.266327 | 0 | 0.15 | 0 | 0.01875 | 0.183734 | 0.042152 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0875 | false | 0 | 0.03125 | 0 | 0.18125 | 0.16875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3e876544f718223952354b4ac3f13e220acfbeb | 12,000 | py | Python | dependency_wxwidgets.py | madeso/build | 79f1d4e592e0f6ebd6bfa3db39cf4ba2b5ef98db | [
"MIT"
] | null | null | null | dependency_wxwidgets.py | madeso/build | 79f1d4e592e0f6ebd6bfa3db39cf4ba2b5ef98db | [
"MIT"
] | null | null | null | dependency_wxwidgets.py | madeso/build | 79f1d4e592e0f6ebd6bfa3db39cf4ba2b5ef98db | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import urllib.request
import os
import zipfile
import sys
import argparse
import re
import subprocess
###############################################################################
## Classes
class TextReplacer:
def __init__(self):
self.res = []
def add(self, reg: str, rep: str):
self.res.append( (reg, rep) )
return self
def replace(self, text: str) -> str:
for replacer in self.res:
reg = replacer[0]
rep = replacer[1]
text = text.replace(reg, rep)
return text
class Settings:
def __init__(self, root: str, install_dist: str, install: str, wx_root: str, build: str, appveyor_msbuild: str, platform: str):
self.root = root
self.install_dist = install_dist
self.install = install
self.wx_root = wx_root
self.build = build
self.appveyor_msbuild = appveyor_msbuild
self.platform = platform
def print(self):
print('root:', self.root)
print('install_dist:', self.install_dist)
print('install:', self.install)
print('wx_root:', self.wx_root)
print('build:', self.build)
print('appveyor_msbuild:', self.appveyor_msbuild)
print('platform:', self.platform)
def is_appveyor(self) -> bool:
key = 'APPVEYOR'
value = os.environ[key] if key in os.environ else ''
return value.lower().strip() == 'true'
def append_appveyor(self, args):
if self.is_appveyor():
args.append(self.appveyor_msbuild)
###############################################################################
## Functions
def setup() -> Settings:
root = os.getcwd()
install_dist = os.path.join(root, 'dependencies')
install = os.path.join(root, 'dist')
wx_root = os.path.join(install_dist, 'wx')
build = os.path.join(root, 'build')
appveyor_msbuild = r'/logger:C:\Program Files\AppVeyor\BuildAgent\Appveyor.MSBuildLogger.dll'
platform = 'x64'
if os.environ.get('PLATFORM', 'unknown') == 'x86':
platform = 'Win32'
return Settings(root, install_dist, install, wx_root, build, appveyor_msbuild, platform)
def verify_dir_exist(path):
if not os.path.isdir(path):
os.makedirs(path)
def download_file(url, path):
if not os.path.isfile(path):
urllib.request.urlretrieve(url, path)
else:
print("Already downloaded", path)
def list_projects_in_solution(path):
ret = []
directory_name = os.path.dirname(path)
project_line = re.compile(r'Project\("[^"]+"\) = "[^"]+", "([^"]+)"')
with open(path) as sln:
for line in sln:
# Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "richtext", "wx_richtext.vcxproj", "{7FB0902D-8579-5DCE-B883-DAF66A885005}"
project_match = project_line.match(line)
if project_match:
ret.append(os.path.join(directory_name, project_match.group(1)))
return ret
def add_definition_to_project(path, define):
# <PreprocessorDefinitions>WIN32;_LIB;_CRT_SECURE_NO_DEPRECATE=1;_CRT_NON_CONFORMING_SWPRINTFS=1;_SCL_SECURE_NO_WARNINGS=1;__WXMSW__;NDEBUG;_UNICODE;WXBUILDING;%(PreprocessorDefinitions)</PreprocessorDefinitions>
preproc = re.compile(r'([ ]*<PreprocessorDefinitions>)([^<]*</PreprocessorDefinitions>)')
lines = []
with open(path) as project:
for line in project:
preproc_match = preproc.match(line)
if preproc_match:
lines.append('{0}{1};{2}'.format(preproc_match.group(1), define, preproc_match.group(2)))
else:
lines.append(line.rstrip())
with open(path, mode='w') as project:
for line in lines:
project.write(line + '\n')
# change from:
# <RuntimeLibrary>MultiThreadedDebugDLL</RuntimeLibrary> to <RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
# <RuntimeLibrary>MultiThreadedDLL</RuntimeLibrary> to <RuntimeLibrary>MultiThreaded</RuntimeLibrary>
def change_to_static_link(path):
mtdebug = re.compile(r'([ ]*)<RuntimeLibrary>MultiThreadedDebugDLL')
mtrelease = re.compile(r'([ ]*)<RuntimeLibrary>MultiThreadedDLL')
lines = []
with open(path) as project:
for line in project:
mdebug = mtdebug.match(line)
mrelease = mtrelease.match(line)
if mdebug:
print('in {project} changed to static debug'.format(project=path))
lines.append('{spaces}<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>'.format(spaces=mdebug.group(1)))
elif mrelease:
print('in {project} changed to static release'.format(project=path))
lines.append('{spaces}<RuntimeLibrary>MultiThreaded</RuntimeLibrary>'.format(spaces=mrelease.group(1)))
else:
lines.append(line.rstrip())
with open(path, mode='w') as project:
for line in lines:
project.write(line + '\n')
def change_all_projects_to_static(sln):
projects = list_projects_in_solution(sln)
for proj in projects:
change_to_static_link(proj)
def add_definition_to_solution(sln, definition):
projects = list_projects_in_solution(sln)
for proj in projects:
add_definition_to_project(proj, definition)
def make_single_project_64(project_path, rep):
if not os.path.isfile(project_path):
print('missing ' + project_path)
return
lines = []
with open(project_path) as project:
for line in project:
new_line = rep.replace(line.rstrip())
lines.append(new_line)
with open(project_path, 'w') as project:
for line in lines:
project.write(line + '\n')
def make_projects_64(sln):
projects = list_projects_in_solution(sln)
rep = TextReplacer()
rep.add('Win32', 'x64')
rep.add('<DebugInformationFormat>EditAndContinue</DebugInformationFormat>', '<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>')
rep.add('<TargetMachine>MachineX86</TargetMachine>', '<TargetMachine>MachineX64</TargetMachine>')
# protobuf specific hack since cmake looks in x64 folder
rep.add(r'<OutDir>Release\</OutDir>', r'<OutDir>x64\Release\</OutDir>')
rep.add(r'<OutDir>Debug\</OutDir>', r'<OutDir>x64\Debug\</OutDir>')
for project in projects:
make_single_project_64(project, rep)
def make_solution_64(solution_path):
rep = TextReplacer()
rep.add('Win32', 'x64')
lines = []
with open(solution_path) as slnlines:
for line in slnlines:
new_line = rep.replace(line.rstrip())
lines.append(new_line)
with open(solution_path, 'w') as solution_handle:
for line in lines:
solution_handle.write(line + '\n')
def convert_sln_to_64(sln):
make_solution_64(sln)
make_projects_64(sln)
def extract_zip_to(path_to_zip, target):
with zipfile.ZipFile(path_to_zip, 'r') as zip_handle:
zip_handle.extractall(target)
###############################################################################
## Commands
def handle_make_solution_64_cmd(args):
convert_sln_to_64(args.sln)
def handle_change_all_projects_to_static_cmd(args):
change_all_projects_to_static(args.sln)
def handle_list_projects_cmd(cmd):
projects = list_projects_in_solution(cmd.sln)
for proj in projects:
print("project", proj)
def handle_add_definition_cmd(args):
add_definition_to_project(args.project, args.define)
def handle_change_to_static_cmd(args):
change_to_static_link(args.project)
def handle_install_cmd(args):
settings = setup()
build = args.build
wx_url = "https://github.com/wxWidgets/wxWidgets/releases/download/v3.1.4/wxWidgets-3.1.4.zip"
wx_zip = os.path.join(settings.install_dist, "wx.zip")
wx_sln = os.path.join(settings.wx_root, 'build', 'msw', 'wx_vc16.sln')
print('Root:', settings.root)
print('wxWidgets solution: ', wx_sln)
verify_dir_exist(settings.install_dist)
verify_dir_exist(settings.wx_root)
print("downloading wx...")
download_file(wx_url, os.path.join(settings.install_dist, wx_zip))
print("extracting wx")
extract_zip_to(wx_zip, settings.wx_root)
print("changing wx to static")
change_all_projects_to_static(wx_sln)
print("building wxwidgets")
print("-----------------------------------")
wx_msbuild_cmd = [
'msbuild',
'/p:Configuration=Release',
'/p:Platform={}'.format(settings.platform)
]
settings.append_appveyor(wx_msbuild_cmd)
wx_msbuild_cmd.append(wx_sln)
if build:
sys.stdout.flush()
subprocess.check_call(wx_msbuild_cmd)
else:
print(wx_msbuild_cmd)
def handle_cmake_cmd(_):
settings = setup()
subinstall = os.path.join(settings.install, 'windows', settings.platform)
os.makedirs(settings.build)
os.makedirs(settings.install)
os.makedirs(subinstall)
generator = 'Visual Studio 16 2019'
cmakecmd = [
'cmake',
"-DCMAKE_INSTALL_PREFIX={}".format(subinstall),
"-DwxWidgets_ROOT_DIR={}".format(settings.wx_root),
"-DRIDE_BUILD_COMMIT=%APPVEYOR_REPO_COMMIT%",
"-DRIDE_BUILD_NUMBER=%APPVEYOR_BUILD_NUMBER%",
"-DRIDE_BUILD_BRANCH=%APPVEYOR_REPO_BRANCH%",
"-DRIDE_BUILD_REPO=%APPVEYOR_REPO_NAME%",
'-G', generator,
'-A', settings.platform,
settings.root
]
sys.stdout.flush()
subprocess.check_call(cmakecmd, cwd=settings.build)
def handle_build_cmd(_):
settings = setup()
ride_sln = os.path.join(settings.build, 'PACKAGE.vcxproj')
ride_msbuild_cmd = [
'msbuild',
'/p:Configuration=Release',
'/p:Platform={}'.format(settings.platform),
settings.appveyor_msbuild,
ride_sln
]
sys.stdout.flush()
subprocess.check_call(ride_msbuild_cmd)
def handle_print_cmd(_):
settings = setup()
settings.print()
###############################################################################
## Main
def main():
parser = argparse.ArgumentParser(description='Does the windows build')
subparsers = parser.add_subparsers()
install_parser = subparsers.add_parser('install')
install_parser.set_defaults(func=handle_install_cmd)
install_parser.add_argument('--nobuild', dest='build', action='store_const', const=False, default=True)
install_parser = subparsers.add_parser('listprojects')
install_parser.set_defaults(func=handle_list_projects_cmd)
install_parser.add_argument('sln', help='solution file')
static_project_parser = subparsers.add_parser('static_project')
static_project_parser.set_defaults(func=handle_change_to_static_cmd)
static_project_parser.add_argument('project', help='make a project staticly link to the CRT')
static_project_parser = subparsers.add_parser('to64')
static_project_parser.set_defaults(func=handle_make_solution_64_cmd)
static_project_parser.add_argument('sln', help='the solution to upgrade')
static_solution_parser = subparsers.add_parser('static_sln')
static_solution_parser.set_defaults(func=handle_change_all_projects_to_static_cmd)
static_solution_parser.add_argument('sln', help='make all the projects in the specified solution staticly link to the CRT')
install_parser = subparsers.add_parser('add_define')
install_parser.set_defaults(func=handle_add_definition_cmd)
install_parser.add_argument('project', help='project file')
install_parser.add_argument('define', help='preprocessor to add')
cmake_parser = subparsers.add_parser('cmake')
cmake_parser.set_defaults(func=handle_cmake_cmd)
build_parser = subparsers.add_parser('build')
build_parser.set_defaults(func=handle_build_cmd)
print_parser = subparsers.add_parser('print')
print_parser.set_defaults(func=handle_print_cmd)
args = parser.parse_args()
args.func(args)
if __name__ == "__main__":
main()
| 32.171582 | 216 | 0.660583 | 1,446 | 12,000 | 5.237206 | 0.17704 | 0.011092 | 0.013205 | 0.029711 | 0.30305 | 0.200845 | 0.13667 | 0.095471 | 0.086491 | 0.086491 | 0 | 0.011969 | 0.192333 | 12,000 | 372 | 217 | 32.258065 | 0.769397 | 0.055833 | 0 | 0.187739 | 0 | 0.003831 | 0.171016 | 0.082826 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114943 | false | 0 | 0.02682 | 0 | 0.172414 | 0.095785 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3ea11353d04c8773687b0313a1924daddc2f149 | 1,814 | py | Python | raw2dataset.py | CoderSLZhang/-LTSM- | c163d53d5177d59e18b88ae5f05b0cf3a8aecf6a | [
"MIT"
] | 4 | 2018-06-20T08:16:19.000Z | 2019-01-18T04:44:53.000Z | raw2dataset.py | CoderSLZhang/-LTSM- | c163d53d5177d59e18b88ae5f05b0cf3a8aecf6a | [
"MIT"
] | null | null | null | raw2dataset.py | CoderSLZhang/-LTSM- | c163d53d5177d59e18b88ae5f05b0cf3a8aecf6a | [
"MIT"
] | 2 | 2018-06-15T09:55:12.000Z | 2018-06-20T08:16:19.000Z | # zhangshulin
# 2018-3-17
# e-mail: zhangslwork@yeah.net
TRAIN_IN_PATH = './datasets/rawdata/train/in.txt'
TRAIN_OUT_PATH = './datasets/rawdata/train/out.txt'
TEST_IN_PATH = './datasets/rawdata/test/in.txt'
TEST_OUT_PATH = './datasets/rawdata/test/out.txt'
TOTAL_PATH = './datasets/all_couplets.txt'
def create_data_file(train_in_path, train_out_path, test_in_path, test_out_path, total_path):
with open(train_in_path, 'r', encoding='utf8') as f:
train_in_arr = f.readlines()
with open(train_out_path, 'r', encoding='utf8') as f:
train_out_arr = f.readlines()
with open(test_in_path, 'r', encoding='utf8') as f:
test_in_arr = f.readlines()
with open(test_out_path, 'r', encoding='utf8') as f:
test_out_arr = f.readlines()
train_in_arr = map(process_in_couplet, train_in_arr)
train_out_arr = map(process_out_couplet, train_out_arr)
test_in_arr = map(process_in_couplet, test_in_arr)
test_out_arr = map(process_out_couplet, test_out_arr)
train_in_out_arr = [up + down for up, down in zip(train_in_arr, train_out_arr)
if len(up.strip()) != 0 and len(down.strip()) != 0]
test_in_out_arr = [up + down for up, down in zip(test_in_arr, test_out_arr)
if len(up.strip()) != 0 and len(down.strip()) != 0]
total_arr = train_in_out_arr + test_in_out_arr
with open(total_path, 'w', encoding='utf8') as f:
f.writelines(total_arr)
print('data file creating complete ^_^')
def process_in_couplet(couplet):
return couplet.replace(' ', '').replace('\n', ';')
def process_out_couplet(couplet):
return couplet.replace(' ', '').replace('\n', '。\n')
if __name__ == '__main__':
create_data_file(TRAIN_IN_PATH, TRAIN_OUT_PATH, TEST_IN_PATH, TEST_OUT_PATH, TOTAL_PATH) | 34.226415 | 93 | 0.681367 | 288 | 1,814 | 3.916667 | 0.190972 | 0.06383 | 0.062057 | 0.066489 | 0.626773 | 0.605496 | 0.400709 | 0.230496 | 0.230496 | 0.230496 | 0 | 0.010767 | 0.180816 | 1,814 | 53 | 94 | 34.226415 | 0.748318 | 0.027563 | 0 | 0.0625 | 0 | 0 | 0.127768 | 0.085747 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09375 | false | 0 | 0 | 0.0625 | 0.15625 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3ebfa36492777ce768867625106b72c8d70ef9a | 13,623 | py | Python | rayvision_c4d/analyze_c4d.py | renderbus/rayvision_c4d | e040d91c8f19c62d9722cd165bc0bedb657340b7 | [
"Apache-2.0"
] | null | null | null | rayvision_c4d/analyze_c4d.py | renderbus/rayvision_c4d | e040d91c8f19c62d9722cd165bc0bedb657340b7 | [
"Apache-2.0"
] | null | null | null | rayvision_c4d/analyze_c4d.py | renderbus/rayvision_c4d | e040d91c8f19c62d9722cd165bc0bedb657340b7 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""A interface for c4d."""
# Import built-in models
from __future__ import print_function
from __future__ import unicode_literals
import base64
import hashlib
import logging
import os
import sys
import time
from builtins import str
from rayvision_c4d.constants import PACKAGE_NAME
from rayvision_c4d.get_preferences import GetInstallPath
from rayvision_log import init_logger
from rayvision_utils import constants
from rayvision_utils import utils
from rayvision_utils.cmd import Cmd
from rayvision_utils.exception import tips_code
from rayvision_utils.exception.exception import AnalyseFailError, CGFileNotExistsError
VERSION = sys.version_info[0]
class AnalyzeC4d(object):
def __init__(self, cg_file, software_version, project_name,
plugin_config, render_software="CINEMA 4D", render_layer_type="0",
input_project_path=None, local_os=None, workspace=None,
custom_exe_path=None,
platform="2",
logger=None,
log_folder=None,
log_name=None,
log_level="DEBUG"
):
"""Initialize and examine the analysis information.
Args:
cg_file (str): Scene file path.
software_version (str): Software version.
project_name (str): The project name.
plugin_config (dict): Plugin information.
render_software (str): Software name, CINEMA 4D by default.
render_layer_type (str): 0 is render layer, 1 is render setup.
input_project_path (str): The working path of the scenario.
local_os (str): System name, linux or windows.
workspace (str): Analysis out of the result file storage path.
custom_exe_path (str): Customize the exe path for the analysis.
platform (str): Platform num.
logger (object, optional): Custom log object.
log_folder (str, optional): Custom log save location.
log_name (str, optional): Custom log file name.
log_level (string): Set log level, example: "DEBUG","INFO","WARNING","ERROR".
"""
self.logger = logger
if not self.logger:
init_logger(PACKAGE_NAME, log_folder, log_name)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(level=log_level.upper())
self.check_path(cg_file)
self.cg_file = cg_file
self.render_software = render_software
self.input_project_path = input_project_path or ""
self.render_layer_type = render_layer_type
self.software_version = software_version
self.project_name = project_name
self.plugin_config = plugin_config
local_os = self.check_local_os(local_os)
self.local_os = local_os
self.tmp_mark = str(int(time.time()))
workspace = os.path.join(self.check_workspace(workspace),
self.tmp_mark)
if not os.path.exists(workspace):
os.makedirs(workspace)
self.workspace = workspace
if custom_exe_path:
self.check_path(custom_exe_path)
self.custom_exe_path = custom_exe_path
self.platform = platform
self.task_json = os.path.join(workspace, "task.json")
self.tips_json = os.path.join(workspace, "tips.json")
self.asset_json = os.path.join(workspace, "asset.json")
self.upload_json = os.path.join(workspace, "upload.json")
self.analyse_log_path = os.path.join(workspace, "analyze.log")
self.tips_info = {}
self.task_info = {}
self.asset_info = {}
self.upload_info = {}
@staticmethod
def check_path(tmp_path):
"""Check if the path exists."""
if not os.path.exists(tmp_path):
raise CGFileNotExistsError("{} is not found".format(tmp_path))
def add_tip(self, code, info):
"""Add error message.
Args:
code (str): error code.
info (str or list): Error message description.
"""
if isinstance(info, str):
self.tips_info[code] = [info]
elif isinstance(info, list):
self.tips_info[code] = info
else:
raise Exception("info must a list or str.")
def save_tips(self):
"""Write the error message to tips.json."""
utils.json_save(self.tips_json, self.tips_info, ensure_ascii=False)
@staticmethod
def check_local_os(local_os):
"""Check the system name.
Args:
local_os (str): System name.
Returns:
str
"""
if not local_os:
if "win" in sys.platform.lower():
local_os = "windows"
else:
local_os = "linux"
return local_os
def check_workspace(self, workspace):
"""Check the working environment.
Args:
workspace (str): Workspace path.
Returns:
str: Workspace path.
"""
if not workspace:
if self.local_os == "windows":
workspace = os.path.join(os.environ["USERPROFILE"], "renderfarm_sdk")
else:
workspace = os.path.join(os.environ["HOME"], "renderfarm_sdk")
else:
self.check_path(workspace)
return workspace
def analyse_cg_file(self):
"""Analyse cg file.
Analyze the scene file to get the path to the startup file of the CG
software.
"""
# Find the version from the cg file
if VERSION == 3:
version = self.check_version3(self.cg_file)
else:
version = self.check_version2(self.cg_file)
if int(float(version)) != int(float(self.software_version)):
self.add_tip(tips_code.CG_NOTMATCH, "{0} {1}".format(
self.render_software, self.software_version))
self.save_tips()
# Find the installation path with the version
if self.custom_exe_path is not None:
exe_path = self.custom_exe_path
else:
exe_path = self.find_location()
return exe_path
def write_task_json(self):
"""The initialization task.json."""
constants.TASK_INFO["task_info"]["input_cg_file"] = self.cg_file.replace("\\", "/")
constants.TASK_INFO["task_info"]["input_project_path"] = self.input_project_path.replace("\\", "/")
constants.TASK_INFO["task_info"]["render_layer_type"] = self.render_layer_type
constants.TASK_INFO["task_info"]["project_name"] = self.project_name
constants.TASK_INFO["task_info"]["cg_id"] = "2005"
constants.TASK_INFO["task_info"]["os_name"] = "1" if self.local_os == "windows" else "0"
constants.TASK_INFO["task_info"]["platform"] = self.platform
constants.TASK_INFO["software_config"] = {
"plugins": self.plugin_config,
"cg_version": self.software_version,
"cg_name": self.render_software
}
utils.json_save(self.task_json, constants.TASK_INFO)
def check_result(self):
"""Check that the analysis results file exists."""
for json_path in [self.task_json, self.asset_json,
self.tips_json]:
if not os.path.exists(json_path):
msg = "Json file is not generated: {0}".format(json_path)
return False, msg
return True, None
def get_file_md5(self, file_path):
"""Generate the md5 values for the scenario."""
hash_md5 = hashlib.md5()
if os.path.exists(file_path):
with open(file_path, 'rb') as file_path_f:
while True:
data_flow = file_path_f.read(8096)
if not data_flow:
break
hash_md5.update(data_flow)
return hash_md5.hexdigest()
def write_upload_json(self):
"""Generate the upload.json."""
assets = self.asset_info["asset"]
upload_asset = []
self.upload_info["scene"] = [
{
"local": self.cg_file.replace("\\", "/"),
"server": utils.convert_path(self.cg_file),
"hash": self.get_file_md5(self.cg_file)
}
]
for path in assets:
resources = {}
local = path.split(" (mtime")[0]
server = utils.convert_path(local)
resources["local"] = local.replace("\\", "/")
resources["server"] = server
upload_asset.append(resources)
# Add the cg file to upload.json
upload_asset.append({
"local": self.cg_file.replace("\\", "/"),
"server": utils.convert_path(self.cg_file)
})
self.upload_info["asset"] = upload_asset
utils.json_save(self.upload_json, self.upload_info)
def __copy_file(self, src, dst):
copy_cmd = 'xcopy /s /y /f /e "%s" "%s"' % (src, dst)
print('Copy command: [%s]' % copy_cmd)
os.system(copy_cmd)
def update_pyp_script(self, exe_path, cg_ver):
print('Update analyze pyp...')
curr_dir = os.path.dirname(__file__)
base_dir = os.path.abspath(curr_dir)
src_plugin = os.path.join(base_dir, 'tool')
maxon_temp_path = os.path.join(os.getenv('APPDATA'), 'MAXON')
if not os.path.exists(maxon_temp_path):
os.makedirs(maxon_temp_path)
flag = False
for dir in os.listdir(maxon_temp_path):
lower_dir = dir.lower()
lower_inst = os.path.basename(os.path.dirname(exe_path)).lower()
lower_ver = cg_ver.lower()
print(lower_dir, lower_inst, lower_ver)
if lower_dir.startswith(lower_ver) or lower_dir.startswith(lower_inst):
flag = True
maxon_plugin_path = os.path.join(maxon_temp_path, dir, 'plugins')
if not os.path.exists(maxon_plugin_path):
os.makedirs(maxon_plugin_path)
else:
try:
os.remove(os.path.join(maxon_plugin_path, 'RBAnalyzer.pyp'))
os.system('del /q /s %s\\python26\\*' % maxon_plugin_path)
os.system('del /q /s %s\\python27\\*' % maxon_plugin_path)
os.system('del /q /s %s\\python37\\*' % maxon_plugin_path)
except:
pass
print('Copy pyp: from [%s] to [%s]' % (src_plugin, maxon_plugin_path))
try:
self.__copy_file(src_plugin, maxon_plugin_path)
print('RBAnalyzer.pyp was updated...')
except:
pass
if not flag:
path_finder = GetInstallPath()
pref_path_inst = path_finder.install_path(os.path.dirname(exe_path))
if not os.path.exists(pref_path_inst):
os.makedirs(pref_path_inst)
flag = self.update_pyp_script(exe_path, cg_ver)
return flag
def analyse(self, exe_path):
"""Build a cmd command to perform an analysis scenario.
Args:
exe_path (bool): Do you not generate an upload,json file.
Raises:
AnalyseFailError: Analysis scenario failed.
"""
if not os.path.exists(exe_path):
self.logger.error("Please enter the c4d software absolute path")
raise AnalyseFailError
cg_ver = '{} {}'.format(self.render_software, self.software_version) # Cinema 4D R19
if not self.update_pyp_script(exe_path, cg_ver):
print('[ERROR] MAXON appdata "%appdata%/MAXON" not found')
raise ValueError('MAXON appdata not found')
self.write_task_json()
print('Analyze cg file: [%s]' % self.cg_file)
if sys.version_info.major == 2:
cg_file = base64.b64encode(bytes(self.cg_file)).decode("utf-8")
else:
cg_file = base64.b64encode(bytes(self.cg_file, 'utf-8')).decode("utf-8")
print('Encoded cg file: [%s]' % cg_file)
if self.local_os == 'windows':
cmd = ('"{exe_path}" -cg_file="{cg_file}" -task_json="{task_json}" '
'-asset_json="{asset_json}" -tips_json="{tips_json}" -upload_json="{upload_json}" '
'-log_path="{log_path}" '
'-parallel -nogui').format(
exe_path=exe_path,
cg_file=cg_file,
task_json=self.task_json,
asset_json=self.asset_json,
tips_json=self.tips_json,
upload_json=self.upload_json,
log_path=self.analyse_log_path
)
else:
self.logger.error("c4d does not support linux rendering")
self.logger.debug(cmd)
code, _, _ = Cmd.run(cmd, shell=True)
if code not in [0, 1]:
self.add_tip(tips_code.UNKNOW_ERR, "")
self.save_tips()
raise AnalyseFailError
# Determine whether the analysis is successful by
# determining whether a json file is generated.
status, msg = self.check_result()
if status is False:
self.add_tip(tips_code.UNKNOW_ERR, msg)
self.save_tips()
raise AnalyseFailError(msg)
self.tips_info = utils.json_load(self.tips_json)
self.asset_info = utils.json_load(self.asset_json)
self.task_info = utils.json_load(self.task_json)
| 36.328 | 107 | 0.585701 | 1,656 | 13,623 | 4.581522 | 0.16244 | 0.023725 | 0.017135 | 0.010149 | 0.208383 | 0.099512 | 0.065111 | 0.046659 | 0.02188 | 0.014235 | 0 | 0.006691 | 0.308889 | 13,623 | 374 | 108 | 36.425134 | 0.79915 | 0.142993 | 0 | 0.096386 | 0 | 0 | 0.098749 | 0.011002 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056225 | false | 0.008032 | 0.068273 | 0 | 0.156627 | 0.036145 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
f3ef89e91876433fd644f579cc48aa7c9c57a992 | 656 | py | Python | main/xdg-utils/template.py | RoastVeg/cports | 803c7f07af341eb32f791b6ec1f237edb2764bd5 | [
"BSD-2-Clause"
] | 46 | 2021-06-10T02:27:32.000Z | 2022-03-27T11:33:24.000Z | main/xdg-utils/template.py | RoastVeg/cports | 803c7f07af341eb32f791b6ec1f237edb2764bd5 | [
"BSD-2-Clause"
] | 58 | 2021-07-03T13:58:20.000Z | 2022-03-13T16:45:35.000Z | main/xdg-utils/template.py | RoastVeg/cports | 803c7f07af341eb32f791b6ec1f237edb2764bd5 | [
"BSD-2-Clause"
] | 6 | 2021-07-04T10:46:40.000Z | 2022-01-09T00:03:59.000Z | pkgname = "xdg-utils"
pkgver = "1.1.3"
pkgrel = 0
_commit = "d11b33ec7f24cfb1546f6b459611d440013bdc72"
build_style = "gnu_configure"
make_cmd = "gmake"
make_dir = "."
hostmakedepends = ["xmlto", "lynx", "gmake"]
depends = ["xset"]
pkgdesc = "Basic desktop integration scripts"
maintainer = "q66 <q66@chimera-linux.org>"
license = "MIT"
url = "https://www.freedesktop.org/wiki/Software/xdg-utils"
source = f"https://gitlab.freedesktop.org/xdg/{pkgname}/-/archive/{_commit}.tar.gz"
sha256 = "cc7f8b1292a4c1fa2054594642ff90e3740269033a32d97bcf9bd04322d5555c"
# no check target
options = ["!check"]
def post_install(self):
self.install_license("LICENSE")
| 31.238095 | 83 | 0.742378 | 75 | 656 | 6.386667 | 0.76 | 0.033403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137288 | 0.10061 | 656 | 20 | 84 | 32.8 | 0.674576 | 0.022866 | 0 | 0 | 0 | 0 | 0.552426 | 0.198748 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |