hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
53bbdf0c014d807bcfb9bc6e91cc8acbb8195a2f | 72,332 | py | Python | pdf.py | cyberstainbot/Pdf_bot | 9eece1a6b3c8bec3dfba908c6913653dbc21e7f4 | [
"Apache-2.0"
] | 3 | 2022-03-10T16:23:15.000Z | 2022-03-24T12:22:22.000Z | pdf.py | cyberstainbot/Pdf_bot | 9eece1a6b3c8bec3dfba908c6913653dbc21e7f4 | [
"Apache-2.0"
] | null | null | null | pdf.py | cyberstainbot/Pdf_bot | 9eece1a6b3c8bec3dfba908c6913653dbc21e7f4 | [
"Apache-2.0"
] | 5 | 2022-03-10T16:25:28.000Z | 2022-03-28T05:45:10.000Z | # !/usr/bin/python
# -*- coding: utf-8 -*-
# ABOUT DEV. & SOURCE CODE
# nabilanavab, india, kerala
# Telegram: @cyberstainbot
# Email: nabilanavab@gmail.com
# copyright ©️ 2021 nabilanavab
# Released Under Apache License
import os
import fitz
import shutil
import logging
import convertapi
from PIL import Image
from time import sleep
from configs import Config, Msgs
from pyrogram import Client, filters
from pyrogram.types import ForceReply
from PyPDF2 import PdfFileWriter, PdfFileReader
from pyrogram.types import InputMediaPhoto, InputMediaDocument
from pyrogram.types import InlineKeyboardButton, InlineKeyboardMarkup
# LOGGING INFO
# logging.basicConfig(level=logging.INFO)
logging.getLogger("pyrogram").setLevel(logging.WARNING)
# PYROGRAM INSTANCE
bot = Client(
"pyroPdf",
parse_mode = "markdown",
api_id = Config.API_ID,
api_hash = Config.API_HASH,
bot_token = Config.API_TOKEN
)
# GLOBAL VARIABLES
PDF = {} # save images for generating pdf
media = {} # sending group images(pdf 2 img)
PDF2IMG = {} # save fileId of each user(later uses)
PROCESS = [] # to check current process
mediaDoc = {} # sending group document(pdf 2 img)
PAGENOINFO = {} # saves no.of pages that user send last
PDF2IMGPGNO = {} # more info about pdf file(for extraction)
# SUPPORTED FILES
suprtedFile = [
".jpg", ".jpeg", ".png"
] # Img to pdf file support
suprtedPdfFile = [
".epub", ".xps", ".oxps",
".cbz", ".fb2"
] # files to pdf (zero limits)
suprtedPdfFile2 = [
".csv", ".doc", ".docx", ".dot",
".dotx", ".log", ".mpp", ".mpt",
".odt", ".pot", ".potx", ".pps",
".ppsx", ".ppt", ".pptx", ".pub",
".rtf", ".txt", ".vdx", ".vsd",
".vsdx", ".vst", ".vstx", ".wpd",
".wps", ".wri", ".xls", ".xlsb",
".xlsx", ".xlt", ".xltx", ".xml"
] # file to pdf (ConvertAPI limit)
# CREATING ConvertAPI INSTANCE
if Config.CONVERT_API is not None:
convertapi.api_secret = os.getenv("CONVERT_API")
if Config.MAX_FILE_SIZE:
MAX_FILE_SIZE = int(os.getenv("MAX_FILE_SIZE"))
MAX_FILE_SIZE_IN_kiB = MAX_FILE_SIZE * 10000
# FORCE SUBSCRIPTION
async def forceSub(chatId):
try:
await bot.get_chat_member(
str(Config.UPDATE_CHANNEL), chatId
)
return "subscribed"
except Exception:
try:
invite_link = await bot.create_chat_invite_link(
int(Config.UPDATE_CHANNEL)
)
await bot.send_message(
chatId,
Msgs.forceSubMsg.format(
chatId.from_user.first_name, chatId
),
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"🌟 JOIN CHANNEL 🌟",
url = invite_link.invite_link
)
],
[
InlineKeyboardButton(
"Refresh ♻️",
callback_data = "refresh"
)
]
]
)
)
return "notSubscribed"
except Exception:
pass
# REPLY TO /start COMMAND
@bot.on_message(filters.command(["start"]))
async def start(bot, message):
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
if Config.UPDATE_CHANNEL:
try:
await bot.get_chat_member(
str(Config.UPDATE_CHANNEL), message.chat.id
)
except Exception:
invite_link = await bot.create_chat_invite_link(
int(Config.UPDATE_CHANNEL)
)
await bot.send_message(
message.chat.id,
Msgs.forceSubMsg.format(
message.from_user.first_name, message.chat.id
),
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"🌟 JOIN CHANNEL 🌟",
url = invite_link.invite_link
)
],
[
InlineKeyboardButton(
"Refresh ♻️",
callback_data = "refresh"
)
]
]
)
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
return
await bot.send_message(
message.chat.id,
Msgs.welcomeMsg.format(
message.from_user.first_name, message.chat.id
),
disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Know More ❤️",
callback_data = "strtDevEdt"
),
InlineKeyboardButton(
"Explore Bot 🎊",
callback_data = "imgsToPdfEdit"
)
],
[
InlineKeyboardButton(
"Close",
callback_data = "close"
)
]
]
)
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
except Exception:
pass
# if message is a /id
@bot.on_message(filters.command(["id"]))
async def userId(bot, message):
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, f'Your Id: `{message.chat.id}`'
)
except Exception:
pass
# if message is a /feedback
@bot.on_message(filters.command(["feedback"]))
async def feedback(bot, message):
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview = True
)
except Exception:
pass
# /deletes : Deletes current Images to pdf Queue
@bot.on_message(filters.command(["delete"]))
async def cancelI2P(bot, message):
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
del PDF[message.chat.id]
await bot.send_message(
message.chat.id, "`Queue deleted Successfully..`🤧",
reply_to_message_id = message.message_id
)
shutil.rmtree(f"{message.chat.id}")
except Exception:
await bot.send_message(
message.chat.id, "`No Queue founded..`😲",
reply_to_message_id = message.message_id
)
# cancel current pdf to image Queue
@bot.on_message(filters.command(["cancel"]))
async def cancelP2I(bot, message):
try:
PROCESS.remove(message.chat.id)
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, '`Canceled current work..`🤧'
)
except Exception:
await bot.send_message(
message.chat.id, '`Nothing to cancel..`🏃'
)
# if message is an image
@bot.on_message(filters.private & filters.photo)
async def images(bot, message):
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
if Config.UPDATE_CHANNEL:
check = await forceSub(message.chat.id)
if check == "notSubscribed":
return
imageReply = await bot.send_message(
message.chat.id,
"`Downloading your Image..⏳`",
reply_to_message_id = message.message_id
)
if not isinstance(PDF.get(message.chat.id), list):
PDF[message.chat.id] = []
await message.download(
f"{message.chat.id}/{message.chat.id}.jpg"
)
img = Image.open(
f"{message.chat.id}/{message.chat.id}.jpg"
).convert("RGB")
PDF[message.chat.id].append(img)
await imageReply.edit(
Msgs.imageAdded.format(len(PDF[message.chat.id]))
)
except Exception:
pass
# if message is a document/file
@bot.on_message(filters.private & filters.document)
async def documents(bot, message):
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
if Config.UPDATE_CHANNEL:
check = await forceSub(message.chat.id)
if check == "notSubscribed":
return
isPdfOrImg = message.document.file_name
fileSize = message.document.file_size
fileNm, fileExt = os.path.splitext(isPdfOrImg)
if Config.MAX_FILE_SIZE and fileSize >= int(MAX_FILE_SIZE_IN_kiB):
try:
bigFileUnSupport = await bot.send_message(
message.chat.id,
Msgs.bigFileUnSupport.format(Config.MAX_FILE_SIZE, Config.MAX_FILE_SIZE)
)
sleep(5)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = bigFileUnSupport.message_id
)
except Exception:
pass
elif fileExt.lower() in suprtedFile:
try:
imageDocReply = await bot.send_message(
message.chat.id,
"`Downloading your Image..⏳`",
reply_to_message_id = message.message_id
)
if not isinstance(PDF.get(message.chat.id), list):
PDF[message.chat.id] = []
await message.download(
f"{message.chat.id}/{message.chat.id}.jpg"
)
img = Image.open(
f"{message.chat.id}/{message.chat.id}.jpg"
).convert("RGB")
PDF[message.chat.id].append(img)
await imageDocReply.edit(
Msgs.imageAdded.format(len(PDF[message.chat.id]))
)
except Exception as e:
await imageDocReply.edit(
Msgs.errorEditMsg.format(e)
)
sleep(5)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = imageDocReply.message_id
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
elif fileExt.lower() == ".pdf":
try:
if message.chat.id in PROCESS:
await message.reply_text(
'`Doing Some other Work.. 🥵`'
)
return
pdfMsgId = await bot.send_message(
message.chat.id,
"`Processing.. 🚶`"
)
await message.download(
f"{message.message_id}/pdftoimage.pdf"
)
doc = fitz.open(f'{message.message_id}/pdftoimage.pdf')
noOfPages = doc.pageCount
PDF2IMG[message.chat.id] = message.document.file_id
PDF2IMGPGNO[message.chat.id] = noOfPages
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = pdfMsgId.message_id
)
await bot.send_chat_action(
message.chat.id, "typing"
)
pdfMsgId = await message.reply_text(
Msgs.pdfReplyMsg.format(noOfPages),
reply_markup = ForceReply(),
parse_mode = "md"
)
doc.close()
shutil.rmtree(f'{message.message_id}')
except Exception as e:
try:
PROCESS.remove(message.chat.id)
doc.close()
shutil.rmtree(f'{message.message_id}')
await pdfMsgId.edit(
Msgs.errorEditMsg.format(e)
)
sleep(15)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = pdfMsgId.message_id
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
except Exception:
pass
elif fileExt.lower() in suprtedPdfFile:
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
pdfMsgId = await message.reply_text(
"`Downloading your file..⏳`",
)
await message.download(
f"{message.message_id}/{isPdfOrImg}"
)
await pdfMsgId.edit(
"`Creating pdf..`💛"
)
Document = fitz.open(
f"{message.message_id}/{isPdfOrImg}"
)
b = Document.convert_to_pdf()
pdf = fitz.open("pdf", b)
pdf.save(
f"{message.message_id}/{fileNm}.pdf",
garbage = 4,
deflate = True,
)
pdf.close()
await pdfMsgId.edit(
"`Started Uploading..`🏋️"
)
sendfile = open(
f"{message.message_id}/{fileNm}.pdf", "rb"
)
await bot.send_document(
chat_id = message.chat.id,
document = sendfile,
thumb = Config.PDF_THUMBNAIL,
caption = f"`Converted: {fileExt} to pdf`"
)
await pdfMsgId.edit(
"`Uploading Completed..❤️`"
)
shutil.rmtree(f"{message.message_id}")
sleep(5)
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview = True
)
except Exception as e:
try:
shutil.rmtree(f"{message.message_id}")
await pdfMsgId.edit(
Msgs.errorEditMsg.format(e)
)
sleep(15)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = pdfMsgId.message_id
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
except Exception:
pass
elif fileExt.lower() in suprtedPdfFile2:
if os.getenv("CONVERT_API") is None:
pdfMsgId = await message.reply_text(
"`Owner Forgot to add ConvertAPI.. contact Owner 😒`",
)
sleep(15)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = pdfMsgId.message_id
)
else:
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
pdfMsgId = await message.reply_text(
"`Downloading your file..⏳`",
)
await message.download(
f"{message.message_id}/{isPdfOrImg}"
)
await pdfMsgId.edit(
"`Creating pdf..`💛"
)
try:
await convertapi.convert(
"pdf",
{
"File": f"{message.message_id}/{isPdfOrImg}"
},
from_format = fileExt[1:],
).save_files(
f"{message.message_id}/{fileNm}.pdf"
)
except Exception:
try:
shutil.rmtree(f"{message.message_id}")
await pdfMsgId.edit(
"ConvertAPI limit reaches.. contact Owner"
)
except Exception:
pass
sendfile = open(
f"{message.message_id}/{fileNm}.pdf", "rb"
)
await bot.send_document(
chat_id = message.chat.id,
Document = sendfile,
thumb = Config.PDF_THUMBNAIL,
caption = f"`Converted: {fileExt} to pdf`",
)
await pdfMsgId.edit(
"`Uploading Completed..`🏌️"
)
shutil.rmtree(f"{message.message_id}")
sleep(5)
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview = True
)
except Exception:
pass
else:
try:
await bot.send_chat_action(
message.chat.id, "typing"
)
unSuprtd = await bot.send_message(
message.chat.id, "`unsupported file..🙄`"
)
sleep(15)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = message.message_id
)
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = unSuprtd.message_id
)
except Exception:
pass
except Exception:
pass
# if message is /extract
@bot.on_message(filters.command(["extract"]))
async def extract(bot, message):
try:
if message.chat.id in PROCESS:
await bot.send_chat_action(
message.chat.id, "typing"
)
await message.reply_text("`Doing Some Work..🥵`", quote=True)
return
needPages = message.text.replace('/extract ', '')
if message.chat.id not in PDF2IMG:
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id,
"`send me a pdf first..🤥`"
)
return
else:
pageStartAndEnd = list(needPages.replace('-',':').split(':'))
if len(pageStartAndEnd) > 2:
await bot.send_message(
message.chat.id,
"`I just asked you starting & ending 😅`"
)
return
elif len(pageStartAndEnd) == 2:
try:
if (1 <= int(pageStartAndEnd[0]) <= PDF2IMGPGNO[message.chat.id]):
if (int(pageStartAndEnd[0]) < int(pageStartAndEnd[1]) <= PDF2IMGPGNO[message.chat.id]):
PAGENOINFO[message.chat.id] = [False, int(pageStartAndEnd[0]), int(pageStartAndEnd[1]), None] #elmnts in list (is singlePage, start, end, if single pg number)
else:
await bot.send_message(
message.chat.id,
"`Syntax Error: errorInEndingPageNumber 😅`"
)
return
else:
await bot.send_message(
message.chat.id,
"`Syntax Error: errorInStartingPageNumber 😅`"
)
return
except:
await bot.send_message(
message.chat.id,
"`Syntax Error: noSuchPageNumbers 🤭`"
)
return
elif len(pageStartAndEnd) == 1:
if pageStartAndEnd[0] == "/extract":
if (PDF2IMGPGNO[message.chat.id]) == 1:
PAGENOINFO[message.chat.id] = [True, None, None, 1]
#elmnts in list (is singlePage, start, end, if single pg number)
else:
PAGENOINFO[message.chat.id] = [False, 1, PDF2IMGPGNO[message.chat.id], None]
#elmnts in list (is singlePage, start, end, if single pg number)
elif 0 < int(pageStartAndEnd[0]) <= PDF2IMGPGNO[message.chat.id]:
PAGENOINFO[message.chat.id] = [True, None, None, pageStartAndEnd[0]]
else:
await bot.send_message(
message.chat.id,
'`Syntax Error: noSuchPageNumber 🥴`'
)
return
else:
await bot.send_message(
message.chat.id,
"`Syntax Error: pageNumberMustBeAnIntiger 🧠`"
)
return
if PAGENOINFO[message.chat.id][0] == False:
if pageStartAndEnd[0] == "/extract":
await bot.send_message(
message.chat.id,
text = f"Extract images from `{PAGENOINFO[message.chat.id][1]}` to `{PAGENOINFO[message.chat.id][2]}` As:",
disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Images 🖼️️",
callback_data = "multipleImgAsImages"
),
InlineKeyboardButton(
"Document 📁 ",
callback_data = "multipleImgAsDocument"
)
],
[
InlineKeyboardButton(
"PDF 🎭",
callback_data = "multipleImgAsPdfError"
)
]
]
)
)
else:
await bot.send_message(
message.chat.id,
text = f"Extract images from `{PAGENOINFO[message.chat.id][1]}` to `{PAGENOINFO[message.chat.id][2]}` As:",
disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Images 🖼️️",
callback_data = "multipleImgAsImages"
),
InlineKeyboardButton(
"Document 📁 ",
callback_data = "multipleImgAsDocument"
)
],
[
InlineKeyboardButton(
"PDF 🎭",
callback_data = "multipleImgAsPdf"
)
]
]
)
)
if PAGENOINFO[message.chat.id][0] == True:
await bot.send_message(
message.chat.id,
text = f"Extract page number: `{PAGENOINFO[message.chat.id][3]}` As:",
disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Images 🖼️️",
callback_data = "asImages"
),
InlineKeyboardButton(
"Document 📁 ",
callback_data = "asDocument"
)
],
[
InlineKeyboardButton(
"PDF 🎭",
callback_data = "asPdf"
)
]
]
)
)
except Exception:
try:
del PAGENOINFO[message.chat.id]
PROCESS.remove(message.chat.id)
except Exception:
pass
# If message is /text
@bot.on_message(filters.command(["text"]))
async def textCommand(bot, message):
try:
if message.chat.id in PROCESS:
await bot.send_chat_action(
message.chat.id, "typing"
)
await message.reply_text(
"`Doing Some Work..🥵`"
)
return
if message.chat.id not in PDF2IMG:
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id,
"`send me a pdf first..🤥`"
)
return
else:
await bot.send_message(
message.chat.id,
text = f"Send Extracted Text As:",
disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Text ✍️",
callback_data = "txtMsg"
),
InlineKeyboardButton(
"Txt File 🗂️",
callback_data = "txtFile"
)
],
[
InlineKeyboardButton(
"Html 🌐",
callback_data = "txtHtml"
),
InlineKeyboardButton(
"Json 🔖",
callback_data = "txtJson"
)
]
]
)
)
except Exception:
try:
del PAGENOINFO[message.chat.id]
PROCESS.remove(message.chat.id)
except Exception:
pass
# If message is /encrypt
@bot.on_message(filters.command(["encrypt"]))
async def encrypt(bot, message):
try:
if message.chat.id in PROCESS:
await bot.send_chat_action(
message.chat.id, "typing"
)
await message.reply_text(
"`Doing Some Work..🥵`"
)
return
if message.chat.id not in PDF2IMG:
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id,
"`send me a pdf first..🤥`"
)
return
password = message.text.replace('/encrypt ', '')
if password == '/encrypt':
await bot.send_message(
message.chat.id,
"`can't find a password..`🐹"
)
return
PROCESS.append(message.chat.id)
await bot.send_chat_action(
message.chat.id, "typing"
)
pdfMsgId = await bot.send_message(
message.chat.id,
"`Downloading your pdf..`🕐"
)
await bot.download_media(
PDF2IMG[message.chat.id],
f"{message.message_id}/pdf.pdf"
)
await pdfMsgId.edit(
"`Encrypting pdf.. `🔐"
)
outputFileObj = PdfFileWriter()
inputFile = PdfFileReader(
f"{message.message_id}/pdf.pdf"
)
pgNmbr = inputFile.numPages
if pgNmbr > 150:
await bot.send_message(
message.chat.id,
f"send me a pdf less than 150pgs..👀"
)
return
for i in range(pgNmbr):
if pgNmbr >= 50:
if i % 10 == 0:
await pdfMsgId.edit(
f"`Encrypted {i}/{pgNmbr} pages..`🔑",
)
page = inputFile.getPage(i)
outputFileObj.addPage(page)
outputFileObj.encrypt(password)
await pdfMsgId.edit(
text = "`Started Uploading..`🏋️",
)
with open(
f"{message.message_id}/Encrypted.pdf", "wb"
) as f:
outputFileObj.write(f)
if message.chat.id not in PROCESS:
try:
shutil.rmtree(f'{message.message_id}')
return
except Exception:
return
await bot.send_chat_action(
message.chat.id, "upload_document"
)
with open(
f"{message.message_id}/Encrypted.pdf", "rb"
) as sendfile:
await bot.send_document(
chat_id = message.chat.id,
document = sendfile,
thumb = Config.PDF_THUMBNAIL,
caption = Msgs.encryptedFileCaption.format(
pgNmbr, password
)
)
await pdfMsgId.edit(
"`Uploading Completed..`🏌️",
)
shutil.rmtree(f"{message.message_id}")
del PDF2IMG[message.chat.id]
PROCESS.remove(message.chat.id)
sleep(5)
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview=True
)
except Exception as e:
try:
await pdfMsgId.edit(
Msgs.errorEditMsg.format(e)
)
PROCESS.remove(message.chat.id)
shutil.rmtree(f"{message.message_id}")
await pdfMsgId.edit(
Msgs.errorEditMsg.format(e),
)
except Exception:
pass
# If message is /generate
@bot.on_message(filters.command(["generate"]))
async def generate(bot, message):
try:
newName = str(message.text.replace("/generate", ""))
images = PDF.get(message.chat.id)
if isinstance(images, list):
pgnmbr = len(PDF[message.chat.id])
del PDF[message.chat.id]
if not images:
await bot.send_chat_action(
message.chat.id, "typing"
)
imagesNotFounded = await message.reply_text(
"`No image founded.!!`😒"
)
sleep(5)
await message.delete()
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = imagesNotFounded.message_id
)
return
gnrtMsgId = await bot.send_message(
message.chat.id, f"`Generating pdf..`💚"
)
if newName == " name":
fileName = f"{message.from_user.first_name}" + ".pdf"
elif len(newName) > 1 and len(newName) <= 15:
fileName = f"{newName}" + ".pdf"
elif len(newName) > 15:
fileName = f"{message.from_user.first_name}" + ".pdf"
else:
fileName = f"{message.chat.id}" + ".pdf"
images[0].save(fileName, save_all = True, append_images = images[1:])
await gnrtMsgId.edit(
"`Uploading pdf.. `🏋️",
)
await bot.send_chat_action(
message.chat.id, "upload_document"
)
with open(fileName, "rb") as sendfile:
await bot.send_document(
chat_id = message.chat.id,
document = sendfile,
thumb = Config.PDF_THUMBNAIL,
caption = f"file Name: `{fileName}`\n\n`Total pg's: {pgnmbr}`",
)
await gnrtMsgId.edit(
"`Successfully Uploaded.. `🤫",
)
os.remove(fileName)
shutil.rmtree(f"{message.chat.id}")
sleep(5)
await bot.send_chat_action(
message.chat.id, "typing"
)
await bot.send_message(
message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview = True
)
except Exception as e:
os.remove(fileName)
shutil.rmtree(f"{message.chat.id}")
print(e)
# delete spam messages
@bot.on_message(filters.private)
async def spam(bot, message):
try:
spamMsgId = await bot.send_message(
message.chat.id, f"`unsupported media..😪`"
)
sleep(5)
await message.delete()
await bot.delete_messages(
chat_id = message.chat.id,
message_ids = spamMsgId.message_id
)
except Exception:
pass
@bot.on_callback_query()
async def answer(client, callbackQuery):
edit = callbackQuery.data
if edit == "strtDevEdt":
try:
await callbackQuery.edit_message_text(
Msgs.aboutDev, disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Follow",
url = "https://t.me/cyberstainbot"
),
InlineKeyboardButton(
"🔙 Home 🏡",
callback_data = "back"
)
],
[
InlineKeyboardButton(
"Close 🚶",
callback_data = "close"
)
]
]
)
)
return
except Exception:
pass
elif edit == "imgsToPdfEdit":
try:
await callbackQuery.edit_message_text(
Msgs.I2PMsg, disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"🔙 Home 🏡",
callback_data = "back"
),
InlineKeyboardButton(
"PDF to images ➡️",
callback_data = "pdfToImgsEdit"
)
],
[
InlineKeyboardButton(
"Close 🚶",
callback_data = "close"
)
]
]
)
)
return
except Exception:
pass
elif edit == "pdfToImgsEdit":
try:
await callbackQuery.edit_message_text(
Msgs.P2IMsg, disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"🔙 Imgs To Pdf",
callback_data = "imgsToPdfEdit"
),
InlineKeyboardButton(
"Home 🏡",
callback_data = "back"
),
InlineKeyboardButton(
"file to Pdf ➡️",
callback_data = "filsToPdfEdit"
)
],
[
InlineKeyboardButton(
"Close 🚶",
callback_data = "close"
)
]
]
)
)
return
except Exception:
pass
elif edit == "filsToPdfEdit":
try:
await callbackQuery.edit_message_text(
Msgs.F2PMsg, disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"🔙 PDF to imgs",
callback_data = "pdfToImgsEdit"
),
InlineKeyboardButton(
"Home 🏡",
callback_data = "back"
),
InlineKeyboardButton(
"WARNING ⚠️",
callback_data = "warningEdit"
)
],
[
InlineKeyboardButton(
"Close 🚶",
callback_data = "close"
)
]
]
)
)
return
except Exception:
pass
elif edit == "warningEdit":
try:
await callbackQuery.edit_message_text(
Msgs.warningMessage, disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"WARNING ⚠️",
callback_data = "warningEdit"
),
InlineKeyboardButton(
"Home 🏡",
callback_data = "back"
)
],
[
InlineKeyboardButton(
"Close 🚶",
callback_data = "close"
)
]
]
)
)
return
except Exception:
pass
elif edit == "back":
try:
await callbackQuery.edit_message_text(
Msgs.back2Start, disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Know More ♥️",
callback_data = "strtDevEdt"
),
InlineKeyboardButton(
"Explore More 🎊",
callback_data = "imgsToPdfEdit"
)
],
[
InlineKeyboardButton(
"Close 🚶",
callback_data = "close"
)
]
]
)
)
return
except Exception:
pass
elif edit == "close":
try:
await bot.delete_messages(
chat_id = callbackQuery.message.chat.id,
message_ids = callbackQuery.message.message_id
)
return
except Exception:
pass
elif edit in ["multipleImgAsImages", "multipleImgAsDocument", "asImages", "asDocument"]:
try:
if (callbackQuery.message.chat.id in PROCESS) or (callbackQuery.message.chat.id not in PDF2IMG):
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "Same work done before..🏃"
)
return
PROCESS.append(callbackQuery.message.chat.id)
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "`Downloading your pdf..⏳`"
)
await bot.download_media(
PDF2IMG[callbackQuery.message.chat.id],
f'{callbackQuery.message.message_id}/pdf.pdf'
)
del PDF2IMG[callbackQuery.message.chat.id]
del PDF2IMGPGNO[callbackQuery.message.chat.id]
doc = fitz.open(f'{callbackQuery.message.message_id}/pdf.pdf')
zoom = 1
mat = fitz.Matrix(zoom, zoom)
if edit == "multipleImgAsImages" or edit == "multipleImgAsDocument":
if int(int(PAGENOINFO[callbackQuery.message.chat.id][2])+1 - int(PAGENOINFO[callbackQuery.message.chat.id][1])) >= 11:
await bot.pin_chat_message(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
disable_notification = True,
both_sides = True
)
percNo = 0
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f"`Total pages: {int(PAGENOINFO[callbackQuery.message.chat.id][2])+1 - int(PAGENOINFO[callbackQuery.message.chat.id][1])}..⏳`"
)
totalPgList = range(int(PAGENOINFO[callbackQuery.message.chat.id][1]), int(PAGENOINFO[callbackQuery.message.chat.id][2] + 1))
cnvrtpg = 0
for i in range(0, len(totalPgList), 10):
pgList = totalPgList[i:i+10]
os.mkdir(f'{callbackQuery.message.message_id}/pgs')
for pageNo in pgList:
page = doc.loadPage(pageNo-1)
pix = page.getPixmap(matrix = mat)
cnvrtpg += 1
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f"`Converted: {cnvrtpg}/{int((PAGENOINFO[callbackQuery.message.chat.id][2])+1 - int(PAGENOINFO[callbackQuery.message.chat.id][1]))} pages.. 🤞`"
)
if callbackQuery.message.chat.id not in PROCESS:
try:
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f"`Canceled at {cnvrtpg}/{int((PAGENOINFO[callbackQuery.message.chat.id][2])+1 - int(PAGENOINFO[callbackQuery.message.chat.id][1]))} pages.. 🙄`"
)
shutil.rmtree(f'{callbackQuery.message.message_id}')
doc.close()
return
except Exception:
return
with open(
f'{callbackQuery.message.message_id}/pgs/{pageNo}.jpg','wb'
):
pix.writePNG(f'{callbackQuery.message.message_id}/pgs/{pageNo}.jpg')
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f"`Started Uploading: {cnvrtpg}'th pg \n\nThis might take some Time :(.. 🤞`"
)
directory = f'{callbackQuery.message.message_id}/pgs'
imag = [os.path.join(directory, file) for file in os.listdir(directory)]
imag.sort(key=os.path.getctime)
percNo = percNo + len(imag)
media[callbackQuery.message.chat.id] = []
mediaDoc[callbackQuery.message.chat.id] = []
LrgFileNo = 1
for file in imag:
if os.path.getsize(file) >= 1000000:
picture = Image.open(file)
CmpImg = f'{callbackQuery.message.message_id}/pgs/temp{LrgFileNo}.jpeg'
picture.save(CmpImg, "JPEG", optimize=True, quality = 50)
LrgFileNo += 1
if os.path.getsize(CmpImg) >= 1000000:
continue
else:
media[
callbackQuery.message.chat.id
].append(
InputMediaPhoto(media = file)
)
mediaDoc[
callbackQuery.message.chat.id
].append(
InputMediaDocument(media = file)
)
continue
media[
callbackQuery.message.chat.id
].append(
InputMediaPhoto(media = file)
)
mediaDoc[
callbackQuery.message.chat.id
].append(
InputMediaDocument(media = file)
)
if edit == "multipleImgAsImages":
if callbackQuery.message.chat.id not in PROCESS:
try:
shutil.rmtree(f'{callbackQuery.message.message_id}')
doc.close()
return
except Exception:
return
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_photo"
)
try:
await bot.send_media_group(
callbackQuery.message.chat.id,
media[callbackQuery.message.chat.id]
)
except Exception:
del media[callbackQuery.message.chat.id]
del mediaDoc[callbackQuery.message.chat.id]
if edit == "multipleImgAsDocument":
if callbackQuery.message.chat.id not in PROCESS:
try:
shutil.rmtree(f'{callbackQuery.message.message_id}')
doc.close()
return
except Exception:
return
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_document"
)
try:
await bot.send_media_group(
callbackQuery.message.chat.id,
mediaDoc[callbackQuery.message.chat.id]
)
except Exception:
del mediaDoc[callbackQuery.message.chat.id]
del media[callbackQuery.message.chat.id]
shutil.rmtree(f'{callbackQuery.message.message_id}/pgs')
PROCESS.remove(callbackQuery.message.chat.id)
del PAGENOINFO[callbackQuery.message.chat.id]
doc.close()
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f'`Uploading Completed.. `🏌️'
)
shutil.rmtree(f'{callbackQuery.message.message_id}')
sleep(5)
await bot.send_chat_action(
callbackQuery.message.chat.id, "typing"
)
await bot.send_message(
callbackQuery.message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview=True
)
if edit == "asImages" or edit == "asDocument":
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f"`Fetching page Number:{PAGENOINFO[callbackQuery.message.chat.id][3]} 🤧`"
)
page = doc.loadPage(int(PAGENOINFO[callbackQuery.message.chat.id][3])-1)
pix = page.getPixmap(matrix = mat)
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f"`Successfully Converted your page..✌️`"
)
os.mkdir(f'{callbackQuery.message.message_id}/pgs')
with open(
f'{callbackQuery.message.message_id}/pgs/{PAGENOINFO[callbackQuery.message.chat.id][3]}.jpg','wb'
):
pix.writePNG(f'{callbackQuery.message.message_id}/pgs/{PAGENOINFO[callbackQuery.message.chat.id][3]}.jpg')
file = f'{callbackQuery.message.message_id}/pgs/{PAGENOINFO[callbackQuery.message.chat.id][3]}.jpg'
if os.path.getsize(file) >= 1000000:
picture = Image.open(file)
CmpImg = f'{callbackQuery.message.message_id}/pgs/temp{PAGENOINFO[callbackQuery.message.chat.id][3]}.jpeg'
picture.save(
CmpImg,
"JPEG",
optimize = True,
quality = 50
)
file = CmpImg
if os.path.getsize(CmpImg) >= 1000000:
await bot.send_message(
callbackQuery.message.chat.id,
'`too high resolution.. 🙄`'
)
return
if edit == "asImages":
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_photo"
)
sendfile = open(file,'rb')
await bot.send_photo(
callbackQuery.message.chat.id,
sendfile
)
if edit == "asDocument":
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_document"
)
sendfile = open(file,'rb')
await bot.send_document(
callbackQuery.message.chat.id,
thumb = Config.PDF_THUMBNAIL,
document = sendfile
)
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = f'`Uploading Completed.. `🏌️'
)
PROCESS.remove(callbackQuery.message.chat.id)
del PAGENOINFO[callbackQuery.message.chat.id]
doc.close()
shutil.rmtree(f'{callbackQuery.message.message_id}')
sleep(5)
await bot.send_chat_action(
callbackQuery.message.chat.id, "typing"
)
await bot.send_message(
callbackQuery.message.chat.id, Msgs.feedbackMsg,
disable_web_page_preview = True
)
except Exception as e:
try:
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = Msgs.errorEditMsg.format(e)
)
shutil.rmtree(f'{callbackQuery.message.message_id}')
PROCESS.remove(callbackQuery.message.chat.id)
doc.close()
except Exception:
pass
elif edit == "multipleImgAsPdfError":
try:
await bot.answer_callback_query(
callbackQuery.id,
text = Msgs.fullPdfSplit,
show_alert = True,
cache_time = 0
)
except Exception:
pass
elif edit in ["multipleImgAsPdf", "asPdf"]:
try:
if (callbackQuery.message.chat.id in PROCESS) or (callbackQuery.message.chat.id not in PDF2IMG):
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "Same work done before..🏃"
)
return
PROCESS.append(callbackQuery.message.chat.id)
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "`Downloading your pdf..⏳`"
)
await bot.download_media(
PDF2IMG[callbackQuery.message.chat.id],
f'{callbackQuery.message.message_id}/pdf.pdf'
)
del PDF2IMG[callbackQuery.message.chat.id]
del PDF2IMGPGNO[callbackQuery.message.chat.id]
try:
if edit == "multipleImgAsPdf":
splitInputPdf = PdfFileReader(f'{callbackQuery.message.message_id}/pdf.pdf')
splitOutput = PdfFileWriter()
for i in range(int(PAGENOINFO[callbackQuery.message.chat.id][1])-1, int(PAGENOINFO[callbackQuery.message.chat.id][2])):
splitOutput.addPage(
splitInputPdf.getPage(i)
)
file_path = f"{callbackQuery.message.message_id}/split.pdf"
with open(file_path, "wb") as output_stream:
splitOutput.write(output_stream)
await bot.send_document(
chat_id = callbackQuery.message.chat.id,
thumb = Config.PDF_THUMBNAIL,
document = f"{callbackQuery.message.message_id}/split.pdf"
)
if edit == "asPdf":
splitInputPdf = PdfFileReader(f'{callbackQuery.message.message_id}/pdf.pdf')
splitOutput = PdfFileWriter()
splitOutput.addPage(
splitInputPdf.getPage(
int(PAGENOINFO[callbackQuery.message.chat.id][3])-1
)
)
with open(f"{callbackQuery.message.message_id}/split.pdf", "wb") as output_stream:
splitOutput.write(output_stream)
await bot.send_document(
chat_id = callbackQuery.message.chat.id,
thumb = Config.PDF_THUMBNAIL,
document = f"{callbackQuery.message.message_id}/split.pdf"
)
shutil.rmtree(f"{callbackQuery.message.message_id}")
PROCESS.remove(callbackQuery.message.chat.id)
del PAGENOINFO[callbackQuery.message.chat.id]
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "`Uploading Completed..🤞`"
)
except Exception as e:
try:
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = Msgs.errorEditMsg.format(e)
)
shutil.rmtree(f"{callbackQuery.message.message_id}")
PROCESS.remove(callbackQuery.message.chat.id)
del PAGENOINFO[callbackQuery.message.chat.id]
except Exception:
pass
except Exception as e:
try:
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = Msgs.errorEditMsg.format(e)
)
shutil.rmtree(f"{callbackQuery.message.message_id}")
PROCESS.remove(callbackQuery.message.chat.id)
del PAGENOINFO[callbackQuery.message.chat.id]
except Exception:
pass
elif edit in ["txtFile", "txtMsg", "txtHtml", "txtJson"]:
try:
if (callbackQuery.message.chat.id in PROCESS) or (callbackQuery.message.chat.id not in PDF2IMG):
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "Same work done before..🏃"
)
return
PROCESS.append(callbackQuery.message.chat.id)
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "`Downloading your pdf..⏳`"
)
await bot.download_media(
PDF2IMG[callbackQuery.message.chat.id],
f'{callbackQuery.message.message_id}/pdf.pdf'
)
del PDF2IMG[callbackQuery.message.chat.id]
del PDF2IMGPGNO[callbackQuery.message.chat.id]
doc = fitz.open(f'{callbackQuery.message.message_id}/pdf.pdf') # open document
if edit == "txtFile":
out = open(f'{callbackQuery.message.message_id}/pdf.txt', "wb") # open text output
for page in doc: # iterate the document pages
text = page.get_text().encode("utf8") # get plain text (is in UTF-8)
out.write(text) # write text of page()
out.write(bytes((12,))) # write page delimiter (form feed 0x0C)
out.close()
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_document"
)
sendfile = open(f"{callbackQuery.message.message_id}/pdf.txt",'rb')
await bot.send_document(
chat_id = callbackQuery.message.chat.id,
thumb = Config.PDF_THUMBNAIL,
document = sendfile
)
sendfile.close()
if edit == "txtMsg":
for page in doc: # iterate the document pages
pdfText = page.get_text().encode("utf8") # get plain text (is in UTF-8)
if 1 <= len(pdfText) <= 1048:
if callbackQuery.message.chat.id not in PROCESS:
try:
await bot.send_chat_action(
callbackQuery.message.chat.id, "typing"
)
await bot.send_message(
callbackQuery.message.chat.id, pdfText
)
except Exception:
return
if edit == "txtHtml":
out = open(f'{callbackQuery.message.message_id}/pdf.html', "wb") # open text output
for page in doc: # iterate the document pages
text = page.get_text("html").encode("utf8") # get plain text as html(is in UTF-8)
out.write(text) # write text of page()
out.write(bytes((12,))) # write page delimiter (form feed 0x0C)
out.close()
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_document"
)
sendfile = open(f"{callbackQuery.message.message_id}/pdf.html",'rb')
await bot.send_document(
chat_id = callbackQuery.message.chat.id,
thumb = Config.PDF_THUMBNAIL,
document = sendfile
)
sendfile.close()
if edit == "txtJson":
out = open(f'{callbackQuery.message.message_id}/pdf.json', "wb") # open text output
for page in doc: # iterate the document pages
text = page.get_text("json").encode("utf8") # get plain text as html(is in UTF-8)
out.write(text) # write text of page()
out.write(bytes((12,))) # write page delimiter (form feed 0x0C)
out.close()
await bot.send_chat_action(
callbackQuery.message.chat.id, "upload_document"
)
sendfile = open(f"{callbackQuery.message.message_id}/pdf.json", 'rb')
await bot.send_document(
chat_id = callbackQuery.message.chat.id,
thumb = Config.PDF_THUMBNAIL,
document = sendfile
)
sendfile.close()
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = "`Completed my task..😉`"
)
PROCESS.remove(callbackQuery.message.chat.id)
shutil.rmtree(f'{callbackQuery.message.message_id}')
except Exception as e:
try:
await bot.send_message(
callbackQuery.message.chat.id,
Msgs.errorEditMsg.format(e)
)
shutil.rmtree(f'{callbackQuery.message.message_id}')
PROCESS.remove(callbackQuery.message.chat.id)
doc.close()
except Exception:
pass
elif edit == "refresh":
try:
await bot.get_chat_member(
str(Config.UPDATE_CHANNEL),
callbackQuery.message.chat.id
)
await bot.edit_message_text(
chat_id = callbackQuery.message.chat.id,
message_id = callbackQuery.message.message_id,
text = Msgs.welcomeMsg.format(
callbackQuery.from_user.first_name,
callbackQuery.message.chat.id
),
disable_web_page_preview = True,
reply_markup = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(
"Know More ❤️",
callback_data = "strtDevEdt"
),
InlineKeyboardButton(
"Explore Bot 🎊",
callback_data = "imgsToPdfEdit"
)
],
[
InlineKeyboardButton(
"Close",
callback_data = "close"
)
]
]
)
)
except Exception:
try:
await bot.answer_callback_query(
callbackQuery.id,
text = Msgs.foolRefresh,
show_alert = True,
cache_time = 0
)
except Exception:
pass
bot.run()
| 36.003982 | 189 | 0.404731 | 5,406 | 72,332 | 5.317795 | 0.096004 | 0.064283 | 0.118478 | 0.103103 | 0.780889 | 0.755496 | 0.715528 | 0.678378 | 0.632705 | 0.601259 | 0 | 0.006019 | 0.517669 | 72,332 | 2,008 | 190 | 36.021912 | 0.814263 | 0.023157 | 0 | 0.588715 | 0 | 0.005643 | 0.100003 | 0.045993 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.021317 | 0.00815 | 0 | 0.033856 | 0.000627 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
53efe75aa5213a51d2c87ab1988ccd831bd540ee | 20,471 | py | Python | tests/src/smiley/smiley/tests/test_db.py | incognitoRepo/hdlogger | c738161ef3144469ba0f47caf89770613031e96e | [
"BSD-2-Clause"
] | null | null | null | tests/src/smiley/smiley/tests/test_db.py | incognitoRepo/hdlogger | c738161ef3144469ba0f47caf89770613031e96e | [
"BSD-2-Clause"
] | null | null | null | tests/src/smiley/smiley/tests/test_db.py | incognitoRepo/hdlogger | c738161ef3144469ba0f47caf89770613031e96e | [
"BSD-2-Clause"
] | null | null | null | # -*- encoding: utf-8 -*-
import datetime
import fixtures
import json
import profile
import pstats
import tempfile
import testtools
import six
from smiley import db
from smiley import stats
class InitializationTest(testtools.TestCase):
def test_initialize_first_time(self):
with tempfile.NamedTemporaryFile() as f:
conn = db.DB._open_db(f.name)
cursor = conn.cursor()
cursor.execute(u'select * from run')
results = cursor.fetchall()
self.assertEqual(0, len(results))
def test_initialize_second_time(self):
with tempfile.NamedTemporaryFile() as f:
db.DB._open_db(f.name)
conn2 = db.DB._open_db(f.name)
cursor = conn2.cursor()
cursor.execute(u'select * from run')
results = cursor.fetchall()
self.assertEqual(0, len(results))
class TransactionTest(testtools.TestCase):
def setUp(self):
super(TransactionTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
def test_commit(self):
with db.transaction(self.db.conn) as c:
c.execute(
"""
INSERT INTO run (id, cwd, description, start_time)
VALUES ('12345', 'cwd-here', 'useful description',
1370436103.65)
""")
db2 = db.DB(':memory:')
c2 = db2.conn.cursor()
c2.execute('select * from run')
d = c2.fetchall()
self.assertEqual(d, [])
c3 = self.db.conn.cursor()
c3.execute('select * from run')
d = c3.fetchall()
self.assertEqual(len(d), 1)
def test_rollback(self):
try:
with db.transaction(self.db.conn) as c:
c.execute(
"""
INSERT INTO run (id, cwd, description, start_time)
VALUES ('12345', 'cwd-here', 'useful description',
1370436103.65)
""")
db2 = db.DB(':memory:')
c2 = db2.conn.cursor()
c2.execute('select * from run')
d = c2.fetchall()
self.assertEqual(d, [])
raise RuntimeError('testing')
except RuntimeError as err:
self.assertEqual(str(err), 'testing')
c3 = self.db.conn.cursor()
c3.execute('select * from run')
d = c3.fetchall()
self.assertEqual(len(d), 0)
class DBTest(testtools.TestCase):
def setUp(self):
super(DBTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
def test_start_run(self):
self.db.start_run(
'12345',
'/no/such/dir',
'command line would go here',
1370436103.65,
)
c = self.db.conn.cursor()
c.execute('select * from run')
data = c.fetchall()
self.assertEqual(len(data), 1)
row = data[0]
self.assertEqual(row['id'], '12345')
self.assertEqual(row['cwd'], '/no/such/dir')
self.assertEqual(row['description'], '"command line would go here"')
self.assertEqual(row['start_time'], 1370436103.65)
self.assertEqual(row['end_time'], None)
self.assertEqual(row['error_message'], None)
self.assertEqual(row['traceback'], None)
def test_start_run_repeat_run_id(self):
self.db.start_run(
'12345',
'/no/such/dir',
'command line would go here',
1370436103.65,
)
try:
self.db.start_run(
'12345',
'/no/such/dir',
'command line would go here',
1370436103.65,
)
except ValueError as e:
self.assertIn('12345', six.text_type(e))
def test_end_run_clean(self):
self.db.start_run(
'12345',
'/no/such/dir',
'command line would go here',
1370436103.65,
)
self.db.end_run(
'12345',
1370436104.65,
message=None,
traceback=None,
stats=None,
)
c = self.db.conn.cursor()
c.execute('select * from run')
data = c.fetchall()
self.assertEqual(len(data), 1)
row = data[0]
self.assertEqual(row['id'], '12345')
self.assertEqual(row['start_time'], 1370436103.65)
self.assertEqual(row['end_time'], 1370436104.65)
def test_end_run_traceback(self):
self.db.start_run(
'12345',
'/no/such/dir',
'command line would go here',
1370436103.65,
)
try:
raise RuntimeError('test exception')
except RuntimeError as err:
import sys
self.db.end_run(
'12345',
1370436104.65,
message=six.text_type(err),
traceback=sys.exc_info()[-1],
stats=None,
)
c = self.db.conn.cursor()
c.execute('select * from run')
data = c.fetchall()
self.assertEqual(len(data), 1)
row = data[0]
self.assertEqual(row['id'], '12345')
self.assertEqual(row['error_message'], 'test exception')
# FIXME: Need to serialize the traceback better
assert 'traceback' in row['traceback']
class TraceTest(testtools.TestCase):
def setUp(self):
super(TraceTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
self.db.start_run(
'12345',
'/no/such/dir',
'command line would go here',
1370436103.65,
)
self.local_values = {'name': ['value', 'pairs']}
self.trace_arg = [{'complex': 'value'}]
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=99,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=100,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
def test_insertion_order(self):
c = self.db.conn.cursor()
c.execute('select * from trace order by id')
data = c.fetchall()
line_nos = [r['line_no'] for r in data]
self.assertEqual(line_nos, [99, 100])
def test_local_vars(self):
c = self.db.conn.cursor()
c.execute('select * from trace order by id')
row = c.fetchone()
self.assertEqual(json.loads(row['local_vars']),
self.local_values)
def test_trace_arg(self):
c = self.db.conn.cursor()
c.execute('select * from trace order by id')
row = c.fetchone()
self.assertEqual(json.loads(row['trace_arg']),
self.trace_arg)
class QueryTest(testtools.TestCase):
def setUp(self):
super(QueryTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
self.db.start_run(
'12345',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436103.65,
)
self.local_values = {'name': ['value', 'pairs']}
self.trace_arg = [{'complex': 'value'}]
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=99,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=100,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.start_run(
'6789',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436104.65,
)
self.db.end_run(
'6789',
1370436105.65,
'error message',
None,
stats=None,
)
def test_get_runs(self):
runs = list(self.db.get_runs())
self.assertEqual(len(runs), 2)
self.assertEqual(runs[0].id, '12345')
self.assertEqual(runs[1].id, '6789')
def test_get_runs_desc(self):
runs = list(self.db.get_runs(sort_order='DESC'))
self.assertEqual(len(runs), 2)
self.assertEqual(runs[0].id, '6789')
self.assertEqual(runs[1].id, '12345')
def test_get_runs_errors(self):
runs = list(self.db.get_runs(True))
self.assertEqual(len(runs), 1)
self.assertEqual(runs[0].id, '6789')
def test_get_run(self):
run = self.db.get_run('12345')
self.assertEqual(run.id, '12345')
self.assertEqual(
run.description,
['command', 'line', 'would', 'go', 'here']
)
self.assertIsNone(run.stats)
def test_get_run_missing(self):
self.assertRaises(
db.NoSuchRun,
self.db.get_run,
'no-run-with-this-id',
)
def test_get_trace(self):
trace = list(self.db.get_trace('12345'))
self.assertEqual(len(trace), 2)
line_nos = [r.line_no for r in trace]
self.assertEqual(line_nos, [99, 100])
class QueryWithStatsTest(testtools.TestCase):
def setUp(self):
super(QueryWithStatsTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
self.db.start_run(
'12345',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436103.65,
)
self.local_values = {'name': ['value', 'pairs']}
self.trace_arg = [{'complex': 'value'}]
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=99,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=100,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.start_run(
'6789',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436104.65,
)
stats_data = stats.stats_to_blob(pstats.Stats(profile.Profile()))
self.db.end_run(
'6789',
1370436105.65,
'error message',
None,
stats=stats_data,
)
def test_get_run(self):
run = self.db.get_run('6789')
self.assertIsNotNone(run.stats)
class ThreadQueryTest(testtools.TestCase):
def setUp(self):
super(ThreadQueryTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
self.db.start_run(
'12345',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436103.65,
)
self.local_values = {'name': ['value', 'pairs']}
self.trace_arg = [{'complex': 'value'}]
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=99,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.trace(
run_id='12345',
thread_id='t1',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=100,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.trace(
run_id='12345',
thread_id='t2',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=99,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436104.65,
)
self.db.trace(
run_id='12345',
thread_id='t2',
call_id='abcd',
event='test',
func_name='test_trace',
line_no=100,
filename='test_db.py',
trace_arg=self.trace_arg,
local_vars=self.local_values,
timestamp=1370436106.65,
)
self.db.start_run(
'6789',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436104.65,
)
self.db.end_run(
'6789',
1370436105.65,
'error message',
None,
stats=None,
)
def test_get_thread_details_all_thread(self):
details = list(self.db.get_thread_details('12345'))
by_id = {t.id: t for t in details}
thread_ids = set(by_id.keys())
self.assertEqual(thread_ids, set(['t1', 't2']))
def test_get_thread_details_start_and_end(self):
details = list(self.db.get_thread_details('12345'))
by_id = {t.id: t for t in details}
self.assertEqual(by_id['t2'].start_time,
datetime.datetime.fromtimestamp(1370436104.65))
self.assertEqual(by_id['t2'].end_time,
datetime.datetime.fromtimestamp(1370436106.65))
def test_get_thread_details_num_events(self):
details = list(self.db.get_thread_details('12345'))
by_id = {t.id: t for t in details}
self.assertEqual(by_id['t1'].num_events, 2)
self.assertEqual(by_id['t2'].num_events, 2)
def test_get_trace_no_thread(self):
trace = list(self.db.get_trace('12345'))
self.assertEqual(len(trace), 4)
def test_get_trace_with_thread(self):
trace = list(self.db.get_trace('12345', 't1'))
self.assertEqual(len(trace), 2)
ids = set(t.thread_id for t in trace)
self.assertEqual(ids, set(['t1']))
class FileCacheTest(testtools.TestCase):
def setUp(self):
super(FileCacheTest, self).setUp()
self.useFixture(fixtures.FakeLogger())
self.db = db.DB(':memory:')
self.db.start_run(
'12345',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436103.65,
)
self.db.start_run(
'6789',
'/no/such/dir',
['command', 'line', 'would', 'go', 'here'],
1370436103.65,
)
def test_add_file(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
c = self.db.conn.cursor()
c.execute('select * from file')
rows = list(c.fetchall())
self.assertEqual(len(rows), 1)
row = rows[0]
self.assertEqual(row['body'], 'this would be the body')
self.assertEqual(row['name'], 'test-file.txt')
def test_add_file_unicode_name(self):
self.db.cache_file_for_run(
'12345',
u'téßt-file.txt',
'this would be the body',
)
c = self.db.conn.cursor()
c.execute('select * from file')
rows = list(c.fetchall())
self.assertEqual(len(rows), 1)
row = rows[0]
self.assertEqual(row['body'], 'this would be the body')
self.assertEqual(row['name'], u'téßt-file.txt')
def test_add_file_body(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
u'thîs would be thé bødy',
)
c = self.db.conn.cursor()
c.execute('select * from file')
rows = list(c.fetchall())
self.assertEqual(len(rows), 1)
row = rows[0]
self.assertEqual(row['body'], u'thîs would be thé bødy')
self.assertEqual(row['name'], 'test-file.txt')
def test_add_file_twice_same(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
c = self.db.conn.cursor()
c.execute('select * from file')
rows = list(c.fetchall())
self.assertEqual(len(rows), 1)
row = rows[0]
self.assertEqual(row['body'], 'this would be the body')
self.assertEqual(row['name'], 'test-file.txt')
def test_add_file_twice_different(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this body has changed',
)
c = self.db.conn.cursor()
c.execute('select * from file')
rows = list(c.fetchall())
self.assertEqual(len(rows), 2)
def test_retrieve_via_name(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
body = self.db.get_cached_file(
'12345',
'test-file.txt',
)
self.assertEqual(body, 'this would be the body')
def test_retrieve_via_signature(self):
signature = self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
name, body = self.db.get_cached_file_by_id(
'12345',
signature,
)
self.assertEqual(name, 'test-file.txt')
self.assertEqual(body, 'this would be the body')
def test_retrieve_signature(self):
signature = self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
actual = self.db.get_file_signature(
'12345',
'test-file.txt',
)
self.assertEqual(signature, actual)
def test_retrieve_from_run_bad_id(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
body = self.db.get_cached_file(
'6789', # wrong run id
'test-file.txt',
)
self.assertEqual(body, '')
def test_retrieve_from_run_bad_name(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
body = self.db.get_cached_file(
'12345',
'no-such-file.txt',
)
self.assertEqual(body, '')
def test_list_files(self):
self.db.cache_file_for_run(
'12345',
'test-file.txt',
'this would be the body',
)
self.db.cache_file_for_run(
'12345',
'test-file2.txt',
'this would be the body',
)
files = list(self.db.get_files_for_run('12345'))
self.assertEqual(2, len(files))
names = [f.name for f in files]
self.assertEqual(['test-file.txt', 'test-file2.txt'], names)
| 30.327407 | 76 | 0.515705 | 2,338 | 20,471 | 4.360992 | 0.085971 | 0.048843 | 0.035308 | 0.023342 | 0.804237 | 0.77756 | 0.726756 | 0.6939 | 0.683503 | 0.675853 | 0 | 0.065773 | 0.351619 | 20,471 | 674 | 77 | 30.372404 | 0.702403 | 0.004006 | 0 | 0.681667 | 0 | 0 | 0.137808 | 0 | 0 | 0 | 0 | 0.001484 | 0.118333 | 1 | 0.068333 | false | 0 | 0.018333 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
54dd0638e38d9e0459606f9b9382942339932f25 | 45 | py | Python | captif_data_structures/__init__.py | captif-nz/captif-data-structures | 321a8338841e4986e0f8ab1c66743cba1e0c54c9 | [
"MIT"
] | null | null | null | captif_data_structures/__init__.py | captif-nz/captif-data-structures | 321a8338841e4986e0f8ab1c66743cba1e0c54c9 | [
"MIT"
] | null | null | null | captif_data_structures/__init__.py | captif-nz/captif-data-structures | 321a8338841e4986e0f8ab1c66743cba1e0c54c9 | [
"MIT"
] | null | null | null | __version__ = "0.12"
from . import readers
| 9 | 21 | 0.688889 | 6 | 45 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0.2 | 45 | 4 | 22 | 11.25 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
54fbd151aaad593b695e94ed818194368eb54a21 | 30 | py | Python | biscuit/_version.py | dcoker/biscuit-py | 35f5c64fdd97d26c009df54f619062437e168cb8 | [
"Apache-2.0"
] | 5 | 2016-07-20T02:03:20.000Z | 2020-09-11T15:34:41.000Z | biscuit/_version.py | dcoker/biscuit-py | 35f5c64fdd97d26c009df54f619062437e168cb8 | [
"Apache-2.0"
] | 1 | 2020-07-22T10:07:46.000Z | 2020-07-22T10:07:46.000Z | biscuit/_version.py | dcoker/biscuit-py | 35f5c64fdd97d26c009df54f619062437e168cb8 | [
"Apache-2.0"
] | 2 | 2016-07-20T02:03:25.000Z | 2020-02-07T10:56:21.000Z | """0.1.2"""
VERSION = __doc__
| 10 | 17 | 0.566667 | 5 | 30 | 2.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 0.133333 | 30 | 2 | 18 | 15 | 0.384615 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
0700f2ae63c1f6ff771a426dd5ca2892deab2faa | 4,024 | py | Python | rlcard/games/bridge/utils/action_event.py | xiviu123/rlcard | 2a5273dff6c9dd49a3d4ab84a952fed9a387955b | [
"MIT"
] | null | null | null | rlcard/games/bridge/utils/action_event.py | xiviu123/rlcard | 2a5273dff6c9dd49a3d4ab84a952fed9a387955b | [
"MIT"
] | null | null | null | rlcard/games/bridge/utils/action_event.py | xiviu123/rlcard | 2a5273dff6c9dd49a3d4ab84a952fed9a387955b | [
"MIT"
] | null | null | null | '''
File name: bridge/utils/action_event.py
Author: William Hale
Date created: 11/25/2021
'''
from .bridge_card import BridgeCard
# ====================================
# Action_ids:
# 0 -> no_bid_action_id
# 1 to 35 -> bid_action_id (bid amount by suit or NT)
# 36 -> pass_action_id
# 37 -> dbl_action_id
# 38 -> rdbl_action_id
# 39 to 90 -> play_card_action_id
# ====================================
class ActionEvent(object): # Interface
no_bid_action_id = 0
first_bid_action_id = 1
pass_action_id = 36
dbl_action_id = 37
rdbl_action_id = 38
first_play_card_action_id = 39
def __init__(self, action_id: int):
self.action_id = action_id
def __eq__(self, other):
result = False
if isinstance(other, ActionEvent):
result = self.action_id == other.action_id
return result
@staticmethod
def from_action_id(action_id: int):
if action_id == ActionEvent.pass_action_id:
return PassAction()
elif ActionEvent.first_bid_action_id <= action_id <= 35:
bid_amount = 1 + (action_id - ActionEvent.first_bid_action_id) // 5
bid_suit_id = (action_id - ActionEvent.first_bid_action_id) % 5
bid_suit = BridgeCard.suits[bid_suit_id] if bid_suit_id < 4 else None
return BidAction(bid_amount, bid_suit)
elif action_id == ActionEvent.dbl_action_id:
return DblAction()
elif action_id == ActionEvent.rdbl_action_id:
return RdblAction()
elif ActionEvent.first_play_card_action_id <= action_id < ActionEvent.first_play_card_action_id + 52:
card_id = action_id - ActionEvent.first_play_card_action_id
card = BridgeCard.card(card_id=card_id)
return PlayCardAction(card=card)
else:
raise Exception(f'ActionEvent from_action_id: invalid action_id={action_id}')
@staticmethod
def get_num_actions():
''' Return the number of possible actions in the game
'''
return 1 + 35 + 3 + 52 # no_bid, 35 bids, pass, dbl, rdl, 52 play_card
class CallActionEvent(ActionEvent): # Interface
pass
class PassAction(CallActionEvent):
def __init__(self):
super().__init__(action_id=ActionEvent.pass_action_id)
def __str__(self):
return "pass"
def __repr__(self):
return "pass"
class BidAction(CallActionEvent):
def __init__(self, bid_amount: int, bid_suit: str or None):
suits = BridgeCard.suits
if bid_suit and bid_suit not in suits:
raise Exception(f'BidAction has invalid suit: {bid_suit}')
if bid_suit in suits:
bid_suit_id = suits.index(bid_suit)
else:
bid_suit_id = 4
bid_action_id = bid_suit_id + 5 * (bid_amount - 1) + ActionEvent.first_bid_action_id
super().__init__(action_id=bid_action_id)
self.bid_amount = bid_amount
self.bid_suit = bid_suit
def __str__(self):
bid_suit = self.bid_suit
if not bid_suit:
bid_suit = 'NT'
return f'{self.bid_amount}{bid_suit}'
def __repr__(self):
return self.__str__()
class DblAction(CallActionEvent):
def __init__(self):
super().__init__(action_id=ActionEvent.dbl_action_id)
def __str__(self):
return "dbl"
def __repr__(self):
return "dbl"
class RdblAction(CallActionEvent):
def __init__(self):
super().__init__(action_id=ActionEvent.rdbl_action_id)
def __str__(self):
return "rdbl"
def __repr__(self):
return "rdbl"
class PlayCardAction(ActionEvent):
def __init__(self, card: BridgeCard):
play_card_action_id = ActionEvent.first_play_card_action_id + card.card_id
super().__init__(action_id=play_card_action_id)
self.card: BridgeCard = card
def __str__(self):
return f"{self.card}"
def __repr__(self):
return f"{self.card}"
| 28.338028 | 109 | 0.634692 | 521 | 4,024 | 4.43762 | 0.180422 | 0.183391 | 0.090398 | 0.055363 | 0.318339 | 0.253028 | 0.16436 | 0.16436 | 0.16436 | 0.037197 | 0 | 0.017845 | 0.261928 | 4,024 | 141 | 110 | 28.539007 | 0.760606 | 0.124503 | 0 | 0.280899 | 0 | 0 | 0.048193 | 0.013769 | 0 | 0 | 0 | 0 | 0 | 1 | 0.213483 | false | 0.089888 | 0.011236 | 0.101124 | 0.561798 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 4 |
07164762307834306469c9174d10629dadb80d27 | 976 | py | Python | src/schnetpack/representation/__init__.py | peder2tm/schnetpack | 3bc033be9d43267742545835ebc5e8bbfc676cc3 | [
"MIT"
] | 1 | 2022-01-24T12:21:03.000Z | 2022-01-24T12:21:03.000Z | src/schnetpack/representation/__init__.py | peder2tm/schnetpack | 3bc033be9d43267742545835ebc5e8bbfc676cc3 | [
"MIT"
] | null | null | null | src/schnetpack/representation/__init__.py | peder2tm/schnetpack | 3bc033be9d43267742545835ebc5e8bbfc676cc3 | [
"MIT"
] | null | null | null | """
Classes for constructing the different representations available in SchnetPack. This encompasses SchNet [#schnet4]_,
Behler-type atom centered symmetry functions (ACSF) [#acsf2]_ and a weighted variant thereof (wACSF) [#wacsf2]_.
References
----------
.. [#schnet4] Schütt, Arbabzadah, Chmiela, Müller, Tkatchenko:
Quantum-chemical insights from deep tensor neural networks.
Nature Communications, 8, 13890. 2017.
.. [#acsf2] Behler:
Atom-centered symmetry functions for constructing high-dimensional neural network potentials.
The Journal of Chemical Physics 134. 074106. 2011.
.. [#wacsf2] Gastegger, Schwiedrzik, Bittermann, Berzsenyi, Marquetand:
wACSF -- Weighted atom-centered symmetry functions as descriptors in machine learning potentials.
The Journal of Chemical Physics 148 (24), 241709. 2018.
"""
from schnetpack.representation.schnet import SchNet, SchNetInteraction
from schnetpack.representation.hdnn import BehlerSFBlock, StandardizeSF
| 48.8 | 116 | 0.780738 | 110 | 976 | 6.9 | 0.690909 | 0.047431 | 0.079051 | 0.114625 | 0.097497 | 0.097497 | 0 | 0 | 0 | 0 | 0 | 0.051887 | 0.131148 | 976 | 19 | 117 | 51.368421 | 0.84316 | 0.843238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
071acf42f1bc7a3b136bd7f9aaaea11be02d0fa8 | 128 | py | Python | geocode.py | knazariy/lviv_parkingbot | b2990046d8d2c84a3ad5ba0bade98f837a16b49f | [
"MIT"
] | 1 | 2021-01-18T14:08:37.000Z | 2021-01-18T14:08:37.000Z | geocode.py | knazariy/lviv_parkingbot | b2990046d8d2c84a3ad5ba0bade98f837a16b49f | [
"MIT"
] | null | null | null | geocode.py | knazariy/lviv_parkingbot | b2990046d8d2c84a3ad5ba0bade98f837a16b49f | [
"MIT"
] | null | null | null | class Geocode:
def __init__(self, longitude, latitude):
self.longitude = longitude
self.latitude = latitude
| 25.6 | 44 | 0.671875 | 13 | 128 | 6.307692 | 0.538462 | 0.317073 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 128 | 4 | 45 | 32 | 0.854167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
074ec97e659f56644e1dc8489f7326dcda245392 | 148 | py | Python | kdl_wagtail/draftail/apps.py | kingsdigitallab/django-kdl-wagtail | 457623a35057f88ee575397ac2c68797f35085e1 | [
"MIT"
] | 3 | 2020-02-18T07:19:13.000Z | 2021-06-14T20:35:08.000Z | kdl_wagtail/draftail/apps.py | kingsdigitallab/django-kdl-wagtail | 457623a35057f88ee575397ac2c68797f35085e1 | [
"MIT"
] | 16 | 2019-02-08T19:39:27.000Z | 2020-07-30T20:01:38.000Z | kdl_wagtail/draftail/apps.py | kingsdigitallab/django-kdl-wagtail | 457623a35057f88ee575397ac2c68797f35085e1 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class KdlWagtailDraftailConfig(AppConfig):
name = 'kdl_wagtail.draftail'
label = 'kdl_wagtail_draftail'
| 21.142857 | 42 | 0.777027 | 16 | 148 | 7 | 0.75 | 0.178571 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148649 | 148 | 6 | 43 | 24.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
4acd093ced1c96b2a30c00e8db04eace937dc7b9 | 119 | py | Python | src/playground/fun.py | mainpyp/paper-miner | ca76439e743920a1fa659b7d7d51be0840fd8ce9 | [
"Apache-2.0"
] | null | null | null | src/playground/fun.py | mainpyp/paper-miner | ca76439e743920a1fa659b7d7d51be0840fd8ce9 | [
"Apache-2.0"
] | null | null | null | src/playground/fun.py | mainpyp/paper-miner | ca76439e743920a1fa659b7d7d51be0840fd8ce9 | [
"Apache-2.0"
] | null | null | null | my_dict = {
"key 1": 1,
"key 2": 2,
"key 3": 3
}
for key in my_dict:
print(f"{key} -> {my_dict[key]}") | 14.875 | 37 | 0.470588 | 22 | 119 | 2.409091 | 0.454545 | 0.339623 | 0.339623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072289 | 0.302521 | 119 | 8 | 37 | 14.875 | 0.566265 | 0 | 0 | 0 | 0 | 0 | 0.316667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
4af3f1bc2df16ac0177f5280f8ee0392928dccd9 | 404 | py | Python | app/passwordhelper.py | DSurguy/nuzlocker | 62dac0079d83cb62b381e09fa43a220b952eac83 | [
"MIT"
] | null | null | null | app/passwordhelper.py | DSurguy/nuzlocker | 62dac0079d83cb62b381e09fa43a220b952eac83 | [
"MIT"
] | 1 | 2018-02-07T02:12:03.000Z | 2018-02-07T02:13:24.000Z | app/passwordhelper.py | DSurguy/nuzlocker | 62dac0079d83cb62b381e09fa43a220b952eac83 | [
"MIT"
] | null | null | null | import hashlib
import base64
import os
class PasswordHelper:
def get_hash(self, val):
return hashlib.sha512(val.encode('utf-8')).hexdigest()
def get_salt(self):
return base64.b32encode(os.urandom(20))
def validate_password(self, plain, salt, expected):
print(plain)
print(salt)
print(expected)
return self.get_hash(plain + salt) == expected | 23.764706 | 62 | 0.658416 | 51 | 404 | 5.137255 | 0.509804 | 0.045802 | 0.129771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03871 | 0.232673 | 404 | 17 | 63 | 23.764706 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0.153846 | 0.230769 | 0.153846 | 0.769231 | 0.230769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 4 |
ab029b3a835a3452bd754ef464c566eed2576939 | 2,368 | py | Python | models/synthesizer_net.py | ly-zhu/cof-net | e447501aa3ca918dcfc6be5bf3e7497cf39aa1d3 | [
"MIT"
] | 3 | 2021-09-23T12:03:09.000Z | 2021-12-06T10:32:12.000Z | models/synthesizer_net.py | ly-zhu/cof-net | e447501aa3ca918dcfc6be5bf3e7497cf39aa1d3 | [
"MIT"
] | null | null | null | models/synthesizer_net.py | ly-zhu/cof-net | e447501aa3ca918dcfc6be5bf3e7497cf39aa1d3 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
class InnerProd(nn.Module):
def __init__(self, fc_dim):
super(InnerProd, self).__init__()
self.scale = nn.Parameter(torch.ones(fc_dim))
self.bias = nn.Parameter(torch.zeros(1))
def forward(self, feat_img, feat_sound):
sound_size = feat_sound.size()
B, C = sound_size[0], sound_size[1]
feat_img = feat_img.view(B, 1, C)
z = torch.bmm(feat_img * self.scale, feat_sound.view(B, C, -1)) \
.view(B, 1, *sound_size[2:])
z = z + self.bias
return z
def forward_nosum(self, feat_img, feat_sound):
(B, C, H, W) = feat_sound.size()
feat_img = feat_img.view(B, C)
z = (feat_img * self.scale).view(B, C, 1, 1) * feat_sound
z = z + self.bias
return z
# inference purposes
def forward_pixelwise(self, feats_img, feat_sound):
(B, C, HI, WI) = feats_img.size()
(B, C, HS, WS) = feat_sound.size()
feats_img = feats_img.view(B, C, HI*WI)
feats_img = feats_img.transpose(1, 2)
feat_sound = feat_sound.view(B, C, HS * WS)
z = torch.bmm(feats_img * self.scale, feat_sound) \
.view(B, HI, WI, HS, WS)
z = z + self.bias
return z
class Bias(nn.Module):
def __init__(self):
super(Bias, self).__init__()
self.bias = nn.Parameter(torch.zeros(1))
# self.bias = nn.Parameter(-torch.ones(1))
def forward(self, feat_img, feat_sound):
(B, C, H, W) = feat_sound.size()
feat_img = feat_img.view(B, 1, C)
z = torch.bmm(feat_img, feat_sound.view(B, C, H * W)).view(B, 1, H, W)
z = z + self.bias
return z
def forward_nosum(self, feat_img, feat_sound):
(B, C, H, W) = feat_sound.size()
z = feat_img.view(B, C, 1, 1) * feat_sound
z = z + self.bias
return z
# inference purposes
def forward_pixelwise(self, feats_img, feat_sound):
(B, C, HI, WI) = feats_img.size()
(B, C, HS, WS) = feat_sound.size()
feats_img = feats_img.view(B, C, HI*WI)
feats_img = feats_img.transpose(1, 2)
feat_sound = feat_sound.view(B, C, HS * WS)
z = torch.bmm(feats_img, feat_sound) \
.view(B, HI, WI, HS, WS)
z = z + self.bias
return z
| 33.828571 | 78 | 0.566301 | 373 | 2,368 | 3.391421 | 0.123324 | 0.156522 | 0.042688 | 0.066403 | 0.803953 | 0.735968 | 0.735968 | 0.664032 | 0.63083 | 0.63083 | 0 | 0.011364 | 0.293919 | 2,368 | 69 | 79 | 34.318841 | 0.745215 | 0.032939 | 0 | 0.649123 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.140351 | false | 0 | 0.052632 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ab0e1d7b2467c461e2ae0bafc7f55825abc7a4c0 | 10,619 | py | Python | pynaoqi-python2.7-2.5.5.5-linux64/lib/python2.7/dist-packages/diagnostic_msgs/msg/_DiagnosticStatus.py | applejenny66/docker_pepper | 2469cc4db6585161a31ac44c8fcf2605d71318b1 | [
"MIT"
] | null | null | null | pynaoqi-python2.7-2.5.5.5-linux64/lib/python2.7/dist-packages/diagnostic_msgs/msg/_DiagnosticStatus.py | applejenny66/docker_pepper | 2469cc4db6585161a31ac44c8fcf2605d71318b1 | [
"MIT"
] | null | null | null | pynaoqi-python2.7-2.5.5.5-linux64/lib/python2.7/dist-packages/diagnostic_msgs/msg/_DiagnosticStatus.py | applejenny66/docker_pepper | 2469cc4db6585161a31ac44c8fcf2605d71318b1 | [
"MIT"
] | null | null | null | """autogenerated by genpy from diagnostic_msgs/DiagnosticStatus.msg. Do not edit."""
import sys
python3 = True if sys.hexversion > 0x03000000 else False
import genpy
import struct
import diagnostic_msgs.msg
class DiagnosticStatus(genpy.Message):
_md5sum = "d0ce08bc6e5ba34c7754f563a9cabaf1"
_type = "diagnostic_msgs/DiagnosticStatus"
_has_header = False #flag to mark the presence of a Header object
_full_text = """# This message holds the status of an individual component of the robot.
#
# Possible levels of operations
byte OK=0
byte WARN=1
byte ERROR=2
byte STALE=3
byte level # level of operation enumerated above
string name # a description of the test/component reporting
string message # a description of the status
string hardware_id # a hardware unique string
KeyValue[] values # an array of values associated with the status
================================================================================
MSG: diagnostic_msgs/KeyValue
string key # what to label this value when viewing
string value # a value to track over time
"""
# Pseudo-constants
OK = 0
WARN = 1
ERROR = 2
STALE = 3
__slots__ = ['level','name','message','hardware_id','values']
_slot_types = ['byte','string','string','string','diagnostic_msgs/KeyValue[]']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
level,name,message,hardware_id,values
:param args: complete set of field values, in .msg order
:param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(DiagnosticStatus, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.level is None:
self.level = 0
if self.name is None:
self.name = ''
if self.message is None:
self.message = ''
if self.hardware_id is None:
self.hardware_id = ''
if self.values is None:
self.values = []
else:
self.level = 0
self.name = ''
self.message = ''
self.hardware_id = ''
self.values = []
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
:param buff: buffer, ``StringIO``
"""
try:
buff.write(_struct_b.pack(self.level))
_x = self.name
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.message
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.hardware_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
length = len(self.values)
buff.write(_struct_I.pack(length))
for val1 in self.values:
_x = val1.key
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = val1.value
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
except struct.error as se: self._check_types(struct.error("%s: '%s' when writing '%s'" % (type(se), str(se), str(_x))))
except TypeError as te: self._check_types(ValueError("%s: '%s' when writing '%s'" % (type(te), str(te), str(_x))))
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
:param str: byte array of serialized message, ``str``
"""
try:
if self.values is None:
self.values = None
end = 0
start = end
end += 1
(self.level,) = _struct_b.unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.name = str[start:end].decode('utf-8')
else:
self.name = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.message = str[start:end].decode('utf-8')
else:
self.message = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.hardware_id = str[start:end].decode('utf-8')
else:
self.hardware_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
self.values = []
for i in range(0, length):
val1 = diagnostic_msgs.msg.KeyValue()
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
val1.key = str[start:end].decode('utf-8')
else:
val1.key = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
val1.value = str[start:end].decode('utf-8')
else:
val1.value = str[start:end]
self.values.append(val1)
return self
except struct.error as e:
raise genpy.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
:param buff: buffer, ``StringIO``
:param numpy: numpy python module
"""
try:
buff.write(_struct_b.pack(self.level))
_x = self.name
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.message
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = self.hardware_id
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
length = len(self.values)
buff.write(_struct_I.pack(length))
for val1 in self.values:
_x = val1.key
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
_x = val1.value
length = len(_x)
if python3 or type(_x) == unicode:
_x = _x.encode('utf-8')
length = len(_x)
if python3:
buff.write(struct.pack('<I%sB'%length, length, *_x))
else:
buff.write(struct.pack('<I%ss'%length, length, _x))
except struct.error as se: self._check_types(struct.error("%s: '%s' when writing '%s'" % (type(se), str(se), str(_x))))
except TypeError as te: self._check_types(ValueError("%s: '%s' when writing '%s'" % (type(te), str(te), str(_x))))
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
:param str: byte array of serialized message, ``str``
:param numpy: numpy python module
"""
try:
if self.values is None:
self.values = None
end = 0
start = end
end += 1
(self.level,) = _struct_b.unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.name = str[start:end].decode('utf-8')
else:
self.name = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.message = str[start:end].decode('utf-8')
else:
self.message = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
self.hardware_id = str[start:end].decode('utf-8')
else:
self.hardware_id = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
self.values = []
for i in range(0, length):
val1 = diagnostic_msgs.msg.KeyValue()
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
val1.key = str[start:end].decode('utf-8')
else:
val1.key = str[start:end]
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
start = end
end += length
if python3:
val1.value = str[start:end].decode('utf-8')
else:
val1.value = str[start:end]
self.values.append(val1)
return self
except struct.error as e:
raise genpy.DeserializationError(e) #most likely buffer underfill
_struct_I = genpy.struct_I
_struct_b = struct.Struct("<b")
| 31.140762 | 123 | 0.577361 | 1,415 | 10,619 | 4.207067 | 0.132862 | 0.077944 | 0.062825 | 0.040316 | 0.70687 | 0.70687 | 0.675962 | 0.671258 | 0.658156 | 0.641021 | 0 | 0.016212 | 0.285526 | 10,619 | 340 | 124 | 31.232353 | 0.76842 | 0.116772 | 0 | 0.836237 | 1 | 0 | 0.117424 | 0.021034 | 0 | 0 | 0.001084 | 0 | 0 | 1 | 0.020906 | false | 0 | 0.013937 | 0 | 0.083624 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ab0eb1ba97301839d4b3a0b26001febbc8617a43 | 52 | py | Python | samples/__init__.py | mbenbernard/contracts | 13cbdc74c70f8133f9e8d87bef2766941a814c49 | [
"Apache-2.0"
] | 1 | 2021-03-08T08:45:12.000Z | 2021-03-08T08:45:12.000Z | samples/__init__.py | mbenbernard/contracts | 13cbdc74c70f8133f9e8d87bef2766941a814c49 | [
"Apache-2.0"
] | 1 | 2017-08-30T01:49:09.000Z | 2017-08-30T01:49:09.000Z | samples/__init__.py | mbenbernard/contracts | 13cbdc74c70f8133f9e8d87bef2766941a814c49 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Benoit Bernard All Rights Reserved. | 52 | 52 | 0.826923 | 7 | 52 | 6.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 0.134615 | 52 | 1 | 52 | 52 | 0.866667 | 0.961538 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ab1230b10d948293bf80a88dbeb355f33b8d9231 | 10,965 | py | Python | networkx/generators/social.py | armando1793/networkx | 48326e1761c08d7a073aec53f7a644baf2249ef6 | [
"BSD-3-Clause"
] | 184 | 2017-12-20T21:50:06.000Z | 2022-03-19T13:24:58.000Z | networkx/generators/social.py | armando1793/networkx | 48326e1761c08d7a073aec53f7a644baf2249ef6 | [
"BSD-3-Clause"
] | 26 | 2020-03-24T18:07:06.000Z | 2022-03-12T00:12:27.000Z | networkx/generators/social.py | armando1793/networkx | 48326e1761c08d7a073aec53f7a644baf2249ef6 | [
"BSD-3-Clause"
] | 136 | 2018-01-09T22:52:06.000Z | 2022-02-24T13:26:18.000Z | """
Famous social networks.
"""
import networkx as nx
__author__ = """\n""".join(['Jordi Torrents <jtorrents@milnou.net>',
'Katy Bold <kbold@princeton.edu>',
'Aric Hagberg <aric.hagberg@gmail.com)'])
__all__ = ['karate_club_graph', 'davis_southern_women_graph',
'florentine_families_graph']
def karate_club_graph():
"""Return Zachary's Karate Club graph.
Each node in the returned graph has a node attribute 'club' that
indicates the name of the club to which the member represented by that node
belongs, either 'Mr. Hi' or 'Officer'.
Examples
--------
To get the name of the club to which a node belongs::
>>> import networkx as nx
>>> G = nx.karate_club_graph()
>>> G.nodes[5]['club']
'Mr. Hi'
>>> G.nodes[9]['club']
'Officer'
References
----------
.. [1] Zachary, Wayne W.
"An Information Flow Model for Conflict and Fission in Small Groups."
*Journal of Anthropological Research*, 33, 452--473, (1977).
.. [2] Data file from:
http://vlado.fmf.uni-lj.si/pub/networks/data/Ucinet/UciData.htm
"""
# Create the set of all members, and the members of each club.
all_members = set(range(34))
club1 = {0, 1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13, 16, 17, 19, 21}
# club2 = all_members - club1
G = nx.Graph()
G.add_nodes_from(all_members)
G.name = "Zachary's Karate Club"
zacharydat = """\
0 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0
1 0 1 1 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 0 0 0 0 0 1 0 0 0
1 1 0 1 0 0 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0
1 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1
0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1
1 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1
1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1
1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 0 1 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 1 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1
0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1
0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1 1
0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1
1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 1
0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 1 0 1 0 1 1 0 0 0 0 0 1 1 1 0 1
0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 1 0 0 1 1 1 0 1 1 0 0 1 1 1 1 1 1 1 0"""
for row, line in enumerate(zacharydat.split('\n')):
thisrow = [int(b) for b in line.split()]
for col, entry in enumerate(thisrow):
if entry == 1:
G.add_edge(row, col)
# Add the name of each member's club as a node attribute.
for v in G:
G.nodes[v]['club'] = 'Mr. Hi' if v in club1 else 'Officer'
return G
def davis_southern_women_graph():
"""Return Davis Southern women social network.
This is a bipartite graph.
References
----------
.. [1] A. Davis, Gardner, B. B., Gardner, M. R., 1941. Deep South.
University of Chicago Press, Chicago, IL.
"""
G = nx.Graph()
# Top nodes
women = ["Evelyn Jefferson",
"Laura Mandeville",
"Theresa Anderson",
"Brenda Rogers",
"Charlotte McDowd",
"Frances Anderson",
"Eleanor Nye",
"Pearl Oglethorpe",
"Ruth DeSand",
"Verne Sanderson",
"Myra Liddel",
"Katherina Rogers",
"Sylvia Avondale",
"Nora Fayette",
"Helen Lloyd",
"Dorothy Murchison",
"Olivia Carleton",
"Flora Price"]
G.add_nodes_from(women, bipartite=0)
# Bottom nodes
events = ["E1",
"E2",
"E3",
"E4",
"E5",
"E6",
"E7",
"E8",
"E9",
"E10",
"E11",
"E12",
"E13",
"E14"]
G.add_nodes_from(events, bipartite=1)
G.add_edges_from([("Evelyn Jefferson", "E1"),
("Evelyn Jefferson", "E2"),
("Evelyn Jefferson", "E3"),
("Evelyn Jefferson", "E4"),
("Evelyn Jefferson", "E5"),
("Evelyn Jefferson", "E6"),
("Evelyn Jefferson", "E8"),
("Evelyn Jefferson", "E9"),
("Laura Mandeville", "E1"),
("Laura Mandeville", "E2"),
("Laura Mandeville", "E3"),
("Laura Mandeville", "E5"),
("Laura Mandeville", "E6"),
("Laura Mandeville", "E7"),
("Laura Mandeville", "E8"),
("Theresa Anderson", "E2"),
("Theresa Anderson", "E3"),
("Theresa Anderson", "E4"),
("Theresa Anderson", "E5"),
("Theresa Anderson", "E6"),
("Theresa Anderson", "E7"),
("Theresa Anderson", "E8"),
("Theresa Anderson", "E9"),
("Brenda Rogers", "E1"),
("Brenda Rogers", "E3"),
("Brenda Rogers", "E4"),
("Brenda Rogers", "E5"),
("Brenda Rogers", "E6"),
("Brenda Rogers", "E7"),
("Brenda Rogers", "E8"),
("Charlotte McDowd", "E3"),
("Charlotte McDowd", "E4"),
("Charlotte McDowd", "E5"),
("Charlotte McDowd", "E7"),
("Frances Anderson", "E3"),
("Frances Anderson", "E5"),
("Frances Anderson", "E6"),
("Frances Anderson", "E8"),
("Eleanor Nye", "E5"),
("Eleanor Nye", "E6"),
("Eleanor Nye", "E7"),
("Eleanor Nye", "E8"),
("Pearl Oglethorpe", "E6"),
("Pearl Oglethorpe", "E8"),
("Pearl Oglethorpe", "E9"),
("Ruth DeSand", "E5"),
("Ruth DeSand", "E7"),
("Ruth DeSand", "E8"),
("Ruth DeSand", "E9"),
("Verne Sanderson", "E7"),
("Verne Sanderson", "E8"),
("Verne Sanderson", "E9"),
("Verne Sanderson", "E12"),
("Myra Liddel", "E8"),
("Myra Liddel", "E9"),
("Myra Liddel", "E10"),
("Myra Liddel", "E12"),
("Katherina Rogers", "E8"),
("Katherina Rogers", "E9"),
("Katherina Rogers", "E10"),
("Katherina Rogers", "E12"),
("Katherina Rogers", "E13"),
("Katherina Rogers", "E14"),
("Sylvia Avondale", "E7"),
("Sylvia Avondale", "E8"),
("Sylvia Avondale", "E9"),
("Sylvia Avondale", "E10"),
("Sylvia Avondale", "E12"),
("Sylvia Avondale", "E13"),
("Sylvia Avondale", "E14"),
("Nora Fayette", "E6"),
("Nora Fayette", "E7"),
("Nora Fayette", "E9"),
("Nora Fayette", "E10"),
("Nora Fayette", "E11"),
("Nora Fayette", "E12"),
("Nora Fayette", "E13"),
("Nora Fayette", "E14"),
("Helen Lloyd", "E7"),
("Helen Lloyd", "E8"),
("Helen Lloyd", "E10"),
("Helen Lloyd", "E11"),
("Helen Lloyd", "E12"),
("Dorothy Murchison", "E8"),
("Dorothy Murchison", "E9"),
("Olivia Carleton", "E9"),
("Olivia Carleton", "E11"),
("Flora Price", "E9"),
("Flora Price", "E11")])
G.graph['top'] = women
G.graph['bottom'] = events
return G
def florentine_families_graph():
"""Return Florentine families graph.
References
----------
.. [1] Ronald L. Breiger and Philippa E. Pattison
Cumulated social roles: The duality of persons and their algebras,1
Social Networks, Volume 8, Issue 3, September 1986, Pages 215-256
"""
G = nx.Graph()
G.add_edge('Acciaiuoli', 'Medici')
G.add_edge('Castellani', 'Peruzzi')
G.add_edge('Castellani', 'Strozzi')
G.add_edge('Castellani', 'Barbadori')
G.add_edge('Medici', 'Barbadori')
G.add_edge('Medici', 'Ridolfi')
G.add_edge('Medici', 'Tornabuoni')
G.add_edge('Medici', 'Albizzi')
G.add_edge('Medici', 'Salviati')
G.add_edge('Salviati', 'Pazzi')
G.add_edge('Peruzzi', 'Strozzi')
G.add_edge('Peruzzi', 'Bischeri')
G.add_edge('Strozzi', 'Ridolfi')
G.add_edge('Strozzi', 'Bischeri')
G.add_edge('Ridolfi', 'Tornabuoni')
G.add_edge('Tornabuoni', 'Guadagni')
G.add_edge('Albizzi', 'Ginori')
G.add_edge('Albizzi', 'Guadagni')
G.add_edge('Bischeri', 'Guadagni')
G.add_edge('Guadagni', 'Lamberteschi')
return G
| 40.611111 | 79 | 0.450251 | 1,956 | 10,965 | 2.495399 | 0.121166 | 0.371235 | 0.511985 | 0.632657 | 0.260602 | 0.246261 | 0.246261 | 0.235403 | 0.229871 | 0.228027 | 0 | 0.215674 | 0.427451 | 10,965 | 269 | 80 | 40.762082 | 0.561803 | 0.124031 | 0 | 0.063107 | 0 | 0.165049 | 0.486715 | 0.012491 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014563 | false | 0 | 0.004854 | 0 | 0.033981 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ab2b82a7d6f59b4569c5ba3193649b935d8f90f6 | 102 | py | Python | codes_auto/301.remove-invalid-parentheses.py | smartmark-pro/leetcode_record | 6504b733d892a705571eb4eac836fb10e94e56db | [
"MIT"
] | null | null | null | codes_auto/301.remove-invalid-parentheses.py | smartmark-pro/leetcode_record | 6504b733d892a705571eb4eac836fb10e94e56db | [
"MIT"
] | null | null | null | codes_auto/301.remove-invalid-parentheses.py | smartmark-pro/leetcode_record | 6504b733d892a705571eb4eac836fb10e94e56db | [
"MIT"
] | null | null | null | #
# @lc app=leetcode.cn id=301 lang=python3
#
# [301] remove-invalid-parentheses
#
None
# @lc code=end | 14.571429 | 41 | 0.696078 | 16 | 102 | 4.4375 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079545 | 0.137255 | 102 | 7 | 42 | 14.571429 | 0.727273 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ab3f2c8f8c41bcb34fdce360f46d13b69b145f2b | 52 | py | Python | client/py_client/utils/interceptors/__init__.py | thefstock/FirstockPy | 09b4dcf3470f83de991b43213958d2c6783f997b | [
"MIT"
] | 1 | 2022-03-29T06:56:06.000Z | 2022-03-29T06:56:06.000Z | client/py_client/utils/interceptors/__init__.py | thefstock/FirstockPy | 09b4dcf3470f83de991b43213958d2c6783f997b | [
"MIT"
] | 3 | 2022-01-17T09:31:21.000Z | 2022-03-11T12:12:08.000Z | client/py_client/utils/interceptors/__init__.py | thefstock/FirstockPy | 09b4dcf3470f83de991b43213958d2c6783f997b | [
"MIT"
] | null | null | null | from .interceptor import *
from .http_error import * | 26 | 26 | 0.788462 | 7 | 52 | 5.714286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134615 | 52 | 2 | 27 | 26 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
ab4ce72dc19521e168588ae683f93acfbf5cb3b4 | 10,719 | py | Python | skdecide/solvers.py | jeromerobert/scikit-decide | 900916e627669fb3f7520edb2aaef55e08064b25 | [
"MIT"
] | null | null | null | skdecide/solvers.py | jeromerobert/scikit-decide | 900916e627669fb3f7520edb2aaef55e08064b25 | [
"MIT"
] | null | null | null | skdecide/solvers.py | jeromerobert/scikit-decide | 900916e627669fb3f7520edb2aaef55e08064b25 | [
"MIT"
] | 1 | 2021-02-26T17:31:51.000Z | 2021-02-26T17:31:51.000Z | # Copyright (c) AIRBUS and its affiliates.
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""This module contains base classes for quickly building solvers."""
from __future__ import annotations
from typing import List, Callable
from skdecide.core import D, autocast_all, autocastable
from skdecide.domains import Domain
from skdecide.builders.solver.policy import DeterministicPolicies
__all__ = ['Solver', 'DeterministicPolicySolver']
# MAIN BASE CLASS
class Solver:
"""This is the highest level solver class (inheriting top-level class for each mandatory solver characteristic).
This helper class can be used as the main base class for solvers.
Typical use:
```python
class MySolver(Solver, ...)
```
with "..." replaced when needed by a number of classes from following domain characteristics (the ones in
parentheses are optional):
- **(assessability)**: Utilities -> QValues
- **(policy)**: Policies -> UncertainPolicies -> DeterministicPolicies
- **(restorability)**: Restorable
"""
T_domain = Domain
@classmethod
def get_domain_requirements(cls) -> List[type]:
"""Get domain requirements for this solver class to be applicable.
Domain requirements are classes from the #skdecide.builders.domain package that the domain needs to inherit from.
# Returns
A list of classes to inherit from.
"""
return cls._get_domain_requirements()
@classmethod
def _get_domain_requirements(cls) -> List[type]:
"""Get domain requirements for this solver class to be applicable.
Domain requirements are classes from the #skdecide.builders.domain package that the domain needs to inherit from.
# Returns
A list of classes to inherit from.
"""
def is_domain_builder(cls): # detected by having only single-'base class' ancestors until root
remove_ancestors = []
while True:
bases = cls.__bases__
if len(bases) == 0:
return True, remove_ancestors
elif len(bases) == 1:
cls = bases[0]
remove_ancestors.append(cls)
else:
return False, []
i = 0
sorted_ancestors = list(cls.T_domain.__mro__[:-1])
while i < len(sorted_ancestors):
ancestor = sorted_ancestors[i]
is_builder, remove_ancestors = is_domain_builder(ancestor)
if is_builder:
sorted_ancestors = [a for a in sorted_ancestors if a not in remove_ancestors]
i += 1
else:
sorted_ancestors.remove(ancestor)
return sorted_ancestors
@classmethod
def check_domain(cls, domain: Domain) -> bool:
"""Check whether a domain is compliant with this solver type.
By default, #Solver.check_domain() provides some boilerplate code and internally
calls #Solver._check_domain_additional() (which returns True by default but can be overridden to define
specific checks in addition to the "domain requirements"). The boilerplate code automatically checks whether all
domain requirements are met.
# Parameters
domain: The domain to check.
# Returns
True if the domain is compliant with the solver type (False otherwise).
"""
return cls._check_domain(domain)
@classmethod
def _check_domain(cls, domain: Domain) -> bool:
"""Check whether a domain is compliant with this solver type.
By default, #Solver._check_domain() provides some boilerplate code and internally
calls #Solver._check_domain_additional() (which returns True by default but can be overridden to define specific
checks in addition to the "domain requirements"). The boilerplate code automatically checks whether all domain
requirements are met.
# Parameters
domain: The domain to check.
# Returns
True if the domain is compliant with the solver type (False otherwise).
"""
check_requirements = all(isinstance(domain, req) for req in cls._get_domain_requirements())
return check_requirements and cls._check_domain_additional(domain)
@classmethod
def _check_domain_additional(cls, domain: D) -> bool:
"""Check whether the given domain is compliant with the specific requirements of this solver type (i.e. the
ones in addition to "domain requirements").
This is a helper function called by default from #Solver._check_domain(). It focuses on specific checks, as
opposed to taking also into account the domain requirements for the latter.
# Parameters
domain: The domain to check.
# Returns
True if the domain is compliant with the specific requirements of this solver type (False otherwise).
"""
return True
def reset(self) -> None:
"""Reset whatever is needed on this solver before running a new episode.
This function does nothing by default but can be overridden if needed (e.g. to reset the hidden state of a LSTM
policy network, which carries information about past observations seen in the previous episode).
"""
return self._reset()
def _reset(self) -> None:
"""Reset whatever is needed on this solver before running a new episode.
This function does nothing by default but can be overridden if needed (e.g. to reset the hidden state of a LSTM
policy network, which carries information about past observations seen in the previous episode).
"""
pass
def solve(self, domain_factory: Callable[[], Domain]) -> None:
"""Run the solving process.
By default, #Solver.solve() provides some boilerplate code and internally calls #Solver._solve(). The
boilerplate code transforms the domain factory to auto-cast the new domains to the level expected by the solver.
# Parameters
domain_factory: A callable with no argument returning the domain to solve (can be just a domain class).
!!! tip
The nature of the solutions produced here depends on other solver's characteristics like
#policy and #assessibility.
"""
return self._solve(domain_factory)
def _solve(self, domain_factory: Callable[[], Domain]) -> None:
"""Run the solving process.
By default, #Solver._solve() provides some boilerplate code and internally calls #Solver._solve_domain(). The
boilerplate code transforms the domain factory to auto-cast the new domains to the level expected by the solver.
# Parameters
domain_factory: A callable with no argument returning the domain to solve (can be just a domain class).
!!! tip
The nature of the solutions produced here depends on other solver's characteristics like
#policy and #assessibility.
"""
def cast_domain_factory():
domain = domain_factory()
autocast_all(domain, domain, self.T_domain)
return domain
return self._solve_domain(cast_domain_factory)
def _solve_domain(self, domain_factory: Callable[[], D]) -> None:
"""Run the solving process.
This is a helper function called by default from #Solver._solve(), the difference being that the domain factory
here returns domains auto-cast to the level expected by the solver.
# Parameters
domain_factory: A callable with no argument returning the domain to solve (auto-cast to expected level).
!!! tip
The nature of the solutions produced here depends on other solver's characteristics like
#policy and #assessibility.
"""
raise NotImplementedError
@autocastable
def solve_from(self, memory: D.T_memory[D.T_state]) -> None:
"""Run the solving process from a given state.
!!! tip
Create the domain first by calling the @Solver.reset() method
# Parameters
memory: The source memory (state or history) of the transition.
!!! tip
The nature of the solutions produced here depends on other solver's characteristics like
#policy and #assessibility.
"""
return self._solve_from(memory)
def _solve_from(self, memory: D.T_memory[D.T_state]) -> None:
"""Run the solving process from a given state.
!!! tip
Create the domain first by calling the @Solver.reset() method
# Parameters
memory: The source memory (state or history) of the transition.
!!! tip
The nature of the solutions produced here depends on other solver's characteristics like
#policy and #assessibility.
"""
pass
def _initialize(self):
"""Runs long-lasting initialization code here, or code to be executed at the
entering of a 'with' context statement.
"""
pass
def _cleanup(self):
"""Runs cleanup code here, or code to be executed at the exit of a
'with' context statement.
"""
pass
def __enter__(self):
"""Allow for calling the solver within a 'with' context statement.
Note that some solvers require such context statements to properly
clean their status before exiting the Python interpreter, thus it
is a good habit to always call solvers within a 'with' statement.
"""
self._initialize()
return self
def __exit__(self, type, value, tb):
"""Allow for calling the solver within a 'with' context statement.
Note that some solvers require such context statements to properly
clean their status before exiting the Python interpreter, thus it
is a good habit to always call solvers within a 'with' statement.
"""
self._cleanup()
# ALTERNATE BASE CLASSES (for typical combinations)
class DeterministicPolicySolver(Solver, DeterministicPolicies):
"""This is a typical deterministic policy solver class.
This helper class can be used as an alternate base class for domains, inheriting the following:
- Solver
- DeterministicPolicies
Typical use:
```python
class MySolver(DeterministicPolicySolver)
```
!!! tip
It is also possible to refine any alternate base class, like for instance:
```python
class MySolver(DeterministicPolicySolver, QValues)
```
"""
pass
| 37.742958 | 121 | 0.661069 | 1,322 | 10,719 | 5.269289 | 0.198185 | 0.024548 | 0.018088 | 0.018088 | 0.649584 | 0.628768 | 0.628768 | 0.61269 | 0.61269 | 0.60379 | 0 | 0.000771 | 0.273906 | 10,719 | 283 | 122 | 37.876325 | 0.894257 | 0.621793 | 0 | 0.16 | 0 | 0 | 0.00999 | 0.008057 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0.066667 | 0.066667 | 0 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
db6db31575e7d74d342b52d3a11ca821cc6236ba | 247 | py | Python | pyHMT2D/cli/__init__.py | ali-mahdavi-mazdeh/pyHMT2D | 5351ea8a0d234d4a2b8e1e62c803d7f3cfeb0f4d | [
"MIT"
] | 7 | 2021-04-12T16:14:30.000Z | 2022-03-11T12:20:52.000Z | pyHMT2D/cli/__init__.py | ali-mahdavi-mazdeh/pyHMT2D | 5351ea8a0d234d4a2b8e1e62c803d7f3cfeb0f4d | [
"MIT"
] | 4 | 2021-05-17T15:33:02.000Z | 2021-07-16T18:04:59.000Z | pyHMT2D/cli/__init__.py | ali-mahdavi-mazdeh/pyHMT2D | 5351ea8a0d234d4a2b8e1e62c803d7f3cfeb0f4d | [
"MIT"
] | 6 | 2021-05-17T15:20:31.000Z | 2022-03-12T02:06:53.000Z | from .ras_to_srh import ras_to_srh
from .hmt_calibrate import hmt_calibrate
from .srh_mesh_to_vtk import hmt_srh_mesh_to_vtk
from .srh_to_vtk import hmt_srh_to_vtk
__all__ = ["ras_to_srh", "hmt_calibrate", "hmt_srh_mesh_to_vtk", "hmt_srh_to_vtk"] | 41.166667 | 82 | 0.838057 | 49 | 247 | 3.571429 | 0.204082 | 0.171429 | 0.137143 | 0.205714 | 0.331429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089069 | 247 | 6 | 82 | 41.166667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.225806 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
db982a6dd2513eb9c0ef5b31fca2060504425423 | 43 | py | Python | streamz_ext/__init__.py | xpdAcq/streamz_ext | 9c3b41fdcca3dc7deea6d3f5523fee315af71211 | [
"BSD-3-Clause"
] | 1 | 2018-10-02T02:37:19.000Z | 2018-10-02T02:37:19.000Z | streamz_ext/__init__.py | xpdAcq/streamz_ext | 9c3b41fdcca3dc7deea6d3f5523fee315af71211 | [
"BSD-3-Clause"
] | 31 | 2018-01-17T15:54:32.000Z | 2018-10-24T17:11:28.000Z | streamz_ext/__init__.py | xpdAcq/streamz_ext | 9c3b41fdcca3dc7deea6d3f5523fee315af71211 | [
"BSD-3-Clause"
] | 4 | 2018-01-16T19:27:49.000Z | 2018-08-20T08:58:06.000Z | __version__ = '0.2.1'
from .core import *
| 10.75 | 21 | 0.651163 | 7 | 43 | 3.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 0.186047 | 43 | 3 | 22 | 14.333333 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
dba6d7450d8b255f56084eed1aede09a453c61f1 | 191 | py | Python | kokki/providers/service/redhat.py | samuel/kokki | da98da55e0bba8db5bda993666a43c6fdc4cacdb | [
"BSD-3-Clause"
] | 11 | 2015-01-14T00:43:26.000Z | 2020-12-29T06:12:51.000Z | kokki/providers/service/redhat.py | samuel/kokki | da98da55e0bba8db5bda993666a43c6fdc4cacdb | [
"BSD-3-Clause"
] | null | null | null | kokki/providers/service/redhat.py | samuel/kokki | da98da55e0bba8db5bda993666a43c6fdc4cacdb | [
"BSD-3-Clause"
] | 3 | 2015-01-14T01:05:56.000Z | 2019-01-26T05:09:37.000Z |
__all__ = ["RedhatServiceProvider"]
from kokki.providers.service import ServiceProvider
class RedhatServiceProvider(ServiceProvider):
def enable_runlevel(self, runlevel):
pass
| 21.222222 | 51 | 0.780105 | 17 | 191 | 8.470588 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146597 | 191 | 8 | 52 | 23.875 | 0.883436 | 0 | 0 | 0 | 0 | 0 | 0.110526 | 0.110526 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
dbad199e7c4097c25b9a16bef1f7839a7f744589 | 286 | py | Python | quantfin/portfolio/__init__.py | gusamarante/QuantFin | ba8ae75095692b3e6a922220ef8cefc1bea7c35e | [
"MIT"
] | 5 | 2021-02-10T09:33:53.000Z | 2021-11-15T21:23:36.000Z | quantfin/portfolio/__init__.py | gusamarante/QuantFin | ba8ae75095692b3e6a922220ef8cefc1bea7c35e | [
"MIT"
] | null | null | null | quantfin/portfolio/__init__.py | gusamarante/QuantFin | ba8ae75095692b3e6a922220ef8cefc1bea7c35e | [
"MIT"
] | 2 | 2021-07-22T02:12:01.000Z | 2022-01-15T15:33:27.000Z | from quantfin.portfolio.asset_allocation import Markowitz, BlackLitterman
from quantfin.portfolio.performance import Performance
from quantfin.portfolio.construction import HRP, PrincipalPortfolios
__all__ = ['Markowitz', 'Performance', 'HRP', 'BlackLitterman', 'PrincipalPortfolios']
| 47.666667 | 86 | 0.835664 | 27 | 286 | 8.666667 | 0.481481 | 0.153846 | 0.269231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 286 | 5 | 87 | 57.2 | 0.886364 | 0 | 0 | 0 | 0 | 0 | 0.195804 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
dbc5e12854aa63f04f0aed1cb708da2e8d798f67 | 595 | py | Python | fgietAdmission/templatetags/my_filters.py | rpsingh21/Fgiet-Admission | a6871da939c2ec16e480844254d801c3a486c5c2 | [
"MIT"
] | 5 | 2018-09-28T20:05:14.000Z | 2019-05-31T19:12:48.000Z | fgietAdmission/templatetags/my_filters.py | rpsingh21/Fgiet-Admission | a6871da939c2ec16e480844254d801c3a486c5c2 | [
"MIT"
] | 5 | 2019-09-30T18:40:15.000Z | 2020-04-20T20:28:29.000Z | fgietAdmission/templatetags/my_filters.py | rpsingh21/Fgiet-Admission | a6871da939c2ec16e480844254d801c3a486c5c2 | [
"MIT"
] | 3 | 2019-09-30T20:54:55.000Z | 2020-04-15T15:39:37.000Z | from django import template
from django.template.loader import get_template
register = template.Library()
@register.filter(name='is_deploma')
def is_deploma(value):
return value['year'] == '2' and value['course'] == 'BTech'
@register.filter(name='is_first_year')
def is_first_year(value):
return value['year'] == '1' and value['course'] == 'BTech'
@register.filter(name='is_mca')
def is_mca(value):
return value['course'] == 'MCA'
@register.filter(name='is_btech_first_year')
def is_btech_first_year(instance):
return instance.applyYear == '1' and instance.course == 'BTech' | 29.75 | 67 | 0.719328 | 84 | 595 | 4.916667 | 0.297619 | 0.135593 | 0.174334 | 0.193705 | 0.188862 | 0.188862 | 0.188862 | 0.188862 | 0 | 0 | 0 | 0.005725 | 0.119328 | 595 | 20 | 67 | 29.75 | 0.782443 | 0 | 0 | 0 | 0 | 0 | 0.159396 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.133333 | 0.266667 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
dbd2063ef922eeae02f5ba92cf5300603fb0a407 | 135 | py | Python | iscan/apps.py | MontrealCorpusTools/iscan-server | 1ba1fba87a9e2a1a3b29b6df1c485a45535ea071 | [
"MIT"
] | 5 | 2018-12-27T10:52:56.000Z | 2021-04-26T07:13:52.000Z | iscan/apps.py | MontrealCorpusTools/polyglot-server | 1ba1fba87a9e2a1a3b29b6df1c485a45535ea071 | [
"MIT"
] | 128 | 2018-06-18T17:20:25.000Z | 2018-10-11T21:32:44.000Z | iscan/apps.py | MontrealCorpusTools/ISCAN | 1ba1fba87a9e2a1a3b29b6df1c485a45535ea071 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
import logging
log = logging.getLogger(__name__)
class IscanConfig(AppConfig):
name = 'iscan'
| 15 | 33 | 0.762963 | 16 | 135 | 6.1875 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 135 | 8 | 34 | 16.875 | 0.868421 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
dbd2cc592db7fa51ffe8e3016df0e67717304c9a | 92 | py | Python | students/K33422/Khusnutdinov_Sergei/Lab_03/birds/birds_app/apps.py | DanteLeapman/ITMO_ICT_WebDevelopment_2020-2021 | e19ce8a69caf353c63a6f4c1b9484cc0fcceb74c | [
"MIT"
] | null | null | null | students/K33422/Khusnutdinov_Sergei/Lab_03/birds/birds_app/apps.py | DanteLeapman/ITMO_ICT_WebDevelopment_2020-2021 | e19ce8a69caf353c63a6f4c1b9484cc0fcceb74c | [
"MIT"
] | null | null | null | students/K33422/Khusnutdinov_Sergei/Lab_03/birds/birds_app/apps.py | DanteLeapman/ITMO_ICT_WebDevelopment_2020-2021 | e19ce8a69caf353c63a6f4c1b9484cc0fcceb74c | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class BirdsAppConfig(AppConfig):
name = 'birds_app'
| 15.333333 | 33 | 0.76087 | 11 | 92 | 6.272727 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163043 | 92 | 5 | 34 | 18.4 | 0.896104 | 0 | 0 | 0 | 0 | 0 | 0.097826 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
dbd95f8ad2428c5163d19acfaa013f4f3978c6f7 | 893 | py | Python | aula#22/desafio113/leitura.py | daramariabs/exercicios-python | 0d9785a9cccd5442a190572c58ab8dd6e2fe0cce | [
"MIT"
] | null | null | null | aula#22/desafio113/leitura.py | daramariabs/exercicios-python | 0d9785a9cccd5442a190572c58ab8dd6e2fe0cce | [
"MIT"
] | null | null | null | aula#22/desafio113/leitura.py | daramariabs/exercicios-python | 0d9785a9cccd5442a190572c58ab8dd6e2fe0cce | [
"MIT"
] | null | null | null | def leiaInt():
loop = True
while loop:
try:
numero = int(input("Digite um valor inteiro:"))
return numero
except (ValueError, TypeError):
print("ERRO! Digite um numero inteiro válido.")
except KeyboardInterrupt:
#return "O usuario não informou este numero.\n"
loop = False
print("O usuario não informou este numero.")
return 0
def leiaFloat():
loop = True
while loop:
try:
numero = float(input("Digite um valor real:"))
return numero
except (ValueError, TypeError):
print("ERRO! Digite um numero real válido.")
except KeyboardInterrupt:
#return "O usuario não informou este numero.\n"
loop = False
print("\nO usuario não informou este numero.")
return 0 | 33.074074 | 59 | 0.549832 | 94 | 893 | 5.223404 | 0.351064 | 0.065173 | 0.14664 | 0.179226 | 0.816701 | 0.816701 | 0.708758 | 0.566191 | 0.566191 | 0.566191 | 0 | 0.003546 | 0.368421 | 893 | 27 | 60 | 33.074074 | 0.867021 | 0.103024 | 0 | 0.666667 | 0 | 0 | 0.2375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.25 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
916605c02a39d719756ce5249ba2185983761bc6 | 150 | py | Python | gateway/kit/sdk/__init__.py | dustinengle/smart-mailbox | 7b0abd8ed31c28d38a620e680ae24b8e02c94905 | [
"MIT"
] | 1 | 2018-11-14T01:37:17.000Z | 2018-11-14T01:37:17.000Z | stellar_base/__init__.py | bic2007/py-stellar-base | 1713a9c708da159d65a1e7ce2722f71df6434c7f | [
"Apache-2.0"
] | 2 | 2022-02-19T06:34:17.000Z | 2022-02-27T10:28:48.000Z | stellar_base/__init__.py | bic2007/py-stellar-base | 1713a9c708da159d65a1e7ce2722f71df6434c7f | [
"Apache-2.0"
] | 1 | 2022-03-21T19:41:55.000Z | 2022-03-21T19:41:55.000Z | from .version import __version__
from .keypair import Keypair
from .builder import Builder
from .address import Address
from .horizon import Horizon
| 21.428571 | 32 | 0.826667 | 20 | 150 | 6 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14 | 150 | 6 | 33 | 25 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
918d1b0432b43dc46d14c061a472db1f3f833f08 | 157 | py | Python | knn_indexing/__settings__.py | CACppuccino/NewsAggregatorSem4 | 01c2672d123f9033e223f45ed0b1d7ce0d5cdbc4 | [
"Apache-2.0"
] | 1 | 2021-02-25T07:59:40.000Z | 2021-02-25T07:59:40.000Z | knn_indexing/__settings__.py | CACppuccino/NewsAggregatorSem4 | 01c2672d123f9033e223f45ed0b1d7ce0d5cdbc4 | [
"Apache-2.0"
] | 5 | 2020-09-19T07:17:32.000Z | 2022-02-27T10:22:57.000Z | knn_indexing/__settings__.py | CACppuccino/NewsAggregatorSem4 | 01c2672d123f9033e223f45ed0b1d7ce0d5cdbc4 | [
"Apache-2.0"
] | 2 | 2021-03-27T13:27:10.000Z | 2021-03-27T13:28:00.000Z | MODEL_URL = "https://tfhub.dev/google/nnlm-en-dim50/2"
MODEL_DIM = 50 # Keep this updated when changing the model
URL = "https://admin:admin@localhost:9200" | 52.333333 | 59 | 0.745223 | 26 | 157 | 4.423077 | 0.807692 | 0.13913 | 0.226087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064286 | 0.10828 | 157 | 3 | 60 | 52.333333 | 0.757143 | 0.261147 | 0 | 0 | 0 | 0 | 0.643478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
9193cb59cb5b4f6c847c70dfaa3bfdd717445e14 | 251 | py | Python | or_gym/envs/registry.py | CanLi1/or-gym | b2809bea1ada53069cbdbb4eee12cf258536e8cc | [
"MIT"
] | 180 | 2020-08-11T17:40:38.000Z | 2022-03-31T02:16:50.000Z | or_gym/envs/registry.py | CanLi1/or-gym | b2809bea1ada53069cbdbb4eee12cf258536e8cc | [
"MIT"
] | 18 | 2020-08-09T19:01:33.000Z | 2022-03-17T15:58:34.000Z | or_gym/envs/registry.py | CanLi1/or-gym | b2809bea1ada53069cbdbb4eee12cf258536e8cc | [
"MIT"
] | 46 | 2020-08-24T07:41:55.000Z | 2022-03-28T10:39:16.000Z | from gym.envs.registration import EnvRegistry
registry = EnvRegistry()
def register(id, **kwargs):
return registry.register(id, **kwargs)
def make(id, **kwargs):
return registry.make(id, **kwargs)
def spec(id):
return registry.spec(id) | 20.916667 | 45 | 0.709163 | 33 | 251 | 5.393939 | 0.424242 | 0.179775 | 0.179775 | 0.247191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151394 | 251 | 12 | 46 | 20.916667 | 0.835681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0.125 | 0.375 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
9194ade622a7fd719411c2b1c29fd6cf8a2864c4 | 343 | py | Python | helot/mysql/__init__.py | jpazarzis/helot_mysql | f98057671724208df972441c804f0c9d6fccd061 | [
"MIT"
] | null | null | null | helot/mysql/__init__.py | jpazarzis/helot_mysql | f98057671724208df972441c804f0c9d6fccd061 | [
"MIT"
] | null | null | null | helot/mysql/__init__.py | jpazarzis/helot_mysql | f98057671724208df972441c804f0c9d6fccd061 | [
"MIT"
] | null | null | null | from .wrappers import execute_query
from .wrappers import db_connection
from .wrappers import make_query_executor
from .wrappers import make_non_query_executor
from .wrappers import query_executor_user
__all__ = [
'execute_query',
'db_connection',
'make_query_executor',
'make_non_query_executor',
'query_executor_user'
]
| 24.5 | 45 | 0.790087 | 44 | 343 | 5.659091 | 0.272727 | 0.313253 | 0.361446 | 0.176707 | 0.248996 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145773 | 343 | 13 | 46 | 26.384615 | 0.849829 | 0 | 0 | 0 | 0 | 0 | 0.253644 | 0.067055 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.416667 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
91968a55706c52af4b04a0ed96ec55aa83ea7397 | 113 | py | Python | affirm/main.py | dominictarro/Affirm | 9b73c1b91efa1661b7f803dc89a6a1f0083ced6d | [
"MIT"
] | null | null | null | affirm/main.py | dominictarro/Affirm | 9b73c1b91efa1661b7f803dc89a6a1f0083ced6d | [
"MIT"
] | null | null | null | affirm/main.py | dominictarro/Affirm | 9b73c1b91efa1661b7f803dc89a6a1f0083ced6d | [
"MIT"
] | null | null | null | # Package imports
import os
# Local imports
from endpoints import app
if __name__ == "__main__":
app.run()
| 12.555556 | 26 | 0.707965 | 15 | 113 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.20354 | 113 | 8 | 27 | 14.125 | 0.8 | 0.256637 | 0 | 0 | 0 | 0 | 0.098765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
91bb74a3993aec82f2b01b3ce284f5eb31379b82 | 744 | py | Python | tests/test_auth.py | fyntex/gcp-utils-python | 11a5be8bd984089a82c77d2fa480c275fc34dd2e | [
"MIT"
] | 1 | 2019-03-08T00:04:29.000Z | 2019-03-08T00:04:29.000Z | tests/test_auth.py | fyntex/gcp-utils-python | 11a5be8bd984089a82c77d2fa480c275fc34dd2e | [
"MIT"
] | 29 | 2020-07-13T12:02:04.000Z | 2022-03-25T13:21:49.000Z | tests/test_auth.py | fyntex/gcp-utils-python | 11a5be8bd984089a82c77d2fa480c275fc34dd2e | [
"MIT"
] | 1 | 2019-09-10T22:03:50.000Z | 2019-09-10T22:03:50.000Z | from unittest import TestCase
from fd_gcp.auth import ( # noqa: F401
get_env_default_credentials, get_env_project_id, get_gce_credentials,
load_credentials_from_file,
)
class FunctionsTestCase(TestCase):
def test_get_env_default_credentials(self) -> None:
# TODO: implement test
# get_env_default_credentials()
pass
def test_get_env_project_id(self) -> None:
# TODO: implement test
# get_env_project_id()
pass
def test_get_gce_credentials(self) -> None:
# TODO: implement test
# get_gce_credentials()
pass
def test_load_credentials_from_file(self) -> None:
# TODO: implement test
# load_credentials_from_file()
pass
| 24.8 | 73 | 0.674731 | 91 | 744 | 5.098901 | 0.296703 | 0.077586 | 0.086207 | 0.181034 | 0.538793 | 0.241379 | 0.241379 | 0 | 0 | 0 | 0 | 0.005415 | 0.255376 | 744 | 29 | 74 | 25.655172 | 0.83213 | 0.263441 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 1 | 0.285714 | false | 0.285714 | 0.142857 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
91c2643dc4fe3db21557f6c3b1881177b8d87ac7 | 6,101 | py | Python | tests/test_cli.py | bollwyvl/pathy | 36c8de95572047862557f4009103db1037816f78 | [
"Apache-2.0"
] | null | null | null | tests/test_cli.py | bollwyvl/pathy | 36c8de95572047862557f4009103db1037816f78 | [
"Apache-2.0"
] | null | null | null | tests/test_cli.py | bollwyvl/pathy | 36c8de95572047862557f4009103db1037816f78 | [
"Apache-2.0"
] | null | null | null | import pytest
from typer.testing import CliRunner
from pathy import Pathy
from pathy.cli import app
from .conftest import TEST_ADAPTERS
runner = CliRunner()
# TODO: add a test for wildcard cp/mv/rm/ls paths (e.g. "pathy cp gs://my-bucket/*.file ./")
# TODO: add a test for streaming in/out sources (e.g. "pathy cp - gs://my-bucket/my.file")
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_cp_file(with_adapter, bucket: str):
source = f"gs://{bucket}/cli_cp_file/file.txt"
destination = f"gs://{bucket}/cli_cp_file/other.txt"
Pathy(source).write_text("---")
assert runner.invoke(app, ["cp", source, destination]).exit_code == 0
assert Pathy(source).exists()
assert Pathy(destination).is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_cp_folder(with_adapter, bucket: str):
root = Pathy.from_bucket(bucket)
source = root / "cli_cp_folder"
destination = root / "cli_cp_folder_other"
for i in range(2):
for j in range(2):
(source / f"{i}" / f"{j}").write_text("---")
assert runner.invoke(app, ["cp", str(source), str(destination)]).exit_code == 0
assert Pathy(source).exists()
assert Pathy(destination).is_dir()
for i in range(2):
for j in range(2):
assert (destination / f"{i}" / f"{j}").is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_mv_folder(with_adapter, bucket: str):
root = Pathy.from_bucket(bucket)
source = root / "cli_mv_folder"
destination = root / "cli_mv_folder_other"
for i in range(2):
for j in range(2):
(source / f"{i}" / f"{j}").write_text("---")
assert runner.invoke(app, ["mv", str(source), str(destination)]).exit_code == 0
assert not Pathy(source).exists()
assert Pathy(destination).is_dir()
# Ensure source files are gone
for i in range(2):
for j in range(2):
assert not (source / f"{i}" / f"{j}").is_file()
# And dest files exist
for i in range(2):
for j in range(2):
assert (destination / f"{i}" / f"{j}").is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_mv_file(with_adapter, bucket: str):
source = f"gs://{bucket}/cli_mv_file/file.txt"
destination = f"gs://{bucket}/cli_mv_file/other.txt"
Pathy(source).write_text("---")
assert Pathy(source).exists()
assert runner.invoke(app, ["mv", source, destination]).exit_code == 0
assert not Pathy(source).exists()
assert Pathy(destination).is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_mv_file_across_buckets(with_adapter, bucket: str, other_bucket: str):
source = f"gs://{bucket}/cli_mv_file_across_buckets/file.txt"
destination = f"gs://{other_bucket}/cli_mv_file_across_buckets/other.txt"
Pathy(source).write_text("---")
assert Pathy(source).exists()
assert runner.invoke(app, ["mv", source, destination]).exit_code == 0
assert not Pathy(source).exists()
assert Pathy(destination).is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_mv_folder_across_buckets(with_adapter, bucket: str, other_bucket: str):
source = Pathy.from_bucket(bucket) / "cli_mv_folder_across_buckets"
destination = Pathy.from_bucket(other_bucket) / "cli_mv_folder_across_buckets"
for i in range(2):
for j in range(2):
(source / f"{i}" / f"{j}").write_text("---")
assert runner.invoke(app, ["mv", str(source), str(destination)]).exit_code == 0
assert not Pathy(source).exists()
assert Pathy(destination).is_dir()
# Ensure source files are gone
for i in range(2):
for j in range(2):
assert not (source / f"{i}" / f"{j}").is_file()
# And dest files exist
for i in range(2):
for j in range(2):
assert (destination / f"{i}" / f"{j}").is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_rm_file(with_adapter, bucket: str):
source = f"gs://{bucket}/cli_rm_file/file.txt"
path = Pathy(source)
path.write_text("---")
assert path.exists()
assert runner.invoke(app, ["rm", source]).exit_code == 0
assert not path.exists()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_rm_verbose(with_adapter, bucket: str):
root = Pathy.from_bucket(bucket) / "cli_rm_folder"
source = str(root / "file.txt")
other = str(root / "folder/other")
Pathy(source).write_text("---")
Pathy(other).write_text("---")
result = runner.invoke(app, ["rm", "-v", source])
assert result.exit_code == 0
assert source in result.output
assert other not in result.output
Pathy(source).write_text("---")
result = runner.invoke(app, ["rm", "-rv", str(root)])
assert result.exit_code == 0
assert source in result.output
assert other in result.output
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_rm_folder(with_adapter, bucket: str):
root = Pathy.from_bucket(bucket)
source = root / "cli_rm_folder"
for i in range(2):
for j in range(2):
(source / f"{i}" / f"{j}").write_text("---")
# Returns exit code 1 without recursive flag when given a folder
assert runner.invoke(app, ["rm", str(source)]).exit_code == 1
assert runner.invoke(app, ["rm", "-r", str(source)]).exit_code == 0
assert not Pathy(source).exists()
# Ensure source files are gone
for i in range(2):
for j in range(2):
assert not (source / f"{i}" / f"{j}").is_file()
@pytest.mark.parametrize("adapter", TEST_ADAPTERS)
def test_cli_ls(with_adapter, bucket: str):
root = Pathy.from_bucket(bucket) / "cli_ls"
one = str(root / "file.txt")
two = str(root / "other.txt")
three = str(root / "folder/file.txt")
Pathy(one).write_text("---")
Pathy(two).write_text("---")
Pathy(three).write_text("---")
result = runner.invoke(app, ["ls", str(root)])
assert result.exit_code == 0
assert one in result.output
assert two in result.output
assert str(root / "folder") in result.output
| 36.975758 | 92 | 0.653172 | 895 | 6,101 | 4.29162 | 0.097207 | 0.036449 | 0.041656 | 0.042958 | 0.808904 | 0.772195 | 0.727154 | 0.695652 | 0.644363 | 0.638115 | 0 | 0.006668 | 0.188822 | 6,101 | 164 | 93 | 37.20122 | 0.769448 | 0.06081 | 0 | 0.553846 | 0 | 0 | 0.120608 | 0.058207 | 0 | 0 | 0 | 0.006098 | 0.323077 | 1 | 0.076923 | false | 0 | 0.038462 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
37cd4147a36043134723e00c74e3ed2532fbfc1b | 13,952 | py | Python | tests/test_enforcer.py | hanjiangxue20/pycasbin | 6c9beab42e8f1e18e1c94afc96a668c261d6aa27 | [
"Apache-2.0"
] | null | null | null | tests/test_enforcer.py | hanjiangxue20/pycasbin | 6c9beab42e8f1e18e1c94afc96a668c261d6aa27 | [
"Apache-2.0"
] | null | null | null | tests/test_enforcer.py | hanjiangxue20/pycasbin | 6c9beab42e8f1e18e1c94afc96a668c261d6aa27 | [
"Apache-2.0"
] | null | null | null | import os
import time
from unittest import TestCase
import casbin
def get_examples(path):
examples_path = os.path.split(os.path.realpath(__file__))[0] + "/../examples/"
return os.path.abspath(examples_path + path)
class TestSub:
def __init__(self, name, age):
self.name = name
self.age = age
class TestCaseBase(TestCase):
def get_enforcer(self, model=None, adapter=None):
return casbin.Enforcer(
model,
adapter,
)
class TestConfig(TestCaseBase):
def test_enforcer_basic(self):
e = self.get_enforcer(
get_examples("basic_model.conf"),
get_examples("basic_policy.csv"),
)
self.assertTrue(e.enforce("alice", "data1", "read"))
self.assertFalse(e.enforce("alice", "data2", "read"))
self.assertTrue(e.enforce("bob", "data2", "write"))
self.assertFalse(e.enforce("bob", "data1", "write"))
def test_enforce_ex_basic(self):
e = self.get_enforcer(
get_examples("basic_model.conf"),
get_examples("basic_policy.csv"),
)
self.assertTupleEqual(
e.enforce_ex("alice", "data1", "read"), (True, ["alice", "data1", "read"])
)
self.assertTupleEqual(e.enforce_ex("alice", "data2", "read"), (False, []))
self.assertTupleEqual(
e.enforce_ex("bob", "data2", "write"), (True, ["bob", "data2", "write"])
)
self.assertTupleEqual(e.enforce_ex("bob", "data1", "write"), (False, []))
def test_model_set_load(self):
e = self.get_enforcer(
get_examples("basic_model.conf"),
get_examples("basic_policy.csv"),
)
if not isinstance(e, casbin.SyncedEnforcer):
e.set_model(None)
self.assertTrue(e.model is None)
# creating new model
e.load_model()
self.assertTrue(e.model is not None)
def test_enforcer_basic_without_spaces(self):
e = self.get_enforcer(
get_examples("basic_model_without_spaces.conf"),
get_examples("basic_policy.csv"),
)
self.assertTrue(e.enforce("alice", "data1", "read"))
self.assertFalse(e.enforce("alice", "data1", "write"))
self.assertFalse(e.enforce("alice", "data2", "read"))
self.assertFalse(e.enforce("alice", "data2", "write"))
self.assertFalse(e.enforce("bob", "data1", "read"))
self.assertFalse(e.enforce("bob", "data1", "write"))
self.assertFalse(e.enforce("bob", "data2", "read"))
self.assertTrue(e.enforce("bob", "data2", "write"))
def test_enforce_basic_with_root(self):
e = self.get_enforcer(
get_examples("basic_with_root_model.conf"), get_examples("basic_policy.csv")
)
self.assertTrue(e.enforce("root", "any", "any"))
def test_enforce_basic_without_resources(self):
e = self.get_enforcer(
get_examples("basic_without_resources_model.conf"),
get_examples("basic_without_resources_policy.csv"),
)
self.assertTrue(e.enforce("alice", "read"))
self.assertFalse(e.enforce("alice", "write"))
self.assertTrue(e.enforce("bob", "write"))
self.assertFalse(e.enforce("bob", "read"))
def test_enforce_basic_without_users(self):
e = self.get_enforcer(
get_examples("basic_without_users_model.conf"),
get_examples("basic_without_users_policy.csv"),
)
self.assertTrue(e.enforce("data1", "read"))
self.assertFalse(e.enforce("data1", "write"))
self.assertTrue(e.enforce("data2", "write"))
self.assertFalse(e.enforce("data2", "read"))
def test_enforce_ip_match(self):
e = self.get_enforcer(
get_examples("ipmatch_model.conf"), get_examples("ipmatch_policy.csv")
)
self.assertTrue(e.enforce("192.168.2.1", "data1", "read"))
self.assertFalse(e.enforce("192.168.3.1", "data1", "read"))
def test_enforce_key_match(self):
e = self.get_enforcer(
get_examples("keymatch_model.conf"), get_examples("keymatch_policy.csv")
)
self.assertTrue(e.enforce("alice", "/alice_data/test", "GET"))
self.assertFalse(e.enforce("alice", "/bob_data/test", "GET"))
self.assertTrue(e.enforce("cathy", "/cathy_data", "GET"))
self.assertTrue(e.enforce("cathy", "/cathy_data", "POST"))
self.assertFalse(e.enforce("cathy", "/cathy_data/12", "POST"))
def test_enforce_key_match2(self):
e = self.get_enforcer(
get_examples("keymatch2_model.conf"), get_examples("keymatch2_policy.csv")
)
self.assertTrue(e.enforce("alice", "/alice_data/resource", "GET"))
self.assertTrue(e.enforce("alice", "/alice_data2/123/using/456", "GET"))
def test_enforce_priority(self):
e = self.get_enforcer(
get_examples("priority_model.conf"), get_examples("priority_policy.csv")
)
self.assertTrue(e.enforce("alice", "data1", "read"))
self.assertFalse(e.enforce("alice", "data1", "write"))
self.assertFalse(e.enforce("alice", "data2", "read"))
self.assertFalse(e.enforce("alice", "data2", "write"))
self.assertFalse(e.enforce("bob", "data1", "read"))
self.assertFalse(e.enforce("bob", "data1", "write"))
self.assertTrue(e.enforce("bob", "data2", "read"))
self.assertFalse(e.enforce("bob", "data2", "write"))
def test_enforce_priority_indeterminate(self):
e = self.get_enforcer(
get_examples("priority_model.conf"),
get_examples("priority_indeterminate_policy.csv"),
)
self.assertFalse(e.enforce("alice", "data1", "read"))
def test_enforce_rbac(self):
e = self.get_enforcer(
get_examples("rbac_model.conf"), get_examples("rbac_policy.csv")
)
self.assertTrue(e.enforce("alice", "data1", "read"))
self.assertFalse(e.enforce("bob", "data1", "read"))
self.assertTrue(e.enforce("bob", "data2", "write"))
self.assertTrue(e.enforce("alice", "data2", "read"))
self.assertTrue(e.enforce("alice", "data2", "write"))
self.assertFalse(
e.enforce("bogus", "data2", "write")
) # test non-existant subject
def test_enforce_rbac__empty_policy(self):
e = self.get_enforcer(
get_examples("rbac_model.conf"), get_examples("empty_policy.csv")
)
self.assertFalse(e.enforce("alice", "data1", "read"))
self.assertFalse(e.enforce("bob", "data1", "read"))
self.assertFalse(e.enforce("bob", "data2", "write"))
self.assertFalse(e.enforce("alice", "data2", "read"))
self.assertFalse(e.enforce("alice", "data2", "write"))
def test_enforce_rbac_with_deny(self):
e = self.get_enforcer(
get_examples("rbac_with_deny_model.conf"),
get_examples("rbac_with_deny_policy.csv"),
)
self.assertTrue(e.enforce("alice", "data1", "read"))
self.assertTrue(e.enforce("bob", "data2", "write"))
self.assertTrue(e.enforce("alice", "data2", "read"))
self.assertFalse(e.enforce("alice", "data2", "write"))
def test_enforce_rbac_with_domains(self):
e = self.get_enforcer(
get_examples("rbac_with_domains_model.conf"),
get_examples("rbac_with_domains_policy.csv"),
)
self.assertTrue(e.enforce("alice", "domain1", "data1", "read"))
self.assertTrue(e.enforce("alice", "domain1", "data1", "write"))
self.assertFalse(e.enforce("alice", "domain1", "data2", "read"))
self.assertFalse(e.enforce("alice", "domain1", "data2", "write"))
self.assertFalse(e.enforce("bob", "domain2", "data1", "read"))
self.assertFalse(e.enforce("bob", "domain2", "data1", "write"))
self.assertTrue(e.enforce("bob", "domain2", "data2", "read"))
self.assertTrue(e.enforce("bob", "domain2", "data2", "write"))
def test_enforce_rbac_with_not_deny(self):
e = self.get_enforcer(
get_examples("rbac_with_not_deny_model.conf"),
get_examples("rbac_with_deny_policy.csv"),
)
self.assertFalse(e.enforce("alice", "data2", "write"))
def test_enforce_rbac_with_resource_roles(self):
e = self.get_enforcer(
get_examples("rbac_with_resource_roles_model.conf"),
get_examples("rbac_with_resource_roles_policy.csv"),
)
self.assertTrue(e.enforce("alice", "data1", "read"))
self.assertTrue(e.enforce("alice", "data1", "write"))
self.assertFalse(e.enforce("alice", "data2", "read"))
self.assertTrue(e.enforce("alice", "data2", "write"))
self.assertFalse(e.enforce("bob", "data1", "read"))
self.assertFalse(e.enforce("bob", "data1", "write"))
self.assertFalse(e.enforce("bob", "data2", "read"))
self.assertTrue(e.enforce("bob", "data2", "write"))
def test_enforce_rbac_with_pattern(self):
e = self.get_enforcer(
get_examples("rbac_with_pattern_model.conf"),
get_examples("rbac_with_pattern_policy.csv"),
)
# set matching function to key_match2
e.add_named_matching_func("g2", casbin.util.key_match2)
self.assertTrue(e.enforce("alice", "/book/1", "GET"))
self.assertTrue(e.enforce("alice", "/book/2", "GET"))
self.assertTrue(e.enforce("alice", "/pen/1", "GET"))
self.assertFalse(e.enforce("alice", "/pen/2", "GET"))
self.assertFalse(e.enforce("bob", "/book/1", "GET"))
self.assertFalse(e.enforce("bob", "/book/2", "GET"))
self.assertTrue(e.enforce("bob", "/pen/1", "GET"))
self.assertTrue(e.enforce("bob", "/pen/2", "GET"))
# replace key_match2 with key_match3
e.add_named_matching_func("g2", casbin.util.key_match3)
self.assertTrue(e.enforce("alice", "/book2/1", "GET"))
self.assertTrue(e.enforce("alice", "/book2/2", "GET"))
self.assertTrue(e.enforce("alice", "/pen2/1", "GET"))
self.assertFalse(e.enforce("alice", "/pen2/2", "GET"))
self.assertFalse(e.enforce("bob", "/book2/1", "GET"))
self.assertFalse(e.enforce("bob", "/book2/2", "GET"))
self.assertTrue(e.enforce("bob", "/pen2/1", "GET"))
self.assertTrue(e.enforce("bob", "/pen2/2", "GET"))
def test_rbac_with_multipy_matched_pattern(self):
e = self.get_enforcer(
get_examples("rbac_with_multiply_matched_pattern.conf"),
get_examples("rbac_with_multiply_matched_pattern.csv"),
)
e.add_named_matching_func("g2", casbin.util.glob_match)
self.assertTrue(e.enforce("root@localhost", "/", "org.create"))
def test_enforce_abac_log_enabled(self):
e = self.get_enforcer(get_examples("abac_model.conf"))
sub = "alice"
obj = {"Owner": "alice", "id": "data1"}
self.assertTrue(e.enforce(sub, obj, "write"))
def test_abac_with_sub_rule(self):
e = self.get_enforcer(
get_examples("abac_rule_model.conf"), get_examples("abac_rule_policy.csv")
)
sub1 = TestSub("alice", 16)
sub2 = TestSub("bob", 20)
sub3 = TestSub("alice", 65)
self.assertFalse(e.enforce(sub1, "/data1", "read"))
self.assertFalse(e.enforce(sub1, "/data2", "read"))
self.assertFalse(e.enforce(sub1, "/data1", "write"))
self.assertTrue(e.enforce(sub1, "/data2", "write"))
self.assertTrue(e.enforce(sub2, "/data1", "read"))
self.assertFalse(e.enforce(sub2, "/data2", "read"))
self.assertFalse(e.enforce(sub2, "/data1", "write"))
self.assertTrue(e.enforce(sub2, "/data2", "write"))
self.assertTrue(e.enforce(sub3, "/data1", "read"))
self.assertFalse(e.enforce(sub3, "/data2", "read"))
self.assertFalse(e.enforce(sub3, "/data1", "write"))
self.assertFalse(e.enforce(sub3, "/data2", "write"))
def test_abac_with_multiple_sub_rules(self):
e = self.get_enforcer(
get_examples("abac_multiple_rules_model.conf"),
get_examples("abac_multiple_rules_policy.csv"),
)
sub1 = TestSub("alice", 16)
sub2 = TestSub("alice", 20)
sub3 = TestSub("bob", 65)
sub4 = TestSub("bob", 35)
self.assertFalse(e.enforce(sub1, "/data1", "read"))
self.assertFalse(e.enforce(sub1, "/data2", "read"))
self.assertFalse(e.enforce(sub1, "/data1", "write"))
self.assertFalse(e.enforce(sub1, "/data2", "write"))
self.assertTrue(e.enforce(sub2, "/data1", "read"))
self.assertFalse(e.enforce(sub2, "/data2", "read"))
self.assertFalse(e.enforce(sub2, "/data1", "write"))
self.assertFalse(e.enforce(sub2, "/data2", "write"))
self.assertFalse(e.enforce(sub3, "/data1", "read"))
self.assertFalse(e.enforce(sub3, "/data2", "read"))
self.assertFalse(e.enforce(sub3, "/data1", "write"))
self.assertFalse(e.enforce(sub3, "/data2", "write"))
self.assertFalse(e.enforce(sub4, "/data1", "read"))
self.assertFalse(e.enforce(sub4, "/data2", "read"))
self.assertFalse(e.enforce(sub4, "/data1", "write"))
self.assertTrue(e.enforce(sub4, "/data2", "write"))
class TestConfigSynced(TestConfig):
def get_enforcer(self, model=None, adapter=None):
return casbin.SyncedEnforcer(
model,
adapter,
)
def test_auto_loading_policy(self):
e = self.get_enforcer(
get_examples("basic_model.conf"),
get_examples("basic_policy.csv"),
)
e.start_auto_load_policy(5 / 1000)
self.assertTrue(e.is_auto_loading_running())
e.stop_auto_load_policy()
# thread needs a moment to exit
time.sleep(10 / 1000)
self.assertFalse(e.is_auto_loading_running())
| 40.914956 | 88 | 0.607798 | 1,662 | 13,952 | 4.917569 | 0.091456 | 0.118439 | 0.133121 | 0.188548 | 0.812309 | 0.763367 | 0.644684 | 0.52184 | 0.463844 | 0.405359 | 0 | 0.020498 | 0.220255 | 13,952 | 340 | 89 | 41.035294 | 0.730766 | 0.010393 | 0 | 0.368794 | 0 | 0 | 0.207014 | 0.04833 | 0 | 0 | 0 | 0 | 0.443262 | 1 | 0.099291 | false | 0 | 0.014184 | 0.007092 | 0.138298 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
5339b2c2092bc79d29ea25cdf623ccce0b69cef1 | 1,294 | py | Python | git/setup.py | unitopia-de/python-efuns | ece29d5980aa09c7d61dd3ec776b99e989463eb3 | [
"0BSD"
] | 2 | 2020-05-01T11:57:49.000Z | 2021-08-14T14:54:01.000Z | git/setup.py | unitopia-de/python-efuns | ece29d5980aa09c7d61dd3ec776b99e989463eb3 | [
"0BSD"
] | null | null | null | git/setup.py | unitopia-de/python-efuns | ece29d5980aa09c7d61dd3ec776b99e989463eb3 | [
"0BSD"
] | null | null | null | import setuptools
setuptools.setup(
name="ldmud-efun-git",
version="0.0.1",
author="UNItopia Administration",
author_email="mudadm@UNItopia.DE",
description="Git Efuns for UNItopia",
long_description="Offers efun for interaction with the Git version management.",
packages=setuptools.find_packages(),
install_requires=[
'ldmud-asyncio',
],
classifiers=[
"Programming Language :: Python :: 3",
"Operating System :: OS Independent",
],
entry_points={
'ldmud_efun': [
'git_list_commits = ldmudefungit.gitefuns:efun_git_list_commits',
'git_info_commit = ldmudefungit.gitefuns:efun_git_info_commit',
'git_show_diff = ldmudefungit.gitefuns:efun_git_show_diff',
'git_status = ldmudefungit.gitefuns:efun_git_status',
'git_status_diff = ldmudefungit.gitefuns:efun_git_status_diff',
'git_commit = ldmudefungit.gitefuns:efun_git_commit',
'git_reverse = ldmudefungit.gitefuns:efun_git_reverse',
'git_cat = ldmudefungit.gitefuns:efun_git_cat',
'git_search_commits = ldmudefungit.gitefuns:efun_git_search_commits',
]
},
zip_safe=False,
)
| 39.212121 | 84 | 0.636785 | 133 | 1,294 | 5.864662 | 0.421053 | 0.098718 | 0.276923 | 0.311538 | 0.301282 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004202 | 0.264297 | 1,294 | 32 | 85 | 40.4375 | 0.815126 | 0 | 0 | 0.064516 | 0 | 0 | 0.603555 | 0.276662 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.032258 | 0 | 0.032258 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
533bf4ab25ce69289d116edc26a4daf328d7baff | 204 | py | Python | pyfbx/utils/synchronized_func.py | zhangxinlei-cn/pyfbx | 8b732efdc47057b7b1cb0127a6ee570c7d8984c7 | [
"MIT"
] | null | null | null | pyfbx/utils/synchronized_func.py | zhangxinlei-cn/pyfbx | 8b732efdc47057b7b1cb0127a6ee570c7d8984c7 | [
"MIT"
] | null | null | null | pyfbx/utils/synchronized_func.py | zhangxinlei-cn/pyfbx | 8b732efdc47057b7b1cb0127a6ee570c7d8984c7 | [
"MIT"
] | null | null | null | import threading
def synchronized(func):
func.__lock__ = threading.Lock()
def synced_func(*args, **kws):
with func.__lock__:
return func(*args, **kws)
return synced_func | 20.4 | 37 | 0.637255 | 24 | 204 | 5 | 0.458333 | 0.133333 | 0.183333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 204 | 10 | 38 | 20.4 | 0.784314 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
535357d45414776ec01f6cb6940f01e2eb56831e | 50 | py | Python | __init__.py | mssalvador/RegnSkabData | b0372b5824c6a82786033f8244e6230cdd4d1e5c | [
"Apache-2.0"
] | null | null | null | __init__.py | mssalvador/RegnSkabData | b0372b5824c6a82786033f8244e6230cdd4d1e5c | [
"Apache-2.0"
] | null | null | null | __init__.py | mssalvador/RegnSkabData | b0372b5824c6a82786033f8244e6230cdd4d1e5c | [
"Apache-2.0"
] | null | null | null | '''
Created on Jan 3, 2017
@author: svanhmic
'''
| 8.333333 | 22 | 0.62 | 7 | 50 | 4.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0.2 | 50 | 5 | 23 | 10 | 0.65 | 0.82 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
535dda32324beea3a32d924cc387dcfce6defdc8 | 256 | py | Python | lib_pypy/_codecs_cn.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 381 | 2018-08-18T03:37:22.000Z | 2022-02-06T23:57:36.000Z | lib_pypy/_codecs_cn.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 75 | 2016-01-14T16:03:02.000Z | 2020-04-29T22:51:53.000Z | lib_pypy/_codecs_cn.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 55 | 2015-08-16T02:41:30.000Z | 2022-03-20T20:33:35.000Z | # this getcodec() function supports any multibyte codec, although
# for compatibility with CPython it should only be used for the
# codecs from this module, i.e.:
#
# 'gb2312', 'gbk', 'gb18030', 'hz'
from _multibytecodec import __getcodec as getcodec
| 32 | 65 | 0.738281 | 35 | 256 | 5.314286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042453 | 0.171875 | 256 | 7 | 66 | 36.571429 | 0.834906 | 0.753906 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
53621316be7b0d7f410bfe02cdb0b045d9c6a6f3 | 6,382 | py | Python | yardstick/tests/unit/benchmark/core/test_plugin.py | upfront710/yardstick | 2c3898f2ca061962cedbfc7435f78b59aa39b097 | [
"Apache-2.0"
] | 28 | 2017-02-07T07:46:42.000Z | 2021-06-30T08:11:06.000Z | yardstick/tests/unit/benchmark/core/test_plugin.py | upfront710/yardstick | 2c3898f2ca061962cedbfc7435f78b59aa39b097 | [
"Apache-2.0"
] | 6 | 2018-01-18T08:00:54.000Z | 2019-04-11T04:51:41.000Z | yardstick/tests/unit/benchmark/core/test_plugin.py | upfront710/yardstick | 2c3898f2ca061962cedbfc7435f78b59aa39b097 | [
"Apache-2.0"
] | 46 | 2016-12-13T10:05:47.000Z | 2021-02-18T07:33:06.000Z | ##############################################################################
# Copyright (c) 2016 Huawei Technologies Co.,Ltd and others.
#
# All rights reserved. This program and the accompanying materials
# are made available under the terms of the Apache License, Version 2.0
# which accompanies this distribution, and is available at
# http://www.apache.org/licenses/LICENSE-2.0
##############################################################################
import copy
import os
import pkg_resources
import mock
import six
import testtools
from yardstick import ssh
from yardstick.benchmark.core import plugin
from yardstick.tests import fixture
class PluginTestCase(testtools.TestCase):
FILE = """
schema: "yardstick:plugin:0.1"
plugins:
name: sample
deployment:
ip: 10.1.0.50
user: root
password: root
"""
NAME = 'sample'
DEPLOYMENT = {'ip': '10.1.0.50', 'user': 'root', 'password': 'root'}
def setUp(self):
super(PluginTestCase, self).setUp()
self.plugin_parser = plugin.PluginParser(mock.Mock())
self.plugin = plugin.Plugin()
self.useFixture(fixture.PluginParserFixture(PluginTestCase.FILE))
self._mock_ssh_from_node = mock.patch.object(ssh.SSH, 'from_node')
self.mock_ssh_from_node = self._mock_ssh_from_node.start()
self.mock_ssh_obj = mock.Mock()
self.mock_ssh_from_node.return_value = self.mock_ssh_obj
self.mock_ssh_obj.wait = mock.Mock()
self.mock_ssh_obj._put_file_shell = mock.Mock()
self._mock_log_info = mock.patch.object(plugin.LOG, 'info')
self.mock_log_info = self._mock_log_info.start()
self.addCleanup(self._cleanup)
def _cleanup(self):
self._mock_ssh_from_node.stop()
self._mock_log_info.stop()
@mock.patch.object(six.moves.builtins, 'print')
def test_install(self, *args):
args = mock.Mock()
args.input_file = [mock.Mock()]
with mock.patch.object(self.plugin, '_install_setup') as \
mock_install, \
mock.patch.object(self.plugin, '_run') as mock_run:
self.plugin.install(args)
mock_install.assert_called_once_with(PluginTestCase.NAME,
PluginTestCase.DEPLOYMENT)
mock_run.assert_called_once_with(PluginTestCase.NAME)
@mock.patch.object(six.moves.builtins, 'print')
def test_remove(self, *args):
args = mock.Mock()
args.input_file = [mock.Mock()]
with mock.patch.object(self.plugin, '_remove_setup') as \
mock_remove, \
mock.patch.object(self.plugin, '_run') as mock_run:
self.plugin.remove(args)
mock_remove.assert_called_once_with(PluginTestCase.NAME,
PluginTestCase.DEPLOYMENT)
mock_run.assert_called_once_with(PluginTestCase.NAME)
@mock.patch.object(pkg_resources, 'resource_filename',
return_value='script')
def test__install_setup(self, mock_resource_filename):
plugin_name = 'plugin_name'
self.plugin._install_setup(plugin_name, PluginTestCase.DEPLOYMENT)
mock_resource_filename.assert_called_once_with(
'yardstick.resources', 'scripts/install/' + plugin_name + '.bash')
self.mock_ssh_from_node.assert_called_once_with(
PluginTestCase.DEPLOYMENT)
self.mock_ssh_obj.wait.assert_called_once_with(timeout=600)
self.mock_ssh_obj._put_file_shell.assert_called_once_with(
'script', '~/{0}.sh'.format(plugin_name))
@mock.patch.object(pkg_resources, 'resource_filename',
return_value='script')
@mock.patch.object(os, 'environ', return_value='1.2.3.4')
def test__install_setup_with_ip_local(self, mock_os_environ,
mock_resource_filename):
plugin_name = 'plugin_name'
deployment = copy.deepcopy(PluginTestCase.DEPLOYMENT)
deployment['ip'] = 'local'
self.plugin._install_setup(plugin_name, deployment)
mock_os_environ.__getitem__.assert_called_once_with('JUMP_HOST_IP')
mock_resource_filename.assert_called_once_with(
'yardstick.resources',
'scripts/install/' + plugin_name + '.bash')
self.mock_ssh_from_node.assert_called_once_with(
deployment, overrides={'ip': os.environ["JUMP_HOST_IP"]})
self.mock_ssh_obj.wait.assert_called_once_with(timeout=600)
self.mock_ssh_obj._put_file_shell.assert_called_once_with(
'script', '~/{0}.sh'.format(plugin_name))
@mock.patch.object(pkg_resources, 'resource_filename',
return_value='script')
def test__remove_setup(self, mock_resource_filename):
plugin_name = 'plugin_name'
self.plugin._remove_setup(plugin_name, PluginTestCase.DEPLOYMENT)
mock_resource_filename.assert_called_once_with(
'yardstick.resources',
'scripts/remove/' + plugin_name + '.bash')
self.mock_ssh_from_node.assert_called_once_with(
PluginTestCase.DEPLOYMENT)
self.mock_ssh_obj.wait.assert_called_once_with(timeout=600)
self.mock_ssh_obj._put_file_shell.assert_called_once_with(
'script', '~/{0}.sh'.format(plugin_name))
@mock.patch.object(pkg_resources, 'resource_filename',
return_value='script')
@mock.patch.object(os, 'environ', return_value='1.2.3.4')
def test__remove_setup_with_ip_local(self, mock_os_environ,
mock_resource_filename):
plugin_name = 'plugin_name'
deployment = copy.deepcopy(PluginTestCase.DEPLOYMENT)
deployment['ip'] = 'local'
self.plugin._remove_setup(plugin_name, deployment)
mock_os_environ.__getitem__.assert_called_once_with('JUMP_HOST_IP')
mock_resource_filename.assert_called_once_with(
'yardstick.resources',
'scripts/remove/' + plugin_name + '.bash')
self.mock_ssh_from_node.assert_called_once_with(
deployment, overrides={'ip': os.environ["JUMP_HOST_IP"]})
self.mock_ssh_obj.wait.assert_called_once_with(timeout=600)
self.mock_ssh_obj._put_file_shell.mock_os_environ(
'script', '~/{0}.sh'.format(plugin_name))
| 42.832215 | 78 | 0.648699 | 764 | 6,382 | 5.066754 | 0.162304 | 0.059933 | 0.059675 | 0.108499 | 0.757685 | 0.734177 | 0.711444 | 0.711444 | 0.690003 | 0.671661 | 0 | 0.009274 | 0.222814 | 6,382 | 148 | 79 | 43.121622 | 0.771169 | 0.04591 | 0 | 0.512397 | 0 | 0 | 0.11188 | 0.003712 | 0 | 0 | 0 | 0 | 0.173554 | 1 | 0.066116 | false | 0.016529 | 0.07438 | 0 | 0.173554 | 0.016529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
7277f1beec5a595185f6379a589577ce9b694ccf | 4,900 | py | Python | falkon/tests/test_nysel.py | mohamad-amin/falkon | 581c761b4a4cb7bf6a299613700db8414c419a52 | [
"MIT"
] | 130 | 2020-06-18T08:30:30.000Z | 2022-03-21T15:43:17.000Z | falkon/tests/test_nysel.py | mohamad-amin/falkon | 581c761b4a4cb7bf6a299613700db8414c419a52 | [
"MIT"
] | 32 | 2020-06-26T09:24:45.000Z | 2022-03-20T10:37:36.000Z | falkon/tests/test_nysel.py | mohamad-amin/falkon | 581c761b4a4cb7bf6a299613700db8414c419a52 | [
"MIT"
] | 17 | 2020-07-13T17:28:02.000Z | 2022-02-15T19:55:40.000Z | import numpy as np
import pytest
import torch
import falkon
from falkon.center_selection import UniformSelector
from falkon.tests.gen_random import gen_random, gen_sparse_matrix
from falkon.utils import decide_cuda
M = 500
D = 20
num_centers = 100
@pytest.fixture
def rowmaj_arr() -> torch.Tensor:
return torch.from_numpy(gen_random(M, D, 'float64', False))
@pytest.fixture
def colmaj_arr() -> torch.Tensor:
return torch.from_numpy(gen_random(M, D, 'float64', True))
@pytest.fixture
def uniform_sel() -> UniformSelector:
return UniformSelector(np.random.default_rng(0), num_centers=num_centers)
@pytest.mark.parametrize("device", [
pytest.param("cpu"),
pytest.param("cuda:0", marks=[pytest.mark.skipif(not decide_cuda(), reason="No GPU found.")])
])
def test_c_order(uniform_sel, rowmaj_arr, device):
rowmaj_arr = rowmaj_arr.to(device=device)
centers: torch.Tensor = uniform_sel.select(rowmaj_arr, None)
assert centers.stride() == (D, 1), "UniformSel changed input stride"
assert centers.size() == (num_centers, D), "UniformSel did not output correct size"
assert centers.dtype == rowmaj_arr.dtype
assert centers.device == rowmaj_arr.device
def test_cuda(uniform_sel, rowmaj_arr):
centers: torch.Tensor = uniform_sel.select(rowmaj_arr, None)
assert centers.stride() == (D, 1), "UniformSel changed input stride"
assert centers.size() == (num_centers, D), "UniformSel did not output correct size"
assert centers.dtype == rowmaj_arr.dtype
assert centers.device == rowmaj_arr.device
centers, idx = uniform_sel.select_indices(rowmaj_arr, None)
assert centers.device == idx.device
assert idx.dtype == torch.long
assert len(idx) == num_centers
def test_f_order(uniform_sel, colmaj_arr):
centers: torch.Tensor = uniform_sel.select(colmaj_arr, None)
assert centers.stride() == (1, num_centers), "UniformSel changed input stride"
assert centers.size() == (num_centers, D), "UniformSel did not output correct size"
assert centers.dtype == colmaj_arr.dtype
assert centers.device == colmaj_arr.device
centers, idx = uniform_sel.select_indices(colmaj_arr, None)
assert centers.device == idx.device
assert idx.dtype == torch.long
assert len(idx) == num_centers
def test_great_m(colmaj_arr):
uniform_sel = UniformSelector(np.random.default_rng(0), num_centers=M + 1)
centers: torch.Tensor = uniform_sel.select(colmaj_arr, None)
assert centers.size() == (M, D), "UniformSel did not output correct size"
assert centers.dtype == colmaj_arr.dtype
assert centers.device == colmaj_arr.device
centers, idx = uniform_sel.select_indices(colmaj_arr, None)
assert centers.device == idx.device
assert idx.dtype == torch.long
assert len(idx) == centers.shape[0]
def test_sparse_csr(uniform_sel):
sparse_csr = gen_sparse_matrix(M, D, np.float32, 0.01)
centers: falkon.sparse.SparseTensor = uniform_sel.select(sparse_csr, None)
assert centers.size() == (num_centers, D), "UniformSel did not output correct size"
assert centers.is_csr is True, "UniformSel did not preserve sparsity correctly"
assert centers.dtype == sparse_csr.dtype
assert centers.device == sparse_csr.device
centers, idx = uniform_sel.select_indices(sparse_csr, None)
assert centers.device == idx.device
assert idx.dtype == torch.long
assert len(idx) == centers.shape[0]
def test_sparse_csc():
uniform_sel = UniformSelector(np.random.default_rng(0), num_centers=5)
sparse_csc = gen_sparse_matrix(M, D, np.float32, 0.01).transpose_csc()
centers: falkon.sparse.SparseTensor = uniform_sel.select(sparse_csc, None)
assert centers.size() == (5, M), "UniformSel did not output correct size"
assert centers.is_csc is True, "UniformSel did not preserve sparsity correctly"
assert centers.dtype == sparse_csc.dtype
assert centers.device == sparse_csc.device
centers, idx = uniform_sel.select_indices(sparse_csc, None)
assert centers.device == idx.device
assert idx.dtype == torch.long
assert len(idx) == centers.shape[0]
def test_with_y(uniform_sel, colmaj_arr):
Y = torch.empty(M, 1, dtype=colmaj_arr.dtype)
centers, cY = uniform_sel.select(colmaj_arr, Y)
assert centers.stride() == (1, num_centers), "UniformSel changed input stride"
assert centers.size() == (num_centers, D), "UniformSel did not output correct size"
assert cY.size() == (num_centers, 1), "UniformSel did not output correct Y size"
assert centers.dtype == colmaj_arr.dtype
assert centers.device == colmaj_arr.device
assert cY.dtype == Y.dtype
assert cY.device == Y.device
centers, cY, idx = uniform_sel.select_indices(colmaj_arr, Y)
assert idx.dtype == torch.long
assert len(idx) == centers.shape[0]
assert len(idx) == cY.shape[0]
if __name__ == "__main__":
pytest.main(args=[__file__])
| 39.516129 | 97 | 0.724694 | 696 | 4,900 | 4.91954 | 0.139368 | 0.121495 | 0.060748 | 0.051402 | 0.76285 | 0.718166 | 0.718166 | 0.706192 | 0.62646 | 0.594918 | 0 | 0.009725 | 0.160612 | 4,900 | 123 | 98 | 39.837398 | 0.822757 | 0 | 0 | 0.454545 | 0 | 0 | 0.116735 | 0 | 0 | 0 | 0 | 0 | 0.484848 | 1 | 0.10101 | false | 0 | 0.070707 | 0.030303 | 0.20202 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
72988ce0c36479fb38068c5f6261678ac41ab711 | 93 | py | Python | palestras/apps.py | jhoneffernandes/python-trabalho | 9710eca693ae04c699b14f11e78959a3cf5bfd6c | [
"MIT"
] | null | null | null | palestras/apps.py | jhoneffernandes/python-trabalho | 9710eca693ae04c699b14f11e78959a3cf5bfd6c | [
"MIT"
] | null | null | null | palestras/apps.py | jhoneffernandes/python-trabalho | 9710eca693ae04c699b14f11e78959a3cf5bfd6c | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class PalestrasConfig(AppConfig):
name = 'palestras'
| 15.5 | 33 | 0.763441 | 10 | 93 | 7.1 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 93 | 5 | 34 | 18.6 | 0.910256 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
72af3e800fe1767eee5d452cf27c339d4c484495 | 315 | py | Python | python-ds-practice-solution/01_product/product.py | Tigerbackwood/Springboard | b1c723fa977d814f066946eb83d7acef3278146a | [
"MIT"
] | null | null | null | python-ds-practice-solution/01_product/product.py | Tigerbackwood/Springboard | b1c723fa977d814f066946eb83d7acef3278146a | [
"MIT"
] | null | null | null | python-ds-practice-solution/01_product/product.py | Tigerbackwood/Springboard | b1c723fa977d814f066946eb83d7acef3278146a | [
"MIT"
] | null | null | null | # Write a function called product which takes in two numbers
# and returns the product of the numbers.
# Examples:
# product(2, 2) # 4
# product(2, -2) # -4
def product(a, b):
"""Return product of a and b.
>>> product(2, 2)
4
>>> product(2, -2)
-4
"""
return a * b | 15.75 | 60 | 0.536508 | 47 | 315 | 3.595745 | 0.425532 | 0.189349 | 0.213018 | 0.236686 | 0.236686 | 0.236686 | 0.236686 | 0.236686 | 0 | 0 | 0 | 0.056604 | 0.326984 | 315 | 20 | 61 | 15.75 | 0.740566 | 0.685714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
72c1504445da4a42e5d538b7ce816a8693fc7495 | 145 | py | Python | examples/mambo.py | juanAFernandez/testing-with-python | 7ed3f5cedb5a4a32997f7512d3f5809c41cfc5ee | [
"Apache-2.0"
] | 3 | 2016-11-17T13:10:47.000Z | 2017-07-20T22:14:04.000Z | examples/mambo.py | juanAFernandez/testing-with-python | 7ed3f5cedb5a4a32997f7512d3f5809c41cfc5ee | [
"Apache-2.0"
] | null | null | null | examples/mambo.py | juanAFernandez/testing-with-python | 7ed3f5cedb5a4a32997f7512d3f5809c41cfc5ee | [
"Apache-2.0"
] | null | null | null | import time
with description('mamba'):
with it('is tested with mamba itself'):
pass
with it('supports Python 3'):
pass
| 16.111111 | 43 | 0.606897 | 19 | 145 | 4.631579 | 0.684211 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009709 | 0.289655 | 145 | 8 | 44 | 18.125 | 0.84466 | 0 | 0 | 0.333333 | 0 | 0 | 0.340278 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
72d08eb43511745bc85e517faf1baaee0721758d | 183 | py | Python | flash_learning/models/leaderboard.py | tobifanibi/flash_learning | 177ceb2a5ccc2224fcb3455a30c57e495f94b3bd | [
"MIT"
] | null | null | null | flash_learning/models/leaderboard.py | tobifanibi/flash_learning | 177ceb2a5ccc2224fcb3455a30c57e495f94b3bd | [
"MIT"
] | 39 | 2020-12-18T13:39:05.000Z | 2021-01-01T01:35:59.000Z | flash_learning/models/leaderboard.py | tobifanibi/flash_learning | 177ceb2a5ccc2224fcb3455a30c57e495f94b3bd | [
"MIT"
] | 5 | 2020-12-18T04:24:15.000Z | 2021-06-01T10:18:02.000Z | from sqlalchemy.orm import relationship
from flash_learning import db
class Leaderboard(db.Model):
id = db.Column(db.Integer, primary_key=True)
user = relationship("User")
| 20.333333 | 48 | 0.754098 | 25 | 183 | 5.44 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153005 | 183 | 8 | 49 | 22.875 | 0.877419 | 0 | 0 | 0 | 0 | 0 | 0.021858 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
72d452d27cb5154da29d550166be6eaffe698e11 | 1,434 | py | Python | tests/test_is_ip_v6.py | alvistack/daveoncode-python-string-utils | 78929d88d90b1f90cb4837528ed955166bf0f559 | [
"MIT"
] | 3 | 2020-08-20T10:27:13.000Z | 2021-11-02T20:28:16.000Z | tests/test_is_ip_v6.py | alvistack/daveoncode-python-string-utils | 78929d88d90b1f90cb4837528ed955166bf0f559 | [
"MIT"
] | null | null | null | tests/test_is_ip_v6.py | alvistack/daveoncode-python-string-utils | 78929d88d90b1f90cb4837528ed955166bf0f559 | [
"MIT"
] | null | null | null | from unittest import TestCase
from string_utils import is_ip_v6
class IsIpV6TestCase(TestCase):
def test_return_false_for_non_string_objects(self):
# noinspection PyTypeChecker
self.assertFalse(is_ip_v6(None))
# noinspection PyTypeChecker
self.assertFalse(is_ip_v6(1))
# noinspection PyTypeChecker
self.assertFalse(is_ip_v6([]))
# noinspection PyTypeChecker
self.assertFalse(is_ip_v6({'a': 1}))
# noinspection PyTypeChecker
self.assertFalse(is_ip_v6(True))
def test_ip_cannot_be_blank(self):
self.assertFalse(is_ip_v6(''))
self.assertFalse(is_ip_v6(' '))
def test_ipv4_is_not_recognized(self):
self.assertFalse(is_ip_v6('255.100.100.75'))
def test_returns_false_for_invalid_ip_v6(self):
self.assertFalse(is_ip_v6('2001.db8:85a3:0000:0000:8a2e:370:7334'))
self.assertFalse(is_ip_v6('2001:db8|85a3:0000:0000:8a2e:370:1'))
self.assertFalse(is_ip_v6('123:db8:85a3:0000:0000:8a2e:370,1'))
self.assertFalse(is_ip_v6('2001:db8:85a3:0:0:8a2e:370'))
def test_recognizes_valid_ip_v6(self):
self.assertTrue(is_ip_v6('2001:db8:85a3:0000:0000:8a2e:370:7334'))
self.assertTrue(is_ip_v6('2001:db8:85a3:0000:0000:8a2e:370:1'))
self.assertTrue(is_ip_v6('123:db8:85a3:0000:0000:8a2e:370:1'))
self.assertTrue(is_ip_v6('2001:db8:85a3:0:0:8a2e:370:7334'))
| 34.97561 | 75 | 0.691771 | 215 | 1,434 | 4.339535 | 0.227907 | 0.081458 | 0.109325 | 0.244373 | 0.725616 | 0.703108 | 0.64523 | 0.49732 | 0.39657 | 0.39657 | 0 | 0.172589 | 0.175732 | 1,434 | 40 | 76 | 35.85 | 0.616751 | 0.093445 | 0 | 0.083333 | 0 | 0 | 0.217156 | 0.204791 | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0.208333 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
72f5be0f2a3bd41a5828d974b6b52709400311f1 | 51 | py | Python | populationsim/tests/__init__.py | gregmacfarlane/populationsim | 347e22ea6e264175346029d91770b3d356fba332 | [
"BSD-3-Clause"
] | 24 | 2020-02-21T11:45:42.000Z | 2022-03-23T15:07:54.000Z | populationsim/tests/__init__.py | gregmacfarlane/populationsim | 347e22ea6e264175346029d91770b3d356fba332 | [
"BSD-3-Clause"
] | 93 | 2017-04-11T15:13:24.000Z | 2021-05-26T19:30:46.000Z | populationsim/tests/__init__.py | gregmacfarlane/populationsim | 347e22ea6e264175346029d91770b3d356fba332 | [
"BSD-3-Clause"
] | 22 | 2017-12-27T19:05:14.000Z | 2021-05-26T14:32:50.000Z | # PopulationSim
# See full license in LICENSE.txt.
| 17 | 34 | 0.764706 | 7 | 51 | 5.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 51 | 2 | 35 | 25.5 | 0.906977 | 0.901961 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
f43af325b1b435310772d6fd5441c3ca3e0492b6 | 10,783 | py | Python | apps/accounts/tests/test_user_login.py | dstl/lighthouse | b810742d9f4cbfac02bf99096542499d25c88b58 | [
"MIT"
] | 5 | 2016-05-12T13:47:38.000Z | 2020-06-22T07:33:35.000Z | apps/accounts/tests/test_user_login.py | dstl/lighthouse | b810742d9f4cbfac02bf99096542499d25c88b58 | [
"MIT"
] | 7 | 2016-10-24T12:41:09.000Z | 2016-12-08T21:58:18.000Z | apps/accounts/tests/test_user_login.py | dstl/lighthouse | b810742d9f4cbfac02bf99096542499d25c88b58 | [
"MIT"
] | 4 | 2016-05-12T21:53:21.000Z | 2021-04-10T22:02:26.000Z | # (c) Crown Owned Copyright, 2016. Dstl.
from django.contrib.auth import get_user_model
from django.core.urlresolvers import reverse
from django_webtest import WebTest
from apps.teams.models import Team
from apps.organisations.models import Organisation
class UserWebTest(WebTest):
def test_can_login(self):
get_user_model().objects.create_user(userid='user@0001.com')
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# userid in the navigation, login link not
self.assertTrue(
response.html.find(
'span',
attrs={'data-slug': 'user0001com'}
)
)
self.assertFalse(
response.html.find('a', class_='login')
)
def test_new_userid_creates_normal_account(self):
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# userid in the navigation, login link not
self.assertTrue(
response.html.find(
'span',
attrs={'data-slug': 'user0001com'}
)
)
self.assertFalse(
response.html.find('a', class_='login')
)
# created a passwordless user in the db
user = get_user_model().objects.get(slug='user0001com')
self.assertTrue(user.pk)
self.assertFalse(user.has_usable_password())
def test_no_userid_doesnt_create_account(self):
form = self.app.get(reverse('login')).form
response = form.submit()
# login link in the navigation
self.assertTrue(
response.html.find('a', class_='login')
)
def test_login_as_user_with_password_redirects_to_admin(self):
get_user_model().objects.create_user(
userid='user@0001.com', password='password')
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit()
self.assertEquals(response.status_code, 302)
self.assertEquals(response.location, 'http://localhost:80/admin/')
# We want to test that a user who is missing a username is redirected
# to the update user profile page
def test_user_missing_data_redirected(self):
# This user doesn't have a username
user = get_user_model().objects.create_user(userid='user@0001.com')
# Log in as user
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# Check that the add display name text is shown
self.assertIn(
'add a display name',
response.html.find(
None, {"class": "user_id"}
).text
)
# Check that the link in the nav is heading to the right place
self.assertEquals(
response.html.find(
'span', attrs={'data-slug': user.slug}
).find('a').attrs['href'],
'/users/' + str(user.slug) + '/update-profile'
)
# We should now be on the user needs to add information page
self.assertEquals(
response.html.find(
'h3',
attrs={'class': 'error-summary-heading'}
).text,
'Please add your name'
)
# Meanwhile a user who has a username but no teams will end up
# at the page asking for them to enter additional team information
def test_user_has_username_but_no_teams_redirected(self):
# This user has a username
user = get_user_model().objects.create_user(
userid='user@0001.com', name='User 0001')
# Log in as user
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# Check that the join a team text is shown
self.assertIn(
'join a team',
response.html.find(
None, {"class": "user_id"}
).text
)
# Check that the link in the nav is heading to the right place
self.assertEquals(
response.html.find(
'span', attrs={'data-slug': user.slug}
).find('a').attrs['href'],
'/users/' + str(user.slug) + '/update-profile/teams'
)
# Make sure we *don't* have an alert summary heading
self.assertEquals(
response.html.find(
'h3',
attrs={'class': 'error-summary-heading'}
).text,
'Please add additional team information'
)
# A user may have username, teams but still missing the extra information
# they will get an alert bell notification and link to update their
# profile.
def test_user_has_username_teams_no_extra_info_redirected(self):
# This user has a username and teams
o = Organisation(name='org0001')
o.save()
t = Team(name='team0001', organisation=o)
t.save()
user = get_user_model().objects.create_user(
userid='user@0001.com', name='User 0001')
user.teams.add(t)
user.save()
# Log in as user
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# Check that the add more details text is shown
self.assertIn(
'enter more details',
response.html.find(
None, {"class": "user_id"}
).text
)
# Check that the link in the nav is heading to the right place
self.assertIn(
'/users/user0001com/update-profile',
response.html.find(
'span', attrs={'data-slug': 'user0001com'}
).find('a').attrs['href'],
)
# Make sure we *don't* have an alert summary heading
self.assertEquals(
response.html.find(
'h3',
attrs={'class': 'alert-summary-heading'}
).text,
'Please add additional information'
)
def test_user_with_full_profile_goes_to_links(self):
# This user has a username and teams
o = Organisation(name='org0001')
o.save()
t = Team(name='team0001', organisation=o)
t.save()
user = get_user_model().objects.create_user(
userid='user@0001.com',
name='User 0001',
best_way_to_find='In the kitchen',
best_way_to_contact='By email',
phone='00000000',
email='user@0001.com',
)
user.teams.add(t)
user.save()
# Log in as user
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
self.assertIn(
'All tools',
response.html.find('h1').text
)
# Check that a link showing the user's slug appears in the top nav
# bar
def test_slug_and_link_exists_in_nav(self):
# Create the user
get_user_model().objects.create_user(userid='user@0001.com')
# Log in as user
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# Go to the profile page (any page would do)
response = self.app.get(
reverse(
'user-detail',
kwargs={'slug': 'user0001com'}
)
)
# Check that the slug is displayed at the top, that it has a link
# and the link test is the user's slug
slug_div = response.html.find(
'span',
attrs={'data-slug': 'user0001com'}
)
self.assertTrue(slug_div)
slug_link = slug_div.find('a')
self.assertTrue(slug_link)
slug_text = slug_link.text
self.assertIn('user@0001.com', slug_text)
# Check that a link showing the user's username appears in the top nav
# bar
def test_username_and_link_exists_in_nav(self):
# Create the user
get_user_model().objects.create_user(
userid='user@0001.com', name='User 0001')
# Log in as user
form = self.app.get(reverse('login')).form
form['userid'] = 'user@0001.com'
response = form.submit().follow()
# Go to the profile page (any page would do)
response = self.app.get(
reverse(
'user-detail',
kwargs={'slug': 'user0001com'}
)
)
# Check that the username is displayed at the top, that it has a link
# and the link test is the user's slug
slug_div = response.html.find(
'span',
attrs={'data-slug': 'user0001com'}
)
self.assertTrue(slug_div)
slug_link = slug_div.find('a')
self.assertTrue(slug_link)
slug_text = slug_link.text
self.assertIn('User 0001', slug_text)
class KeycloakHeaderLoginTest(WebTest):
def test_auto_login(self):
headers = {'KEYCLOAK_USERNAME': 'user@0001.com'}
response = self.app.get(reverse('login'), headers=headers)
self.assertEqual(
'http://localhost:80/users/user0001com/update-profile',
response.location
)
def test_auto_login_for_admin(self):
get_user_model().objects.create_user(
userid='admin@0001.com', password='password')
headers = {'KEYCLOAK_USERNAME': 'admin@0001.com'}
response = self.app.get(reverse('login'), headers=headers)
self.assertEqual(
'http://localhost:80/users/admin0001com/update-profile',
response.location
)
def test_auto_login_for_complete_profile_goes_to_links(self):
# This user has a username and teams
o = Organisation(name='org0001')
o.save()
t = Team(name='team0001', organisation=o)
t.save()
user = get_user_model().objects.create_user(
userid='user@0001.com',
name='User 0001',
best_way_to_find='In the kitchen',
best_way_to_contact='By email',
phone='00000000',
email='user@0001.com',
)
user.teams.add(t)
user.save()
headers = {'KEYCLOAK_USERNAME': 'user@0001.com'}
response = self.app.get(reverse('login'), headers=headers)
self.assertEqual(
'http://localhost:80/links',
response.location
)
| 33.383901 | 79 | 0.562923 | 1,277 | 10,783 | 4.638215 | 0.151135 | 0.039169 | 0.042715 | 0.051663 | 0.762114 | 0.730035 | 0.7221 | 0.716191 | 0.692386 | 0.647982 | 0 | 0.031104 | 0.323194 | 10,783 | 322 | 80 | 33.487578 | 0.780488 | 0.170175 | 0 | 0.675325 | 0 | 0 | 0.159946 | 0.013142 | 0 | 0 | 0 | 0 | 0.121212 | 1 | 0.056277 | false | 0.017316 | 0.021645 | 0 | 0.08658 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
be32031fc1e6e93ddafd44bdb5a952bb7d1393a5 | 182 | py | Python | section_15_(intermediate)/quadrant.py | govex/python-lessons | e692f48b6db008a45df0b941dee1e580f5a6c800 | [
"MIT"
] | 425 | 2015-01-13T03:19:10.000Z | 2022-03-13T00:34:44.000Z | section_15_(intermediate)/quadrant.py | Supercodero/python-lessons | 38409c318e7a62d30b2ffd68f8a7a5a5ec00778d | [
"MIT"
] | null | null | null | section_15_(intermediate)/quadrant.py | Supercodero/python-lessons | 38409c318e7a62d30b2ffd68f8a7a5a5ec00778d | [
"MIT"
] | 178 | 2015-01-08T05:01:05.000Z | 2021-12-02T00:56:58.000Z | def quadrant(address):
"Returns the DC quadrant for the address given"
return [quadrant for quadrant in address.split(' ') if quadrant in ['NW', 'NE', 'SW', 'SE']] or None
| 30.333333 | 104 | 0.664835 | 27 | 182 | 4.481481 | 0.666667 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197802 | 182 | 5 | 105 | 36.4 | 0.828767 | 0.247253 | 0 | 0 | 0 | 0 | 0.296703 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
be492a720d910be5d2b8d5f718767a50d301d36a | 2,177 | py | Python | easyPass-WS/misha/compilers_cli.py | CiganOliviu/esential | 9cb40e89e9c119c5ac4e6e54ab4b5fd657c91462 | [
"MIT"
] | 4 | 2019-08-20T18:57:39.000Z | 2020-02-26T20:54:35.000Z | easyPass-WS/misha/compilers_cli.py | CiganOliviu/esential | 9cb40e89e9c119c5ac4e6e54ab4b5fd657c91462 | [
"MIT"
] | null | null | null | easyPass-WS/misha/compilers_cli.py | CiganOliviu/esential | 9cb40e89e9c119c5ac4e6e54ab4b5fd657c91462 | [
"MIT"
] | null | null | null | import click
import sys
from setup import environment_variables
sys.path.insert(0, 'dependencies/')
from dependencies.compilers import compilers, exe_compilers
class misha_compilers_system():
def __init__(self):
super(misha_compilers_system, self).__init__()
@click.command()
@click.option('--file_name', help='The file that you want to compile')
def run_gcc(path, file_name):
compiler_refference = compilers()
compiler_refference.gcc_compiler(environment_variables.workflow_space_path, file_name)
@click.command()
@click.option('--file_name', help='The file that you want to compile')
def run_jit(path, file_name):
compiler_refference = compilers()
compiler_refference.jit_compiler(environment_variables.workflow_space_path, file_name)
@click.command()
@click.option('--file_name', help='The file that you want to compile')
def run_python2(path, file_name):
compiler_refference = compilers()
compiler_refference.python2_run(environment_variables.workflow_space_path, file_name)
@click.command()
@click.option('--file_name', help='The file that you want to compile')
def run_python3(path, file_name):
compiler_refference = compilers()
compiler_refference.python3_run(environment_variables.workflow_space_path, file_name)
@click.command()
@click.option('--file_name', help='The file that you want to compile')
def go_run(path, file_name):
compiler_refference = compilers()
compiler_refference.go_run(environment_variables.workflow_space_path, file_name)
@click.command()
@click.option('--file_name', help='The file that you want to compile')
def go_build(path, file_name):
compiler_refference = compilers()
compiler_refference.go_build(environment_variables.workflow_space_path, file_name)
@click.command()
def run_exe_python2():
exe_compiler_refference = exe_compilers()
exe_compiler_refference.python2_exe()
@click.command()
def run_exe_python3():
exe_compiler_refference = exe_compilers()
exe_compiler_refference.python3_exe()
| 32.492537 | 94 | 0.723932 | 270 | 2,177 | 5.503704 | 0.155556 | 0.096904 | 0.096904 | 0.092867 | 0.805518 | 0.78533 | 0.78533 | 0.78533 | 0.559219 | 0.441454 | 0 | 0.005022 | 0.176849 | 2,177 | 66 | 95 | 32.984848 | 0.824219 | 0 | 0 | 0.478261 | 0 | 0 | 0.127239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.195652 | false | 0 | 0.086957 | 0 | 0.304348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
be4ddb7b1ec80f2135fe01c93d04221d5bd615d9 | 88 | py | Python | python/ase/scripts/take_picture.py | vlinhd11/vlinhd11-android-scripting | c90f04eb26a3746f025a6a0beab92bb6aa88c084 | [
"Apache-2.0"
] | 2,293 | 2015-01-02T12:46:10.000Z | 2022-03-29T09:45:43.000Z | python/ase/scripts/take_picture.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 315 | 2015-05-31T11:55:46.000Z | 2022-01-12T08:36:37.000Z | python/ase/scripts/take_picture.py | weiqiangzheng/sl4a | d3c17dca978cbeee545e12ea240a9dbf2a6999e9 | [
"Apache-2.0"
] | 1,033 | 2015-01-04T07:48:40.000Z | 2022-03-24T09:34:37.000Z | import android
droid = android.Android()
droid.cameraCapturePicture('/sdcard/foo.jpg')
| 17.6 | 45 | 0.784091 | 10 | 88 | 6.9 | 0.7 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079545 | 88 | 4 | 46 | 22 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0.170455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
be6f2a8a351ca5577a205fa343daad5636cccd3c | 115 | py | Python | dsapi/config/__init__.py | splunk-soar-connectors/digitalshadows | f1a1799986ae30c96ee61e511597b9bd575e7c32 | [
"Apache-2.0"
] | null | null | null | dsapi/config/__init__.py | splunk-soar-connectors/digitalshadows | f1a1799986ae30c96ee61e511597b9bd575e7c32 | [
"Apache-2.0"
] | 2 | 2021-11-09T20:46:30.000Z | 2022-03-31T12:36:12.000Z | dsapi/config/__init__.py | digitalshadows/splunk-soar-digitalshadows | cc94b9fdac11846c8086d78ec04f2fea9be84137 | [
"Apache-2.0"
] | 3 | 2022-02-02T07:39:45.000Z | 2022-03-08T10:43:46.000Z | # File: dsapi/config/__init__.py
#
# Licensed under Apache 2.0 (https://www.apache.org/licenses/LICENSE-2.0.txt)
#
| 23 | 77 | 0.721739 | 19 | 115 | 4.157895 | 0.842105 | 0.050633 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.095652 | 115 | 4 | 78 | 28.75 | 0.721154 | 0.921739 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
be72cfa91783e5cfc6ba172f85d34a92cca5d2b2 | 94 | py | Python | genbank_fasta/apps.py | danmcelroy/VoSeq | e22bd5d971154170bf3f4f24b684b95a12418637 | [
"BSD-3-Clause"
] | 2 | 2019-08-20T04:16:12.000Z | 2020-08-25T02:05:12.000Z | genbank_fasta/apps.py | danmcelroy/VoSeq | e22bd5d971154170bf3f4f24b684b95a12418637 | [
"BSD-3-Clause"
] | 65 | 2016-09-27T23:14:51.000Z | 2022-03-19T14:17:58.000Z | genbank_fasta/apps.py | danmcelroy/VoSeq | e22bd5d971154170bf3f4f24b684b95a12418637 | [
"BSD-3-Clause"
] | 4 | 2018-07-02T16:57:44.000Z | 2021-03-23T02:12:15.000Z | from django.apps import AppConfig
class GenbankFasta(AppConfig):
name = 'genbank_fasta'
| 15.666667 | 33 | 0.765957 | 11 | 94 | 6.454545 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159574 | 94 | 5 | 34 | 18.8 | 0.898734 | 0 | 0 | 0 | 0 | 0 | 0.138298 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
be907293e7d1e7bdf7c28492c1c9838df8452108 | 166 | py | Python | venv/bin/django-admin.py | yuzhouStayHungry/clean-blog | 65e3cb58881180aa0f31377e767ac2a8f0a556cc | [
"MIT"
] | 1 | 2021-05-08T08:40:55.000Z | 2021-05-08T08:40:55.000Z | venv/bin/django-admin.py | yuzhouStayHungry/clean-blog | 65e3cb58881180aa0f31377e767ac2a8f0a556cc | [
"MIT"
] | null | null | null | venv/bin/django-admin.py | yuzhouStayHungry/clean-blog | 65e3cb58881180aa0f31377e767ac2a8f0a556cc | [
"MIT"
] | null | null | null | #!/Users/yuzhou_1su/DjangoProject/cleanblog/venv/bin/python
from django.core import management
if __name__ == "__main__":
management.execute_from_command_line()
| 27.666667 | 59 | 0.801205 | 21 | 166 | 5.761905 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006623 | 0.090361 | 166 | 5 | 60 | 33.2 | 0.794702 | 0.349398 | 0 | 0 | 0 | 0 | 0.074766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
be9afb1f4461299a79c03a7a3e79badc57c466f7 | 1,252 | py | Python | frontend/app.py | KSTARK007/testingfinal | e5471b090cb1c267c26c0e051649d01ea3732379 | [
"MIT"
] | null | null | null | frontend/app.py | KSTARK007/testingfinal | e5471b090cb1c267c26c0e051649d01ea3732379 | [
"MIT"
] | null | null | null | frontend/app.py | KSTARK007/testingfinal | e5471b090cb1c267c26c0e051649d01ea3732379 | [
"MIT"
] | null | null | null | from flask import *
import hashlib
from pymongo import *
import string
import datetime
import re
from flask_cors import *
app = Flask(__name__)
cors = CORS(app)
@app.errorhandler(Exception)
def page_not_found(e):
return render_template('error.html',errorValue=404)
@app.route('/')
@cross_origin()
def start():
return render_template('index.html',val = "False")
@app.route('/<var>')
def index(var):
try:
int(var)
return render_template('index.html',val = var)
except Exception as e:
return render_template('error.html',errorValue=404)
@app.route('/signup')
def signup():
return render_template('signup.html')
@app.route('/login/<var>')
def login(var):
return render_template('login.html' ,val = var)
@app.route('/login')
def loginstart():
return render_template('login.html' ,val = "False")
@app.route('/upload')
def upload():
return render_template('upload.html')
@app.route('/cat')
def cat():
return render_template('cat.html')
@app.route('/addcat')
def addcat():
return render_template('addcat.html')
@app.route('/rmuser')
def rmusr():
return render_template('rmuser.html')
@app.route('/rmcat')
def rmcat():
return render_template('rmcat.html')
if __name__ == '__main__':
app.run(host='0.0.0.0',port=80,debug=True) | 19.261538 | 53 | 0.710863 | 179 | 1,252 | 4.815642 | 0.307263 | 0.167053 | 0.278422 | 0.048724 | 0.296984 | 0.266821 | 0.118329 | 0.118329 | 0.118329 | 0.118329 | 0 | 0.010791 | 0.111821 | 1,252 | 65 | 54 | 19.261538 | 0.764388 | 0 | 0 | 0.040816 | 0 | 0 | 0.167598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.22449 | false | 0 | 0.142857 | 0.204082 | 0.612245 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
fe2afda22e1fd7314d155de28cea57947c6226a0 | 178 | py | Python | asnets/experiments/actprop_2l_h_add.py | xf1590281/ASNets | 5f4b29fb62a5e72004b813228442d06246c9ec33 | [
"MIT"
] | 21 | 2017-12-05T13:27:36.000Z | 2021-11-16T20:32:33.000Z | asnets/experiments/actprop_2l_h_add.py | xf1590281/ASNets | 5f4b29fb62a5e72004b813228442d06246c9ec33 | [
"MIT"
] | 2 | 2018-07-16T12:15:46.000Z | 2020-10-31T00:02:49.000Z | asnets/experiments/actprop_2l_h_add.py | xf1590281/ASNets | 5f4b29fb62a5e72004b813228442d06246c9ec33 | [
"MIT"
] | 7 | 2018-03-19T13:45:13.000Z | 2022-03-24T07:52:20.000Z | """A two-layer configuration for the action/proposition network w/ h-add
teacher."""
# use defaults from actprop_1l
from .actprop_1l_h_add import * # noqa F401
NUM_LAYERS = 2
| 22.25 | 72 | 0.752809 | 29 | 178 | 4.448276 | 0.827586 | 0.062016 | 0.20155 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.157303 | 178 | 7 | 73 | 25.428571 | 0.82 | 0.662921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
fe685660f825842a89415f7afe8a065507cfb72a | 295 | py | Python | auth0/v3/authentication/__init__.py | santiagoroman/auth0-python | b88b056d0c68eb26a1171f33273010faf8fefe63 | [
"MIT"
] | 2 | 2020-10-08T21:42:56.000Z | 2021-03-21T08:17:52.000Z | auth0/v3/authentication/__init__.py | santiagoroman/auth0-python | b88b056d0c68eb26a1171f33273010faf8fefe63 | [
"MIT"
] | null | null | null | auth0/v3/authentication/__init__.py | santiagoroman/auth0-python | b88b056d0c68eb26a1171f33273010faf8fefe63 | [
"MIT"
] | 1 | 2018-12-02T18:47:47.000Z | 2018-12-02T18:47:47.000Z | from .authorize_client import AuthorizeClient
from .database import Database
from .delegated import Delegated
from .enterprise import Enterprise
from .get_token import GetToken
from .logout import Logout
from .passwordless import Passwordless
from .social import Social
from .users import Users
| 29.5 | 45 | 0.847458 | 38 | 295 | 6.526316 | 0.394737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122034 | 295 | 9 | 46 | 32.777778 | 0.957529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.111111 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
fe6ad2f3e2c814ab3d5f1b9147093c23ea3c387e | 17,631 | py | Python | anuga/abstract_2d_finite_volumes/tests/test_general_mesh.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | anuga/abstract_2d_finite_volumes/tests/test_general_mesh.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | anuga/abstract_2d_finite_volumes/tests/test_general_mesh.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | #!/usr/bin/env python
from __future__ import division
from builtins import str
from builtins import range
from past.utils import old_div
import unittest
from math import sqrt, pi
from anuga.config import epsilon
from anuga.abstract_2d_finite_volumes.general_mesh import General_mesh
from anuga.coordinate_transforms.geo_reference import Geo_reference
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
from anuga.shallow_water.shallow_water_domain import Domain
import numpy as num
class Test_General_Mesh(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_get_vertex_coordinates(self):
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
#Create basic mesh
nodes, triangles, _ = rectangular(1, 3)
domain = General_mesh(nodes, triangles)
assert num.allclose(domain.get_nodes(), nodes)
M = domain.number_of_triangles
V = domain.get_vertex_coordinates()
assert V.shape[0] == 3*M
for i in range(M):
for j in range(3):
k = triangles[i,j] #Index of vertex j in triangle i
assert num.allclose(V[3*i+j,:], nodes[k])
def test_get_vertex_coordinates_with_geo_ref(self):
x0 = 314036.58727982
y0 = 6224951.2960092
geo = Geo_reference(56, x0, y0)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
nodes = num.array([a, b, c, d, e, f])
nodes_absolute = geo.get_absolute(nodes)
# bac, bce, ecf, dbe
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]], int)
domain = General_mesh(nodes, triangles, geo_reference=geo)
verts = domain.get_vertex_coordinates(triangle_id=0) # bac
msg = ("num.array([b,a,c])=\n%s\nshould be close to 'verts'=\n%s"
% (str(num.array([b,a,c])), str(verts)))
self.assertTrue(num.allclose(num.array([b,a,c]), verts), msg)
verts = domain.get_vertex_coordinates(triangle_id=0)
msg = ("num.array([b,a,c])=\n%s\nshould be close to 'verts'=\n%s"
% (str(num.array([b,a,c])), str(verts)))
self.assertTrue(num.allclose(num.array([b,a,c]), verts), msg)
verts = domain.get_vertex_coordinates(triangle_id=0, absolute=True)
msg = ("num.array([...])=\n%s\nshould be close to 'verts'=\n%s"
% (str(num.array([nodes_absolute[1],
nodes_absolute[0],
nodes_absolute[2]])),
str(verts)))
self.assertTrue(num.allclose(num.array([nodes_absolute[1],
nodes_absolute[0],
nodes_absolute[2]]),
verts), msg)
verts = domain.get_vertex_coordinates(triangle_id=0,
absolute=True)
msg = ("num.array([...])=\n%s\nshould be close to 'verts'=\n%s"
% (str(num.array([nodes_absolute[1],
nodes_absolute[0],
nodes_absolute[2]])),
str(verts)))
self.assertTrue(num.allclose(num.array([nodes_absolute[1],
nodes_absolute[0],
nodes_absolute[2]]),
verts), msg)
def test_get_vertex_coordinates_triangle_id(self):
"""test_get_vertex_coordinates_triangle_id
Test that vertices for one triangle can be returned.
"""
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
#Create basic mesh
nodes, triangles, _ = rectangular(1, 3)
domain = General_mesh(nodes, triangles)
assert num.allclose(domain.get_nodes(), nodes)
M = domain.number_of_triangles
for i in range(M):
V = domain.get_vertex_coordinates(triangle_id=i)
assert V.shape[0] == 3
for j in range(3):
k = triangles[i,j] #Index of vertex j in triangle i
assert num.allclose(V[j,:], nodes[k])
def test_get_edge_midpoint_coordinates(self):
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
#Create basic mesh
nodes, triangles, _ = rectangular(1, 3)
domain = General_mesh(nodes, triangles)
assert num.allclose(domain.get_nodes(), nodes)
M = domain.number_of_triangles
E = domain.get_edge_midpoint_coordinates()
assert E.shape[0] == 3*M
for i in range(M):
k0 = triangles[i,0] #Index of vertex 0 in triangle i
k1 = triangles[i,1] #Index of vertex 1 in triangle i
k2 = triangles[i,2] #Index of vertex 2 in triangle i
assert num.allclose(E[3*i+0,:], 0.5*(nodes[k1]+nodes[k2]) )
assert num.allclose(E[3*i+1,:], 0.5*(nodes[k0]+nodes[k2]) )
assert num.allclose(E[3*i+2,:], 0.5*(nodes[k1]+nodes[k0]) )
def test_get_edge_midpoint_coordinates_with_geo_ref(self):
x0 = 314036.58727982
y0 = 6224951.2960092
geo = Geo_reference(56, x0, y0)
a = num.array([0.0, 0.0])
b = num.array([0.0, 2.0])
c = num.array([2.0, 0.0])
d = num.array([0.0, 4.0])
e = num.array([2.0, 2.0])
f = num.array([4.0, 0.0])
nodes = num.array([a, b, c, d, e, f])
nodes_absolute = geo.get_absolute(nodes)
# bac, bce, ecf, dbe
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]], int)
domain = General_mesh(nodes, triangles, geo_reference=geo)
verts = domain.get_edge_midpoint_coordinates(triangle_id=0) # bac
msg = ("num.array(1/2[a+c,b+c,a+b])=\n%s\nshould be close to 'verts'=\n%s"
% (str(num.array([0.5*(a+c),0.5*(b+c),0.5*(a+b)])), str(verts)))
self.assertTrue(num.allclose(num.array([0.5*(a+c),0.5*(b+c),0.5*(a+b)]), verts), msg)
verts = domain.get_edge_midpoint_coordinates(triangle_id=0, absolute=True)
msg = ("num.array([...])=\n%s\nshould be close to 'verts'=\n%s"
% (str(0.5*num.array([nodes_absolute[0]+nodes_absolute[2],
nodes_absolute[1]+nodes_absolute[2],
nodes_absolute[1]+nodes_absolute[0]])),
str(verts)))
self.assertTrue(num.allclose(0.5*num.array([nodes_absolute[0]+nodes_absolute[2],
nodes_absolute[1]+nodes_absolute[2],
nodes_absolute[1]+nodes_absolute[0]]),
verts), msg)
def test_get_edge_midpoint_coordinates_triangle_id(self):
"""test_get_vertex_coordinates_triangle_id
Test that vertices for one triangle can be returned.
"""
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
#Create basic mesh
nodes, triangles, _ = rectangular(1, 3)
domain = General_mesh(nodes, triangles)
assert num.allclose(domain.get_nodes(), nodes)
M = domain.number_of_triangles
for i in range(M):
E = domain.get_edge_midpoint_coordinates(triangle_id=i)
assert E.shape[0] == 3
k0 = triangles[i,0] #Index of vertex 0 in triangle i
k1 = triangles[i,1] #Index of vertex 0 in triangle i
k2 = triangles[i,2] #Index of vertex 0 in triangle i
assert num.allclose(E[0,:], 0.5*(nodes[k1]+nodes[k2]))
assert num.allclose(E[1,:], 0.5*(nodes[k0]+nodes[k2]))
assert num.allclose(E[2,:], 0.5*(nodes[k1]+nodes[k0]))
E0 = domain.get_edge_midpoint_coordinate(i, 0 )
E1 = domain.get_edge_midpoint_coordinate(i, 1 )
E2 = domain.get_edge_midpoint_coordinate(i, 2 )
assert num.allclose(E0, 0.5*(nodes[k1]+nodes[k2]))
assert num.allclose(E1, 0.5*(nodes[k0]+nodes[k2]))
assert num.allclose(E2, 0.5*(nodes[k1]+nodes[k0]))
def test_get_vertex_values(self):
"""Get connectivity based on triangle lists.
"""
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
#Create basic mesh
nodes, triangles, _ = rectangular(1, 3)
domain = General_mesh(nodes, triangles)
msg = ("domain.get_triangles()=\n%s\nshould be the same as "
"'triangles'=\n%s"
% (str(domain.get_triangles()), str(triangles)))
assert num.allclose(domain.get_triangles(), triangles), msg
msg = ("domain.get_triangles([0,4])=\n%s\nshould be the same as "
"'[triangles[0], triangles[4]]' which is\n%s"
% (str(domain.get_triangles([0,4])),
str([triangles[0], triangles[4]])))
assert num.allclose(domain.get_triangles([0,4]),
[triangles[0], triangles[4]]), msg
def test_vertex_value_indices(self):
"""Check that structures are correct.
"""
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
nodes = num.array([a, b, c, d, e, f])
#bac, bce, ecf, dbe, daf, dae
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]])
domain1 = General_mesh(nodes, triangles)
#Create larger mesh
nodes, triangles, _ = rectangular(3, 6)
domain2 = General_mesh(nodes, triangles)
# Test both meshes
for domain in [domain1, domain2]:
assert sum(domain.number_of_triangles_per_node) ==\
len(domain.vertex_value_indices)
# Check number of triangles per node
count = [0]*domain.number_of_nodes
for triangle in domain.triangles:
for i in triangle:
count[i] += 1
#print count
#
assert num.allclose(count, domain.number_of_triangles_per_node)
# Check indices
current_node = 0
k = 0 # Track triangles touching on node
for index in domain.vertex_value_indices:
k += 1
triangle = old_div(index, 3)
vertex = index % 3
assert domain.triangles[triangle, vertex] == current_node
if domain.number_of_triangles_per_node[current_node] == k:
# Move on to next node
k = 0
current_node += 1
def test_get_triangles_and_vertices_per_node(self):
"""test_get_triangles_and_vertices_per_node -
Test that tuples of triangle, vertex can be extracted
from inverted triangles structure
"""
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
nodes = num.array([a, b, c, d, e, f])
#bac, bce, ecf, dbe, daf, dae
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]])
domain = General_mesh(nodes, triangles)
# One node
L = domain.get_triangles_and_vertices_per_node(node=2)
assert num.allclose(L[0], [0, 2])
assert num.allclose(L[1], [1, 1])
assert num.allclose(L[2], [2, 1])
# All nodes
ALL = domain.get_triangles_and_vertices_per_node()
assert len(ALL) == 6
for i, Lref in enumerate(ALL):
L = domain.get_triangles_and_vertices_per_node(node=i)
assert num.allclose(L, Lref)
def test_areas(self):
#Create basic mesh
points, vertices, boundary = rectangular(1, 3)
domain = General_mesh(points, vertices)
assert domain.get_area() == 1.0
def test_one_degenerate_triangles(self):
a = num.array([1.0, 1.0])
b = num.array([0.0, 2.0])
c = num.array([2.0, 0.0])
d = num.array([0.0, 4.0])
e = num.array([2.0, 2.0])
f = num.array([4.0, 0.0])
nodes = num.array([a, b, c, d, e, f])
# bac, bce, ecf, dbe
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]], int)
try:
domain = General_mesh(nodes, triangles)
except AssertionError:
#print 'excepted assertion error'
pass
def test_two_degenerate_triangles(self):
a = num.array([1.0, 1.0])
b = num.array([0.0, 2.0])
c = num.array([2.0, 0.0])
d = num.array([1.0, 2.0])
e = num.array([2.0, 2.0])
f = num.array([4.0, 0.0])
nodes = num.array([a, b, c, d, e, f])
# bac, bce, ecf, dbe
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]], int)
try:
domain = General_mesh(nodes, triangles)
except AssertionError:
#print 'excepted assertion error'
pass
def test_get_unique_vertex_values(self):
"""
get unique_vertex based on triangle lists.
"""
#Create basic mesh
points, vertices, boundary = rectangular(1, 3)
domain = General_mesh(points, vertices)
assert domain.get_unique_vertices() == [0,1,2,3,4,5,6,7]
unique_vertices = domain.get_unique_vertices([0,1,4])
assert unique_vertices == [0,1,2,4,5,6,7]
unique_vertices = domain.get_unique_vertices([0,4])
assert unique_vertices == [0,2,4,5,6,7]
def test_get_node(self):
"""test_get_triangles_and_vertices_per_node -
Test that tuples of triangle, vertex can be extracted
from inverted triangles structure
"""
x0 = 314036.58727982
y0 = 6224951.2960092
geo = Geo_reference(56, x0, y0)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
nodes = num.array([a, b, c, d, e, f])
nodes_absolute = geo.get_absolute(nodes)
# bac, bce, ecf, dbe
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]])
domain = General_mesh(nodes, triangles, geo_reference = geo)
node = domain.get_node(2)
msg = ('\nc=%s\nnode=%s' % (str(c), str(node)))
self.assertTrue(num.alltrue(c == node), msg)
# repeat get_node(), see if result same
node = domain.get_node(2)
msg = ('\nc=%s\nnode=%s' % (str(c), str(node)))
self.assertTrue(num.alltrue(c == node), msg)
node = domain.get_node(2, absolute=True)
msg = ('\nnodes_absolute[2]=%s\nnode=%s'
% (str(nodes_absolute[2]), str(node)))
self.assertTrue(num.alltrue(nodes_absolute[2] == node), msg)
# repeat get_node(2, absolute=True), see if result same
node = domain.get_node(2, absolute=True)
msg = ('\nnodes_absolute[2]=%s\nnode=%s'
% (str(nodes_absolute[2]), str(node)))
self.assertTrue(num.alltrue(nodes_absolute[2] == node), msg)
def test_assert_index_in_nodes(self):
"""test_assert_index_in_nodes -
Test that node indices in triangles are within nodes array.
"""
x0 = 314036.58727982
y0 = 6224951.2960092
geo = Geo_reference(56, x0, y0)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
nodes = num.array([a, b, c, d, e, f])
nodes_absolute = geo.get_absolute(nodes)
# max index is 5, use 5, expect success
triangles = num.array([[1,0,2], [1,2,4], [4,2,5], [3,1,4]])
General_mesh(nodes, triangles, geo_reference=geo)
# should fail with negative area
triangles = num.array([[0,1,2], [1,2,4], [4,2,5], [3,1,4]])
self.assertRaises(AssertionError, General_mesh,
nodes, triangles, geo_reference=geo)
# max index is 5, use 6, expect assert failure
triangles = num.array([[1,6,2], [1,2,4], [4,2,5], [3,1,4]])
self.assertRaises(AssertionError, General_mesh,
nodes, triangles, geo_reference=geo)
# max index is 5, use 10, expect assert failure
triangles = num.array([[1,10,2], [1,2,4], [4,2,5], [3,1,4]])
self.assertRaises(AssertionError, General_mesh,
nodes, triangles, geo_reference=geo)
################################################################################
if __name__ == "__main__":
#suite = unittest.makeSuite(Test_General_Mesh, 'test')
suite = unittest.makeSuite(Test_General_Mesh, 'test')
runner = unittest.TextTestRunner()
runner.run(suite)
| 33.905769 | 93 | 0.527593 | 2,355 | 17,631 | 3.808493 | 0.086624 | 0.015163 | 0.009366 | 0.047385 | 0.827406 | 0.787936 | 0.725722 | 0.683131 | 0.665849 | 0.629056 | 0 | 0.059638 | 0.336169 | 17,631 | 519 | 94 | 33.971098 | 0.706681 | 0.109807 | 0 | 0.616393 | 0 | 0.009836 | 0.039525 | 0.021158 | 0 | 0 | 0 | 0 | 0.160656 | 1 | 0.055738 | false | 0.013115 | 0.059016 | 0 | 0.118033 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
fe6c18ed9d18ccdddd4d22529050322301a98d2d | 134 | py | Python | Basics/9.Libraries.py | AMZEnterprise/Python_Course_Jadi | 4c1b3512ae0292f897d3ae2aa6449be6a5adb514 | [
"MIT"
] | null | null | null | Basics/9.Libraries.py | AMZEnterprise/Python_Course_Jadi | 4c1b3512ae0292f897d3ae2aa6449be6a5adb514 | [
"MIT"
] | null | null | null | Basics/9.Libraries.py | AMZEnterprise/Python_Course_Jadi | 4c1b3512ae0292f897d3ae2aa6449be6a5adb514 | [
"MIT"
] | null | null | null | # Standard Libraries and bulit-in functions
import random
from random import randint
random.random()
print()
# Third Party Libraries | 16.75 | 43 | 0.798507 | 18 | 134 | 5.944444 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141791 | 134 | 8 | 44 | 16.75 | 0.930435 | 0.470149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
fe84b4969c27f8b4c20bf3935594814c60c29442 | 318 | py | Python | PythonCurso01/aula117_metodos_magicos/exemplo08.py | AlissonAnjos21/Aprendendo | 9454d9e53ef9fb8bc61bf481b6592164f5bf8695 | [
"MIT"
] | null | null | null | PythonCurso01/aula117_metodos_magicos/exemplo08.py | AlissonAnjos21/Aprendendo | 9454d9e53ef9fb8bc61bf481b6592164f5bf8695 | [
"MIT"
] | null | null | null | PythonCurso01/aula117_metodos_magicos/exemplo08.py | AlissonAnjos21/Aprendendo | 9454d9e53ef9fb8bc61bf481b6592164f5bf8695 | [
"MIT"
] | null | null | null | class Exemplo:
def __init__(self):
pass
# Não é muito recomendável usar este método, pois a documentação do Python informa que ele nem sempre é chamado
def __del__(self):
print('Executo sempre que o Garbage Collector do Python é chamado, ou seja, no fim da execução')
exemplo = Exemplo()
| 31.8 | 115 | 0.704403 | 47 | 318 | 4.595745 | 0.765957 | 0.074074 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238994 | 318 | 9 | 116 | 35.333333 | 0.892562 | 0.342767 | 0 | 0 | 0 | 0 | 0.42029 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0.333333 | false | 0.166667 | 0 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
fe8f0ed9c53cbc7e01c664c648d67f4cb15c6d61 | 262 | py | Python | 2021-03-08-Introduction-to-Python/examples/24-special-methods.py | s3rvac/talks | 469ea5d2d3d90527f77863b85746bbc2d7236cb1 | [
"BSD-3-Clause"
] | 2 | 2019-05-15T06:42:32.000Z | 2020-08-01T11:48:40.000Z | 2019-03-04-Introduction-to-Python/examples/24-special-methods.py | s3rvac/talks | 469ea5d2d3d90527f77863b85746bbc2d7236cb1 | [
"BSD-3-Clause"
] | null | null | null | 2019-03-04-Introduction-to-Python/examples/24-special-methods.py | s3rvac/talks | 469ea5d2d3d90527f77863b85746bbc2d7236cb1 | [
"BSD-3-Clause"
] | 1 | 2017-03-28T21:14:37.000Z | 2017-03-28T21:14:37.000Z | # A list that disallows assignments:
class MyList(list):
def __setitem__(self, index, value):
raise RuntimeError('assignment is not supported')
list = MyList([1, 2, 3])
print(list[0]) # 1
# list[0] = 2 # RuntimeError: assignment is not supported
| 29.111111 | 60 | 0.683206 | 36 | 262 | 4.861111 | 0.638889 | 0.251429 | 0.274286 | 0.308571 | 0.411429 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033175 | 0.194656 | 262 | 8 | 61 | 32.75 | 0.796209 | 0.358779 | 0 | 0 | 0 | 0 | 0.165644 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0.2 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
feac253ae16e41427a0d1e15e9a5b922dd9aec99 | 273 | py | Python | veqtor_keras/layers/__init__.py | veqtor/veqtor_keras | 303f81b7c6aaa7962b288541275fe7ea618804b9 | [
"MIT"
] | 1 | 2020-08-07T14:47:16.000Z | 2020-08-07T14:47:16.000Z | veqtor_keras/layers/__init__.py | veqtor/veqtor_keras | 303f81b7c6aaa7962b288541275fe7ea618804b9 | [
"MIT"
] | null | null | null | veqtor_keras/layers/__init__.py | veqtor/veqtor_keras | 303f81b7c6aaa7962b288541275fe7ea618804b9 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from . import attention_layers
from . import time_delay_layers
# Cleanup symbols to avoid polluting namespace.
del absolute_import
del division
del print_function | 24.818182 | 47 | 0.860806 | 37 | 273 | 5.837838 | 0.486486 | 0.138889 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124542 | 273 | 11 | 48 | 24.818182 | 0.903766 | 0.164835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
22a95a969b273d51070df525e229aec0466e82e7 | 101 | py | Python | microcosm_sagemaker/tests/data_models/simple_prediction.py | globality-corp/microcosm-sagemaker | c112ea2c1f5c40c1973c292b73ca0fadbf461280 | [
"Apache-2.0"
] | null | null | null | microcosm_sagemaker/tests/data_models/simple_prediction.py | globality-corp/microcosm-sagemaker | c112ea2c1f5c40c1973c292b73ca0fadbf461280 | [
"Apache-2.0"
] | 15 | 2019-04-22T19:46:32.000Z | 2022-02-11T17:31:43.000Z | microcosm_sagemaker/tests/data_models/simple_prediction.py | globality-corp/microcosm-sagemaker | c112ea2c1f5c40c1973c292b73ca0fadbf461280 | [
"Apache-2.0"
] | null | null | null | from dataclasses import dataclass
@dataclass
class SimplePrediction:
uri: str
score: float
| 12.625 | 33 | 0.752475 | 11 | 101 | 6.909091 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207921 | 101 | 7 | 34 | 14.428571 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
22a98684208b5df9638f77d19f4291944f037cc4 | 3,469 | py | Python | electrumsv/gui/qt/wallet_api.py | CherryDT/electrumsv | 6b778b1c363e22286c3e3ef1bc5a2fa56955ac48 | [
"MIT"
] | 1 | 2021-12-28T10:52:11.000Z | 2021-12-28T10:52:11.000Z | electrumsv/gui/qt/wallet_api.py | SomberNight/electrumsv | 28262e3cab7b73e4960466f8aee252975953acf8 | [
"MIT"
] | null | null | null | electrumsv/gui/qt/wallet_api.py | SomberNight/electrumsv | 28262e3cab7b73e4960466f8aee252975953acf8 | [
"MIT"
] | null | null | null |
from decimal import Decimal
from functools import partial
from typing import Any, Optional, Iterable, Tuple
from PyQt5.QtCore import pyqtSignal, QObject
from electrumsv.app_state import app_state
from electrumsv.contacts import (ContactEntry, ContactIdentity, IdentitySystem, IdentityCheckResult)
class WalletAPI(QObject):
# TODO: ...
fiat_rate_changed = pyqtSignal(Decimal)
# TODO: ...
fiat_currency_changed = pyqtSignal(str)
contact_changed = pyqtSignal(bool, object, object)
def __init__(self, wallet_window: 'ElectrumWindow') -> None:
self.wallet_window = wallet_window
super().__init__(wallet_window)
app_state.app.identity_added_signal.connect(partial(self._on_contact_change, True))
app_state.app.identity_removed_signal.connect(partial(self._on_contact_change, False))
app_state.app.contact_added_signal.connect(partial(self._on_contact_change, True))
app_state.app.contact_removed_signal.connect(partial(self._on_contact_change, False))
# Contact related:
def add_identity(self, contact_id: int, system_id: IdentitySystem, system_data: str) -> None:
self.wallet_window.contacts.add_identity(contact_id, system_id, system_data)
def add_contact(self, system_id: IdentitySystem, label: str,
identity_data: Any) -> ContactEntry:
return self.wallet_window.contacts.add_contact(system_id, label, identity_data)
def remove_contacts(self, contact_ids: Iterable[int]) -> None:
self.wallet_window.contacts.remove_contacts(contact_ids)
def remove_identity(self, contact_id: int, identity_id: int) -> None:
self.wallet_window.contacts.remove_identity(contact_id, identity_id)
def set_label(self, contact_id: int, label: str) -> None:
self.wallet_window.contacts.set_label(contact_id, label)
def get_contact(self, contact_id: int) -> Optional[ContactEntry]:
return self.wallet_window.contacts.get_contact(contact_id)
def get_identities(self):
return self.wallet_window.contacts.get_contact_identities()
def check_label(self, label: str) -> IdentityCheckResult:
return self.wallet_window.contacts.check_label(label)
def check_identity_valid(self, system_id: IdentitySystem, system_data: Any,
skip_exists: Optional[bool]=False) -> IdentityCheckResult:
return self.wallet_window.contacts.check_identity_valid(system_id, system_data, skip_exists)
# Balance related.
def get_balance(self, account_id=None) -> int:
c, u, x = self.wallet_window.wallet.get_balance()
return c + u
def get_fiat_unit(self) -> Optional[str]:
fx = app_state.fx
if fx and fx.is_enabled():
return fx.get_currency()
def get_amount_and_units(self, amount: int) -> Tuple[str, str]:
return self.wallet_window.get_amount_and_units(amount)
# Fiat related.
def get_fiat_amount(self, sv_value: int) -> Optional[str]:
fx = app_state.fx
if fx and fx.is_enabled():
return fx.format_amount(sv_value)
def get_base_unit(self) -> str:
return app_state.base_unit()
def get_base_amount(self, sv_value: int) -> str:
return self.wallet_window.format_amount(sv_value)
def _on_contact_change(self, added: bool, contact: ContactEntry,
identity: Optional[ContactIdentity]=None) -> None:
self.contact_changed.emit(added, contact, identity)
| 38.120879 | 100 | 0.722398 | 451 | 3,469 | 5.254989 | 0.18847 | 0.081013 | 0.094515 | 0.091139 | 0.4 | 0.293671 | 0.244726 | 0.134177 | 0.134177 | 0.091139 | 0 | 0.000352 | 0.181897 | 3,469 | 90 | 101 | 38.544444 | 0.834743 | 0.019314 | 0 | 0.070175 | 0 | 0 | 0.004124 | 0 | 0 | 0 | 0 | 0.011111 | 0 | 1 | 0.298246 | false | 0 | 0.105263 | 0.140351 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
22c754b2ee19e7551983acff84ed921ee332a6ae | 86 | py | Python | Module-2-Python-Basics/Types-And-Variables/specify_str_type.py | CloudSkills/Coding-From-Scratch-Course | 9b2f75f0f39fe3703f6e10b8fed0834078f6cb73 | [
"MIT"
] | null | null | null | Module-2-Python-Basics/Types-And-Variables/specify_str_type.py | CloudSkills/Coding-From-Scratch-Course | 9b2f75f0f39fe3703f6e10b8fed0834078f6cb73 | [
"MIT"
] | null | null | null | Module-2-Python-Basics/Types-And-Variables/specify_str_type.py | CloudSkills/Coding-From-Scratch-Course | 9b2f75f0f39fe3703f6e10b8fed0834078f6cb73 | [
"MIT"
] | null | null | null | def greeting(name: str) -> str:
print(name)
return name
greeting(name='Mike') | 17.2 | 31 | 0.651163 | 12 | 86 | 4.666667 | 0.583333 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197674 | 86 | 5 | 32 | 17.2 | 0.811594 | 0 | 0 | 0 | 0 | 0 | 0.045977 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0.25 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
22e354a29410be9d095b29a517a7cb6df59feae3 | 374 | py | Python | djapati/models.py | subajat1/djapati | 58d29024b133415bcf676fe3b865ee8754df3b0a | [
"MIT"
] | null | null | null | djapati/models.py | subajat1/djapati | 58d29024b133415bcf676fe3b865ee8754df3b0a | [
"MIT"
] | 1 | 2020-02-12T03:05:53.000Z | 2020-02-12T03:05:53.000Z | djapati/models.py | subajat1/djapati | 58d29024b133415bcf676fe3b865ee8754df3b0a | [
"MIT"
] | null | null | null | from django.db import models
class Payload(models.Model):
head = models.CharField(max_length=64)
body = models.CharField(max_length=256)
icon = models.CharField(max_length=256)
url = models.CharField(max_length=256)
class Meta:
verbose_name = 'Payload'
verbose_name_plural = 'Payloads'
def __str__(self):
return self.name
| 23.375 | 43 | 0.684492 | 48 | 374 | 5.104167 | 0.541667 | 0.244898 | 0.293878 | 0.391837 | 0.330612 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037671 | 0.219251 | 374 | 15 | 44 | 24.933333 | 0.80137 | 0 | 0 | 0 | 0 | 0 | 0.040107 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.090909 | 0.090909 | 0.818182 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
22e9ee292062cbc7648a915c9e6bbfb2cbb07535 | 248 | py | Python | test_functools.py | lliryc/baytrino | 272bd17b7a7c9fb027e976adb7278ec5b582d36c | [
"Apache-2.0"
] | 6 | 2021-01-29T12:13:51.000Z | 2022-01-28T18:02:38.000Z | test_functools.py | lliryc/baytrino | 272bd17b7a7c9fb027e976adb7278ec5b582d36c | [
"Apache-2.0"
] | null | null | null | test_functools.py | lliryc/baytrino | 272bd17b7a7c9fb027e976adb7278ec5b582d36c | [
"Apache-2.0"
] | null | null | null | import functools
@functools.lru_cache(500)
def calc(*args)->int:
return " ".join(args)
print(calc("test"))
print(calc("test1", "test2"))
print(calc("test2", "test1"))
print(calc("test1", "test2"))
print(calc("test"))
print(calc.cache_info())
| 19.076923 | 29 | 0.669355 | 35 | 248 | 4.685714 | 0.457143 | 0.329268 | 0.158537 | 0.219512 | 0.5 | 0.341463 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.092742 | 248 | 12 | 30 | 20.666667 | 0.688889 | 0 | 0 | 0.4 | 0 | 0 | 0.157258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | true | 0 | 0.1 | 0.1 | 0.3 | 0.6 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
22f157d4a1d69ba98add3f0c019819a22707e46c | 119 | py | Python | hack/arp/res.py | MisterZhouZhou/pythonLearn | 8933c7a6d444d3d86a173984e6cf4c08dbf84039 | [
"Apache-2.0"
] | 1 | 2019-07-09T09:59:39.000Z | 2019-07-09T09:59:39.000Z | hack/arp/res.py | MisterZhouZhou/pythonLearn | 8933c7a6d444d3d86a173984e6cf4c08dbf84039 | [
"Apache-2.0"
] | null | null | null | hack/arp/res.py | MisterZhouZhou/pythonLearn | 8933c7a6d444d3d86a173984e6cf4c08dbf84039 | [
"Apache-2.0"
] | null | null | null | from scapy.all import *
from scapy.layers.l2 import ARP
# 查询局域网中主机mac地址
res = sr1(ARP(pdst='192.168.1.60'))
print(res) | 19.833333 | 35 | 0.731092 | 21 | 119 | 4.142857 | 0.761905 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104762 | 0.117647 | 119 | 6 | 36 | 19.833333 | 0.72381 | 0.109244 | 0 | 0 | 0 | 0 | 0.114286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0.25 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
fe032e598984d27fbe76949a36854a5b32e171ec | 2,689 | py | Python | 011_Largest_product_in_a_grid.py | gradam/project-euler | d939ad54d057fd8ed2db06948b6ccc7833d68ab6 | [
"MIT"
] | null | null | null | 011_Largest_product_in_a_grid.py | gradam/project-euler | d939ad54d057fd8ed2db06948b6ccc7833d68ab6 | [
"MIT"
] | null | null | null | 011_Largest_product_in_a_grid.py | gradam/project-euler | d939ad54d057fd8ed2db06948b6ccc7833d68ab6 | [
"MIT"
] | null | null | null | from datetime import datetime
startTime = datetime.now()
grd="08 02 22 97 38 15 00 40 00 75 04 05 07 78 52 12 50 77 91 08 49 49 99 40 17 81 18 57 60 87 17 40 98 43 69 48 04 56 62 00 81 49 31 73 55 79 14 29 93 71 40 67 53 88 30 03 49 13 36 65 52 70 95 23 04 60 11 42 69 24 68 56 01 32 56 71 37 02 36 91 22 31 16 71 51 67 63 89 41 92 36 54 22 40 40 28 66 33 13 80 24 47 32 60 99 03 45 02 44 75 33 53 78 36 84 20 35 17 12 50 32 98 81 28 64 23 67 10 26 38 40 67 59 54 70 66 18 38 64 70 67 26 20 68 02 62 12 20 95 63 94 39 63 08 40 91 66 49 94 21 24 55 58 05 66 73 99 26 97 17 78 78 96 83 14 88 34 89 63 72 21 36 23 09 75 00 76 44 20 45 35 14 00 61 33 97 34 31 33 95 78 17 53 28 22 75 31 67 15 94 03 80 04 62 16 14 09 53 56 92 16 39 05 42 96 35 31 47 55 58 88 24 00 17 54 24 36 29 85 57 86 56 00 48 35 71 89 07 05 44 44 37 44 60 21 58 51 54 17 58 19 80 81 68 05 94 47 69 28 73 92 13 86 52 17 77 04 89 55 40 04 52 08 83 97 35 99 16 07 97 57 32 16 26 26 79 33 27 98 66 88 36 68 87 57 62 20 72 03 46 33 67 46 55 12 32 63 93 53 69 04 42 16 73 38 25 39 11 24 94 72 18 08 46 29 32 40 62 76 36 20 69 36 41 72 30 23 88 34 62 99 69 82 67 59 85 74 04 36 16 20 73 35 29 78 31 90 01 74 31 49 71 48 86 81 16 23 57 05 54 01 70 54 71 83 51 54 69 16 92 33 48 61 43 52 01 89 19 67 48"
lgrd = grd.split()
grd3 = []
tym1 = []
grd4 = []
sum_poz = []
sum_pion = []
sum_kos_dol = []
sum_kos_gor = []
z=0
ind_a = 0
ind2_a = 3
kon = []
#przekształcanie na liste
for x in lgrd:
tym1.append(x)
if len(tym1) == 20:
grd3.append(tym1)
tym1=[]
for x in grd3:
x = [int(i) for i in x]
grd4.append([int(i) for i in x])
#suma pozioma
for a in grd4:
tym2=0
while tym2<17:
num = 0
num = x[tym2]*x[tym2+1]*x[tym2+2]*x[tym2+3]
sum_poz.append(num)
tym2+=1
for x in sum_poz:
kon.append(x)
#suma pionowa
while z < 20:
ind=0
while ind<17:
tym2 = []
for a in grd4[ind:ind+4]:
tym2.append(a[z])
sum_pion.append(tym2[0]*tym2[1]*tym2[2]*tym2[3])
ind+=1
z+=1
for x in sum_pion:
kon.append(x)
#suma po skosie w dol
while ind_a<17:
ind_b = 0
while ind_b<17:
sum_kos_dol.append(grd4[ind_a][ind_b]*grd4[ind_a+1][ind_b+1]*grd4[ind_a+2][ind_b+2]*grd4[ind_a+3][ind_b+3])
ind_b+=1
ind_a+=1
for x in sum_kos_dol:
kon.append(x)
#suma po skosie w gore
while ind2_a<20:
ind2_b=0
while ind2_b<17:
sum_kos_gor.append(grd4[ind2_a][ind2_b]*grd4[ind2_a-1][ind2_b+1]*grd4[ind2_a-2][ind2_b+2]*grd4[ind2_a-3][ind2_b+3])
ind2_b+=1
ind2_a+=1
for x in sum_kos_gor:
kon.append(x)
# wyliczanie największej wartości.
print(max(kon))
print (datetime.now() - startTime) | 40.134328 | 1,205 | 0.628858 | 686 | 2,689 | 2.397959 | 0.223032 | 0.017021 | 0.021885 | 0.017021 | 0.070517 | 0.058359 | 0.044985 | 0 | 0 | 0 | 0 | 0.466494 | 0.28412 | 2,689 | 67 | 1,206 | 40.134328 | 0.388052 | 0.04537 | 0 | 0.098361 | 0 | 0.016393 | 0.467994 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016393 | 0 | 0.016393 | 0.032787 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3a6abdd7a3c0f9aba4153e4add8c713a63edcfc | 126 | py | Python | notebooks/exercise_solutions/n03_kinematics_define-point.py | pydy/pydy-tutorial-human-standing | 72b1d8513e339e9b10e501bd3490caa3fa997bc4 | [
"CC-BY-4.0"
] | 134 | 2015-05-19T15:24:18.000Z | 2022-03-12T09:39:03.000Z | notebooks/exercise_solutions/n03_kinematics_define-point.py | pydy/pydy-tutorial-human-standing | 72b1d8513e339e9b10e501bd3490caa3fa997bc4 | [
"CC-BY-4.0"
] | 46 | 2015-05-05T18:08:20.000Z | 2022-01-28T11:12:42.000Z | notebooks/exercise_solutions/n03_kinematics_define-point.py | pydy/pydy-tutorial-pycon-2014 | 72b1d8513e339e9b10e501bd3490caa3fa997bc4 | [
"CC-BY-4.0"
] | 62 | 2015-06-16T01:50:51.000Z | 2022-02-26T07:39:41.000Z | upper_leg_length = symbols('l_U')
hip = Point('H')
hip.set_pos(knee, upper_leg_length * upper_leg_frame.y)
hip.pos_from(ankle) | 31.5 | 55 | 0.769841 | 24 | 126 | 3.666667 | 0.666667 | 0.272727 | 0.318182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079365 | 126 | 4 | 56 | 31.5 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0.031496 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3b33ba6a11c9b443b7433b520dbd501dc9eea56 | 20,091 | py | Python | xero_python/finance/models/data_source_response.py | gavinwhyte/xero-python | 53a028c3b7c51da1db203b616bf7b7a028a4a1d2 | [
"MIT"
] | 1 | 2022-01-22T20:50:36.000Z | 2022-01-22T20:50:36.000Z | xero_python/finance/models/data_source_response.py | kos7138/xero-python | fd4b00016366880d61b42437397e732f53fc8ce2 | [
"MIT"
] | null | null | null | xero_python/finance/models/data_source_response.py | kos7138/xero-python | fd4b00016366880d61b42437397e732f53fc8ce2 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Xero Finance API
The Finance API is a collection of endpoints which customers can use in the course of a loan application, which may assist lenders to gain the confidence they need to provide capital. # noqa: E501
Contact: api@xero.com
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
from xero_python.models import BaseModel
class DataSourceResponse(BaseModel):
"""NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
"""
Attributes:
openapi_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
openapi_types = {
"direct_bank_feed": "float",
"indirect_bank_feed": "float",
"file_upload": "float",
"manual": "float",
"direct_bank_feed_pos": "float",
"indirect_bank_feed_pos": "float",
"file_upload_pos": "float",
"manual_pos": "float",
"direct_bank_feed_neg": "float",
"indirect_bank_feed_neg": "float",
"file_upload_neg": "float",
"manual_neg": "float",
"other_pos": "float",
"other_neg": "float",
"other": "float",
}
attribute_map = {
"direct_bank_feed": "directBankFeed",
"indirect_bank_feed": "indirectBankFeed",
"file_upload": "fileUpload",
"manual": "manual",
"direct_bank_feed_pos": "directBankFeedPos",
"indirect_bank_feed_pos": "indirectBankFeedPos",
"file_upload_pos": "fileUploadPos",
"manual_pos": "manualPos",
"direct_bank_feed_neg": "directBankFeedNeg",
"indirect_bank_feed_neg": "indirectBankFeedNeg",
"file_upload_neg": "fileUploadNeg",
"manual_neg": "manualNeg",
"other_pos": "otherPos",
"other_neg": "otherNeg",
"other": "other",
}
def __init__(
self,
direct_bank_feed=None,
indirect_bank_feed=None,
file_upload=None,
manual=None,
direct_bank_feed_pos=None,
indirect_bank_feed_pos=None,
file_upload_pos=None,
manual_pos=None,
direct_bank_feed_neg=None,
indirect_bank_feed_neg=None,
file_upload_neg=None,
manual_neg=None,
other_pos=None,
other_neg=None,
other=None,
): # noqa: E501
"""DataSourceResponse - a model defined in OpenAPI""" # noqa: E501
self._direct_bank_feed = None
self._indirect_bank_feed = None
self._file_upload = None
self._manual = None
self._direct_bank_feed_pos = None
self._indirect_bank_feed_pos = None
self._file_upload_pos = None
self._manual_pos = None
self._direct_bank_feed_neg = None
self._indirect_bank_feed_neg = None
self._file_upload_neg = None
self._manual_neg = None
self._other_pos = None
self._other_neg = None
self._other = None
self.discriminator = None
if direct_bank_feed is not None:
self.direct_bank_feed = direct_bank_feed
if indirect_bank_feed is not None:
self.indirect_bank_feed = indirect_bank_feed
if file_upload is not None:
self.file_upload = file_upload
if manual is not None:
self.manual = manual
if direct_bank_feed_pos is not None:
self.direct_bank_feed_pos = direct_bank_feed_pos
if indirect_bank_feed_pos is not None:
self.indirect_bank_feed_pos = indirect_bank_feed_pos
if file_upload_pos is not None:
self.file_upload_pos = file_upload_pos
if manual_pos is not None:
self.manual_pos = manual_pos
if direct_bank_feed_neg is not None:
self.direct_bank_feed_neg = direct_bank_feed_neg
if indirect_bank_feed_neg is not None:
self.indirect_bank_feed_neg = indirect_bank_feed_neg
if file_upload_neg is not None:
self.file_upload_neg = file_upload_neg
if manual_neg is not None:
self.manual_neg = manual_neg
if other_pos is not None:
self.other_pos = other_pos
if other_neg is not None:
self.other_neg = other_neg
if other is not None:
self.other = other
@property
def direct_bank_feed(self):
"""Gets the direct_bank_feed of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a direct bank feed in to Xero. This gives an indication on the certainty of correctness of the data. # noqa: E501
:return: The direct_bank_feed of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._direct_bank_feed
@direct_bank_feed.setter
def direct_bank_feed(self, direct_bank_feed):
"""Sets the direct_bank_feed of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a direct bank feed in to Xero. This gives an indication on the certainty of correctness of the data. # noqa: E501
:param direct_bank_feed: The direct_bank_feed of this DataSourceResponse. # noqa: E501
:type: float
"""
self._direct_bank_feed = direct_bank_feed
@property
def indirect_bank_feed(self):
"""Gets the indirect_bank_feed of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a indirect bank feed to Xero (usually via Yodlee). This gives an indication on the certainty of correctness of the data. # noqa: E501
:return: The indirect_bank_feed of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._indirect_bank_feed
@indirect_bank_feed.setter
def indirect_bank_feed(self, indirect_bank_feed):
"""Sets the indirect_bank_feed of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a indirect bank feed to Xero (usually via Yodlee). This gives an indication on the certainty of correctness of the data. # noqa: E501
:param indirect_bank_feed: The indirect_bank_feed of this DataSourceResponse. # noqa: E501
:type: float
"""
self._indirect_bank_feed = indirect_bank_feed
@property
def file_upload(self):
"""Gets the file_upload of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a CSV file upload in to Xero. This gives an indication on the certainty of correctness of the data. # noqa: E501
:return: The file_upload of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._file_upload
@file_upload.setter
def file_upload(self, file_upload):
"""Sets the file_upload of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a CSV file upload in to Xero. This gives an indication on the certainty of correctness of the data. # noqa: E501
:param file_upload: The file_upload of this DataSourceResponse. # noqa: E501
:type: float
"""
self._file_upload = file_upload
@property
def manual(self):
"""Gets the manual of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was manually keyed in to Xero. This gives an indication on the certainty of correctness of the data. # noqa: E501
:return: The manual of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._manual
@manual.setter
def manual(self, manual):
"""Sets the manual of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was manually keyed in to Xero. This gives an indication on the certainty of correctness of the data. # noqa: E501
:param manual: The manual of this DataSourceResponse. # noqa: E501
:type: float
"""
self._manual = manual
@property
def direct_bank_feed_pos(self):
"""Gets the direct_bank_feed_pos of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a direct bank feed in to Xero. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:return: The direct_bank_feed_pos of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._direct_bank_feed_pos
@direct_bank_feed_pos.setter
def direct_bank_feed_pos(self, direct_bank_feed_pos):
"""Sets the direct_bank_feed_pos of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a direct bank feed in to Xero. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:param direct_bank_feed_pos: The direct_bank_feed_pos of this DataSourceResponse. # noqa: E501
:type: float
"""
self._direct_bank_feed_pos = direct_bank_feed_pos
@property
def indirect_bank_feed_pos(self):
"""Gets the indirect_bank_feed_pos of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a indirect bank feed to Xero (usually via Yodlee). This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:return: The indirect_bank_feed_pos of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._indirect_bank_feed_pos
@indirect_bank_feed_pos.setter
def indirect_bank_feed_pos(self, indirect_bank_feed_pos):
"""Sets the indirect_bank_feed_pos of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a indirect bank feed to Xero (usually via Yodlee). This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:param indirect_bank_feed_pos: The indirect_bank_feed_pos of this DataSourceResponse. # noqa: E501
:type: float
"""
self._indirect_bank_feed_pos = indirect_bank_feed_pos
@property
def file_upload_pos(self):
"""Gets the file_upload_pos of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a CSV file upload in to Xero. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:return: The file_upload_pos of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._file_upload_pos
@file_upload_pos.setter
def file_upload_pos(self, file_upload_pos):
"""Sets the file_upload_pos of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a CSV file upload in to Xero. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:param file_upload_pos: The file_upload_pos of this DataSourceResponse. # noqa: E501
:type: float
"""
self._file_upload_pos = file_upload_pos
@property
def manual_pos(self):
"""Gets the manual_pos of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was manually keyed in to Xero. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:return: The manual_pos of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._manual_pos
@manual_pos.setter
def manual_pos(self, manual_pos):
"""Sets the manual_pos of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was manually keyed in to Xero. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:param manual_pos: The manual_pos of this DataSourceResponse. # noqa: E501
:type: float
"""
self._manual_pos = manual_pos
@property
def direct_bank_feed_neg(self):
"""Gets the direct_bank_feed_neg of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a direct bank feed in to Xero. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:return: The direct_bank_feed_neg of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._direct_bank_feed_neg
@direct_bank_feed_neg.setter
def direct_bank_feed_neg(self, direct_bank_feed_neg):
"""Sets the direct_bank_feed_neg of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a direct bank feed in to Xero. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:param direct_bank_feed_neg: The direct_bank_feed_neg of this DataSourceResponse. # noqa: E501
:type: float
"""
self._direct_bank_feed_neg = direct_bank_feed_neg
@property
def indirect_bank_feed_neg(self):
"""Gets the indirect_bank_feed_neg of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a indirect bank feed to Xero (usually via Yodlee). This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:return: The indirect_bank_feed_neg of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._indirect_bank_feed_neg
@indirect_bank_feed_neg.setter
def indirect_bank_feed_neg(self, indirect_bank_feed_neg):
"""Sets the indirect_bank_feed_neg of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a indirect bank feed to Xero (usually via Yodlee). This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:param indirect_bank_feed_neg: The indirect_bank_feed_neg of this DataSourceResponse. # noqa: E501
:type: float
"""
self._indirect_bank_feed_neg = indirect_bank_feed_neg
@property
def file_upload_neg(self):
"""Gets the file_upload_neg of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was a CSV file upload in to Xero. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:return: The file_upload_neg of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._file_upload_neg
@file_upload_neg.setter
def file_upload_neg(self, file_upload_neg):
"""Sets the file_upload_neg of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was a CSV file upload in to Xero. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:param file_upload_neg: The file_upload_neg of this DataSourceResponse. # noqa: E501
:type: float
"""
self._file_upload_neg = file_upload_neg
@property
def manual_neg(self):
"""Gets the manual_neg of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was manually keyed in to Xero. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:return: The manual_neg of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._manual_neg
@manual_neg.setter
def manual_neg(self, manual_neg):
"""Sets the manual_neg of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was manually keyed in to Xero. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:param manual_neg: The manual_neg of this DataSourceResponse. # noqa: E501
:type: float
"""
self._manual_neg = manual_neg
@property
def other_pos(self):
"""Gets the other_pos of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was any other category. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:return: The other_pos of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._other_pos
@other_pos.setter
def other_pos(self, other_pos):
"""Sets the other_pos of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was any other category. This gives an indication on the certainty of correctness of the data. Only positive transactions are included. # noqa: E501
:param other_pos: The other_pos of this DataSourceResponse. # noqa: E501
:type: float
"""
self._other_pos = other_pos
@property
def other_neg(self):
"""Gets the other_neg of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was any other category. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:return: The other_neg of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._other_neg
@other_neg.setter
def other_neg(self, other_neg):
"""Sets the other_neg of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was any other category. This gives an indication on the certainty of correctness of the data. Only negative transactions are included. # noqa: E501
:param other_neg: The other_neg of this DataSourceResponse. # noqa: E501
:type: float
"""
self._other_neg = other_neg
@property
def other(self):
"""Gets the other of this DataSourceResponse. # noqa: E501
Sum of the amounts of all statement lines where the source of the data was any other category. This gives an indication on the certainty of correctness of the data. # noqa: E501
:return: The other of this DataSourceResponse. # noqa: E501
:rtype: float
"""
return self._other
@other.setter
def other(self, other):
"""Sets the other of this DataSourceResponse.
Sum of the amounts of all statement lines where the source of the data was any other category. This gives an indication on the certainty of correctness of the data. # noqa: E501
:param other: The other of this DataSourceResponse. # noqa: E501
:type: float
"""
self._other = other
| 41.85625 | 261 | 0.681698 | 2,785 | 20,091 | 4.727828 | 0.048833 | 0.076555 | 0.066986 | 0.095694 | 0.835878 | 0.768284 | 0.732589 | 0.697425 | 0.683299 | 0.619883 | 0 | 0.016017 | 0.260415 | 20,091 | 479 | 262 | 41.943633 | 0.870112 | 0.58016 | 0 | 0.078534 | 0 | 0 | 0.096562 | 0.012607 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162304 | false | 0 | 0.010471 | 0 | 0.267016 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3bbf8cd4c7152a23b8a13fdc6a067e0f7b2de50 | 148 | py | Python | src/tap_apple_search_ads/api/__init__.py | mighty-digital/tap-apple-search-ads | de7de13509c06e4ce4ef89884b23a9b9d7182d56 | [
"MIT"
] | 1 | 2022-01-18T15:04:40.000Z | 2022-01-18T15:04:40.000Z | src/tap_apple_search_ads/api/__init__.py | mighty-digital/tap-apple-search-ads | de7de13509c06e4ce4ef89884b23a9b9d7182d56 | [
"MIT"
] | null | null | null | src/tap_apple_search_ads/api/__init__.py | mighty-digital/tap-apple-search-ads | de7de13509c06e4ce4ef89884b23a9b9d7182d56 | [
"MIT"
] | null | null | null | """Apple Search Ads API"""
from tap_apple_search_ads.api import auth, utils
__all__ = [
"auth",
"utils",
]
API_DATE_FORMAT = r"%Y-%m-%d"
| 13.454545 | 48 | 0.635135 | 23 | 148 | 3.695652 | 0.695652 | 0.258824 | 0.329412 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.195946 | 148 | 10 | 49 | 14.8 | 0.714286 | 0.135135 | 0 | 0 | 0 | 0 | 0.139344 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3ca05ce1f9774fb3dcd5fdf95d5cec90541e229 | 10,622 | py | Python | sdk/python/pulumi_azure/healthcare/outputs.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/healthcare/outputs.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/healthcare/outputs.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = [
'ServiceAuthenticationConfiguration',
'ServiceCorsConfiguration',
'GetServiceAuthenticationConfigurationResult',
'GetServiceCorsConfigurationResult',
]
@pulumi.output_type
class ServiceAuthenticationConfiguration(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "smartProxyEnabled":
suggest = "smart_proxy_enabled"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in ServiceAuthenticationConfiguration. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
ServiceAuthenticationConfiguration.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
ServiceAuthenticationConfiguration.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
audience: Optional[str] = None,
authority: Optional[str] = None,
smart_proxy_enabled: Optional[bool] = None):
"""
:param str audience: The intended audience to receive authentication tokens for the service. The default value is https://azurehealthcareapis.com
:param str authority: The Azure Active Directory (tenant) that serves as the authentication authority to access the service. The default authority is the Directory defined in the authentication scheme in use when running this provider.
Authority must be registered to Azure AD and in the following format: https://{Azure-AD-endpoint}/{tenant-id}.
:param bool smart_proxy_enabled: Enables the 'SMART on FHIR' option for mobile and web implementations.
"""
if audience is not None:
pulumi.set(__self__, "audience", audience)
if authority is not None:
pulumi.set(__self__, "authority", authority)
if smart_proxy_enabled is not None:
pulumi.set(__self__, "smart_proxy_enabled", smart_proxy_enabled)
@property
@pulumi.getter
def audience(self) -> Optional[str]:
"""
The intended audience to receive authentication tokens for the service. The default value is https://azurehealthcareapis.com
"""
return pulumi.get(self, "audience")
@property
@pulumi.getter
def authority(self) -> Optional[str]:
"""
The Azure Active Directory (tenant) that serves as the authentication authority to access the service. The default authority is the Directory defined in the authentication scheme in use when running this provider.
Authority must be registered to Azure AD and in the following format: https://{Azure-AD-endpoint}/{tenant-id}.
"""
return pulumi.get(self, "authority")
@property
@pulumi.getter(name="smartProxyEnabled")
def smart_proxy_enabled(self) -> Optional[bool]:
"""
Enables the 'SMART on FHIR' option for mobile and web implementations.
"""
return pulumi.get(self, "smart_proxy_enabled")
@pulumi.output_type
class ServiceCorsConfiguration(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "allowCredentials":
suggest = "allow_credentials"
elif key == "allowedHeaders":
suggest = "allowed_headers"
elif key == "allowedMethods":
suggest = "allowed_methods"
elif key == "allowedOrigins":
suggest = "allowed_origins"
elif key == "maxAgeInSeconds":
suggest = "max_age_in_seconds"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in ServiceCorsConfiguration. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
ServiceCorsConfiguration.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
ServiceCorsConfiguration.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
allow_credentials: Optional[bool] = None,
allowed_headers: Optional[Sequence[str]] = None,
allowed_methods: Optional[Sequence[str]] = None,
allowed_origins: Optional[Sequence[str]] = None,
max_age_in_seconds: Optional[int] = None):
"""
:param bool allow_credentials: If credentials are allowed via CORS.
:param Sequence[str] allowed_headers: A set of headers to be allowed via CORS.
:param Sequence[str] allowed_methods: The methods to be allowed via CORS.
:param Sequence[str] allowed_origins: A set of origins to be allowed via CORS.
:param int max_age_in_seconds: The max age to be allowed via CORS.
"""
if allow_credentials is not None:
pulumi.set(__self__, "allow_credentials", allow_credentials)
if allowed_headers is not None:
pulumi.set(__self__, "allowed_headers", allowed_headers)
if allowed_methods is not None:
pulumi.set(__self__, "allowed_methods", allowed_methods)
if allowed_origins is not None:
pulumi.set(__self__, "allowed_origins", allowed_origins)
if max_age_in_seconds is not None:
pulumi.set(__self__, "max_age_in_seconds", max_age_in_seconds)
@property
@pulumi.getter(name="allowCredentials")
def allow_credentials(self) -> Optional[bool]:
"""
If credentials are allowed via CORS.
"""
return pulumi.get(self, "allow_credentials")
@property
@pulumi.getter(name="allowedHeaders")
def allowed_headers(self) -> Optional[Sequence[str]]:
"""
A set of headers to be allowed via CORS.
"""
return pulumi.get(self, "allowed_headers")
@property
@pulumi.getter(name="allowedMethods")
def allowed_methods(self) -> Optional[Sequence[str]]:
"""
The methods to be allowed via CORS.
"""
return pulumi.get(self, "allowed_methods")
@property
@pulumi.getter(name="allowedOrigins")
def allowed_origins(self) -> Optional[Sequence[str]]:
"""
A set of origins to be allowed via CORS.
"""
return pulumi.get(self, "allowed_origins")
@property
@pulumi.getter(name="maxAgeInSeconds")
def max_age_in_seconds(self) -> Optional[int]:
"""
The max age to be allowed via CORS.
"""
return pulumi.get(self, "max_age_in_seconds")
@pulumi.output_type
class GetServiceAuthenticationConfigurationResult(dict):
def __init__(__self__, *,
audience: str,
authority: str,
smart_proxy_enabled: bool):
"""
:param str audience: The intended audience to receive authentication tokens for the service.
:param str authority: The Azure Active Directory (tenant) that serves as the authentication authority to access the service.
:param bool smart_proxy_enabled: Is the 'SMART on FHIR' option for mobile and web implementations enabled?
"""
pulumi.set(__self__, "audience", audience)
pulumi.set(__self__, "authority", authority)
pulumi.set(__self__, "smart_proxy_enabled", smart_proxy_enabled)
@property
@pulumi.getter
def audience(self) -> str:
"""
The intended audience to receive authentication tokens for the service.
"""
return pulumi.get(self, "audience")
@property
@pulumi.getter
def authority(self) -> str:
"""
The Azure Active Directory (tenant) that serves as the authentication authority to access the service.
"""
return pulumi.get(self, "authority")
@property
@pulumi.getter(name="smartProxyEnabled")
def smart_proxy_enabled(self) -> bool:
"""
Is the 'SMART on FHIR' option for mobile and web implementations enabled?
"""
return pulumi.get(self, "smart_proxy_enabled")
@pulumi.output_type
class GetServiceCorsConfigurationResult(dict):
def __init__(__self__, *,
allow_credentials: bool,
allowed_headers: Sequence[str],
allowed_methods: Sequence[str],
allowed_origins: Sequence[str],
max_age_in_seconds: int):
"""
:param bool allow_credentials: Are credentials are allowed via CORS?
:param Sequence[str] allowed_headers: The set of headers to be allowed via CORS.
:param Sequence[str] allowed_methods: The methods to be allowed via CORS.
:param Sequence[str] allowed_origins: The set of origins to be allowed via CORS.
:param int max_age_in_seconds: The max age to be allowed via CORS.
"""
pulumi.set(__self__, "allow_credentials", allow_credentials)
pulumi.set(__self__, "allowed_headers", allowed_headers)
pulumi.set(__self__, "allowed_methods", allowed_methods)
pulumi.set(__self__, "allowed_origins", allowed_origins)
pulumi.set(__self__, "max_age_in_seconds", max_age_in_seconds)
@property
@pulumi.getter(name="allowCredentials")
def allow_credentials(self) -> bool:
"""
Are credentials are allowed via CORS?
"""
return pulumi.get(self, "allow_credentials")
@property
@pulumi.getter(name="allowedHeaders")
def allowed_headers(self) -> Sequence[str]:
"""
The set of headers to be allowed via CORS.
"""
return pulumi.get(self, "allowed_headers")
@property
@pulumi.getter(name="allowedMethods")
def allowed_methods(self) -> Sequence[str]:
"""
The methods to be allowed via CORS.
"""
return pulumi.get(self, "allowed_methods")
@property
@pulumi.getter(name="allowedOrigins")
def allowed_origins(self) -> Sequence[str]:
"""
The set of origins to be allowed via CORS.
"""
return pulumi.get(self, "allowed_origins")
@property
@pulumi.getter(name="maxAgeInSeconds")
def max_age_in_seconds(self) -> int:
"""
The max age to be allowed via CORS.
"""
return pulumi.get(self, "max_age_in_seconds")
| 38.766423 | 243 | 0.651007 | 1,210 | 10,622 | 5.495868 | 0.117355 | 0.030075 | 0.042105 | 0.045714 | 0.783308 | 0.745865 | 0.730376 | 0.652632 | 0.652632 | 0.650226 | 0 | 0.000126 | 0.255413 | 10,622 | 273 | 244 | 38.908425 | 0.840688 | 0.282715 | 0 | 0.571429 | 1 | 0.012422 | 0.170548 | 0.027549 | 0 | 0 | 0 | 0 | 0 | 1 | 0.161491 | false | 0 | 0.031056 | 0 | 0.341615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3f3bc7465977a51100ac6afd5aa22ba19af5fdc | 148 | py | Python | python-intro/exercicios/problem20.py | elzasimoes/selenium-python | 50e4ad8a46864b06193eda09aa2a2a047f98974c | [
"CC0-1.0"
] | 2 | 2020-06-03T04:45:35.000Z | 2020-07-10T03:21:17.000Z | python-intro/exercicios/problem20.py | elzasimoes/selenium-python | 50e4ad8a46864b06193eda09aa2a2a047f98974c | [
"CC0-1.0"
] | null | null | null | python-intro/exercicios/problem20.py | elzasimoes/selenium-python | 50e4ad8a46864b06193eda09aa2a2a047f98974c | [
"CC0-1.0"
] | 2 | 2020-06-03T11:52:33.000Z | 2020-06-07T00:01:57.000Z | #Baseando-se nos exercícios passados, monte um dicionário com os seguintes
#seguintes chaves:
#lista, somatório, tamanho, maior valor e menor valor
| 37 | 74 | 0.804054 | 21 | 148 | 5.666667 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 148 | 3 | 75 | 49.333333 | 0.929688 | 0.959459 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3f80871b63b6cb7cf96620784162c650e341fb2 | 17,710 | py | Python | src/falconpy/firewall_management.py | shawndwells/falconpy | b411eff4a7594bfcf50dc9c301bf441de1cb3d6d | [
"Unlicense"
] | null | null | null | src/falconpy/firewall_management.py | shawndwells/falconpy | b411eff4a7594bfcf50dc9c301bf441de1cb3d6d | [
"Unlicense"
] | 10 | 2021-05-31T06:39:18.000Z | 2022-03-21T23:04:29.000Z | src/falconpy/firewall_management.py | shawndwells/falconpy | b411eff4a7594bfcf50dc9c301bf441de1cb3d6d | [
"Unlicense"
] | null | null | null | """
_______ __ _______ __ __ __
| _ .----.-----.--.--.--.--| | _ | |_.----|__| |--.-----.
|. 1___| _| _ | | | | _ | 1___| _| _| | <| -__|
|. |___|__| |_____|________|_____|____ |____|__| |__|__|__|_____|
|: 1 | |: 1 |
|::.. . | CROWDSTRIKE FALCON |::.. . | FalconPy
`-------' `-------'
OAuth2 API - Customer SDK
firewall_management - CrowdStrike Falcon Firewall Management API interface class
This is free and unencumbered software released into the public domain.
Anyone is free to copy, modify, publish, use, compile, sell, or
distribute this software, either in source code form or as a compiled
binary, for any purpose, commercial or non-commercial, and by any
means.
In jurisdictions that recognize copyright laws, the author or authors
of this software dedicate any and all copyright interest in the
software to the public domain. We make this dedication for the benefit
of the public at large and to the detriment of our heirs and
successors. We intend this dedication to be an overt act of
relinquishment in perpetuity of all present and future rights to this
software under copyright law.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
For more information, please refer to <https://unlicense.org>
"""
from ._util import parse_id_list, service_request
from ._service_class import ServiceClass
class Firewall_Management(ServiceClass):
""" The only requirement to instantiate an instance of this class
is a valid token provided by the Falcon API SDK OAuth2 class.
"""
def aggregate_events(self: object, body: dict) -> dict:
""" Aggregate events for customer. """
# [POST] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/aggregate_events
FULL_URL = self.base_url+'/fwmgr/aggregates/events/GET/v1'
HEADERS = self.headers
BODY = body
returned = service_request(caller=self,
method="POST",
endpoint=FULL_URL,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def aggregate_policy_rules(self: object, body: dict) -> dict:
""" Aggregate rules within a policy for customer. """
# [POST] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/aggregate_policy_rules
FULL_URL = self.base_url+'/fwmgr/aggregates/policy-rules/GET/v1'
HEADERS = self.headers
BODY = body
returned = service_request(caller=self,
method="POST",
endpoint=FULL_URL,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def aggregate_rule_groups(self: object, body: dict) -> dict:
""" Aggregate rule groups for customer. """
# [POST] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/aggregate_rule_groups
FULL_URL = self.base_url+'/fwmgr/aggregates/rule-groups/GET/v1'
HEADERS = self.headers
BODY = body
returned = service_request(caller=self,
method="POST",
endpoint=FULL_URL,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def aggregate_rules(self: object, body: dict) -> dict:
""" Aggregate rules for customer. """
# [POST] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/aggregate_rules
FULL_URL = self.base_url+'/fwmgr/aggregates/rules/GET/v1'
HEADERS = self.headers
BODY = body
returned = service_request(caller=self,
method="POST",
endpoint=FULL_URL,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def get_events(self: object, ids) -> dict:
""" Get events entities by ID and optionally version. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/get_events
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/events/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def get_firewall_fields(self: object, ids) -> dict:
""" Get the firewall field specifications by ID. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/get_firewall_fields
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/firewall-fields/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def get_platforms(self: object, ids) -> dict:
""" Get platforms by ID, e.g., windows or mac or droid. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/get_platforms
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/platforms/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def get_policy_containers(self: object, ids) -> dict:
""" Get policy container entities by policy ID. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/get_policy_containers
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/policies/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
# TODO: Update dynamic documentation to handle the cs_username parameter
def update_policy_container(self: object, body: dict, cs_username: str) -> dict:
""" Update an identified policy container. """
# [PUT] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/update_policy_container
FULL_URL = self.base_url+'/fwmgr/entities/policies/v1'
HEADERS = self.headers
HEADERS['X-CS-USERNAME'] = cs_username
BODY = body
returned = service_request(caller=self,
method="PUT",
endpoint=FULL_URL,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def get_rule_groups(self: object, ids) -> dict:
""" Get rule group entities by ID. These groups do not contain their rule entites,
just the rule IDs in precedence order.
"""
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/get_rule_groups
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/rule-groups/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def create_rule_group(self: object, body: dict, cs_username: str, parameters: dict = {}) -> dict:
""" Create new rule group on a platform for a customer with a name and description, and return the ID. """
# [POST] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/create_rule_group
FULL_URL = self.base_url+'/fwmgr/entities/rule-groups/v1'
HEADERS = self.headers
HEADERS['X-CS-USERNAME'] = cs_username
PARAMS = parameters
BODY = body
returned = service_request(caller=self,
method="POST",
endpoint=FULL_URL,
params=PARAMS,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def delete_rule_groups(self: object, ids, cs_username: str, parameters: dict = {}) -> dict:
""" Delete rule group entities by ID. """
# [DELETE] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/falconx-sandbox/QueryReports
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/rule-groups/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
HEADERS['X-CS-USERNAME'] = cs_username
PARAMS = parameters
returned = service_request(caller=self,
method="DELETE",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def update_rule_group(self: object, body: dict, cs_username: str, parameters: dict = {}) -> dict:
""" Update name, description, or enabled status of a rule group, or create, edit, delete, or reorder rules. """
# [PATCH] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/update_rule_group
FULL_URL = self.base_url+'/fwmgr/entities/rule-groups/v1'
HEADERS = self.headers
HEADERS['X-CS-USERNAME'] = cs_username
PARAMS = parameters
BODY = body
returned = service_request(caller=self,
method="PATCH",
endpoint=FULL_URL,
params=PARAMS,
body=BODY,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def get_rules(self: object, ids) -> dict:
""" Get rule entities by ID (64-bit unsigned int as decimal string) or Family ID (32-character hexadecimal string). """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/get_rules
ID_LIST = str(parse_id_list(ids)).replace(",", "&ids=")
FULL_URL = self.base_url+'/fwmgr/entities/rules/v1?ids={}'.format(ID_LIST)
HEADERS = self.headers
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def query_events(self: object, parameters: dict = {}) -> dict:
""" Find all event IDs matching the query with filter. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/query_events
FULL_URL = self.base_url+'/fwmgr/queries/events/v1'
HEADERS = self.headers
PARAMS = parameters
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def query_firewall_fields(self: object, parameters: dict = {}) -> dict:
""" Get the firewall field specification IDs for the provided platform. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/query_firewall_fields
FULL_URL = self.base_url+'/fwmgr/queries/firewall-fields/v1'
HEADERS = self.headers
PARAMS = parameters
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def query_platforms(self: object, parameters: dict = {}) -> dict:
""" Get the list of platform names. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/query_platforms
FULL_URL = self.base_url+'/fwmgr/queries/platforms/v1'
HEADERS = self.headers
PARAMS = parameters
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def query_policy_rules(self: object, parameters: dict = {}) -> dict:
""" Find all firewall rule IDs matching the query with filter, and return them in precedence order. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/query_policy_rules
FULL_URL = self.base_url+'/fwmgr/queries/policy-rules/v1'
HEADERS = self.headers
PARAMS = parameters
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def query_rule_groups(self: object, parameters: dict = {}) -> dict:
""" Find all rule group IDs matching the query with filter. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/query_rule_groups
FULL_URL = self.base_url+'/fwmgr/queries/rule-groups/v1'
HEADERS = self.headers
PARAMS = parameters
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
def query_rules(self: object, parameters: dict = {}) -> dict:
""" Find all rule IDs matching the query with filter. """
# [GET] https://assets.falcon.crowdstrike.com/support/api/swagger.html#/firewall-management/query_rule_groups
FULL_URL = self.base_url+'/fwmgr/queries/rules/v1'
HEADERS = self.headers
PARAMS = parameters
returned = service_request(caller=self,
method="GET",
endpoint=FULL_URL,
params=PARAMS,
headers=HEADERS,
verify=self.ssl_verify
)
return returned
| 50.3125 | 127 | 0.528515 | 1,736 | 17,710 | 5.223502 | 0.140553 | 0.030878 | 0.037494 | 0.061756 | 0.740626 | 0.720776 | 0.704124 | 0.668946 | 0.620644 | 0.609727 | 0 | 0.002727 | 0.378769 | 17,710 | 351 | 128 | 50.45584 | 0.821487 | 0.297798 | 0 | 0.747967 | 0 | 0 | 0.06526 | 0.051848 | 0 | 0 | 0 | 0.002849 | 0 | 1 | 0.081301 | false | 0 | 0.00813 | 0 | 0.174797 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a3ff8aaf6a86487f445c1355bc47f2f1a388009f | 880 | py | Python | soluzione_it3/cliente.py | mary023010/bank_managment_system_pro | b678012f881e471fc55fbff7116ba232905dc890 | [
"MIT"
] | null | null | null | soluzione_it3/cliente.py | mary023010/bank_managment_system_pro | b678012f881e471fc55fbff7116ba232905dc890 | [
"MIT"
] | null | null | null | soluzione_it3/cliente.py | mary023010/bank_managment_system_pro | b678012f881e471fc55fbff7116ba232905dc890 | [
"MIT"
] | null | null | null |
class Cliente:
def __init__(self,nome_cliente,numero_telefono):
self.__id = id(self)
self.__nome_cliente = nome_cliente
self.__numero_telefono = numero_telefono
def __repr__(self):
return f"Id {self.id} Cliente {self.nome_cliente} telefono {self.numero_telefono}"
def __get_id(self):
return self.__id
id = property(__get_id,)
def __get_nome_cliente(self):
return self.__nome_cliente
def __set_nome_cliente(self,nome_cliente):
self.__nome_cliente = nome_cliente
nome_cliente = property(__get_nome_cliente,__set_nome_cliente)
def __get_numero_telefono(self):
return self.__numero_telefono
def __set_numero_telefono(self,numero_telefono):
self.__numero_telefono = numero_telefono
numero_telefono = property(__get_numero_telefono,__set_numero_telefono)
| 27.5 | 90 | 0.718182 | 110 | 880 | 5.018182 | 0.136364 | 0.259058 | 0.163043 | 0.119565 | 0.315217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209091 | 880 | 31 | 91 | 28.387097 | 0.793103 | 0 | 0 | 0.2 | 0 | 0 | 0.082192 | 0.025114 | 0 | 0 | 0 | 0 | 0 | 1 | 0.35 | false | 0 | 0 | 0.2 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
431649699b29556df5ac23dbbd5226b5c5439ec0 | 1,324 | py | Python | app/standalone/PlaneSpecificationParser.py | egenerat/gae-django | f12379483cf3917ed3cb46ca5ff0b94daf89fc50 | [
"MIT"
] | 3 | 2016-07-08T23:49:32.000Z | 2018-04-15T22:55:01.000Z | app/standalone/PlaneSpecificationParser.py | egenerat/gae-django | f12379483cf3917ed3cb46ca5ff0b94daf89fc50 | [
"MIT"
] | 27 | 2017-02-05T15:57:04.000Z | 2018-04-15T22:57:26.000Z | app/standalone/PlaneSpecificationParser.py | egenerat/gae-django | f12379483cf3917ed3cb46ca5ff0b94daf89fc50 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from app.common.string_methods import get_value_from_regex, get_amount
class PlaneSpecificationParser(object):
def __init__(self, html_page):
self.html_page = html_page
def get_plane_model(self):
return get_value_from_regex(u"""<span class="titre">Fiche détaillée de l'avion : (.+?)</span>""", self.html_page)
def get_speed(self):
return get_amount(get_value_from_regex(u'<td class="fiche1">Vitesse de croisière</td>[\n\s]+<td class="fiche2">(.+) Km/h</td>', self.html_page))
def get_kerosene_capacity(self):
return get_amount(get_value_from_regex(u"""<td class="fiche1">Capacité maximale de carburant</td>[\n\s]+<td class="fiche2">(.+) litres</td>""", self.html_page))
# does not work with a 600st
# def get_engine_nb(self):
# return get_int_from_regex(u"""<td class="fiche1">Poussée</td>[\n\s]+<td class="fiche2">(\d+) x .+ kN</td>""", self.html_page)
def get_kerosene_consumption(self):
return get_amount(get_value_from_regex(u"""<td class="fiche1">Consommation \(des moteurs\)</td>[\n\s]+<td class="fiche2">(.+) litres/heure</td>""", self.html_page))
def get_price(self):
return get_amount(get_value_from_regex(u"""<td class="fiche1">Prix</td>[\n\s]+<td class="fiche2">(.+) \$</td>""", self.html_page))
| 47.285714 | 172 | 0.670695 | 203 | 1,324 | 4.128079 | 0.334975 | 0.083532 | 0.114558 | 0.121718 | 0.515513 | 0.472554 | 0.360382 | 0.238663 | 0.238663 | 0.238663 | 0 | 0.012238 | 0.135952 | 1,324 | 27 | 173 | 49.037037 | 0.72028 | 0.153323 | 0 | 0 | 0 | 0.285714 | 0.364695 | 0.159498 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.071429 | 0.357143 | 0.928571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
4a3061ac15923495625ebe4c26c82cbaf1b34d0c | 138 | py | Python | cactus/util/default_root.py | grayfallstown/cactus-blockchain | 680d68d0bb7694bd4b99e4906b356e014bca7734 | [
"Apache-2.0"
] | 1 | 2021-07-20T12:13:06.000Z | 2021-07-20T12:13:06.000Z | cactus/util/default_root.py | grayfallstown/cactus-blockchain | 680d68d0bb7694bd4b99e4906b356e014bca7734 | [
"Apache-2.0"
] | 2 | 2022-02-27T18:07:25.000Z | 2022-03-25T19:22:44.000Z | cactus/util/default_root.py | grayfallstown/cactus-blockchain | 680d68d0bb7694bd4b99e4906b356e014bca7734 | [
"Apache-2.0"
] | null | null | null | import os
from pathlib import Path
DEFAULT_ROOT_PATH = Path(os.path.expanduser(os.getenv("CACTUS_ROOT", "~/.cactus/mainnet"))).resolve()
| 27.6 | 101 | 0.76087 | 20 | 138 | 5.1 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07971 | 138 | 4 | 102 | 34.5 | 0.80315 | 0 | 0 | 0 | 0 | 0 | 0.202899 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
4a36b79eb864048e63b744dba771b8385c6dae3c | 2,150 | py | Python | Products/CMFCore/tests/test_zcml.py | fulv/Products.CMFCore | 1d6ce101b10aaefba8aa917b6aa404e6c49e254d | [
"ZPL-2.1"
] | 3 | 2015-11-24T16:26:02.000Z | 2019-04-09T07:37:12.000Z | Products/CMFCore/tests/test_zcml.py | fulv/Products.CMFCore | 1d6ce101b10aaefba8aa917b6aa404e6c49e254d | [
"ZPL-2.1"
] | 86 | 2015-09-10T16:25:08.000Z | 2022-03-17T07:16:30.000Z | Products/CMFCore/tests/test_zcml.py | fulv/Products.CMFCore | 1d6ce101b10aaefba8aa917b6aa404e6c49e254d | [
"ZPL-2.1"
] | 16 | 2015-08-21T21:35:35.000Z | 2021-08-04T18:20:55.000Z | ##############################################################################
#
# Copyright (c) 2007 Zope Foundation and Contributors.
#
# This software is subject to the provisions of the Zope Public License,
# Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution.
# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED
# WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS
# FOR A PARTICULAR PURPOSE.
#
##############################################################################
"""Unit tests for zcml module.
"""
import doctest
import unittest
def test_registerDirectory():
"""
Use the cmf:registerDirectory directive::
>>> import Products.CMFCore
>>> from Zope2.App import zcml
>>> configure_zcml = '''
... <configure xmlns:cmf="http://namespaces.zope.org/cmf">
... <cmf:registerDirectory
... name="fake_skin"
... directory="tests/fake_skins/fake_skin"
... recursive="True"
... ignore="foo bar"
... />
... </configure>'''
>>> zcml.load_config('meta.zcml', Products.CMFCore)
>>> zcml.load_string(configure_zcml)
Make sure the directory is registered correctly::
>>> from Products.CMFCore.DirectoryView import _dirreg
>>> reg_keys = (
... 'Products.CMFCore:tests/fake_skins/fake_skin',
... 'Products.CMFCore:tests/fake_skins/fake_skin/test_directory')
>>> reg_keys[0] in _dirreg._directories
True
>>> reg_keys[1] in _dirreg._directories
True
>>> info = _dirreg._directories[reg_keys[0]]
>>> info._reg_key == reg_keys[0]
True
>>> info.ignore
('.', '..', 'foo', 'bar')
Clean up and make sure the cleanup works::
>>> from zope.testing.cleanup import cleanUp
>>> cleanUp()
>>> reg_keys[0] in _dirreg._directories
False
>>> reg_keys[1] in _dirreg._directories
False
"""
def test_suite():
return unittest.TestSuite((
doctest.DocTestSuite(),
))
| 31.15942 | 78 | 0.58186 | 230 | 2,150 | 5.291304 | 0.469565 | 0.040263 | 0.026294 | 0.044371 | 0.167625 | 0.149548 | 0.060805 | 0 | 0 | 0 | 0 | 0.007813 | 0.226047 | 2,150 | 68 | 79 | 31.617647 | 0.723558 | 0.767442 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | true | 0 | 0.285714 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
4a5725b8736096cc9f90e9de829a14478bee5791 | 450 | bzl | Python | workspace_definitions.bzl | matzipan/rules_foreign_cc | 07e1645dcc6c013b954d26b826fcd41f85585e55 | [
"Apache-2.0"
] | 2 | 2021-03-18T04:14:56.000Z | 2021-03-18T05:11:09.000Z | workspace_definitions.bzl | matzipan/rules_foreign_cc | 07e1645dcc6c013b954d26b826fcd41f85585e55 | [
"Apache-2.0"
] | null | null | null | workspace_definitions.bzl | matzipan/rules_foreign_cc | 07e1645dcc6c013b954d26b826fcd41f85585e55 | [
"Apache-2.0"
] | 1 | 2021-03-01T17:51:22.000Z | 2021-03-01T17:51:22.000Z | """Deprecated in favor of `//foreign_cc:repositories.bzl"""
load("//foreign_cc:repositories.bzl", _rules_foreign_cc_dependencies = "rules_foreign_cc_dependencies")
# buildifier: disable=print
print(
"`@rules_foreign_cc//:workspace_definitions.bzl` has been replaced by " +
"`@rules_foreign_cc//foreign_cc:repositories.bzl`. Please use the " +
"updated source location",
)
rules_foreign_cc_dependencies = _rules_foreign_cc_dependencies
| 34.615385 | 103 | 0.775556 | 56 | 450 | 5.839286 | 0.464286 | 0.247706 | 0.256881 | 0.318043 | 0.318043 | 0.318043 | 0.318043 | 0.318043 | 0 | 0 | 0 | 0 | 0.106667 | 450 | 12 | 104 | 37.5 | 0.813433 | 0.177778 | 0 | 0 | 0 | 0 | 0.590659 | 0.423077 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
4a742ecbd39dff8e79ce53970d2c0b23f9a79247 | 6,596 | py | Python | crosshair/abcstring.py | samuelchassot/CrossHair | 4eac7a23e470567cc23e6d0916ce6dd6820eacd8 | [
"MIT"
] | null | null | null | crosshair/abcstring.py | samuelchassot/CrossHair | 4eac7a23e470567cc23e6d0916ce6dd6820eacd8 | [
"MIT"
] | null | null | null | crosshair/abcstring.py | samuelchassot/CrossHair | 4eac7a23e470567cc23e6d0916ce6dd6820eacd8 | [
"MIT"
] | null | null | null | import collections.abc
from numbers import Integral
import sys
from collections import UserString
from crosshair.tracers import NoTracing
# Similar to UserString, but allows you to lazily supply the contents
# when accessed.
# Sadly, this illusion doesn't fully work: various Python operations
# require a actual strings or subclasses.
# (see related issue: https://bugs.python.org/issue16397)
# TODO: Our symbolic strings likely already override most of these methods.
# Consider removing this class.
_MISSING = object()
def _real_string(thing: object):
with NoTracing():
return thing.data if isinstance(thing, (UserString, AbcString)) else thing
def _real_int(thing: object):
return thing.__int__() if isinstance(thing, Integral) else thing
class AbcString(collections.abc.Sequence, collections.abc.Hashable):
"""
Implement just ``__str__``.
Useful for making lazy strings.
"""
data = property(lambda s: s.__str__())
def __str__(self):
NotImplemented
def __repr__(self):
return repr(self.data)
def __hash__(self):
return hash(self.data)
def __eq__(self, string):
return self.data == _real_string(string)
def __lt__(self, string):
return self.data < _real_string(string)
def __le__(self, string):
return self.data <= _real_string(string)
def __gt__(self, string):
return self.data > _real_string(string)
def __ge__(self, string):
return self.data >= _real_string(string)
def __contains__(self, char):
return _real_string(char) in self.data
def __len__(self):
return len(self.data)
def __getitem__(self, index):
return self.data[index]
def __add__(self, other):
other = _real_string(other)
if isinstance(other, str):
return self.data + other
return self.data + str(other)
def __radd__(self, other):
other = _real_string(other)
if isinstance(other, str):
return other + self.data
return str(other) + self.data
def __mul__(self, n):
return self.data * n
def __rmul__(self, n):
return self.data * n
def __mod__(self, args):
return self.data % args
def __rmod__(self, template):
return str(template) % self.data
# the following methods are defined in alphabetical order:
def capitalize(self):
return self.data.capitalize()
def casefold(self):
return self.data.casefold()
def center(self, width, *args):
return self.data.center(width, *args)
def count(self, sub, start=0, end=sys.maxsize):
return self.data.count(_real_string(sub), start, end)
def encode(self, encoding=_MISSING, errors=_MISSING):
if encoding is not _MISSING:
if errors is not _MISSING:
return self.data.encode(encoding, errors)
return self.data.encode(encoding)
return self.data.encode()
def endswith(self, suffix, start=0, end=sys.maxsize):
return self.data.endswith(suffix, start, end)
def expandtabs(self, tabsize=8):
return self.data.expandtabs(_real_int(tabsize))
def find(self, sub, start=0, end=sys.maxsize):
return self.data.find(_real_string(sub), start, end)
def format(self, *args, **kwds):
return self.data.format(*args, **kwds)
def format_map(self, mapping):
return self.data.format_map(mapping)
def index(self, sub, start=0, end=sys.maxsize):
return self.data.index(_real_string(sub), start, end)
def isalpha(self):
return self.data.isalpha()
def isalnum(self):
return self.data.isalnum()
def isascii(self):
return self.data.isascii()
def isdecimal(self):
return self.data.isdecimal()
def isdigit(self):
return self.data.isdigit()
def isidentifier(self):
return self.data.isidentifier()
def islower(self):
return self.data.islower()
def isnumeric(self):
return self.data.isnumeric()
def isprintable(self):
return self.data.isprintable()
def isspace(self):
return self.data.isspace()
def istitle(self):
return self.data.istitle()
def isupper(self):
return self.data.isupper()
def join(self, seq):
return self.data.join(seq)
def ljust(self, width, *args):
return self.data.ljust(width, *args)
def lower(self):
return self.data.lower()
def lstrip(self, chars=None):
return self.data.lstrip(_real_string(chars))
maketrans = str.maketrans
def partition(self, sep):
return self.data.partition(_real_string(sep))
def replace(self, old, new, maxsplit=-1):
return self.data.replace(_real_string(old), _real_string(new), maxsplit)
def rfind(self, sub, start=0, end=sys.maxsize):
return self.data.rfind(_real_string(sub), start, end)
def rindex(self, sub, start=0, end=sys.maxsize):
return self.data.rindex(_real_string(sub), start, end)
def rjust(self, width, *args):
return self.data.rjust(width, *args)
def rpartition(self, sep):
return self.data.rpartition(sep)
def rsplit(self, sep=None, maxsplit=-1):
return self.data.rsplit(sep, maxsplit)
def rstrip(self, chars=None):
return self.data.rstrip(_real_string(chars))
def split(self, sep=None, maxsplit=-1):
return self.data.split(sep, maxsplit)
def splitlines(self, keepends=False):
return self.data.splitlines(keepends)
def startswith(self, prefix, start=0, end=sys.maxsize):
return self.data.startswith(prefix, start, end)
def strip(self, chars=None):
return self.data.strip(_real_string(chars))
def swapcase(self):
return self.data.swapcase()
def title(self):
return self.data.title()
def translate(self, *args):
return self.data.translate(*args)
def upper(self):
return self.data.upper()
def zfill(self, width):
return self.data.zfill(width)
if sys.version_info >= (3, 9):
def removeprefix(self, prefix: str) -> "AbcString":
if self.startswith(prefix):
return self[len(prefix) :]
return self
def removesuffix(self, suffix: str) -> "AbcString":
if self.endswith(suffix):
suffixlen = len(suffix)
if suffixlen == 0:
return self
return self[:-suffixlen]
return self
| 26.48996 | 82 | 0.63675 | 834 | 6,596 | 4.877698 | 0.229017 | 0.12586 | 0.196165 | 0.079646 | 0.283186 | 0.242625 | 0.173304 | 0.161996 | 0.129056 | 0.076205 | 0 | 0.00385 | 0.251819 | 6,596 | 248 | 83 | 26.596774 | 0.820466 | 0.070952 | 0 | 0.056962 | 0 | 0 | 0.00295 | 0 | 0 | 0 | 0 | 0.004032 | 0 | 1 | 0.411392 | false | 0 | 0.031646 | 0.367089 | 0.911392 | 0.012658 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
4a7775a5a4217eb2f87e2eed10cc2d5b58f13cc6 | 1,119 | py | Python | testprojects/pants-plugins/src/python/test_pants_plugin/register.py | revl/pants | 8ad83e4ca80c095d44efceafd8b41e575da39c65 | [
"Apache-2.0"
] | 1 | 2020-06-13T22:01:39.000Z | 2020-06-13T22:01:39.000Z | testprojects/pants-plugins/src/python/test_pants_plugin/register.py | revl/pants | 8ad83e4ca80c095d44efceafd8b41e575da39c65 | [
"Apache-2.0"
] | null | null | null | testprojects/pants-plugins/src/python/test_pants_plugin/register.py | revl/pants | 8ad83e4ca80c095d44efceafd8b41e575da39c65 | [
"Apache-2.0"
] | 3 | 2020-06-30T08:28:13.000Z | 2021-07-28T09:35:57.000Z | # Copyright 2018 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
import os
from test_pants_plugin.pants_testutil_tests import PantsTestutilTests
from test_pants_plugin.subsystems.pants_testutil_subsystem import PantsTestutilSubsystem
from test_pants_plugin.tasks.deprecation_warning_task import DeprecationWarningTask
from test_pants_plugin.tasks.lifecycle_stub_task import LifecycleStubTask
from pants.build_graph.build_file_aliases import BuildFileAliases
from pants.goal.task_registrar import TaskRegistrar as task
def build_file_aliases():
return BuildFileAliases(
context_aware_object_factories={"pants_testutil_tests": PantsTestutilTests,}
)
def register_goals():
task(name="deprecation-warning-task", action=DeprecationWarningTask).install()
task(name="lifecycle-stub-task", action=LifecycleStubTask).install("lifecycle-stub-goal")
def global_subsystems():
return (PantsTestutilSubsystem,)
if os.environ.get("_RAISE_KEYBOARDINTERRUPT_ON_IMPORT", False):
raise KeyboardInterrupt("ctrl-c during import!")
| 34.96875 | 93 | 0.823056 | 133 | 1,119 | 6.676692 | 0.488722 | 0.036036 | 0.058559 | 0.085586 | 0.054054 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005964 | 0.100983 | 1,119 | 31 | 94 | 36.096774 | 0.87674 | 0.112601 | 0 | 0 | 0 | 0 | 0.138384 | 0.058586 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.5 | 0.111111 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 4 |
4a8223c7c5492317e1554b21f6e7cbbaf0995dc8 | 621 | py | Python | 0x06-python-classes/2-main.py | ricardo1470/holbertonschool-higher_level_programming | aab73c8efee665b0215958ee7b338871f13634bc | [
"CNRI-Python"
] | 1 | 2022-02-07T12:13:18.000Z | 2022-02-07T12:13:18.000Z | 0x06-python-classes/2-main.py | ricardo1470/holbertonschool-higher_level_programming | aab73c8efee665b0215958ee7b338871f13634bc | [
"CNRI-Python"
] | null | null | null | 0x06-python-classes/2-main.py | ricardo1470/holbertonschool-higher_level_programming | aab73c8efee665b0215958ee7b338871f13634bc | [
"CNRI-Python"
] | 1 | 2021-12-06T18:15:54.000Z | 2021-12-06T18:15:54.000Z | #!/usr/bin/python3
Square = __import__('2-square').Square
my_square_1 = Square(3)
print(type(my_square_1))
print(my_square_1.__dict__)
my_square_2 = Square()
print(type(my_square_2))
print(my_square_2.__dict__)
try:
print(my_square_1.size)
except Exception as e:
print(e)
try:
print(my_square_1.__size)
except Exception as e:
print(e)
try:
my_square_3 = Square("3")
print(type(my_square_3))
print(my_square_3.__dict__)
except Exception as e:
print(e)
try:
my_square_4 = Square(-89)
print(type(my_square_4))
print(my_square_4.__dict__)
except Exception as e:
print(e)
| 17.742857 | 38 | 0.708535 | 105 | 621 | 3.714286 | 0.190476 | 0.287179 | 0.2 | 0.174359 | 0.553846 | 0.553846 | 0.430769 | 0.348718 | 0.348718 | 0.238462 | 0 | 0.038685 | 0.167472 | 621 | 34 | 39 | 18.264706 | 0.715667 | 0.027375 | 0 | 0.444444 | 0 | 0 | 0.014925 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.037037 | 0.518519 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
4ac45f9be3ceab834a993a7edbd4b548147bb869 | 415 | py | Python | python/branchedflowsim/__init__.py | ngc92/branchedflowsim | d38c0e7f892d07d0abd9b63d30570c41b3b83b34 | [
"MIT"
] | null | null | null | python/branchedflowsim/__init__.py | ngc92/branchedflowsim | d38c0e7f892d07d0abd9b63d30570c41b3b83b34 | [
"MIT"
] | null | null | null | python/branchedflowsim/__init__.py | ngc92/branchedflowsim | d38c0e7f892d07d0abd9b63d30570c41b3b83b34 | [
"MIT"
] | null | null | null | from __future__ import absolute_import
# import sub-modules
from . import io
from . import observers
from . import incoming
from . import results
from . import correlation
from . import medium
from . import utils
from . import potential
from .tracer import trace, trace_multiple, reduce_trace_multiple
from .medium import MediumSpec, ScalarPotentialSpec, generate_multiple
from .potential import Potential, Field
| 25.9375 | 70 | 0.816867 | 53 | 415 | 6.226415 | 0.415094 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139759 | 415 | 15 | 71 | 27.666667 | 0.92437 | 0.043373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
436b0519cb5907ed8792f47b50c4d1866633bc89 | 188 | py | Python | deui/html/view/var_element.py | urushiyama/DeUI | 14530d2dae7d96a3dee30759f85e02239fb433c5 | [
"MIT"
] | 1 | 2021-10-17T01:54:18.000Z | 2021-10-17T01:54:18.000Z | deui/html/view/var_element.py | urushiyama/DeUI | 14530d2dae7d96a3dee30759f85e02239fb433c5 | [
"MIT"
] | null | null | null | deui/html/view/var_element.py | urushiyama/DeUI | 14530d2dae7d96a3dee30759f85e02239fb433c5 | [
"MIT"
] | null | null | null | from .element import Element
class Variable(Element):
"""
Represents name of a variable in mathematic expression or program.
"""
def __str__(self):
return "var"
| 17.090909 | 70 | 0.654255 | 22 | 188 | 5.409091 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.260638 | 188 | 10 | 71 | 18.8 | 0.856115 | 0.351064 | 0 | 0 | 0 | 0 | 0.028302 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
437f79f50cee62e695fe5e41b24f264e35c17727 | 52 | py | Python | 2009/scientific-computing/project1/src/util/__init__.py | rla/old-code | 06aa69c3adef8434992410687d466dc42779e57b | [
"Ruby",
"MIT"
] | 2 | 2015-11-08T10:01:47.000Z | 2020-03-10T00:00:58.000Z | 2009/scientific-computing/project1/src/util/__init__.py | rla/old-code | 06aa69c3adef8434992410687d466dc42779e57b | [
"Ruby",
"MIT"
] | null | null | null | 2009/scientific-computing/project1/src/util/__init__.py | rla/old-code | 06aa69c3adef8434992410687d466dc42779e57b | [
"Ruby",
"MIT"
] | null | null | null | """A package for common classes in the project 1.""" | 52 | 52 | 0.711538 | 9 | 52 | 4.111111 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022727 | 0.153846 | 52 | 1 | 52 | 52 | 0.818182 | 0.884615 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
43a3c6c2334937f59e3bbe7161a705206eaf02a9 | 21,580 | py | Python | appengine/src/greenday_api/tests/test_video_api/test_filter_tags.py | meedan/montage | 4da0116931edc9af91f226876330645837dc9bcc | [
"Apache-2.0"
] | 6 | 2018-07-31T16:48:07.000Z | 2020-02-01T03:17:51.000Z | appengine/src/greenday_api/tests/test_video_api/test_filter_tags.py | meedan/montage | 4da0116931edc9af91f226876330645837dc9bcc | [
"Apache-2.0"
] | 41 | 2018-08-07T16:43:07.000Z | 2020-06-05T18:54:50.000Z | appengine/src/greenday_api/tests/test_video_api/test_filter_tags.py | meedan/montage | 4da0116931edc9af91f226876330645837dc9bcc | [
"Apache-2.0"
] | 1 | 2018-08-07T16:40:18.000Z | 2018-08-07T16:40:18.000Z | """
Video tag filter tests for :mod:`greenday_api.video.video_api <greenday_api.video.video_api>`
"""
# LIBRARIES
import datetime
from milkman.dairy import milkman
import mock
from google.appengine.api.search import Index
# FRAMEWORK
from django.utils import timezone
# GREENDAY
from greenday_core.models import (
Project,
VideoCollection,
)
from greenday_core.tests.base import TestCaseTagHelpers
from ..base import ApiTestCase
from ...video.video_api import VideoAPI
from ...video.containers import VideoFilterContainer
# pylint: disable=R0902
class VideoTagFilterAPITests(TestCaseTagHelpers, ApiTestCase):
"""
Tests for :func:`greenday_api.video.video_api.VideoAPI.video_tag_filter <greenday_api.video.video_api.VideoAPI.video_tag_filter>`
"""
api_type = VideoAPI
def setUp(self):
"""
Bootstrap test data
"""
super(VideoTagFilterAPITests, self).setUp()
self.project = milkman.deliver(Project)
self.project.set_owner(self.admin)
self.video_1 = self.create_video(
project=self.project,
channel_id=u"UCDASmtEzVZS5PZxjiRjcHKA",
publish_date=datetime.datetime(2014, 1, 1, tzinfo=timezone.utc),
name=u"Bazooka")
self.video_2 = self.create_video(
channel_id=u"UCmA0uNMDy4wx9NHu1OfAw9g",
publish_date=datetime.datetime(2014, 2, 1, tzinfo=timezone.utc),
recorded_date=datetime.datetime(2014, 1, 15, tzinfo=timezone.utc),
project=self.project)
self.video_3 = self.create_video(
channel_id=u"UCHX5-wIWTaClDu6uTKXKItg",
publish_date=datetime.datetime(2014, 3, 1, tzinfo=timezone.utc),
project=self.project)
# Video 1: Alpha
self.tag_1, self.project_tag_1, _, self.video_1_tag_instance = \
self.create_video_instance_tag(
name='Alpha',
project=self.project,
video=self.video_1,
user=self.admin,
start_seconds=5,
end_seconds=42)
# Video 2: Alpha
self.create_video_instance_tag(
video=self.video_2,
user=self.admin,
project_tag=self.project_tag_1)
# Video 3: Bravo
self.tag_2, self.project_tag_2, _, _ = self.create_video_instance_tag(
name='Bravo',
project=self.project,
video=self.video_3,
user=self.admin)
# Video 2: Charlie
self.tag_3, self.project_tag_3, _, _ = self.create_video_instance_tag(
name='Charlie',
project=self.project,
video=self.video_2,
user=self.admin)
# Video 3: Charlie
_, _, _, _ = self.create_video_instance_tag(
project=self.project,
project_tag=self.project_tag_3,
video=self.video_3,
user=self.admin)
# create some collections
self.collection_1 = milkman.deliver(
VideoCollection, project=self.project, name='Collection 1')
self.collection_2 = milkman.deliver(
VideoCollection, project=self.project, name='Collection 2')
self.collection_1.add_video(self.video_2)
self.collection_2.add_video(self.video_1)
# add a video outside of this project to make sure we filter on project
self.other_project = milkman.deliver(Project)
self.other_project.set_owner(self.admin)
self.other_video = self.create_video(youtube_video=self.video_1.youtube_video)
def test_filter_response_data(self):
"""
Check API response data
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(3, len(response.items))
first_item = response.items[0]
self.assertEqual(self.tag_1.pk, first_item.global_tag_id)
self.assertEqual(self.tag_1.name, first_item.name)
self.assertEqual(self.tag_1.description, first_item.description)
self.assertEqual(self.tag_1.image_url, first_item.image_url)
self.assertEqual(2, len(first_item.instances))
first_instance = first_item.instances[0]
self.assertEqual(
self.video_1_tag_instance.start_seconds,
first_instance.start_seconds)
self.assertEqual(
self.video_1_tag_instance.end_seconds,
first_instance.end_seconds)
self.assertEqual(
self.video_1.pk,
first_instance.video_id)
self.assertEqual(
self.video_1.youtube_video.name,
first_instance.video_name)
self.assertEqual(
self.video_1.youtube_video.youtube_id,
first_instance.youtube_id)
self.assertEqual(
self.video_1_tag_instance.user_id,
first_instance.user_id)
def test_filter_keyword(self):
"""
Filter by keyword
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id, q=u"Bazooka")
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(1, len(response.items))
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_1_tag_instance.start_seconds,
response.items[0].instances[0].start_seconds)
def test_filter_tags_single(self):
"""
Filter by tag
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
tag_ids="%s" % self.project_tag_2.pk
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(2, len(response.items))
global_tag_ids = [t.global_tag_id for t in response.items]
self.assertIn(self.tag_2.pk, global_tag_ids)
self.assertIn(self.tag_3.pk, global_tag_ids)
self.assertEqual(
self.video_3.pk, response.items[0].instances[0].video_id)
self.assertEqual(1, len(response.items[0].instances))
def test_filter_tags_single_negation(self):
"""
Filter by negated tag
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
tag_ids="-%s" % self.project_tag_2.pk
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(2, len(response.items))
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_2.pk, response.items[0].instances[1].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
def test_filter_tags_two(self):
"""
Filter by two tags
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
tag_ids=",".join(
map(str, (self.project_tag_2.pk, self.project_tag_3.pk))
)
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
global_tag_ids = [t.global_tag_id for t in response.items]
self.assertIn(self.tag_3.pk, global_tag_ids)
self.assertIn(self.tag_2.pk, global_tag_ids)
tag_2_item = next(
t for t in response.items if t.global_tag_id == self.tag_2.pk)
self.assertEqual(self.tag_2.pk, tag_2_item.global_tag_id)
self.assertEqual(1, len(tag_2_item.instances))
self.assertEqual(
self.video_3.pk, tag_2_item.instances[0].video_id)
tag_3_item = next(
t for t in response.items if t.global_tag_id == self.tag_3.pk)
self.assertEqual(self.tag_3.pk, tag_3_item.global_tag_id)
self.assertEqual(2, len(tag_3_item.instances))
self.assertEqual(
self.video_2.pk, tag_3_item.instances[0].video_id)
self.assertEqual(
self.video_3.pk, tag_3_item.instances[1].video_id)
def test_filter_tags_two_negation(self):
"""
Filter by negated and non-negated tag
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
tag_ids=",".join(
(str(self.project_tag_3.pk), '-%s' % self.project_tag_2.pk)
)
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
global_tag_ids = [t.global_tag_id for t in response.items]
self.assertIn(self.tag_3.pk, global_tag_ids)
self.assertIn(self.tag_1.pk, global_tag_ids)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_2.pk, response.items[0].instances[0].video_id)
def test_filter_tags_multiple(self):
"""
Filter by list of tags
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
tag_ids=",".join(
map(str, (
self.project_tag_1.pk,
self.project_tag_2.pk,
self.project_tag_3.pk))
)
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_2.pk, response.items[0].instances[1].video_id)
self.assertEqual(self.tag_2.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_3.pk, response.items[1].instances[0].video_id)
self.assertEqual(self.tag_3.pk, response.items[2].global_tag_id)
self.assertEqual(2, len(response.items[2].instances))
self.assertEqual(
self.video_2.pk, response.items[2].instances[0].video_id)
self.assertEqual(
self.video_3.pk, response.items[2].instances[1].video_id)
def test_filter_tags_multiple_with_negation(self):
"""
Filter by combination of negated and non-negated tags
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
tag_ids=",".join(
(
str(self.project_tag_1.pk),
str(self.project_tag_3.pk),
'-%s' % self.project_tag_2.pk
)
)
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_2.pk, response.items[0].instances[1].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
def test_filter_date_between(self):
"""
Filter published date between two dates
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
date="publish_date__between__2014-1-20__2014-2-20"
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_2.pk, response.items[0].instances[0].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
def test_filter_date_notbetween(self):
"""
Filter published date not between two dates
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
date="publish_date__notbetween__2014-1-20__2014-2-20"
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(self.tag_2.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_3.pk, response.items[1].instances[0].video_id)
def test_filter_date_after(self):
"""
Filter published date after a date
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
date="publish_date__after__2014-1-2"
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_2.pk, response.items[0].instances[0].video_id)
self.assertEqual(self.tag_2.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_3.pk, response.items[1].instances[0].video_id)
self.assertEqual(self.tag_3.pk, response.items[2].global_tag_id)
self.assertEqual(2, len(response.items[2].instances))
self.assertEqual(
self.video_2.pk, response.items[2].instances[0].video_id)
self.assertEqual(
self.video_3.pk, response.items[2].instances[1].video_id)
def test_filter_date_before(self):
"""
Filter published date before a date
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
date="publish_date__before__2014-2-2"
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_2.pk, response.items[0].instances[1].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
def test_filter_date_exact(self):
"""
Filter published date exactly matches a date
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
date="publish_date__exact__2014-1-1"
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
def test_filter_recorded_date_exact(self):
"""
Filter recorded date after a date
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
date="recorded_date__exact__2014-1-15"
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_2.pk, response.items[0].instances[0].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
def test_filter_channel_id(self):
"""
Filter by channel ID
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
channel_ids=",".join(
(self.video_1.youtube_video.channel_id, self.video_2.youtube_video.channel_id,)
)
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_2.pk, response.items[0].instances[1].video_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
self.assertEqual(1, len(response.items[1].instances))
def test_filter_collection_id(self):
"""
Filter by collection ID
"""
self._sign_in(self.admin)
# single collection id
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
collection_id="%s" % self.collection_1.pk
)
with self.assertNumQueries(7):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(1, len(response.items[0].instances))
self.assertEqual(
self.video_2.pk, response.items[0].instances[0].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
# multiple collection id
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
collection_id=",".join(
(str(self.collection_1.pk), str(self.collection_2.pk))
)
)
with self.assertNumQueries(6):
response = self.api.video_tag_filter(request)
self.assertEqual(self.tag_1.pk, response.items[0].global_tag_id)
self.assertEqual(2, len(response.items[0].instances))
self.assertEqual(
self.video_1.pk, response.items[0].instances[0].video_id)
self.assertEqual(
self.video_2.pk, response.items[0].instances[1].video_id)
self.assertEqual(self.tag_3.pk, response.items[1].global_tag_id)
self.assertEqual(1, len(response.items[1].instances))
self.assertEqual(
self.video_2.pk, response.items[1].instances[0].video_id)
@mock.patch.object(Index, "search")
def test_filter_location(self, mock_search):
"""
Filter by location
"""
self._sign_in(self.admin)
request = VideoFilterContainer.combined_message_class(
project_id=self.project.id,
location="3.14__4.28__5"
)
response = self.api.video_tag_filter(request)
# not expecting results because the local search stub in the SDK
# does not support geo-filtering
self.assertEqual(0, len(response.items))
self.assertEqual(
'((project_id:"{project_id}") AND '
'(distance(location, geopoint(3.140000, 4.280000)) < 8046 AND '
'has_location:"1"))'.format(
project_id=self.project.pk),
mock_search.mock_calls[0][1][0].query_string
)
| 35.669421 | 137 | 0.628916 | 2,709 | 21,580 | 4.778147 | 0.066076 | 0.133266 | 0.114493 | 0.08529 | 0.804002 | 0.752781 | 0.710445 | 0.677611 | 0.660074 | 0.643541 | 0 | 0.028287 | 0.261168 | 21,580 | 604 | 138 | 35.728477 | 0.783555 | 0.051715 | 0 | 0.612981 | 0 | 0 | 0.024332 | 0.015452 | 0 | 0 | 0 | 0 | 0.331731 | 1 | 0.043269 | false | 0 | 0.024038 | 0 | 0.072115 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
43abbc24e8585168a1786a8f256c7403423e72bf | 43 | py | Python | kubeshell/__init__.py | leizhag/kube-shell | af8d77c59d85cddd39963a3f0112cc48a8be82a6 | [
"Apache-2.0"
] | 9 | 2020-04-29T09:02:31.000Z | 2020-08-16T17:50:46.000Z | kubeshell/__init__.py | leizhag/kbsh | af8d77c59d85cddd39963a3f0112cc48a8be82a6 | [
"Apache-2.0"
] | null | null | null | kubeshell/__init__.py | leizhag/kbsh | af8d77c59d85cddd39963a3f0112cc48a8be82a6 | [
"Apache-2.0"
] | null | null | null | __version__ = '0.1.1'
from . import logger
| 14.333333 | 21 | 0.697674 | 7 | 43 | 3.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0.162791 | 43 | 2 | 22 | 21.5 | 0.638889 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
43c68e98cb78e4c63ccf5c45cd5e66d4eae10303 | 306 | py | Python | galaxpay/__init__.py | gabicavalcante/galaxpay-api-python | b683295daa39ef362f7418ceb63a69067ff8be8c | [
"MIT"
] | null | null | null | galaxpay/__init__.py | gabicavalcante/galaxpay-api-python | b683295daa39ef362f7418ceb63a69067ff8be8c | [
"MIT"
] | 3 | 2019-10-30T00:17:41.000Z | 2019-10-30T19:11:05.000Z | galaxpay/__init__.py | gabicavalcante/galaxpay-api-python | b683295daa39ef362f7418ceb63a69067ff8be8c | [
"MIT"
] | 1 | 2020-09-18T13:12:48.000Z | 2020-09-18T13:12:48.000Z | __author__ = "Gabriela Cavalcante"
__copyright__ = "Copyright 2019, Gabriela Cavalcante da Silva"
__credits__ = ["Gabriela Cavalcante da Silva"]
__license__ = "MIT"
__version__ = "0.1.0"
__maintainer__ = "Gabriela Cavalcante da Silva"
__email__ = "gabicavalcantesilva@gmail.com"
__status__ = "Development"
| 34 | 62 | 0.781046 | 32 | 306 | 6.46875 | 0.625 | 0.347826 | 0.289855 | 0.362319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025926 | 0.117647 | 306 | 8 | 63 | 38.25 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0.545752 | 0.094771 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
43e233a1f69051e3cbcd996369845a4098846cbe | 531 | py | Python | stin/parameters.py | SoftwareDevEngResearch/Gas-liquid-flows | 10db5551da534341c5f1eb6f781759f68381f0ea | [
"MIT"
] | 1 | 2020-06-28T19:45:12.000Z | 2020-06-28T19:45:12.000Z | stin/parameters.py | SoftwareDevEngResearch/Gas-liquid-flows | 10db5551da534341c5f1eb6f781759f68381f0ea | [
"MIT"
] | null | null | null | stin/parameters.py | SoftwareDevEngResearch/Gas-liquid-flows | 10db5551da534341c5f1eb6f781759f68381f0ea | [
"MIT"
] | 2 | 2019-04-18T22:27:20.000Z | 2019-04-26T15:37:04.000Z | γ = 1.3 # coefficient of adiabatic process - 5th equation of the system
C_0 = 1.1 # distribution coefficient - 6th equation of the system
v_s = 0.2 # slip velocity - 6th equation of the system
a = 500 # acoustic velocity - 5th equation of the system
ρ_L = 1000 # density of the liquid phase - 3rd equation of the system
g = 9.81 # acceleration of gravity - 3rd equation of the system
f = 0.03 # friction factor - 3rd equation of the system
D = 0.002 # m^2 - for 1" (2.5cm) annulus (as if there is one) - 3rd equation of the system
| 59 | 90 | 0.719397 | 97 | 531 | 3.907216 | 0.515464 | 0.118734 | 0.274406 | 0.401055 | 0.46438 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 0.20904 | 531 | 8 | 91 | 66.375 | 0.816667 | 0.826742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
78e2550cab067becf8849ff81727da8b6c945e0b | 106 | py | Python | matlab2cpp/rules/_cx_vec.py | emc2norway/m2cpp | 81943057c184c539b409282cbbd47bbf933db04f | [
"BSD-3-Clause"
] | 28 | 2017-04-25T10:06:38.000Z | 2022-02-09T07:25:34.000Z | matlab2cpp/rules/_cx_vec.py | emc2norway/m2cpp | 81943057c184c539b409282cbbd47bbf933db04f | [
"BSD-3-Clause"
] | null | null | null | matlab2cpp/rules/_cx_vec.py | emc2norway/m2cpp | 81943057c184c539b409282cbbd47bbf933db04f | [
"BSD-3-Clause"
] | 5 | 2017-04-25T17:54:53.000Z | 2022-03-21T20:15:15.000Z | from assign import Assign
from variables import *
from vec import Get, Set
Declare = "cx_vec %(name)s ;"
| 17.666667 | 29 | 0.735849 | 17 | 106 | 4.529412 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179245 | 106 | 5 | 30 | 21.2 | 0.885057 | 0 | 0 | 0 | 0 | 0 | 0.160377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
78fc9d48b205d91e3b31e8f14813f8d033e02046 | 85 | py | Python | nlpaug/model/spectrogram/spectrogram.py | techthiyanes/nlpaug | bb2fc63349bf949f6f6047ff447a0efb16983c0a | [
"MIT"
] | 3,121 | 2019-04-21T07:02:47.000Z | 2022-03-31T22:17:36.000Z | nlpaug/model/spectrogram/spectrogram.py | techthiyanes/nlpaug | bb2fc63349bf949f6f6047ff447a0efb16983c0a | [
"MIT"
] | 186 | 2019-05-31T18:18:13.000Z | 2022-03-28T10:11:05.000Z | nlpaug/model/spectrogram/spectrogram.py | techthiyanes/nlpaug | bb2fc63349bf949f6f6047ff447a0efb16983c0a | [
"MIT"
] | 371 | 2019-03-17T17:59:56.000Z | 2022-03-31T01:45:15.000Z | class Spectrogram:
def manipulate(self, data):
raise NotImplementedError
| 21.25 | 33 | 0.717647 | 8 | 85 | 7.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223529 | 85 | 3 | 34 | 28.333333 | 0.924242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
78fcfa602b527d61c62689d048b09b19bd9a3d77 | 110 | py | Python | {{ cookiecutter.repo_name }}/src/models/train/__init__.py | jnirschl/cookiecutter-data-science | b27bf00280c0aa4b437290eb4f3d45579214ed7f | [
"MIT"
] | null | null | null | {{ cookiecutter.repo_name }}/src/models/train/__init__.py | jnirschl/cookiecutter-data-science | b27bf00280c0aa4b437290eb4f3d45579214ed7f | [
"MIT"
] | null | null | null | {{ cookiecutter.repo_name }}/src/models/train/__init__.py | jnirschl/cookiecutter-data-science | b27bf00280c0aa4b437290eb4f3d45579214ed7f | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
__all__ = ["callbacks", "fit"]
from .callbacks import set
from .fit import fit, main
| 15.714286 | 30 | 0.7 | 16 | 110 | 4.5625 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010753 | 0.154545 | 110 | 6 | 31 | 18.333333 | 0.774194 | 0.190909 | 0 | 0 | 0 | 0 | 0.136364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
601ceec1571c0e9ffd8eb902461c7ba1dfd83dea | 236 | py | Python | source/backend/event_scheduler/infrastructure/app.py | awslabs/aws-media-replay-engine | 2c217eff42f8e2c56b43e2ecf593f5aaa92c5451 | [
"Apache-2.0"
] | 22 | 2021-11-24T01:23:07.000Z | 2022-03-26T23:24:46.000Z | source/backend/event_scheduler/infrastructure/app.py | awslabs/aws-media-replay-engine | 2c217eff42f8e2c56b43e2ecf593f5aaa92c5451 | [
"Apache-2.0"
] | null | null | null | source/backend/event_scheduler/infrastructure/app.py | awslabs/aws-media-replay-engine | 2c217eff42f8e2c56b43e2ecf593f5aaa92c5451 | [
"Apache-2.0"
] | 3 | 2021-12-10T09:42:51.000Z | 2022-02-16T02:22:50.000Z | #!/usr/bin/env python3
from aws_cdk import core as cdk
from stacks.eventSchedulerStack import EventSchedulerStack
app = cdk.App()
EventSchedulerStack(app, 'aws-mre-event-scheduler', description="MRE Event Scheduler stack")
app.synth()
| 29.5 | 92 | 0.79661 | 32 | 236 | 5.84375 | 0.59375 | 0.235294 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004695 | 0.097458 | 236 | 7 | 93 | 33.714286 | 0.873239 | 0.088983 | 0 | 0 | 0 | 0 | 0.224299 | 0.107477 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
603174df97a7fbcc2976148b32e2bdde0de1dde4 | 217 | py | Python | DeepEBM/Torture/__init__.py | taufikxu/FD-ScoreMatching | 9df0789bb98bb798b3de57072f63ee4b2f19947f | [
"MIT"
] | 12 | 2020-05-23T10:02:12.000Z | 2021-03-25T19:54:00.000Z | DeepEBM/Torture/__init__.py | taufikxu/FD-ScoreMatching | 9df0789bb98bb798b3de57072f63ee4b2f19947f | [
"MIT"
] | 6 | 2021-03-19T15:30:28.000Z | 2022-03-12T00:51:16.000Z | DeepEBM/Torture/__init__.py | taufikxu/FD-ScoreMatching | 9df0789bb98bb798b3de57072f63ee4b2f19947f | [
"MIT"
] | 4 | 2020-11-04T03:52:45.000Z | 2021-12-28T16:07:08.000Z | from . import Models
from . import Layers
from . import advtools
from . import shortcuts
from . import dataset
from . import loss_function
from . import utils
from .shortcuts import *
import torch
import numpy as np
| 18.083333 | 27 | 0.778802 | 31 | 217 | 5.419355 | 0.451613 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179724 | 217 | 11 | 28 | 19.727273 | 0.94382 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
603d9716e80e12b24b7a8c2b5b5c35291dd15cdf | 311 | py | Python | Giraffe/Chef/ChineseChef.py | MaggieIllustrations/softuni-github-programming | f5695cb14602f3d2974359f6d8734332acc650d3 | [
"MIT"
] | null | null | null | Giraffe/Chef/ChineseChef.py | MaggieIllustrations/softuni-github-programming | f5695cb14602f3d2974359f6d8734332acc650d3 | [
"MIT"
] | null | null | null | Giraffe/Chef/ChineseChef.py | MaggieIllustrations/softuni-github-programming | f5695cb14602f3d2974359f6d8734332acc650d3 | [
"MIT"
] | 1 | 2022-01-14T17:12:44.000Z | 2022-01-14T17:12:44.000Z | class ChineseChef:
def make_chicken(self):
print("The chef makes a chicken")
def make_salad(self):
print("The chef makes a salad")
def make_special_dish(self):
print("The chef makes orange chicken")
def make_fried_rice(self):
print("The chef makes fried rice") | 25.916667 | 46 | 0.649518 | 44 | 311 | 4.454545 | 0.386364 | 0.142857 | 0.244898 | 0.326531 | 0.438776 | 0.22449 | 0 | 0 | 0 | 0 | 0 | 0 | 0.257235 | 311 | 12 | 47 | 25.916667 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0.320513 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0 | 0 | 0 | 0.555556 | 0.444444 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
606b0857ad101df24e38e58e6e0b3763dbb8667f | 63 | py | Python | mamba/component/protocol_controller/__init__.py | ismaelJimenez/mamba_server | e6e2343291a0df24f226bde0d13e5bfa74cc3650 | [
"MIT"
] | null | null | null | mamba/component/protocol_controller/__init__.py | ismaelJimenez/mamba_server | e6e2343291a0df24f226bde0d13e5bfa74cc3650 | [
"MIT"
] | null | null | null | mamba/component/protocol_controller/__init__.py | ismaelJimenez/mamba_server | e6e2343291a0df24f226bde0d13e5bfa74cc3650 | [
"MIT"
] | null | null | null | from .mamba_protocol_controller import MambaProtocolController
| 31.5 | 62 | 0.920635 | 6 | 63 | 9.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063492 | 63 | 1 | 63 | 63 | 0.949153 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
608a0f8f76e03d9bc14c32c397f4d0e521bc851e | 714 | py | Python | day-08/day-08-built-in-functions.py | nerdchallenges/python-101-book | 78f67197e0febe59cfc373f28e0385b32bb0d0d1 | [
"MIT"
] | null | null | null | day-08/day-08-built-in-functions.py | nerdchallenges/python-101-book | 78f67197e0febe59cfc373f28e0385b32bb0d0d1 | [
"MIT"
] | null | null | null | day-08/day-08-built-in-functions.py | nerdchallenges/python-101-book | 78f67197e0febe59cfc373f28e0385b32bb0d0d1 | [
"MIT"
] | null | null | null | # Built In Functions
# Example 1
print("This is a built-in function") # Function Call
print(str(5))
print("Hello programmer.", "Welcome To Python 101!", "3rd argument of function")
# Kilobyte Nerd Challenge #1 - Print the following statements all on a new line
# Hello programmer
# Welcome to Python 101
# 3rd Argument of function
print("Hello programmer.")
print("Welcome to Python 101!")
print("3rd argument of function")
# Example 2
print("Hello programmer.", "Welcome to Python 101!", "3rd argument of function", sep="\n")
print("Hello programmer.", "Welcome to Python 101!", "3rd argument of function", sep="\t")
print("Hello programmer.", "Welcome to Python 101!", "3rd argument of function", sep="--->")
| 35.7 | 92 | 0.717087 | 104 | 714 | 4.923077 | 0.346154 | 0.175781 | 0.175781 | 0.210938 | 0.583984 | 0.583984 | 0.583984 | 0.583984 | 0.583984 | 0.583984 | 0 | 0.045827 | 0.144258 | 714 | 19 | 93 | 37.578947 | 0.792144 | 0.270308 | 0 | 0 | 0 | 0 | 0.686275 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
60a95d609ee2f166e696e721645e1886818a99b2 | 2,927 | py | Python | nanotune/drivers/dac_interface.py | jenshnielsen/nanotune | 0f2a252d1986f9a5ff155fad626658f85aec3f3e | [
"MIT"
] | null | null | null | nanotune/drivers/dac_interface.py | jenshnielsen/nanotune | 0f2a252d1986f9a5ff155fad626658f85aec3f3e | [
"MIT"
] | null | null | null | nanotune/drivers/dac_interface.py | jenshnielsen/nanotune | 0f2a252d1986f9a5ff155fad626658f85aec3f3e | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from typing import Optional, Tuple, Type
from enum import Enum
import qcodes as qc
from qcodes.instrument.base import Instrument
from qcodes.instrument.channel import ChannelList, InstrumentChannel
class RelayState(Enum):
ground = 0
smc = 1
bus = 2
floating = 3
class DACChannelInterface(InstrumentChannel, ABC):
def __init__(self, parent, name, channel_id):
super().__init__(parent, name)
self._channel_id = channel_id
@property
def channel_id(self) -> int:
return self._channel_id
@property
@abstractmethod
def supports_hardware_ramp(self) -> bool:
pass
@abstractmethod
def set_voltage(self, new_voltage: float) -> None:
pass
@abstractmethod
def get_voltage(self) -> float:
pass
@abstractmethod
def set_voltage_limit(self, new_limits: Tuple[float, float]) -> None:
pass
@abstractmethod
def get_voltage_limit(self) -> Tuple[float, float]:
pass
@abstractmethod
def get_voltage_step(self) -> float:
pass
@abstractmethod
def set_voltage_step(self, new_step: float) -> None:
pass
@abstractmethod
def get_frequency(self) -> float:
pass
@abstractmethod
def set_frequency(self, new_frequency: float) -> None:
pass
@abstractmethod
def get_offset(self) -> float:
pass
@abstractmethod
def set_offset(self, value: float):
pass
@abstractmethod
def get_amplitude(self) -> float:
pass
@abstractmethod
def set_amplitude(self, value: float):
pass
@abstractmethod
def get_relay_state(self) -> RelayState:
pass
@abstractmethod
def set_relay_state(self, new_state: RelayState):
""" """
pass
@abstractmethod
def ramp_voltage(self, target_voltage: float, ramp_rate: Optional[float] = None):
pass
@abstractmethod
def set_ramp_rate(self, value: float):
pass
@abstractmethod
def get_ramp_rate(self) -> float:
pass
@abstractmethod
def get_waveform(self) -> str:
pass
@abstractmethod
def set_waveform(self, waveform: str):
pass
class DACInterface(Instrument):
def __init__(self, name, DACChannelClass: Type[DACChannelInterface]):
assert issubclass(DACChannelClass, DACChannelInterface)
super().__init__(name)
channels = ChannelList(self, "Channels", DACChannelClass, snapshotable=False)
for chan_id in range(0, 64):
chan_name = f"ch{chan_id:02d}"
channel = DACChannelClass(self, chan_name, chan_id)
channels.append(channel)
self.add_submodule(chan_name, channel)
self.add_submodule("channels", channels)
@abstractmethod
def run(self):
pass
@abstractmethod
def sync(self):
pass
| 22.689922 | 85 | 0.647762 | 321 | 2,927 | 5.697819 | 0.255452 | 0.204483 | 0.229634 | 0.142154 | 0.311099 | 0.240022 | 0.149809 | 0 | 0 | 0 | 0 | 0.004194 | 0.266826 | 2,927 | 128 | 86 | 22.867188 | 0.848089 | 0 | 0 | 0.479167 | 0 | 0 | 0.010616 | 0 | 0 | 0 | 0 | 0 | 0.010417 | 1 | 0.260417 | false | 0.229167 | 0.0625 | 0.010417 | 0.40625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
60aa22dcc4b1f82d676551ef574880dea3953212 | 9,207 | py | Python | karbor-1.3.0/karbor/tests/fullstack/test_checkpoints.py | scottwedge/OpenStack-Stein | 7077d1f602031dace92916f14e36b124f474de15 | [
"Apache-2.0"
] | null | null | null | karbor-1.3.0/karbor/tests/fullstack/test_checkpoints.py | scottwedge/OpenStack-Stein | 7077d1f602031dace92916f14e36b124f474de15 | [
"Apache-2.0"
] | 5 | 2019-08-14T06:46:03.000Z | 2021-12-13T20:01:25.000Z | karbor-1.3.0/karbor/tests/fullstack/test_checkpoints.py | scottwedge/OpenStack-Stein | 7077d1f602031dace92916f14e36b124f474de15 | [
"Apache-2.0"
] | 2 | 2020-03-15T01:24:15.000Z | 2020-07-22T20:34:26.000Z | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from karbor.common import constants
from karbor.tests.fullstack import karbor_base
from karbor.tests.fullstack import karbor_objects as objects
class CheckpointsTest(karbor_base.KarborBaseTest):
"""Test Checkpoints operation """
def setUp(self):
super(CheckpointsTest, self).setUp()
self.provider_id = self.provider_id_os
def test_checkpoint_create(self):
volume = self.store(objects.Volume())
volume.create(1)
plan = self.store(objects.Plan())
volume_parameter_key = "OS::Cinder::Volume#{id}".format(id=volume.id)
backup_name = "volume-backup-{id}".format(id=volume.id)
parameters = {
"OS::Cinder::Volume": {
"backup_mode": "full",
"force": False
},
volume_parameter_key: {
"backup_name": backup_name
}
}
plan.create(self.provider_id_os, [volume, ],
parameters=parameters)
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id, plan.id, timeout=2400)
search_opts = {"volume_id": volume.id}
backups = self.cinder_client.backups.list(search_opts=search_opts)
self.assertEqual(1, len(backups))
search_opts = {"name": backup_name}
backups = self.cinder_client.backups.list(search_opts=search_opts)
self.assertEqual(1, len(backups))
def test_checkpoint_delete(self):
volume = self.store(objects.Volume())
volume.create(1)
plan = self.store(objects.Plan())
plan.create(self.provider_id, [volume, ])
checkpoint = objects.Checkpoint()
checkpoint.create(self.provider_id, plan.id, timeout=2400)
checkpoint_item = self.karbor_client.checkpoints.get(self.provider_id,
checkpoint.id)
self.assertEqual(constants.CHECKPOINT_STATUS_AVAILABLE,
checkpoint_item.status)
checkpoint.close()
items = self.karbor_client.checkpoints.list(self.provider_id)
ids = [item.id for item in items]
self.assertTrue(checkpoint.id not in ids)
def test_checkpoint_list(self):
volume = self.store(objects.Volume())
volume.create(1)
plan = self.store(objects.Plan())
plan.create(self.provider_id_noop, [volume, ])
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id_noop, plan.id, timeout=2400)
items = self.karbor_client.checkpoints.list(self.provider_id_noop)
ids = [item.id for item in items]
self.assertTrue(checkpoint.id in ids)
def test_checkpoint_get(self):
volume = self.store(objects.Volume())
volume.create(1)
plan = self.store(objects.Plan())
plan.create(self.provider_id, [volume, ])
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id, plan.id, timeout=2400)
# sanity
checkpoint_item = self.karbor_client.checkpoints.get(self.provider_id,
checkpoint.id)
self.assertEqual(constants.CHECKPOINT_STATUS_AVAILABLE,
checkpoint_item.status)
self.assertEqual(checkpoint.id, checkpoint_item.id)
def test_server_attached_volume_only_protect_server(self):
"""Test checkpoint for server with attached volume
Test checkpoint for server which has attached one volume,
but only add server in protect source
"""
volume = self.store(objects.Volume())
volume.create(1)
server = self.store(objects.Server())
server.create()
server.attach_volume(volume.id)
plan = self.store(objects.Plan())
plan.create(self.provider_id_noop, [server, ])
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id_noop, plan.id, timeout=2400)
items = self.karbor_client.checkpoints.list(self.provider_id_noop)
ids = [item.id for item in items]
self.assertTrue(checkpoint.id in ids)
def test_server_attached_volume_protect_both(self):
"""Test checkpoint for server with attached volume
Test checkpoint for server which has attached one volume,
and add server and volume both in protect source
"""
volume = self.store(objects.Volume())
volume.create(1)
server = self.store(objects.Server())
server.create()
server.attach_volume(volume.id)
plan = self.store(objects.Plan())
plan.create(self.provider_id_noop, [server, volume])
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id_noop, plan.id, timeout=2400)
items = self.karbor_client.checkpoints.list(self.provider_id_noop)
ids = [item.id for item in items]
self.assertTrue(checkpoint.id in ids)
def test_server_boot_from_volume_with_attached_volume(self):
"""Test checkpoint for server with a bootable volume
Test checkpoint for server which has booted form one bootable
volume.
"""
bootable_volume = self.store(objects.Volume())
bootable_volume_id = bootable_volume.create(1, create_from_image=True)
volume = self.store(objects.Volume())
volume.create(1)
server = self.store(objects.Server())
server.create(volume=bootable_volume_id)
server.attach_volume(volume.id)
plan = self.store(objects.Plan())
plan.create(self.provider_id, [server, ])
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id, plan.id, timeout=2400)
items = self.karbor_client.checkpoints.list(self.provider_id)
ids = [item.id for item in items]
self.assertTrue(checkpoint.id in ids)
search_opts = {"volume_id": volume.id}
backups = self.cinder_client.backups.list(search_opts=search_opts)
self.assertEqual(1, len(backups))
search_opts = {"volume_id": bootable_volume_id}
bootable_backups = self.cinder_client.backups.list(
search_opts=search_opts)
self.assertEqual(1, len(bootable_backups))
server.detach_volume(volume.id)
def test_checkpoint_share_projection(self):
share = self.store(objects.Share())
share.create("NFS", 1)
plan = self.store(objects.Plan())
share_parameter_key = "OS::Manila::Share#{id}".format(
id=share.id)
snapshot_name = "share-snapshot-{id}".format(id=share.id)
parameters = {
"OS::Manila::Share": {
"force": False
},
share_parameter_key: {
"snapshot_name": snapshot_name
}
}
plan.create(self.provider_id_os, [share, ],
parameters=parameters)
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id, plan.id, timeout=2400)
search_opts = {"share_id": share.id}
snapshots = self.manila_client.share_snapshots.list(
search_opts=search_opts)
self.assertEqual(1, len(snapshots))
search_opts = {"name": snapshot_name}
backups = self.manila_client.share_snapshots.list(
search_opts=search_opts)
self.assertEqual(1, len(backups))
def test_checkpoint_volume_snapshot(self):
volume = self.store(objects.Volume())
volume.create(1)
plan = self.store(objects.Plan())
volume_parameter_key = "OS::Cinder::Volume#{id}".format(id=volume.id)
snapshot_name = "volume-snapshot-{id}".format(id=volume.id)
parameters = {
"OS::Cinder::Volume": {
"force": False
},
volume_parameter_key: {
"snapshot_name": snapshot_name
}
}
plan.create(self.provider_id_os_volume_snapshot, [volume, ],
parameters=parameters)
checkpoint = self.store(objects.Checkpoint())
checkpoint.create(self.provider_id_os_volume_snapshot, plan.id,
timeout=2400)
search_opts = {"volume_id": volume.id}
snapshots = self.cinder_client.volume_snapshots.list(
search_opts=search_opts)
self.assertEqual(1, len(snapshots))
search_opts = {"name": snapshot_name}
snapshots = self.cinder_client.volume_snapshots.list(
search_opts=search_opts)
self.assertEqual(1, len(snapshots))
| 38.684874 | 78 | 0.637558 | 1,074 | 9,207 | 5.299814 | 0.133147 | 0.047435 | 0.084329 | 0.063247 | 0.768974 | 0.737526 | 0.715039 | 0.704322 | 0.697997 | 0.68324 | 0 | 0.008492 | 0.258173 | 9,207 | 237 | 79 | 38.848101 | 0.82489 | 0.106441 | 0 | 0.651163 | 0 | 0 | 0.037438 | 0.008374 | 0 | 0 | 0 | 0 | 0.093023 | 1 | 0.05814 | false | 0 | 0.017442 | 0 | 0.081395 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
60d909faf0fa90907f795cc2c1b3d68161ac6e90 | 130 | py | Python | app/main/errors.py | stevekibe/newsapp | 61cb31a28ecfe05c46eaad40ad97fe6d8e9b6c26 | [
"MIT"
] | null | null | null | app/main/errors.py | stevekibe/newsapp | 61cb31a28ecfe05c46eaad40ad97fe6d8e9b6c26 | [
"MIT"
] | null | null | null | app/main/errors.py | stevekibe/newsapp | 61cb31a28ecfe05c46eaad40ad97fe6d8e9b6c26 | [
"MIT"
] | null | null | null | def four_Ow_four(error):
'''
method to render the 404 error page
'''
return render_template('fourOwfour.html'),404 | 26 | 49 | 0.669231 | 18 | 130 | 4.666667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0.215385 | 130 | 5 | 49 | 26 | 0.764706 | 0.269231 | 0 | 0 | 0 | 0 | 0.1875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
717ef781390c0c6bec7503adc936487428536fd1 | 174 | py | Python | ServerlessController/providers_app/forms.py | pacslab/ChainFaaS | f99dd3753de21a93c61cc411b88b7ab2c5da9efe | [
"Apache-2.0"
] | 7 | 2020-08-27T23:32:43.000Z | 2022-02-18T12:08:50.000Z | ServerlessController/providers_app/forms.py | pacslab/ChainFaaS | f99dd3753de21a93c61cc411b88b7ab2c5da9efe | [
"Apache-2.0"
] | 6 | 2020-11-02T07:03:22.000Z | 2021-06-10T19:58:48.000Z | ServerlessController/providers_app/forms.py | pacslab/ChainFaaS | f99dd3753de21a93c61cc411b88b7ab2c5da9efe | [
"Apache-2.0"
] | 2 | 2020-04-16T00:47:21.000Z | 2021-04-27T07:45:52.000Z | from django import forms
from profiles.models import Provider
class ProviderForm(forms.ModelForm):
class Meta:
model = Provider
fields = ('ram', 'cpu')
| 19.333333 | 36 | 0.678161 | 20 | 174 | 5.9 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235632 | 174 | 8 | 37 | 21.75 | 0.887218 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.