hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d34d4bd2e0a6bfa6226a7b4b08226b7097b82c36 | 175 | py | Python | code/python/pymir/commands/train_seql.py | mfranco/pymir | f6d86bfdec942156ae95984f1ef8182d8983181f | [
"MIT"
] | 1 | 2020-01-18T21:47:59.000Z | 2020-01-18T21:47:59.000Z | code/python/pymir/commands/train_seql.py | maigfrga/pymir | f6d86bfdec942156ae95984f1ef8182d8983181f | [
"MIT"
] | null | null | null | code/python/pymir/commands/train_seql.py | maigfrga/pymir | f6d86bfdec942156ae95984f1ef8182d8983181f | [
"MIT"
] | null | null | null | from pymir.analytics.key_detection.musicnet.ml.seql import seql_trainer
def run():
"""
Musicnet metadata format for training seql
"""
seql_trainer.compute()
| 19.444444 | 71 | 0.714286 | 22 | 175 | 5.545455 | 0.772727 | 0.180328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188571 | 175 | 8 | 72 | 21.875 | 0.859155 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d35ea0e8065b0c5cecd94bf1415278b8d67cea54 | 892 | py | Python | lib/hachoir/parser/archive/__init__.py | 0x20Man/Watcher3 | 4656b42bc5879a3741bb95f534b7c6612a25264d | [
"Apache-2.0"
] | 320 | 2017-03-28T23:33:45.000Z | 2022-02-17T08:45:01.000Z | lib/hachoir/parser/archive/__init__.py | 0x20Man/Watcher3 | 4656b42bc5879a3741bb95f534b7c6612a25264d | [
"Apache-2.0"
] | 300 | 2017-03-28T19:22:54.000Z | 2021-12-01T01:11:55.000Z | lib/hachoir/parser/archive/__init__.py | 0x20Man/Watcher3 | 4656b42bc5879a3741bb95f534b7c6612a25264d | [
"Apache-2.0"
] | 90 | 2017-03-29T16:12:43.000Z | 2022-03-01T06:23:48.000Z | from hachoir.parser.archive.ace import AceFile # noqa
from hachoir.parser.archive.ar import ArchiveFile # noqa
from hachoir.parser.archive.bomstore import BomFile # noqa
from hachoir.parser.archive.bzip2_parser import Bzip2Parser # noqa
from hachoir.parser.archive.cab import CabFile # noqa
from hachoir.parser.archive.gzip_parser import GzipParser # noqa
from hachoir.parser.archive.tar import TarFile # noqa
from hachoir.parser.archive.zip import ZipFile # noqa
from hachoir.parser.archive.rar import RarFile # noqa
from hachoir.parser.archive.rpm import RpmFile # noqa
from hachoir.parser.archive.sevenzip import SevenZipParser # noqa
from hachoir.parser.archive.mar import MarFile # noqa
from hachoir.parser.archive.mozilla_ar import MozillaArchive # noqa
from hachoir.parser.archive.zlib import ZlibData # noqa
from hachoir.parser.archive.prs_pak import PRSPakFile # noqa
| 55.75 | 68 | 0.815022 | 124 | 892 | 5.830645 | 0.298387 | 0.228216 | 0.352697 | 0.497925 | 0.542185 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002541 | 0.117713 | 892 | 15 | 69 | 59.466667 | 0.916137 | 0.08296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d376bc305c32a030c68c2a55a1be006f0de7c9be | 7,475 | py | Python | cogs/music.py | ilyLuxa/Athena-bot | 7cb242a8c4626dbe7866850e629f85657c698514 | [
"MIT"
] | null | null | null | cogs/music.py | ilyLuxa/Athena-bot | 7cb242a8c4626dbe7866850e629f85657c698514 | [
"MIT"
] | null | null | null | cogs/music.py | ilyLuxa/Athena-bot | 7cb242a8c4626dbe7866850e629f85657c698514 | [
"MIT"
] | 1 | 2021-09-19T21:49:08.000Z | 2021-09-19T21:49:08.000Z | from discord.ext import commands
import discord
from requests.models import default_hooks
import youtube_dl
class music(commands.Cog):
def __init__(self,client):
self.client = client
@commands.command(
name="join",
description='Joins voice channel.',
usage="`a!join`"
)
async def join(self,ctx):
if ctx.author.voice is None:
await ctx.send("**You must be connected to a voice channel to use this command.**")
voice_channel = ctx.author.voice.channel
if ctx.voice_client is None:
await voice_channel.connect()
await ctx.send(f"*Successfully connected to: {voice_channel}*")
else:
if len(ctx.voice_client.channel.members) == 1:
await ctx.voice_client.move_to(voice_channel)
await ctx.send(f"*Successfully connected to: {voice_channel}*")
else:
await ctx.send("**Someone else is already listening to music in different channel.**")
@commands.command(
name="disconnect",
description='Leaves voice channel.',
usage="`a!disconnect`"
)
async def disconnect(self,ctx):
voice = ctx.voice_client
if ctx.author.voice is None:
await ctx.send("**You must be connected to a voice channel to use this command.**")
elif ctx.voice_client is None:
await ctx.send("**I'm not in a voice channel.**")
else:
if ctx.author.voice.channel == voice.channel:
await ctx.voice_client.disconnect()
await ctx.send(f"*Successfully disconnected from: {voice.channel}*")
else:
if len(ctx.voice_client.channel.members) == 1:
await ctx.voice_client.disconnect()
await ctx.send(f"*Successfully disconnected from: {voice.channel}*")
else:
await ctx.send("**Someone else is already listening to music in different channel.**")
@commands.command(
name="play",
description='Plays song from url into voice channel.',
usage="`a!play [url]`"
)
async def play(self,ctx,url):
voice = ctx.voice_client
if ctx.author.voice is None:
await ctx.send("**You must be connected to a voice channel to use this command.**")
voice_channel = ctx.author.voice.channel
if ctx.voice_client is None:
await voice_channel.connect()
FFMPEG_OPTIONS = {'before_options': '-reconnect 1 -reconnect_streamed 1 -reconnect_delay_max 5','options': '-vn'}
YDL_OPTIONS = {'format':'bestaudio'}
with youtube_dl.YoutubeDL(YDL_OPTIONS) as ydl:
info = ydl.extract_info(url, download=False)
url2 = info['formats'][0]['url']
source = await discord.FFmpegOpusAudio.from_probe(url2,**FFMPEG_OPTIONS)
ctx.voice_client.play(source)
else:
if ctx.author.voice.channel == voice.channel:
FFMPEG_OPTIONS = {'before_options': '-reconnect 1 -reconnect_streamed 1 -reconnect_delay_max 5','options': '-vn'}
YDL_OPTIONS = {'format':'bestaudio'}
with youtube_dl.YoutubeDL(YDL_OPTIONS) as ydl:
info = ydl.extract_info(url, download=False)
url2 = info['formats'][0]['url']
source = await discord.FFmpegOpusAudio.from_probe(url2,**FFMPEG_OPTIONS)
ctx.voice_client.play(source)
else:
if len(ctx.voice_client.channel.members) == 1:
await ctx.voice_client.move_to(voice_channel)
FFMPEG_OPTIONS = {'before_options': '-reconnect 1 -reconnect_streamed 1 -reconnect_delay_max 5','options': '-vn'}
YDL_OPTIONS = {'format':'bestaudio'}
with youtube_dl.YoutubeDL(YDL_OPTIONS) as ydl:
info = ydl.extract_info(url, download=False)
url2 = info['formats'][0]['url']
source = await discord.FFmpegOpusAudio.from_probe(url2,**FFMPEG_OPTIONS)
ctx.voice_client.play(source)
else:
await ctx.send("**Someone else is already listening to music in different channel.**")
@commands.command(
name="pause",
description='Pauses current song.',
usage="`a!pause`"
)
async def pause(self,ctx):
voice = ctx.voice_client
if ctx.author.voice is None:
await ctx.send("**You must be connected to a voice channel to use this command.**")
elif ctx.voice_client is None:
await ctx.send("**I'm not in a voice channel.**")
else:
if ctx.author.voice.channel == voice.channel:
if ctx.voice_client.is_playing():
ctx.voice_client.pause()
await ctx.send("*Successfully paused the song.*")
elif ctx.voice_client.is_paused():
await ctx.send("**I'm already paused.**")
else:
await ctx.send("**Nothing is playing.**")
else:
await ctx.send("**Someone else is already listening to music in different channel.**")
@commands.command(
name="resume",
description='Resumes current song.',
usage="`a!resume`"
)
async def resume(self,ctx):
voice = ctx.voice_client
if ctx.author.voice is None:
await ctx.send("**You must be connected to a voice channel to use this command.**")
elif ctx.voice_client is None:
await ctx.send("**I'm not in a voice channel.**")
else:
if ctx.author.voice.channel == voice.channel:
if ctx.voice_client.is_paused():
ctx.voice_client.resume()
await ctx.send("*Successfully resumed the song.*")
elif ctx.voice_client.is_playing():
await ctx.send("**Music is already playing.**")
else:
await ctx.send("**Nothing is playing.**")
else:
await ctx.send("**Someone else is already listening to music in different channel.**")
@commands.command(
name="stop",
description='Stops current song.',
usage="`a!stop`"
)
async def stop(self,ctx):
voice = ctx.voice_client
if ctx.author.voice is None:
await ctx.send("**You must be connected to a voice channel to use this command.**")
elif ctx.voice_client is None:
await ctx.send("**I'm not in a voice channel.**")
else:
if ctx.author.voice.channel == voice.channel:
if ctx.voice_client.is_playing() or ctx.voice_client.is_paused():
ctx.voice_client.stop()
await ctx.send("*Successfully stopped the song.*")
else:
await ctx.send("**Nothing is playing.**")
else:
await ctx.send("**Someone else is already listening to music in different channel.**")
def setup(client):
client.add_cog(music(client))
| 43.71345 | 134 | 0.555184 | 855 | 7,475 | 4.753216 | 0.126316 | 0.103346 | 0.103346 | 0.047244 | 0.813484 | 0.805118 | 0.805118 | 0.791831 | 0.772884 | 0.772884 | 0 | 0.004237 | 0.33699 | 7,475 | 170 | 135 | 43.970588 | 0.815779 | 0 | 0 | 0.657895 | 0 | 0 | 0.257084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013158 | false | 0 | 0.026316 | 0 | 0.046053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d38cf84193ef201fc156eda1f39a3e96eb90298b | 17,823 | py | Python | goutdotcom/flare/views.py | Spiewart/goutdotcom | 0916155732a72fcb8c8a2fb0f4dd81efef618af8 | [
"MIT"
] | null | null | null | goutdotcom/flare/views.py | Spiewart/goutdotcom | 0916155732a72fcb8c8a2fb0f4dd81efef618af8 | [
"MIT"
] | null | null | null | goutdotcom/flare/views.py | Spiewart/goutdotcom | 0916155732a72fcb8c8a2fb0f4dd81efef618af8 | [
"MIT"
] | null | null | null | from django.contrib import messages
from django.contrib.auth.mixins import LoginRequiredMixin
from django.http import HttpResponseRedirect
from django.http.response import Http404
from django.shortcuts import redirect
from django.urls import reverse
from django.views.generic import CreateView, DetailView, ListView, UpdateView
from django.views.generic.base import TemplateView
from ..history.forms import (
AnginaForm,
CHFSimpleForm,
HeartAttackSimpleForm,
HypertensionSimpleForm,
PVDForm,
StrokeSimpleForm,
)
from ..history.models import CHF, PVD, Angina, HeartAttack, Hypertension, Stroke
from ..lab.forms import UrateFlareForm
from ..lab.models import Urate
from .forms import FlareForm
from .models import Flare
# Create your views here.
class AboutFlares(TemplateView):
template_name = "flare/about.html"
class FlareDetail(DetailView):
model = Flare
### NEED TO CHECK IF FLARE BELONGS TO LOGGED IN USER, PERMISSION DENIED IF NOT
class FlareList(LoginRequiredMixin, ListView):
"""Changed allow_empty to = False so it returns 404 when empty, then redirect with dispatch to Flare About view"""
allow_empty = False
paginate_by = 5
model = Flare
"""Overrode dispatch to redirect to Flare About view if FlareList view returns 404, as in the case of it being empty due to allow_empty=False
"""
def dispatch(self, *args, **kwargs):
try:
return super().dispatch(*args, **kwargs)
except Http404:
messages.info(self.request, f"No flares to list!")
return redirect("flare:about")
def get_context_data(self, **kwargs):
context = super(ListView, self).get_context_data(**kwargs)
context.update(
{
"flare_list": Flare.objects.filter(user=self.request.user),
}
)
return context
def get_queryset(self):
queryset = super().get_queryset()
return queryset.filter(user=self.request.user).order_by("-created")
class FlareCreate(CreateView):
model = Flare
form_class = FlareForm
urate_form_class = UrateFlareForm
angina_form_class = AnginaForm
hypertension_form_class = HypertensionSimpleForm
heartattack_form_class = HeartAttackSimpleForm
CHF_form_class = CHFSimpleForm
stroke_form_class = StrokeSimpleForm
PVD_form_class = PVDForm
def form_valid(self, form):
if self.request.user.is_authenticated:
if self.request.user.patientprofile:
if self.request.user.patientprofile.gender:
if self.request.user.patientprofile.gender == "male":
form.instance.male = True
else:
form.instance.male = False
form.instance.user = self.request.user
return super().form_valid(form)
else:
return super().form_valid(form)
def get_context_data(self, **kwargs):
context = super(FlareCreate, self).get_context_data(**kwargs)
if self.request.user.is_authenticated:
if "urate_form" not in context:
context["urate_form"] = self.urate_form_class(self.request.GET)
if "angina_form" not in context:
context["angina_form"] = self.angina_form_class(instance=self.request.user.medicalprofile.angina)
if "hypertension_form" not in context:
context["hypertension_form"] = self.hypertension_form_class(
instance=self.request.user.medicalprofile.hypertension
)
if "heartattack_Form" not in context:
context["heartattack_form"] = self.heartattack_form_class(
instance=self.request.user.medicalprofile.heartattack
)
if "CHF_form" not in context:
context["CHF_form"] = self.CHF_form_class(instance=self.request.user.medicalprofile.CHF)
if "stroke_form" not in context:
context["stroke_form"] = self.stroke_form_class(instance=self.request.user.medicalprofile.stroke)
if "PVD_form" not in context:
context["PVD_form"] = self.PVD_form_class(instance=self.request.user.medicalprofile.PVD)
else:
if "urate_form" not in context:
context["urate_form"] = self.urate_form_class(self.request.GET)
if "angina_form" not in context:
context["angina_form"] = self.angina_form_class(self.request.GET)
if "hypertension_form" not in context:
context["hypertension_form"] = self.hypertension_form_class(self.request.GET)
if "heartattack_form" not in context:
context["heartattack_form"] = self.heartattack_form_class(self.request.GET)
if "CHF_form" not in context:
context["CHF_form"] = self.CHF_form_class(self.request.GET)
if "stroke_form" not in context:
context["stroke_form"] = self.stroke_form_class(self.request.GET)
if "PVD_form" not in context:
context["PVD_form"] = self.PVD_form_class(self.request.GET)
return context
def get_form_kwargs(self):
"""Overwrites get_form_kwargs() to look for 'flare' kwarg in GET request, uses 'flare' to query database for associated flare for use in FlareAidForm
returns: [dict: dict containing 'flare' kwarg for form]"""
# Assign self.gender to None so FlareForm won't error on loading from GET request kwargs before calling super() which will overwrite kwargs
self.gender = None
# Checks if user is logged in, if they have a patient profile gender, and if so, assign to self.gender
if self.request.user.is_authenticated:
if self.request.user.patientprofile.gender:
self.gender = self.request.user.patientprofile.gender
kwargs = super(FlareCreate, self).get_form_kwargs()
# Pass self.gender to FlareForm as kwarg for use in form processing of male field
if self.gender:
kwargs["gender"] = self.gender
return kwargs
def get_object(self):
object = self.model
return object
def post(self, request, *args, **kwargs):
self.object = self.get_object()
form = self.form_class(request.POST, instance=Flare())
urate_form = self.urate_form_class(request.POST, instance=Urate())
if request.user.is_authenticated:
angina_form = self.angina_form_class(request.POST, instance=request.user.medicalprofile.angina)
hypertension_form = self.hypertension_form_class(
request.POST, instance=request.user.medicalprofile.hypertension
)
heartattack_form = self.heartattack_form_class(
request.POST, instance=request.user.medicalprofile.heartattack
)
CHF_form = self.CHF_form_class(request.POST, instance=request.user.medicalprofile.CHF)
stroke_form = self.stroke_form_class(request.POST, instance=request.user.medicalprofile.stroke)
PVD_form = self.PVD_form_class(request.POST, instance=request.user.medicalprofile.PVD)
else:
angina_form = self.angina_form_class(request.POST, instance=Angina())
hypertension_form = self.hypertension_form_class(request.POST, instance=Hypertension())
heartattack_form = self.heartattack_form_class(request.POST, instance=HeartAttack())
CHF_form = self.CHF_form_class(request.POST, instance=CHF())
stroke_form = self.stroke_form_class(request.POST, instance=Stroke())
PVD_form = self.PVD_form_class(request.POST, instance=PVD())
if form.is_valid():
flare_data = form.save(commit=False)
if urate_form.is_valid():
urate_data = urate_form.save(commit=False)
if urate_data.value:
if request.user.is_authenticated:
urate_data.user = request.user
urate_data.save()
flare_data.urate = urate_data
angina_data = angina_form.save(commit=False)
angina_data.last_modified = "Flare"
angina_data.save()
hypertension_data = hypertension_form.save(commit=False)
hypertension_data.last_modified = "Flare"
hypertension_data.save()
heartattack_data = heartattack_form.save(commit=False)
heartattack_data.last_modified = "Flare"
heartattack_data.save()
CHF_data = CHF_form.save(commit=False)
CHF_data.last_modified = "Flare"
CHF_data.save()
stroke_data = stroke_form.save(commit=False)
stroke_data.last_modified = "Flare"
stroke_data.save()
PVD_data = PVD_form.save(commit=False)
PVD_data.last_modified = "Flare"
PVD_data.save()
flare_data.angina = angina_data
flare_data.hypertension = hypertension_data
flare_data.heartattack = heartattack_data
flare_data.CHF = CHF_data
flare_data.stroke = stroke_data
flare_data.PVD = PVD_data
flare_data.save()
return self.form_valid(form)
else:
if request.user.is_authenticated:
return self.render_to_response(
self.get_context_data(
form=form,
urate_form=self.urate_form_class(request.POST, instance=Urate()),
angina_form=self.angina_form_class(request.POST, instance=request.user.medicalprofile.angina),
hypertension_form=self.hypertension_form_class(
request.POST, instance=request.user.medicalprofile.hypertension
),
heartattack_form=self.heartattack_form_class(
request.POST, instance=request.user.medicalprofile.heartattack
),
CHF_form=self.CHF_form_class(request.POST, instance=request.user.medicalprofile.CHF),
stroke_form=self.stroke_form_class(request.POST, instance=request.user.medicalprofile.stroke),
PVD_form=self.PVD_form_class(request.POST, instance=request.user.medicalprofile.PVD),
)
)
else:
return self.render_to_response(
self.get_context_data(
form=form,
urate_form=self.urate_form_class(request.POST, instance=Urate()),
angina_form=self.angina_form_class(request.POST, instance=Angina()),
hypertension_form=self.hypertension_form_class(request.POST, instance=Hypertension()),
heartattack_form=self.heartattack_form_class(request.POST, instance=HeartAttack()),
CHF_form=self.CHF_form_class(request.POST, instance=CHF()),
stroke_form=self.stroke_form_class(request.POST, instance=Stroke()),
PVD_form=self.PVD_form_class(request.POST, instance=PVD()),
)
)
class FlareUpdate(LoginRequiredMixin, UpdateView):
model = Flare
form_class = FlareForm
urate_form_class = UrateFlareForm
angina_form_class = AnginaForm
hypertension_form_class = HypertensionSimpleForm
heartattack_form_class = HeartAttackSimpleForm
CHF_form_class = CHFSimpleForm
stroke_form_class = StrokeSimpleForm
PVD_form_class = PVDForm
def get_context_data(self, **kwargs):
context = super(FlareUpdate, self).get_context_data(**kwargs)
if self.request.POST:
if "urate_form" not in context:
context["urate_form"] = self.urate_form_class(self.request.POST, instance=self.object.urate)
if "angina_form" not in context:
context["angina_form"] = self.angina_form_class(instance=self.request.user.medicalprofile.angina)
if "hypertension_form" not in context:
context["hypertension_form"] = self.hypertension_form_class(
instance=self.request.user.medicalprofile.hypertension
)
if "heartattack_form" not in context:
context["heartattack_form"] = self.heartattack_form_class(
instance=self.request.user.medicalprofile.heartattack
)
if "CHF_form" not in context:
context["CHF_form"] = self.CHF_form_class(instance=self.request.user.medicalprofile.CHF)
if "stroke_form" not in context:
context["stroke_form"] = self.stroke_form_class(instance=self.request.user.medicalprofile.stroke)
if "PVD_form" not in context:
context["PVD_form"] = self.PVD_form_class(instance=self.request.user.medicalprofile.PVD)
else:
if "urate_form" not in context:
context["urate_form"] = self.urate_form_class(instance=self.object.urate)
if "angina_form" not in context:
context["angina_form"] = self.angina_form_class(instance=self.request.user.medicalprofile.angina)
if "heartattack_form" not in context:
context["heartattack_form"] = self.heartattack_form_class(
instance=self.request.user.medicalprofile.heartattack
)
if "hypertension_form" not in context:
context["hypertension_form"] = self.hypertension_form_class(
instance=self.request.user.medicalprofile.hypertension
)
if "CHF_form" not in context:
context["CHF_form"] = self.CHF_form_class(instance=self.request.user.medicalprofile.CHF)
if "stroke_form" not in context:
context["stroke_form"] = self.stroke_form_class(instance=self.request.user.medicalprofile.stroke)
if "PVD_form" not in context:
context["PVD_form"] = self.PVD_form_class(instance=self.request.user.medicalprofile.PVD)
return context
def post(self, request, **kwargs):
self.object = self.get_object()
form = self.form_class(request.POST, request.FILES, instance=self.object)
if form.is_valid():
flare_data = form.save(commit=False)
flare_data.user = request.user
urate_form = self.urate_form_class(request.POST, instance=self.object.urate)
angina_form = self.angina_form_class(request.POST, instance=request.user.medicalprofile.angina)
hypertension_form = self.hypertension_form_class(
request.POST, instance=request.user.medicalprofile.hypertension
)
heartattack_form = self.hypertension_form_class(
request.POST, instance=request.user.medicalprofile.heartattack
)
CHF_form = self.CHF_form_class(request.POST, instance=request.user.medicalprofile.CHF)
stroke_form = self.stroke_form_class(request.POST, instance=request.user.medicalprofile.stroke)
PVD_form = self.PVD_form_class(request.POST, instance=request.user.medicalprofile.PVD)
if urate_form.is_valid():
urate_data = urate_form.save(commit=False)
if urate_data.value:
urate_data.user = request.user
urate_data.save()
flare_data.urate = urate_data
angina_data = angina_form.save(commit=False)
angina_data.last_modified = "Flare"
angina_data.save()
hypertension_data = hypertension_form.save(commit=False)
hypertension_data.last_modified = "Flare"
hypertension_data.save()
heartattack_data = heartattack_form.save(commit=False)
heartattack_data.last_modified = "Flare"
heartattack_data.save()
CHF_data = CHF_form.save(commit=False)
CHF_data.last_modified = "Flare"
CHF_data.save()
stroke_data = stroke_form.save(commit=False)
stroke_data.last_modified = "Flare"
stroke_data.save()
PVD_data = PVD_form.save(commit=False)
PVD_data.last_modified = "Flare"
PVD_data.save()
flare_data.hypertension = hypertension_data
flare_data.heartattack = heartattack_data
flare_data.CHF = CHF_data
flare_data.stroke = stroke_data
flare_data.PVD = PVD_data
flare_data.save()
return self.form_valid(form)
else:
return self.render_to_response(
self.get_context_data(
form=form,
urate_form=self.urate_form_class(request.POST, instance=Urate()),
angina_form=self.angina_form_class(request.POST, instance=request.user.medicalprofile.angina),
hypertension_form=self.hypertension_form_class(
request.POST, instance=request.user.medicalprofile.hypertension
),
heartattack_form=self.heartattack_form_class(
request.POST, instance=request.user.medicalprofile.heartattack
),
CHF_form=self.CHF_form_class(request.POST, instance=request.user.medicalprofile.CHF),
stroke_form=self.stroke_form_class(request.POST, instance=request.user.medicalprofile.stroke),
PVD_form=self.PVD_form_class(request.POST, instance=request.user.medicalprofile.PVD),
)
)
| 49.92437 | 157 | 0.635471 | 1,977 | 17,823 | 5.512898 | 0.085483 | 0.071841 | 0.063125 | 0.078906 | 0.800532 | 0.771447 | 0.7564 | 0.754656 | 0.737407 | 0.733187 | 0 | 0.001013 | 0.279863 | 17,823 | 356 | 158 | 50.064607 | 0.84815 | 0.040958 | 0 | 0.660436 | 0 | 0 | 0.046161 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031153 | false | 0 | 0.043614 | 0 | 0.208723 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d3a3ad1bfa261ca1b7a17e7d9ef0112762e8b1d9 | 30 | py | Python | src/data/__init__.py | lucasalexsorensen/mlops | 2d8157eb493061775bdab9a8e176d2bdcc2c166e | [
"MIT"
] | null | null | null | src/data/__init__.py | lucasalexsorensen/mlops | 2d8157eb493061775bdab9a8e176d2bdcc2c166e | [
"MIT"
] | null | null | null | src/data/__init__.py | lucasalexsorensen/mlops | 2d8157eb493061775bdab9a8e176d2bdcc2c166e | [
"MIT"
] | null | null | null | from .data import MaskDataset
| 15 | 29 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6ccbc65d8b1942df4b506a1846eafe5488db7549 | 9,166 | py | Python | tests/test_rest_views.py | timgates42/django-nap | 34a2a3f9a194457ea5391b62d49f8ee975bc5aeb | [
"BSD-3-Clause"
] | 114 | 2015-01-15T23:03:38.000Z | 2021-11-02T07:58:08.000Z | tests/test_rest_views.py | timgates42/django-nap | 34a2a3f9a194457ea5391b62d49f8ee975bc5aeb | [
"BSD-3-Clause"
] | 23 | 2015-01-08T00:37:24.000Z | 2021-02-06T09:30:15.000Z | tests/test_rest_views.py | timgates42/django-nap | 34a2a3f9a194457ea5391b62d49f8ee975bc5aeb | [
"BSD-3-Clause"
] | 19 | 2015-01-13T17:19:50.000Z | 2020-03-09T11:02:38.000Z | from django.test import TestCase
import json
from nap.http import STATUS
from .models import Poll, Choice
class ListRestViewTest(TestCase):
def setUp(self):
self.question_1 = {
'question': 'Question 1',
'pub_date': '2016-05-13 00:00:00',
'kill_date': None,
}
self.question_2 = {
'question': 'Question 2',
'pub_date': '2015-05-13 00:00:00',
'kill_date': None,
}
Poll.objects.create(**self.question_1)
Poll.objects.create(**self.question_2)
def test_get(self):
response = self.client.get('/rest/polls/')
self.assertEqual(response.status_code, STATUS.OK)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data[0], dict(self.question_1, choices=[]))
self.assertEqual(data[1], dict(self.question_2, choices=[]))
def test_post(self):
request_data = {}
response = self.client.post('/rest/polls/',
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.BAD_REQUEST)
request_data = {'question': 'Question 1'}
response = self.client.post('/rest/polls/',
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.BAD_REQUEST)
request_data = {'pub_date': '2016-05-13 00:00:00'}
response = self.client.post('/rest/polls/',
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.BAD_REQUEST)
request_data = {
'question': 'Question 1',
'pub_date': '2016-05-13 00:00:00',
'kill_date': None,
}
response = self.client.post('/rest/polls/',
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.CREATED)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data, dict(request_data, choices=[]))
class SingleObjectRestViewTest(TestCase):
def setUp(self):
self.default_question = 'Default question'
self.default_pub_date = '2015-01-01 00:00:00'
self.poll = Poll.objects.create(question=self.default_question, pub_date=self.default_pub_date)
def test_get(self):
response = self.client.get('/rest/polls/{}'.format(self.poll.pk))
self.assertEqual(response.status_code, STATUS.OK)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data['question'], self.default_question)
self.assertEqual(data['pub_date'], self.default_pub_date)
def test_put(self):
# put requires all fields
request_data = {}
response = self.client.put('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.BAD_REQUEST)
request_data = {'question': 'A new question'}
response = self.client.put('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.BAD_REQUEST)
request_data = {'pub_date': '2014-06-01 12:30:30'}
response = self.client.put('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.BAD_REQUEST)
request_data = {'question': 'A new question', 'pub_date': '2014-06-01 12:30:30'}
response = self.client.put('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.OK)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data, dict(request_data, choices=[], kill_date=None))
# reload poll from db and see that it's updated
poll = Poll.objects.get(pk=self.poll.pk)
self.assertEqual(poll.question, request_data['question'])
self.assertEqual(poll.pub_date.isoformat(' '), request_data['pub_date'])
def test_patch(self):
# patch can have any combination of fields
request_data = {}
response = self.client.patch('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.OK)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data, {
'question': self.default_question,
'pub_date': self.default_pub_date,
'kill_date': None,
'choices': [],
})
# reload poll from db and see that it's updated
poll = Poll.objects.get(pk=self.poll.pk)
self.assertEqual(poll.question, self.default_question)
self.assertEqual(poll.pub_date.isoformat(' '), self.default_pub_date)
# one field
request_data = {'question': 'One field question'}
response = self.client.patch('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.OK)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data, {
'question': request_data['question'],
'pub_date': self.default_pub_date,
'kill_date': None,
'choices': [],
})
# reload poll from db and see that it's updated
poll = Poll.objects.get(pk=self.poll.pk)
self.assertEqual(poll.question, request_data['question'])
self.assertEqual(poll.pub_date.isoformat(' '), self.default_pub_date)
request_data = {'question': 'A new question', 'pub_date': '2014-06-01 12:30:30'}
response = self.client.patch('/rest/polls/{}'.format(self.poll.pk),
data=json.dumps(request_data),
content_type='application/json')
self.assertEqual(response.status_code, STATUS.OK)
self.assertEqual(response['Content-Type'], 'application/json')
data = json.loads(response.getvalue().decode())
self.assertEqual(data, dict(request_data, choices=[], kill_date=None))
# reload poll from db and see that it's updated
poll = Poll.objects.get(pk=self.poll.pk)
self.assertEqual(poll.question, request_data['question'])
self.assertEqual(poll.pub_date.isoformat(' '), request_data['pub_date'])
def test_delete(self):
response = self.client.delete('/rest/polls/{}'.format(self.poll.pk))
self.assertEqual(response.status_code, STATUS.NO_CONTENT)
# make sure the poll isn't in the db
self.assertFalse(Poll.objects.filter(pk=self.poll.pk).exists())
class ChoiceListViewTest(TestCase):
def setUp(self):
self.question_data = {
'question': 'Default question',
'pub_date': '2015-01-01 00:00:00',
}
self.poll = Poll.objects.create(**self.question_data)
def test_get(self):
response = self.client.get('/rest/polls/{}/choice/'.format(self.poll.pk))
self.assertEqual(response.status_code, 200)
def test_post(self):
response = self.client.post('/rest/polls/{}/choice/'.format(self.poll.pk),
data=json.dumps({
'choice_text': 'Does this work?',
'poll': self.poll.pk,
}),
content_type='application/json')
self.assertEqual(response.status_code, 201)
def test_paginate(self):
choices = [
Choice.objects.create(poll=self.poll),
Choice.objects.create(poll=self.poll),
]
response = self.client.get('/rest/polls/{}/choice/'.format(self.poll.pk))
self.assertEqual(response.status_code, 200)
| 45.376238 | 103 | 0.581933 | 1,018 | 9,166 | 5.112967 | 0.103143 | 0.118156 | 0.106052 | 0.094909 | 0.868972 | 0.848799 | 0.808838 | 0.800384 | 0.78732 | 0.775985 | 0 | 0.022436 | 0.285184 | 9,166 | 201 | 104 | 45.60199 | 0.771978 | 0.031966 | 0 | 0.660606 | 0 | 0 | 0.143051 | 0.007446 | 0 | 0 | 0 | 0 | 0.254545 | 1 | 0.072727 | false | 0 | 0.024242 | 0 | 0.115152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6cec910d2b106eeb001696a06aaf2faeca73b9ff | 38 | py | Python | language.py | super-system-studio/calculator | db217c1074ee915e9fb8ea7fdcd006794e920613 | [
"Apache-2.0"
] | null | null | null | language.py | super-system-studio/calculator | db217c1074ee915e9fb8ea7fdcd006794e920613 | [
"Apache-2.0"
] | 1 | 2018-11-25T02:29:37.000Z | 2018-11-25T02:30:35.000Z | language.py | SuperSystemStudio/calculator | db217c1074ee915e9fb8ea7fdcd006794e920613 | [
"Apache-2.0"
] | 1 | 2018-11-20T13:09:30.000Z | 2018-11-20T13:09:30.000Z | import os
import basic
basic.language
| 9.5 | 14 | 0.842105 | 6 | 38 | 5.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 38 | 3 | 15 | 12.666667 | 0.969697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9f1b243789fef1c25ca47ec59f43de1b20261b7f | 28 | py | Python | lib/utils.py | percebus/leetcode-excercises | 1d746e2255a3a5a652535ed1984a5aa45165e17e | [
"Unlicense"
] | null | null | null | lib/utils.py | percebus/leetcode-excercises | 1d746e2255a3a5a652535ed1984a5aa45165e17e | [
"Unlicense"
] | null | null | null | lib/utils.py | percebus/leetcode-excercises | 1d746e2255a3a5a652535ed1984a5aa45165e17e | [
"Unlicense"
] | null | null | null |
def noop(x):
return x
| 5.6 | 12 | 0.535714 | 5 | 28 | 3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.357143 | 28 | 4 | 13 | 7 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
9f3c0bb28ad3592876e82dcfd0bb00598878b481 | 75 | py | Python | leveldbs/__init__.py | infinityfuture/leveldb-server | e786b29745f18a416607df28fe7a5d5de95fff1c | [
"MIT"
] | null | null | null | leveldbs/__init__.py | infinityfuture/leveldb-server | e786b29745f18a416607df28fe7a5d5de95fff1c | [
"MIT"
] | 1 | 2018-08-18T18:45:02.000Z | 2018-08-18T18:45:02.000Z | leveldbs/__init__.py | infinityfuture/leveldb-server | e786b29745f18a416607df28fe7a5d5de95fff1c | [
"MIT"
] | null | null | null |
# from .client import LevelDBClient
from .zmq_client import LevelDBClient
| 18.75 | 37 | 0.826667 | 9 | 75 | 6.777778 | 0.555556 | 0.393443 | 0.819672 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 75 | 3 | 38 | 25 | 0.938462 | 0.44 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
9f5e4cba6ce8b0cdf94faf79102391798911f91f | 6,221 | py | Python | pytextdist/vector_similarity.py | ywu94/python-text-distance | e96b68144a9df0ff78b635245e2ee8201473d6bc | [
"MIT"
] | 5 | 2020-03-18T18:07:16.000Z | 2022-02-16T11:37:36.000Z | pytextdist/vector_similarity.py | ywu94/python-text-distance | e96b68144a9df0ff78b635245e2ee8201473d6bc | [
"MIT"
] | 1 | 2020-03-12T00:58:19.000Z | 2020-03-18T05:37:37.000Z | pytextdist/vector_similarity.py | ywu94/python-text-distance | e96b68144a9df0ff78b635245e2ee8201473d6bc | [
"MIT"
] | 3 | 2020-03-12T00:44:07.000Z | 2021-01-22T07:56:38.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import math
import logging
logger = logging.getLogger(__name__)
from .input_validator import input_validator
from .preprocessing import phrase_preprocessing, ngram_counter
@input_validator(str, str, n=int)
def cosine_similarity(phrase_1, phrase_2, n=1, grain="word", ignore_non_alnumspc=True, ignore_space=True, ignore_numeric=True, ignore_case=True):
"""
Get cosine similarity between two text phrases
|
| Argument
| | phrase_1, phrase_2: text phrases to compare
|
| Parameter
| | n: number of continuous tokens to group
| | grain: "char" or "word", grain for building vector
|
| Parameter for preprocessing
| | ignore_non_alnumspc: whether to remove all non alpha/numeric/space characters
| | ignore_space: whether to remove all spaces if grain is character
| | ignore_numeric: whether to remove all numeric characters
| | ignore_case: whether to convert all alpha characters to lower case
|
| Output
| | similarity (type: float)
"""
l_1 = phrase_preprocessing(phrase_1, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_1 = ngram_counter(l_1, n=n)
l_2 = phrase_preprocessing(phrase_2, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_2 = ngram_counter(l_2, n=n)
numerator = sum([counter_1[x] * counter_2[x] for x in set(counter_1.keys()) & set(counter_2.keys())])
denominator = math.sqrt(sum([v**2 for v in counter_1.values()])) * math.sqrt(sum([v**2 for v in counter_2.values()]))
similarity = numerator/denominator
return similarity
@input_validator(str, str, n=int)
def jaccard_similarity(phrase_1, phrase_2, n=1, grain="word", ignore_non_alnumspc=True, ignore_space=True, ignore_numeric=True, ignore_case=True):
"""
Get jaccard similarity between two text phrases
|
| Argument
| | phrase_1, phrase_2: text phrases to compare
|
| Parameter
| | n: number of continuous tokens to group
| | grain: "char" or "word", grain for building vector
|
| Parameter for preprocessing
| | ignore_non_alnumspc: whether to remove all non alpha/numeric/space characters
| | ignore_space: whether to remove all spaces if grain is character
| | ignore_numeric: whether to remove all numeric characters
| | ignore_case: whether to convert all alpha characters to lower case
|
| Output
| | similarity (type: float)
"""
l_1 = phrase_preprocessing(phrase_1, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_1 = ngram_counter(l_1, n=n)
unique_token_1 = set(counter_1.keys())
l_2 = phrase_preprocessing(phrase_2, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_2 = ngram_counter(l_2, n=n)
unique_token_2 = set(counter_2.keys())
numerator = len(unique_token_1 & unique_token_2)
denominator = len(unique_token_1 | unique_token_2)
similarity = numerator/denominator
return similarity
@input_validator(str, str, n=int)
def sorensen_dice_similarity(phrase_1, phrase_2, n=1, grain="word", ignore_non_alnumspc=True, ignore_space=True, ignore_numeric=True, ignore_case=True):
"""
Get Sorense Dice similarity between two text phrases
|
| Argument
| | phrase_1, phrase_2: text phrases to compare
|
| Parameter
| | n: number of continuous tokens to group
| | grain: "char" or "word", grain for building vector
|
| Parameter for preprocessing
| | ignore_non_alnumspc: whether to remove all non alpha/numeric/space characters
| | ignore_space: whether to remove all spaces if grain is character
| | ignore_numeric: whether to remove all numeric characters
| | ignore_case: whether to convert all alpha characters to lower case
|
| Output
| | similarity (type: float)
"""
l_1 = phrase_preprocessing(phrase_1, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_1 = ngram_counter(l_1, n=n)
unique_token_1 = set(counter_1.keys())
l_2 = phrase_preprocessing(phrase_2, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_2 = ngram_counter(l_2, n=n)
unique_token_2 = set(counter_2.keys())
numerator = 2 * len(unique_token_1 & unique_token_2)
denominator = len(unique_token_1) + len(unique_token_2)
similarity = numerator/denominator
return similarity
@input_validator(str, str, n=int)
def qgram_similarity(phrase_1, phrase_2, n=1, grain="word", ignore_non_alnumspc=True, ignore_space=True, ignore_numeric=True, ignore_case=True):
"""
Get Q-Gram similarity between two text phrases
|
| Argument
| | phrase_1, phrase_2: text phrases to compare
|
| Parameter
| | n: number of continuous tokens to group
| | grain: "char" or "word", grain for building vector
|
| Parameter for preprocessing
| | ignore_non_alnumspc: whether to remove all non alpha/numeric/space characters
| | ignore_space: whether to remove all spaces if grain is character
| | ignore_numeric: whether to remove all numeric characters
| | ignore_case: whether to convert all alpha characters to lower case
|
| Output
| | similarity (type: float)
"""
l_1 = phrase_preprocessing(phrase_1, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_1 = ngram_counter(l_1, n=n)
l_2 = phrase_preprocessing(phrase_2, grain=grain, ignore_non_alnumspc=ignore_non_alnumspc, ignore_numeric=ignore_numeric, ignore_case=ignore_case, ignore_space=ignore_space)
counter_2 = ngram_counter(l_2, n=n)
numerator = sum([abs(counter_1.get(key,0)-counter_2.get(key,0)) for key in set(counter_1.keys())|set(counter_2.keys())])
denominator = sum([max(counter_1.get(key,0), counter_2.get(key,0)) for key in set(counter_1.keys())|set(counter_2.keys())])
similarity = 1 - numerator/denominator
return similarity
| 38.88125 | 174 | 0.775438 | 927 | 6,221 | 4.926645 | 0.101402 | 0.047296 | 0.089337 | 0.080578 | 0.917013 | 0.917013 | 0.917013 | 0.911101 | 0.911101 | 0.899715 | 0 | 0.016614 | 0.12924 | 6,221 | 159 | 175 | 39.125786 | 0.826472 | 0.371323 | 0 | 0.596154 | 0 | 0 | 0.004218 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.134615 | 0 | 0.288462 | 0.019231 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9f8ceab77aea964bd7aed15d9632d0047b8e6d89 | 76 | py | Python | session25_cython/example.py | morales-gregorio/Python-Module-of-the-Week | 2c68e20be3e174be9b91c92ac872806dd982e7d2 | [
"MIT"
] | 15 | 2017-06-22T11:57:38.000Z | 2022-03-31T13:34:07.000Z | session25_cython/example.py | morales-gregorio/Python-Module-of-the-Week | 2c68e20be3e174be9b91c92ac872806dd982e7d2 | [
"MIT"
] | 3 | 2019-10-16T10:32:55.000Z | 2020-01-09T09:24:48.000Z | session25_cython/example.py | morales-gregorio/Python-Module-of-the-Week | 2c68e20be3e174be9b91c92ac872806dd982e7d2 | [
"MIT"
] | 6 | 2016-10-07T12:50:24.000Z | 2019-11-28T11:15:04.000Z | from lib import return_primes
import numpy as np
print(return_primes(100))
| 15.2 | 29 | 0.815789 | 13 | 76 | 4.615385 | 0.769231 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.131579 | 76 | 4 | 30 | 19 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9f96ed6e00fc4363a159be4ed549fd2edd8b65b6 | 30 | py | Python | devdoc/__init__.py | CaioTeixeira95/automacdoc | 37f488ed2cee53ac716136d2c00907ee70574a46 | [
"MIT"
] | 2 | 2020-05-12T22:36:39.000Z | 2021-10-11T18:03:48.000Z | devdoc/__init__.py | CaioTeixeira95/devdoc | 37f488ed2cee53ac716136d2c00907ee70574a46 | [
"MIT"
] | 1 | 2020-05-12T18:23:17.000Z | 2020-05-12T18:23:17.000Z | devdoc/__init__.py | CaioTeixeira95/devdoc | 37f488ed2cee53ac716136d2c00907ee70574a46 | [
"MIT"
] | null | null | null | from .devdoc import write_doc
| 15 | 29 | 0.833333 | 5 | 30 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9fbe76937f493012fd0b0fe12aa99ee1c5c1cfda | 20,148 | py | Python | lib/cogs/mod.py | TheProfessor-dev/Cyborg | 4f79d74c287c14f5b02c144660d5d0652f757ced | [
"MIT"
] | null | null | null | lib/cogs/mod.py | TheProfessor-dev/Cyborg | 4f79d74c287c14f5b02c144660d5d0652f757ced | [
"MIT"
] | null | null | null | lib/cogs/mod.py | TheProfessor-dev/Cyborg | 4f79d74c287c14f5b02c144660d5d0652f757ced | [
"MIT"
] | null | null | null | import discord
from discord.ext import commands
import typing
from datetime import datetime
class Mod(commands.Cog):
def __init__(self, bot):
self.bot = bot
@commands.command(aliases=['pmuser'], help="Dm the user which is mentioned.")
@commands.has_permissions(manage_messages=True)
async def dmuser(self, ctx, user: discord.User, *, msg):
try:
await user.send(f'**{ctx.message.author}** has a message for you, \n `{msg}`')
await ctx.send("Done!")
except:
await ctx.send(f'The user has his/her DMs turned off.')
@dmuser.error
async def dm_error(self, ctx, error):
if isinstance(error, commands.MissingRequiredArgument):
embed = discord.Embed(description=f'{ctx.author.mention}Please mention the user.',
colour=ctx.author.colour
)
await ctx.send(embed=embed, delete_after=10)
if isinstance(error, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.message.author.mention} :x: You need `manage_messagesr` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command(aliases=['clearuser'], description="Clear messages of particular user in a channel")
@commands.guild_only()
@commands.has_permissions(manage_messages=True)
async def purgeuser(self, ctx, user: discord.Member,
num_messages: typing.Optional[int] = 100,
):
channel = ctx.message.channel
if ctx.guild.me.top_role < user.top_role:
return await ctx.send("Admin :(")
if ctx.message.author.top_role < user.top_role:
return await ctx.send("You have lower roles.")
def check(msg):
return msg.author.id == user.id
await ctx.message.delete()
await channel.purge(limit=num_messages, check=check, before=None)
embed = discord.Embed(title="User Messages cleared",
colour=ctx.author.colour)
embed.add_field(name="Member", value=user.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="No. of messages", value=f"{num_messages}", inline=False)
embed.add_field(name="Channel", value=channel.mention, inline=False)
embed.timestamp = datetime.utcnow()
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
return await ctx.send(embed=embed, delete_after=10)
log_channel = self.bot.get_channel(id=r)
await log_channel.send(embed=embed)
@purgeuser.error
async def purge_error(self, ctx, error):
if isinstance(error, commands.MissingRequiredArgument):
embed = discord.Embed(description=f'{ctx.author.mention}Please mention the user.',
colour=ctx.author.colour
)
await ctx.send(embed=embed, delete_after=10)
if isinstance(error, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.message.author.mention} :x: You need `manage_messagesr` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command(name='ban', help='use to ban members.')
@commands.guild_only()
@commands.has_permissions(ban_members=True)
async def ban(self, ctx, member: discord.Member, *, reason=None):
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
await member.ban(reason=reason)
embed = discord.Embed(title="Member Banned",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Reason", value=reason, inline=False)
embed.timestamp = datetime.utcnow()
return await ctx.send(embed=embed, delete_after=10)
log_channel = self.bot.get_channel(id=r)
await member.ban(reason=reason)
embed = discord.Embed(title="Member Banned",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Reason", value=reason, inline=False)
embed.timestamp = datetime.utcnow()
await log_channel.send(embed=embed)
@ban.error
async def ban_error(self, ctx, member: discord.Member):
if isinstance(member, commands.MissingRequiredArgument):
embed = discord.Embed(description=f'{ctx.author.mention}Please mention the member to be **Banned**.',
colour=ctx.author.colour
)
await ctx.send(embed=embed, delete_after=10)
if isinstance(member, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.message.author.mention} :x: You need `Ban_Member` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command(name='clear', help='Using this command you can clear messages in any channel.')
@commands.guild_only()
@commands.has_permissions(administrator=True)
async def clear(self, ctx, amount1: int):
channel = ctx.message.channel
await ctx.channel.purge(limit=amount1)
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
embed = discord.Embed(title="Messages Clear",
colour=ctx.author.colour)
embed.add_field(name="Number of messages clear", value=f"{amount1}", inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Channel", value=channel.mention, inline=False)
embed.timestamp = datetime.utcnow()
return await ctx.send(embed=embed, delete_after=10)
log_channel = self.bot.get_channel(id=r)
embed = discord.Embed(title="Messages Clear",
colour=ctx.author.colour)
embed.add_field(name="Number of messages clear", value=f"{amount1}", inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Channel", value=channel.mention, inline=False)
embed.timestamp = datetime.utcnow()
return await log_channel.send(embed=embed)
@clear.error
async def clear_error(self, ctx, error):
if isinstance(error, commands.MissingRequiredArgument):
embed = discord.Embed(
description=f'{ctx.author.mention}Please specify the number of messages to be **purge**',
colour=ctx.author.colour
)
await ctx.send(embed=embed, delete_after=10)
if isinstance(error, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.author.mention} :x: You need `Manage_messages` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command(name='kick', help='This command is used to kick someone from server. Only given to some members.')
@commands.guild_only()
@commands.has_permissions(kick_members=True)
async def kick(self, ctx, member: discord.Member, *, reason=None):
await member.kick(reason=reason)
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
embed = discord.Embed(title="Member Kicked",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Reason", value=reason, inline=False)
embed.timestamp = datetime.utcnow()
return await ctx.send(embed=embed, delete_after=10)
log_channel = self.bot.get_channel(id=r)
embed = discord.Embed(title="Member Kicked",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Reason", value=reason, inline=False)
embed.timestamp = datetime.utcnow()
await log_channel.send(embed=embed)
@kick.error
async def kick_error(self, ctx, member: discord.Member):
if isinstance(member, commands.MissingRequiredArgument):
embed = discord.Embed(description=f'{ctx.author.mention}Please mention the member to be **Kicked**.',
colour=ctx.author.colour
)
await ctx.send(embed=embed, delete_after=10)
if isinstance(member, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.author.mention} :x: You need `Kick_Member` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command(name='unban', help='Unbans member')
@commands.guild_only()
@commands.has_permissions(ban_members=True)
async def unban(self, ctx, *, member):
banned_users = await ctx.guild.bans() # names tuple containing user object and reason user is baned
member_name, member_discriminator = member.split('#') # splitting name nd discriminator with #
for ban_entry in banned_users: # going throw and banned entry in variable banned_user which in server its self
user = ban_entry.user # pulling user from banned entry and assigning in variable user
if (user.name, user.discriminator) == (
member_name, member_discriminator): # taking user name and matching
# it in banner_users
await ctx.guild.unban(user)
# unbanning user
embed = discord.Embed(title="Member Muted",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.timestamp = datetime.utcnow()
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
await ctx.send(embed=embed, delete_after=10) # mentioning user which was unbanned
return
log_channel = self.bot.get_channel(id=r)
await log_channel.send(embed=embed) # mentioning user which was unbanned
@unban.error
async def unban_error(self, ctx, member: discord.Member):
if isinstance(member, commands.MissingRequiredArgument):
await ctx.send(f'{ctx.author.mention}Please enter name and discriminator of member.', delete_after=10)
if isinstance(member, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.author.mention} :x: You need `ban_member` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command()
@commands.has_permissions(manage_roles=True, manage_guild=True)
async def mute(self, ctx, member: discord.Member, *, reason=None):
role = discord.utils.get(ctx.guild.roles, name="Muted")
guild = ctx.guild
if role not in guild.roles:
await ctx.send("You have not setup mute command. to do so use **$setup**")
else:
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
await member.edit(roles=[role])
embed = discord.Embed(title="Member Muted",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Reason", value=reason, inline=False)
embed.timestamp = datetime.utcnow()
await ctx.send(embed=embed, delete_after=10)
return
await member.edit(roles=[role])
embed = discord.Embed(title="Member Muted",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Reason", value=reason, inline=False)
embed.timestamp = datetime.utcnow()
log_channel = self.bot.get_channel(id=r)
await log_channel.send(embed=embed)
@commands.command()
@commands.has_permissions(manage_roles=True, manage_guild=True)
async def unmute(self, ctx, member: discord.Member, role: discord.Role):
await member.edit(roles=[role])
embed = discord.Embed(title="Member Unmuted",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.add_field(name="Role given", value=role.name, inline=False)
embed.timestamp = datetime.utcnow()
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
await ctx.send(embed=embed, delete_after=10)
return
log_channel = self.bot.get_channel(id=r)
await log_channel.send(embed=embed)
@mute.error
async def mute_error(self, ctx, member: discord.Member):
if isinstance(member, commands.MissingRequiredArgument):
await ctx.send(f'{ctx.author.mention}Please mention the member to be muted.', delete_after=10)
if isinstance(member, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.author.mention} :x: You need `manage_roles` & `manage_server` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@unmute.error
async def unmute_error(self, ctx, member: discord.Member):
if isinstance(member, commands.MissingPermissions):
embed = discord.Embed(
description=f"{ctx.author.mention} :x: You need `manage_roles` & `manage_server` permission to use this command.",
colour=ctx.author.colour)
await ctx.send(embed=embed, delete_after=10)
@commands.command()
@commands.has_permissions(administrator=True)
async def lock(self, ctx, role: discord.Role):
await ctx.channel.set_permissions(role, send_messages=False, read_messages=True)
await ctx.send("Channel locked.", delete_after=10)
@commands.command()
@commands.has_permissions(administrator=True)
async def unlock(self, ctx, role: discord.Role):
await ctx.channel.set_permissions(role, send_messages=True, read_messages=True)
await ctx.send("Channel unlocked.", delete_after=10)
@commands.command(aliases=['invite'])
async def createbotlink(self, ctx):
embed = discord.Embed(title="Invite me !",
url="https://discord.com/api/oauth2/authorize?client_id=802539167992119296&permissions=8&scope=bot",
colour=ctx.author.colour)
await ctx.send(embed=embed)
@commands.command(name="role", description='Gives role to user.')
async def give_role(self, ctx, member: discord.Member, *, role: discord.Role):
if role not in member.roles:
await member.add_roles(role)
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
embed = discord.Embed(title="Add Role",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.timestamp = datetime.utcnow()
return await ctx.send(embed=embed, delete_after=10)
embed = discord.Embed(title="Add Role",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.timestamp = datetime.utcnow()
log_channel = self.bot.get_channel(id=r)
print(log_channel)
await ctx.send(embed=embed)
return
await ctx.send(f"{member.mention} already has {role.name}")
@commands.command(name="rrole", description='Removes role to user.')
async def remove_role(self, ctx, member: discord.Member, *, role: discord.Role):
if role in member.roles:
await member.remove_roles(role)
r = await self.bot.db.fetchval(f"SELECT channel_id FROM logchannel WHERE guild_id = {ctx.guild.id}")
if not r:
print(r)
embed = discord.Embed(title="Remove Role",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.timestamp = datetime.utcnow()
return await ctx.send(embed=embed, delete_after=10)
embed = discord.Embed(title="Add Role",
colour=ctx.author.colour)
embed.add_field(name="Member", value=member.display_name, inline=False)
embed.add_field(name="Actioned By", value=ctx.author.name, inline=False)
embed.timestamp = datetime.utcnow()
log_channel = self.bot.get_channel(id=r)
print(r)
await log_channel.send(embed=embed)
return
await ctx.send(f"{member.mention} does not have {role.name}")
@lock.error
async def lock_error(self, ctx, error):
if isinstance(error, commands.MissingRequiredArgument):
await ctx.send("Role is the required argument which is missing.", delete_after=10)
@unlock.error
async def unlock_error(self, ctx, error):
if isinstance(error, commands.MissingRequiredArgument):
await ctx.send("Role is the required argument which is missing.", delete_after=10)
await ctx.send("Role is the required argument which is missing.", delete_after=10)
def setup(bot):
bot.add_cog(Mod(bot))
print("Mod cog is loaded")
| 50.878788 | 131 | 0.613758 | 2,423 | 20,148 | 5.018985 | 0.087082 | 0.041444 | 0.043829 | 0.057314 | 0.81416 | 0.795412 | 0.789984 | 0.761862 | 0.750596 | 0.738179 | 0 | 0.00585 | 0.278787 | 20,148 | 395 | 132 | 51.007595 | 0.831051 | 0.018414 | 0 | 0.653731 | 0 | 0.026866 | 0.164628 | 0.014971 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008955 | false | 0 | 0.01194 | 0.002985 | 0.068657 | 0.01194 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9fc1f35f332f8f503415c4d15077d6e61afaf3d4 | 17,334 | py | Python | userbot/plugins/feks.py | Doom098/userbot | 11f0225a75241ab9492b1c435414c77de287b8a6 | [
"MIT"
] | 12 | 2022-01-06T19:52:48.000Z | 2022-03-06T09:05:08.000Z | userbot/plugins/feks.py | Doom098/userbot | 11f0225a75241ab9492b1c435414c77de287b8a6 | [
"MIT"
] | null | null | null | userbot/plugins/feks.py | Doom098/userbot | 11f0225a75241ab9492b1c435414c77de287b8a6 | [
"MIT"
] | 64 | 2022-01-06T19:55:15.000Z | 2022-03-29T21:03:01.000Z | import asyncio
from telethon.tl.functions.users import GetFullUserRequest
from telethon.tl.types import ChannelParticipantsAdmins
from FIREX.utils import admin_cmd
from userbot.cmdhelp import CmdHelp
@borg.on(admin_cmd(pattern="ftyping ?(.*)"))
async def _(event):
if event.fwd_from:
return
await event.delete()
input_str = event.pattern_match.group(1)
action = "typing"
if input_str:
action = input_str
async with borg.action(event.chat_id, action):
await asyncio.sleep(600) # type for 10 seconds
@borg.on(admin_cmd(pattern="fcontact ?(.*)"))
async def _(event):
if event.fwd_from:
return
await event.delete()
input_str = event.pattern_match.group(1)
action = "contact"
if input_str:
action = input_str
async with borg.action(event.chat_id, action):
await asyncio.sleep(600) # type for 10 seconds
@borg.on(admin_cmd(pattern="fgame ?(.*)"))
async def _(event):
if event.fwd_from:
return
await event.delete()
input_str = event.pattern_match.group(1)
action = "game"
if input_str:
action = input_str
async with borg.action(event.chat_id, action):
await asyncio.sleep(600) # type for 10 seconds
@borg.on(admin_cmd(pattern="flocation ?(.*)"))
async def _(event):
if event.fwd_from:
return
await event.delete()
input_str = event.pattern_match.group(1)
action = "location"
if input_str:
action = input_str
async with borg.action(event.chat_id, action):
await asyncio.sleep(600) # type for 10 seconds
@borg.on(admin_cmd(pattern="fvoice ?(.*)"))
async def _(event):
if event.fwd_from:
return
await event.delete()
input_str = event.pattern_match.group(1)
action = "recording"
if input_str:
action = input_str
async with borg.action(event.chat_id, action):
await asyncio.sleep(600) # type for 10 seconds
@borg.on(admin_cmd(pattern="fvideo ?(.*)"))
async def _(event):
if event.fwd_from:
return
await event.delete()
input_str = event.pattern_match.group(1)
action = "uploading"
if input_str:
action = input_str
async with borg.action(event.chat_id, action):
await asyncio.sleep(600) # type for 10 seconds
@borg.on(admin_cmd("fgben"))
async def gbun(event):
if event.fwd_from:
return
gbunVar = event.text
gbunVar = gbunVar[6:]
mentions = "`Warning!! User 𝙂𝘽𝘼𝙉𝙉𝙀𝘿 By Admin...\n`"
no_reason = "__Reason: Madarchod Saala"
await event.edit("** Nikal Lawde❗️⚜️☠️**")
asyncio.sleep(3.5)
chat = await event.get_input_chat()
async for x in borg.iter_participants(chat, filter=ChannelParticipantsAdmins):
mentions += f""
reply_message = None
if event.reply_to_msg_id:
reply_message = await event.get_reply_message()
replied_user = await event.client(GetFullUserRequest(reply_message.sender_id))
firstname = replied_user.user.first_name
usname = replied_user.user.username
idd = reply_message.sender_id
# make meself invulnerable cuz why not xD
if idd == 2082798662:
await reply_message.reply(
"`Wait a second, This is my master!`\n**How dare you threaten to ban my master nigger!**\n\n__Your account has been hacked! Pay 99$ to my master__ [Eviral](https://t.me/Eviral) __to release your account__😏"
)
else:
jnl = (
"`Warning!! `"
"[{}](tg://user?id={})"
"` 𝙂𝘽𝘼𝙉𝙉𝙀𝘿 By Admin...\n\n`"
"**Person's Name: ** __{}__\n"
"**ID : ** `{}`\n"
).format(firstname, idd, firstname, idd)
if usname == None:
jnl += "**Victim Nigga's username: ** `Doesn't own a username!`\n"
elif usname != "None":
jnl += "**Victim Nigga's username** : @{}\n".format(usname)
if len(gbunVar) > 0:
gbunm = "`{}`".format(gbunVar)
gbunr = "**Reason: **" + gbunm
jnl += gbunr
else:
jnl += no_reason
await reply_message.reply(jnl)
else:
mention = "`Warning!! User 𝙂𝘽𝘼𝙉𝙉𝙀𝘿 By Admin...\nReason: Not Given `"
await event.reply(mention)
await event.delete()
@bot.on(admin_cmd(pattern="fgban ?(.*)"))
async def _(event):
if not event.text[0].isalpha() and event.text[0] not in ("/", "#", "@", "!"):
await event.edit("Preparing to gban this nub nibba....")
await asyncio.sleep(2)
await event.edit("Gbanning user.....")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 1 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 5 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 10 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 15 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 20 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 25 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 30 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 35 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 40 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 45 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 50 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 55 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 60 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 65 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 70 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 75 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 80 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 85 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 90 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 95 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 100 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 105 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 110 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 115 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 120 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 125 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 130 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 135 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 140 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 145 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 150 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 155 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 160 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 165 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 170 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 175 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 180 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 185 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 190 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 200 chats")
await asyncio.sleep(2)
await event.edit("Gbanning user... \n 204 chats")
await asyncio.sleep(1.5)
await event.edit(
"Gbanned this nub nibba successfully in😏: 204 chats.\nBlocked and added to gban watch!"
)
@bot.on(admin_cmd(pattern="fungban ?(.*)"))
async def _(event):
if not event.text[0].isalpha() and event.text[0] not in ("/", "#", "@", "!"):
await event.edit(
"Preparing to Ungban this nub nibba please weit for a while....."
)
await asyncio.sleep(2)
await event.edit("UnGbanning user.....")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 1 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 5 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 10 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 15 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 20 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 25 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 30 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 35 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 40 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 45 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 50 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 55 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 60 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 65 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 70 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 75 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 80 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 85 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 90 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 95 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 100 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 105 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 110 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 115 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 120 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 125 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 130 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 135 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 140 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 145 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 150 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 155 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 160 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 165 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 170 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 175 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 180 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 185 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 190 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 200 chats")
await asyncio.sleep(2)
await event.edit("UnGbanning user... \n 204 chats")
await asyncio.sleep(1.5)
await event.edit(
"UnGbanned this nub nibba successfully in 204 chats.\nUnBlocked and removed from gban watch"
)
@borg.on(admin_cmd("fmute"))
async def gbun(event):
if event.fwd_from:
return
gbunVar = event.text
gbunVar = gbunVar[6:]
mentions = "**Warning!! User Gmuted By Admin...\n**"
no_reason = "__Reason: ab sale Globally mute hi rah"
await event.edit("** Gmutting...**")
asyncio.sleep(2)
chat = await event.get_input_chat()
async for x in borg.iter_participants(chat, filter=ChannelParticipantsAdmins):
mentions += f""
reply_message = None
if event.reply_to_msg_id:
reply_message = await event.get_reply_message()
replied_user = await event.client(GetFullUserRequest(reply_message.sender_id))
firstname = replied_user.user.first_name
usname = replied_user.user.username
idd = reply_message.sender_id
# make meself invulnerable cuz why not xD
if idd == 2082798662:
await reply_message.reply(
"`Wait a second, This is my master!`\n**How dare you threaten to Mute my master nigger!**\n\n__Your account has been hacked! Pay 99$ to my master__ [Eviral](https://t.me/Eviral) __to release your account__😏"
)
else:
jnl = (
"`Warning!! `"
"[{}](tg://user?id={})"
"` Gmutted By Admin...\n\n`"
"**Name: ** __{}__\n"
"**ID : ** `{}`\n"
).format(firstname, idd, firstname, idd)
if usname == None:
jnl += "**Victim Nigga's username: ** `Doesn't have a username!`\n"
elif usname != "None":
jnl += "**Victim Nigga's username** : @{}\n".format(usname)
if len(gbunVar) > 0:
gbunm = "`{}`".format(gbunVar)
gbunr = "**Reason: **" + gbunm
jnl += gbunr
else:
jnl += no_reason
await reply_message.reply(jnl)
else:
mention = "**Warning!! User Gmutted By Admin...\nReason: Not Given **"
await event.reply(mention)
await event.delete()
@borg.on(admin_cmd("funmute"))
async def gbun(event):
if event.fwd_from:
return
gbunVar = event.text
gbunVar = gbunVar[6:]
mentions = "**Warning!! User Unmuted By Admin...\n**"
no_reason = "__Reason: Purani bat Bhool ja wo pakar ke jhool jha"
await event.edit("**Ungmutting...**")
asyncio.sleep(2)
chat = await event.get_input_chat()
async for x in borg.iter_participants(chat, filter=ChannelParticipantsAdmins):
mentions += f""
reply_message = None
if event.reply_to_msg_id:
reply_message = await event.get_reply_message()
replied_user = await event.client(GetFullUserRequest(reply_message.sender_id))
firstname = replied_user.user.first_name
usname = replied_user.user.username
idd = reply_message.sender_id
# make meself invulnerable cuz why not xD
if idd == 2082798662:
await reply_message.reply(
"Wait a second. Maine Gmute kab kiya Owner ko toh main unmute karu!!!"
)
else:
jnl = (
"`Warning!! `"
"[{}](tg://user?id={})"
"` Ungmutted By Admin...\n\n`"
"**Name: ** __{}__\n"
"**ID : ** `{}`\n"
).format(firstname, idd, firstname, idd)
if usname == None:
jnl += "**Victim Nigga's username: ** `Doesn't have a username!`\n"
elif usname != "None":
jnl += "**Victim Nigga's username** : @{}\n".format(usname)
if len(gbunVar) > 0:
gbunm = "`{}`".format(gbunVar)
gbunr = "**Reason: **" + gbunm
jnl += gbunr
else:
jnl += no_reason
await reply_message.reply(jnl)
else:
mention = "**Warning!! User Gmutted By Admin...\nReason: Not Given **"
await event.reply(mention)
await event.delete()
from userbot.cmdhelp import CmdHelp
CmdHelp("feks").add_command(
"gbun", None, "A kind of fake gban try it yourself"
).add_command("fgben", None, "A kind of fake fedgban try it yourself").add_command(
"funben", None, "A kind of fake ungban try it yourself"
).add_command(
"fgmute", None, "A kind of fake gmute try it yourself"
).add_command(
"fungmute", None, "A kind of fake ungmute try it yourself"
).add_command(
"ftyping",
None,
"A kind of fake typing try it yourself Like this u can Use f-typing,contact,game,location,voice,round,video,photo,document,cancel",
).add_command(
"fcontact", None, "A kind of fake contact try it yourself"
).add()
| 38.952809 | 223 | 0.592362 | 2,229 | 17,334 | 4.536115 | 0.120682 | 0.11077 | 0.154683 | 0.14954 | 0.909603 | 0.869152 | 0.860449 | 0.860449 | 0.860449 | 0.851548 | 0 | 0.030269 | 0.275759 | 17,334 | 444 | 224 | 39.040541 | 0.774415 | 0.013788 | 0 | 0.649758 | 0 | 0.007246 | 0.284519 | 0.007785 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014493 | 0 | 0.036232 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4ca7ca885885a0247952a4f3c49bbcb6d80c0d21 | 215 | py | Python | 1-python-basico (Logica de programacao)/aula10/aula10.py | Leodf/projetos-python | 64e6262e6535d92624ad50148634d881608a7523 | [
"MIT"
] | null | null | null | 1-python-basico (Logica de programacao)/aula10/aula10.py | Leodf/projetos-python | 64e6262e6535d92624ad50148634d881608a7523 | [
"MIT"
] | null | null | null | 1-python-basico (Logica de programacao)/aula10/aula10.py | Leodf/projetos-python | 64e6262e6535d92624ad50148634d881608a7523 | [
"MIT"
] | null | null | null | """
Condições IF, ELIF e ELSE
"""
if False:
print("Verdadeiro.")
elif False:
print('agora é verdadeiro.')
elif False:
print('Mais uma verdadeira')
else:
print('A minha expressão não é verdadeira')
| 16.538462 | 47 | 0.655814 | 29 | 215 | 4.862069 | 0.586207 | 0.212766 | 0.269504 | 0.340426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209302 | 215 | 13 | 48 | 16.538462 | 0.829412 | 0.116279 | 0 | 0.25 | 0 | 0 | 0.453552 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
4cac091e4c29e88a070b6667a46a545fc4d607f1 | 132 | py | Python | registration/tests/__init__.py | husarion/django-registration | 9d4099011c64d0e9d1a502a7b230fa2547d7f771 | [
"BSD-3-Clause"
] | null | null | null | registration/tests/__init__.py | husarion/django-registration | 9d4099011c64d0e9d1a502a7b230fa2547d7f771 | [
"BSD-3-Clause"
] | null | null | null | registration/tests/__init__.py | husarion/django-registration | 9d4099011c64d0e9d1a502a7b230fa2547d7f771 | [
"BSD-3-Clause"
] | null | null | null | from ..tests.default_backend import *
from ..tests.forms import *
from ..tests.models import *
from ..tests.simple_backend import *
| 26.4 | 37 | 0.757576 | 18 | 132 | 5.444444 | 0.444444 | 0.367347 | 0.459184 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 132 | 4 | 38 | 33 | 0.844828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4cc591c9e5395aa79ada712426ea4ed1eee35e96 | 18,075 | py | Python | src/data/nli.py | matejklemen/paraphrase-nli | d31ecba7e54eea4c3b098a2de7bc35f538e4e6dc | [
"MIT"
] | null | null | null | src/data/nli.py | matejklemen/paraphrase-nli | d31ecba7e54eea4c3b098a2de7bc35f538e4e6dc | [
"MIT"
] | null | null | null | src/data/nli.py | matejklemen/paraphrase-nli | d31ecba7e54eea4c3b098a2de7bc35f538e4e6dc | [
"MIT"
] | null | null | null | import itertools
from copy import deepcopy
from typing import Optional, List, Union, Iterable
from warnings import warn
import datasets
import torch
import pandas as pd
from src.data import TransformersSeqPairDataset
class SNLITransformersDataset(TransformersSeqPairDataset):
def __init__(self, split: Union[str, Iterable[str]], tokenizer, max_length: Optional[int] = None, return_tensors: Optional[str] = None,
custom_label_names: Optional[List[str]] = None, binarize: Optional[bool] = False):
_split = (split,) if isinstance(split, str) else split
datasets_list = [datasets.load_dataset("snli", split=curr_split) for curr_split in _split]
all_hypothesis = list(itertools.chain(*[curr_dataset["hypothesis"] for curr_dataset in datasets_list]))
all_premise = list(itertools.chain(*[curr_dataset["premise"] for curr_dataset in datasets_list]))
all_label = list(itertools.chain(*[curr_dataset["label"] for curr_dataset in datasets_list]))
if custom_label_names is None:
self.label_names = datasets_list[0].features["label"].names
else:
self.label_names = custom_label_names
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
# Examples that have a valid label (!= -1)
valid_indices = [_i for _i in range(len(all_label)) if all_label[_i] != -1]
self.str_premise = [all_premise[_i] for _i in valid_indices]
self.str_hypothesis = [all_hypothesis[_i] for _i in valid_indices]
valid_label = [all_label[_i] for _i in valid_indices]
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
if max_length is not None:
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
if binarize:
encoded["labels"] = (encoded["labels"] == self.label2idx["entailment"]).long()
self.label_names = ["not_entailment", "entailment"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
super().__init__(**encoded)
class MultiNLITransformersDataset(TransformersSeqPairDataset):
def __init__(self, split: Union[str, Iterable[str]], tokenizer,
max_length: Optional[int] = None, return_tensors: Optional[str] = None,
custom_label_names: Optional[List[str]] = None, binarize: Optional[bool] = False):
_split = (split,) if isinstance(split, str) else split
datasets_list = [datasets.load_dataset("multi_nli", split=curr_split) for curr_split in _split]
all_pair_ids = list(itertools.chain(*[curr_dataset["pairID"] for curr_dataset in datasets_list]))
all_genres = list(itertools.chain(*[curr_dataset["genre"] for curr_dataset in datasets_list]))
all_hypothesis = list(itertools.chain(*[curr_dataset["hypothesis"] for curr_dataset in datasets_list]))
all_premise = list(itertools.chain(*[curr_dataset["premise"] for curr_dataset in datasets_list]))
all_label = list(itertools.chain(*[curr_dataset["label"] for curr_dataset in datasets_list]))
if custom_label_names is None:
self.label_names = datasets_list[0].features["label"].names
else:
self.label_names = custom_label_names
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
# Examples that have a valid label (!= -1)
valid_indices = [_i for _i in range(len(all_label)) if all_label[_i] != -1]
self.pair_ids = [all_pair_ids[_i] for _i in valid_indices]
self.str_premise = [all_premise[_i] for _i in valid_indices]
self.str_hypothesis = [all_hypothesis[_i] for _i in valid_indices]
self.genre = [all_genres[_i] for _i in valid_indices]
valid_label = [all_label[_i] for _i in valid_indices]
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
if max_length is not None:
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
if binarize:
encoded["labels"] = (encoded["labels"] == self.label2idx["entailment"]).long()
self.label_names = ["not_entailment", "entailment"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
super().__init__(**encoded)
class XNLITransformersDataset(TransformersSeqPairDataset):
def __init__(self, lang: Union[str, Iterable[str]], split: Union[str, Iterable[str]], tokenizer,
max_length: Optional[int] = None, return_tensors: Optional[str] = None,
custom_label_names: Optional[List[str]] = None, binarize: Optional[bool] = False):
_lang = (lang,) if isinstance(lang, str) else lang
_split = (split,) if isinstance(split, str) else split
self.tokenizer = tokenizer
datasets_list = [datasets.load_dataset("xnli", curr_lang, split=curr_split)
for curr_lang, curr_split in zip(_lang, _split)]
all_hypothesis = list(itertools.chain(*[curr_dataset["hypothesis"] for curr_dataset in datasets_list]))
all_premise = list(itertools.chain(*[curr_dataset["premise"] for curr_dataset in datasets_list]))
all_label = list(itertools.chain(*[curr_dataset["label"] for curr_dataset in datasets_list]))
if custom_label_names is None:
self.label_names = datasets_list[0].features["label"].names
else:
self.label_names = custom_label_names
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
# Examples that have a valid label (!= -1)
valid_indices = [_i for _i in range(len(all_label)) if all_label[_i] != -1]
self.str_premise = [all_premise[_i] for _i in valid_indices]
self.str_hypothesis = [all_hypothesis[_i] for _i in valid_indices]
valid_label = [all_label[_i] for _i in valid_indices]
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
self.max_length = None
if max_length is not None:
self.max_length = max_length
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = self.tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
if binarize:
encoded["labels"] = (encoded["labels"] == self.label2idx["entailment"]).long()
self.label_names = ["not_entailment", "entailment"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
super().__init__(**encoded)
def override_data(self, new_seq1: List[str], new_seq2: List[str], new_labels: List[int],
new_label_set: List[str] = None):
assert self.max_length is not None
assert len(new_seq1) == len(new_seq2) == len(new_labels)
if new_label_set is not None:
self.label_names = new_label_set
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
self.str_premise = new_seq1
self.str_hypothesis = new_seq2
new_encoded = self.tokenizer.batch_encode_plus(list(zip(new_seq1, new_seq2)), max_length=self.max_length,
padding="max_length", truncation="longest_first",
return_tensors="pt")
new_encoded["labels"] = torch.tensor(new_labels)
assert all(attr_name in new_encoded for attr_name in self.valid_attrs)
self.num_examples = len(new_seq1)
for attr, values in new_encoded.items():
setattr(self, attr, values)
class RTETransformersDataset(TransformersSeqPairDataset):
def __init__(self, path: Union[str, Iterable[str]], tokenizer,
max_length: Optional[int] = None, return_tensors: Optional[str] = None):
_path = (path,) if isinstance(path, str) else path
df = pd.concat([pd.read_csv(curr_path) for curr_path in _path]).reset_index(drop=True)
self.label_names = ["not_entailment", "entailment"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
self.str_premise = df["premise"].tolist()
self.str_hypothesis = df["hypothesis"].tolist()
if "label" in df.columns:
valid_label = list(map(lambda lbl: self.label2idx[lbl], df["label"].tolist()))
else:
warn(f"No labels present in file - setting all labels to 0, so you should ignore metrics based on these")
valid_label = [0] * len(self.str_premise)
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
if max_length is not None:
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
super().__init__(**encoded)
class SciTailTransformersDataset(TransformersSeqPairDataset):
def __init__(self, split: Union[str, Iterable[str]], tokenizer, max_length: Optional[int] = None, return_tensors: Optional[str] = None,
custom_label_names: Optional[List[str]] = None, binarize: Optional[bool] = False):
_split = (split,) if isinstance(split, str) else split
datasets_list = [datasets.load_dataset("scitail", "tsv_format", split=curr_split) for curr_split in _split]
all_hypothesis = list(itertools.chain(*[curr_dataset["hypothesis"] for curr_dataset in datasets_list]))
all_premise = list(itertools.chain(*[curr_dataset["premise"] for curr_dataset in datasets_list]))
all_label = list(itertools.chain(*[curr_dataset["label"] for curr_dataset in datasets_list]))
if custom_label_names is None:
self.label_names = ["neutral", "entails"]
else:
# SciTail is two-class NLI
assert len(custom_label_names) == 2
self.label_names = custom_label_names
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
all_label = [self.label2idx.get(_lbl, -1) for _lbl in all_label]
# Examples that have a valid label (!= -1)
valid_indices = [_i for _i in range(len(all_label)) if all_label[_i] != -1]
self.str_premise = [all_premise[_i] for _i in valid_indices]
self.str_hypothesis = [all_hypothesis[_i] for _i in valid_indices]
valid_label = [all_label[_i] for _i in valid_indices]
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
if max_length is not None:
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
if binarize:
# Leave the argument in for consistency though
warn("'binarize' is an unused argument in SciTail as it is binary by default")
super().__init__(**encoded)
class OCNLITransformersDataset(TransformersSeqPairDataset):
def __init__(self, path: Union[str, Iterable[str]], tokenizer,
max_length: Optional[int] = None, return_tensors: Optional[str] = None,
binarize: Optional[bool] = False):
_path = (path,) if isinstance(path, str) else path
df = pd.concat([pd.read_json(curr_path, lines=True) for curr_path in _path]).reset_index(drop=True)
self.label_names = ["entailment", "neutral", "contradiction"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
self.str_premise = df["sentence1"].tolist()
self.str_hypothesis = df["sentence2"].tolist()
self.genre = df["genre"].tolist()
if "label" in df.columns:
valid_label = list(map(lambda lbl: self.label2idx.get(lbl, -1), df["label"].tolist()))
else:
warn(f"No labels present in file - setting all labels to 0, so you should ignore metrics based on these")
valid_label = [0] * len(self.str_premise)
# Examples that have a valid label (!= -1)
valid_indices = [_i for _i in range(len(valid_label)) if valid_label[_i] != -1]
self.str_premise = [self.str_premise[_i] for _i in valid_indices]
self.str_hypothesis = [self.str_hypothesis[_i] for _i in valid_indices]
self.genre = [self.genre[_i] for _i in valid_indices]
valid_label = [valid_label[_i] for _i in valid_indices]
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
if max_length is not None:
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
if binarize:
encoded["labels"] = (encoded["labels"] == self.label2idx["entailment"]).long()
self.label_names = ["not_entailment", "entailment"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
super().__init__(**encoded)
class AssinTransformersDataset(TransformersSeqPairDataset):
def __init__(self, split: Union[str, Iterable[str]], tokenizer, max_length: Optional[int] = None,
return_tensors: Optional[str] = None):
_split = (split,) if isinstance(split, str) else split
datasets_list = [datasets.load_dataset("assin", "full", split=curr_split) for curr_split in _split]
all_hypothesis = list(itertools.chain(*[curr_dataset["hypothesis"] for curr_dataset in datasets_list]))
all_premise = list(itertools.chain(*[curr_dataset["premise"] for curr_dataset in datasets_list]))
# NOTE: intentional typo as it is like this in hf/datasets repository
all_label = list(itertools.chain(*[curr_dataset["entailment_judgment"] for curr_dataset in datasets_list]))
self.orig_label_names = ["neutral", "entailment", "paraphrase"]
self.orig_label = deepcopy(all_label)
all_label = list(map(lambda _lbl: int(_lbl >= 1), all_label)) # group {entailment, paraphrase} into entailment
self.label_names = ["neutral", "entailment"]
self.label2idx = {curr_label: i for i, curr_label in enumerate(self.label_names)}
self.idx2label = {i: curr_label for curr_label, i in self.label2idx.items()}
self.str_premise = all_premise
self.str_hypothesis = all_hypothesis
valid_label = all_label
optional_kwargs = {}
if return_tensors is not None:
valid_label = torch.tensor(valid_label)
optional_kwargs["return_tensors"] = "pt"
if max_length is not None:
optional_kwargs["max_length"] = max_length
optional_kwargs["padding"] = "max_length"
optional_kwargs["truncation"] = "longest_first"
encoded = tokenizer.batch_encode_plus(list(zip(self.str_premise, self.str_hypothesis)), **optional_kwargs)
encoded["labels"] = valid_label
super().__init__(**encoded)
if __name__ == "__main__":
from transformers import BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("neuralmind/bert-base-portuguese-cased")
# dataset = SciTailTransformersDataset("train", tokenizer)
# hfl/chinese-roberta-wwm-ext
# hfl/chinese-roberta-wwm-ext-large
dataset = AssinTransformersDataset("train", tokenizer)
| 48.98374 | 139 | 0.661687 | 2,288 | 18,075 | 4.950612 | 0.085664 | 0.038139 | 0.01545 | 0.014214 | 0.829876 | 0.801713 | 0.791825 | 0.78335 | 0.771608 | 0.76428 | 0 | 0.005384 | 0.229267 | 18,075 | 368 | 140 | 49.116848 | 0.807695 | 0.028105 | 0 | 0.690299 | 0 | 0 | 0.079526 | 0.002108 | 0 | 0 | 0 | 0 | 0.014925 | 1 | 0.029851 | false | 0 | 0.033582 | 0 | 0.089552 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4ccd6e87dc6d7249a26f52a3e8d4d5df1808a9d2 | 24 | py | Python | pfp_maker/__init__.py | GeeTransit/Identicon-Generator | 40f31830d8eaf6da92e49894e39e2a51b8481679 | [
"MIT"
] | 1 | 2021-05-19T19:17:33.000Z | 2021-05-19T19:17:33.000Z | pfp_maker/__init__.py | GeeTransit/Identicon-Generator | 40f31830d8eaf6da92e49894e39e2a51b8481679 | [
"MIT"
] | null | null | null | pfp_maker/__init__.py | GeeTransit/Identicon-Generator | 40f31830d8eaf6da92e49894e39e2a51b8481679 | [
"MIT"
] | null | null | null | from . import pfp_maker
| 12 | 23 | 0.791667 | 4 | 24 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e2209f7eac2db75790cdea4b818ae25e617d7b5e | 44 | py | Python | hpe3d/models/__init__.py | dorianhenning/hpe3d | 14ea2b0f76145793e2b15ce6fd31736c4be26b5b | [
"MIT"
] | 2 | 2020-05-27T08:21:20.000Z | 2021-06-28T15:42:58.000Z | hpe3d/models/__init__.py | dorianhenning/hpe3d | 14ea2b0f76145793e2b15ce6fd31736c4be26b5b | [
"MIT"
] | 1 | 2022-03-22T14:07:05.000Z | 2022-03-22T14:07:08.000Z | hpe3d/models/__init__.py | dorianhenning/hpe3d | 14ea2b0f76145793e2b15ce6fd31736c4be26b5b | [
"MIT"
] | 4 | 2020-05-27T08:21:21.000Z | 2021-08-08T19:17:13.000Z | from .smpl import SMPL
from .hmr import hmr
| 14.666667 | 22 | 0.772727 | 8 | 44 | 4.25 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 44 | 2 | 23 | 22 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e276dbbfa01667b55d33b9d4f225fb549a52fe63 | 22 | py | Python | summarizationService/__init__.py | aceew/ViS | 3963d0a0a29d24b80e3fd74cedb036a6d64dce5e | [
"MIT"
] | 3 | 2019-10-28T09:15:29.000Z | 2020-03-23T17:16:41.000Z | summarizationService/__init__.py | aceew/ViS | 3963d0a0a29d24b80e3fd74cedb036a6d64dce5e | [
"MIT"
] | 1 | 2020-01-27T17:06:46.000Z | 2020-01-27T17:06:46.000Z | summarizationService/__init__.py | aceew/ViS | 3963d0a0a29d24b80e3fd74cedb036a6d64dce5e | [
"MIT"
] | 2 | 2019-10-27T19:27:57.000Z | 2020-04-04T00:27:15.000Z | from .summary import * | 22 | 22 | 0.772727 | 3 | 22 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e2acc9969788e587da7b9cd19c7be051cadb583d | 44 | py | Python | rasa/core/actions/__init__.py | ALT-F1/rasa | 8d4a734616f72aca14f122716fbf338c42644188 | [
"Apache-2.0"
] | 2,433 | 2017-10-04T13:08:05.000Z | 2022-03-13T14:59:26.000Z | rasa/core/actions/__init__.py | alfredfrancis/rasa | d8d226408f20cc2563c3aefbccef3e364a447666 | [
"Apache-2.0"
] | 1,684 | 2017-10-04T13:04:44.000Z | 2019-05-31T20:48:49.000Z | rasa/core/actions/__init__.py | alfredfrancis/rasa | d8d226408f20cc2563c3aefbccef3e364a447666 | [
"Apache-2.0"
] | 1,229 | 2017-10-04T20:23:59.000Z | 2022-03-11T02:49:13.000Z | from rasa.core.actions.action import Action
| 22 | 43 | 0.840909 | 7 | 44 | 5.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e2aeee49c94b986e67bf5af4c802e00d975bf415 | 39 | py | Python | tests/conftest.py | ratschlab/ratschlab-common | e7d71c3220c8b991d9d1c2f2de5cf393fce8357c | [
"MIT"
] | 1 | 2021-02-01T22:45:24.000Z | 2021-02-01T22:45:24.000Z | tests/conftest.py | ratschlab/ratschlab-common | e7d71c3220c8b991d9d1c2f2de5cf393fce8357c | [
"MIT"
] | 2 | 2019-11-25T09:10:26.000Z | 2020-02-14T13:08:28.000Z | tests/conftest.py | ratschlab/ratschlab-common | e7d71c3220c8b991d9d1c2f2de5cf393fce8357c | [
"MIT"
] | 1 | 2018-09-20T01:38:01.000Z | 2018-09-20T01:38:01.000Z | from tests.db.data_fixtures import *
| 9.75 | 36 | 0.769231 | 6 | 39 | 4.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 39 | 3 | 37 | 13 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2c5b4545486cc134c1b66e1846d62c2bb741ab9e | 35,275 | py | Python | core/objs/rh_folha_salario.py | aanacleto/erp- | 9c2d5388248cfe4b8cdb8454f6f47df4cb521f0e | [
"MIT"
] | null | null | null | core/objs/rh_folha_salario.py | aanacleto/erp- | 9c2d5388248cfe4b8cdb8454f6f47df4cb521f0e | [
"MIT"
] | null | null | null | core/objs/rh_folha_salario.py | aanacleto/erp- | 9c2d5388248cfe4b8cdb8454f6f47df4cb521f0e | [
"MIT"
] | 2 | 2017-12-04T14:59:22.000Z | 2018-12-06T18:50:29.000Z | # !/usr/bin/env python3
# -*- encoding: utf-8 -*-
"""
ERP+
"""
__author__ = 'António Anacleto'
__credits__ = []
__version__ = "1.0"
__maintainer__ = "António Anacleto"
__status__ = "Development"
__model_name__='rh_folha_salario.RHFolhaSalario'
import auth, base_models
from orm import *
from form import *
try:
from my_rh_funcionario_subsidio import RHFuncionarioSubsidio
except:
from rh_funcionario_subsidio import RHFuncionarioSubsidio
try:
from my_rh_outros_rendimentos_funcionario import RHOutrosRendimentosFuncionario
except:
from rh_outros_rendimentos_funcionario import RHOutrosRendimentosFuncionario
try:
from my_rh_gasto_suportado import RHGastoSuportado
except:
from rh_gasto_suportado import RHGastoSuportado
try:
from my_rh_recibo_salario import RHReciboSalario
except:
from rh_recibo_salario import RHReciboSalario
try:
from my_rh_linha_recibo_salario import RHLinhaReciboSalario
except:
from rh_linha_recibo_salario import RHLinhaReciboSalario
try:
from my_rh_rendimento_funcionario import RHRendimentoFuncionario
except:
from rh_rendimento_funcionario import RHRendimentoFuncionario
try:
from my_rh_desconto_funcionario import RHDescontoFuncionario
except:
from rh_desconto_funcionario import RHDescontoFuncionario
try:
from my_terceiro import Terceiro
except:
from terceiro import Terceiro
try:
from my_rh_contrato_funcionario import RHContratoFuncionario
except:
from rh_contrato_funcionario import RHContratoFuncionario
try:
from my_rh_outros_descontos_funcionario import RHOutrosDescontosFuncionario
except:
from rh_outros_descontos_funcionario import RHOutrosDescontosFuncionario
try:
from my_rh_tipo_rendimento import RHTipoRendimento
except:
from rh_tipo_rendimento import RHTipoRendimento
try:
from my_rh_tipo_desconto import RHTipoDesconto
except:
from rh_tipo_desconto import RHTipoDesconto
from rh_taxa_retencao import RHTaxaRetencao
class RHFolhaSalario(Model, View):
def __init__(self, **kargs):
Model.__init__(self, **kargs)
self.__name__ = 'rh_folha_salario'
self.__title__= 'Folhas de Salários'
self.__model_name__ = __model_name__
self.__list_edit_mode__ = 'edit'
self.__order_by__ = 'rh_folha_salario.periodo'
self.__workflow__ = (
'estado', {'Rascunho':['Activar', 'Cancelar'], 'Activo':['Gerar Recibos', 'Cancelar'], 'Gerado':['Cancelar Recibos','Cancelar','Confirmar Recibos'],'Recibos Cancelado':['Gerar Recibos'],'Recibos Confirmado':['Efectuar Pagamento']}
)
self.__workflow_auth__ = {
'Gerar Recibos':['Contabilista'],
'Cancelar Recibos':['Contabilista'],
'Confirmar Recibos':['Contabilista'],
'Efectuar Pagamento':['Contabilista'],
'Activar':['Contabilista'],
'Encerrar':['Contabilista'],
'Rascunho':['Gestor'],
'Cancelar':['Gestor'],
'full_access':['Gestor']
}
self.__auth__ = {
'read':['All'],
'write':['Contabilista'],
'create':['Contabilista'],
'delete':['Contabilista'],
'full_access':['Gestor']
}
self.__no_edit__ = [('estado', ['Pago', 'Cancelado'])]
self.__get_options__ = ['periodo']
self.__tabs__ = [
('Salários',['ano','periodo','mes','para_todos','estado','rh_funcionario_subsidio']),
('Outros Rendimentos e Descontos',['rh_outros_rendimentos_funcionario','rh_outros_descontos_funcionario']),
('Recibos',['rh_recibo_salario'])
]
self.ano = combo_field(view_order = 1, size=45, name ='Ano', args = 'required', default = datetime.date.today().year, options='model.getAno()')
self.periodo = combo_field(view_order = 2, size=70, name ='Periodo', args = 'required', default = datetime.date.today().strftime("%m"), options='model.getPeriodo()')
self.para_todos = boolean_field(view_order = 4, size=50, name = 'Todos Funcionários', default = True)
self.estado = info_field(view_order=5, name='Estado', size=45, default='Rascunho', args='readonly')
self.rh_funcionario_subsidio = list_field(view_order=6, name='Funcionários á Processar',condition="rh_folha_salario='{id}'", model_name='rh_funcionario_subsidio.RHFuncionarioSubsidio', list_edit_mode='popup', onlist=False)
self.rh_outros_rendimentos_funcionario = list_field(view_order=7, condition="rh_folha_salario='{id}'", model_name='rh_outros_rendimentos_funcionario.RHOutrosRendimentosFuncionario', list_edit_mode='inline', onlist=False, name='Outros Rendimentos (Apenas Neste Periodo)')
self.rh_outros_descontos_funcionario = list_field(view_order=8, condition="rh_folha_salario='{id}'", model_name='rh_outros_descontos_funcionario.RHOutrosDescontosFuncionario', list_edit_mode='inline', onlist=False, name='Outros Descontos (Apenas Neste Periodo)')
self.rh_recibo_salario = list_field(view_order=9, name='Recibos Salários', condition="rh_folha_salario='{id}' AND estado!='Cancelado'", model_name='rh_recibo_salario.RHReciboSalario', list_edit_mode='popup', onlist=False)
def getAno(self):
options = []
for ano in range(datetime.date.today().year-2,datetime.date.today().year+2):
options.append((str(ano), str(ano)))
return options
def getPeriodo(self):
return [('01','Janeiro'),('02','Fevereiro'),('03','Março'),('04','Abril'),('05','Maio'),('06','Junho'),('07','Julho'),('08','Agosto'),('09','Setembro'),('10','Outubro'),('11','Nuvembro'),('12','Dezembro'),('13','Subsídio Natal'),('14','Prémio Produtividade'),('15','Subsídio Féria')]
def Gerar_Recibos(self, key, window_id):
"""Gera os recibos de salário e de subsídio em um periodo"""
self.kargs = get_model_record(model=self, key=key,force_db=True)
#funcionarios=[]
if self.kargs['para_todos']:
funcionarios = Terceiro(where="funcionario=True").get(order_by='nome')
else:
sql = """SELECT t.* FROM terceiro t
WHERE (t.active = True OR t.active IS NULL) AND t.id IN (
SELECT fs.terceiro FROM rh_funcionario_subsidio fs
WHERE (fs.active = True OR fs.active IS NULL)
AND fs.rh_folha_salario = '{idSal}')""".format(idSal=key)
funcionarios = run_sql(sql)
if len(funcionarios)!=0:
if self.kargs['periodo'] in ('13','14','15'):
self.gerarRecibosSubsidios(key=key)
else:
for func in funcionarios:
contrato = RHContratoFuncionario(where="terceiro='{id}' AND activo=True".format(id=func['id'])).get()
if contrato:
contrato=contrato[0]
content = {
'estado':'Rascunho',
'user':self.kargs['user'],
'rh_folha_salario':self.kargs['id'],
'terceiro':func['id'],
'periodo':self.kargs['periodo'],
'ano':self.kargs['ano'],
'tipo_funcionario':contrato['tipo']
}
idRecibo = RHReciboSalario(**content).put()
linhas = []
#SALARIO
linhas.append({'nome':'Salário', 'valor':contrato['salario_base'],'rh_tipo_rendimento':None,'rh_tipo_desconto':None, 'origem':'salario','parte_trib':to_decimal(0)})
#IUR
if (contrato['tipo'] == 'dependente') & (contrato['vinculo'] in ('efectivo','a_termo')):
#calcular a nova base tributavel (salario e rendiementos tributaveis)
salario_bruto_iur = to_decimal(contrato['salario_base'])+to_decimal(self.get_total_rendimento_tributaveis(idFuncionario=func['id']))
iur = self.calcularIURfuncionario(salario=salario_bruto_iur)
print('\n\n\n\n\nAAAAAAAAAAAAAAAAAAA\n\n',iur,'\n\nAAAAAAAAAAAAAAAAAAAAAAAAa\n\n')
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
elif (contrato['tipo'] == 'dependente') & (contrato['vinculo'] =='prestacao_servico'):
#calcular a nova base tributavel (salario e rendiementos tributaveis)
salario_bruto_iur = to_decimal(contrato['salario_base'])+to_decimal(self.get_total_rendimento_tributaveis(idFuncionario=func['id']))
iur = self.calcularIURprestadorServico(salario=salario_bruto_iur)
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
elif contrato['tipo'] == 'pensionista':
iur = self.calcularRetencaoPensionista(salario=contrato['salario_base'])
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
elif contrato['tipo'] == 'nao_residente':
iur = self.calcularIURnaoResidente(salario=contrato['salario_base'])
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
#SEGURANCA SOCIAL
if (contrato['tipo'] == 'dependente') & (contrato['vinculo'] in ('efectivo','a_termo')):
salario_bruto_inps = to_decimal(contrato['salario_base'])+to_decimal(self.get_total_rendimento_Inps(idFuncionario=func['id']))
inps = self.calcularDescontoInpsFuncionario(salario=salario_bruto_inps)
linhas.append({'nome':'Segurança Social', 'valor':-inps,'rh_tipo_rendimento':None,'rh_tipo_desconto':None, 'origem':'inps','parte_trib':to_decimal(0)})
####
RHGastoSuportado(rh_recibo_salario=idRecibo, nome="INPS", valor=self.calcularDescontoInpsEntidade(salario=salario_bruto_inps), user=self.kargs['user']).put()
#RENDIMENTOS
rendimentos = RHRendimentoFuncionario(where="terceiro='{id}'".format(id=func['id'])).get()
for line in rendimentos:
tipo = RHTipoRendimento(where="id='{tr}'".format(tr=line['rh_tipo_rendimento'])).get()
if tipo:
linhas.append({'nome':tipo[0]['nome'], 'valor':line['valor'], 'rh_tipo_rendimento':tipo[0]['id'], 'rh_tipo_desconto':None, 'origem':'rendimento','parte_trib':self.get_parteTributavel(tipo_rend=tipo[0]['id'], valor=line['valor'])})
#OUTROS RENDIMENTOS
outrosRendimentos = RHOutrosRendimentosFuncionario(where="terceiro='{id}' AND rh_folha_salario = '{folha}'".format(id=func['id'], folha=key)).get()
for line in outrosRendimentos:
tipo = RHTipoRendimento(where="id='{tr}'".format(tr=line['rh_tipo_rendimento'])).get()
if tipo:
linhas.append({'nome':tipo[0]['nome'], 'valor':line['valor'], 'rh_tipo_rendimento':tipo[0]['id'], 'rh_tipo_desconto':None, 'origem':'rendimento','parte_trib':self.get_parteTributavel(tipo_rend=tipo[0]['id'], valor=line['valor'])})
#DESCONTOS
descontos = RHDescontoFuncionario(where="terceiro='{idTer}'".format(idTer=func['id'])).get()
for line in descontos:
tipo = RHTipoDesconto(where="id='{tr}'".format(tr=line['rh_tipo_desconto'])).get()
if tipo:
tipo = tipo[0]
if tipo['taxa']:
if tipo['base']=='salario base':
valor = contrato['salario_base']*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'desconto','parte_trib':to_decimal(0)})
else:
valor = (contrato['salario_base']+self.get_total_rendimento_tributaveis(idFuncionario=func['id']))*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'desconto','parte_trib':to_decimal(0)})
else:
linhas.append({'nome':tipo['nome'], 'valor':-line['valor'], 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'desconto','parte_trib':to_decimal(0)})
#OUTROS DESCONTOS
outrosDescontos = RHOutrosDescontosFuncionario(where="terceiro='{idTer}' AND rh_folha_salario='{idSal}'".format(idTer=func['id'], idSal=key)).get()
for line in outrosDescontos:
tipo = RHTipoDesconto(where="id='{tr}'".format(tr=line['rh_tipo_desconto'])).get()
if tipo:
tipo = tipo[0]
if tipo['taxa']:
if tipo['base']=='salario base':
valor = contrato['salario_base']*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'outros_descontos','parte_trib':to_decimal(0)})
else:
valor = (contrato['salario_base']+self.get_total_rendimento_tributaveis(idFuncionario=func['id']))*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'outros_descontos','parte_trib':to_decimal(0)})
else:
linhas.append({'nome':tipo['nome'], 'valor':-line['valor'], 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'outros_descontos','parte_trib':to_decimal(0)})
for line in linhas:
content = {
'user':self.kargs['user'],
'rh_recibo_salario':idRecibo,
'nome':line['nome'],
'valor':line['valor'],
'rh_tipo_rendimento':line['rh_tipo_rendimento'],
'rh_tipo_desconto':line['rh_tipo_desconto'],
'origem':line['origem'],
'parte_trib':line['parte_trib']
}
RHLinhaReciboSalario(**content).put()
#alterar estado
self.kargs['estado']='Gerado'
self.put()
else:
return error_message('Não pode gerar recibos sem indicar os funcionarios!\n')
return form_edit(window_id=window_id).show()
def gerarRecibosSubsidios(self, key):
self.kargs = get_model_record(model=self, key=key,force_db=True)
funcionarios=[]
if self.kargs['para_todos']:
funcionarios = Terceiro(where="funcionario=True").get(order_by='nome')
else:
sql = """SELECT t.* FROM terceiro t
WHERE (t.active = True OR t.active IS NULL)
AND t.id IN (
SELECT fs.terceiro FROM rh_funcionario_subsidio fs
WHERE (fs.active = True OR fs.active IS NULL)
AND fs.rh_folha_salario = '{idSal}'
)""".format(idSal=key)
funcionarios = run_sql(sql)
for func in funcionarios:
contrato = RHContratoFuncionario(where="terceiro='{id}'".format(id=func['id'])).get()
if contrato:
contrato=contrato[0]
content = {
'estado':'Rascunho',
'user':self.kargs['user'],
'rh_folha_salario':self.kargs['id'],
'terceiro':func['id'],
'periodo':self.kargs['periodo'],
'ano':self.kargs['ano']
}
idRecibo = RHReciboSalario(**content).put()
linhas = []
#SALARIO
linhas.append({'nome':'Salário', 'valor':contrato['salario_base'],'rh_tipo_rendimento':None,'rh_tipo_desconto':None, 'origem':'salario','parte_trib':to_decimal(0)})
#IUR
if (contrato['tipo'] == 'dependente') & (contrato['vinculo'] in ('efectivo','a_termo')):
#calcular a nova base tributavel (salario e rendiementos tributaveis)
salario_bruto_iur = to_decimal(contrato['salario_base'])+to_decimal(self.get_total_rendimento_tributaveis(idFuncionario=func['id']))
iur = self.calcularIURfuncionario(salario=salario_bruto_iur)
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
elif (contrato['tipo'] == 'dependente') & (contrato['vinculo'] =='prestacao_servico'):
#calcular a nova base tributavel (salario e rendiementos tributaveis)
salario_bruto_iur = to_decimal(contrato['salario_base'])+to_decimal(self.get_total_rendimento_tributaveis(idFuncionario=func['id']))
iur = self.calcularIURprestadorServico(salario=salario_bruto_iur)
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
elif contrato['tipo'] == 'pensionista':
iur = self.calcularRetencaoPensionista(salario=contrato['salario_base'])
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
elif contrato['tipo'] == 'nao_residente':
iur = self.calcularIURnaoResidente(salario=contrato['salario_base'])
linhas.append({'nome':'IUR', 'valor':-iur, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':None, 'origem':'iur','parte_trib':to_decimal(0)})
#SEGURANCA SOCIAL
if (contrato['tipo'] == 'dependente') & (contrato['vinculo'] in ('efectivo','a_termo')):
salario_bruto_inps = to_decimal(contrato['salario_base'])+to_decimal(self.get_total_rendimento_Inps_subsidio(idFuncionario=func['id']))
inps = self.calcularDescontoInpsFuncionario(salario=salario_bruto_inps)
linhas.append({'nome':'Segurança Social', 'valor':-inps,'rh_tipo_rendimento':None,'rh_tipo_desconto':None, 'origem':'inps','parte_trib':to_decimal(0)})
####
RHGastoSuportado(rh_recibo_salario=idRecibo, nome="INPS", valor=self.calcularDescontoInpsEntidade(salario=salario_bruto_inps), user=self.kargs['user']).put()
#RENDIMENTOS
rendimentos = RHRendimentoFuncionario(where="terceiro='{id}' AND em_subsidio = True".format(id=func['id'])).get()
for line in rendimentos:
tipo = RHTipoRendimento(where="id='{tr}'".format(tr=line['rh_tipo_rendimento'])).get()
if tipo:
linhas.append({'nome':tipo[0]['nome'], 'valor':line['valor'], 'rh_tipo_rendimento':tipo[0]['id'], 'rh_tipo_desconto':None, 'origem':'rendimento','parte_trib':self.get_parteTributavel(tipo_rend=tipo[0]['id'], valor=line['valor'])})
#OUTROS RENDIMENTOS
outrosRendimentos = RHOutrosRendimentosFuncionario(where="terceiro='{id}' AND rh_folha_salario = '{folha}'".format(id=func['id'], folha=key)).get()
for line in outrosRendimentos:
tipo = RHTipoRendimento(where="id='{tr}'".format(tr=line['rh_tipo_rendimento'])).get()
if tipo:
linhas.append({'nome':tipo[0]['nome'], 'valor':line['valor'], 'rh_tipo_rendimento':tipo[0]['id'], 'rh_tipo_desconto':None, 'origem':'rendimento','parte_trib':self.get_parteTributavel(tipo_rend=tipo[0]['id'], valor=line['valor'])})
#DESCONTOS
descontos = RHDescontoFuncionario(where="terceiro='{idTer}' AND em_subsidio=True".format(idTer=func['id'])).get()
for line in descontos:
tipo = RHTipoDesconto(where="id='{tr}'".format(tr=line['rh_tipo_desconto'])).get()
if tipo:
tipo = tipo[0]
if tipo['taxa']:
if tipo['base']=='salario base':
valor = contrato['salario_base']*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'desconto','parte_trib':to_decimal(0)})
else:
valor = (contrato['salario_base']+self.get_total_rendimento_tributaveis_subsidio(idFuncionario=func['id']))*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'desconto','parte_trib':to_decimal(0)})
else:
linhas.append({'nome':tipo['nome'], 'valor':-line['valor'], 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'desconto','parte_trib':to_decimal(0)})
#OUTROS DESCONTOS
outrosDescontos = RHOutrosDescontosFuncionario(where="terceiro='{idTer}' AND rh_folha_salario='{idSal}'".format(idTer=func['id'], idSal=key)).get()
for line in outrosDescontos:
tipo = RHTipoDesconto(where="id='{tr}'".format(tr=line['rh_tipo_desconto'])).get()
if tipo:
tipo = tipo[0]
if tipo['taxa']:
if tipo['base']=='salario base':
valor = contrato['salario_base']*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'outros_descontos', 'parte_trib':to_decimal(0)})
else:
valor = (contrato['salario_base']+self.get_total_rendimento_tributaveis_subsidio(idFuncionario=func['id']))*line['valor']/100
linhas.append({'nome':tipo['nome'], 'valor':-valor, 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'outros_descontos', 'parte_trib':to_decimal(0)})
else:
linhas.append({'nome':tipo['nome'], 'valor':-line['valor'], 'rh_tipo_rendimento':None, 'rh_tipo_desconto':tipo['id'], 'origem':'outros_descontos', 'parte_trib':to_decimal(0)})
for line in linhas:
content = {
'user':self.kargs['user'],
'rh_recibo_salario':idRecibo,
'nome':line['nome'],
'valor':line['valor'],
'rh_tipo_rendimento':line['rh_tipo_rendimento'],
'rh_tipo_desconto':line['rh_tipo_desconto'],
'origem':line['origem'],
'parte_trib':line['parte_trib']
}
RHLinhaReciboSalario(**content).put()
def get_total_rendimento_tributaveis(self, idFuncionario):
"""
retorna o total tributavel dos rendimentos (excepto salario) de um funcionario
"""
total = to_decimal(0)
sql = """SELECT rf.valor, tr.*
FROM rh_rendimento_funcionario rf, rh_tipo_rendimento tr
WHERE (rf.active = True OR rf.active IS NULL)
AND (tr.active = True OR tr.active IS NULL)
AND rf.rh_tipo_rendimento = tr.id
AND rf.terceiro = '{id}'""".format(id=idFuncionario)
rendimentos = run_sql(sql)
for line in rendimentos:
#verificar se o valor é maior que o limite de isencao
if to_decimal(line['valor']) > to_decimal(line['limite_isento']):
a_tributar = to_decimal(line['valor']) - to_decimal(line['limite_isento'])
total+=a_tributar*to_decimal(line['percent_tribuavel'])/100
return total
def get_parteTributavel(self,tipo_rend, valor):
'''
retorna a parte tributavel do valor de um rendimento
'''
total=to_decimal(0)
tipo = RHTipoRendimento(where="id='{id}'".format(id=tipo_rend)).get()
if tipo:
tipo=tipo[0]
if to_decimal(valor) > to_decimal(tipo['limite_isento']):
a_tributar = to_decimal(valor) - to_decimal(tipo['limite_isento'])
total+=a_tributar*to_decimal(tipo['percent_tribuavel'])/100
return total
def get_total_rendimento_tributaveis_subsidio(self, idFuncionario):
"""
retorna o total tributavel dos rendimentos (excepto salario) de um funcionario
"""
total = to_decimal(0)
sql = """SELECT rf.valor, tr.*
FROM rh_rendimento_funcionario rf, rh_tipo_rendimento tr
WHERE (rf.active = True OR rf.active IS NULL)
AND (tr.active = True OR tr.active IS NULL)
AND rf.rh_tipo_rendimento = tr.id
AND rf.em_subsidio=True
AND rf.terceiro = '{id}'""".format(id=idFuncionario)
rendimentos = run_sql(sql)
for line in rendimentos:
#verificar se o valor é maior que o limite de isencao
if to_decimal(line['valor']) > to_decimal(line['limite_isento']):
a_tributar = to_decimal(line['valor']) - to_decimal(line['limite_isento'])
total+=a_tributar*to_decimal(line['percent_tribuavel'])/100
return total
def get_total_rendimento_Inps(self, idFuncionario):
"""
retorna o total de rendimentos de um funcionario que contribui para a seguranca social
"""
total = to_decimal(0)
sql = """SELECT rf.valor, tr.*
FROM rh_rendimento_funcionario rf, rh_tipo_rendimento tr
WHERE rf.rh_tipo_rendimento = tr.id
AND tr.desconto_inps = True
AND rf.terceiro = '{id}'""".format(id=idFuncionario)
rendimentos = run_sql(sql)
for line in rendimentos:
if line['desconto_inps']:
total+=to_decimal(line['valor'])
return total
def get_total_rendimento_Inps_subsidio(self, idFuncionario):
"""
retorna o total de rendimentos de um funcionario que conta em subsidios e que contribui para a seguranca social
"""
total = to_decimal(0)
sql = """SELECT rf.valor, tr.*
FROM rh_rendimento_funcionario rf, rh_tipo_rendimento tr
WHERE rf.rh_tipo_rendimento = tr.id
AND rf.em_subsidio=True
AND tr.desconto_inps = True
AND rf.terceiro = '{id}'""".format(id=idFuncionario)
rendimentos = run_sql(sql)
for line in rendimentos:
if line['desconto_inps']:
total+=to_decimal(line['valor'])
return total
def Activar(self, key, window_id):
self.kargs = get_model_record(model=self, key=key)
self.kargs['estado'] = 'Activo'
self.put()
return form_edit(window_id=window_id).show()
def Cancelar_Recibos(self, key, window_id):
"""
cancela os recobos gerados a partir desta folha de salario
"""
self.kargs = get_model_record(model=self, key=key)
try:
from my_rh_recibo_salario import RHReciboSalario
except:
from rh_recibo_salario import RHReciboSalario
recibos = RHReciboSalario(where="rh_folha_salario='{id}'".format(id=self.kargs['id'])).get()
for recibo in recibos:
recibo['estado']='Cancelado'
recibo['user']=self.kargs['user']
RHReciboSalario(**recibo).put()
self.kargs['estado']='Recibos Cancelado'
self.put()
return form_edit(window_id=window_id).show()
def Confirmar_Recibos(self, key, window_id):
"""
confirma os recobos gerados a partir desta folha de salario
"""
self.kargs = get_model_record(model=self, key=key)
try:
from my_rh_recibo_salario import RHReciboSalario
except:
from rh_recibo_salario import RHReciboSalario
recibos = RHReciboSalario(where="rh_folha_salario='{id}' AND estado='Rascunho'".format(id=self.kargs['id'])).get()
for recibo in recibos:
recibo['estado']='Confirmado'
recibo['user']=self.kargs['user']
RHReciboSalario(**recibo).put()
self.kargs['estado']='Recibos Confirmado'
self.put()
return form_edit(window_id=window_id).show()
def Efectuar_Pagamento(self, key, window_id):
self.kargs = get_model_record(model=self, key=key)
try:
from my_rh_recibo_salario import RHReciboSalario
except:
from rh_recibo_salario import RHReciboSalario
recibos = RHReciboSalario(where="rh_folha_salario='{id}' AND estado='Confirmado'".format(id=self.kargs['id'])).get()
for recibo in recibos:
recibo['estado']='Pago'
recibo['user']=self.kargs['user']
RHReciboSalario(**recibo).put()
self.kargs['estado']='Pago'
self.put()
return form_edit(window_id=window_id).show()
def Cancelar(self, key, window_id):
self.kargs = get_model_record(model=self, key=key)
try:
from my_rh_recibo_salario import RHReciboSalario
except:
from rh_recibo_salario import RHReciboSalario
recibos = RHReciboSalario(where="rh_folha_salario='{id}'".format(id=key)).get()
for recibo in recibos:
recibo['estado']='Cancelado'
recibo['user']=self.kargs['user']
RHReciboSalario(**recibo).put()
self.kargs['estado'] = 'Cancelado'
self.put()
return form_edit(window_id=window_id).show()
def Rascunho(self, key, window_id):
self.kargs = get_model_record(model=self, key=key)
self.kargs['estado'] = 'Rascunho'
self.put()
return form_edit(window_id=window_id).show()
def taxas_retencao(self):
def get_results():
try:
from my_rh_retencao import RHRetencao
except:
from rh_retencao import RHRetencao
taxa = RHRetencao().get()
if taxa:
return taxa[0]
return erp_cache.get(key=self.__model_name__ + 'retencoes', createfunc=get_results)
def calcularIURfuncionario(self, salario):
"""
calcura o IUR do funcionario dado o valor do salario tributavel
"""
iur = to_decimal(0)
taxa = RHTaxaRetencao(where="minimo<='{sal}' AND maximo >='{sal}'".format(sal=salario)).get()
print("\n\n\nDDDDDDDDDDDDDDDDDDDDDDD\n\ntaxa:",taxa,"\n\n\n")
if taxa:
iur = to_decimal(salario)*to_decimal(taxa[0]['taxa'])/100
return iur
def calcularRetencaoPensionista(self, salario):
if salario not in (0,None,'0','None',''):
salario=to_decimal(salario)
retencao = salario - to_decimal(960000)/12
return retencao
else:
return to_decimal(0)
def calcularIURnaoResidente(self, salario):
if salario not in (0,None,'0','None',''):
salario=to_decimal(salario)
taxa = self.taxas_retencao()
if taxa:
retencao = salario * to_decimal(taxa['taxa_nao_residente'])/100
return retencao
else:
return to_decimal(0)
else:
return to_decimal(0)
def calcularIURprestadorServico(self, salario):
if salario not in (0,None,'0','None',''):
salario=to_decimal(salario)
taxa = self.taxas_retencao()
if taxa:
retencao = salario * to_decimal(taxa['taxa_prestacao_servico'])/100
return retencao
else:
return to_decimal(0)
else:
return to_decimal(0)
def calcularDescontoInpsFuncionario(self, salario):
if salario not in (0,None,'0','None',''):
salario=to_decimal(salario)
taxa = self.taxas_retencao()
if taxa:
retencao = salario * to_decimal(taxa['taxa_inps_func'])/100
return retencao
else:
return to_decimal(0)
else:
return to_decimal(0)
def calcularDescontoInpsEntidade(self, salario):
if salario not in (0,None,'0','None',''):
salario=to_decimal(salario)
taxa = self.taxas_retencao()
if taxa:
retencao = salario * to_decimal(taxa['taxa_inps_entidade'])/100
return retencao
else:
return to_decimal(0)
else:
return to_decimal(0) | 55.992063 | 291 | 0.56309 | 3,630 | 35,275 | 5.252893 | 0.088705 | 0.026432 | 0.038599 | 0.025173 | 0.814716 | 0.764684 | 0.736522 | 0.73112 | 0.721051 | 0.704426 | 0 | 0.007579 | 0.308065 | 35,275 | 630 | 292 | 55.992063 | 0.77364 | 0.03691 | 0 | 0.684314 | 0 | 0 | 0.239782 | 0.027763 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047059 | false | 0 | 0.07451 | 0.001961 | 0.186275 | 0.003922 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2c81803515af2008eb72e61088f16729a8c9cf75 | 166 | py | Python | src/Lib/site-packages/simpleaio/__init__.py | NUS-ALSET/ace-react-redux-brython | d009490263c5716a145d9691cd59bfcd5aff837a | [
"MIT"
] | 1 | 2021-08-05T12:45:39.000Z | 2021-08-05T12:45:39.000Z | src/Lib/site-packages/simpleaio/__init__.py | NUS-ALSET/ace-react-redux-brython | d009490263c5716a145d9691cd59bfcd5aff837a | [
"MIT"
] | null | null | null | src/Lib/site-packages/simpleaio/__init__.py | NUS-ALSET/ace-react-redux-brython | d009490263c5716a145d9691cd59bfcd5aff837a | [
"MIT"
] | 1 | 2019-09-05T08:20:07.000Z | 2019-09-05T08:20:07.000Z | """The asyncio package, tracking PEP 3156."""
from .events import *
from .coroutines import *
from .futures import *
from .tasks import *
from .http import *
| 20.75 | 46 | 0.686747 | 21 | 166 | 5.428571 | 0.619048 | 0.350877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030303 | 0.204819 | 166 | 7 | 47 | 23.714286 | 0.833333 | 0.23494 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2c90a08ea45aaadca8abe17a8dddfbfe08a7941a | 45,356 | py | Python | src/datadog_api_client/v2/api/security_monitoring_api.py | rchenzheng/datadog-api-client-python | 2e86ac098c6f0c7fdd90ed218224587c0f8eafef | [
"Apache-2.0"
] | null | null | null | src/datadog_api_client/v2/api/security_monitoring_api.py | rchenzheng/datadog-api-client-python | 2e86ac098c6f0c7fdd90ed218224587c0f8eafef | [
"Apache-2.0"
] | null | null | null | src/datadog_api_client/v2/api/security_monitoring_api.py | rchenzheng/datadog-api-client-python | 2e86ac098c6f0c7fdd90ed218224587c0f8eafef | [
"Apache-2.0"
] | null | null | null | # Unless explicitly stated otherwise all files in this repository are licensed under the Apache-2.0 License.
# This product includes software developed at Datadog (https://www.datadoghq.com/).
# Copyright 2019-Present Datadog, Inc.
import re # noqa: F401
import sys # noqa: F401
from datadog_api_client.v2.api_client import ApiClient, Endpoint as _Endpoint
from datadog_api_client.v2.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types,
)
from datadog_api_client.v2.model.api_error_response import APIErrorResponse
from datadog_api_client.v2.model.security_filter_create_request import SecurityFilterCreateRequest
from datadog_api_client.v2.model.security_filter_delete_response import SecurityFilterDeleteResponse
from datadog_api_client.v2.model.security_filter_response import SecurityFilterResponse
from datadog_api_client.v2.model.security_filter_update_request import SecurityFilterUpdateRequest
from datadog_api_client.v2.model.security_filters_response import SecurityFiltersResponse
from datadog_api_client.v2.model.security_monitoring_list_rules_response import SecurityMonitoringListRulesResponse
from datadog_api_client.v2.model.security_monitoring_rule_create_payload import SecurityMonitoringRuleCreatePayload
from datadog_api_client.v2.model.security_monitoring_rule_response import SecurityMonitoringRuleResponse
from datadog_api_client.v2.model.security_monitoring_rule_update_payload import SecurityMonitoringRuleUpdatePayload
from datadog_api_client.v2.model.security_monitoring_signal_list_request import SecurityMonitoringSignalListRequest
from datadog_api_client.v2.model.security_monitoring_signals_list_response import SecurityMonitoringSignalsListResponse
from datadog_api_client.v2.model.security_monitoring_signals_sort import SecurityMonitoringSignalsSort
class SecurityMonitoringApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
self._create_security_filter_endpoint = _Endpoint(
settings={
"response_type": (SecurityFilterResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/configuration/security_filters",
"operation_id": "create_security_filter",
"http_method": "POST",
"servers": None,
},
params_map={
"all": [
"body",
],
"required": [
"body",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"body": (SecurityFilterCreateRequest,),
},
"attribute_map": {},
"location_map": {
"body": "body",
},
"collection_format_map": {},
},
headers_map={"accept": ["application/json"], "content_type": ["application/json"]},
api_client=api_client,
)
self._create_security_monitoring_rule_endpoint = _Endpoint(
settings={
"response_type": (SecurityMonitoringRuleResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/rules",
"operation_id": "create_security_monitoring_rule",
"http_method": "POST",
"servers": None,
},
params_map={
"all": [
"body",
],
"required": [
"body",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"body": (SecurityMonitoringRuleCreatePayload,),
},
"attribute_map": {},
"location_map": {
"body": "body",
},
"collection_format_map": {},
},
headers_map={"accept": ["application/json"], "content_type": ["application/json"]},
api_client=api_client,
)
self._delete_security_filter_endpoint = _Endpoint(
settings={
"response_type": (SecurityFilterDeleteResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}",
"operation_id": "delete_security_filter",
"http_method": "DELETE",
"servers": None,
},
params_map={
"all": [
"security_filter_id",
],
"required": [
"security_filter_id",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"security_filter_id": (str,),
},
"attribute_map": {
"security_filter_id": "security_filter_id",
},
"location_map": {
"security_filter_id": "path",
},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._delete_security_monitoring_rule_endpoint = _Endpoint(
settings={
"response_type": None,
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/rules/{rule_id}",
"operation_id": "delete_security_monitoring_rule",
"http_method": "DELETE",
"servers": None,
},
params_map={
"all": [
"rule_id",
],
"required": [
"rule_id",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"rule_id": (str,),
},
"attribute_map": {
"rule_id": "rule_id",
},
"location_map": {
"rule_id": "path",
},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._get_security_filter_endpoint = _Endpoint(
settings={
"response_type": (SecurityFilterResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}",
"operation_id": "get_security_filter",
"http_method": "GET",
"servers": None,
},
params_map={
"all": [
"security_filter_id",
],
"required": [
"security_filter_id",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"security_filter_id": (str,),
},
"attribute_map": {
"security_filter_id": "security_filter_id",
},
"location_map": {
"security_filter_id": "path",
},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._get_security_monitoring_rule_endpoint = _Endpoint(
settings={
"response_type": (SecurityMonitoringRuleResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/rules/{rule_id}",
"operation_id": "get_security_monitoring_rule",
"http_method": "GET",
"servers": None,
},
params_map={
"all": [
"rule_id",
],
"required": [
"rule_id",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"rule_id": (str,),
},
"attribute_map": {
"rule_id": "rule_id",
},
"location_map": {
"rule_id": "path",
},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._list_security_filters_endpoint = _Endpoint(
settings={
"response_type": (SecurityFiltersResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/configuration/security_filters",
"operation_id": "list_security_filters",
"http_method": "GET",
"servers": None,
},
params_map={"all": [], "required": [], "nullable": [], "enum": [], "validation": []},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {},
"attribute_map": {},
"location_map": {},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._list_security_monitoring_rules_endpoint = _Endpoint(
settings={
"response_type": (SecurityMonitoringListRulesResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/rules",
"operation_id": "list_security_monitoring_rules",
"http_method": "GET",
"servers": None,
},
params_map={
"all": [
"page_size",
"page_number",
],
"required": [],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"page_size": (int,),
"page_number": (int,),
},
"attribute_map": {
"page_size": "page[size]",
"page_number": "page[number]",
},
"location_map": {
"page_size": "query",
"page_number": "query",
},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._list_security_monitoring_signals_endpoint = _Endpoint(
settings={
"response_type": (SecurityMonitoringSignalsListResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/signals",
"operation_id": "list_security_monitoring_signals",
"http_method": "GET",
"servers": None,
},
params_map={
"all": [
"filter_query",
"filter_from",
"filter_to",
"sort",
"page_cursor",
"page_limit",
],
"required": [],
"nullable": [],
"enum": [],
"validation": [
"page_limit",
],
},
root_map={
"validations": {
("page_limit",): {
"inclusive_maximum": 1000,
},
},
"allowed_values": {},
"openapi_types": {
"filter_query": (str,),
"filter_from": (datetime,),
"filter_to": (datetime,),
"sort": (SecurityMonitoringSignalsSort,),
"page_cursor": (str,),
"page_limit": (int,),
},
"attribute_map": {
"filter_query": "filter[query]",
"filter_from": "filter[from]",
"filter_to": "filter[to]",
"sort": "sort",
"page_cursor": "page[cursor]",
"page_limit": "page[limit]",
},
"location_map": {
"filter_query": "query",
"filter_from": "query",
"filter_to": "query",
"sort": "query",
"page_cursor": "query",
"page_limit": "query",
},
"collection_format_map": {},
},
headers_map={
"accept": ["application/json"],
"content_type": [],
},
api_client=api_client,
)
self._search_security_monitoring_signals_endpoint = _Endpoint(
settings={
"response_type": (SecurityMonitoringSignalsListResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/signals/search",
"operation_id": "search_security_monitoring_signals",
"http_method": "POST",
"servers": None,
},
params_map={
"all": [
"body",
],
"required": [],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"body": (SecurityMonitoringSignalListRequest,),
},
"attribute_map": {},
"location_map": {
"body": "body",
},
"collection_format_map": {},
},
headers_map={"accept": ["application/json"], "content_type": ["application/json"]},
api_client=api_client,
)
self._update_security_filter_endpoint = _Endpoint(
settings={
"response_type": (SecurityFilterResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}",
"operation_id": "update_security_filter",
"http_method": "PATCH",
"servers": None,
},
params_map={
"all": [
"security_filter_id",
"body",
],
"required": [
"security_filter_id",
"body",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"security_filter_id": (str,),
"body": (SecurityFilterUpdateRequest,),
},
"attribute_map": {
"security_filter_id": "security_filter_id",
},
"location_map": {
"security_filter_id": "path",
"body": "body",
},
"collection_format_map": {},
},
headers_map={"accept": ["application/json"], "content_type": ["application/json"]},
api_client=api_client,
)
self._update_security_monitoring_rule_endpoint = _Endpoint(
settings={
"response_type": (SecurityMonitoringRuleResponse,),
"auth": ["apiKeyAuth", "appKeyAuth"],
"endpoint_path": "/api/v2/security_monitoring/rules/{rule_id}",
"operation_id": "update_security_monitoring_rule",
"http_method": "PUT",
"servers": None,
},
params_map={
"all": [
"rule_id",
"body",
],
"required": [
"rule_id",
"body",
],
"nullable": [],
"enum": [],
"validation": [],
},
root_map={
"validations": {},
"allowed_values": {},
"openapi_types": {
"rule_id": (str,),
"body": (SecurityMonitoringRuleUpdatePayload,),
},
"attribute_map": {
"rule_id": "rule_id",
},
"location_map": {
"rule_id": "path",
"body": "body",
},
"collection_format_map": {},
},
headers_map={"accept": ["application/json"], "content_type": ["application/json"]},
api_client=api_client,
)
def create_security_filter(self, body, **kwargs):
"""Create a security filter # noqa: E501
Create a security filter. See the [security filter guide](https://docs.datadoghq.com/security_platform/guide/how-to-setup-security-filters-using-security-monitoring-api/) for more examples. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_security_filter(body, async_req=True)
>>> result = thread.get()
Args:
body (SecurityFilterCreateRequest): The definition of the new security filter.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityFilterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._create_security_filter_endpoint.default_arguments(kwargs)
kwargs["body"] = body
return self._create_security_filter_endpoint.call_with_http_info(**kwargs)
def create_security_monitoring_rule(self, body, **kwargs):
"""Create a detection rule # noqa: E501
Create a detection rule. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_security_monitoring_rule(body, async_req=True)
>>> result = thread.get()
Args:
body (SecurityMonitoringRuleCreatePayload):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityMonitoringRuleResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._create_security_monitoring_rule_endpoint.default_arguments(kwargs)
kwargs["body"] = body
return self._create_security_monitoring_rule_endpoint.call_with_http_info(**kwargs)
def delete_security_filter(self, security_filter_id, **kwargs):
"""Delete a security filter # noqa: E501
Delete a specific security filter. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_security_filter(security_filter_id, async_req=True)
>>> result = thread.get()
Args:
security_filter_id (str): The ID of the security filter.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityFilterDeleteResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._delete_security_filter_endpoint.default_arguments(kwargs)
kwargs["security_filter_id"] = security_filter_id
return self._delete_security_filter_endpoint.call_with_http_info(**kwargs)
def delete_security_monitoring_rule(self, rule_id, **kwargs):
"""Delete an existing rule # noqa: E501
Delete an existing rule. Default rules cannot be deleted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_security_monitoring_rule(rule_id, async_req=True)
>>> result = thread.get()
Args:
rule_id (str): The ID of the rule.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._delete_security_monitoring_rule_endpoint.default_arguments(kwargs)
kwargs["rule_id"] = rule_id
return self._delete_security_monitoring_rule_endpoint.call_with_http_info(**kwargs)
def get_security_filter(self, security_filter_id, **kwargs):
"""Get a security filter # noqa: E501
Get the details of a specific security filter. See the [security filter guide](https://docs.datadoghq.com/security_platform/guide/how-to-setup-security-filters-using-security-monitoring-api/) for more examples. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_security_filter(security_filter_id, async_req=True)
>>> result = thread.get()
Args:
security_filter_id (str): The ID of the security filter.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityFilterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._get_security_filter_endpoint.default_arguments(kwargs)
kwargs["security_filter_id"] = security_filter_id
return self._get_security_filter_endpoint.call_with_http_info(**kwargs)
def get_security_monitoring_rule(self, rule_id, **kwargs):
"""Get a rule's details # noqa: E501
Get a rule's details. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_security_monitoring_rule(rule_id, async_req=True)
>>> result = thread.get()
Args:
rule_id (str): The ID of the rule.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityMonitoringRuleResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._get_security_monitoring_rule_endpoint.default_arguments(kwargs)
kwargs["rule_id"] = rule_id
return self._get_security_monitoring_rule_endpoint.call_with_http_info(**kwargs)
def list_security_filters(self, **kwargs):
"""Get all security filters # noqa: E501
Get the list of configured security filters with their definitions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_security_filters(async_req=True)
>>> result = thread.get()
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityFiltersResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._list_security_filters_endpoint.default_arguments(kwargs)
return self._list_security_filters_endpoint.call_with_http_info(**kwargs)
def list_security_monitoring_rules(self, **kwargs):
"""List rules # noqa: E501
List rules. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_security_monitoring_rules(async_req=True)
>>> result = thread.get()
Keyword Args:
page_size (int): Size for a given page.. [optional] if omitted the server will use the default value of 10
page_number (int): Specific page number to return.. [optional] if omitted the server will use the default value of 0
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityMonitoringListRulesResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._list_security_monitoring_rules_endpoint.default_arguments(kwargs)
return self._list_security_monitoring_rules_endpoint.call_with_http_info(**kwargs)
def list_security_monitoring_signals(self, **kwargs):
"""Get a quick list of security signals # noqa: E501
The list endpoint returns security signals that match a search query. Both this endpoint and the POST endpoint can be used interchangeably when listing security signals. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_security_monitoring_signals(async_req=True)
>>> result = thread.get()
Keyword Args:
filter_query (str): The search query for security signals.. [optional]
filter_from (datetime): The minimum timestamp for requested security signals.. [optional]
filter_to (datetime): The maximum timestamp for requested security signals.. [optional]
sort (SecurityMonitoringSignalsSort): The order of the security signals in results.. [optional]
page_cursor (str): A list of results using the cursor provided in the previous query.. [optional]
page_limit (int): The maximum number of security signals in the response.. [optional] if omitted the server will use the default value of 10
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityMonitoringSignalsListResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._list_security_monitoring_signals_endpoint.default_arguments(kwargs)
return self._list_security_monitoring_signals_endpoint.call_with_http_info(**kwargs)
def search_security_monitoring_signals(self, **kwargs):
"""Get a list of security signals # noqa: E501
Returns security signals that match a search query. Both this endpoint and the GET endpoint can be used interchangeably for listing security signals. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_security_monitoring_signals(async_req=True)
>>> result = thread.get()
Keyword Args:
body (SecurityMonitoringSignalListRequest): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityMonitoringSignalsListResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._search_security_monitoring_signals_endpoint.default_arguments(kwargs)
return self._search_security_monitoring_signals_endpoint.call_with_http_info(**kwargs)
def update_security_filter(self, security_filter_id, body, **kwargs):
"""Update a security filter # noqa: E501
Update a specific security filter. Returns the security filter object when the request is successful. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_security_filter(security_filter_id, body, async_req=True)
>>> result = thread.get()
Args:
security_filter_id (str): The ID of the security filter.
body (SecurityFilterUpdateRequest): New definition of the security filter.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityFilterResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._update_security_filter_endpoint.default_arguments(kwargs)
kwargs["security_filter_id"] = security_filter_id
kwargs["body"] = body
return self._update_security_filter_endpoint.call_with_http_info(**kwargs)
def update_security_monitoring_rule(self, rule_id, body, **kwargs):
"""Update an existing rule # noqa: E501
Update an existing rule. When updating `cases`, `queries` or `options`, the whole field must be included. For example, when modifying a query all queries must be included. Default rules can only be updated to be enabled and to change notifications. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_security_monitoring_rule(rule_id, body, async_req=True)
>>> result = thread.get()
Args:
rule_id (str): The ID of the rule.
body (SecurityMonitoringRuleUpdatePayload):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (float/tuple): timeout setting for this request. If one
number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
SecurityMonitoringRuleResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs = self._update_security_monitoring_rule_endpoint.default_arguments(kwargs)
kwargs["rule_id"] = rule_id
kwargs["body"] = body
return self._update_security_monitoring_rule_endpoint.call_with_http_info(**kwargs)
| 43.237369 | 270 | 0.555979 | 4,431 | 45,356 | 5.466712 | 0.063417 | 0.046237 | 0.025761 | 0.018825 | 0.853486 | 0.825373 | 0.811006 | 0.794906 | 0.765677 | 0.730958 | 0 | 0.004662 | 0.361606 | 45,356 | 1,048 | 271 | 43.278626 | 0.831912 | 0.438817 | 0 | 0.588022 | 0 | 0 | 0.223575 | 0.053052 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023593 | false | 0 | 0.030853 | 0 | 0.07804 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2c99eb1e22e1c0db7f76d9a667ff686a618cf762 | 250 | py | Python | Practice_problems/Edabit/Edaaaaabit.py | riyabhatia26/Python-Programming | 2882728982c15c3b6380033eb2e90761b538dd93 | [
"MIT"
] | 3 | 2020-08-07T04:33:19.000Z | 2021-10-06T08:58:01.000Z | Practice_problems/Edabit/Edaaaaabit.py | riyabhatia26/Python-Programming | 2882728982c15c3b6380033eb2e90761b538dd93 | [
"MIT"
] | null | null | null | Practice_problems/Edabit/Edaaaaabit.py | riyabhatia26/Python-Programming | 2882728982c15c3b6380033eb2e90761b538dd93 | [
"MIT"
] | 2 | 2021-10-06T08:58:05.000Z | 2021-10-06T09:46:42.000Z | # Edaaaaabit
# Write a function that takes an integer and returns a string with the given number of "a"s in Edabit.
def how_many_times(num):
return 'Ed{0}bit'.format("a" * num)
print(how_many_times(5)) #Edaaaaabit
print(how_many_times(0)) #Edbit
| 27.777778 | 102 | 0.744 | 45 | 250 | 4 | 0.711111 | 0.116667 | 0.2 | 0.188889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014019 | 0.144 | 250 | 8 | 103 | 31.25 | 0.827103 | 0.504 | 0 | 0 | 0 | 0 | 0.07563 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.5 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 6 |
2cb3b612dba0afbad5a1643b10a05254e9e0291c | 6,885 | py | Python | Count_Fre.py | jasonwho321/Guangxin | 60ce419d1bd8d58c470a2b69ce8baadd0cb1671c | [
"MIT"
] | null | null | null | Count_Fre.py | jasonwho321/Guangxin | 60ce419d1bd8d58c470a2b69ce8baadd0cb1671c | [
"MIT"
] | null | null | null | Count_Fre.py | jasonwho321/Guangxin | 60ce419d1bd8d58c470a2b69ce8baadd0cb1671c | [
"MIT"
] | null | null | null | import pandas as pd
import xlwings as xw
import math
def Get_Inven():
df = pd.read_csv(r'E:\OneDrive\露露\22-03-15-Inventorys (2)(1).csv')
df1 = pd.DataFrame(df,columns=['Goods Name','Packing Size','Gross Weight'])
df_2 = df1['Packing Size'].str.split('*',expand=True)
df1 = pd.concat([df1,df_2],axis=1)
df1 = df1.rename(columns={0:'Long',1:'Width',2:'Height'})
df1 = pd.DataFrame(df1,columns=['Goods Name','Packing Size','Gross Weight','Long','Width','Height'])
return df1
def Get_Inven1():
df = pd.read_csv(r'E:\OneDrive\露露\邮费.csv')
df1 = pd.DataFrame(df,columns=['Goods Name','Packing Size','Gross Weight','Long','Width','Height'])
return df,df1
def Get_Fed1_price():
app = xw.App(visible=True,add_book=False)
book = app.books.open(r'E:\OneDrive\露露\物流对外报价\20210118 lsq+phl 对外报价-Noya提供.xlsx')
sheet = book.sheets[0]
value = sheet.range('B5:J155').options(pd.DataFrame,header=1,index=0).value
book.close()
app.quit()
return value
def Get_UPS_Price():
app = xw.App(visible=True,add_book=False)
book = app.books.open(r'E:\OneDrive\露露\220128 ups 报价--rancho ontario atlanta us lax us ccp us aus us jfk us alt .xlsx')
sheet = book.sheets[0]
value = sheet.range('A2:I152').options(pd.DataFrame,header=1,index=0).value
book.close()
app.quit()
return value
def Coun_UPS_Fre(df1,value):
fre_list = []
for i in range(len(df1)):
print(i)
dic = df1.iloc[i].to_dict()
freight = 0.00
L = float(dic['Long'])
W = float(dic['Width'])
H = float(dic['Height'])
GW = float(dic['Gross Weight'])
LWH_list = [L,W,H]
LWH_list.sort()
try:
if GW > 67.5 or LWH_list[-1] > 274 or (L+2*(W+H))>419:
freight = 9999.00
else:
GWLBS = GW/0.45
VOL = (L*0.3937*W*0.3937*H*0.3937)/139
WG_list = [GWLBS,VOL]
WG_list.sort()
FinalGW = WG_list[-1]
FinalGW = math.ceil(FinalGW)
if GWLBS > 50.0:
Over_Wet = 20.4
else:
Over_Wet = 0.0
if LWH_list[-1] > 121.0 or LWH_list[-2] > 76.0 or 330.0>=(L+2*(W+H))>266.0:
Over_lon = 10.7
else:
Over_lon = 0.0
if 419.0>=(L+2*(W+H))>330.0:
Over_Vol = 110.0
if Over_lon != 0.0 and FinalGW < 90:
FinalGW = 90
else:
Over_Vol = 0.0
house_fee = 3.8
base_Fre = value.loc[value['≤磅/lbs']==float(FinalGW),'Zone 8']
base_Fre = float(base_Fre.iloc[0])
add_up = base_Fre+Over_Vol+Over_Wet+Over_lon+house_fee
Fuel_add = add_up*0.13
freight = freight+Fuel_add+add_up
except:
freight = "NA"
fre_list.append(freight)
df1['UPS_Freight'] = fre_list
df1.to_csv(r'E:\OneDrive\露露\邮费.csv')
def Coun_Fedex_Fre(df1,value):
fre_list = []
for i in range(len(df1)):
print(i)
dic = df1.iloc[i].to_dict()
freight = 0.00
L = float(dic['Long'])
W = float(dic['Width'])
H = float(dic['Height'])
GW = float(dic['Gross Weight'])
LWH_list = [L,W,H]
LWH_list.sort()
try:
if GW > 67.5 or LWH_list[-1] > 274 or (LWH_list[-1]+2*(LWH_list[-2]+LWH_list[-3]))>419:
freight = 9999.00
else:
GWLBS = GW/0.45
VOL = (L*0.3937*W*0.3937*H*0.3937)/250
WG_list = [GWLBS,VOL]
WG_list.sort()
FinalGW = WG_list[-1]
FinalGW = math.ceil(FinalGW)
Over_Wet = 0.0
if LWH_list[-1] > 243.84 or (LWH_list[-1]+2*(LWH_list[-2]+LWH_list[-3]))>330.0:
Over_Vol = 48.4
if FinalGW < 90:
FinalGW = 90
else:
Over_Vol = 0.0
AHS = 0.0
if (LWH_list[-1] + 2 * (LWH_list[-2] + LWH_list[-3])) < 330.0:
if GWLBS > 50:
AHS+=11.55
if LWH_list[-1] > 122 or LWH_list[-2] > 76.2 or (L+2*(W+H))*0.3937>267:
AHS += 8.8
house_fee = 4.95
base_Fre = value.loc[value['≤磅/lbs']==float(FinalGW),'Zone 8']
base_Fre = float(base_Fre.iloc[0])
add_up = base_Fre+Over_Vol+Over_Wet++house_fee
freight = freight++add_up
except:
freight = "NA"
fre_list.append(freight)
df1['Fedex_Freight'] = fre_list
df1.to_csv(r'E:\OneDrive\露露\邮费.csv')
def Coun_Fed1_Fre(df1,value,df):
fre_list = []
for i in range(len(df1)):
print(i)
dic = df1.iloc[i].to_dict()
freight = 0.00
L = float(dic['Long'])
W = float(dic['Width'])
H = float(dic['Height'])
GW = float(dic['Gross Weight'])
LWH_list = [L,W,H]
LWH_list.sort()
try:
if GW > 67.5 or LWH_list[-1] > 274 or (L+2*(W+H))>419:
freight = 9999.00
else:
GWLBS = GW/0.45
VOL = (L*0.3937*W*0.3937*H*0.3937)/194
WG_list = [GWLBS,VOL]
WG_list.sort()
FinalGW = WG_list[-1]
FinalGW = math.ceil(FinalGW)
if GWLBS > 50.0:
Over_Wet = 17.05+5.39
else:
Over_Wet = 0.0
if LWH_list[-1] > 121.0 or LWH_list[-2] > 76.0 or 330.0>=(L+2*(W+H))>266.0:
Over_lon = 10.45+5.39
else:
Over_lon = 0.0
if 419.0>=(L+2*(W+H))>330.0 or LWH_list[-1] > 243.0:
Over_Vol = 71.5+29.15
if Over_lon != 0.0 and FinalGW < 90:
FinalGW = 90
else:
Over_Vol = 0.0
house_fee = 3.85
base_Fre = value.loc[value['Lbs.']==float(FinalGW),'Zone 8']
base_Fre = float(base_Fre.iloc[0])
add_up = base_Fre+Over_Vol+Over_Wet+Over_lon+house_fee
Fuel_add = add_up*0.13
freight = freight+Fuel_add+add_up
except:
freight = "NA"
fre_list.append(freight)
df['Fed1_Freight'] = fre_list
df.to_csv(r'E:\OneDrive\露露\邮费.csv')
def main_ups():
df1 = Get_Inven()
value = Get_UPS_Price()
Coun_UPS_Fre(df1,value)
def main_Fed1():
df,df1 = Get_Inven1()
value = Get_Fed1_price()
Coun_Fed1_Fre(df1,value,df)
if __name__ == '__main__':
main_Fed1() | 31.582569 | 123 | 0.486275 | 993 | 6,885 | 3.225579 | 0.176234 | 0.056822 | 0.027474 | 0.026225 | 0.809866 | 0.790197 | 0.777084 | 0.77646 | 0.734312 | 0.716828 | 0 | 0.093944 | 0.369208 | 6,885 | 218 | 124 | 31.582569 | 0.643104 | 0 | 0 | 0.638889 | 0 | 0.005556 | 0.089602 | 0.024543 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.016667 | 0 | 0.088889 | 0.016667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e2fbdffb1579e50e813dc5a1c88f1a5bb1186111 | 311 | py | Python | test/suite/out/E27.py | shardros/autopep8 | 2ab2ea74668b10f3910f3d5b9526494fa5671ca1 | [
"MIT"
] | 3,459 | 2015-01-03T15:53:43.000Z | 2022-03-31T16:33:01.000Z | test/suite/out/E27.py | hayata-yamamoto/autopep8 | 107e29dce22c7b367a36633a78735278e4ad4288 | [
"MIT"
] | 435 | 2015-01-03T12:58:44.000Z | 2022-03-29T12:37:13.000Z | test/suite/out/E27.py | hayata-yamamoto/autopep8 | 107e29dce22c7b367a36633a78735278e4ad4288 | [
"MIT"
] | 279 | 2015-03-16T16:34:51.000Z | 2022-03-26T23:58:48.000Z | #: Okay
True and False
#: E271
True and False
#: E272
True and False
#: E271
if 1:
#: E273
True and False
#: E273 E274
True and False
#: E271
a and b
#: E271
1 and b
#: E271
a and 2
#: E271 E272
1 and b
#: E271 E272
a and 2
#: E272
this and False
#: E273
a and b
#: E274
a and b
#: E273 E274
this and False
| 10.032258 | 14 | 0.636656 | 63 | 311 | 3.142857 | 0.222222 | 0.282828 | 0.30303 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.257642 | 0.263666 | 311 | 30 | 15 | 10.366667 | 0.606987 | 0.353698 | 0 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
390ea587d4dd26a91596c78ff8d9661d87d43eac | 13,330 | py | Python | tests/test_logger.py | TahaEntezari/ramstk | f82e5b31ef5c4e33cc02252263247b99a9abe129 | [
"BSD-3-Clause"
] | 26 | 2019-05-15T02:03:47.000Z | 2022-02-21T07:28:11.000Z | tests/test_logger.py | TahaEntezari/ramstk | f82e5b31ef5c4e33cc02252263247b99a9abe129 | [
"BSD-3-Clause"
] | 815 | 2019-05-10T12:31:52.000Z | 2022-03-31T12:56:26.000Z | tests/test_logger.py | TahaEntezari/ramstk | f82e5b31ef5c4e33cc02252263247b99a9abe129 | [
"BSD-3-Clause"
] | 9 | 2019-04-20T23:06:29.000Z | 2022-01-24T21:21:04.000Z | # pylint: skip-file
# type: ignore
# -*- coding: utf-8 -*-
#
# tests.test_utilities.py is part of The RAMSTK Project
#
# All rights reserved.
# Copyright 2007 - 2017 Doyle Rowland doyle.rowland <AT> reliaqual <DOT> com
"""Test class for testing the Utilities module algorithms and models."""
# Standard Library Imports
import logging
import os
# Third Party Imports
import pytest
from pubsub import pub
# RAMSTK Package Imports
from ramstk import RAMSTKLogManager
class TestLogManager:
"""Test class for RAMSTKLogManager methods."""
def test_create_log_manager(self):
"""__init__() should create an instance of the RAMSTKLogManager."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
assert isinstance(DUT, RAMSTKLogManager)
assert isinstance(DUT.loggers["ramstk.logger"], logging.Logger)
assert DUT.log_file == _testlog
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_delete_fmea")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_action")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_cause")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_control")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_mechanism")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_mode")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_update_fmea")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_delete_function")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_delete_hazard")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_function")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_hazard")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_update_function")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_hardware")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_validation")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_stakeholder")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_revision")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_environment")
assert pub.isSubscribed(
DUT._do_log_fail_message, "fail_insert_failure_definition"
)
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_mission")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_mission_phase")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_requirement")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_opload")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_opstress")
assert pub.isSubscribed(DUT._do_log_fail_message, "fail_insert_test_method")
@pytest.mark.unit
def test_log_fail_messages(self):
"""_do_log_fail_message() should be called when fail_* messages are broadcast and log the associated error message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("DEBUG", "DEBUG", True)
pub.sendMessage(
"fail_delete_fmea",
error_message=("Attempted to delete non-existent " "FMEA element ID ax."),
)
pub.sendMessage(
"fail_update_fmea",
error_message=(
"Attempted to save non-existent FMEA " "element with FMEA ID ax."
),
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["DEBUG"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Attempted to delete non-existent FMEA element ID ax."
)
assert _lines[1].split(":", 5)[-1].strip() == (
"Attempted to save non-existent FMEA element with FMEA ID ax."
)
@pytest.mark.unit
def test_do_log_debug(self):
"""do_log_debug() should be called when the do_log_debug_msg message is broadcast and log the associated debug message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("DEBUG", "DEBUG", True)
pub.sendMessage(
"do_log_debug_msg",
logger_name="DEBUG",
message="Test DEBUG message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["DEBUG"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Test DEBUG message sent and logged."
)
@pytest.mark.unit
def test_do_log_info(self):
"""do_log_info() should be called when the do_log_info_msg message is broadcast and log the associated debug message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("INFO", "INFO", True)
pub.sendMessage(
"do_log_info_msg",
logger_name="INFO",
message="Test INFO message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["INFO"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Test INFO message sent and logged."
)
@pytest.mark.unit
def test_do_log_info_ignore_debug(self):
"""do_log_info() should not be called when the do_log_debug_msg message is broadcast and the log level is INFO."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("INFO", "INFO", True)
pub.sendMessage(
"do_log_debug_msg",
logger_name="INFO",
message="Test DEBUG message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["INFO"], logging.Logger)
assert _lines == []
@pytest.mark.unit
def test_do_log_info_higher_level_messages(self):
"""do_log_info() should log WARN, ERROR, and CRITICAL level information when it is an INFO log manager."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("INFO", "INFO")
pub.sendMessage(
"do_log_warning_msg",
logger_name="INFO",
message="Test WARN message sent and logged.",
)
pub.sendMessage(
"do_log_error_msg",
logger_name="INFO",
message="Test ERROR message sent and logged.",
)
pub.sendMessage(
"do_log_critical_msg",
logger_name="INFO",
message="Test CRITICAL message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["INFO"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Test WARN message sent and logged."
)
assert _lines[1].split(":", 5)[-1].strip() == (
"Test ERROR message sent and logged."
)
assert _lines[2].split(":", 5)[-1].strip() == (
"Test CRITICAL message sent and logged."
)
@pytest.mark.unit
def test_do_log_warning(self):
"""do_log_warning() should be called when the do_log_warning_msg message is broadcast and log the associated debug message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("WARN", "WARN", True)
pub.sendMessage(
"do_log_warning_msg",
logger_name="WARN",
message="Test WARN message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["WARN"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Test WARN message sent and logged."
)
@pytest.mark.unit
def test_do_log_warning_ignore_debug_info(self):
"""do_log_warning() should not log a debug or info message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("WARN", "WARN", True)
pub.sendMessage(
"do_log_debug_msg",
logger_name="WARN",
message="Test DEBUG message sent and logged.",
)
pub.sendMessage(
"do_log_info_msg",
logger_name="WARN",
message="Test INFO message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["WARN"], logging.Logger)
assert _lines == []
@pytest.mark.unit
def test_do_log_error(self):
"""do_log_error() should be called when the do_log_error_msg message is broadcast and log the associated debug message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("ERROR", "ERROR", True)
pub.sendMessage(
"do_log_error_msg",
logger_name="ERROR",
message="Test ERROR message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["ERROR"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Test ERROR message sent and logged."
)
@pytest.mark.unit
def test_do_log_error_ignore_debug_info_warning(self):
"""do_log_warning() should not log a debug, info, or warning message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("ERROR", "ERROR", True)
pub.sendMessage(
"do_log_debug_msg",
logger_name="ERROR",
message="Test DEBUG message sent and logged.",
)
pub.sendMessage(
"do_log_info_msg",
logger_name="ERROR",
message="Test INFO message sent and logged.",
)
pub.sendMessage(
"do_log_warning_msg",
logger_name="ERROR",
message="Test WARN message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["ERROR"], logging.Logger)
assert _lines == []
@pytest.mark.unit
def test_do_log_critical(self):
"""do_log_critical() should be called when the do_log_critical_msg message is broadcast and log the associated debug message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("CRITICAL", "CRITICAL", True)
pub.sendMessage(
"do_log_critical_msg",
logger_name="CRITICAL",
message="Test CRITICAL message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["CRITICAL"], logging.Logger)
assert _lines[0].split(":", 5)[-1].strip() == (
"Test CRITICAL message sent and logged."
)
@pytest.mark.unit
def test_do_log_critical_ignore_debug_info_warning_error(self):
"""do_log_warning() should not log a debug, info, warning, or error message."""
_testlog = "./test_info.log"
if os.path.exists(_testlog):
os.remove(_testlog)
DUT = RAMSTKLogManager(_testlog)
DUT.do_create_logger("CRITICAL", "CRITICAL", True)
pub.sendMessage(
"do_log_debug_msg",
logger_name="CRITICAL",
message="Test DEBUG message sent and logged.",
)
pub.sendMessage(
"do_log_info_msg",
logger_name="CRITICAL",
message="Test INFO message sent and logged.",
)
pub.sendMessage(
"do_log_warning_msg",
logger_name="CRITICAL",
message="Test WARN message sent and logged.",
)
pub.sendMessage(
"do_log_error_msg",
logger_name="CRITICAL",
message="Test ERROR message sent and logged.",
)
_test_log = open(_testlog, "r")
_lines = _test_log.readlines()
assert isinstance(DUT.loggers["CRITICAL"], logging.Logger)
assert _lines == []
| 35.737265 | 136 | 0.619805 | 1,585 | 13,330 | 4.90347 | 0.092744 | 0.04439 | 0.046835 | 0.066907 | 0.858338 | 0.841354 | 0.823984 | 0.799794 | 0.773417 | 0.745754 | 0 | 0.004009 | 0.270218 | 13,330 | 372 | 137 | 35.833333 | 0.794922 | 0.117104 | 0 | 0.683099 | 0 | 0 | 0.20782 | 0.016256 | 0 | 0 | 0 | 0 | 0.183099 | 1 | 0.042254 | false | 0 | 0.017606 | 0 | 0.06338 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1a7f392205f1d62baeba222018a4c149daa3f7ce | 33 | py | Python | app/html_parser.py | s2t2/omniparser-prep-py | 8528ce9df6976e919968086bb343f5a3c160cde0 | [
"MIT"
] | 1 | 2020-02-18T21:08:42.000Z | 2020-02-18T21:08:42.000Z | app/html_parser.py | s2t2/omniparser-prep-py | 8528ce9df6976e919968086bb343f5a3c160cde0 | [
"MIT"
] | null | null | null | app/html_parser.py | s2t2/omniparser-prep-py | 8528ce9df6976e919968086bb343f5a3c160cde0 | [
"MIT"
] | null | null | null | print("PARSING AN HTML FILE...")
| 16.5 | 32 | 0.666667 | 5 | 33 | 4.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0.69697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
202e4cbc47d4e4c8d8f580045f2e47007aa40634 | 32 | py | Python | src/greenbudget/lib/utils/__init__.py | nickmflorin/django-proper-architecture-testing | da7c4019697e85f921695144375d2f548f1e98ad | [
"MIT"
] | null | null | null | src/greenbudget/lib/utils/__init__.py | nickmflorin/django-proper-architecture-testing | da7c4019697e85f921695144375d2f548f1e98ad | [
"MIT"
] | null | null | null | src/greenbudget/lib/utils/__init__.py | nickmflorin/django-proper-architecture-testing | da7c4019697e85f921695144375d2f548f1e98ad | [
"MIT"
] | null | null | null | from .builtins import * # noqa
| 16 | 31 | 0.6875 | 4 | 32 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21875 | 32 | 1 | 32 | 32 | 0.88 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6443e0c6434bd98dc8cf2138075882567a64ec85 | 2,010 | py | Python | test_utils.py | kaarne/splitgram | cdc4a8172f4737b0f4a84277978128b5fb01633f | [
"MIT"
] | 1 | 2022-02-18T11:10:17.000Z | 2022-02-18T11:10:17.000Z | test_utils.py | kaarne/splitgram | cdc4a8172f4737b0f4a84277978128b5fb01633f | [
"MIT"
] | null | null | null | test_utils.py | kaarne/splitgram | cdc4a8172f4737b0f4a84277978128b5fb01633f | [
"MIT"
] | null | null | null | from utils import parse_message, split_costs
def test_parse():
assert parse_message(None) is None
assert parse_message('lorem') is None
assert parse_message('lorem 19.99') is None
assert parse_message('99999') is None
assert parse_message('-99999') is None
assert parse_message(1) is None
assert parse_message('19.99') == 19.99
assert parse_message('19,99') == 19.99
assert parse_message('19.99 lorem') == 19.99
def test_split():
assert split_costs(None) == {}
assert split_costs({}) == {}
assert split_costs({'a': 0, 'b': 0}) == {}
assert split_costs({'a': 10, 'b': 0}) == {'b': {'a': 5}}
assert split_costs({'a': 0, 'b': 10}) == {'a': {'b': 5}}
assert split_costs({'a': 10, 'b': 10}) == {}
assert split_costs({'a': 10.68888, 'b': 0}) == {'b': {'a': 5.34444}}
assert split_costs({'a': 0, 'b': 10.68888}) == {'a': {'b': 5.34444}}
assert split_costs({'a': -10, 'b': 10}) == {'a': {'b': 10}}
assert split_costs({'a': -10, 'b': -10}) == {}
assert split_costs({'a': 0, 'b': 200, 'c': 100}) == {'a': {'b': 100}}
assert split_costs({'a': 0, 'b': 100, 'c': 200}) == {'a': {'c': 100}}
assert split_costs({'a': 100, 'b': 200, 'c': 0}) == {'c': {'b': 100}}
assert split_costs({'a': 100, 'b': 0, 'c': 200}) == {'b': {'c': 100}}
assert split_costs({'a': 200, 'b': 100, 'c': 0}) == {'c': {'a': 100}}
assert split_costs({'a': 200, 'b': 0, 'c': 100}) == {'b': {'a': 100}}
assert split_costs({'a': 1000, 'b': 0, 'c': 600, 'd': 0}) == \
{'b': {'a': 400}, 'd': {'a': 200, 'c': 200}}
assert split_costs({'a': 100, 'b': 100, 'c': 0, 'd': 100, 'e': 100}) == \
{'c': {'a': 20.0, 'b': 20.0, 'd': 20.0, 'e': 20.0}}
assert split_costs({'a': 1, 'b': 0, 'c': 1, 'd': 0, 'e': 1, 'f': 0}) == \
{'b': {'a': 0.5}, 'd': {'c': 0.5}, 'f': {'e': 0.5}}
assert split_costs({'a': 0, 'b': 1, 'c': 0, 'd': 1, 'e': 0, 'f': 1}) == \
{'a': {'b': 0.5}, 'c': {'d': 0.5}, 'e': {'f': 0.5}}
| 45.681818 | 77 | 0.473632 | 326 | 2,010 | 2.819018 | 0.104294 | 0.228509 | 0.348205 | 0.332971 | 0.688792 | 0.652884 | 0.457019 | 0.254625 | 0.254625 | 0.254625 | 0 | 0.139119 | 0.220398 | 2,010 | 43 | 78 | 46.744186 | 0.447352 | 0 | 0 | 0 | 0 | 0 | 0.073134 | 0 | 0 | 0 | 0 | 0 | 0.805556 | 1 | 0.055556 | true | 0 | 0.027778 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
644be25440b0431b083c7883bf82ebafda7f81d2 | 551 | py | Python | review/double_for_loop_list_comprehension.py | Rorima/exercicios-python | ca78e2d2402c2aa90efd95ccaa620c0a8b42444f | [
"MIT"
] | null | null | null | review/double_for_loop_list_comprehension.py | Rorima/exercicios-python | ca78e2d2402c2aa90efd95ccaa620c0a8b42444f | [
"MIT"
] | null | null | null | review/double_for_loop_list_comprehension.py | Rorima/exercicios-python | ca78e2d2402c2aa90efd95ccaa620c0a8b42444f | [
"MIT"
] | null | null | null | """
# Normal for loop
my_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
for sub_list in my_list:
for number in sub_list:
print(number)
# List comprehension
my_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
numbers = [number for sub_list in my_list for number in sub_list]
for i in numbers: print(i)
# Maybe not a good practice
my_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
[print(number) for sub_list in my_list for number in sub_list]
"""
my_list = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
[print(number) for sub_list in my_list for number in sub_list]
| 23.956522 | 65 | 0.6098 | 112 | 551 | 2.857143 | 0.232143 | 0.15 | 0.0875 | 0.1 | 0.725 | 0.725 | 0.725 | 0.725 | 0.725 | 0.725 | 0 | 0.083527 | 0.217786 | 551 | 22 | 66 | 25.045455 | 0.658933 | 0.785844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
646f70e8f63f172c2d95fc5446123e23196c416e | 140 | py | Python | code/udls/__init__.py | acids-ircam/lottery_mir | 1440d717d7fd688ac43c1a406602aaf2d5a3842d | [
"MIT"
] | 10 | 2020-07-29T23:12:15.000Z | 2022-03-23T16:27:43.000Z | code/udls/__init__.py | acids-ircam/lottery_mir | 1440d717d7fd688ac43c1a406602aaf2d5a3842d | [
"MIT"
] | null | null | null | code/udls/__init__.py | acids-ircam/lottery_mir | 1440d717d7fd688ac43c1a406602aaf2d5a3842d | [
"MIT"
] | 1 | 2022-02-06T11:42:28.000Z | 2022-02-06T11:42:28.000Z | from .base_dataset import SimpleLMDBDataset
from .domain_adaptation import DomainAdaptationDataset
from .simple_dataset import SimpleDataset | 46.666667 | 54 | 0.9 | 15 | 140 | 8.2 | 0.666667 | 0.211382 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078571 | 140 | 3 | 55 | 46.666667 | 0.953488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
64c5a9c568e03e745ede2eb87d25d2ca722ca61d | 7,121 | gyp | Python | gyp/pixman_test.gyp | Frankie-666/color-emoji.skia | f1634a9952086155b9069d49ab91f1fa43b5ec6a | [
"BSD-3-Clause"
] | 2 | 2017-05-19T08:53:12.000Z | 2017-08-28T11:59:26.000Z | gyp/pixman_test.gyp | Frankie-666/color-emoji.skia | f1634a9952086155b9069d49ab91f1fa43b5ec6a | [
"BSD-3-Clause"
] | 2 | 2017-07-25T09:37:22.000Z | 2017-08-04T07:18:56.000Z | gyp/pixman_test.gyp | Frankie-666/color-emoji.skia | f1634a9952086155b9069d49ab91f1fa43b5ec6a | [
"BSD-3-Clause"
] | 2 | 2017-08-09T09:03:23.000Z | 2020-05-26T09:14:49.000Z | # GYP file to build unit tests.
{
'includes': [
'apptype_console.gypi',
'common.gypi',
],
'targets': [
{
'target_name': 'pixman_test',
'type': 'executable',
'mac_bundle' : 1,
'defines': [
'HAVE_CONFIG_H',
],
'include_dirs' : [
'../src/core',
'../experimental/pixman',
'../experimental/SimpleCocoaApp', # needed to get SimpleApp.h
],
'sources': [
'../experimental/pixman/config.h',
'../experimental/pixman/junk.cpp',
# '../../../pixman/demos/alpha-test.c',
'../../../pixman/demos/checkerboard.c',
# '../../../pixman/demos/clip-in.c',
# '../../../pixman/demos/clip-test.c',
# '../../../pixman/demos/composite-test.c',
# '../../../pixman/demos/convolution-test.c',
# '../../../pixman/demos/gradient-test.c',
# '../../../pixman/demos/gtk-utils.c',
# '../../../pixman/demos/parrot.c',
# '../../../pixman/demos/quad2quad.c',
# '../../../pixman/demos/radial-test.c',
# '../../../pixman/demos/screen-test.c',
# '../../../pixman/demos/srgb-test.c',
# '../../../pixman/demos/srgb-trap-test.c',
# '../../../pixman/demos/trap-test.c',
# '../../../pixman/demos/tri-test.c',
'../../../pixman/demos/gtk-utils.h',
# '../../../pixman/test/a1-trap-test.c',
# '../../../pixman/test/affine-test.c',
# '../../../pixman/test/alpha-loop.c',
# '../../../pixman/test/alphamap.c',
# '../../../pixman/test/blitters-test.c',
# '../../../pixman/test/combiner-test.c',
# '../../../pixman/test/composite-traps-test.c',
# '../../../pixman/test/composite.c',
# '../../../pixman/test/fetch-test.c',
# '../../../pixman/test/glyph-test.c',
# '../../../pixman/test/gradient-crash-test.c',
# '../../../pixman/test/infinite-loop.c',
# '../../../pixman/test/lowlevel-blt-bench.c',
# '../../../pixman/test/oob-test.c',
# '../../../pixman/test/pdf-op-test.c',
# '../../../pixman/test/region-contains-test.c',
# '../../../pixman/test/region-test.c',
# '../../../pixman/test/region-translate-test.c',
# '../../../pixman/test/rotate-test.c',
# '../../../pixman/test/scaling-crash-test.c',
# '../../../pixman/test/scaling-helpers-test.c',
# '../../../pixman/test/scaling-test.c',
# '../../../pixman/test/stress-test.c',
# '../../../pixman/test/trap-crasher.c',
'../../../pixman/test/utils.c',
'../../../pixman/test/utils.h',
'../../../pixman/pixman/pixman-access-accessors.c',
'../../../pixman/pixman/pixman-access.c',
# '../../../pixman/pixman/pixman-arm-neon.c',
# '../../../pixman/pixman/pixman-arm-simd.c',
'../../../pixman/pixman/pixman-arm.c',
'../../../pixman/pixman/pixman-bits-image.c',
'../../../pixman/pixman/pixman-combine-float.c',
'../../../pixman/pixman/pixman-combine32.c',
'../../../pixman/pixman/pixman-conical-gradient.c',
'../../../pixman/pixman/pixman-edge-accessors.c',
'../../../pixman/pixman/pixman-edge.c',
'../../../pixman/pixman/pixman-fast-path.c',
'../../../pixman/pixman/pixman-general.c',
'../../../pixman/pixman/pixman-glyph.c',
'../../../pixman/pixman/pixman-gradient-walker.c',
'../../../pixman/pixman/pixman-image.c',
'../../../pixman/pixman/pixman-implementation.c',
'../../../pixman/pixman/pixman-linear-gradient.c',
'../../../pixman/pixman/pixman-matrix.c',
# '../../../pixman/pixman/pixman-mips-dspr2.c',
'../../../pixman/pixman/pixman-mips.c',
'../../../pixman/pixman/pixman-mmx.c',
'../../../pixman/pixman/pixman-noop.c',
'../../../pixman/pixman/pixman-ppc.c',
'../../../pixman/pixman/pixman-radial-gradient.c',
# '../../../pixman/pixman/pixman-region.c',
'../../../pixman/pixman/pixman-region16.c',
'../../../pixman/pixman/pixman-region32.c',
'../../../pixman/pixman/pixman-solid-fill.c',
'../../../pixman/pixman/pixman-sse2.c',
'../../../pixman/pixman/pixman-timer.c',
'../../../pixman/pixman/pixman-trap.c',
'../../../pixman/pixman/pixman-utils.c',
# '../../../pixman/pixman/pixman-vmx.c',
'../../../pixman/pixman/pixman-x86.c',
'../../../pixman/pixman/pixman.c',
# '../../../pixman/pixman/pixman-arm-neon-asm-bilinear.S',
# '../../../pixman/pixman/pixman-arm-neon-asm.S',
# '../../../pixman/pixman/pixman-arm-simd-asm.S',
# '../../../pixman/pixman/pixman-mips-dspr2-asm.S',
# '../../../pixman/pixman/pixman-mips-memcpy-asm.S',
'../../../pixman/pixman/loongson-mmintrin.h',
'../../../pixman/pixman/pixman-accessor.h',
'../../../pixman/pixman/pixman-arm-common.h',
'../../../pixman/pixman/pixman-arm-neon-asm.h',
'../../../pixman/pixman/pixman-combine32.h',
'../../../pixman/pixman/pixman-compiler.h',
'../../../pixman/pixman/pixman-edge-imp.h',
'../../../pixman/pixman/pixman-inlines.h',
'../../../pixman/pixman/pixman-mips-dspr2-asm.h',
'../../../pixman/pixman/pixman-mips-dspr2.h',
'../../../pixman/pixman/pixman-private.h',
'../../../pixman/pixman/pixman.h',
],
'dependencies': [
'skia_base_libs.gyp:skia_base_libs',
'effects.gyp:effects',
'experimental.gyp:experimental',
'images.gyp:images',
'pdf.gyp:pdf',
'views.gyp:views',
'xml.gyp:xml',
],
'conditions': [
[ 'skia_os in ["linux", "freebsd", "openbsd", "solaris"]', {
}],
[ 'skia_os == "win"', {
}],
[ 'skia_os == "mac"', {
'sources': [
# Mac files
'../src/views/mac/SkEventNotifier.h',
'../src/views/mac/SkEventNotifier.mm',
'../src/views/mac/skia_mac.mm',
'../src/views/mac/SkNSView.h',
'../src/views/mac/SkNSView.mm',
'../src/views/mac/SkOptionsTableView.h',
'../src/views/mac/SkOptionsTableView.mm',
'../src/views/mac/SkOSWindow_Mac.mm',
'../src/views/mac/SkTextFieldCell.h',
'../src/views/mac/SkTextFieldCell.m',
],
'libraries': [
'$(SDKROOT)/System/Library/Frameworks/QuartzCore.framework',
'$(SDKROOT)/System/Library/Frameworks/OpenGL.framework',
],
'xcode_settings' : {
'INFOPLIST_FILE' : '../experimental/Intersection/EdgeDemoApp-Info.plist',
},
'mac_bundle_resources' : [
'../experimental/Intersection/EdgeDemoApp.xib',
],
}],
],
'msvs_settings': {
'VCLinkerTool': {
'SubSystem': '2',
'AdditionalDependencies': [
'd3d9.lib',
],
},
},
},
],
}
# Local Variables:
# tab-width:2
# indent-tabs-mode:nil
# End:
# vim: set expandtab tabstop=2 shiftwidth=2:
| 40.005618 | 85 | 0.491925 | 709 | 7,121 | 4.911142 | 0.25952 | 0.361861 | 0.268811 | 0.196439 | 0.272832 | 0.078978 | 0 | 0 | 0 | 0 | 0 | 0.00434 | 0.223424 | 7,121 | 177 | 86 | 40.231638 | 0.625316 | 0.353742 | 0 | 0.134454 | 0 | 0 | 0.661532 | 0.567121 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
64ce01c255b10789c6dc0565e482dde551e8eee4 | 4,988 | py | Python | tests/functional/test_user_client.py | orached/itora_tuto | 8de36d834fc7ef2dc8895ec7ac048fb420de76e3 | [
"MIT"
] | null | null | null | tests/functional/test_user_client.py | orached/itora_tuto | 8de36d834fc7ef2dc8895ec7ac048fb420de76e3 | [
"MIT"
] | 3 | 2020-03-24T18:03:03.000Z | 2021-02-02T22:23:27.000Z | tests/functional/test_user_client.py | orached/itora_tuto | 8de36d834fc7ef2dc8895ec7ac048fb420de76e3 | [
"MIT"
] | null | null | null | import pytest
def test_edit_profile(setUp, populate_db):
"""
GIVEN an anonymous User
WHEN he tries to access to profile edition page
THEN he is redirected to login page
"""
response = setUp.get('/edit_profile')
assert response.status_code == 302
response = setUp.get('/edit_profile', follow_redirects=True)
assert b"Please log in to access this page." in response.data
"""
GIVEN an authenticated User
WHEN he access to profile edition page and post a modification
THEN the profile modification is correctly processed
"""
# login with john's account
response = setUp.post('/auth/login', data={
'username': 'john',
'password': 'cat'
}, follow_redirects=True)
#john change his username to jojo
response = setUp.post('/edit_profile', data={
'username': 'jojo'
}, follow_redirects=True)
assert response.status_code == 200
assert "Vos modifications ont été enregistrées." in response.data.decode("utf-8")
def test_follow_user(setUp, populate_db):
"""
GIVEN an anonymous User
WHEN he tries to access follow user page
THEN he is redirected to login page
"""
response = setUp.get('/follow/susan', follow_redirects=True)
assert b"Please log in to access this page." in response.data
# login with john's account
response = setUp.post('/auth/login', data={
'username': 'john',
'password': 'cat'
}, follow_redirects=True)
"""
GIVEN an authenticated User
WHEN he tries to follow himself
THEN an error message is displayed
"""
response = setUp.get('/follow/john', follow_redirects=True)
assert b"Vous ne pouvez pas vous suivre !" in response.data
"""
GIVEN an authenticated User
WHEN he tries to follow an inexistent user
THEN an error message is displayed
"""
response = setUp.get('/follow/inexistent', follow_redirects=True)
assert b"Utilisateur inexistent introuvable." in response.data
"""
GIVEN an authenticated User
WHEN he tries to follow a different user
THEN it's processed correctly
"""
response = setUp.get('/follow/susan', follow_redirects=True)
assert b"Vous suivez maintenant susan !" in response.data
def test_unfollow_user(setUp, populate_db):
"""
GIVEN an anonymous User
WHEN he tries to access unfollow user page
THEN he is redirected to login page
"""
response = setUp.get('/unfollow/susan', follow_redirects=True)
assert b"Please log in to access this page." in response.data
# login with john's account
response = setUp.post('/auth/login', data={
'username': 'john',
'password': 'cat'
}, follow_redirects=True)
"""
GIVEN an authenticated User
WHEN he tries to unfollow himself
THEN an error message is displayed
"""
response = setUp.get('/unfollow/john', follow_redirects=True)
assert b"Vous ne pouvez pas vous suivre !" in response.data
"""
GIVEN an authenticated User
WHEN he tries to unfollow an inexistent user
THEN an error message is displayed
"""
response = setUp.get('/unfollow/inexistent', follow_redirects=True)
assert b"Utilisateur inexistent introuvable." in response.data
"""
GIVEN an authenticated User
WHEN he tries to unfollow a different user
THEN it's processed correctly
"""
response = setUp.get('/unfollow/susan', follow_redirects=True)
assert b"Vous ne suivez plus susan." in response.data
def test_send_message(setUp, populate_db):
"""
GIVEN an anonymous User
WHEN he tries to send a message
THEN he is redirected to login page
"""
response = setUp.get('/send_message/susan', follow_redirects=True)
assert b"Please log in to access this page." in response.data
# login with john's account
response = setUp.post('/auth/login', data={
'username': 'john',
'password': 'cat'
}, follow_redirects=True)
"""
GIVEN an authenticated User
WHEN he tries to send a message to a different user
THEN it's processed correctly
"""
response = setUp.post('/send_message/susan', data={
'message': 'Hi susan! I like your posts.'
}, follow_redirects=True)
assert "Votre message a été envoyé." in response.data.decode("utf-8")
response = setUp.get('/auth/logout', follow_redirects=True)
"""
GIVEN an authenticated User
WHEN he receives a new message
THEN it appears in navigation bar and he can read the message in messages page
"""
# login with susan's account
response = setUp.post('/auth/login', data={
'username': 'susan',
'password': 'dog'
}, follow_redirects=True)
response = setUp.get('/notifications')
assert b"\"data\":1" in response.data
response = setUp.get('/messages')
assert b"john" in response.data
assert b"Hi susan! I like your posts." in response.data | 33.253333 | 85 | 0.6668 | 664 | 4,988 | 4.953313 | 0.159639 | 0.083004 | 0.103983 | 0.091213 | 0.779264 | 0.754941 | 0.711766 | 0.708726 | 0.701429 | 0.659167 | 0 | 0.002362 | 0.236167 | 4,988 | 150 | 86 | 33.253333 | 0.860892 | 0.113673 | 0 | 0.516129 | 0 | 0 | 0.284386 | 0 | 0 | 0 | 0 | 0 | 0.274194 | 1 | 0.064516 | false | 0.080645 | 0.016129 | 0 | 0.080645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
b3a480dfb1c245c67d204cd0f5a6d6680af79046 | 381 | py | Python | rasa_chinese/nlu/__init__.py | wegylexy/rasa_chinese | 76ecd8b6bc7b518b9d8c416668fa2a0ad75bdd37 | [
"Apache-2.0"
] | 80 | 2020-12-28T06:33:01.000Z | 2022-03-30T09:02:19.000Z | rasa_chinese/nlu/__init__.py | AlexAuthor7/rasa_chinese | 76ecd8b6bc7b518b9d8c416668fa2a0ad75bdd37 | [
"Apache-2.0"
] | 4 | 2021-08-20T10:30:22.000Z | 2022-03-14T05:43:22.000Z | rasa_chinese/nlu/__init__.py | AlexAuthor7/rasa_chinese | 76ecd8b6bc7b518b9d8c416668fa2a0ad75bdd37 | [
"Apache-2.0"
] | 24 | 2020-12-28T08:36:17.000Z | 2022-03-29T11:11:41.000Z | from rasa_chinese.nlu.tokenizers import LanguageModelTokenizer
from rasa_chinese.nlu.featurizers import BertTextFeaturizer, BertCharFeaturizer
from rasa_chinese.nlu.extractors import BilstmCrfTensorFlowEntityExtractor
from rasa_chinese.nlu.classifiers import TextCnnTensorFlowClassifier, DenseNetworkTensorFlowClassifier
from rasa_chinese.nlu.utils import TensorflowNLP, PaddleNLP
| 63.5 | 102 | 0.905512 | 38 | 381 | 8.947368 | 0.473684 | 0.117647 | 0.220588 | 0.264706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060367 | 381 | 5 | 103 | 76.2 | 0.949721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b3e453e48d6aa2c3047c8c08381560a0912905e4 | 931 | py | Python | correct_python_programs/find_first_in_sorted.py | PatrickShaw/QuixBugs | 5a2eb2987fdac12860b526ffa92a57e5831fd639 | [
"MIT"
] | 22 | 2018-01-29T01:56:30.000Z | 2022-03-21T12:25:40.000Z | correct_python_programs/find_first_in_sorted.py | zixifan/QuixBugs | 5a2eb2987fdac12860b526ffa92a57e5831fd639 | [
"MIT"
] | 31 | 2017-12-18T21:04:34.000Z | 2022-02-21T07:38:09.000Z | correct_python_programs/find_first_in_sorted.py | zixifan/QuixBugs | 5a2eb2987fdac12860b526ffa92a57e5831fd639 | [
"MIT"
] | 19 | 2018-01-06T14:18:33.000Z | 2022-03-21T12:25:43.000Z |
def find_first_in_sorted(arr, x):
lo = 0
hi = len(arr)
while lo < hi:
mid = (lo + hi) // 2
if x == arr[mid] and (mid == 0 or x != arr[mid - 1]):
return mid
elif x <= arr[mid]:
hi = mid
else:
lo = mid + 1
return -1
"""
def find_first_in_sorted(arr, x):
lo = 0
hi = len(arr)
while lo <= hi - 1:
mid = (lo + hi) // 2
if x == arr[mid] and (mid == 0 or x != arr[mid - 1]):
return mid
elif x <= arr[mid]:
hi = mid
else:
lo = mid + 1
return -1
def find_first_in_sorted(arr, x):
lo = 0
hi = len(arr)
while lo + 1 <= hi:
mid = (lo + hi) // 2
if x == arr[mid] and (mid == 0 or x != arr[mid - 1]):
return mid
elif x <= arr[mid]:
hi = mid
else:
lo = mid + 1
return -1
"""
| 16.051724 | 61 | 0.398496 | 137 | 931 | 2.642336 | 0.153285 | 0.099448 | 0.174033 | 0.116022 | 0.994475 | 0.994475 | 0.994475 | 0.994475 | 0.994475 | 0.994475 | 0 | 0.03992 | 0.461869 | 931 | 57 | 62 | 16.333333 | 0.682635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3778f689c2fa9b1e20b9906365d2728570133b57 | 165 | py | Python | temboo/core/Library/Google/Gmailv2/Attachments/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Google/Gmailv2/Attachments/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Google/Gmailv2/Attachments/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.Google.Gmailv2.Attachments.GetAttachment import GetAttachment, GetAttachmentInputSet, GetAttachmentResultSet, GetAttachmentChoreographyExecution
| 82.5 | 164 | 0.909091 | 12 | 165 | 12.5 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006329 | 0.042424 | 165 | 1 | 165 | 165 | 0.943038 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
37bfa95565a184fa817c3a14829ffc6b57be54f2 | 1,707 | py | Python | Curso-em-video/Aula_102.py | JhonAI13/Curso_python | 27dedb0effa2c26140f46392e993b8e7a27d6eb3 | [
"MIT"
] | null | null | null | Curso-em-video/Aula_102.py | JhonAI13/Curso_python | 27dedb0effa2c26140f46392e993b8e7a27d6eb3 | [
"MIT"
] | null | null | null | Curso-em-video/Aula_102.py | JhonAI13/Curso_python | 27dedb0effa2c26140f46392e993b8e7a27d6eb3 | [
"MIT"
] | null | null | null | # """ 102: Crie um programa que tenha uma função fatorial() que receba dois parâmetros:
# o primeiro que indique o número a calcular e outro chamado show, que será um valor
# lógico (opcional) indicando se será mostrado ou não na tela o processo de cálculo do fatorial."""
#
#
# def fatorial(n, show=False):
# """
# -> e uma função que dis a fatorial.
# :param n: o numero a fatorar
# :param show: Se você quer que apareça o historico
# :return: sem retorno
# """
# c = n
# if show:
# print('{}!'.format(c), "=", end='')
# print(' {}'.format(c), 'X', end='')
# for x in range(n - 1):
# c -= 1
# n = n * c
# if show:
# print(' {} '.format(c), end='')
# if c != 1:
# print('X', end='')
# if c == 1:
# print('=', end='')
# print(n)
#
#
# fatorial(6, show=True)
# fatorial(8)
# help(fatorial)
#
""" 102: Crie um programa que tenha uma função fatorial() que receba dois parâmetros:
o primeiro que indique o número a calcular e outro chamado show, que será um valor
lógico (opcional) indicando se será mostrado ou não na tela o processo de cálculo do fatorial."""
def fatorial(n, show=False):
"""
-> e uma função que dis a fatorial.
:param n: o numero a fatorar
:param show: Se você quer que apareça o historico
:return: sem retorno
"""
f = 1
for c in range(n, 0, -1):
if show:
print(c, end='')
if c > 1:
print(' x ', end='')
else:
print(' = ', end='')
f *= c
return f
print(fatorial(5, show=True))
print(fatorial(6))
help(fatorial)
| 27.532258 | 101 | 0.537786 | 238 | 1,707 | 3.857143 | 0.289916 | 0.039216 | 0.035948 | 0.022876 | 0.805011 | 0.795207 | 0.753813 | 0.753813 | 0.716776 | 0.716776 | 0 | 0.015557 | 0.322203 | 1,707 | 61 | 102 | 27.983607 | 0.777874 | 0.746924 | 0 | 0 | 0 | 0 | 0.016484 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0 | 0 | 0.142857 | 0.357143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
80b70ae0a897a32464af41608c3d43ab3ab75ff6 | 112 | py | Python | myflaskapp/dashboard/utils.py | RubyHome/Parking-Dev | 20f5e1c44160ecc80b65ef0df63f1cd96e2de564 | [
"BSD-3-Clause"
] | null | null | null | myflaskapp/dashboard/utils.py | RubyHome/Parking-Dev | 20f5e1c44160ecc80b65ef0df63f1cd96e2de564 | [
"BSD-3-Clause"
] | null | null | null | myflaskapp/dashboard/utils.py | RubyHome/Parking-Dev | 20f5e1c44160ecc80b65ef0df63f1cd96e2de564 | [
"BSD-3-Clause"
] | null | null | null | import time
def timestamp():
"""Return the current timestamp as an integer."""
return int(time.time())
| 18.666667 | 53 | 0.669643 | 15 | 112 | 5 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196429 | 112 | 5 | 54 | 22.4 | 0.833333 | 0.383929 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
039cc1e6af2a53fe9969ccda01bf2c70d70fc648 | 101 | py | Python | test/test_example.py | freakleesin/CSC510-Group35-p1 | 554dd1e68e2da8da06b2b4a1154193038a48e0cc | [
"MIT"
] | null | null | null | test/test_example.py | freakleesin/CSC510-Group35-p1 | 554dd1e68e2da8da06b2b4a1154193038a48e0cc | [
"MIT"
] | null | null | null | test/test_example.py | freakleesin/CSC510-Group35-p1 | 554dd1e68e2da8da06b2b4a1154193038a48e0cc | [
"MIT"
] | 2 | 2021-08-29T19:17:04.000Z | 2021-08-29T19:56:11.000Z | from code.myfunc import mymul
# content of setup.py
def test_answer():
assert mymul(3, 4) == 12
| 16.833333 | 29 | 0.693069 | 17 | 101 | 4.058824 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049383 | 0.19802 | 101 | 5 | 30 | 20.2 | 0.802469 | 0.188119 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
03b929ebbde0de84dff53c4f36912142134ad027 | 984 | py | Python | result.py | lidongyv/Reppoint-Tracking | 81b81e921f6b905e68aba117ffc4fca8ffcfcfd6 | [
"MIT"
] | null | null | null | result.py | lidongyv/Reppoint-Tracking | 81b81e921f6b905e68aba117ffc4fca8ffcfcfd6 | [
"MIT"
] | null | null | null | result.py | lidongyv/Reppoint-Tracking | 81b81e921f6b905e68aba117ffc4fca8ffcfcfd6 | [
"MIT"
] | null | null | null | reppoints_moment_r101_dcn_fpn_2x.pth
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.416
Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.620
Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.453
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.245
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.463
Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.541
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.342
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.547
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.582
Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.379
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.631
Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.742 | 75.692308 | 78 | 0.607724 | 171 | 984 | 3.467836 | 0.222222 | 0.080944 | 0.111298 | 0.118044 | 0.876897 | 0.779089 | 0.738617 | 0.738617 | 0.738617 | 0.541315 | 0 | 0.194839 | 0.212398 | 984 | 13 | 79 | 75.692308 | 0.570323 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
03e34dfe542a961ff776e7485c6352a8e1796794 | 35 | py | Python | center/app/networker/processors/__init__.py | netSensTeam/netSens | 7ab5f41a7103e6c86aa6cb2eff3df68c301e48c1 | [
"MIT"
] | null | null | null | center/app/networker/processors/__init__.py | netSensTeam/netSens | 7ab5f41a7103e6c86aa6cb2eff3df68c301e48c1 | [
"MIT"
] | 3 | 2021-05-10T13:50:55.000Z | 2022-03-02T08:12:46.000Z | center/app/networker/processors/__init__.py | netSensTeam/netSens | 7ab5f41a7103e6c86aa6cb2eff3df68c301e48c1 | [
"MIT"
] | null | null | null | from processors._processor import * | 35 | 35 | 0.857143 | 4 | 35 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 35 | 1 | 35 | 35 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
03e99a1e90e85456a76ee060598c0fae8af109ea | 175 | py | Python | storage_integration/storage_integration/doctype/storage_integration_settings/test_storage_integration_settings.py | rutwikhdev/storage_integration | 0765a6506d703404a8d6df2d5a6920b6ab7f8cf3 | [
"MIT"
] | 8 | 2022-02-25T13:29:08.000Z | 2022-03-25T12:42:34.000Z | storage_integration/storage_integration/doctype/storage_integration_settings/test_storage_integration_settings.py | rutwikhdev/storage_integration | 0765a6506d703404a8d6df2d5a6920b6ab7f8cf3 | [
"MIT"
] | 1 | 2022-03-28T07:47:16.000Z | 2022-03-28T07:47:16.000Z | storage_integration/storage_integration/doctype/storage_integration_settings/test_storage_integration_settings.py | rutwikhdev/storage_integration | 0765a6506d703404a8d6df2d5a6920b6ab7f8cf3 | [
"MIT"
] | 6 | 2022-02-25T19:28:05.000Z | 2022-03-26T03:00:49.000Z | # Copyright (c) 2022, Frappe Technologies and Contributors
# See license.txt
# import frappe
import unittest
class TestStorageIntegrationSettings(unittest.TestCase):
pass
| 17.5 | 58 | 0.805714 | 19 | 175 | 7.421053 | 0.842105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026316 | 0.131429 | 175 | 9 | 59 | 19.444444 | 0.901316 | 0.491429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
ff1a7a3dfb9aad253083dc13c885793d10881850 | 148 | py | Python | behave/shut_down.py | flavioesposito/BeA | 1f5f8b60f17940d2da96b699f0391b0febf771aa | [
"MIT"
] | null | null | null | behave/shut_down.py | flavioesposito/BeA | 1f5f8b60f17940d2da96b699f0391b0febf771aa | [
"MIT"
] | null | null | null | behave/shut_down.py | flavioesposito/BeA | 1f5f8b60f17940d2da96b699f0391b0febf771aa | [
"MIT"
] | 2 | 2018-04-18T17:54:34.000Z | 2022-02-09T07:34:34.000Z | import os
os.system('iptables -F')
os.system('iptables -P INPUT DROP')
os.system('iptables -P OUTPUT DROP')
os.system('iptables -P FORWARD DROP')
| 18.5 | 37 | 0.716216 | 24 | 148 | 4.416667 | 0.416667 | 0.301887 | 0.603774 | 0.481132 | 0.396226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121622 | 148 | 7 | 38 | 21.142857 | 0.815385 | 0 | 0 | 0 | 0 | 0 | 0.544218 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
206748826295bfc6444c78bf2ca217bcac7925e5 | 179 | py | Python | django_app/myapp/views.py | bb-k8/docker-examples | 77f9c78e714b2888554eba8a706e72388d272813 | [
"MIT"
] | null | null | null | django_app/myapp/views.py | bb-k8/docker-examples | 77f9c78e714b2888554eba8a706e72388d272813 | [
"MIT"
] | null | null | null | django_app/myapp/views.py | bb-k8/docker-examples | 77f9c78e714b2888554eba8a706e72388d272813 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.http import HttpResponse
# Create your views here.
def index(req):
return HttpResponse("Simple django app running in docker!") | 29.833333 | 63 | 0.787709 | 25 | 179 | 5.64 | 0.8 | 0.141844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145251 | 179 | 6 | 63 | 29.833333 | 0.921569 | 0.128492 | 0 | 0 | 0 | 0 | 0.232258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
207cd22c5687f57b81f327add04366de1f8a9887 | 337 | py | Python | benwaonline/entities/__init__.py | goosechooser/benwaonline | e2879412aa6c3c230d25cd60072445165517b6b6 | [
"MIT"
] | null | null | null | benwaonline/entities/__init__.py | goosechooser/benwaonline | e2879412aa6c3c230d25cd60072445165517b6b6 | [
"MIT"
] | 16 | 2017-09-13T10:21:40.000Z | 2020-06-01T04:32:22.000Z | benwaonline/entities/__init__.py | goosechooser/benwaonline | e2879412aa6c3c230d25cd60072445165517b6b6 | [
"MIT"
] | null | null | null | from benwaonline.entities.entity import Entity
from benwaonline.entities.user import User, UserLike
from benwaonline.entities.post import Post, PostLike
from benwaonline.entities.image import Image
from benwaonline.entities.preview import Preview
from benwaonline.entities.comment import Comment
from benwaonline.entities.tag import Tag
| 42.125 | 52 | 0.863501 | 44 | 337 | 6.613636 | 0.295455 | 0.360825 | 0.553265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089021 | 337 | 7 | 53 | 48.142857 | 0.947883 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2082be6bb1dea61d467001aaa1227df81da27fc8 | 111 | py | Python | Pytorch/Visualization/style_modules/__init__.py | Kuga23/Deep-Learning | 86980338208c702b6bfcbcfffdb18498e389a56b | [
"MIT"
] | null | null | null | Pytorch/Visualization/style_modules/__init__.py | Kuga23/Deep-Learning | 86980338208c702b6bfcbcfffdb18498e389a56b | [
"MIT"
] | null | null | null | Pytorch/Visualization/style_modules/__init__.py | Kuga23/Deep-Learning | 86980338208c702b6bfcbcfffdb18498e389a56b | [
"MIT"
] | null | null | null | from .content_loss import ContentLoss
from .style_loss import StyleLoss
from .tv_loss import TotalVariationLoss | 37 | 39 | 0.873874 | 15 | 111 | 6.266667 | 0.6 | 0.319149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099099 | 111 | 3 | 39 | 37 | 0.94 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
209a0ecb142d4792642ac90414642641cfca60a8 | 30 | py | Python | tests/test_prompts_input.py | lichensky/PyInquirer | 31f4da76bbbf73585a14819aaabe9fe3432e721d | [
"MIT"
] | 1,587 | 2018-06-14T03:05:42.000Z | 2022-03-31T17:51:51.000Z | tests/test_prompts_input.py | lichensky/PyInquirer | 31f4da76bbbf73585a14819aaabe9fe3432e721d | [
"MIT"
] | 131 | 2018-06-27T12:38:34.000Z | 2022-03-09T16:26:27.000Z | tests/test_prompts_input.py | lichensky/PyInquirer | 31f4da76bbbf73585a14819aaabe9fe3432e721d | [
"MIT"
] | 257 | 2018-07-03T13:47:08.000Z | 2022-03-24T19:49:46.000Z | # TODO tests from Inquirer.js
| 15 | 29 | 0.766667 | 5 | 30 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 30 | 1 | 30 | 30 | 0.92 | 0.9 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 1 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
20ade8236dabd1bf4e1582cff78807a8d4b0ae1b | 130 | py | Python | src/main.py | WannaHeal/line-lottery-numbers-sender | f20dddd8c92aeae17db0c4689f752fa050c58592 | [
"MIT"
] | null | null | null | src/main.py | WannaHeal/line-lottery-numbers-sender | f20dddd8c92aeae17db0c4689f752fa050c58592 | [
"MIT"
] | null | null | null | src/main.py | WannaHeal/line-lottery-numbers-sender | f20dddd8c92aeae17db0c4689f752fa050c58592 | [
"MIT"
] | null | null | null | from services.sender import send_message_to_users
def run():
send_message_to_users()
if __name__ == "__main__":
run()
| 13 | 49 | 0.715385 | 18 | 130 | 4.388889 | 0.722222 | 0.278481 | 0.329114 | 0.455696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184615 | 130 | 9 | 50 | 14.444444 | 0.745283 | 0 | 0 | 0 | 0 | 0 | 0.061538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
20b9290111d074abdabf01c189a3f55fa600e8e5 | 102 | py | Python | tests/test_package.py | ValidereInc/pyomnisci | c28d680fe589091beeafb4f6bdc4e63d016b3ad2 | [
"Apache-2.0"
] | 7 | 2021-02-05T17:00:21.000Z | 2022-02-04T20:55:14.000Z | tests/test_package.py | ValidereInc/pyomnisci | c28d680fe589091beeafb4f6bdc4e63d016b3ad2 | [
"Apache-2.0"
] | 19 | 2021-01-14T18:48:13.000Z | 2022-01-13T00:26:22.000Z | tests/test_package.py | ValidereInc/pyomnisci | c28d680fe589091beeafb4f6bdc4e63d016b3ad2 | [
"Apache-2.0"
] | 8 | 2020-11-18T01:58:36.000Z | 2022-01-27T19:45:50.000Z | import pyomnisci
def test_versioning():
assert pyomnisci.__version__ not in (None, "", "0.0.0")
| 17 | 59 | 0.696078 | 14 | 102 | 4.714286 | 0.785714 | 0.060606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035294 | 0.166667 | 102 | 5 | 60 | 20.4 | 0.741176 | 0 | 0 | 0 | 0 | 0 | 0.04902 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
20e3efe13db623823148de6c0dffef19e5cb93d7 | 2,471 | py | Python | descarteslabs/workflows/types/numpy/__init__.py | carderne/descarteslabs-python | 757b480efb8d58474a3bf07f1dbd90652b46ed64 | [
"Apache-2.0"
] | 167 | 2017-03-23T22:16:58.000Z | 2022-03-08T09:19:30.000Z | descarteslabs/workflows/types/numpy/__init__.py | carderne/descarteslabs-python | 757b480efb8d58474a3bf07f1dbd90652b46ed64 | [
"Apache-2.0"
] | 93 | 2017-03-23T22:11:40.000Z | 2021-12-13T18:38:53.000Z | descarteslabs/workflows/types/numpy/__init__.py | carderne/descarteslabs-python | 757b480efb8d58474a3bf07f1dbd90652b46ed64 | [
"Apache-2.0"
] | 46 | 2017-03-25T19:12:14.000Z | 2021-08-15T18:04:29.000Z | from . import linalg
from . import ma
from ..array import Array
from .numpy_ufuncs import (
add,
subtract,
multiply,
divide,
logaddexp,
logaddexp2,
true_divide,
floor_divide,
negative,
power,
float_power,
remainder,
mod,
conj,
conjugate,
exp,
exp2,
log,
log2,
log10,
log1p,
expm1,
sqrt,
square,
cbrt,
reciprocal,
sin,
cos,
tan,
arcsin,
arccos,
arctan,
arctan2,
hypot,
sinh,
cosh,
tanh,
arcsinh,
arccosh,
arctanh,
deg2rad,
rad2deg,
bitwise_and,
bitwise_or,
bitwise_xor,
bitwise_not,
invert,
greater,
greater_equal,
less,
less_equal,
not_equal,
equal,
logical_and,
logical_or,
logical_xor,
logical_not,
maximum,
minimum,
fmax,
fmin,
isfinite,
isinf,
isnan,
signbit,
copysign,
nextafter,
spacing,
ldexp,
fmod,
floor,
ceil,
trunc,
degrees,
radians,
rint,
fabs,
sign,
absolute,
)
from .numpy_functions import np_funcs
globals().update(np_funcs)
array = Array
__all__ = [
# Array constructor
"array",
# Ufuncs
"add",
"subtract",
"multiply",
"divide",
"logaddexp",
"logaddexp2",
"true_divide",
"floor_divide",
"negative",
"power",
"float_power",
"remainder",
"mod",
"conj",
"conjugate",
"exp",
"exp2",
"log",
"log2",
"log10",
"log1p",
"expm1",
"sqrt",
"square",
"cbrt",
"reciprocal",
"sin",
"cos",
"tan",
"arcsin",
"arccos",
"arctan",
"arctan2",
"hypot",
"sinh",
"cosh",
"tanh",
"arcsinh",
"arccosh",
"arctanh",
"deg2rad",
"rad2deg",
"bitwise_and",
"bitwise_or",
"bitwise_xor",
"bitwise_not",
"invert",
"greater",
"greater_equal",
"less",
"less_equal",
"not_equal",
"equal",
"logical_and",
"logical_or",
"logical_xor",
"logical_not",
"maximum",
"minimum",
"fmax",
"fmin",
"isfinite",
"isinf",
"isnan",
"signbit",
"copysign",
"nextafter",
"spacing",
"ldexp",
"fmod",
"floor",
"ceil",
"trunc",
"degrees",
"radians",
"rint",
"fabs",
"sign",
"absolute",
# sub-packages
"linalg",
"ma",
]
__all__ += list(np_funcs.keys())
| 13.576923 | 37 | 0.508296 | 226 | 2,471 | 5.376106 | 0.402655 | 0.017284 | 0.031276 | 0.041152 | 0.83786 | 0.83786 | 0.83786 | 0.83786 | 0.83786 | 0.83786 | 0 | 0.01243 | 0.348847 | 2,471 | 181 | 38 | 13.651934 | 0.742697 | 0.014974 | 0 | 0 | 0 | 0 | 0.220576 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02907 | 0 | 0.02907 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
20e50f0f6c79248e2ebeb380b2fde0e5a134529d | 31 | py | Python | languages/python/modelfox/__init__.py | OneToolsCollection/tangramdotdev-tangram | 666343a87b88a1c1b34a4be2298f6aa54f0fc2eb | [
"MIT"
] | null | null | null | languages/python/modelfox/__init__.py | OneToolsCollection/tangramdotdev-tangram | 666343a87b88a1c1b34a4be2298f6aa54f0fc2eb | [
"MIT"
] | null | null | null | languages/python/modelfox/__init__.py | OneToolsCollection/tangramdotdev-tangram | 666343a87b88a1c1b34a4be2298f6aa54f0fc2eb | [
"MIT"
] | null | null | null | from .modelfox_python import *
| 15.5 | 30 | 0.806452 | 4 | 31 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
20f45386089ad8915d2f57079f192463c2332038 | 5,896 | py | Python | resources/dot_PyCharm/system/python_stubs/-762174762/PySide/QtGui/QTabWidget.py | basepipe/developer_onboarding | 05b6a776f8974c89517868131b201f11c6c2a5ad | [
"MIT"
] | 1 | 2020-04-20T02:27:20.000Z | 2020-04-20T02:27:20.000Z | resources/dot_PyCharm/system/python_stubs/cache/8cdc475d469a13122bc4bc6c3ac1c215d93d5f120f5cc1ef33a8f3088ee54d8e/PySide/QtGui/QTabWidget.py | basepipe/developer_onboarding | 05b6a776f8974c89517868131b201f11c6c2a5ad | [
"MIT"
] | null | null | null | resources/dot_PyCharm/system/python_stubs/cache/8cdc475d469a13122bc4bc6c3ac1c215d93d5f120f5cc1ef33a8f3088ee54d8e/PySide/QtGui/QTabWidget.py | basepipe/developer_onboarding | 05b6a776f8974c89517868131b201f11c6c2a5ad | [
"MIT"
] | null | null | null | # encoding: utf-8
# module PySide.QtGui
# from C:\Python27\lib\site-packages\PySide\QtGui.pyd
# by generator 1.147
# no doc
# imports
import PySide.QtCore as __PySide_QtCore
import Shiboken as __Shiboken
from QWidget import QWidget
class QTabWidget(QWidget):
# no doc
def addTab(self, *args, **kwargs): # real signature unknown
pass
def changeEvent(self, *args, **kwargs): # real signature unknown
pass
def clear(self, *args, **kwargs): # real signature unknown
pass
def cornerWidget(self, *args, **kwargs): # real signature unknown
pass
def count(self, *args, **kwargs): # real signature unknown
pass
def currentChanged(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def currentIndex(self, *args, **kwargs): # real signature unknown
pass
def currentWidget(self, *args, **kwargs): # real signature unknown
pass
def documentMode(self, *args, **kwargs): # real signature unknown
pass
def elideMode(self, *args, **kwargs): # real signature unknown
pass
def event(self, *args, **kwargs): # real signature unknown
pass
def heightForWidth(self, *args, **kwargs): # real signature unknown
pass
def iconSize(self, *args, **kwargs): # real signature unknown
pass
def indexOf(self, *args, **kwargs): # real signature unknown
pass
def initStyleOption(self, *args, **kwargs): # real signature unknown
pass
def insertTab(self, *args, **kwargs): # real signature unknown
pass
def isMovable(self, *args, **kwargs): # real signature unknown
pass
def isTabEnabled(self, *args, **kwargs): # real signature unknown
pass
def keyPressEvent(self, *args, **kwargs): # real signature unknown
pass
def minimumSizeHint(self, *args, **kwargs): # real signature unknown
pass
def paintEvent(self, *args, **kwargs): # real signature unknown
pass
def removeTab(self, *args, **kwargs): # real signature unknown
pass
def resizeEvent(self, *args, **kwargs): # real signature unknown
pass
def selected(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def setCornerWidget(self, *args, **kwargs): # real signature unknown
pass
def setCurrentIndex(self, *args, **kwargs): # real signature unknown
pass
def setCurrentWidget(self, *args, **kwargs): # real signature unknown
pass
def setDocumentMode(self, *args, **kwargs): # real signature unknown
pass
def setElideMode(self, *args, **kwargs): # real signature unknown
pass
def setIconSize(self, *args, **kwargs): # real signature unknown
pass
def setMovable(self, *args, **kwargs): # real signature unknown
pass
def setTabBar(self, *args, **kwargs): # real signature unknown
pass
def setTabEnabled(self, *args, **kwargs): # real signature unknown
pass
def setTabIcon(self, *args, **kwargs): # real signature unknown
pass
def setTabPosition(self, *args, **kwargs): # real signature unknown
pass
def setTabsClosable(self, *args, **kwargs): # real signature unknown
pass
def setTabShape(self, *args, **kwargs): # real signature unknown
pass
def setTabText(self, *args, **kwargs): # real signature unknown
pass
def setTabToolTip(self, *args, **kwargs): # real signature unknown
pass
def setTabWhatsThis(self, *args, **kwargs): # real signature unknown
pass
def setUsesScrollButtons(self, *args, **kwargs): # real signature unknown
pass
def showEvent(self, *args, **kwargs): # real signature unknown
pass
def sizeHint(self, *args, **kwargs): # real signature unknown
pass
def tabBar(self, *args, **kwargs): # real signature unknown
pass
def tabCloseRequested(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def tabIcon(self, *args, **kwargs): # real signature unknown
pass
def tabInserted(self, *args, **kwargs): # real signature unknown
pass
def tabPosition(self, *args, **kwargs): # real signature unknown
pass
def tabRemoved(self, *args, **kwargs): # real signature unknown
pass
def tabsClosable(self, *args, **kwargs): # real signature unknown
pass
def tabShape(self, *args, **kwargs): # real signature unknown
pass
def tabText(self, *args, **kwargs): # real signature unknown
pass
def tabToolTip(self, *args, **kwargs): # real signature unknown
pass
def tabWhatsThis(self, *args, **kwargs): # real signature unknown
pass
def usesScrollButtons(self, *args, **kwargs): # real signature unknown
pass
def widget(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
East = PySide.QtGui.QTabWidget.TabPosition.East
North = PySide.QtGui.QTabWidget.TabPosition.North
Rounded = PySide.QtGui.QTabWidget.TabShape.Rounded
South = PySide.QtGui.QTabWidget.TabPosition.South
staticMetaObject = None # (!) real value is '<PySide.QtCore.QMetaObject object at 0x0000000004BFE6C8>'
TabPosition = None # (!) real value is "<type 'PySide.QtGui.QTabWidget.TabPosition'>"
TabShape = None # (!) real value is "<type 'PySide.QtGui.QTabWidget.TabShape'>"
Triangular = PySide.QtGui.QTabWidget.TabShape.Triangular
West = PySide.QtGui.QTabWidget.TabPosition.West
| 28.621359 | 106 | 0.639925 | 653 | 5,896 | 5.739663 | 0.183767 | 0.201174 | 0.309498 | 0.273746 | 0.664088 | 0.648879 | 0.648879 | 0.638741 | 0.03762 | 0 | 0 | 0.004524 | 0.25017 | 5,896 | 205 | 107 | 28.760976 | 0.843248 | 0.305122 | 0 | 0.446154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.446154 | false | 0.446154 | 0.023077 | 0 | 0.546154 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
459d10542e982bd5946e7d3d73ff426e8c8ee10b | 10,739 | py | Python | homePage/views/create_a_char.py | bmackley/history | cddf1405f23c98702f2c5b4c02571b5cc20a449a | [
"Apache-2.0"
] | null | null | null | homePage/views/create_a_char.py | bmackley/history | cddf1405f23c98702f2c5b4c02571b5cc20a449a | [
"Apache-2.0"
] | null | null | null | homePage/views/create_a_char.py | bmackley/history | cddf1405f23c98702f2c5b4c02571b5cc20a449a | [
"Apache-2.0"
] | null | null | null | from django import forms
from django.conf import settings
from django.http import HttpResponse, HttpResponseRedirect, Http404
from django.contrib.auth import authenticate, login
from homePage import models as m
from . import templater
import glob
def process_request(request):
from django import forms
from django.conf import settings
from django.http import HttpResponse, HttpResponseRedirect, Http404
from django.contrib.auth import authenticate, login
from homePage import models as m
from . import templater
import glob
def process_request(request):
sign = m.Sign.objects.get(name = '589')
line = 1
new_char = m.AssyrianChar()
new_char.id = 1
new_char.line = line
new_char.Sign = sign
new_char.note = "6a74eaae-3c30-4661-a7fb-2adfc974df0b"
new_char.save()
sign = m.Sign.objects.get(name = '78')
line = 1
new_char = m.AssyrianChar()
new_char.id = 2
new_char.line = line
new_char.Sign = sign
new_char.note = "6fc68815-1ead-420c-8eb9-844a23491b75"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 1
new_char = m.AssyrianChar()
new_char.id = 3
new_char.line = line
new_char.Sign = sign
new_char.note = "9f25034b-e28a-4459-ae5d-255372ed3098"
new_char.save()
sign = m.Sign.objects.get(name = '170')
line = 1
new_char = m.AssyrianChar()
new_char.id = 4
new_char.line = line
new_char.Sign = sign
new_char.note = "7fa1866d-993e-45f0-88f8-2aa8559ff7c0"
new_char.save()
sign = m.Sign.objects.get(name = '557')
line = 1
new_char = m.AssyrianChar()
new_char.id = 5
new_char.line = line
new_char.Sign = sign
new_char.note = "3336488b-f628-404c-b742-b6dee11fc348"
new_char.save()
sign = m.Sign.objects.get(name = '319')
line = 2
new_char = m.AssyrianChar()
new_char.id = 6
new_char.line = line
new_char.Sign = sign
new_char.note = "7fa1f185-7831-42ff-9bd0-9faa485626f3"
new_char.save()
sign = m.Sign.objects.get(name = '396a')
line = 2
new_char = m.AssyrianChar()
new_char.id = 7
new_char.line = line
new_char.Sign = sign
new_char.note = "1848f5e6-c0cd-4c6c-a05e-39527afe8a79"
new_char.save()
sign = m.Sign.objects.get(name = '335')
line = 2
new_char = m.AssyrianChar()
new_char.id = 8
new_char.line = line
new_char.Sign = sign
new_char.note = "f3e64cbc-5bdf-4354-bffc-8a979d90aef7"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 2
new_char = m.AssyrianChar()
new_char.id = 9
new_char.line = line
new_char.Sign = sign
new_char.note = "d7512498-1f54-451b-86d6-5fa6ef3df3ee"
new_char.save()
sign = m.Sign.objects.get(name = '112')
line = 2
new_char = m.AssyrianChar()
new_char.id = 10
new_char.line = line
new_char.Sign = sign
new_char.note = "60a955c2-d106-4fda-866e-73499205842f"
new_char.save()
sign = m.Sign.objects.get(name = '532')
line = 2
new_char = m.AssyrianChar()
new_char.id = 11
new_char.line = line
new_char.Sign = sign
new_char.note = "8e10dd18-31d4-495b-b75b-91d2e5ef24a0"
new_char.save()
sign = m.Sign.objects.get(name = '399')
line = 2
new_char = m.AssyrianChar()
new_char.id = 12
new_char.line = line
new_char.Sign = sign
new_char.note = "1b59f811-a99a-46f3-bbdb-c48058ccaf52"
new_char.save()
sign = m.Sign.objects.get(name = '142')
line = 3
new_char = m.AssyrianChar()
new_char.id = 13
new_char.line = line
new_char.Sign = sign
new_char.note = "f928d175-18be-42e5-9b4d-8b4bfeae4545"
new_char.save()
sign = m.Sign.objects.get(name = '396a')
line = 3
new_char = m.AssyrianChar()
new_char.id = 14
new_char.line = line
new_char.Sign = sign
new_char.note = "5ed612b4-a20d-4aac-a1fb-874409fe21d8"
new_char.save()
sign = m.Sign.objects.get(name = '148')
line = 3
new_char = m.AssyrianChar()
new_char.id = 15
new_char.line = line
new_char.Sign = sign
new_char.note = "87e1d11e-906a-4389-8ce6-5e6c4e84fb6f"
new_char.save()
sign = m.Sign.objects.get(name = '342')
line = 3
new_char = m.AssyrianChar()
new_char.id = 16
new_char.line = line
new_char.Sign = sign
new_char.note = "658700b4-1170-437b-b035-599121ff4764"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 4
new_char = m.AssyrianChar()
new_char.id = 17
new_char.line = line
new_char.Sign = sign
new_char.note = "49736074-a7d8-422e-9824-87be6c0ee80a"
new_char.save()
sign = m.Sign.objects.get(name = '70')
line = 4
new_char = m.AssyrianChar()
new_char.id = 18
new_char.line = line
new_char.Sign = sign
new_char.note = "5df86a86-bf07-4a7b-8440-594fe1c1df42"
new_char.save()
sign = m.Sign.objects.get(name = '598a')
line = 4
new_char = m.AssyrianChar()
new_char.id = 19
new_char.line = line
new_char.Sign = sign
new_char.note = "d3cf7ecd-38d7-415f-b293-75a06faadbba"
new_char.save()
sign = m.Sign.objects.get(name = '595')
line = 4
new_char = m.AssyrianChar()
new_char.id = 20
new_char.line = line
new_char.Sign = sign
new_char.note = "f053847d-ce61-4a2f-8caf-dfe97d4d9d4d"
new_char.save()
sign = m.Sign.objects.get(name = '468')
line = 4
new_char = m.AssyrianChar()
new_char.id = 21
new_char.line = line
new_char.Sign = sign
new_char.note = "49b69885-7666-49b8-9f88-9efd70a8cafd"
new_char.save()
sign = m.Sign.objects.get(name = '381')
line = 4
new_char = m.AssyrianChar()
new_char.id = 22
new_char.line = line
new_char.Sign = sign
new_char.note = "1789d20d-9d90-4fcd-83a5-d1a9ba987c37"
new_char.save()
sign = m.Sign.objects.get(name = '319')
line = 5
new_char = m.AssyrianChar()
new_char.id = 23
new_char.line = line
new_char.Sign = sign
new_char.note = "4ff057e6-9f0a-4d19-aa93-a4fa57e74826"
new_char.save()
sign = m.Sign.objects.get(name = '319')
line = 5
new_char = m.AssyrianChar()
new_char.id = 24
new_char.line = line
new_char.Sign = sign
new_char.note = "47c688b5-408b-4cbd-87f0-ce894aad3121"
new_char.save()
sign = m.Sign.objects.get(name = '86')
line = 5
new_char = m.AssyrianChar()
new_char.id = 25
new_char.line = line
new_char.Sign = sign
new_char.note = "ebbac1d6-121e-4c70-bb3a-b8e358d5fec9"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 5
new_char = m.AssyrianChar()
new_char.id = 26
new_char.line = line
new_char.Sign = sign
new_char.note = "5daf6862-f961-4f4e-a2ff-625ea3b74f8b"
new_char.save()
sign = m.Sign.objects.get(name = '212')
line = 6
new_char = m.AssyrianChar()
new_char.id = 27
new_char.line = line
new_char.Sign = sign
new_char.note = "865f4680-019c-4356-81d5-fd82131b70e3"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 6
new_char = m.AssyrianChar()
new_char.id = 28
new_char.line = line
new_char.Sign = sign
new_char.note = "943fdd75-ead6-485e-b089-0dcf0d93d9de"
new_char.save()
sign = m.Sign.objects.get(name = '170')
line = 6
new_char = m.AssyrianChar()
new_char.id = 29
new_char.line = line
new_char.Sign = sign
new_char.note = "0b3efae9-f633-4354-906f-f860c96326c8"
new_char.save()
sign = m.Sign.objects.get(name = '112')
line = 6
new_char = m.AssyrianChar()
new_char.id = 30
new_char.line = line
new_char.Sign = sign
new_char.note = "ebfc3dbf-aed6-4b62-89cf-3450235c78e4"
new_char.save()
sign = m.Sign.objects.get(name = '354a')
line = 6
new_char = m.AssyrianChar()
new_char.id = 31
new_char.line = line
new_char.Sign = sign
new_char.note = "1d6f0582-fccc-4657-87bb-7ebfb713228d"
new_char.save()
sign = m.Sign.objects.get(name = '342')
line = 6
new_char = m.AssyrianChar()
new_char.id = 32
new_char.line = line
new_char.Sign = sign
new_char.note = "f7da5f84-74e0-4ae8-9fb5-661698bea15d"
new_char.save()
sign = m.Sign.objects.get(name = '342')
line = 7
new_char = m.AssyrianChar()
new_char.id = 33
new_char.line = line
new_char.Sign = sign
new_char.note = "55970697-e8e9-4ec7-b84a-cfc521ff1fed"
new_char.save()
sign = m.Sign.objects.get(name = '342')
line = 7
new_char = m.AssyrianChar()
new_char.id = 34
new_char.line = line
new_char.Sign = sign
new_char.note = "7ca1c742-48f7-4dcc-ae2b-e69ecc122373"
new_char.save()
sign = m.Sign.objects.get(name = '13')
line = 7
new_char = m.AssyrianChar()
new_char.id = 35
new_char.line = line
new_char.Sign = sign
new_char.note = "1addafae-e4f9-44e2-9f53-966f92abdfff"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 7
new_char = m.AssyrianChar()
new_char.id = 36
new_char.line = line
new_char.Sign = sign
new_char.note = "3dc4805f-c487-47a8-9c56-67302a21912b"
new_char.save()
sign = m.Sign.objects.get(name = '70')
line = 8
new_char = m.AssyrianChar()
new_char.id = 37
new_char.line = line
new_char.Sign = sign
new_char.note = "bb0269e6-f295-4267-b060-70e888354219"
new_char.save()
sign = m.Sign.objects.get(name = '319')
line = 8
new_char = m.AssyrianChar()
new_char.id = 38
new_char.line = line
new_char.Sign = sign
new_char.note = "3c2e55ea-24f4-45f8-bc72-ba48b79dcfd0"
new_char.save()
sign = m.Sign.objects.get(name = '319')
line = 8
new_char = m.AssyrianChar()
new_char.id = 39
new_char.line = line
new_char.Sign = sign
new_char.note = "ca4176e1-fb25-4815-99e2-527e1dd9bc04"
new_char.save()
sign = m.Sign.objects.get(name = '86')
line = 8
new_char = m.AssyrianChar()
new_char.id = 40
new_char.line = line
new_char.Sign = sign
new_char.note = "4799ca5d-791b-4cd6-b009-99b64ff6f3e3"
new_char.save()
sign = m.Sign.objects.get(name = '579')
line = 9
new_char = m.AssyrianChar()
new_char.id = 41
new_char.line = line
new_char.Sign = sign
new_char.note = "8a312545-2bd5-48e3-8187-0f1c5b0393ba"
new_char.save()
sign = m.Sign.objects.get(name = '142')
line = 9
new_char = m.AssyrianChar()
new_char.id = 42
new_char.line = line
new_char.Sign = sign
new_char.note = "214ae14d-fd18-4197-b76c-afc303d2cb39"
new_char.save()
sign = m.Sign.objects.get(name = '206')
line = 9
new_char = m.AssyrianChar()
new_char.id = 43
new_char.line = line
new_char.Sign = sign
new_char.note = "43be12c1-7216-421c-8161-2b13cf348835"
new_char.save()
sign = m.Sign.objects.get(name = '383')
line = 9
new_char = m.AssyrianChar()
new_char.id = 44
new_char.line = line
new_char.Sign = sign
new_char.note = "5073b62d-d285-41c7-a573-c83ad993ca27"
new_char.save()
# sign = m.Sign.objects.get(name = '451')
# line = 9
# new_char = m.AssyrianChar()
# new_char.id = 45
# new_char.line = line
# new_char.Sign = sign
# new_char.note = "14235979-31be-4dea-9778-80584272915e"
# new_char.save()
tvars = {
}
return templater.render_to_response(request, 'index.html', tvars)
| 25.149883 | 69 | 0.684794 | 1,706 | 10,739 | 4.150059 | 0.196366 | 0.266949 | 0.057203 | 0.101695 | 0.768362 | 0.768362 | 0.765113 | 0.765113 | 0.765113 | 0.513418 | 0 | 0.132651 | 0.181488 | 10,739 | 426 | 70 | 25.20892 | 0.67281 | 0.019182 | 0 | 0.71159 | 0 | 0 | 0.163816 | 0.150513 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005391 | false | 0 | 0.037736 | 0 | 0.045822 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
45b34c1d3fbbe1110848c2478bbd09217e0f64ef | 22,678 | py | Python | test/test_data_elements.py | crazynayan/tpf3 | a17b82e7c6eb1c9237af9388fb3e7b18a213f50b | [
"MIT"
] | null | null | null | test/test_data_elements.py | crazynayan/tpf3 | a17b82e7c6eb1c9237af9388fb3e7b18a213f50b | [
"MIT"
] | 1 | 2021-06-02T00:47:09.000Z | 2021-06-02T00:47:09.000Z | test/test_data_elements.py | crazynayan/tpf3 | a17b82e7c6eb1c9237af9388fb3e7b18a213f50b | [
"MIT"
] | null | null | null | from copy import deepcopy
from typing import List, Tuple
from requests.utils import quote
from config import config
from test import TestAPI
class OutputRegisters(TestAPI):
def setUp(self) -> None:
self.test_data: dict = self.get_sample_test_data()
self.reg_updated: bool = False
def tearDown(self) -> None:
if self.reg_updated:
self.patch(f"/test_data/{self.test_data['id']}/output/regs", json={'regs': []})
def test_empty(self):
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertDictEqual(dict(), response.json()['outputs'][0]['regs'])
def test_few_reg(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/regs", json={'regs': ['R1', 'R3']})
self.assertEqual(200, response.status_code)
self.reg_updated = True
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertDictEqual({'R1': 0, 'R3': 0}, response.json()['outputs'][0]['regs'])
def test_all_reg(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/regs", json={'regs': list(config.REGISTERS)})
self.assertEqual(200, response.status_code)
self.reg_updated = True
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertDictEqual({reg: 0 for reg in config.REGISTERS}, response.json()['outputs'][0]['regs'])
def test_invalid_id(self):
response = self.patch(f"/test_data/invalid_id/output/regs", json={'regs': list(config.REGISTERS)})
self.assertEqual(404, response.status_code)
def test_invalid_key(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/regs", json={'invalid': ['R1']})
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Invalid format of Registers', 'error': 'Bad Request'}, response.json())
def test_invalid_reg(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/regs", json={'regs': ['R1', 'R16']})
self.assertEqual(400, response.status_code)
def test_reg_tuple(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/regs", json={'regs': ('R1', 'R15')})
self.assertEqual(200, response.status_code)
self.reg_updated = True
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertDictEqual({'R1': 0, 'R15': 0}, response.json()['outputs'][0]['regs'])
class OutputFields(TestAPI):
def setUp(self) -> None:
self.test_data: dict = self.get_sample_test_data()
self.macro_fields: List[Tuple[str, str]] = list()
self.maxDiff = None
def tearDown(self) -> None:
for macro_name, field_name in self.macro_fields:
self.delete(f"/test_data/{self.test_data['id']}/output/cores/{macro_name}/fields/{quote(field_name)}")
def _check_field_byte(self, macro_name: str, field_name: str, length: int, input_len: int = None,
input_base_reg: str = None) -> dict:
body: dict = {'field': f"{field_name}", 'length': input_len if input_len is not None else 0,
'base_reg': input_base_reg if input_base_reg is not None else str()}
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/{macro_name}/fields", json=body)
self.assertEqual(200, response.status_code)
self.macro_fields.append((macro_name, field_name))
body['length'] = length
body.pop('base_reg', None)
self.assertDictEqual(body, response.json())
return body
def _check_core(self, macro_name: str, field_bytes: List[dict], base_reg: str = None):
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
actual_test_data = response.json()
expected_test_data = deepcopy(self.test_data)
core_id = actual_test_data['outputs'][0]['cores'][0]['id']
base_reg = base_reg if base_reg else str()
core = {'id': core_id, 'base_reg': base_reg, 'field_data': list(), 'macro_name': macro_name, 'variation': 0,
'variation_name': str()}
expected_test_data['outputs'][0]['cores'].append(core)
expected_test_data['outputs'][0]['cores'][0]['field_data'].extend(field_bytes)
self.assertDictEqual(expected_test_data, actual_test_data)
def test_default_field_no_length(self) -> None:
field_byte1 = self._check_field_byte('WA0AA', 'WA0BBR', 2)
field_byte2 = self._check_field_byte('WA0AA', '#WA0TTY', 1, input_len=0, input_base_reg='R0')
self._check_core('WA0AA', [field_byte1, field_byte2])
def test_default_field_with_length_change_it(self) -> None:
self._check_field_byte('EB0EB', 'EBW000', 10, input_len=10)
field_byte = self._check_field_byte('EB0EB', 'EBW000', 6, input_len=6)
self._check_core('EB0EB', [field_byte])
def test_based_field_delete(self) -> None:
field_byte1 = self._check_field_byte('UI2PF', 'UI2CNN', 1, input_base_reg='R7')
field_byte2 = self._check_field_byte('UI2PF', 'UI2INC', 3, input_len=3, input_base_reg='R6')
self._check_core('UI2PF', [field_byte1, field_byte2], base_reg='R6')
# Delete 1st field
response = self.delete(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields/UI2CNN")
self.assertEqual(200, response.status_code)
del self.macro_fields[0]
self.assertDictEqual(field_byte1, response.json())
self._check_core('UI2PF', [field_byte2], base_reg='R6')
# Deleting the 1st field again will give an error
response = self.delete(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields/UI2CNN")
self.assertEqual(400, response.status_code)
# Delete 2nd field
response = self.delete(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields/UI2INC")
self.assertEqual(200, response.status_code)
del self.macro_fields[0]
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertEqual(list(), response.json()['outputs'][0]['cores'])
def test_invalid_id(self):
response = self.patch(f"/test_data/invalid_id/output/cores/WA0AA/fields", json={'field': 'WA0BBR'})
self.assertEqual(404, response.status_code)
def test_no_field(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields", json={})
self.assertEqual(400, response.status_code)
def test_empty_field(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields", json={'field': str()})
self.assertEqual(400, response.status_code)
def test_invalid_macro(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/INVALID_MACRO/fields",
json={'field': 'WA0BBR'})
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Error in adding field', 'error': 'Bad Request'}, response.json())
def test_invalid_length(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'length': 'invalid_type'})
self.assertEqual(400, response.status_code)
def test_invalid_data(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'data': 123})
self.assertEqual(400, response.status_code)
def test_field_not_in_macro(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields", json={'field': 'EBW000'})
self.assertEqual(400, response.status_code)
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(self.test_data, response.json())
def test_data_in_body(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'data': 'some_data'})
self.assertEqual(400, response.status_code)
def test_base_reg_for_default_macros(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'base_reg': 'R1'})
self.assertEqual(400, response.status_code)
def test_no_base_reg_for_non_default_macros(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields",
json={'field': 'UI2CNN'})
self.assertEqual(400, response.status_code)
def test_base_reg_R0_for_non_default_macros(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields",
json={'field': 'UI2CNN', 'base_reg': 'R0'})
self.assertEqual(400, response.status_code)
def test_invalid_base_reg_number(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields",
json={'field': 'UI2CNN', 'base_reg': 12})
self.assertEqual(400, response.status_code)
def test_invalid_base_reg(self):
response = self.patch(f"/test_data/{self.test_data['id']}/output/cores/UI2PF/fields",
json={'field': 'UI2CNN', 'base_reg': 'R16'})
self.assertEqual(400, response.status_code)
def test_delete_invalid_id(self):
response = self.delete(f"/test_data/invalid_id/output/cores/WA0AA/fields/WA0BBR")
self.assertEqual(404, response.status_code)
def test_delete_macro_name_not_in_core(self):
response = self.delete(f"/test_data/{self.test_data['id']}/output/cores/WA0AA/fields/WA0BBR")
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Error in deleting field', 'error': 'Bad Request'}, response.json())
class InputFields(TestAPI):
def setUp(self) -> None:
self.test_data: dict = self.get_sample_test_data()
self.macro_fields: List[Tuple[str, str]] = list()
self.maxDiff = None
def tearDown(self) -> None:
for macro_name, field_name in self.macro_fields:
self.delete(f"/test_data/{self.test_data['id']}/input/cores/{macro_name}/fields/{quote(field_name)}")
def _check_field_byte(self, macro_name: str, field_name: str, data: str) -> dict:
body: dict = {'field': f"{field_name}", 'data': data, 'variation': 0, 'variation_name': str()}
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/{macro_name}/fields", json=body)
self.assertEqual(200, response.status_code)
self.macro_fields.append((macro_name, field_name))
del body['variation']
del body['variation_name']
self.assertDictEqual(body, response.json())
return body
def _check_core(self, macro_name: str, field_bytes: List[dict]):
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
actual_test_data = response.json()
expected_test_data = deepcopy(self.test_data)
core_id = actual_test_data['cores'][0]['id']
core = {'id': core_id, 'base_reg': str(), 'field_data': list(), 'macro_name': macro_name, 'variation': 0,
'variation_name': str()}
expected_test_data['cores'].append(core)
expected_test_data['cores'][0]['field_data'].extend(field_bytes)
self.assertDictEqual(expected_test_data, actual_test_data)
def test_default_field(self) -> None:
field_byte1 = self._check_field_byte('WA0AA', 'WA0BBR', 'F1F2')
field_byte2 = self._check_field_byte('WA0AA', '#WA0TTY', '01')
self._check_core('WA0AA', [field_byte1, field_byte2])
def test_based_field_delete(self) -> None:
field_byte1 = self._check_field_byte('EB0EB', 'EBW000', '80')
field_byte2 = self._check_field_byte('EB0EB', 'EBW001', 'C1C2C3C4')
self._check_core('EB0EB', [field_byte1, field_byte2])
# Delete 1st field
response = self.delete(f"/test_data/{self.test_data['id']}/input/cores/EB0EB/fields/EBW000")
self.assertEqual(200, response.status_code)
del self.macro_fields[0]
self.assertDictEqual(field_byte1, response.json())
self._check_core('EB0EB', [field_byte2])
# Deleting the 1st field again will give an error
response = self.delete(f"/test_data/{self.test_data['id']}/input/cores/EB0EB/fields/EBW000")
self.assertEqual(400, response.status_code)
# Delete 2nd field
response = self.delete(f"/test_data/{self.test_data['id']}/input/cores/EB0EB/fields/EBW001")
self.assertEqual(200, response.status_code)
del self.macro_fields[0]
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertEqual(list(), response.json()['cores'])
def test_invalid_id(self):
response = self.patch(f"/test_data/invalid_id/input/cores/WA0AA/fields", json={'field': 'WA0BBR', 'data': '01'})
self.assertEqual(404, response.status_code)
def test_no_field(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields", json={'data': '01'})
self.assertEqual(400, response.status_code)
def test_empty_field(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': str(), 'data': '01'})
self.assertEqual(400, response.status_code)
def test_invalid_macro(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/INVALID_MACRO/fields",
json={'field': 'WA0BBR', 'data': '01'})
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Error in adding field', 'error': 'Bad Request'}, response.json())
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(self.test_data, response.json())
def test_invalid_length(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'length': 'invalid_type', 'data': '01'})
self.assertEqual(400, response.status_code)
def test_invalid_data(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'data': 123})
self.assertEqual(400, response.status_code)
def test_field_not_in_macro(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'EBW000', 'data': '01'})
self.assertEqual(400, response.status_code)
def test_no_data(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'WA0BBR'})
self.assertEqual(400, response.status_code)
def test_empty_data(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'data': ''})
self.assertEqual(400, response.status_code)
def test_length_in_body(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'data': '01', 'length': 1})
self.assertEqual(400, response.status_code)
def test_base_reg_in_body(self):
response = self.patch(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields",
json={'field': 'WA0BBR', 'data': '01', 'base_reg': 'R1'})
self.assertEqual(400, response.status_code)
def test_delete_invalid_id(self):
response = self.delete(f"/test_data/invalid_id/input/cores/WA0AA/fields/WA0BBR")
self.assertEqual(404, response.status_code)
def test_delete_macro_name_not_in_core(self):
response = self.delete(f"/test_data/{self.test_data['id']}/input/cores/WA0AA/fields/WA0BBR")
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Error in deleting field', 'error': 'Bad Request'}, response.json())
class InputRegisters(TestAPI):
def setUp(self) -> None:
self.test_data: dict = self.get_sample_test_data()
self.reg_list: list = list()
self.maxDiff = None
def tearDown(self) -> None:
for reg in self.reg_list:
self.delete(f"/test_data/{self.test_data['id']}/input/regs/{reg}")
return
def test_few_reg_delete(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R14', 'value': 10})
self.assertEqual(200, response.status_code)
self.assertDictEqual({'test_data_id': self.test_data['id']}, response.json())
self.reg_list.append('R14')
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R3', 'value': -5})
self.assertEqual(200, response.status_code)
self.reg_list.append('R3')
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
test_data = deepcopy(self.test_data)
test_data['regs']['R14'] = 10
test_data['regs']['R3'] = -5
self.assertDictEqual(test_data, response.json())
response = self.delete(f"/test_data/{self.test_data['id']}/input/regs/r14")
self.assertEqual(400, response.status_code)
response = self.delete(f"/test_data/{self.test_data['id']}/input/regs/R14")
self.assertEqual(200, response.status_code)
self.reg_list.remove('R14')
self.assertDictEqual({'test_data_id': self.test_data['id']}, response.json())
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
del test_data['regs']['R14']
self.assertDictEqual(test_data, response.json())
response = self.delete(f"/test_data/{self.test_data['id']}/input/regs/R3")
self.assertEqual(200, response.status_code)
self.reg_list.remove('R3')
response = self.get(f"/test_data/{self.test_data['id']}")
self.assertEqual(200, response.status_code)
self.assertDictEqual(self.test_data, response.json())
def test_invalid_id(self) -> None:
response = self.patch(f"/test_data/invalid_id/input/regs", json={'reg': 'R14', 'value': 10})
self.assertEqual(404, response.status_code)
def test_key_regs(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'regs': 'R14', 'value': 10})
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Invalid format of input Register', 'error': 'Bad Request'}, response.json())
def test_no_reg(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'value': 10})
self.assertEqual(400, response.status_code)
def test_reg_not_string(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 10, 'value': 10})
self.assertEqual(400, response.status_code)
def test_invalid_reg(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'r1', 'value': 10})
self.assertEqual(400, response.status_code)
def test_empty_body(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={})
self.assertEqual(400, response.status_code)
def test_key_values(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R14', 'values': 10})
self.assertEqual(400, response.status_code)
def test_no_value(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R14'})
self.assertEqual(400, response.status_code)
def test_value_not_int(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R14', 'value': '10'})
self.assertEqual(400, response.status_code)
def test_value_high(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R14', 'value': 2147483648})
self.assertEqual(400, response.status_code)
def test_value_high_boundary(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R14', 'value': 2147483647})
self.assertEqual(200, response.status_code)
self.reg_list.append('R14')
def test_value_low(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R1', 'value': -2147483649})
self.assertEqual(400, response.status_code)
def test_value_low_boundary(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R1', 'value': -2147483648})
self.assertEqual(200, response.status_code)
self.reg_list.append('R1')
def test_3_keys(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs", json={'reg': 'R1', 'value': 1, 'val': 3})
self.assertEqual(400, response.status_code)
def test_multiple_reg(self) -> None:
response = self.patch(f"/test_data/{self.test_data['id']}/input/regs",
json=[{'reg': 'R1', 'value': 12}, {'reg': 'R3', 'value': 32}])
self.assertEqual(400, response.status_code)
def test_delete_invalid_id(self) -> None:
response = self.delete(f"/test_data/invalid_id/input/regs/R1")
self.assertEqual(404, response.status_code)
def test_delete_reg_not_present(self) -> None:
response = self.delete(f"/test_data/{self.test_data['id']}/input/regs/R1")
self.assertEqual(400, response.status_code)
self.assertDictEqual({'message': 'Invalid Register', 'error': 'Bad Request'}, response.json())
| 51.076577 | 120 | 0.644369 | 3,022 | 22,678 | 4.622105 | 0.052614 | 0.111111 | 0.073883 | 0.076174 | 0.920891 | 0.905713 | 0.878436 | 0.86333 | 0.83541 | 0.795246 | 0 | 0.031013 | 0.192389 | 22,678 | 443 | 121 | 51.191874 | 0.731641 | 0.007188 | 0 | 0.529412 | 0 | 0.005602 | 0.249245 | 0.180958 | 0 | 0 | 0 | 0 | 0.282913 | 1 | 0.196078 | false | 0 | 0.014006 | 0 | 0.229692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
45bee478e5a166be9cec6d292d6f4fb82e64ddfa | 133 | py | Python | syft/frameworks/torch/__init__.py | antonrd/PySyft | c23d17022fc83774119b737059aed3d5e3dff20b | [
"Apache-2.0"
] | null | null | null | syft/frameworks/torch/__init__.py | antonrd/PySyft | c23d17022fc83774119b737059aed3d5e3dff20b | [
"Apache-2.0"
] | null | null | null | syft/frameworks/torch/__init__.py | antonrd/PySyft | c23d17022fc83774119b737059aed3d5e3dff20b | [
"Apache-2.0"
] | null | null | null | from . import hook_args
from . import tensors
from . import federated
from . import differential_privacy
from .hook import TorchHook
| 22.166667 | 34 | 0.81203 | 18 | 133 | 5.888889 | 0.5 | 0.377358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150376 | 133 | 5 | 35 | 26.6 | 0.938053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
45cbb4e26286c0ae8832cf175df8541c3f547d73 | 241 | py | Python | zcrmsdk/src/com/zoho/crm/api/dc/__init__.py | zoho/zohocrm-python-sdk-2.0 | 3a93eb3b57fed4e08f26bd5b311e101cb2995411 | [
"Apache-2.0"
] | null | null | null | zcrmsdk/src/com/zoho/crm/api/dc/__init__.py | zoho/zohocrm-python-sdk-2.0 | 3a93eb3b57fed4e08f26bd5b311e101cb2995411 | [
"Apache-2.0"
] | null | null | null | zcrmsdk/src/com/zoho/crm/api/dc/__init__.py | zoho/zohocrm-python-sdk-2.0 | 3a93eb3b57fed4e08f26bd5b311e101cb2995411 | [
"Apache-2.0"
] | null | null | null | from .au_data_center import AUDataCenter
from .cn_data_center import CNDataCenter
from .data_center import DataCenter
from .eu_data_center import EUDataCenter
from .in_data_center import INDataCenter
from .us_data_center import USDataCenter
| 34.428571 | 40 | 0.875519 | 35 | 241 | 5.714286 | 0.428571 | 0.3 | 0.48 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099585 | 241 | 6 | 41 | 40.166667 | 0.921659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
45d7b36587ca0e172a727eeec4826e7953a8a1c5 | 29 | py | Python | pg_check/__init__.py | rleitemayermore-godaddy/pgsql_proxy_checker | 9cf7f540622464b007a9735d7f9fa775156daea2 | [
"BSD-3-Clause"
] | null | null | null | pg_check/__init__.py | rleitemayermore-godaddy/pgsql_proxy_checker | 9cf7f540622464b007a9735d7f9fa775156daea2 | [
"BSD-3-Clause"
] | null | null | null | pg_check/__init__.py | rleitemayermore-godaddy/pgsql_proxy_checker | 9cf7f540622464b007a9735d7f9fa775156daea2 | [
"BSD-3-Clause"
] | null | null | null | from .pg_check import PgCheck | 29 | 29 | 0.862069 | 5 | 29 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 29 | 1 | 29 | 29 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
afd7e3a748d5eb7c86a629467bdfa6b6b2e3d424 | 19,865 | py | Python | dragonflow/controller/apps/sfc_mpls_driver.py | qianyuqiao/dragonflow | 0f154d4f794b02ac5b7fd61a3417d89e7b10912d | [
"Apache-2.0"
] | null | null | null | dragonflow/controller/apps/sfc_mpls_driver.py | qianyuqiao/dragonflow | 0f154d4f794b02ac5b7fd61a3417d89e7b10912d | [
"Apache-2.0"
] | null | null | null | dragonflow/controller/apps/sfc_mpls_driver.py | qianyuqiao/dragonflow | 0f154d4f794b02ac5b7fd61a3417d89e7b10912d | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from os_ken.lib.packet import ether_types
from oslo_log import log
from dragonflow.controller.apps import sfc_driver_base
from dragonflow.controller.common import constants
from dragonflow.db.models import sfc
LOG = log.getLogger(__name__)
def _get_index_by_id(lst, obj):
return next(i for i, o in enumerate(lst) if o.id == obj.id)
def _create_group_id(label, extra):
# FIXME add global way to share group IDs
return (label << 8) | extra
def _get_dispatch_to_all_group_id(label):
return _create_group_id(label, 1)
def _get_dispatch_locally_group_id(label):
return _create_group_id(label, 2)
class _SimpleMplsLabelAllocator(object):
@classmethod
def _create_label(cls, chain_idx, fc_idx, ppg_idx):
return ppg_idx | (fc_idx << 8) | (chain_idx << 11)
@classmethod
def _get_ingress_label(cls, port_chain, flow_classifier, port_pair_group):
fc_idx = _get_index_by_id(
port_chain.flow_classifiers,
flow_classifier,
)
ppg_idx = _get_index_by_id(
port_chain.port_pair_groups,
port_pair_group,
)
return cls._create_label(port_chain.chain_id, fc_idx, ppg_idx)
@classmethod
def _get_egress_label(cls, port_chain, flow_classifier, port_pair_group):
label = cls._get_ingress_label(
port_chain,
flow_classifier,
port_pair_group,
)
return label + 1
@classmethod
def _get_encap_label(cls, port_chain, flow_classifier):
# Can be done faster but this reads better
return cls._get_ingress_label(
port_chain,
flow_classifier,
port_chain.port_pair_groups[0],
)
@classmethod
def _get_decap_label(cls, port_chain, flow_classifier):
# Can be done faster but this reads better
return cls._get_egress_label(
port_chain,
flow_classifier,
port_chain.port_pair_groups[-1],
)
class MplsDriver(_SimpleMplsLabelAllocator, sfc_driver_base.SfcBaseDriver):
_ETH_TYPE_TO_TC = {
ether_types.ETH_TYPE_IP: 0,
ether_types.ETH_TYPE_IPV6: 1,
}
def __init__(self, app):
self.app = app
def install_encap_flows(self, port_chain, flow_classifier):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
table_id=constants.SFC_ENCAP_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
reg2=flow_classifier.unique_key,
eth_type=eth_type,
),
inst=[
self.app.parser.OFPInstructionActions(
self.app.ofproto.OFPIT_APPLY_ACTIONS,
[
self.app.parser.OFPActionPushMpls(
ether_types.ETH_TYPE_MPLS,
),
self.app.parser.OFPActionSetField(
mpls_label=self._get_encap_label(
port_chain,
flow_classifier,
),
),
self.app.parser.OFPActionSetField(
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
],
),
self.app.parser.OFPInstructionGotoTable(
constants.SFC_MPLS_DISPATCH_TABLE
),
],
)
def uninstall_encap_flows(self, port_chain, flow_classifier):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
table_id=constants.SFC_ENCAP_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
reg2=flow_classifier.unique_key,
eth_type=eth_type,
),
)
def install_decap_flows(self, port_chain, flow_classifier):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=self._get_decap_label(
port_chain,
flow_classifier,
),
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
inst=[
self.app.parser.OFPInstructionActions(
self.app.ofproto.OFPIT_APPLY_ACTIONS,
[
self.app.parser.OFPActionPopMpls(eth_type),
self.app.parser.OFPActionSetField(
reg2=flow_classifier.unique_key,
),
],
),
self.app.parser.OFPInstructionGotoTable(
constants.SFC_END_OF_CHAIN_TABLE,
),
],
)
def uninstall_decap_flows(self, port_chain, flow_classifier):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=self._get_decap_label(
port_chain,
flow_classifier
),
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
)
def install_forward_to_dest(self, port_chain, flow_classifier):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=self._get_decap_label(
port_chain,
flow_classifier
),
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
inst=[
self.app.parser.OFPInstructionActions(
self.app.ofproto.OFPIT_APPLY_ACTIONS,
[
self.app.parser.OFPActionSetField(
reg2=flow_classifier.dest_port.unique_key,
),
],
),
self.app.parser.OFPInstructionGotoTable(
constants.EGRESS_TABLE,
),
],
)
def uninstall_forward_to_dest(self, port_chain, flow_classifier):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
priority=constants.PRIORITY_HIGH,
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=self._get_decap_label(
port_chain,
flow_classifier,
),
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
)
def _port_pair_to_bucket(self, port_pair):
if (
port_pair.correlation_mechanism == sfc.CORR_MPLS or
not port_pair.ingress_port.is_local
):
next_table = constants.EGRESS_TABLE
else:
next_table = constants.SFC_MPLS_PP_DECAP_TABLE
actions = [
self.app.parser.OFPActionSetField(
reg7=port_pair.ingress_port.unique_key,
),
self.app.parser.NXActionResubmitTable(table_id=next_table),
]
return self.app.parser.OFPBucket(actions=actions, weight=1)
def _install_port_pair_decap_flows(self, label):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
table_id=constants.SFC_MPLS_PP_DECAP_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=label,
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
actions=[
self.app.parser.OFPActionPopMpls(eth_type),
self.app.parser.NXActionResubmitTable(
table_id=constants.EGRESS_TABLE,
),
],
)
def _uninstall_port_pair_decap_flows(self, label):
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
table_id=constants.SFC_MPLS_PP_DECAP_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=label,
),
)
def _install_dispatch_to_all_port_pairs(self, port_pair_group, label):
all_group_id = _get_dispatch_to_all_group_id(label)
# Add group: pick random SF from all available
self.app.add_group(
group_id=all_group_id,
group_type=self.app.ofproto.OFPGT_SELECT,
buckets=[
self._port_pair_to_bucket(pp)
for pp in port_pair_group.port_pairs
],
replace=True,
)
# Add flow: label => execute above group
self.app.mod_flow(
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=label,
),
actions=[self.app.parser.OFPActionGroup(group_id=all_group_id)],
)
def _uninstall_dispatch_to_all_port_pairs(self, port_pair_group, label):
all_group_id = _get_dispatch_to_all_group_id(label)
# Remove execute group flow
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
priority=constants.PRIORITY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=label,
),
)
# Delete group
self.app.del_group(
group_id=all_group_id,
group_type=self.app.ofproto.OFPGT_SELECT,
)
def _install_dispatch_to_local_port_pairs(self, port_pair_group, label):
local_pps = [
pp for pp in port_pair_group.port_pairs if pp.ingress_port.is_local
]
if not local_pps:
return
local_group_id = _get_dispatch_locally_group_id(label)
# Add group: pick random SF from local only
self.app.add_group(
group_id=local_group_id,
group_type=self.app.ofproto.OFPGT_SELECT,
buckets=[self._port_pair_to_bucket(pp) for pp in local_pps],
replace=True,
)
# Add flow: label => execute above group
self.app.mod_flow(
table_id=constants.INGRESS_DESTINATION_PORT_LOOKUP_TABLE,
priority=constants.PRIORITY_VERY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=label,
),
actions=[self.app.parser.OFPActionGroup(group_id=local_group_id)],
)
def _uninstall_dispatch_to_local_port_pairs(self, port_pair_group, label):
local_pps = [
pp for pp in port_pair_group.port_pairs if pp.ingress_port.is_local
]
if not local_pps:
return
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
table_id=constants.INGRESS_DESTINATION_PORT_LOOKUP_TABLE,
priority=constants.PRIORITY_VERY_HIGH,
match=self.app.parser.OFPMatch(
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=label,
),
)
local_group_id = _get_dispatch_locally_group_id(label)
self.app.del_group(
group_id=local_group_id,
group_type=self.app.ofproto.OFPGT_SELECT,
)
def install_port_pair_group_flows(self, port_chain, port_pair_group):
for flow_classifier in port_chain.flow_classifiers:
label = self._get_ingress_label(
port_chain,
flow_classifier,
port_pair_group,
)
# Flows to remove MPLS shim for non MPLS service functions
self._install_port_pair_decap_flows(label)
self._install_dispatch_to_all_port_pairs(port_pair_group, label)
self._install_dispatch_to_local_port_pairs(port_pair_group, label)
def uninstall_port_pair_group_flows(self, port_chain, port_pair_group):
for flow_classifier in port_chain.flow_classifiers:
label = self._get_ingress_label(
port_chain,
flow_classifier,
port_pair_group,
)
self._uninstall_port_pair_decap_flows(label)
self._uninstall_dispatch_to_all_port_pairs(port_pair_group, label)
self._uninstall_dispatch_to_local_port_pairs(
port_pair_group, label)
def install_port_pair_egress_flows(self, port_chain, port_pair_group,
port_pair):
if port_pair.correlation_mechanism == sfc.CORR_MPLS:
self._install_mpls_port_pair_egress_flows(
port_chain,
port_pair_group,
port_pair,
)
elif port_pair.correlation_mechanism == sfc.CORR_NONE:
self._install_none_port_pair_egress_flows(
port_chain,
port_pair_group,
port_pair,
)
else:
LOG.warning('Driver does not support correlation_mechanism %s',
port_pair.correlation_mechanism)
def _install_mpls_port_pair_egress_flows(self, port_chain, port_pair_group,
port_pair):
for flow_classifier in port_chain.flow_classifiers:
self.app.mod_flow(
table_id=self.app.dfdp.apps['portsec'].states.main,
priority=constants.PRIORITY_VERY_HIGH,
match=self.app.parser.OFPMatch(
reg6=port_pair.egress_port.unique_key,
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=self._get_ingress_label(
port_chain,
flow_classifier,
port_pair_group,
),
),
actions=[
self.app.parser.OFPActionSetField(
mpls_label=self._get_egress_label(
port_chain,
flow_classifier,
port_pair_group
),
),
self.app.parser.NXActionResubmitTable(
table_id=constants.SFC_MPLS_DISPATCH_TABLE),
],
)
def _install_none_port_pair_egress_flows(self, port_chain, port_pair_group,
port_pair):
for flow_classifier in port_chain.flow_classifiers:
mpls_label = self._get_egress_label(
port_chain,
flow_classifier,
port_pair_group,
)
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
table_id=self.app.dfdp.apps['portsec'].states.main,
priority=constants.PRIORITY_VERY_HIGH,
match=self.app.parser.OFPMatch(
reg6=port_pair.egress_port.unique_key,
eth_type=eth_type,
),
actions=[
self.app.parser.OFPActionPushMpls(
ether_types.ETH_TYPE_MPLS,
),
self.app.parser.OFPActionSetField(
mpls_label=mpls_label,
),
self.app.parser.OFPActionSetField(
mpls_tc=self._ETH_TYPE_TO_TC[eth_type],
),
self.app.parser.NXActionResubmitTable(
table_id=constants.SFC_MPLS_DISPATCH_TABLE,
),
],
)
def uninstall_port_pair_egress_flows(self, port_chain, port_pair_groups,
port_pair):
if port_pair.correlation_mechanism == sfc.CORR_MPLS:
self._uninstall_mpls_port_pair_egress_flows(
port_chain,
port_pair_groups,
port_pair,
)
elif port_pair.correlation_mechanism == sfc.CORR_NONE:
self._uninstall_none_port_pair_egress_flows(port_pair)
else:
LOG.warning('Driver does not support correlation_mechanism %s',
port_pair.correlation_mechanism)
def _uninstall_mpls_port_pair_egress_flows(self, port_chain,
port_pair_group, port_pair):
for flow_classifier in port_chain.flow_classifiers:
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
table_id=self.app.dfdp.apps['portsec'].states.main,
priority=constants.PRIORITY_VERY_HIGH,
match=self.app.parser.OFPMatch(
reg6=port_pair.egress_port.unique_key,
eth_type=ether_types.ETH_TYPE_MPLS,
mpls_label=self._get_ingress_label(
port_chain,
flow_classifier,
port_pair_group,
),
),
)
def _uninstall_none_port_pair_egress_flows(self, port_pair):
for eth_type in self._ETH_TYPE_TO_TC:
self.app.mod_flow(
command=self.app.ofproto.OFPFC_DELETE_STRICT,
priority=constants.PRIORITY_VERY_HIGH,
table_id=self.app.dfdp.apps['portsec'].states.main,
match=self.app.parser.OFPMatch(
reg6=port_pair.egress_port.unique_key,
eth_type=eth_type,
),
)
| 38.055556 | 79 | 0.553687 | 2,116 | 19,865 | 4.784972 | 0.104915 | 0.056691 | 0.052642 | 0.054519 | 0.842765 | 0.827259 | 0.78321 | 0.744 | 0.732642 | 0.68879 | 0 | 0.002114 | 0.380821 | 19,865 | 521 | 80 | 38.128599 | 0.821057 | 0.046766 | 0 | 0.696833 | 0 | 0 | 0.006557 | 0.002221 | 0 | 0 | 0 | 0.001919 | 0 | 1 | 0.070136 | false | 0 | 0.011312 | 0.015837 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2fda7227aee9118033d6a354546099fcaaf10a38 | 42 | py | Python | src/err.py | rome2rio/TokyoGTFS | c641529cd4e1d84692e85060dacb710270bc235f | [
"MIT"
] | 10 | 2018-10-24T13:49:18.000Z | 2022-02-06T22:58:34.000Z | src/err.py | rome2rio/TokyoGTFS | c641529cd4e1d84692e85060dacb710270bc235f | [
"MIT"
] | null | null | null | src/err.py | rome2rio/TokyoGTFS | c641529cd4e1d84692e85060dacb710270bc235f | [
"MIT"
] | 6 | 2019-04-17T03:57:29.000Z | 2021-01-26T11:44:53.000Z | class DataAssertion(ValueError):
pass
| 14 | 32 | 0.761905 | 4 | 42 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 42 | 2 | 33 | 21 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
2fe77d8442e9e6b9161ccca2bfeb8070e1f3e766 | 57 | py | Python | libs/annea_foo/src/annea_foo/__init__.py | annea-ai/python-framework | d24fb5fa20c534c7451513eb2a08bb55907b7857 | [
"MIT"
] | null | null | null | libs/annea_foo/src/annea_foo/__init__.py | annea-ai/python-framework | d24fb5fa20c534c7451513eb2a08bb55907b7857 | [
"MIT"
] | null | null | null | libs/annea_foo/src/annea_foo/__init__.py | annea-ai/python-framework | d24fb5fa20c534c7451513eb2a08bb55907b7857 | [
"MIT"
] | null | null | null | from .example import print_joke # pylint: disable=E0401
| 28.5 | 56 | 0.789474 | 8 | 57 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 0.140351 | 57 | 1 | 57 | 57 | 0.816327 | 0.368421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
64080893042fe03d07506d2810e9121b5f68f24b | 44 | py | Python | bin/pycid/__init__.py | runborg/pycid | dd0379592ca2cd5c78ad9bf40917f4bfa9c0ef0c | [
"MIT"
] | null | null | null | bin/pycid/__init__.py | runborg/pycid | dd0379592ca2cd5c78ad9bf40917f4bfa9c0ef0c | [
"MIT"
] | null | null | null | bin/pycid/__init__.py | runborg/pycid | dd0379592ca2cd5c78ad9bf40917f4bfa9c0ef0c | [
"MIT"
] | null | null | null | import mfnmatch
from loginrc import loginrc
| 14.666667 | 27 | 0.863636 | 6 | 44 | 6.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 44 | 2 | 28 | 22 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6421c677c4853c692cd2a45f7af1d3aa50c0928d | 3,484 | py | Python | MainGenerator.py | rexliu3/MineSweeper | e9c0751ed95d22f0042122dec5b7a5d1c8871fae | [
"MIT"
] | null | null | null | MainGenerator.py | rexliu3/MineSweeper | e9c0751ed95d22f0042122dec5b7a5d1c8871fae | [
"MIT"
] | null | null | null | MainGenerator.py | rexliu3/MineSweeper | e9c0751ed95d22f0042122dec5b7a5d1c8871fae | [
"MIT"
] | null | null | null | from random import randint
def can_place(row, column, notRow, notColumn, numRows, numColumns):
# Returns True if can place there
if row == notRow or column == notColumn:
return False
if row != 0 and column != 0 and notRow == row - 1 and notColumn == column - 1:
return False
if row != 0 and notRow == row - 1 and notColumn == column:
return False
if row != 0 and column != numColumns - 1 and notRow == row - 1 and notColumn == column + 1:
return False
if column != 0 and notRow == row and notColumn == column - 1:
return False
if column != numColumns - 1 and notRow == row and notColumn == column + 1:
return False
if row != numRows - 1 and column != 0 and notRow == row + 1 and notColumn == column - 1:
return False
if row != numRows - 1 and notRow == row + 1 and notColumn == column:
return False
if row != numRows - 1 and column != numColumns - 1 and notRow == row + 1 and notColumn == column + 1:
return False
return True
def place_mines(board, numRows, numColumns, numMines, notRow, notColumn):
for k in range(numMines):
rowCoordinate = randint(0, numRows - 1)
columnCoordinate = randint(0, numColumns - 1)
while not can_place(rowCoordinate, columnCoordinate, notRow, notColumn, numRows, numColumns):
rowCoordinate = randint(0, numRows - 1)
columnCoordinate = randint(0, numColumns - 1)
while board[rowCoordinate][columnCoordinate] < 0:
rowCoordinate = randint(0, numRows - 1)
columnCoordinate = randint(0, numColumns - 1)
board[rowCoordinate][columnCoordinate] = -1
if rowCoordinate != 0 and columnCoordinate != 0:
if board[rowCoordinate - 1][columnCoordinate - 1] >= 0:
board[rowCoordinate - 1][columnCoordinate - 1] += 1
if rowCoordinate != 0:
if board[rowCoordinate - 1][columnCoordinate] >= 0:
board[rowCoordinate - 1][columnCoordinate] += 1
if rowCoordinate != 0 and columnCoordinate != numColumns - 1:
if board[rowCoordinate - 1][columnCoordinate + 1] >= 0:
board[rowCoordinate - 1][columnCoordinate + 1] += 1
if columnCoordinate != 0:
if board[rowCoordinate][columnCoordinate - 1] >= 0:
board[rowCoordinate][columnCoordinate - 1] += 1
if columnCoordinate != numColumns - 1:
if board[rowCoordinate][columnCoordinate + 1] >= 0:
board[rowCoordinate][columnCoordinate + 1] += 1
if rowCoordinate != numRows - 1 and columnCoordinate != 0:
if board[rowCoordinate + 1][columnCoordinate - 1] >= 0:
board[rowCoordinate + 1][columnCoordinate - 1] += 1
if rowCoordinate != numRows - 1:
if board[rowCoordinate + 1][columnCoordinate] >= 0:
board[rowCoordinate + 1][columnCoordinate] += 1
if rowCoordinate != numRows - 1 and columnCoordinate != numColumns - 1:
if board[rowCoordinate + 1][columnCoordinate + 1] >= 0:
board[rowCoordinate + 1][columnCoordinate + 1] += 1
return board
def set_up(board, numRows, numColumns, numMines, notRow, notColumn):
for i in range(numRows):
board.append([])
for j in range(numColumns):
board[i].append(0)
board = place_mines(board, numRows, numColumns, numMines, notRow, notColumn)
return board
| 42.487805 | 105 | 0.609644 | 386 | 3,484 | 5.489637 | 0.101036 | 0.152902 | 0.107598 | 0.198207 | 0.821614 | 0.813119 | 0.794714 | 0.728646 | 0.666824 | 0.666824 | 0 | 0.036708 | 0.288462 | 3,484 | 81 | 106 | 43.012346 | 0.818072 | 0.008898 | 0 | 0.265625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046875 | false | 0 | 0.015625 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
643e2a960cdf0bbd447bcc29cd9eb2fc4ac7ca4b | 1,236 | py | Python | tests/test_windows.py | ThatXliner/userpath | fb29f5a7049d9bc84729d2c2b5a893253e1d8eaa | [
"Apache-2.0",
"MIT"
] | 111 | 2017-09-21T02:02:55.000Z | 2022-02-07T14:53:20.000Z | tests/test_windows.py | ThatXliner/userpath | fb29f5a7049d9bc84729d2c2b5a893253e1d8eaa | [
"Apache-2.0",
"MIT"
] | 30 | 2017-09-21T11:54:03.000Z | 2022-03-14T15:53:23.000Z | tests/test_windows.py | ThatXliner/userpath | fb29f5a7049d9bc84729d2c2b5a893253e1d8eaa | [
"Apache-2.0",
"MIT"
] | 19 | 2017-09-21T08:59:55.000Z | 2021-12-25T20:39:31.000Z | import pytest
import userpath
from .utils import ON_WINDOWS_CI, get_random_path
pytestmark = pytest.mark.skipif(not ON_WINDOWS_CI, reason='Tests only for throwaway Windows VMs on CI')
def test_prepend():
location = get_random_path()
assert not userpath.in_current_path(location)
assert userpath.prepend(location, check=True)
assert userpath.in_new_path(location)
assert userpath.need_shell_restart(location)
def test_prepend_multiple():
locations = [get_random_path(), get_random_path()]
assert not userpath.in_current_path(locations)
assert userpath.prepend(locations, check=True)
assert userpath.in_new_path(locations)
assert userpath.need_shell_restart(locations)
def test_append():
location = get_random_path()
assert not userpath.in_current_path(location)
assert userpath.append(location, check=True)
assert userpath.in_new_path(location)
assert userpath.need_shell_restart(location)
def test_append_multiple():
locations = [get_random_path(), get_random_path()]
assert not userpath.in_current_path(locations)
assert userpath.append(locations, check=True)
assert userpath.in_new_path(locations)
assert userpath.need_shell_restart(locations)
| 31.692308 | 103 | 0.778317 | 166 | 1,236 | 5.506024 | 0.222892 | 0.183807 | 0.099562 | 0.083151 | 0.768053 | 0.768053 | 0.768053 | 0.768053 | 0.768053 | 0.768053 | 0 | 0 | 0.139968 | 1,236 | 38 | 104 | 32.526316 | 0.859831 | 0 | 0 | 0.571429 | 0 | 0 | 0.033981 | 0 | 0 | 0 | 0 | 0 | 0.571429 | 1 | 0.142857 | false | 0 | 0.107143 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ff8cda8091eb9ac1faf7eefbc023d55f44c577bc | 5,433 | py | Python | tests/fields/test_o2o_with_unique.py | blazing-gig/tortoise-orm | 811bbcb12c702c5f45e3d86ce6e0b2ab386459df | [
"Apache-2.0"
] | 2,847 | 2018-08-27T12:02:21.000Z | 2022-03-31T01:30:40.000Z | tests/fields/test_o2o_with_unique.py | Tomes111/tortoise-orm | 8b55499a228e44f33fec9099f4d559c77c73beb7 | [
"Apache-2.0"
] | 983 | 2018-08-24T16:42:41.000Z | 2022-03-30T05:14:49.000Z | tests/fields/test_o2o_with_unique.py | Tomes111/tortoise-orm | 8b55499a228e44f33fec9099f4d559c77c73beb7 | [
"Apache-2.0"
] | 323 | 2018-09-04T23:38:42.000Z | 2022-03-31T06:49:17.000Z | from tests import testmodels
from tortoise.contrib import test
from tortoise.exceptions import IntegrityError, OperationalError
from tortoise.queryset import QuerySet
class TestOneToOneFieldWithUnique(test.TestCase):
async def test_principal__empty(self):
with self.assertRaises(IntegrityError):
await testmodels.Principal.create()
async def test_principal__create_by_id(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school_id=school.id)
self.assertEqual(principal.school_id, school.id)
self.assertEqual(await school.principal, principal)
async def test_principal__create_by_name(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
await principal.fetch_related("school")
self.assertEqual(principal.school, school)
self.assertEqual(await school.principal, principal)
async def test_principal__by_name__created_prefetched(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
self.assertEqual(principal.school, school)
self.assertEqual(await school.principal, principal)
async def test_principal__by_name__unfetched(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
principal = await testmodels.Principal.get(id=principal.id)
self.assertIsInstance(principal.school, QuerySet)
async def test_principal__by_name__re_awaited(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
await principal.fetch_related("school")
self.assertEqual(principal.school, school)
self.assertEqual(await principal.school, school)
async def test_principal__by_name__awaited(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
principal = await testmodels.Principal.get(id=principal.id)
self.assertEqual(await principal.school, school)
self.assertEqual(await school.principal, principal)
async def test_update_by_name(self):
school = await testmodels.School.create(id=1024, name="School1")
school2 = await testmodels.School.create(id=2048, name="School2")
principal0 = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
await testmodels.Principal.filter(id=principal0.id).update(school=school2)
principal = await testmodels.Principal.get(id=principal0.id)
await principal.fetch_related("school")
self.assertEqual(principal.school, school2)
self.assertEqual(await school.principal, None)
self.assertEqual(await school2.principal, principal)
async def test_update_by_id(self):
school = await testmodels.School.create(id=1024, name="School1")
school2 = await testmodels.School.create(id=2048, name="School2")
principal0 = await testmodels.Principal.create(name="Sang-Heon Jeon", school_id=school.id)
await testmodels.Principal.filter(id=principal0.id).update(school_id=school2.id)
principal = await testmodels.Principal.get(id=principal0.id)
self.assertEqual(principal.school_id, school2.id)
self.assertEqual(await school.principal, None)
self.assertEqual(await school2.principal, principal)
async def test_delete_by_name(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
del principal.school
with self.assertRaises(IntegrityError):
await principal.save()
async def test_principal__uninstantiated_create(self):
school = await testmodels.School(id=1024, name="School1")
with self.assertRaisesRegex(OperationalError, "You should first call .save()"):
await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
async def test_principal__instantiated_create(self):
school = await testmodels.School.create(id=1024, name="School1")
await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
async def test_principal__fetched_bool(self):
school = await testmodels.School.create(id=1024, name="School1")
await school.fetch_related("principal")
self.assertFalse(bool(school.principal))
await testmodels.Principal.create(name="Sang-Heon Jeon", school=school)
await school.fetch_related("principal")
self.assertTrue(bool(school.principal))
async def test_principal__filter(self):
school = await testmodels.School.create(id=1024, name="School1")
principal = await testmodels.Principal.create(name="Sang-Heon Jeon1", school=school)
self.assertEqual(await school.principal.filter(name="Sang-Heon Jeon1"), principal)
self.assertEqual(await school.principal.filter(name="Sang-Heon Jeon2"), None)
| 51.742857 | 98 | 0.726118 | 643 | 5,433 | 6.021773 | 0.099533 | 0.135589 | 0.123967 | 0.108471 | 0.848141 | 0.818698 | 0.740444 | 0.734246 | 0.708419 | 0.637397 | 0 | 0.020404 | 0.170072 | 5,433 | 104 | 99 | 52.240385 | 0.838323 | 0 | 0 | 0.552941 | 0 | 0 | 0.070495 | 0 | 0 | 0 | 0 | 0 | 0.282353 | 1 | 0 | false | 0 | 0.047059 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ff997edd9351db4ffb5a26bb71c5a55ce775b90e | 487 | py | Python | PyTeacher/bgui/text/__init__.py | Banshee1221/PyRob | 4a35976cffe519480b90c3b967949c659d00fbfd | [
"MIT"
] | 40 | 2015-02-04T02:38:12.000Z | 2022-01-31T03:14:35.000Z | src/upbgui/text/__init__.py | ShaunKulesa/UPBGUI | 487d68de7a2b083648a16b1675832ff43f0ee4a2 | [
"MIT"
] | 2 | 2018-08-25T16:28:19.000Z | 2021-12-19T12:56:47.000Z | src/upbgui/text/__init__.py | ShaunKulesa/UPBGUI | 487d68de7a2b083648a16b1675832ff43f0ee4a2 | [
"MIT"
] | 17 | 2015-02-05T00:02:56.000Z | 2022-01-11T21:51:49.000Z | import abc
#TODO: This just follows the blf interface, which isn't very Pythonic
class TextLibrary:
"""Class for handling text drawing. """
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def load(self, filename):
pass
@abc.abstractmethod
def draw(self, fontid, text):
pass
@abc.abstractmethod
def dimensions(self, fontid, text):
pass
@abc.abstractmethod
def position(self, fontid, x, y, z):
pass
@abc.abstractmethod
def size(self, fontid, size, dpi):
pass | 16.793103 | 69 | 0.718686 | 66 | 487 | 5.242424 | 0.560606 | 0.245665 | 0.289017 | 0.277457 | 0.219653 | 0.219653 | 0.219653 | 0 | 0 | 0 | 0 | 0 | 0.176591 | 487 | 29 | 70 | 16.793103 | 0.862843 | 0.223819 | 0 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 1 | 0.277778 | false | 0.277778 | 0.055556 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
ff9adcf1892a4996060543246a43e6eebbade856 | 22,655 | py | Python | pymaster/workspaces.py | DanielLenz/NaMaster | fb329b70e7a2f4cea5411abf80122746afed27f2 | [
"BSD-3-Clause"
] | null | null | null | pymaster/workspaces.py | DanielLenz/NaMaster | fb329b70e7a2f4cea5411abf80122746afed27f2 | [
"BSD-3-Clause"
] | null | null | null | pymaster/workspaces.py | DanielLenz/NaMaster | fb329b70e7a2f4cea5411abf80122746afed27f2 | [
"BSD-3-Clause"
] | null | null | null | from pymaster import nmtlib as lib
import numpy as np
class NmtWorkspace(object):
"""
NmtWorkspace objects are used to compute and store the coupling matrix associated with an incomplete sky coverage, and used in the MASTER algorithm. When initialized, this object is practically empty. The information describing the coupling matrix must be computed or read from a file afterwards.
"""
def __init__(self):
self.wsp = None
def __del__(self):
if self.wsp is not None:
lib.workspace_free(self.wsp)
self.wsp = None
def read_from(self, fname):
"""
Reads the contents of an NmtWorkspace object from a file (encoded using an internal binary format).
:param str fname: input file name
"""
if self.wsp is not None:
lib.workspace_free(self.wsp)
self.wsp = None
self.wsp = lib.read_workspace(fname)
def compute_coupling_matrix(self, fl1, fl2, bins, is_teb=False):
"""
Computes coupling matrix associated with the cross-power spectrum of two NmtFields and an NmtBin binning scheme. Note that the mode coupling matrix will only contain ells up to the maximum multipole included in the NmtBin bandpowers.
:param NmtField fl1,fl2: fields to correlate
:param NmtBin bin: binning scheme
:param boolean is_teb: if true, all mode-coupling matrices (0-0,0-2,2-2) will be computed at the same time. In this case, fl1 must be a spin-0 field and fl1 must be spin-2.
"""
if self.wsp is not None:
lib.workspace_free(self.wsp)
self.wsp = None
self.wsp = lib.comp_coupling_matrix(fl1.fl, fl2.fl, bins.bin, int(is_teb))
def write_to(self, fname):
"""
Writes the contents of an NmtWorkspace object to a file (encoded using an internal binary format).
:param str fname: output file name
"""
if self.wsp is None:
raise RuntimeError("Must initialize workspace before writing")
lib.write_workspace(self.wsp, fname)
def get_coupling_matrix(self) :
"""
Returns the currently stored mode-coupling matrix.
:return: mode-coupling matrix. The matrix will have shape `[nrows,nrows]`, with `nrows = n_cls * n_ells`, where `n_cls` is the number of power spectra (1, 2 or 4 for spin0-0, spin0-2 and spin2-2 correlations) and `n_ells = lmax + 1` (normally `lmax = 3 * nside - 1`). The assumed ordering of power spectra is such that the `l`-th element of the `i`-th power spectrum be stored with index `l * n_cls + i`.
"""
if self.wsp is None:
raise RuntimeError("Must initialize workspace before getting a MCM")
nrows=(self.wsp.lmax+1)*self.wsp.ncls
return lib.get_mcm(self.wsp,nrows*nrows).reshape([nrows,nrows])
def update_coupling_matrix(self,new_matrix) :
"""
Updates the stored mode-coupling matrix.
The new matrix (`new_matrix`) must have shape `[nrows,nrows]`, with `nrows = n_cls * n_ells`, where `n_cls` is the number of power spectra (1, 2 or 4 for spin0-0, spin0-2 and spin2-2 correlations) and `n_ells = lmax + 1` (normally `lmax = 3 * nside - 1`). The assumed ordering of power spectra is such that the `l`-th element of the `i`-th power spectrum be stored with index `l * n_cls + i`.
:param new_matrix: matrix that will replace the mode-coupling matrix.
"""
if self.wsp is None:
raise RuntimeError("Must initialize workspace before updating MCM")
if len(new_matrix)!=(self.wsp.lmax+1)*self.wsp.ncls :
raise ValueError("Input matrix has an inconsistent size")
lib.update_mcm(self.wsp,len(new_matrix),new_matrix.flatten())
def couple_cell(self, cl_in):
"""
Convolves a set of input power spectra with a coupling matrix (see Eq. 6 of the C API documentation).
:param cl_in: set of input power spectra. The number of power spectra must correspond to the spins of the two fields that this NmtWorkspace object was initialized with (i.e. 1 for two spin-0 fields, 2 for one spin-0 and one spin-2 field and 4 for two spin-2 fields).
:return: coupled power spectrum
"""
if (len(cl_in) != self.wsp.ncls) or (len(cl_in[0]) < self.wsp.lmax + 1):
raise ValueError("Input power spectrum has wrong shape")
cl1d = lib.couple_cell_py(self.wsp, cl_in, self.wsp.ncls * (self.wsp.lmax + 1))
clout = np.reshape(cl1d, [self.wsp.ncls, self.wsp.lmax + 1])
return clout
def decouple_cell(self, cl_in, cl_bias=None, cl_noise=None):
"""
Decouples a set of pseudo-Cl power spectra into a set of bandpowers by inverting the binned coupling matrix (se Eq. 4 of the C API documentation).
:param cl_in: set of input power spectra. The number of power spectra must correspond to the spins of the two fields that this NmtWorkspace object was initialized with (i.e. 1 for two spin-0 fields, 2 for one spin-0 and one spin-2 field, 4 for two spin-2 fields and 7 if this NmtWorkspace was created using `is_teb=True`).
:param cl_bias: bias to the power spectrum associated to contaminant residuals (optional). This can be computed through :func:`pymaster.deprojection_bias`.
:param cl_noise: noise bias (i.e. angular power spectrum of masked noise realizations).
:return: set of decoupled bandpowers
"""
if (len(cl_in) != self.wsp.ncls) or (len(cl_in[0]) < self.wsp.lmax + 1):
raise ValueError("Input power spectrum has wrong shape")
if cl_bias is not None:
if (len(cl_bias) != self.wsp.ncls) or (len(cl_bias[0]) < self.wsp.lmax + 1):
raise ValueError("Input bias power spectrum has wrong shape")
clb = cl_bias.copy()
else:
clb = np.zeros_like(cl_in)
if cl_noise is not None:
if (len(cl_noise) != self.wsp.ncls) or (
len(cl_noise[0]) < self.wsp.lmax + 1
):
raise ValueError("Input noise power spectrum has wrong shape")
cln = cl_noise.copy()
else:
cln = np.zeros_like(cl_in)
cl1d = lib.decouple_cell_py(
self.wsp, cl_in, cln, clb, self.wsp.ncls * self.wsp.bin.n_bands
)
clout = np.reshape(cl1d, [self.wsp.ncls, self.wsp.bin.n_bands])
return clout
class NmtWorkspaceFlat(object):
"""
NmtWorkspaceFlat objects are used to compute and store the coupling matrix associated with an incomplete sky coverage, and used in the flat-sky version of the MASTER algorithm. When initialized, this object is practically empty. The information describing the coupling matrix must be computed or read from a file afterwards.
"""
def __init__(self):
self.wsp = None
def __del__(self):
if self.wsp is not None:
lib.workspace_flat_free(self.wsp)
self.wsp = None
def read_from(self, fname):
"""
Reads the contents of an NmtWorkspaceFlat object from a file (encoded using an internal binary format).
:param str fname: input file name
"""
if self.wsp is not None:
lib.workspace_flat_free(self.wsp)
self.wsp = None
self.wsp = lib.read_workspace_flat(fname)
def compute_coupling_matrix(
self, fl1, fl2, bins, ell_cut_x=[1., -1.], ell_cut_y=[1., -1.], is_teb=False
):
"""
Computes coupling matrix associated with the cross-power spectrum of two NmtFieldFlats and an NmtBinFlat binning scheme.
:param NmtFieldFlat fl1,fl2: fields to correlate
:param NmtBinFlat bin: binning scheme
:param float(2) ell_cut_x: remove all modes with ell_x in the interval [ell_cut_x[0],ell_cut_x[1]] from the calculation.
:param float(2) ell_cut_y: remove all modes with ell_y in the interval [ell_cut_y[0],ell_cut_y[1]] from the calculation.
:param boolean is_teb: if true, all mode-coupling matrices (0-0,0-2,2-2) will be computed at the same time. In this case, fl1 must be a spin-0 field and fl1 must be spin-2.
"""
if self.wsp is not None:
lib.workspace_flat_free(self.wsp)
self.wsp = None
self.wsp = lib.comp_coupling_matrix_flat(
fl1.fl,
fl2.fl,
bins.bin,
ell_cut_x[0],
ell_cut_x[1],
ell_cut_y[0],
ell_cut_y[1],
int(is_teb),
)
def write_to(self, fname):
"""
Writes the contents of an NmtWorkspaceFlat object to a file (encoded using an internal binary format).
:param str fname: output file name
"""
if self.wsp is None:
raise RuntimeError("Must initialize workspace before writing")
lib.write_workspace_flat(self.wsp, fname)
def couple_cell(self, ells, cl_in):
"""
Convolves a set of input power spectra with a coupling matrix (see Eq. 6 of the C API documentation).
:param ells: list of multipoles on which the input power spectra are defined
:param cl_in: set of input power spectra. The number of power spectra must correspond to the spins of the two fields that this NmtWorkspaceFlat object was initialized with (i.e. 1 for two spin-0 fields, 2 for one spin-0 and one spin-2 field and 4 for two spin-2 fields).
:return: coupled power spectrum. The coupled power spectra are returned at the multipoles returned by calling :func:`get_ell_sampling` for any of the fields that were used to generate the workspace.
"""
if (len(cl_in) != self.wsp.ncls) or (len(cl_in[0]) != len(ells)):
raise ValueError("Input power spectrum has wrong shape")
cl1d = lib.couple_cell_py_flat(
self.wsp, ells, cl_in, self.wsp.ncls * self.wsp.bin.n_bands
)
clout = np.reshape(cl1d, [self.wsp.ncls, self.wsp.bin.n_bands])
return clout
def decouple_cell(self, cl_in, cl_bias=None, cl_noise=None):
"""
Decouples a set of pseudo-Cl power spectra into a set of bandpowers by inverting the binned coupling matrix (se Eq. 4 of the C API documentation).
:param cl_in: set of input power spectra. The number of power spectra must correspond to the spins of the two fields that this NmtWorkspaceFlat object was initialized with (i.e. 1 for two spin-0 fields, 2 for one spin-0 and one spin-2 field, 4 for two spin-2 fields and 7 if this NmtWorkspaceFlat was created using `is_teb=True`). These power spectra must be defined at the multipoles returned by :func:`get_ell_sampling` for any of the fields used to create the workspace.
:param cl_bias: bias to the power spectrum associated to contaminant residuals (optional). This can be computed through :func:`pymaster.deprojection_bias_flat`.
:param cl_noise: noise bias (i.e. angular power spectrum of masked noise realizations).
:return: set of decoupled bandpowers
"""
if (len(cl_in) != self.wsp.ncls) or (len(cl_in[0]) != self.wsp.bin.n_bands):
raise ValueError("Input power spectrum has wrong shape")
if cl_bias is not None:
if (len(cl_bias) != self.wsp.ncls) or (
len(cl_bias[0]) != self.wsp.bin.n_bands
):
raise ValueError("Input bias power spectrum has wrong shape")
clb = cl_bias.copy()
else:
clb = np.zeros_like(cl_in)
if cl_noise is not None:
if (len(cl_noise) != self.wsp.ncls) or (
len(cl_noise[0]) != self.wsp.bin.n_bands
):
raise ValueError("Input noise power spectrum has wrong shape")
cln = cl_noise.copy()
else:
cln = np.zeros_like(cl_in)
cl1d = lib.decouple_cell_py_flat(
self.wsp, cl_in, cln, clb, self.wsp.ncls * self.wsp.bin.n_bands
)
clout = np.reshape(cl1d, [self.wsp.ncls, self.wsp.bin.n_bands])
return clout
def deprojection_bias(f1, f2, cls_guess):
"""
Computes the bias associated to contaminant removal to the cross-pseudo-Cl of two fields.
:param NmtField f1,f2: fields to correlate
:param cls_guess: set of power spectra corresponding to a best-guess of the true power spectra of f1 and f2.
:return: deprojection bias power spectra.
"""
if len(cls_guess) != f1.fl.nmaps * f2.fl.nmaps:
raise ValueError("Proposal Cell doesn't match number of maps")
if len(cls_guess[0]) != f1.fl.lmax + 1:
raise ValueError("Proposal Cell doesn't match map resolution")
cl1d = lib.comp_deproj_bias(
f1.fl, f2.fl, cls_guess, len(cls_guess) * len(cls_guess[0])
)
cl2d = np.reshape(cl1d, [len(cls_guess), len(cls_guess[0])])
return cl2d
def uncorr_noise_deprojection_bias(f1, map_var):
"""
Computes the bias associated to contaminant removal in the presence of uncorrelated inhomogeneous noise to the auto-pseudo-Cl of a given field f1.
:param NmtField f1: fields to correlate
:param map_cls_guess: array containing a HEALPix map corresponding to the local noise variance (in one sterad).
:return: deprojection bias power spectra.
"""
ncls = f1.fl.nmaps * f1.fl.nmaps
nells = f1.fl.lmax + 1
if len(map_var) != f1.fl.npix:
raise ValueError("Variance map doesn't match map resolution")
cl1d = lib.comp_uncorr_noise_deproj_bias(f1.fl, map_var, ncls * nells)
cl2d = np.reshape(cl1d, [ncls, nells])
return cl2d
def deprojection_bias_flat(
f1, f2, b, ells, cls_guess, ell_cut_x=[1., -1.], ell_cut_y=[1., -1.]
):
"""
Computes the bias associated to contaminant removal to the cross-pseudo-Cl of two flat-sky fields. The returned power spectrum is defined at the multipoles returned by the method :func:`get_ell_sampling` of either f1 or f2.
:param NmtFieldFlat f1,f2: fields to correlate
:param NmtBinFlat b: binning scheme defining output bandpower
:param ells: list of multipoles on which the proposal power spectra are defined
:param cls_guess: set of power spectra corresponding to a best-guess of the true power spectra of f1 and f2.
:param float(2) ell_cut_x: remove all modes with ell_x in the interval [ell_cut_x[0],ell_cut_x[1]] from the calculation.
:param float(2) ell_cut_y: remove all modes with ell_y in the interval [ell_cut_y[0],ell_cut_y[1]] from the calculation.
:return: deprojection bias power spectra.
"""
if len(cls_guess) != f1.fl.nmaps * f2.fl.nmaps:
raise ValueError("Proposal Cell doesn't match number of maps")
if len(cls_guess[0]) != len(ells):
raise ValueError("cls_guess and ells must have the same length")
cl1d = lib.comp_deproj_bias_flat(
f1.fl,
f2.fl,
b.bin,
ell_cut_x[0],
ell_cut_x[1],
ell_cut_y[0],
ell_cut_y[1],
ells,
cls_guess,
f1.fl.nmaps * f2.fl.nmaps * b.bin.n_bands,
)
cl2d = np.reshape(cl1d, [f1.fl.nmaps * f2.fl.nmaps, b.bin.n_bands])
return cl2d
def compute_coupled_cell(f1, f2):
"""
Computes the full-sky angular power spectra of two masked fields (f1 and f2) without aiming to deconvolve the mode-coupling matrix. Effectively, this is equivalent to calling the usual HEALPix anafast routine on the masked and contaminant-cleaned maps.
:param NmtField f1,f2: fields to correlate
:return: array of coupled power spectra
"""
if f1.fl.nside != f2.fl.nside:
raise ValueError("Fields must have same resolution")
cl1d = lib.comp_pspec_coupled(
f1.fl, f2.fl, f1.fl.nmaps * f2.fl.nmaps * (f1.fl.lmax + 1)
)
clout = np.reshape(cl1d, [f1.fl.nmaps * f2.fl.nmaps, f1.fl.lmax + 1])
return clout
def compute_coupled_cell_flat(f1, f2, b, ell_cut_x=[1., -1.], ell_cut_y=[1., -1.]):
"""
Computes the angular power spectra of two masked flat-sky fields (f1 and f2) without aiming to deconvolve the mode-coupling matrix. Effectively, this is equivalent to computing the map FFTs and averaging over rings of wavenumber. The returned power spectrum is defined at the multipoles returned by the method :func:`get_ell_sampling` of either f1 or f2.
:param NmtFieldFlat f1,f2: fields to correlate
:param NmtBinFlat b: binning scheme defining output bandpower
:param float(2) ell_cut_x: remove all modes with ell_x in the interval [ell_cut_x[0],ell_cut_x[1]] from the calculation.
:param float(2) ell_cut_y: remove all modes with ell_y in the interval [ell_cut_y[0],ell_cut_y[1]] from the calculation.
:return: array of coupled power spectra
"""
if (f1.nx != f2.nx) or (f1.ny != f2.ny):
raise ValueError("Fields must have same resolution")
cl1d = lib.comp_pspec_coupled_flat(
f1.fl,
f2.fl,
b.bin,
f1.fl.nmaps * f2.fl.nmaps * b.bin.n_bands,
ell_cut_x[0],
ell_cut_x[1],
ell_cut_y[0],
ell_cut_y[1],
)
clout = np.reshape(cl1d, [f1.fl.nmaps * f2.fl.nmaps, b.bin.n_bands])
return clout
def compute_full_master(f1, f2, b, cl_noise=None, cl_guess=None, workspace=None):
"""
Computes the full MASTER estimate of the power spectrum of two fields (f1 and f2). This is equivalent to successively calling:
- :func:`pymaster.NmtWorkspace.compute_coupling_matrix`
- :func:`pymaster.deprojection_bias`
- :func:`pymaster.compute_coupled_cell`
- :func:`pymaster.NmtWorkspace.decouple_cell`
:param NmtField f1,f2: fields to correlate
:param NmtBin b: binning scheme defining output bandpower
:param cl_noise: noise bias (i.e. angular power spectrum of masked noise realizations) (optional).
:param cl_guess: set of power spectra corresponding to a best-guess of the true power spectra of f1 and f2. Needed only to compute the contaminant cleaning bias (optional).
:param NmtWorkspace workspace: object containing the mode-coupling matrix associated with an incomplete sky coverage. If provided, the function will skip the computation of the mode-coupling matrix and use the information encoded in this object.
:return: set of decoupled bandpowers
"""
if f1.fl.nside != f2.fl.nside:
raise ValueError("Fields must have same resolution")
if cl_noise is not None:
if len(cl_noise) != f1.fl.nmaps * f2.fl.nmaps:
raise ValueError("Wrong length for noise power spectrum")
cln = cl_noise.copy()
else:
cln = np.zeros([f1.fl.nmaps * f2.fl.nmaps, 3 * f1.fl.nside])
if cl_guess is not None:
if len(cl_guess) != f1.fl.nmaps * f2.fl.nmaps:
raise ValueError("Wrong length for guess power spectrum")
clg = cl_guess.copy()
else:
clg = np.zeros([f1.fl.nmaps * f2.fl.nmaps, 3 * f1.fl.nside])
if workspace is None:
cl1d = lib.comp_pspec(
f1.fl, f2.fl, b.bin, None, cln, clg, len(cln) * b.bin.n_bands
)
else:
cl1d = lib.comp_pspec(
f1.fl, f2.fl, b.bin, workspace.wsp, cln, clg, len(cln) * b.bin.n_bands
)
clout = np.reshape(cl1d, [len(cln), b.bin.n_bands])
return clout
def compute_full_master_flat(
f1,
f2,
b,
cl_noise=None,
cl_guess=None,
ells_guess=None,
workspace=None,
ell_cut_x=[1., -1.],
ell_cut_y=[1., -1.],
):
"""
Computes the full MASTER estimate of the power spectrum of two flat-sky fields (f1 and f2). This is equivalent to successively calling:
- :func:`pymaster.NmtWorkspaceFlat.compute_coupling_matrix`
- :func:`pymaster.deprojection_bias_flat`
- :func:`pymaster.compute_coupled_cell_flat`
- :func:`pymaster.NmtWorkspaceFlat.decouple_cell`
:param NmtFieldFlat f1,f2: fields to correlate
:param NmtBinFlat b: binning scheme defining output bandpower
:param cl_noise: noise bias (i.e. angular power spectrum of masked noise realizations) (optional). This power spectrum should correspond to the bandpowers defined by b.
:param cl_guess: set of power spectra corresponding to a best-guess of the true power spectra of f1 and f2. Needed only to compute the contaminant cleaning bias (optional).
:param ells_guess: multipoles at which cl_guess is defined.
:param NmtWorkspaceFlat workspace: object containing the mode-coupling matrix associated with an incomplete sky coverage. If provided, the function will skip the computation of the mode-coupling matrix and use the information encoded in this object.
:param int nell_rebin: number of sub-intervals into which the base k-intervals will be sub-sampled to compute the coupling matrix
:param float(2) ell_cut_x: remove all modes with ell_x in the interval [ell_cut_x[0],ell_cut_x[1]] from the calculation.
:param float(2) ell_cut_y: remove all modes with ell_y in the interval [ell_cut_y[0],ell_cut_y[1]] from the calculation.
:return: set of decoupled bandpowers
"""
if (f1.nx != f2.nx) or (f1.ny != f2.ny):
raise ValueError("Fields must have same resolution")
if cl_noise is not None:
if (len(cl_noise) != f1.fl.nmaps * f2.fl.nmaps) or (
len(cl_noise[0]) != b.bin.n_bands
):
raise ValueError("Wrong length for noise power spectrum")
cln = cl_noise.copy()
else:
cln = np.zeros([f1.fl.nmaps * f2.fl.nmaps, b.bin.n_bands])
if cl_guess is not None:
if ells_guess is None:
raise ValueError("Must provide ell-values for cl_guess")
if (len(cl_guess) != f1.fl.nmaps * f2.fl.nmaps) or (
len(cl_guess[0]) != len(ells_guess)
):
raise ValueError("Wrong length for guess power spectrum")
lf = ells_guess.copy()
clg = cl_guess.copy()
else:
lf = b.get_effective_ells()
clg = np.zeros([f1.fl.nmaps * f2.fl.nmaps, b.bin.n_bands])
if workspace is None:
cl1d = lib.comp_pspec_flat(
f1.fl,
f2.fl,
b.bin,
None,
cln,
lf,
clg,
len(cln) * b.bin.n_bands,
ell_cut_x[0],
ell_cut_x[1],
ell_cut_y[0],
ell_cut_y[1],
)
else:
cl1d = lib.comp_pspec_flat(
f1.fl,
f2.fl,
b.bin,
workspace.wsp,
cln,
lf,
clg,
len(cln) * b.bin.n_bands,
ell_cut_x[0],
ell_cut_x[1],
ell_cut_y[0],
ell_cut_y[1],
)
clout = np.reshape(cl1d, [len(cln), b.bin.n_bands])
return clout
| 46.42418 | 481 | 0.655087 | 3,478 | 22,655 | 4.155837 | 0.093445 | 0.034385 | 0.012592 | 0.012177 | 0.839768 | 0.814723 | 0.79369 | 0.762626 | 0.733361 | 0.711222 | 0 | 0.019204 | 0.252968 | 22,655 | 487 | 482 | 46.519507 | 0.83485 | 0.483072 | 0 | 0.604317 | 0 | 0 | 0.095513 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082734 | false | 0 | 0.007194 | 0 | 0.140288 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ffa83f6beb945dc27d951d3f1d8f5e5ab3699c7d | 18,476 | py | Python | test/test_files/pylops/pytests/test_basicoperators.py | SoftwareUnderstanding/inspect4py | 9c4d7252535082ad938b26baf281d93f3a27285e | [
"BSD-3-Clause"
] | 2 | 2022-02-15T20:30:57.000Z | 2022-03-17T00:50:37.000Z | test/test_files/pylops/pytests/test_basicoperators.py | SoftwareUnderstanding/code_inspector | a820b5a7bb18f5df9c3e79346108d8280b20c39a | [
"BSD-3-Clause"
] | 101 | 2021-06-09T14:19:59.000Z | 2022-01-24T13:24:39.000Z | test/test_files/pylops/pytests/test_basicoperators.py | SoftwareUnderstanding/inspect4py | 9c4d7252535082ad938b26baf281d93f3a27285e | [
"BSD-3-Clause"
] | 1 | 2021-09-22T06:59:32.000Z | 2021-09-22T06:59:32.000Z | import pytest
import numpy as np
from numpy.testing import assert_array_equal, assert_array_almost_equal
from scipy.sparse import rand
from scipy.sparse.linalg import lsqr
from pylops.utils import dottest
from pylops.basicoperators import Regression, LinearRegression, MatrixMult, \
Identity, Zero, Flip, Symmetrize, Roll, Sum, Real, Imag, Conj
par1 = {'ny': 11, 'nx': 11, 'imag': 0,
'dtype':'float64'} # square real
par2 = {'ny': 21, 'nx': 11, 'imag': 0,
'dtype':'float64'} # overdetermined real
par1j = {'ny': 11, 'nx': 11, 'imag': 1j,
'dtype':'complex128'} # square complex
par2j = {'ny': 21, 'nx': 11, 'imag': 1j,
'dtype':'complex128'} # overdetermined complex
par3 = {'ny': 11, 'nx': 21, 'imag': 0,
'dtype':'float64'} # underdetermined real
np.random.seed(10)
@pytest.mark.parametrize("par", [(par1), (par2)])
def test_Regression(par):
"""Dot-test, inversion and apply for Regression operator
"""
np.random.seed(10)
order = 4
t = np.arange(par['ny'], dtype=np.float32)
LRop = Regression(t, order=order, dtype=par['dtype'])
assert dottest(LRop, par['ny'], order+1)
x = np.array([1., 2., 0., 3., -1.], dtype=np.float32)
xlsqr = lsqr(LRop, LRop*x, damp=1e-10, iter_lim=300, show=0)[0]
assert_array_almost_equal(x, xlsqr, decimal=3)
y = LRop * x
y1 = LRop.apply(t, x)
assert_array_almost_equal(y, y1, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2)])
def test_LinearRegression(par):
"""Dot-test and inversion for LinearRegression operator
"""
np.random.seed(10)
t = np.arange(par['ny'], dtype=np.float32)
LRop = LinearRegression(t, dtype=par['dtype'])
assert dottest(LRop, par['ny'], 2)
x = np.array([1., 2.], dtype=np.float32)
xlsqr = lsqr(LRop, LRop*x, damp=1e-10, iter_lim=300, show=0)[0]
assert_array_almost_equal(x, xlsqr, decimal=3)
y = LRop * x
y1 = LRop.apply(t, x)
assert_array_almost_equal(y, y1, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j)])
def test_MatrixMult(par):
"""Dot-test and inversion for MatrixMult operator
"""
np.random.seed(10)
G = np.random.normal(0, 10, (par['ny'],
par['nx'])).astype('float32') + \
par['imag']*np.random.normal(0, 10, (par['ny'],
par['nx'])).astype('float32')
Gop = MatrixMult(G, dtype=par['dtype'])
assert dottest(Gop, par['ny'], par['nx'],
complexflag=0 if par['imag'] == 0 else 3)
x = np.ones(par['nx']) + par['imag']*np.ones(par['nx'])
xlsqr = lsqr(Gop, Gop*x, damp=1e-20, iter_lim=300, show=0)[0]
assert_array_almost_equal(x, xlsqr, decimal=4)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j)])
def test_MatrixMult_sparse(par):
"""Dot-test and inversion for test_MatrixMult operator using sparse
matrix
"""
np.random.seed(10)
G = rand(par['ny'], par['nx'], density=0.75).astype('float32') + \
par['imag'] * rand(par['ny'], par['nx'], density=0.75).astype('float32')
Gop = MatrixMult(G, dtype=par['dtype'])
assert dottest(Gop, par['ny'], par['nx'],
complexflag=0 if par['imag'] == 1 else 3)
x = np.ones(par['nx']) + par['imag'] * np.ones(par['nx'])
xlsqr = lsqr(Gop, Gop * x, damp=1e-20, iter_lim=300, show=0)[0]
assert_array_almost_equal(x, xlsqr, decimal=4)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j)])
def test_MatrixMult_repeated(par):
"""Dot-test and inversion for test_MatrixMult operator repeated
along another dimension
"""
np.random.seed(10)
G = np.random.normal(0, 10, (par['ny'], par['nx'])).astype('float32') + \
par['imag'] * np.random.normal(0, 10, (par['ny'],
par['nx'])).astype('float32')
Gop = MatrixMult(G, dims=5, dtype=par['dtype'])
assert dottest(Gop, par['ny']*5, par['nx']*5,
complexflag=0 if par['imag'] == 1 else 3)
x = (np.ones((par['nx'], 5)) +
par['imag'] * np.ones((par['nx'], 5))).flatten()
xlsqr = lsqr(Gop, Gop*x, damp=1e-20, iter_lim=300, show=0)[0]
assert_array_almost_equal(x, xlsqr, decimal=4)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Identity_inplace(par):
"""Dot-test, forward and adjoint for Identity operator
"""
np.random.seed(10)
Iop = Identity(par['ny'], par['nx'], dtype=par['dtype'], inplace=True)
assert dottest(Iop, par['ny'], par['nx'],
complexflag=0 if par['imag'] == 0 else 3)
x = np.ones(par['nx']) + par['imag'] * np.ones(par['nx'])
y = Iop*x
x1 = Iop.H*y
assert_array_almost_equal(x[:min(par['ny'], par['nx'])],
y[:min(par['ny'], par['nx'])], decimal=4)
assert_array_almost_equal(x[:min(par['ny'], par['nx'])],
x1[:min(par['ny'], par['nx'])], decimal=4)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Identity_noinplace(par):
"""Dot-test, forward and adjoint for Identity operator (not in place)
"""
np.random.seed(10)
Iop = Identity(par['ny'], par['nx'], dtype=par['dtype'], inplace=False)
assert dottest(Iop, par['ny'], par['nx'],
complexflag=0 if par['imag'] == 0 else 3)
x = np.ones(par['nx']) + par['imag'] * np.ones(par['nx'])
y = Iop*x
x1 = Iop.H*y
assert_array_almost_equal(x[:min(par['ny'], par['nx'])],
y[:min(par['ny'], par['nx'])], decimal=4)
assert_array_almost_equal(x[:min(par['ny'], par['nx'])],
x1[:min(par['ny'], par['nx'])], decimal=4)
# change value in x and check it doesn't change in y
x[0] = 10
assert x[0] != y[0]
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Zero(par):
"""Dot-test, forward and adjoint for Zero operator
"""
np.random.seed(10)
Zop = Zero(par['ny'], par['nx'], dtype=par['dtype'])
assert dottest(Zop, par['ny'], par['nx'])
x = np.ones(par['nx']) + par['imag']*np.ones(par['nx'])
y = Zop * x
x1 = Zop.H*y
assert_array_almost_equal(y, np.zeros(par['ny']))
assert_array_almost_equal(x1, np.zeros(par['nx']))
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j)])
def test_Flip1D(par):
"""Dot-test, forward and adjoint for Flip operator on 1d signal
"""
np.random.seed(10)
x = np.arange(par['ny']) + par['imag'] * np.arange(par['ny'])
Fop = Flip(par['ny'], dtype=par['dtype'])
assert dottest(Fop, par['ny'], par['ny'])
y = Fop * x
xadj = Fop.H * y
assert_array_equal(x, xadj)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j)])
def test_Flip2D(par):
"""Dot-test, forward and adjoint for Flip operator on 2d signal
"""
np.random.seed(10)
x = {}
x['0'] = np.outer(np.arange(par['ny']), np.ones(par['nx'])) + \
par['imag'] * np.outer(np.arange(par['ny']), np.ones(par['nx']))
x['1'] = np.outer(np.ones(par['ny']), np.arange(par['nx'])) + \
par['imag'] * np.outer(np.ones(par['ny']), np.arange(par['nx']))
for dir in [0, 1]:
Fop = Flip(par['ny']*par['nx'], dims=(par['ny'], par['nx']),
dir=dir, dtype=par['dtype'])
assert dottest(Fop, par['ny']*par['nx'], par['ny']*par['nx'])
y = Fop * x[str(dir)].flatten()
xadj = Fop.H * y.flatten()
xadj = xadj.reshape(par['ny'], par['nx'])
assert_array_equal(x[str(dir)], xadj)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j)])
def test_Flip3D(par):
"""Dot-test, forward and adjoint for Flip operator on 3d signal
"""
np.random.seed(10)
x = {}
x['0'] = np.outer(np.arange(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx']) + \
par['imag'] * np.outer(np.arange(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx'])
x['1'] = np.outer(np.ones(par['ny']),
np.arange(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx']) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.arange(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx'])
x['2'] = np.outer(np.ones(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.arange(par['nx']) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.arange(par['nx'])
for dir in [0, 1, 2]:
Fop = Flip(par['ny']*par['nx']*par['nx'],
dims=(par['ny'], par['nx'], par['nx']),
dir=dir, dtype=par['dtype'])
assert dottest(Fop, par['ny']*par['nx']*par['nx'],
par['ny']*par['nx']*par['nx'])
y = Fop * x[str(dir)].flatten()
xadj = Fop.H * y.flatten()
xadj = xadj.reshape(par['ny'], par['nx'], par['nx'])
assert_array_equal(x[str(dir)], xadj)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Symmetrize1D(par):
"""Dot-test, forward and inverse for Symmetrize operator on 1d signal
"""
np.random.seed(10)
x = np.arange(par['ny']) + par['imag'] * np.arange(par['ny'])
Sop = Symmetrize(par['ny'], dtype=par['dtype'])
dottest(Sop, par['ny']*2-1, par['ny'], verb=True)
y = Sop * x
xinv = Sop / y
assert_array_almost_equal(x, xinv, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Symmetrize2D(par):
"""Dot-test, forward and inverse for Symmetrize operator on 2d signal
"""
np.random.seed(10)
x = {}
x['0'] = np.outer(np.arange(par['ny']), np.ones(par['nx'])) + \
par['imag'] * np.outer(np.arange(par['ny']), np.ones(par['nx']))
x['1'] = np.outer(np.ones(par['ny']), np.arange(par['nx'])) + \
par['imag'] * np.outer(np.ones(par['ny']), np.arange(par['nx']))
for dir in [0, 1]:
Sop = Symmetrize(par['ny']*par['nx'],
dims=(par['ny'], par['nx']),
dir=dir, dtype=par['dtype'])
y = Sop * x[str(dir)].flatten()
assert dottest(Sop, y.size, par['ny']*par['nx'])
xinv = Sop / y
assert_array_almost_equal(x[str(dir)].ravel(), xinv, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Symmetrize3D(par):
"""Dot-test, forward and adjoint for Symmetrize operator on 3d signal
"""
np.random.seed(10)
x = {}
x['0'] = np.outer(np.arange(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx']) + \
par['imag'] * np.outer(np.arange(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx'])
x['1'] = np.outer(np.ones(par['ny']),
np.arange(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx']) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.arange(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx'])
x['2'] = np.outer(np.ones(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.arange(par['nx']) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.arange(par['nx'])
for dir in [0, 1, 2]:
Sop = Symmetrize(par['ny']*par['nx']*par['nx'],
dims=(par['ny'], par['nx'], par['nx']),
dir=dir, dtype=par['dtype'])
y = Sop * x[str(dir)].flatten()
assert dottest(Sop, y.size, par['ny']*par['nx']*par['nx'])
xinv = Sop / y
assert_array_almost_equal(x[str(dir)].ravel(), xinv, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Roll1D(par):
"""Dot-test, forward and adjoint for Roll operator on 1d signal
"""
np.random.seed(10)
x = np.arange(par['ny']) + par['imag'] * np.arange(par['ny'])
Rop = Roll(par['ny'], shift=2, dtype=par['dtype'])
assert dottest(Rop, par['ny'], par['ny'])
y = Rop * x
xadj = Rop.H * y
assert_array_almost_equal(x, xadj, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Roll2D(par):
"""Dot-test, forward and inverse for Roll operator on 2d signal
"""
np.random.seed(10)
x = {}
x['0'] = np.outer(np.arange(par['ny']), np.ones(par['nx'])) + \
par['imag'] * np.outer(np.arange(par['ny']),
np.ones(par['nx']))
x['1'] = np.outer(np.ones(par['ny']), np.arange(par['nx'])) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.arange(par['nx']))
for dir in [0, 1]:
Rop = Roll(par['ny'] * par['nx'],
dims=(par['ny'], par['nx']),
dir=dir, shift=-2, dtype=par['dtype'])
y = Rop * x[str(dir)].flatten()
assert dottest(Rop, par['ny'] * par['nx'], par['ny'] * par['nx'])
xadj = Rop.H * y
assert_array_almost_equal(x[str(dir)].ravel(), xadj, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Roll3D(par):
"""Dot-test, forward and adjoint for Roll operator on 3d signal
"""
np.random.seed(10)
x = {}
x['0'] = np.outer(np.arange(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx']) + \
par['imag'] * np.outer(np.arange(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx'])
x['1'] = np.outer(np.ones(par['ny']),
np.arange(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx']) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.arange(par['nx']))[:, :, np.newaxis] * \
np.ones(par['nx'])
x['2'] = np.outer(np.ones(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.arange(par['nx']) + \
par['imag'] * np.outer(np.ones(par['ny']),
np.ones(par['nx']))[:, :, np.newaxis] * \
np.arange(par['nx'])
for dir in [0, 1, 2]:
Rop = Roll(par['ny'] * par['nx'] * par['nx'],
dims=(par['ny'], par['nx'], par['nx']),
dir=dir, shift=3, dtype=par['dtype'])
y = Rop * x[str(dir)].flatten()
assert dottest(Rop, par['ny'] * par['nx'] * par['nx'],
par['ny'] * par['nx'] * par['nx'])
xinv = Rop.H * y
assert_array_almost_equal(x[str(dir)].ravel(), xinv, decimal=3)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Sum2D(par):
"""Dot-test for Sum operator on 2d signal
"""
for dir in [0, 1]:
dim_d = [par['ny'], par['nx']]
dim_d.pop(dir)
Sop = Sum(dims=(par['ny'], par['nx']),
dir=dir, dtype=par['dtype'])
assert dottest(Sop, np.prod(dim_d), par['ny'] * par['nx'])
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Sum3D(par):
"""Dot-test, forward and adjoint for Sum operator on 3d signal
"""
for dir in [0, 1, 2]:
dim_d = [par['ny'], par['nx'], par['nx']]
dim_d.pop(dir)
Sop = Sum(dims=(par['ny'], par['nx'], par['nx']),
dir=dir, dtype=par['dtype'])
assert dottest(Sop, np.prod(dim_d), par['ny'] * par['nx'] * par['nx'])
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Real(par):
"""Dot-test, forward and adjoint for Real operator
"""
Rop = Real(dims=(par['ny'], par['nx']), dtype=par['dtype'])
if np.dtype(par['dtype']).kind == 'c':
complexflag = 3
else:
complexflag = 0
assert dottest(Rop, par['ny'] * par['nx'], par['ny'] * par['nx'],
complexflag=complexflag)
np.random.seed(10)
x = (np.random.randn(par['nx'] * par['ny'])
+ par['imag'] * np.random.randn(par['nx'] * par['ny']))
y = Rop * x
assert_array_equal(y, np.real(x))
y = (np.random.randn(par['nx'] * par['ny'])
+ par['imag'] * np.random.randn(par['nx'] * par['ny']))
x = Rop.H * y
assert_array_equal(x, np.real(y) + 0j)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Imag(par):
"""Dot-test, forward and adjoint for Imag operator
"""
Iop = Imag(dims=(par['ny'], par['nx']), dtype=par['dtype'])
if np.dtype(par['dtype']).kind == 'c':
complexflag = 3
else:
complexflag = 0
assert dottest(Iop, par['ny'] * par['nx'], par['ny'] * par['nx'],
complexflag=complexflag)
np.random.seed(10)
x = (np.random.randn(par['nx'] * par['ny'])
+ par['imag'] * np.random.randn(par['nx'] * par['ny']))
y = Iop * x
assert_array_equal(y, np.imag(x))
y = (np.random.randn(par['nx'] * par['ny'])
+ par['imag'] * np.random.randn(par['nx'] * par['ny']))
x = Iop.H * y
if np.dtype(par['dtype']).kind == 'c':
assert_array_equal(x, 0 + 1j*np.real(y))
else:
assert_array_equal(x, 0)
@pytest.mark.parametrize("par", [(par1), (par2), (par1j), (par2j), (par3)])
def test_Conj(par):
"""Dot-test, forward and adjoint for Conj operator
"""
Cop = Conj(dims=(par['ny'], par['nx']), dtype=par['dtype'])
if np.dtype(par['dtype']).kind == 'c':
complexflag = 3
else:
complexflag = 0
assert dottest(Cop, par['ny'] * par['nx'], par['ny'] * par['nx'],
complexflag=complexflag)
np.random.seed(10)
x = (np.random.randn(par['nx'] * par['ny'])
+ par['imag'] * np.random.randn(par['nx'] * par['ny']))
y = Cop * x
xadj = Cop.H * y
assert_array_equal(x, xadj)
assert_array_equal(y, np.conj(x))
assert_array_equal(xadj, np.conj(y))
| 37.325253 | 80 | 0.517807 | 2,597 | 18,476 | 3.636504 | 0.06315 | 0.078357 | 0.060144 | 0.064591 | 0.873571 | 0.845616 | 0.807179 | 0.774989 | 0.75413 | 0.719928 | 0 | 0.028732 | 0.257794 | 18,476 | 494 | 81 | 37.40081 | 0.659958 | 0.08281 | 0 | 0.68144 | 0 | 0 | 0.062515 | 0 | 0 | 0 | 0 | 0 | 0.146814 | 1 | 0.060942 | false | 0 | 0.019391 | 0 | 0.080332 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
44758385578bbab28368552788bf221a41c93e8a | 78 | py | Python | pipy/__init__.py | rhsmits91/pipy | b38100203711fad715078a00c6074ea63af06893 | [
"MIT"
] | null | null | null | pipy/__init__.py | rhsmits91/pipy | b38100203711fad715078a00c6074ea63af06893 | [
"MIT"
] | null | null | null | pipy/__init__.py | rhsmits91/pipy | b38100203711fad715078a00c6074ea63af06893 | [
"MIT"
] | null | null | null | from pipy import parameters, pipeline, widgets
from pipy.tests import testing
| 26 | 46 | 0.833333 | 11 | 78 | 5.909091 | 0.727273 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128205 | 78 | 2 | 47 | 39 | 0.955882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
447a015c45215acc5db0abb350b8b9fabe099a59 | 144 | py | Python | netmiko/zte/__init__.py | mtuska/netmiko | 90ae69a7c251c13e483f7c52629dbbe4356e7a6d | [
"MIT"
] | 2,833 | 2015-01-04T20:04:10.000Z | 2022-03-31T13:03:17.000Z | netmiko/zte/__init__.py | MrPaulAR/netmiko | bc9700a803ccd89e29672dbe544368b946352aa0 | [
"MIT"
] | 2,137 | 2015-01-28T17:33:41.000Z | 2022-03-31T18:41:21.000Z | netmiko/zte/__init__.py | georgesnow/netmiko | 185f51ca5c24ea2977d6ca31db1ae263aa72cc12 | [
"MIT"
] | 1,367 | 2015-01-04T20:04:10.000Z | 2022-03-31T19:13:28.000Z | from netmiko.zte.zte_zxros import ZteZxrosSSH
from netmiko.zte.zte_zxros import ZteZxrosTelnet
__all__ = ["ZteZxrosSSH", "ZteZxrosTelnet"]
| 28.8 | 49 | 0.798611 | 17 | 144 | 6.411765 | 0.470588 | 0.201835 | 0.256881 | 0.311927 | 0.513761 | 0.513761 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118056 | 144 | 4 | 50 | 36 | 0.858268 | 0 | 0 | 0 | 0 | 0 | 0.178571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
448cd44cbd1bcbcbcb31ec220b59e33bebf664dd | 172 | py | Python | tes.py | nucklehead/voxforge-kreyol | 28171e96768bb62a8d6c7fc823fa5b1f4c5bb99e | [
"MIT"
] | null | null | null | tes.py | nucklehead/voxforge-kreyol | 28171e96768bb62a8d6c7fc823fa5b1f4c5bb99e | [
"MIT"
] | null | null | null | tes.py | nucklehead/voxforge-kreyol | 28171e96768bb62a8d6c7fc823fa5b1f4c5bb99e | [
"MIT"
] | null | null | null | import io
from flask import Flask
from stokaj_fichye.nextcloud.kliyan import stoke_fichye_a
app = Flask(__name__)
stoke_fichye_a(app, f'test.zip', io.StringIO('asdasd')) | 21.5 | 57 | 0.796512 | 28 | 172 | 4.571429 | 0.607143 | 0.171875 | 0.1875 | 0.234375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104651 | 172 | 8 | 58 | 21.5 | 0.831169 | 0 | 0 | 0 | 0 | 0 | 0.080925 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
922fe224ef86df55cb87b258115c9968b5610ff1 | 26 | py | Python | New_World.py | sayaji/MyFirstRepo | 356dfb89efaf1316314c9aa7c88f33fd82a9379f | [
"MIT"
] | null | null | null | New_World.py | sayaji/MyFirstRepo | 356dfb89efaf1316314c9aa7c88f33fd82a9379f | [
"MIT"
] | null | null | null | New_World.py | sayaji/MyFirstRepo | 356dfb89efaf1316314c9aa7c88f33fd82a9379f | [
"MIT"
] | null | null | null | print("Hello New World!!") | 26 | 26 | 0.692308 | 4 | 26 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 26 | 1 | 26 | 26 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
9249c5f8c4e60f618f8eeec811c203e922c23c37 | 913 | py | Python | src/robobo/movement/simple_movements.py | josejorgers/bobo-script | 02641c420ec3f7eca7031d789b3638bde0f7fd1d | [
"MIT"
] | 1 | 2021-10-20T20:53:24.000Z | 2021-10-20T20:53:24.000Z | src/robobo/movement/simple_movements.py | josejorgers/bobo-script | 02641c420ec3f7eca7031d789b3638bde0f7fd1d | [
"MIT"
] | null | null | null | src/robobo/movement/simple_movements.py | josejorgers/bobo-script | 02641c420ec3f7eca7031d789b3638bde0f7fd1d | [
"MIT"
] | null | null | null | def move_forward(robot, speed, time=None):
if time is None:
robot.moveWheels(speed, speed)
else:
robot.moveWheelsByTime(speed, speed, time)
def turn_right(robot, speed, time=None):
if time is None:
robot.moveWheels(-speed, speed)
else:
robot.moveWheelsByTime(-speed, speed, time)
def turn_left(robot, speed, time=None):
if time is None:
robot.moveWheels(speed, -speed)
else:
robot.moveWheelsByTime(speed, -speed, time)
def move_backward(robot, speed, time=None):
if time is None:
robot.moveWheels(-speed, -speed)
else:
robot.moveWheelsByTime(-speed, -speed, time)
def diagonal_movement(robot, speed_left, speed_right, time=None):
if time is None:
robot.moveWheels(speed_left, speed_right)
else:
robot.moveWheelsByTime(speed_left, speed_right, time)
def stop(robot):
robot.stopMotors() | 28.53125 | 65 | 0.669222 | 117 | 913 | 5.128205 | 0.17094 | 0.12 | 0.083333 | 0.116667 | 0.796667 | 0.726667 | 0.726667 | 0.726667 | 0.726667 | 0.66 | 0 | 0 | 0.221249 | 913 | 32 | 66 | 28.53125 | 0.843882 | 0 | 0 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
927bba8eb2fbc7ff76015a39801feaa53c96b92f | 248 | py | Python | ummon/utils/__init__.py | matherm/ummon3 | 08476d21ce17cc95180525d48202a1690dfc8a08 | [
"BSD-3-Clause"
] | 1 | 2022-02-10T06:47:13.000Z | 2022-02-10T06:47:13.000Z | ummon/utils/__init__.py | matherm/ummon3 | 08476d21ce17cc95180525d48202a1690dfc8a08 | [
"BSD-3-Clause"
] | null | null | null | ummon/utils/__init__.py | matherm/ummon3 | 08476d21ce17cc95180525d48202a1690dfc8a08 | [
"BSD-3-Clause"
] | null | null | null | from .average_utils import *
from .data_utils import *
from .dataset_utils import *
from .memory_utils import *
from .model_utils import *
from .stats_utils import *
from .time_utils import *
from .batch_utils import *
from .FastDataLoader import * | 27.555556 | 29 | 0.78629 | 35 | 248 | 5.342857 | 0.342857 | 0.470588 | 0.641711 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141129 | 248 | 9 | 29 | 27.555556 | 0.877934 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9293ac049518c688e6881850ed7804dba357975f | 43 | py | Python | lqr_mdp/__init__.py | vincentzhang/lqr_mdp | 91d531bd534a8594a4175b8ae2f1db00228a42cb | [
"Apache-2.0"
] | 2 | 2020-08-11T17:37:43.000Z | 2021-08-25T05:28:25.000Z | lqr_mdp/__init__.py | vincentzhang/lqr_mdp | 91d531bd534a8594a4175b8ae2f1db00228a42cb | [
"Apache-2.0"
] | null | null | null | lqr_mdp/__init__.py | vincentzhang/lqr_mdp | 91d531bd534a8594a4175b8ae2f1db00228a42cb | [
"Apache-2.0"
] | null | null | null | from .lqr import MDP_LQR_Cont, MDP_LQR_Disc | 43 | 43 | 0.860465 | 9 | 43 | 3.666667 | 0.666667 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 43 | 1 | 43 | 43 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
2bada45802c3784e1949085caf77208998cf2bf4 | 2,151 | gyp | Python | jc3_api.gyp | xforce/jc3-api | 01675239689761de5f805c0eaabbaf46289be9e8 | [
"MIT"
] | null | null | null | jc3_api.gyp | xforce/jc3-api | 01675239689761de5f805c0eaabbaf46289be9e8 | [
"MIT"
] | 1 | 2016-11-21T19:32:23.000Z | 2016-12-10T00:26:49.000Z | jc3_api.gyp | xforce/jc3-api | 01675239689761de5f805c0eaabbaf46289be9e8 | [
"MIT"
] | null | null | null | {
'targets': [
{
'target_name': 'jc3_api',
'type': 'static_library',
'dependencies' : [
'deps/jc3_hooking/jc3_hooking.gyp:jc3_hooking',
],
'direct_dependent_settings': {
'include_dirs': ['.', 'deps/boost', 'deps/stl'],
},
'include_dirs': ['.', 'deps/boost', 'deps/stl'],
'sources': [
'jc3/entities/character.h',
'jc3/entities/character.cpp',
'jc3/entities/damageable.h',
'jc3/entities/game_object.h',
'jc3/entities/game_object.cpp',
'jc3/entities/physics_game_object.h',
'jc3/entities/player.h',
'jc3/entities/vehicle.h',
'jc3/entities/pfx/pfx_breakable.h',
'jc3/entities/pfx/pfx_rigid_body.h',
'jc3/entities/pfx/pfx_vehicle.h',
'jc3/entities/pfx/pfx_game_object.h',
'jc3/entities/pfx/pfx_instance.h',
'jc3/entities/pfx/air_aerodynamics.h',
'jc3/entities/pfx/air_audio.h',
'jc3/entities/pfx/air_engine.h',
'jc3/entities/pfx/air_global.h',
'jc3/entities/pfx/boat_audio.h',
'jc3/entities/pfx/boat_buoyancy.h',
'jc3/entities/pfx/boat_engine.h',
'jc3/entities/pfx/boat_global.h',
'jc3/entities/pfx/boat_steering.h',
'jc3/entities/pfx/brakes.h',
'jc3/entities/pfx/buoyancy.h',
'jc3/entities/pfx/custom_land_global.h',
'jc3/entities/pfx/custom_velocity_damper.h',
'jc3/entities/pfx/driver_lean.h',
'jc3/entities/pfx/effect_attachments.h',
'jc3/entities/pfx/fins.h',
'jc3/entities/pfx/helicopter_model.h',
'jc3/entities/pfx/helicopter_steering.h',
'jc3/entities/pfx/land_aerodynamics.h',
'jc3/entities/pfx/land_audio.h',
'jc3/entities/pfx/land_global.h',
'jc3/entities/pfx/land_steering.h',
'jc3/entities/pfx/lights.h',
'jc3/entities/pfx/motorbike_steering.h',
'jc3/entities/pfx/motorbike_suspension.h',
'jc3/entities/pfx/propellers.h',
'jc3/entities/pfx/rotors.h',
'jc3/entities/pfx/rudders.h',
'jc3/entities/pfx/suspension.h',
'jc3/entities/pfx/transmission.h',
'jc3/entities/pfx/vehicle_misc.h',
'jc3/entities/pfx/water_interaction.h',
'jc3/spawn/spawn_system.h',
'jc3/spawn/spawn_system.cpp',
'jc3/ui/overlay_ui.h',
'jc3/ui/overlay_ui.cpp',
],
},
]
} | 30.295775 | 60 | 0.677359 | 306 | 2,151 | 4.598039 | 0.22549 | 0.351812 | 0.358209 | 0.394456 | 0.614072 | 0.144989 | 0 | 0 | 0 | 0 | 0 | 0.028357 | 0.131102 | 2,151 | 71 | 61 | 30.295775 | 0.724452 | 0 | 0 | 0.060606 | 0 | 0 | 0.767193 | 0.700743 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2bd89c33d4384c7283e2550b9a03969857225df2 | 114 | py | Python | toontown/speedchat/TTSCDecoders.py | AnonymousDeveloper65535/open-toontown | 3d05c22a7d960ad843dde231140447c46973dba5 | [
"BSD-3-Clause"
] | 8 | 2017-10-10T11:41:01.000Z | 2021-02-23T12:55:47.000Z | toontown/speedchat/TTSCDecoders.py | AnonymousDeveloper65535/open-toontown | 3d05c22a7d960ad843dde231140447c46973dba5 | [
"BSD-3-Clause"
] | 1 | 2018-07-28T20:07:04.000Z | 2018-07-30T18:28:34.000Z | toontown/speedchat/TTSCDecoders.py | AnonymousDeveloper65535/open-toontown | 3d05c22a7d960ad843dde231140447c46973dba5 | [
"BSD-3-Clause"
] | 3 | 2021-06-03T05:36:36.000Z | 2021-06-22T15:07:31.000Z | from TTSCToontaskTerminal import decodeTTSCToontaskMsg
from TTSCResistanceTerminal import decodeTTSCResistanceMsg
| 38 | 58 | 0.929825 | 8 | 114 | 13.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070175 | 114 | 2 | 59 | 57 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2bdf3e5afff522af274b522f668f840a089c7c62 | 4,381 | py | Python | tests/test_cfm_advance.py | vinniemo90/madden-cfm-chatbot | 062f22860d2848edafba5094695ef244c3d5d02e | [
"MIT"
] | 4 | 2019-05-27T20:53:28.000Z | 2020-12-22T00:42:14.000Z | tests/test_cfm_advance.py | vinniemo90/madden-cfm-chatbot | 062f22860d2848edafba5094695ef244c3d5d02e | [
"MIT"
] | 2 | 2019-10-24T00:01:50.000Z | 2021-04-30T20:48:45.000Z | tests/test_cfm_advance.py | vinniemo90/madden-cfm-chatbot | 062f22860d2848edafba5094695ef244c3d5d02e | [
"MIT"
] | 8 | 2019-02-23T09:43:34.000Z | 2020-02-27T23:49:54.000Z | from unittest.mock import patch
import sys
sys.path.append('./')
import cfm_advance
import cfm_schedule
import constants
from firebase_admin import db
@patch('firebase_admin.db')
def test_advance_to_preseason(mock_db):
db_root = mock_db.reference()
schedule = cfm_advance.advance_to_preseason(db_root)
assert isinstance(schedule, list)
assert schedule[0] == 'Preseason Week 1 Schedule'
@patch('cfm_schedule.get_user_games')
@patch('firebase_admin.db')
def test_advance_to_preseason_error(mock_db, mock_get_user_games):
db_root = mock_db.reference()
mock_get_user_games.return_value = [constants.USER_GAME_ERROR]
schedule = cfm_advance.advance_to_preseason(db_root)
assert isinstance(schedule, list)
assert schedule[0] == constants.USER_GAME_ERROR
@patch('firebase_admin.db')
def test_advance_to_reg(mock_db):
db_root = mock_db.reference()
schedule = cfm_advance.advance_to_reg(db_root)
assert isinstance(schedule, list)
assert schedule[0] == 'Regular Season Week 1 Schedule'
@patch('cfm_schedule.get_user_games')
@patch('firebase_admin.db')
def test_advance_to_reg_error(mock_db, mock_get_user_games):
db_root = mock_db.reference()
mock_get_user_games.return_value = [constants.USER_GAME_ERROR]
schedule = cfm_advance.advance_to_reg(db_root)
assert isinstance(schedule, list)
assert schedule[0] == constants.USER_GAME_ERROR
@patch('firebase_admin.db')
def test_advance_to_playoffs(mock_db):
db_root = mock_db.reference()
schedule = cfm_advance.advance_to_playoffs(db_root)
assert isinstance(schedule, list)
assert schedule[0] == 'Wildcard Schedule'
@patch('cfm_schedule.get_user_games')
@patch('firebase_admin.db')
def test_advance_to_playoffs_error(mock_db, mock_get_user_games):
db_root = mock_db.reference()
mock_get_user_games.return_value = [constants.USER_GAME_ERROR]
schedule = cfm_advance.advance_to_playoffs(db_root)
assert isinstance(schedule, list)
assert schedule[0] == constants.USER_GAME_ERROR
@patch('cfm_advance.advance_to_preseason')
@patch('firebase_admin.db')
def test_advance_pre(mock_db, mock_advance_to_preseason):
db_root = mock_db.reference()
mock_advance_to_preseason.return_value = ['abc']
schedule = cfm_advance.advance(db_root, ['/advance', 'pre'], 0)
assert schedule == 'abc'
@patch('cfm_advance.advance_to_preseason')
@patch('firebase_admin.db')
def test_advance_pre_error(mock_db, mock_adv_to_preseason):
db_root = mock_db
mock_adv_to_preseason.side_effect = Exception
schedule = cfm_advance.advance(db_root, ['/advance', 'pre'], 0)
assert schedule == constants.UNEXPECTED_ERR_MSG
@patch('cfm_advance.advance_to_reg')
@patch('firebase_admin.db')
def test_advance_reg(mock_db, mock_adv_reg):
db_root = mock_db.reference()
mock_adv_reg.return_value = ['foo']
schedule = cfm_advance.advance(db_root, ['/advance', 'reg'], 0)
assert schedule == 'foo'
@patch('cfm_advance.advance_to_reg')
@patch('firebase_admin.db')
def test_advance_reg_error(mock_db, mock_adv_reg):
db_root = mock_db
mock_adv_reg.side_effect = Exception
schedule = cfm_advance.advance(db_root, ['/advance', 'reg'], 0)
assert schedule == constants.UNEXPECTED_ERR_MSG
@patch('cfm_advance.advance_to_playoffs')
@patch('firebase_admin.db')
def test_advance_playoffs(mock_db, mock_adv_to_playoffs):
db_root = mock_db.reference()
mock_adv_to_playoffs.return_value = ['bar']
schedule = cfm_advance.advance(db_root, ['/advance', 'playoffs'], 0)
assert schedule == 'bar'
@patch('cfm_advance.advance_to_playoffs')
@patch('firebase_admin.db')
def test_advance_playoffs_error(mock_db, mock_adv_to_playoffs):
db_root = mock_db
mock_adv_to_playoffs.side_effect = Exception
schedule = cfm_advance.advance(db_root, ['/advance', 'playoffs'], 0)
assert schedule == constants.UNEXPECTED_ERR_MSG
@patch('firebase_admin.db')
def test_advance_offseason(mock_db):
db_root = mock_db.reference()
schedule = cfm_advance.advance(db_root, ['/advance', 'offseason'], 0)
assert schedule == 'Offseason Stage 1: Resign Players'
@patch('firebase_admin.db')
def test_advance_offseason_error(mock_db):
db_root = mock_db
mock_db.update.side_effect = Exception
schedule = cfm_advance.advance(db_root, ['/advance', 'offseason'], 0)
assert schedule == constants.UNEXPECTED_ERR_MSG | 37.767241 | 73 | 0.761242 | 631 | 4,381 | 4.892235 | 0.087163 | 0.056365 | 0.110139 | 0.090703 | 0.907677 | 0.901523 | 0.865241 | 0.837059 | 0.801101 | 0.758341 | 0 | 0.004433 | 0.124629 | 4,381 | 116 | 74 | 37.767241 | 0.800522 | 0 | 0 | 0.656863 | 0 | 0 | 0.167047 | 0.059105 | 0 | 0 | 0 | 0 | 0.196078 | 1 | 0.137255 | false | 0 | 0.058824 | 0 | 0.196078 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9203a75195ae3a4280d422554e6bcadf44057704 | 245 | py | Python | main.py | lmallez/wikidata-tree-generator | 4fe6b8af6615083e670bdd9495624f4292fd53c0 | [
"MIT"
] | 4 | 2020-07-06T09:48:30.000Z | 2020-10-27T06:56:44.000Z | main.py | lmallez/wikidata-tree-generator | 4fe6b8af6615083e670bdd9495624f4292fd53c0 | [
"MIT"
] | 2 | 2020-10-10T13:59:19.000Z | 2021-06-25T15:44:46.000Z | main.py | lmallez/wikidata-tree-generator | 4fe6b8af6615083e670bdd9495624f4292fd53c0 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import sys
from wikidata.entity import EntityId
from wikidata_tree_generator.generate_from_yaml import generate_from_yaml
if __name__ == '__main__':
generate_from_yaml(sys.argv[1], EntityId(sys.argv[2]), sys.argv[3])
| 30.625 | 73 | 0.791837 | 38 | 245 | 4.684211 | 0.552632 | 0.202247 | 0.269663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0181 | 0.097959 | 245 | 7 | 74 | 35 | 0.78733 | 0.085714 | 0 | 0 | 1 | 0 | 0.035874 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
920733ec4074f412b6c779d11278321f860b5354 | 42 | py | Python | lithops/standalone/__init__.py | kpavel/lithops | 395eef8b283512bd714d3633dcd94258e1df620c | [
"Apache-2.0"
] | 158 | 2020-09-16T13:22:03.000Z | 2022-03-28T20:01:31.000Z | lithops/standalone/__init__.py | kpavel/lithops | 395eef8b283512bd714d3633dcd94258e1df620c | [
"Apache-2.0"
] | 256 | 2018-05-20T13:01:51.000Z | 2020-09-16T09:09:54.000Z | lithops/standalone/__init__.py | kpavel/lithops | 395eef8b283512bd714d3633dcd94258e1df620c | [
"Apache-2.0"
] | 48 | 2020-09-19T15:29:53.000Z | 2022-03-23T17:08:24.000Z | from .standalone import StandaloneHandler
| 21 | 41 | 0.880952 | 4 | 42 | 9.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 42 | 1 | 42 | 42 | 0.973684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a6219320097cbf34b8d717f219794d77a379d273 | 33 | py | Python | src/vbox/config/__init__.py | autumnjolitz/vbox | 38015ee0bb2316982490172068b791762e092a85 | [
"BSD-3-Clause"
] | 1 | 2020-08-12T15:03:31.000Z | 2020-08-12T15:03:31.000Z | src/vbox/config/__init__.py | autumnjolitz/vbox | 38015ee0bb2316982490172068b791762e092a85 | [
"BSD-3-Clause"
] | null | null | null | src/vbox/config/__init__.py | autumnjolitz/vbox | 38015ee0bb2316982490172068b791762e092a85 | [
"BSD-3-Clause"
] | null | null | null | from .interface import VirtualBox | 33 | 33 | 0.878788 | 4 | 33 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 1 | 33 | 33 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a6219908ed7ba5c1557dce5486dbcb8dde0e18dd | 223 | py | Python | genesys/genesys/codelets/adl/graph/__init__.py | ziqingzeng/public | 4102b3bd42f43b49cf74599492d52d4f755ab7b2 | [
"BSD-3-Clause"
] | 6 | 2021-04-20T06:33:25.000Z | 2022-02-24T06:46:13.000Z | genesys/genesys/codelets/adl/graph/__init__.py | ziqingzeng/public | 4102b3bd42f43b49cf74599492d52d4f755ab7b2 | [
"BSD-3-Clause"
] | 3 | 2021-04-20T04:28:51.000Z | 2021-05-24T05:14:31.000Z | genesys/genesys/codelets/adl/graph/__init__.py | ziqingzeng/public | 4102b3bd42f43b49cf74599492d52d4f755ab7b2 | [
"BSD-3-Clause"
] | 4 | 2021-04-08T16:38:46.000Z | 2021-04-30T05:51:30.000Z | from .architecture_graph import ArchitectureGraph
from .architecture_node import ArchitectureNode
from .compute_node import ComputeNode
from .communication_node import CommunicationNode
from .storage_node import StorageNode | 44.6 | 49 | 0.892377 | 25 | 223 | 7.76 | 0.52 | 0.206186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085202 | 223 | 5 | 50 | 44.6 | 0.95098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a624f7ddb4f955832a199cdfc284b3c19c67c149 | 98 | py | Python | tests/local_airflow/load_lib_test.py | Fahadsaadullahkhan/KubernetesJobOperator | d96f9498667f937503d1e45142060904674f823f | [
"MIT"
] | 35 | 2020-02-10T16:55:41.000Z | 2022-03-18T01:25:00.000Z | tests/local_airflow/load_lib_test.py | Fahadsaadullahkhan/KubernetesJobOperator | d96f9498667f937503d1e45142060904674f823f | [
"MIT"
] | 26 | 2020-02-10T05:36:44.000Z | 2022-03-02T18:44:47.000Z | tests/local_airflow/load_lib_test.py | Fahadsaadullahkhan/KubernetesJobOperator | d96f9498667f937503d1e45142060904674f823f | [
"MIT"
] | 8 | 2020-02-28T23:24:07.000Z | 2021-11-29T21:35:46.000Z | from airflow_kubernetes_job_operator.kubernetes_job_operator import (
KubernetesJobOperator,
) | 32.666667 | 69 | 0.867347 | 10 | 98 | 8 | 0.7 | 0.325 | 0.525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091837 | 98 | 3 | 70 | 32.666667 | 0.898876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
5b4a4333ee04b641d1595c1defee0e62dcd12652 | 30 | py | Python | pythonBase/advancePyton/chapter13/asyncio_http.py | cangchengkun/pythonbase | 4e01331b1c7c13d86f32f697dd812cb267abe7ef | [
"CNRI-Python"
] | null | null | null | pythonBase/advancePyton/chapter13/asyncio_http.py | cangchengkun/pythonbase | 4e01331b1c7c13d86f32f697dd812cb267abe7ef | [
"CNRI-Python"
] | null | null | null | pythonBase/advancePyton/chapter13/asyncio_http.py | cangchengkun/pythonbase | 4e01331b1c7c13d86f32f697dd812cb267abe7ef | [
"CNRI-Python"
] | null | null | null | # asyncio没有提供http协议接口 aiohttp
| 15 | 29 | 0.866667 | 2 | 30 | 13 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 30 | 1 | 30 | 30 | 0.962963 | 0.9 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5b4cf4c8f4e7391bb6f6f9f20746f132df30dd4e | 21 | py | Python | KD_Lib/KD/vision/DML/__init__.py | khizirsiddiqui/KD_Lib | 0eccceccf9bbc994ec4b380a75114192cf358286 | [
"MIT"
] | 360 | 2020-05-11T08:18:20.000Z | 2022-03-31T01:48:43.000Z | KD_Lib/KD/vision/DML/__init__.py | khizirsiddiqui/KD_Lib | 0eccceccf9bbc994ec4b380a75114192cf358286 | [
"MIT"
] | 91 | 2020-05-11T08:14:56.000Z | 2022-03-30T05:29:03.000Z | KD_Lib/KD/vision/DML/__init__.py | khizirsiddiqui/KD_Lib | 0eccceccf9bbc994ec4b380a75114192cf358286 | [
"MIT"
] | 39 | 2020-05-11T08:06:47.000Z | 2022-03-29T05:11:18.000Z | from .dml import DML
| 10.5 | 20 | 0.761905 | 4 | 21 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5b5008acd1ac79adf36d5999e8c804216147cdb6 | 2,063 | py | Python | src/envassert/file.py | shirou/envassert | 1113eb7fd1b1dffa28243e9fd11fd49830388124 | [
"MIT"
] | null | null | null | src/envassert/file.py | shirou/envassert | 1113eb7fd1b1dffa28243e9fd11fd49830388124 | [
"MIT"
] | null | null | null | src/envassert/file.py | shirou/envassert | 1113eb7fd1b1dffa28243e9fd11fd49830388124 | [
"MIT"
] | null | null | null | from __future__ import with_statement
from fabric.api import run, sudo, hide, env
def exists(location):
with hide("everything"):
return run('test -e "%s" && echo OK ; true' % (location)).endswith("OK")
def is_file(location):
with hide("everything"):
return run("test -f '%s' && echo OK ; true" % (location)).endswith("OK")
def is_dir(location):
with hide("everything"):
return run("test -d '%s' && echo OK ; true" % (location)).endswith("OK")
def is_link(location):
with hide("everything"):
return run("test -L '%s' && echo OK ; true" % (location)).endswith("OK")
def dir_exists(location):
with hide("everything"):
return run('test -d "%s" && echo OK ; true' % (location)).endswith("OK")
def has_line(location, line):
with hide("everything"):
text = run('cat "%s"' % (location))
return text.find(line) >= 0
def has_line_sudo(location, line):
with hide("everything"):
text = sudo('cat "%s"' % (location))
return text.find(line) >= 0
def owner_is(location, name):
with hide("everything"):
if env.platform_family == "freebsd":
return run('stat -f %%Su %s | grep "^%s$" && echo OK ; true' % (location, name)).endswith("OK")
else:
return run('stat -c %%U %s | grep "^%s$" && echo OK ; true' % (location, name)).endswith("OK")
def mode_is(location, name):
with hide("everything"):
if env.platform_family == "freebsd":
return run('stat -f %%Op %s | cut -c 4-6 | grep "^%s$" && echo OK ; true' % (location, name)).endswith("OK")
else:
return run('stat -c %%a %s | grep "^%s$" && echo OK ; true' % (location, name)).endswith("OK")
def group_is(location, name):
with hide("everything"):
if env.platform_family == "freebsd":
return run('stat -f %%Sg %s | grep "^%s$" && echo OK ; true' % (location, name)).endswith("OK")
else:
return run('stat -c %%G %s | grep "^%s$" && echo OK ; true' % (location, name)).endswith("OK")
| 32.234375 | 120 | 0.566651 | 275 | 2,063 | 4.185455 | 0.207273 | 0.086012 | 0.066898 | 0.105126 | 0.880973 | 0.880973 | 0.821894 | 0.754127 | 0.682016 | 0.56212 | 0 | 0.002556 | 0.241396 | 2,063 | 63 | 121 | 32.746032 | 0.732907 | 0 | 0 | 0.418605 | 0 | 0.023256 | 0.291323 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.232558 | false | 0 | 0.046512 | 0 | 0.581395 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
5b84c1dd5de1653ab50f977f021bc6ecec34aeca | 156 | py | Python | tests/unit/testhelper.py | starkcoffee/randomchatroom | 6784d1f32916a5d5e2cf3c7d9f6be927a061b5ef | [
"MIT"
] | null | null | null | tests/unit/testhelper.py | starkcoffee/randomchatroom | 6784d1f32916a5d5e2cf3c7d9f6be927a061b5ef | [
"MIT"
] | null | null | null | tests/unit/testhelper.py | starkcoffee/randomchatroom | 6784d1f32916a5d5e2cf3c7d9f6be927a061b5ef | [
"MIT"
] | 1 | 2020-01-05T02:29:18.000Z | 2020-01-05T02:29:18.000Z | import sys
from os.path import dirname
# import the app code by putting directory ../.. on the path
sys.path.append(dirname(dirname(dirname(__file__))))
| 19.5 | 60 | 0.75 | 24 | 156 | 4.708333 | 0.625 | 0.247788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141026 | 156 | 7 | 61 | 22.285714 | 0.843284 | 0.371795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5bd41dfb3ddd6a498f0df1d0d8689e1c6b4941fd | 82 | py | Python | keys.py | dvatvani/trakt-ratings-trends | 9c341592439bb4b825e8b4feb040653e434bdd67 | [
"MIT"
] | 2 | 2021-02-26T07:32:16.000Z | 2021-03-25T04:24:24.000Z | keys.py | dvatvani/trakt-ratings-trends | 9c341592439bb4b825e8b4feb040653e434bdd67 | [
"MIT"
] | null | null | null | keys.py | dvatvani/trakt-ratings-trends | 9c341592439bb4b825e8b4feb040653e434bdd67 | [
"MIT"
] | null | null | null | trakt_api_key = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' | 82 | 82 | 0.939024 | 4 | 82 | 18.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 82 | 1 | 82 | 82 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0.771084 | 0.771084 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5be76276e87e417627666b8c3e716b4ec1330053 | 5,548 | py | Python | scripts/icws_csv_data_export.py | ioffe-ben/icws-snippets | 3a2d859a296272f55ed579c7bc5c1e483e9368d5 | [
"BSD-3-Clause"
] | 1 | 2021-11-01T13:50:27.000Z | 2021-11-01T13:50:27.000Z | scripts/icws_csv_data_export.py | ioffe-ben/icws-snippets | 3a2d859a296272f55ed579c7bc5c1e483e9368d5 | [
"BSD-3-Clause"
] | null | null | null | scripts/icws_csv_data_export.py | ioffe-ben/icws-snippets | 3a2d859a296272f55ed579c7bc5c1e483e9368d5 | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) 2021, Ben Ioffe (github.com/ioffe-ben). All rights reserved. Copyrights licensed under the BSD 3-Clause License. See the accompanying LICENSE file for terms.
# Code snippet description: the following snippet shows how to generate a .csv document with IC users, workgroups, roles, skills, wrap-up codes, and wrap-up categories using ICWS API.
import json
import requests
import icws_cloud_authentication # or import icws_premise_authentication for PureConnect Premise (more details: https://github.com/ioffe-ben/pureconnect-icws-code-snippets/blob/main/scripts/icws_premise_authentication.py)
# Generate a .csv document with IC users
request = requests.get(icws_cloud_authentication.baseURL + icws_cloud_authentication.json_connection_response['sessionId'] + '/configuration/users', headers=icws_cloud_authentication.header, cookies=icws_cloud_authentication.cookie, verify=False)
json_items = json.loads(request.text)
i = 0
csv_export_string = 'IC Usernames\n'
while i < len(json_items['items']):
item = json_items['items'][i]['configurationId']['id']
i += 1
csv_export_string = csv_export_string + item + '\n'
print('(' + str(i) + ' out of ' + str(len(json_items['items'])) + ') ' + item)
with open('ic_users_export.csv', 'w') as file:
file.write(csv_export_string)
file.close()
print('Export completed, see "ic_users_export.csv" file in the same folder as this script')
# Generate a .csv document with workgroups
request = requests.get(icws_cloud_authentication.baseURL + icws_cloud_authentication.json_connection_response['sessionId'] + '/configuration/workgroups', headers=icws_cloud_authentication.header, cookies=icws_cloud_authentication.cookie, verify=False)
json_items = json.loads(request.text)
i = 0
csv_export_string = 'Workgroups\n'
while i < len(json_items['items']):
item = json_items['items'][i]['configurationId']['id']
i += 1
csv_export_string = csv_export_string + item + '\n'
print('(' + str(i) + ' out of ' + str(len(json_items['items'])) + ') ' + item)
with open('workgroups_export.csv', 'w') as file:
file.write(csv_export_string)
file.close()
print('Export completed, see "workgroups_export.csv" file in the same folder as this script')
# Generate a .csv document with roles
request = requests.get(icws_cloud_authentication.baseURL + icws_cloud_authentication.json_connection_response['sessionId'] + '/configuration/roles', headers=icws_cloud_authentication.header, cookies=icws_cloud_authentication.cookie, verify=False)
json_items = json.loads(request.text)
i = 0
csv_export_string = 'Roles\n'
while i < len(json_items['items']):
item = json_items['items'][i]['configurationId']['id']
i += 1
csv_export_string = csv_export_string + item + '\n'
print('(' + str(i) + ' out of ' + str(len(json_items['items'])) + ') ' + item)
with open('roles_export.csv', 'w') as file:
file.write(csv_export_string)
file.close()
print('Export completed, see "roles_export.csv" file in the same folder as this script')
# Generate a .csv document with skills
request = requests.get(icws_cloud_authentication.baseURL + icws_cloud_authentication.json_connection_response['sessionId'] + '/configuration/skills', headers=icws_cloud_authentication.header, cookies=icws_cloud_authentication.cookie, verify=False)
json_items = json.loads(request.text)
i = 0
csv_export_string = 'Skills\n'
while i < len(json_items['items']):
item = json_items['items'][i]['configurationId']['id']
i += 1
csv_export_string = csv_export_string + item + '\n'
print('(' + str(i) + ' out of ' + str(len(json_items['items'])) + ') ' + item)
with open('skills_export.csv', 'w') as file:
file.write(csv_export_string)
file.close()
print('Export completed, see "skills_export.csv" file in the same folder as this script')
# Generate a .csv document with Wrap-up Codes
request = requests.get(icws_cloud_authentication.baseURL + icws_cloud_authentication.json_connection_response['sessionId'] + '/configuration/wrap-up-codes', headers=icws_cloud_authentication.header, cookies=icws_cloud_authentication.cookie, verify=False)
json_items = json.loads(request.text)
i = 0
csv_export_string = 'Wrap-up Codes\n'
while i < len(json_items['items']):
item = json_items['items'][i]['configurationId']['displayName']
i += 1
csv_export_string = csv_export_string + item + '\n'
print('(' + str(i) + ' out of ' + str(len(json_items['items'])) + ') ' + item)
with open('wrap-up-codes_export.csv', 'w') as file:
file.write(csv_export_string)
file.close()
print('Export completed, see "wrap-up-codes_export.csv" file in the same folder as this script')
# Generate a .csv document with Wrap-up Categories
request = requests.get(icws_cloud_authentication.baseURL + icws_cloud_authentication.json_connection_response['sessionId'] + '/configuration/wrap-up-categories', headers=icws_cloud_authentication.header, cookies=icws_cloud_authentication.cookie, verify=False)
json_items = json.loads(request.text)
i = 0
csv_export_string = 'Wrap-up Codes\n'
while i < len(json_items['items']):
item = json_items['items'][i]['configurationId']['displayName']
i += 1
csv_export_string = csv_export_string + item + '\n'
print('(' + str(i) + ' out of ' + str(len(json_items['items'])) + ') ' + item)
with open('wrap-up-categories_export.csv', 'w') as file:
file.write(csv_export_string)
file.close()
print('Export completed, see "wrap-up-categories_export.csv" file in the same folder as this script')
| 48.666667 | 259 | 0.733598 | 780 | 5,548 | 5.023077 | 0.138462 | 0.057427 | 0.146759 | 0.052067 | 0.840735 | 0.833078 | 0.833078 | 0.817254 | 0.817254 | 0.817254 | 0 | 0.003527 | 0.131218 | 5,548 | 113 | 260 | 49.097345 | 0.809336 | 0.141673 | 0 | 0.691358 | 0 | 0.024691 | 0.251684 | 0.059343 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.037037 | 0.148148 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7507a3ca79d840b6515aa600f81486d05f9e0cd9 | 155 | py | Python | ptgnn/baseneuralmodel/__init__.py | mir-am/ptgnn | 54b6f8a6411d31833e7ba904ac6bf24c37861e34 | [
"MIT"
] | 319 | 2020-05-16T01:08:03.000Z | 2022-03-31T18:47:21.000Z | ptgnn/baseneuralmodel/__init__.py | mir-am/ptgnn | 54b6f8a6411d31833e7ba904ac6bf24c37861e34 | [
"MIT"
] | 8 | 2020-06-26T13:54:34.000Z | 2022-02-01T17:31:47.000Z | ptgnn/baseneuralmodel/__init__.py | mir-am/ptgnn | 54b6f8a6411d31833e7ba904ac6bf24c37861e34 | [
"MIT"
] | 40 | 2020-05-21T13:36:51.000Z | 2022-03-16T12:56:21.000Z | from .abstractneuralmodel import AbstractNeuralModel
from .modulewithmetrics import ModuleWithMetrics
from .trainer import AbstractScheduler, ModelTrainer
| 38.75 | 52 | 0.890323 | 13 | 155 | 10.615385 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083871 | 155 | 3 | 53 | 51.666667 | 0.971831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7526a1873c2b3c6a9ed6aeb6abe77344785e80e4 | 38 | py | Python | AsyncLibrary/__init__.py | grzegorz-krol/robotframework-async | d9378cbbd8992ae63f5b7e0b8b5ba53952f989d8 | [
"MIT"
] | 25 | 2015-04-04T00:49:49.000Z | 2021-11-12T17:19:50.000Z | AsyncLibrary/__init__.py | grzegorz-krol/robotframework-async | d9378cbbd8992ae63f5b7e0b8b5ba53952f989d8 | [
"MIT"
] | 10 | 2015-07-29T17:44:21.000Z | 2021-12-27T10:16:57.000Z | AsyncLibrary/__init__.py | grzegorz-krol/robotframework-async | d9378cbbd8992ae63f5b7e0b8b5ba53952f989d8 | [
"MIT"
] | 15 | 2015-06-01T05:45:46.000Z | 2021-01-20T22:57:06.000Z | from .robot_async import AsyncLibrary
| 19 | 37 | 0.868421 | 5 | 38 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
75324d6be1c8202e0d06077cca3517e33b0cd943 | 2,526 | py | Python | plot_pdf.py | eladgsofer/dlg | 043ef1181a8fc476e01d73bb9df59bd5faed66f9 | [
"MIT"
] | null | null | null | plot_pdf.py | eladgsofer/dlg | 043ef1181a8fc476e01d73bb9df59bd5faed66f9 | [
"MIT"
] | null | null | null | plot_pdf.py | eladgsofer/dlg | 043ef1181a8fc476e01d73bb9df59bd5faed66f9 | [
"MIT"
] | null | null | null | import matplotlib as plt
import pickle
import numpy as np
def plot_graphs(algo, iteration_list):
with open('output/ITER_MAT_LOSS_' + algo + '_VANILLA.npy', 'rb') as f:
dlg_loss_per_iter_matrix = pickle.load(f)
with open('output/ITER_MAT_MSE_' + algo + '_VANILLA.npy', 'rb') as f:
dlg_mse_per_iter_matrix = pickle.load(f)
with open('output/ITER_MAT_SSIM_' + algo + '_VANILLA.npy', 'rb') as f:
dlg_ssim_per_iter_matrix = pickle.load(f)
with open('output/ITER_MAT_LOSS_' + algo + '_JOPEQ.npy', 'rb') as f:
jopeq_loss_per_iter_matrix = pickle.load(f)
with open('output/ITER_MAT_MSE_' + algo + '_JOPEQ.npy', 'rb') as f:
jopeq_mse_per_iter_matrix = pickle.load(f)
with open('output/ITER_MAT_SSIM_' + algo + '_JOPEQ.npy', 'rb') as f:
jopeq_ssim_per_iter_matrix = pickle.load(f)
with open('output/ITER_GRAD_MAT_NORM_' + algo + '_new.npy', 'rb') as f:
grads_norm_mat = pickle.load(f)
font = {'weight': 'bold', 'size': 12}
plt.figure()
plt.rc('font', **font)
plt.plot(iteration_list, np.mean(np.log(dlg_loss_per_iter_matrix), axis=1),'-o',linewidth=1.5)
plt.plot(iteration_list, np.mean(np.log(jopeq_loss_per_iter_matrix), axis=1), '-*', linewidth=1.5, markersize=8)
plt.legend(['Vanilla DLG', 'JoPEQ'])
plt.grid(visible=True, axis="y")
plt.grid(visible=True, which='minor')
plt.xlabel("epoches")
plt.ylabel("log(loss)")
plt.savefig("log_loss_dlg_vs_jopeq.pdf", format="pdf", bbox_inches="tight")
plt.figure()
plt.rc('font', **font)
plt.plot(iteration_list, np.mean(np.log(dlg_mse_per_iter_matrix), axis=1), linewidth=3)
plt.plot(iteration_list, np.mean(np.log(jopeq_mse_per_iter_matrix), axis=1), linewidth=3)
plt.title("dlg vanilla MSE vs JoPEQ MSE")
plt.grid(visible=True, axis="y")
plt.grid(visible=True, which='minor')
plt.xlabel("epoches")
plt.ylabel("log(MSE)")
plt.figure()
plt.rc('font', **font)
plt.plot(iteration_list, np.mean(dlg_ssim_per_iter_matrix, axis=1), '-o', linewidth=1.5)
plt.plot(iteration_list, np.mean(jopeq_ssim_per_iter_matrix, axis=1), '-*', linewidth=1.5, markersize=8)
plt.legend(['Vanilla DLG', 'JoPEQ'])
plt.grid(visible=True, axis="y")
plt.grid(visible=True, which='minor')
plt.xlabel("epoches")
plt.ylabel("SSIM")
plt.savefig("ssim_dlg_vs_jopeq.pdf", format="pdf", bbox_inches="tight")
plt.figure()
plt.plot(iteration_list, np.mean(grads_norm_mat, axis=1), linewidth=3)
plt.show() | 44.315789 | 116 | 0.669834 | 402 | 2,526 | 3.962687 | 0.174129 | 0.052731 | 0.097928 | 0.079096 | 0.853735 | 0.837414 | 0.812932 | 0.721908 | 0.721908 | 0.652229 | 0 | 0.010363 | 0.159541 | 2,526 | 57 | 117 | 44.315789 | 0.739991 | 0 | 0 | 0.352941 | 0 | 0 | 0.17966 | 0.061733 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019608 | false | 0 | 0.058824 | 0 | 0.078431 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
753719719029315f266975b6eeb89a0a85368340 | 27 | py | Python | docker_python_template/other.py | mattwalshdev/docker_python_template | d180c880c2bb4609d5f00ba948c3339f2d05de2d | [
"MIT"
] | null | null | null | docker_python_template/other.py | mattwalshdev/docker_python_template | d180c880c2bb4609d5f00ba948c3339f2d05de2d | [
"MIT"
] | null | null | null | docker_python_template/other.py | mattwalshdev/docker_python_template | d180c880c2bb4609d5f00ba948c3339f2d05de2d | [
"MIT"
] | null | null | null | print("I'm another file!!") | 27 | 27 | 0.666667 | 5 | 27 | 3.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 27 | 1 | 27 | 27 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
7557029c81abc848e6992b6a7ef2b133a6d3d8a8 | 274 | py | Python | pybullet-env-mods/pybullet_env_mods/envs/__init__.py | homayoonfarrahi/cycle-time-study | 3939d1ffa7e74eaea2ebf059290dd817f4fe084c | [
"Apache-2.0"
] | null | null | null | pybullet-env-mods/pybullet_env_mods/envs/__init__.py | homayoonfarrahi/cycle-time-study | 3939d1ffa7e74eaea2ebf059290dd817f4fe084c | [
"Apache-2.0"
] | null | null | null | pybullet-env-mods/pybullet_env_mods/envs/__init__.py | homayoonfarrahi/cycle-time-study | 3939d1ffa7e74eaea2ebf059290dd817f4fe084c | [
"Apache-2.0"
] | null | null | null | from pybullet_env_mods.envs.self_aware_reacher_env import SelfAwareReacherBulletEnv
from pybullet_env_mods.envs.self_aware_reacher_env import HighFreqReacherBulletEnv
from pybullet_env_mods.envs.high_freq_inverted_pendulum_env import HighFreqInvertedDoublePendulumBulletEnv
| 68.5 | 106 | 0.934307 | 34 | 274 | 7.058824 | 0.470588 | 0.15 | 0.1875 | 0.2375 | 0.495833 | 0.4 | 0.4 | 0.4 | 0.4 | 0.4 | 0 | 0 | 0.043796 | 274 | 3 | 107 | 91.333333 | 0.916031 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f35d2b073871238f8b7004835319ec3bc4ac0c02 | 323 | py | Python | terrascript/librato/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/librato/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/librato/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | # terrascript/librato/r.py
import terrascript
class librato_space(terrascript.Resource):
pass
class librato_space_chart(terrascript.Resource):
pass
class librato_metric(terrascript.Resource):
pass
class librato_alert(terrascript.Resource):
pass
class librato_service(terrascript.Resource):
pass
| 16.15 | 48 | 0.780186 | 37 | 323 | 6.648649 | 0.351351 | 0.243902 | 0.46748 | 0.455285 | 0.569106 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145511 | 323 | 19 | 49 | 17 | 0.891304 | 0.074303 | 0 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.454545 | 0.090909 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
f361ce2611ce262a4963939c167171239ba46ffa | 10,376 | py | Python | tests/core/test_nested_data.py | utsc-networking/utsc-tools | d5bc10cf825f1be46999d5a42da62cc0df456f0c | [
"MIT"
] | null | null | null | tests/core/test_nested_data.py | utsc-networking/utsc-tools | d5bc10cf825f1be46999d5a42da62cc0df456f0c | [
"MIT"
] | null | null | null | tests/core/test_nested_data.py | utsc-networking/utsc-tools | d5bc10cf825f1be46999d5a42da62cc0df456f0c | [
"MIT"
] | null | null | null | # pylint: disable=unused-argument
from utsc.core import NestedData
def test_nesteddata_unstructure():
input_data = {
"menu": {
"header": "SVG Viewer",
"items": [
{"id": "Open"},
{"id": "OpenNew", "label": "Open New"},
None,
{"id": "ZoomIn", "label": "Zoom In"},
{"id": "ZoomOut", "label": "Zoom Out"},
{"id": "OriginalView", "label": "Original View"},
None,
{"id": "Quality"},
{"id": "Pause"},
{"id": "Mute"},
None,
{"id": "Find", "label": "Find..."},
{"id": "FindAgain", "label": "Find Again"},
{"id": "Copy"},
{"id": "CopyAgain", "label": "Copy Again"},
{"id": "CopySVG", "label": "Copy SVG"},
{"id": "ViewSVG", "label": "View SVG"},
{"id": "ViewSource", "label": "View Source"},
{"id": "SaveAs", "label": "Save As"},
None,
{"id": "Help"},
{"id": "About", "label": "About Adobe CVG Viewer..."},
],
"other": {"[key1]": True, "[key2]": False},
}
}
expected_output = [
("menu.header", "SVG Viewer"),
("menu.items.[0].id", "Open"),
("menu.items.[1].id", "OpenNew"),
("menu.items.[1].label", "Open New"),
("menu.items.[2]", None),
("menu.items.[3].id", "ZoomIn"),
("menu.items.[3].label", "Zoom In"),
("menu.items.[4].id", "ZoomOut"),
("menu.items.[4].label", "Zoom Out"),
("menu.items.[5].id", "OriginalView"),
("menu.items.[5].label", "Original View"),
("menu.items.[6]", None),
("menu.items.[7].id", "Quality"),
("menu.items.[8].id", "Pause"),
("menu.items.[9].id", "Mute"),
("menu.items.[10]", None),
("menu.items.[11].id", "Find"),
("menu.items.[11].label", "Find..."),
("menu.items.[12].id", "FindAgain"),
("menu.items.[12].label", "Find Again"),
("menu.items.[13].id", "Copy"),
("menu.items.[14].id", "CopyAgain"),
("menu.items.[14].label", "Copy Again"),
("menu.items.[15].id", "CopySVG"),
("menu.items.[15].label", "Copy SVG"),
("menu.items.[16].id", "ViewSVG"),
("menu.items.[16].label", "View SVG"),
("menu.items.[17].id", "ViewSource"),
("menu.items.[17].label", "View Source"),
("menu.items.[18].id", "SaveAs"),
("menu.items.[18].label", "Save As"),
("menu.items.[19]", None),
("menu.items.[20].id", "Help"),
("menu.items.[21].id", "About"),
("menu.items.[21].label", "About Adobe CVG Viewer..."),
("menu.other.[key1]", True),
("menu.other.[key2]", False),
]
output = []
for keypath, value in NestedData.unstructure(input_data):
assert isinstance(keypath, str)
output.append((keypath, value))
assert output == expected_output
def test_nesteddata_restructure():
input_data = [
("menu.header", "SVG Viewer"),
("menu.items.[0].id", "Open"),
("menu.items.[1].id", "OpenNew"),
("menu.items.[1].label", "Open New"),
("menu.items.[2]", None),
("menu.items.[3].id", "ZoomIn"),
("menu.items.[3].label", "Zoom In"),
("menu.items.[4].id", "ZoomOut"),
("menu.items.[4].label", "Zoom Out"),
("menu.items.[5].id", "OriginalView"),
("menu.items.[5].label", "Original View"),
("menu.items.[6]", None),
("menu.items.[7].id", "Quality"),
("menu.items.[8].id", "Pause"),
("menu.items.[9].id", "Mute"),
("menu.items.[10]", None),
("menu.items.[11].id", "Find"),
("menu.items.[11].label", "Find..."),
("menu.items.[12].id", "FindAgain"),
("menu.items.[12].label", "Find Again"),
("menu.items.[13].id", "Copy"),
("menu.items.[14].id", "CopyAgain"),
("menu.items.[14].label", "Copy Again"),
("menu.items.[15].id", "CopySVG"),
("menu.items.[15].label", "Copy SVG"),
("menu.items.[16].id", "ViewSVG"),
("menu.items.[16].label", "View SVG"),
("menu.items.[17].id", "ViewSource"),
("menu.items.[17].label", "View Source"),
("menu.items.[18].id", "SaveAs"),
("menu.items.[18].label", "Save As"),
("menu.items.[19]", None),
("menu.items.[20].id", "Help"),
("menu.items.[21].id", "About"),
("menu.items.[21].label", "About Adobe CVG Viewer..."),
("menu.other.[key1]", True),
("menu.other.[key2]", False),
]
expected_output = {
"menu": {
"header": "SVG Viewer",
"items": [
{"id": "Open"},
{"id": "OpenNew", "label": "Open New"},
None,
{"id": "ZoomIn", "label": "Zoom In"},
{"id": "ZoomOut", "label": "Zoom Out"},
{"id": "OriginalView", "label": "Original View"},
None,
{"id": "Quality"},
{"id": "Pause"},
{"id": "Mute"},
None,
{"id": "Find", "label": "Find..."},
{"id": "FindAgain", "label": "Find Again"},
{"id": "Copy"},
{"id": "CopyAgain", "label": "Copy Again"},
{"id": "CopySVG", "label": "Copy SVG"},
{"id": "ViewSVG", "label": "View SVG"},
{"id": "ViewSource", "label": "View Source"},
{"id": "SaveAs", "label": "Save As"},
None,
{"id": "Help"},
{"id": "About", "label": "About Adobe CVG Viewer..."},
],
"other": {"[key1]": True, "[key2]": False},
}
}
output = NestedData.restructure(input_data)
assert output == expected_output
def test_nesteddata_remap():
keymap = [
# basic renaming
("menu.header", "menu.footer"),
# renaming with shell-style wildcards
("menu.items.[1].*", "menu.items.[1].new*"),
# multiple rules can be applied to the same items, will be applied in order
("menu.items.*", "menu.newitems.*"),
# support multiple wildcards
("menu.*.[3].*", "menu.*.[3].*altered"),
# can move entire branches of the tree around, reattach them to other parts of the tree
("menu.newitems.[4].*", "menu.newsubkey.*"),
]
input_data = {
"menu": {
"header": "SVG Viewer",
"items": [
{"id": "Open"},
{"id": "OpenNew", "label": "Open New"},
None,
{"id": "ZoomIn", "label": "Zoom In"},
{"id": "ZoomOut", "label": "Zoom Out"},
],
"other": {"[key1]": True, "[key2]": False},
}
}
expected_output = {
"menu": {
"footer": "SVG Viewer",
"newitems": [
{"id": "Open"},
{"newid": "OpenNew", "newlabel": "Open New"},
None,
{"idaltered": "ZoomIn", "labelaltered": "Zoom In"},
],
"newsubkey": {"id": "ZoomOut", "label": "Zoom Out"},
"other": {"[key1]": True, "[key2]": False},
}
}
unstructured = NestedData.unstructure(input_data)
unstructured = NestedData.remap(unstructured, keymap)
output = NestedData.restructure(unstructured)
assert output == expected_output
def test_nesteddata_filter():
input_data = {
"menu": {
"header": "SVG Viewer",
"items": [
{"id": "Open"},
{"id": "OpenNew", "label": "Open New"},
None,
{"id": "ZoomIn", "label": "Zoom In"},
{"id": "ZoomOut", "label": "Zoom Out"},
{"id": "OriginalView", "label": "Original View"},
None,
{"id": "Quality"},
{"id": "Pause"},
{"id": "Mute"},
None,
{"id": "Find", "label": "Find..."},
{"id": "FindAgain", "label": "Find Again"},
{"id": "Copy"},
{"id": "CopyAgain", "label": "Copy Again"},
{"id": "CopySVG", "label": "Copy SVG"},
{"id": "ViewSVG", "label": "View SVG"},
{"id": "ViewSource", "label": "View Source"},
{"id": "SaveAs", "label": "Save As"},
None,
{"id": "Help"},
{"id": "About", "label": "About Adobe CVG Viewer..."},
],
"other": {
"first": {"id": "Help"},
"second": {"id": "Help"},
},
}
}
filters = [
"menu.header", # full match
"menu.other.first", # partial match
"menu.items.*.*", # regex match, filter out all entries in items which don't have an id
]
expected_output = {
"menu": {
"header": "SVG Viewer",
"items": [
{"id": "Open"},
{"id": "OpenNew", "label": "Open New"},
{"id": "ZoomIn", "label": "Zoom In"},
{"id": "ZoomOut", "label": "Zoom Out"},
{"id": "OriginalView", "label": "Original View"},
{"id": "Quality"},
{"id": "Pause"},
{"id": "Mute"},
{"id": "Find", "label": "Find..."},
{"id": "FindAgain", "label": "Find Again"},
{"id": "Copy"},
{"id": "CopyAgain", "label": "Copy Again"},
{"id": "CopySVG", "label": "Copy SVG"},
{"id": "ViewSVG", "label": "View SVG"},
{"id": "ViewSource", "label": "View Source"},
{"id": "SaveAs", "label": "Save As"},
{"id": "Help"},
{"id": "About", "label": "About Adobe CVG Viewer..."},
],
"other": {"first": {"id": "Help"}},
}
}
unstructured = NestedData.unstructure(input_data)
filtered = NestedData.filter_(unstructured, filters)
output = NestedData.restructure(filtered)
assert output == expected_output
| 38.147059 | 96 | 0.427814 | 983 | 10,376 | 4.490336 | 0.144456 | 0.146806 | 0.021749 | 0.030131 | 0.780471 | 0.761441 | 0.754418 | 0.725193 | 0.717943 | 0.717943 | 0 | 0.018374 | 0.344352 | 10,376 | 271 | 97 | 38.287823 | 0.630457 | 0.034888 | 0 | 0.796078 | 0 | 0 | 0.379548 | 0.033587 | 0 | 0 | 0 | 0 | 0.019608 | 1 | 0.015686 | false | 0 | 0.003922 | 0 | 0.019608 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f36b0e967d23b83573072b1385c4ce3c5d802899 | 88 | py | Python | eod/runner/__init__.py | scott-mao/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | 1 | 2022-01-12T01:51:39.000Z | 2022-01-12T01:51:39.000Z | eod/runner/__init__.py | YZW-explorer/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | null | null | null | eod/runner/__init__.py | YZW-explorer/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | null | null | null | from .base_runner import BaseRunner # noqa
from .quant_runner import QuantRunner # noqa
| 29.333333 | 44 | 0.818182 | 12 | 88 | 5.833333 | 0.666667 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 88 | 2 | 45 | 44 | 0.921053 | 0.102273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f36db3be3d72b932bbf88cacd5dfc7e022c60d1b | 126,657 | py | Python | healthier/entries/migrations/0015_physicalactivity.py | muatik/healthier | 571662f0e9bbae4ff09014e4f6e37e11fa485a12 | [
"MIT"
] | 9 | 2017-02-07T12:00:00.000Z | 2021-03-15T05:29:13.000Z | healthier/entries/migrations/0015_physicalactivity.py | muatik/healthier | 571662f0e9bbae4ff09014e4f6e37e11fa485a12 | [
"MIT"
] | 37 | 2016-10-03T08:26:33.000Z | 2016-12-18T15:49:30.000Z | healthier/entries/migrations/0015_physicalactivity.py | muatik/healthier | 571662f0e9bbae4ff09014e4f6e37e11fa485a12 | [
"MIT"
] | 6 | 2017-02-09T17:23:11.000Z | 2018-04-19T02:17:09.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.10.3 on 2016-12-11 17:49
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('entries', '0014_auto_20161211_1436'),
]
operations = [
migrations.CreateModel(
name='PhysicalActivity',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('code', models.IntegerField()),
('METS', models.FloatField()),
('name', models.TextField(max_length=255)),
],
),
migrations.RunSQL([
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01003', 'bicycling, mountain, uphill, vigorous', '14.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01004', 'bicycling, mountain, competitive, racing', '16.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01008', 'bicycling, BMX', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01009', 'bicycling, mountain, general', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01010', 'bicycling, <10 mph, leisure, to work or for pleasure (Taylor Code 115)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01011', 'bicycling, to/from work, self selected pace', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01013', 'bicycling, on dirt or farm road, moderate pace', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01015', 'bicycling, general', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01018', 'bicycling, leisure, 5.5 mph', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01019', 'bicycling, leisure, 9.4 mph', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01020', 'bicycling, 10-11.9 mph, leisure, slow, light effort', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01030', 'bicycling, 12-13.9 mph, leisure, moderate effort', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01040', 'bicycling, 14-15.9 mph, racing or leisure, fast, vigorous effort', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01050', 'bicycling, 16-19 mph, racing/not drafting or > 19 mph drafting, very fast, racing general', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01060', 'bicycling, > 20 mph, racing, not drafting', '15.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01065', 'bicycling, 12 mph, seated, hands on brake hoods or bar drops, 80 rpm', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01066', 'bicycling, 12 mph, standing, hands on brake hoods, 60 rpm', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('01070', 'unicycling', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02001', 'activity promoting video game (e.g., Wii Fit), light effort (e.g., balance, yoga)', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02003', 'activity promoting video game (e.g., Wii Fit), moderate effort (e.g., aerobic, resistance)', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02005', 'activity promoting video/arcade game (e.g., Exergaming, Dance Dance Revolution), vigorous effort', '7.2');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02008', 'army type obstacle course exercise, boot camp training program', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02010', 'bicycling, stationary, general', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02011', 'bicycling, stationary, 30-50 watts, very light to light effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02012', 'bicycling, stationary, 90-100 watts, moderate to vigorous effort', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02013', 'bicycling, stationary, 101-160 watts, vigorous effort', '8.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02014', 'bicycling, stationary, 161-200 watts, vigorous effort', '11.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02015', 'bicycling, stationary, 201-270 watts, very vigorous effort', '14.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02017', 'bicycling, stationary, 51-89 watts, light-to-moderate effort', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02019', 'bicycling, stationary, RPM/Spin bike class', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02020', 'calisthenics (e.g., push ups, sit ups, pull-ups, jumping jacks), vigorous effort', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02022', 'calisthenics (e.g., push ups, sit ups, pull-ups, lunges), moderate effort', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02024', 'calisthenics (e.g., situps, abdominal crunches), light effort', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02030', 'calisthenics, light or moderate effort, general (e.g., back exercises), going up & down from floor (Taylor Code 150)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02035', 'circuit training, moderate effort', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02040', 'circuit training, including kettlebells, some aerobic movement with minimal rest, general, vigorous intensity', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02045', 'CurvesTM exercise routines in women', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02048', 'Elliptical trainer, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02050', 'resistance training (weight lifting, free weight, nautilus or universal), power lifting or body building, vigorous effort (Taylor Code 210)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02052', 'resistance (weight) training, squats , slow or explosive effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02054', 'resistance (weight) training, multiple exercises, 8-15 repetitions at varied resistance', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02060', 'health club exercise, general (Taylor Code 160)', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02061', 'health club exercise classes, general, gym/weight training combined in one visit', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02062', 'health club exercise, conditioning classes', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02064', 'home exercise, general', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02065', 'stair-treadmill ergometer, general', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02068', 'rope skipping, general', '12.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02070', 'rowing, stationary ergometer, general, vigorous effort', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02071', 'rowing, stationary, general, moderate effort', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02072', 'rowing, stationary, 100 watts, moderate effort', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02073', 'rowing, stationary, 150 watts, vigorous effort', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02074', 'rowing, stationary, 200 watts, very vigorous effort', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02080', 'ski machine, general', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02085', 'slide board exercise, general', '11.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02090', 'slimnastics, jazzercise', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02101', 'stretching, mild', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02105', 'pilates, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02110', 'teaching exercise class (e.g., aerobic, water)', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02112', 'therapeutic exercise ball, Fitball exercise', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02115', 'upper body exercise, arm ergometer', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02117', 'upper body exercise, stationary bicycle - Airdyne (arms only) 40 rpm, moderate', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02120', 'water aerobics, water calisthenics, water exercise', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02135', 'whirlpool, sitting', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02140', 'video exercise workouts, TV conditioning programs (e.g., yoga, stretching), light effort', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02143', 'video exercise workouts, TV conditioning programs (e.g., cardio-resistance), moderate effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02146', 'video exercise workouts, TV conditioning programs (e.g., cardio-resistance), vigorous effort', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02150', 'yoga, Hatha', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02160', 'yoga, Power', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02170', 'yoga, Nadisodhana', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02180', 'yoga, Surya Namaskar', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02200', 'native New Zealander physical activities (e.g., Haka Powhiri, Moteatea, Waita Tira, Whakawatea, etc.), general, moderate effort', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('02205', 'native New Zealander physical activities (e.g., Haka, Taiahab), general, vigorous effort', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03010', 'ballet, modern, or jazz, general, rehearsal or class', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03012', 'ballet, modern, or jazz, performance, vigorous effort', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03014', 'tap', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03015', 'aerobic, general', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03016', 'aerobic, step, with 6 - 8 inch step', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03017', 'aerobic, step, with 10 - 12 inch step', '9.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03018', 'aerobic, step, with 4-inch step', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03019', 'bench step class, general', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03020', 'aerobic, low impact', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03021', 'aerobic, high impact', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03022', 'aerobic dance wearing 10-15 lb weights', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03025', 'ethnic or cultural dancing (e.g., Greek, Middle Eastern, hula, salsa, merengue, bamba y plena, flamenco, belly, and swing)', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03030', 'ballroom, fast (Taylor Code 125)', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03031', 'general dancing (e.g., disco, folk, Irish step dancing, line dancing, polka, contra, country)', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03038', 'ballroom dancing, competitive, general', '11.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03040', 'ballroom, slow (e.g., waltz, foxtrot, slow dancing, samba, tango, 19th century dance, mambo, cha cha)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03050', 'Anishinaabe Jingle Dancing', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('03060', 'Caribbean dance (Abakua, Beguine, Bellair, Bongo, Brukins, Caribbean Quadrills, Dinki Mini, Gere, Gumbay, Ibo, Jonkonnu, Kumina, Oreisha, Jambu)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04001', 'fishing, general', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04005', 'fishing, crab fishing', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04007', 'fishing, catching fish with hands', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04010', 'fishing related, digging worms, with shovel', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04020', 'fishing from river bank and walking', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04030', 'fishing from boat or canoe, sitting', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04040', 'fishing from river bank, standing (Taylor Code 660)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04050', 'fishing in stream, in waders (Taylor Code 670)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04060', 'fishing, ice, sitting', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04061', 'fishing, jog or line, standing, general', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04062', 'fishing, dip net, setting net and retrieving fish, general', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04063', 'fishing, set net, setting net and retrieving fish, general', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04064', 'fishing, fishing wheel, setting net and retrieving fish, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04065', 'fishing with a spear, standing', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04070', 'hunting, bow and arrow, or crossbow', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04080', 'hunting, deer, elk, large game (Taylor Code 170)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04081', 'hunting large game, dragging carcass', '11.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04083', 'hunting large marine animals', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04085', 'hunting large game, from a hunting stand, limited walking', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04086', 'hunting large game from a car, plane, or boat', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04090', 'hunting, duck, wading', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04095', 'hunting, flying fox, squirrel', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04100', 'hunting, general', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04110', 'hunting, pheasants or grouse (Taylor Code 680)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04115', 'hunting, birds', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04120', 'hunting, rabbit, squirrel, prairie chick, raccoon, small game (Taylor Code 690)', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04123', 'hunting, pigs, wild', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04124', 'trapping game, general', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04125', 'hunting, hiking with hunting gear', '9.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04130', 'pistol shooting or trap shooting, standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04140', 'rifle exercises, shooting, lying down', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('04145', 'rifle exercises, shooting, kneeling or standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05010', 'cleaning, sweeping carpet or floors, general', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05011', 'cleaning, sweeping, slow, light effort', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05012', 'cleaning, sweeping, slow, moderate effort', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05020', 'cleaning, heavy or major (e.g. wash car, wash windows, clean garage), moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05021', 'cleaning, mopping, standing, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05022', 'cleaning windows, washing windows, general', '3.2');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05023', 'mopping, standing, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05024', 'polishing floors, standing, walking slowly, using electric polishing machine', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05025', 'multiple household tasks all at once, light effort', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05026', 'multiple household tasks all at once, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05027', 'multiple household tasks all at once, vigorous effort', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05030', 'cleaning, house or cabin, general, moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05032', 'dusting or polishing furniture, general', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05035', 'kitchen activity, general, (e.g., cooking, washing dishes, cleaning up), moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05040', 'cleaning, general (straightening up, changing linen, carrying out trash, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05041', 'wash dishes, standing or in general (not broken into stand/walk components)', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05042', 'wash dishes, clearing dishes from table, walking, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05043', 'vacuuming, general, moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05044', 'butchering animals, small', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05045', 'butchering animal, large, vigorous effort', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05046', 'cutting and smoking fish, drying fish or meat', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05048', 'tanning hides, general', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05049', 'cooking or food preparation, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05050', 'cooking or food preparation - standing or sitting or in general (not broken into stand/walk components), manual appliances, light effort', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05051', 'serving food, setting table, implied walking or standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05052', 'cooking or food preparation, walking', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05053', 'feeding household animals', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05055', 'putting away groceries (e.g. carrying groceries, shopping without a grocery cart), carrying packages', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05056', 'carrying groceries upstairs', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05057', 'cooking Indian bread on an outside stove', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05060', 'food shopping with or without a grocery cart, standing or walking', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05065', 'non-food shopping, with or without a cart, standing or walking', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05070', 'ironing', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05080', 'knitting, sewing, light effort, wrapping presents, sitting', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05082', 'sewing with a machine', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05090', 'laundry, fold or hang clothes, put clothes in washer or dryer, packing suitcase, washing clothes by hand, implied standing, light effort', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05092', 'laundry, hanging wash, washing clothes by hand, moderate effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05095', 'laundry, putting away clothes, gathering clothes to pack, putting away laundry, implied walking', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05100', 'making bed, changing linens', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05110', 'maple syruping/sugar bushing (including carrying buckets, carrying wood)', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05120', 'moving furniture, household items, carrying boxes', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05121', 'moving, lifting light loads', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05125', 'organizing room', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05130', 'scrubbing floors, on hands and knees, scrubbing bathroom, bathtub, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05131', 'scrubbing floors, on hands and knees, scrubbing bathroom, bathtub, light effort', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05132', 'scrubbing floors, on hands and knees, scrubbing bathroom, bathtub, vigorous effort', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05140', 'sweeping garage, sidewalk or outside of house', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05146', 'standing, packing/unpacking boxes, occasional lifting of lightweight household items, loading or unloading items in car, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05147', 'implied walking, putting away household items, moderate effort', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05148', 'watering plants', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05149', 'building a fire inside', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05150', 'moving household items upstairs, carrying boxes or furniture', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05160', 'standing, light effort tasks (pump gas, change light bulb, etc.)', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05165', 'walking, moderate effort tasks, non-cleaning (readying to leave, shut/lock doors, close windows, etc.)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05170', 'sitting, playing with child(ren), light effort, only active periods', '2.2');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05171', 'standing, playing with child(ren) light effort, only active periods', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05175', 'walking/running, playing with child(ren), moderate effort, only active periods', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05180', 'walking/running, playing with child(ren), vigorous effort, only active periods', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05181', 'walking and carrying small child, child weighing 15 lbs or more', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05182', 'walking and carrying small child, child weighing less than 15 lbs', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05183', 'standing, holding child', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05184', 'child care, infant, general', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05185', 'child care, sitting/kneeling (e.g., dressing, bathing, grooming, feeding, occasional lifting of child), light effort, general', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05186', 'child care, standing (e.g., dressing, bathing, grooming, feeding, occasional lifting of child), moderate effort', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05188', 'reclining with baby', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05189', 'breastfeeding, sitting or reclining', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05190', 'sit, playing with animals, light effort, only active periods', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05191', 'stand, playing with animals, light effort, only active periods', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05192', 'walk/run, playing with animals, general, light effort, only active periods', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05193', 'walk/run, playing with animals, moderate effort, only active periods', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05194', 'walk/run, playing with animals, vigorous effort, only active periods', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05195', 'standing, bathing dog', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05197', 'animal care, household animals, general', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05200', 'elder care, disabled adult, bathing, dressing, moving into and out of bed, only active periods', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('05205', 'elder care, disabled adult, feeding, combing hair, light effort, only active periods', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06010', 'airplane repair', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06020', 'automobile body work', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06030', 'automobile repair, light or moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06040', 'carpentry, general, workshop (Taylor Code 620)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06050', 'carpentry, outside house, installing rain gutters (Taylor Code 640),carpentry, outside house, building a fence', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06052', 'carpentry, outside house, building a fence', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06060', 'carpentry, finishing or refinishing cabinets or furniture', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06070', 'carpentry, sawing hardwood', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06072', 'carpentry, home remodeling tasks, moderate effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06074', 'carpentry, home remodeling tasks, light effort', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06080', 'caulking, chinking log cabin', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06090', 'caulking, except log cabin', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06100', 'cleaning gutters', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06110', 'excavating garage', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06120', 'hanging storm windows', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06122', 'hanging sheet rock inside house', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06124', 'hammering nails', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06126', 'home repair, general, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06127', 'home repair, general, moderate effort', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06128', 'home repair, general, vigorous effort', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06130', 'laying or removing carpet', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06140', 'laying tile or linoleum,repairing appliances', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06144', 'repairing appliances', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06150', 'painting, outside home (Taylor Code 650)', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06160', 'painting inside house,wallpapering, scraping paint', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06165', 'painting, (Taylor Code 630)', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06167', 'plumbing, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06170', 'put on and removal of tarp - sailboat', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06180', 'roofing', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06190', 'sanding floors with a power sander', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06200', 'scraping and painting sailboat or powerboat', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06205', 'sharpening tools', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06210', 'spreading dirt with a shovel', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06220', 'washing and waxing hull of sailboat or airplane', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06225', 'washing and waxing car', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06230', 'washing fence, painting fence, moderate effort', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('06240', 'wiring, tapping-splicing', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07010', 'lying quietly and watching television', '1.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07011', 'lying quietly, doing nothing, lying in bed awake, listening to music (not talking or reading)', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07020', 'sitting quietly and watching television', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07021', 'sitting quietly, general', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07022', 'sitting quietly, fidgeting, general, fidgeting hands', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07023', 'sitting, fidgeting feet', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07024', 'sitting, smoking', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07025', 'sitting, listening to music (not talking or reading) or watching a movie in a theater', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07026', 'sitting at a desk, resting head in hands', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07030', 'sleeping', '1.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07040', 'standing quietly, standing in a line', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07041', 'standing, fidgeting', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07050', 'reclining, writing', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07060', 'reclining, talking or talking on phone', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07070', 'reclining, reading', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('07075', 'meditating', '1.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08009', 'carrying, loading or stacking wood, loading/unloading or carrying lumber, light-to-moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08010', 'carrying, loading or stacking wood, loading/unloading or carrying lumber', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08019', 'chopping wood, splitting logs, moderate effort', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08020', 'chopping wood, splitting logs, vigorous effort', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08025', 'clearing light brush, thinning garden, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08030', 'clearing brush/land, undergrowth, or ground, hauling branches, wheelbarrow chores, vigorous effort', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08040', 'digging sandbox, shoveling sand', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08045', 'digging, spading, filling garden, composting, light-to-moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08050', 'digging, spading, filling garden, compositing, (Taylor Code 590)', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08052', 'digging, spading, filling garden, composting, vigorous effort', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08055', 'driving tractor', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08057', 'felling trees, large size', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08058', 'felling trees, small-medium size', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08060', 'gardening with heavy power tools, tilling a garden, chain saw', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08065', 'gardening, using containers, older adults > 60 years', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08070', 'irrigation channels, opening and closing ports', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08080', 'laying crushed rock', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08090', 'laying sod', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08095', 'mowing lawn, general', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08100', 'mowing lawn, riding mower (Taylor Code 550)', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08110', 'mowing lawn, walk, hand mower (Taylor Code 570)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08120', 'mowing lawn, walk, power mower, moderate or vigorous effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08125', 'mowing lawn, power mower, light or moderate effort (Taylor Code 590)', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08130', 'operating snow blower, walking', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08135', 'planting, potting, transplanting seedlings or plants, light effort', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08140', 'planting seedlings, shrub, stooping, moderate effort', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08145', 'planting crops or garden, stooping, moderate effort', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08150', 'planting trees', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08160', 'raking lawn or leaves, moderate effort', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08165', 'raking lawn (Taylor Code 600)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08170', 'raking roof with snow rake', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08180', 'riding snow blower', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08190', 'sacking grass, leaves', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08192', 'shoveling dirt or mud', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08195', 'shoveling snow, by hand, moderate effort', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08200', 'shovelling snow, by hand (Taylor Code 610)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08202', 'shoveling snow, by hand, vigorous effort', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08210', 'trimming shrubs or trees, manual cutter', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08215', 'trimming shrubs or trees, power cutter, using leaf blower, edge, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08220', 'walking, applying fertilizer or seeding a lawn, push applicator', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08230', 'watering lawn or garden, standing or walking', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08239', 'weeding, cultivating garden, light-to-moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08240', 'weeding, cultivating garden (Taylor Code 580)', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08241', 'weeding, cultivating garden, using a hoe, moderate-to-vigorous effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08245', 'gardening, general, moderate effort', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08246', 'picking fruit off trees, picking fruits/vegetables, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08248', 'picking fruit off trees, gleaning fruits, picking fruits/vegetables, climbing ladder to pick fruit, vigorous effort', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08250', 'implied walking/standing - picking up yard, light, picking flowers or vegetables', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08251', 'walking, gathering gardening tools', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08255', 'wheelbarrow, pushing garden cart or wheelbarrow', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08260', 'yard work, general, light effort', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08261', 'yard work, general, moderate effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('08262', 'yard work, general, vigorous effort', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09000', 'board game playing, sitting', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09005', 'casino gambling, standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09010', 'card playing, sitting', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09013', 'chess game, sitting', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09015', 'copying documents, standing', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09020', 'drawing, writing, painting, standing', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09025', 'laughing, sitting', '1.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09030', 'sitting, reading, book, newspaper, etc.', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09040', 'sitting, writing, desk work, typing', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09045', 'sitting, playing traditional video game, computer game', '1.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09050', 'standing, talking in person, on the phone, computer, or text messaging, light effort', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09055', 'sitting, talking in person, on the phone, computer, or text messaging, light effort', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09060', 'sitting, studying, general, including reading and/or writing, light effort', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09065', 'sitting, in class, general, including note-taking or class discussion', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09070', 'standing, reading', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09071', 'standing, miscellaneous', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09075', 'sitting, arts and crafts, carving wood, weaving, spinning wool, light effort', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09080', 'sitting, arts and crafts, carving wood, weaving, spinning wool, moderate effort', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09085', 'standing, arts and crafts, sand painting, carving, weaving, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09090', 'standing, arts and crafts, sand painting, carving, weaving, moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09095', 'standing, arts and crafts, sand painting, carving, weaving, vigorous effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09100', 'retreat/family reunion activities involving sitting, relaxing, talking, eating', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09101', 'retreat/family reunion activities involving playing games with children', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09105', 'touring/traveling/vacation involving riding in a vehicle', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09106', 'touring/traveling/vacation involving walking', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09110', 'camping involving standing, walking, sitting, light-to-moderate effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('09115', 'sitting at a sporting event, spectator', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10010', 'accordion, sitting', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10020', 'cello, sitting', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10030', 'conducting orchestra, standing', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10035', 'double bass, standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10040', 'drums, sitting', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10045', 'drumming (e.g., bongo, conga, benbe), moderate, sitting', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10050', 'flute, sitting', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10060', 'horn, standing', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10070', 'piano, sitting', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10074', 'playing musical instruments, general', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10077', 'organ, sitting', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10080', 'trombone, standing', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10090', 'trumpet, standing', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10100', 'violin, sitting', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10110', 'woodwind, sitting', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10120', 'guitar, classical, folk, sitting', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10125', 'guitar, rock and roll band, standing', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10130', 'marching band, baton twirling, walking, moderate pace, general', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10131', 'marching band, playing an instrument, walking, brisk pace, general', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('10135', 'marching band, drum major, walking', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11003', 'active workstation, treadmill desk, walking', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11006', 'airline flight attendant', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11010', 'bakery, general, moderate effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11015', 'bakery, light effort', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11020', 'bookbinding', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11030', 'building road, driving heavy machinery', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11035', 'building road, directing traffic, standing', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11038', 'carpentry, general, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11040', 'carpentry, general, moderate effort', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11042', 'carpentry, general, heavy or vigorous effort', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11050', 'carrying heavy loads (e.g., bricks, tools)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11060', 'carrying moderate loads up stairs, moving boxes 25-49 lbs', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11070', 'chambermaid, hotel housekeeper, making bed, cleaning bathroom, pushing cart', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11080', 'coal mining, drilling coal, rock', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11090', 'coal mining, erecting supports', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11100', 'coal mining, general', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11110', 'coal mining, shoveling coal', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11115', 'cook, chef', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11120', 'construction, outside, remodeling, new structures (e.g., roof repair, miscellaneous)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11125', 'custodial work, light effort (e.g., cleaning sink and toilet, dusting, vacuuming, light cleaning)', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11126', 'custodial work, moderate effort (e.g., electric buffer, feathering arena floors, mopping, taking out trash, vacuuming)', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11130', 'electrical work (e.g., hook up wire, tapping-splicing)', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11135', 'engineer (e.g., mechanical or electrical)', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11145', 'farming, vigorous effort (e.g., baling hay, cleaning barn)', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11146', 'farming, moderate effort (e.g., feeding animals, chasing cattle by walking and/or horseback, spreading manure, harvesting crops)', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11147', 'farming, light effort (e.g., cleaning animal sheds, preparing animal feed)', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11170', 'farming, driving tasks (e.g., driving tractor or harvester)', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11180', 'farming, feeding small animals', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11190', 'farming, feeding cattle, horses', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11191', 'farming, hauling water for animals, general hauling water,farming, general hauling water', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11192', 'farming, taking care of animals (e.g., grooming, brushing, shearing sheep, assisting with birthing, medical care, branding), general', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11195', 'farming, rice, planting, grain milling activities', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11210', 'farming, milking by hand, cleaning pails, moderate effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11220', 'farming, milking by machine, light effort', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11240', 'fire fighter, general', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11244', 'fire fighter, rescue victim, automobile accident, using pike pole', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11245', 'fire fighter, raising and climbing ladder with full gear, simulated fire supression', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11246', 'fire fighter, hauling hoses on ground, carrying/hoisting equipment, breaking down walls etc., wearing full gear', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11247', 'fishing, commercial, light effort', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11248', 'fishing, commercial, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11249', 'fishing, commercial, vigorous effort', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11250', 'forestry, ax chopping, very fast, 1.25 kg axe, 51 blows/min, extremely vigorous effort', '17.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11260', 'forestry, ax chopping, slow, 1.25 kg axe, 19 blows/min, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11262', 'forestry, ax chopping, fast, 1.25 kg axe, 35 blows/min, vigorous effort', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11264', 'forestry, moderate effort (e.g., sawing wood with power saw, weeding, hoeing)', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11266', 'forestry, vigorous effort (e.g., barking, felling, or trimming trees, carrying or stacking logs, planting seeds, sawing lumber by hand)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11370', 'furriery', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11375', 'garbage collector, walking, dumping bins into truck', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11378', 'hairstylist (e.g., plaiting hair, manicure, make-up artist)', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11380', 'horse grooming, including feeding, cleaning stalls, bathing, brushing, clipping, longeing and exercising horses', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11381', 'horse, feeding, watering, cleaning stalls, implied walking and lifting loads', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11390', 'horse racing, galloping', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11400', 'horse racing, trotting', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11410', 'horse racing, walking', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11413', 'kitchen maid', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11415', 'lawn keeper, yard work, general', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11418', 'laundry worker', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11420', 'locksmith', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11430', 'machine tooling (e.g., machining, working sheet metal, machine fitter, operating lathe, welding) light-to-moderate effort', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11450', 'Machine tooling, operating punch press, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11472', 'manager, property', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11475', 'manual or unskilled labor, general, light effort', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11476', 'manual or unskilled labor, general, moderate effort', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11477', 'manual or unskilled labor, general, vigorous effort', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11480', 'masonry, concrete, moderate effort', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11482', 'masonry, concrete, light effort', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11485', 'massage therapist, standing', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11490', 'moving, carrying or pushing heavy objects, 75 lbs or more, only active time (e.g., desks, moving van work)', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11495', 'skindiving or SCUBA diving as a frogman, Navy Seal', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11500', 'operating heavy duty equipment, automated, not driving', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11510', 'orange grove work, picking fruit', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11514', 'painting,house, furniture, moderate effort', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11516', 'plumbing activities', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11520', 'printing, paper industry worker, standing', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11525', 'police, directing traffic, standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11526', 'police, driving a squad car, sitting', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11527', 'police, riding in a squad car, sitting', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11528', 'police, making an arrest, standing', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11529', 'postal carrier, walking to deliver mail', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11530', 'shoe repair, general', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11540', 'shoveling, digging ditches', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11550', 'shoveling, more than 16 lbs/minute, deep digging, vigorous effort', '8.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11560', 'shoveling, less than 10 lbs/minute, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11570', 'shoveling, 10 to 15 lbs/minute, vigorous effort', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11580', 'sitting tasks, light effort (e.g., office work, chemistry lab work, computer work, light assembly repair, watch repair, reading, desk work)', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11585', 'sitting meetings, light effort, general, and/or with talking involved (e.g., eating at a business meeting)', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11590', 'sitting tasks, moderate effort (e.g., pushing heavy levers, riding mower/forklift, crane operation)', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11593', 'sitting, teaching stretching or yoga, or light effort exercise class', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11600', 'standing tasks, light effort (e.g., bartending, store clerk, assembling, filing, duplicating, librarian, putting up a Christmas tree, standing and talking at work, changing clothes when teaching physical education, standing)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11610', 'standing, light/moderate effort (e.g., assemble/repair heavy parts, welding,stocking parts,auto repair,standing, packing boxes, nursing patient care)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11615', 'standing, moderate effort, lifting items continuously, 10 \u2013 20 lbs, with limited walking or resting', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11620', 'standing, moderate effort, intermittent lifting 50 lbs, hitch/twisting ropes', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11630', 'standing, moderate/heavy tasks (e.g., lifting more than 50 lbs, masonry, painting, paper hanging)', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11708', 'steel mill, moderate effort (e.g., fettling, forging, tipping molds)', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11710', 'steel mill, vigorous effort (e.g., hand rolling, merchant mill rolling, removing slag, tending furnace)', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11720', 'tailoring, cutting fabric', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11730', 'tailoring, general', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11740', 'tailoring, hand sewing', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11750', 'tailoring, machine sewing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11760', 'tailoring, pressing', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11763', 'tailoring, weaving, light effort (e.g., finishing operations, washing, dyeing, inspecting cloth, counting yards, paperwork)', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11765', 'tailoring, weaving, moderate effort (e.g., spinning and weaving operations, delivering boxes of yam to spinners, loading of warp bean, pinwinding, conewinding, warping, cloth cutting)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11766', 'truck driving, loading and unloading truck, tying down load, standing, walking and carrying heavy loads', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11767', 'Truch, driving delivery truck, taxi, shuttlebus, school bus', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11770', 'typing, electric, manual or computer', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11780', 'using heavy power tools such as pneumatic tools (e.g., jackhammers, drills)', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11790', 'using heavy tools (not power) such as shovel, pick, tunnel bar, spade', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11791', 'walking on job, less than 2.0 mph, very slow speed, in office or lab area', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11792', 'walking on job, 3.0 mph, in office, moderate speed, not carrying anything', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11793', 'walking on job, 3.5 mph, in office, brisk speed, not carrying anything', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11795', 'walking on job, 2.5 mph, slow speed and carrying light objects less than 25 lbs', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11796', 'walking, gathering things at work, ready to leave', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11797', 'walking, 2.5 mph, slow speed, carrying heavy objects more than 25 lbs', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11800', 'walking, 3.0 mph, moderately and carrying light objects less than 25 lbs', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11805', 'walking, pushing a wheelchair', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11810', 'walking, 3.5 mph, briskly and carrying objects less than 25 lbs', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11820', 'walking or walk downstairs or standing, carrying objects about 25 to 49 lbs', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11830', 'walking or walk downstairs or standing, carrying objects about 50 to 74 lbs', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11840', 'walking or walk downstairs or standing, carrying objects about 75 to 99 lbs', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11850', 'walking or walk downstairs or standing, carrying objects about 100 lbs or more', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('11870', 'working in scene shop, theater actor, backstage employee', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12010', 'jog/walk combination (jogging component of less than 10 minutes) (Taylor Code 180)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12020', 'jogging, general', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12025', 'jogging, in place', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12027', 'jogging, on a mini-tramp', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12029', 'Running, 4 mph (13 min/mile)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12030', 'running, 5 mph (12 min/mile)', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12040', 'running, 5.2 mph (11.5 min/mile)', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12050', 'running, 6 mph (10 min/mile)', '9.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12060', 'running, 6.7 mph (9 min/mile)', '10.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12070', 'running, 7 mph (8.5 min/mile)', '11.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12080', 'running, 7.5 mph (8 min/mile)', '11.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12090', 'running, 8 mph (7.5 min/mile)', '11.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12100', 'running, 8.6 mph (7 min/mile)', '12.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12110', 'running, 9 mph (6.5 min/mile)', '12.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12120', 'running, 10 mph (6 min/mile)', '14.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12130', 'running, 11 mph (5.5 min/mile)', '16.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12132', 'running, 12 mph (5 min/mile)', '19.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12134', 'running, 13 mph (4.6 min/mile)', '19.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12135', 'running, 14 mph (4.3 min/mile)', '23.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12140', 'running, cross country', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12150', 'running, (Taylor code 200)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12170', 'running, stairs, up', '15.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12180', 'running, on a track, team practice', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12190', 'running, training, pushing a wheelchair or baby carrier', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('12200', 'running, marathon', '13.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13000', 'getting ready for bed, general, standing', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13009', 'sitting on toilet, eliminating while standing or squating', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13010', 'bathing, sitting', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13020', 'dressing, undressing, standing or sitting', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13030', 'eating, sitting', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13035', 'talking and eating or eating only, standing', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13036', 'taking medication, sitting or standing', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13040', 'grooming, washing hands, shaving, brushing teeth, putting on make-up, sitting or standing', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13045', 'hairstyling, standing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13046', 'having hair or nails done by someone else, sitting', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('13050', 'showering, toweling off, standing', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('14010', 'active, vigorous effort', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('14020', 'general, moderate effort', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('14030', 'passive, light effort, kissing, hugging', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15000', 'Alaska Native Games, Eskimo Olympics, general', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15010', 'archery, non-hunting', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15020', 'badminton, competitive (Taylor Code 450)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15030', 'badminton, social singles and doubles, general', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15040', 'basketball, game (Taylor Code 490)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15050', 'basketball, non-game, general (Taylor Code 480)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15055', 'basketball, general', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15060', 'basketball, officiating (Taylor Code 500)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15070', 'basketball, shooting baskets', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15072', 'basketball, drills, practice', '9.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15075', 'basketball, wheelchair', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15080', 'billiards', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15090', 'bowling (Taylor Code 390)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15092', 'bowling, indoor, bowling alley', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15100', 'boxing, in ring, general', '12.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15110', 'boxing, punching bag', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15120', 'boxing, sparring', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15130', 'broomball', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15135', 'children\u2019s games, adults playing (e.g., hopscotch, 4-square, dodgeball, playground apparatus, t-ball, tetherball, marbles, arcade games), moderate effort', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15138', 'cheerleading, gymnastic moves, competitive', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15140', 'coaching, football, soccer, basketball, baseball, swimming, etc.', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15142', 'coaching, actively playing sport with players', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15150', 'cricket, batting, bowling, fielding', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15160', 'croquet', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15170', 'curling', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15180', 'darts, wall or lawn', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15190', 'drag racing, pushing or driving a car', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15192', 'auto racing, open wheel', '8.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15200', 'fencing', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15210', 'football, competitive', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15230', 'football, touch, flag, general (Taylor Code 510)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15232', 'football, touch, flag, light effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15235', 'football or baseball, playing catch', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15240', 'frisbee playing, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15250', 'frisbee, ultimate', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15255', 'golf, general', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15265', 'golf, walking, carrying clubs', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15270', 'golf, miniature, driving range', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15285', 'golf, walking, pulling clubs', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15290', 'golf, using power cart (Taylor Code 070)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15300', 'gymnastics, general', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15310', 'hacky sack', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15320', 'handball, general (Taylor Code 520)', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15330', 'handball, team', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15335', 'high ropes course, multiple elements', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15340', 'hang gliding', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15350', 'hockey, field', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15360', 'hockey, ice, general', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15362', 'hockey, ice, competitive', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15370', 'horseback riding, general', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15375', 'horse chores, feeding, watering, cleaning stalls, implied walking and lifting loads', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15380', 'saddling, cleaning, grooming, harnessing and unharnessing horse', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15390', 'horseback riding, trotting', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15395', 'horseback riding, canter or gallop', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15400', 'horseback riding,walking', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15402', 'horseback riding, jumping', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15408', 'horse cart, driving, standing or sitting', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15410', 'horseshoe pitching, quoits', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15420', 'jai alai', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15425', 'martial arts, different types, slower pace, novice performers, practice', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15430', 'martial arts, different types, moderate pace (e.g., judo, jujitsu, karate, kick boxing, tae kwan do, tai-bo, Muay Thai boxing)', '10.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15440', 'juggling', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15450', 'kickball', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15460', 'lacrosse', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15465', 'lawn bowling, bocce ball, outdoor', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15470', 'moto-cross, off-road motor sports, all-terrain vehicle, general', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15480', 'orienteering', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15490', 'paddleball, competitive', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15500', 'paddleball, casual, general (Taylor Code 460)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15510', 'polo, on horseback', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15520', 'racquetball, competitive', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15530', 'racquetball, general (Taylor Code 470)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15533', 'rock or mountain climbing (Taylor Code 470) (Formerly code = 17120)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15535', 'rock climbing, ascending rock, high difficulty', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15537', 'rock climbing, ascending or traversing rock, low-to-moderate difficulty', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15540', 'rock climbing, rappelling', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15542', 'rodeo sports, general, light effort', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15544', 'rodeo sports, general, moderate effort', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15546', 'rodeo sports, general, vigorous effort', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15550', 'rope jumping, fast pace, 120-160 skips/min', '12.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15551', 'rope jumping, moderate pace, 100-120 skips/min, general, 2 foot skip, plain bounce', '11.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15552', 'rope jumping, slow pace, < 100 skips/min, 2 foot skip, rhythm bounce', '8.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15560', 'rugby, union, team, competitive', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15562', 'rugby, touch, non-competitive', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15570', 'shuffleboard', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15580', 'skateboarding, general, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15582', 'skateboarding, competitive, vigorous effort', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15590', 'skating, roller (Taylor Code 360)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15591', 'rollerblading, in-line skating, 14.4 km/h (9.0 mph), recreational pace', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15592', 'rollerblading, in-line skating, 17.7 km/h (11.0 mph), moderate pace, exercise training', '9.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15593', 'rollerblading, in-line skating, 21.0 to 21.7 km/h (13.0 to 13.6 mph), fast pace, exercise training', '12.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15594', 'rollerblading, in-line skating, 24.0 km/h (15.0 mph), maximal effort', '14.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15600', 'skydiving, base jumping, bungee jumping', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15605', 'soccer, competitive', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15610', 'soccer, casual, general (Taylor Code 540)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15620', 'softball or baseball, fast or slow pitch, general (Taylor Code 440)', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15625', 'softball, practice', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15630', 'softball, officiating', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15640', 'softball,pitching', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15645', 'sports spectator, very excited, emotional, physically moving', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15650', 'squash (Taylor Code 530)', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15652', 'squash, general', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15660', 'table tennis, ping pong (Taylor Code 410)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15670', 'tai chi, qi gong, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15672', 'tai chi, qi gong, sitting, light effort', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15675', 'tennis, general', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15680', 'tennis, doubles (Taylor Code 430)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15685', 'tennis, doubles', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15690', 'tennis, singles (Taylor Code 420)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15695', 'tennis, hitting balls, non-game play, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15700', 'trampoline, recreational', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15702', 'trampoline, competitive', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15710', 'volleyball (Taylor Code 400)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15711', 'volleyball, competitive, in gymnasium', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15720', 'volleyball, non-competitive, 6 - 9 member team, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15725', 'volleyball, beach, in sand', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15730', 'wrestling (one match = 5 minutes)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15731', 'wallyball, general', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15732', 'track and field (e.g., shot, discus, hammer throw)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15733', 'track and field (e.g., high jump, long jump, triple jump, javelin, pole vault)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('15734', 'track and field (e.g., steeplechase, hurdles)', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16010', 'automobile or light truck (not a semi) driving', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16015', 'riding in a car or truck', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16016', 'riding in a bus or train', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16020', 'flying airplane or helicopter', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16030', 'motor scooter, motorcycle', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16035', 'pulling rickshaw', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16040', 'pushing plane in and out of hangar', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16050', 'truck, semi, tractor, > 1 ton, or bus, driving', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('16060', 'walking for transportation, 2.8-3.2 mph, level, moderate pace, firm surface', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17010', 'backpacking (Taylor Code 050)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17012', 'backpacking, hiking or organized walking with a daypack', '7.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17020', 'carrying 15 pound load (e.g. suitcase), level ground or downstairs', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17021', 'carrying 15 lb child, slow walking', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17025', 'carrying load upstairs, general', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17026', 'carrying 1 to 15 lb load, upstairs', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17027', 'carrying 16 to 24 lb load, upstairs', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17028', 'carrying 25 to 49 lb load, upstairs', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17029', 'carrying 50 to 74 lb load, upstairs', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17030', 'carrying > 74 lb load, upstairs', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17031', 'loading /unloading a car, implied walking', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17033', 'climbing hills, no load', '6.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17035', 'climbing hills with 0 to 9 lb load', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17040', 'climbing hills with 10 to 20 lb load', '7.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17050', 'climbing hills with 21 to 42 lb load', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17060', 'climbing hills with 42+ lb load', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17070', 'descending stairs', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17080', 'hiking, cross country (Taylor Code 040)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17082', 'hiking or walking at a normal pace through fields and hillsides', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17085', 'bird watching, slow walk', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17088', 'marching, moderate speed, military, no pack', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17090', 'marching rapidly, military, no pack', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17100', 'pushing or pulling stroller with child or walking with children, 2.5 to 3.1 mph', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17105', 'pushing a wheelchair, non-occupational', '3.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17110', 'race walking', '6.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17130', 'stair climbing, using or climbing up ladder (Taylor Code 030)', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17133', 'stair climbing, slow pace', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17134', 'stair climbing, fast pace', '8.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17140', 'using crutches', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17150', 'walking, household', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17151', 'walking, less than 2.0 mph, level, strolling, very slow', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17152', 'walking, 2.0 mph, level, slow pace, firm surface', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17160', 'walking for pleasure (Taylor Code 010)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17161', 'walking from house to car or bus, from car or bus to go places, from car or bus to and from the worksite', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17162', 'walking to neighbor\u2019s house or family\u2019s house for social reasons', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17165', 'walking the dog', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17170', 'walking, 2.5 mph, level, firm surface', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17180', 'walking, 2.5 mph, downhill', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17190', 'walking, 2.8 to 3.2 mph, level, moderate pace, firm surface', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17200', 'walking, 3.5 mph, level, brisk, firm surface, walking for exercise', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17210', 'walking, 2.9 to 3.5 mph, uphill, 1 to 5% grade', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17211', 'walking, 2.9 to 3.5 mph, uphill, 6% to 15% grade', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17220', 'walking, 4.0 mph, level, firm surface, very brisk pace', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17230', 'walking, 4.5 mph, level, firm surface, very, very brisk', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17231', 'walking, 5.0 mph, level, firm surface', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17235', 'walking, 5.0 mph, uphill, 3% grade', '9.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17250', 'walking, for pleasure, work break', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17260', 'walking, grass track', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17262', 'walking, normal pace, plowed field or sand', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17270', 'walking, to work or class (Taylor Code 015)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17280', 'walking, to and from an outhouse', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17302', 'walking, for exercise, 3.5 to 4 mph, with ski poles, Nordic walking, level, moderate pace', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17305', 'walking, for exercise, 5.0 mph, with ski poles, Nordic walking, level, fast pace', '9.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17310', 'walking, for exercise, with ski poles, Nordic walking, uphill', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17320', 'walking, backwards, 3.5 mph, level', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('17325', 'walking, backwards, 3.5 mph, uphill, 5% grade', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18010', 'boating, power, driving', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18012', 'boating, power, passenger, light', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18020', 'canoeing, on camping trip (Taylor Code 270)', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18025', 'canoeing, harvesting wild rice, knocking rice off the stalks', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18030', 'canoeing, portaging', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18040', 'canoeing, rowing, 2.0-3.9 mph, light effort', '2.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18050', 'canoeing, rowing, 4.0-5.9 mph, moderate effort', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18060', 'canoeing, rowing, kayaking, competition, >6 mph, vigorous effort', '12.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18070', 'canoeing, rowing, for pleasure, general (Taylor Code 250)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18080', 'canoeing, rowing, in competition, or crew or sculling (Taylor Code 260)', '12.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18090', 'diving, springboard or platform', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18100', 'kayaking, moderate effort', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18110', 'paddle boat', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18120', 'sailing, boat and board sailing, windsurfing, ice sailing, general (Taylor Code 235)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18130', 'sailing, in competition', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18140', 'sailing, Sunfish/Laser/Hobby Cat, Keel boats, ocean sailing, yachting, leisure', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18150', 'skiing, water or wakeboarding (Taylor Code 220)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18160', 'jet skiing, driving, in water', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18180', 'skindiving, fast', '15.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18190', 'skindiving, moderate', '11.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18200', 'skindiving, scuba diving, general (Taylor Code 310)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18210', 'snorkeling (Taylor Code 310)', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18220', 'surfing, body or board, general', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18222', 'surfing, body or board, competitive', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18225', 'paddle boarding, standing', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18230', 'swimming laps, freestyle, fast, vigorous effort', '9.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18240', 'swimming laps, freestyle, front crawl, slow, light or moderate effort', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18250', 'swimming, backstroke, general, training or competition', '9.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18255', 'swimming, backstroke, recreational', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18260', 'swimming, breaststroke, general, training or competition', '10.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18265', 'swimming, breaststroke, recreational', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18270', 'swimming, butterfly, general', '13.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18280', 'swimming, crawl, fast speed, ~75 yards/minute, vigorous effort', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18290', 'swimming, crawl, medium speed, ~50 yards/minute, vigorous effort', '8.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18300', 'swimming, lake, ocean, river (Taylor Codes 280, 295)', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18310', 'swimming, leisurely, not lap swimming, general', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18320', 'swimming, sidestroke, general', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18330', 'swimming, synchronized', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18340', 'swimming, treading water, fast, vigorous effort', '9.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18350', 'swimming, treading water, moderate effort, general', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18352', 'tubing, floating on a river, general', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18355', 'water aerobics, water calisthenics', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18360', 'water polo', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18365', 'water volleyball', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18366', 'water jogging', '9.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18367', 'water walking, light effort, slow pace', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18368', 'water walking, moderate effort, moderate pace', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18369', 'water walking, vigorous effort, brisk pace', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18370', 'whitewater rafting, kayaking, or canoeing', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18380', 'windsurfing, not pumping for speed', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18385', 'windsurfing or kitesurfing, crossing trial', '11.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('18390', 'windsurfing, competition, pumping for speed', '13.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19005', 'dog sledding, mushing', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19006', 'dog sledding, passenger', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19010', 'moving ice house, set up/drill holes', '6.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19011', 'ice fishing, sitting', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19018', 'skating, ice dancing', '14.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19020', 'skating, ice, 9 mph or less', '5.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19030', 'skating, ice, general (Taylor Code 360)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19040', 'skating, ice, rapidly, more than 9 mph, not competitive', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19050', 'skating, speed, competitive', '13.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19060', 'ski jumping, climb up carrying skis', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19075', 'skiing, general', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19080', 'skiing, cross country, 2.5 mph, slow or light effort, ski walking', '6.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19090', 'skiing, cross country, 4.0-4.9 mph, moderate speed and effort, general', '9.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19100', 'skiing, cross country, 5.0-7.9 mph, brisk speed, vigorous effort', '12.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19110', 'skiing, cross country, >8.0 mph, elite skier, racing', '15.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19130', 'skiing, cross country, hard snow, uphill, maximum, snow mountaineering', '15.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19135', 'skiing, cross-country, skating', '13.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19140', 'skiing, cross-country, biathlon, skating technique', '13.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19150', 'skiing, downhill, alpine or snowboarding, light effort, active time only', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19160', 'skiing, downhill, alpine or snowboarding, moderate effort, general, active time only', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19170', 'skiing, downhill, vigorous effort, racing', '8.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19175', 'skiing, roller, elite racers', '12.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19180', 'sledding, tobogganing, bobsledding, luge (Taylor Code 370)', '7.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19190', 'snow shoeing, moderate effort', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19192', 'snow shoeing, vigorous effort', '10.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19200', 'snowmobiling, driving, moderate', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19202', 'snowmobiling, passenger', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19252', 'snow shoveling, by hand, moderate effort', '5.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19254', 'snow shoveling, by hand, vigorous effort', '7.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('19260', 'snow blower, walking and pushing', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20000', 'sitting in church, in service, attending a ceremony, sitting quietly', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20001', 'sitting, playing an instrument at church', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20005', 'sitting in church, talking or singing, attending a ceremony, sitting, active participation', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20010', 'sitting, reading religious materials at home', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20015', 'standing quietly in church, attending a ceremony', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20020', 'standing, singing in church, attending a ceremony, standing, active participation', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20025', 'kneeling in church or at home, praying', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20030', 'standing, talking in church', '1.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20035', 'walking in church', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20036', 'walking, less than 2.0 mph, very slow', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20037', 'walking, 3.0 mph, moderate speed, not carrying anything', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20038', 'walking, 3.5 mph, brisk speed, not carrying anything', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20039', 'walk/stand combination for religious purposes, usher', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20040', 'praise with dance or run, spiritual dancing in church', '5.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20045', 'serving food at church', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20046', 'preparing food at church', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20047', 'washing dishes, cleaning kitchen at church', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20050', 'eating at church', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20055', 'eating/talking at church or standing eating, American Indian Feast days', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20060', 'cleaning church', '3.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20061', 'general yard work at church', '4.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20065', 'standing, moderate effort (e.g., lifting heavy objects, assembling at fast rate)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20095', 'Standing, moderate-to-heavy effort, manual labor, lifting ? 50 lbs, heavy maintenance', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('20100', 'typing, electric, manual, or computer', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21000', 'sitting, meeting, general, and/or with talking involved', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21005', 'sitting, light office work, in general', '1.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21010', 'sitting, moderate work', '2.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21015', 'standing, light work (filing, talking, assembling)', '2.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21016', 'sitting, child care, only active periods', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21017', 'standing, child care, only active periods', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21018', 'walk/run play with children, moderate, only active periods', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21019', 'walk/run play with children, vigorous, only active periods', '5.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21020', 'standing, light/moderate work (e.g., pack boxes, assemble/repair, set up chairs/furniture)', '3.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21025', 'standing, moderate (lifting 50 lbs., assembling at fast rate)', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21030', 'standing, moderate/heavy work', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21035', 'typing, electric, manual, or computer', '1.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21040', 'walking, less than 2.0 mph, very slow', '2.0');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21045', 'walking, 3.0 mph, moderate speed, not carrying anything', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21050', 'walking, 3.5 mph, brisk speed, not carrying anything', '4.3');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21055', 'walking, 2.5 mph slowly and carrying objects less than 25 lbs', '3.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21060', 'walking, 3.0 mph moderately and carrying objects less than 25 lbs, pushing something', '4.5');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21065', 'walking, 3.5 mph, briskly and carrying objects less than 25 lbs', '4.8');", None),
("INSERT INTO entries_PhysicalActivity (code, name, METS) VALUES ('21070', 'walk/stand combination, for volunteer purposes', '3.0');", None)
])
] | 149.536009 | 331 | 0.652581 | 15,432 | 126,657 | 5.302035 | 0.145347 | 0.100341 | 0.17058 | 0.331125 | 0.661114 | 0.648684 | 0.643356 | 0.639139 | 0.632478 | 0.624021 | 0 | 0.06264 | 0.185509 | 126,657 | 847 | 332 | 149.536009 | 0.730499 | 0.000537 | 0 | 0 | 1 | 0.977381 | 0.839683 | 0.156264 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.004762 | 0.002381 | 0 | 0.005952 | 0.00119 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.